Skip to content

Commit a096ba5

Browse files
authored
Merge pull request #5434 from foarsitter/uv-generated-project
Migrate to `uv` as package manager for the generated project
2 parents ad5287c + 7204f57 commit a096ba5

28 files changed

Lines changed: 470 additions & 309 deletions

README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -84,11 +84,11 @@ and then editing the results to include your name, email, and various configurat
8484

8585
First, get Cookiecutter. Trust me, it's awesome:
8686

87-
pip install "cookiecutter>=1.7.0"
87+
uv tool install "cookiecutter>=1.7.0"
8888

8989
Now run it against this repo:
9090

91-
cookiecutter https://github.com/cookiecutter/cookiecutter-django
91+
uvx cookiecutter https://github.com/cookiecutter/cookiecutter-django
9292

9393
You'll be prompted for some values. Provide them, then a Django project will be created for you.
9494

docs/2-local-development/developing-locally-docker.rst

Lines changed: 38 additions & 20 deletions
Original file line numberDiff line numberDiff line change
@@ -32,14 +32,32 @@ Build the Stack
3232

3333
This can take a while, especially the first time you run this particular command on your development system::
3434

35-
$ docker compose -f docker-compose.local.yml build
35+
docker compose -f docker-compose.local.yml build
3636

3737
Generally, if you want to emulate production environment use ``docker-compose.production.yml`` instead. And this is true for any other actions you might need to perform: whenever a switch is required, just do it!
3838

39+
After we have created our initial image we nee to generate a lockfile for our dependencies.
40+
Docker cannot write to the host system during builds, so we have to run the command to generate the lockfile in the container.
41+
This is important for reproducible builds and to ensure that the dependencies are installed correctly in the container.
42+
Updating the lockfile manually is normally not necessary when you add packages through `uv add <package_name>`.
43+
44+
docker compose -f docker-compose.local.yml run --rm django uv lock
45+
46+
This is done by running the following command: ::
47+
48+
docker compose -f docker-compose.local.yml run --rm django uv lock
49+
50+
To be sure we are on the right track we need to build our image again: ::
51+
52+
docker compose -f docker-compose.local.yml build
53+
54+
55+
56+
3957
Before doing any git commit, `pre-commit`_ should be installed globally on your local machine, and then::
4058

41-
$ git init
42-
$ pre-commit install
59+
git init
60+
pre-commit install
4361

4462
Failing to do so will result with a bunch of CI and Linter errors that can be avoided with pre-commit.
4563

@@ -50,27 +68,27 @@ This brings up both Django and PostgreSQL. The first time it is run it might tak
5068

5169
Open a terminal at the project root and run the following for local development::
5270

53-
$ docker compose -f docker-compose.local.yml up
71+
docker compose -f docker-compose.local.yml up
5472

5573
You can also set the environment variable ``COMPOSE_FILE`` pointing to ``docker-compose.local.yml`` like this::
5674

57-
$ export COMPOSE_FILE=docker-compose.local.yml
75+
export COMPOSE_FILE=docker-compose.local.yml
5876

5977
And then run::
6078

61-
$ docker compose up
79+
docker compose up
6280

6381
To run in a detached (background) mode, just::
6482

65-
$ docker compose up -d
83+
docker compose up -d
6684

6785
These commands don't run the docs service. In order to run docs service you can run::
6886

69-
$ docker compose -f docker-compose.docs.yml up
87+
docker compose -f docker-compose.docs.yml up
7088

7189
To run the docs with local services just use::
7290

73-
$ docker compose -f docker-compose.local.yml -f docker-compose.docs.yml up
91+
docker compose -f docker-compose.local.yml -f docker-compose.docs.yml up
7492

7593
The site should start and be accessible at http://localhost:3000 if you selected Webpack or Gulp as frontend pipeline and http://localhost:8000 otherwise.
7694

@@ -79,8 +97,8 @@ Execute Management Commands
7997

8098
As with any shell command that we wish to run in our container, this is done using the ``docker compose -f docker-compose.local.yml run --rm`` command: ::
8199

82-
$ docker compose -f docker-compose.local.yml run --rm django python manage.py migrate
83-
$ docker compose -f docker-compose.local.yml run --rm django python manage.py createsuperuser
100+
docker compose -f docker-compose.local.yml run --rm django python manage.py migrate
101+
docker compose -f docker-compose.local.yml run --rm django python manage.py createsuperuser
84102

85103
Here, ``django`` is the target service we are executing the commands against.
86104
Also, please note that the ``docker exec`` does not work for running management commands.
@@ -136,7 +154,7 @@ The three envs we are presented with here are ``POSTGRES_DB``, ``POSTGRES_USER``
136154

137155
One final touch: should you ever need to merge ``.envs/.production/*`` in a single ``.env`` run the ``merge_production_dotenvs_in_dotenv.py``: ::
138156

139-
$ python merge_production_dotenvs_in_dotenv.py
157+
python merge_production_dotenvs_in_dotenv.py
140158

141159
The ``.env`` file will then be created, with all your production envs residing beside each other.
142160

@@ -149,15 +167,15 @@ Activate a Docker Machine
149167

150168
This tells our computer that all future commands are specifically for the dev1 machine. Using the ``eval`` command we can switch machines as needed.::
151169

152-
$ eval "$(docker-machine env dev1)"
170+
eval "$(docker-machine env dev1)"
153171

154172
Add 3rd party python packages
155173
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
156174

157-
To install a new 3rd party python package, you cannot use ``pip install <package_name>``, that would only add the package to the container. The container is ephemeral, so that new library won't be persisted if you run another container. Instead, you should modify the Docker image:
158-
You have to modify the relevant requirement file: base, local or production by adding: ::
175+
To install a new 3rd party python package, you cannot use ``uv add <package_name>``, that would only add the package to the container. The container is ephemeral, so that new library won't be persisted if you run another container. Instead, you should modify the Docker image:
176+
You have to modify pyproject.toml and either add it to project.dependencies or to tool.uv.dev-dependencies by adding: ::
159177

160-
<package_name>==<package_version>
178+
"<package_name>==<package_version>"
161179

162180
To get this change picked up, you'll need to rebuild the image(s) and restart the running container: ::
163181

@@ -176,7 +194,7 @@ If you are using the following within your code to debug: ::
176194

177195
Then you may need to run the following for it to work as desired: ::
178196

179-
$ docker compose -f docker-compose.local.yml run --rm --service-ports django
197+
docker compose -f docker-compose.local.yml run --rm --service-ports django
180198

181199

182200
django-debug-toolbar
@@ -190,8 +208,8 @@ docker
190208

191209
The ``container_name`` from the yml file can be used to check on containers with docker commands, for example: ::
192210

193-
$ docker logs <project_slug>_local_celeryworker
194-
$ docker top <project_slug>_local_celeryworker
211+
docker logs <project_slug>_local_celeryworker
212+
docker top <project_slug>_local_celeryworker
195213

196214
Notice that the ``container_name`` is generated dynamically using your project slug as a prefix
197215

@@ -331,7 +349,7 @@ Assuming that you registered your local hostname as ``my-dev-env.local``, the ce
331349

332350
Rebuild your ``docker`` application. ::
333351

334-
$ docker compose -f docker-compose.local.yml up -d --build
352+
docker compose -f docker-compose.local.yml up -d --build
335353

336354
Go to your browser and type in your URL bar ``https://my-dev-env.local``.
337355

docs/2-local-development/developing-locally.rst

Lines changed: 19 additions & 29 deletions
Original file line numberDiff line numberDiff line change
@@ -1,37 +1,27 @@
11
Getting Up and Running Locally
22
==============================
33

4-
.. index:: pip, virtualenv, PostgreSQL
4+
.. index:: PostgreSQL
55

66

77
Setting Up Development Environment
88
----------------------------------
99

1010
Make sure to have the following on your host:
1111

12-
* Python 3.12
12+
* uv https://docs.astral.sh/uv/getting-started/installation/
1313
* PostgreSQL_.
1414
* Redis_, if using Celery
1515
* Cookiecutter_
1616

17-
First things first.
18-
19-
#. Create a virtualenv: ::
20-
21-
$ python3.12 -m venv <virtual env path>
22-
23-
#. Activate the virtualenv you have just created: ::
24-
25-
$ source <virtual env path>/bin/activate
26-
2717
#. .. include:: generate-project-block.rst
2818

2919
#. Install development requirements: ::
3020

31-
$ cd <what you have entered as the project_slug at setup stage>
32-
$ pip install -r requirements/local.txt
33-
$ git init # A git repo is required for pre-commit to install
34-
$ pre-commit install
21+
cd <what you have entered as the project_slug at setup stage>
22+
uv sync
23+
git init # A git repo is required for pre-commit to install
24+
uv run pre-commit install
3525

3626
.. note::
3727

@@ -40,7 +30,7 @@ First things first.
4030

4131
#. Create a new PostgreSQL database using createdb_: ::
4232

43-
$ createdb --username=postgres <project_slug>
33+
createdb --username=postgres <project_slug>
4434

4535
``project_slug`` is what you have entered as the project_slug at the setup stage.
4636

@@ -54,7 +44,7 @@ First things first.
5444

5545
#. Set the environment variables for your database(s): ::
5646

57-
$ export DATABASE_URL=postgres://postgres:<password>@127.0.0.1:5432/<DB name given to createdb>
47+
export DATABASE_URL=postgres://postgres:<password>@127.0.0.1:5432/<DB name given to createdb>
5848

5949
.. note::
6050

@@ -71,15 +61,15 @@ First things first.
7161

7262
#. Apply migrations: ::
7363

74-
$ python manage.py migrate
64+
uv run python manage.py migrate
7565

7666
#. If you're running synchronously, see the application being served through Django development server: ::
7767

78-
$ python manage.py runserver 0.0.0.0:8000
68+
uv run python manage.py runserver 0.0.0.0:8000
7969

8070
or if you're running asynchronously: ::
8171

82-
$ uvicorn config.asgi:application --host 0.0.0.0 --reload --reload-include '*.html'
72+
uv run uvicorn config.asgi:application --host 0.0.0.0 --reload --reload-include '*.html'
8373

8474
If you've opted for Webpack or Gulp as frontend pipeline, please see the :ref:`dedicated section <bare-metal-webpack-gulp>` below.
8575

@@ -136,11 +126,11 @@ Following this structured approach, here's how to add a new app:
136126

137127
#. **Create the app** using Django's ``startapp`` command, replacing ``<name-of-the-app>`` with your desired app name: ::
138128

139-
$ python manage.py startapp <name-of-the-app>
129+
uv run python manage.py startapp <name-of-the-app>
140130

141131
#. **Move the app** to the Django Project Root, maintaining the project's two-tier structure: ::
142132

143-
$ mv <name-of-the-app> <django_project_root>/
133+
mv <name-of-the-app> <django_project_root>/
144134

145135
#. **Edit the app's apps.py** change ``name = '<name-of-the-app>'`` to ``name = '<django_project_root>.<name-of-the-app>'``.
146136

@@ -166,7 +156,7 @@ For instance, one of the packages we depend upon, ``django-allauth`` sends verif
166156

167157
#. Make it executable: ::
168158

169-
$ chmod +x mailpit
159+
chmod +x mailpit
170160

171161
#. Spin up another terminal window and start it there: ::
172162

@@ -199,18 +189,18 @@ If the project is configured to use Celery as a task scheduler then, by default,
199189

200190
Next, make sure `redis-server` is installed (per the `Getting started with Redis`_ guide) and run the server in one terminal::
201191

202-
$ redis-server
192+
redis-server
203193

204194
Start the Celery worker by running the following command in another terminal::
205195

206-
$ celery -A config.celery_app worker --loglevel=info
196+
uv run celery -A config.celery_app worker --loglevel=info
207197

208198
That Celery worker should be running whenever your app is running, typically as a background process,
209199
so that it can pick up any tasks that get queued. Learn more from the `Celery Workers Guide`_.
210200

211201
The project comes with a simple task for manual testing purposes, inside `<project_slug>/users/tasks.py`. To queue that task locally, start the Django shell, import the task, and call `delay()` on it::
212202

213-
$ python manage.py shell
203+
uv run python manage.py shell
214204
>> from <project_slug>.users.tasks import get_users_count
215205
>> get_users_count.delay()
216206

@@ -231,11 +221,11 @@ If you've opted for Gulp or Webpack as front-end pipeline, the project comes con
231221
#. Make sure that `Node.js`_ v18 is installed on your machine.
232222
#. In the project root, install the JS dependencies with::
233223

234-
$ npm install
224+
npm install
235225

236226
#. Now - with your virtualenv activated - start the application by running::
237227

238-
$ npm run dev
228+
npm run dev
239229

240230
This will start 2 processes in parallel: the static assets build loop on one side, and the Django server on the other.
241231

docs/Makefile

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@
33

44
# You can set these variables from the command line.
55
SPHINXOPTS =
6-
SPHINXBUILD = sphinx-build
6+
SPHINXBUILD = uv run sphinx-build
77
SOURCEDIR = .
88
BUILDDIR = _build
99

0 commit comments

Comments
 (0)