Two of the biggest benefits pipenv
and poetry
are dependency locking and hash checking. Dependency locking means you can specify the direct dependencies your code requires, for example, celery==4.4.*
and the tooling will lock, not only celery
to a specific version, but also every dependency pulled in by celery
. Hash checking ensures the package you download matches a hash of the package when it was first downloaded. This ensures that when celery
(or any of its dependencies) are installed on a different system, they match the same file hash as when they were set up by the developer. If the package is in some way tampered with between the two installs, hash checking will fail, preventing the new (and potentially malicious) package from being installed.
The combination of these two features makes your project installation deterministic: given the same requirements/lock file, you’ll always get the same set of packages installed. This prevents all manner of issues when dependencies can shift underneath the code between developers’ machines, CI, and deployment. It’s a huge improvement over how we previously managed dependencies.
Pipenv
and poetry
have many other features, but I find myself rarely using them. Managing Python virtual environments is nice, but for experienced Python developers, skipping python -m venv .venv
before installing a project is not game-changing. I have limited experience with poetry
, but the performance of pipenv
and edge-case bugs have been a regular sore point in the projects we use it on.
Locking Dependencies with pip-compile
If you’re willing to give up the bells-and-whistles, pip-compile
(provided by pip-tools
) is a viable alternative. It provides both locking and hash-pinning for your dependencies. Once installed, you’ll typically create a requirements.in
file. This is where you define your project’s top-level dependencies (similar to pipenv’s Pipfile
or pyproject.toml
in poetry). A basic example might look something like this:
Django==2.2.*
psycopg2
celery>4.4
To “lock” these dependencies, you can run:
This generates the standard requirements.txt
file with all dependencies and includes package hashes provided by PyPI. Here’s a line from that file:
pytz==2019.3 \
--hash=sha256:1c557d7d0e871de1f5ccd5833f60fb2550652da6be2693c1e02300743d21500d \
--hash=sha256:b02c06db6cf09c12dd25137e563b31700d3b80fcc4ad23abb7a315f2789819be \
# via django
Note we didn’t have pytz
in our requirements.in
, but it’s included in requirements.txt
because it is required by django
(which the pip-compile
is kind enough to output in the file).
If your project uses a Makefile
, this workflow is well suited for it. The following will allow you to run make requirements.txt
and it will be updated if and only if the requirements.in
file has changed since requirements.txt
was last generated:
pip-compile
also provides the option for selectively upgrading individual packages with the --upgrade-package
argument as noted in their README.md
Installing Dependencies
Installing the dependencies is as simple as:
You can also use pip-sync
(included in pip-tools
) to reconcile an existing virtual environment with whatever exists in requirements.txt
. Not only will this install/upgrade packages as needed, but also remove packages that are no longer in requirements.txt
.
pipenv
and poetry
get a lot of attention in the Python community, and rightfully so. They are great projects that certainly scratch an itch for developers. That being said, if you’re just looking for something to handle dependency management, pip-tools
may be all you need.
Photo by Valery Fedotov on Unsplash