- Frost Ming
When you develop a Python project, you need to install the project's dependencies. For a long time, tutorials and articles have told you to use a virtual environment to isolate the project's dependencies. This way you don't contaminate the working set of other projects, or the global interpreter, to avoid possible version conflicts. We usually have to do these things:
$ python3 -m venv venv # make a virtualenv named `venv` $ . venv/bin/activate # activate the virtualenv (venv) $ pip install -r requirements.txt # install the dependencies
There are also workflow tools that simplify this process, such as Pipenv and Poetry. They create virtual environments for you without perception and then install dependencies into them. They are used by a wide range of users. A virtual environment contains a Python interpreter and has the same directory structure as a normal installation so that it can be used as if it were a standalone Python installation.
The problems with virtual environments
Virtualenvs help us isolate project dependencies, but things get tricky when it comes to nested venvs: One installs the virtualenv manager(like Pipenv or Poetry) using a venv encapsulated Python, and creates more venvs using the tool which is based on an encapsulated Python. One day a minor release of Python is out and one has to check all those venvs and upgrade them if required before they can safely delete the out-dated Python version.
Another scenario is global tools. There are many tools that are not tied to any specific virtualenv and are supposed to work with each of them. Examples are profiling tools and third-party REPLs. We also wish them to be installed in their own isolated environments. It's impossible to make them work with virtualenv, even if you have activated the virtualenv of the target project you want to work on because the tool is lying in its own virtualenv and it can only see the libraries installed in it. So we have to install the tool for each project.
I've been maintaining the Pipenv project as a collaborator for the past two years and became a member of PyPA in early 2020. I am always thinking if the virtual environment is really a must-to-have for Python projects, just like
npm, it doesn't need a cloned
node binary, but just a
node_modules directory that is unique for each project.
PEP 582 -- Python local packages directory
The solution has been existing for a long time. PEP 582 was originated in 2018 and is still a draft proposal till the time I wrote this article, but I found out it is exactly what
node_modules are in Python.
Say you have a project with the following structure:
. ├── __pypackages__ │ └── 3.8 │ └── lib └── my_script.py
As specified in the PEP 582, if you run
__pypackages__/3.8/lib will be added to
sys.path, and the libraries inside will become import-able in
Now let's review the two problems I mentioned in the last section and see how they change with the power of PEP 582. For the first problem, the main cause is that the virtual environment is bound to a cloned Python interpreter on which the subsequent library searching based. It takes advantage of Python's existing mechanisms without any other complex changes but makes the entire virtual environment to become unavailable when the Python interpreter is stale. With the local packages directory, you don't have a Python interpreter any more, the library path is directly appended to
sys.path, so you can freely move and copy it.
For the second, once again, you just call the tool against the project you want to analyze, and the
__pypackages__ sitting inside the project will be loaded automatically. This way you only need to keep one copy of the global tool and make it work with multiple projects.
PDM -- A new Python package manager and workflow tool
Starting from the PEP, I made PDM, a new Python package manager and workflow tool that leverages PEP 582 to get rid of virtualenv entirely. It installs dependencies into the local package directory
__package__ and makes Python interpreters aware of it with a very simple setup. It is not only an implementation of PEP 582 but also the only package manager that supports PEP 621, a new metadata format based on
pyproject.toml which becomes the standard recently. It is foreseen that pip will also gradually support this format. Besides, PDM uses the same dependency resolver as pip and has a full-featured plugin system, allowing for community-contributed plugins to enhance the functionalities.
In PDM, PEP 582 is not mandatory, you can also stick with virtualenv. PDM can detect existing venvs but not create new ones.
Another thing that is noteworthy is its dependency resolution mechanism -- it tries to lock versions that are compatible with the
requires-python value of the project. Say your project requires Python 2.7 or 3.6 upper and you want to add
pytest as a development dependency, in Pipenv(ver.
2020.11.15) you have to pin
pytest = "<5" manually in
Pipfile. And in Poetry(ver.
1.1.4) if you run
poetry add -D pytest you will get:
The current project's Python requirement (>=2.7,<3.0 || >=3.6,<4.0) is not compatible with some of the required packages Python requirement: - pytest requires Python >=3.6, so it will not be satisfied for Python >=2.7,<3.0
Yes, it tells you to upgrade your Python requires version. However, in PDM, you can lock successfully:
❯ pdm add -d pytest Adding packages to dev dependencies: pytest ✔ 🔒 Lock successful Changes are written to pdm.lock. Changes are written to pyproject.toml. Synchronizing working set with lock file: 11 to add, 0 to update, 0 to remove ✔ Install atomicwrites 1.4.0 successful ✔ Install colorama 0.4.4 successful ✔ Install packaging 20.8 successful ✔ Install more-itertools 5.0.0 successful ✔ Install pyparsing 2.4.7 successful ✔ Install attrs 20.3.0 successful ✔ Install pluggy 0.13.1 successful ✔ Install py 1.10.0 successful ✔ Install pytest 4.6.11 successful ✔ Install six 1.15.0 successful ✔ Install wcwidth 0.2.5 successful 🎉 All complete!
As you can see,
pytest is pinned to
4.* that is compatible with Python 2.7 and Python 3.6+.
Some random words
You may have seen this comic before, there are already so many package managers in Python's world, do we need a new one? No, I think. Gladly we have seen many improvements in the Python packaging ecosystem and the official installer pip, such as PEP 517/PEP 518, and the new dependency resolver, more to come in the future. But before the day comes, why not try something different from the traditional, why not make something new that makes at least myself happy. If you think so, PDM should be an option, hopefully.