Improving the developer experience for Python on Ubuntu

I have been using Ubuntu as a development environment for Python professionally since 2007, and I even recall my excitement when I received those shiny CDs with the latest Ubuntu version back in the days. I certainly enjoy Ubuntu as a Python development environment to this day, but when you have been using something for such a long time, you tend to develop a blind spot for issues new developers could face.

Developing with Python itself has its issues, especially when it comes to packaging, but when I have a look at, there seem to be a couple of recurring issues that are more related to Python on Ubuntu:

  • people do not know how to install a new Python version

  • people try to upgrade an existing Python version and break their system

  • people are not aware that Python on Ubuntu is split in several packages, e.g. pip and venv come separately

  • people are confused about whether they should use Python packages from PyPI or via apt, and what consequences each decision has

I would like to understand how we can work together to improve the experience.

Do you think this is a technical thing? Or would it make sense to provide official documentation on how to handle those cases? But there will always be that dangerous blog post available somewhere on the internet that suggests uninstalling the system Python.

Have you experienced papercuts yourself? Tell me!

Next week I will be at PyCon & PyData in Berlin where I will speak about that topic. I want to get feedback, and I am looking for suggestions to make Ubuntu the best Python development environment it can be!


One thing I’ve found, at least in my own opinion, is that the Python tooling ecosystem suffers from the same issue as the JavaScript framework ecosystem where there is a new tool every six months that everyone suggests that you starting using. The biggest papercut I’ve had is that I haven’t really found the best way for virtual environment management. I’ve used pyenv, conda, raw venv, etc but there’s still some challenges, especially if you have like 30+ projects on your workstation.

For the technical side of the house, it would be nice to have a virtual environment manager that plugs into Ubuntu seamlessly. Also, documentation on how to make system-wide Python packages on Ubuntu would be really nice. I’ve often found myself getting slapped by Python’s “interpretered” nature where you have to hunt down missing packages or manipulate the PYTHON_PATH so that everything works as expected.

Also telling people about pipx would be great for those who want CLI applications to be available system-wide, but donm 't want to fetch them from the Ubuntu package archive. That way they don’t try to run pip as root :sweat_smile:

1 Like

I personally hate Python, not because the language is bad, but because the language makes breaking changes with every minor release that can break tons of apps and libraries. Libraries themselves can also make breaking changes, and don’t seem to have a good mechanism for determining whether a library is or isn’t compatible with a particular application. Compound that with the fact that Python is interpreted rather than compiled and lacks strong typing - that makes some issues not appear until a program reaches a particular point in its code that makes it go haywire.

The way Ubuntu handles Python makes this even worse because now there’s only one or two versions of Python available at any one time, plus a slew of libraries and applications that are hopefully compatible with each other and one or both versions of Python. (Build-time unit tests can shake out ones that have compatibility issues, but there’s no guarantee they’ll work right or that they even are included with an app or library. Manuskript is a great example of an app that has been bitten by incompatibility issues.)

IMO, the Python experience in Ubuntu is somewhat substandard for application distribution, but it’s even more substandard for development because now people on an LTS version of Ubuntu are relying on old Python versions and outdated libraries to write new code! This is really bad because, for instance, I can still use distutils on my Jammy box and will be able to for the next three (or eight or even ten) years if I want to, for brand new code. But distutils doesn’t even exist anymore in more modern Python versions. It means that new code I write is outdated before I even finish typing it. Even if I know better than to rely on the system Python libraries and use pip instead, I still have to grapple with my Python version being out of date.

I think what might help things is if we somehow were able to treat Python the way Windows treats Visual C++ redistributables. If it was possible to install any version of Python on any Ubuntu release, that would alleviate some of the problems. (We obviously can’t maintain every Python library in the world, but why reimplement Python’s package manager in apt if we don’t have to?) Perhaps providing or at least pointing people to a way of doing this would be helpful. As for libraries, maybe we can somehow inform the user that system Python libraries are intended only for the use of the system’s software and should not be relied upon for development?

Just some random thoughts from someone who tries to avoid coding in Python. Maybe some of that will be helpful.

1 Like

I’ve found that issue vof tooling very annoying, specially because I don’t want to risk that one of them causes issues with my host system, so I use LXD to do all development work inside LXC containers.


Do you refer to creating debs?

Yes :+1: - I’ve always found it a bit interesting since not every Python project follows the same directory structure.

That’s why I’ve always used venv’s for shipping Python code bases because then I know what versions of everything will be available. I only ever use the system Python interpreter if I’m writing a utility script specially for the system, or if I need to do some pre-provising for a cooler, better installation.

But yes - Python does change a lot a quite a quick cadence, so it makes it quite hard to support scripts over time unless you shipping a venv for each possible Python code base you have. However, I can get things done so much faster in Python than I can with other languages.

If you have to write documentation about how your feature works, then you failed :smiley:

but seriously, one of the main problems is not even upgrading. As maintainer of python packages I would like to locally run tests for all the environments my package supports. With toxit is done easy. However, for toxto work, I need python to be available.
On Windows this is damn simple, install as many pythons as you want, just add them to PATH.

On Linux you cannot just apt install all required versions, you usually will need to add extra PPAs, then indeed you have some libs that are shipped only as PPAs, like pip, distutils, venv and others.

I think we can find a better approach to handle it on Ubuntu.
It should be as simple as that:

apt-get install python3.8
apt-get install python3.9
apt-get install python3.10
apt-get install python3.11
apt-get install python3.12

and all of this should come prebundled with default packages like pip.
so, user can immediately do

python3.11 -m pip install venv
python3.11 -m venv venv
source <...>/activate
pip install <...>

for developers who work towards optimizing their space we can either provide instructions how to compile by themselves or provide smth like apt-get install python3.11-minimal

1 Like

@beliaev-maksim I had very similar thoughts.

As one of the maintainers of tox I am very well aware of that issue. People install tox and then they are confused whey their tests are not working for all versions of Python they have specified.

Fedora is handling this very smoothly as when you install tox as a system package, automatically all supported Python versions get also installed.

For Ubuntu and especially the LTS releases this needs more thought. Which kind of Python versions do you intend to support via system packages? The currently upstream maintained ones? That would be 5-6 versions. But as this is an LTS, and there are new Python versions coming out every year… do you want to maintain 5-6+10-12 Python versions for a single Ubuntu distribution over its lifetime? This gets out of hand really quickly.

And FWIW there is already a python3-minimal package out there, though I think there is room for discussion what a minimal, what a full version of Python is and maybe something in between.


One thing that Fedora does differently (and notice I don’t say “better”) is that we intentionaly support only one Python version to be used with other distribution packages. We support 7-8 other CPython versions and 2-3 PyPys, which is indeed a lot. However, we only make sure those work with venv/tox/virtualenv. Beyond that, we don’t support them. Other Fedora packages cannot use e.g. Python 3.10 just because they want/need it. That significantly reduces the “we need X different packages for each Python library” problem.

1 Like

do not we already cover Python (other versions than main) by Pro and multiverse support ?

I do not want to speak “solutions” here. I just share the common use case that the majority of python package maintainers/python developers will have, how to address it is a different question.
Maybe we can even provide snaps for all python versions ? :man_shrugging:

1 Like