Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Not only because the syntax is more human-friendly, but also because the Python interpreter is natively integrated in all Unix distros

That's kind of very optimistic evaluation - literally anything beyond "import json" will likely lead you into the abyss of virtual envs. Running something created with say Python 3.13.x on Ubuntu 22.04 or even 24.04 (LTSs) / Rocky 9 and the whole can of worms opened.

things like virtual envs + containers (docker like)/version managers become a must quickly.



“import json” is the kind of thing which requires picking and installing libraries in batteries-not-included languages, and it’s just one of many modules which are in the standard library. That’s not a compelling basis for large projects but over the years I’ve shipped a ton of useful production code which never needed more than the stdlib and thus spent no time at all thinking about deployment or security patching.

Also, it’s not the 2000s any more. Using venv to isolate application installs is not very hard anymore and there have been decent package managers for a long time.


The official package manager is pip, it's broken, and there has been a new "permanent solution" replacement for it each year.


“broken” is hyperbole. It works fine for millions of people every day. If you have some specific scenarios where you want it be better, it’s better to say those rather than just complain about an open source project.


Millions of people put it into Docker, or they just deal with it and you see the results with tons of Stackoverflow questions


> Millions of people put it into Docker, or they just deal with it and you see the results with tons of Stackoverflow questions

Arrogantly wrong.

I've coded in Python for almost 20 years. Many of those years I've had it as my primary language at work.

2024 was the first year I actually needed a virtualenv. Before that, I'd happily use pip to install whatever I want, and never had a version conflict that caused problems.

I often encounter junior folks who default to using a virtualenv, not because they need to, but because they've been brainwashed into believing that "you're doing it wrong if you don't use one".

In any case, people should use uv these days.


>2024 was the first year I actually needed a virtualenv. Before that, I'd happily use pip to install whatever I want, and never had a version conflict that caused problems.

Okay I'll bite: how did you deal with situations where you needed to work on two different projects that required different versions of the same dependency?


Some packages are written by researchers in fields that aren’t engineering. Sometimes you go all the way and reach for a VM.


I'm not denying that some need a VM or virtualenv. They exist for a reason.


I was talking about pip, not venv. I don't use venv either, not because I think it's a bad idea but because I can't be bothered. Stuff does end up conflicting unless I use Docker (lol) or uv.


If you use uv you are using virtual environments. You're merely being shielded from a few minutes worth of education about how they work (https://chriswarrick.com/blog/2018/09/04/python-virtual-envi...).


Well I also don't have to deal with the venv then, uv does it for me. Which I suspected is how it works, cause idk how else it would.


For some reason copying a venv directory causes problems with internal links. Fun all around.


The problem is twofold:

- all generated console scripts hardcode the path to python during creation for the shebang

- some of the binaries generated by install processes can link to absolute paths in the venv

Conda does quite a bit of work under the hood to reduce disk space and ran into related issues described here: https://docs.conda.io/projects/conda/en/stable/dev-guide/dee...


Well that explains why you think pip is broken


Because I went to the official pip "getting started" docs and did exactly what it says? It's bad even in venv though, like not managing your requirements.txt or installing conflicting stuff. The web is full of questions about this, with answers that involve installing some other thing.


pip freeze

but yes, pip predates the paradigm of package managers managing your dependency file instead of you managing your dependency file and then invoking the package manager


As another commenter alluded: It's clear your knowledge of the topic at hand is poor.

It's wise to keep silent in such cases.


Oh yeah, pip is actually right, and everyone is using it wrong


The rest of the thread makes it clear that you expect a "package manager" to do more things than Pip does.

Pip is focused on installing the packages, not really "managing" them. The problem is that there are quite a few different takes on what "management" should entail. That's where all the alternatives are coming from.


Whichever you call it, at the end of the day it's more common to encounter dep problems in Python than with NodeJS, Rust, etc.

But also, ignoring things pip isn't meant to do like version management and just focusing on installing, pip's default of installing to some unclear system location was always confusing, particularly when Py2 vs 3 was a thing.


I don't get the python hate on here.

Pip is fine. It's been fine for at least the last 5 to 10 years.


I don't hate Python, I use it by choice every day. This is merely a downside.


Poetry has been decent for years. uv is new but great and will likely continue to be.


uv is soooo much faster than Poetry, especially for dependency resolution.


I would like to try uv, but I don't find Poetry that bad for dependency resolution times. If I have to wait 2 minutes once a week it's not the end of the world. Maybe if you're constantly installing new things that are tough to resolve it's an issue.


It's not the end of the world, but it's annoying. I used to work at a place that had a huge monorepo with tons of dependencies. Poetry would sit there grinding away for 4 or 5 minutes. With smaller projects, I agree it's not much of an issue. But, uv has other cool features and it's super snappy!


Not the end of the world but why wait 2 minutes when you can wait 2 seconds?


That's what I'm saying, not too long ago people were saying "just use poetry". Which, like uv, has its own lockfile format.


Did conda ever get faster?


I'm not arguing on venv to isolate installs, I'm saying that relying on Python coming from UNIX-like OSes is close to impossible for, at least web related projects. I.e. Ansible does a lot [in terms of what code it generates] to keep itself compatible with whatever Python3 versions it may find on systems ( remote hosts )


I have a silly theory that I only half joke about that docker/containers wouldn't've ever taken off as fast as it did if it didn't solve the horrible python dependency hell so well. You know something is bad when fancy chrooting is the only ergonomic way of shipping something that works.

My first taste of Python was as a sysadmin, back in 2012 or so, installing a service written in Python on a server. The dependency hell, the stupid venv commands, all this absolute pain just to get a goddamn webserver running, good lord. It turned me off of Python for over a decade. Almost any time I saw it I just turned and walked away, not interested, no thanks. The times I didn't, I walked right back into that pile of bullshit and remembered why I normally avoided it. The way `brew` handles it on macOS is also immensely frustrating, breaking basic pip install commands, installing libraries as commands but in ways that make them not available to other python scripts, what a goddamn disaster.

And no, I really have no clue what I'm talking about, because as someone starting out this has been so utterly stupid and bewildering that I just move on to more productive, pleasant work with a mental note of "maybe when Python gets their shit together I'll revisit it".

However, uv has, at least for my beginner and cynical eyes, swept away most of the bullshit for me. At least superficially, in the little toy projects I am starting to do in Python (precisely because its such a nicer experience), it sweeps away most of the horrid bullshit. `uv init`, `uv add`, `uv run`. And it just works*.


> I have a silly theory that I only half joke about that docker/containers wouldn't've ever taken off as fast as it did if it didn't solve the horrible python dependency hell so well.

I don't think this is a silly theory at all. The only possibly silly part is that containers specifically helped solve this problem just for python. Lots of other software systems built with other languages have "dependency hell."


Back in the early days of Redhat, rpm's didn't really have good dependency management. Yes there were rpms, yes you could download them, but getting the full dep tree was a PITA. Most people installed the full Linux distro rather than a lightweight version because of this.

Debian's apt-get was very "apt" at the time when it came out. It solved the entire issue for Debian. There was a point at which there was an apt-rpm for redhat. Yum tried to solve it for redhat, but didn't really work that well -- particularly if you needed to pin packages to certain versions.


I won't touch Python either, but because I've been burned debugging large Python programs. Something that would have taken a minute in a statically typed language took hours of tracing data through the program to understand what was supposed to be in a dict. There are alternative languages that are pithy, statically typed, can write programs quickly, and can grow into large code bases that are maintainable; so there is never a reason to start a new project with Python today.


I've seen the same thing in .NET and Java where there's 800 layers of interface and impl and it's an adventure trying to find the actual business logic in all the layers of indirection

>so there is never a reason to start a new project with Python today

Nothing else has an ML/data ecosystem that compares. Perl/Go are maybe a distant 2nd


I deal with that all the time cause an adjacent team uses Java with tons of boilerplate and frameworks. At that point, your static typing isn't so static. Takes them forever to make changes, to the point where I started taking over responsibilities using Python code.


Most python written at a large scale uses types (TypedDict) and/or Pydantic for safety and never plain dict objects. That's a code smell in any language, we can stuff data into `map[string]interface{}` all day long and cause problems downstream.


There's also dataclass in stdlib


It's fine if the author doesn't choose to do horrendous things. There are types too if you want them, but I don't.


> However, uv has, at least for my beginner and cynical eyes, swept away most of the bullshit for me.

uv is _much_ better than what came before. As someone who has only had only glancing contact with Python throughout my career (terrible experiences at jobs with conda and pip), uv feels like Python trying to actually join the 21st century of package management. It's telling that it's in Rust and clearly takes inspiration from Cargo, which itself took inspiration from Ruby and Bundler.


The funniest thing for me is that you can use uv in mise to install Python cli programs in a surprisingly elegant manner.


Just curious: how?


You can do it with various mise commands, but the simplest way is to use a mise.toml that looks something like this

  [tools]
  python = latest
  uv = latest
  "pipx:yt-dlp" = latest

  [settings]

  [settings.pipx]
  uvx = true

  [settings.python]
  uv_venv_auto = true
These are all features of the [pipx backend] for mise, and the docs discuss what to do in the case of things like python updates, etc. The advantage of doing it this way, particularly for a global mise config, is that you treat these python tools as basically any other mise tool, so their versioning is easy to control.

I know mise isn't really a package manager. But with its support for things like this, be it for python, ruby, npm, or cargo, as well as more universal support from things like Ubi and the upcoming github backends, its rapidly becoming my favorite package manager. I've a number of projects that use particularly useful node based tools, like markdown-lint or prettier, that aren't JS based in any way, shape, or form. Having to carry around a package.json felt weird, and with the way mise handles all of it, now I don't have to

[pipx backend]: https://mise.jdx.dev/dev-tools/backends/pipx.html


They may be referring to uvx: https://docs.astral.sh/uv/guides/tools/

Or it could be something else, not sure.


In mise.toml I have:

``` [tools] python = "3.12.11" ruff = "latest" ```

I get ruff installed and anything else needed without any fuss.


`uv tool install` I believe


> You know something is bad when fancy chrooting is the only ergonomic way of shipping something that works.

Every language seems to have this problem. Or else how can we explain the proliferation of AppImage, Flatpak, Snap, ... ?


Yeah, if you look at a big Python vs Node vs Rust repo side by side, you'll notice Python is often the only one with a Dockerfile.


Basically in the world of C static vs. dynamic linking. A trade-off as old as programming.


You should always use virtual envs. They're a single directory, how are they an abyss? Pip now complains if you try to install a package system wide.


For shadow IT, anything that requires "installation" vs. being in the base system or being added as a file to a project is an inconvenience.

It's why I like using Bottle for small Python frontends: download the file and import.

(I'm ranting based on personal experiences with IT in the past. Yes in general virtualenv is the baseline)


Compile python to executable is always an option.


If you're dealing with a managed system, chances are, compilers are banned. You'll have to be unsafe and work outside the constrained environment - potentially violating policies, contracts, regulations and laws.


> You should always use virtual envs.

If you're not using dependencies, and are writing 3.x code, there's very little justification.


Not using dependencies is rather niche. Most people come to Python because they want to use pre-existing libraries.


I don't have to deal with this in JS or I think in other stuff like Golang. I give someone a package.json with versions of everything. npm install always sets deps up locally, but doesn't need to copy the entire NodeJS runtime.


… node_modules is your venv.

If we use uv from TFA, like the commands are nearly 1:1:

  npm install     <=> uv sync
  npm install foo <=> uv add foo
> It doesn't have to also store a local copy of NodeJS

… which Node devs do have a thing for, too, called nvm, written in bash.


The difference is it's default, it always works the same everywhere, it actually writes down the deps, and I don't have to manually switch between them (or set up fancy bashrc triggers) like venvs.


> it's default

This point is true; the ecosystem simply can't change overnight. uv is getting there, I hope.

> it always works the same everywhere

`uv` works the same, everywhere?

> it actually writes down the deps

`uv add` does that, too.

> I don't have to manually switch between them (or set up fancy bashrc triggers) like venvs.

You don't have to do that with `uv`, either?


Hope they make uv default then. It's nice, but I have to separately create a project with it, and regular python commands don't work with it, both of which go back to it not being default. But even that won't fix all the old projects.


node is not the default in js projects either, it's just the currently most popular manager. Old JS projects are their own bundle of fun.


Basically is by now. This is more of a recent thing for frontend JS, but NodeJS (which is more directly comparable to Python) had npm I think from the start.

Those browser JS libs installed via <script> tags though, honestly were pretty convenient in a way.


Yes, please use virtual envs or containers. I know it seems overly heavy and hard to manage, but you don't want to end up in a situation where you're afraid to update the system because the library upgrades might break some of your Python code.


+1 - I've been shocked at how many little portability issues that I've had shipping software, even within the relatively constrained environment of fellow employees.

Minor differences between distro versions can make a big difference, and not everyone that uses a Python script knows how to use something like pyenv to manage different versions.


Also in some older cases the pre-included Python is v2 and the system relies on it, which is more of a liability.


this is solved by uv


> things like virtual envs

I consider my point as still valid with UV, what you wanted to express?

On UV specifically - say 'asdf' compiles python right on your system from official sources - means using your ssl libs for example. UV brings Python binary - I feel worried on this.


uv works just as well with whatever Python you want to bring -- you're not required to use the Pythons that uv is capable of installing on your machine.


uv really is working super super super hard to absolve decades of sins and blood in the python world. and doing a good job at redemption.


Just use uv


I agree - Python without uv is masochistic. But that really negates the "Python is already available" advantage. If you have to install uv you can just as easily install Rust or Go or Deno.


More like a snake pit than a can of worms.


I agree that the built-in Python is typically not suitable for development, especially if you're planning to distribute your code and/or care about the versions of its dependencies. (And especially on Debian and its derivatives, in my experience; they even remove parts of the standard library, like Tkinter [0].)

I disagree that virtual environments represent an "abyss". It takes very little effort to learn how they work [1], plus there a variety of tools that will wrap the process in various opinionated ways [2]. The environment itself is a very simple concept and requires very few moving parts; the default implementation includes some conveniences that are simply not necessary.

In particular, you don't actually need to "activate" a virtual environment; in 99% of cases you can just run Python by specifying the path to the environment's Python explicitly, and in the exceptional cases where the code is depending on environment variables being set (e.g. because it does something like `subprocess.call(['python', 'foo.py'])` to run more code in a new process, instead of checking `sys.executable` like it's supposed to, or because it explicitly checks `VIRTUAL_ENV` because it has a reason to care about activation) then you can set those environment variables yourself.

Creating a virtual environment is actually very fast. The built-in `venv` standard library module actually does it faster in my testing than the equivalent `uv` command. The slow part is bootstrapping Pip from its own wheel - but you don't need to do this [2]. You just have to tell `venv` not to, using `--without-pip`, and then you can use a separate Pip (for recent versions — almost the last 3 years now) copy cross-environment using `--python` (it's a hack, but it works if you don't have to maintain EOL versions of anything). If you need heavy-duty support, there's also the third-party `virtualenv` [3].

Much of the same tooling that manages virtual environments for you — in particular, pipx and uv, and in the hopefully near future, PAPER [4] — also does one-off script runs in a temporary virtual environment, installing dependencies described in the script itself following a new ecosystem standard [5]. Uv's caching system (and of course I am following suit) makes it very fast to re-create virtual environments with common dependencies: it has caches of unpacked wheel contents, so almost all of the work is just hard-linking file trees into the new environment.

[0]: https://stackoverflow.com/questions/76105218

[1]: https://chriswarrick.com/blog/2018/09/04/python-virtual-envi...

[2]: https://zahlman.github.io/posts/2025/01/07/python-packaging-...

[3]: https://virtualenv.pypa.io/

[4]: https://github.com/zahlman/paper

[5]: https://peps.python.org/pep-0723


Many of the people I coached after they asked for help would have been absolutely flabbergasted by these suggestions.

Activating a venv was at least sething they could relate to.


No and no. I don't know how you even get to this level of "making it harder for yourself".

Say you want to use a specific version of python that is not available on Ubuntu.

1. Install build dependencies https://devguide.python.org/getting-started/setup-building/#...

2. Download whichever Python source version you want, https://www.python.org/downloads/source/. Extract it with tar

3. run ./configure --enable-optimizations --with-lto

4. run make -s -j [num cores]

5. sudo make altinstall

This will install that specific version without overwriting default system python.

You can then bash alias pip to python3.xx -m pip to make sure it runs the correct one.

All the libraries and any pip install executable will be installed locally to ~/.local folder under the specific python version.

Alternatively, if you work with other tools like node and want to manage different versions, you can use asdf, as it gives you per folder version selection.

virtual environments are really only useful for production code, where you want to test with specific versions and lock those down.


Virtual environments are useful for isolating dependencies for different projects, not just isolating the interpreter.

(I mean, except on Windows, your venvs default to symlinking the interpreter and other shared bits, so you aren't really isolating the interpreter at all, just the dependencies.)


AFAIK, it's the same even on POSIX, when you create a venv, "./bin/python" are symlinks and the pyvenv.cfg has a hardcoded absolute path of the current python interpreter the venv module was using at the time of creation. It really doesn't isolate the interpreter.

(also one of the reasons why, if you're invoking venv manually, you absolutely need to invoke it from the correct python as a module (`python3.13 -m venv`) to make sure you're actually picking the "correct python" for the venv)


I meant that the symlink behavior is default except on Windows (though it may just be on POSIX platforms and I think of it as “except Windows” because I don't personally encounter Python on non-Windows non-POSIX platforms.)


> making it harder for yourself

Looking at just the first link, looks way more complicated than venv. And I'm a C++ developer, imagine someone who less experienced, or even who just isn't familiar with C toolchains.


It’s really not hard; the person you’re replying to put a lot of details in. I’ve lost track of how many times I’ve built a Python interpreter. Virtual envs used to work badly with some tools and dependencies, so I had to do it a lot. It’s gotten better and now I only compile Python to get a version that’s not provided by my Linux distribution.


The first link is a sudo apt install command that you copy paste into terminal. in what world is that more complicated than venv?

Here it is for clarity

sudo apt-get install build-essential gdb lcov pkg-config \ libbz2-dev libffi-dev libgdbm-dev libgdbm-compat-dev liblzma-dev \ libncurses5-dev libreadline6-dev libsqlite3-dev libssl-dev \ lzma lzma-dev tk-dev uuid-dev zlib1g-dev libmpdec-dev libzstd-dev


Even what you just pasted is wrong. And it would give a confusing error if you used it. Try "sudo apt install vim \ make" to see.

It's the kinda thing an experienced engineer wouldn't have that much trouble with, but you should be able to recognize how much experiential knowledge is required to compile a complex C code base and what kinda dumb stuff can go wrong.

You probably don't need to do much of the stuff on that page to build, but "What is dnf?", "Is the pre-commit hook important?", "Do I need cpython?", "What's an ABI dump?" are questions many people will the wrestling with while reading.


> The first link is a sudo apt install command that you copy paste into terminal. in what world is that more complicated than venv?

venvs also aren't complicated.

I have built Python from source before, many times. I do it to test Python version compatibility for my code, investigate performance characteristics etc.

Re-building the same version of Python, simply in order to support a separate project using the same version with different dependencies, is a huge waste of time and disk space (hundreds of MB per installation, plus the mostly-shared dev dependencies). Just make the virtual environment. They are not hard to understand. People who want a tool to do that understanding for them are welcome to waste a smaller amount of disk space (~35MB) for uv. A full installation of pip weighs 10-15MB and may take a few seconds; you normally only need one copy of it, but it does take some work to avoid making extra copies.


Personal preference, but I prefer to use 'mise' instead of 'asdf' these days: https://mise.jdx.dev/




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: