Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Python as a language is nice. Python's version and package management is nothing short of a nightmare.


This hasn't really been true for a while now. `uv` has radically improved the experience.


For the last 20 years that has been the mantra. Some X "solves" all the problems.

Except it doesn't. It just creates another X that is popular for a while, and doesn't somehow retroactively "fix" all the chaotic projects that are a nightmare to install and upgrade. Yes, I understand people like Python. Yes, I understand the LLM bros love it. But in a real production environment, for real applications, you still want to avoid it because it isn't particularly easy to create robust systems for industrial use. You may survive if you can contain the madness in a datacenter somewhere and have people babysit it.


I think that's true until it isn't, and people are really rallying around uv.

Here's to hoping it manages to actually solve the Python packagig issue (and lots of people are saying it already has for their use cases)!


Solving it at least involves the Python maintainers making a choice, integrating it into the Python binary and sticking to it. At least. But that requires possibly annoying some people until a) whatever solution becomes mature and b) people get over it.


I've ignored the trends and just used the bog standard requirements.txt with pip and a virtualenv and have had no problems in the past 10+ years. Either way, you always want to catch production deploys if something breaks.


"production deploys" sounds like something that is in a datacenter.


>Except it doesn't.

That is only true if you never reexamine the universality of your statement. I promise that it is possible to "solve" the mess that was Python's ecosystem, that uv has largely done so, and that your preconceptions are holding you back from taking advantage of it.


Here's the thing: there has never been a lack of people who have declared this problem as solved in the 20 or so years since Python started Poking its way into my professional life. (And for about 12-13 years before that I could gladly ignore it since nobody did much of anything in it). People have said this since the days of the Blackberry.

Multiple times people have explained why they think whatever they are madly in love with now is the definitive solution. And none of those times, over those couple of decades did it turn out to be true.

I understand that you are enthusiastic about things. I get it. But perhaps you might understand that some people actually need to see things stick before they declare a winner? I'm not big on wishful thinking.


I would take a look at uv adoption. That's what makes it different. It nails everything that all the other tools have done and it does it fast. So it's been what people have been using for a while now. Even poetry never seemed to get this ubiquitous of support.


I'm not saying uv isn't catching on. I'm saying most Python software still doesn't use it and for meaningfully complete adoption to happen, a solution has to be the default solution, preferably included in the standard distributions of the language.


Some of the biggest codebases in the world are in Python, this is a bizarre statement that reeks of the hn superiority complex.


Every single language enthusiast says that some of the biggest codebases in the world are whatever their favorite major language is. And here's the thing: it is completely irrelevant whether the codebase is small or large. What counts is what it is like to use and maintain programs.

Python isn't the only language that has poor tooling. C/C++ is even bigger than Python in terms of established code base, and its tooling is nothing short of atrocious.

What helps is people realizing where tooling and production readiness should be. They can learn a lot from Rust and Go.

The it's big so therefore it must be right argument is nonsense. Worse yet: it is nonsense that excuses lack of real improvement.


> But in a real production environment, for real applications, you still want to avoid it because it isn't particularly easy to create robust systems for industrial use.

This is silly and seems to discount the massive Python codebases found in "real production environment"s throughout the tech industry and beyond, some of which are singlehandedly the codebases behind $1B+ ventures and, I'd wager, many of which are "robust" and fit for "industrial use" without babysitting just because they're Python.

(I get not liking a given language or its ecosystem, but I suspect I could rewrite the same reply for just about any of the top 10-ish most commonly used languages today.)


I can get that Pythons not for everyone, it certainly has its flaws and maybe uv is just another transient solution which will come and go and others have. I might disagree, but I can accept that. What I can't accept is the idea that it should be avoided for real production environments, which is frankly a bit ridiculous considering all the real applications and real production environments running on Python.


There’s still 20 years of projects using everything that became before uv. They didn’t upgrade the moment uv came into existence. Data science-land still uses other rubbish too.


> They didn’t upgrade the moment uv came into existence.

There's also projects that can't use `uv` because it doesn't like their current `requirements.txt`[0] and I have no bandwidth to try and figure out how to work around it.

[0] We have an install from `git+https` in there and it objects strongly for some reason. Internet searches have not revealed anything helpful.


Unrelated to uv but the problem with having a git ref in requirements.txt is that pip will treat it as a fixed version so it will strictly narrow the other dependencies it can resolve, which gets exceptionally difficult to reason about once that packages also loads a package from another git ref. Throwing everything into codeartifact (or equivalent on other clouds) is better longterm.


If you open even a brief issue and tag me @zanieb I'm happy to take a look!


Ta, will give that a go when I've got some free time.


Python's package management is OK. The main culprit is .so libraries

Think JNI or cgo management.


Yes, I've never really understood the complaint about python packaging - building native code is not something that is ever easy to guarantee across multiple distributions and operating systems.

Those native packages can be in any language and require any odd combination of tools to build. Who has truly solved that problem?


If you don't need to link C lib, you can build any combination of arch and OS for a golang program. The default tooling allows you to do so easily.

If you need to link a C lib, there are ways to set it up to compile other OS (and maybe other archs).


take numpy as an example, it's gfortran mixed with C. How does cgo handle that?

And there's ffmpeg....


If you got object files you can compile whatever if you have the compiler toolset. I have made a go program that links ffmpeg and is built from linux to Windows and Macos. It was not super easy but it's doable.


wheels, conda, docker, firecracker, unikernel, flatpak


flatpak, docker ? i.e. include an almost complete distribution just to make one package work? but what if you want that package to work in the distribution you have now? What if you need 2 different packages that are in different flatpacks or different docker images?


Set them up, each in its own container, and use GRPC to communicate between them...

(Not even kidding, I've seen people do this)


Goodness gracious!


Well, if you're using GRPC anyway, and have the protocol buffers already... It's hard to resist the temptation after a few hours of installing broken package versions.

Incidentally, I suspect this is the spiritual origin of microservoces...


obviously don't try those first I'm just saying there's a stack of options to work with for any level of isolation you desire.

start with wheels if you just want pre-built binaries, move up to conda if you need arch and OS-specific C libs that depend on system packages, flatpack or docker if you dont want those to mess up your host, or unikernel/firecracker/VMs if you need kernel modules or hardware virtualization.


I'll throw in BLAS and LAPACK. Fuck, what a nightmare it always has been to get scipy running. I've always ended up with a wild mix of conda and pip installed shit that just wouldn't work.


Really? `pip install scipy` in a new environment just works for me. What concrete issues are you encountering?


It is not okay (maybe uv fixes this?)


pure .py libs can be "vendorized" directly into project.


I expect that for every two python users who know what `pip` is, there are three who are discouraged from thinking about the environment at all. Given that, focusing on the language and not the packaging is the right choice. Packaging is most often somebody else's problem.

It's just HN users are more likely to be that somebody else. Probably we have to deal with non-python dependencies anyway so we're reaching for bigger hammers like docker or nix. It would be nice if there wasn't a mess of python package managers, but whichever one I use ends up being a somewhat unimportant middle-layer anyway.


Going to be honest: With uv it really isn't that bad anymore. Sure, the whole packaging ecosystem is still really bad, but uv abstracts over most of it. Unless you're doing some really esoteric stuff you'll be just fine.


Using a nix devshell has worked out well for scripts for me so far. I haven't figured out what that workflow is like for larger projects though. I'm not interested in learning Uv


As a nix user, you hardly need to know uv, but you might later need to know how to work together with non-nix-users who use uv. It's easy.

There will be a file: uv.lock You can use uv2nix to get a single nix package for all the project's python dependencies, which you can then add to your devshell like anything else you find in nixpkgs (e.g. right alongside uv itself). It ends up being two or three lines of code to actually get the package, but you can just point a LLM at the uv2nix docs and at your flake.nix and it'll figure them out for you.

Your devshell will then track with changes that other devs make to that project's dependencies. If you want to modify them...

   edit pyproject.toml # package names and versions
   uv lock             # map names and versions to hashes (nix needs hashes, finds them in uv.lock)
   nix flake lock      # update flake.lock based on uv.lock
   nix develop         # now your devshell has it too
This way you're not maintaining separate sources of truth for what the project depends on, and the muggles need not think about your nix wizardy at all.


Didn't I say I wasn't interested?


You created a place where the interaction between nix devshells and uv was relevant, I put something there to help passers-by that might be interested in those topics. It wasn't actually for you.


There's nothing to learn. Get Claude code to install uv into your devshell and cut over your requirements. 5 minutes.


AI? :vomit_emoji:


Not being comfortable with AI isn't a valid reason for being intimidated by new tech. At this point any developer should be able to write a command line simple tool using a language that's brand new to them in a couple of hours. Simply using a new tool for the first time is trivial. Don't get left behind.


I'd rather work in a different field than be forced to use a LLM or vibe code all day.


Do you feel that way about IDEs also? Or compilers?


Or do it yourself, 10 minutes


It's like the (usually) interpreted equivalent to C/C++. There are lots of 'standard' package management choices.

And it seems like the package resolution is finally local by default, although that requires a 'virtualenv', which seems to be a legacy of the global packaging system.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: