Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Why does debian do that instead of upgrading everyone to the latest git version?


Debian 'stable' releases mainly target people who like fixed upgrade cycles. No feature/API/etc. changes for the lifetime of a release, only security patches. Means that you can have your servers routinely applying updates with relatively little worry that something is going to stop working because of a change. Even minor feature changes can break things, especially when run in scripts, so they prefer not to risk non-security updates, even small ones. Then you upgrade to the next release at a time scheduled to debug any problems that come up. I believe RHEL releases work somewhat similarly.

For my developer machine I prefer rolling updates, so I run Debian 'testing' on that one, which is basically a snapshot of what is going to be the next stable release, with daily updates (there are also distros like Arch that only do rolling release).

There's also a kind of in-between option: http://backports.debian.org/


"testing" is probably the worst choice. It's auto-generated from unstable with a 10 (iirc) day delay if no critical bugs are open. However, if a critical bug is fixed in unstable and another one found that affects both unstable and testing, then the release fixing the first bug won't migrate to testing, so you'll have an even more broken version for a longer time. Testing also receives no security updates. Unstable doesn't get security updates either (in the sense that the security team doesn't provide updates to unstable), but it usually gets fixed versions pretty quickly, and certainly before testing.

"unstable" with apt-listbugs installed works quite well for me. Sometimes I have to boot grml to roll back packages (check out the grml-rescueboot package), but that's very rare.


What if I want to decide for each individual package if it should get

(a) upgraded to the latest version

(b) upgraded to the latest version that the developer considers API compatible

(c) kept frozen

Is there a way to do that with the debian package management system?


If you're on Debian, an want to try and run testing/unstable, you probably want to install "apt-listbugs" -- which will warn you about open bugs in software you're installing/updating. Another option is to run Debian stable, and have testing and/or unstable chroot, managed via schroot: https://wiki.debian.org/Schroot (that page and the manpages could use an update, but having a scroot backed by lvm, and set up to mount /home works well enough for running x11 apps (the MIT auth cookie is in $HOME)).

I would recommend trying to keep things separate like that (or via something like docker/kvm/vagrant/etc) -- rather than trying to mix'n'match. You'll likely be the only one trying a particular combination of versions, and it's unlikely to be much fun.


Generally if you want to use only debian packages, the answer is "stop wanting that".

But, you can get a little closer if you run debian Stable and also include apt sources from testing and/or unstable, and do some clever things in /etc/apt/preferences to pin package priorities. It can get messy fast, though.


You can override per-package, yes, if you really want to, and you can pin versions for C. As others have said, in practice a newer version of one package often depends on a newer version of another (the system will generally stop you rather than blindly installing incompatible things), and obviously you'll miss out on the testing of an integrated system that's the selling point of debian stable, but you can do it.


Not completely, but there are ways to achieve this partially. You can add multiple sources (say, stable and unstable) and set the priorities so that by default, stable is preferred. Then you can choose to install certain packages from unstable. However, since these frequently depend on newer libraries (e.g. libc6), it's hard to pull this off without upgrading most fundamental libraries to the version from unstable. In that case, you might just as well run unstable.


The problem with this concept of "stability" is that it depends on a particular software development philosophy that isn't shared by all projects.

It works well if developers make sure that bugs in older versions of their software are fixed even after the next version is released. I think that is the case with a lot of infrastructure sort of software.

But if developers do not maintain old versions and fix bugs only by releasing a newer version of their software, then debian's approach leads to stability only in the sense of a reliably buggy system.


Debian's system does not require developers to maintain older versions, only to clearly indicate security fixes. Debian patches its own older versions to include the security fixes.


I understand that, but it means that debian inevitably distributes buggy software in a supposedly 'stable' distribution.

And I don't mean buggy in the sense that all software is buggy. I mean buggier than the best compatible release version available.


> I understand that, but it means that debian inevitably distributes buggy software in a supposedly 'stable' distribution.

Bugs are a matter of perspective. If alleged bugfixes actually make you modify your currently working setup, then I don't consider that much of an actual bugfix, just something that makes me do work for no real benefit:

http://stevelosh.com/blog/2012/04/volatile-software/

I like Debian stable. Two years is an entirely reasonable amount of time to be able to have most software in my OS immutable except for security fixes. For the tiny amount of software for which I may want the bleeding edge, there are language-specific "package" "managers" (lol npm) or I can just backport the software myself.


Many people are unhappy about living with old bugs or missing features for years, even if you are not. And as a result we get a proliferation of many different update mechanisms on the same system. Some of them interactive and unscriptable. Some less than secure. This is not an ideal situation by any stretch.

It's not a huge problem either as long as Linux is used almost exclusively by professionals and mostly on servers.


You know what people really hate? Change. Ask around how many people like it when Facebook changes its UI. Most don't. Sure, if there's a bug people hate living with the bug, but people really hate change even more.


I agree with you that this is a very strong sentiment. People don't want everything to change all the time underneath them, especially not the UI.

But freezing everything for years puts too many people in a situation where they just have to upgrade for one reason or another. It's not always their choice and it's rarely a desire for change that makes them do it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: