Hacker Newsnew | past | comments | ask | show | jobs | submit | moomin's commentslogin

I think, from an economic standpoint, there’s plenty of non-democratic countries that do this all the time and some of them do quite well. It’s more the end of an Experiment than the start of one.

Waiting for scientists to discover HUMAN.md


The article is kind of interesting: on the one hand, you’ve got a tool that can be used by ordinary citizens and political dissidents for legitimate reasons. On the other, the French police were mildly inconvenienced during their arrest of a small-time drug dealer.

Yes, really, that’s the argument.


It's about protecting kids/olds/fighting crime/drug dealer

`What would you like me to wrap the global surveillance in?'



I mean, this already isn’t permitted in the US yet somehow I’ve read her emails and his signal chats.


Send those doing that to jail. We already do for lower ranked individuals.


Friend of mine was gathering survey results for a kids programme in London. They take council estate kids to events and do childcare and a bunch of other stuff. When asked what they liked best, the kids kept talking about the food and how you could even have seconds. Meanwhile we’ve got food banks up and down the country struggling to keep up with demand. I know families with three kids to a room smaller than the one my youngest fills with books. I can assure you pretty is very real in London.


> I can assure you pretty is very real in London.

It seems to me this may be one part of the problem. De-industrialisation means people in developed nations have surged towards cities because that is where most jobs are. But lower paid jobs just don't pay enough to support a reasonable life for a poor family in London. It would be far easier support a family on a minimum wage job in the NW or NE of the UK than in London. But there don't seem to be enough jobs to do that.

Additionally, and I say this admitting I am speaking from a position of relative ignorance, there are a huge number of non UK born immigrants, living in state subsidised housing in one of the most expensive cities in the world. I don't fully understand why this is, but maybe it is because people are placed close to other family, maybe because of jobs.

As an immigrant to the UK myself, I'm aware that I should be very sensitive to criticisms of the system, but it does feel weird to have more than 50% of social housing in the capital allocated to people not born in the country. Please take this comment with as much charity as you can, I fully admit I am not close to the reasons for this.


> it does feel weird to have more than 50% of social housing in the capital allocated to people not born in the country

It's worth adding the context that more than two thirds of that 50% have a British passport [0], and that around 40% of London's population is foreign-born [1]. This is more a natural product of circumstance than it is anything to do with preferring immigrants over British-born individuals.

[0] https://www.standard.co.uk/news/politics/fact-check-foreigne...

[1] https://en.wikipedia.org/wiki/Demographics_of_London


As I said, I don’t know the UK context. In the US context, I went to a pretty destitute public schooling system and we provided breakfast, lunch, and (to a limited subset) dinner - plus there is SNAP/EBT.

> I know families with three kids to a room smaller than the one my youngest fills with books.

Housing is much more of an issue for the very poor, at least in the US. But I don’t agree that it has gotten relatively worse on a large timescale.


I volunteer my time with Food Not Bombs. 20% of American children do not know where their next meal is coming from. Many are simultaneously overweight and malnourished, because the foodstuffs the US government subsidizes are calorically dense but nutritionally destitute.

Food banks, subsidized school meals, and SNAP/EBT prevent what would otherwise be children starving to death. As it stands though, the relief is insufficient. Many children from food insecure households have stunted growth and lifelong learning impairments from insufficient protein, calcium, etc.

Source: https://www.ers.usda.gov/topics/food-nutrition-assistance/fo...


You should consider touring Appalachia, especially eastern Kentucky. The poverty there will make you think you're in a third world country.

https://www.ohchr.org/en/press-releases/2018/06/contempt-poo...

I grew up in this area. My parents didn't have electricity until they were teenagers. (I'm 32 for reference.)


It might solve the problem, in as much as the problem is that not only can it be done, but it’s profitable to do so. This is why there’s no Rust problem (yet).


Honest to God not something I care about but: this is pretty much the nail in the coffin for “master”. I do know some people _did_ care about the name. Sometimes surprisingly senior people who never supported a tech upgrade want the name changed. In any event, it’s done, “main” won, it’s fine, let’s move on.


Maybe they resisted because it was completely ridiculous waste of engineering resources all over the country and for absolutely no tangible reason other than white people trying to feel better about themselves.

I work in the field of film mastering (with countless product names with the word “master” in it) and luckily no one got the ridiculous idea in their head that we need to change this lingo.

Show me a single person who has a valid reason for me not calling my branch “master” or my bedroom “the master”. I honestly think this sort of ridiculing word policing is why we lost this last damned election. And if you’re somehow proud that you’ve renamed your git branches, you’re very likely a contributor to that lost election.


In Microsoft v. AT&T, decision 550 US 437 (2007), there was discussion about a golden disk, and the terminology changed to master disk during the course of the proceedings, because the disk wasn’t actually made of gold.

I remember that Justice Antonin Scalia objected: “I hope we can continue calling it the golden disk. It has a certain Scheherazade quality that really adds a lot of interest to this case.”

<https://www.supremecourt.gov/oral_arguments/argument_transcr...>


what, the golden ratio is not made of gold? have we been betrayed all this time?


The golden rule would deform pretty quickly be useless for measurement.


Wait until you hear about showers


you mean, like baby showers? yeah, totally not what i expected...


This stuff reminds me of what my mother said about feminists trying to get people to spell women with a y. She didn't like it because it made feminism seem like something petty and frivolous.

If I put my tin foil hat on it feels like a psyops to make the left look like a bunch of morons.


Yeah, I mean, isn’t a masters degree even worse, then?


Now I just know why I couldn't get a job. It's because I got a bad and worse degree for two years


a masters degree is about mastery, not about being central/main/leading...


Masters degree is from “magister”, meaning director/chief/boss/leader


wikipedia says: The original meaning of the master's degree was thus that someone who had been admitted to the rank (degree) of master (i.e. teacher)

and

from Latin: magister, "teacher"

teaching requires mastery, not leadership, although the concepts are related because mastery is also good for leadership.

my armchair etymology suggests that master and mastery were closely related until it started to be applied to leadership as well.


Magister means teacher and the things I mentioned. See https://en.wiktionary.org/wiki/magister#Latin


That "waste" of resources was absolutely tiny. Took me just some minutes. And I didn't do it because of DEI, just because I think it's a better name.


Same for me, but kind of because of DEI. Basically, it offended some people, and even if I thought it was a little overblown, it took about 2 minutes to change the default name of future repos to be something else (which was at least as good, and perhaps better). It made some people happier at approximately zero cost to myself, so why not.


Making "some people" happier isn't zero cost if the people in question are intolerant lunatics with ideas corrosive to the social fabric. It's one reason why the pendulum is swinging fiercely in the other direction.


TIL "I'm uncomfortable calling it master-slave, can we do main-replica?" is the idea of an "intolerant lunatic" that is "corrosive to the social fabric".

Good Lord, just listen to yourself.

Red-lined districts still shape America to this day and several red states have been rampant on racial districting to screw minority communities. You can't even pretend the history of slavery is in the past in America.


This is why I left California and utterly done with you people.


I live over 1,000 miles from California, bud.


Yeah that’s how I feel about most progressive stuff - sure it might not bother me, but also changing doesn’t bother me either. It costs you so little to accommodate other people.


I used to feel that way up until about a year ago. At worst I would roll my eyes at the silliness and then move on, because this stuff rarely matters much one way or another.

But then the 2024 elections happened, along with a bunch of exit polls, voter interviews, and other data showing that a surprising (to me anyway) number of people hate this kind of virtue signalling to the point that it can sway their vote. It's very possible those swung votes have ushered in a host of harmful changes that I think do matter a great deal. So now I'm sick of this stuff, it's not only a waste of time it's actively harmful.


Sorry it was exit polls that convinced you not to care about other people so much?

Don’t fool yourself kiddo, you were always an asshole, you were just waiting for the right excuse, just like the rest of us.

The deal with progressive ideology is that it progresses. Fixing inequality, prejudice, and injustice are a lifelong project, because as fast as you address issues, bigots will create new things to be bullies about. You don’t get to just get off at some point and be like “oh okay things seem good enough now.”


Progressive ideology tends to treat moral progress as inevitable, while pursuing social transformation in ways that can undermine the very institutions and norms that make progress possible.

I don’t think it’s true that the culture war issues themselves were the cause of those swayed votes so much as there’s a propaganda machine running 24/7 stoking those resentments and using such cultural critique as fodder.

This works really well to whip people into an othering frenzy to distract them from voting for their own economic interests.


I’d like to see a study showing 1) people aware of this issue and 2) for whom it swung their vote to the right. That’d have to be, what, 10 complete idiots? “Well, I was going to vote for A, but some of B’s supporters asked if I would please be considerate, and that’s a bridge too far.”


Oh, sure, it was people asking for branch renaming.

It wasn't the multiple dedicated Neo-Nazi propaganda networks that call all minorities and immigrants and political opponents enemies of the state.

It wasn't the election of politicians that are actual, convicted rapists and felons who distract by pointing to those who can't fight back.


I have encountered at least two bugs due to the change in names.

Everything considered I invested an hour or more in total. I am pretty sure decades of engineering time and resources were invested over the years because some people didn't like a default globally used for decades.


I dont know, for me main makes more sense so I prefer main. The main branch isn't master of anything, but its the main branch.


> The main branch isn't master of anything

It's master as in "master copy":

A "master copy" is an original version of a work from which other copies are made, serving as the definitive or controlling version


Yah, they are losing something with the name change that they don't even understand because they apparently don't understand the intricacies of English. We would be better off changing it to "Gucci Mane" then we could tag our branches off Gucci's hit singles.It only makes slightly less sense than switching to "main"


But git is a distributed VCS system - there is no master copy. But there certainly is a main branch, which may have many copies on many machines.


That would make perfect sense if all branches had to be made from the default branch.

But they don't.

At `$CLIENT` we use `stable` as the default branch.

Use whatever works for you. Getting upset about a default that you can change is like getting upset about the default wallpaper of your OS.

And before you get all persnickety about that argument working both ways: the developers of git, get to decide the defaults and they did.

If you're so upset, fork it, revert the default branch name and maintain it yourself infinitely. That's definitely worth it just to keep a default branch name you like, right?


> If you're so upset, fork it, revert the default branch name and maintain it yourself infinitely. That's definitely worth it just to keep a default branch name you like, right?

No idea how you got that impression from my comment. It sounds like you're the one that's upset.

I don't care what you name your branches. I do think it's dumb to tell other people what (not) to name their branch though. But definitely not something I feel compelled to rearrange my life over.


Nobody is telling you what not to name your branches.

The people that wrote the software you're using for free, decided to change the default name.

That's it. Nobody has said you can't use whatever name you want.


> Nobody is telling you what not to name your branches.

> Nobody has said you can't use whatever name you want.

This is reductionist. The git people didn't pull this idea out of their butt. It came about because a lot of people were saying that we should not name our branches master.

I have no problem with what the git people did. Easy enough for them to change it, and it puts a dumb issue to bed (for them).

But I think it's fair for anyone to point out that the motivation was dumb, and to explain why it's dumb and how the word "master" is actually not an unreasonable choice in this context.

> Nobody has said you can't use whatever name you want.

Sure, until somebody makes the mistake of not renaming all of their old "master" branches and gets shamed by the word police over it.

Of course you're welcome to disagree.


> how word "master" is actually not an unreasonable choice in this context.

It doesn't even make sense in this context though. The name just got copied from BitKeeper which had master and slave branches.

Git doesn't have that concept.

> Sure, until somebody makes the mistake of not renaming all of their old "master" branches and gets shamed by the word police over it.

How are you going to be shamed? I thought there's nothing wrong with it?


> How are you going to be shamed? I thought there's nothing wrong with it?

If you re-read my comments you will understand that I don't believe there's anything wrong with using the word "master" to name a branch. But other people do, which is why there was an uproar and the default name was ultimately changed to "main".

Not sure how you were able to misinterpret this.


So what's your point?

If you don't think there's anything wrong with it, why would you care if someone else says "hey change this".

Do the same thing you'd do if someone says "hey you should use mongodb it's web scale": tell them you disagree and won't be doing that.

If you don't think there's anything wrong with it, how can you be "shamed" into doing something you disagree with?


I am doing exactly as you say.

I made a comment saying I disagree with the word police and I think it's dumb to cast people as being insensitive for using a longstanding word that makes sense to many people in the context it's used in.


I actually worked in film audio engineering and Master is not the universally used term and hasn't been used uniformly throughout history. I have an analog Mackie mixer from the 2000s with "Main" as the name of the Main Bus that was designed before the whole debate took part.

As far as software goes, things are similar. The process of "Mastering" is an exception.

As far as git branches go, I am fine with main. It has two advantages over master aside from any culturual questions:

1. main is more self-explanatory for beginners who don't know how "master" was/is used in tech.

2. it is shorter. While two letters don't make a huge difference, that is still a subtile advantage.

Whether these two points alone are enough to justify the needed work (which is probably not a lot to be honest), IDK.


Makes sense when you release 3.0 and basically allowed to introduce breaking changes.

In tech field there's lots of people living on the very fringes of society, hidden away behind keyboard.


Every time I push to master I get this song in my head

  Master! Master! 
Every time I push to main I have in my head:

  ...meh


Master of Puppets -- Metallica

To be fair, the song is about control and the abuse of power.


I heard that it's about drugs controlling you


you can call your branch whatever name you want. nobody cares, nobody is stopping you.


That was true before the 3.0 release. Why didn't the people offended by "master" just change the branch name? Because it was never about their own branch names. It was about everyone else's.


> Maybe they resisted because it was completely ridiculous waste of engineering resources all over the country and for absolutely no tangible reason other than white people trying to feel better about themselves.

I think the resisting probably wasted more time than anything else.

We used the occasion to ensure that there was no hardcoded naming in our IaC, internal tooling and CI/CD. It was surprinsingly easy, gave us a great excuse to do some much needed clean up and now everything can work with any branch used as the main one.

Was it extremely important? Probably not. Was it worth fighting against/having a stong opinion about? Probably not either.

Sometimes, it's easier to just go with the flow and try to turn things which seem meaningless into actual improvements. If it makes the people who think it's not meaningless feel better, well, even better. It surely didn't cost me much.


That "waste of resources" is completely made up, this changes nothing for any existing repo what so ever. Any existing repo that updated did so completely voluntarily, no tool forced them to.

At most you could argue that you needed to run one additional command when pushing the initial commit during this transitional period where GitLab/GitHub had updated the name but Git itself has not. Therefore, now we're back to square one with less "waste" as you put it.


Not at all. This is just about defaults. People can still choose arbitrary branch names. People can still set the default branch name, as I have:

  [init]
      defaultBranch = master
I just think "master" is an awesome word. Master record. Mastering. It just sounds cool to me and I'm gonna keep using it.

I also think "main" is a stupid word that doesn't say much about anything. I even hate "main" functions.


Tip: the next time you need to name a function, don't use "a stupid word that doesn't say much about anything". That's not how you're supposed to name stuff in programming. :)


In that case, I think 'git' is probably the name we should have been looking to replace...


It's hard when compilers force you to use "main".

  int main(int argc, char **argv);
I once wrote a liblinux library for Linux software development with freestading C. One of the things I did was replace the "main" function with a "liblinux_start" function.


Ada compiler does not force to use main


That's just the default. Nobody stops you from specifying `gcc -Wl,--entry,foo`.


Standard C stops you. The C standard library is hardcoded to call the main function. Providing one's own ELF entry point also breaks libc initialization unless the exact same startfiles are used.

Freestanding C gets rid of the libc so that's not a problem.


Maybe you could also do some runtime linking trickery when dynamically linking libc?


The libc could easily add support for such a thing but I doubt they'd care enough to. They'd probably see it as adding complexity for zero gain.

I submitted an issue to the GCC bugzilla about renaming certain symbols generated by compiler.

https://gcc.gnu.org/bugzilla/show_bug.cgi?id=113694

I don't know how to implement it and nobody else cared enough to.


It's one of those things I couldn't care less about what it's actually called as long as it's uniform everywhere.

I had a few frustrated evenings of debugging when Github changed the default to main and my local scripts expected "master".

All fixed now, but still an annoyance. Don't think about it much anymore.


Some of your scripts could possibly use `origin/HEAD` and reflect whatever origin thinks is the default branch. (Though obviously that assumes you always have an `origin` remote or something remote-like.) Including using the commit referenced by `origin/HEAD` to find the `origin/{branch-name}` that matches if you want a name to check locally.


PSA: You can run one git command and ignore this change and associated drama entirely. I don't care which you prefer, but let's not pretend like main "won" when sticking with master is as easy as:

    git config --global init.defaultBranch master


To be fair, ignoring the drama is just adapting to changes, which is crucial in this field. Our old repo defaulted to master, our new ones defaulted to main. No time was spent on bike shedding.


This has been the case on nearly every open source and proprietary project, I have worked on.

Most people do not care and will stick with the old default on old projects and use the new default on new projects. Occasionally, it stokes conversations around possible third options that are more descriptive like stable or development, but the norm is to just go with the default.

Going out of your way to set the default to the old name really reeks to me of slacktivism. People probably think that they're taking a stand, but in actual fact others will just assume that your repository is older or that you have an old configuration.


I did this the first time git asked me, because getting asked every time would get annoying.

Glad it's going back to having a default and not asking.


git config --global init.defaultBranch default

Because Mercurial is using default as default, and if we have to resort to worse SCMs, let them behave like Mercurial anyway


main won


There’s a tool that used to be popular in the .NET community called GhostDoc, that did pretty much exactly what you’re describing: rewording the blindingly obvious. I loathed it. But in terms of filling a very specific and all-too-common niche of “My manager’s insisting we do this thing, but has allocated no time to it, and will never spend more than five minutes verifying that it is done” it was excellent. I feel like Google is just creating the next generation of that technology and it will be very effective at solving the same problem.


Arguably the answer is “When Barbara Liskov invented CLU”. It literally didn’t support inheritance, just implementation of interface and here we have her explaining 15 odd years later why she was right the first time.

I used to do a talk about Liskov that included the joke “CLU didn’t support object inheritance. The reason for this is that Barbara Liskov was smarter than Bjarne Stroustrup.”


There is a reason C++ devs and only C++ devs have nightmares of diamond inheritance.

Oh the damage that language has done to a generation, but at least it is largely passed us now.


I haven't encountered diamond inheritance a single time in 10 years of writing/reading C++, so I definitely don't have nightmares about it. Maybe that was really a thing in the 90s or 2000s?


I have been programming professionally in c++ for 20 years. I remember once thinking "cool, I could use virtual inheritance here". I ended up not needing it.

MI is not an issue in c++, and if it were the solution would be virtual inheritance.


Exactly. Unlike Java where every object inherits from Ojbect, in C++ multiply inheriting from objects with a common base class is rare.

Some older C++ frameworks give all their objects a common base class. If that inheritance isn't virtual, developers may not be able to multiply inherit objects from that framework. That's fine, one can still inherit from classes outside the framework to "mix in" or add capabilities.

I've never understood the diamond pattern fear-mongering. It's just a rarely-encountered issue to keep in mind and handle appropriately.


> in C++ multiply inheriting from objects with a common base class is rare.

One example is COM (or COM-like frameworks) where every interface inherits from IUnknown. However, there is no diamond problem because COM interfaces are pure abstract base classes and the pure virtual methods in IUnknown are implemented only once in the actual concrete class.


Diamond inheritance is its own special kind of hell, but “protected virtual” members of java and c# are the “evil at scale” that’s still with us today. An easy pattern that leads to combinatorial explosion beyond the atoms in the universe. Trivially.

People need to look at a playing deck. 52 cards, and you get 8×10^67 possible orders of the deck. Don’t replicate this in code.


At least C# methods are not virtual by default like in Java.


Why do protected virtual methods lead to an explosion?


Protected = subclasses can call them.

Virtual = subclasses can override them.

So basically, any subclass can call the method, and that method may be overridden in any other subclass.


What is the issue with those overrides? They only affect that one path in the hierarchy of inheritance, no? Not a C++ user here, but I imagine it would be catastrophic, if an unrelated (not on path to root superclass) class could override a method and affect unrelated classes/objects.


> They only affect that one path in the hierarchy of inheritance, no?

Not necessarily. If you create a diamond (or a spiderweb :) inheritence pattern, the amount of places the method can be called and overriden grows fast.


Every language that permits diamond inheritance causes the devs who dare to use this feature at least some nightmare. It's not a C++ issue.


It's also cultural, possibily. Python supports diamond inheritance, and clearly states how it handles it (it ends up virtual in C++ terms). But in like 20 years of working with Python I can't remember encountering diamond inheritance in the wild once.


Django documentation explicitly recommended it for a short while. At a point, the Python community created all kinds of mixins on all kinds of random APIs.

Then people noticed it was bad, and stopped.


Mixins are usually explicitly orthogonal and rarely get subclassed, so diamond-shaped inheritance with mixins seems rare.


Diamond inheritance is in fact highly pervasive in Python. The reason is that every class is a subclass of object since Python 3 (Python 2 allows classic classes that are different). So every single time you use multiple inheritance you have diamond inheritance. Some of this diamond inheritance is totally innocuous, but mostly not, because a lot of classes override dunder methods on object like __setattr__. It was Guido van Rossum himself that observed the prevalence of diamond inheritance that led to Python 2.3 fixing the MRO, and introducing the super() function to make multiple inheritance sane.

You should read his essay: https://www.python.org/download/releases/2.2/descrintro/


> Diamond inheritance is in fact highly pervasive in Python.

I don't think that's true, because...

> So every single time you use multiple inheritance you have diamond inheritance.

Multiple inheritance is supported but not itself “highly pervasive” in Python

> It was Guido van Rossum himself that observed the prevalence of diamond inheritance

The essay you link does not support that claim. He doesn’t observe an existing prevalence, he describes new features being added simultaneously with the MRO fix that would present new use cases where diamond inheritance may be useful.

And, its true, diamond inheritance is more common in modern Python than it was with classic classes in ancient Python, but there is a huge leap between that and “highly pervasive”.


The MRO fix was added to Python 2.3. The new style classes that would cause diamond inheritance to be prevalent were already present in Python 2.2. So they weren’t simultaneous.

A better phrasing would be that Guido predicted the prevalence of diamond inheritance in Python and therefore found it necessary to fix the MRO.


The most evil code I’ve ever written was diamond inheritance where (some) of the base types were template parameters.

I needed it!

For reasons.

Good reasons? No… but I had my justification.


> at least it is largely passed us now

What does this mean? There doesn't seem to be a popular alternative to C++ yet, unfortunately.


Aside from game dev, Rust is being used in quite a lot of green field work where C++ would have otherwise been used.

Game dev world still has tons of C++, but also plenty of C#, I guess.

Agreed that it’s not really behind us though. Even if Rust gets used for 100% of C++’s typical domains going forward (and it’s a bit more complicated than that), there’s tens? hundreds? of millions (or maybe billions?) of lines of working C++ code out there in the wild that’ll need maintained for quite a long time - likely order decades.


The problem in Rust is that if B is inside of A,

    struct A {
        name: String,
        owned: B
    }

    struct B {
        name: String,
    }
you can't have a writeable reference to both A and B at the same time. This is alien to the way C/C++ programmers think. Yes, there are ways around it, but you spend a lot of time in Rust getting the ownership plumbing right to make this work.


> you can't have a writeable reference to both A and B at the same time > but you spend a lot of time in Rust getting the ownership plumbing right to

I think you maybe meant to say something different because here's the most obvious thing:

    impl A {
        fn simultaneously_writeable(&mut self) -> (&mut str, &mut str) {
            (&mut self.name, &mut self.owned.name)
        }
    }

Now it may take you a while to figure out if you've never done Rust before, but this is trivial.

Did you perhaps mean simultaneous partial field borrows where you have two separate functions that return the name fields mutably and you want to use the references returned by those functions separately simultaneously? That's hopefully going to be solved at some point, but in practice I've only seen the problem rarely so you may be overstating the true difficulty of this problem in practice.

Also, even in a more complicated example you could use RefCell to ensure that you really are grabbing the references safely at runtime while side-stepping the compile time borrow checking rules.


It's kind of crazy that OOO is sold to people as 'thinking about the world as objects' and then people expect to have an object, randomly take out a part, do whatever they want with it and just stick it back in and voila

This is honestly such an insane take when you think about what the physical analogue would be (which again, is how OOP is sold).

The proper thing here is that, if A is the thing, then you really only have an A and your reference into B is just that, And should be represented as such, with appropriate syntactic sugar. In Haskell, you would keep around A and use a lens into B and both get passed around separately. The semantic meaning is different.


I recently had this problem is some rust code. I was implementing A and had some code that would decide which of several 'B's to use. I then wanted to call an internal method on A (that takes a mutable reference to A) with a mutable reference to the B that I selected. That was obviously rejected by the compiler and had to find a way around it.


It's not crazy at all, especially since majority of programming is about digitalization of real world things/processed.

eBay, Tinder, Youtube, Robinhood, etc, etc.

Those are all real world things that are now represented in digital world and adjusted for that.

Also "world" doesn't imply "physical", but that's different matter.

And at the end of the day that was not wildly crazy, but wildly successful!

Such school of thinking enabled generations of software engineers who created all this digital world.


Wildy successful does not mean a good idea.

> Such school of thinking enabled generations of software engineers who created all this digital world.

Same could be said for imperative or functional programming for that matter.


As far as I know OOP has orders of magnitude higher market share than FP.

>Wildy successful does not mean a good idea.

Sure, but if there was FP instead of OOP, then would current digital world be better, as big, safer?

Who knows?


Rust depends on C++, until people cut their compilers lose from LLVM, GCC, and other C++ based runtimes, it is going to stay with us for a very long time.

That includes industry standards like POSIX and Khronos, CUDA, Hip and SYCL, MPI and OpenMP, that mostly acknowledge C and C++ on their definition.


There's a growing group that believes no new projects should be started in C/C++ due to its lack of memory safety guarantees. Obviously we should be managing existing projects, but 1973 is calling, it's time to retire into long-tail maintenance mode.

https://security.googleblog.com/2025/11/rust-in-android-move...


I've programmed C++ for decades and I believe all sane C++ code styles disallow multiple inheritance (possibly excepting pure abstract classes which are nothing but interfaces). I certainly haven't encountered any for a long time even in the OO-heavy code bases I've worked with.


I'm spoiled by Python's incredibly sane inheritance and I always have to keep in mind that inheritance is a very different beast in other languages.


And python didn't get it right the first time either. It wasn't until python 2.3 when method resolution order was decided by C3 linearization that the inheritance in python became sane.

http://mail.python.org/pipermail/python-dev/2002-October/029...


Inheritance being "sane" in Python is a red herring for which many smart people have fallen (e.g. https://www.youtube.com/watch?v=EiOglTERPEo). It's like saying that building a castle with sand is not a very good idea because first, it's going to be very difficult to extract pebbles (the technical difficulty) and also, it's generally been found to be a complicated and tedious material to work with and maintain. Then someone discovers a way to extract the pebbles. Now we have a whole bunch of castles sprouting that are really difficult to maintain.


Python is slightly better because it can mostly be manipulated beyond recognition due to strong metaprogramming but pythons operator madness is dangerous. Random code can run at any minute. It's useful for something's and a good scripting language, and a very well designed one, no question there. Still it would be better if it supported proper type classes. It could retain the dynamic typing, just be more sensible.


I'm always surprised by how arrogant and unaware Python developers are. JavaScript/C++/etc developers are quite honest about the flaws in their language. Python developers will stare a horrible flaw in their language and say "I see nothing... BTW JS sucks so hard.".

Let me give you just one example of Python's stupid implementation of inheritance.

In Python you can initialize a class with a constructor that's not even in the inheritance chain (sorry, inheritance tree because Python developers think multiple inheritance is a good idea).

    class A:
        def __init__(self):
            self.prop = 1

    class B:
        def __init__(self):
            self.prop = 2

    class C(A):
        def __init__(self):
            B.__init__(self)


    c = C()
    print(c.prop) # 2, no problem boss
And before you say "but no one does that", no, I've see that myself. Imagine you have a class that inherits from SteelMan but calls StealMan in it's constructor and Python's like "looks good to me".

I've seen horrors you people can't imagine.

* I've seen superclass constructors called multiple times.

* I've seen constructors called out of order.

* I've seen intentional skipping of constructors (with comments saying "we have to do this because blah blah blah)

* I've seen intentional skipping of your parent's constructor and instead calling your grandparent's constructor.

* And worst of all, calling constructors which aren't even in your inheritance chain.

And before you say "but that's just a dumb thing to do", that's the exact criticism of JS/C++. If you don't use any of the footguns of JS/C++, then they're flawless too.

Python developers would say "Hurr durr, did you know that if you add a object and an array in JS you get a boolean?", completely ignoring that that's a dumb thing to do, but Python developers will call superclass constructors that don't even belong to them and think nothing of it.

------------------------------

Oh, bonus point. I've see people creating a second constructor by calling `object.__new__(C)` instead of `C()` to avoid calling `C.__init__`. I didn't even know it was possible to construct an object while skipping its constructor, but dumb people know this and they use it.

Yes, instead of putting an if condition in the constructor Python developers in the wild, people who walk among us, who put their pants on one leg at a time like the rest of us, will call `object.__new__(C)` to construct a `C` object.

    def init_c():
        c2 = object.__new__(C)
        c2.prop2 = 'three'
        print(c2.__dict__, type(c2)) # {'prop2': 'three'} <class '__main__.C'>
And Python developers will look at this and say "Wow, Python is so flawless".


> In Python you can initialize a class with a constructor that's not even in the inheritance chain

No, you can't. Or, at least, if you can, that’s not what you’ve shown. You’ve shown calling the initializer of an unrelated class as a cross-applied method within the initializer. Initializers and constructors are different things.

> Oh, bonus point. I've see people creating a second constructor by calling `object.__new__(C)` instead of `C()` to avoid calling `C.__init__`.

Knowing that there are two constructors that exist for normal, non-native, Python classes, and that the basic constructoe Class.__new__, and that the constructor Class() itself calls Class.__new__() and then, if Class.__new__() returns an instance i of Class, also calls Class.__init__(i) before returning i, is pretty basic Python knowledge.

> I didn't even know it was possible to construct an object while skipping its constructor, but dumb people know this and they use it.

I wouldn’t use the term “dumb people” to distinguish those who—unlike you, apparently—understand the normal Python constructors and the difference between a constructor and an initializer.


> Knowing that there are two constructors that exist for normal, non-native, Python classes, and that the basic constructoe Class.__new__, and that the constructor Class() itself calls Class.__new__() and then, if Class.__new__() returns an instance i of Class, also calls Class.__init__(i) before returning i, is pretty basic Python knowledge.

I disagree that this is basic knowledge. In python a callable is an object whose type has a __call__() method. So when we see Class() its just a syntax proxy for Metaclass.__call__(Class). That's the true (first of three?) constructor, the one then calling instance = Class.__new__(cls), and soon after Class.__init__(instance), to finally return instance.

That's not basic knowledge.


> Knowing that there are two constructors that exist for normal, non-native, Python classes, and that the basic constructoe Class.__new__, and that the constructor Class() itself calls Class.__new__() and then, if Class.__new__() returns an instance i of Class, also calls Class.__init__(i) before returning i, is pretty basic Python knowledge.

I didn't know most of that, and I've performed in a nightclub in Python, maintained a CSP networking stack in Python, presented a talk at a Python conference, implemented Python extensions with both C and cffi, and edited the Weekly Python-URL!


Don't you mean 0Python?


I see what you did there.


1. Your first example is very much expected, so I don't know what's wrong here.

2. Your examples / post in general seems to be "people can break semantics and get to the internals just to do anything" which I agree is bad, but python works of the principle of "we're all consenting adults" and just because you can, doesn't mean you should.

I definitely don't consent to your code, and I wouldn't allow it to be merged in main.

If you or your team members have code like this, and it's regularly getting pushed into main, I think the issue is that you don't have safeguards for design or architecture

The difference with JavaScript "hurr durr add object and array" - is that it is not an architectural thing. That is a runtime / language semantics thing. One would be right to complain about that


> The difference with JavaScript "hurr durr add object and array" - is that it is not an architectural thing. That is a runtime / language semantics thing. One would be right to complain about that

Exactly. One is something in plain sight in front of ones eyes, and the other one can be well hidden, not easy to spot.


Oh I've seen one team constructing an object while skipping the constructor for a class owned by another team. The second team responded by rewriting the class in C. It turns out you cannot call `object.__new__` if the class is written in native code. At least Python doesn't allow you to mess around when memory safety is at stake.


For what it's worth, pyright highlights the problem in your first example:

    t.py:11:20 - error: Argument of type "Self@C" cannot be assigned to parameter "self" of type "B" in function "__init__"
        "C*" is not assignable to "B" (reportArgumentType)
    1 error, 0 warnings, 0 information 
ty and pyrefly give similar results. Unfortunately, mypy doesn't see a problem by default; you need to enable strict mode.


I don't understand the problem with your first example. The __init__ method isn't special and B.__init__ is just a function. Your code boils down to:

    def some_function(obj):
      obj.prop = 2

    class Foo:
      def __init__(self):
        some_function(self)

    # or really just like

    class Foo:
      def __init__(self):
        self.prop = 2
Which like, yeah of course that works. You can setattr on any object you please. Python's inheritance system ends up being sane in practice because it promises you nothing except method resolution and that's how it's used. Inheritance in Python is for code reuse.

Your examples genuinely haven't even scratched the surface of the weird stuff you can do when you take control of Python's machinery—self is just a convention, you can remove __init__ entirely, types are made up and the points don't matter. Foo() isn't even special it's just __call__ on the classes type and you can make that do anything.


With the assumptions typical of static class-based OO (but which may or may not apply in programs in Python), this naively seems like a type error, an even when it isn't it introduces a coupling where the class where the call is made likely depends on the internal implementation (not just the public interface) of the called class, which is...definitely an opportunity to introduce unexpected bugs easily.


Curious quirk of history that C++ peaked when Gen X was comiyof age, who were disproportionately affected by lead poisoning.


There's nothing wrong with implementation inheritance, though. Generic typestate is implementation inheritance in a type-theoretic trench coat. We were just very wrong to think that implementation inheritance has anything to do with modularity or "programming in the large": it turns out that these are entirely orthogonal concerns, and implementation inheritance is best used "in the small"!


If CLU only supported composition, was the Liskov substitution principle still applicable to CLU?


CLU implemeted abstract data types. What we commonly call generics today.

The Liskov substitute principle in that context pretty much falls out naturally. As the entire point is to substitute in types into your generic data structure.


No, because the LSP is specifically about inheritance, or subtyping more generally. No inheritance/subtyping, no LSP.

It is true that an interface defines certain requirements of things that claim to implement it, but merely having an interface lacks the critical essence of the LSP. The LSP is not merely a banal statement that "a thing that claims to implement an interface ought to actually implement it". It is richer and more subtle than that, though perhaps from an academic perspective, still fairly basic. In the real world a lot of code technically violates it in one way or another, though.


Yes it is, as it is about the semantics of type hierarchies, not their syntax. If your software has type hierarchies, then it is a good idea for them conform to the principle, regardless of whether the implementation language syntax includes inheritance.

It might be argued that CLU is no better than typical OO languages in supporting the principle, but the principle is still valid - and it was particularly relevant at the time Liskov proposed it, as inheritance was frequently being abused as just a shortcut to do composition (fortunately, things are better now, right?)


I mean, duh. The spicier take is that Barbara Liskov is smarter than Alan Kay.


Except that Smalltalk is so aggressively duck-typed that inheritance is not particularly first class except as an easy way to build derived classes using base classes as a template. When it comes to actually working with objects, the protocol they follow (roughly: the informally specified API they implement) is paramount, and compositional techniques have been a part of Smalltalk best practice since forever ago (something it took C++ and Java devs decades to understand). This allows you to abuse the snotdoodles out of the doesNotUnderstand: operator to delegate received messages to another object or other objects; and also the become: operator to substitute one object for another, even if they lie worlds apart on the class-hierarchy tree, usually without the caller knowing the switch has taken place. As long as they respond to the expected messages in the right way, it all adds up the same both ways.


I mean, it's not that hard to understand, why composition is to be preferred, when you could easily just use composition instead of inheritance. It's just that people, who don't want to think have been cargo-culting inheritance ever since they first heard about it, as they don't think much further than the first reuse of a method through inheritance.


No, it's not a complete replacement for inheritance.


Nor did I claim so.


Composition folks can get very dogmatic.

I have some data types (structs or objects), that I want to serialize, persist, and that they have some common attributes of behaviors.

In swift I can have each object to conform to Hashable, Identifiable, Codabele, etc etc... and keep repeating the same stuff over and over, or just create a base DataObject, and have the specific data object inherit it and just .

In swift you can do it by both protocols, (and extensions of them), but after a while they start looking exactly like object inheritance, and nothing like commposition.

Composition was preferred when many other languages didn't support object oriented out the gate (think Ada, Lua, etc), and tooling (IDEs) were primitive, but almost all modern languages do support it, and the tooling in insanely great.

Composition is great when you have behaviour that can be widely different, depending on runtime conditions. But, when you keep repeating yourself over and over by adopting the same protocols, perhaps you need some inheritance.

The one negative of inheretance is that when you change some behaviour of a parent class, you need to do more refactoring as there could be other classes that depend on it. But, again, with today's IDEs and tooling, that is a lot easier.

TLDR: Composition was preferred in a world where the languages didn't suport propper object inheretance out of the gate, and tooling and IDEs were still rudemmentary.


> In swift I can have each object to conform to Hashable, Identifiable, Codabele, etc etc... and keep repeating the same stuff over and over, or just create a base DataObject, and have the specific data object inherit it and just .

But then if you need a DataObject with an extra field, suddenly you need to re-implement serialization and deserialization. This only saves time across classes with exactly the same fields.

I'd argue that the proper tool for recursively implementing behaviours like `Eq`, `Hashable`, or `(De)Serialize` are decorator macros, e.g. Java annotations, Rust's `derive`, or Swift's attached macros.


Yes, all behaviors should be implemented like definitions in category theory: X behaves like a Y over the category of Zs, and you have to recursively unpack the definition of Y and Z through about 4-5 more layers before you have a concrete implementation.


I'll be honest here. I don't know if any comment on this thread is a joke.

There are valid reasons to want each one of the things described, and I really need to add type reflexivity to the set here. Looks like horizontal traits are a completely unsolved problem, because every type of program seems to favor a different implementation of it.


    > The one negative of inheretance is that when you change some behaviour of a parent class, you need to do more refactoring as there could be other classes that depend on it. But, again, with today's IDEs and tooling, that is a lot easier.
It is widely known as the "unstable base class" problem.

Another one is, that there are cases, where hierarchies simply don't work well. Platypus cases.

Another one is, that inheritance hides where stuff is actually implemented and it can be tedious to find out when unfamiliar with the code. It is very implicit in nature.

    > TLDR: Composition was preferred in a world where the languages didn't suport propper object inheretance out of the gate, and tooling and IDEs were still rudemmentary.
I think this is rather a rewriting of history to fit your narrative.

Fact is, that at least one very modern language, that is gaining in popularity, doesn't have any inheritance, and seems to do just fine without it.

Many people still go about "solving" problems by making every noun a class, which is, frankly, a ridiculous methodology of not wanting to think much. This kind of has been addressed by Casey Muratori, who formulated it approximately like this: Making 1-to-1 mappings of things/hierarchies to hierarchies of classes/objects in the code. (https://inv.nadeko.net/watch?v=wo84LFzx5nI) This kind of representing things in the code has the programmer frequently adjusting the code and adding more specializations to it.

One silly example of this is the ever popular but terrible example of making "Car" a class and then subclassing that with various types of cars and then those by brands of cars etc. New brand of car appears on the market? Need to touch the code. New type of car? Need to touch the code. Something about regulations about what every car needs to have changes? Need to touch the code. This is exactly how it shouldn't be. Instead, one should be thinking of underlying concepts and how they could be represented so that they can either already deal with changes, or can be configured from configuration files and do not depend on the programmer adding yet another class.

Composition over inheritance is actually something, that people realized after the widespread over-use of inheritance, not the other way around, and not because of language deficiencies either. The problems with inheritance are not merely previously bad IDE or editor support. The problems are, that in some cases it is bad design.


nVidia has been deeply involved in the software side, first with gaming, forever. It’s written into their DNA. Even when ATI/AMD could outperform them in raw hardware, nVidia worked well with every last game and worked with individual developers even writing some of their code for them.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: