Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Who would have though including a hard depedency on third part service with unclear long term availability would be a problem!

Paid compilers and remotely acessible mainframes all over again - people apparently never learn.



Everyone that successfully avoided social media for the last decade escaped with their mental health. Everyone that carefully moderates their ai intake (e.g don’t depend on Claude Code) will also escape with their skills over the next decade, others will become AI fiends, just the like social media fiends. Just knowing tech like the internet and ai can fuck your whole brain up is enough to be ahead of the curve. If you didn’t learn the lesson from the uptake of video games, cellphones, tv, streaming (the list is endless), then you’re not paying attention.

The destruction of spelling didn’t feel like doomsday to us. In fact, I think most people treated the utter annihilation of that skill as a joke. “No one knows how to spell anymore” - haha, funny, isn’t technology cute? Not really. We’ve gone up an order of magnitude, and not paying attention to how programming is on the chopping block is going to screw a lot of people out of that skill.


Very thoughtful comment, let me try to capture it more clearly.

Zuckerberg recently said that those not using AI glasses will be cognitively disadvantaged in the future.

Imagine an AI-native from the future and one of the fancy espresso coffee machines that we have today. They will be able to know right away how to operate them from their AI assistants, but they won't be able to figure out how they work on their own.

That's the future that Zuckerberg wants. Naturally, fancy IT offices will likely be gone. The AI-native would have bought the coffee machine for nostalgia effect for a large sum, trying to combat existential dread and feeling of failure which are fueled by their behavior being even more directly coerced into consumption.


curious, maybe one could go and spin up a study for using claculators instead of calculating manually and how it can lead to less x type of thinking and affect our abiltiy but maybe even if that is true(i am not sure maybe it is just in the domains we dont feel like we need to care much or etc) would people quitting clacutors a good thing for getting things done in the world ?


For me the thing that atrophied basic math skills wasn't the calculator, which was invented decades before I was born, but the rise of the smart phone.

Sure, calculators are useful in professional life and in high school math and sciences. But you still had to do everyday math in all kinds of places and didn't always have a calculator at hand. The smartphone changed that

I feel that's relevant in two ways: just like with math, a little bit of manual coding is going to be a huge difference compared to no manual coding, and any study like the one you propose would be hugely complicated by everything else that happened around the time, both because of smart phones and the coinciding 2008 crash


curious, maybe one could go and spin up a study for using claculators instead of calculating manually and how it can lead to less x type of thinking and affect our abiltiy but maybe even if that is true(i am not sure maybe it is just in the domains we dont feel like we need to care much or etc) would people quitting clacutors a good thing for producing value in the world by the will of God?


It’s only a hard dependency if you don’t know and never learn how to program.

For developers who read and understand the code being generated, the tool could go away and it would only slow you down, not block progress.

And even if you don’t, it really isn’t a hard dependency on a particular tool. There are multiple competing tools and models to choose from, so if you can’t make progress with one, switch to another. There isn’t much lock-in to any specific tool.


My experience has been that Claude can layout a lot of things in minutes that would take me hours if not days. Often I can dictate the precise logic and then claude get's most of the way there, with a little prompting claude can usually get even further. The amount of work I can get done is much more substantial than it used to be.

I think there is a lot of reticence to adopt AI for coding but I'm seeing it as a step change for coding the way powerful calculators/workstation computers were for traditional engineering disciplines. The volume of work they were limited to using a slide rule was much lower than now with a computer.


> For developers who read and understand the code being generated, the tool could go away and it would only slow you down

Recent research suggests it would in fact speed you up.

https://metr.org/blog/2025-07-10-early-2025-ai-experienced-o...


You should actually read the paper. N size of 16. Only 1 of which had used cursor more than 40 hours before. All people working in existing code bases where they were the primary author.


I did read the paper, and the HN discussion (which is how I found it). I recommend you read that, your comments were addressed.

https://news.ycombinator.com/item?id=44522772


Which of bjclark's specific points does the thread actually refute?


Interestingly, devs felt that it sped them up even though it slowed them down in the study.

So even if it’s not an actual productivity booster on individual tasks, perhaps it still could reduce cognitive load and possibly burnout in the long term.

Either way, it’s a tool that devs should feel free to use or not use according to their preferences.


> Paid compilers.

I don't think this one is a good comparison.

Once you had the binary, the compiler worked forever[1]

The issue with them was around long term support for bugs and upgrade path as the language evolved.

---

[1] as long you had a machine capable of running/emulating the instruction set for the binary.


Hm, I am assuming that paid compilers were largely gone before the whole "must have this dongle attached to computer" industry? Because for software that uses those, "I paid for it" absolutely does not guarantee "I can still run it". The only reason it's not more of a problem is the planned obsolescence that means forced to upgrade sooner or later (but, unlike purely subscription-based services, you have some control over how frequently you pay).


Sadly, paid compilers still exist, and paid compilers requiring a licensing dongle still exist. The embedded development world is filled with staggering amounts of user hostility.


My understanding is that much of the established embedded world has moved to any one flavour of GCC or (more commonly) Clang, just because maintaining a proprietary optimising compiler is too much effort than just modifying (and eventually contributing to) Clang.


Tough for me to speak about embedded in general, but many companies are on vendor toolchains or paid compilers by choice, and it is the right choice to make given the tradeoffs involved.

IAR for example is simply a fantastic compiler. It produces more compact binaries that use less memory than GCC, with lots and lots of hardware support and noticeably better debugging. Many companies have systems-engineering deadlines which are much less amenable to beta quality software, fewer software engineering resources to deal with GCC or build-chain quirks (often, overworked EEs writing firmware), and also a strong desire due to BOM cost to use cheaper/less dense parts. And if there is a compiler bug or quirk, there is someone on the other end of the line who will actually pick up the phone when you call.

That said, some of those toolchain+IDE combos absolutely do suck in the embedded world, mostly the vendor-provided ones (makes sense, silicon manufacturers usually aren't very good at or care much about software, as it turns out).


> Tough for me to speak about embedded in general, but many companies are on vendor toolchains or paid compilers by choice, and it is the right choice to make given the tradeoffs involved.

That's true in general. With paid licenses and especially subscriptions, you're not just getting the service, you're also entering a relationship with the provider.

For companies, that often matters more than the service itself - especially when support is part of this relationship. That's one of many reasons companies like subscriptions.

For individuals, that sucks. They don't need or want another relationship with some random party, that they now need to keep track of. The relationship has so much power imbalance that it doesn't benefit the individual at all - in fact, for most businesses, such customer is nothing more than a row in an analytics database - or less, if GDPR applies.


8051s pretty much mean Keil - they used to do license dongles, but it's all online now. You really don't get much more established than the 8051. If you pick up any cheap electronic product and crack it open to find a low part count PCB with a black epoxy blob on it, chances are very good there's an 8051 core with a mask ROM under the blob.

(Also AVR/PIC compiler from Microchip had a dongle as recently as February this year, and it looks like it's still available for sale even though its license isn't in the new licensing model).


> My understanding is that much of the established embedded world has moved to any one flavour of GCC or (more commonly) Clang

Clang is not being professionally used commonly in the embedded space.


You need to run the cost/benefit analysis here: if I had avoided Claude Code, all that would have happened is I would have written much less code.

What's the cost between never using Claude, and using it and getting these lower limits? In the end, I'm left with no Claude in both situations, which leaves me better off for having used it (I wrote more code when Claude worked).


Did you write more code with Claude? Isn’t the point that you have in fact written less (because Claude wrote it for you)?

As for the cost, you are ignoring the situation where someone has depended on the tool so much that when it goes away they are atrophied and unable to continue as before. So no, in your scenario you’re not always left better off.


The metric being more lines of code usually turnes out to not be a very good one. Can it also help you do the same with less lines of code & reduced complexity ?


> people apparently never learn

They don't, because ironically enough human meatbags all need to be trained from scratch, unlike LLMs which don't die every 80 years or so.

I doubt even 10% of the devs now were conscious when "paid compilers and remotely accessible mainframes" still existed.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: