Hacker Newsnew | past | comments | ask | show | jobs | submit | abraae's commentslogin

Jaws is the only movie (within reason I guess) that I don't let my 13 year old watch.

We live by the sea with one of the world's best marine reserves right off shore. There are plenty of fish including sharks living right off the beach and you need nothing more than a mask and snorkel to get right in amongst them.

When I watched Jaws as a kid when it first came out, it scared me shitless and I still carry some of that trauma whenever I am snorkeling over a deep canyon where the blue just goes on forever and you can't see the bottom.

I just don't want my child to miss out on that because of the ability of Hollywood to scare us.


I'd recommend you watch Jaws again, or at least some clips of it, just to see how cheesy it feels now. It's lost its teeth imo

"Jaws lost its teeth"? I see what you did there.

But yeah. I had the fortune to see Jaws on a bus in highland Bolivia (talk about a weird choice for forced onboard entertainment!), and it was more annoying than it was scary.


> Jaws is the only movie (within reason I guess) that I don't let my 13 year old watch.

On the other hand, I totally forgot about Sharknado until just now; that's my next movie night pick and my kid's gonna love it.


Do you have kids?

It's much easier to say to a child "you can't have a social media account, it's the law because experts have determined it's not healthy at your age" than "your mother and I think that social media is bad for you".


I tried that before they were 13. Technically false, but almost no social media company allows under 13s.

It didn’t work very well.


> So you want to make it legally viable for religious parents to sue social spaces that allow for their children to question the religion of their parents?

Why not. In a court of law, and facts, such lawsuits would only serve to highlight that religion is not a real thing. That would be a good thing for the world.


The subsequent Vietnam war reinforced this even more.

The only path that America had to win in Vietnam was to destroy it, including the population they were allegedly there to protect. Hence they lost.


I joined IBM over 40 years ago, like my pappy before me.

My main takeaway from IBM's longevity is just how astonishingly long big companys' death rattles can be, not how great IBM are at running software businesses.


Are they dying? IBM’s stock is up 160% over the past 5 years.

No. They are a multi-generational institution at this point and they are constantly evolving. If you work there it definitely FEELS like they are dying because the thing you spent the last 10 years of your career on is going away and was once heralded as the "next big thing." That said, IBM fascinated me when I was acquired by them because it is like a living organism. Hard to kill, fully enmeshed in both the business and political fabric of things and so ultimately able to sustain market shifts.

That's an interesting and enlightening way to look at it.

For me it was the death of IBM's preeminence in IT. When I started there a job at IBM was prestigious,a job for life. More than once I was told that we had a lengthy backlog of inventions and technological wonders that could be wheeled out of the plant if competitors ever nipped too closely at IBM's heels.

At that time IBM had never made a single person redundant - anywhere in the world. The company had an incredible sophisticated internal HR platform that did elaborate succession planning and considered training and promotion as major workforce factors - there was little need to think much about recruitment because jobs for life. IBM could win any deal, maybe needing only to discount a little if things were very competitive.

It's impossible to imagine now what a lofty position the company held. It's not unfair to say that, if not dead, the IBM of old is no longer with us.


To be fair, the whole job market has changed. Layoffs and the death of "a job for life" is not unique to IBM.

I think the pace of progress and innovation has, for better or worse, meant that companies can no longer count on successfully evolving only from the inside through re-training and promotions over the average employee's entire career arc (let's say 30 years).

The reality is that too many people who seek out jobs in huge companies like IBM are not looking to constantly re-invent themselves and learn new things and keep pushing themselves into new areas every 5-10 years (or less), which is table stakes now for tech companies that want to stay relevant.


Honestly, I think that's people reacting to the market more than it's the market reacting to people.

If your average zoomer had the ability to get a job for life that paid comparably well by a company that would look after them, I don't think loyalty would be an issue.

The problem is today, sticking with a company typically means below market reward, which is particularly acute given the ongoing cost of living crises affecting the west.


It is absolutely a different company. I had the opportunity to intern there twice in the late 70's and then was acquired by them in 2015, the IBM of 1978 and the IBM of 2015 were very different businesses. Having "grown up" so to speak in the Bay Area tech company ecosystem where companies usually died when their founding team stopped caring, IBM was a company that had decided, as an institution, to encapsulate what it took to survive in the company's DNA. I had a lot of great discussions (still do!) with our integration executive (that is the person who is responsible for integrating the acquisition with the larger company) about IBM's practice in terms of change.

I interned there one summer and I felt like 9 years ago when I was there the company died 30 years prior. It’s a super weird place to work

I don't think they're dying at all, they're just become yet another consultancy/outsourcing shop

turning into a rent-seeking-behavior engine.

the final end-state of the company, like a glorious star turning into a black hole


exactly

It's just the same dynamic as old servers. They still work fine but power costs make them uneconomical compared to latest tech.

It’s far more extreme: old servers are still okay on I/O, and memory latency, etc. won’t change that dramatically so you can still find productive uses for them. AI workloads are hyper-focused on a single type of work and, unlike most regular servers, a limiting factor in direct competition with other companies.

I mean you could use training GPUs for inference right? That would be use case number 1 for a 8 * a100 box in a couple of years. It can also be used for non IO limited things like folding proteins or other 'scientific' use cases. Push comes to shove im sure an old A100 will run crysis.

All those use cases would probably use up 1% of the current AI infrastructure, let alone ahat they're planning to build.

Yeah, just like gas, possible uses will expand if AI crashes out, but:

* will these uses cover, say, 60% of all this infra?

* will these uses scale up to use that 60% within the next 5-7 years, while that hardware is still relevant and fully functional?

Also, we still have railroad tracks from the 1800s rail mania that were never truly used to capacity and dot com boom dark fiber that's also never been used fully, even with the internet growing 100x since. And tracks and fiber don't degrade as quickly as server hardware and especially GPUs.


> Push comes to shove im sure an old A100 will run crysis.

They don’t have video out ports!


Just like laptop dGPUs.

LambdaLabs is still making money off their Tesla V100s, A100s, and A6000s. The older ones are cheap enough to run some models and very cheap, so if that's all you need, that's what you'll pick.

The V100 was released in 2017, A6000 in 2020, A100 in 2021.


That could change with a power generation breakthrough. If power is very cheap then running ancient gear till it falls apart starts making more sense

Power consumption is only part of the equation. More efficient chips => less heat => lower cooling costs and/or higher compute density in the same space.

Solution: run them in the north. Put a server in the basement of every home in Edmonton and use the excess heat to warm the house.

Hugely unlikely.

Even if the power is free you still need a grid connection to move it to where you need it, and, guess what, the US grid is bursting at the seams. This is not just due to data center demand; it was struggling to cope with the transition away from coal well before that point.

You also can’t buy a gas turbine for love nor money at the moment, and they’re not ever going to be free.

If you plonked massive amounts of solar panels and batteries in the Nevada desert, that’s becoming cheap but it ain’t free, particularly as you’ll still need gas backup for a string of cloudy days.

If you think SMRs are going to be cheap I have a bridge to sell you, you’re also not going to build them right next to your data centre because the NRC won’t let you.

So that leaves fusion or geothermal. Geothermal is not presently “very cheap” and fusion power has not been demonstrated to work at any price.


I'm a little bit curious about this. Where do all the hardware from the big tech giants usually go once they've upgraded?

In-house hyperscaler stuff gets shredded, after every single piece of flash storage gets first drilled through and every hard drive gets bent by a hydraulic press. Then it goes into the usual e-waste recycling stream (ie. gets sent to poor countries where precious metals get extracted by people with a halved life expectancy).

Off-the-shelf enterprise gear has a chance to get a second life through remarketing channels, but much of it also gets shredded due to dumb corporate policies. There are stories of some companies refusing to offload a massive decom onto the second hand market as it would actually cause a crash. :)

It's a very efficient system, you see.


Similar to corporate laptops where due to stupid policies, for most BigCos you can't really buy or otherwise get a used laptop, even as the former corporate used of said laptop.

Super environmentally friendly.


I used (relatively) ancient servers (5-10 years in age) because their performance is completely adequate; they just use slightly more power. As a plus it's easy to buy spare parts, and they run on DDR3, so I'm not paying the current "RAM tax". I generally get such a server, max out its RAM, max out its CPUs and put it to work.

Same, the bang for buck on a 5yo server is insane. I got an old Dell a year ago (to replace our 15yo one that finally died) and it was $1200 AUD for a maxed out recently-retired server with 72TB of hard drives and something like 292GB of RAM.

Just not too old. Easy to get into "power usage makes it not worth it" for any use case when it runs 24/7

Seriously. 24/7 adds up faster than most realize!

The idle wattage per module has shrunk from 2.5-3W down to 1-1.2 between DDR3 & DDR5. Assuming a 1.3W difference (so 10.4W for 8760 hours), a DDR3 machine with 8 sticks would increase your yearly power consumption by almost 1% (assuming avg 10,500kWh/yr household)

That's only a couple dollars in most cases but the gap is only larger in every other instance. When I upgraded from Zen 2 to Zen 3 it was able to complete the same workload just as fast with half as many cores while pulling over 100W less. Sustained 100% utilization barely even heats a room effectively anymore!


The one thing to be careful with Zen 2 onwards is that if your server is going to be idling most of the time then the majority of your power usage comes from the IO die. Quite a few times you'd be better off with the "less efficient" Intel chips because they save 10-20 Watts when doing nothing.

A similar one I just ran into: my Framework Desktop was idling @ 5W more than other reported numbers. Issue turned out to be the 10 year old ATX PSU I was using.

Wake on LAN?

Then you cannot enjoy some very useful and used home server functions like home automation or NVR.

To be clear, this server is very lightly loaded, it's just running our internal network services (file server, VPN/DNS, various web apps, SVN etc.) so it's not like we're flogging a room full of GeForce 1080Ti cards instead of buying a new 4090Ti or whatever. Also it's at work so it doesn't impact the home power bill. :D

Maybe? The price difference on newer hardware can buy a lot of electricity, and if you aren't running stuff at 100% all the time the calculation changes again. Idle power draw on a brand new server isn't significantly different from one that's 5 years old.

Some is sold on the used market; some is destroyed. There are plenty of used V100 and A100 available now for example.

Manipulating this for creative accounting seems to be the root of Michael Burry’s argument, although I’m not fluent enough in his figures to map here. But, commenting that it interesting to see IBM argue a similar case (somewhat), or comments ITT hitting the same known facts, in light of Nvidia’s counterpoints to him.

Burry just did his first interview for many years https://youtu.be/nsE13fvjz18?t=265

with Michael Lewis, about 30 mins long. Highlights - he thinks we are near the top, his puts are for two years time. If you go long he suggests healthcare stocks. He's been long gold some years, thinks bitcoin is dumb. Thinks this is dotcom bubble #2 except instead of pro investors it's mostly index funds this time. Most recent headlines about him have been bad reporting.


> They still work fine but power costs make them uneconomical compared to latest tech.

That's not necessarily the driving financial decision, in fact I'd argue company's with data center hardware purchases barely look at this number. It's more simple than that - their support runs out and its cheaper to buy a new piece of hardware (that IS more efficient) because the hardware vendors make extended support inordinately expensive.

Put yourselves in the shoes of a sales person at Dell selling enterprise server hardware and you'll see why this model makes sense.


Eh, not exactly. If you don't run CPU at 70%+ the rest of the machine isn't that much more inefficient that model generation or two behind.

It used to be that new server could use half power of the old one at idle but vendors figured out that servers also need proper power management a while ago and it is much better.

Last few gens increase could be summed up to "low % increase in efficiency, with TDP, memory channels and core count increase".

So for loads not CPU bound the savings on newer gen aren't nearly worth it to replace it, and for bulk storage the CPU power usage is even smaller part


Definitely single thread performance and storage are the main reasons not to use an old server. A 6 year old server didn't have nvme drives, so SATA SSD at best. That's a major slow down if disk is important.

Aside from that there's no reason to not use a dual socket server from 5 years ago instead of a single socket server of today. Power and reliability maybe not as good.


NVMe is just a different form factor for what's essentially a PCIe connection, and adapters are widely available to bridge these formats. Surely old servers will still support PCIe?

that was then. now, high-end chips are reaching 4,3,2 nm. power savings aren't that high anymore. what's the power saving going from 4 to 2nm?

+5-20% clockspeed at 5-25% lower voltages (which has been and continues to be the trend) add up quick from gen to gen, nevermind density or ipc gains.

We can’t really go lower on voltage anymore without a very significant change in the materials used. Silicon band gap yadda yadda.

What if it's some junior given a job beyond their abilities, and struggling manfully using whatever tools they have to hand. Is it worth publicly trashing their name? What does their name really add to this article?


A good lesson. If you as an employer look at this history, and handle it in the interview appropriately (what did you learn / do better now for example) you can figure out if they did.

I'm sure lots won't, but if that is you as an employer you're worth nothing.


What, understand, review, and accept a two-line patch is now a job beyond a junior's ability? Beyond ability of anyone who can call themselves a "programmer," much less a "maintainer"?

As a certified former newborn, I should tell that finding the tit as a newborn is way harder, and yet here we all are.

"Struggling manfully," my arse, I don't know if the bar can go any lower...


It discourages other from doing the same. It might not be much, but discussing various made up "what if ..." scenarios also doesn't add much. We can just stick to the facts.


You're not wrong.

It's fashionable to dunk on OOP (because most examples - like employee being a subtype of person - are stupid) and ORM (because yes you need to hand write queries of any real complexity).

But there's a reason large projects rely on them. When used properly they are powerful, useful, time-saving and complexity-reducing abstractions.

Code hipsters always push new techniques and disparage the old ones, then eventually realise that there were good reasons for the status quo.

Case in point the arrival of NoSQL and wild uptake of MongoDB and the like last decade. Today people have re-learned the value of the R part of RDBMS.


Large projects benefited from OOP because large projects need abstraction and modularization. But OOP is not unique in providing those benefits, and it includes some constructs (e.g. inheritance, strictly-dynamic polymorphism) that have proven harmful over time.


Inheritance == harmful is quite an extreme position.


It may be extreme, but it's very common. It's probably the single most common argument used against OOP. If you drop out inheritance, most of the complaints about OO fall away.


Can you share where you've seen inheritance get dropped and it resulted in fewer complaints about OOP?


Anecdotally, yes. In work efforts where inheritance was kept to a minimum (shallow, more sensible, class structures) there were far fewer issues with both complaints and problems caused by it.

Outside that, look to Go. Some people will waste a few pages and hours of their life arguing about whether it is or isn't OO, but it provides everything other OO languages provide except for inheritance (struct embedding kinda-sorta looks like inheritance, but it's composition and some syntax sugar to avoid `deeply.nested.references()`). It provides for polymorphism, encapsulation, and information hiding. The complaints about Go are never (or rarely) about its OO system.


OOP without inheritance is exactly what VB6 had… now its cool again, I guess.


Always has been.

That position's not uncommon, but generally people who hold it prefer the rust style trait/interfaces system. To me it makes more sense, I don't care what this object is so long as it guarantees to provide me with certain functionality that I need.


What would be the best use of his last few years? Sitting in an easy chair by the pool?

Using your last few years to exercise your brain and ward off cognitive decline might be the best way to ensure those last few years are fulfilling and not just marking time before the end.


That's quite a dichotomy here. He can exercise his brain and ward off cognitive decline without working on Colorforth specifically...


Some people have trouble doing meaningless intellectual pursuits like crosswords, sudoku etc.

Working on Colorforth might be the greatest meaning in his life.


He himself said he didn't think it was worth it anymore and that he very rarely codes now. I respect him enough to assume he has some other pursuit more worthy of his attention.


Maybe he's been suffering a cognitive decline that makes coding much less rewarding.


I suppose there's meaning in searching for abstract logical truths, but he might have other such pursuits. Or, he might even feel that it's mostly done already and became just another boring software maintenance project.

It's hard to imagine an extremely niche software tool to be the greatest meaning in someone's life.


Still the same dichotomy. Who's to say his other pursuits are "meaningless", and the likes of crosswords, sudoku, etc.? For all we know he might have some other projects that he considers more useful.

He does not think working on Colorforth is worth it anymore, so it could actually be detrimental to do so.


>> Some people have trouble doing meaningless intellectual pursuits like crosswords, sudoku etc.

My Dad is like this. I'm like this. My son is like this.

Unless we're busy, pushing ourselves to build something, fix something or just outside doing something we don't feel the reward.

My Dad told his motto, "A rolling rock gathers no moss - until it finally stops rolling." He told me that in his 50's - he's in his 80's still out in the garage refinishing old furniture and giving it away. The drive the man has just never burns out.


Yeah! I can definitely see myself doing that. I told family and friends that my ideal way of death is a quick death by the computer when I'm over 80 working on some projects. I just have to move forward in some direction.

Perhaps he has chosen the best use of his last few years to his own satisfaction, and doesn't feel the need to share every last detail about himself on the internet.


He specifically mentions hiking and staying healthy, I'd imagine he's not going to stop using his brain completely.


> What would be the best use of his last few years? Sitting in an easy chair by the pool?

That's completely up to him, and if that's what he wants to do, then that's the best use. No one can say what is best for anyone else.


Google is also conflicted though.

The more they emulate ChatGPT's clean UI, the more they are failing to push ads in people's faces, which is what generates that $100B for them.

Their business model fails if their users don't experience a confusing crap-fest of ads.


> Their business model fails if their users don't experience a confusing crap-fest of ads.

I bet you OpenAI will implement ads far sooner than Google can hypothetically run out of money.


They don't have to do that today, though. Remember original Google? Or original Gmail? Neither had prominent ads for the first few years, a couple decades later the ads are everywhere.


> the more they are failing to push ads in people's faces

Won't be long until chat AI will include sponsored products and services in the output.


OpenAI isn't burning through tens of billions of dollars every year on its free tier for charity. It will dial up the advertising knob (and every other knob) to 11 the moment the cash starts to runs out.


If ads are the future of revenue for AI then every AI company is in trouble, because ads won’t even come close to covering the costs.


Google also has GCP where it can monetize its hardware investments.


As always, they will keep it clean until they can crank that enshitification dial to 11.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: