True, though that's because Apple basically bought NeXT, it's IP, OS, top executives and Jobs had learned alot about the mistakes he'd made. Also he was singularly driven and charismatic (like him or not).
Basically it's as if Intel bought AMD and switched, but AMD had some visionary CEO that could lead Intel to victory.
The days of Moore are over. Still I used to hate Intel, but now root for them. I guess I like underdogs.
I don't know how you can root for Intel. Until AMD made it's comeback Intel charged premium prices for very low performance improvement generation after generation.
Intel is still the primary CPU on most laptops produced today; and only in the last 6-12 months I've started seeing some options to buy AMD powered laptops.
Even ignoring that Intel's vPRO/ME implementation is still scarier (potentially more invasive) than the somewhat equivalent AMD "feature".
Yes, I am old enough to remember the years 1994/1995, when Intel has introduced the split between the "Pentium Pro" CPUs intended for "professionals" who should pay dearly for the right of having computers that function without errors, and the "Pentium" CPUs (2nd generation Pentium @ 90/100 MHz), intended for the naive laymen, whose computers are not important, so it does not matter if their computers make mistakes from time to time.
This market segmentation introduced by Intel has broken the tradition created by IBM, who had taken care from the beginning to provide the IBM PC with error-detecting memory.
You are mostly right, but AMD laptops have been available for years. One of my few Windows systems is an HP Envy x360 laptop with an AMD 4500U, from mid-2020.
Any one discovery isn't going to suffice. Building a bunch of technologies to make that discovery usable and profitable is hard and long work.
We know for years that carbon nanotubes can make great, very compact, very fast transistors. There's nothing close to a practical application of this discovery, because we barely know how to make nanotubes in tiny amounts, a handful of labs, at a great expense.
Have you? There were plenty of people predicting Moore’s law would end in x years (e.g. https://www.technologyreview.com/2000/05/01/236362/the-end-o..., from 2000), and some people (including Moore himself) argued that progression had slowed a bit, but I don’t remember anybody saying the law already had ended at any time.
The graph of CPU density, clock speed, etc has a kink in it a decade or two ago. So, yeah, at least what we thought of as "Moore's Law" is dead, and has been for a while.
Dennard scaling broke down around 2005 or 2006.
Are we one discovery away? No, probably not. Leakage current isn't a "one trick" problem, especially as the features continue to get smaller. It's a fundamentally hard problem, and as you go to smaller feature sizes, it keeps getting harder. As we go to 3D approaches, thermal issues probably aren't "one trick" away either.
If you go read what his article said, it's very clear and incontestable ended. Still, that doesn't stop a bunch of people that have no idea what Moore's Law was about from claiming whatever they want to.
Anyway, where did you hear Moore's Law ended 30 years ago? That one doesn't make sense.
The most superficial and literal reading of Moore's Law is "the number of transistors on the most economical package doubles every X months" where X changed a bit over the decades.
Not only has the number of transistors on the largest package not followed an exponential for a decade (a good hint is that people denying it repeats the law with X changed on every single iteration), but as fabs adopt more and more complex processes, the number of transistors on the most economical package has been growing very slowly, with no perspective of a doubling any soon.
On a deeper reading, the Moore's paper is all about the economics of semiconductor fabrication. And not only companies that aren't on the most advanced process are not failing as they used to, but nobody is using the law anymore to size investments.
There is just no way to read Moore's Law and interpret it into something that still exists.
> People were saying it was going to end, not that it already ended.
My read is that people who like to "buy American" are often concerned with where the product is manufactured, as opposed to where the company is headquartered, or where the product was designed. On that score AMD is quite a bit less American made than Intel simply due to being fabricated it Taiwan and assembled in Malaysia (as far as I can find).
Basically it's as if Intel bought AMD and switched, but AMD had some visionary CEO that could lead Intel to victory.
The days of Moore are over. Still I used to hate Intel, but now root for them. I guess I like underdogs.