Intel’s failing will redefine the industry in many ways. ARM and AMD and other players are taking chunks out of them at the cutting edge. They seem to be redefining themselves as a “couture” fabricator rather than taking leadership on the design end of things … playing off their scale rather than their velocity. It’s a big change and probably has as much to do with why apple ditched them. Remember Apple did this before when they switched to to intel from IBM/Motorola, when they too had stagnated.
This take totally misses the mark on the realities of the situation.
Intel made a bad bet on tech and wasn’t able to shrink the node. TSMC got the right choice. TSMC was therefore able to make better tech in the short term. Intel designs are not directly related to that shortcoming.
TSMC makes way more chips than intel. TSMC is therefore able to buy the fabs, equipment, engineers, etc at a lower price per chip (since more chips). TSMC is therefore able to spend more on their fabs, and invest more in research. Intel can’t keep up on the manufacturing side even if they can on the design side. The only way to justify the research costs and fab costs is to amortize it across more chips which means they need to manufacture for more than just intel. It’s basically the AWS model - you can be your best customer, but you can drive prices down for yourself with extra customers. Amazon didn’t abandon the revolutionary 2 day shipping when they became a data center provider. Assuming intel still has good designers left, they won’t abandon their own chips.
Multiple reports over the years indicate that it's not just a single bad bet that got Intel into this, but a corporate culture where engineers have less and less influence and MBAs more and more, leading to worse and worse tech. Similar to Boeing.
Yet Intel is still competitive to AMD, and now has an engineer in charge. Intel's problems feel pretty over stated their products are still good, they're launching new ones which will bring new competition to the GPU space.
I’m personally quite excited for what intel has planned and I work on M1 chips (opinions are my own). I think there’s a decent chance they’ll have a comeback in the next few years. Yes, AMD is doing awesome and ARM is bigger than ever. We’re definitely headed into a very interesting time for processors as consumers.
>. The only way to justify the research costs and fab costs is to amortize it across more chips
or make more money per chip, which is what intel does, since lets say compared to an AMD chip TSMC manufacturers, TSMC takes a cut of the chip for manufacturing and then AMD takes the rest, while intel collects both portions.
In return it also has two sets of R&D to support and two sets of risks - architecture and manufacturing. If it falls behind on either of these it starts to lose.
TSMC for example can solely focus on manufacturing assured that it will fill it’s fabs if it keeps pace.
Maybe Intel made super profits when x86 was the only game in town but that’s not the case any more.
>If it falls behind on either of these it starts to lose.
If TMSC falls behind in one of these they lose, and they don't have the other. Is that an advantage as you seem to put it? If they make a wrong choice like intel did for 10nm they're going to be immediately a non-entity with no 'other' business. Having two sets of money make businesses puts intel at a big advantage in terms of financing and owning their own platforms.
If TSMC falls behind a node all of their orders will disappear to whoever has a more advanced node. They don't have another business. Instead of two risks, they have one risk thats identical to intel's, and their entire business depends on it. That's a lot less anti fragile.
Intel has two sets of risks and in exchange on many many fewer chips they basically made the same amount of money last year, when they were behind on CPUs at almost every metric. That's resilient. People talking about the fall of intel are talking about something that intel is actively maneuvering ahead of. TSMC has no chip design risks much lower per chip profits in exchange.
Vertical integration is great if it generates synergies. It’s really bad if the tie into the internal customer hinders the development of each part of the business.
Intel is not remotely robust as it’s almost completely dependent on x86 and needs to catch up with TSMC. It lost smartphones in part because of x86. Now it’s fallen behind AMD because of manufacturing weaknesses. Hence a P/E ratio of 9 vs c20 for TSMC.
TSMC on the other hand has a huge variety of customers at a wide variety of nodes.
Intel never had smartphones to lose. Inetl can generate all the same advantages as amd by simply buying Chips from TSMC if they want (and they already do for some chipsets) so there is no operational disadvantage. Intel is already mostly caught up to amd and will be making significantly more per chip than amd very soon.
TSMC is competing for something Intel doesn't want to sell. Intel even when it was in the lead wasn't tabbing it's newest process for 3rd parties. You're declaring TSMC Victor in a game Intel never played. And in a handful of years if Intel gains back process advantages you will likely still declare them the loser for not playing the cutting edge fab for other companies game they don't want to play.
TSMC is not playing the same game Intel is, and in 2021 when by all accounts Intel was behind TSMC and and, they still managed to make similar profit to TSMC and laugh at AMD's inability to buy enough chips to make anything close to competition for either Intel or Nvidia.
Now they're also getting into graphics cards and have largely caught uP with amd designs. Their future is bright.
Intel has consistently tried to build a top tier GPU and failed year after year. Expecting them to suddenly break away from their history is extremely optimistic.
They're just starting to get into the business. It will improve. They don't have to have the best cards, they just have to compete in some segments from the start. It's all upwards from here.
That's the thing about experience - you keep accumulating it.
Stagnated? Weren't PowerPC chips pretty advanced compared to Intel whose chips were carrying a lot of baggage at the time? Given that PowerPC chips were based on RISC, I'd guess they're a lot closer to the M1 than modern Intel chips are.
My understanding is that IBM/Motorola's struggle with achieving volume is what doomed them not a lack of innovation.
This is all way outta my area of understanding though.
> My understanding is that IBM/Motorola's struggle with achieving volume is what doomed them not a lack of innovation.
Before Apple announced their Intel transition laptops were more than half of their Mac sales. Of their desktop sales, the iMac dominated over PowerMacs. So a majority of the systems they were selling had relatively tight thermal envelopes.
Neither IBM nor Motorola was willing (or able) to get computing power equivalent to x86 into those thermal envelopes. The G5 was a derivative of IBM's POWER chips they put in servers and workstations. They were largely unconcerned with thermals. Motorola saw the embedded space as more profitable and didn't want to invest in the G4 to make it more competitive.
Meanwhile Intel had introduced the Pentium III derived Core series chips. Good thermals, high performance, multiple cores, and 64-bit. It was better performance than Apple's G5 in the thermal envelope of the G4.
Neither IBM or Motorola had general issues with production volume. Apple switching was all about the future direction of the architecture. There was no market for desktop PowerPC chips besides Apple. Neither IBM or Motorola really wanted to compete directly with Intel and saw their fortunes in other segments.
So Apple went with Intel because they were making chips compatible with what Apple wanted to do with the Mac. The first Intel Macs ran circles around the PowerPC machines they replaced with no major sacrifices needed in thermals or battery life.
So Intel innovation in the 00s got Apple to switch to them and a lack thereof got them to switch away again.
> Weren't PowerPC chips pretty advanced compared to Intel
PowerPC had better floating point performance which was important for graphics and publishing workflows. Photoshop performance comparisons seemed to happen at every year's MacWorld during that period.
Unfortunately, IBM used Power as a workstation chip, and making a version of the chips for laptops was not on their radar. Of course, at the time, Pentium IV chips weren't known for running cool either. The more popular laptops got, the more this was a problem.
After Intel transitioned to the Core architecture, Apple transitioned to Intel so they could make laptops with a much better performance per watt than PowerPC offered.
People weren't buying laptops for everything during the PowerPC transition. They were buying desktops. No one doing "serious" work bought laptops in 1994. Not for coding, not for photo manipulation, or even gaming.
It wasn't until the 20-teens (2013 - 2015) that Macs for coding caught on. Apple transitioning to PowerPC made perfect sense for graphics workstations.
Right, but I didn't start seeing Mac laptops show up at work (software companies in the Midwest and Eastern U.S.) until 2014 when the MBP became a serious contender to Dells and HPs.
I had a Toshiba laptop in '95 for work, but it was a spare for when I was traveling. That pattern continued with Windows laptops as supplements for desktops in the office for the next 20 years. In 2015 my company went all MacBook Pro for everything, but they were trailing not trailblazing.
Students are always going be the leading edge, because they have to be mobile, they get used to it, then they bring it into the new workplace when they come. It's one of the benefits of hiring in new people.
If I recall, the scuttlebutt was that Motorola had promised Apple (meaning Steve Jobs) that faster clock speeds were just around the corner for a while, and when that repeatedly failed to materialize Apple (meaning Steve Jobs) got pissed and activated the Intel backup plan.
Well it wasnt really clockspeeds, it was performance per watt. The g5 was able to go into a (watercooled) powermac but IBM couldnt get it to run cool and efficient enough to go into apple laptops (or the mac mini iirc). By 2005 intel was, at the very least, probably prototyping multicore (I dont remember if IBM’s processor offering to apple was multicore at the time) chips that blew ibm (and previous intel offerings) out of the water performance and efficiency wise and apple announced the transition.
It was two Dual-Core 2.5GHZ CPUs. I have one sitting under my desk right now. ;)
Pretty advanced for 2005. Four 64bit CPU cores, 16GB DDR2 RAM, and liquid cooled. It's still usable today 17 years later and it could work for 90% of what I do on the computer. It draws nearly 1000 watts under full load tho....
Youtube is extremely slow, and it can only play back 360p or lower video smoothly. There is no hardware h.264 acceleration. “New” Reddit is also very slow. But old Reddit loads fine. Hacker news is very fast, loads as quickly as a modern computer.
Sort of. The G4 chip was still used in laptops until the Intel transition, and was produced by Motorola until they spun off their semiconductor division into Freescale, which continued producing the G4 until the end.
It was driven by power and heat for the portable market. Intel x86 laptops were the best at the time and PPC couldn’t compete especially with thin and light.
In general, IBM was just going in a different direction with Power than Apple needed them to be going in. IBM was and is focused on the highest end, high priced end of the server market.
The PPC processors may have been decent. But they were hamstrung by having to run an emulated 68K operating system and Apple cheaper out by having slower buses.
>Intel’s failing will redefine the industry in many ways. ARM and AMD and other players are taking chunks out of them at the cutting edge
Failing? Have you looked at Intel's 12th Gen CPUs? This trope was valid till the 10th Gen 14nm++++ era from 2019 but you might have overslept the last couple of years. Intel has improved massively since then starting with 11th Gen and Xe graphics.
Intel's 12th gen big-little tech really shook up the market and even AMD now is feeling the pressure.
11th gen Intel chips were still 14nm, the top chip had less threads than the 10th gen because of the thermals, and intel xe was, iirc, only offered with the 11900/11900k (i.e. the top of the stack). Intel has had a stranglehold on integrated transcoders for a while but AMD’s integrated vega cores (soon to be RDNA2) still wipes the floor with current integrated XE offerings gaming wise…
One could argue it took ten years for Intel to have enough competition from ARM to actually wake up and do something again.
I don't care, I got a 12th gen i7 with integrated graphics (in the weird time window and edge case where Intel was ahead of AMD again for a bit) which is super fast and was way better priced then Intel used to be.
No, consumer demand. It takes years to design, test and prepare for manufacturing a new CPU architecture, so Intel had their big-little in the pipeline long before Apple came out with the M1, same how it took Apple over 10 years of iterations to get to M1.
The real question is what is AMD gonna respond with?
I think you're mixing up some things. I couldn't just buy an ARM chip and plug it in a desktop PC or laptop and plus, the ARM chips, big-little or not, we're terribly underpowered 10 years ago compared even to Intel Celeron.
So calling it 10 year lagging because of a feature that had no relevance in the PC space back then is a misrepresentation.
Big-little made it to the PC market now since modern CPU cores are powerful enough that even low performance ones can still run a desktop environment and browsers well enough without stutters. That was not always the case 10 year ago, so consumer demand was always optimized around maximizing raw performance.
So, the fact that ARM had this feature 10 years ago is larglely irrelevant for this argument
It's ARM's performance improvement at the top end in the last 10 years that changed the landscape for the PC industry to a degree, not big-little.
Not to mention, they intend to compete with Apple on transistor density before 2024. Time will tell how successful they are, but I do get a laugh out of the people who are counting Intel out of the game right now. Apple doesn't sell CPUs, they sell Macs. They aren't even competing in the same market segment.