Last I heard (very recently) the NVidia closed source drivers are still the most practical option, because the AMD closed source drivers are fickle and the AMD open source drivers are incomplete. Wrong?
No, you are dead right. Even though a lot of NVidias stuff is closed source, the binaries have always had much better support for graphics and power management than ever the Radeon drivers. I could never get things like graphics hibernate to work as well as it did with those from NVidia.
I believe I read earlier that they have promised to open source a lot of their drivers for OSS.
True, if you are really serious about 3D in Linux then NVidia is the better option.
With AMD there is still some pressure that someday NVidia could follow. With AMD gone, NVidia does not need to worry any longer, because Intel really isn't a valid competitor in the professional 3D space.
Right, but it was improving. If there's no high-performance 3D hardware target with a hope of having FOSS driver support, it'll put a stop to any innovation in that space. Hence, this is bad.
> Intel graphics card don't count if you're doing serious 3D work.
(I started work on the 3D driver team at Intel in July, but everything I say is my own opinion.)
Historically, I think you might be right. Lately (the last few years) I think Intel significantly improved both its hardware and drivers.
The i965 driver is the only free software driver that supports OpenGL 3.1. The team I'm on is responsible for a huge number of improvements to the 3D stack, including rewriting the GLSL compiler which benefits the Radeon and Nouveau drivers as well.
We've had engineers visit Valve in person and make sure that their stuff will work well on our drivers. I'd consider what Valve is doing to be "serious 3D work."
Sandy Bridge and Ivy Bridge are pretty respectable. Haswell will be even better. You should reconsider your position.
> The i965 driver is the only free software driver that supports OpenGL 3.1
And how does it perform in regards to ATI and NVidia cards?
Valve is surely doing serious 3D work, that is why on their Linux blog, they talk about a NVIDIA GeForce GTX 680, not Intel graphics card.
And let's not forget that L4D is a DirectX 9 game level, so the requirements are pretty low in today's hardware terms.
> Sandy Bridge and Ivy Bridge are pretty respectable.
Not if you're blasting polygons. I have a dual core system that beats a Sandy Bridge system with the hands down, thanks to a NVidia GeForce GT 240M.
> You should reconsider your position.
Does the Intel's GPA finally support OpenGL?
Back in 2009 at Games Developer Conference Europe, I had the pleasure to talk with someone from Intel, stating that OpenGL support was important. This after hearing the guy spending one hour talking how to analyse DirectX with GPA.
Nvidia might find itself in a similar position soon if its Tegra line-up doesn't become significantly more popular and successful. Intel is squeezing them out of the market with its integrated graphics in laptops, and I think that's quite a shame because I don't think Intel played fair there. They tried to force manufacturers to use integrated graphics even if the laptop used discrete graphics. Now that Intel's integrated graphics chips are becoming "good enough" for most people, it's going to be a very smooth transition to take Nvidia out of the laptop market.
So Intel is killing both AMD and Nvidia. Hopefully Nvidia has the last say with its ARM-based chips, though. If ARM ends up disrupting Intel in all its markets, and if Nvidia remains one of the best ARM chip makers, then Nvidia will ultimately win.
That can't be said about AMD, though. They're stuck in an x86 world where they're falling further and further behind Intel, and they haven't even begun to enter the ARM world, and even if they do, it might be too late now, because they lack the ARM engineering experience of their competitors. Buying the OMAP division from TI might help, but OMAP is not doing that great now either, and they always use overclocked last-gen GPU's to compete with the new mobile GPU's, which I find a bit annoying. Plus, I don't know if AMD can even afford them, or the turn-around that comes with that. Another interesting acquisition target could be MIPS, which is for sale, but AMD would have to also promote the benefits of that architecture on their own, which seems even harder.
NVIDIA still owns the high-end scientific computing market. This combined with their ARM products could save it in the future. AMD...their only advantage has been X86 and that is just a dying market, kudos to the board at least for dumping the CEO who wanted to focus on the high end PC market. AMD still has really good GPU tech, enough to compete with NVIDIA on the high-end non-GPGPU market, but that is very niche.
Actually, I fear for NVIDIA's HPC market as well. Kepler 2 in my eyes is not well suited for GPGPU, they've rather optimized it for the gaming market in order to go after AMD. I.e. they introduced much more cores, but those cores now have less cache and register resources available to them. Less registers, especially, is deadly - this has already been a limiting factor for many HPC applications.
Meanwhile Intel pushes out their Xeon Phi boards with huge memory bandwidth and OpenMP support. If there's not some kind of surprise in terms of performance either on Intel's or NVIDIA's side I see dark clouds coming for them.
Well Intel recently introduced its Xeon Phi co-processor, which according to them offers about the same performance as Nvidia's chips, while not needing to code in CUDA. But I never trust Intel's marketing anyway because they always seem to exaggerate in some way or be misleading on purpose, so we'll see how that goes. Plus, I'm interested to see what comes out of Nvidia's Project Denver or whatever they are calling it now (Nvidia's 64-bit custom ARMv8 SoC for HPC and servers, which should arrive in 2014).
>But I never trust Intel's marketing anyway because they always seem to exaggerate in some way or be misleading on purpose, so we'll see how that goes.
I think you can safely remove "Intel's" from that sentence and it remains as accurate, or perhaps more so. I really don't mean to hate on marketing, but their job is to sell a story. Good marketers walk that fine line between outright lying and carefully walking the "happy path" that makes whatever they are selling the cure for what ails you and fails to mention the problems. In general once you dig into things you almost always start finding the trade-offs/drawbacks that marketing "forgot" to mention. Thomas Sowell was right when he said: "There are no solutions ... only trade offs".
1. Intel is safe for the foreseeable future, they have a unique position as far as manufacturing and processes go.
2. I suspect the engineering barriers in developing an OMAP class processor are much lower than what AMD has to deal with it in x86 and graphics markets. There is a large amount of off the shelf IP for mobile application processors and they have lots of internal knowledge they could use to customize the ICs (graphics, core architecture).
The problem there is that I'm not sure who the buyers are - the two giants (Apple and Samsung) are making their own processors, so unless you're making cut price ICs (a very crowded market) you are somewhat limited. I don't think the Tegra is bad from a technical standpoint, it's just that the number of really high volume customers is not infinite.
Nevertheless I'd love to see AMD stepping away from the x86 game into a wide ARM portfolio. I'm sure ARM would play ball, it'd be a major win for them and with ATI's graphics/hpc expertise we could see killer solutions.
That might not last as long as you think. Apparently Global Foundries thinks it can reach parity with FinFET 14 nm chips in 2014, which is Intel's (new) timeline as well. I think this is happening because of 2 reasons: Intel has experienced delays of a few months with every latest generation, and GloFo also found a shortcut to the 14nm process:
"By moving to 14nm finfet while keeping 20nm interconnect, GF has brought forward the introduction of its 14nm process by one year."
When 3dfx started flailing, they sued NVidia for IP infringement until NVidia purchased them. I could see the board of AMD doing the same thing, patent trolling everyone in sight until somebody buys them out.
Right now, Intel and ARMs business are very much segregated. Each of the two is making steps into foreign country, but with relatively meager success.
To clarify: ARMs business is the embedded and booming smartphone market, Intel has the never-going-to-cease-to-exist high performance server and workstation market. The fate of the thing between, consumer and "office" PCs, is very much unknown.
Cloud-based data centers are beginning to adopt ARMs for power consumption reasons. If that trend picks up along with everyone using cloud providers, then I think even the high-end PC server business could start dying. Incidentally, this is a market where NVIDIA could begin to really cream Intel, a lot of scientific computing has already moved over to CUDA/Tesla (granted, there is a huge difference between scientific and enterprise applications). I don't see the same thing happening to workstations, however, difficult to move that work into the cloud.
Apple? They can afford it, and have already demonstrated interest in having their own chips. At the current valuation, it'd not cost much more than ATi did when AMD bought them.
Well Apple is certainly an interesting choice, and maybe if this happened like 2 years ago, they might've done it, too. However, I think they are too far along building their ARM SoC for Macs right now, and it would take them more than 2 years to put AMD's chips on the right track. Plus, if Apple bought them, would they even be allowed to keep them for themselves? Wouldn't that mean Intel would remain the only competitor in x86 for Windows devices?
Having ATI's GPU on their SoC might be an upgrade. I am not convinced the Mac line will move to the ARM because of bootcamp and Microsoft's restrictions on ARM platforms.
I don't know how much Apple really cares about the ability to run Windows on their hardware. I always had the impression that Bootcamp was done just as a bonus because it was easy to do, not because they saw dual-booting as something essential. Even if they did, that whole area is waning fast. I wouldn't be surprised if they tossed Bootcamp aside without a second thought.
Apple would be dropping the ball if they went to ARM, especially on all the people using MS products. Not to mention they would essentially destroy the businesses of Vmware Fusion & Parallels for their virtualization environments which many many people still use to this day in large enterprise organizations to maintain a dual-environment compatibility layer. If they want to throw out their enterprise market, then this will be a great way to go about doing it.
I think they know how many machines are using bootcamp or VMWare / Parallels and it isn't a small percentage. These are the gateway products that allow a lot of folks to buy a Macintosh.
I think the people with the most interest in keeping AMD alive would be Intel's big customers who are not resellers. I'm thinking of people like Amazon, Google and Facebook.
Very good point. It's telling. Screw the company, they say, we're going to inflate this thing, take our profits, and go to the next con.
In this new world we're watching unfold, boards no longer do what is in the interest of the company. This guy puts it well:
2)"Companies should not be run in the interest of their owners." Shareholders are the most mobile of corporate stakeholders, often holding ownership for but a fraction of a second (high-frequency trading represents 70% of today's trading). Shareholders prefer corporate strategies that maximize short-term profits and dividends, usually at the cost of long-term investments. (This often also includes added leverage and risk, and reliance on socializing risk via 'too big to fail' status, and relying on 'the Greenspan put.') Chang adds that corporate limited liability, while a boon to capital accumulation and technological progress, when combined with professional managers instead of entrepreneurs owning a large chunk (eg. Ford, Edison, Carnegie) and public shares with smaller voting rights (typically limited to 10%), allows professional managers to maximize their own prestige via sales growth and prestige projects instead of maximizing profits. Another negative long-term outcome driven by shareholders is increased share buybacks (less than 5% of profits until the early 1980s, 90% in 2007, and 280% in 2008) - one economist estimates that had GM not spent $20.4 billion on buybacks between 1986 and 2002 it could have prevented its 2009 bankruptcy. Short-term stockholder perspectives have also brought large-scale layoffs from off-shoring. Governments of other countries encourage longer-term thinking by holding large shares in key enterprises (China Mobile, Renault, Volkswagen), providing greater worker representation (Germany's supervisory boards), and cross-shareholding among friendly companies (Japan's Toyota and its suppliers).
The real problem showed up when Hedge funds became so popular. People putting money into there 401k care a lot about long term gains, people running Hedge funds care mostly about short term gains. Hedge funds end up filling board seats not small time investors.
It's actually a common problem with 'money managers' as generally they share far more of the upsides than the downside risk. Yet, there is no way to consistently beat the market so trading risk is really there only option.
Really, high-frequency trading is the most common sort of trading? That is absolutely astonishing. Do you think that says anything about how much stock (by valuation) is invested long-term?
Even if, say, 95% of stock is held in long-term investments, high-frequency trades will dominate the metric "% of trades executed", since high-frequency trading holds the stock for less than 1/20th of the time long-term trading does.
Suppose num_long shares of stock are held in long-term investments, as a permanent feature of the economy. Suppose further that the average duration of a particular long-term investment is two years. Then every two years, num_long trades will occur under the banner of "long-term investing".
Suppose num_hft shares of stock are held in HFT investments, and that the average duration of an HFT investment is one day. Then every two years, 730.5 * num_hft trades will occur under the banner of "high-frequency trading".
But who cares what the % of trades are? They don't get more weight as a stakeholder because they are trading a lot. In fact they have absolutely none, but that doesn't diminish the control held by the 95%.
News like this worries me because I've always imagined the world's improvements in mass market technology being driven by competition between major players like Intel and AMD.
In other words, I'm afraid that the loss of competition for Intel will slow the approach of the future.
The other sources of competition that Intel faces might be strong enough to motivate it, but I'm not as familiar with the influences that, say, ARM chips and Apple's own CPU development have had on Intel.
In desktop and servers, for about 5-6 years now Intel has mostly been in competition with itself. If it can't engineer faster parts, it can't get people to upgrade, which means no sales. I wouldn't worry about that too much.
This is more true than you'd think, unless you know how Intel is organized and operated.
When someone from Intel visits your company to discuss future product directions and you sign an NDA, they give you some documents with scary-looking bright yellow "CONFIDENTIAL - DO NOT DUPLICATE" covers.
The thing is, though, they don't much care if you turn the document over to AMD, Samsung, or whoever. They might yell at you or sue you or take you off of their Christmas card list, but they'll get over it. What the program manager is absolutely terrified about is that another manager at Intel will hear about the project and knife it.
Intel is a Darwinian dystopia. When Andy Grove titled his book, "Only the Paranoid Survive," he wasn't joking around or being overdramatic. It's an interesting company, quite unlike any other I've ever heard about.
Down goes the only graphics manufacturer offering proper graphics cards with documentation for open source developers.
Intel graphics card don't count if you're doing serious 3D work.