These are welcome changes, as the practice of DEI (not it's idealization) is actively discriminatory and intolerant of dissenting views. Let competence be the only metric.
If competence were the only metric (or even a metric that this administration actually cared for) 90% of the appointees of this administration would not have been hired.
It's not an either-or. This administration can have the average IQ of a bdelloid rotifer, but that doesn't mean that every single ideological position of all the people they are mad at is valid.
False. Mass accumulations of capital allow exploratory research without a clear path to commercial benefit, but it's a cherry on top, a kind of motivator for researchers.
It's ironic that the much more significant ultimate success of deep learning happened despite a lack of government funding, if Hinton is to be believed. The 90s were a neural net winter, and success required faster computation, a private success.
I lose zero sleep at the prospect that there would be zero government robotics research funding. If the advantages are there, profit seekers will find a way. We must stop demonizing private accumulations of capital, "ending" billionaires and "monopolies" that are offering more things at lower cost. Small enterprises cannot afford a Bell Labs, a Watson Research, a Deep Mind, a Xerox PARC, etc.
Hinton and his students studied for years on US (and then Canadian) government grants. The year Alexnet came out, Nvidia was awarded tens of millions by DARPA for Project Osprey.
It's an odd historical revisionism where from Fairchild to the Internet to the web to AI, government grants and government spending are washed out of the picture. The government funded AI research for decades.
I think their point is the billions in private investment which preceded those millions.
I think this is a common issue in computer science, where credit is given to sexy "software applications" like AI when the real advances were in the hardware that enabled them, which everyone just views as an uninteresting commodity.
> I think their point is the billions in private investment which preceded those millions.
But the "billions" didn't precede the "millions". They're just completely incorrect, and anyone that knows even a tiny amount about the actual history can see it immediately. That's why these comment sections are so polarized. It's a bunch of people vibe commenting vs people that have spent even like an hour researching the industry.
The history of semiconductor enterprise in the US is just a bunch of private companies lobbying the government for contracts, grants, and legal/trade protections. All of them would've folded at several different points without military contracts or government research grants. Read Chip War.
You seem to be arguing that the second government touches anything then everything it does gets credited to the government funding column. Seems simplistic to me, but you can believe what you like. Go back far enough and there was only private industry, and no government funding until the space race basically.
Either way the fact remains that the billions spent developing GPU's preceded the millions spent to use those GPUs for AI. Not sure what it has to do with polarization of the comment section. I assume it's just people seeking an opportunity to heap abuse on anything close to a representative of the evil "other side".
> Go back far enough and there was only private industry, and no government funding until the space race basically.
How do you think the railroads were built in the US? The bonds of the Pacific Railroad Acts date back to the 1860s. Pretty easy to build a railway line when government foots the bill.
Government funding of research. We were talking about the NSF after all, not free markets versus central planning.
On that though, I read somewhere that the hierarchical committee-led operation of the funding agencies is the same way communist systems dole out money for everything else too. Not sure if they were being completely serious.
From 1901 up to FDR's election in 1932, 5 Americans won Nobel Prizes in the sciences. There was not much government funding back then, and not much was going on either.
So your argument is that nothing is communism? The fact that it's a single large organization allocating resources is rather key to the whole point. That the same organizational structure doing it is interesting to me anyway. I suspected this line of thinking is too triggering for some people though.
A corporation is not an economic system, just a tiny participant of one. And I'd rather describe their decision making as hierarchical yes, but by middle managers implementing the agendas of higher ups, not necessarily by committees. When they operate by committee they tend to be at their worst...
Many industries are uninvestable in their early days. How many get to the point where private funding makes sense without initial government funding for fundamental science and research? Where will we be in 15 years if the government starts pulling funding like the NSF? We might find the private money at that time is funding those future industries in other countries instead.
Seeing all the recent tariff fights and actually finding out what the story is behind some of the different industries, I am becoming much more of the opinion that other countries take over industries as the result of specific agendas targeting those industries and maintaining a large degree of monopoly over them. The US has not reacted much because each country only took one industry or so and it was a way to manipulate them or appease them or whatever, but it is turning into death by a thousand cuts. I definitely think the US government needs to be a lot more involved than they have been in a range of ways. That list of ridiculous-sounding cancelled NSF grants wasn't it though. If you're talking about the SBIR program, that is pretty tiny. I assume it will continue, it is legally set to be at 2% or whatever.
> You seem to be arguing that the second government touches anything then everything it does gets credited to the government funding column.
Absolutely not. This is an obvious bad faith interpretation of my comment.
> Either way the fact remains that the billions spent developing GPU's preceded the millions spent to use those GPUs for AI.
Again, you're just obviously completely factually wrong to anyone who has even a modicum of casual interest in the history of these technologies.
> Not sure what it has to do with polarization of the comment section. I assume it's just people seeking an opportunity to heap abuse on anything close to a representative of the evil "other side".
And one more time for the people in the back. Anyone with any amount of actual knowledge on the topic at hand can immediately dismiss your entire argument because it isn't based in anything resembling fact. It's just you wishing or hoping that it might be somewhere close to true. This is just that scene from Billy Madison: "Everyone in this room is now dumber for having listened to it. I award you no points, and may God have mercy on your soul."
I wonder if it deals more with the approachability of software applications. If I even begin to think I’d compete with NVIDIA delivering similar hardware, I’d very quickly realize I was an idiot. Meanwhile as a single individual, there is still a reasonable amount of commercial markets of software I really do have some chance at tackling or competing against. As software complexity rises it’s becoming far less tractable than it was in say the 90s but there are still areas individuals and small sums of capital can enter. I think that makes the sector alluring in general.
Hardware is just in general capital intensive, not even including all the intellectual capital needed. So it’s not that it’s uninteresting or even a commodity to me, it’s just a stone wall that whatever is there is there and that’s it in my mind.
That difference in difficulties is kind of the point. Imagine, as an extreme, a company makes a machine with certain functions performed based on which button combinations you press. A second company gets a patent for using the first company's machine for doing various tasks by pressing various button combinations, which are new uses of the machine no one had thought of yet. Now the second company has all the bargaining power in the market and so gets giant margins, despite doing a tiny fraction of the work it takes to make those tasks possible.
I wonder if our current system ended up this way because it is the most efficient in terms of specialization, or because the patent system drove things in this direction where the people last dealing with customers (i.e., those making the software layer) have the best info of what tasks the customers want to do with their computers, and hence patent the solutions first. Leaving hardware vendors no choice but to serve the software monopolies (one after another since the 80's).
You are suggesting unilateral disarmament. Allowing other nations, not all of them friendly, to take the lead in science and technology as they continue to fund their own research and poach our best and brightest.
Once something has a predictable ROI (can be productized and sold), profit seekers will find a way. The role of publicly funded research is to get ideas that are not immediately profitable to the stage that investors can take over. Publicly funded research also supports investor-funded R&D by educating their future work force.
The provided examples do not clearly support the idea that industry can compensate for a decrease in government-funded basic research. Bell Labs was the product of government action (antitrust enforcement), not a voluntary creation. The others are R&D (product development) organizations, not research organizations. Of those listed, Xerox PARC is the most significant, but from the profit-seeking perspective it's more of a cautionary tale since it primarily benefited Xerox's competitors. And Hinton seems to have received government support; his backpropagation paper at least credits ONR. As I understand it, the overall deep learning story is that basic research, including government-funded research, laid theoretical groundwork that capital investment was later able to scale commercially once video games drove development of the necessary hardware.
a block of countries is what makes them far less worrisome. They're too busy competing with each other - none is going to want the others spying on it's own citizens for gain.
But in a you had agency and chose to underinvest in defense way.
That said, it's pretty unlikely the rest of the world could have defended against a technologically advanced Europe / Middle East / China, at their respective peaks, and especially after transoceanic sail enabled cross-sea logistics.