Inference keeps getting cheaper, so "it isn't cheap enough yet" isn't an issue. Even with zero efficiency innovations from here, cost per instruction is the most deflationary commodity of all time.
So how was that ever going to be a problem?
The optimal choice for marginal costs, which will naturally drop on their own, at the beginning of a new tech cycle is to run in the red. It would be a sign of gross incompetence if they were fine tuning those costs already.
Training spend is the giant expense. And either training costs are unsustainable, and training spend will hit a pause, or it is not unsustainable and training spend will continue.
So, which is it?
Critical point: The majority of their costs are not required to serve the highest level of capability they have achieved at any given time.
That is unusual. In the sense that it is an exceptionally healthy cost control structure. Note that not even open source offers a cost advantage, for training or inference.
So how was that ever going to be a problem?
The optimal choice for marginal costs, which will naturally drop on their own, at the beginning of a new tech cycle is to run in the red. It would be a sign of gross incompetence if they were fine tuning those costs already.
Training spend is the giant expense. And either training costs are unsustainable, and training spend will hit a pause, or it is not unsustainable and training spend will continue.
So, which is it?
Critical point: The majority of their costs are not required to serve the highest level of capability they have achieved at any given time.
That is unusual. In the sense that it is an exceptionally healthy cost control structure. Note that not even open source offers a cost advantage, for training or inference.