This post is written with its intellectual fly open. I'm not sure whether it was partly AI-generated, or whether the author has spent so much time ingesting AI-generated content that the tells have rubbed off, but this article has:
- Strange paragraph-lists with bolded first words. e.g. "The Cash Flow Mystery"
- The 'It's not just X; it's Y' meme: "Buying Groq wouldn't just [...], it could give them a chip that is actually [...]. It’s a supply chain hedge."
Tells like:
- "My personal read? NVIDIA is [...]"
- "[...]. Now I'm looking at Groq, [...]"
However, even if these parts were AI generated, it's simultaneously riddled with typos and weird phrases:
- "it looks like they are squeezing each other [sic] balls."
- Stylization of OpenAI as 'Openai'.
Not sure what to make of this low-quality prose.
Even if the conclusion is broadly correct, that doesn't mean the reasoning used to get there is consistent.
I do, at least, appreciate that the author was honest up-front with respect to use of Gemini and other AI tools.
It does amuse me when you have great, clean writing in some parts of a post, but then you have a sentence like
> As we head into 2026, when looking at Nvidia, openai and Oracle dynamics, it looks like they are squeezing each other balls.
Yeah I don't think there's a snowball's chance in heck that an LLM wrote that one, lol. My best guess is that the author combed over some of their prose with an LLM, but not all.
On the contrary, recently I encountered many cases where ChatGPT randomly switched to a very familiar style for a few sentences. It has a strong Reddit vibe when it does it, which I guess is not surprising.
I've seen this happen when AI is asked to spot fix a single sentence in the middle of an essay and it fails to maintain the style of the rest of the writing.
If someone generates a ten thousand word slop essay with AI, do I have a moral obligation to critique its reasoning step by step instead of merely pointing out its origin?
If I do, it just so happens I have a ten thousand word rebuttal for you…
In other post:
https://philippeoger.com/pages/why-googles-tpu-could-beat-nv...
in the paragraph:
"What This Means for the Future
NVIDIA is not standing still; its Q3 Fiscal 2026 earnings were a record \$57.0 billion in revenue, with Data Center revenue hitting \$51.2 billion. But the growing adoption of TPUs introduces a long-term risk to NVIDIA's core business model."
they write about 2026 as if it already was. Could be a human typo or AI mistake.
Nvidia's (fiscal) Q3 2026 financial statements have been released, and are what this is all about. Fiscal years may be correlated with calendar years, but sometimes (as in this case) the correlation is rather elastic.
I'm also unsure why they took a sudden tangent from the topic at hand to suggesting Oracle should buy groq. It's like two separate half-baked blog posts merged into one with no real segue or conclusion as to why that sudden hard-left was relevant or meaningful.
1. I haven't commented on HN in a while and didn't want to dig up my password. Throwaway accounts are a tradition.
2. I don't want people to see my disparagement of the quality of prose in this article as indication of personal agreement or disagreement with any of the points in the article. I have no horse in this race. I just want to read high-quality material. I love HN, but I'm not sure how much longer HN will be a place I can frequent in this respect. Have the hills not eroded? What of childlike curiosity?
3. My comment is nothing special. Others also point out portions of this article may be AI generated. People can verify the contents of my comment independently and come to their own conclusions. It does not require that I lean on implied authority of some form.
I read a lot, it's basically all I do. I wish writers maintained the contract of spending at least as much energy writing out ideas as they expect their audience to expend while reading them.
I will now log out of this account and lose the password. I hope this was helpful. I intend no malice; I'm sure the author of this piece is a kind person and fun to hang out with. I hope they take this feedback the right way.
The problems with #2 and #3 are they are equally, if not more, valuable points to those wishing to hide their identity for nefarious reasons instead. I.e. that nameless "just makes you think, doesn't it" kind of farming. Identity (as in an account history) provides not only a community here, but easily rules out a lot of malicious wordsmithing concerns. Not that I place any of that on you in this case, just the general use of throwaways as perhaps not as net-good for the site as advertised above.
Nothing wrong with #1 on its own of course, but if we're talking about what we'd like to see here then I'd lean more towards the value in discussions with individuals in the community than the value in the prose of the articles/comments.
>I love HN, but I'm not sure how much longer HN will be a place I can frequent in this respect. Have the hills not eroded? What of childlike curiosity?
Culture shifts, even If you want to pretend you're beyond trends. I know HN wants to say "we're not Reddit" but cultural osmosis from Reddit and the internet at large will change how you interact even here.
That said:
>I wish writers maintained the contract of spending at least as much energy writing out ideas as they expect their audience to expend while reading them.
Maybe they did. Thing is that it's rare to be a skilled orator and a highly technical person (AKA the audience here). I could spend 20 hours writing this piece (after researching) and it'd be worse than someone who spent 2 hours writing it up but basically write full time. Don't let aptitude be confused with effort.
I pasted two paragraphs in GPTZero and got the following results: 19% AI, 65% mix of human and AI and remaining 16% human. As I wasn't logged in, I did not get other details.
> However, Groq’s architecture relies on SRAM (Static RAM). Since SRAM is typically built in logic fabs (like TSMC) alongside the processors themselves, it theoretically shouldn't face the same supply chain crunch as HBM.
It's true SRAM comes with your logic, you get a TSMC N3 (or N6 or whatever) wafer, you got SRAM. Unfortunately SRAM just doesn't have the capacity you have to augment with DRAM which you see companies like D-Matrix and Cerebras doing. Perhaps you can use cheaper/more available LPDDR or GDDR (Nvidia have done this themselves with Rubin CPX) but that also has supply issues.
Note it's not really about parameter storage (which you can amortize over multiple users) it's KV cache storage which gets you and that scales with the user count.
Now Groq does appear to be going for a pure SRAM play but if the easily available pure SRAM thing comes at some multiple of the capital cost of the DRAM thing it's not a simple escape hatch from DRAM availability.
SRAM scaling also hit a wall a while ago, so you can't really count on new processes allowing for significantly higher density in the future. That's more of a longer-term issue with the SRAM gambit that'll come into play after the DRAM shortage is over though - logic and DRAM will keep improving while SRAM probably stays more or less where it is now.
You can still scale SRAM by stacking it in 3D layers, similar to the common approach now used with NAND flash. I think HBM DRAM is also directly stacked on-die to begin with, apparently that's the best approach to scaling memory bandwidth too.
It'll be interesting to see if we get any kind of non-NAND persistent memory in the near future, that might beat some performance metrics of both DRAM and NAND flash.
NAND is built with dozens of layers on one die. HBM DRAM is a dozen-ish dies stacked and interconnected with TSVs, but only one layer of memory cells per die. AMD's X3D CPUs have a single SRAM die stacked on top of the regular CPU+SRAM, with TSVs in the L3 cache to connect to the extra SRAM. I'm not aware of anyone shipping a product that stacks multiple SRAM dies; the tech definitely exists but it may not be economically feasible for any mass-produced product.
The issue is size, SRAM is 6 transistors per bit while DRAM is 1 transistor and a capacitor. Anyone who wants density starts with DRAM. There’s never been motivation to stack.
> AMD's X3D CPUs have a single SRAM die stacked on top of the regular CPU+SRAM, with TSVs in the L3 cache to connect to the extra SRAM.
Just FYI, the latest X3D flipped the stack; the cache die is now on the bottom. This helps transfer heat from the compute die to the heatsink more effectively. In armchair silicon designer mode, one could imagine this setup also adds potential for multiple cache dies stacked, since they do interpose all the signals, why not add a second one ... but I'm sure it's not that simple, for one: AMD wants the package z-heights to be consistent between the x3d and normal chip.
I agree with your description and conclusion. Additionally the companies that can make chip stacks like HBM in volume are the HBM manufacturers. As they are bottlenecked by the packaging/stacking right now (while also furiously building new plant capacity) I can't see them diverting manufacturing to stacking a new SRAM tech.
You really don't need to overthink the reality that the op costs far surpass the funding available and the value being created isn't significant enough (in future cash flow terms) to outpace the costs (though, this is tech so while I don't expect it, a miracle advancement would invalidate this).
Open to being corrected on this stance as I'm armchair qb'ing this (on a legitimate, factual basis, not a "but dude this is how I'm going to eat for the next few years" basis).
The critiques of 'circular funding' don't really make sense to me. If you invest 20 billion and you get back 20 billion, your profit is the same. Sure your revenues look higher but investors have access to all that information and should be taking that into account, just like all the other financial data.
Michael Burry is betting against AI growth translating into real profits as a whole, not the circular funding.
The problem is that stocks are often valued and traded on revenue growth, not profit[0] So circular funding generates stock price bumps when, as you said, there's no inherent value underneath. Creates a recipe for a crash.
[0] consider pagerduty, incredibly profitable with little revenue growth. Trading at 1.5X revenue, where high revenue growth, unprofitable companies are trading at 10X revenue.
I feel like it's almost more of a Popular stock thing. Consider if pagerduty eked out an empty deal with any one of the "Pop stocks" that had little impact on their real profitability. Would stock trade differently or better? It feels like it really would in the modern market. Like even if the numbers weren't a big change, the buzz would be.
Both are taken into account. Potential profitability is taken into account with growth companies. Circular funding has no effect on that. With unprofitable companies case is made on how risky the company is and what the potential profit will be in the future.
I would disagree, at least in the short term. Exhibit A: AMD's stock rose 36% at the announcement of their OpenAI circular deal. If 1+1 = 3 and there is potential profit to be gleaned from such a deal, then it isn't circular, and is just plain good business. But the fact that AMD's stock collapsed back to where it was shortly after suggests otherwise
This isn't to do with this being circular. It is moreso that AMD is thought to be falling behind in AI race, but OpenAI doing a deal with them is a strong indicator that they might have potential to come back.
The deal allows OpenAI to purchase up to 6GW of AMD GPUs, while AMD grants OpenAI warrants for up to 10% equity tied to performance milestones, creating a closed-loop of compute, equity, and potential self-funding hardware purchases. Circular.
From the announcement per se, AMD's stock rose to a level that effectively canceled out whatever liabilities they were committing to as part of the deal, so it was all gravy, despite it being a press release
Why is that generous? This is clearly showing OpenAI's belief in AMD, which in turn would give investors a large amount of confidence. A lot of that market cap came from Nvidia, which lost around 50B that day while AMD gained 70B in market cap. It all makes sense to me.
Where do you see the 70B being erased? But in any case it is also plausible that a confidence changes given new stream of constant information, so I don't see how it would be problematic if it did lose given new information.
Why? Wash trading is about selling and then requiring the same asset for tax purposes. How is this analogous, other than that you presumably dislike both practices?
In crypto, wash trading usually refers to the practice of exchanges or project creators colluding to trade the same asset back and forth in order to make the volume/liquidity/popularity look greater than it is.
- "Our coin hit $100M daily volume, get on this rocketship before it's too late!"
- "Our exchange does $1B annually, so you know we're trustworthy!"
- "Hey investors, look at the massive demand for our GPUs (driven by the company we invested $100B)!"
Nvidia is buying customers that will likely have increasing need for Nvidia. Those investment dollars will be spent on Nvidia. Future dollars will be spent on Nvidia.
Second order effects are that everyone serviced by AI today will need even more AI tomorrow. Nvidia is there for that. They're increasing AI proliferation.
By increasing the number of engineers, dollars, watts spent on GPU, Nvidia grows its market.
The added benefit here is that Nvidia gets to share in the upside if any of these companies succeed in their goals.
It's as if Microsoft had Azure back before the doctom boom and took investments in Google, Amazon, and Facebook in exchange for hosting them. (And maybe a few misfires, like WebVan.)
Burrys critique is that the Nvidia funding deals have them investing money in a company and getting both stock in that company and their own money back to buy the chips. They then book the chip sales in revenue but they don’t show the investment as a cost, since investments are treated separately from an accounting perspective. So it looks like they’re growing revenue organically at no cost, while that doesn’t seem logically consistent with what’s actually happening.
Burry's critique is even more general than that when it comes to tech companies doing accounting fraud. It's his argument as to why "the market doesn't make sense" and his bets have failed -- which is why I'm not sure anyone would summarize it as "betting against AI growth translating into real profits as a whole"
The truth is you can't properly account for these transactions. If they are making legitimate equity investments (ie, that an independent investor would reasonably make) it's all fine. If they are investments that don't hold water, it's fraud.
It's not that different to any type of vendor financing. Vendor financing is legit, if done legitimately.
It's worse than that. One side of the "circle" is 40 billion, the other side is 300. Why not just subtract it, and say 260 billion is going one way.
The real story is that Nvidia is accepting equity in their customers as a payment for their hardware. "What, you don't have cash to buy our chips? That's OK, you can pay by giving us 10% of everything you earn in perpetuity."
This has happened before, let's call it the "selling the goose that lays golden eggs scan." You can buy our machine that converts electricity into cash, but we will only take preorders, after all it is such a good deal. Then, after bulding the machines with the said preorder money, they of course plugged the machines in themselves instead of shipping them, claiming various "delays" in production. Here I'm talking about the bitcoin mining hardware when the said hardware first appeared.
Nvidia is doing similar thing, just instead of doing it 100% themselves, they are 10% in by acquiring the equity in their customers.
> Here I'm talking about the bitcoin mining hardware when the said hardware first appeared.
even better, we take preorders, while we delay for 1 year, we run the ASICs ourselves with way outsized TH/s power compared to the world. Once we develop the next one, we release the 'new' one to the public with 1/10th of the power.
I've run into this before in other industries as well. Sports franchises are notorious where they expect any company doing work for the franchise to then spend some of that money earned back with the franchise in forms of buying advertising, suites, etc to the point that very little money if any is made by the vendor.
You are right, Michael Burry is betting against AI growth being underwhelming. But if he convinces other investors that AI is bad as a general investment, for whatever reasons (like the "circular economy" meme), then he stands to make a nice profit sooner. What's not to like?
It's certainly a problem when circular investment structures are used to get around legal limits on the amount of leverage or fractional reserve, or to dodge taxes from bringing offshore funds onshore.
Plenty of sneaky ways of using different accounting years offshore to push taxes forward indefinitely too, since the profit is never present at the year end.
If you invest $100B and get back $40B in sales, you're investing $60B of money and $40B of your products. This is simple stuff. The question is whether or not it is a good investment. Probably not.
There's a lot of "shoulds" that go out the window when you're basically in a hype cycle. We're high stakes rolling at this point. It's a matter of when the house goes broke.
>Michael Burry is betting against AI growth translating into real profits as a whole, not the circular funding.
No. Your revenue increased by 20bn and your profit increased by (for arguments sake) 5bn. You also have 20bn of investments that you then need to value.
Is that bad? Depends. Did the purchase of chips make sense. Would they have done that if someone else say an independent entity invested?
this isn't an investing site but Coreweave is what I watch. All those freaking datacenters have to get built, come online, and work for all the promises to come true. Coreweave is already in a bit of a picklye, I feel like they are the first domino.
/not an investing/finance/anything to do with money expert.
Yeah, I saw this critique show up a few months ago and now I'm seeing it everywhere, even in major financial news sites like Bloomberg.[0] It's certainly worth discussing, but people are taking it as a gotcha to prove the AI boom is fake. However, all the AI companies have to buy from Nvidia anyway. And Nvidia has tons of cash, in fact it has 4x the cash on hand now than it did in 2023, despite all the investments.[1] So yes, if they think the AI market will grow then of course they will buy into it. If all of Nvidia's deals went bad, their stock would plummet, but not because they lost a few tens of billions, rather because that would mean the AI market is going down in general. There is a great counterexample to the "AI is propped up by circular funding" argument in Google, which uses its own TPUs and builds its own AI, and integrates it into it's own end-user products, no circular deals needed. If AI is propped up by anything it is investors and companies thinking it will give them a huge return. Circular deals are a result of that: cash is going everywhere into that market, it's that simple. The AI boom may be a bubble, but not due to circular deals in particular.
I'm not sold on the circular funding argument either, though it certainly wouldn't surprise me if it (or some other form of corruption/collusion) turned out to be true to some extent. Personally, I firmly believe that Silicon Valley jumped-the-gun early on AI investment for fear of being left behind and over-estimating the potential of LLM-based AI (at least in the short term), and now are stuck in the awkward position of not being able to admit it without shaking investor confidence which they don't want to do as they still need significant more investment to mature the tech to a point that it starts paying significant returns with respect to the investment.
> Michael Burry is betting against AI growth translating into real profits as a whole, not the circular funding.
It's not even so much that he's betting against that translating into profits, but rather that the pace of infrastructure investments is too out of sync with the timeline of realizing those profits, and also that throwing money at the problem doesn't necessarily move that break-even ROI timetable forward in a sustainable way (beyond a certain point).
That's what popped the DotCom bubble. It was the fundamental fallacy that potential profits and revenues were directly proportional to and/or dependent on investment, and even more specifically that more investment would realize not just greater returns, but the belief that more investment yielded greater return sooner which just wasn't true - at least not beyond a certain point. So while many people associate the Pets.com flop with the dotcom bubble, it was actually over investment in and by Cisco (chiefly, but not solely) that really precipitated the bubble bursting.
A lot of people see lots of parallels with the AI bubble in that context. If the ROI timeframe is greater than the viable lifecycle of hardware bought today, how wise is it to spend big today? Does it accelerate the timeframe if you spend more, and if so by how much, and up to what point? There's also something to be said about market momentum and strategic positioning, but that's hard to quantify, especially in the context of forecasting how impactful it will be on realizing your ROI at some indefinite point in the future.
0% NET accretive profit - the OP was saying that the invest/return wash doesn't affect prior profitability, just revenue. Obviously, the new profitability inclusive of the new revenue will actually by lower because of the zero margin wash trade.
But you haven't made any money either. That's what "profit" means.
Also, "the asset" here means stocks of a company that is losing billions dollars per year. OpenAI has no clear path to become profitable, especially given the fact that Google has just leaprfroged them with their Gemini 3 model.
Just because it's legal and in the open doesn't mean it's sound or not creating perverse incentives. Investors that "should be taking that into account" probably are, and hoping that they come out on top when the bubble bursts. That means pain for many people. Those are very valid reasons to point the finger and criticize.
Since SRAM is typically built in logic fabs (like TSMC) alongside the processors themselves, it theoretically shouldn't face the same supply chain crunch as HBM.
DRAM and logic fabs are both sold out so replacing one with the other doesn't really help. And SRAM uses ~6x more silicon area than DRAM.
Groq may be undervalued but not for supply chain reasons.
The specifics of the article zero sense. Net Income and Operating Cash Flow are not the same thing so there is no mystery about whey they are different especially in a business with large Capex and long lead times.
NVIDIA's historic DSO figures have also ranged from 41 to 57 over the last 5 years, so again not that crazy.
> Net Income and Operating Cash Flow are not the same thing
This is the second article that gained a lot of traction on HN in the last few months with similar fundamental errors. The previous one was similarly written by someone with a dev background where they didn't understand the difference between gross margins, operating profit, and net income–concepts which were the basis for their analysis.
I wonder with this author or the previous, if they're just a little too confident in their abilities. Accounting and Finance are very deep and highly technical fields. Particularly for publicly traded companies where the businesses tend to be very complex and sophisticated and the consequences for making mistakes or committing fraud are severe. I'm not suggesting that non-finance people shouldn't try to analyze this stuff, but some humility might be in order.
It reminds me of a data scientist I worked with who was doing some modeling on companies' revenues. He believed he'd uncovered some secret, conspiratorial truth about Samsung having trillions in revenues, thereby making it the largest and most powerful entity on Earth. We had to gently inform him that he was looking at the revenues denominated in Korean Won and they needed to be converted to USD.
Don't get me wrong, I love SWEs, devs, DSs, etc. and I think they do tend to be much smarter than the average. But, in my experience, they also exhibit higher than average hubris.
Not a big fan of the circular observation. It's not the gotcha people seem to think.
If the baker sells bread to the butcher, and the butcher sells meat to the baker then they can still both go to bed with a belly full of sandwich (aka actual utility & substance).
Adding a third party to make it look more circle-y doesn't change that logic.
Round trip financing is mostly an issue if it is artificial (e.g. a circle of loans) and between affiliated parties, not when something of substance is delivered. Oracle is a business partner of nvidia but I'd wager they'll still kick up a fuss if they don't get their pallets of GB200s. They'll expect actual delivery...like you know...in a real sale.
That's the wrong analogy, because the butcher is giving money to the baker for the bread. If we we fix the analogy, then the baker gives money to the butcher so that he buys their bread with that money. The butcher cannot afford the bread without it!
Isn't this type of "circular funding" equivalent to bartering? And why is this a problem?
Basically, the gold rush has reached a point where the shovel seller wants a stake in the operation.
I wonder if we would one day have a situation where NVIDIA no longer wants to sell chips to anyone and just use them themselves. Some special developments would have to occur to reach this point I think.
The circular funding is concerning, but more concerning are suggestions that supply might be vastly exceeding demand. Not that people don’t want chips but that the chip production now exceeds the ability to power them up and use them. The shortage is power and racks in data centers ready to go. Folks are running numbers suggesting there’s a bunched chips now just sitting around.
That, combined with some cooling from an AI hype bubble burst (see separate articles about companies missing quota as folks aren’t buying as much AI as the hype hoped) and there’s a potential ugly future where the headline demand plummets in top of idle chips waiting to be powered on. Suddenly the market is flooded with chips nobody wants.
Isn't circular funding how the entire economy works?
I can see how you could make an argument that this particular ouroboros has an insufficient loop area to sustain itself, or more significantly, lacks connection to the rest of the economy, but money has to flow in circles/cycles or it doesn't work at all.
Parties in an economy don't normally buy something that they sell at the same time. It's hazier than that here, but still looks like Nvidia is buying GPUs from itself via OpenAI and Oracle.
Btw there are examples involving sanctioned economies. Most US saffron comes from Spain, all of whose saffron comes from Iran. Azerbaijan exports way more gas than they produce, cause they also buy from Russia.
Only when using non aligned intwrests, like government funding roads.
When interests directly align and parties are largely owned by the same people, its wash trading.
The point of wash trading is to make activity increase the value of an asset via a netzero activity. Since nothing is generated from the activity its circular, eg, nothing physical changes hands.
Crypto trading is the golden child of wash trading as the primary mode of increasing the value of an asset.
Its unsurprising then that the company that got rich on crypto wash trading is doing its own attempts to drive artificial demand.
There is no extra wafer capacity anywhere in the system and realistically it will take several years to unstick that. So not sure where the slack is that will let sudden massive competition appear to compete with Nvidia on the newest nodes?
> However, Groq’s architecture relies on SRAM (Static RAM). Since SRAM is typically built in logic fabs (like TSMC) alongside the processors themselves, it theoretically shouldn't face the same supply chain crunch as HBM.
>
> Looking at all those pieces, I feel Oracle should seriously look into buying Groq.
I don't see why. Graphcore bet on SRAM and that backfired because unless you go for insane wafer scale integration like Cerebras, you don't remotely get enough memory for modern LLMs. Graphcore's chip only got to 900MB (which is both a crazy amount and not remotely enough). They've pivoted to DRAM.
You could make an argument for buying Cerebras I guess, but even at 3x the price, DRAM is just so much more cost effective than SRAM I don't see how it can make any sense for LLMs.
Forget about DRAM vs. SRAM or whatever: How does a cheaper source of non-Nvidia GPUs help Oracle? They’re not training models or even directly in the inference business. Their pitch is cloud infra for AI, and today that means CUDA & Nvidia or you’re severely limiting your addressable market.
This is what happens when highly confident uneducated people read slop from other uneducated people with an agenda (Twitter, Burry et al) and then regurgitate more slop.
There is no circular funding. There’s certainly circular speculation that is driving up the prices but the revenues are all accounted.
The DSO change is meaningless if you understand accounting.
The inventory building up is the cost of materials and incomplete inventory. It’s not chips sitting around waiting to be deployed.
> holding ~120 days of inventory seems like a huge capital drag to me.
Yeah I guess this guy who knows nothing about running a business like Nvidia is allowed to make confident statements like this despite no education or experience.
This article is garbage and he wasted his 48 hrs investigating the same things I read in another worthless tweet several weeks ago.
Are partially processed wafers on Nvidia's balance sheet or TSMC's? Or are you saying Nvidia is holding sawn die in inventory? Confused as to what state the product might be in for it to make sense to hold for 4 months when supposedly you can sell everything you make.
They are on nvidias balance sheet. You also don’t sell freshly baked chips the way you do cookies. Just because they are hot out of the oven doesn’t mean you sell it right away.
75% margin means they have around $79.2bn of potential revenue sitting in inventory. Next quarter revenue is projected to be $65bn, so 110 days of stock.
I don't have an opinion on whether it's correct or not. I see AI writing, I stop reading. When compared to human-authored prose, AI writing is much more likely to be convincing-but-wrong. I don't need to be ingesting stuff engineered to be believable, with correctness only as a secondary concern.
If it's correct, it's usually because a better source has already written it correctly, or because it's trivial to somebody who knows what they're talking about.
AI writing is a sign that the author either doesn't know the material, doesn't want to write it, or both. Because distinguishing "too lazy to write it" and "too stupid to fact-check it properly" is very difficult on my end as a reader, it's better to be safe than sorry. There's no sense in exposing my malleable wetware to slop of unknown provenance.
There has been no shortage of human authors writing about this topic. I don't know who "Philip Pieogger" is, so I have no reason to prefer him as a co-author.
Given that not all information is public in this scenario, there is no choice but to construct theories that are plausible regardless of the gaps in evidence; such is the basic nature of investing and economics both. Is your objection that available evidence was excluded that you consider to be materially relevant, or that theories were constructed when we don’t have the complete story, or..?
The problem isn’t theorising itself, investing is full of it. The issue is when speculation is presented with the tone and certainty of established fact. The article doesn’t merely offer possibilities in light of missing data; it states mechanisms and outcomes as though the evidence for them is already in hand. So the objection isn’t to building a model, but to blurring the line between assumption and demonstration, and to glossing over the range of alternative explanations that the same incomplete information could support.
If you think this comment was not written by the account holder or that account sales are occurring, please do as asked by the guidelines in such cases and contact the mods to report that, instead of posting about it in a discussion.
I wouldn't know if account sales was occurring, my first thought was that they sincerely wanted to have AI critique this article and post here. I don't think it is a good enough reason for me personally to flag.
I think if someone was trying to use AI to farm they wouldn't post these types of critiques, but something safer rather.
But I did wanted to see what their reasoning for posting those AI critiques was, and they answered, so I got my curiousity satisifed to an extent.
Pull our POV back far enough, and isn't "circular funding" just "The economy?"
Money circulates; it's what it does. The real question is to what extent circulation among a small group of firms is either collusion in disguise (i.e. decisionmaking by only one actual entity falsely measured as multiple independent entities) or a fragile ecosystem masquerading as a healthy one (i.e. an "island economy" where things look great in the current status quo, but the moment the fish go away the entire cycle instantly collapses).
I appreciate the disclosures about Gemeni and Nano Banana, but does that start to feel a little like a conflict of interest or something similar in an article discussing their competition?
The Burry short is just one data point, but the "facts we know" are piling up fast.
Here is a possible roadmap for the coming correction:
1. The Timeline:
We are looking at a winter. A very dark and cold winter. Whether it hits before Christmas or mid-Q1 is a rounding error; the gap between valuations and fundamentals has widened enough to be physically uncomfortable.
The Burry thesis—focused on depreciation schedules and circular revenue—is likely just the mechanical trigger for a sentiment cascade.
2. The Big Players:
Google: Likely takes the smallest hit. A merger between DeepMind and Anthropic is not far-fetched (unless Satya goes all the way).
By consolidating the most capable models under one roof, Google insulates itself from the hardware crash better than anyone else.
OpenAI: They look "half naked." It is becoming impossible to ignore the leadership vacuum. It’s hard to find people who’ve worked closely with Altman who speak well of his integrity, and the exits of Sutskever, Schulman, and others tell the real story.
For a company at that valuation, leadership credibility isn’t a soft factor—it’s a structural risk.
3. The "Pre-Product" Unicorns:
We are going to see a reality check for the ex-OpenAI, pre-product, multi-billion valuation labs like SSI and Thinking Machines.
These are prime candidates for "acquihres" once capital tightens. They are built on assumptions of infinite capital availability that are about to evaporate.
4. The Downstream Impact:
The second and third tier—specifically recent YC batches built on API wrappers and hype—will suffer the most from this catastrophic twister.
When the tide goes out, the "Yes" men who got carried away by the wave will be shouting the loudest, pretending they saw it coming all along
I don't believe your comment is just a direct dump out of an LLM's output, mainly because of the minor typo of "acquihires", but as much as I'd love to ignore superficial things and focus on the substance of a post, the LLM smells in this comment are genuinely too hard to ignore. And I don't just mean because there's em-dashes, I do that too. Specifically these patterns stink very strong of LLM fluff:
> leadership credibility isn’t a soft factor—it’s a structural risk.
> The Timeline/The Big Players/The "Pre-Product" Unicorns/The Downstream Impact
If you really just write like this entirely naturally then I feel bad, but unfortunately I think this writing style is just tainted.
- Strange paragraph-lists with bolded first words. e.g. "The Cash Flow Mystery"
- The 'It's not just X; it's Y' meme: "Buying Groq wouldn't just [...], it could give them a chip that is actually [...]. It’s a supply chain hedge."
Tells like:
- "My personal read? NVIDIA is [...]"
- "[...]. Now I'm looking at Groq, [...]"
However, even if these parts were AI generated, it's simultaneously riddled with typos and weird phrases:
- "it looks like they are squeezing each other [sic] balls."
- Stylization of OpenAI as 'Openai'.
Not sure what to make of this low-quality prose.
Even if the conclusion is broadly correct, that doesn't mean the reasoning used to get there is consistent.
I do, at least, appreciate that the author was honest up-front with respect to use of Gemini and other AI tools.
Final grade: D+.
reply