Fuel is a minuscule percentage of the cost of building and operating a nuclear reactor.
The cost of uranium is currently about $90/kg. Citing a previous poster, around 70 MT of uranium is used every year, which comes to about 6.3 G$, which comes to about 2.61 $/TWh. As per the NEI, the cost of nuclear energy is 108 $/TWh. At current prices, uranium accounts for 2.4% of the cost of producing nuclear energy.
Keeping the same profit margins under an increase of one order of magnitude of the price of uranium would mean increasing the price of nuclear energy by roughly 20%.
The cost of everything else about a nuclear reactor is based in hugely understated fossil-fuel subsidised cost basis. That is going to inflate when you don't have those fossil fuels around, particularly for elements for which hydrocarbons are essential inputs or exceptionally difficult to substitute (coking coal for steel, much or all transport, chemical feedstocks, industrial process heating).
But that's less a concern to me than the technological stack and global risk associated with massive (15,000+ reactors) deployment of nuclear technology. Present major incident rate has been about 1 per 100 years of plant operation. How much are we going to cut that down to? 1 per 1,000 years? 1 pere 10,000 years?
Because that's one or more major nuclear accidents per year at a 15,000+ plant scale.
Even getting the rate down to a few per century adds up over sufficient time, and I presume you're in this for the long haul.
Who manages plants and waste processing during major economic disruptions or times of total war between nuclear powered states (which would be, let me check, um, all of them).
The most immediate implication is that it simply invalidates a great deal of present future-cost/future-value analysis, by noting that the entire present price and cost regime is severely faulty. It may well be that money is the wrong unit of economic analysis. There are a number of people who've suggested that currency should be considered to be backed in energy (though that's not the same as saying money is energy -- a crucial but nuanced distinction) -- Arthur C. Clarke, Kim Stanley Robinson, and F. Buckminster Fuller among them. I've traced the concept back to H.G. Wells (of whom Clarke was a fan).
Which would suggest that an energy flows model of an economy is of interest, and that for the nuclear instance, you'd want to sort that out based on total available nuclear energy, conversions for provisioning synthesised hydrocarbons (or other fuels, though CH chains are awfully appealing), and how much net free energy (for other activities) you're left with. Plus factors for risk and such.
That could be compared with the models for renewables-based alternatives.
A key benefit to most renewables schemes is that they pose relatively few widespread catastrophic systemic risks. Grid stability seem the main issue if we're still talking electrical systems (and the advantages of electricity are such that we almost certainly are). But solar and wind plants don't suddenly go into catastrophic meltdown, and the tech stack for each is relatively small.
(Hydro power can see regional catastrophic failure, see the Banqiao Dam disaster, 170k killed. But a lot of things had to go wrong, most of which aren't significantly different from what can befall a nuclear site, and the long-term consequences are fairly benign: the region is now home to 7+ million people, 40 years on.)
But, answering your question in part: it seems to me that a combined measure of net free energy, and systemic risk is probably a better assessment criteria than some putative present "cost" analysis based on a flawed pricing system.
Do those calculations (108$/TWh) include the cost of safely getting rid of the waste and the insurance costs to cover the damage done by an accident? Let alone the costs of accidents.
For a reasonably balanced view of costs and risks of nuclear power, I'm inclined to look at the IPCC's study of renewable or carbon-neutral energy alternatives, SRREN. I don't have numbers off the top of my head, but believe that nuclear is at least within ballpark competitiveness based on present cost estimates (see my immediately prior comment on why cost estimates are likely not the best assessment tool to use).
IPCC assimilates data from a large range of sources and viewpoints. It's about as close to a concensus view as you'll find. And considerably more sober than the parent article here.
The cost of uranium is currently about $90/kg. Citing a previous poster, around 70 MT of uranium is used every year, which comes to about 6.3 G$, which comes to about 2.61 $/TWh. As per the NEI, the cost of nuclear energy is 108 $/TWh. At current prices, uranium accounts for 2.4% of the cost of producing nuclear energy.
Keeping the same profit margins under an increase of one order of magnitude of the price of uranium would mean increasing the price of nuclear energy by roughly 20%.