This phenomenon is the wilful destruction of valuable global commons at the hands of a very small number of companies. The number of individually accountable decision-makers driving this destruction is probably in the dozens or low hundreds.
Everybody and their dog are writing AI scrapers with Captcha passing functionality these days. None of that is new but the scale is unseen.
The thing is also that corporate scrapers are in comparison even the good guys, they respect robots.txt, have properly set user agents etc. Others might do neither and from residential IPs.
The issue isn't even new but any proper solution attempts have been postponed because CDNs seem such an easy solution.