I had this same experience in regards to support interaction. Starting from scratch is a less than ideal process IMO, so much so I have not done it in the hope that a future update fixes this, and I don't have to jump through all the hoops.
I use two Max OS "Locations" (Apple Menu -> Location), one which uses the router default DNS (which is a Pi-Hole at home), and another which is set to use the Cloudflare DNS servers.
This allows me to to switch location, and instantly reload a page without the Pi-Hole interfering. Of course this only works on my Mac.
As far as I know, Pi-Hole sets a relatively short TTL on its responses, but I think should still cause a non-zero delay when you disable it, shouldn't it?
Haven't seen this data in the form of an interactive map before.
For anyone else wondering "Geographic Access" is based on public transport times to selected public services. This is outline in the methodology documentation [0].
GIMP and Krita focus on different stuff even if some of their functionality overlaps. GIMP focuses on photo editing whereas Krita focuses on digital painting. This sort of focus affects what priorities functionality for each program gets.
From Google's perspective it's probably too much work. I would assume this was a part of the cralwer code and extracted over time into a library, while part of the monorepo, so changesets probably didn't only touch this code, but also other parts and this code probably depended on internal libraries (now it depends on Google's public abseil library) publishing all that needs lots of review (also considering names and other personal information in commit logs, TODO comments and their like)
Not only that, code libraries that weren’t designed to be open source often have things in them that Google might want to show: codenames, profanity, calling out specific companies…
Also, even if it is authoritatively managed in git now, the whole 20 year history certainly wasn't (since git is only 14 years old, and Google probably didn't adopt it on day one), and it's quite likely commit history wasn't converted,so it's quite possible Google couldn't easily make the whole history available when publishing it to GitHub even if they wanted to.
I assume the authoritative version is still in Google's Piper-based repo and previously was in perforce and I assume that was for a while ... so if there were interest Google's could dig deep. But I assume there are other projects where this is even more interesting. (how ranking changed over time; how storage formats for the index changed; ...)
I can attest to this. I work in a very large monorepo with tens of thousands of commits. Even files that aren't changed often have regular updates - usually repo-wide CodeMods. This makes the blame less useful and the history quite noisy. I figure the robots.txt parser's history would be in a similar state - not very useful or interesting to read.
[0] https://mislav.net/2020/01/github-cli/