We're talking about different internets, though. You're talking about the hypothetical patched internet that uses Google's #! remapping, whereas I'm talking about the internet as it exists right now. If I go to Gawker with lynx right now, it will not work, period. The fact that there exists the details of implementation somewhere—and the fact that the implementation is trivial—doesn't mean that it should become standard across the board.
I hate to invoke a slippery slope, but it seems a frightening proposition that $entity can start putting out arbitrary standards and suddenly the entire Internet infrastructure has to follow suit in order to be compatible. It's happened before, e.g. favicon.ico. All of them are noble ideas (personalize bookmarks and site feel, allow Ajax content to be accessible) with troublesome implementation (force thousands of redundant GET /favicon.ico requests instead of using something like <meta>, force existing infrastructure to make changes if they want to continue operations as usual.)
All of this is moot, of course, if you just write your pages to fall back sensibly instead of doing what Gawker did and allowing no backwards-compatible text-only fallback. Have JS rewrite your links from "foo/bar" to "#!foo/bar" and then non-compliant user agents and compliant browsers are happy.
> If I go to Gawker with lynx right now, it will not work, period.
As a specific issue, that seems like a minus, but an exceedingly minor one, as lynx is probably a negligible proportion of Gawker's audience. In principle, backwards-compatibility is a great thing, until it impedes some kind of desirable change, such as doing something new or doing it more economically.
> it seems a frightening proposition that $entity can start putting out arbitrary standards
I generally do want someone putting out new standards, and sometimes it's worth breaking backwards-compatibility to an extent. So it really depends on $entity: if it's WHATWG, great. If it's Google, then more caution is warranted. But there's been plenty of cases of innovations (e.g. canvas) starting with a specific player and going mainstream from there. I do agree that Google's approach feels like an ugly hack in a way that is reminiscent of favicon.ico.
> All of this is moot, of course...
This is good general advice, but it's not always true. At least one webapp I've worked on has many important ajax-loads triggered by non-anchor elements; it's about as useful in lynx as Google maps would be. The devs could go through and convert as much as possible to gracefully-degrading anchors, that would at least partly help with noscript, but it seems like a really bad use of resources, given the goals of that app.
I hate to invoke a slippery slope, but it seems a frightening proposition that $entity can start putting out arbitrary standards and suddenly the entire Internet infrastructure has to follow suit in order to be compatible. It's happened before, e.g. favicon.ico. All of them are noble ideas (personalize bookmarks and site feel, allow Ajax content to be accessible) with troublesome implementation (force thousands of redundant GET /favicon.ico requests instead of using something like <meta>, force existing infrastructure to make changes if they want to continue operations as usual.)
All of this is moot, of course, if you just write your pages to fall back sensibly instead of doing what Gawker did and allowing no backwards-compatible text-only fallback. Have JS rewrite your links from "foo/bar" to "#!foo/bar" and then non-compliant user agents and compliant browsers are happy.