Perhaps it does look like that. It also looks like Mozilla has a healthy incentive to shadow Google in the form of their recent search deal.
Is Mozilla doing the best thing for the Web or further cementing the current browser landscape? Will it be better when I have to write my crawlers to work with HTTP 1.1, SPDY, and HTTP Next?
You don't have to write your crawlers to even work with HTTP 1.1; all the web servers out there will respond just fine to HTTP 1.0 requests, even through HTTP 1.1 is ubiquitous and has been in use for over a decade now. What makes you think that SPDY-enabled web servers won't be compatible with HTTP?
My objection to this turn of events is almost entirely related to the behavior of Google and their search deal with Mozilla. Mozilla has compromised their principles again and again for Google and it is accelerating.
I'm tired of Google's doublespeak and lies about "Open Web" and open standards. If SPDY takes off, we will all have to think about supporting it and if it's built correctly that will be fine. I seriously have my doubts.
It would have been much less suspicious if the second browser to adopt SPDY was Safari or IE. Right now, it looks like Google is essentially bribing an "independent" browser vendor to implement their half-baked standards so they can turn around and claim "it's a web standard!".
It's like 1999 but now with corporate protectorates!
What makes you think you have to do anything? If spdy takes off all you have to do is upgrading your http library to advantage of it. Besides the servers still speak http so your crawlers will keep working even if you don't upgrade.
Is Mozilla doing the best thing for the Web or further cementing the current browser landscape? Will it be better when I have to write my crawlers to work with HTTP 1.1, SPDY, and HTTP Next?