I think the idea would be to have a global index of sites that sites use to mirror their data for searching.
This database would normally only accessible by search engines and the sites themselves could then disallow direct bot search in their robots.txt.
It occurs to me that this might have to be "invite only" - Google invites the sites they trust to put their data there but if they catch someone "cheating" in one way or another, they stop indexing. Plus they wouldn't have to invite really small sites.
This database would normally only accessible by search engines and the sites themselves could then disallow direct bot search in their robots.txt.
It occurs to me that this might have to be "invite only" - Google invites the sites they trust to put their data there but if they catch someone "cheating" in one way or another, they stop indexing. Plus they wouldn't have to invite really small sites.