Because the prefetch is on a per-query basis and not on a per page basis. It would be possible, but I think the processing overhead wouldn't be warrented.
Again, lets not worry about the side effect of Google using prefetch and instead, worry about the disease that the Mozilla group would put the feature in there to start with.
Why would you want to stop preople prefetching your web site?
I go to extreme lengths to make sure my pages are fast loading, so visitors get a good first impression. Prefetch would be an added boost.
Is google using prefetch on any of your pages, or is it your question a hypothetical one?
|Why would you want to stop preople prefetching your web site? |
Bandwidth costs. The page might be pre-fetched without the searcher actually clicking on your result = wasted bandwidth.
Isn't prefetch a browser side technology...
effectively the bots are server side technology...
why would Google want to include "prefetch" in a bot crawl?
they are already storing enough data about each page they index...
I would think that the bot would always want to find the latest version of a page .. not anticipate that page..but actually see it and calculate the latest version...
Taken from the Google Helpcenter, Results Prefetching section:-
|On some searches, Google automatically instructs your browser to start downloading the top search result before you click on it. |
|The page might be pre-fetched without the searcher actually clicking on your result = wasted bandwidth |
It'll be a tiny fraction of your total bandwidth. By far the majority of people will click on the link.
Have you seen the kind of search terms the prefetch is used for? It's things like "webmaster world". How many people are likely to type in webmaster world but not click on the link?
How about Google prefetches only graphic files and only from the Google cache saving us the bandwidth?
Mozilla prefetch command:
- uses bandwidth. With little indication that a user would ever visit.
- if the site is dynamic and using cache busting headers, the vistor will have to redownload the page anywhere - thus, you get speed penalized twice.
- falsifies referral information in your referral log.
- flasifies agent information in your agent log.
- causes system load.
- is basically an unstopable spider.
Lastly, from those whacky fun loving affiliate guys, comes this little tidbit:
Mozies prefetch offers the perfect opportunity to blindly cloak away for moz users with little risk of ever being caught - think about it.
I just disliked their suggestion of returning a 404 error to the x-moz agent...
Mainly 'cos if someone looks at your page after its been prefetched how would you know they've read it.
Basically my customers could claim that I haven't provided as much traffic due to pages being prefetched and not actually read and I'd have know way of proving them wrong.
I thought if Google was aware that I didn't want any of my site prefetched then they could be kind enough not to tell mozilla to prefetch any of it.
End of my rant...
My gripe is as a user, with the browser fetching something that I would never have clicked on.
I saw that the other day when a site asked to set a cookie, and I was still looking at the SERP - not clicked on anything yet.
How many rogue sites are setting a prefetch to rogue content scam sites? I disabled it immediately.
There is an option to disable this in Mozilla 1.3+ which I had never noticed. It is in:
Preferences --> Advanced --> Cache