Welcome to WebmasterWorld Guest from

Forum Moderators: open

Message Too Old, No Replies

Cloaking for Spidering Speed

OK to serve internally cached pages to spiders?

11:50 pm on Jul 17, 2003 (gmt 0)

Preferred Member

10+ Year Member

joined:Mar 20, 2003
votes: 0

I'm developing a site that relies on a data feed which is updated about every hour. I'm caching it anyway as to not strain the other server and to speed up page loads.

Using cached data, the page loads are very quick, usually under 0.01 seconds, but when this data needs to be updated on the fly, it can take between 1-20 seconds. Typically its 3-7 seconds. Unfortunately, its not feasible to preload and cache this data.

First of all, is that kind of page load time going to be a problem for spiders? Let's say pages with expired cache data load in an average of 5-6 seconds. I'm really only concerned about Googlebot and Inktomi/Slurp. If it would be a problem, would it be OK to serve a page containing the most recently cached data to those spiders?

I don't see a problem with it. Even a shadow spider would only see insignificant differences. Titles and headings would always be the same. Any thoughts? Am I taking a risk at all by doing that? I know Inktomi is known to send shadow spiders...would slight differences set off any cloaking flags?

5:19 am on July 25, 2003 (gmt 0)

Senior Member

WebmasterWorld Senior Member littleman is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:June 17, 2000
votes: 0

I've never did any testing to see how long Slurp or Google will stick around for a page request, though it would seem that 5 seconds isn't very long.

Still, I'd think feeding the bots the cached pages makes sense -- there would be less chance of getting a server error indexed. I can't see how this would be considered a negative thing in the eyes of the search engines.


Join The Conversation

Moderators and Top Contributors

Hot Threads This Week

Featured Threads

Free SEO Tools

Hire Expert Members