Welcome to WebmasterWorld Guest from 22.214.171.124
Forum Moderators: martinibuster
I am running a community site (Vbulletin forum). I have too many Yahoo! Slurp spiders crawling my site frequently.
Looks like more than the actual users / visitors in my community, these spiders are querying my forum database (MySQL) a lot causing it to reach the max_questions limit and reducing the performance of the site.
How do I delay or reduce the crawling of the Yahoo! SLurp spiders? Though I want yahoo spider to crawl and index my site, I want it to be done adequately without affecting site performance.
Do I introduce crawl delay in robots.txt file? How do I make it take effect immediately?
Is there any other solution?
Thanks in advance for the responses received here.
When WILL Yahoo wake up and pull their head's from their posteriors? Who knows.
This will make bot make 10 second delays between requests. Some other bots also support it.
Additionally have a look at urls that get crawled and maybe some of them can be disallowed in robots.txt - you are the one who knows whether some pages are of zero interest to the search engines.