Msg#: 3381302 posted 4:12 pm on Jun 28, 2007 (gmt 0)
I am running a community site (Vbulletin forum). I have too many Yahoo! Slurp spiders crawling my site frequently.
Looks like more than the actual users / visitors in my community, these spiders are querying my forum database (MySQL) a lot causing it to reach the max_questions limit and reducing the performance of the site.
How do I delay or reduce the crawling of the Yahoo! SLurp spiders? Though I want yahoo spider to crawl and index my site, I want it to be done adequately without affecting site performance.
Do I introduce crawl delay in robots.txt file? How do I make it take effect immediately?
Is there any other solution?
Thanks in advance for the responses received here.
Msg#: 3381302 posted 8:38 pm on Jun 29, 2007 (gmt 0)
I have the same exact problem. Yahoo simply sucks and just sits there and crawls all day for, apparently, nothing, LOL. I think they literally all go meet on my forum for a beer or something. I am thinking of simply banning Yahoo from robots.txt, altogether, as they are simply worthless anymore anyway. I hate to be that drastic, but they have serious problems. Google usually has ONE spider all day on my forum, MSN usually has 3 all day on forum and Yahoo has 100 or so all day...no doubt its a waste.
When WILL Yahoo wake up and pull their head's from their posteriors? Who knows.