Welcome to WebmasterWorld Guest from

Forum Moderators: martinibuster

Message Too Old, No Replies

Yahoo! Slurp is slurping up my site and a massive amount of bandwidth

7:37 pm on Nov 17, 2005 (gmt 0)

Full Member

10+ Year Member

joined:Sept 23, 2004
votes: 0

Hi people,

Not posted in this section before, I normally stick to Google and Adsense forums.

I have been having a good look at my sites stats tonightand I think I should be concerned.

Yahoo! Slurp is coming to my site and spidering my site like crazy.

So far this month -

Page Views: 56,566
Visits: 51,290
Bandwidth: 2,715,409 Kb

Is this normal?

The worrying thing is they are taking all my bandwidth. At the going rate they will eat nearly 5 gig.

By a long long long way they are the busiest spider on my site. Google come in at 1,128 visits and MSN at 794 visits so far.

And does anyone know anything more about "OmniExplorer"?

They say on their site ... "Omni-Explorer is a venture-backed startup based in Silicon Valley. Stay tuned to this site; we plan on launching shortly".

Who are they?

Thanks for all replies.

1:27 am on Nov 18, 2005 (gmt 0)

New User

10+ Year Member

joined:May 5, 2005
votes: 0

endomorph1 --

I'm seeing the same thing. I was dropped from Yahoo back in early July. They've been crawling me to the tune of 30,000+ pages daily ever since. (Yes, 30,000 pages daily!) I have multiple dedicated servers, and they're tuned well enough to handle the load. Otherwise, I'd probably have to ban Slurp.

I keep hoping that all of this activity from Slurp is a good sign and that they will eventually reinclude my site. But I'm not showing up at all, even after filing a reinclusion request and after Yahoo's most recent index update. I don't know what to think anymore...

It's as if Yahoo and Google have both gone insane. But I suppose that's a topic for another thread.

-- Orbiter

2:25 am on Nov 18, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member jdmorgan is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Mar 31, 2002
votes: 0

Look into using robots.txt to control what directories and pages are crawled, and use the Crawl-delay parameter to slow down Slurp to a tolerable level.

Yahoo robots help page [help.yahoo.com]


6:58 am on Nov 18, 2005 (gmt 0)

Full Member

10+ Year Member

joined:Sept 23, 2004
votes: 0


Thanks for that. My robots file now looks like -

User-agent: *
Disallow: /images/

User-agent: Slurp
Crawl-delay: 20

Do you think that is OK? I have not used robots file much so not sure of syntax.