homepage Welcome to WebmasterWorld Guest from 54.204.58.87
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor 2014
Visit PubCon.com
Home / Forums Index / Search Engines / Alternative Search Engines
Forum Library, Charter, Moderators: bakedjake

Alternative Search Engines Forum

    
Bandwidth Sucking Search Engines
IPs are now blocked you packet robbing swine
incrediBILL




msg:465123
 2:35 am on Apr 19, 2005 (gmt 0)


I'm not one just to randomly block every search engine on the planet like some of the more radical posts I've read but I'm real tempted after this afternoon. Two well known of the lesser search engines (you know who you are as I blocked you in mid crawl you swine!) came crawling during prime time tonight on a customers ecommerce site and started sucking bandwidth like you wouldn't believe.

It seems like they were completely unthrottled hitting it full tilt generating 2Mbps of site traffic for almost 2 hours before I noticed and blocked their IPs. Google, Yahoo and MSN seem to sneak in, crawl and go away relatively unnoticed but these other nitwit crawlers seemed like they were trying to knock the server offline.

Don't they know spider behavior like this hurts the sites they crawl which results in their being blocked and ultimately hurting themselves?

What are these people thinking?

Are these people thinking?

Unbelievable.

 

victor




msg:465124
 6:51 am on Apr 19, 2005 (gmt 0)

Some spiders are so badly behaved all you can do is whack 'em with a blunt object.

Several offline readers are the same -- they'll hit your site at zillions of requests a second rather than pace themselves.

If you hadn't noticed in time -- like maybe you'd been asleep during the attack -- and you have a very large site, a couple of such attacks could use up your bandwidth quota.

Which is why I decided to go proactive. Rather than ban after the event, I monitor request rates by IP address. Any misbehavior gets the IP banned for anything between 10 minutes and 24 hours.

While banned, I send back a very short page saying (something like) "spider misbehaving" and quote their UA id etc, and explain what they were doing wrong.

The page has no links on it, so they soon run out of harvsted URLs to spider.

Then, later, I can use their search engine, find the "spider misbehaving" pages they've indexed, and email them to complain of the service they are giving their users.

victor




msg:465125
 8:26 am on Apr 19, 2005 (gmt 0)

Too late to edit the above:
There is a simlar thread in progress:
[webmasterworld.com...]

That offers some PHP code to dynamically block IPs that are rampaging. Don't use PHP myself, but looks like it could be useful to those who do.

ByronM




msg:465126
 1:31 pm on Apr 19, 2005 (gmt 0)

Being a search engine operator, what types of throttling do most sites like?

I generally have an 8 second delay between page fetches for each domain and we try and keep an unsorted fetch list so that rule wouldn't apply over the scale of a 1 million page fetch.

The only issue i have yet to work on is possibly doing an IP lookup to see if we are hitting a bunch of hosts on a shared server or something similar, but that doesn't seem to be what you are complaining about.

incrediBILL




msg:465127
 5:20 pm on Apr 19, 2005 (gmt 0)

I generally have an 8 second delay between page fetches

If that's all that was going on I probably wouldn't have lost my mind - 7 to 8 pages a minute is trivial.

I was watching as many as 4 pages a second get yanked off the site and the real problem is all my sites are database driven. When some serious amount of concurrent access hits the site then the disk churns and the CPU usage spikes to the point everything crawls. Doesn't crash mind you as it can handle the load, but trying to stop the abuse under a heavy load is like working in a time warp.

Victor - thanks for the link to the script, my site doesn't use PHP but I can easily adapt the concept.

victor




msg:465128
 5:24 pm on Apr 19, 2005 (gmt 0)

Thanks for asking.

Generally, I do not want a bot sucking bandwidth / using CGI time any faster than a human would. Remember, the site could have many bots all clamoring for attention at any one moment

The bots have got to be reasonable and give priority to the humans. The bots can keep going 24x7 to get more pages than a human ever would.

I put a crawl-delay of 15 seconds in when Microsoft's beta bot started rampaging. crawl-delay is a non-standard robots.txt command that a few bots respect.

Perhaps you could sample some robots.txt files and see what crawl-delays are common....Or maybe ask Brett's robots' survey to check for that:
[webmasterworld.com...]

incrediBILL




msg:465129
 5:43 pm on Apr 19, 2005 (gmt 0)

The bots can keep going 24x7 to get more pages than a human ever would

My primary site would generate between 40K-80k pages and if someone wanted that in a hurry it would be very problematic. The server that was hit yesterday averages about 300kbps all day long so when it suddenly skyrockets to 2mbps it's real easy to spot an issue. I actually have historical traffic graphs for all the my servers and domains with a 12 month history to see long term trends (and abuse spikes) and the amazing thing is this really doesn't happen that frequently based on the traffic graphs, nothing on this scale in the last 2 months anyway.

jschmitz




msg:465130
 5:54 pm on Apr 19, 2005 (gmt 0)

How are you guys tracking real time traffic? Are you running your own servers? HOW are you DOING IT?

thanks

judy

incrediBILL




msg:465131
 6:02 pm on Apr 19, 2005 (gmt 0)

I use a rather arcane tool called MRTG (Multi Router Traffic Grapher) that is updated in 5 minute increments.

[mrtg.biz...]

I have it setup to page my cell phone when the CPU exceeds 90% or bandwidth spikes over 200% so I can catch abuse in real time. However, when you forgot you had the ringer turned off on the phone it doesn't help much :)

And yes, it's on my own server.

ByronM




msg:465132
 6:35 pm on Apr 19, 2005 (gmt 0)

Can you use the mod_throttle to slow down the ip's eating up your resources? (may be more cpu intensive than it's worth).

I can't stand bots that do go nuts, and it is bad practice. Have you verified the bot is all from a single host or that it was simple they built a new fetch list that had everyone of your pages in it and you had a few dozen of the msn spiders hitting your site?

I typically run 4 spiders at once.. so i'm always thinking of ways to keep them behaving but sustain a fresh index. :)

incrediBILL




msg:465133
 7:06 pm on Apr 19, 2005 (gmt 0)

Can you use the mod_throttle to slow down the ip's eating up your resources?

Nope - we used to use mod_throttle - but the author said he's not upgrading it to work in Apache 2.0 and I'm sure as heck not going back to Apache 1.3 so alternative strategies will need to be employed.

It wasn't a mainstream spider like MSN, it was one I didn't care about so I just blocked the IP.

victor




msg:465134
 6:28 am on Apr 21, 2005 (gmt 0)

How are you guys tracking real time traffic? Are you running your own servers? HOW are you DOING IT?

All my sites are dynamic, not static. That is, a browser or spider connects to a CGI script not a pre-made page.

The common code at the start of each of my scripts calls a bad-bot check routine.

That keeps a sliding window of the last 3 minutes worth of requests. If an IP address is on that list too many times, it gets banned for 5 minutes -- ie it'll get the "too fast" message" for any request in the next 5 minutes.

But if it's still spidering too fast when the 5 minute ban expires, it gets banned for 30 minutes -- with a slightly tougher message.

Then two hours.

Then 24 hours.

Anyone persistently stupid enough to get themselves into the 24-hour ban list, I review for manual eternal zapping.

That's a more complex scheme that than PHP solution (see message 3 in this thread). But it works for me, and it allows bad bots or offline readers to adjust their behavior and return later as a welcome guest.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Search Engines / Alternative Search Engines
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved