Welcome to WebmasterWorld Guest from 54.211.17.91

Forum Moderators: goodroi

Message Too Old, No Replies

Blocking Spiders

How to block spiders from crawling your site

   
7:03 pm on Jul 25, 2011 (gmt 0)



We are getting ready to implement a feature on our site that would create massive duplicate content. It is good for users, bad for spiders. We have weighed several options about how to avoid this.

1) rel="canonical" - I think this is not the best option because it will still allow spiders to use up our resources

2) block all major bot User Agents from seeing this feature - Would require updating, would never be able to block all bots.

3) Meta noindex, nofollow, noarchive - Could be a possible solution

What do you guys suggest/had the best luck with?
8:40 pm on Jul 25, 2011 (gmt 0)

WebmasterWorld Senior Member penders is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month



4) Block known spider IPs? (Is there a list of known spider IPs?) ... would perhaps need updating like the User Agents.

5) Robots.txt?
8:50 pm on Jul 25, 2011 (gmt 0)

WebmasterWorld Senior Member tangor is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month



Work smart, not hard... WHITELIST the bots allowed and disallow all the rest. These days that list of who's invited in is MUCH SMALLER than the other way around.
8:57 pm on Jul 25, 2011 (gmt 0)

WebmasterWorld Senior Member g1smd is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



Bad bots ignore robots.txt so you need .htaccess rules to shut the door in their face.

Good bots obey robots.txt so you can keep them out of various bits of the site quite easily if those URLs have an easy to recognise pattern or feature in them.
9:12 pm on Jul 25, 2011 (gmt 0)

WebmasterWorld Senior Member tangor is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month



Thanks for reminder that I left out a bit, g1smd... .htaccess, of course, is how it is done. As for robots.txt, same applies: Whitelist what you allow and disallow all the rest... then deal with those unruly bots which do not play nice. Mea cupla for leaving that out as we do say this many, many times in many, many threads... my bad...!
9:15 pm on Jul 25, 2011 (gmt 0)

WebmasterWorld Senior Member g1smd is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



It's a multi-faceted approach. Once you have a month of raw site logs you will have details for much more than 90% of the bots that might access your site.

After that, you'll just need a minor tweak to the rules now and again.
 

Featured Threads

Hot Threads This Week

Hot Threads This Month