Welcome to WebmasterWorld Guest from 54.162.226.212

Forum Moderators: goodroi

Message Too Old, No Replies

Blocking Google Only

From certain sub domains

     
1:54 pm on Apr 10, 2007 (gmt 0)

5+ Year Member



Being a victim of Google's new algo, I need to stop them (Google) from crawling certain sub directories of my site. How do I do this?
1:14 pm on Apr 11, 2007 (gmt 0)

WebmasterWorld Administrator goodroi is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



Add this to your robots.txt
User-agent: Googlebot
Disallow: /subdirectory1/
Disallow: /subdirectory2/
Disallow: /subdirectoryetc/

[google.com...]

To verify your robots.txt is correct you can use Google Webmaster Central.

7:06 am on Apr 13, 2007 (gmt 0)

5+ Year Member



Hi Goodroi

Thanks, but Google's own pages say to leave sub-directories /open without the traling backslash, no /closed/ as with everyone else.

7:18 pm on Apr 13, 2007 (gmt 0)

WebmasterWorld Administrator goodroi is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



to further complicate things you can look at their own robots.txt file (http://www.google.com/robots.txt) which uses both styles :)
7:25 pm on Apr 13, 2007 (gmt 0)

WebmasterWorld Senior Member encyclo is a WebmasterWorld Top Contributor of All Time 10+ Year Member



I believe there is a difference between the two styles (please correct me if I'm wrong!). As I understand it:

Disallow: /foo

Matches

/foo/
and
/foo.html

Whereas:

Disallow: /foo[b]/[/b]

Matches the directory only.

 

Featured Threads

Hot Threads This Week

Hot Threads This Month