Welcome to WebmasterWorld Guest from 54.159.50.111

Forum Moderators: goodroi

Message Too Old, No Replies

blocking certain files with robots.txt?

     
10:39 am on May 30, 2008 (gmt 0)

Full Member

10+ Year Member

joined:June 17, 2003
posts:208
votes: 0


Hi,

I have a directory and for each catagory there is a link (add url for each cat) I see that google is indexing this link for each catagory i.e www.mysite.com/dir/add_url.php?c=5
c=6
c=7 rct...
Is there a way to block each add url link from being indexed with robots txt without going through the whole site & copy pasting each url to my robots.txt file

thanks

8:39 pm on May 30, 2008 (gmt 0)

Junior Member

5+ Year Member

joined:Sept 18, 2006
posts:197
votes: 0


This should do the job:

Disallow: /*/add_url.php$

7:38 am on May 31, 2008 (gmt 0)

Full Member

10+ Year Member

joined:June 17, 2003
posts:208
votes: 0


Thanks OutdoorMan!
2:22 pm on July 11, 2008 (gmt 0)

Senior Member

WebmasterWorld Senior Member g1smd is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:July 3, 2002
posts:18903
votes: 0


That will only block stuff that ends with .php exactly.

If you need to block stuff with appended query strings, too, then remove the $ from the end of the rule.

2:26 pm on July 11, 2008 (gmt 0)

Senior Member

joined:Jan 27, 2003
posts:2534
votes: 0


If 'dir' is a constant part of the URL, then all that's needed is
Disallow: /dir/add_url.php

Note that this will mean no version of add_url.php will appear in search results.