Welcome to WebmasterWorld Guest from 188.8.131.52 , register , free tools , login , search , pro membership , help , library , announcements , recent posts , open posts Become a Pro Member
blocking certain files with robots.txt? htdawg msg:3662711 10:39 am on May 30, 2008 (gmt 0) Hi,
I have a directory and for each catagory there is a link (add url for each cat) I see that google is indexing this link for each catagory i.e www.mysite.com/dir/add_url.php?c=5
c=6 c=7 rct... Is there a way to block each add url link from being indexed with robots txt without going through the whole site & copy pasting each url to my robots.txt file
OutdoorMan msg:3663205 8:39 pm on May 30, 2008 (gmt 0)
This should do the job:
htdawg msg:3663512 7:38 am on May 31, 2008 (gmt 0)
Thanks OutdoorMan! g1smd msg:3696319 2:22 pm on Jul 11, 2008 (gmt 0)
That will only block stuff that ends with .php exactly.
If you need to block stuff with appended query strings, too, then remove the
$ from the end of the rule.
Receptional Andy msg:3696322 2:26 pm on Jul 11, 2008 (gmt 0)
If 'dir' is a constant part of the URL, then all that's needed is
Note that this will mean no version of add_url.php will appear in search results.