Welcome to WebmasterWorld Guest from 54.144.243.34

Forum Moderators: goodroi

blocking certain files with robots.txt?

   
10:39 am on May 30, 2008 (gmt 0)

10+ Year Member



Hi,

I have a directory and for each catagory there is a link (add url for each cat) I see that google is indexing this link for each catagory i.e www.mysite.com/dir/add_url.php?c=5
c=6
c=7 rct...
Is there a way to block each add url link from being indexed with robots txt without going through the whole site & copy pasting each url to my robots.txt file

thanks

8:39 pm on May 30, 2008 (gmt 0)

5+ Year Member



This should do the job:

Disallow: /*/add_url.php$

7:38 am on May 31, 2008 (gmt 0)

10+ Year Member



Thanks OutdoorMan!
2:22 pm on Jul 11, 2008 (gmt 0)

WebmasterWorld Senior Member g1smd is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



That will only block stuff that ends with .php exactly.

If you need to block stuff with appended query strings, too, then remove the $ from the end of the rule.

2:26 pm on Jul 11, 2008 (gmt 0)



If 'dir' is a constant part of the URL, then all that's needed is
Disallow: /dir/add_url.php

Note that this will mean no version of add_url.php will appear in search results.

 

Featured Threads

Hot Threads This Week

Hot Threads This Month