homepage Welcome to WebmasterWorld Guest from 54.211.219.178
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Visit PubCon.com
Home / Forums Index / Search Engines / Sitemaps, Meta Data, and robots.txt
Forum Library, Charter, Moderators: goodroi

Sitemaps, Meta Data, and robots.txt Forum

    
blocking certain files with robots.txt?
htdawg




msg:3662711
 10:39 am on May 30, 2008 (gmt 0)

Hi,

I have a directory and for each catagory there is a link (add url for each cat) I see that google is indexing this link for each catagory i.e www.mysite.com/dir/add_url.php?c=5
c=6
c=7 rct...
Is there a way to block each add url link from being indexed with robots txt without going through the whole site & copy pasting each url to my robots.txt file

thanks

 

OutdoorMan




msg:3663205
 8:39 pm on May 30, 2008 (gmt 0)

This should do the job:

Disallow: /*/add_url.php$

htdawg




msg:3663512
 7:38 am on May 31, 2008 (gmt 0)

Thanks OutdoorMan!

g1smd




msg:3696319
 2:22 pm on Jul 11, 2008 (gmt 0)

That will only block stuff that ends with .php exactly.

If you need to block stuff with appended query strings, too, then remove the $ from the end of the rule.

Receptional Andy




msg:3696322
 2:26 pm on Jul 11, 2008 (gmt 0)

If 'dir' is a constant part of the URL, then all that's needed is
Disallow: /dir/add_url.php

Note that this will mean no version of add_url.php will appear in search results.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Search Engines / Sitemaps, Meta Data, and robots.txt
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved