Forum Moderators: goodroi

Message Too Old, No Replies

Banning googlebot from individual pages via robots.txt

Meta tags are just not doing the job properly

         

Vimes

8:07 am on Mar 31, 2005 (gmt 0)

10+ Year Member



Hi,

Recently I added Meta tags to some pages that I did not want indexed.

Now I want them completely removed from Googlebots agenda is this the correct way of removing them from the index?

User-agent: googlebot
Disallow: /directory/lowerdirectory/*.htm

Now I would like these directories still to be indexed /directory/lowerdirectory.htm
I just want anything after the /lowerdirectory/ not to be index. There are only a few pages in this directory so would it be better to just give the bot the exact location i.e.

User-agent: googlebot
Disallow: /directory/lowerdirectory/fileiwantremoved.htm

I don’t want to place /fileiwantremoved.htm as its my understanding that all files named this will be axed.

I have pages in different directories ending in the same name that I want to remain indexed and others I don’t.

I hope this makes sense, any advice would be greatly appreciated.

Vimes.