Forum Moderators: goodroi

Message Too Old, No Replies

Need help removing multiple pages

         

tintin74

6:56 pm on Nov 14, 2005 (gmt 0)

10+ Year Member



HI everybody,

using the google removal tool, I need to build a robots.txt file that can help me to exclude search engine to crawl some (1000 or more) pages like these:

1abc.html
2abc.html
3abc.html

Could I use something similar to :

User-agent: *
Disallow: /*abc.html

Could this string work for me?

Thank you

Tintin

Dijkgraaf

8:49 pm on Nov 14, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Have a look at this thread here.
[webmasterworld.com...]
The conclusion is that the Google removal tool does not support wild cards.
Wild cards in disallows are not currently part of the robots.txt standard, so it is not recomended that you use them even if some bots do allow them.

tintin74

11:00 am on Nov 15, 2005 (gmt 0)

10+ Year Member



Thank you, Dijkgraaf

Tintin