Forum Moderators: goodroi
using the google removal tool, I need to build a robots.txt file that can help me to exclude search engine to crawl some (1000 or more) pages like these:
1abc.html
2abc.html
3abc.html
Could I use something similar to :
User-agent: *
Disallow: /*abc.html
Could this string work for me?
Thank you
Tintin