Forum Moderators: goodroi
User-agent: Googlebot
Line 2 Disallow: /*?
Line 3 Disallow: /scpages
Line 4 Disallow: /merchant
Line 5 Disallow: /cgi-local
When I checked the following URL with the check tool in Google's robots.txt section of Google Webmasters Tools (formerly sitemaps) it tells me that the page is ALLOWED.
[mysite.com...]
I also have the directory "scpages" disallowed for all robots like this:
User-agent: *
Disallow: /scpages/
I do not believe my robots.txt should allow the above URL to be indexed. Am I doing something wrong or is the URL checking tool in Google Webmaster Tools nuts? Do I need a longer path statement if I've done it wrong like?
User-agent: Googlebot
Line 2 Disallow: /online-store/scstore/scpages*
Don't understand what the heck is going on. Thanks for any feedback. I think the check tool is wrong, but who knows? Maybe I'm missing something. Thanks for any help. Tonerman