Forum Moderators: goodroi

Message Too Old, No Replies

Google Sitemaps Robots.txt Question

         

Tonerman

3:45 pm on Oct 9, 2006 (gmt 0)

10+ Year Member



I have the following in my robots.txt file:

User-agent: Googlebot
Line 2 Disallow: /*?
Line 3 Disallow: /scpages
Line 4 Disallow: /merchant
Line 5 Disallow: /cgi-local

When I checked the following URL with the check tool in Google's robots.txt section of Google Webmasters Tools (formerly sitemaps) it tells me that the page is ALLOWED.

[mysite.com...]

I also have the directory "scpages" disallowed for all robots like this:

User-agent: *
Disallow: /scpages/

I do not believe my robots.txt should allow the above URL to be indexed. Am I doing something wrong or is the URL checking tool in Google Webmaster Tools nuts? Do I need a longer path statement if I've done it wrong like?

User-agent: Googlebot
Line 2 Disallow: /online-store/scstore/scpages*

Don't understand what the heck is going on. Thanks for any feedback. I think the check tool is wrong, but who knows? Maybe I'm missing something. Thanks for any help. Tonerman

Tonerman

5:31 pm on Oct 9, 2006 (gmt 0)

10+ Year Member



Ok, I'm officially stupid. I had it wrong. The correct syntax is:

User-agent: Googlebot
Disallow: /*?
Disallow: /online-store/scstore/scpages/

I'd love to blow my stupid question away. Maybe the moderator will take heart! Tonerman