Welcome to WebmasterWorld Guest from 54.205.251.179

Forum Moderators: goodroi

Message Too Old, No Replies

Why is Google following my exclusions?

Robots.txt file says dont go and Google does!

   
4:30 am on Jul 14, 2004 (gmt 0)

10+ Year Member



My robots.txt file at www.mysite.com/robots.txt contains the following:

User-agent: *
Disallow: alert.asp

But google follows every single one of them! I have a alert.asp?issue=1-10000.. It follows every single page.

What am I doing wrong?

4:41 am on Jul 14, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



How new is your robots.txt? Google often caches robots.txt for a while before picking up a fresh copy. If there is nothing wrong with your robots.txt file then just give it some time.
4:54 am on Jul 14, 2004 (gmt 0)

WebmasterWorld Senior Member ogletree is a WebmasterWorld Top Contributor of All Time 10+ Year Member



If that is true why do they read the thing every day. I have never seen a site that did not get at least 2 hits a day on gobot one to index and one to the robots.txt. I'm sure some smaller pages don't get that but any site that has pages indexed in G and has a PR4 or better should get that.
4:56 am on Jul 14, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



You know one other way you could do it would be to do useragent detection via ASP:

Put this at the top of your ASP page:

If instr(request.servervariables("HTTP_USER_AGENT"),"googlebot") Then
response.write "Sorry google, no access baby"
Else
... the rest of your page

Then throw an "end if" at the bottom. That would go into effect immediately.

5:49 am on Jul 14, 2004 (gmt 0)

WebmasterWorld Senior Member jdmorgan is a WebmasterWorld Top Contributor of All Time 10+ Year Member



Simple fix:

User-agent: *
Disallow: [b]/a[/b]lert.asp

Jim