Welcome to WebmasterWorld Guest from 184.108.40.206 , register , free tools , login , search , subscribe , help , library , announcements , recent posts , open posts Subscribe and Support WebmasterWorld
Why is Google following my exclusions? Robots.txt file says dont go and Google does! anxvariety msg:1529247 4:30 am on Jul 14, 2004 (gmt 0) My robots.txt file at www.mysite.com/robots.txt contains the following:
But google follows every single one of them! I have a alert.asp?issue=1-10000.. It follows every single page.
What am I doing wrong?
digitalv msg:1529248 4:41 am on Jul 14, 2004 (gmt 0)
How new is your robots.txt? Google often caches robots.txt for a while before picking up a fresh copy. If there is nothing wrong with your robots.txt file then just give it some time. ogletree msg:1529249 4:54 am on Jul 14, 2004 (gmt 0)
If that is true why do they read the thing every day. I have never seen a site that did not get at least 2 hits a day on gobot one to index and one to the robots.txt. I'm sure some smaller pages don't get that but any site that has pages indexed in G and has a PR4 or better should get that. digitalv msg:1529250 4:56 am on Jul 14, 2004 (gmt 0)
You know one other way you could do it would be to do useragent detection via ASP:
Put this at the top of your ASP page:
If instr(request.servervariables("HTTP_USER_AGENT"),"googlebot") Then
response.write "Sorry google, no access baby" Else ... the rest of your page
Then throw an "end if" at the bottom. That would go into effect immediately.
jdMorgan msg:1529251 5:49 am on Jul 14, 2004 (gmt 0)
Simple fix: User-agent: * Disallow: [b]/a[/b]lert.asp Jim