| Welcome to WebmasterWorld Guest from 184.108.40.206 |
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
|Become a Pro Member|
|Google Site Diagnostics - robots.txt blocks urls|
Recently I did some changes to my robots.txt file and now I'm getting blocked urls in Google Site Diagnostics (for around 15 pages).
They look like this:
http:/ / 209. 85. 135. 104/ search? q= cache:1lNv7BWsX34J:www. mydomain. org/ blog/ node/ 58+yamaha+t-85&hl= de&gl= de&ct= clnk&cd= 6
Google says it is blocked by robots.txt.
I don't understand what is happening, since it looks like I'm blocking a cached version of my page? 220.127.116.11 is a google search page.
Also, none of my robots.txt file blocks my blog - so it shouldn't be blocked.
What is going on?
is that google sitemaps or google adsense giving you those results? google adsense has had problems with their reporting
that is an adsense bug that google needs to fix
i wouldn't worry to much about it
All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved