I don't wish any of my site images listed on search engines. Therefore I have a blanket block on my /images/ folder in my robots.txt as below, plus the additional Googebot-Image/1.0 block in place. However the Googlebot-Image bot still insists on spidering a few images of my site, despite receiving a 403 error.
Robots: User-agent: *
Disallow: /i
User-agent: Googlebot-Image/1.0
Disallow: /
Result: 403 GET 66.249.66.33 Googlebot-Image/1.0 /images/abc.gif
403 GET 66.249.66.33 Googlebot-Image/1.0 /robots.txt
I'm guessing the Googlebot-Image bot is receiving the 403 on the robots.txt so is therefor unable to ascertain what it can and cannot spider, so continues to do so.
I cannot figure out why it is receiving a 403 though.
Can anyone shed any light on what may be happening please?
Many thanks in advance