Recently one of my sites suddenly started to perform very badly in Google. I decided to have a look at the server logs, and I'm seeing some very strange behaviour.
Googlebot turns up and requests my homepage several times during the course of a day. However, when it requests the page, a HTTP code of 404 is frequently being returned. It comes back several hours later, requests the homepage again, but this time a HTTP code of 200 is returned.
There seems to be a loose correlation between the time and the HTTP code returned - 404s seems to be returned early morning (between 6-11), whereas the rest of the time a 200 is returned.
I don't understand why this is happening. As far as I'm aware there has been no server downtime, and even if the server was down, then it wouldn't be possible to record the log.
I've had a look at the homepage through a HTTP viewer and a 200 is always returned. I used Firefox to view the page as Googlebot, and the page is fine.
What's even weirder, is that Slurp appears to have no problems with the site. It requests the homepage during the same time periods as Googlebot, and a 200 is always returned.
Has anyone ever seen anything like this?