Further to an earlier comment here re: sites becoming unverified in WMT...
This seems to happen weekly (at least). I wonder if the following may give an insight as to why.
We have just re-verified our WMT sites (about 8 sites) in mid-afternoon UK (time may be relevant - see below). They all started out as Unverified - Error 200 (ok) (typical and obviously incorrect reason).
A few of the sites verfied immediately.
Some claimed to have "our server" errors - ie google's fault, including unable to find a domain name. A retry seconds later resulted in verification. Possibly a busy time of day for google in the UK?
For one site google claimed a 403, which I return for a variety of "bad" browsers/bots/IPs. Re-verification a moment later went through with no problem.
Checking the logs for the rejected verfication attempt (UA: Google-Sitemaps/1.0) came up with an IP in the 66.249.85.* range. Checking on this, there is no rDNS entry - google always uses traceable rDNS for robots? Hmm.
I had in fact blocked this IP earlier, at 3 am this morning. I have the relevant range "whitelised" for google UAs so was puzzled why I had blocked it. It turned out it had been used with a standard MSIE6 UA with certain invalid header credentials: I'm assuming a robot of some kind. It was also used previously with a google Translate UA.
Whether this IP block is being used outside of google or for "Google Reach" I'm not sure, but shoving a valid robot like Sitemap through on a "general purpose" block is certainly not good, especially with no rDNS.
It occurs to me this and similar types of access attempts by google may have resulted in Unverfied for some sites that may have returned a Sod Off code for this range of IPs and an Ok response was later returned on a retest. I have nothing to establish this conjecture, however.
[edited by: tedster at 11:49 pm (utc) on July 29, 2008]
[edit reason] moved from another location [/edit]