I've some questions about reconsideration requests and want to know your experience/guess about these questions.
1. Submitting a re-consideration request means that your site will always get another MANUAL review? Or some responses can be simply automated?
I believe there are quick reply buttons but Is it possible that when a reconsideration request is submitted, Google bot checks the violating page and If nothing is changes on the page, reconsideration request is simply denied with an automated message?
On the other hand, If the violating page is changed in some way, the reconsideration request goes in 'next phase'?
2. The MANUAL review will always occur from a Google IP or a IP from Mountain view? Is this carved in stone?
3. Is there anyway to check the raw access logs or analytics to see which pages were visited by the Google staff? In other words, Is there a way to pinpoint the pages which are being looked by Google manually so that we can look only specific pages more carefully.
4. Is it possible that when a manual review takes place, Google spoofs the referrer or the IP so that things are not easily traceable?
5. Google might be using other "clever" methods to take a look at the site so that web master can't pinpoint anything?
6. How can we use any sort of analytics to track Google staff activities on the site. What are the footprints?
7. These days Google is sending out notifications like :
"Specifically look for unnatural links pointing TO your site"
Above message is clearly a backlinking penalty. If you don't receive above message and receive a message that "some pages still violate quality guidelines", Is it safe to assume that Google is having problem ONLY with some on-site things? Or IT is also possible that the penalty is related to backlinks and Google is not explicitly telling the reason. Can Google tell the things more explicitly only in particular cases?
Sorry for the long post but I'm trying to resolve penalties and want to move in right direction. I'm doing everything like improving onpage things, removing spammy backlinks, documenting them etc.. Before I send the final request, I want to know your opinion on above questions.
I don't think you can count on a human visit to your site, because Google employees can see a lot, and very quickly, through their internal tools. For example, they can quickly see if there is a manual penalty or not, and if there isn't - they send you a boilerplate response that there is no manual penalty to be removed.
I've always assumed that most of the manual inspection was done via internal tools, anyway. If you submit a request and the site gets crawled after that, then they have what they need to see on their own servers.