Forum Moderators: Robert Charlton & goodroi
Tracking the sites logs shows repetetive visits from some or one of googlebots, real googlebots not just faked user-agent ones. Of course when these sites were unbanned they were getting much more visits from all or most googlebots. Questions is, since these sites were banned, why do googlebots still visiting them?
Googlebot is like a dumb machine and it is not able to act like a human and review sites to see if some tricks have been removed.
Actually it's more than a dumb machine.
It has been ten days so far since I modified my robots.txt to "force" googlebot not to spider what I "thought" google might consider as a duplicated content, but googlebots insists on spidering the disallowed contents.
My logs show that every particular bot downloads robots.txt then ironically spiders the pages, as if I asked him to spider them not the opposite!
To see if they still suck (in googlebot's opinion)?
Correct a mondo, and explains why does it insist on spidering pages that I told it not to spider.
Why exactly do they say they've forwarded your problem to the engineers if they know your site is banned and why it's banned?
I dont seem to understand the question.
Furthemore, many of the bans are based on algorithms which Google is always tweaking, so Google doesn't really want to totally lose the banned pages, it just doesn't want to display them at the moment.
I have many pages cached in google including the index page.
Googlebot is coming to the site every day (did 980 hits yesterday)
I had problem with code which has cleaned up mistaken dup content and emailed G, no reply so far..
So in anyones opinion am I banned or is it a question of waiting to get reindexed?
I am so confused, many people have different theories, just dont know who to believe!
Any ideas would be appreciated!