Forum Moderators: Robert Charlton & goodroi
I'm pretty happy with our indexing at the moment. It usually takes 24-48 hours for a front-page item to hit the index, and 48-96 for a deep link. We've worked a lot to optimize with Google over the years, and I'm wondering if a sitemap would help or hurt. I'm inclined to think it's not worth it for this situation.
However, I use the Google sitemap for some weeks.
The normal Googlebot htakes approx. 30.000 URLs per day - nothing changed the first weeks.
But the Mozilla Googlebot is since a few days very active (up to 100,000
visits on the day) and deletes the visited URLs on its index. The normal bot ist very inactive (approx. 500 sites per day).
Neither I use Cloaking nor I make other spam techniques.
The only problem is double Content by faults in the URL Rewriting
for many months with a big duplicate content filter.
I removed all these URLs with the Google Removal tool, and the duplicate
content filter went away for 6 weeks.
For a couple of days all these URLs are appearing again - and the
deleting-problem with the Mozilla Bot started parallel.
I don't know whether the whole is due to the Google sitemap.
Does anybody have an idea why URLs are deleted by the Mozilla
Bot?