Does anyone have a feel for how freshbot handles dynamic pages in comparison to deepcrawl bot?
I have a theory that Google is reworking the deepcrawl algorithm to try and better handle dynamic pages. Remember until quite recently, these pages hardly appeared in Google at all.
Think of the problem. Regularly on here you see people saying things like "the Google index has 90,000 of my 110,000 pages." The problem is what you are talking about is content that might represent a few thousand static pages that, because of dynamic creation, ends up with ridiculous numbers and Google's index gets clogged with essentially duplicate content that - whether deliberate or not - is awfully close to spam.
If Freshbot started with a relatively clean page, it may handle some of these things better, so I am curious about people's experience with how freshbot handles their dynamic content.
Could it be Google declared a moratorium on deepbot crawls until they have an algorithm they like better?