jmccormac - 8:52 am on Jun 21, 2012 (gmt 0)
Search engines need to establish a baseline for how often a site is updated and how deep the content (clicks away from the home page) is on the site. From experience, (as in this is exactly the problem I am working on for a country level search engine), this can take up to a year of continual, scheduled, spidering. A new site has a very specific link profile and link acquisition rate. The process of a new site gaining new inbound links is, where there is no outside SEO campaign involved, more like a process of accretion in that links are gradually added an the site goes through a few bursts of new inbound links. What JohnMu seems to be hinting at is that comment spam, the use of meat bots and off topic/out of area links tend to flag new sites as potential problem sites.
What does this say about folks working hard to getting out of Panda recovery recognition? Does this resonate with anyone's experience?