edgeman - 10:39 pm on Oct 5, 2011 (gmt 0)
Our site has followed the same pattern as DaniWeb. To summarize, 50% drop in Febraury, 100%+ recovery in July, 43% drop on September 27th and now what appears to be a full recovery starting October 4th.
I posted earlier in this thread indicating the same pattern: Hit in late February, 100%+ recovery in July, 45% drop on September 27th and 110% recovery on October 4th (actually, recovery began around 6pm PST on October 3rd).
One thing I did do was submit a reconsideration request to Google last week, before I read anything about Panda hitting, because I thought we may have been penalized for a developer bug that caused a lot of UGC to be hidden from users but not from the engines.
Interestingly enough, we have some similarities here. We had a developer bug go into production that Google picked up on the weekend before September 27th. They sent us an automated notification in Google Webmaster Tools telling us they had discovered an extremely high number of pages. GWT reports they crawled greater than 1.3 million pages over the weekend as a result of the bug. Our average is 280k. There's no way we wanted these pages indexed as they serve a purpose on our site, but they are, by definition, "shallow" content we still intend to improve in the near future before allowing them to be indexed. We fixed the problem immediately, submitted a reconsideration request fearing a penalty, submitted a directory removal request through GWT to remove the content (completed within 24 hours) ... and then we waited.
However, seeing what happened with DaniWeb's recent recover, and what Alika said above, I'm assuming the developer bug had nothing to do with it.
I'm considering the same. But I don't want to be quick to ignore what happened in our case. I would not be surprised whatsoever to hear that a site gets nailed by Panda after adding 1 million pages of less than ideal content.