My primary site is a pretty large site (> 250,000 pages of user generated content) and has done well in Google through Big Daddy and the subsequent tweaking until this past week. Honestly I'm not sure I had ever even seen a supplimental listing until this week. We've not paid a lot of attention to SEO since whatever we have been doing has seemed to work well.
Until Sunday June 18th at around 12:00 PM CST.
Sunday afternoon I noticed a sharp dip in traffic and when I investigated, all I could find was the post from Adam about the 'bad data push' and how they were working on fixing the problem. Is there some way that my site got trapped in this fix?
Something that I should add is that I use GoogleAlerts to notify me of when my site name appears in a new listing on Google, and for weeks I have been getting almost daily notices from GoogleAlerts showing the "5 billion page site" with content scraped from my pages. I don't know that any of these pages actually linked to my site, I never thought to look.
Now traffic is down 40% from the norm and I've also lost about 2 thirds of the pages that were shown by doing a "site:domain.com" command. I wonder if by removing this site caused the backlinks to my site to be recalculated, though currently a 'link:domain.com' shows the same number of backlinks as it has for the past month or so.
Just to be clear, I have no connection with the '5 billion page site' other than they scraped a lot of my content.
Any ideas or suggestions or am I alone on this?