My tech blog site has been hit by the first Panda and thankfully is still able to get about 10,000 hits daily. In attempt to recover from Panda, we are in a journey to improve all our contents by rewriting them or simply deleting the ones that are no longer useful today.
I've read many times that a site will mostly recover from Panda after removing some pages and often rewriting the rest.
Which action would "probably" make more sense for a faster partial Panda recovery since even tedster confirms that sites gradually recovers and not through one panda refresh.
A) Improving the top 100-200 articles that gets the most hits.
- Doing this should improve the overall metrics for the whole site. I thought about this possibility because it makes sense to make sure that the top articles are really good when Google is sending a lot of visitors to those pages. If they are not updated/improved to satisfy the visotors, Google may just somehow know about it and stop sending people to those popular pages.
B) Improving the oldest ones first.
- Most of the old posts are either short, not well researched, outdated, or even doesn't work at all. Google has often said to fix the "bad" pages to recover from Panda. I think that there are a couple of old posts that is in the top 100-200.
Any advice or suggestions would be much appreciated. Thanks!