Forum Moderators: open
The reason I am bringing this up is because I think some people focus on one specific factor that is mentioned in the forums and believe that this factor has to be what affected their site. However, it is possible that the specific issue may have nothing to do with the reason their site dropped and they do not explore other options (for example, I added the word affiliate to my site and it dropped a month later, so that must be the reason why I dropped).
People should try to keep an open mind and freely discuss possible reasons for a drop and see which of these possibilities applies to their site. Most likely there will be multiple factors applicable, but the more feedback posted from people being affected, the easier it is to help define which factors are more likely to be a reason for a penalty/drop in SERPs. So I thought it would be good to try to post as many possible theories as we could come up with in one thread. I would be great for others to add to the list.
(Caveman has done a good job of pointing out a few possibilities and here are some other possibilities mentioned):
- Change in # (or %) of pages
- Semantic tweak
- Dupe filter tweaked
- A new technology implemented in the algo(possibly Block Level Link Analysis and Temporal Link Analysis – as discussed by Dr. Garcia)
- Affiliate pages (or pages with datafeeds) are getting filtered/dampened
- Decrease in weight of certain links (a result of interlinking, nav links, footer links, etc)
- Over-Optimization filter (anchor text, etc)
- Rate of increase or decrease in links for a site
- Manual penalties (it would not take much time for a few G employees to go through the top 1,000 SERPs that generate the most revenue via Adwords and filter out – or put a dampening factor – on certain sites). They could combine the spam reports they had with the top producing keywords (since those are likely the ones that are getting spammed) and find some spammy sites fairly quickly. G claims they want to do everything automatically, but their primary goal is not to produce the best search engine. G’s primary goal is to produce the best results for their users.
What other possibilities do you think could have an influence (and I am stressing – possibilities)? I am fairly confident some of the items in the list above are not applicable and have no influence on the SERPs. And if all of the theories are thrown out at once it may spark some thoughts from other people whose sites have been affected.
I know that much content is dynamic so many sites need never upload new pages to overwrite old ones, but it is a signal of potentially fresh content. If your pages are .html then it would suggest they need to be uploaded for new content. If they are .html and have dynamic content then this may be a sign that a smart seo guru is behind the site....