seoguy9999 - 2:04 am on Feb 27, 2011 (gmt 0)
I am working on a very high-traffic, high PR site that has lost 20% of it's traffic. I believe it is site-wide. Most of my rankings have dropped. Some have not. But in each and every case where I maintained a number 1 position, there was no real competition. So even if I took a modest ding, it wouldn't push me down.
And the dings have been modest. But if everything goes down 4 places, you are going to lose a lot of your traffic as things get knocked to the second page.
Also, every part of my site has been hit, every page type, all with a similar hit. Even pages I believe are much higher quality.
So I believe it is site-wide similar to a PR ding (but I don't think it is that either - because my higher PR pages still would have maintained their number 1 ranking). Just a new judgement on a "site". When Google announced the update - they talked about the quality of sites, not pages.
So how are they judging quality? And what makes a good page Algo-wise?
And can they now be making a site-wide assessment based on behavior? (For instance, a second click on a search result within 15 seconds of the last click is a definite quality signal). These are the things I am wondering about.
What are the negative quality signals?
- Large dup content from other domains
- Pages assembled from small pieces of content from your own site - typically done on large dynamic sites to make up for lack of content
- Lack of original images
- Heavy-cross linking perhaps (content done for Spiders)
- Lack of complete sentences
- Content that does not make sense (again stuff assembled just to create a page, but beyond the title and maybe a sentence the rest does not hold up thematically).
- Poorly organized content (Not sure how do this with an Algo - but Ehow's content is certainly well organized).
- Content that is just too similar to other content on your site. (Shuffling the same content to get different results) or even Ehows case of writing 10 articles on how to tie a bow-tie - not the same text, but really the same content with different titles purely for SEO purposes).
- Poor ad to content ratio
As someone said earlier in the thread, it is likely a percentage of overall indexed pages that they judge as low quality that generates a low site quality rating.
This could explain why some sites with really good content are getting knocked down. How does their overall pool of content look? Are they generating thousands of tag pages or something like that which are drowning out the good content in overall pool of indexed content?
Anybody have other ideas on what other signals might indicate quality of content and any indication that it is based on user behavior?