| 9:46 pm on Jun 2, 2011 (gmt 0)|
I wouldn't say
|entire site is either of low quality or high quality |
It is as if certain sections of the site are deemed low quality it can affect the ranking of a set of pages which will affect overall traffic. On small sites this is easier to spot, on larger sites it is more difficult.
See tedsters theory of low value pages affecting high value pages that are linked. Unless Mr Tedster has changed this idea ...
| 9:52 pm on Jun 2, 2011 (gmt 0)|
Would it make sense then to block the google spider (or logged out users) from viewing anything that might potentially be low quality?
In other words, if google is in the business of "judging" rather than "ranking", maybe it makes sense to only show content that you want to have judged?
| 10:07 pm on Jun 2, 2011 (gmt 0)|
|See tedsters theory of low value pages affecting high value pages that are linked. Unless Mr Tedster has changed this idea ... |
It still looks to me as though pages that link directly TO the most demoted pages show the second most severe demotions. However, to complicate the analysis a bit, Google sometimes notices a high quality page here and there and gives it a boost - even though other pages on that same site are demoted.
|Would it make sense then to block the google spider (or logged out users) from viewing anything that might potentially be low quality? |
It all depends on what you think low quality actually is. With no one pulling out of a Panda nose-dive so far, I wouldn't buy into any particular theories about what Google is using for Panda criteria, outside of the most obvious - stub pages, copied or spun content, pages that overlap in a major way with other pages.
Googlebot should not be able to access anything that requires a login anyway, right?
So it boils down to this: what kind of pages are you currently publishing that you think are low quality? If you think you've got such low quality content on the site, then why publish it at all?
| 10:50 pm on Jun 2, 2011 (gmt 0)|
It's sad that if you're not linking to what a SEARCH ENGINE thinks is #1 worthy you're risking your own rankings. Sad indeed.
| 11:22 pm on Jun 2, 2011 (gmt 0)|
My own experiences don't bear this out. The entire site across the board got demoted, and cannot identify and particular pages that were singled out.
And, of course, the definition of what is 'low-quality' is dubious to me. The only possible thing I can think of is I have some pages that are a little low on the word count -- but as it's a blog they still resulted in a healthy discussion following. I don't consider that low quality - but Google apparently does.
| 11:25 pm on Jun 2, 2011 (gmt 0)|
|The entire site across the board got demoted, and cannot identify and particular pages that were singled out. |
How can that possibly be? Certainly not all pages had the same entry traffic from search, right? Some had to be better performers than others, and of those you should be able to see which lost the most.
| 11:35 pm on Jun 2, 2011 (gmt 0)|
Well, one thing you could do is run some reports on which pages google sent ZERO traffic to in the last 6 months or so, and pretty much ban google from indexing those pages.
With so many trillion pages on the web, google may be finding the "deep web" is that part in which they don't want to waste server space indexing or ranking. Why not do them a favor and just remove the ability for them to index that content?
Just keep the content they are sending traffic to and then allow Google to index new content. If they do a good job with that, open up more of your site to them. If not, the social web is increasing, and google may not be the only game in town for much longer.
| 11:38 pm on Jun 2, 2011 (gmt 0)|
Yes some pages were better placed in search, but as a percentage lost, it's pretty similar. E.g. The page that got 1000 referrals per day not gets about 50. The page that got 100 now gets about 5. And yes... thats 95% drop -- got hit by all 3 panda iterations.
If I were to go through the top 25 landing pages from Google traffic pre-panda and post-panda -- they are not actually that different - except for the numbers that is...
| 2:48 am on Jun 3, 2011 (gmt 0)|
I'd agree with synthese. I don't find that certain pages or subdirectories were necessarily hit differently.
I do see a larger percentage traffic loss on some pages that ranked in the top #5 for certain keywords. But that is what you would expect. We all know top ranking positions get a higher percentage of clicks, and therefore when they drop they loose a proportionally larger percentage of traffic. A drop from #1 to #25 will loose more proportionally than a drop from #10 to #35. It's not arithmetic.
But for my individual subdirectories as a whole, topics that were basically long-tail pages and not high rankers, the drop in traffic is essentially the same percentage.
I wish it was so easy as to look at Analytics and see which pages google no longer valued.
| 3:11 am on Jun 3, 2011 (gmt 0)|
|I wish it was so easy as to look at Analytics and see which pages google no longer valued. |
exactly...it isn't that easy as it is made out by many including Vanessa fox. If it were, we would have seen many more recoveries. Neither analytics nor WMT really helps.