|Is a page low quality if Google still sends traffic?|
| 1:09 am on Nov 22, 2011 (gmt 0)|
I have a large, high traffic site (1+ million pages indexed), which was hit hard (-60%) by Panda 1.0 and has yet to recover. I've already pruned/noindexed a huge number of pages, but I know I've got to take things a step further if I'm going to have any chance of lifting the penalty (Yes, I'm also making other improvements in parallel). However, I'm faced with a conundrum... many of the remaining pages I'd like to eliminate for being low(ish) quality, or duplicative, still get traffic. For any one such page, it may only be a single visit, but in aggregate these pages account for perhaps 30% of all traffic that remains post-Panda. So, cutting them will hurt, and I may not be able to get the traffic back by restoring them if I change my mind about their value down the line.
Given the potential downside, am I being to harsh in my editorial judgement that these pages should be eliminated, or should I take the hint that Google still sees value in them? In other words, is traffic from Google confirmation that a page is of sufficient quality that it won't contribute to a Panda penalty?
| 5:35 am on Nov 22, 2011 (gmt 0)|
First, Google often does send a low level of traffic to Pandalyzed pages. I'd look at the query terms and see what that tells you. Chances are the traffic is not for very "big" keywords, but see what you can learn about that traffic - including whether it's "just" traffic or if it does your business any good.
Have you considered improving the quality of those pages rather than completely eliminating them?
| 7:25 pm on Nov 22, 2011 (gmt 0)|
I think there are some big questions here:
1. What are the low quality? Are low quality pages those getting low traffic? What if some are missing targeted traffic, poorly SEO wise.
2. Should you remove so called "low quality" pages or edit/improve the content of these pages?
3. Are these pages marked "red" in the eyes of the algorithm?
4. Might be deleting the low quality pages altogether could eventually help the rankings of your higher-quality content.
I believe converting many shallow pages (let's assume, low quality - are those getting low traffic) into more useful pages is a good solution, but how?
* Low quality is subjective in many cases.
| 9:39 pm on Nov 22, 2011 (gmt 0)|
Chances are the traffic is not for very "big" keywords
Yes, that's right. These pages target long tail keywords. Very long tail keywords in some cases, which are in some cases relatively minor variations of those targeted by other pages on the site. That said, collapsing pages targeting similar keywords is much easier said than done.
|but see what you can learn about that traffic - including whether it's "just" traffic or if it does your business any good. |
Avg bounce rate on these pages is ~48%, so there is user engagement. I think that's quite a bit better than most people (including me) would expect just eyeballing these pages.
|Have you considered improving the quality of those pages rather than completely eliminating them? |
Yes, we're making what improvements we can, but there's a limit to how much better these pages can be given the obscurity of the terms they target.
| 3:41 pm on Nov 23, 2011 (gmt 0)|
Low quality is indeed subjective. I'd be comfortable simply exercising my judgement if it didn't conflict with the fact that Google is sending traffic to these pages.
| 4:13 pm on Nov 23, 2011 (gmt 0)|
After thinking, I'd say that looking at the bounce rate, time on site and page/visit on specific pages is clearly an objective judgement.
Those with the lowest score are indeed low quality and require improvement.
p.s. I don't think that Panda is so "clever" as we are..