homepage Welcome to WebmasterWorld Guest from 54.197.94.241
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
High Quality Pages on Low Quality Sites - A Weakness in Panda?
dvduval




msg:4321296
 9:08 pm on Jun 2, 2011 (gmt 0)

In my opinion one of the biggest mistakes regarding the Panda update is the idea that an entire site is either of low quality or high quality.

My personal view is
- there can be high quality sections of sites. For example there might be one section of a site that is maintained by a good writer.
- there can be high quality pages like well written user submission intermixed with some other submissions that were less worthwhile
- some content can have an appeal to a group of people that doesn't fit the mold. They don't care about trusting the people on the site for example.

It seems to me that google has decided to use a "broad brush" and make a decision about an entire site.

To draw a comparison, Consumer Reports for years in their ratings said that all Volkswagons were not worth buying.
- Does that mean that mean that all customers who bought VWs were unhappy with their experience?
- could there have been specific features (eg pages) that a subset of consumers were especially interested in having in their car?

It almost seems that google's goal now is to JUDGE which content people will like the most, rather than be the best locator of said content.

 

johnhh




msg:4321305
 9:46 pm on Jun 2, 2011 (gmt 0)

I wouldn't say
entire site is either of low quality or high quality


It is as if certain sections of the site are deemed low quality it can affect the ranking of a set of pages which will affect overall traffic. On small sites this is easier to spot, on larger sites it is more difficult.

See tedsters theory of low value pages affecting high value pages that are linked. Unless Mr Tedster has changed this idea ...

dvduval




msg:4321307
 9:52 pm on Jun 2, 2011 (gmt 0)

Would it make sense then to block the google spider (or logged out users) from viewing anything that might potentially be low quality?

In other words, if google is in the business of "judging" rather than "ranking", maybe it makes sense to only show content that you want to have judged?

tedster




msg:4321316
 10:07 pm on Jun 2, 2011 (gmt 0)

See tedsters theory of low value pages affecting high value pages that are linked. Unless Mr Tedster has changed this idea ...

It still looks to me as though pages that link directly TO the most demoted pages show the second most severe demotions. However, to complicate the analysis a bit, Google sometimes notices a high quality page here and there and gives it a boost - even though other pages on that same site are demoted.

Would it make sense then to block the google spider (or logged out users) from viewing anything that might potentially be low quality?

It all depends on what you think low quality actually is. With no one pulling out of a Panda nose-dive so far, I wouldn't buy into any particular theories about what Google is using for Panda criteria, outside of the most obvious - stub pages, copied or spun content, pages that overlap in a major way with other pages.

Googlebot should not be able to access anything that requires a login anyway, right?

So it boils down to this: what kind of pages are you currently publishing that you think are low quality? If you think you've got such low quality content on the site, then why publish it at all?

Sgt_Kickaxe




msg:4321337
 10:50 pm on Jun 2, 2011 (gmt 0)

It's sad that if you're not linking to what a SEARCH ENGINE thinks is #1 worthy you're risking your own rankings. Sad indeed.

synthese




msg:4321354
 11:22 pm on Jun 2, 2011 (gmt 0)

My own experiences don't bear this out. The entire site across the board got demoted, and cannot identify and particular pages that were singled out.

And, of course, the definition of what is 'low-quality' is dubious to me. The only possible thing I can think of is I have some pages that are a little low on the word count -- but as it's a blog they still resulted in a healthy discussion following. I don't consider that low quality - but Google apparently does.

tedster




msg:4321360
 11:25 pm on Jun 2, 2011 (gmt 0)

The entire site across the board got demoted, and cannot identify and particular pages that were singled out.

How can that possibly be? Certainly not all pages had the same entry traffic from search, right? Some had to be better performers than others, and of those you should be able to see which lost the most.

dvduval




msg:4321367
 11:35 pm on Jun 2, 2011 (gmt 0)

Well, one thing you could do is run some reports on which pages google sent ZERO traffic to in the last 6 months or so, and pretty much ban google from indexing those pages.

With so many trillion pages on the web, google may be finding the "deep web" is that part in which they don't want to waste server space indexing or ranking. Why not do them a favor and just remove the ability for them to index that content?

Just keep the content they are sending traffic to and then allow Google to index new content. If they do a good job with that, open up more of your site to them. If not, the social web is increasing, and google may not be the only game in town for much longer.

synthese




msg:4321372
 11:38 pm on Jun 2, 2011 (gmt 0)

Yes some pages were better placed in search, but as a percentage lost, it's pretty similar. E.g. The page that got 1000 referrals per day not gets about 50. The page that got 100 now gets about 5. And yes... thats 95% drop -- got hit by all 3 panda iterations.

If I were to go through the top 25 landing pages from Google traffic pre-panda and post-panda -- they are not actually that different - except for the numbers that is...

Broadway




msg:4321408
 2:48 am on Jun 3, 2011 (gmt 0)

I'd agree with synthese. I don't find that certain pages or subdirectories were necessarily hit differently.

I do see a larger percentage traffic loss on some pages that ranked in the top #5 for certain keywords. But that is what you would expect. We all know top ranking positions get a higher percentage of clicks, and therefore when they drop they loose a proportionally larger percentage of traffic. A drop from #1 to #25 will loose more proportionally than a drop from #10 to #35. It's not arithmetic.

But for my individual subdirectories as a whole, topics that were basically long-tail pages and not high rankers, the drop in traffic is essentially the same percentage.

I wish it was so easy as to look at Analytics and see which pages google no longer valued.

indyank




msg:4321412
 3:11 am on Jun 3, 2011 (gmt 0)

I wish it was so easy as to look at Analytics and see which pages google no longer valued.

exactly...it isn't that easy as it is made out by many including Vanessa fox. If it were, we would have seen many more recoveries. Neither analytics nor WMT really helps.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved