Welcome to WebmasterWorld Guest from 54.167.29.208

Forum Moderators: Robert Charlton & goodroi

Allow low quality pages to be indexed? Lesser of two evils conundrum

     
7:35 am on Feb 5, 2018 (gmt 0)

Senior Member from GB 

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month

joined:Sept 16, 2009
posts:1071
votes: 75


Following the announcement that Google has been treating links to noindex pages as nofollow (i.e. dropping them from the link graph) I am wondering whether to make changes on a site and would welcome thoughts from other members.

I have always had an 'if in doubt, noindex' policy for any content that I considered low quality, and this site has (let's call it 'product') content that cannot be salvaged:
- content is not permanent;
- content is often brief;
- content is derivative and cliched because there are limited ways to describe very similar products;
- content has information obfuscated in many instances to preserve anonymity of product 'owner';
- we do not always ultimately control the exact product wordings, these can be forced on us;
- content is widely syndicated on far bigger sites (aggregators etc) with high authority and relevancy.

The saving grace is that this is standard for the niche, so we are at no disadvantage compared to competitors.

I have - to now - let the client organise the content into categories, and write snippets that are (hopefully) unique, so that we have a (hopefully) reasonable category page.

I am taking over this process as the snippets have, over time, deteriorated in terms of both length and quality. I hope to improve the user experience on the category pages by highlighting - as far as possible - what makes each 'product' different, to keep people on the site longer without their eyes glazing over.

That's the background: now my question.

Given the low quality of the content, and its transitory nature, which do people consider the lesser of two evils?

1) Category pages bleed PR
There can be up to 80 'products' in some categories and this is unfortunately unavoidable. Trying to devise a classification system that distributes 'products' evenly is impossible due to overlap, trends in the niche and in the business operations.

2) PR preserved but hundreds of low quality pages are indexed, perhaps at the expense of others
Lots of 410s will be required. Google NEVER forgets urls it seems.
And, because occasionally these 'product' pages are temporarily popular and even relatively heavily linked, I'm worried that they may oust other pages (from the 'resources' part of the site, and not at all well-linked as yet, but which add up to a decent regular amount of traffic that can be relied on permanently).

Thanks for reading, hope I've been clear :)
9:13 am on Feb 5, 2018 (gmt 0)

Moderator from US 

WebmasterWorld Administrator keyplyr is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Sept 26, 2001
posts:10638
votes: 630


There's a new GSC in the process of completion. You can see it now linked from the current GSC.

In the new GSC there is (will be) much more data concerning indexed pages, even those that are disallowed by robots.txt or meta tags.

At first glance, it appears Google no longer wants anything disallowed. If its online, Google wants to index it. It is treating all disallowed files as errors to be fixed, then validated.

There should be more data populated to the reports soon that may explain it further. You may want to wait until then to make any changes.
10:44 am on Feb 5, 2018 (gmt 0)

Senior Member from GB 

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month

joined:Sept 16, 2009
posts:1071
votes: 75


Thanks - yes, I'm definitely in no rush to do this. I want to see the effect of the better quality snippets first. Then I may test on one or two categories once I have some sort of bench-marking in place.