Welcome to WebmasterWorld Guest from

Message Too Old, No Replies

Dodging risk of thin content penalty

8:53 am on Sep 8, 2013 (gmt 0)

New User from NL 

joined:Sept 7, 2013
posts: 31
votes: 0


I am developing a website where each page offers a time related service for different locations around the world. There is about 22.000 pages on the website. Each page URL is registered in a submitted sitemap:

... www.mysite.com/product_1.html
... www.mysite.com/product_2.html
... www.mysite.com/product_3.html

Now, although the content is clearly different for each page, there is not a lot of text on each page. More some could argue that it is not original content. However, the implementation is cleaner (better GUI), and much faster than competition. I also plan to make those pages work well on SmartPhones, since this is not the case for competition.

I see a possible risk of being penalized for excess of thin content pages.

I am wondering whether converting my page into a unique page with URL query parameters would eliminate that risk:

... www.mysite.com/product.html?id=1
... www.mysite.com/product.html?id=2
... www.mysite.com/product.html?id=3

Does anyone have experience with this? Is this a good idea worth investing time in?
12:35 pm on Sept 8, 2013 (gmt 0)

Moderator This Forum from GB 

WebmasterWorld Administrator 5+ Year Member Top Contributors Of The Month

joined:Apr 30, 2008
votes: 141

Welcome to WebmasterWorld, JVerstry!

Query string parameters also form unique URL.

Hence changing URLs from /product_1.html to /product.html?id=1 will not solve this problem and in fact it will flood your site doubling the number of product URLs.

You would then have to either return 404/410 for the old URLs or redirect them to a new URL version. Then you *could* tell Google via WMT parameter handling to index only one representative URL with id=nnn parameter, providing you do not use id= parameter on some other URL constructs.

It would take Google some time to process your 404/410 and also new URLs and there is a risk that in a meantime the site ranking will tank before recovering (and it may or may not recover to where it was). So personally I would not go for this solution.

What I would do is to leave URLs as they are now and add meta robots noindex on each of these thin pages instead.
7:34 pm on Sept 8, 2013 (gmt 0)

New User from NL 

joined:Sept 7, 2013
posts: 31
votes: 0

Ok thanks. Makes sense. I am going to noindex them and keep country level pages. It should be enough.