homepage Welcome to WebmasterWorld Guest from 54.167.182.201
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
Publish product pages gradually to avoid Panda thin content filter?
CheeryFox

5+ Year Member



 
Msg#: 4598672 posted 6:33 am on Aug 2, 2013 (gmt 0)

As I know, one of the reasons why Panda hits website is thin content. Say, there are 10000 products, and I decided to build online store with one page for each product. However, I can not write 10000 product articles fast. I will write them gradually. Say, I will make 100 articles in one month. So after first month I am ready to open website. Should I publish only 100 pages for 100 products which have good articles? Or I can publish all 10000 product pages with 9900 thin content pages? Will Google devalue those 100 pages with good articles, because they reside on website with 9900 pages with thin content or not? Another words, the question is, does thin content filter in this situation works on per-page level or on website level? And what is the best strategy in this situation?

 

JD_Toims

WebmasterWorld Senior Member Top Contributors Of The Month



 
Msg#: 4598672 posted 6:55 am on Aug 2, 2013 (gmt 0)

I've thought about how to approach this situation quite a bit, meaning publishing a large number of pages (2.4 million) and the decision I've made is to "publish in stages" for search engines, meaning, I would "push the site out" but noindex anything that does not have unique content, then remove the noindex when a page is unique.

Basically, I think in your situation I'd noindex everything and remove the noindex from any page that contains unique content, so if that's 100 pages in the first month, then I'd have 100 pages that were "index" and 9900 that were "noindexed", then I'd continue to remove the noindex from others as "uniqueness was created" for them specifically.

JS_Harris

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 4598672 posted 7:15 am on Aug 2, 2013 (gmt 0)

Is it even possible for someone to write 2.4 million quality pages? That might be a red flag in and of itself.

The problem you describe CheeryFox is faced by all online stores, they all need the 'more' Google is looking for. You will want to avoid republishing the generic product description if your pages don't have more to offer else they'll linger somewhere around page 15 until they do.

More doesn't have to be content however.
- Cool comparison tools
- A new way of helping visitors get what they wanted
- Ease of use
- Etc.

Having something tangible that yields great user metrics can overcome generic descriptions to some extent but as you pointed out you will want to write your own sales points if at all possible. That has been widely known for a long time so you're on the right path.

Did you know? JJ McCarthy, former head of eBays partner network, once owned a company that did nothing but write unique product descriptions in bulk for online retailers before he joined eBay.

JD_Toims

WebmasterWorld Senior Member Top Contributors Of The Month



 
Msg#: 4598672 posted 7:25 am on Aug 2, 2013 (gmt 0)

Is it even possible for someone to write 2.4 million quality pages? That might be a red flag in and of itself.

Yes, when you find yourself in a position to present unique information through databases and [other things] no one else is presenting to searchers, it's possible, even though being in that position is unlikely these days.

I know it's crazy to think there's anything anyone's missed on the entire Internet, but I've got a bit of crazy in me and there's something people are missing in a niche I've been involved in that allows for me to be the first to do something even though it will take a *very* large number of pages to do it.

aristotle

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month



 
Msg#: 4598672 posted 1:12 pm on Aug 2, 2013 (gmt 0)

I know it's crazy to think there's anything anyone's missed on the entire Internet


There's enormous amounts of information in old books and newspaper articles which, as far as I can determine, has never been put on the web. Of course, most of it isn't of any commercial value. But some of it is relevant to important social, political and enviromental issues, especially in providing a historical perspective. I found that when you do add anything new to the web, even if it's about an obscure subject, it gets scraped and spread around, and eventually gets turned into a wikipedia page.

Rasputin

5+ Year Member



 
Msg#: 4598672 posted 2:52 pm on Aug 2, 2013 (gmt 0)

There are lots of sites related to weather, driving distances etc that have thousands or millions of pages and have avoided panda and penguin penalties, in fact quite often they seem to have done well despite pages all being auto-generated.

So in principle I think it would be possible to publish loads of pages at once and not be penalised - having said that I wouldn't do it myself unless it was a test site unconnected to my other sites, and while you might get away with a small amount of text on each page if it is unique I don't think you would get away with duplicate product descriptions.

Also panda sometimes seems to affect a whole site, other times certain sections or even pages more than others, but not sure what 'swings the balance' between the two..

I suspect it is easier to stay out of panda than to get released after being penalised (could be wrong, just a hunch), so if it was me I would just post the 100 quality articles at a time.

Planet13

WebmasterWorld Senior Member planet13 us a WebmasterWorld Top Contributor of All Time Top Contributors Of The Month



 
Msg#: 4598672 posted 4:20 pm on Aug 2, 2013 (gmt 0)

Agree with JD_Tomis:

"Basically, I think in your situation I'd noindex everything and remove the noindex from any page that contains unique content, so if that's 100 pages in the first month, then I'd have 100 pages that were "index" and 9900 that were "noindexed", then I'd continue to remove the noindex from others as "uniqueness was created" for them specifically."


Hope this helps.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved