|Publish product pages gradually to avoid Panda thin content filter?|
As I know, one of the reasons why Panda hits website is thin content. Say, there are 10000 products, and I decided to build online store with one page for each product. However, I can not write 10000 product articles fast. I will write them gradually. Say, I will make 100 articles in one month. So after first month I am ready to open website. Should I publish only 100 pages for 100 products which have good articles? Or I can publish all 10000 product pages with 9900 thin content pages? Will Google devalue those 100 pages with good articles, because they reside on website with 9900 pages with thin content or not? Another words, the question is, does thin content filter in this situation works on per-page level or on website level? And what is the best strategy in this situation?
I've thought about how to approach this situation quite a bit, meaning publishing a large number of pages (2.4 million) and the decision I've made is to "publish in stages" for search engines, meaning, I would "push the site out" but noindex anything that does not have unique content, then remove the noindex when a page is unique.
Basically, I think in your situation I'd noindex everything and remove the noindex from any page that contains unique content, so if that's 100 pages in the first month, then I'd have 100 pages that were "index" and 9900 that were "noindexed", then I'd continue to remove the noindex from others as "uniqueness was created" for them specifically.
Is it even possible for someone to write 2.4 million quality pages? That might be a red flag in and of itself.
The problem you describe CheeryFox is faced by all online stores, they all need the 'more' Google is looking for. You will want to avoid republishing the generic product description if your pages don't have more to offer else they'll linger somewhere around page 15 until they do.
More doesn't have to be content however.
- Cool comparison tools
- A new way of helping visitors get what they wanted
- Ease of use
Having something tangible that yields great user metrics can overcome generic descriptions to some extent but as you pointed out you will want to write your own sales points if at all possible. That has been widely known for a long time so you're on the right path.
Did you know? JJ McCarthy, former head of eBays partner network, once owned a company that did nothing but write unique product descriptions in bulk for online retailers before he joined eBay.
|Is it even possible for someone to write 2.4 million quality pages? That might be a red flag in and of itself. |
Yes, when you find yourself in a position to present unique information through databases and [other things] no one else is presenting to searchers, it's possible, even though being in that position is unlikely these days.
I know it's crazy to think there's anything anyone's missed on the entire Internet, but I've got a bit of crazy in me and there's something people are missing in a niche I've been involved in that allows for me to be the first to do something even though it will take a *very* large number of pages to do it.
|I know it's crazy to think there's anything anyone's missed on the entire Internet |
There's enormous amounts of information in old books and newspaper articles which, as far as I can determine, has never been put on the web. Of course, most of it isn't of any commercial value. But some of it is relevant to important social, political and enviromental issues, especially in providing a historical perspective. I found that when you do add anything new to the web, even if it's about an obscure subject, it gets scraped and spread around, and eventually gets turned into a wikipedia page.
There are lots of sites related to weather, driving distances etc that have thousands or millions of pages and have avoided panda and penguin penalties, in fact quite often they seem to have done well despite pages all being auto-generated.
So in principle I think it would be possible to publish loads of pages at once and not be penalised - having said that I wouldn't do it myself unless it was a test site unconnected to my other sites, and while you might get away with a small amount of text on each page if it is unique I don't think you would get away with duplicate product descriptions.
Also panda sometimes seems to affect a whole site, other times certain sections or even pages more than others, but not sure what 'swings the balance' between the two..
I suspect it is easier to stay out of panda than to get released after being penalised (could be wrong, just a hunch), so if it was me I would just post the 100 quality articles at a time.
Agree with JD_Tomis:
|"Basically, I think in your situation I'd noindex everything and remove the noindex from any page that contains unique content, so if that's 100 pages in the first month, then I'd have 100 pages that were "index" and 9900 that were "noindexed", then I'd continue to remove the noindex from others as "uniqueness was created" for them specifically." |
Hope this helps.