Forum Moderators: Robert Charlton & goodroi
The site is only about a year and a half old, and for a while we just thought they were in that mysterious Google sandbox and that was why they weren’t getting indexed. In early May 2006, all pages had dropped out of the index entirely except for the home page and two others. On May 9th, 297 pages popped back in, but all except the home page were supplemental results. We re-submitted the sitemap to Google Sitemaps. On a side note, I found out about this same time that our client had also tried to be "helpful" and opened their own Google Sitemaps account and tried to submit a sitemap as well. Not sure if this hurt anything but as soon as I found out I told them to shut down their account immediately. Over the last few weeks, all pages have been re-indexed, but most of the 984 pages in the Google index are in the Supplemental Index. We have worked hard with this client to optimize the site and ensure that the site follows search engine guidelines, but can’t figure out why the site is in the supplemental index.
A recent discovery of mine shows that there are other sites out there that have similar content to this one. Shame on me - I assumed that the copy that the client wrote as product descriptions was unique. However, this site is an e-commerce site, distributing the same products that several other sites do. Consequently, almost all of those other distributors are using the same copy as descriptions for their products as it was provided by the manufacturer. The only possibility that I see here is that this is a dupe content issue. Is it possible that the client just needs to re-write all product descriptions to be unique? I understand the point of avoiding dupe content, but on the other hand, there are only so many ways that you can describe one product. If the dupe content issue is causing this, then in theory, all e-comm sites that sell these products need to write unique copy for each and discard the copy provided to them by the manufacturer? Is this realistic? I can’t imagine that it’s ok to say that only one site is going to be “allowed” to sell their products while all other sites with the same exact products (who are authorized to sell those products and have every right to do so) are being stashed away in this supplemental index never to be found. On the other hand, dupe content is dupe content and how is Google to know the difference?
Any insight into this issue is greatly appreciated.
[edited by: jatar_k at 12:56 am (utc) on July 14, 2006]
[edit reason]
[1][edit reason] no urls thanks [/edit] [/edit][/1]
1. Not enough content on the page
2. Content on pages already found on other sites (dupe content)
3. Title tags not different page to page
4. Description tags not different page to page
5. Not enough inbound links (especially if content is not unique)
6. Pages are too deep in the site (more than 2 levels deep, home page counts as level 1).
Those are just some points but I´m sure there are others.
Often the duplication is within the site, and is caused by too little copy per page.
Google reads code, not the visible page; so a page with lots of shared content (ads, promo statements, links, navigation etc) and a para of unique stuff really won't work - in the oage code, that unique stuff is such a small proportion, that Google smiles sadly and moves on.
You can help by merging small pages, if the content cannot be increased (eg a page on blue widgets and second one on red becomes one widgets page, with paras on blue and red).
You can also help by using clean off-page css, remove clutter, pointless links and repetition of marketing blurb (which really need not be on every page).
Final note, when you say 'several other sites' I'm guessing you mean 10,000. Why would Google list them all?
The one who provides the best content will win - let that be your client!