Welcome to WebmasterWorld Guest from 54.145.167.92

Forum Moderators: brotherhood of lan & mack

Message Too Old, No Replies

Indexing Dynamic pages and Dup Content

Indexing Problems - Maybe Duplicate Content?

   
5:13 pm on Jan 10, 2010 (gmt 0)

5+ Year Member



Iam new to WW, and this is my first post. Iam not a programmer, but do have some experience with SEO. I really need some help and advice on how I can fix my site.

My site has a PR4 and has been live since 2006.(new site launched Oct 2008) There have been problems with the site from the beginning, and we have tried may things to fix it.

Here is a timeline of problems and attempted fixes.

1. Problem: Working site was indexed by Google because there was no "DO NOT FOLLOW" Robot text placed on working site.
Fix: We removed all pages from the working site and submitted a re-inclusion request to Google.

2. Problem: Indexing of dynamic pages. There are only approx 7500 pages that seem to be indexed with Google using tool Site:mydomain.com.
Fix: We have tried to work on the site map page, use of an XML site map, and even had an SEO company do some consulting on how to get the pages and vehicle indexed.

This site has over 70,000 vehicles on it and over 700 Canadian auto dealers. It has a HUGE potential to be a top contender in the Auto Industry but it lacking reach.

Any advice or knowledge you can pass on will be greatly appreciated.
Thank you
Ray

[edited by: brotherhood_of_LAN at 6:16 pm (utc) on Jan. 11, 2010]
[edit reason] No personal URLs, thanks! [/edit]

11:22 am on Jan 12, 2010 (gmt 0)

WebmasterWorld Administrator phranque is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



how is your internal navigation structure?

how much unique content is on each page as a percentage of the total content on the page?

if you have lots of boilerplate that is duplicated on 10's of thousands of page or insufficient internal navigation/inbound links, a sitemap won't do much for you with crawling or indexing.

7:35 pm on Jan 16, 2010 (gmt 0)

5+ Year Member



The percentage of uniue content per page is only about 5% per vehicle, and other pages could even be less than that.

Was looking at getting a Mod Rewrite done on the site to remove the ?. How much would this help if any?

[edited by: mack at 7:12 pm (utc) on Jan. 17, 2010]
[edit reason] Removed URL [/edit]

7:20 pm on Jan 17, 2010 (gmt 0)

WebmasterWorld Administrator mack is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



Hi ray,

I removed the link to your site. On WebmasterWorld to try and keep the topics as general as possible. By doing this the thread may, long term be of use to a lot more members.

When you say duplicate content, am I right in assuming some of your pages can be accessed using more than one url? If this is the case you need to ensure that each page can only be accessed at one location. Ensure your own internal linking structure only uses one url per page. By doing this it will prevent search engines discovering the other possible url's.

The lack of unique content is in my opinion of greater concern. If there are many other sites using the very same data then it may prove extremely difficult to rank.

Mack.

2:03 am on Jan 18, 2010 (gmt 0)

WebmasterWorld Administrator phranque is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



there are actually three distinct issues that are often referred to as duplicate content and each requires a different solution:
- non-canonical urls - serving the same content for multiple urls within the control of one publisher.
- heavy boilerplate - too much content is repeated on multiple pages so there is not enough unique content to distinguish the subject or message of one page from another.
- scraped or syndicated content - content that is shared among many publishers or is stolen and used elsewhere without permission.

which "duplicate content" problem(s) are you having?