Welcome to WebmasterWorld Guest from 126.96.36.199
My site has a PR4 and has been live since 2006.(new site launched Oct 2008) There have been problems with the site from the beginning, and we have tried may things to fix it.
Here is a timeline of problems and attempted fixes.
1. Problem: Working site was indexed by Google because there was no "DO NOT FOLLOW" Robot text placed on working site.
Fix: We removed all pages from the working site and submitted a re-inclusion request to Google.
2. Problem: Indexing of dynamic pages. There are only approx 7500 pages that seem to be indexed with Google using tool Site:mydomain.com.
Fix: We have tried to work on the site map page, use of an XML site map, and even had an SEO company do some consulting on how to get the pages and vehicle indexed.
This site has over 70,000 vehicles on it and over 700 Canadian auto dealers. It has a HUGE potential to be a top contender in the Auto Industry but it lacking reach.
Any advice or knowledge you can pass on will be greatly appreciated.
[edited by: brotherhood_of_LAN at 6:16 pm (utc) on Jan. 11, 2010]
[edit reason] No personal URLs, thanks! [/edit]
how much unique content is on each page as a percentage of the total content on the page?
if you have lots of boilerplate that is duplicated on 10's of thousands of page or insufficient internal navigation/inbound links, a sitemap won't do much for you with crawling or indexing.
Was looking at getting a Mod Rewrite done on the site to remove the ?. How much would this help if any?
[edited by: mack at 7:12 pm (utc) on Jan. 17, 2010]
[edit reason] Removed URL [/edit]
I removed the link to your site. On WebmasterWorld to try and keep the topics as general as possible. By doing this the thread may, long term be of use to a lot more members.
When you say duplicate content, am I right in assuming some of your pages can be accessed using more than one url? If this is the case you need to ensure that each page can only be accessed at one location. Ensure your own internal linking structure only uses one url per page. By doing this it will prevent search engines discovering the other possible url's.
The lack of unique content is in my opinion of greater concern. If there are many other sites using the very same data then it may prove extremely difficult to rank.
which "duplicate content" problem(s) are you having?