Welcome to WebmasterWorld Guest from

Message Too Old, No Replies

Linking and indexing for millions of product pages



7:25 am on Aug 1, 2011 (gmt 0)

Good Morning all,
I am working on < a website > reviewing the seo status and I have noticed that the individual product pages are not listed within google. I have checked various things and all seems ok, the site has over 3 million items so this is a major issue. I am looking for a webmaster that has experience on working with seo on sites with a high page count. Please reply if you would like any information.

From what I can see the main sections are listed but not the items and its the items that are the main focus.

Many thanks.


[edited by: tedster at 10:41 pm (utc) on Aug 22, 2011]
[edit reason] no personal domain names, please [/edit]


2:09 pm on Aug 8, 2011 (gmt 0)

5+ Year Member

Id say your first port of call would be webmaster tool see what WMT has to say. I notice your xml sitemap is not product level deep and your product pages are really light on content.

Googlebot is obviously not reaching your product pages if there not showing up, but that could also be content related too to be honest.

An idea might be to create some back links to a product from something like blogspot, see if the page will get indexed. I think you have a couple of issues to look at and you have some testing to do.

Secondly if you post in a more relavent palce you might get more response's


2:16 pm on Aug 8, 2011 (gmt 0)

Hi Novus, first of all thankyou for replying.
The issue is we have 3 million items and i didnt think it would be sensible of time effective to produce a site map for all of them, but if really needed we could.

Content, will we have tried to honest, its a record, it has tracks and images and related items. I can certainly get some deeper links put on blogs. The site has run for 10 years and ranked really well, we have had a resign and google seems to have forgot us.

I suppose one question is should I do a complete site map.




1:33 pm on Aug 9, 2011 (gmt 0)

5+ Year Member

Hi Mark

No probs, for whatever reason googlebot is not getting to your product pages, so a full sitemap would be a good idea. How do your results looking in other search engines? do you see your products there?

I run a media ecommerce site ive got aroud 2,000 products, when I inherited the site it only had about 10% of pages indexed, The site is built on a horrible CMS with zero functionality, but I made sure I added as much content as possible to new products,optmized for keywords, spent time building back links (articles, press releases, blogging etc, I used a well known blogging service to great effect to build some backlinks in which googlebot followed.


10:50 pm on Aug 22, 2011 (gmt 0)

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member

Hello markward, and welcome to the forums. I moved your thread here to the Google SEO forum as a better place to get input.

Normally, depth of indexing on Google is related to PageRank - and in particular, solid backlinks to internal pages can be a good help. By "solid" I mean freely given "editorial" links rather than links you can just decide to place on your own. This can be challenging for a product "database" type of site, unless you have some remarkable content somewhere - possibly for strong categories within the site. I'd say the key for this site is summed up in two words "content marketing".

If the current product pages are just generic feeds, with the same essential content available on other websites, then the job may be almost impossible. However, if you can enhance that content and add value, things do get better.

And finally, I agree that a sitemap is a good idea - specifically an XML sitemap (at this scale, a set of sitemaps with an index sitemap.) At the very least, you'll get more accurate indexing information from WebmasterTools in that case than you would see using the site: operator.


12:16 am on Aug 23, 2011 (gmt 0)

Is it me?

Am I being dumb?

3 million items/products?

I'd better return to la la land:-)


1:13 am on Aug 23, 2011 (gmt 0)

hi mark

I am managing SEO of couple of HUGE HUGE sites over 20million pages for one of them. We feed URLs to Google in various ways and xml sitemaps is one of the best ways for url discovery.
I would suggest checking some basic things as well, like robots.txt, header status codes etc.


1:32 am on Aug 23, 2011 (gmt 0)

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member

Also, with large numbers of URLs, having no canonical problems becomes very important. If you're using a CMS that has a good canonical link plug-in, use it. Even if you aren't using such a CMS, find a way to put an ACCURATE canonical link tag on every page.


1:38 am on Aug 23, 2011 (gmt 0)

@tedster I agree but issue here is that the pages are not getting indexed. Correct canonical tag will surely help in better SEO, but I believe it's not the root issue.


1:58 am on Aug 23, 2011 (gmt 0)

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member

I fully agree. Here's how I see the connection. The issue with large websites is having enough PR to make it seem worthwhile to Google - that and having a good link structure. The problem with canonical errors is that they sap Page Rank, and they cause googlebot to give up indexing.

That's why I added a post about canonical advice. If you've got a plug-in, adding canonical tags can be a relatively easy thing to do - and then you've got easy insurance against one potential area of nasty trouble.

Featured Threads

Hot Threads This Week

Hot Threads This Month