Welcome to WebmasterWorld Guest from 18.104.22.168
If the product is totally gone and there is no exact or near replacement we often leave the page up for a while with a note about product no longer available, and a link to something else that might be suitable.
I don't agree with single entry point either.
How can a search engine judge which page is the most important if they all only have 1 on-site link? Amongst other things a crawler sees links and anchor text - not the 100x100 px logo that says "specials" or whatever...
Obviously off-site linking will help but our experience is that you need on-site link weight to key pages.
I've used Google Mini and its algo, as you'd expect, awards pagerank to those pages on the site that have the most links. That's the only way it can judge which pages are most important (assuming they have similar content...).
Google Search has to use the same algo but obviously adds external links to the equation.
I have a site with 200k pages and even though our off-site linking and on-site is correctly weighted Google gets it wrong sometimes. Don't get me started on Yahoo that can't handle 301s properly and MSN that just removed the site from their index because of the 301s even though it was #1 for a very competitive search term for a year...
Before anyone thinks you can learn Google secrets by buying a Google Mini - don't. It's good at what it does but a bag of Google tips it ain't (again, as you'd expect).
for extinct products, we keep the pages live but exclude them from the site's pr sphere of influence. the benfit to keeping these pages live is:
1. page continues to attract traffic
2. inbound links pointed at the product remain a source of link support
3. thematic content...the more the merrier
Amongst other things a crawler sees links and anchor text - not the 100x100 px logo that says "specials" or whatever.
Actually, if that image is linked properly, the crawler sees the link just fine. What it doesn't see is anchor text. But, if you use the alt attribute on a linked image, you get a bit of play. ;)
Optimizing an Amazon Store for Google [webmasterworld.com]
Writing many product descriptions without duplicating content
I just started doing some SEO work for an ecommerce site with about 15000 unique widgets in its inventory. A large percentage (I haven't been able to determine exactly what percentage--anywhere from 25-75%) of the product descriptions are written from a set of about thirty different templates. I'm concerned that these may be interpreted as duplicate content by such interested parties as Google, and I'm wondering what sort of strategies any of you have used to create variety in product descriptions without spending millions of hours starting from scratch with each product.
What prompted my concern is that we just got hit with the minus-30 penalty (happy day for everyone). I don't believe that this particular issue was the only reason we got hit (it may not have even been the biggest issue--I inherted some less than brilliant SEO), but I figure that in the interests of leaving no stone unturned, I ought to start working on it.
[edited by: tedster at 1:17 am (utc) on Dec. 7, 2006]
You may have a legitimate concern there, if by "30 templates" you mean only 30 distinct product descriptions. If they are true templates, however, and they fill in a sigificant portion of the description dynamically, then things maybe OK in this department.
Just to double check, you do mean the product descriptions in the <body> correct? Not the meta descriptions, that is.
I took some measurements about my widgets pages.
my widget have about 50 elements to the page.
about 40 of them are repetitive. the rest has a lot of similarity.
what I have done is the following:
I am reducing some the common elements into gif's
this should help me have a better ratio of repetitive content to regular content. so my page might show 2000 characters of information of which 60% is repetitive.
instead of 2200 characters of information an 90% repetitive.
I've also removed some fields that I feel don't help in the serps.
I also spent some time in making my code tighter and reducing white-spaces.
my problem is that my new site has over 100,000 pages of content of which 90% of the data within each page is repetitive information.
I hope this helps, my only fear is that I will have server performance problems since it's an image request to the server. I am going to analyze which images get the most common request and try to pre-load images over a few pages before they get to the correct page.