Welcome to WebmasterWorld Guest from 220.127.116.11
Forum Moderators: open
With a wry smile, I look back at the last 6 months to a year with one of our clients where we have got it hopelessly wrong.
Here's the scenario:
The client - Markets multiple products in various overlapping industries. Their product range and services scream multiple websites aimed at individual niche markets. The ideal project really!
We built a range of sites all aimed at specific markets. No spam here. Each site has industry and market specific info. All content is driven out of the same content management system. Via a web interface, one can easily manage where the product appears within each site from the same system. No way for a spider to pick up that the content is driven from the same system (all appears to be published statically).
We have made 2 mistakes:
1) Duplicate content within a site. On some of the sites, we displayed the same product range within multiple categories.
2) Duplicate content accross websites. Where appropriate, complimentary product has been listed in the product sections of respective sites.
This subject has been covered adnauseam. First with Altavista and their patent pending link dupe checker and more recently with Google. The million $ question has been at what point do Google and some of the other search engines consider sites content to be sufficently dissimilar to avoid penalty? This time, we were on the wrong side of right and have been penalised accordingly :(
Diversity in online markets doesn't necessarily mean product/service diversity, and this can lead to duplicate content.
On pages that are (IMO less than 20% similar) you do have 2 courses of action to avoid the Google wraith.
Crosslinking - can to bad again if you over do it, but their are significant benefits, particularly when used to avoid duplication.
An example: news releases. As all sites are owned by the same client the corporate news is likely to be the same in all sites.
Only one site should retain the news pages and all other sites should link to those pages. Should the design layout of each individual site be the same the graphical imagery of the interface (shell) can also link to the appropriate design by using multiple cascading styles sheets on the primary web pages, serving up the appropriate style on where the request is coming from (one domain or another). If design integrity (look and feel) is not that important this makes your job easier.
Unique elements, tags, and attributes in each domain also varies content and this can actually be close to 10% depending on how innovative you are.
One example used for a client (earth science) is Plate Tectonics and Tectonic Plates.
The scientific community only uses Plate Tectonics as this define the actual science.
The general population not being as scientific savvy as scientists commonly refer to Tectonic Plates.
Quite a nice market diversity, and a beautiful way of targeting normally duplicate pages on different sites in the text content, alt tags, titles (elements and attributes) as well as file names (images, pages, and objects) and directory names.
In this case it is good to remember that "content" is not reserved for just "text".