Welcome to WebmasterWorld Guest from 188.8.131.52
Forum Moderators: phranque
I made several mistakes with the optimisation that really cost us.
I’d like to think that we’ve fixed all of the mistakes and I want to lay them out here and maybe get some thoughts on how you guys deal with clients and identifying mistakes you make in a case like this.
As the company operated in several countries, we had a section on each country. Many of the tours sold went through several countries so we had those itineraries being duplicated in each of the countries.
For example tour 1 went through Namibia, Botswana and Zimbabwe.
In each of those country folders we had:
This created a lot of duplicate pages.
The problem was – from the begging we thought this made run foul in the engines. But failed to act on it thinking that there were enough pages not duplicated and we’d be OK.
We did in fact even get a top ten listing on Altavista for a while and then got dumped – At the time we weren’t to concerned with not being listed in AV, cause we were waiting for the big guy, Google to pick the site up. – Another mistake – I did not document the exact day I noticed we were no longer in AV and go back to the log files and ANALYSE!
Added to this, we got a good listing in the Yahoo! directory after doing a bait and switch and we were getting a fair amount of traffic from Yahoo! for our main keyword phrase. Now that Yahoo! is no longer displaying it directory straight up – traffics dried up from there.
In order to get traffic we tried buying a few terms on overture and this has been quite successful, we’ve gone on a huge reciprocal links campaign and that too is producing traffic.
As for the client, he has also been pretty cool about it all. I’ve played open cards with him from the beginning and while I know he is frustrated, I’ve been doing a good job of keeping him patient.
It has been made easier to keep him happy, as he has never paid a fee for the site and optimisation
So what I’ve learnt from this is:
- duplicate content is absolute poison and if there is ANY doubt in you mind, get rid of any duplicate content.
- If things go bad – obtaining reciprocal links, while it is slog work and frustrating, is a good way to substitute/compliment search engine traffic
- Even on a limited budget, you can get good traffic from PPC engines
- Document everything! AS soon as you think something is going down, MAKE time to analyse your notes
- Google is all powerful…
- The wheels of the web turn slowly
- Be honest with clients
- Spend more time in these forums!
I'm a bit concerned because on one site I just started work with they moved all the product pages from a /directory/ to the root. It looks like the pages were just replicated. Now Google has both sets, so there's a duplicate of each page in the index. I had no inkling until I checked allinurl: and it's a tense wait until the next update.
Was there anything in the linking structure you can think of that you'd avoid if you had to do this over again?
Also, have you figured out a percentage of similarity that would have tagged pages as duplicates?
One thing I would do over though - I would keep all content that is relavant to various sections of a site in its own directory and not repeat it in each of the relevant sections. And then from that directory, link to each of the other sections...
I was wondering if when Page A is considered a duplicate of Page B - what percentage of those two duplicate pages are identical to each other? Are they 100% alike, or 50% alike or 3/4 of the page is just like the other? We're always wondering how much of a page has to be just like another page for them to be considered duplicates of each other.
joined:Apr 13, 2002
And then from that directory, link to each of the other sections...
Just a gentle note: Not sure how big you're site is... but that can be a lot of links at the bottom of a lot of pages. A site map is waaaaaaaay more efficient. ;)
Welcome to Webmasterworld.
1. Develop sub-domains where heavy duplication exists. This would be relatively easy if the site is develop in separate directories www.example.com/south_africa/ and www.example.com/congo/
a sub-domain domain turns the url reference www.example.com/ and www.south_africa.example.com/ and www.congo.example.com/. Now the duplication is in fact via different sites (according to a search engine spider) each sub-domain externally references the others.
2. Text context - even though the itineraries maybe identical the overall page themes are different (one is "South Africa" the other "Congo"). Where ever possible get these distinctions in the text, links, image alt tags and title.
3. Keyphrase targeting - you are targeting a very special market, in that there is a split between keywords Vacation vs. Travel vs. Destinations, Discount vs Packages and many, many more.
Target specific keyword and phrases in different sub-domains.
and match these up with the specific locations.
Using these method will reduce duplicated content "particularly with search terms".
4. Because the sub-domains make each section external (a different site) to each other cross-linking should be only the related information, Congo and South Africa content are lower level pages in your web site hierarchy therefore links between them shouldn't exist, but at the same time your "primary site" is about location itineraries and therefore visitors can still move back and forth between locations.
I would keep all content that is relavant to various sections of a site in its own directory and not repeat it in each of the relevant sections. And then from that directory, link to each of the other sections...
clickthinker, are you saying you'd only link the index pages of the different directories to each other? And not cross-link between pages within the directories?
I'm trying to follow your train of thought closely here to get a clear picture of what your navigation was like, and how you'd do it, since you know the details of the site. I think it would help to examine your situation a little closer so we can know to avoid it.
Our navigation was as follows - We have a site wide header include with links to the index pages of the top level pages, eg, Tours, Accommodation, Contact Us.
And a left hand site wide include going to each of the country pages and the index page of each type of tour offered.
Our problem of duplicate content came in when you get to:
If a tour went through SA, Namibia and Botswana you could get the same content by going:
and the same for:
As these pages were all built out of a db - they were all exactly the same, the only change was the URL.
What we've done now is have a tour directory and no matter where a person come from within the site, they get the same page:
martinibuster - thanks for the tip of a site map!
Clickthinker, I'll try to summarize your situation - shout if I get it wrong:
1)You have a range of products (Tour 1, Tour 2).
2)You have a number of categories (Destination 1, Destination 2)
3)Often, the same products overlap in the various categories,
As a result, a large percentage of the product pages are identical (regardless of which category they fall in) or as close as butter is to margarine.
Regarding your experiences with:
Altavista.com - How similar is the link structure on these pages? I would bet that you have been featured and then dropped because the link structure from one page to another is what Altavista would regard as too close for comfort, hence you have lost your listing.
Google.com - How many pages in Google have been listed, if any? This is circumstantial, but I strongly believe that Google has got a method whereby they check a collection of the pages on a site to determine duplication and then penalise the site. I think this may have happened to you. So, its more then one page against another - its patterns accross a number of pages.
Be interested to get more feedback from you and some of the others. There have been loads of discussions regarding duplicate content but I am still not sure whether we have found out what counts as being sufficiently dissimilar to avoid penalty.