homepage Welcome to WebmasterWorld Guest from
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Marketing and Biz Dev / SEM Research Topics
Forum Library, Charter, Moderators: phranque

SEM Research Topics Forum

Making mistakes in SEO
keeping the ball rolling when you screw up

10+ Year Member

Msg#: 411 posted 8:46 pm on Nov 15, 2002 (gmt 0)

We built a site for a client and instead of charging for the design, build and optimisation of the site, we went the revenue sharing route - we get 8% on any bookings made from that site.
It’s a travel site, selling tours of Southern Africa and many of our fee based clients do exactly this and we were confident that we could crack it…

I made several mistakes with the optimisation that really cost us.
I’d like to think that we’ve fixed all of the mistakes and I want to lay them out here and maybe get some thoughts on how you guys deal with clients and identifying mistakes you make in a case like this.

As the company operated in several countries, we had a section on each country. Many of the tours sold went through several countries so we had those itineraries being duplicated in each of the countries.
For example tour 1 went through Namibia, Botswana and Zimbabwe.

In each of those country folders we had:

This created a lot of duplicate pages.
The problem was – from the begging we thought this made run foul in the engines. But failed to act on it thinking that there were enough pages not duplicated and we’d be OK.

We did in fact even get a top ten listing on Altavista for a while and then got dumped – At the time we weren’t to concerned with not being listed in AV, cause we were waiting for the big guy, Google to pick the site up. – Another mistake – I did not document the exact day I noticed we were no longer in AV and go back to the log files and ANALYSE!
Added to this, we got a good listing in the Yahoo! directory after doing a bait and switch and we were getting a fair amount of traffic from Yahoo! for our main keyword phrase. Now that Yahoo! is no longer displaying it directory straight up – traffics dried up from there.

In order to get traffic we tried buying a few terms on overture and this has been quite successful, we’ve gone on a huge reciprocal links campaign and that too is producing traffic.

As for the client, he has also been pretty cool about it all. I’ve played open cards with him from the beginning and while I know he is frustrated, I’ve been doing a good job of keeping him patient.
It has been made easier to keep him happy, as he has never paid a fee for the site and optimisation

So what I’ve learnt from this is:
- duplicate content is absolute poison and if there is ANY doubt in you mind, get rid of any duplicate content.
- If things go bad – obtaining reciprocal links, while it is slog work and frustrating, is a good way to substitute/compliment search engine traffic
- Even on a limited budget, you can get good traffic from PPC engines
- Document everything! AS soon as you think something is going down, MAKE time to analyse your notes
- Google is all powerful…
- The wheels of the web turn slowly
- Be honest with clients
- Spend more time in these forums!



WebmasterWorld Senior Member marcia us a WebmasterWorld Top Contributor of All Time 10+ Year Member

Msg#: 411 posted 9:43 pm on Nov 15, 2002 (gmt 0)

Thanks so much for sharing this with us, clickthinker. A lot of us worry about cross-linking and running into problems with duplicate content. It can be serious, and it's so easy to do accidentally.

I'm a bit concerned because on one site I just started work with they moved all the product pages from a /directory/ to the root. It looks like the pages were just replicated. Now Google has both sets, so there's a duplicate of each page in the index. I had no inkling until I checked allinurl: and it's a tense wait until the next update.

Was there anything in the linking structure you can think of that you'd avoid if you had to do this over again?

Also, have you figured out a percentage of similarity that would have tagged pages as duplicates?


10+ Year Member

Msg#: 411 posted 8:44 am on Nov 16, 2002 (gmt 0)

Hey Marcia.
I don't have an exact percentage of how many pages were duplicated - I would guess, of the entire site that it was around 10 - 15%.

One thing I would do over though - I would keep all content that is relavant to various sections of a site in its own directory and not repeat it in each of the relevant sections. And then from that directory, link to each of the other sections...


WebmasterWorld Senior Member marcia us a WebmasterWorld Top Contributor of All Time 10+ Year Member

Msg#: 411 posted 9:10 am on Nov 16, 2002 (gmt 0)

Thanks, clickthinkers. That's not really too many.

I was wondering if when Page A is considered a duplicate of Page B - what percentage of those two duplicate pages are identical to each other? Are they 100% alike, or 50% alike or 3/4 of the page is just like the other? We're always wondering how much of a page has to be just like another page for them to be considered duplicates of each other.


WebmasterWorld Administrator martinibuster us a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

Msg#: 411 posted 9:17 am on Nov 16, 2002 (gmt 0)

And then from that directory, link to each of the other sections...

Just a gentle note: Not sure how big you're site is... but that can be a lot of links at the bottom of a lot of pages. A site map is waaaaaaaay more efficient. ;)

Welcome to Webmasterworld.


WebmasterWorld Senior Member fathom us a WebmasterWorld Top Contributor of All Time 10+ Year Member

Msg#: 411 posted 9:23 am on Nov 16, 2002 (gmt 0)

There are a couple of strategies (quick) to reducing the chance of a duplicating penalty.

1. Develop sub-domains where heavy duplication exists. This would be relatively easy if the site is develop in separate directories www.example.com/south_africa/ and www.example.com/congo/

a sub-domain domain turns the url reference www.example.com/ and www.south_africa.example.com/ and www.congo.example.com/. Now the duplication is in fact via different sites (according to a search engine spider) each sub-domain externally references the others.

2. Text context - even though the itineraries maybe identical the overall page themes are different (one is "South Africa" the other "Congo"). Where ever possible get these distinctions in the text, links, image alt tags and title.

3. Keyphrase targeting - you are targeting a very special market, in that there is a split between keywords Vacation vs. Travel vs. Destinations, Discount vs Packages and many, many more.

Target specific keyword and phrases in different sub-domains.

Vacation Travel
Discount Travel
Destination Packages
Discount Packages

and match these up with the specific locations.

Using these method will reduce duplicated content "particularly with search terms".

4. Because the sub-domains make each section external (a different site) to each other cross-linking should be only the related information, Congo and South Africa content are lower level pages in your web site hierarchy therefore links between them shouldn't exist, but at the same time your "primary site" is about location itineraries and therefore visitors can still move back and forth between locations.


WebmasterWorld Senior Member marcia us a WebmasterWorld Top Contributor of All Time 10+ Year Member

Msg#: 411 posted 9:42 am on Nov 16, 2002 (gmt 0)

Were the pages all in the same directory? How was the file structure set up?

I would keep all content that is relavant to various sections of a site in its own directory and not repeat it in each of the relevant sections. And then from that directory, link to each of the other sections...

clickthinker, are you saying you'd only link the index pages of the different directories to each other? And not cross-link between pages within the directories?

I'm trying to follow your train of thought closely here to get a clear picture of what your navigation was like, and how you'd do it, since you know the details of the site. I think it would help to examine your situation a little closer so we can know to avoid it.


10+ Year Member

Msg#: 411 posted 10:45 am on Nov 16, 2002 (gmt 0)

Marcia - I've been taking a look to see what percentage of the site was made up of duplicate pages, and its probably more that I initially thought...
You asked when Page A is considered a duplicate of Page B, this is how we worked it:
We had different types of tour for people to choose from, eg:

Popular tours
Adventure tours
Camping tours

Our navigation was as follows - We have a site wide header include with links to the index pages of the top level pages, eg, Tours, Accommodation, Contact Us.
And a left hand site wide include going to each of the country pages and the index page of each type of tour offered.
Our problem of duplicate content came in when you get to:
If a tour went through SA, Namibia and Botswana you could get the same content by going:


and the same for:

As these pages were all built out of a db - they were all exactly the same, the only change was the URL.

What we've done now is have a tour directory and no matter where a person come from within the site, they get the same page:

martinibuster - thanks for the tip of a site map!


10+ Year Member

Msg#: 411 posted 2:32 pm on Nov 21, 2002 (gmt 0)

Hi All,

Clickthinker, I'll try to summarize your situation - shout if I get it wrong:

1)You have a range of products (Tour 1, Tour 2).
2)You have a number of categories (Destination 1, Destination 2)
3)Often, the same products overlap in the various categories,

As a result, a large percentage of the product pages are identical (regardless of which category they fall in) or as close as butter is to margarine.

Regarding your experiences with:
Altavista.com - How similar is the link structure on these pages? I would bet that you have been featured and then dropped because the link structure from one page to another is what Altavista would regard as too close for comfort, hence you have lost your listing.

Google.com - How many pages in Google have been listed, if any? This is circumstantial, but I strongly believe that Google has got a method whereby they check a collection of the pages on a site to determine duplication and then penalise the site. I think this may have happened to you. So, its more then one page against another - its patterns accross a number of pages.

Be interested to get more feedback from you and some of the others. There have been loads of discussions regarding duplicate content but I am still not sure whether we have found out what counts as being sufficiently dissimilar to avoid penalty.

Global Options:
 top home search open messages active posts  

Home / Forums Index / Marketing and Biz Dev / SEM Research Topics
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved