Welcome to WebmasterWorld Guest from

Forum Moderators: open

Message Too Old, No Replies

Google Penalties

applied to specific keywords?

2:44 pm on Mar 9, 2004 (gmt 0)

Full Member

10+ Year Member

joined:Sept 13, 2003
votes: 0

I had a site, say: widgetadjectiveinfo.com which has one main link to it from another site of mine with PR7. The page linking to it isn't closely related content wise. Pre-Florida it ranked circa #8 for a search on "widget adjective" but afterwards disappeared completely. However it is still being ranked #1 for "widget adjective info" and is being crawled by google.

This has led me to believe that Google's optimistation penalty only applies to certain keywords or that my site had been banned/penalised only for certain keywords but not others.

Would I be better off registering a new domain name and taking down the first? i.e. post-Florida were domain names blacklisted for keywords or is this optimisation penalty dynamic and I'll have to re-work my site?

11:45 am on Mar 10, 2004 (gmt 0)

Senior Member

WebmasterWorld Senior Member ciml is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:June 22, 2001
votes: 2

Yes, I think you've described it quite well. These keyphrase penalties are making a significant difference to some sites.

Changing the domain (even just the URLs by some accounts) may help in the very short term, but if you replicate the pattern it will presumably be caught again.

4:40 pm on Mar 10, 2004 (gmt 0)

Junior Member

10+ Year Member

joined:May 6, 2002
votes: 0

Didn't GG say that it's not a penalty, just a matter of factors being weighted differently in the algorythm?
5:16 pm on Mar 10, 2004 (gmt 0)

New User

10+ Year Member

joined:Mar 5, 2003
votes: 0

That cannot be completely true... if what GG said was completely true, would a site at #1 fall off the top 1000 on a very specific target market phrase? NO! If what GG said was true, the site might fall to 100, or 300, or even 500! 500 would be better than completely gone.

This is something i saw and monitored since Florida. Just about anyone could get in the top 20 for this phrase if that had a PR of 4. The previous #1 position which is by far the best (information wise) website on the topic was taken out of the top 1000 after being #1 for over 7 months. How can GG's explanation justify this?

Then after Austin the site started to popup into the top 10, then the top 3. Just when everyone was saying how Austin released the conspirational penalty factor.

5:27 pm on Mar 10, 2004 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Feb 17, 2003
votes: 0

Penalising certain keywords is not new thing, it has started in July, pull back in August/September and back again later.

In July, many people thought Google was broken. They shout, yell and blame Google in WebmasterWorld. Look like the campaign worked and Google pulled away the penalty (or 'filer' or 'algo' to make people some people happy).

Webmasters were happy again. But implementhing a new penalty (or filer or algo) by GooglePlex meant to build a better Google. REMEMBER THIS. The GooglePlex guys and girls are paid to make a better Google.

But no one cares on the objective of the new work out!. There MUST BE reasons why the Plex want to implement it the very first place - although it sucks when it came out that time.

- Too many information about Google SEO.
- Too many members join webmasterworld everyday
- Too many searches related to Google SEO return a message in WebmasterWorld Forum in SERPs. (more members)
- Too many SEO sites are born everyday, focusing on Google after all other engines fall.
- Too many ebooks, free or paid, can be found online.

Consequence: Even KIDS know how to do SEO for Google!

So what happened then, many people got their site in Top 10 of Google SERPs. They are happy. Me too. I am one of the KIDs who learned SEO from webmasterword.

The 2 main tactics we used are (esp the first 1)
1. Link Exchange
2. Buy Links from High PR site

So what happens? Developer found opputunity to make money. they develope tools to make link exchange easier. Some automate all link exchange process and even host your links page! They even crawl your partner site to see if they link back. I hate this bot and blocked it.

There is another software that crawl the site to find link exchange partners for you. Taking the Title and Meta description as the links info. They even show rating, backlinks ... They will put your site higher if you link back to them! And there are a few books that promote this software. I receive so many emails that have the same style, template where I know exactly they are using this tool.


Too many people are trying to manupulate the SERPs. And it is TOO EASY to MANIPULATE Google. If Google want to survive and do not wish to end up like Altavista, they need to do something! They need to change. And that why it came out a sucking algo/filter/penalty in July.

The JULY algo act this way
1. They scan for anchor text and if there are too many same anchor text that matches the optimization found in the page (keyword in title, in H1, in ALT, in outbound links, many occurances...) It will trigger the penalty and your site become out of sight for that keyword phase.

2. The strange part is, allinanchor search for the same keyword phase return you HIGH! and if you add a keyword to the penalised keyword phase, you are unaffected!

The algo is not good and it harm too many innocent sites. It take too little things into consideration (not sophiscated enough). So Google pull it out. Webmasters become happy again. But their motifs and objectives remained. Instead of giving out, they actually working harder on it! Try to include more factors, more details, more circumstances to feed the new algo.

Then a mutant came out in Florida, which has the same motive and objective - to fight against and control MANIPULATION. The algo improve a lot over the July algo, but still, it sucks.

It is being improved over updates and updates until Brandy where we start to see the shape of a better (not perfect) algo. KIDS' SEO strategy that used to work well is no longer working that well -> link exchange automation tools, buying links, interlinking sites, multiple domains, subdomains ...

They algo patch up some 'bugs' to avoid us checking the 'penalty', i.e. allinachor text now return similar position to the normal seach, search for keyword -asdasdasd tactic is removed, blar blar ...

But what I want to say is Google is still VERY VERY kind to us. They didn't implement a FULL penalty that removed your site, instead, they try to catch which keyword you are playing with and only kick 'its' ass.

If you want to sucess with Google, the KIDS' strategy no longer work well. Old books need to be rewrote. Old Google SEO information need to be updated. New SEO masters are needed.

Don't blame Google, blame SEOs. Blame KIDS that success in SEO. :)

5:35 pm on Mar 10, 2004 (gmt 0)

Preferred Member

10+ Year Member

joined:Dec 7, 2001
votes: 0

What you are seeing is an OOP "Over Optimised Page" loading factor.

The 2-word search PHRASE will have a different density and distribution pattern to the 3-word search Phrase.

In just about every page, you will find that the 2-word phrase is more prevalent than the same phrase with and additional word (at either end) that makes up a related 3-word search phrase.

This is a common problem now for many pages that incur a "selective" loading factor for over dense search terms, (usually 2 word phrases) but perform well for secondary search terms (usually 3 words or more).

When judging density and distribution to avoid OOP, there is the further complication of each individual word scattering around the page to be taken into account.

Yahoo/Inktomi/MSN combination becomes more attractive and shows a better ratio of effort:return as months go by. The Google bubble has not yet burst but......................

6:36 pm on Mar 10, 2004 (gmt 0)

Senior Member

WebmasterWorld Senior Member ciml is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:June 22, 2001
votes: 2

AthlonInside, that is by far the best description I've seen.
6:55 pm on Mar 10, 2004 (gmt 0)

New User

10+ Year Member

joined:June 8, 2003
votes: 0


1. Lowering your keyword density for the two word phrase. Once in title, a couple times in the text.

2. Increase density for words Google thinks are related (search "kw1 ~kw2" and "kw2 ~kw1" to see what words Google considers related).

3. Link out to a couple related authority sites on your home page, preferably using targeted keywords.

This technique seemed to work well for me, anyway.