homepage Welcome to WebmasterWorld Guest from 54.167.11.16
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Google / Google News Archive
Forum Library, Charter, Moderator: open

Google News Archive Forum

This 526 message thread spans 18 pages: 526 ( [1] 2 3 4 5 6 7 8 9 ... 18 > >     
What The Early Research is Showing – Florida Update 2003
an analysis and aggregate of the current post-Florida update best practices
ryanallis1




msg:183422
 9:14 am on Dec 3, 2003 (gmt 0)

I would welcome any comments and discussion on the following article (all URLs and specific keywords have been removed) that analyzes the current state of the Google update and suggests certain steps to take for both webmasters and Google...

Thank you,
Ryan Allis

On November 15, 2003, the SERPs (Search Engine Result Pages) in Google were dramatically altered. Although Google has been known to go through a reshuffling (appropriately named a Google Dance) every 2 months or so, this 'Dance' seems to be more like a drunken Mexican salsa that its usual conservative fox-trot.

Most likely, you will already know if your web site has been affected. You may have seen a significant drop-off in traffic around Nov. 15. Three of my sites have been hit. While one could understand dropping down a few positions, since November 15, the sites that previously held these rankings are nowhere to be found in the top 10,000 rankings. Such radical repositionings have left many mom-and-pop and small businesses devastated and out of luck for the holiday season. With Google controlling approximately 85% of Internet searches, many businesses are finding a need to lay off workers or rapidly cancel inventory orders. This situation deserves a closer look.

What the Early Research is Showing

From what early research shows, it seems that Google has put into place what has been quickly termed in the industry as an 'Over Optimization Penalty' (OOP) that takes into account the incoming link text and the on-site keyword frequency. If too many sites that link to your site use link text containing a word that is repeated more than a certain number of times on your home page, that page will be assessed the penalty and either demoted to oblivion or removed entirely from the rankings. In a sense Google is penalizing sites for being optimized for the search engines--without any forewarning of a change in policy.

Here is what else we know:

- The OOP is keyword specific, not site specific. Google has selected only certain keywords to apply the OOP for.

- Certain highly competitive keywords have lost many of the listings.

How to Know if Your Site Has Been Penalized

There are a few ways to know if your site has been penalized. The first, mentioned earlier, is if you noticed a significant drop in traffic around the 15th of November you've likely been hit. Here are ways to be sure:

1. Go to google.com. Type in any search term you recall being well-ranked for. See you site logs to see which terms you received search engine traffic from. If your site is nowhere to be found it's likely been penalized.

2. Type in the search term you suspect being penalized for, followed by "-dkjsahfdsaf" (or any other similar gibberish, without the quotes). This will remove the OOP and you should see what your results should be.

3. Or, simply go to www.**** to have this automated for you. Just type in the search term and see quickly what the search engine results would be if the OOP was not in effect. This site, put up less than a week ago, has quickly gained in popularity, becoming one of the 5000 most visited web sites on the Internet in a matter of days.

The Basics of SEO Redefined. Should One De-Optimize?

Search engine optimization consultants such as myself have known for years that the basics of SEO are:

- put your target keyword or keyphrase in your title, meta-tags, and alt-tags
- put your target keyword or keyphrase in an H1 tag near the top of your page
- repeat your keyword or keyphrase 5-10 times throughout the page
- create quality content on your site and update it regularly
- use a site map (linked to from every page) that links to all of your pages
- build lots of relevant links to your site
- ensure that your target keyword or keyphrase is in the link text of your incoming links

Now, however, the best practices for keyword frequency and link text will likely trigger the Google OOP. There is surely no denying that there are many low quality sites have used link farms and spammed blog comments in order to increase their PageRank (Google's measure of site quality) and link popularity. However, a differentiation must be made from these sites and quality sites with dozens or hundreds of pages of informational well-written content that have taken the time to properly build links.

So if you have been affected, what can you do? Should one de-optimize their site, or wait it out? Should one create one site for Google and one for the 'normal engines?' Is this a case of a filter been turned on too tight that Google will fix in a matter of days or something much more?

These are all serious questions that no one seems to have answers to. At this point we recommend making the following changes to your site if, and only if, your rankings seem to have been affected:

1. Contact a few of your link partners via email. Ask them to change the link text so that the keyword you have been penalized for is not in the link text or the keyphrase is in a different order than the order you are penalized for.

2. Open up the page that has been penalized (usually your home page) and reduce the number of times that you have the keyword on your site. Keep the number under 5 times for every 100 words you have on your page.

3. If you are targeting a keyphrase (a multiple-word keyword) reduce the number of times that your page has the target keyphrase in the exact order you are targeting. Mix up the order. For example, if you are targeting "Florida web designer" change this text on your site to "web site designer in florida" and "florida-based web site design services."

It is important to note that these 'de-optimization' steps should only be taken if you know that you have been affected by the Google OOP.

Why did Google do this? There are two possible answers. First, it is possible that Google has simply made an honest (yet very poor) attempt at removing many of the low-quality web sites in their results that had little quality content and received their positions from link farms and spamdexing. The evidence and the search engine results point to another potential answer.

A second theory, which has gained credence in the past days within the industry, is that in preparation for its Initial Public Offering (possibly this Spring), Google has developed a way to increase its revenue. How? By removing many of the sites that are optimized for the search engines on major commerical search terms, thereby increasing the use of its AdWords paid search results (cost-per-click) system. Is this the case? Maybe, maybe not.

Perhaps both of these reasons came into play. Perhaps Google execs thought they could

1) improve the quality of their rankings,
2) remove many of the 'spammy' low-quality sites
3) because of #2, increase AdWords revenues and
4) because of better results and more revenue have a better chance at a successful IPO.

Sadly, for Google, this plan had a detrimental flaw.

What Google Should Do

While there are positives that have come from this OOP filter, the filter needs to be adjusted. Here is what Google should do:

1. Post a communiqué on its web site explaining in as much detail as they are able what they have done and what they are doing to fix it;

2. Reduce the weight of OOP;

3. If the OOP is indeed a static penalty that can only be removed by a human, change it to a dynamic penalty that is analyzed and assessed with each major update; and

4. Establish an appeal process through which site owners which feel they are following all rules and have quality content can have a human (or enlightened spider) review their site and remove the OOP if appropriate.

When this recent update broke on November 15, webmasters clamored in the thousands to the industry forums such as webmasterworld.com. The mis-update was quickly titled "Florida Update 2003" and the initial common wisdom was that Google had made a serious mistake that would be fixed within 3-4 days and everyone should just stay put and wait for Google to 'fix itself.' While the rankings are still dancing, this fix has yet to come. High quality sites with lots of good content that have done everything right are being severely penalized.

If Google does not act quickly, it will soon lose market share and its reputation as the provider of the best search results. With Yahoo's recent acquisition of Inktomi, Alltheweb/FAST, and Altavista, it most likely will soon renege on its deal to serve Google results and may, in the process, create the future "best search engine on the 'net." Google, for now, has gone bananas in its recent meringue, and it may soon be spoiled rotten.

 

Miop




msg:183423
 9:36 am on Dec 3, 2003 (gmt 0)

I'm suspicious of this link problem thingy - if the text used in the links is a culprit, how come so many shopping malls and directories are listing for top kw's? They have standard link text when linking. If anything, it makes me wonder if it's the sites where the link text wasn't exact enough which have been dropped. If you sign to a shopping mall or directory, you use their standard link text, and they are all still at the top of the league...
Wouldn't it make sense that if G wanted the results to be more filtered for relevance, then exact linking text to that found on the site would be a good thing not a bad one?

rfgdxm1




msg:183424
 5:15 pm on Dec 3, 2003 (gmt 0)

>I'm suspicious of this link problem thingy - if the text used in the links is a culprit, how come so many shopping malls and directories are listing for top kw's?

If you read what the OP stated: "Google has selected only certain keywords to apply the OOP for." Is it possible these shopping malls and directories aren't using penalized keywords? Also, this list of keywords would have to be hand created. Otherwise too many cases of brand.com getting penalized for "brand". These sites typically have a huge number of inbound anchor text with "brand". Thus, Google when creating this keyword list may have been looking for cases where they wouldn't hurt well known, clean sites. Like shopping malls and directories. They may have even whitelisted the ones they knew about so as not to cause collateral damage.

jim_w




msg:183425
 5:32 pm on Dec 3, 2003 (gmt 0)

If they did that, then that is one of the most arrogant, naive things I have ever seen a company do. What if your company name is not a Fortune 1000 company had happens to have one of those KWs in it? As many, many businesses do. What if, a simple example, the KW happens to be ‘used cars’ and you have listings of new and used cars with it stated if it is new or used?

Every page I have has our company name, address, phone number, etc. on it so that people know who and where we are. This also apparently seems to be one of their KW filters. Having your name and address on at least one page was a recommendation by yahoo years ago, and makes good sense even if G doesn’t think so. So they are such a monopoly now that they can dictate what my business principals should be and can dictate how I run my company? Isn’t that what M$ got into so much hot water for? And if they did anything to the results to increase ad revenues then they may have big problems with the FTC just for openers.

usavetele




msg:183426
 5:36 pm on Dec 3, 2003 (gmt 0)

"- use a site map (linked to from every page) that links to all of your pages"

So does this help with ranking, or just help the search engines find all your pages? Stupid Q, but I'm desperate to get my dang page out of the google trash can.

Thanks!

claus




msg:183427
 5:41 pm on Dec 3, 2003 (gmt 0)

Regarding "over-optimization penalty" and the "commercial keyword list" - i (still) don't believe it.

These are some queries that are penalized, according to that website that most of the conspiracy-advocates seem to use... Below are real queries from November 30.:

  • foo bar (70 of top 100 removed)
  • turquoise (69 of top 100 removed)
  • hello world (61 of top 100)
  • red blue (54)
  • 1 2 3 4 5 6 7 8 (39)
  • some random words (28)
  • keyword phrase (26)
  • blue widgets (13)
  • lorem ipsum dolor sit amet (7)
Post #386: [webmasterworld.com...]

It is quite easy to find other similarly nonsensical "banned" phrases. Now, do any of you seriously believe that any of these keyword phrases return "too optimised" sites? Do they seem commercial?

What you think you are seeing is in fact something else. The terms you think are "banned" seems, in stead, to be terms that (return a range of pages that do not) qualify for a "broad match". The exact nature of this broad match is what you should be reflecting on in stead of seeking conspiracies.

Added: sorry if i'm sounding rude, it annoys me to see stuff like this repeated when it is so easy to see it's not true

/claus

[edited by: claus at 5:48 pm (utc) on Dec. 3, 2003]

Miop




msg:183428
 5:46 pm on Dec 3, 2003 (gmt 0)

If you lookied up 'blue widgets' since the update, a few would be bound to be removed. :)

Miop




msg:183429
 5:49 pm on Dec 3, 2003 (gmt 0)

Actually I am torn between the commercial kw thingy, and the notion that they are looking for very accurate matches with pages with the lowest kw density...
If you sell blue widgets and your links say 'blue widgets' and your site is a genuine blue widget whatever, then presumably you don't need to stuff the page with kw's and sites that do might trip a filter.

john316




msg:183430
 5:53 pm on Dec 3, 2003 (gmt 0)

OOP is not an accurate assesment, try BA.

Buy Adwords.

It would seem that a new psychological disorder should be coined. "battered webmaster syndrome"; you get zapped for having a useful site and its your fault.

Please folks, just use Altavista, don't feed the hand that slapped ya.

superscript




msg:183431
 5:54 pm on Dec 3, 2003 (gmt 0)

Actually I am torn between the commercial kw thingy, and the notion that they are looking for very accurate matches with pages

There's something in this - I'm sunk for most 2 KW phrases - but pop up fine right at the top of the SERPs for more specific 3 KW phrases. It has certainly hurt my revenue, but customers are still managing to find my products. This is why its so hard to figure whether I have been 'penalised' as such.

MetropolisRobot




msg:183432
 5:54 pm on Dec 3, 2003 (gmt 0)

I believe there may be something to do with inward links, but it may be the number of links from a another site all featuring the same phrase.

Say a company has a main domain A and a secondary domain B. B has many links to A, all featuring the same phrase. "Blue Widgets in X" where X may be the name of a state etc etc.

If it is inward link related then it could be an issue. What's to stop me now setting up multiple links to my competitor featuring the phrase? That's what makes me hesitate to say that it is this alone....

I know that the top site in the category i'm looking at has a 2 keey word density in excess of 38% so I dont believe it is KW density alone.

DylanW




msg:183433
 5:55 pm on Dec 3, 2003 (gmt 0)

I'm suspicious of this link problem thingy - if the text used in the links is a culprit, how come so many shopping malls and directories are listing for top kw's?
Just a complete guess out of nowhere, but what if it had something to do with excessive link text as well? Since most links are fairly short, this would indicate "keyword stuffing" on link exchanges if you see the same long link text showing up time and again. For example, I've been requesting that people who link to our site use link text like "KeyPhrase 1 from OurCompany - Since 1956" (depending on the page they're linking to); however, I've seen people request that links to them use link text like "TheirCompany - key phrase 1, key phrase 2, and key phrase 3" or something to that effect.

Of course, if it is a link text issue, or even an issue of how people link to you, fixing it's going to be no easy task if you've got a whole lot of links. Then again, it's always been said you can't be penalized for people linking to you, simply because this would allow people to sabotage you.

[edited by: DylanW at 5:57 pm (utc) on Dec. 3, 2003]

jim_w




msg:183434
 5:55 pm on Dec 3, 2003 (gmt 0)

claus
>>"too optimised" <<

I don’t think it is an issue of that as much as G has made a decision on what is a broad match, and sometimes their theory isn’t correct. Sometimes a set of KWs MUST be used together for the context to make sense to a human reader. I am also convinced that based on the fact that not only have my hits gone down, but people are looking for us with my name or our company name in G and the number of hits from other SEs is starting to go up, as is traffic. But very, very slowly. Based on my 2 weeks or so of data, it does seem to be a trend, but too soon to call. So G may NOT be supplying the end users with what they are looking for now [edit] at least based on one set of 2 word phrases[/edit]. Personally, I feel I almost have enough data and statistics to turn it over to the FTC for an investigation.

Miop




msg:183435
 6:02 pm on Dec 3, 2003 (gmt 0)

There is the added complication that some sites may yet to fall from grace, so examining their kw density too closely might be detrimental to your health!
If I had the time I would be inclined to check a range of subjects across the top ten sites and try to spot any patterns...
Fortunately our sales are satisfactory, but some of the kw's we are being found for are truly bizarre and totally different to the past 6 months.

Miop




msg:183436
 6:04 pm on Dec 3, 2003 (gmt 0)

NB. I am going to attempt to remove inward links from the equation - it would not be rational to do this because of the webmasters lack of control over them. Or G has thrown a complete wobbly and doesn't care about rationality...

rfgdxm1




msg:183437
 6:06 pm on Dec 3, 2003 (gmt 0)

>Regarding "over-optimization penalty" and the "commercial keyword list" - i (still) don't believe it.

claus makes a convincing argument. 70 out of the top 100 sites for "foo bar" have disappeared in this last update? That makes no sense at all on the theory there is an over-optimization penalty, or a "commercial keyword list" Google is going after. The evidence claus presents points more in the direction of some major shift in the algo.

superscript




msg:183438
 6:08 pm on Dec 3, 2003 (gmt 0)

Anyone for good ole keyword-stuffing as the culprit? I know there are some exceptions - but it's nice and simple and would, from Google's point of view, quickly eliminate the most basic spammers.

<edit: and potentially mess up the listings of competing search engines if webmasters react to it by reducing keyword density >

[edited by: superscript at 6:12 pm (utc) on Dec. 3, 2003]

MetropolisRobot




msg:183439
 6:10 pm on Dec 3, 2003 (gmt 0)

superscript

as i said, i found the major competitor in my category to have a KW density of 38% so i'm ruling that one out unless it is offset by something else.

Miop




msg:183440
 6:12 pm on Dec 3, 2003 (gmt 0)

I do agree at the moment superscript - I have removed a few to see what happens (though I hadn't stuffed it - I just sell a lot of different kind of fuzzy widgets!). Unfortunately I seem to be being crawled by googlebot but the index is not updating at the moment.

HenryUK




msg:183441
 6:12 pm on Dec 3, 2003 (gmt 0)

OK

Here's my observations based on some of my key phrases, using the method outlined in the original post.

1) On some I am doing better because some spammy sites have been removed. Hoorah for me but not very interesting.

2) On one major key phrase I am doing the same as before, but it is different sites that are above and below me. What this shows is that it is not simply a case of certain sites being "penalised" or not. If my site were "penalised" it wouldn't be #6 out of about 6 million, as it was before. However, some other sites have been promoted ahead of it, to replace the ones that have been "penalised". If there is an OOP then it is on some kind of sliding scale rather than a binary on-off thing. Which would make some kind of sense given Google's general approach to these things. Incidentally, in this case at least some of the sites "removed"/"penalised" weren't obvious stuffers.

cheers

[edited by: HenryUK at 6:15 pm (utc) on Dec. 3, 2003]

jim_w




msg:183442
 6:13 pm on Dec 3, 2003 (gmt 0)

>>Anyone for good ole keyword-stuffing as the culprit?<<

Sites in the 1 and 2 position for my 2 KWs, have more than I do, so I doubt that is it.

rfgdxm1




msg:183443
 6:14 pm on Dec 3, 2003 (gmt 0)

>Anyone for good ole keyword-stuffing as the culprit? I know there are some exceptions - but it's nice and simple and would, from Google's point of view, quickly eliminate the most basic spammers.

Check the "foo bar" example. Few of those pages that dropped off the radar were keyword stuffed.

superscript




msg:183444
 6:15 pm on Dec 3, 2003 (gmt 0)

"i found the major competitor in my category to have a KW density of 38%"

Point taken, but the SERPs are a bit of a mess, there's a lot of crawling to do. It's possible some sites have slipped through the net. Certainly low keyword density looks like a major factor to me. The only competitor of mine who hasn't budged an inch in the SERPs and is still at number one, has only a single mention of the main keyword in the body text of his index page.

Miop




msg:183445
 6:18 pm on Dec 3, 2003 (gmt 0)

rfgdxm1 - maybe they weren't stuffed for foo bar, but didn't have enough precise links pointing towards them saying 'foo bar here'.

Kirby




msg:183446
 6:20 pm on Dec 3, 2003 (gmt 0)

>claus makes a convincing argument. 70 out of the top 100 sites for "foo bar" have disappeared in this last update? That makes no sense at all on the theory there is an over-optimization penalty, or a "commercial keyword list" Google is going after. The evidence claus presents points more in the direction of some major shift in the algo.

I think this is a sound explanation.

Both the 'over-seo' and 'commercial keyword list' arguments have to many proven exceptions to be valid.

Combine an algo shift and a shake-up in the Google directory and does that result in Florida?.

Miop




msg:183447
 6:20 pm on Dec 3, 2003 (gmt 0)

Another reason for my thinking it might be a double combo of kw density/inward link accuracy is my sites appearance for words it has never appeared for before. I didn't even twig the site was optimised for that until I looked through my own internal links which I had done and forgotten about.

Chndru




msg:183448
 6:21 pm on Dec 3, 2003 (gmt 0)

What The Early Research is Showing is..it's too early to analyze anything, especially a half-baked algo. :)

give 'em a week for the dust to settle down

quotations




msg:183449
 6:22 pm on Dec 3, 2003 (gmt 0)

Perhaps I should try the experiment of adding a major competitor to every page I have using the two word phrase they rank number one for as the anchor text.

That should settle the question of whether it is anchor text related.

800,000+ links to them with identical kw1 kw2 anchor text ought to have some impact if that is a criteria.

superscript




msg:183450
 6:23 pm on Dec 3, 2003 (gmt 0)

The site I referred to earlier with only one mention of the KW in his index has *always* been at number one - so the site is noteworthy. Because whatever he has been doing, whether consciously or not, has worked both pre and post Florida. His all in text results: Nowhere. His All in anchor results: top dog.

It looks like he's got a very large number of low ranking, but genuine and organic inbound links.

[edited by: superscript at 6:24 pm (utc) on Dec. 3, 2003]

jim_w




msg:183451
 6:23 pm on Dec 3, 2003 (gmt 0)

>>especially a half-baked algo<<

or half-raw, depends on if you are a pessimist or an optimist

This 526 message thread spans 18 pages: 526 ( [1] 2 3 4 5 6 7 8 9 ... 18 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google News Archive
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved