Forum Moderators: open
Thank you,
Ryan Allis
On November 15, 2003, the SERPs (Search Engine Result Pages) in Google were dramatically altered. Although Google has been known to go through a reshuffling (appropriately named a Google Dance) every 2 months or so, this 'Dance' seems to be more like a drunken Mexican salsa that its usual conservative fox-trot.
Most likely, you will already know if your web site has been affected. You may have seen a significant drop-off in traffic around Nov. 15. Three of my sites have been hit. While one could understand dropping down a few positions, since November 15, the sites that previously held these rankings are nowhere to be found in the top 10,000 rankings. Such radical repositionings have left many mom-and-pop and small businesses devastated and out of luck for the holiday season. With Google controlling approximately 85% of Internet searches, many businesses are finding a need to lay off workers or rapidly cancel inventory orders. This situation deserves a closer look.
What the Early Research is Showing
From what early research shows, it seems that Google has put into place what has been quickly termed in the industry as an 'Over Optimization Penalty' (OOP) that takes into account the incoming link text and the on-site keyword frequency. If too many sites that link to your site use link text containing a word that is repeated more than a certain number of times on your home page, that page will be assessed the penalty and either demoted to oblivion or removed entirely from the rankings. In a sense Google is penalizing sites for being optimized for the search engines--without any forewarning of a change in policy.
Here is what else we know:
- The OOP is keyword specific, not site specific. Google has selected only certain keywords to apply the OOP for.
- Certain highly competitive keywords have lost many of the listings.
How to Know if Your Site Has Been Penalized
There are a few ways to know if your site has been penalized. The first, mentioned earlier, is if you noticed a significant drop in traffic around the 15th of November you've likely been hit. Here are ways to be sure:
1. Go to google.com. Type in any search term you recall being well-ranked for. See you site logs to see which terms you received search engine traffic from. If your site is nowhere to be found it's likely been penalized.
2. Type in the search term you suspect being penalized for, followed by "-dkjsahfdsaf" (or any other similar gibberish, without the quotes). This will remove the OOP and you should see what your results should be.
3. Or, simply go to www.**** to have this automated for you. Just type in the search term and see quickly what the search engine results would be if the OOP was not in effect. This site, put up less than a week ago, has quickly gained in popularity, becoming one of the 5000 most visited web sites on the Internet in a matter of days.
The Basics of SEO Redefined. Should One De-Optimize?
Search engine optimization consultants such as myself have known for years that the basics of SEO are:
- put your target keyword or keyphrase in your title, meta-tags, and alt-tags
- put your target keyword or keyphrase in an H1 tag near the top of your page
- repeat your keyword or keyphrase 5-10 times throughout the page
- create quality content on your site and update it regularly
- use a site map (linked to from every page) that links to all of your pages
- build lots of relevant links to your site
- ensure that your target keyword or keyphrase is in the link text of your incoming links
Now, however, the best practices for keyword frequency and link text will likely trigger the Google OOP. There is surely no denying that there are many low quality sites have used link farms and spammed blog comments in order to increase their PageRank (Google's measure of site quality) and link popularity. However, a differentiation must be made from these sites and quality sites with dozens or hundreds of pages of informational well-written content that have taken the time to properly build links.
So if you have been affected, what can you do? Should one de-optimize their site, or wait it out? Should one create one site for Google and one for the 'normal engines?' Is this a case of a filter been turned on too tight that Google will fix in a matter of days or something much more?
These are all serious questions that no one seems to have answers to. At this point we recommend making the following changes to your site if, and only if, your rankings seem to have been affected:
1. Contact a few of your link partners via email. Ask them to change the link text so that the keyword you have been penalized for is not in the link text or the keyphrase is in a different order than the order you are penalized for.
2. Open up the page that has been penalized (usually your home page) and reduce the number of times that you have the keyword on your site. Keep the number under 5 times for every 100 words you have on your page.
3. If you are targeting a keyphrase (a multiple-word keyword) reduce the number of times that your page has the target keyphrase in the exact order you are targeting. Mix up the order. For example, if you are targeting "Florida web designer" change this text on your site to "web site designer in florida" and "florida-based web site design services."
It is important to note that these 'de-optimization' steps should only be taken if you know that you have been affected by the Google OOP.
Why did Google do this? There are two possible answers. First, it is possible that Google has simply made an honest (yet very poor) attempt at removing many of the low-quality web sites in their results that had little quality content and received their positions from link farms and spamdexing. The evidence and the search engine results point to another potential answer.
A second theory, which has gained credence in the past days within the industry, is that in preparation for its Initial Public Offering (possibly this Spring), Google has developed a way to increase its revenue. How? By removing many of the sites that are optimized for the search engines on major commerical search terms, thereby increasing the use of its AdWords paid search results (cost-per-click) system. Is this the case? Maybe, maybe not.
Perhaps both of these reasons came into play. Perhaps Google execs thought they could
1) improve the quality of their rankings,
2) remove many of the 'spammy' low-quality sites
3) because of #2, increase AdWords revenues and
4) because of better results and more revenue have a better chance at a successful IPO.
Sadly, for Google, this plan had a detrimental flaw.
What Google Should Do
While there are positives that have come from this OOP filter, the filter needs to be adjusted. Here is what Google should do:
1. Post a communiqué on its web site explaining in as much detail as they are able what they have done and what they are doing to fix it;
2. Reduce the weight of OOP;
3. If the OOP is indeed a static penalty that can only be removed by a human, change it to a dynamic penalty that is analyzed and assessed with each major update; and
4. Establish an appeal process through which site owners which feel they are following all rules and have quality content can have a human (or enlightened spider) review their site and remove the OOP if appropriate.
When this recent update broke on November 15, webmasters clamored in the thousands to the industry forums such as webmasterworld.com. The mis-update was quickly titled "Florida Update 2003" and the initial common wisdom was that Google had made a serious mistake that would be fixed within 3-4 days and everyone should just stay put and wait for Google to 'fix itself.' While the rankings are still dancing, this fix has yet to come. High quality sites with lots of good content that have done everything right are being severely penalized.
If Google does not act quickly, it will soon lose market share and its reputation as the provider of the best search results. With Yahoo's recent acquisition of Inktomi, Alltheweb/FAST, and Altavista, it most likely will soon renege on its deal to serve Google results and may, in the process, create the future "best search engine on the 'net." Google, for now, has gone bananas in its recent meringue, and it may soon be spoiled rotten.
Not complaining, just trying to find out what is happening. The challenge is that there seems to be so many discrepancies and contradictions.
For example, the search for 'big city real estate' returns only one of the pre-florida results still in top 10, with the rest primarily directory/authority sites. The next highest pre-florida result is #69, and a total of 88 of the pre-florida top 100 are gone.
Do a search for 'small city real estate' and results are back to pre-florida, with none of the directory/authority sites.
At first I attributed the number of directory and/or authority sites now showing for 'big city real estate' to the fact that as a big city it would have far more applicable directory/authority sites.
This theory gets blown out of the water when you do a search for 'big city2 real estate'. The results are pretty much the same as pre-florida, sans the directories.
After examing numerous city results, I'm looking for reasons that explain why this happens. Many of the sites that disappeared are real estate template sites, yet these same company template sites are untouched for other 'city real estate' searches.
I understand the noise level is always higher after updates, but if we could stick to analysis and problem solving instead of the 'serps are great, stop complaining' mantra, everyone would benefit.
foo bar
and
foo bar -dwjkcbhhcf
should come up with the same results if some arcane operation / filter is not at work. I have checked the top 20 sites in the category I target and none of them have dwjkcbhhcf in their content but they still show up under case II. The difference is that I dont show up under case I.
None of my sites were adversely affected by the Florida update. I've even improved on a few terms
For years/months/whatever, many of the webmasters here would complain and whine to Google about all the spam sites that were beating them in the serps, and in most of those circumstances they were claiming spam just because a site wasn't totally to their liking and was ahead of them.
When Google started that asinine practice of auto-banning sites just for the color of the text being the same as the background(instead of ignoring the text as to not risk banning good sites), these same people cheered loudly and were happy they now had a little less competition and couldn't care less about the quality of the serps.
Well, now the shoe is on the other foot. (I'm _not_ referring to all of you and it sucks that your sites are gone, but the whiners in question know who they are and are getting what they deserve.)
Don't you wish you complained less about spam and instead focused on working on your own content; Don't you wish that you had advocated for a more tolerant Google instead of a strict Google that tosses sites at a whim?
That "relevant" site you own is probably considered spam by someone else. Maybe the quality of the serps are better without your sites?
Have a happy Christmas.
I think that claus asserts that if you use - operator to qualify your search you are also implying an exact match on your first term so the results are more like an exact match of the search rather than a broad match.
To test this I have done many searches using the exclusion operator and then a search using quotes (without the exclusion operator). Although in most cases the two results aren't exactly alike, they are pretty close.
I'm sure claus will correct me if I've put words into his mouth!
[edited by: merlin30 at 3:32 pm (utc) on Dec. 4, 2003]
I think that claus asserts that if you use - operator to qualify your search you are also implying an exact match on your first term so the results are more like an exact match of the search rather than a broad match.
A very reasonable explanation, but the question remains: why are some terms broad matched and others not.
Chndru:"It's time to deal with it rather than venting empty threats/accusations and conspiracies at G".
Just to clarify - I wasn't venting conspiracies at G, I did point out that as I haven't done many searches recently, I couldn't say whether Google was working properly or not. I merely highlighted the programme, and tried to repeat as much of it as I could remember (I didn't realize it would be available on video), to make another point. I don't necessarily agree with what they say but it is only the second time I have ever seen Google mentioned on tv in Europe (the first time was in September. Canal Sur News in the South of Spain mentioned Google, as the local Government had appeared in the top 10 queries in August - not a particularly interesting item if you ask me!). Anyway, it is the perception of Google rather than how it is actually performing that was the point- sometimes "to be is to be perceived". The programme did mention the Advanced Search page and using search commands to refine the searches.(Personally I don't find the Advanced Search Page particularly easy to use or intuitive, and sometimes I sense a bit of ambivalence from Google with regard to refining queries. I know that they give tips, but in [google.com...] they say
"DON'T bother with advanced search techniques, such as +,-,quotes, etc. unless the most obvious keywords don't work" which to me would suggest that they want people to try unrefined queries first, and then if they fail, use operators, rather than using operators from the start). I just thought it was significant that a mainstream (not a specifically technical) programme should devote so much time to a Google update- even if there isn't a problem they made it sound as if one existed.
I hope that clarifies things.
With Kind Regards
PBG
foo bar -****xxx
as the next person. That facility is provided for those of us who really want to do some serious googling.
The vast majority of searches are of the form
foo bar
and then people cycle through the first few pages looking for what they want. Therefore, if yoiur results appear in response to a simple search you are made, else, for an ecommerce site who relies on casual searchers, you are sunk.
At first I attributed the number of directory and/or authority sites now showing for 'big city real estate' to the fact that as a big city it would have far more applicable directory/authority sites.This theory gets blown out of the water when you do a search for 'big city2 real estate'. The results are pretty much the same as pre-florida, sans the directories.
Hi Kirby,
It is examples of conflicting results like this together with the vast range of on/off page criteria found by webmasters here that convinces me that a spam filter is being applied to certain keywords and not to others.
I'm proposing the theory that this filter works on the result set for a given search. Any sites that are deemed to be OK pass through the filter and are weighted by the standard Google algo, any that are not OK are stopped by the filter and no algo is applied to them.
When you think about it this is an efficient way of penalising sites. By not giving them any weight for any of the normal SEO techniques they drop out of the reults and you save on processing power becuase they can just be disregarded.
If you do the search using the term that results in your site being dropped out of SERPs using one of these filters does it come back in?
intitle:
allintitle:
allintext:
allinanchor:
search term -site:www.google.com
Sid
This leaves me wondering why "Chicago" would be treated differently than "Detroit" or "Houston"?
Also, why does "city_name term" and "term in city_name" return results that are so different.
Seems to me that over optimization is exactly what sank people. Being over analytical makes you jaded towards what constitutes a good site.
We definitely analyze the hell out of things now days at WebmasterWorld.(as evident by this thread)
yeah... i really thought i did answer that ;) The key is in the literal interpretation of that first sentense of the quote:
You can increase the accuracy of your searches by adding operators that fine-tune your keywords
...adding a space and a minus sign is "adding operators", no matter what you write after that minus sign. As a result of this, the search box reverts from the "broad match query" (post-Florida default) to the "specific match query" (pre-Florida default).
Pre-Florida Google did what any other SE does - tried to find an exact match. Post-Florida, Google does something that other SE's just can't do, that's probably why it's so hard to understand.
>> Why Chicago and not Houston... / why some, not others...
The easy answer would be that Chicago starts with C and Houston starts with H and the "broad match feature" haven't reached H yet.
I'm not sure it's the right answer, but i'm sure that the reason why some search terms are affected more than others is the straightforward one: Because the "broad match database" (for lack of better words) haven't been expanded to all words yet.
>> google has gone for more specific search terms, three words plus
I've seen a move away from the broad match, toward the exact match (pre-Florida style) as well the last couple of days.
I'm sure post-Florida will return full scale, as that is something other SE's just can't do. Remember the rumors of MS going into search? this is, if nothing else, a demonstration of superior technology.
/claus
The easy answer would be that Chicago starts with C and Houston starts with H and the "broad match feature" haven't reached H yet.
Now if that isn't pure speculation - I'd like to know what is ;)
How are Alaskan and Zairen real estate terms faring? :)
I was hoping to provide more information along with this post but I cannot at this time. I am providing this in the hope the more experienced SEO's can run with it.
These are the example keywords, I have not tested these but if you replace the example with your keywords it may supprise you.
Keyword 1 reno
Keyword 2 hotels
Search reno hotels - below the basment level.
Search re-no hotels - bingo back to expected position.
Search reno hotel-s - bingo back to expected position.
In reality by actually making these changes to the pages, in an organized fashion, I would thing the filter can be tricked.
The smart way to go about this would be to totaly deoptimise a page and see if it will rank using the hyphen soley in the title and one occurance on the page.
If so then continue with the SEO to determine exactly where the filter trips.
I am doing this now but the time factor is two to three days per change to see the results.
I looked at several cities:
Denver - more directory/authority sites
San Diego - more directory/authority sites
Las Vegas - 5 of 1st 6 are pre-florida, non-directory sites
Boston - more directory/authority sites
Orange County - mix of pre-florida and directory/authority sites
Naples, Florida - mostly directory sites w/ listings of pre-florida sites
Before the critiques of the serps begin, I'm not expressing an opinion on the quality, just trying to understand the results since a weather site with a real estate link comes up a lot. Also seeing other pages comes up simply because of a link to a page one might expect to get as a relevant result. I'm looking for a way to reconcile theming, broad match and these oddball pages.
Any thoughts?
I don't care much for technology except technology that improves my life. Don't see *that* anywhere around here. Not yet anyway.
Sure do love my Blackberry though. Now that's cool technology. OK, so I'm simple minded. ;-)
< 'blackberry' ... now that's an interesting search term. (Mods if I wan't supposed to say that word, please delete.)>
They also talk about the possibility of Google trying to get ad revenueas the cause of this algo change.
Nothing that hasn't been discussed here on the forums but it appears that the press is writing about it. I wouldn't want this kind of press if I were Google.
The title of the Article is:
Google Changes Rankle Merchants-
Thursday Decemeber 4th Atlanta Journal And Constitution
I have no opinion either way but wanted to post this for everybody to see.
I guess that's why I'm confused about Siterank. It seems that a weather site should not be an authoritative site about Chicago. Perhaps it's an authoritative site about weather in chicago, in which case I would expect it to be on the first page for a search on Chicago weather or weather in Chicago.
It seems if I were to search for "city_name mortgage" I would see banks and other large lendors as authoritative, but that's not the case.
The Alphabet theory doesn't really work either as Searches for "Boston term" and "term in Boston" still return what I would expect to see.
Also there seems to be a city size issue. The largest cities seem to be affected while the smaller cities seem to return what I would expect to see.
Also, I definately see a difference in KW usage in the new terms showing up in the big cities. KW density is far lower, so low as to be almost a non issue. Whatever is floating these guys to the top in big cities has little to do with KW relevance.
I'm also seeing that many of these sites have incoming links from just as many non-industry or non-topic related sites as they do same industry or topic.
intitle:
allintitle:
allintext:
allinanchor:
search term -site:www.google.com
The site comes back in, but few of the other pre-florida pages that are gone return with the exception of "search term -site:www.google.com". This returns the same results as 'search term - asdf-fdsa".
Allintitle, allintext and allinanchor results drastically different than pre-florida. Why would that be?
1. The fact that 'tahoe wedding' trips the filter and 'tahoe wedding packages' does not, yet 'lake tahoe wedding packages' does would lead one to believe that having two of three poison words in a search will not trip the filter, but three out of four will.
2. There is no Google directory listed at the top of the 'lake tahoe wedding packages' search, but it is tripping the filter.
3. There is only one adword for the search that doesn't trip the filter, there are many for the rest.
4. Heavy cross linking is the way to go.