Forum Moderators: open
Kackle - can you explain the "dictionary" for me? And how I might benefit from it - Im reading your posts hard but dont see where youre coming from.
Sure. But you have to act quickly. Google will fix this one just like they fixed the hyphen.
1. Google is depreciating pages/sites that are over-optimized for certain keywords or keyword combinations. It does this by looking up search terms in a dictionary of target keywords or keyword pairs that it has compiled. This dictionary is Top Secret, because if you knew what was in the dictionary, you could avoid these words in your optimization efforts.
2. If the search term or terms hit on a dictionary entry, the search results for that user's search are flagged. This means that before the results are delivered, the order of the links, or even the inclusion of links, are adjusted so as to penalize pages that have overoptimizated for those terms. Most likely the title, headlines, links and anchor text are examined. It's possible that external anchor text pointing to that page has also been pre-collected and is available for scanning, but this is much less likely. (Besides, external links are not something within your immediate control, so don't worry about it right now.)
3. You want to find out which keywords that are relevant to your site are in Google's dictionary. Compile as many relevant keywords you can think of that searchers might use to find your site. Now take these words singly and in pairs, according to how users might search. Run two searches for each combination and compare the results.
4. If the results are strikingly different for the pre-filter and the post-filter search on a particular term or combination of terms, it means that some variation of those terms has been flagged because something was found in Google's dictionary.
5. Do lots of searches and you can come up with a list of "sensitive" words that you'll want to avoid when you re-optimize your pages.
It's a nice weekend project.
I think the suggestion that Google is temporarily messed up makes more sense than some of the wackier theories in this thread. There may also be less emphasis on allintext: and more on anchor text.
should hire 100 guys going through spam reports....
that would keep the results a lot clearer.....
Their are some things that humans can do better then computers :)
However....
Google does respond to Spam reports and sites have been deleted due to human intervention. Google is trying to build a fool proof system without the requirements of a human being.
With billions of web pages indexed already and the expected growth of web pages in the coming years it will make it very difficult for humans to manage. Google has already managed to indexed all those web pages, display the correct results and update on a daily basis.
When you start to employ many humans to do the job you will run the risk of sites being banned for no particular reason(a few dollars in pocket would do the trick).
Although I have suffered in this update so far I have more trust in computers then humans, computers don't make mistakes thay just need to be re-configured.
Google is going through this reconfiguration period and it's going to be a very bumpy ride but at the end of the day they will have the best freely indexed search engine producing the most relevent and update results on the internet.
This is no short term tactic by Google.
:)
If this is the case I would say Google has lost its direction and forgot what got them so far in the first place.
The results are no good. People don't have time to sort through search results that are no good.
JMO
It made sense a few days ago when Google delivered poor results to Yahoo! It was a good idea to make Yahoo! perform low bandwidth until the switch occurs.
During the last 72 hours, Google has been displaying the same poor results as Yahoo! In addition, Google's directory results look really bad.
If a Yahoo! user obtains poor search results, he or she has the option to access Yahoo!'s directory results which look better than results in Google's directory.
There is something wrong with Google.
On the worlds largest search engine, if my existing customer base or future customers type in "widgets.com", no page exists.
Is that even legal? At this point, I don't care about google referrals, but I would like the ability for customers to type in my domain name and get my site.
Cut as a break Google!
jd
[edited by: johnnydequino at 1:20 pm (utc) on Nov. 23, 2003]
Our site is a two word (no hyphen) .com named after an Island off the coast of SA. It has an English way of saying it, and two Spanish ways of saying it, one with "de" between the words which returns different results than the two word search without it. Our tile and description has been the same way with minor adjustments since it was created in 1995. It gets us wonderful placement in ALL search engines and is filled with content.
My fear is that if the update is over and I have to change the home page title and description as well as the text because the name is repeated many times to provide the 3 ways of saying it, what impact will this have on the other search engines?
Are we back to doing doorway pages which will just be filtered at some future update?
Google has to return to a REASONABLE level of optimization to reflect the real world problems that we face in creating proper web sites. Because we have this problem of multiple names with one recurring word is our web site to be penalized?
I would guess that there are thousands of web sites dealing with two word locations that have this same problem.
I might say that as soon as we append the country name we show up just fine. Unfortunately most searchers just enter the two word version.
It's been this way for weeks! There are several threads that talk about it.
Directory search is just showing the same results as the regular search, and has been for some time. It has nothing to do with this thread.
The last UPDATE this happened to the major keywords but the minor results seemed to stay good -- but now the minor results have become irrelevant (less quality).
When you search for key phrases such as "dry hair" and all you get are message board listings and guestbook pages then things are not good.
Then why oh why have I just searched my keywords and for this example I will use "Great Widgets" got a site with 32 uses of the phrase great widgets and then 47 uses of just widgets.
This is a very big money keyword and this spammer has 2nd place in Google is he lucky?
Well a little detective work and this site is a member of googlesyndication.com which means onward transition of visitors (because this site doesn’t actually offer anything they are just a portal) benefits the site by money from Google. Is this a new way for people to get good listings in Google I always thought they were above commercialism!
If when Google Guy wants to respond he can tell me how to join this club and get my site back onto the first page I for one will be very pleased.
It appears that a few things are happening:
1) Im getting more 3 keyword plus traffic, as customers are figuring out that single and two keyword combos returns crap.
2) The adsense traffic has increased, as customers are finding that the ads of the right is much more relevant that the results on the left.
I don't know why Google likes to spend so much time tweeking its algorithm. All the bad guys are eventually going to poke and prod the system until they find out what is causing the penalty, and the level of spam will slowly increase again, and the cycle will repeat itself.
Meanwhile all the good guys are doing the same thing, because we feel that we have unjustily been dropped. It basically forces companies that don't do SEO, into SEO.
On our top search phrase their used to be around 8 million
results, now over 16 million! We were no.3, now nowhere -
although we have still keep most of our rankings for less
competitive keywords.
If a filter is in place, it has let in a lot more sites!
TM
.
Google Directory results have changed again though. Some datacentres are still showing old data, and some have new data. One datacentre with new data reverted to old data yesterday. One datacentre that wasn't responding last week, now has the new version in it.
Responses: -fi timeout; -in old; -va new; -ab old; -dc new; -cw new; -ex new; -zu NO; -sj NO; -mc NO; -kr NO; -gv NO.
(... where "new" is meaning "includes a cat added to dmoz.org in May, that then showed in Google SERPs in July, but only appeared in the Google Directory for the first time on November 2nd")
[edited by: g1smd at 3:46 pm (utc) on Nov. 23, 2003]
What is curious is that there has been no statement from GG and no news headlines (as far as I can find by doing a news search).
Outside of SEO forums an average surfer would think this is a non-event.
I've also noticed that a few online retail sites in the UK have dropped dramatically as their company names happen to be "money keyword ltd" or similar as a lot of other sites link to them with the anchor text "money keyword ltd" as you would expect. Of the 4 or 5 sites I have noticed, (online retailers) none of them have reciprocal links, so the linking is solely inbound. Yet it appears that because on the page they have "welcome to money keyword" or something similar with some anchor text also featuring "money keyword" they have suffered.
I understand why Google has implemented an algo change in order to combat spam, but I think they have turned the dial a little too far and lots of good sites have been hit.
Personally I have not been effected by the changes as my sites are non commercial and fortunately they have held their positions, but I am seeing what the rest of you are seeing as the results for some searches do seem a little odd.
Once I calmed down however, I realized that many, if not the almost all of my competitors suffered the same fate. If you punch in some of the most popular keywords for my industry (services oriented) you will see that the results you get currently on Google's live site are almost all non-commercial sites in the top 30. For example, you get enthusiast sites who mention my service industry with some links, but no actual providers are showing up.
This puzzled me for a short time, and I was thinking that Google was in chaos, not providing relevant links. Then, it struck me. Why wouldn't Google wipe out organic results for commercial sites and basically force them to use AdWords. In my industry, many terms are going for $3+ on Adwords and Overture. Google is losing some serious $$ to the top five or six organically placed companies. Why not remove them from the listings under the guise of 'search purity' in order to both convince end-users that they are getting better, non-commercial laden results as well as seriously fattening their bottom line with increased AdWord revenue as the commecial sites are forced to use this avenue (Adwords) to drive customers to their site. It is a win-win for Google and a lose-lose for commercial sites. I am concerned.
In common with many here I've had my golden goose shot. A big earning page that has been at #1 has been wiped from the results for the most important two word search in the key market that it is targetted at.
I'm sure that it is not as simple as this but...
I noticed that another page in the same site has retained #1 spot for another two word search and do you know what it doesn't have any <h> tags on the page. And the page that has been dropped has only (apparently) been dropped for the two word phrase that is in the <h> tags.
One particular two word phrase pulls in (or used to) something like 70% of traffic to one of my sites and guess what, I put it in <h1> and <h2> tags. This page has been dropped.
In my search for a solution I just did a search for those two words and none of the top 10 sites had properly formed <h> tags. 8 of them had no <h> tags at all and the others had style code enclosed in the tag.
Has anyone else here noticed this effect?
I guess that <h> tags are an easy target. Google was known to weigh these highly and with a bit of CSS you can easily change the viewable text to look like body text or whatever.
I would be interested to hear comments on this hypothesis.
Best wishes
Sid
I have noticed the same thing. My competitor has no H1 text, and has remained at #1. I have one <h1 text and it was at the very bottom of the web page, containing the two word combo and I got nuked for that combo.
Everything else is basically the same, between the competitor and I.
Can anybody else confirm this.
Update: Non of my top 10 for my keywords has any <H1> or <H2> text.
[edited by: bull at 4:03 pm (utc) on Nov. 23, 2003]
I will keep everybody informed,if my home page comes back from the depths of despair.
Bull and Prejustic, your profile indicates that you are
outside North America, I wonder if this may have anything to
do with you being unable to confirm the results, (i.e. Google has not delt with european sites yet).
A site that was optimized about 5 months ago absolutely failed to rank for a fairly easy 3 keyword phrase. Now it pretty much owns the 1 st page for that search.
Another page (different site) that never used to show up for that same phrase started showing up mid week but then the page was modified using h2 tags and starting last night disapeared for that phrase. H2 tags are now being removed and we will see quickly if it makes a difference. (site gets freshed daily)
Write two keywords:
Okay, so you're not content with widgets, they also have to be blue. Or is it that you're not content with blue, it must also be widgets? Or it it really blue gadgets you want, but you don't know the name? Before this would get closer matches, now the broad match kicks in.
With great respect, this doesn't explain what we see. We have a number of sites in different cateogories. Only one of them is in a competitive adwords category, and that category pre-Florida saw a mix of well-SEO'd and over-SEO'd (spammy) sites in the top spots.
Our sites in the less competitve categories are up in traffic. The one site in the competitive category saw its index page drop from page 2 to page 37, then gone altogether.
Since Florida, this one competitive category now shows SERP's dominated by directory pages (and not even the best choices; probably the broad match factor), news sites, book sites, and .edu / .gov sites with varying degrees of connection to the topic. With one exception, all of the sites in the previous top 20 spots are gone.
If the broad matching *without severe filtering* were the case, one would have expected to see at least some of the very clean sites from the past still showing up. They are good sites, high traffic, relevant, etc. True, they were SEO'd...but in a way consistent with Bretts 12 rules and without spam.
If I saw the spammy sites get blown away and a majority of clean ones remain - mixed in with the broader match results - I could agree that this is a broad match push. But when a massive number of clean, relevant sites gets wiped away along with the spammy ones, then the baby has been thrown out with the bathwater, and it seems clear that broad matching alone is not all that is at work here. Broad matching might or might not improve the SERP's. But broad matching, if it eliminates previously relevant, clean pages, is surely not something any thinking SE would choose.
Thus, unless I'm missing something, I'm left with Kackle's suppositions as the best, if somewhat improbable, explanations for what I see.
That said, I see an exception so far for every one-factor-only rule in the theory.
The one site that I noted above (the one that slips thru to remain in the current SERP's of this competitive category) continues to show KW densities of 10%-12% for two-word and three-word keyphrases that are clearly in the 'dictionary' if the dictionary exists (based on the "-ljxasldj -kdjakdls" test). Both of these phrases are in the site's title, and on-page with high density. Also, the homepage uses <H1> tags with both phrases.
So if the dictionary exists, it must trigger a filter that contains multiple conditions. And I can confirm that the conditions are not simply:
--"keyword phrase in title"
--"high keyword phrase density"
--"use of <H1> tags"
--any combination of the above...
I'm still searching for a set of conditional IF's that can be proven to exclude sites across all examples...
One question: do people believe this dictionary, if it exists, is tied only to homepages right now?
My sites have plunged.
All else seems equal (Titles, etc.)
Of course the older sites do have more backlinks.
So # of backlinks and/or age makes a difference?
-c
Cayenne, our site has been in Google since day 1 and has ranked well on our key phrases since January 2001 and it has tanked.
Caveman I don't think it is the h tag alone but there is a nugget there that says all else being equal the h tag will do you in.
Not seeing any comments on the huge number of amazon results that have re-appeared.