Forum Moderators: open
Kackle - can you explain the "dictionary" for me? And how I might benefit from it - Im reading your posts hard but dont see where youre coming from.
Sure. But you have to act quickly. Google will fix this one just like they fixed the hyphen.
1. Google is depreciating pages/sites that are over-optimized for certain keywords or keyword combinations. It does this by looking up search terms in a dictionary of target keywords or keyword pairs that it has compiled. This dictionary is Top Secret, because if you knew what was in the dictionary, you could avoid these words in your optimization efforts.
2. If the search term or terms hit on a dictionary entry, the search results for that user's search are flagged. This means that before the results are delivered, the order of the links, or even the inclusion of links, are adjusted so as to penalize pages that have overoptimizated for those terms. Most likely the title, headlines, links and anchor text are examined. It's possible that external anchor text pointing to that page has also been pre-collected and is available for scanning, but this is much less likely. (Besides, external links are not something within your immediate control, so don't worry about it right now.)
3. You want to find out which keywords that are relevant to your site are in Google's dictionary. Compile as many relevant keywords you can think of that searchers might use to find your site. Now take these words singly and in pairs, according to how users might search. Run two searches for each combination and compare the results.
4. If the results are strikingly different for the pre-filter and the post-filter search on a particular term or combination of terms, it means that some variation of those terms has been flagged because something was found in Google's dictionary.
5. Do lots of searches and you can come up with a list of "sensitive" words that you'll want to avoid when you re-optimize your pages.
It's a nice weekend project.
What in the world is going on over there? At this moment, Google is useless to me (and to the searcher, IMHO). Almost every search I do returns DMOZ, Yahoo directories, Google directories, Epinions and KMart. Not exactly focused results if you ask me.
I just wonder about my funky AdWords campaign results and the relationship to the Florida fiasco.
Hmmmm.... November 22 is an appropriate day for conspiracy theories.
Googlebot visited us and today we are showing fresh results with NO improvement in SERPs. We have only 3 external links to the site that use keyword1 keyword2 in them. The index page has a PR4.
Too bad it didn't help. But I congratulate you on this careful, rational test and your concise report of your results.
This would suggest that Google is keeping, along with a compressed version of the entire page (which is used to extract the snippet), a list of keywords from external links to that page.
I had my doubts that this was happening with external links, because it would mean that much more planning went into this filter than otherwise. The external link anchor text is most likely not collected on the fly. That would be too expensive in terms of CPU overhead. It's the sort of thing that would be collected once per crawl -- just like the old PageRank had to be computed once per monthly crawl.
But it makes sense that Google would someday use external links as part of the filter scan. I'd try zapping the one mention of your phrase in the index text, so as to completely sever the connection of that page from the external anchor text.
The reason I say that is because my favorite example of external anchor text making all the difference was that a search for "discount brokers" (don't use the quotes) used to produce an empty directory in the number one or two spot. I checked it three days ago and this was still happening. It's been empty for a year, and Google has been doing this since I first noticed it in April. But I checked just now and it is gone! For the very first time since April!
Maybe Google now requires that the external anchor text appear somewhere on the page before they give it credit. Now that would be, in my book, a real improvement.
Keep in mind that Google can react very quickly with this new filter. There might be some AI (neural networks, self-organizing maps, etc) elements involved, but even so, it seems to me that Google would do most of the "training" of the algo off-line and begin with a fairly stable dictionary. I can't believe that major dictionary shifts would be tolerated online -- it would be too unstable and ruin Google's reputation.
One more thing. There are a lot of comments about keyword1 and keyword2, etc. It would be very helpful if more posters tried to make a determination, while the "-wqwqzw" test is still available to us, whether either or both of the keywords appear to be "dictionary sensitive," either alone or when paired, and when paired, whether they are sensitive to which one is first.
Occam's Razor suggests that all else being equal, you should go with the simplest, least elaborate theory that explains the phenomenon... it seems to me that if non-competitive and informational searches have been very stable while competitive commercial ones have been volatile, the simplest explanation is that one of the previous major optimization factors has been reduced in importance, causing a big shake-up among the most highly optimized searchterms. If it was a penalty, then either all the searches would have been volatile across the board (not true) or Google would have had to spend a tremendous amount of time on this putative dictionary, which wouldn't even account for why some commercial and even spammy results are doing fine for competitive terms. My money's on excessive repetition of keywords simply having been de-weighted somehow, possibly combined with some merging of singular and plural words and the like. That would account for all the industry-specific chaos without elaborate conspiracy theories or incorrect predictions.
I'm sorry this is causing such financial distress for so many people, because it's rather fascinating from a purely observational perspective. )-:
==========================
This Florida update is quite harsh this time for online business as well as of the timing - Many people, both who got bump and got dump, are caught "unprepared"!
For those who got bump from nowhere to top spots get too much traffic, orders, inquiries than they have anticipated and prepared thus it is pretty difficult to handle the incoming volumes - products running out of stocks and employees working overload - thus resulting in lower quality of services to customers who are the endusers.
On the contrary, those who got dump I can imagine the frustation - products in warehouse are left unsold, employees no work to do and so on...
[although I have worries about the Google Razor ;-) ]
Hmmmm.... November 22 is an appropriate day for conspiracy theories.
Heck, Every day is an appropriate day for conspiracy theories on these boards. Even the date of the first Apollo landing will do!
we see NO penalty for keyword1 or keyword2 or keword2 keyword1
However, when we search keyword1 keyword2 keyword3,
a interior page ranks higher than our index page, our index page is indented to our #3 ranking interior page.
Anyone been hit badly on sites that haven't developed new links lately, i.e. messed up worse than pre-September levels?
the search for keyword8 <country name> on www-va brings up more than 3,5 million results, me at #2 with a sub-sub-sub page (PR 0). keyword8 is a very very generic term resulting in more than 17 million pages on Google. It's really our 8th keyword as we use it only for descriptive purposes.
The search for keyword1 keyword8 <country name> on the same datacenter has only a total of 33,600 pages. And we cannot be found at all.
Keyword1 is the name of the industry I'm working in. A search for keyword1 shows my site on -va on #80 the index page being a PR6
Both keywords apear in the title of the page, both twice in the body. There's no description for this page.
Does this sound like anyone will like the results? Anyone any questions?
>>> The search for keyword1 keyword8 <country name> on the same datacenter has only a total of 33,600 pages. And we cannot be found at all.
I repeated that with the "-wqwqzw" and my sub sub sub page shows up #1 plus a second much more relevant page from my site. Hope this helps
So today I downloaded that abc tool for checking pagerank across the data centers.
Having inputted the url of my site into the program, the results report "Dea" in every slot per data center, instead of a pagerank value.
I have inputted some of my other sites and it works fine.
Does anyone know the reasons for this? PLZ
As for this google update, being totally truthful some of the results are not quite a bad as some are making out. My sites are far down the serps unfortunately, but once I figure out why this is then I can go forward.
My favourite set of results so far are at [labs.google.com...]
lol - I don't think that set of result counts anything towards the latest update, but the results are completely different :)
Again, NO. My 2 amateur sites happen to have all the above, and I have NOT dropped for the relevant keywords.
I believe it! My jewelry site vanished after being #2 or #3 and always in the top ten position on a certain two word search for 2yrs. Noticed the other top tens also have dropped drastically or disappeared. Now just junk coming up in the results.
Noticing an increase of other search engines in the referral stats. At least I'm still #2 or #3 in those.
Trying to find good search results now has been a waste of time. I've switched to other search engines myself now. Hope they sort the mess soon.
1) The domain does NOT end in .com
2) The page is in a language other than English
off topic , gotta wonder about Brett's bandwidth bill this month?