Forum Moderators: open
Kackle - can you explain the "dictionary" for me? And how I might benefit from it - Im reading your posts hard but dont see where youre coming from.
Sure. But you have to act quickly. Google will fix this one just like they fixed the hyphen.
1. Google is depreciating pages/sites that are over-optimized for certain keywords or keyword combinations. It does this by looking up search terms in a dictionary of target keywords or keyword pairs that it has compiled. This dictionary is Top Secret, because if you knew what was in the dictionary, you could avoid these words in your optimization efforts.
2. If the search term or terms hit on a dictionary entry, the search results for that user's search are flagged. This means that before the results are delivered, the order of the links, or even the inclusion of links, are adjusted so as to penalize pages that have overoptimizated for those terms. Most likely the title, headlines, links and anchor text are examined. It's possible that external anchor text pointing to that page has also been pre-collected and is available for scanning, but this is much less likely. (Besides, external links are not something within your immediate control, so don't worry about it right now.)
3. You want to find out which keywords that are relevant to your site are in Google's dictionary. Compile as many relevant keywords you can think of that searchers might use to find your site. Now take these words singly and in pairs, according to how users might search. Run two searches for each combination and compare the results.
4. If the results are strikingly different for the pre-filter and the post-filter search on a particular term or combination of terms, it means that some variation of those terms has been flagged because something was found in Google's dictionary.
5. Do lots of searches and you can come up with a list of "sensitive" words that you'll want to avoid when you re-optimize your pages.
It's a nice weekend project.
They have observed incorrectly!
Firstly, it would be very easy for users to directly affect other's site's by putting up "targeted" anchor text links and directed them to a competitors site.
Secondly, if your sites name is red-widgets.com I DONT see how they could possibly penalize a site for having links pointing to them such as visit < red-widgets.com > .
I dont think this has anything to do with anchor text selection regarding inbound links.
(he said..., as he collapsed on the floor)
<edit: grammatical for dramatic effect>
Whatever the reason, the -systsrs exclusion is merely bringing up pre-florida style results. Another way of saying that is: heavily anchor text weighted results. Why they left that data there (at least temporarily) is one question. Why something so mundane generates a conspiracy theory is another question.
I don't think you appreciate the significance of this latest update.
In Google's history to date, ranking has been primarily determined by PageRank. This calculation was so intensive that it took several days to compute for the entire web. Consequently, it was done once per monthly crawl.
Other factors were used for ranking on-the-fly. Since every search result produced only the top 10 or top 100 or top 1000 results based on PageRank, this subset of results was small enough to put through a real-time analysis that included dozens of other factors. Most of these involved on-page factors -- title, headlines, domain, path + filename, etc. The external link analysis and anchor text was presumably done at the same time as the crawl, or when PageRank was calculated. In fact, some of the on-page considerations were probably pre-computed also, as there is room for coding it in the inverted index, as described in that famous early paper by Brin and Page.
With this update, we see a new kind of on-the-fly ranking that is throwing out pages that always ranked high under the old criteria. It's effectively a cancellation of previous criteria. It seems that this is done on the fly, otherwise we wouldn't be able to switch it on and off by exploiting a minor bug.
It appears that this new filter is based on keywords, plus what Google perceives as over-optimization. This is a departure from old methods. The mere fact that top sites are tanking based on their use of particular keywords is evidence enough that something new has happened.
It's not just another ranking phenomenon. If you were sitting pretty for years in Google, and suddenly you disappear for your carefully-chosen keywords, it's not just "Ho-hum, the algo was tuned again."
The imperative is to understand what has changed. The mere fact that WebmasterWorld traffic related to this update has skyrocketed should tell you that something is different. The posts from dozens of webmasters who are trying to understand what happened should tell you that something is different. Yet you sit at your keyboard and tell us that everything is normal.
The change in Google for top English-language e-commerce sites is so drastic that it suggests a reordering of priorities at the Googleplex. Alternately, it's possible that Google is increasingly incompetent with their algorithms. Or perhaps Google is competent when they want to be, but they are so focused on getting rich that they don't care about algorithms anymore. Any of these three possibilities is very interesting to many people -- even if you aren't one of those webmasters who is suddenly seeing a loss of traffic and income.
I don't understand how you can dismiss all this interest in this update, and claim that everything is normal. Clearly something has changed, and you have yet to tell us what Google is doing differently that explains all the changes that have been observed. Your complaint that newbies are misleading other newbies won't fly.
Sorry about that, but it is simply unconvincing. Something else is happening here.
Well, hopefully this will help people see that you obviously know nothing about this. You should stop confusing people.
For the past six months anchor text has been the primary determiner of the serps, as all allinanchor has closely paralleled the actual serps.
===
"Clearly something has changed"
The algorithm.
Maybe I've finally lost it and need a few brews to clear out the cobwebs. Hmmm Good Idea!
The past 6 months have been no different than any other as far as what we (not in the Google know) *know*. All guess work and supposition.
Dave
Normal / with -Search 1 (8 million, Top-10 results) -
All keywords in title...6 / 9
Exact phrase in title...0 / 8
Only 1 keyword in title.2 / 0
No keyword in title.....2 / 1
Search 2 (4 million, Top-10 results) -
All keywords in title....7 / 9
Exact phrase in title....5 / 8
Only 1 keyword in title..3 / 1
No keyword in title......0 / 0
See the changes on the top listings for titles?
Note - I've now noticed that some searches are now returning the same (bad) results with or without the -dsajkd. Are they "fixing" this little bug?
keyword1 keywording (notice the ing at the end of second keyword) and my page is lost.
keyword1 keyword (no ing) and i am number one again.
this probably doesn't apply to most searches that are singular/plural, but for the ing type words, this might be significant.
my title and optimization is based on keyword1 keywording. i do have the non-ing word form a couple of times in the text, but not in the title.
Multiple keywords are more speicific. There is no doubt about that. However, the average searcher often uses just the main keyword(s). If you look at a list of the top few hundred searched keywords on the net, you will see most of the big one are single words.
For example, "jokes" is more common than "blonde jokes" or "dirty jokes".
hmmm...
keyword1 keywording (notice the ing at the end of second keyword) and my page is lost.keyword1 keyword (no ing) and i am number one again.
this probably doesn't apply to most searches that are singular/plural, but for the ing type words, this might be significant.
my title and optimization is based on keyword1 keywording. i do have the non-ing word form a couple of times in the text, but not in the title.
WOW...us too! Now that's an interesting find! :)
That said, after a little bit of poking around, it seem to me that the new algo is having the most effect on searches that list lots of adwords ads on the right.
If there is screening based on advertising, well...
WBF (-qwdter)
I guess i am one of the n00bs that stevb keeps referring too, but I have to say that a week ago, if somebody typed in those keywords, they found sites that offered those products. Now when those keywords are typed in, they find 8 out of 10 directory sites. I suppose those sites are
relevent, but it definately makes the end user jump through more hoops. Are they really more relevent than the previous results?
I was working on one of my client's website. The site is a couple of years old. We started working on it four months back. The optimized content has been created and uploaded. The link pop. building task has just started, only 10 inbound links. Google ranked it top10 rank for 3 keyword, and top 40 rank for the other three keywords. After one month the site is nowhere within 300 ranks. We have not used any spam tricks.
Please advise. Will it again come up? What is happening?
The anchor text theory continues to explain the discrepancies simply and adequately. Results for Mongolian yurts are only slightly affected, because no one in their right mind is optimizing for that. Results for hotly contested phrases are going crazy, because everyone with 500 anchor-text links was in the top two pages last month and now suddenly they're on a level playing ground with sites that only have 30 anchor-text links, and consequently some sites are gaining or losing 150 places while this sorts out.
I appreciate what Flicker is saying in post 977, but wouldn't the real estate site that is showing up for "jewelry" searches seem to argue against this? If the algo is now simply giving less wieght to off page factors, then how does this site score in top ten with just a page called "ring"?
im not sure what datacentre im seeing , but it seems that a " switch " has just been flicked . A lot of anchor text pages ( GB spam ) have just dropped out of the SERPS for my KW1 KW2 etc..incidentally how do i tell what Dc my live google is using? thanks
irishaff, the dc you get will bounce around, it depends on which one Google pointed you to the last time your PC did a DNS inquiry to them.
However, there is a way to tell.
Right after you get an interesting search result (i.e. - before it changes), check what IP address your PC is using for Google.
One way to do this is to open a DOS or Command-Line window, and then type the command -
ping www.google.com
The first line that comes back will look something like this -
Pinging www.google.com [216.239.37.100] with 32 bytes...
That # (216.239.37.100) is the IP Address you are using for Google at the time.
That will also identify which DC you were using.