Forum Moderators: open

Message Too Old, No Replies

Update Florida - Nov 2003 Google Update Part 4

         

Kackle

5:57 am on Nov 22, 2003 (gmt 0)



Continued from: [webmasterworld.com...]

Kackle - can you explain the "dictionary" for me? And how I might benefit from it - Im reading your posts hard but dont see where youre coming from.

Sure. But you have to act quickly. Google will fix this one just like they fixed the hyphen.

1. Google is depreciating pages/sites that are over-optimized for certain keywords or keyword combinations. It does this by looking up search terms in a dictionary of target keywords or keyword pairs that it has compiled. This dictionary is Top Secret, because if you knew what was in the dictionary, you could avoid these words in your optimization efforts.

2. If the search term or terms hit on a dictionary entry, the search results for that user's search are flagged. This means that before the results are delivered, the order of the links, or even the inclusion of links, are adjusted so as to penalize pages that have overoptimizated for those terms. Most likely the title, headlines, links and anchor text are examined. It's possible that external anchor text pointing to that page has also been pre-collected and is available for scanning, but this is much less likely. (Besides, external links are not something within your immediate control, so don't worry about it right now.)

3. You want to find out which keywords that are relevant to your site are in Google's dictionary. Compile as many relevant keywords you can think of that searchers might use to find your site. Now take these words singly and in pairs, according to how users might search. Run two searches for each combination and compare the results.

4. If the results are strikingly different for the pre-filter and the post-filter search on a particular term or combination of terms, it means that some variation of those terms has been flagged because something was found in Google's dictionary.

5. Do lots of searches and you can come up with a list of "sensitive" words that you'll want to avoid when you re-optimize your pages.

It's a nice weekend project.

plasma

5:20 am on Nov 23, 2003 (gmt 0)

10+ Year Member



>Wht ds tht mn?
Wll ppl chng thr srch bhvr?

Like on Jeopardy, would you like to buy a vowel?

I would like to buy an E like Esmeralda ;)
I would buy a D like Dominic, too, but that is not a vowel =)

Goanna1

5:24 am on Nov 23, 2003 (gmt 0)

10+ Year Member



If there is a filter, it is not clear what is activating it. I have found exceptions for each of the following theories in the top 10 of highly competitive set of keywors

1. over optimization:
(I can see sites in the top ten that have the searched keywords in the title, H tags, text, links, and image tags.)

2. internal achor text:

3. external anchor text:
(I can see sites on top that have hundreds of incoming links with the searched keywords.)

4. "money words":


Does anybody have any ideas?

ronhollin

5:35 am on Nov 23, 2003 (gmt 0)

10+ Year Member



I think I may have figured it out! Go back and read otnot's post # 1005. I think he may be on to something. I did a little experiment that consisted of about 10 searches. I took text/content from my index page and typed it into Google. Text was very "non optimized" phrases. These were not keywords. I think Google is going away from the Title/H1/text links way of ranking sites and going strictly with content. Every single search that I did with the content on my site we came up number one for. These were just words on my site and not keywords. Crazy words that we should never come up for.

claus

5:48 am on Nov 23, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Just re-read the whole thread, parts 1,2,3 in reverse order... feel a bit tired now, thankfully a lot had been modded out from part 1.

Now that i think i might perhaps claim at least a little bit more than no understanding at all, i find the whole idea excellent.

  • singular-plural,
  • more results showing up,
  • dictionary (the "did you mean" thingy, that is..)
  • recently introduced synonyms operator
  • reshuffling of the directory
  • an english language only update,
  • pages ranking for synonym keywords,
  • directories doing well, having lots of words,
  • results getting more fuzzy the more kw's you use in stead of the other way round
  • index pages, dropping (for the optimized kw's)
  • shopping, ecommerce, product-specific, dropping (for the optimized kw's)
  • referrals from a broader array of keywords, and more phrases
  • hard to find very specifics unless "quoting"
  • and the "-asads" that is not a bug but a feature, (synomym for "exact phrase" which isn't default)

Broad matching is the main new component. I'm not in doubt about that.

Only.... right now the www serps seem pretty boring...are we back in the garage again? Please don't tell me that some management or marketing guy got cold feet...

/claus

aspdesigner

5:50 am on Nov 23, 2003 (gmt 0)

10+ Year Member



RonHollin, check-out my post #679 way back on page 46, it goes into more detail on this.


These were just words on my site and not keywords. Crazy words that we should never come up for.

Exactly my point. Taking this approach results in poor SERP quality, which is why even AltaVista decided this approach to search rankings was not a good idea years ago.

Doing this to fight SEO, and destroying search quality in the process, is insanity.

plasma

5:52 am on Nov 23, 2003 (gmt 0)

10+ Year Member



Every single search that I did with the content on my site we came up number one for. These were just words on my site and not keywords. Crazy words that we should never come up for.

1. How many results were for these searches?
2. would a 'real' searcher use these search phrases for that topic?

I doubt that Google did/will tweak it's algo to revert to the stoneage of search engines.
Google's success lies in link popularity.

Google can change it's algo but it can't change ppl's search habbits.

plasma

5:55 am on Nov 23, 2003 (gmt 0)

10+ Year Member



BTW: The board is broken, it displays at least the 3rd time, that this was my 249th post.

Powdork

5:58 am on Nov 23, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Like on Jeopardy, would you like to buy a vowel?
What is: Wheel of Fortune.;)

Kackle

6:07 am on Nov 23, 2003 (gmt 0)



<<In Google's history to date, ranking has been primarily determined by PageRank. >>

I hope you don't seriously believe this.

Such nitpicking.

The key word is "primarily." PageRank broke down last April, when the deep crawl stopped, and I argued that Google was broken in this thread [webmasterworld.com].

But for the 2.5 years I was watching prior to that, PageRank was primary. Moreover, the crawl was driven by PageRank. This was important for large sites. If you didn't have the PageRank, you couldn't get your whole site crawled. I'd always get about half of mine crawled before Googlebot got tired and went away.

The importance of PageRank was further evident on large sites when the deep pages had to inherit their PR from the main page. Typically, on a site with a PR 7 that had 50,000 deep pages in Google, each deep page would end up with a PR of zero. This was with optimum internal linking. It wasn't a penalty, it was a simple lack of juice. The situation is a lot better now for my deep pages; most are 3 or 4 while the home page is still a 7. This makes a huge, huge difference when 99.9 percent of your new visitors discover your site via a deep page hit in Google. Really, really huge. Like 1,000 referrals a day two years ago, vs. 25,000 now -- almost all from Google in both cases.

So yes, after three years of trying to get all 120,000 pages in Google, and watching how well they do, I'd say that in Google's history to date, ranking has been primarily determined by PageRank.

And GoogleGuy would probably agree with me, and he is all-wise and never lies. So there.

Powdork

6:17 am on Nov 23, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Broad matching is the main new component. I'm not in doubt about that.
From message 741
Broad Matching + overrelevancy filter=poor results

Yup, Didn't like it with adsense, don't like it here.

Plasma, methinks the update thread posts don't count towards your totals.

To me, Broad Matching should be a dating service.;)

aspdesigner

6:19 am on Nov 23, 2003 (gmt 0)

10+ Year Member



I noticed a few people mentioning that their AdWords hits were down.

A change in the algo should not have any adverse effect on PPC click-throughs.

I wonder if users are simply getting frustrated with the poor quality results and going elsewhere?

Powdork

6:27 am on Nov 23, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I wonder if users are simply getting frustrated with the poor quality results and going elsewhere?
I just had to look up restaurant industry stats and I did it with altavista. Both SE's had the info I was looking for, but i am not happy with G and am happy wit av.

Actually, neither had what i was looking for. That's a good thing because what i am looking for is behind members only stuff and if a search engine had it, that would be bad.

willardnesss

6:35 am on Nov 23, 2003 (gmt 0)

10+ Year Member



Just got back...

Kackle and otnot and myself all seem to be onto something that can be scientifically proven....I can get real true predictable results on a whole range of websites when you use the following tests.

Although all of us are junior members, I wish Steveb would stop spinning this into a conspiracy theory, stop making condescending remarks, and actually go and test the theory out (your '-fufuf = allinanchor' theory was comlpletely wrong, so go do some tests before you go on the attack).

I'm not saying Google is doing anything wrong, and I don't think they have a sinister motive...I'm just doing tests, analyzing the results, then trying to show predictability...that's how theories get proven, right?

First of all, there seems to be just one main database/index that Google is pulling from to return results....In its raw unfiltered form, it will return results similar to pre-florida update. (nobody's pages have been kicked out of the main index, and there are not seprate databases for spammers and seprate for non-spammers). The only thing different, is Google is now applying a new filter as results are returned to the user...if you know anything about database design & query statements...If you want to refine/filter your results, you don't delete the records from your database, you just add more if/then criteria to your database queries...

It is quite obvious that there is just new filters/refinements being applied to the same old index we had before (maybe slight updates were made to this index by the last crawl, but it is more or less the same).

The records are still in the database, because you can still find your site at its pre-florida position if you use the following techniques:

1: keyword1 keyword2 -gggg -ddddd
I've checked this with multiple sites, and this has been tru with all of them....(let me know if this is not the case).

2: Use otnot's test: search for 1 unique word from your title, and 2 unique words from you body text that are not the 'optomized keyword combos'... most likely your site will come up in the top 10 (if you were there pre-florida).

3: Now try searching with your optomized keywords....You vanish! This is because the new FILTER is checking to see if those 'popular' 'commercial' 'money' keyword combos are being used excessivley on your pages (in the title, density in body text, possibly external anchor text)...If they are excessive....then you are filtered out. You are not erased from the database, you are just filtered out of the results that are returned to the user.

Now where does Google get these optimized spammy keyword combos? WE created the database for them...Google Adwords combos is my guess.....THIS IS NOT A CONSPIRACY! THIS IS NOT SINISTER! (Steveb) I think Google is wise if this is the case...who spams? People selling things, right? Who uses AdWords? People selling things, right? If Google wants to filter blatant spammers, then just check for 'fishy' activity with these AdWords keyword combos...the databse is already created for them (no need for AI) makes sense to me, and I applaud Google if this is their intent....I just think they have cranked up the threshold of the filter a bit TOO high.

Current Basic (dumbed down) Filter Example: Filter out all pages that use $$$ keyword combos over 3 times on 1 page...Perhaps if they just turn it down a notch to allow 5 times per page...then more 'white hat' sites will suddenly re-appear....This could apply towards titles, text, anchortext, etc.

Before slamming this theory, give it a try. I think you will find some educational results.... I wouldn't make any adjustments just yet though...the new backlinks have not been added....check -va for the NEW backlinks..but search results on -va seem to be the same as plain old google results.

the -fufu technique that Kackle discovered is a nice bug, but I'm not sure if google will race to fix it....You can also use otnot's technique to reveal which keyword combos are being filtered...and this is not a bug..it's just scientific method.

Adios.

As pointed out by other users...the -fufu -ffff -gggg things need to be different...sorry, on my tests I'd just slam my fingers on the keyboard to get: qwor qowiu oqiwu yrqw etc... but for this message I cut/pasted the -fufuf..sorry for the confusion

[edited by: willardnesss at 8:20 am (utc) on Nov. 23, 2003]

seasalt

6:36 am on Nov 23, 2003 (gmt 0)

10+ Year Member



Where is Google Guy?

I guess Napoleon's steps to a Google update must be correct. I forget which step it was, but this must be the one where the SERPs are verifiably messed up and GG is absent and without any meaningful input.

seasalt

P.S. The obfuscation wears a little thin. IMO

Kirby

6:37 am on Nov 23, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



OK kackle, I tested and searched, searched and tested, but 24 hours later I'm not buying the -asdfasd theory. For one example, when I check "2 word city real estate" ( a 4 word search) the results are the same either way. If I substitute 'homes' for 'real estate' (a 3 word search), again no difference. If I add 'for sale' (now a 5 or 6 word search), still no major difference. The results are dominated by so-called authority and directory sites. All previous relevant 1-10 results are gone and dont reappear using any of the conspiracy theory search suggestions.

The only exceptions in one search Im seeing are two sites that had huge backlinks relative to the niche, but little onpage optimization. They appear to actually be there because of the user-centric on page content.

Not smart enough to know what any of this means, but I'm betting Googleplex is burning the candle at both ends.

Dazzlindonna, at first I meant codejam as a joke - not so sure anymore.

Kirby

6:43 am on Nov 23, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



The records are still in the database, because you can still find your site at its pre-florida position if you use the following techniques:

1: keyword1 keyword2 -fufuf -fufuf
I've checked this with multiple sites, and this has been tru with all of them....(let me know if this is not the case).

Not the case! Back to the drawing board.

willardnesss

6:43 am on Nov 23, 2003 (gmt 0)

10+ Year Member



Hey Kirby, try this:

2 word city real estate -ggg -dddd -fffff -iiii -sssss

Make sure ot put multiple -fufu -ytyty the search!

Let me know what happens now.

[edited by: willardnesss at 8:21 am (utc) on Nov. 23, 2003]

LateNight

6:49 am on Nov 23, 2003 (gmt 0)

10+ Year Member



Kirby - on the 3 and above kw searches 2 -argf are required to view former results ie.

keyword1 keyword2 keyword3 -argf -argf will show pre-Florida until the programmer fixes the bug.

<EDIT>What willardnesss said</EDIT>

quotations

7:07 am on Nov 23, 2003 (gmt 0)

10+ Year Member



When I search for kk1 kw2 -fufuf -fufuf I get results which are nothing whatsoever like the pre-Florida results.

The site which has been at #1 for five years shows up #4 and the #2 site for the past four years shows up as #5. A major spam site shows up #1 and two of my sites are #2 and #3.

kw1 kw2 shows the old #1 site at #1, old #2 site at #2 spam site at #3 and my sites at #4 and #5.

Pre Florida my sites were #8 and #9 and the spam site was at #18. #1 and #2 were at #1 and #2, just like they have been for five and four years.

Kirby

7:08 am on Nov 23, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Still doesnt work.

But I just checked pure search and 7 out of 10 of last month's Google results are there for my kws... and Im #1 for the 2 main ones. So while G may have knocked me silly for:
A. Over optimization of H1, title, etc
B. Specific anchor text
C. Actually having words that describe my site on my site
D. all of the above
E. Who the heck knows

...INK decided to put up decent results that closely mirror Google's serps of two weeks ago.

IMO, outguessing Google is pointless. Even if you do "figure it out", dont make any changes because G isnt going to let this stand. They will get this fixed since there is no way to defend these serps when compared against AV and INK.

allanp73

7:15 am on Nov 23, 2003 (gmt 0)

10+ Year Member



It was annoying to read steveb and Bradbristol's tug-of-war posts. Obviously Steveb wasn't listening.
Let's look at the facts:
1) allinanchor and -fgfgfg are not the same and never have been. allinanchor always produces fewer results and results which use the phrase in the anchor text only. The -fgfgf search should actually have no effect on the serps since as stated before no sane page would have this text. So the -dfdfdf is the same as natural results.

2) Strange thing happens when you type keyword phrase and -dfdfd -dddddd (I noticed the two minus thingy's have to be different). Suddenly more results appear than the natural search results. This would imply that there is some filter being added to the serps. Possibly based on a dictory look up or possibly based on over optimized sites.

3) The break down:
sample search:
normal search: 1,620,000 results
allinarchor: 2000 results
quotes around phrase (exact search): 32,300 results
phrase -fgfgfgf: 1,650,000 results
phrase -fgfgfgf -fgqqqqq: 1,650,000 results
(all results show fresh tags)
The best qulaity results were shown with the double minus. Even the exact phrase shows signs of filtering.

steveb

7:17 am on Nov 23, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



"(your '-fufuf = allinanchor' theory was comlpletely wrong, so go do some tests before you go on the attack)."

Actually it is plainly right. One bit of evidence is that people have been saying over and over again that they see themselves where they were before the update. If you'd actually look you could see that the anchor text results now very closely correspond, and this is true for all types of searches where anchor text was the key factor before, and it has nothing to do with some imaginary dictionary. A simple command shows a search phrase (at least now) under the previous algorithm.

They changed the alogorithm. Like with anything where there is a major change, some things did not go smoothly.

Instead of myths that can be disproved in ten seconds, people would do well to examine the algorithm: the increase in words on the page factors, diminishment of anchor text, increase in generic authority, etc.

==

"allinanchor always produces fewer results and results which use the phrase in the anchor text only."

I give up. If you honestly didn't know how anchor text was effecting the previous serps, then there is no way for you to discuss any of this.

aspdesigner

7:18 am on Nov 23, 2003 (gmt 0)

10+ Year Member



Kirby, as I indicated back on post #989 on Page 66, I think they may have fixed this. (While I was doing tests then, the results suddenly snapped to identical!)

With regards to the title relevancy tests I ran then, I have since tried the same tests with a couple of other SEs, to see how Google compared. Here are the results -

Search 1 (8 million in Google, Top-10 results)

values = all keywords in title, exact phrase in title
(larger # is better)

AllTheWeb: 10, 7
old algo (-): 9, 8
AltaVista: 8, 8
New Google: 6, 0

values = only 1 keyword in title, no keyword in title
(smaller # is better)

AllTheWeb: 0, 0
old algo (-): 0, 1
AltaVista: 1, 1
New Google: 2, 2

Search 2 (4 million in Google, Top-10 results)

values = all keywords in title, exact phrase in title
(larger # is better)

old algo (-): 9, 8
AllTheWeb: 9, 4
New Google: 7, 5
AltaVista: 6, 3

values = only 1 keyword in title, no keyword in title
(smaller # is better)

old algo (-): 1, 0
AllTheWeb: 1, 0
New Google: 3, 0
AltaVista: 4, 0

Note that in Test 1, the New Google consistently got the worst scores of the 4 engines, and in Test 2, it in was second to last place.

BTW, on the other SEs I tried, ALL of the Top-10 listings for the search "jewely", were - guess what - related to jewelry!

Kirby

7:24 am on Nov 23, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



(I noticed the two minus thingy's have to be different).

We have a winner! Back to good results 1-10. So this means that all 10 from pre-florida pi$$ed off the Google gods?

Kirby

7:31 am on Nov 23, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



They changed the alogorithm. Like with anything where there is a major change, some things did not go smoothly.

Instead of myths that can be disproved in ten seconds, people would do well to examine the algorithm: the increase in words on the page factors, diminishment of anchor text, increase in generic authority, etc.

steveb, I believe this is the answer.

quoting myself >The only exceptions in one search Im seeing are two sites that had huge backlinks relative to the niche, but little onpage optimization. They appear to actually be there because of the user-centric on page content.

allanp73

7:39 am on Nov 23, 2003 (gmt 0)

10+ Year Member



Steveb,
I have an excellent understanding of the allinanchor and the previous pre-florida serps. I don't see why you are being so stubborn. The double minus results are not the pre-florida allinanchor results. And they are not the same in any way as the allinanchor results. They are very different results. The allinanchor search results produce much fewer results.
The double minus results deserve some proper analysis. I have not determined why they produce different results, but I do know the results are less spammy and seem to give the largest sampling of the Google index.

aspdesigner

7:40 am on Nov 23, 2003 (gmt 0)

10+ Year Member



Yep! Double-dash and we're Top-10.

Without and we're down on page 3! We got replaced by a book review on a cooking web site, that just happened to include the keywords sprinkled somewhere in the text, for a search that has nothing to do with either books or cooking.

But it's definately a money search, with lots of AdWords on the results page. Funny, all of them are dead-on for the money topic I searched on (unlike the "new, improved" Google listings!)

Isn't progress wonderful?

TheDave

7:43 am on Nov 23, 2003 (gmt 0)

10+ Year Member



Running with the "jewelry" thing b4 it gets mod'd go check the -wiuytuiw -wtuwitu -wtwruitywi results on jewelry. There's a spammer there with 3 listings. Now lets look at the filtered results. There's a listing in #4 about real estate. I definately think they need to tone those filters down just a tad. The got rid of the duplicate guy, but somehow started returning irrelevant results along with the old results.

allanp73

7:46 am on Nov 23, 2003 (gmt 0)

10+ Year Member



Kirby,

I think you're wrong. I don't believe on page factors are producing a benefit. In fact I'm seeing the opposite. Sites was with excellent page text ranking extremely poorly. In fact for many searches allinanchor and natural top ranked serps are the same if not very similar.
If anything anchor text has become more important. However, I believe in a third case where sites are being dropped for over-optimization. I seems Google is testing a filter which has gone too far. Possibly they will reconsider its use.

merlin30

8:32 am on Nov 23, 2003 (gmt 0)

10+ Year Member



Some information to support the theory of the filter.

Pre-Florida, my site was listed well in both normal and allinanchor: for key1 key2 key3. The thing was, the only anchor text where I had those keywords were internal backlinks to my home page.

Post Florida, my site along with dozens of others killed for that phrase. As predicted doing the search with -dfdf -dfdfd brings back all those sites.

Now, if I use the phrase key1 key2-plural key3 I'm back! I don't use the plural of key2 in my internal anchor - but make no mistake I use as richly onpage as its singular. Google didn't like so much anchor text using it.

To summarize:

key1 key2 key3 (nowhere)
key1 key2 key3 -dfdf -dfdfd (top 5)

allinanchor: key1 key2 key3 (nowhere)
allinanchor: key1 key2 key3 -dfdf -dfdfd (top 5)

Notice that the filter also kicks in on allinanchor!

key1 key2-plural key3 (top 5)
allinanchor key1 key2-plural key3 (nowhere - to be expected)

And, for those still skeptical about the filter and the -dfdf thing, try a search on, lets say "Roman History" with and without the -dfdf -dfdfd - in both cases the results are a near exact match - as would be expected. Now pick lots of different commercial type searches and you will see a marked difference - lots of sites excluded without the -dfdf as the phrase should make a filter kick in. Clearly, a bug is stopping the fliter kicking in when you type in exclusion phrases.

This 626 message thread spans 21 pages: 626