Forum Moderators: open

Message Too Old, No Replies

Update Florida - Nov 2003 Google Update Part 2

         

GoogleGuy

4:50 pm on Nov 17, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Continued from part 1: [webmasterworld.com...]


I stopped by several times yesterday, but it seemed like people were into the analysis stage already. caveman, this update didn't add any penalties for hyphenated domains, so that's not a factor. Just a reminder that people with specific feedback (good or bad) can send it to webmaster [at] google.com with the keyword "floridaupdate" somewhere in the email. I've mentioned that a few times, but as more than one person has pointed out, it can take 2-3 hours to read the whole thread from beginning to end. :)

notsleepy

4:33 pm on Nov 19, 2003 (gmt 0)

10+ Year Member



Title of page: word1 word2 word3 word4

Result of search for word1 word2 word3: TOP RESULT

Result of search for word1 word2 word3 word4: NOWHERE TO BE SEEN IN FIRST 10 PAGES (and in page 10 none of those words are even mentioned in the title).

Obviously my page is getting a penalty because the title too closely matches the search phrase. How else is this explained?

Maybe you should try changing your title from "Keyword1 Keyword2" to "Company name - the best source for keyword1 keyword2" and you will do better in the SERPs?

SWG: I don't find this to be true. I have a site with 1,800 pages with plenty that disprove this theory.

Macro

4:36 pm on Nov 19, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Florida! I like it, I like it, I like it, I like it. I liiikkkke it. It's rocking....

I don't know whether I've dropped on some keywords and gone up on others but I've got a 30% boost in traffic recently. I've had a more detailed look at my logs. It's all coming from Google. On all my sites. And it's since Florida started.

I don't know if this helps you guys figure out Florida... but here's some info about my sites:

I haven't checked allinurl, allinanchor or any of the other tools all you experts seem to be using. I haven't done a detailed analysis of linkbacks and anchor text. I haven't got any optimisation software. Heck, one of my sites is 200 pages in size without a single H1 tag (only learnt about H1 recently). My sites are whitehat, or not intentionally anything but whitehat. I interlink all of them with a lot of all kinds of cross linking (relevant cross linking methinks), don't have alt tags on most images, have never done a keyword density check so KWD may vary wildly from page to page depending on the content. I'm awful at page titles and descriptions, must really revise them all to dmoz editor guideline standards. In fact I'm awful at html, I use Frontpage with all it's attendant problems like code bloat. I have a few hundred inward links (PR1-PR7). I have a few FFA type inward links which I mistakenly put in place some years ago and which still seem to be there. Of the other inward links about 50% of them are reciprocal. Links from authority sites aren't reciprocated but I do have a few of those. And I do like adding content. For the last 2-3 years I've been adding a few new pages every month, usually consisting of my own ramblings on subjects I like to think I know a little about.

All the sites taken together have about 200 PR5 pages, 300 PR4 pages and about 1000 pages of PR0-3. Total unique visitors I guess is about 3-5K per day. Not a big sample so I don't know if anybody can draw any conclusions from any of this. Or whether these sites/topics/confused ramblings have affected SERPS because of some quirk/s. Or even whether people have just been searching for different KWs since the Florida update started.

skipfactor

4:38 pm on Nov 19, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>>i think stevexyz is right, they are applying one algo for profitable keywords and other for non-profitable

All this talk of a mighty anti-SEO algo change is poppycock. Hope everyone's foolishly making changes based upon this silly assumption 'cause it's a zero-sum game right? I will refrain from saying Google is broke because we're all guilty of an occasional bug--doesn't mean it don't work, just needs fixing.

I was a victim of the last allinanchor/index page bug, and I didn't change a thing and came out smelling like a rose when Google got it right. Then, many old-timers were saying major algo change, index pages had been devalued, blah-blah-blah. That simply wasn't the case; it was a bug.

Hopefully, everyone's followed the guidelines & built and marketed plenty of internal content that doesn't rely on the index for traffic. I've taken a hit on rankings but stats are holding up pretty well, all things considered. It's going to be a long couple of weeks before Google fixes the bug; get comfortable with it.

HarryM

4:40 pm on Nov 19, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



My impression is that the update is still in progress with changes still going on on -va.

Doing a search for www.mydomain.com shows the same number of pages in www, www2, www3, -dc, -qv, -ab, including 1 obsolete page url.

-va is showing 1 less page, and the obsolete page has gone.

-ex, -in, -cw, -fi are showing older results.

.co.uk is the same as www etc if searching all the web, but is (as usual) hopelessly wrong for just UK results. (Mine is a .com site hosted in the UK.)

Note: number of pages are those actually shown in SERPs. The "about" numbers are inaccurate on some DCs.

needinfo

4:43 pm on Nov 19, 2003 (gmt 0)

10+ Year Member



Obviously my page is getting a penalty because the title too closely matches the search phrase. How else is this explained?

notsleepy, I respect your opinion but to suggest making your page TITLE not reflect the search term for that page is absolutely crazy.

Even if as suggested before this is the case when other page/link factors are considered by Google, to make your page TITLE not reflect the search term your targetting is mad IMO.

dross

4:51 pm on Nov 19, 2003 (gmt 0)

10+ Year Member



The filters are no where near what they should be and they are so screwed up. If google's filter are supposed to filter out spam and only "relevant" sites appear, why would a keyword that I have worked so hard to optimize "correctly" show a site on the first page that has black text on a black border. This is so clearly spam and how is the "great google" missing this one!

Seriously Pissed
David

dazzlindonna

4:51 pm on Nov 19, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



skipfactor, i asked in this thread many pages ago what ever happened to the old index page problem, but never got an answer. would have been good to know that it came back on its own due to it being a bug. thanks for sharing that. keep in mind that some of us didn't know that.

stevexyz

4:52 pm on Nov 19, 2003 (gmt 0)

10+ Year Member



Overall my traffic has not changed that dramatically - I could not stand the algo which I called "anchor text spamming". I refused to do this and to be honest am very pleased that G has applied this new filter to stop this from happening. Thing is anchor text spamming has only been taking place on "profitable" keywords - so I think my theory makes a lot of sence - well done G. I have concentrated all the time on providing answers to the "keyword questions" with top notch content - it looks like its paying off. It was depresing seeing sites just shooting up the rankings by using the "silver bullet" method of anchor text SPAM.

onedumbear

4:56 pm on Nov 19, 2003 (gmt 0)

10+ Year Member



for those of you wondering about changing your titles...
I have a few friends that have tried this already. One in particular was very fast to change her title. The day after the update started she changed the title of her homepage to "company name" instead of "red and blue widgets by company name".
It has not helped her at all and the only thing that has change is the title in the serps.

kaled

4:57 pm on Nov 19, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I'm one of the people who apparently hasn't suffered in this update and has been keeping quiet.

In all the searches for my site (four areas) I see no significant changes. HOWEVER, traffic from Google is down by something like 80% or more - very strange.

Some months ago someone suggested that MS may have a saboteur working at Google. Well, if Google is as badly broken as this thread suggests (from all my searches it still seems ok) then I think the saboteur/sleeper has struck. Certainly, if the public loose faith in Google, I would expect MSN to be the big winner.

This theory may be nonsense, but the timing is very interesting. Also, I don't think many people who've watched MS over the years would doubt that this sort of dirty trick was possible.

Kaled.

PS
I'd be very interested to hear from anyone whose SERPS appear unchanged but whos traffic from Google has dropped significantly. I live in the UK and my site is hosted here. I'm wondering if Google are using some strange geolocation filters.

troi21

5:03 pm on Nov 19, 2003 (gmt 0)

10+ Year Member



I just checked my backlinks on the datacenters and they are different on each one. I will take this as proof that it isn't over yet.

synergy

5:14 pm on Nov 19, 2003 (gmt 0)

10+ Year Member



Certainly, if the public loose faith in Google, I would expect MSN to be the big winner.

Case example: My fiancee's best friend who is a hairdresser and non-computer savvy (doesnt even own a computer) had this to say:

"I was searching for something at work yesterday and I was getting really aggrevated because a lot of them had nothing to do with what I was searching for."

It's not just us seeing this folks :)

skyhighpn

5:15 pm on Nov 19, 2003 (gmt 0)

10+ Year Member



Just before this update started, I belive we were seeing different results on -mc and -in. At that time, I found -mc's results to be very good (for me), and very relevant for other searches I tested. Of course now I find -mc to be awful. Anybody think its possible that they used -mc to do like a test update to see how the results looked before doing it all? Am I the only one that saw this on -mc just before the update? Just a thought.

Nikke

5:15 pm on Nov 19, 2003 (gmt 0)

10+ Year Member



different on each one. I will take this as proof that it isn't over

It for sure isn't over. I've been trying different searches from www.google.com, www.google.fi and www.google.se and the serps are all different for some of my keywords.

However. This isn't true for all keywords!?!?!

The site I'm watching the closest seem to be getting more traffic since Florida started, but it is a new site where we are adding about 20 pages per day, so it might just be the novelty effect. Older sites also do well, but they are in a language spoken by about 9 million people, where most seem to use MSN, so it's hard to tell what effect Florida might have on them.

[edited by: Nikke at 5:21 pm (utc) on Nov. 19, 2003]

skipfactor

5:17 pm on Nov 19, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Some months ago someone suggested that MS may have a saboteur working at Google.

Best conspiracy theory yet and an evil mole no doubt! :)

I could not stand the algo which I called "anchor text spamming". I refused to do this and to be honest am very pleased that G has applied this new filter to stop this from happening

Funny, I see a low PR site that derives its ranking from anchor text spamming (doorways with 100 links, all the same anchor text) ruling the day--pre-Florida and post-Florida. Anchor text filter my fanny.

Nikke

5:20 pm on Nov 19, 2003 (gmt 0)

10+ Year Member



www-va is showing new backlinks!

It's looking horrible. Backlinks are down with 60%...

[edited by: Nikke at 5:23 pm (utc) on Nov. 19, 2003]

johnnydequino

5:20 pm on Nov 19, 2003 (gmt 0)

10+ Year Member



So, we fixed yet? What's the latest?

Did google kill JFK?

jd

talismon

5:21 pm on Nov 19, 2003 (gmt 0)

10+ Year Member



I am still not convinced these results are any good however, like macro, looking at traffic reports - I too have seen a 25-30% boost in traffic from google. Granted I am comparing only a few days of traffic. People are too concerned with SERPs when they should be looking at overall traffic. There could be hundreds of targeted phrases that you have improved on and dont even know it!

too much information

5:47 pm on Nov 19, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



well from what I'm seing with my results -in has not spread to the other servers yet.

Is -in showing results that I will expect to see in another few days, or is it still shuffling around?

rfgdxm1

5:50 pm on Nov 19, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>well from what I'm seing with my results -in has not spread to the other servers yet.

Looks to me like it partially has. However, the index has been cooking some more on -in, and the others still haven't picked that up. We are in mid-dance, and nowehere near the end.

>Is -in showing results that I will expect to see in another few days, or is it still shuffling around?

Still shuffling.

ronin

5:51 pm on Nov 19, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I suspect figment and tiffany are barking up the right tree: the new introduction to the algo is broad-matching.

Sites which look like they have been dropped way down the SERPS, actually haven't - they are still as high as they were before... for that specific keyphrase.

However, many other sites have moved up above those sites because using broad matching means that there is now much more general competition... because SERPS are now topic-based rather than keyphrase-based.

My latest guess is that Google AI analyses the keyphrase and instead of returning sites highly optimised for that phrase, it ascertains the topic and returns sites related to the topic and only broadly (rather than specifically) related to the keyphrase.

Just a hypothesis...

Bobby_Davro

5:54 pm on Nov 19, 2003 (gmt 0)

10+ Year Member



No one else seems to have mentioned this yet, but today in the UK, and in Germany, I noticed a massive drop in the number of Freshbot pages. Certainly for the areas that I watch, there were NO fresh pages at all at one point. It seems to have added some in since.

I suggest that perhaps Google have changed the Freshbot algo as well and needed to flush out all of the old pages?

ridgway

5:58 pm on Nov 19, 2003 (gmt 0)

10+ Year Member



Long time lurker, first time poster.
No GI Joe with the Kung Fu grip for Christmas this year.

Followed the sage advice here in the forums over the last year. Slowly built up a dozen sites for self and clients, achieved excellent rankings and lots of #1 returns. Felt proud doing it with hard work and no funny stuff. Esme/dom smacked me pretty good, but all recovered. However, even at the depths, it was never this bad. Sites that were ranked #1 for two word key phrases are now buried deeper than 500, which is as far as I had the stomach to check. However, have noticed two things to give me pause.

Used to show up at #1 for phrase area widgets , now, that phrase is buried 500+ down, but the same site shows #1 for area widget , and area-widgets Just tried this on another site hammered in Florida, and same results. Something regarding plural/singular and/or using the – has changed. Which leads me to a question: Why would an innocuous character like a –, akin to a space, lead a site from the bowels of 500+ to a first place result, when it has no direct material effect on the RELEVANCY of the search? And why would the plural trigger the site to drop 500+ places?

Normally, I’d just take the prescribed walk around the lake and, in the words of Lock Stock and Two Smoking Barrels, “Chill Whin-stonn” But just before retiring last night, I read the GoogleGuy post that basically says it’s done, and “some will like it, some won’t”

Just let one voice from the desert state: I do not believe there is any kind of conspiracy to boost adwords bids, I do not believe there is an MS saboteur in Google, I do not believe this is the beginning of the end for Google, and I DO believe that Google attempts to provide the most relevant possible search results.

But in my opinion, there has been a huge step backwards; or forward into a morass. (Thus far) A former unabashed Google supporter is now just going to bite lip and be quiet. Still hopeful it's not over, and results will return ala esme/dom.

nutsandbolts

6:00 pm on Nov 19, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Very nice first post ridgway. I think many people feel the same.

HarryM

6:05 pm on Nov 19, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



well from what I'm seing with my results -in has not spread to the other servers yet.

From what I see re my own site, -in is old stuff. IMHO the most likely candidate for more work and rollout is -va.

Kackle

6:06 pm on Nov 19, 2003 (gmt 0)



For what it's worth, folks, it's a keyword-pair filter we're seeing.

1. There is a word-pair filter in place. This is not a penalty in the old manner of Google penalties -- as, for example, when they gave SearchKing a PR zero. This is a filter that is applied to the SERPs. If a given search produces 500 results, the order in which the results are displayed is determined, as usual, by PR and all the other algos. However, there is a new added filter such that if the searcher's search terms match the word-pair filter of particular pages within those 500, then those particular pages are placed further down in the queue for display purposes. PR stays the same, and a site's ranking for anything other than word-pair hits stays the same. Using a hyphen in your search terms defeats the word-pair matching.

2. This algo was apparently applied only to English-language pages. Perhaps only to dot-coms within English-language pages. Perhaps only to the most-frequent, most-competitive word pairs, as determined by an English-language word-pair dictionary organized by frequency of appearance, across all English-language pages. There is probably a front-end dictionary that's consulted for all two-word searches, and if the word pair is found in the "hit list" dictionary, then the SERPs are put through an extra step of consulting the threshold number for pages that score highly for those two words. It's probably done on the fly for each page in the SERPs -- the page is already decompressed to get the snippet out, and if you confine the scan to word-pairs inside of titles, links, anchors, and headlines, then it would not incur much overhead.

3. Single-word search terms, and most searches that use more than two words, are not affected. It seems to be a word-pair filter. If the string in which the word-pair was found contains three words, it was probably not considered for inclusion. If it contains more than three, it was probably parsed for pairs within the string. Hard to say. Suffice it to say that most SEOs use two-word terms to optimize their sites, and targeting two-word combinations makes a lot of sense from Google's point of view.

4. Anchor text in external links might be included in the word-pair hit count for a page. Hard to say about that. However, it's already clear that anchor text in external links alone will not trigger the filter. On-page word pairs in titles, links, and headlines are of fundamental importance. Internal linking is very important (and it's very easy to pick out whether a link on a page is an internal link to another page on the same site). If it's done on the fly, as suggested in 2) above, then I doubt that external links play any role at all.

5. There's a threshold applied for word-pair density. It might be a moving target, as in a threshold determined as a percentage of all text, or all linking text, or whatever, for a particular page.

Bottom line: Google was so predictable for so long, with respect to the role played by anchor text in links, that almost all SEOs have been doing too much of this in the last six to twelve months. Basically, it's Google's fault that it's been this easy for this long. Now Google is trying to correct it. But the correction leaves a lot of spam that replaces the filtered sites, simply because these sites missed getting hit by the filter.

Bobby_Davro

6:07 pm on Nov 19, 2003 (gmt 0)

10+ Year Member



HarryM,
agreed: -va and -dc look like contenders for the next everflux update over the next couple of days.

George Abitbol

6:08 pm on Nov 19, 2003 (gmt 0)

10+ Year Member



well from what I'm seing with my results -in has not spread to the other servers yet.

Is -in showing results that I will expect to see in another few days, or is it still shuffling around?

I don't know if this can bring up a kind of answer to that, but from where I check (France), results for site:www.mysite.com -jkzjhekzjeh are :

- ex, va, dc, ab, cw : 927 pages
- fi : 928 pages
- in : 17 pages
- gv : 628 pages
- kr, mc : 624 pages
- zu, sj : servers down

As for backlinks, only VA shows some. New PR shown only on VA too (maybe the only logical thing to be found ;-)).

You guys probably don't see the same things from where you are, but those results can't be done updating. And I can't see how -in could be leading the way.

Fred

too much information

6:09 pm on Nov 19, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I was just wondering about the old '-' character in the title.

I have title: kword1 kword2 kword3 - company name and I was #2 for the kword1 kword2 kword3 search until this update where I'm nowhere on -in.

The other thing I was thinking is that I have an H1 that matches my title (or atleast part of it). I added the tag a few months back and shot right up to the top, now I show up great for two word combinations of my keywords, but not the three word combination. Droppinb the '-' won't change the meaning or words in my title, but it should be an interesting test to see what happens.

I'll post my results...

Napoleon

6:12 pm on Nov 19, 2003 (gmt 0)



Interesting one Kaled.... yet another theory to add to the list of possibilities.

Any could be correct of course, but the most viable I guess would be the Adwords conspiracy (or perhaps the one that has them boosting the big sites with deep pockets to tempt them at IPO).

However, the most likely I think is still that Google has just got it wrong. Maybe they aren't as clever as many people think they are.

They produce a product (algo) and test it with their own test bed. Then they release it. It the real world it sucks. The results are quite a lot worse than they were before.

That sort of thing happens with many products and many companies. It happens in almost every industry.

The issue though is where they go from here. The good companies address the issues very quickly, to limit the damage. They backtrack, or they apply a quick fix.

Google in the past has done exactly this.

The telling factor this time is whether they do it again. If we see a fix or a backtrack soon, fair enough. Everyone makes mistakes.

If they don't, and decide to run with an inferior index... then we'd better start looking at those theories very closely.

This 933 message thread spans 32 pages: 933