Forum Moderators: open

Message Too Old, No Replies

Update Florida - Nov 2003 Google Update Part 3

         

LaBonne

5:41 pm on Nov 19, 2003 (gmt 0)

10+ Year Member



continued from: [webmasterworld.com...]

The panic is settling down, the whine of worry is receding to a steady hum in the back of my head, and several recovery plans are forming...

I lost my index page entirely, due to lazy keyword stuffing. My fault! Unfortunately, mine is a very small business: no listing = no food (let alone xmas).

I was planning on overhauling the website anyway, and I've given myself until 1/1/04 before I accept an opening with another business and abandon my own. The question now is: overhaul the index page and resubmit to Google immediately, overhaul the entire website and resubmit the whole thing in a few weeks, overhaul the website (starting with the index page of course) and wait for Googlebot. Time is most definitely a factor.

...are any of these plans likely to restore my index page to the directory before I have to throw in the towel in January?

There are also longer range options of starting over with a new website and closing the old.

Mahalo Nui Loa! (Thank you very much!)

mrwhy2k

8:03 pm on Nov 21, 2003 (gmt 0)

10+ Year Member



"they took a old dataset and now compare it to the fresh crawl , and only links that are in the "old" and in the "new" crawl counts as PR"

Under this theory, how would new sites and new content ever get PR? How would new sites ever get indexed if Google only followed links from old data sets?
I have some new sites (less than 90 days old) in Google right now that held their top rankings, while others fell. There is no “old” data to compare to. If there is no old data to compare to, you would think the sites would lose their ranking, but this is not 100% true in my case.
Just another inconsistency I suppose.

I think this theory holds some merit, but trying to poke holes.

caveman

8:08 pm on Nov 21, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



BTW, it strikes me as humoursly odd that:

--On the one hand, G knows (from Adwords, etc.) that the more closely a keyword search phrase matches an ad headline, the higher the click-thru...

--On the other hand, extraordinary steps seem to have been taken in Florida to weed out closely matching paired keywords, in favor of related, but not identical matches.

Talk about pretzel logic...gotta laugh at that one. :-)

I'm hoping this is all just about testing, albeit at the expense of the searcher, to learn some really good stuff! Then, soon we'll be back with SERP's that are bigger and better than ever! G, don't let us down!

WebSempster

8:09 pm on Nov 21, 2003 (gmt 0)

10+ Year Member



new data for vonna:

www-va now 2,670,000 Not found in top 100

21/11/2003 06:50:25 1,120,000 Not found in top 50
14/11/2003 06:56:18 1,100,000 Found at 29
13/11/2003 18:14:52 1,100,000 Found at 29

ciml

8:18 pm on Nov 21, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



johnnydequino:
> Looks like my site is not even showing up for keyword1-keyword2 anymore, let alone keyword1 keyword2.

I think quite a few people expected that not to last.

I haven't noticed anything odd about the va + searches. Maybe I was too slow?

willybfriendly

8:25 pm on Nov 21, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Time can not be THE major factor. A site I put up less than 30 days ago ranks at #8 for a three word term and #9 for a two word term.

A site I have had up for 9 months is MIA.

One I put up a couple years ago is suddenly at #10 for a two word term, out of the blue I might add since it was always a loser.

Another about three years old has risen into the top ten across multiple terms.

If time is a factor, it appears to be sites first indexed around the time of esmeralda.

And Brett, if it is themes, Google needs to take some lessons from Teoma :)

WBF

rfgdxm1

8:36 pm on Nov 21, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>These fairly sophisticated scripts that are showing up in the heavy hitter aff leagues are stemming out of the auto Yahoo/Hotmail mail signers that the spam communities use. Those were pretty evolved scripts. Adapting that same technique to PRGen scripts is fairly easy. That's how they got all those auto guest books scripts. Now they are just turning them on for the forums that allow sigs.

This is the sort of thing that has been an issue for Google since the beginning. A lot of old message board scripts were based on how Usenet was, and thus incorporated .sigs. Thus for a long time Google rankings have been influenced by .sigs, almost always with the signer doing it without even thinking of Google. The question is whether Google has figured out how to ignore these? Most message boards use standard scripts that are obvious to spot, just like guestbooks. Google finally has managed to learn how to ignore guestbooks, and my guess if they haven't, soon they will with message boards. However, the spammers may be doing it now even if Google does ignore the links. I noticed that a lot of guestbook spam was blatant, including hyping the site. Obviously they wouldn't have done that if they were thinking about Google. Making the spam obvious encourages the site owner to delete it.

rfgdxm1

8:41 pm on Nov 21, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>And Brett, if it is themes, Google needs to take some lessons from Teoma

When you mentioned themeing with Google Brett, did you mean the Teoma style? There are 2 possible kinds of themeing. One is the Teoma style, which tries to spot communities of sites on a common topic. The other is where the idea isn't to spot communities, but instead comprehensive sites on a topic. This latter is done by seeing how many pages the site has on this topic, and boosting the site for searches on that. IOW, if my site has just one page mentioning "purple widgets", it will rank lower than if Googlebot spots I have 500 pages that mention "purple widgets."

ogletree

8:46 pm on Nov 21, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I'm starting to hear from a lot of average joe surfers that they can't find anything. I have been just asking around my company. They have no idea how things work nor care they just do their job and have no idea what I do. It's weird I have noticed problems before but everybody said they were finding what they need but this time Google has gone too far.

willybfriendly

8:46 pm on Nov 21, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



IOW, if my site has just one page mentioning "purple widgets", it will rank lower than if Googlebot spots I have 500 pages that mention "purple widgets."

Hard to accept this as being the model given the number of sites with anly one page, and/or only passing reference to the topic showing up.

Search for tacks - #7 = "Yahoo tacks fees onto e-mail..." Hard to find any theme on that top ten result!

WBF

merlin30

8:59 pm on Nov 21, 2003 (gmt 0)

10+ Year Member



Themeing and context sensitive searching, especially for one and two word searches, requires some interaction with the user - otherwise the context stays inside the users head and the SE has no real chance of guessing accurately. Google doesn't interact this way with users at the moment so I don't think the themeing idea is responsible for the change in the SERPs.

glengara

8:59 pm on Nov 21, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Relax and enjoy the weekend, this isn't an algo tweak or a new filter or two , it's just another SNA*U.
(There, is that better?)

[edited by: glengara at 9:01 pm (utc) on Nov. 21, 2003]

Kackle

9:06 pm on Nov 21, 2003 (gmt 0)



Looks like my site is not even showing up for keyword1-keyword2 anymore, let alone keyword1 keyword2.

This is confirmed by looking at actual keyword pairs mentioned on other forums days ago.

This supports my suspicion that there is a lookup of two-word search terms in a new Google dictionary of the most popular two-word search terms used on commercial, English-language sites.

A lookup like this is probably being done semi-centrally, by referring two-word searches to dedicated machines at one or two locations over their internal network. That way they can respond quickly without waiting for the next update.

Before a couple days ago, it was embarrassing to Google that hyphenated word pairs were not recognized as matches in this lookup, and were therefore producing non-filtered results because the hyphenated word pair didn't flag the search as requiring the new filter treatment. This glitch provided us with one of the best clues that a special filter was in place.

Now that lookup is fixed on the hyphen matching. It's a lot easier to fix that lookup than it would have been to roll back the dance, and it's the only thing that's been definitively fixed. On the other hand, they could have turned off the lookup just as easily, with a simple "return" with the flag still set to zero, instead of going through a lookup. Google decided to improve the lookup by fixing the matching glitch.

This is strong evidence that the filter is deliberate. Google isn't broken, Google is filtering. The nature of the dictionary is Top Secret. It might be more sophisticated than just keyword pairs. Also, it's a dictionary designed for its anti-spam and/or anti-SEO effects, and is therefore limited in size. A lot of the "my site did this" vs. the "you're wrong, my site didn't do that" that we see on WebmasterWorld comes from the fact that we cannot use real keywords on WebmasterWorld. That means we cannot figure out what's listed in the dictionary and what isn't, and we just go back and forth.

john316

9:11 pm on Nov 21, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>The nature of the dictionary is Top Secret.

Maybe adwords dictionary?

vonna

9:13 pm on Nov 21, 2003 (gmt 0)

10+ Year Member



mrwhy2k:
>Under this theory, how would new sites and new content ever get PR? How would new sites ever get indexed if Google only followed links from old data sets?
I have some new sites (less than 90 days old) in Google right now that held their top rankings, while others fell. There is no “old” data to compare to. If there is no old data to compare to, you would think the sites would lose their ranking, but this is not 100% true in my case.
Just another inconsistency I suppose.

I think this theory holds some merit, but trying to poke holes.
--

Google "normally" compares to last month dataset ( I think they started doing this in aug. - sep. ), so if your site is added in august ( before the august update ) you will get PR in sep update. In the "OLD" days you could give a site PR within days ( not any more ).

Google do follow / index and list new sites , but now they do not get any PR before the inlinks has been seen on the same page for 1 - 2 month.

HyperGeek

9:15 pm on Nov 21, 2003 (gmt 0)

10+ Year Member



RE: Update Florida - 2600+ posts later.

And still, absolutely no one has mentioned working towards the Inktomi transition while all of this other stuff is going on - nor studying these odd results with the intention of possibly capitalizing on this situation if it were to ever occur again.

We are... and whether it's a Google-fart or not, why should your optimizing and SE campaigns end at Google? Scared of doing some real SEO?

Relax and enjoy the weekend, it's not an algo tweak, it's not a new filter, it's a good old fashioned Google SNA*U.

Glengara, remind me to buy you a pint at Pubcon FL.

Anyone else tired of bashing their head against the wall?

I never started. Then again, play the game for a certain length of time and you get used to big changes and temporary wormholes.

Do you really think that your site is gone forever from the top of the SERPS - even if you have one of the most relevant sites out there for it's subject matter? Google would never shoot their own feet - and that's what that frame of mind implies... that Google is no longer interested in providing relevant results.

This can't be good and my two clients are very likely not the only real businesses in the world whos is ticked at Google.

Of course they aren't - but then again, there are alot of BADLY RUN businesses who's Principals have absolutely no stamina for adjusting for change and waiting for better results.

vbjaeger

9:25 pm on Nov 21, 2003 (gmt 0)

10+ Year Member



IOW, if my site has just one page mentioning "purple widgets", it will rank lower than if Googlebot spots I have 500 pages that mention "purple widgets."

I am not sure this is far off. all of the datacenters except VA are showing directorys and authority sites for 8 out of the top 10 results for a particular 3 keyword phrase that I watch.

Kackle

10:17 pm on Nov 21, 2003 (gmt 0)



Another thing. The reason that the hyphen glitch got fixed so fast (has Google ever moved this fast?) is because as long as the glitch was in place, a clever SEO could have run some search scripts for keywords with and without hyphens, and determined from the difference between the results whether the terms were in the dictionary or not. If you could whip out a script fast enough, a couple hundred runs would have given you a birds-eye view of what was in the dictionary for the sort of sites you manage.

And, more importantly, what was not in the dictionary.

The same thing would work if you have a pre-update search vs. a filtered search.

Too late. Anyone who discovers an equivalent glitch probably won't be posting it.

ronin

10:30 pm on Nov 21, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



So are we confirming or still hypothesizing that google has introduced a filter for popular keyword-strings?

Are we still theorising that Google checks to see if a given page contains any high-density on-page keyword strings... and if it does, if those same keyword strings are also listed in "the dictionary"... and if they are, Googlebot looks at inbound link text to see if a high proportion of links also contain the same keyword string... and if everything matches a filter is applied to the page (regardless of whether it's a genuine non-spam resource or one of Chef Spam's plats du jour?)

How does that fit in with the theory the Google is broad-matching query strings with pages?

Does it simply look like broad-matching, because the most relevant pages with high on-page keyword-string density and many identical inbound links are being filtered out and leaving behind only those pages which are broadly relevant?

soapystar

10:34 pm on Nov 21, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



did you ever take your kids to work when they were off school? and you know how you always said.."Now be sure not to touch anything!"...you think maybe someone at googletopia took his kid to work..and the kid touched the wrong switch without telling anyone?..and they still havent noticed?

steveb

10:40 pm on Nov 21, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Next person who uses the word "filter" to describe the normal ranking process gets taken out behind the dugout and shot.

Kirby quotes something I wrote as being written by Napolean, and then says I'm agreeing with Brett. Cold....

"Disappearing" sites is a problem which I don't see at all in my area, but assuming it is true in some areas, people who dropped a lot in the serps are not the same as "disappeared". Lord knows there are plenty of sites in the serps that either should lose hundreds of places in rank or could plausibly be ranked much lower than they are. "Not in the index" is completly different.

This is easily Google's most positive algo move this year. Not counting the disappeared phenomenon, the problems seem to be:
- certain types of spam (often created by one person/entity) have been able to get large blocks of crap ranking right next to each other. Google seems to be removing these when they find them though, so offering them specific feedback on plainly obvious garbage is a good idea.
- problems with figuring out duplicate pages/content
- duplicate anchor text-based crap still does okay, though this junk's days may be numbered
- authority sites are thank God finally being valued over anchor text drivel; however, Google needs to better value context.

There is zero "localrank" sensibility applied now, which means an authority site (like a newspaper) that happens to mention keywords in an article may be doing better than they should, at the expense of "authority on this topic" sites. Said it before, will say it again... the roots of this update are a tremendous improvement. Quality content -- and a *history* of quality content -- is being rewarded, but the stew isn't done. It needs more topic-context, valuation of authority within a niche. If that is the next step (and since Google bought that localrank technology, one would guess that it is either the next step or the one after that), then Google could have something truly amazing on their hands.

makemetop

10:42 pm on Nov 21, 2003 (gmt 0)



>Chef Spam's plats du jour..

LOL - The best phrase to come out of this thread :)

The fun and games sort of reminds of INK in 2000 pre PFI (the AV turkey shoot was not to encourage payment, I don't think) - it looks like it might be get your credit cards out time! The Google free lunch may soon be over for commercial sites.

DocElder

10:45 pm on Nov 21, 2003 (gmt 0)

10+ Year Member



Seems something ought to be done with webmasters that create a bunch of phony domains and link them together. Many are using proxy domain registrars to hide the fact they are the same owner from Google. Google should penalize any commercial site that uses a proxy registrar since any legitimate business is not going to hide their place of buisiness. This will quickly eliminate the scum from high rankings. I think this is a great idea.

viggen

10:46 pm on Nov 21, 2003 (gmt 0)

10+ Year Member



did you ever take your kids to work when they were off school? and you know how you always said.."Now be sure not to touch anything!"...

...my bet is on the Kaltix guys, they also wanted to play with those buttons. ;)

steveb

10:49 pm on Nov 21, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Also a problem is the double edge sword of valuing words on a page. Link pages that just repeat the same words over and over (not as spam but just naturally in listing site titles) are doing much too well. Boosting the valuation of page title would help diminish this a bit, since genuine link pages will normally cover many topics.

Emphasizing page title a bit more would also address the problem of newspaper-like pages ranking highly just because the text happens to be in one article. If title mattered more, and a keyword wasn't in the title, these newspaper examples would have to drop drastically.

Jakpot

10:51 pm on Nov 21, 2003 (gmt 0)

10+ Year Member



This is easily Google's most positive algo move this year

I agree as far as my web pages go.
rankings are up
traffic is up for last few days
sales are up

Google SERPs are fine with me

BryantStevens

10:52 pm on Nov 21, 2003 (gmt 0)

10+ Year Member



Well I am defintely seeing some keyword density issues.

I see pages that have the keywords listed once and a about a paragraph of content coming back as number one for that keyword pair search.

Then I cannot even find my page using those same keywords. I have several paragraphs of content. My page is decently linked and has a fair keyword density like is recommended here. The page that is coming back number one has one or two incoming links.

airpal

10:54 pm on Nov 21, 2003 (gmt 0)

10+ Year Member



Makemetop, please shed some light on this. I've heard that google is simply reverting back to June's listings and slowly working it's way up to the present date in calculating backlinks/PR while adding new sites to the mix and applying a slight filter to everything. It's somewhat different than what you mentioned in your post in another forum which mentions that there are on-the-fly sensitivity levels/bars according to the average optimization of SERPS for specific keywords.

Valuing and awaiting your expert opinion...

[edited by: airpal at 10:55 pm (utc) on Nov. 21, 2003]

mrwhy2k

10:54 pm on Nov 21, 2003 (gmt 0)

10+ Year Member



hey Doc
>>Seems something ought to be done with webmasters that create a bunch of phony domains and link them together. Many are using proxy domain registrars to hide the fact they are the same owner from Google. Google should penalize any commercial site that uses a proxy registrar....

The webmasters that register 100's of domains could just put a different name and business name each time they buy a domain... they don't HAVE to use a proxy to hide their identity. Besides, many large, legitimate companies are now making small portals to go after niche-targeted audiences.

Starbucks doesn't have just one coffee shop, why should webmasters have just one website? Don't wait for the people to come to you, go after the people.

jrokesmith

10:58 pm on Nov 21, 2003 (gmt 0)

10+ Year Member



Googleguy mentioned in a post a while ago that they would be adding some "subtle filters" in soon during the Dom, Esm updates. In some areas that I watch as a benchmark, I have seen anywhere from 10 to 15 (not counting the spammers that got pulverised, only what seem like legit sites)of the top twenty results drop from high rankings to Never Never land. Other areas are almost untouched. Most of the areas hit seem to be areas that spammers like to inhabit. The untouched areas were relatively free from spammers before the update. If Ronin is right (good summary of information and hypothesis posted here) then Google may be refining the "filter" to decrease the casualities and make the results more usable in the next "iteration". Does anyone remember the sites that were in their areas that got dropped and the sites that stayed through the update for competitive keywords? Any commonalities?

DocElder

11:02 pm on Nov 21, 2003 (gmt 0)

10+ Year Member



Yes, multiple domains can be done for good reasons. But why would a legitimate business hide it's identity at the registrars? This is the point I was making. It is the spammers that are hiding their identities since they probably ran out of family members to buy sites for.

Google surely wants to eliminate this activity. I was just pointing out a way to it.

This 688 message thread spans 23 pages: 688