homepage Welcome to WebmasterWorld Guest from 54.211.95.201
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Visit PubCon.com
Home / Forums Index / Google / Google News Archive
Forum Library, Charter, Moderator: open

Google News Archive Forum

This 521 message thread spans 18 pages: < < 521 ( 1 ... 4 5 6 7 8 9 10 11 12 13 [14] 15 16 17 18 > >     
September, 2002 Google Update Discussion - Part 1
Discussing the major changes that took place
dauction




msg:120053
 3:32 am on Sep 28, 2002 (gmt 0)

How on earth can they justify dropping sites that were ranked in the top 10 and are now page 20 and NOTHING at all has changed on the sites from the last month?

The biggest thing is they move the toilet mid stream without a hint they are going to do it...(change the rules)

Googles a joke..

tired of their games..

off to support ANY other search engine..enough of this every month change the rules nonsense..good bye Google ..Good riddence..

 

glengara




msg:120443
 12:21 pm on Oct 2, 2002 (gmt 0)

Makes sense to me, for a page that dropped 50 places, the main keyphrase is a 4 word one.
Analysis shows a couple of three word phrases at a density of over 50%!

For the secondary keyphrase <20%, the page remains where it was.

MeditationMan




msg:120444
 12:26 pm on Oct 2, 2002 (gmt 0)

<<should have mentioned - the percentages quoted are for a two word phrase. If you're targetting single words I expect the figures to be lower, or higher for three word phrases>>

Could you clarify what you mean here, deejay? Are you suggesting that for a single keyword the point at which you'd be considered to be spamming would be less than 14%?

My own site had a keyword density of 18% for a single word, and I dropped from 7 to 13. It's not catastrophic since I still get a lot of Google traffic for phrases of two words and up. I'd suspected k/w density might be the problem, and so I've tweaked it down to 14% for that single word. It's tough getting it even that low, but if the danger zone for density starts below 14% then I'd better get out the chainsaw.

That was a good piece of research, by the way. Kudos to you!

JonnyWales




msg:120445
 12:40 pm on Oct 2, 2002 (gmt 0)

Using the keyword density tool on this site for my main 2-word keywrd gives a density percentage of 34%. This seems v.high, however there are, apparently, only 38 occurances of 2-word phrases of which this one appears 13 times.

Looking at the 2-word phrase as two single keywords results in 14% and 8% which seems much less excessive.

I suppose what I'm tring to say is that reducing my 2-word keyword to a more manageable 10% would mean that it would appear less than 4 times on the page and potentially dilute the message I'm trying to get across.

Jakpot




msg:120446
 12:41 pm on Oct 2, 2002 (gmt 0)

"Are you suggesting that for a single keyword the point at which you'd be considered to be spamming would be less than 14%? "

Did you mean more than 14%

Some #1 pages I checked had less than 1%
Seems there is little consistency as a result of this update.
Page "scoring" is a mystery - And Google wants it this way.

Marcia




msg:120447
 12:44 pm on Oct 2, 2002 (gmt 0)

>evidence to the contrary.

I have a site doing fine, it moved up, and it's got word-widgets-1 to 5 leading to product pages. Which doesn't necesarily mean it's OK, the whole page and what's on the target pages also factors in.

But I am still seeing excessive density as an issue, even when it's not in the link text. An acquaintance's site that's very relevant dropped way down from the first page. I ran a check and she's 37%.

A couple of mine hit the same ranking with almost identical density for the single and two-word phrases, I'm running a check on a load of them now. The density issue is not new to this month, and it is an important factor.

MeditationMan




msg:120448
 1:37 pm on Oct 2, 2002 (gmt 0)

I checked up on the sites that are above mine for my single keyword, and yet have comparable or fewer links (the others, including the top two, have double the links my own site has, and so I didn't see much point in doing that comparison).

The numbers of links to each site have not changed dramatically. My site (#13) has increased the number of links since last month by 38, while the rest have declined or are unchanged.

The figures in the table are: position, keyword density, # links, and change in SERP since last index, respectively.

#3 14.13% 442 0
#4 17.65% 252 +7
#6 26.45% 316 +3
#7 12.58% 242 +3
#8 9.38% 764 -4
#9 8.84% 558 -1
#10 9.00% 206 -4
#11 36.36% 234 +1
#12 14.29% 414 ?
#13 18.24% 560 -6

The biggest change was site #4, which went up seven places. It has approx the same density as my own site, and less than half the links. My own site has far more content.

#6 and #11 have very high densities, and yet have gone up.

But the three sites with densities of below 10% have gone down.

Frankly, I'm puzzled. I don't see any great case for arguing that keyword density is behind the drop my own site took, although you could argue that a keyword density of less than 10% is harmful.

The only complicating factor I can think of is that just before the reindexing I decided to remove the sitemap we used to have. I thought it was a bit spammy. When I saw the drop I immediately put it back in!

David_1cog




msg:120449
 3:07 pm on Oct 2, 2002 (gmt 0)

I didn't make clear in my OP that I was mainly referring to repetition in the <title> tag. I still think this theory is holding up after a few days noodling around.

I also believe that (as others have stated in separate threads), that keyword importance in the <title> has been reduced - many high ranking sites do not have all keywords in the <title>.

gmoney




msg:120450
 5:06 pm on Oct 2, 2002 (gmt 0)

David_1cog,

I have seen some evidence in my site that gives some weight to your theory. Since this update, I decided to tone down the use of “widgets” in the title as well as in the footer of my pages.

Somewhere along the line I had gone a bit wild with my use of widget in the footer (ie. buy widget, widget products, widget search, widget info, widget types etc.). I feel better now that I have cut back on the widget rhetoric, hopefully Googlebot will like it as well.

<added> I think my over use of "widget" hurt my ranking for "widget services" even though "widget services" wasn't overused.<added>

I’m not advocating that everybody rush to change their pages to emulate current high ranking pages because I can’t imagine that Google is satisfied with their new algorithm. I’m guessing that they well scale back some of their major algorithm changes somewhat. After all, their previous months SERPs were widely regarded by many to be the best . . . and now a large number of the SERPs are drastically different.

chiyo




msg:120451
 5:38 pm on Oct 2, 2002 (gmt 0)

Ok so now the only websites that are returning relevant results for commercial queries according to recent posts here are search engines that have so little traffic, the great majority of SEO's are not targeting them!

This now makes googleguys surprising comment several months back that we should be putting effort into other search engines and not google now much more understandable!

Yidaki




msg:120452
 7:15 pm on Oct 2, 2002 (gmt 0)

The good and the bad ...

My experience is that keyword density can be used to spam as well as to reflect high quality. For this i wouldn't make myself crazy about density ... allthough i like the WebmasterWorld dens tool ;)

My example: i run a specialized directory. All listings have a highly descriptive title and description to reflect what they offer and to describe the catergory they fit in. Each directory subsite contains 10 Links. So there are categories where there is:

- 10 x mentioned "widgets" within the title of each listing
- 10 x mentioned "widget selling" within its description
- 1 x mention "sites about widgets" within the title of the category page
- and maybe 5 x within the surrounding content of the categoy page

I would say, that this category is really authoritative for "widgets". If you ask me, i SHOULD hold top positions at any se.

Big but - a spammer's site that uses repeated keywords, alt texts, hidden text etc. to build a artificial densitiy is - in the eyes of a robot - also authoritative for the targeted word(s).

Who decides now what's good and what's bad ...

I wouldn't like to be a se robot. :)

gmoney




msg:120453
 7:38 pm on Oct 2, 2002 (gmt 0)

I am wondering how many of you are seeing an increase in irrelevant referrals from Google in your log files.

After looking at the relevancy of various SERPs, I thought there would be a dramatic increase in unrelated referrals to my site for keywords I didn’t even know I was ranked for. So far, my referrals have roughly the same relevancy to my site as before the new algorithm.

If webmasters see a dramatic increase in unrelated referrals then that would tend to indicate that the new algorithm is not as effective as the old one.

cminblues




msg:120454
 8:45 pm on Oct 2, 2002 (gmt 0)


Chiyo:
In other words..
Google has decided to being not-relevant in commercial queries?

Maybe Google is tired of being 'owned' in commercial areas.
But, what is the price of this kind of 'reaction' we all see?
Less relevant results.

So, think of this:

case 1] Google commercial areas 'owned' by SEOs
-> no revenue for Google, apart some result's relevance,
due to the fact that the people ranking well have enough money to make good content.

case 2] Google commercial areas 'owned' by AdWords buyers.
-> some revenue for Google. And, Google hopes, this people
buying 'ranking' will make good content, 'cause they also have some $$, if they can afford AdWords.

I don't mean, in the case 2], that Google will start selling ranking.

I mean that Google maybe want that the users start thinking:

commercial queries -> click on the AdWords
no-commercial queries -> click on 'normal' results.

I know this above is a little bit 'overloaded', but I'm firmly convinced that there is also some truth.

cminblues


Marcos




msg:120455
 8:58 pm on Oct 2, 2002 (gmt 0)

Well, this is what we found until now. Mr Vitaplease comments here [webmasterworld.com] seem to be mostly right. In order to fight “Googlebombing” and “Pagerank for sale”, they may have downgraded results when the Keywords is not in some important part of the on-page text, (to stop googlebombing), and the anchortext in ranking may have been tuned down, (to stop Pagerank monetization), specially if the linking pages do not have a good PageRank to begging with. Internal links, and links from interlinked pages may have been tuned down also.

Still, we have sees as many as 200 regional competitive cats easily dominated by unscrupulous Dmoz editors. We have done some testing on that.

To test if we really are in front of a Dmoz dominated Update, we have set up a Aspseek based small search engine, a GNU search engine with a crude PageRank-like ranking system. We have indexed around 1.500.000 pages, using as the starting point 700 Dmoz very competitive dmoz cats, including up to 250 pages per site, following up to 10 outside link, with up to 100 pages per outside link.
What we found was that 59% of our top 20 results on the 100 cat-related competitive Keywords where also top 20 in Googles new index, and 26% of our Top 10 where also Googles Top 10.

But we must also said that we have not been able to find a so-compelling relationship using no-competitive categories. A 2.000.000 pages index with non-competitive regional cats, using non-competitive Keywords, showed a very small correlation between top 10, top 20, and even top 50 results.

So, our working theory right now is, yes, as Don Vitaplease and others are pointing, small changes, probably committed in order to fight both googlebombing and Pagerank commoditization, have affected the index accuracy in many different ways.

We think the index is unbalance, or unless much more unbalanced than the last one, and, as a result, the weight of some previously no-so important characteristics are souped-up, opening the door for abuse.

In our case, souped-up Dmoz weight is the main factor now, prompting my initial, inaccurate claim, about Dmoz empowerment. It does not seem to be the case. Google has not chosen a drastic reduction of popular linking, as I voiced in previous post. That may be the final effect of the Algo change in some of the most competitive cats, but probably not the desired effect. So much for pro-Adwords, anti webmasters/SEOs conspiracy theories, and my apologies to the Googleserfs for so vehemently suggesting that ( just in case they cared :) ). I hope also Herr Googleguy can now stop laughing at me :) :)

But we do think this update and the changes committed are, to say the least, unbalanced, and the new algo is rampantly open to easy abuse. Lets hope Google good old Phd common sense returns soon, and a new, improved update takes place as soon as possible. In the meant time, I guess we can spend our time pointing out spammers to google, they are easy to find now: usually sitting at a Keyword near you, between #1 and #10. ;)

rogerd




msg:120456
 9:03 pm on Oct 2, 2002 (gmt 0)

Once again, Marcos, nice post! Thanks for sharing your research!

europeforvisitors




msg:120457
 10:56 pm on Oct 2, 2002 (gmt 0)

Google control the statistical analysis through the algo's. One minor change and 50% of people here could be seriously affected.

Not only that, but they could be seriously affected for one keyword or keyphrase but not hurt at all for another.

Example:

One of my travel content site's major subtopics is a destination in Europe. I've built a "site within a site" around it. Last month, its index.htm page was in Google's top 10 search results; suddenly it's down in the 30s. Yet at the same time, some of my other keyphrase topics haven't slipped at all. Why? Beats me.

Fortunately, I have a large site with several thousand pages on many different subjects, so most traffic doesn't come into my site through my home page or my subtopics' index.htm pages. Users typically arrive by searching on keywords or keyphrases that are found in my articles, links pages, etc. So even if Google doesn't send much traffic to my Elbonian index.htm page, users are likely to get there by searching on "Elbonia apartment rentals," "Elbonia railroad station," "Piazza della Elbonia," etc. and clicking a navigation link to find information on other aspects of travel in Elbonia. And even if users never get to my Elbonian index.htm page, it really doesn't matter, because that index.htm page is merely one point of entry to my site.

What works for a content site like mine may not work for an e-commerce site, obviously. But for some of us, traffic on the home page--or from searches on any specific keyword--may not be that big a deal...and content diversity is one way to minimize the impact of changes in Google's algorithm from one month to the next.

chiyo




msg:120458
 1:37 am on Oct 3, 2002 (gmt 0)

Crimblues wrote..

Chiyo:
In other words..
Google has decided to being not-relevant in commercial queries?
Maybe Google is tired of being 'owned' in commercial areas.

What i meant was that Googleguy knew that all of us SEO's targeting Google was leading to problems in the relevance of SERPS for competitive areas. Basially Google would be happier if SEO's didnt exist at all (of course). Their algos just can't compete against thousands of professional and amateur SEO's, especially where the content is usually very similar.

Therefore his post months back asking (pleading!) for us to target other search engines, had some method in it's intial "madness". The fact that people are saying that search engines that dont get the attention that Google's has by SEO's have better results in competitive commercial areas, seems to back up the real reasons why he said that.

My view is that any search engine that gets the attention from SEO's like google has will eventually suffer in competitive areas. Googleguy just wanted to share that nuisance more eqitably (from Google's point pf view) :)

At risk of repeating myself, Google's strengths in PR and link popularity just does not work with commercial sites where SEO is very evident, as comemrcial sites do not naturally link to each other and they have to use workarounds and artifical methods to get to the top. All which makes for problems in the SERPS for relevance.

Adwords may indeed, provide better relevance for commercial sites, the fact that it is PPC aside. The observation that google may be trying to force commercial sites to pay for Adwords by providing sub optimal main database results is an attractive theory, but another explanation may be just that they CAN'T reliably rank commercial competitive categories as they can in informational areas where backlinks and citation come naturally. Thats been increasingly obvious for months.

john316




msg:120459
 2:16 am on Oct 3, 2002 (gmt 0)

Chiyo:

You are almost advocating a "bait and switch" type thing here.

"hey you guys, we love the commercial stuff for now because our users (whom we cherish above all else) are eating this stuff up and making us the king of search. However, we are going to be pulling the trigger on a CPC model pretty soon and we'll be adjusting the algo to accomodate that."

I personally don't have a problem with that, just put a disclosure up like everyone else and quit hiding behind the algo mystique. Take your chances, see if the shoppers (the profitable "researchers") buy into the whole thing, but be up front.

As good as google is, no one is going to believe that they can't find/return commercial serps for commercial queries.

darnbarn




msg:120460
 2:51 am on Oct 3, 2002 (gmt 0)

(lurker post)
I have a question for those of you that seem to understand how the update works a bit more than I do....

I have noticed that the google directory seems to add sites directly from the dmoz directory. In the most recent google update, I see that most of my clients sites that were added into dmoz, are also now added into google with the same title and description.

My question is:
While many of these categories have been updated, I have noticed that several categories have not been. These are commercial categories, where sites have been added over a month ago, but the google directory does not reflect any of the recent changes.

This is not an issue of not adding a specific website, but that the directory is not updating from these categories at all.

Is this normal? Is this a sign?

I look forward to hearing what some of you have to say about this.

cminblues




msg:120461
 2:54 am on Oct 3, 2002 (gmt 0)


Chiyo wrote:
another explanation may be just that they CAN'T reliably rank commercial competitive categories as they can in informational areas where backlinks and citation come naturally.

Hmmm.. so after 4 years and billions of pages Google has problem ranking something?

Shutting down well-positioned commercial sites don't seems to me a way to make it better, nor an error in a new improved algo.

It's only what it is:
Killing business.
I am Google, the serps are my land, and I want to make my biz in my home.

cminblues


cminblues




msg:120462
 3:09 am on Oct 3, 2002 (gmt 0)


Darnbarn:
While many of these categories have been updated, I have noticed that several categories have not been. These are commercial categories, where sites have been added over a month ago, but the google directory does not reflect any of the recent changes.

This seems related with the last posts..
Nth face of the "adwords algo"? ;)

cminblues


chiyo




msg:120463
 3:22 am on Oct 3, 2002 (gmt 0)

Hmmm.. so after 4 years and billions of pages Google has problem ranking something?

So it seems.

It may just because of the sheer volume that reported problems are occuring now.

I think SEO targeting of Google is at a level never seen before. I think cracks may be starting to show if reports here of poorly relevant sERPS are any guide.

Shutting down well-positioned commercial sites don't seems to me a way to make it better, nor an error in a new improved algo.

Why were these sites well positioned and now "shut down"? Maybe other sites have been optimized better since month? Maybe there are new sites and players? Maybe the reason for good positioning early was due to taking an advantage of a part of an algo that has now changed? There seems to be a perception amongst many here that if you have hold a top position for a while you somehow "own" it. That's a curious perception, and its pragmatically flawed. After the SEO community decided that Google was really the only cost effective search engine to target several months back, there is obviously going to be a lot more competition, a lot more spam (as guaranteeing top ranking long term by normal methods for highly commercial sites is illusory as I have argued previously), and a lot more sites, many vigourously SEO'd. Even 50 phds cant keep up with that, though when the pressure comes off in furture the relevancy for these types of queries might well improve, who knows? At the moment its a battleground out there for commercial sites. We know. We slipped in our commercial site this time, and several competitors have come in basically copying the key elements of it. We target other promotional avenues for this site, including Adwords.

chiyo




msg:120464
 3:43 am on Oct 3, 2002 (gmt 0)

see if the shoppers (the profitable "researchers") buy into the whole thing

Profitable for whom? Site owners or google?

And if Adwords is getting "shoppers" better results, it shouldnt make a difference to them whether they click on an Adword or a free listing, so they still come back to Google.

Are "shoppers" really the only "profitable researchers"? Google survives only because people know that when they go there they will not be bombarded with comemrcial spins when they are looking for information. I beleive many will buy things realted to their search terms, and they will be more pleased to find these in a separate Adwords section than the main index, which they expect to be objective and non-commercial. So "real researchers" are shoppers too, and I dont think the amount of people who go to Google specifically to buy something is as large as you may think. My feeling is that its a quite small percentage, and are not necessarilly profitable.

cminblues




msg:120465
 3:52 am on Oct 3, 2002 (gmt 0)


Why were these sites well positioned and now "shut down"? Maybe other sites have been optimized better since month? Maybe there are new sites and players? Maybe the reason for good positioning early was due to taking an advantage of a part of an algo that has now changed?

IMHO the point here are the penalties, not the different ranks.
Let's see what will happen next months. :)

cminblues


chiyo




msg:120466
 5:27 am on Oct 3, 2002 (gmt 0)

cminblues.. just realised was typing your name wrong! sorry. i agree, lets see what happens next update and the next and I understand what you are saying re penalties vs rankings..

john316




msg:120467
 1:08 pm on Oct 3, 2002 (gmt 0)

>> Profitable for whom? Site owners or google? <<

Actually a good set of serps is profitable to the shopper. How long do you shop at stores with minimal selection?

bcc1234




msg:120468
 2:05 pm on Oct 3, 2002 (gmt 0)

I noticed one thing, anybody can confirm this?

The traffic started coming in waves. I get much more referrals for about 30 minutes then much less for several hours.

Noticed that on several sites, some of them ae pretty old and have been in the index for a long time.

It's as if google shuffled several indeces to expose more sites to the visitors.

Weird, the serps seem stable but the traffic intensity varies.

cminblues




msg:120469
 2:20 pm on Oct 3, 2002 (gmt 0)


It's as if google shuffled several indeces to expose more sites to the visitors.

Exactly same thing to me.
Scrambling often results is part of "AdWords Algo"..
Oops.. I've mispelled.. It's "We want to make it better" Algo..

Anyway, the serps and their variations are the only thing I care.
We're on the same wavelength of Google about users eh.. ;)

cminblues


4eyes




msg:120470
 2:48 pm on Oct 3, 2002 (gmt 0)

Worth stating, if a little obvious....

We all need to consider that there are two ways for your site to improve its ranking

1) Factors pertaining to your site
2) Factors on the sites above you being penalised or de-emphasised

I have a couple of dead sites that have risen through the ranks simply because others above have been penalised by this update.

Of course, in a way that means that my sites have factors that are now given more emphasis.

However, if I had improved the anchor text(or some other factor) and then seen my site rise, I might then consider (in error) that my anchor text change was a plus factor. The real reason might be that 20 sites above me got hit bad because of too high a KWD. My anchor text change might actually be detrimental in the algo, just not enough to balance against the other sites penalties.

When so many variables are changing at the same time it is almost impossible to analyse based on a small sample.

grayhair




msg:120471
 5:06 pm on Oct 3, 2002 (gmt 0)

hi,

Just an observation - it seems to me that google used to assume blue widgets meant "blue widgets" as a keyword phrase and showed those relevant results first. Recently it seems that - blue birds dancing on widgets - ranks higher.

The reason I say this is that my site has fallen from page 1 to page 5 for blue widget but shows up at #4 (page 1) where it was before for "blue widget".

How many people know to include "" in their search? It should be assumed that whatever is typed for a search should be treated as a phrase and that the person is not looking for a random distribution of the words.

Google used to be good at this whereas other engines were not, now it seems they have changed?

ciml




msg:120472
 5:43 pm on Oct 3, 2002 (gmt 0)

Welcome to WebmasterWorld, grayhair.
(I recommend paynt's welcome post [webmasterworld.com])

The more I look at results this month, the more I come to the same conclusion as you. The 'proximity bonus' and use of anchor text were two of the things that made Google better than the rest.

AAnnAArchy




msg:120473
 6:46 pm on Oct 3, 2002 (gmt 0)

grayhair
The reason I say this is that my site has fallen from page 1 to page 5 for blue widget but shows up at #4 (page 1) where it was before for "blue widget".

Exactly! In one of my categories, the "" makes all the difference between good results and worthless .edu/.org results that have nothing to do with the subject at all. I have sites on both sides of the issue -- some sites that are nowhere to be found and other sites that are being found for "fun blue widgets games" when the site is about, say, fun games. I don't have blue widgets on my site, so therefore I don't want to be found by people searching for such. Blue widgets in this case stands for something quite gross and unrelated to my site.

AAnn

This 521 message thread spans 18 pages: < < 521 ( 1 ... 4 5 6 7 8 9 10 11 12 13 [14] 15 16 17 18 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google News Archive
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved