|September, 2002 Google Update Discussion - Part 1|
Discussing the major changes that took place
How on earth can they justify dropping sites that were ranked in the top 10 and are now page 20 and NOTHING at all has changed on the sites from the last month?
The biggest thing is they move the toilet mid stream without a hint they are going to do it...(change the rules)
Googles a joke..
tired of their games..
off to support ANY other search engine..enough of this every month change the rules nonsense..good bye Google ..Good riddence..
lol, I think we both are running into the same problem in this discussion :)
Been away for the weekend, but wanted to give my rant...
Actually, I won't rant too much, nor blame Google for anything specific, just tell you what happened from my perspective:
My PR 6-7 site is completely gone from the index. Grey toolbar. This caused my Yahoo listings to drop from #1 to the bottom (2,000 - 3,000th) of Yahoo results (not google.yahoo) on the competitive keywords (since they use Google's PR to rank them). Financial loss here is devastating.
My PR 5 site remained unchanged but I dropped from #1 position on my top keyword to about the 10th page of results. Visitors from google.com have all but dried up except for those random 4-10 word queries.
A fairly new site, which is listed in zeal, yahoo and was being updated in google on a daily basis for the past two months is gone from google. PR is grey. Yahoo listing dropped to over 3,000.
My PR 4 site remained a PR4 but my listings for the top keywords dropped so far down I can't find them. Traffic has almost completely ceased.
Another new site that was indexed in Google results mid month showed up in the results on www2 and www3 and was then given PR0. The only link to it is from Zeal/MSN and it has several affiliate links on the homepage so I wasn't surprised by this.
I guess what frustrates me is that I built and designed these sites to be most relevant to visitors, and although added a little keyword density to help optimize them on Google, was very careful not to crosslink them or add any other links or content that I thought could penalize me. I never added a single banner ad, popup or exit ad and I only used affiliate links on some of the deeper pages.
It really hurts because I have spent about 4-5 hours per nite for the past six months working on these sites and the content on them. Then suddently, overnite, they are worth about 25% of what they were worth a week ago. I would like to try to increase my rankings again, but I have NO IDEA what the hell just happened! I am very frustrated!
<<The only link to it is from Zeal/MSN and it has several affiliate links>>
Affiliate links are no problem if Google likes the page otherwise. I have pages ranked #1 for their target terms that are chock full of affiliate links.
I suspect the first part of that sentence is more of a problem -- "the only link to it ..." That's where you can take some practical action.
Do some research to find other sites who might find your content useful, and invite them to link to you. Reciprocate if you have to. Then find some more, and some more ... The more links in, the better, because the traffic they send will provide a stable base amidst the SE ups and downs.
It's okay to have 80% Google traffic if the other 20% is enough to survive on if you had to.
>> I have a feeling that a silent minority or even majority of webmasters are pretty impressed with the new Google results overall, as i am <<
I presume that's a joke. Goodness knows what topic area you are searching in.
Some great points in this thread though, especially with respect to responsibility.
Sorry Google, but morality HAS to be an issue amongst decent folk. Hard nosed capitalism, where the strong don't give a hoot about the weak, is pretty ugly.
That is basically what seems to have occurred with some of the guys on here. They are your 'collateral damage' - the innocent bystanders.
Given the numbers that seem to be involved... have you REALLY done the homework to keep this to an absolute minimum? Have you REALLY given due consideration to the the small but high quality sites that some of them produce? It certainly doesn't look like it.... but only you know the answers to these questions. Only you can decide how much of a social conscience to have, if any.
I think that we are going to have to disagree about whether Google should be responsible or not. Your earlier posts suggest that you seem to have taken the ultra-capitalist approach that companies aren't responsible to anyone but the shareholders; including their customers, users and employees. I would rather think that companies should be responsible to *everyone* associated with and/or affected by them.
Whether Google like it or not, whether they want it or not, people's lives do depend upon them. Google is now the driving force behind a large proportion of web development. Whole sites are built from the ground up to obey Google's fuzzy rules. Google's linking dependence actually dicates how sites are designed and structured, which sites are linked to, how they are linked to etc. For me, Google has been the death of Flash based sites, for example, which is not a bad thing IMHO, but still significant. With this power comes responsibilityand Google aren't taking it seriously.
What is more annoying is that the whole matter can be so easily fixed. Just state the rules clearly, openly, in detail and with examples and then nobody can complain if they are dropped (unless they are caught up randomly). Google should also provide an established appeal route for sites that have been dropped, and perhaps a prewarning list of URLs of sites that are likely to be dropped in the next index if they don't conform to the rules.
Now to a different issue:
From a spam reducing point of view, I don't think that Google's current tactics work either. Because of the inherent instability of the index and the fact that webmasters can't trust Google to keep their sites present, I wonder just how many webmasters re-use their content across several domains. I do provide SEO services and always advise clients to build at least two if not three sites, obviously obeying all the rules of not simply duplicating. To minimise the unpleasantness of this I will always make sure each site is differently focused and providing at least some unique content. However, you get the picture.
And why do I do this? because Google is so unstable. The only way to make sure that a company is not killed by a random Google blitz is to make sure that the company is not dependent upon a single site. I am just following Googleguy's advice of "diversifying". However, the end result is that Google has three lots of the same content, albeit in different forms, thus reducing the quality of Google's index.
Would I do this if I knew sites weren't going to drop out of Google randomly? no, of course not. It is a lot of effort, but the only way to play it safe. I will repeat: Google forces webmasters into making duplicate sites. Discuss ;)
No law requires a social conscience. Very few have one. Others might emulate a social conscience, depending upon the sanctions. The more severe the sanctions for not having one, the more the people will develop one.
There has been a lot of whining this update, and maybe somewhat understandably.
What all the whining SEO's don't understand though, is THIS is what you are paid for.....to have the knowledge of what the ongoing changing rules are. Yes, the rules have changed again......we should hope for this if google is to maintain the relevancy of their results.
I agree that this was a bad month, but I think that some issues that needed to be taken care of were. I think we will see some of the downplayed variables will be tweaked back up to the level they need to be at to obtain the best results.
I also think that as much as we would like this to be a reliable occupation, it is a shifty game, and if you play it you have to be willing to deal with the consequences whatever they may be.
Even though I don't follow the advice myself...I think we all know that the safest optimization.....is no optimization.
I have a site (not the one in my profile) with 900 pages.
A catalog based site selling pens.
Nothing has changeed at all, well maybe I had some shifts 2 or 3 spots up or down, nothing major.
The site is not optimized, some pages are even missing titles.
Just made a mental note never to cross swords with you in a court of law (or was that philosophy)
Just made a mental note never to cross swords with you in a court of law (or was that philosophy)
Haha, I'm just glad somebody understood what I said. I just reread it and I don't understand it. :)
stuntdubl, I can see what you are saying, but this is half of my complain: it simply isn't possible to know the rules precisely, because Google doesn't tell anyone. Instead, we are left to guess from month to month what new rule Google has cooked up to update the index. Why are webmasters left guessing the reason for a site dropping out?
I also disagree that "the safest optimization.....is no optimization". Half of the work that I do for other people is making sure that sites meet all of the Google and Yahoo rules that I am aware of. So many sites still have hidden comment tags, hidden text, massive meta tags, browser incompatibility etc. and so I clean them up to make them acceptable. I do regard this as optimisation work since it is a prerequisite for not being banned! ;)
WRT to it being a shifty business...well there really is no need for it to be. Google can be as stable as they want it to be. This is the new world industry and we need to encourage it in the right direction. I don't believe that Google is helping this by proving unreliable and unpredictable. They are part of the industry, and are a large defining part of it. If Google is seen as unreliable by webmasters, then so is the web business, which in turn leads to job insecurity and slowed growth. What we will be left with is the group of people willing to take the most risks, but not those who require a stable income.
[edited by: Bobby_Davro at 9:18 pm (utc) on Sep. 30, 2002]
Well my theory is: you get what you pay for.
How much money did you give to Google to get those first page listings. NOTHING !.
You made all that money by winning lotto Google for the given month.
If Google changes its algrorithm, they have to answer to nobody but themselves.
You know why I can say all this. Well I beat the odds and won lotto Google again this month :)
And that folks is my last rant on Google (well until the next update at least).
September 2002 Google Update discussion continues:
I did some analysis on the top 3 sites for a phrase in my industry. Sites 1 & 3 are sites I expect to see in the top 10. Both of these sites have a toolbar PR of 5. Site A shows 98 backlinks. Site B has 158 backlinks. Of these backlinks, Site A has 7 external links that include the phrase. Site B has 5 external links that include the phrase. The remaining links for both sites are internal site links that do not use the phrase. Keywords in the title or on the page seems to matter much less than did in previous updates.
Here are my ideas:
1) The new rule out of this update is that exact phrase anchor text in external links is what matters. And I would suggest that it is the accumulated PR of this subset of links that trumps. In other words 3 PR6 backlinks with the anchor text in them should trump 8 PR4 backlinks with the anchor text in them because of the logarithmic scale of PR (but I am hypothesizing now on this extension of the theory).
I built on martinibuster's theory for this, starting at message #12 here: [webmasterworld.com...]
>>> martinibuster, I agree with your theory.
2) Exact phrase matching in the title or on the page used to be very important. Now it seems to be much less so, and is one of the main factors in why this update appears more spammy to some of us.
>>> Site 1 above doesn't have the exact phrase on the page. It does have the phrase concatenated together however (i.e. "keyword1keyword2" instead of "keyword1 keyword2", which is the exact phrase).
3) Google may be filtering out or discounting the weight of internal links.
My page used to hold a position of #11 for the phrase I analyzed the top 3. Now I am at #60. This page has only two external links showing up in www2 with the phrase in the anchor text. All other links to it (50 or so pages) are internal links with the anchor text in them. This page used to be in the top 5-8 up until last month.
Now here it gets wierd with the internal links theory. I have two other pages that are top 5 for their keywords, both in last months update and in this month's update. Why didn't these pages sink? One of these two is competing with 2.5 M results - the phrase above is competing with 2M results. What is different about these pages is that there are no external links to these pages. All links are from internal pages with the link text included. So I think that maybe internal links are discounted unless internal links are all Google has to work with for that page.
3) Huge PR differences can trump.
Site 2 of the search above I didn't mention. Its not a relevant result. is a big software company with a PR of 8 and the two keywords appear separately on the page. The keyword phrase does not appear in the backlinks -I think. I didn't check for this, but it wouldn't make a whole lot of sense for this particular keyword phrase to be in this Software Company's backlinks - most of their backlinks are logo-image links.
I'd love to hear your input, challenges, or different theories. I don't care if I am right or wrong. I just want to crack the algo.
I can only speak of my experience but it is counter to your hypothesis. I - like many of the former top 10 sites in my category - took a huge serp drop and on a keyphrase that is the first three words of my title (e.g. TITLE: keywd1 keywd2 keywd3 blah blah blah blah blah). Links in are in the form of: keywd1 keywd2 keywd3: descriptive info.
Many theories I see floating about, I think yours is defineately a strong candidate. I have a theory built off of other theories, which starts on another thread here
After just watching (or should I say wading) through the update pages, and listening to different reactions and results..
I am wondering if there are several sets of new criteria. Is it possible that certain sites (i.e. the serp giants) have a set of standards and criteria. Then as mentioned in the above thread (which has affected most of us here) the medium-large sites have a set of different standards. Then the smaller sites yet a criteria/standard on there own. I realize this sounds absurd, but think about it. The serps are going to hold thier weight for certain criteria depending on the magnitude of the sight. Could be a filter or algo that examines a sight (theme, pr, pop, etc.), and the sight is given a set of standards to be judged on based on which "cateragory" the sight falls into. For the medium to large sites, it may not have as much to do the quantity of inbounds, but perhaps the desc., anchor, or quality..
An example I can give has been discuused before.. Yahoo and ODP backlinks are showing for some sights, but not others.. Some have been dropped entirely, others added. There really is NO standard on this update that applies across the board. It depends on the site. I have seen a pattern of similar site results either in the forums, or doing my curiosity checking.. Certain sites all seem to fall the same cateragory or standard of this mystery, while others have been relatively unaffected.
My two cents.
That theory could hold some weight argus. To be honest, the only consistency I've seen this update is INCONSISTENCY. As I waded through the update thread, it seemed for every seemingly valid post and arguement, their was a counter story that was COMPLETELY opposite.
Everyone has looked at the commonly known variables, but even if your theory doesn't play out, I think the answer we are all searching for lies in a new (and probably somewhat simple) variable that has as of yet slipped under our radars.
argusdesigns and stuntdubl,
Inconsistency... yes I hear you. I am looking/hoping for one or two factors that would be across the board, but that may not be the case. When I read the update thread, I wonder how many people have really analyzed closely any SERPS. it gets real interesting once you do.
Your theory isn't absurd. Maybe there are multiple criteria. It would make some sense to try to make the results more relevant to the market that was searching for it.
Have you compared the number of external links you have with your phrase to the number of external links with your phrase that the top sites have?
|exact phrase anchor text in external links is what matters |
Which continues to prove itself out with each update.
|Exact phrase matching in the title or on the page used to be very important. Now it seems to be much less so |
I suggest itís too early to tell and that spammy results often settle down by the time Google calms down and that we may want to wait and see with this one. I at least havenít had time to do more than gather the data in.
|Google may be filtering out or discounting the weight of internal links |
Iím not seeing that from the sites Iím looking at. What I see so far is that the internal linking is being grouped together better in the backward link checks Iíve been running. If I see any sign of discounting internal linking but right now Iím not seeing that. Like I said though Iím just gathering in the data and havenít had a chance to analyze anything yet. I like to give things a chance to settle down.
|So I think that maybe internal links are discounted unless internal links are all Google has to work with for that page. |
Thatís an interesting theory. Thanks for the heads up, Iíll keep an eye out on that as I work my way through my data.
|I am wondering if there are several sets of new criteria. Is it possible that certain sites (i.e. the serp giants) have a set of standards and criteria. Then as mentioned in the above thread (which has affected most of us here) the medium-large sites have a set of different standards. Then the smaller sites yet a criteria/standard on there own. I realize this sounds absurd, but think about it. The serps are going to hold thier weight for certain criteria depending on the magnitude of the sight. Could be a filter or algo that examines a sight (theme, pr, pop, etc.), and the sight is given a set of standards to be judged on based on which "cateragory" the sight falls into. For the medium to large sites, it may not have as much to do the quantity of inbounds, but perhaps the desc., anchor, or quality.. |
It seems to me that just a few minor tweeks to the algo can have the same affect. For example, the google bombing with "go to hell" could be handled fairly well by changing the term
in the algo to
(link_text * (linking_site_PR / 6))
In addition to the normal place for the linking site PR to be considered, they decide to modify the link text specificly by that, discounting any sites < PR6 and promoting link text from sites > PR6.
Now if they change those PR numbers to the voting value of the different PR levels, it would make the text in the links of a PR4 site to be worth effectively 1 and the link text from a PR8 site would suddenly be worth almost as much as most of the other factors combined.
BTW, I am not saying that that is what they did.
Little tweaks can do a lot.
An alternate to your "several sets of criteria" theory, Argus, is that this algo changed something that affected medium to medium-large sites dramatically probably related to the counting/weighting of links in - which is roughly comparable to these sites across categories (this is on another thread so I won't go into detail here).
The result: giants are untouched (if you have 5,000 links in, losing 1,000 won't hurt you) but small sites vault over the medium-big sites which used to get a huge serp boost from their 50 - 300 links in.
I think we all have some very plausable theories, and I think in one way or another we may all have a piece to this puzzle. It will be interesting to pan through the data after things have settled down. I can't quite put my finger on it but as Sasquatch mentioned, a minor tweak can have a huge out play, it is an element to keep in mind. Also, remember that even in the 'theory of caos'(inconsistency) has a very predictable and consistent nature.
Funny that you should mention Chaos Theory. I was just thingking of ways that might apply to SE design. I wouldn not be at all surprised if there were a few Chaos hackers working on their code.
As it is, PageRank fits into chaos theory quite well. Time to go dig out a couple of my old books.
I like your theory. I have been thinking that this may be part of the new algo also. It would be really easy to change the weighting of the PR scale. I also agree with what you wrote in another thread about LYAO on people looking for one factor. I am sure there are multiple factors that Google has tweaked.
I like your chaos theory comment sasquatch!
when you think about it, it must be a nightmare for Google to try and predict the effects of even minor algo changes. All they can realistically deal with is summary data given the time constraints, so unanticipated shifts that are quite significant may go unnoticed. I assume they also track the effect on major players as they don't want the trouble that would come from delisting a fortune 500 company, but that leaves the rest of us vulnerable.
Ahh yes Chaos theory: Fractural Geometry. What is that the Lorenz theory or something...trying SERPS.. I think this defineately something to look into. Type Chaos Theory into search, in the SERPS look at #9.. Sound familiar?
Combine some basic logic, fractural geometry and Choas theory and walla! You now have the key to the algo. Well, that is if you can relate the infinate possibilities.
Would you maybe be thinking of this little line "Chaos theory also covers
the reverse: finding the order in what appears to be completely random data."?
Infinite possibilities are not a problem if you do not want an exact answer. If all you are interested in is probabilities (I feel lucky), then it plays out well. You are just working to narrow the possibilities and sometimes you are wrong.
I imagine that after just about evey update there are a lot of people down at Googleplex wondering "how the hell did that happen", and they even have access to the algo.
That is exactly the line I was thinking!
|Infinite possibilities are not a problem if you do not want an exact answer. If all you are interested in is probabilities (I feel lucky), then it plays out well. You are just working to narrow the possibilities and sometimes you are wrong. |
That is exactly what we do as SEO's... great analogy on the "feeling lucky". and your right, people at googleplex are probably wondering that, and they probably have to make a call to thier MIT class-buddies at NASA to get the big picture.
MIT Class buddies? ...i thought they are from Stanford :)
Neutralizing the googlebomb by discounting lower quality links would make a lot of sense of the changes we've seen in this update.
Why The Search Engine Is Not Always Right [forbes.com]
Google can't afford bad press that implies that they have bad results/or that they are spammable. Problem is, their solution has caused a whole bunch of new irrelevant results. I expect some of this to roll-back in coming months.
When I'm on my deathbed (which could be soon, after the latest update), would you visit me and show me the month-by-month algo history for the last couple years? I promise not to tell...