|September, 2002 Google Update Discussion - Part 1|
Discussing the major changes that took place
How on earth can they justify dropping sites that were ranked in the top 10 and are now page 20 and NOTHING at all has changed on the sites from the last month?
The biggest thing is they move the toilet mid stream without a hint they are going to do it...(change the rules)
Googles a joke..
tired of their games..
off to support ANY other search engine..enough of this every month change the rules nonsense..good bye Google ..Good riddence..
When I'm on my deathbed (which could be soon, after the latest update), would you visit me and show me the month-by-month algo history for the last couple years? I promise not to tell...
|Site 2 of the search above I didn't mention. Its not a relevant result. is a big software company with a PR of 8 and the two keywords appear separately on the page. |
I see this happening on terms I track.
I made some new pages for a two word search term where last month none of the competing pages in the SERPS were in DMOZ and most had PR 3 - 4. I thought I could get my new pages in the top 10 SERPs pretty easy with links from a couple of on-topic PR5 pages.
The new pages didn't rank as well as I thought they would under the latest algo, but I was surpised to find some other existing pages of mine in the SERPs for this search term. These existing pages only had the keywords in the text and sometimes the keywords weren't even together on the page. They weren't particularly high PR pages - PR5 -- but they were in DMOZ. With the new algo all of the top 10 SERPs for this term are now DMOZ pages, completely different results from last month where none of the SERPs were in DMOZ.
Question about Google...
I noticed that my site had a number of superfluos and irrelevant key phrases repeated 2,3 and even 4 times. This obviously dilutes the key phrases that I would like my site to appear for. How does removing these repetitious terms affect positioning?
Once removed, my #1 search phrase represents 38% of the two word phrases repeated on my home page and is repeated 19 times. Is this too high?
[edited by: mbennie at 8:11 am (utc) on Sep. 30, 2002]
PR, internal links, external links, metas, keyword density, etc. I think what they do is have a big Wheel of Fortune type wheel with many different algos. Each algo has a different % for each factor.
Example spin the wheel and it can land on 20%PR, 15% external links etc. Each update they spin this wheel and come to the different forums to see the chaos they have produced.
I used to do fine with "Exact Phrase External Links".
1. What I see with this update, is that if the anchortext text is not in some important part of the on-page text (title/H1 etc) of the page recieving the link, the ranking falls. This is a logical step towards facing googlebombing.
2. A one-off high Pageranked anchortext link equalling "search query" I had from a topical directory (not ODP) pushed me into first page two updates ago. With this update I'm on page three for that "search query".
One could say google gives you a temporary boost for a new quality link, but I would guess that the "logarithmic PR" boost of that anchortext in ranking has been tuned down. That is, an external incoming anchortext from a PR7 page containing "blue-widgets" towards a page with "blue-widgets" in all the important areas, is not worth the log factor to the power of three of the same anchortext coming from a PR4 page (not that I ever believed it was such a strong factor in ranking). In any case some more even distribution is put into place. This is a logical step against buying high Pageranked anchortext links.
Also I am interested if people did this backlink check "all in url check" as I mentioned in the emotions thread. [webmasterworld.com]
Please check for yourself using the function: "allinanchor: keyphrase"
and comparing those results with a normal search for "keyphrase" in both www and www2.
"Allinanchor: keyphrase" results used to be nearly identical to "keyphrase" results for medium competitive keyphrases.
Yes, that seems likely to me. I posted elsewhere that my many sites were largely OK or slightly improved with this update. Anchor text closely correlates to phrases for which target pages were optimized. Your post would explain what happened for me.
Vitalplease, I think you hit it on the nose.
Many of the inbound external links I have contain a phrase similar to "Blue and Red Widgets at Widgetworld.com" My title says the same thing. My H1 tag says the same thing.
Consequently, my rankings for "blue widgets" and "red widgets" have gone through the roof. Even related phrases like "blue wodgets" and "red bidgets" are better because the "blue" and "red" match in the external links as well as the h1 and title are there.
"To be honest, the only consistency I've seen this update is INCONSISTENCY"
Googleguy - Where do we go from here?
egomaniac, I am curious if your study took into account how those external links where placed in the code and the freshness factor of the linking pages.
If Google allows you to filter your results by date and where the term appers in the code, then this data is already stored and can be easily referenced during updates.
We already know how important freshness is to Google with that whole 'Freshness Date' thing they used to display in serps. (now just a date) I would think it common sense on Google's part to not give as much weight to links coming from old dusty pages. Maybe this has become more or less of a factor...
We also know that how a link is factored into an entire page does play a role in its importance. A link on a side bar with just a small font text link and maybe an image or a few words is a lot different than the same exact link buried in the middle of a relevant paragragh regardless of the PR on either page. A link needs to be surrounded by content and relevant to the page it sits on and links to for the full effect. How many of those links come from the 'Similar pages' subset? How many come from pages with the term in the title or in a heading? It all has to match up. I think this may be that main factor in the shifts your seeing right now. It is probably being tweeked a bit...
We know Google looks at the code on the page quite a bit to get rid of the link farms and link swapping programs, so this is where I would look for changes in the future. The flow of the code is where the secrets lie...
...just my theories. I don't do this work as much anymore, so the time spend searching for the answer I'll leave to you sharp kids.
Last month I was No6 for keyword keyword. Now I'm No2.
One of my competitors was at No2 and is now on the 2nd page ;)
My competitor has hundreds of 'keyword keyword' anchor text links pointing to his site. I have about 10.
So, I don't know what is going on but I like it.
|One strange observation that I am getting: Normally, Google gives mysite.com|
The same page rank except perhaps when they are first indexed. Why would a big site like Commission Junction be displaying different page ranks for the mysite.com version and the www.mysite.com?
http:// version gets page rank of 3
http://www. version gets page rank of 7
Here is a brief history that may provide some insights:
In December 2000, a site I manage was #15 for a very competitive two word phrase (3 million results). It had been optimized for:
1)Keyword1 Keyword2 in the title, H1, text, etc.
2)250 or so natural, external links that had built up over time. No control over the link text... Pagerank was 5.
I decided to join the fray and built some interlinked mini-sites with similar themes and with Keyword1 Keyword2 as the link text to the site in question. By February 2001, it was #4, and then #3 by March (Pagerank was still 5). I decided to back off on the linking strategy as I was dominating too many keyword listings with too many sites. (As I said, it is a very competitive phrase).
By July 2001, it was 16, and then out of the top 30 by August. Since then, the top 10 has been dominated by Fortune 500 companies and big government sites, with a couple exceptions. Currently the Pageranks of the top 10 are: 8,7,8,8,8,6,5 (Exception: this is an industry trade association...?),7,7,6 (Exception: independent site).
In May 2002, I was able to add over 100 backlinks with Keyword1 Keyword2 as the link text from external pages of PR 6 and 7 and unrelated themes. This boosted the site to 8, up from 80. It also boosted the Pagerank to 6. In June, I was able to add a few more PR7 backlinks. The June update had the site at 7 with PR 7. By 9/1/02, it was 5.
During the September dance, it was as high as 4, but the new algo dropped it to 16.
My tentative conclusion is that the theme of the linking sites is more important in the new algo. I will be developing some hard data to see if this bears out. Meanwhile, I dont plan to make any changes until at least another update. I have been whacked before, only to be restored the next month.
My total backward links (link: www.mydomain.com) went down with this update from about 2,500 to 1,000. The reason is, that internal links do not show up any more for most pages with PR4. Pages with external links or with PR 5 or PR6 however still show backward links. This would support your theory that some sort of filtering out of internal links took place in this update.
Also your theory of exact phrase matching in external anchor text with text in <title> or <Hx> sections would explain the differences in ranking of many of my pages (many up, some down). And it makes perfect sense as an answer to the Google bomb problem.
> What I see with this update, is that if the anchortext text is not in some important part of the on-page text (title/H1 etc) of the page recieving the link, the ranking falls.
Interesting. I had been doing quite well (multiple top ten rankings across a variety of keywords) the past 5 months by building pages with the keyword in important places on the page, and then using the kewyword phrase in internal anchor text links. With this update I saw a drop on many phrases. If this theory about "external links + anchor text" is what drives the SERP for a keyphrase, then my ranking drop would make sense.
I just reviewed my own external back-links anchor phrases. And I got a reminder at how few of my own external back-links contain my target keyphrases. Most of them contain only one of the two needed keywords of the target phrase for my domains. The main reason I don't have a lot of external links with my keyphrases, is I have been finishing building the site.
On the two pages that I studied, all of the incoming links with the anchor text were links that had been around for awhile. Some were DMOZ links, some were relevant topical directories, and some were just links pages.
I don't think that total quantity of links matters. I think it is the cumulative PR coming from the pages with the linked anchor text that makes the difference.
My backlinks went down as well. I lost a couple of dozen or so internal links (from 75 to 50) . But I also lost some PR5 external links (with good relevant anchor text in my links), which I can't figure out a reason for.
Hmm, so ya gotta have dmoz listings to get high rank, eh?
We all know where this is leading. I'll go in, sign up to be an editor, put my site on, and smack out non relavant sites. Then do nothing else.
Nice going, google.
I don't know that you gotta have DMOZ listings to rank high. I think you just have to have links from high PR pages with your search phrase in the anchor text. I think the DMOZ factor may just be a coincidence with the fact that many DMOZ pages are high in PR.
On the other hand, someone has suggested to me that the keyword context of the page giving a link (e.g. a page with an external link to you) may also matter in how much influence it throws towards the page receiving that link (e.g. the page you want ranked). I think this is possible also, though I haven't tested for it yet.
One or both of these theories could explain the why a DMOZ link matters and is a good thing to get. But getting a DMOZ link is not an exclusive way to boost your site in my opinion. Besides think about it from an engineer's perspective. Boosting the influence of DMOZ alone in the algo is a crude hand-tweak. Factoring in the source page's PR, anchor text, and (possibly) keyword context is *elegant*.
I have an entire site indexed in google with on-page factors strongly optimized, PR6, and ranking pretty well for the most of the keywords. It has 25 pages. It hasn't too many external links (no more than 20) pointing to it with keywords in anchortext. There are 40 internal links pointing to each page with the keyword. Of course it is in DMOZ and Yahoo. The market is pretty competitive (1 million pages). Last month it was #4, #6, (more or less in the midlle of the serp) for a few keywords combos. Now, after the update, I am #1 and #2 for 5 combinations of the main keyword (blue widget, red widgetmetc...). My main competitors have dissapeared and there are 200,000 pages less for the most competitive KW combo. I think that, at least in this case, internal links are really, really important.
I strongly confirm the observation and analysis of vitaplease. I'd like to emphasize that the single importance of high PR has been dropped and would like to add to the discussion that the importance of an Inbound-Link from ODP with the kw-phrase in the Title = Anchor-Text may also have been reduced.
If all these observations are correct, we now have a good reaction of google on the following problems:
- Poeple purchasing links to improve PR
- Poeple Abusing Categories of the ODP to produce "Keyword1 Keyword2" Titles
- Poeple doing google-bombings.
I see the following problem: Reducing the abuse-potential of these ranking criteria google may have also reduced some good and very powerful aspects of its algo. One of the good things was the relatively low importance of the source code regarding the high importance of inbound-links+ specific anchor-text so either source-code spam used to be useless. now it may increase again.
the most dramatic thing is that this update doesn't take care of guestbook-entries yet even if this technique has filled hundreds of threads on different message-boards.
I give you an example since the new update the pos. 1-3 on the searp for one very competitive keyword-string have all the following factors:
- between 100 and 400 Inbound-Links from Guestbooks ;-)
- no dmoz-listing, no or almost no other inbound-links than from the guestbooks
- high to massive source-code density within hx and normal text
I am wondering why guestbooks aren't filtered yet. Does anyone has an answer?
All these results, theories, ideas are all backed by information gleaned from totally different industries. We have to understand as a body of interested parties here that what may be working for one industry may not work for another. Look to the obvious of course but take a step back and see whatís happening in the industry you are marketing in. These industry wide changes can greatly influence across the board results. Itís good to watch for new businesses coming in, older more established sites changing positions and why. I think we need to consider that it may be something else besides the search engine (or Google specifically) that may influence the rankings.
Thereís a reason Brettís great Google site takes 12 months for success, but think of what one person in one industry can do to an industry over a twelve month period of time to change the rankings. Now, for competitive niches donít you think more than a few of the those site directors read here and are perhaps applying that theory to their strategy. Think of all the great SEOís working for themselves and others and how they probably read here and follow advice and then apply it. How long does that take to start working through and how do those apparently subtle changes eventually rock the results an industry has grown accustomed to? Do we attribute this to changes in Googleís algo or great SEO practices being applied and starting to take hold?
Just something to consider while we look at this months results.
|1. What I see with this update, is that if the anchortext text is not in some important part of the on-page text (title/H1 etc) of the page recieving the link, the ranking falls. |
This is a logical step towards facing googlebombing.
I think you're right, vitaplease.
A common googlebomb was "go to hell" (between quotes). The first page that appeared in the list was Microsoft (I suppose this was a joke in response to Microsoft's "Where do you want to go today?"). There were also some other pages from giants like AOL. This happens no more. The page which was the second one in the listings is now the first one and MS and AOL don't appear.
I normally see the results from .co.uk reflecting the results from .com then filtering out results that are not uk sites.
A change I can see is that PR seems to have a strong effect in UK results and be toned down in .com
I have a uk site top for a search term in .com (used to be top uk)but beaten by 2 slightly higher page pr sites in uk results
Small stuff but something.
I thought till today you are right. But i found sites of me which carry the keyword as outgoing links (and this is the only time it appears) and which are positioned better than the site which ist goal of the above links and has the keyword in domain,title and h1.
PR is identical on all sites.
paynt: >>Thereís a reason Brettís great Google site takes 12 months for success<<
I have been a disciple of Brett's for two years, and I thought I was doing my sites according to his gospel. They ALL were bashed by Google this update. (The site I describe above went from 5 to 16), but others fared far worse).
Either I have misapplied some of Brett's teachings, or the Google algo did in fact undergo a major tweak.
It would be good to hear from Brett. How did the "Great Google Site" fare this month?
I feel any Google algo change should have subtle, barely perceptible effects. Such a wild change in results (as far as I can see) cannot be good especially because their results have been so great in the past.
Thank you, Paynt, for a little grounding.
Several threads with discussions of the September 2002 Google update combined, continued here in Part 2:
Discussion of September 2002 update: several threads combined. Second of two parts, beginning in Part One [webmasterworld.com]
Everyman made an excellent post in another thread calling for everyone to put their cards on the table, but it was buried on page 10 and was ignored, so I think we should start in a new topic (say here).
Let's share some specifics, without giving away who we are, so we can analyse what's happened here.
There was a big reshuffle in many areas for "Location widgets" and "widgets location". This happened in 2 cities we are active in. We lost 1 domain in each city. We still have our main site at the top which we try to keep squeaky clean. The ones we lost were our "insurance sites". They are small time with less content and only the first page or 2 on each site were optimised.
The big surprise was the return of a site which still has only pagerank 3, but has MANY incoming links with good anchor text, because they run an affiliation scheme. They have been out of the SERPs since 2000. They now beat out many better optimised pages with far superior pagerank.
1) So why are they back, and why did we drop? Well, I can tell you one thing that was common between all sites that dropped:
All my internal links to the home page contained only the word "home", whereas the site which stayed top has links to "Location1, widgets, location 2, location 3". We just did this last month to make sure we maximized the value of our internal links.
This suggests anchor text on internal links is key
2) Our sites which dropped have FAR less incoming links, again suggesting that the amount of incoming external links with good anchor text is key.
3)Our competitor's site has hundreds of affiliate links containing text "widgets Location". Even though he is only pagerank 3, her ranks third in the SERPs. This suggests the number of incoming links with good anchor text is important
I hope I have kicked off a useful discussion. If you have anything to add, please do, and be specific, and let's keep emotions out of this thread. Thanks.
[edited by: NFFC at 11:23 am (utc) on Sep. 29, 2002]
[edited by: Marcia at 9:51 pm (utc) on Sep. 30, 2002]
[edit reason] Added reference to Part 1 [/edit]
I was going to post in the other thread but it was too long.
I have had a look around and have a feeling that what I think started in the previous update has been given more weight in this update and that is freshness.
I believe that pages with more frequently updated content are being given more weight. So I believe the priniciples before still apply in terms of keywords, titles, etc but that the freshness aspect is now far more important than before.
One of our sites this month dropped one PR but has increased in position from 3 to 1 (even beating out CNN who were 1) and other pages have dropped from 2 to 5 (all on first page).
I also note that when searching for X some sites as we do have a page listing and another indented listing. Our indented listing has changed this time round.
All this (without giving specifics I know) leads me to believe that it Google has altered its algo slightly by saying we want the not only the most accurate results but also the most frequently updated and freshest results. I humbly believe this was tested in the last update and is now in full force.
Of course I could be wrong !
Edit IN : of course this has one big floor which may be the reason for some much discourse this update and that is that some pages do not need to be updated on a regular basis.
If you take the examples of <edit>widgets</edit> as SlyDog gave above then those pages would not need to change greatly every month, neither would many other pages, and so this may give smaller, newer, and possibly spammier sites a chance.
[edited by: NFFC at 11:26 am (utc) on Sep. 29, 2002]
|This suggests the number of incoming links with good anchor text is important |
I think you're a bit late.. That's what used to be important, but doesn't always seem to be the main factor now. that's the difference between this update and the previous one..
Looking at it logically, what was wrong with Google? Judging from the size of the Google forum here and by the amount of posts praising Google - not a whole lot.
However, what was the single biggest threat to Google's reputation? Page Rank for Sale. This had all the makings of a disaster, and what we are experiencing from this update is an attempt to combat PR for Sale, it had to be done immediately and it also had to set an example.
Unfortunately there is absolutely no way you can combat the problem of PR for Sale without causing:
A) A massive change in SERPs
B) Some wider spread damage in the first place.
I think the biggest change in SERPs is apparent in very competitive money making areas. This is because these areas are worth spending time and money optimising sites for.
It's one thing having a beautifully optimised site, but a high page rank is the seal of approval. It's the difference between being #20 and #1.
Google has apparently discounted the importance of backward links (maybe only in competitive areas) and paid special attention to any backward links that may have been gained by unsavoury methods.
It would explain why one site of mine that I havenít touched in over 3 months goes from strength to strength every month, and others, for which I have actively been seeking links for have suffered slightly.
Google are enforcing the way in which backward links are supposed to function, i.e. genuine "votes" for your site.
Other than that, what seems to be a major change in SERPs in competitive areas is actually the void or gap being filled by sites that would have normally been on page 2 or 3 which are still relevant, but just not so in your face relevant.
"Real" Pagerank (as opposed to a toolbar guess) has been given to dynamic pages for the first time. Dynamic pages tend to be updated more frequently. That's why I suspect that there's been a major shuffle in some competitive areas. Those areas - like <edit>widgets</edit>, to use the above example - tend to have a lot of dynamic sites for all the different information. Pages "not yet ranked by google" always have shown up in SERPS, but there would have to be an almost exact term match and little competion for it to show up near the top. Now that there are millions of pages with a real page rank that didn't have one before (even a PR1 is better than not ranked, right?) the SERPS are going to be majorly different in those areas.
[edited by: NFFC at 11:27 am (utc) on Sep. 29, 2002]