Welcome to WebmasterWorld Guest from

Forum Moderators: open

Message Too Old, No Replies

Update Austin Part 2



11:04 pm on Jan 28, 2004 (gmt 0)

10+ Year Member

continued from: [webmasterworld.com...]

My new website PR was raised from PR1 to PR5 in the last PR update, still I haven't noticed any improvement in serps!

Does it take time for serps to improve or I should lose hope?

Does it worth it to achieve PR5, PR6 or PR7 when people find you in serps by a five words keyword?

Who said that websites that have PR4 get around 300 hit/day and PR6 websites get around 3000 hit/day and so on?

Hoax! My PR5 website barely gets 100 hit/day while another PR5 website gets 17000 hit/day!

And yes, I am very newbie regarding SEO world if you were wondering.



9:29 am on Jan 30, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member

I had thought the whole thing was just another Snafoogle, but I now have to assume it's both deliberate and momentous.
We may well be watching a chrysalis turning into a butterfly; the in-between stages are slow, messy, and can be quite repelling.


9:34 am on Jan 30, 2004 (gmt 0)

10+ Year Member

As have some others already, I can see *very* different results for local versions of Google (.nl, .de, .se, etc) vs. Google.com. When I search for one of my sites, e.g. old blue widgets, I'm not found in the 1st 100 results for the .com domain.
However, the localize versions show me a 4th position. Now here's the funny bit: when I change "&hl=en" in the querystring for the .com search to a localized language (hl=home language I guess?), say "de" or "nl", the site is listed 4th on the .com!

Do you guys think Google ranks content based on geographical location? It can't judge it by the content, as the whole site is in English. Maybe it sees a European IP-address en decides the site shouldn't rank well in the US (and more countries :/ ), because of that? Just a theory tho...


10:39 am on Jan 30, 2004 (gmt 0)

10+ Year Member

If G is using local rank then they have not introduced the IP flush. I can see a very competitive serp on G UK with two sites in the top 10 sharing the same IP.


10:45 am on Jan 30, 2004 (gmt 0)

10+ Year Member

Have you tried several local Google searches vs. G.com? If so, are your positions different as well (better on local)?


11:12 am on Jan 30, 2004 (gmt 0)

10+ Year Member

help@google.com and webmaster@google.com will not respond to Austin inquires.


5:03 pm on Jan 30, 2004 (gmt 0)

10+ Year Member

I show the following data centers alligned now:
www-dc: DEFUNCT 2004-01-25
new-dc1: []
new-dc2: [] www.google.akadns.net
new-dc3: []
new-dc4: []
new-dc5: []
new-dc6: []
new-dc7: []

What does this mean, if anythign?


5:26 pm on Jan 30, 2004 (gmt 0)

10+ Year Member


I've just noticed the same!

Doing a two keyword for my site (an Irish site) on google.it brings the site up second for those particular keywords.

But on google.com it is nowhere... that is, unless i change the language from 'en' to 'it'.

But why is this? My site is in English? Surely thats a major flaw.



5:28 pm on Jan 30, 2004 (gmt 0)

10+ Year Member

Tribal I have exactly the same thing, but although I live in Europe my site is on a server in the USA, so I dont understand it. It is very very odd!


5:33 pm on Jan 30, 2004 (gmt 0)

10+ Year Member

Robert123, nothing's changed there for me. There is still different data there a good percentage of the time.


6:40 pm on Jan 30, 2004 (gmt 0)

WebmasterWorld Senior Member kaled is a WebmasterWorld Top Contributor of All Time 10+ Year Member

It's widely believed that Florida introduced filters that had the effect of junking many highly placed results. Therefore, the most likely explanation for national Googles such as .de showing different (better) results is that the filters have been changed or omitted.

This begs the question : Is it a messup or policy change?

I just did a few quick searches on .de and found pre-florida type results. One of my own SERPS returned from #80 to #20.


[edited by: Marcia at 4:43 pm (utc) on Jan. 31, 2004]
[edit reason] Minor semantic modification. [/edit]


6:49 pm on Jan 30, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member

>> Do you guys think Google ranks content based on geographical location?

Yes, change the interface language (hl=xy) and you get different SERPS. It's been like that for some time - i have one interesting example from www.google.com:

A keyword that is identical in Danish and Norwegian - it even means the same thing. There's a Danish and a Norwegian page on the first two spots. If i choose "hl=da" the Danish page is #1 and if i choose "hl=no" the Norwegian page is #1.

Then, if i choose "hl=en" (english) the Danish page is gone completely (there's another page from the Danish site around #450 or so).


7:08 pm on Jan 30, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member

If a site is in English and is targeted to English visitors, then why is it being filtered out for EN?


7:58 pm on Jan 30, 2004 (gmt 0)

10+ Year Member

^--- I would like to know the same!


8:06 pm on Jan 30, 2004 (gmt 0)

10+ Year Member

It seems to me now : 1) more pages with titles not containing k/w at all; 2) more .pdf files. True or false impression?


11:26 pm on Jan 30, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member

Refer to the first Post....generalising approximate hits by PR is not good. A new website which is PR3 that I built a month back gets 7000 hits a day whereas a PR6 website which I built 5 years back gets 3000 hits a day.


12:48 am on Jan 31, 2004 (gmt 0)

10+ Year Member

Are there some generalizations about the algo change?

I have followed Brett's advice and had been doing pretty well, but now most of my internal pages are lost and are only found if I add my city name to the search. A couple of the pages ranked quite high and are simply gone without the name of the city in the search. Any thoughts? Have I been themed as my city?

I don't do anything spammy, home page has a PR5, other internal pages PR4, and subweb PR3.

I'd also like to add my .02 about the SERPs from a searchers standpoint: If I want to buy a book about widgets, I'll use the word book in the search.


1:44 am on Jan 31, 2004 (gmt 0)

10+ Year Member

I've noticed that i'm getting a bit more traffic since the update.

However the terms for which I am found are much less targetted and seem to appear only once in a page.

It results in a much worst conversion ratio, while getting more traffic.

Face it people, Google just wants to boost it's IPO, good free terms are becoming an endangered species.


4:15 pm on Jan 31, 2004 (gmt 0)

10+ Year Member

The importance of theming

I am sorry, but I think I misled some people with this statement. What I mean is the importance of Authority. You can be a web site with a reasonable PageRank and very very heavily themed to a specific subject and you will find yourself just vanishing from the SERPs as a result of this Austin update. The reason is that you are not an Authority site and there are hundreds of "Authority sites" out there that have just bumped you down the index. The important thing is you have not been banned or penalised in any way, you have just been pushed down by sites whose relevance is highly questionable.

So how do you become an Authority site?

By being:
a) a large site
b) linked to be sites themed in the same area
c) linked to by other authoritative sites with the context of the link being relevant
d) having a reasonably good PageRank (which kind of follows on from the above)

So why have sites disappeared?

We are not convined that there is a database of "trigger words" that result in a new algorithm being used. Rather we believe that there are strong themes and weak themes. A strong theme is jobs or hotels, a weak theme is accountants. The stronger the theme the more dramatic the effect on search results. In any event it is a little bit academic which of the above is correct, what is beyond doubt is that if you search for accountancy jobs in new hampshire for example the results have changed dramatically and instead of a whole load of web sites appearing (or indeed spammy pages) now you get hundred and hundreds of job portals, university sites and other authoritative sites related to the word "jobs"

And here is the Nub!

This is the single most important thing to note about the way google has changed. Do lots of searches for 3,4 or 5 word search phrases like the one above and look through the results and you will find many many examples of sites that simply are not a good match, and here is what is happening:

In the above example the strongest themed word in the search term is "jobs". So, Google collects all of its Authority sites related to that word and works out which ones are a match.

Now here is the amazing bit: as long as these authority sites contain the word "accountancy", "new" and "hampshire" they are a match, EVEN THOUGH those words may be on different parts of the page.

Get this, it is very important - if a jobs site on a particular page has one job that is accountancy jobs in los angeles and another job that is sales jobs in new hampshire, then Google says, yes that page is relevant!

You may have a bit of difficulty replicating this for job terms unless you go really specific with the discipline or area, but what is a much better example is this:

Take a specific town or village and put it at the beginning of a search query and follow it by country house hotel (for example "waterloo country house hotel" which by the way there is no such name) and just watch in disbelief as the portals all come chargiing at you!

Why does this matter? Well, we have a client who is XXX Country House Hotel and was always number one for that term as he properly should be. Now, search for XXX country House and he is number one, search for XXX country house hotel (now the uthority trigger kicks in as the word "hotel" is there) and he is not in the first 200 results. When you examine the results it is just staggering - the first two are fair enough because they are portals that sell rooms in his hotel, but from then on EVERY SINGLE ONE IS IRRELEVANT and is producing the wrong hotels. So for instance in one case it is XXX Guest House in another part of the country. Why does that site come up? Because on that page in one place there is XXX (the wrong guest house), in another part of the page there is YYY Country House.

So, Google has said you sites are all authorities on hotels and provided somewhere on one of your pages there exists XXX, and another place country and another place house, then you are a match and a match above a web site that contains the full term (being of course the hotel site)!

So what does this mean for optimisation?

What it means for web site owners is that life has suddenly got much more difficult for them, and for all those people trying to get rid of web intermediaries I am afraid that Google has just made this almost impossible.

Your average small business has for a long time accepted that they cannot realistically achieve top ranking under generic terms but this was compensated for by the fact that they could get top under niche phrases, which frankly is a bit like the real world of corporate versus smaller niche businesses. But now that opportunity has slipped for many of them.

In terms of what it means for professional optimisers, well frankly not a lot. It is just a case of moving on to the next optimisation method. Watch out Google, the number of portals you have to index is just about to go through the roof! If you thought 6 billion pages was a lot to index, we think you can add another few billion to that figure in the next few months.

What about Google?

Well, we think that for one and two word searches this has probably been a good update, although arguable nothing much has changed in the results in these cases as theming and the portal effect was already a key factor for top generic terms.

However, for 3 and 4 word terms the results are absolutely terrible. I I am looking to contact the XXX Country House Hotel I have absolutely no chance of finding them, and the portals certainly are not going to give me their number as they want to sell me a room directly.

We believe that as long as this algorithm stays google will lose market share. Already there is talk of it. We ourselves are handling traffic at the rate of about 50 million visitors per year and whilst we lost about 10% of our positions and a proportionate drop in traffic from Google, we have already seen traffic increase from other search engines. It is only a trickle now but we think it will continue.

The greatest beneficiary will probably be FAST which is quite ironic. For a search engine that could be adjudged to have been a clone of the old Google (remember when it was a good search engine back last week!) and therefore unlikely to win market share they must be rubbing their eyes in disbelief at how google has just stepped aside and let them be the only search engine left delivering relevant results!

I have a very interesting question that I wonder if anyone knows the anser to - does Google ever do usability testing of its search engine, i.e. bring in a thousand users and tell them to start searching? It would not surprise me if they did not, which when you think about it is quite staggering considering the position it is currently in and the innate value in the quality of the results.

Light at the end of the tunnel!

There is one glimmer of hope for webmasters - and that is that Google recognises how bad an algorithm this is and reverses it, or amends it so that the results are not so irrelevant for 3 and 4 word searches. However, you would not want to be banking on it.

I hope this helps clarify some people's thoughts and as I said if you want to see the full report when it is ready, drop me a sticky mail.

[edited by: Marcia at 6:22 pm (utc) on Jan. 31, 2004]
[edit reason] Pls refer to previous stickymail. [/edit]


4:24 pm on Jan 31, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member

Pimpernel, thank you for such a detailed post. I agree with you completely, as this is the same conclusion I have come to. But you explained far better than I ever could have.


4:27 pm on Jan 31, 2004 (gmt 0)

10+ Year Member

so it is over then--offically?


4:32 pm on Jan 31, 2004 (gmt 0)

10+ Year Member

there is a definite filter on repeated keywords in the title tag that is only applied to certain searches.

my page title reads: "yellow widgets, blue widgets, big and small". for yellow widget i am #3, but i am nowhere for blue widget. the first search is non-competitive (30,00 pages), while the latter search is highly competitive (479,000). prior to the recent algo change i was number 15 for blue widget.

but when i search for "favorite blue widget" i am #1.


4:42 pm on Jan 31, 2004 (gmt 0)

10+ Year Member


"so it is over then--offically?"

It's never over ;) it keeps changing and that will always be.

How far in the future can you plan for?



4:43 pm on Jan 31, 2004 (gmt 0)

10+ Year Member

truth_speak. I am taking a real chance here answering your post blindly, but here is what you should do. Search for widget in google. Then take the top 10 or 20 results and search within each of the sites for the word "yellow". My guess is that you will find that many of these Authority sites (assuming of course that "widgets" is a strong theme) do not contain the word "yellow" and they do contain the word "blue" and that is why they are bumping you down the list under blue widgets but not under yellow widgets.

If it is not a strong theme then the title tag will make a big difference, and I would suggest you switch the order around as the words at the beginning of the title will in general do better


4:50 pm on Jan 31, 2004 (gmt 0)

10+ Year Member


The amount of user data they have has been talked about here, I believe Brett has mentioned in another thread somewhere. Not sure if they are picking joe blow off of the street but there are tons of joe blows with tool bars. I am sure that they dont just have "users" but have different classes/categories of users. Employees, Joe Blows off the street and Joe Blows that have the tool bar. I know I would, but then again I would also break those categories of users down even further based on trends and sites viewed by the users. I would even take it a step further and well read up on the P2P papers and think about how you could incorporate the thinking outlined there into the toolbar.

There would be tons to learn from this data as already stated.

We are far from the "end"

Just some thoughts,


4:57 pm on Jan 31, 2004 (gmt 0)


If I search on the names of hotels ("Hotel Whatsit"), I tend to get the hotels' own Web sites at the top of the SERPs. That's a big improvement over a few weeks ago, when affiliate or booking-site pages would have predominated.

On the other hand, if I search on "Hotel Whatsit Shelbyville," I tend to get the boilerplate affiliate or booking-site pages.

In both cases, pages from TripAdvisor are often in the top 10. That isn't as bad as it sounds, because TripAdvisor pages often include user reviews and links to descriptions at Frommer's, Fodors, or other guidebooks.

On balance, I think the current SERPs make it easier to find hotel information if you're looking for a specific hotel. A general search on "Shelbyville hotels" continues to yield a mess of boilerplate sites, however. As a user, I'd be happier if such general searches yielded a list of individual hotel Web sites and reviews instead of booking sites and hotel directories.

We hear a lot of complaints on this board that Google is "trying to make commercial sites buy AdWords." From a user's perspective, that might be a good thing. Instead of getting search results that look like a jumbled-up mixture of guidebook listings and Yellow Pages ads (as they do now), I'd have the guidebook listings in the white "search results" area and the Yellow Pages ads (AdWords) neatly segregated into their own column alongside.

This would be better for me as a user, and it might also be better for e-commerce businesses in the long run, because it would train shoppers to use AdWords as a resource just as they now use Yellow Pages directories.


5:10 pm on Jan 31, 2004 (gmt 0)

10+ Year Member

I am not sure what you mean on any level. As I am still seeing different results on google.it, etc, I wanted to know if people are seeing any new sets of serp's. With regard to "how long far in the future can I plan for", I would say that is important people not knee-jerk react yet


5:13 pm on Jan 31, 2004 (gmt 0)

10+ Year Member

europeforvisitors, I hope I am right in saying that every time you find a hotel name at the top they will have an Open Directory listing (which is another point altogether - we think Open Directory has just got more important). But if the hotel does not have an Open Directory listing I think you will find it very hard to find it.

I absolutely agree with you about tripadvisor - it was one of the first two results under XXX Country House Hotel that were absolutely relevant. It is the stuff after that that had no relevance that bothers me and much more importantly the fact that they bumped the hotel web site off the planet.


5:30 pm on Jan 31, 2004 (gmt 0)

Google has made some serious mistakes. There is no doubt about that. These gross mistakes have extinguished thoughts of advancing IPO. These mistakes can and do result in the overnight and complete loss of market share.
Google is presently showing its caliber of leadership and where these decision makers will take us. Presently we can expect only further deterioration of search engine function.

With this kind of performance IPO offering is off the shelf for at least 3 years. The time they will need to rebuild and establish stability.



5:40 pm on Jan 31, 2004 (gmt 0)

10+ Year Member


The reason I say is that, to some degree there is such a thing as a Rolling Update that will always be shifting around. How far along they are to a full on Rolling Update , who knows. Noting that this past "update" was more like an "upgrade" more so than the updates we have seen in the past. I beleive the goal is to update PageRank on the fly, which is saying allot as far as how much one could tell about the internet when being able to basically calculate (or see) "changes" as they are happening.

About how far can we plan for the future, that comment is basically stems from;

GoogleBots must be looked as editors. That is one of Googles goals, to have millions of editors scanning the internet, editing away with the judgement a human would have.

So as far as planning for the future...plan on them attaining that goal and plan on your sites having a Google editor looking at your sites everyday, eventually.

Oh and you will drive yourself crazy manually watching results as they are rolling. Now some heavy duty ranking reporting and site analysis in a controlled enviornement is valuable... But overall either way the BEST thing to do is to work on your website.



6:03 pm on Jan 31, 2004 (gmt 0)

10+ Year Member

"so it is over then--offically?"

As far as this update... "stick a fork in it"
filters will probably be adjusted in the next days as happened post-florida.

This 238 message thread spans 8 pages: 238

Featured Threads

Hot Threads This Week

Hot Threads This Month