Forum Moderators: open
[edited by: idoc at 1:35 am (utc) on Jan. 28, 2004]
Maybe different IPs are getting fed different results (like is going on with Yahoo), but for sure it is pointless to talk about ".it" since people are seeing different things on that.
Going through the various dns datacenters brings up lots of differences, and it seems there is a greater degree of difference between the variance of results within niches. My main niche all the new results are fairly similiar. Some other niches that people have sent me look quite different.
The one client who is nowhere to be found in these new, work-in-progress SERPs is the client who has created THE authority site for their market. They have more information about the industry, the products, latest news, help documents, etc., than any of their main competitors. They sell no products through the site, but they do display what's available. The site has not been over-optimized. (It's barely been optimized at all by my current standards.)
But is it the quantity and quality of authoritative information on a site that confers authority status? How about the linkage situation for that site at this point in time?
From the Hilltop paper:
HITS produces two distinct but related types of pages in response to a query topic: hubs and authorities . Hubs and authorities exhibit a mutually reinforcing relationship: a good hub points to many good authorities; a good authority is pointed to by many good hubs (pages can be both good authorities and good hubs.).
Hubs - links pointing out
Authorities - links pointing in
Kind of makes sense thinking about what a member posted just the other day, about seemingly irrelevant comparatively, totally unoptimized interior pages ranking for search terms for what seemed him to be exceptionally high PR on the root index page.
Following the logic, sites with an enormous number of inbound links to the root page, particularly from respected hubs, would have authority status according to the classic definition. As shopping sites in a lot of the cases we're now seeing, yes?
If a site is put up, thinking on a smaller scale than the biggies we're seeing, that's created as a quality hub for a particular niche, won't it follow that it should get a lot of inbound links and consequently gain some authority status?
How many of us have been seen pages over time ranking for search terms based on what appears to be the anchor text in an outbound link on the page? It's always been that way to a degree for non-competitive terms, given a couple of supporting factors. Still is.
In spite of the absence of it being possible to glean any details until this is all over, aren't we seeing a general trend that seems to be increasing? Fact is, we're seeing a lot of authority sites where we hadn't in the past, regardless of how "good" that individual page in the SERPs is.
What makemetop says makes perfect sense from a theoretical standpoint.
My reaction to all this is that I've got no choice but to quietly accept how it is because I can't change it, and then take whatever steps are necessary to adapt to what's the reality of it after the dust settles.
A fuller description of what makemetop posted... and though this is not over, I agree the trend is there, yes the pages shuffle for 4 days now, but pretty much we see the same end result. Google seems to like incoming links and anchor text...always has yes, but now somehow it seems to count for more... even above content itself. Maybe because the keyword content counts for less, maybe because related keywords being paired with keywords count for more than keyword keyword alone who knows or cares at this point till it's over.
I guess so far, what difference does it make what is causing the trend... the trend is there. If you can emulate an authority site, increase internal pages, internal links, links from external themed sites... then to google you are an authority. The bots don't mind if the rest of the world thinks you aren't an authority. If the structure and pr is there you have it.
[edited by: ciml at 8:22 am (utc) on Jan. 28, 2004]
[edit reason] Examplified domains. [/edit]
Meaning, just because you have the best, most accurate content doesn't mean you are an "authority site". Google's bots can't read a "how to bake bread" article and decide if it is accurate or nonsense. The bots need recognition of that site, links, from sites that are they themselves recognized as authorities.
Good content is king if it is recognized. If good content is essentially hidden under a basket, it can't be recognized as authoritative.
The page has the words in the keyphrase somewhere on the page - but not together. Blue is mentioned somewhere. Widgets is mentioned somewhere else. But the page is actually about Blue-blooded royalty and there is a mention at the bottom of the page that the site once used a widget to create the site.
This authority site SHOULD show up in a search about royalty, but NOT about widgets - yet it does. Sigh.
I can see pre-florida results with a little variation in results.
Yahoo (.com) showing Google results with top results being the Yahoo Directory listing results
Netscape controls the very first result (may be : picked by its own research team as the most relevant result), and uses all others in similar pattern from Google
AOl Search (http://search.aol.com/aolcom/index.jsp)showing exactly same results from Google
[edited by: guddu at 7:33 am (utc) on Jan. 28, 2004]
"Fact is, we're seeing a lot of authority sites where we hadn't in the past, regardless of how "good" that individual page in the SERPs is."
Excellent point. This makes me think that this update plus the last two (Florida plus whatever we called the December update) have now brought both hubs and authorities more prominently into Google's algo.
In this month's Planet Ocean newsletter (1/4/04), John Heard made the observation that Hubs are now a prominent factor in Google's algo post-Florida. They always have been to a degree in my opinion. But Florida seemed to turn that knob up.
Now some of you are observing that authorities are being given some prominence too. Makes sense to me. John Heard opined that the new Google was serving up a mixture of different types of sites in the top 10-20 results: some based on being a hub, some based on being semantically the best, etc. If this is there new strategy, it makes sense to layer in some authorities as well.
I think the difference is that the old pre-Florida Google SERPS tended to be dominated by sites that mastered a set of algorithm factors like anchor text, PR, and title tags.
Now there are multiple ways to get to the top of Google. Sort of like affirmative action quotas for web-pages... "let's target a set percentage of each class of web-pages to get ranked in the top 20". In the past one-class of algorithm masters always won. Now there are winners from multiple classes of webpages.
According to my understanding, regional results are included (with a little preference over other results) in the regional google results. The sorting of results is a bit different depending on the websites fetched.
If you watch it closely for different keywords say "architects" on both google.it and google.com, you would be able to understand the difference. but this logic may not apply to 2-3 keyword search for example "commercial architects" "commercial architecture design" etc.
Also, the sorting depends on the websites fetched on a particular keyword.
The answer to your question: MY WEBSITES ARE PERFORMING WELL IN ALL GOOGLE DATACENTERS for specific keywords even though there is some amount of fluctuation in results.
(sorry mods, i have to give a keyword to explain things)
SEO tricks will be rewarded and punished, carrot and hammer, from update to update. each update punishment for another SEO trick.
-> reward because an SEO trick like anchor text is also a good (non-SEO) way to determine good results on a kw. ("good" in the searchers perspective, and that's all that counts).
-> punishment because you SEO guys have overdone it. to an extent that results have become increasingly rubish.
so all the webmasters who don't know anything about SEO will be fine, their rewards won't be too exiting but their punishments won't hurt too much either. (for me as a searcher that means: start looking on result-page 2 at competitive searches. ;-)
all the SEO who try to aproach the red line but not cross it, will cross it and the only way to avoid it, is to go, say, 50% back at all applied tricks (thus acting more like a non-SEO webmaster).
so i must tune in with those who say: deoptimize your sites. (or start optimizing, as brett puts it.) and start working on usability, authority, etc. what makes your visitors happy, will make google happy.
[edited by: muesli at 9:45 am (utc) on Jan. 28, 2004]
A site using the above has made it through Florida and Austin and remains in number one.
I just can't understand how Google specifies in its guidelines the no use of doorways - hidden text - hidden links and yet it doesn't have the technology to get rid of them.
It looks to me that Google remains an easy target for spammers.
I see no improvement. :)
Edit: unnecessary flame
[edited by: zafile at 10:53 am (utc) on Jan. 28, 2004]
muesli, your theory is fine but... i can see a lot of totally IRrelevant results checking my pages positions right now. A lot of simply silly pages ##1-5 now
that's what i try to say with my theory: some tricks will work, some others will be punished. google wants to educate SEO people to refrain from tricks altogether, but this will take a couple of updates (depending on how fast SEO people are willing to learn the lesson).
let's put it this way: many of the irrelevant pages at top now will be gone altogether after the next update, some others will reappear. this is supposed to drive away customers from SEO-firms, who continue to work the old way.
if my theory is correct (many chances it isn't ;-) the SEO business is about to go through a giant shift - rather than google loosing reach!