It is really hard to understand this batch of algo changes by 'reverse engineering' (which is effectively what we collectively try to do on WW).
But it seems the new algo(s) have removed the boundaries around quality sites.
That is: what links to a page, the content of the page itself, and what that page links to, seem to have been merged into one continuous idea for evaluation by Google, which it then attempts to rank.
It doesn't appear to be working very well because simple lists of sites now often rank higher than the sites themselves. The algo has turned what was an index, into an index of indexes.
(In my particular case, which is illustrative only & admittedly statistically insignificant, I have seen my own CV, and a site that simply describes my site, ranking higher than the site itself.)
Perhaps it is time for Google to roll it back, take stock, and re-think.
People are now, understandably, even openly discussing creating directories themselves, in order to rank well (in fact I had the same idea the other day) - but surely this isn't the solution. There's little point in a laudable Google mission statement unless all its employees know and understand it, and it is followed. This state of affairs can't be healthy for anyone - especially the WWW :)
Edit: Grammatical / clarity.
[edited by: Chelsea at 8:49 pm (utc) on Feb. 5, 2004]
|Perhaps it is time for Google to roll it back, take stock, and re-think |
they already did that once....you think they would do that again?
I'm not arguing :) but when did they roll-back? I've seen some dodgy past algo's repaired in a week or so, but this unfathomable situation has been going on for nearly 3 months now.
[edited by: Chelsea at 9:00 pm (utc) on Feb. 5, 2004]
I can't think of any other way around.
It's clear, that the problem lies in the core of the new algo. There is no way to fix it doing some minor fixes.
Tha idea of giving so much power to some autoritative (read preffered) sites is in fact the main reason we are here now. This all DMOZ and other directories flooding is the most logical consequence of this idea.
This is a gas... The last few days, as I have stated, my site for my money phrase has ping ponged between #1 and #30. Now it seems everyone else is seeing sites popping in and out, yet for the past 36 hours, I have been rock solid at #16.
Very Strange. It seems I am either behind or ahead of everyone else.
[edited by: webdude at 9:27 pm (utc) on Feb. 5, 2004]
|Very Strange. It seems I am either behond or ahead of everyone else |
But don't forget that it was possible to partially understand this yo-yo effect when Google displayed its various datacentres - but they've been pulled.
It is a very strange situation, and if a member was to say 'I think Google is bust', IMO, this would be very hard to argue with now.
Although I recall that after Florida such ideas about 'Google being broken' were rejected outright and regarded as mere 'conspiracy theories' :)
(This was always a little unfair, since conspiracy theories are invariably too complicated to ring true. But something being 'bust' is extremely commonplace :) esp. in the UK ;)
Hey Brett, you better close this thread, it doesn't seem we're ready for it yet.
|it doesn't seem we're ready for it yet. |
What do you mean?
[edited by: Chelsea at 10:31 pm (utc) on Feb. 5, 2004]
Ready to accept the premise of the thread ;-)
it's only speculation / observation :)
|I'm not arguing :) but when did they roll-back? I've seen some dodgy past algo's repaired in a week or so, but this unfathomable situation has been going on for nearly 3 months now. |
just after dominic..or was it florida?..i forget now!..but the first time when it was just looking like a keyword filter....that first time it was rolled back for a few weeks...
Google doesn't rank sites, it ranks pages. It's odd that a lot of people still don't get that.
If you still believe this Steveb, then you haven't understood the recent drastic changes in the algo.
Of course Google must consider pages; they're the basic unit of the WWW: but the way it relates these pages to each other now seems entirely different. In the past, the way that Google ranked pages gave some weight to the value of a site, I guess there was some weight to internal linking. Not so now. Now it ranks collections of pages - whether they are engineered by huge, and very clever linking campaigns or not.
Let's enter the real world, not the salad days of Google's past :)
[edited by: Chelsea at 11:10 pm (utc) on Feb. 5, 2004]
For those that are seeing some improvement, have you made any changes?
|It doesn't appear to be working very well because simple lists of sites now often rank higher than the sites themselves. The algo has turned what was an index, into an index of indexes. |
Agreed. It looks like a problem with the use of "hubs". A hub is a great way of gathering pages for SERPs however it should never appear in the SERPs itself unless the user includes a search term such as "links" or "directory".
Unfortunately the problem with that last sentence is the phrase "such as" which is impossible to implement algorithmically. My (latest!) suspicion is that Google have therefore not bothered to do so. Hub pages are considered as relevant as content pages.
Good search engines decrease the amount of clicks to relevant content. Florida and Austin have increased the amount of clicks.
|Agreed. It looks like a problem with the use of "hubs". |
Let's all 'hub out' then!
Deep Purple famously said in a live concert "let's have everything louder than everything else"
(which is of course absurd)
So let's have an Internet with "everything linking out to everything else"
(Which is equally absurd)
I hope Google know what they're doing - it looks like a total disaster in the making to me :(
And it will be really easy for a competing search engine to improve upon these results, they just need to dump the pages with huge numbers of outbound links (that Google seems to admire) and replace them with those focused on a specific topic.
After all, these pages that Google is now serving up aren't *search results*, they are *search pages* - Who wants to search twice?
It increasingly looks like an abdication of responsibility, as well as being an irritation :)
[edited by: Chelsea at 11:58 pm (utc) on Feb. 5, 2004]
|Good search engines decrease the amount of clicks to relevant content. Florida and Austin have increased the amount of clicks. |
I used to (1) click on Google (2) enter my search and click (3) click on a target website.
Now I (1) click on Google (2) enter my search and click (3)(4)(5)(6) click through irrelevant results (7) click on vivisimo (8) enter my search term and click (9) click on a target website.
I really, really wish I could add ":-)" but I'm afraid it's true.
Hissingsid, maybe you're right about being a bug in SERPS cuz i recently questioned Google about some discrepancy showings in Adwords, and they emailed me back saying that there is a bug in the Adwords...so maybe its in the SERPS too?
Chelsea, you are not understanding the relationship of pages and sites. You have to get yourself into the post-Austin world.
It is all about PAGES. Google is ranking PAGES from slate, or cnn, or other authority domains. Those PAGES are beating full-fledged domains worth of content, because domains of niche content matter less (very little).
The effect you are mistaken about, imo, is that those CNN pages are given high authority ranks because they reside on authoritative domains. The domain content means nothing. It is a serious mistake to think that. The page content -- or more accurately, the APPARENT-to-a-bot page content, is what matters.
People keep saying "my site this" or "my site that" while missing a fundamental of the post-florda world that PAGES, even with long URLs deep on large domains, is what are being algorithmically judged. Don't confuse the value of having links from CNN to a CNN news article on Widgets with Google thinking CNN is all about widgets. Google is saying that it trusts CNN's judgement and this widgets article is worth ranking well.
And, to long time readers of webmasterworld none of this should be a surprise.
GoogleGuy told us when Googlebot got better at indexing long URLs, and webmasterworld members noticed.
Google Guy encouraged people to focus on multiple keywords rather than putting eggs in one basket.
Multiple pages focused on multiple things/keywords on a large, stable, authoritative domain is the direction to go in.
One side note on this partly explains why directory pages are doing well... they are PAGES that have a high conetration of keyword content, that is likely titled well, that links to authoritative sites, and is linked from its authoritative parent. These PAGES have no depth, and some folks are thinking it is bad search engineering that they outrank domains full of content. Maybe, but the point is it is a page being ranked, not the domain. In other words, one directory page with words and linking on it is going to kick the butt of an index page of a large on-topic domain that is just a flash graphic.
Don't want to give the wrong impression about the above though. I fully believe Google will be valuing niche authority much higher as its algorithm switch process unfolds.
Deep pages of large, niche sites will dominate those deep CNN-type pages as this process matures.
(Warning: speculation ahead.) It's possible that if your listing gets clicked more than the expected average for a given search term, it may get "some kind of mojo points." Google may measure this extra clicking as votes for your site and perhaps use this information in their algorithm.
"If nobody's clicking on the search results Google knows it's not delivering what they're looking for. What are people who are drilling down into page two or three clicking on? Google may float those up higher."
They can very easily identify the search results that fail to generate the click through rates that they should and begin adjusting their results that way.
Google used to have a feedback tab in their tool bar, and a link at the bottom of the results that asked "how are these results?" These methods of measuring the value of results required interaction from the searcher. Tracking clicks allows Google to do the work, and clicks tell a more complete story than direct user feedback.
"Google is ranking PAGES from slate, or cnn, or other authority domains."
That is part of the stupidity of Google's new algo. I've seen many pages from these domains rank high just because they mention the words in the search phrase on the page. The page has nothing to do with the search phrase, yet that's what google thinks is relevant.
I don't think it is stupidty as much as immaturity of the algo. Obviously Google can more easily recognize generic authority as oppossed to niche authority. It needs to get better at the more difficult task.
Doesnt that bring us back to a point that has been made several times. If the algo isn't ready, maybe google should have waited until it was ready. Or at least more ready than it is now.
IMO Steveb is right on the money, we're in a process, present results have little bearing on eventual ones, and trying to decipher anything from them now will make you mad/take to drink/depressed/quit.
I don't know what to say about this, except I'm stunned.
Do these searches:
keyword +keyword (use the same word twice)
These put up completely different results from each other and anything ever seen on this planet before.
steve - that's been the case since Austin went live. Very bizarre and lends great weight to the view that the algo is very much a work in progress. If anyone can come up with an explanation as to why these three should be different, I'd love to hear it...
"Multiple pages focused on multiple things/keywords on a large, stable, authoritative domain is the direction to go in."
That is where the site gets in. Unless you are dmoz or some kind of large directory your site has to focus on one (or a few) themes.
If you want to sell everything to everybody you won't sell anything to anybody...
OK, I do not believe that this is about pages anymore. It is in fact about sites! But maybe we would have to give that word a new meaning.
Untill now when I searched for a keyword, the first results pointed to pages, dedicated to the keyword. Now they point to sites, that are mentioning the keyword.
When I search for a green widget the first results probably will contain :
1. Link to google directory
2. Link to DMOZ
3. Article on a newspaper or TV site saying somewhere in the text "She was wearing a green widget tonight"
4. Link to a page in a large online retailer (for most of the cases not exacly the page, selling the green widget, but a page that links to it)
5. Link to a large forum, where there is a post by a member, nicknamed green widget
I thing we shoud not mix the meanings of the words site and domain. But anyway the sites are the kings now.
The king (read content)is dead! Long live the king (authority)!
The main problem with linking is this: Until now when you place a link on page a to page b with anchor text green widget, when the user searchs for green widget G was showing page b. Now it shows page a.
[edited by: pavlin at 11:52 am (utc) on Feb. 6, 2004]
| This 105 message thread spans 4 pages: < < 105 ( 1  3 4 ) > > |