Forum Moderators: open
We are now back to page #2 or #3 on most 2 word searches, we have reduced keyword density but I think it is more of a tweek that Google is doing.
Keep in coming Google.
But it seems the new algo(s) have removed the boundaries around quality sites.
That is: what links to a page, the content of the page itself, and what that page links to, seem to have been merged into one continuous idea for evaluation by Google, which it then attempts to rank.
It doesn't appear to be working very well because simple lists of sites now often rank higher than the sites themselves. The algo has turned what was an index, into an index of indexes.
(In my particular case, which is illustrative only & admittedly statistically insignificant, I have seen my own CV, and a site that simply describes my site, ranking higher than the site itself.)
Perhaps it is time for Google to roll it back, take stock, and re-think.
People are now, understandably, even openly discussing creating directories themselves, in order to rank well (in fact I had the same idea the other day) - but surely this isn't the solution. There's little point in a laudable Google mission statement unless all its employees know and understand it, and it is followed. This state of affairs can't be healthy for anyone - especially the WWW :)
Edit: Grammatical / clarity.
[edited by: Chelsea at 8:49 pm (utc) on Feb. 5, 2004]
I'm not arguing :) but when did they roll-back? I've seen some dodgy past algo's repaired in a week or so, but this unfathomable situation has been going on for nearly 3 months now.
[edited by: Chelsea at 9:00 pm (utc) on Feb. 5, 2004]
Very Strange. It seems I am either behind or ahead of everyone else.
[edited by: webdude at 9:27 pm (utc) on Feb. 5, 2004]
Very Strange. It seems I am either behond or ahead of everyone else
But don't forget that it was possible to partially understand this yo-yo effect when Google displayed its various datacentres - but they've been pulled.
It is a very strange situation, and if a member was to say 'I think Google is bust', IMO, this would be very hard to argue with now.
Although I recall that after Florida such ideas about 'Google being broken' were rejected outright and regarded as mere 'conspiracy theories' :)
(This was always a little unfair, since conspiracy theories are invariably too complicated to ring true. But something being 'bust' is extremely commonplace :) esp. in the UK ;)
it doesn't seem we're ready for it yet.
What do you mean?
[edited by: Chelsea at 10:31 pm (utc) on Feb. 5, 2004]
it's only speculation / observation :)
I'm not arguing :) but when did they roll-back? I've seen some dodgy past algo's repaired in a week or so, but this unfathomable situation has been going on for nearly 3 months now.
just after dominic..or was it florida?..i forget now!..but the first time when it was just looking like a keyword filter....that first time it was rolled back for a few weeks...
Of course Google must consider pages; they're the basic unit of the WWW: but the way it relates these pages to each other now seems entirely different. In the past, the way that Google ranked pages gave some weight to the value of a site, I guess there was some weight to internal linking. Not so now. Now it ranks collections of pages - whether they are engineered by huge, and very clever linking campaigns or not.
Let's enter the real world, not the salad days of Google's past :)
[edited by: Chelsea at 11:10 pm (utc) on Feb. 5, 2004]
It doesn't appear to be working very well because simple lists of sites now often rank higher than the sites themselves. The algo has turned what was an index, into an index of indexes.
Agreed. It looks like a problem with the use of "hubs". A hub is a great way of gathering pages for SERPs however it should never appear in the SERPs itself unless the user includes a search term such as "links" or "directory".
Unfortunately the problem with that last sentence is the phrase "such as" which is impossible to implement algorithmically. My (latest!) suspicion is that Google have therefore not bothered to do so. Hub pages are considered as relevant as content pages.
Agreed. It looks like a problem with the use of "hubs".
Let's all 'hub out' then!
Deep Purple famously said in a live concert "let's have everything louder than everything else"
(which is of course absurd)
So let's have an Internet with "everything linking out to everything else"
(Which is equally absurd)
I hope Google know what they're doing - it looks like a total disaster in the making to me :(
And it will be really easy for a competing search engine to improve upon these results, they just need to dump the pages with huge numbers of outbound links (that Google seems to admire) and replace them with those focused on a specific topic.
After all, these pages that Google is now serving up aren't *search results*, they are *search pages* - Who wants to search twice?
It increasingly looks like an abdication of responsibility, as well as being an irritation :)
[edited by: Chelsea at 11:58 pm (utc) on Feb. 5, 2004]
Good search engines decrease the amount of clicks to relevant content. Florida and Austin have increased the amount of clicks.
Absolutely.
I used to (1) click on Google (2) enter my search and click (3) click on a target website.
Now I (1) click on Google (2) enter my search and click (3)(4)(5)(6) click through irrelevant results (7) click on vivisimo (8) enter my search term and click (9) click on a target website.
I really, really wish I could add ":-)" but I'm afraid it's true.
It is all about PAGES. Google is ranking PAGES from slate, or cnn, or other authority domains. Those PAGES are beating full-fledged domains worth of content, because domains of niche content matter less (very little).
The effect you are mistaken about, imo, is that those CNN pages are given high authority ranks because they reside on authoritative domains. The domain content means nothing. It is a serious mistake to think that. The page content -- or more accurately, the APPARENT-to-a-bot page content, is what matters.
People keep saying "my site this" or "my site that" while missing a fundamental of the post-florda world that PAGES, even with long URLs deep on large domains, is what are being algorithmically judged. Don't confuse the value of having links from CNN to a CNN news article on Widgets with Google thinking CNN is all about widgets. Google is saying that it trusts CNN's judgement and this widgets article is worth ranking well.
And, to long time readers of webmasterworld none of this should be a surprise.
GoogleGuy told us when Googlebot got better at indexing long URLs, and webmasterworld members noticed.
Google Guy encouraged people to focus on multiple keywords rather than putting eggs in one basket.
Multiple pages focused on multiple things/keywords on a large, stable, authoritative domain is the direction to go in.
One side note on this partly explains why directory pages are doing well... they are PAGES that have a high conetration of keyword content, that is likely titled well, that links to authoritative sites, and is linked from its authoritative parent. These PAGES have no depth, and some folks are thinking it is bad search engineering that they outrank domains full of content. Maybe, but the point is it is a page being ranked, not the domain. In other words, one directory page with words and linking on it is going to kick the butt of an index page of a large on-topic domain that is just a flash graphic.
"If nobody's clicking on the search results Google knows it's not delivering what they're looking for. What are people who are drilling down into page two or three clicking on? Google may float those up higher."
They can very easily identify the search results that fail to generate the click through rates that they should and begin adjusting their results that way.
Google used to have a feedback tab in their tool bar, and a link at the bottom of the results that asked "how are these results?" These methods of measuring the value of results required interaction from the searcher. Tracking clicks allows Google to do the work, and clicks tell a more complete story than direct user feedback.
That is part of the stupidity of Google's new algo. I've seen many pages from these domains rank high just because they mention the words in the search phrase on the page. The page has nothing to do with the search phrase, yet that's what google thinks is relevant.
That is where the site gets in. Unless you are dmoz or some kind of large directory your site has to focus on one (or a few) themes.
If you want to sell everything to everybody you won't sell anything to anybody...