|Are search engines getting too complicated?|
| 3:36 pm on Sep 28, 2010 (gmt 0)|
does anyone else think that the time has come for search engines to roll back their algos to how they were a few years ago?
they check about a billion things now with links and backlinks and load times, repeat visits, bounce outs, blah blah. but for a lot of queries, that is all just noise. all we really want to know is whether the content we want is written on the page.
imagine this... if i want to get a bus to somewhere and ask a policeman which bus goes there, i want him to say 65. because that's the answer. which bus goes there? easy, he says... its the number 65. job done.
but if i ask a search engine the same question, he'd probably be checking the number of passengers on each bus, how many stops it makes, what the fare is, how many people just go one stop, how many times the bell rings, how many breakdowns it has, how many seats are free... all noise. none of which helps me get the answer.
to test it out, i searched for "bus from (city) to (city)"
that is an easy question with a specific answer.
if the search engine restricted itself to just finding those exact words on the page then presumably it would end up with a list of pages with the answer on. but because of all this noise that it factors in i end up with a page containing train times, city tours, sightseeing bus companies and news links to the latest traffic and travel news. plus a load of suggestions for other destinations that i might want to search for instead.
it's like they've taken the word "bus" and chucked in everything that relates to it. the only thing they've missed out is a list of rhyming words and photos (...oh, wait a minute, they have actually given me some photos of buses too. but not the ones that go to the city.)
this is my point:
i reckon if i tried this query years ago, when the actual words contained on a page were rated a lot higher in the algo than they are now, then the serps would have been dominated by pages containing the answer. but now the majority seems to consist of bus-related trusted pages.
| 7:04 am on Oct 13, 2010 (gmt 0)|
There are very valid reasons for all that "noise" used by SE algo's.. The problem is, people will always find ways to beat them, and to get their sites listed on top for their target keywords..
A Bus seller's site might get listed first when you search for "bus from (city) to (city)" even if all the search engines revert to their old algo codes. simply because companies who have the resources will market, link, promote and find ways to rank..
"if the search engine restricted itself to just finding those exact words on the page then presumably it would end up with a list of pages with the answer on."
How I wish things were this simple, but people would just place random phrases, and crap on their sites to rank for different words.. There would be no "noise" to filter out the crap..
I'm not for or against the current algos.. Google, yahoo or any other search engines will never become perfect, no matter how much money or resources they have.
Just my two cents..
| 7:19 am on Oct 13, 2010 (gmt 0)|
I'm all for having access to a search engine that can be used entirely as a tool for our manipulation and that relies on the expertise of the searcher.
It would still have gatekeeping for safety and security, but no artificial intelligence in the algorithm.
Help users get smarter, not search engines.