Welcome to WebmasterWorld Guest from 220.127.116.11
Forum Moderators: phranque
I realised a while ago that I have subconsciously developed the habit of (or have been conditioned into!) clicking on the 5th or 6th page or so, of Google search links without even checking the first few pages! I then changed my Firefox home page (read 'regular search engine') with pleasant results, though I still prefer Google for very specific search word combinations, and boolean searches.
Am I lagging behind the times, has the Google target audience changed, is Google perhaps not as useful as it once was, or can you think of another more valid reason for my experience of the Google search engine?
For most searches, it's just fine - but certainly some are so spam ridden that you need to dig deep, or vary your search term.
But that is true of all search engines, isn't it?
Or have you found one better than Google? Don't fall into the Google-bashers trap of blaming Google for spam sites; that's just too cheesy ;)
Google is engaged in an arms race with, well, pretty much anyone who owns a website, but especially "black hat" SEO.
The original Google algorithm was spectacularly successful. This changed the landscape. Joe Public started using search for everything. This meant it was important to rank in search, especially google, leading to SEO: search engine optimisation. ie the art of trying to get better rankings in search engines. Problem is, SEO pollutes the integrity of the results. Google needed to adjust their algorithm accordingly. SEO adjusted accordingly. And so on.
Google improves its algorithm over time. Today's Google algorithm is, without a doubt, more sophisticated than the algo of '98. But even with a better algorithm, it's harder to provide clean SERPs now, because of black, white & gray SEO. The severity of this problem varies considerably depending on the domain.
In my experience, long tail searches are still pretty clean, and can be surprisingly good. Most of my google searches are as long-tail as I can make them.
Big commercial terms are bringing back big commercial sites rather than information based sites which then review and lead the now-informed user to the commercial sites.
The reason many of us are skipping past the first set of results is because they are composed of commercial entities and wikipedia. Spam is not a major problem in most sectors at the present time.
To put it another way - adwords show you the commercial entities who feel they should be advertising on the results pages. Natural results show you exactly the same thing.
But now, the web, just like the Malls and the High Street, are dominated by Big Name companies with big sites and big affiliate schemes - and big budgets.
And, just like the Malls and the High Street, that's a trend that is here to stay. Finding a quality independent site is going to get more difficult - and promoting one, even more so.
I'm not sure that Google (or any SE can do anything about that - and I'm not sure they'd see it as something they should be doing something about.
The reason many of us are skipping past the first set of results is because they are composed of commercial entities and wikipedia.
Yup. The problem is, for most of us, when we do a search, at least 50% of the time, we are not looking for commercial sites. We are looking for information. We are not looking for information related to any product. So, typically, the first few results are not relevant.
Google's emphasis on keywords, and lack of any ability to understand or communicate the real intent of the search is getting to be very limiting.
I have proposed that search engines need to move toward truly understanding human languages - not keyword mumbo-jumbo. As well, they need to be more configurable as to users specific preferences (for example, global filtering - "never show me Ebay results"). And they need to provide a way for users to indicate the class of result sites they want to see. (non-commercial, government agency, educational institution, manufacturer, online store, etc. etc. etc.)
I don't see any of this coming from Google, though. They're gonna ride that broken-down pony till it's lying, collapsed, in the middle of the stream, yelling "giddy-up" while the poor thing tries to take a drink...
I have proposed that search engines need to move toward truly understanding human languages - not keyword mumbo-jumbo.
There are horses for courses in the search engine landscape, and given responses here, I am determined to experiment more widely.
No doubt, it would be nice if some information regarding the tone/ intention of a search could be conveyed to a search engine at the time of a query; natural language comprehension would be ideal, but is it the only solution while it is not achievable just now? Natural language comprehension is a great goal to work towards, however.
No doubt, it would be nice if some information regarding the tone/ intention of a search could be conveyed to a search engine at the time of a query; natural language comprehension would be ideal, but is it the only solution while it is not achievable just now?
There's a home-page discussion right now about Google's "classifying" of web-site type.
What is ironic is that what they are doing with this information is exactly the opposite of what I think would be useful to do with it. They are using it to insure the appearance of a number of types of websites on the first result page.
Perhaps this makes sense for short (such as one-word) searches. (Which appears to be how they are currently using this.) But, why not simply LET THE USER TELL GOOGLE what type of website they want!
Google doesn't like users to tell them anything, though. They just want them to type away like blind monkeys, and Google will try to figure out what they meant.
They could start by providing keywords for their current website categories that are accessed through separate search pages - e.g. blog, shopping, news, etc.
It seems pretty silly to me that you have to go to a special page to do, say, a news search, and can't combine categories. e.g. "show me blogs and news".
You just have to know the google tricks and ways in order to get the best out of it.
Oh, if somebody would just make a search engine which didn't have any commercial results in it at all... and then structure the results accordingly.. as in five rows of types of results across the screen with an intertextual interface as in "deeper or higher ratings buttons and not a list of 500.000 results of utter bull.
Got all the ideas here, but not the resources to make it happen. Back to saving up some investment money..
Sincerely and have fun,
Search engines don't have to provide total natural language understanding.
People don't treat Google like the lady at the library counter. They learn the system, how search engines work, what works, what doesn't, how to maximise their chances of success. They learn the tweaks and the hacks.
The options for "images" "news" etc are an example of what I'm talking about: they provide people with tools beyond just saying "tell me about x".
I can't recall when Google has made a significant improvement to their "search language" recently. It is essentially static. Pick up a second-edition "Google Hacks" (published 2005) and save for code search, Google doesn't do anything significant today that it didn't then.
What they do have is pretty basic. They have a few simple operators, for "and" and "or", a handful of more specialized operators that they call "query modifiers" and "alternate query types". These are the various operators ending in ":", such as site:".
There are a number of additional search options, but there is no way to access them in the "search language". They require keyword parameters in the URL, and few if any users are going to do that.
Several of the options on the advanced search page have no equivalent in the search language! None of the subject-specific searches have any support in the search language. (There's no way to specify that you want an academic search, for example, let alone, God forbid, that you might just want photos that are in the news. Nope, pick one: you get photos, or you get news, and first you have to go to a different URL before you even do the search.)
Regular expressions are supported only for code search. I guess anybody who isn't a programmer is too stupid to use regular expressions. OK, regular expressions are obscure and techie - so, invent a better way of doing regular expressions more suitable for the unwashed public.