| This 40 message thread spans 2 pages: 40 (  2 ) > > || |
|Is Google Classifying 'Types' of Websites and Search Terms?|
Deeper Implications of Universal Search
| 6:34 am on Sep 9, 2007 (gmt 0)|
In our ongoing September SERP Changes [webmasterworld.com] thread, gehrlekrona made an observation that I feel is worth some extra attention.
|GOGG seems to be trying to do some "web site typing", i.e. trying to find a way to clump sites together in a taxonomy. Maybe they ARE planning to have their search categorized |
I think Google has been classifying types of sites for a while, and now it's getting more granular and sophisticated. This kind of sorting and classifying showed up a while back for some of the highly competitive searches. One day we woke up and WHAM!, the whole first page looked profoundly different, with very different types of sites (often informational rather than commercial) being featured. And the previously dominant domains were pushed down or off page one.
Search terms themselves can also be sorted into various taxonomies, especially the 1-word and 2-word queries. Search term taxonomies could be built through "user intention" studies, which are especially challenging for those short queries. Many members here have mention the recent earthquake we;ve seen in the result for many 1-word searches.
With the advent of Universal Search, Google now has the infrastructure to force integrate selections from any class of websites onto the first page. So the implications of Universal Search can go well past the obvious and publicised taxonomies of images, video, news, books, maps, blogs. Even more than a simple "commercial" and "informational" taxonomy, there could also be classes like brochureware sites, trademark holders, businesses with a physical world presence, manufacturers, B2B, multi-topic (encyclopedic) and on and on. One factor Google could then tweak would be which classes of sites to force integrate into the results for which kinds of search terms.
This forced crazy-quilting of Page 1 might well push more users go to Page 2 - and that may even be seen as a potentially positive evolution at Google. It does give more ad impressions, for instance. I'm not saying this is the only motivation, not by a long shot. But it would be one kind of positive outcome as long as there is no negative impact on market share. And the average search user might well appreciate the variety of choices. Time will tell.
| 11:48 am on Sep 9, 2007 (gmt 0)|
|This forced crazy-quilting of Page 1 might well push more users go to Page 2 - and that may even be seen as a potentially positive evolution at Google. |
Surely though there's a bit of a catch22 there in that when users start to realise it takes longer to find what they want, they will shift to other search engines over time?
| 12:43 pm on Sep 9, 2007 (gmt 0)|
I can't imagine a day when "the average search user might well appreciate the variety of choices". If they are digging to page 2, 3 and beyond to really find what they are looking for. They will move on and give another search engine a try. I know I would.
| 2:56 pm on Sep 9, 2007 (gmt 0)|
We do that already since it seems impossible to find what you want on the first page anymore.
Maybe in the near future we'll have new mata tags where we decide where we want to belong in the search. You'll have your title, the description and a top level like widgets and then sub categories like classified, books, article and so on. <META NAME = "Taxonomy" Content="Widgets, Classified">
It seems that they are trying to do this already because my site has been showing up with "typed" sites lately.
| 4:47 pm on Sep 9, 2007 (gmt 0)|
|I can't imagine a day when "the average search user might well appreciate the variety of choices". If they are digging to page 2, 3 and beyond to really find what they are looking for. |
Does the average user have to dig to page 2, 3, or beyond to find what he or she is looking for? As someone who uses Google Web Search every day, I seldom find that to be the case--whether I'm looking for tourist information, reviews of digital cameras, help with software error messages, or advice on Draft Q cordless widgets. In the overall scheme of things, Google Web Search works about as well as it ever did--or possibly better in some cases. (A few years ago, it was almost impossible to find a hotel's Web site amid a flood of boilerplate affiliate pages, for example; nowadays it's likely to be in the top few results, behind the "user reviews" at TripAdvisor or VirtualTourist.)
As for the "crazy quilt" effect, there's always been a "crazy quilt" of results in Google Web Search, simply because (like most search engines) it uses an all-purpose, one-size-fits-all index. Historically, a search on "widgetco wc-1 digital camera" might have yielded results for:
- The Widgetco, Inc. product page for the WC-1;
- Reviews of the WC-1 at several major review sites;
- A press release or trade-magazine article about the WC-1;
- A forum thread about the WC-1.
At least any Google News or Google Images results that may be displayed will stand out from rest of the page, making news or image results easier to skim over if the reader is just looking for information to help in making a purchase.
If Google is moving to classify "types" of Web sites and search terms, that may be a good thing, especially if that encourages users to learn better search techniques. Let's say that Joe User wants to buy the Widgetco WC-1 camera that I mentioned above. Once he learns that adding the word "buy" or "dealers" to a keyphrase will deliver e-commerce results, he'll be less likely to just type in "widgetco wc-1 camera" or "chronotronique watch" or "inhalex vacuum cleaner" and take potluck.
| 5:29 pm on Sep 9, 2007 (gmt 0)|
|Search terms themselves can also be sorted into various taxonomies, especially the 1-word and 2-word queries. Search term taxonomies could be built through "user intention" studies, which are especially challenging for those short queries. Many members here have mention the recent earthquake we;ve seen in the result for many 1-word searches. |
The major churn I've been seeing for single-word searches over the past several days seems to have reverted to "normal" for the moment.
One of the words I was monitoring applies to several different market areas. In the recent churn, I noticed several market areas I don't normally see rising up to page one. I'm guessing that Google is playing with various ways of sorting these out.
Right now, with the old results back, I'm seeing page one dominated by results from one market, with the right column dominated by ads from another market. Eventually, I suspect, there will be divisions for these within the organic results.
| 5:42 pm on Sep 9, 2007 (gmt 0)|
|In the recent churn, I noticed several market areas I don't normally see rising up to page one. I'm guessing that Google is playing with various ways of sorting these out. |
That's exactly the kind of sign I've been seeing. Seems like there must be some kind of taxonomy of query terms and websites in play - it's too strong an effect to be a coincidence.
|page one dominated by results from one market, with the right column dominated by ads from another market. |
The presence of various markets in AdWords could be an indicator of discrete markets on the same 1-word search. Maybe Google's "Chinese wall" between AdWords and organic search is being droppeed just a bit?
| 6:36 pm on Sep 9, 2007 (gmt 0)|
|The presence of various markets in AdWords could be an indicator of discrete markets on the same 1-word search. Maybe Google's "Chinese wall" between AdWords and organic search is being droppeed just a bit? |
When I target a site, that's one of the things I look for... whether the ads suggest either multiple meanings in search terms or multiple audiences for the same term, either of which might make ranking for some queries especially difficult.
In one manufacturing area, eg, I ran across a large market in collectibles for the same item. To some extent, this double meaning influences searches themselves, as searchers try to refine initial searches to get ferret out the results they want.
Universal Search might cut both ways, making top spots for some channels easier to get, but reducing the number of top spots in others.
|If Google is moving to classify "types" of Web sites and search terms, that may be a good thing, especially if that encourages users to learn better search techniques. |
Something for Google to keep in mind. This is a fine point... but, by handing lazy searchers a smattering of specific top results, broken down by channels, Google might also be removing the motivation for these users, at least, to learn better searching techniques.
| 7:12 pm on Sep 9, 2007 (gmt 0)|
|but, by handing lazy searchers a smattering of specific top results, broken down by channels, Google might also be removing the motivation for these users, at least, to learn better searching techniques. |
Or it might be teaching them. How many Google Web Search users are aware that they might find what they're looking for more easily with Google News, Google Images, Google Products, Google Patents, etc.? Giving them a taste of results from other channels may help to broaden their search habits.
In any case, I think we're talking about two different topics in this thread:
1) Google Universal Search, which is the aggregation of search results from different Google search tools on a single SERP;
2) Whether Google is classifying types of sites or pages within its search index and Web Search algorithm (e.g., as editorial or reference pages, e-commerce pages, affiliate pages, forum pages, etc.).
The first topic has been discussed here before (at length); the second topic has been discussed in a more speculative "what if" or "maybe someday" context, and the premise of this thread seems to be that "someday" may be arriving at long last.
| 7:33 pm on Sep 9, 2007 (gmt 0)|
|"someday" may be arriving at long last. |
Yes. From what we're seeing on quite a few 1-word searches, I do think it is arriving. Related thread: Major Shift in One Word Search Results [webmasterworld.com]
My purpose for even mentioning Universal Search in responding to gehrlekrona's observation on website taxonomies, was that the infrastructure Google needed to develop for it could have widely extended uses. The challenges that Google was facing included:
1. Developing comparable relevance ratings for different channels
2. Program the ability to insert any given channel onto page 1
3. Decide on the query term triggers that insert a given channel into the SERP
My conjecture is that this infrastructure can be applied to any segment of the total index thast Google wants to create, and not just the previously stand-alone search channels.
| 8:09 pm on Sep 9, 2007 (gmt 0)|
|My conjecture is that this infrastructure can be applied to any segment of the total index thast Google wants to create, and not just the previously stand-alone search channels. |
That's an interesting thought (and one worth highlighting).
| 8:18 pm on Sep 9, 2007 (gmt 0)|
From what I have noticed lately is that users are learning how to search. Mayby Ask has taught them to use more specific searches.
One noticable change, and this might be where Google starts(-ed) and that is to differentiate between "to sell" and "to buy".
From there it is pretty easy to see what site offer anything to sell then you can easily weed out the ones that does not sell anything. If you are using "for sale" on your site, then you probably have something people can buy.
If you have something to sell, then you might want classifieds so you can post your items, so then Google will show you all their classifieds if you don't specify your item.
These are just loose ends for now but I see in my stats that people try to specify more what they want, and I don't see a lot of one word searches but that might just be bacuase I am not high up enough in the results which is ok for me. I am not sure that people just use one word searches anymore. They learn pretty fast that they need to type in more to find what they want and lately I have had to type in long sentences, key phrases to find what I needed.
Also what I see from looking at the SERP's is that my site has shifted focus and that it is now more closely related to the same type of web sites. It might just be a coincidence, but it doesn't seem like it.
| 8:57 pm on Sep 9, 2007 (gmt 0)|
|These are just loose ends for now but I see in my stats that people try to specify more what they want, and I don't see a lot of one word searches but that might just be bacuase I am not high up enough in the results which is ok for me. I am not sure that people just use one word searches anymore. |
My top search phrase is a single keyword that relates to a major topic of my site. However, it represented only 0.4% of my August search referrals from Google. My August server logs show 796 search phrases that contained that keyword and probably several hundred more that didn't contain the specific keyword but were about the topic.
I do rank high for that single keyword (#3 out of 41 million), so my experience tends to support your observation.
Given the sheer numbers of pages for any major search term, it would be understandable if users were getting savvy enough to search on "widgetville hotel rooms" or "widgetville tourist information" instead of just "widgetville," or on "widgetco wc-1 camera reviews" instead of just "widgetco" or "wc-1." And if multiple-word searches really are a trend, they should make it easier for Google to provide search results that better fit the user's needs--in some cases with a mixture of "Google Universal Search" results, and in some cases with traditional, plain-vanilla Web search results alone.
| 9:27 pm on Sep 9, 2007 (gmt 0)|
What I believe is that Google is logically trying to manage web #s explosion by making results more granular, giving more weight to subject orientation and 'freshness' of content rather than weight and vintage.
The outcome has been positive on some aspects but -as we all know- has had some quite bad collateral damage.
I believe we all do realiza its a difficult balance that G has to achieve. What I find annoying is the lack of communication and the ever-growing arrogance of these guys. You may find interesting the latest article on Google published by The Economist 09/06/2007 about Googlers and Xooglers (ex Google employees) and how the company is managed.
| 9:50 pm on Sep 9, 2007 (gmt 0)|
I agree that it is annoying to say the least that there is no communication whatsoever. It is like a Secret Society where there is no insight at all but to a few people. I am pretty sure there are web site owners that have a direct line to Google, web sites that have special AdSense code without the Google Logo and web sites with a lot of lawyers that can push the right buttons if a site disappears overnight. Remember Amazon.com? It disappeared overnight but was back the same day. Who can do that?
With all the data Google collects through their Urchin, or whatever it's called now, they have all the necessay tools they need to try to figure out a way to type sites and if they don't do it, then someone else will and they will get a lot of user because it'll be more targeted, easier to find things when searching. It's just a matter of time.
Like I said, people are getting smarter since they have been forced. One word keywords are a thing of the past and I think that is why Google might have hard time right now to show the "right results". Someone might be looking for a widget and Google brings up 48 million pages. They don't see what they were looking for so they type in widgets for sale, and doesn't find anything either that is close to where they live, so they type in "widgets for sale in Chicago" and Bingo.... there's a site that sell widgets in my neighborhood......
So what is Google going to do with all their 48 million pages of "widget junk"? They need to classify them, type them!
| 10:23 pm on Sep 9, 2007 (gmt 0)|
One thing worth pondering might be what can you do to help your site get found, practically, in this kind of search environment. I've got a few ideas, and they are probably valuable even if these kinds of taxonomies aren't affecting you right now. Here's what I'm up to, in practical terms:
1. Watch my server logs and know what searches really matter for the business's bottom line.
2. Be aware of any semantic confusion around a site's major search terms, and do my best to clarify which kind of "widgets" the site offers. For a hypothetical example, if I'm working with a site that sells glass windows, I'd be very judicious with mentions of anything related to Microsoft, software, programs, etc. Even "this site best viewed with Internet Exporer" might create chaos.
3. Regularly consider whether chasing 1-word searches - especially those "big trophy words" - is worth the effort in any particular market. In some markets, certain single words still have a big payoff, but that's always up for another look at any time.
4. If website taxonomies kick in with great strength on a single word search, then I might be chasing one of only two or three spots available on the Home Page to my website's classification, instead of one out of 10. In terms of resources and energy used, this can be like paying for a full page ad in the NYTimes to publicize the local peach festival.
5. Keep the major Universal Search channels in mind, and as is practical, create content that has a chance of showing up in one of those dedicated spots - video, news, blogs and so on.
| 12:01 am on Sep 10, 2007 (gmt 0)|
That was a good summary, tedster!
1. I check my stats a LOT every day and can spot within the hour what is happening to SE's
2. I watch my logs to check and see what people have been searching for to get to my site. I can see if it tips over one way or the other and then I try to correct it the way I want it to go.
3. 1-word keywords is just a waste of time, if you ask me. No traffic from that. It's only good for your self esteem :)
4. I think the only SEO we need in the future is to have our taxonomies straightened out, if we don't already have. Maybe we should think about having our site navigation set to taxonomy levels, i.e. keeping our site structured that way?
5. I know it's a good thing, experienced first hand, to be precise and not wander off into different fields not really related. Google gets confused and you loose rankings if you are not within a certain area/taxonomy with you additions. I usually think of Google as made up of shelves, and our sites are just a lot of documents stored on these shelves. Their algo looks at the documents and shuffles them around up or down depending on document changes or how it is related to something else they have in their shelves. Before it didn't matter how documents were related to each other but a while back they needed to be related within a taxonomy to be counted.
| 1:12 am on Sep 10, 2007 (gmt 0)|
The initial introduction of universal search saw a major weakening in both relevance, niche and for lack of a better way to put it, less dependance on anchor text.
One single use of a word on a page, like "Bush's gamble in Iraq..." could get an entirely unrelated article listed on page 1 for a very competitive keyword.
The recent evolution is just going firther down that path. Niche and topical authority mean far less, while incidence of words on a page matter more (for one word searches only).
People have often complained about Google's anchor text reliance, wanting more "on page" reliance. Well here we see the foolishness of that idea. Lower importance, less truly relevant pages replace objectively better pages from the same domain; pages with a high incidence of words (like the Bush gamble thing) rank well for terms that have a very different meaning than the isolated one word would suggest (in this case both "bush" and "gamble" are terms where anchor text and theme/context matter since the words have multiple uses and meanings).
Google has made it so at least for some types of searches (notably one word ones) every website on earth can rank for any term. Obviously there are still other algorithmic elements too, so you can't just put "Iraq" on a needlepoint page a dozen times and expect it to rank in the top ten for an Iraq search, but it now is far more likely that things like that will happen.
I suspect google has just gotten so bad at finding niches and niche linking they took one further major step away from even trying to identify them.
It's kinda scary to think of all the things a page with this text would rank for:
"Bush burning over Iraq gamble"
| 3:24 am on Sep 10, 2007 (gmt 0)|
On the flip side of avoiding semantic confusion, it seems to make sense to make sure your site fits solidly into its definition. Wikipedia's definition is now the first Google result on a word I watch. When reading the definition, the two-word phrase my site is more specifically "about" is not covered in the Wikipedia definition. I wonder if it would make sense to add to the Wikipedia definition *or* make sense *not to*, being that as long as it isn't covered in Wikipedia maybe it avoids being lumped into the group (and whatever that now means). And, in viewing the terms emphasized and seeming (per Wikipedia) to be meaningful synonyms of this word, do you make sure you also sprinkle the symonyms throughout your site? If Google deems Wikipedia to be the authority, then it would follow that Google prefers to see other sites have a similar pattern of terminology usage.
| 3:28 am on Sep 10, 2007 (gmt 0)|
|do you make sure you also sprinkle the symonyms throughout your site? |
I'd say no, just make sure you use some of them in some key pages - make sure you don't keep your copy on too tight a leash. Overdoing synonyms and semantically related keywords can make your term co-occurence factor way too high, and that can trigger a spam detection filter. Those mixmaster, autogenerated scraper pages can end up with just this kind of a footprint.
| 4:00 am on Sep 10, 2007 (gmt 0)|
Do there seem to be different expectations now (from Google) for different types of sites? Image sites naturally have less text, since the images, theoretically, speak for themselves. In the past, though, Google treated them the same and it was hard to rank well against a traditional content site. Stats on my image site have doubled this month. I don't know if it's due to universal search, due to the thinning out of other sites that were above mine, or possibly due to a change in expectations for this kind of site.
| 9:59 am on Sep 10, 2007 (gmt 0)|
This makes sense now, I wrote a post in another forum a few days ago wondering why SIX forums had jumped to the number one ranking for 6 different but very competitive topics I follow.
It didn't make sense, they do not offer the best visitor experience if you want information but they do give a place to hang out.
SEO wise they are a nightmare. Being categorized might explain why they took the number one spot in front of even the manufacturer site I guess. Not sure It's a good move though.
Forums are in!
| 10:02 am on Sep 10, 2007 (gmt 0)|
Yahoo! is going "behavioral" with their ad targeting. Perhaps Google will go behavioral with their serps?
A different set of results for each individual perhaps? categorizing makes that easier I think.
| 1:36 pm on Sep 10, 2007 (gmt 0)|
Fascinating topic to say the least.
My question is this: would this move towards "classification" require a change in our collective SEO philosophies?
I would say no.
If anything, this further reinforces the idea of focusing on relevant backlinks and building out more "universal" content (text, video, images, user generate, etc...)
| 1:47 pm on Sep 10, 2007 (gmt 0)|
I would say yes to the SEO.
It loks like you need to SEO towards the classifications/typing of web sites.
It seems that links have been devaluated and a quick look this morning when my site disappeared once again, is that typing is in full force right now. Classifieds after classifieds and a lot of them from UK, ZA and other weird places...
| 2:19 pm on Sep 10, 2007 (gmt 0)|
|This forced crazy-quilting of Page 1 might well push more users go to Page 2 |
This reminded me of something I've been wondering about (to myself, mostly) for some time now, and it's mildly related to this thread - what would be the likelihood - and the effect - of Google deciding one day to change the default page from 10 results to 20? From my experience with clients, I believe that most users probably don't set their own preferences, and just leave it to the default, which of course is ten. But now that there's Universal Search pushing things down and off of page one - rather than depend upon the user clicking NEXT to go to the next ten, what if they decided to just show more results on page one?
| 2:35 pm on Sep 10, 2007 (gmt 0)|
They already do that!
Not sure you have seen that but I have seen that if you have their 20 result page and you go straight to that page (done that from my stat pages) and then click to go to the first page, the results are different.
| 2:37 pm on Sep 10, 2007 (gmt 0)|
That's not quite what I'm saying. If you do a search in Google, and you don't have a preference set (or aren't logged in to your Google Account) then by default, your search pulls up ten results. What if we all woke up one day and the default search pulled up twenty results?
| 2:43 pm on Sep 10, 2007 (gmt 0)|
|But now that there's Universal Search pushing things down and off of page one - rather than depend upon the user clicking NEXT to go to the next ten, what if they decided to just show more results on page one? |
Two factors (and probably more) would need to be considered before making such a change:
1) What the typical user sees (which is determined by screen resolution and the size of the browser window);
2) How users feel about clicking vs. scrolling on a search-results page. (Interesting bit of history: Back in the late 1990s, WIRED reported on an academic study where users were shown two versions of an article: one version with all text on a single page, and a second version with more text spread across multiple pages. The majority of users preferred the multiple-page version, which they incorrectly perceived as being shorter than the version that required scrolling.)
| 9:39 pm on Sep 10, 2007 (gmt 0)|
I've read people saying that your whole site has to be theme related, being careful not to slip into other realms than your main subject.
Example: in a glass windows site not putting anything related to Microsoft or Explorer.
While this may be good advice it demonstrates that some of us believe Google narrows your theme but not their own. Evidence of this is that, lately, Youtube is ranking well on very unrelated subjects.
Why would this non-strict-subject-related-filter be lifted for Youtube?
is Youtube exempted from this?
Why would other sites with diverse content (like directories listing different subjects) be filtered and not Google's group companies?
Google is becoming a strange place.
| This 40 message thread spans 2 pages: 40 (  2 ) > > |