Search just became Search 2.0
| 8:33 pm on May 11, 2006 (gmt 0)|
Google II - "Now it's personal"
|Google Co-op is a community of organizations, businesses, and individuals working together to help improve Google search. |
Major implications for the traditional webmaster and SEM marketer?
| 8:09 am on May 20, 2006 (gmt 0)|
I was reviewing the press releases and guidelines for Google Co-op. It seems as stated "Google Co-op is a platform which enables you to use your expertise to help other users find information." means something along the lines of "We will let qualified professionals tag our SERPS."
This really sounds to me that if you are a credible enough authority on a subject then Google are saying you are welcome to massage our results for us.
Could this be the beginning of the end of a computerized Google algorithm that represents listings in the top half of the first page?
| 12:19 pm on May 22, 2006 (gmt 0)|
|Could this be the beginning of the end of a computerized Google algorithm that represents listings in the top half of the first page? |
It looks like a step in that direction.
Authority figures in their field will soon have the ability to "set" the SERPS?
Of course, that brings its own inherrent problems and concerns. I'm not sure that I see it as "better" than link structure and traditional SE measures of "authority".
At the moment, it will follow that only "authorities" that you subscribe to will have an effect on your SERPS. It remains to be seen how effective the program turns out to be, but there is a very strong possibility that it completely throws a spanner in the works for traditional SEO (if there is such a thing).
One set of SERPS for me, another set of SERPS for you. Getting "in" with the known authority figures may become a big element of SEM.
| 4:14 pm on May 22, 2006 (gmt 0)|
It's an interesting situation. The Co-op is an umbrella for two essentially unrelated products: topics and "subscribed links". I'll talk about "topics", which can be used to offer "refined" SERPs to people who subscribe to your view of the world.
- This is hands-down the most botched and poorly prepared roll-out of any new Google service. People are encouraged to "contribute" their annotations to "existing topics" with absolutely no clear instructions on how to do so. Google groups to support the topics are created, but after an initial "charter" message, no Google employee answers any question. Volunteers are seen wandering these groups wondering how to contribute and what the hell is going on.
- The user interface for refinements... sucks pretty bad. Define as many refinements as you want, but if the user's search term triggers your refinements, only the first three of the first three categories (which Google wants to call "facets" for some reason) will display. It is not visually obvious that more categories/items exist. It is not real clear that clicking on the "refine your [whatever] search" link will display your complete list of refinements. And if all your refinements fit nicely in the 3x3 list, then clicking on the "refine your [whatever] search" link will simply display the same refinements. If you click on any refinement, the refined search shows the complete set of refinements, but now with the category titles displayed. You also now see a "Clear refinements" link; it's hard to guess what the typical Google user might think clicking on that link will do.
- The documentation claims that your Co-op results will appear "immediately" for you (a lag for your subcribers) after you upload your URL annotations and context file (an XML file of specifications). In fact, they don't appear immediately at all, leading to much initial confusion. Indeed, even days after your topic has been defined, subscribers may see your results only some of the time (for no apparent reason that Google has documented). For example, this morning I went to look at a topic I have defined refinements for. I had to hit the Search button 4 times before my refinements finally appeared. Maybe one has to go around hitting every datacenter once to get them "primed" to avoid this effect. Maybe Google deliberately turns them on and off even for the same client to do some kind of relevance testing. Who knows?
- Some of the confusion comes from the fact that some "topics" are special because Google is "developing" them. They ask for contributors to these topics, but don't say how to contribute, how they'll decide which contributions to accept, etc. These "special" topics display differently in Google. People don't have to subscribe for them to appear, and there is no "remove" link.
- People have to manually "subscribe" to you (which is to say, all topic refinements associated with your account) in order to see anything different.
- The granularity is bizarre. If you're an expert in both Chinese cooking and astrophysics and develop "topics" for both, other people can only subscribe to both your topics or none.
- You're limited to 1,000 "annotations" (URL patterns that you can identify as worthy of moving up or down in the SERPs). Sounds like a lot, until you realize that's only 200 URLs with 5 different categorizations each.
- The only visible solution to the annotation limit and to the inability to offer separate subscriptions to unrelated topics in your account is... to just use up lots of Google accounts, where the purpose of each is simply to be a container for a separate topic or to add additional URL annotations to one of your other topics. In the latter case, you'll have to somehow convince your users to subscribe to your multiple accounts if you want them to see the results you've defined. And, like so many things with Co-op, the effect of the order of account subscription in the face of conflicting annotations is not documented.
- Your annotations can only be used to "refine" the SERPs, or to remove specific URL patterns from them -- you cannot add new URLs to existing search results. This means, for one example, that if your "widgets" website is sandboxed, you won't be able to construct a refined search that is the same as searching for "widgets", except with URLs from your website appearing at the top. refinements can only reorder SERPs or remove URLs from SERPs -- they cannot add URLs to SERPs
- To be clear, I use the mythical concept of "sandbox" here to simply mean "not appearing anywhere in the top 1,000 (that is, visible) SERPs". There is no sandbox conspiracy here, and the effect can be seen in most any website for some term. For example, dmoz.org is not believed to be sandboxed, yet I cannot use a Co-op refinement to move dmoz.org/Computers/Software/Backup/ to the top of the SERPs for the search query "abc". The reason is, even though that PR6 page does contain the word "abc", it does not rank for it in the visible SERPs (the top 1,000). You can't use refinements to inject URLs into the SERPs that aren't already there.
- You can also modify SERPs by appending new terms to the user's original search term. For example, if you search for "widget" and then click on the refinement named "repair", I can arrange for the following SERPs to be displayed: "widget (repair OR damage OR fixing)".
- You can also create refinements that display a specific web page instead of refined SERPs. A possibly unintended consequence of this feature is that I can construct a "refinement" that turns into a fetch of www.google.com/search?q=newterms. If those new terms also trigger some of my other defined topic refinements, then you get the effect of "chained-together" or hierarchical refinements -- even though the system is really not set up to offer any hierarchy of topics.
- You don't get to a priori modify the results of your subscribers. Only if they type in a search query that matches one of your patterns will a "teaser" be displayed that invites them to "refine" that search. Only if they then click on one of the links in that (limited to 9 items) box will you then get a chance to display some modified SERPs for them.
- It's very easy at all times for anyone to "unsubscribe". So folks who try to get people to "subscribe" to "refinements" that essentially just make their personal website the top of all SERPs are likely to find those hard-won subscribers dropping like flies.
- The interaction between subscribing to multiple publishers of "topics" is poorly defined. It appears that if two different publishers (on purpose or by accident) use the same labels for their URL annotations, then the annotations of one publisher may be applied to the SERPs-rearranging rules of another. However, there is no mechanism for multiple publishers to note that they are sharing labels. There is no mechanism for a group of coordinated publishers to let users subscribe to them as a group.
- People's initial response seems to invariably be "OMG, spammers will be all over this." What they miss is: nobody sees your SERPs refinements unless they have a Google account, are logged into their Google account, and have manually subscribed to your Co-op account. About the only real opportunity for vicious spam (as opposed to subtle, very gray-area spam) that I can see is to use spyware to auto-subscribe people to your Co-op spam view of the world. Even then, all a spammed subscriber has to do is click that "remove" link when they see your spammy refinements appear. The spammer could combat this by creating 1,000 different Co-Op accounts and subscribing each victim to all of them. Wouldn't be too hard for Google to automate detection of this though (e.g., when you get a burst of "removes" for an account, suspend it until a human can glance at it and decide whether it's bogus or not).
- It's easy to accidentally create really bad refined searches. For example, until very recently, you could search on "bird flu" and see refinements offered by the "special" topic of "health", which is being supported by bigwigs like the CDC. Unfortunately, if you then clicked on the "alternative medicine" refinement, you would see a page dominated by the CDC -- but where none of the results had anything to do with "alternative medicine". They've fixed that particular case now, but it goes to show that it's difficult to examine all the possible implications of a set of rules for matching search queries combined with a set of labels for scoring URLs.
- As a subscriber, you get to peer into the annotations of any particular publisher. You can see the URLs that publisher has attached labels to, and what the names of those labels are. However, you don't get to see the "score" (used in rank ordering) for each URL, nor can you see the "context file" that defines what search patterns will be matched and how those annotations will be applied to which searches. That means that I could see that the refined results for "Alternative Medicine" were bogus, but I had very little ability to detect whether that was an accident or a flat-out deliberate spamming attempt. In the case of the CDC, I'm sure it was accidental. In the general case, I might want the general public to be able to scrutinize the details of a set of refinements that Google is considering rolling out into their "normal" SERPs.
- The holy grail here is Google's tease that, perhaps, someday, maybe, if you construct topic refinements that lots of people sign up for and love, Google may use your refinements in the SERPs they show to everyone, not just your subscribers. Some problems there include: are they going to display the refinements defined in your account from that point on? In that case, you can now start doing some subtle spamming now that your results are "live". If it's the reverse case, then Google gets a "frozen" copy of your refinements that slowly drifts towards irrelevance.
- It seems to me there was a lovely opportunity to tie Co-op into the AdSense search boxes in order to give AdSense publishers the ability to offer something more compelling than the current weakly "themed" search. Maybe that will happen in the future.
- Bottom line for webmasters: This is a tool for promotion. If you can create refinements that are useful enough to garner and keep subscribers, and that still generate more traffic for you than the normal SERPs, then this could be worth the effort for particular niches. That assumes that Co-op is going to become popular among users, which is far from certain. Do consider how many people in your target audience have a Google account, or even know what one is.
| 6:22 am on May 24, 2006 (gmt 0)|
I'm sorry to sound kind of skeptical on this, but for the size of the internet and from problems at others attempts at human classification I just don't think it is the best option. It almost sounds like an admission of defeat to the spammers, and it probably isn't that but the people they pick to "adjust" results will have billions of dollars in their hands with each "adjustment" and that is a lot of authority to give away.
| 7:06 am on May 24, 2006 (gmt 0)|
feel better ronburk?
| 7:31 am on May 24, 2006 (gmt 0)|
|Could this be the beginning of the end of a computerized Google algorithm that represents listings in the top half of the first page? |
Beginning has been long ago.
Remember the friendly Dutchman? Remember hand tagging of results by Google as "Vital"? No news here... just more smoke.
| 9:25 am on May 24, 2006 (gmt 0)|
I don't think Google understands its users. It seems to me this is a quirky little tool that cost loads to build and will be used by almost no one.
People want relevant search results and they don't want to fart about subscribing to various Coop thingies.
| 9:25 am on May 24, 2006 (gmt 0)|
Sounds somthing similar but not exactly as dmoz for search results!
this would be as directed above, first half of result page, the other part will be normal search result and third one will be topic centric i.e. do you mean?
Remember that new layout testing?
| 10:01 am on May 24, 2006 (gmt 0)|
You really get a sense of absolutely no excitement considering this "tool". I mean imagine how happy people are to be able to login to Google and subscribe to llnks of websites they already know about?
| 10:44 am on May 24, 2006 (gmt 0)|
Most of Google's new toys are useful or at least interesting to me. This one however, leaves me totally cold. It has the potential to really mess things up.
I hope it goes away soon.
| 11:21 am on May 24, 2006 (gmt 0)|
| 11:51 am on May 24, 2006 (gmt 0)|
This looks mostly like a sad attempt at categorization or related searches.
"Clusty.com" does a better job at this.
Based on this first look, I'd never use this again. It doesn't even make sense to the end user.
| 1:13 pm on May 24, 2006 (gmt 0)|
I'm just about to book a holiday - now, before I do a search on my chosen destination, I will just go and browse the Google Co-op directory and decide which travel "experts" I will subscribe to first....
It's just not going to happen, is it? - the only way this will function is if Google steadily increases the number of "experts" that users are automatically subscribed to by default.
That is no doubt the intention - we will find there is a category of "trusted" experts and the SERPS will be increasingly controlled by this cartel.
The number of annotations that a website has from "trusted" experts will become yet another "quality signal" that affects rankings - whether or not users refine their searches.
It won't just be "trusted IBLs" that matter, but "trusted annotations" as well.
What will that do to the freshness of the SERPS?
| 1:16 pm on May 24, 2006 (gmt 0)|
Google wants people to publish their stuff on its pages rather than be a search engine that's really good at sending people to the appropriate page on the internet.
Which is a good strategy if they've given up on being able to stay one step ahead of the spammers. At least now the spammers will be using google's platform to achieve their goals.
| 1:21 pm on May 24, 2006 (gmt 0)|
"Dissatisfied? Help us improve 2.0"
Interactive algo training/building experiment. Mere addition of data channel. The learning machine needs more data. "More data!" Nothing unexpected here, nor a permanent shift to human sorting, yet great science experiment. Wish I could be in the design room. Looks like a lot of fun. Run as a discrete test on one data center. Reminds me of Deep Blue versus Gary Kasparov chess match.
Google Deep Think build v1.0: "The answer is . . . 12 . . no, no, no . . 49 . . no, no . . wait . . "
Google Deep Think build v3.3: "The answer is . . . buy phentermine now . . no, no, no . . . "
| 3:17 pm on May 24, 2006 (gmt 0)|
I'd suggest an alternative theory.
Google wants to understand exactly what the user wants. But when a user queries "movies", what does she mean? A DVD? Local movie listing? A movie database perhaps?
Queries of the broad nature are, IMHO, precisely the problem that Page and others mention when they say that they want to give the users exactly want they mean.
So how do you solve that problem?
You can have automated system that extracts concepts. We already know that Google understands concepts to a certain level, that is, it can, at the very least, identify sister concepts to a major concept.
So how about this: Google wants to "reality-check"/"refine" is its own conceptual answers against human answers.
The dilemma they face is spamming. On the face value it seems like the simplist answer is to choose Coops that cover a variety of topics.
| 4:11 pm on May 24, 2006 (gmt 0)|
|but the people they pick to "adjust" results |
They don't pick people. They let algorithms that measure search result satisfaction pick from data that people submit. Being a submitter with lots of money doesn't necessarily help you. Being a guy who just wants to plug his own website definitely hurts the odds of your "refinements" ever being selected for widespread use.
Remember, if your "refined" results don't look like any kind of improvement, that "remove" link is right there in front of every user, every time. It's much, much easier to unsubscribe to a publisher than to subscribe in the first place.
|People want relevant search results and they don't want to fart about subscribing to various Coop thingies. |
True. That's why you don't win by getting a million users to subscribe to you (which you could never do). You win by getting a modest number of subscribers in your niche to subscribe to you, and attracting enough Google attention (or algorithm attention) to say "Hmmm, maybe this guy's topic refinements are worth a test with the general public."
Just like Google every once in a while takes my page 8 listing and puts it on page 1 to test whether it deserves to move up in the ranking, they can auto-detect plausible Co-op "topics" and test them by randomly dropping them in front of the general public (people who didn't subscribe to you, who aren't even necessarily logged into Google). If they measure satisfaction at that point, then they can put that snapshot into the general search results -- and go right on testing, including testing revisions you make to your original "topics".
|Sounds somthing similar but not exactly as dmoz for search results! |
Exactly. It's a framework for distributing the work of editing a directory. You get a different (and arguably richer) set of tools for presenting your "directory" information to users. You get Google algorithms instead of meta-editors deciding which "editors" are going to have their "work" ever see the light of day (appearing in the main Google search results).
It will be interesting to see if any of the folks who constantly rant about dmoz will invest effort in this. Here's your chance to make your own directory. You just have to do all the work for your niche, sell people on subscribing to it, and then be good enough quality that Google actually someday wants to use it. I suspect many of the folks that get rejection slips from dmoz will likewise be rejected by the Google algorithms associated with Co-op.
|It has the potential to really mess things up. |
That sentiment usually comes from people who don't understand how difficult it is to spam. It has no more potential to mess things up than any of the other hundreds of factors that go into deciding what Google's going to display for any given search query.
All speculation on my part. I have no inside track other than having played with it more than most.
| 4:14 pm on May 24, 2006 (gmt 0)|
"I don't think Google understands its users."
This is where the Google PHD's and the common folk (like me with a four year degree) diverge. I don't "get" it, don't need it and can't afford the time to even understand it.
Marissa was in a vulnerable mood the day they put this in front of her.
| 8:12 pm on May 24, 2006 (gmt 0)|
My vote says very sad, let google do it they will fail faster, great for investors wanting to short the stock, good for MSN and Yahoo who seems to have a better footing anyway on Algo's and searches, I do not care what the opinion is but Google is getting further and further behind the 8-ball on search results.
End of opinion - end of story...
| 4:38 am on May 25, 2006 (gmt 0)|
My first impression was not how this could be manipulated by users, but how easily google could test this. Excellent observations, ronburk.
| 9:53 am on May 25, 2006 (gmt 0)|
Say no more.
| 5:44 pm on May 26, 2006 (gmt 0)|
check out search 2.0 delivered is RSS. jsut for a taste try our tool and google with google video in e-mail. this is a google alpha folks.