| This 174 message thread spans 6 pages: < < 174 ( 1 2 3 4 5  ) || |
|Google AJAX Powered Serps Kills Referals|
Story on Clicky [getclicky.com] about Google testing ajax powered SERPs. I believe the finders credit goes to this story [smackdown.blogsblogsblogs.com]. I think it is a major story. It has big implications for site owners and webmasters.
About 7 years ago, in the classic "3 characters from done [webmasterworld.com]" thread we had talked about the ramifications on seo of se's switching from GET to POST'ed data. However, it never came to pass that browser programmers would allow an unchecked form "post" submission. However, the hiding or obscuring of referral data is nothing new:
March 2002: [webmasterworld.com]
|Both AOL and Overture routinely encode urls and session strings. There are some meta engines and java script based forms on engines that already hide referral strings. Most of us have seen this effect from search MSN already. |
I honestly don't think G is shortsighted enough to do something like that system wide for the sake of obfuscating seo efforts. Websites would have little incentive to look to Google for traffic or optimize for Google. The focus would no longer be on Google for optimization. We wouldn't know what or how to optimize for keywords - optimization would be shots in the dark. Our only option would be to look for other big sources of traffic [webmasterworld.com].
There has always been the unwritten trust between websites and search engines that they could use our data, as long as they sent us traffic. Refererless traffic pretty much breaks that unwritten but implicitly agreed upon trust and treats websites as commodities - just food for the engine.
On the other hand, we have to respect a websites - even Google's - opportunity to innovate. I think we have to see what G is doing with the Ajax before passing final judgment on it and it's intentions. I doubt that it is Googles will to break log analyzers and keyword trackers with this test. I think that is a by-product of whatever Ajax implementation google is currently testing.
On the third hand, it would probably drive alot of webmasters to look to AdWords for traffic. hmmm
|I don't work for Google, I'm not an attorney, and I'm not defending Google's right to do anything. (A lawyer might point out that Google Image Search has previewed entire works for quite some time, though.) |
Yet, in another thread that you started on the possibility of G providing the functionality to preview pages in detail you made comments such as:
|Actually, it's all about the search experience--which translates into a better search product... |
On the whole, it could be a good development... if the Google Search team can improve the user experience through more informative SERPs...
As a user and as a publisher, I think the idea holds great promise...
As far as image search, I seem to remember that the argument was that a thumbnail was "fair use" in as much as the visual information was the only way to effectively identify and index the image.
I might also point out that image search does not carry ads - i.e. it is not commercialized (yet).
Thinking about the "mouseover preview" idea some more:
- Google could easily make this an "opt-out" feature, but Webmasters who opted out would be at a disadvantage in the SERPs.
- Or Google could make it an "opt-in" feature, which would (or ought to) muzzle the "Google is stealing my pages without my permission" crowd while offering a benefit to site owners who chose to participate.
Would you opt out or in, if you had a choice?
-- opt out or in --
That is the question! So far my Firefox search/url suggestion box is filled with # searches from testing this nonsense. I browse with JS Disabled. Only Trusted Sites. G is not one of them, Sorry.
Clearing search History is Inconvenient to me as a user. I have Other sites in that list. And having JS visiting Google while JS Turned Off and choosing from the list is pain cause it displays Default search page. If I remember it correctly there was <noscript> thingy, and unless it is the same experience without JS on Google I’d say it would be Hippocratic if we go by "First Page see all Rule" for the testing phase.
P.S. Don't forget Flowers
" (making it possible for searchers to preview sites from the SERP) is too compelling to ignore if it can be done. If Google doesn't do it, somebody else will, and I can't see Google being willing to let a competitor or an upstart be the game-changer. "
Ask.com have be doing the preview thing for like the last year, can't say people find it that great as they are not getting any market share.
|Ask.com have be doing the preview thing for like the last year, can't say people find it that great as they are not getting any market share. |
1) Ask.com's implementation is pretty weak, and it's almost invisible to casual users. (The user has to click a binocular icon, and the resulting preview pane doesn't show much. It isn't the equivalent of the default mouseover behavior in Brett's "imagine this..." scenario.)
2) Ask.com has such a tiny market share that most searchers have never been exposed to the binocular icon, let alone to Ask.com's small preview panes.
Still, your guess is as good as mine. But Google won't have to rely on guesswork--it will have test data to guide its decisions.
BTW, as far as opt-in vs. opt-out goes, I'd guess that, if Brett's scenario were implemented, it would be opt-out, because opt-in wouldn't provide a critical mass of previewable search results. Of course, if Google wanted to be evil, it could sell a "mousover preview" option in organic SERPs, just like the optional boldface business listings in the White Pages of my local telephone directory. :-)
|brotherhood of LAN|
Drifting off topic slightly but expanding on the idea of previews; you have to wonder what a search engine's end goal is.
Is it purely an (economical) way of making the web's information findable or a journey towards an encyclopedia of knowledge. Back in the day the only major add-on to G was a dictionary...
|you have to wonder what a search engine's end goal is |
Whatever you think of them, it seems reasonable (and sensible).
Gee, that statement wasn't written, refined, and approved by the legal team at all, was it?
Very conversational and all.
You know, Matt's usual loose self, joking about this and that.
For those who don't catch my sarcasm. Let me rephrase
You, as a webmaster, should be very concerned that the ONLY comment on this "test" was obviously generated by the legal department.
Considering the NOISE around the web about it, the silence from Google withOUT publicly refuting some of the claims made in this thread and on MANY other boards is NOT a good sign that this test is going to lead to something that's going to make ANY webmasters happy.
|Considering the NOISE around the web about it, the silence from Google withOUT publicly refuting some of the claims made in this thread and on MANY other boards is NOT a good sign that this test is going to lead to something that's going to make ANY webmasters happy. |
Why would they want to get into a public catfight or subject themselves to abuse over something that hasn't happened?
If you worked for Google, would you consider that to be a productive use of your time?
|If you worked for Google, would you consider that to be a productive use of your time? |
lol what does your argument have to do with what i wrote?
GENERALLY, MC will post something a little more ...how do you say... something written by him when he posts about "accidents" "test" etc.
The simple fact that what he wrote on seomoz was neither written BY HIM and obviously written BY the legal team and official public relations dept SHOULD have you concerned.
Its a glaring red flag to those of us who can guess "what we would do if we were Goog" cause that scenario is NOT pretty for ANY of the webmasters here.
[edited by: tedster at 12:08 am (utc) on Feb. 9, 2009]
"The simple fact..."
Your view on "facts" gets more unfactual all the time.
|Why would they want to get into a public catfight or subject themselves to abuse over something that hasn't happened? |
Ah, a question I have confronted from our Board of Directors many, many times.
There are any number of reasons why corporate double speak does not work in the age of the Internet. Time spent interacting with "the market" is always well spent, and rumor control is far easier to accomplish than damage control. "Abusers" lose their power when they are treated with dignity and respect. Even though the abuser may never change, those eavesdropping on the interaction will form their allegiences based on what they hear (or don't as the case may be).
These are concepts that G once understood and mastered. (That would be back with "GoogleGuy" frequented these boards and actually interacted with webmasters.)
We, and others, have often found our suppliers to be one of our richest sources of new ideas, especially when it comes to products - yes, even more than customers. For G it would seem to me the webmaster is the supplier.
Ignore them at your own risk.
|If you worked for Google, would you consider that to be a productive use of your time? |
To answer your question in a single word, "Yes".
|"The simple fact..." |
Your view on "facts" gets more unfactual all the time.
Agreed - this whole thread keeps veering toward guesswork and opinion, and calling an opinion a fact doesn't make it one.
Does anyone have any more observations about the actual AJAX code that Google deployed? I can't find any more examples at present - did the "test" go away?
|Agreed - this whole thread keeps veering toward guesswork and opinion, and calling an opinion a fact doesn't make it one. |
It's called "chunking up"
Or 3rd tier thinking.
You know like, "what does this mean to me?"
"How will this affect ME in the future?"
The deployment of the AJAX is called "chunking down" - ie "give me details, give me examples"
But at the end of the day, one still needs to chunk it back UP after chunking down to see how it will affect MY business.
And by definition that means one has to "speculate" "guess" and "predict"
This basic function of thinking is how one stays AHEAD of the game, instead of continually playing catch up
This thread is about Google not passing referer keyword data. That takes no guesswork on our part, no "what if" ...because it really did happen. The whole side issue about previews is pure conjecture -- it "might" happen, it "could" happen. Well, sure, lots of things could happen.
But what DID happen was that up to 10% of some webmasters' Google traffic carried no keyword data last week.
Matt Cutts said that wasn't the intention of the test. Well, even if it wasn't intentional, it was a pretty danged short-sighted thing to do. And if it ever happens again, I personally hope the shout around the web is a thousand times as loud as this one has been.
Anyone who isn't using their server logs to discover keyword data but only looking at raw traffic numbers, should dig down and start using this valuable resource immediately. It's pure gold, and has been part of webmastering since the beginning. Through keyword referer data, your actual visitors tell you what they value about your site, what they want from you and what they expect from you.
It's my view that no company, no matter how big, should even dream about taking resource that away.
[edited by: tedster at 7:21 am (utc) on Feb. 9, 2009]
|Anyone who isn't using their server logs to discover keyword data but only looking at raw traffic numbers, should dig down and start using this valuable resource immediately. It's pure gold, and has been part of webmastering since the beginning. |
And in some countries, you have to keep the access log files for a certain period. Of course, if Google does not provide keyword information in the referrer, that data can not be kept by the owner. Yet I guess that governments might not be very happy about this.
And the least thing Google wants is interaction with critical governments (which is 1000x worse than interacting with their Adsense partners which is 1000x worse than interacting with paying customers).
It's certainly possible that Google didn't realise referrer data would be lost, or the implications of this - you only have to look at the 'flagging all sites as malware' incident to see that Google has sprawling, uncontrollable processes and there may not always be the oversight expected from a large company.
And on the flipside, you only have to look at some of the testing that has enraged adsense and adwords publishers over the years to see that Google is prepared to conduct large-scale live tests, seemingly regarding webmasters problems as acceptable collateral damage.
I don't see any conspiracy or falling of the sky though. Or anything new. Google's site is Google's and a site owner's site is their own. I think many here have more than happily coped with (or profited from) the various changes at Google over the years. Google both pushes, and is pushed by individual websites. Besides, it would get boring if things didn't change ;)
[edited by: Receptional_Andy at 1:04 pm (utc) on Feb. 9, 2009]
I am no longer seeing the fragment identifier in Google search like I was last week.
Something of interest regarding the discussion here is that Michael Mahemoff commented on the fragment identifier [webmasterworld.com] as he noticed it being used in Google Docs prior to the AJAX-powered SERP results discussions that erupted. So it does indeed carry beyond the scope of search alone, being tested/used in other services that Google offers.
Wow :0, just wow!
168 posts, and if a single post made reference to users' privacy then I missed it (my apologies if you were the sole one ;) ). That might have been par for the course in 2002 but today? You should all be ashamed of yourselves ;) .
I'll admit I've previously found aggregate referrer information to be useful to me. If it goes away though I'll have a hard time justifying any sense of outrage. And the AOL experiment made it quite clear that the only thing stopping individual drill down into information possessed is the morals of the person doing the possessing. Do you trust those behind every web site you visit?
Can anybody provide me with an explanation of why we should even be entitled to the domain (where) a visitor comes from, let alone query strings (what they were doing while they were there)?
Let's face it. We've been taking advantage for personal profit of the lack of safeguards built into a system invented by academia, in the same way spammers take advantage of same in email! (No, I don't seriously consider this as bad as spamming or I wouldn't have done it myself (but I still need a shower), but it does have a similar smell.)
[edited by: Status_203 at 2:08 pm (utc) on Feb. 19, 2009]
followup: Matt Cutts stated at PubCon Austin this week that this was a test of Speed. They found that it might increase SERP speed by using ajax to fetch the results. They did not anticipate the loss of referral string information at the start of the test.
When I asked Matt, "why didn't they just come out and say that?", Matt said, "That's nice feedback and he would let them know".
|When I asked Matt, "why didn't they just come out and say that?", Matt said, "That's nice feedback and he would let them know". |
How are we supposed to interpret this?!
I have two different opinions on what this means.
Both would be considered "Google Noise" but they are the ONLY explanations.
A.) Google lies.
B.) Google is clueless.
C.) Some from column A, some from column B.
I find it "odd," to say the least, they just didn't err.. what's the word?
And quiet down the uproar.
'They did not anticipate the loss of referral string' = clueless.
Right arm, whatever it is for Google, doesn't know what the left arm, whatever that is, is doing.
|Right arm, whatever it is for Google, doesn't know what the left arm, whatever that is, is doing. |
This is far from the first time that we've seen this kind of problem crop up. Google has become rather big, and it is very compartmentalized. Each group within Google can be so focused on their goal that they are unaware of the bigger picture.
I consider it a major "growing pain" of the kind that afflicts many organizations. I see it with my larger clients quite frequently - IT makes a move that makes sense, but just within their viewpoint. In actuality, they end up zapping the in-house SEO efforts, or the marketing department goals.
Google is still immature as companies go, a youngster whose growth spurt was rather astounding. The webmaster community will (and should) call them on their worst "adolescent" missteps. Our viewpoint is external and we will notice some things before anyone within the plex does.
As another side note, at the Austin Pubcon, Matt also mentioned that the Google Analytics team also hit the roof about this particular problem.
|Our viewpoint is external and we will notice some things before anyone within the plex does. |
|the Google Analytics team also hit the roof about this particular problem |
Interestingly enough, this was noticed by a member here first too, and referenced earlier in this very discussion:
|An interesting post was started in Website Analytics - Tracking and Logging titled |
Tracking traffic sources [webmasterworld.com]
Why does google analytics gives you a source you can't track?
Anybody here using GA that can comment in that thread in regards to the discussion here being a possibility for what is being noticed in GA?
| This 174 message thread spans 6 pages: < < 174 ( 1 2 3 4 5  ) |