| This 174 message thread spans 6 pages: < < 174 ( 1  3 4 5 6 ) > > || |
|Google AJAX Powered Serps Kills Referals|
Story on Clicky [getclicky.com] about Google testing ajax powered SERPs. I believe the finders credit goes to this story [smackdown.blogsblogsblogs.com]. I think it is a major story. It has big implications for site owners and webmasters.
About 7 years ago, in the classic "3 characters from done [webmasterworld.com]" thread we had talked about the ramifications on seo of se's switching from GET to POST'ed data. However, it never came to pass that browser programmers would allow an unchecked form "post" submission. However, the hiding or obscuring of referral data is nothing new:
March 2002: [webmasterworld.com]
|Both AOL and Overture routinely encode urls and session strings. There are some meta engines and java script based forms on engines that already hide referral strings. Most of us have seen this effect from search MSN already. |
I honestly don't think G is shortsighted enough to do something like that system wide for the sake of obfuscating seo efforts. Websites would have little incentive to look to Google for traffic or optimize for Google. The focus would no longer be on Google for optimization. We wouldn't know what or how to optimize for keywords - optimization would be shots in the dark. Our only option would be to look for other big sources of traffic [webmasterworld.com].
There has always been the unwritten trust between websites and search engines that they could use our data, as long as they sent us traffic. Refererless traffic pretty much breaks that unwritten but implicitly agreed upon trust and treats websites as commodities - just food for the engine.
On the other hand, we have to respect a websites - even Google's - opportunity to innovate. I think we have to see what G is doing with the Ajax before passing final judgment on it and it's intentions. I doubt that it is Googles will to break log analyzers and keyword trackers with this test. I think that is a by-product of whatever Ajax implementation google is currently testing.
On the third hand, it would probably drive alot of webmasters to look to AdWords for traffic. hmmm
IMHO, signor_john has it exactly right regarding Google's view of the effects this might have. Powerful companies can force change and this change has plenty of potential benefit to Google with very little downside to them. The issue of what happens to third parties and how those parties may react becomes for Google largely a PR task, not a strategic restriction (and G can point to existing browser technology as a part of their PR strategy).
Notable benefits from G's POV would seem to include:
-- encouraging more time on Google.com (if you thought cached pages were huge ...),
-- cutting off, at least temporarily, the two-way flow of very valuable information,
-- opening up more revenue streams, should G choose to monetize kw data currently being distributed freely,
-- encouraging more webmaters to share information with G (in order to get information),
-- discouraging SEO
-- encouraging webmasters to focus on "creating great content" rather than on "manipulating SERP's."
What's not to like?
I'd add the following to caveman's assessment:
-- a huge "cool factor" among searchers, depending on how the AJAX is implemented.
|anti-SERPs-scraping initiative, then they'll probably just switch the data into your registered Webmaster Tools account, thereby giving the legitimate owner all the data they could want |
Now I am starting to warm up to this idea.
So, going to page 2 on serps... will that break the back button?
While I hate this from an SEO POV, I'm finding it hard to see any justified reason for the outcry, beyond sour grapes.
|So, going to page 2 on serps... will that break the back button? |
No, not as long as the page has been cached by the browser. Google does not seem to be using AJAX on the paging. They are GET requests.
Statement retracted ... this is interesting ... paging works fine, as does clicking the back button in the browser. However, a paging forward request uri (the "Next" link at the base of the page) and the address in the address bar are different. Compare on a search for "example" (I stripped the scheme and G domain and then attempted to line the two up here for comparison):
A quick peek at the http headers is showing no redirection, so perhaps they are indeed using AJAX on the paging.
[edited by: coopster at 5:54 pm (utc) on Feb. 4, 2009]
How are you seeing the ajax coop? Is there a special url you are using? Did you get a cookie for the test? Care-to-share? We have yet to run into anyone that is getting the ajax code. Are you getting the drill-down stuff?
The more I think about this, the better I like it. Consider the opportunities that it offers:
FOR THE USER: Easier searching, with fewer detours and better results.
FOR THE SITE OWNER: The chance to entice searchers with relevant content and effective page design--without having to obsess about being #1, on a SERP, worry about the snippet that gets served up with the page title, or feel at a disadvantage on Universal Search SERPs that include images, video clips, etc.
FOR THE SEO: An opportunity to move beyond the technical or the mechanical by consulting on page design and content presentation, which will become more important if users are able to preview pages in the search engine's results. Smart SEOs will also develop formulas and "what if" scenarios to help site owners make strategic design decisions, such as how to find the sweet spot between optimum ad placement and optimum clickthrough rates from SERPs.
Who will be hurt by this kind of change? Two groups:
- SEOs who are unable to adapt (nothing new about that--how many SEOs are still making a living with their AltaVista/InfoSeek SEO techniques?)
- Site owners whose content presentation doesn't interest or entice users, such as template-based sites with little content on the page or "made for AdSense" sites that have three AdSense units above the fold.
Just reading Brett's description of mousing over search results and seeing page previews makes today's Google SERPs seem old-fashioned and quaint. This is a change that needs to happen and will happen--the sooner, the better.
The google homepage is now the #1 site referral for the month of February on our main site. 2% of overall traffic yesterday.
Looking through my Analytics, it appears that they must have tested this stuff back on January 26th as well. Sometime Monday they started again and yesterday it went up even more.
I was able to get the ajax results by signing out of google and searching from google.com using firefox. I didn't get them when I was using iGoogle (signed in or signed out).
|An opportunity to move beyond the technical or the mechanical by consulting on page design and content presentation, which will become more important if users are able to preview pages in the search engine's results. Smart SEOs will also develop formulas and "what if" scenarios to help site owners make strategic design decisions, such as how to find the sweet spot between optimum ad placement and optimum clickthrough rates from SERPs. |
an excellent segue toward more focus on a/b and multi-variate testing for optimal element positioning.
I can confirm everything being said about logs as I seen the same thing in my tests, but that is a no-brainer -- it's already been reported and indeed you get the "http://www.google.com/" in your referer and that's it. However, if you turn JS off, you fallback to the original "q" query string and the full referer and query string is there yet. So, anybody with JS turned off as they visit G and follow a search result will still show the referer query string in your logs. Not that this means much, just wanted to share the information.
talking two different things Coop. I am not getting any of the ajaxified results - regardless of browser. Yes, there is js on the page, but it isn't doing anything but tracking. The drill down you are talking about is the google suggest. I am not. There apparently is new ajax functionality being tested on the serps, that opens "more info" (eg: mouse over to get "More results from...")
I agree that this is probably huge.
Google is known to give out as little information as possible. And they deliberately choose which information they give out.
So it is rather unrealistic to think that they will give out that information to webmasters once they have the ability to control that information.
If you want to get a glimpse into the future - just have a look at Adsense reports. There is hardly any valuable information in there, and critics are usually told to simply shut off: "You want Google's money, so accept the deal. You don't have a right to get that information. Accept the system 'as is', or do not use Google for monetization." - Now, with the AJAX referrers, it's easy to see how this may be translated to Google search: "You want Google's traffic, so accept the deal. You don't have a right to get that information. Accept the system 'as is', or do not use Google for getting traffic."
I see only one chance to stop Google: webmasters need to block the Google bot and actively promote other search engines. I know how hard this may be. It's certainly easier said than done. Even for our tiny group of sites. But it needs to be done. TBH, I don't even want to start thinking about a time where Google IS the Internet.
>>talking two different things
Oh, that "drill down". Understood and yes, I thought you meant the G suggest. However, in Firebug after G suggest is where I see the other AJAX src being created and retrieved during the same request. You have to sift through some chrome js calls but in the remaining (obfuscated filename) calls you will find the xmlhttprequest code and the changes in the window.location as is being discussed here.
We seemed to have bridged into another discussion regarding the other "experiment" going on. I think you may find that stuff in the accessibility area of G labs ... [google.com...]
|That means that this is more than a simple one shot test, there are millions of people getting this test. |
I encountered it in Japan.
Its HTML code was totally different from that of normal SERP. There were no URLs found in the code though they showed on SERP. That will mean rank checking software couldn't retrieve the ranking data.
I didn't require a cookie to get the AJAX results (indeed I had cookies disabled). The only noticeable difference that I saw was the lack of a new URL request, and the different URL. So to me, it seems like a technology test more than anything - to assess support for this method of displaying results. I saw this from a US-IP only, and nothing from the UK.
I'm not seeing this behavior on a sitesearch. For example, if I use the search feature at the top of the WebmasterWorld forum page the query string method remains and no fragment identifiers are being used.
I can get these results for a site search, coop, but only if I start from the Google homepage (or [google.com...]
What I can't quite figure out is that when I request the AJAX-y URL, live HTTP headers records a request for the traditional URL with a question mark. Maybe Google are requesting the whole page via AJAX and then just switching the content, but that would be slightly strange.
|I see only one chance to stop Google: webmasters need to block the Google bot and actively promote other search engines. |
Why? To keep users from making use of an improved search tool that gives useful, relevant Web pages an advantage over useless or irrelevant Web pages for a given search string?
As a user, I think this is a great idea. Right now, if Joe User searches Google for something like "pc won't recognize usb widget," he'll get a hodgepodge of results that range from articles on the topic to casual mentions of "pc," usb," and "widget" in a 100-message forum thread. There's no way for Joe to identify the helpful search results without visiting each of the listed pages, and the author of the helpful article on "What to Do When Your PC Won't Recognize Your USB Widget" is competing with a long list of less relevant or useful results. The scenario that Brett describes (mouse over the search results to see what's on the underlying pages) is a win-win-win situation for Google, for the site owners with the most relevant or useful results, and--above all--for Joe the previously frustrated user.
There's a lot less "win" for webmasters whose sites are monetized through advertising and so depend on an actual visitor for their revenues. However, some of that impact would also hit Adsense advertisers and hence Adwords clicks - so Google would have to think about that issue long and hard. Some of their "win' might also go down the drain.
|I can get these results for a site search, coop, but only if I start from the Google homepage (or [google.com...] |
OK, just did the same using "site" in the search and did indeed get the results. However, as you mentioned, do it from the advanced search page and it is a different story. For me at least.
|What I can't quite figure out is that when I request the AJAX-y URL, live HTTP headers records a request for the traditional URL with a question mark. Maybe Google are requesting the whole page via AJAX and then just switching the content, but that would be slightly strange. |
Lot of Google-juice drinkers in this thread.
So let's try again to see the LONG TERM impact of this situation.
Since no one on here, last i checked, works FOR Google.
|for Joe the previously frustrated user |
Joe User, doesn't normally search for "What to Do When Your PC Won't Recognize Your USB Widget" etc.
That's "Joe Geek" (a very SMALL population of searchers)
Joe User searches for "XYZ song lyrics" "sports scores" "celebrity gossip"
Don't believe me? Check out the zeitgeist sometime.
At best, "Young Joe User" searches for "xbox gaming cheats" as that population's most "technical" term.
Ethics? who cares about ethics?
This is about MONEY and control of information.
The money all of us have built businesses for.
Here's a very EASY and probable scenario for this "test"
(beyond all the VERY short sighted thinking about it's "implementation"... who cares?!)
What do you do 1-2 years down the line, when everyone has, once again, gracefully cow-towed to Goog's demands; every website has sheepishly given over all their info for Goog for accurate record-keeping, and
"oh by the way, we are now CHARGING for WMT and analytics.
"Cause it's too much of a drain on our servers to give away for free"
Geez, this is a BASIC scenario of what's going to happen.
Let alone more "conspiratorial" scenarios.
Who wants to bet AGAINST me on this scenario?!
I'm laying 3 to 1 odds...
[edited by: whitenight at 9:53 pm (utc) on Feb. 4, 2009]
This change may turn many small niches with low volume keywords into unprofitable.
Traffic distribution vs website rank is going to change. Entries will be more equally distributed between the listings. The top listings will lose their traffic share, for small niches they may lose to the point where it doesn't worth the effort anymore.
They could dramatically improve the user experience with some JQuery or AJAX.
How about an Images link that only requires you to mouse over it to see a dynamic 5x5 list of image results that "hover" in a chromeless window over the SERPS?
How about a Videos link that does the same? News results? Blog results? Maps?
And that's just some of the basic stuff.
|There's a lot less "win" for webmasters whose sites are monetized through advertising and so depend on an actual visitor for their revenues. |
It may not be good for the kind of Web site that counts on users arriving on a page with three AdSense units above the fold and exiting via the nearest PPC link. But in the long run, it should benefit sites that present an attractive, informative face to users by sending them a bigger share of search referrals.
In any case, Google Search's main concern has to be about the quality of its product and its user experience. It's wishful thinking to imagine that Google would--or should--keep the status quo intact just because progress might not benefit every Webmaster or SEO.
I was able to get it to work after clearing all cookies in FF.
The Referrer it self is www.google.com I don’t use any analytics programs on my sites. I wrote my own code that in the old way would give me the data that I was interested in, QS, Default Language, the page number on G search where the link was clicked. That Coupled with the IP(general geo location of the user) I was able to do some data digging and understand the patterns when it comes to delivering more relevant site experience to my site's visitors. It made possible for me make my sites better for visitors. In a way I king of new this was coming when they introduced the OQ(original query which is what was used to suggest the NEW, Google preferred, High Paying Adwords string) Variable in referrer. How is your Long Tail doing Now?
The referrer is not missing, it is basically striped.
ZETT, I don't think overage site owner would know what hit them, why their Analytics program stopped working. Blocking the GBOT? They will go to the web to search for answer, and guess where?
If this will stick, they would actually be killing several birds with the same stone. SERP Scraping is one(once implemented no need to be blamed for sending the canned response to the webmaster). This one also will encourage the individuals doing so do a manual visit to the site to get the content, in order to get to the site they would go to g…com, generating impressions and clicks on the ads. Traffic to the main g site is what the are looking for .
The referrer is not missing, it is basically striped. Oh, Google knows…
I am sure G does not give the flying –ask- about analytics programs about there, it is their right. But a move like this from G pretty much screams in my face. The PR is starting to work. They even have a word Child on their home page. I see where this is going. Privacy, Security +++… I wonder if this has something to do with a recent Press release from MySpace that the have identified the predators. The only thing we know is that the data will be available to G. and that is what the’re after, ALWAYS.
This is an incomplete train of thought, but I have thing to do in a few minutes…
NO QS FOR YOU.
Quick Question to Toolbar/IE developers out there:
If the AJAX functionality is fixed from the browser support point, does it enable the toolbar to capture the on page events/data?
How are webmasters supposed to load dynamic content based on user Intent now?
For example, we publish phone number #1 if users query X, and phone number #2 if users query Y. We determine query using referral string. It doesn't look like this will be possible much longer.
Any work around for situations like this?
I'm finally seeing this after clearing cookies and restarting Firefox, but it's not at all what I expected. In fact, the average end user will never notice a difference. It's no faster, slicker, more attractive, or anything - just vanilla Google results.
In other words, there's clearly some motive here *other* than improving the user experience.
| This 174 message thread spans 6 pages: < < 174 ( 1  3 4 5 6 ) > > |