Forum Moderators: Robert Charlton & goodroi
About 7 years ago, in the classic "3 characters from done [webmasterworld.com]" thread we had talked about the ramifications on seo of se's switching from GET to POST'ed data. However, it never came to pass that browser programmers would allow an unchecked form "post" submission. However, the hiding or obscuring of referral data is nothing new:
March 2002: [webmasterworld.com]
Both AOL and Overture routinely encode urls and session strings. There are some meta engines and java script based forms on engines that already hide referral strings. Most of us have seen this effect from search MSN already.
I honestly don't think G is shortsighted enough to do something like that system wide for the sake of obfuscating seo efforts. Websites would have little incentive to look to Google for traffic or optimize for Google. The focus would no longer be on Google for optimization. We wouldn't know what or how to optimize for keywords - optimization would be shots in the dark. Our only option would be to look for other big sources of traffic [webmasterworld.com].
There has always been the unwritten trust between websites and search engines that they could use our data, as long as they sent us traffic. Refererless traffic pretty much breaks that unwritten but implicitly agreed upon trust and treats websites as commodities - just food for the engine.
On the other hand, we have to respect a websites - even Google's - opportunity to innovate. I think we have to see what G is doing with the Ajax before passing final judgment on it and it's intentions. I doubt that it is Googles will to break log analyzers and keyword trackers with this test. I think that is a by-product of whatever Ajax implementation google is currently testing.
On the third hand, it would probably drive alot of webmasters to look to AdWords for traffic. hmmm
Notable benefits from G's POV would seem to include:
-- encouraging more time on Google.com (if you thought cached pages were huge ...),
-- cutting off, at least temporarily, the two-way flow of very valuable information,
-- opening up more revenue streams, should G choose to monetize kw data currently being distributed freely,
-- encouraging more webmaters to share information with G (in order to get information),
-- discouraging SEO
-- encouraging webmasters to focus on "creating great content" rather than on "manipulating SERP's."
What's not to like?
/search?hl=en&safe=off&q=example&start=10&sa=N
/#hl=en&safe=off&q=example&start=10&sa=N&fp=aaBBccDDeeF
A quick peek at the http headers is showing no redirection, so perhaps they are indeed using AJAX on the paging.
[edited by: coopster at 5:54 pm (utc) on Feb. 4, 2009]
FOR THE USER: Easier searching, with fewer detours and better results.
FOR THE SITE OWNER: The chance to entice searchers with relevant content and effective page design--without having to obsess about being #1, on a SERP, worry about the snippet that gets served up with the page title, or feel at a disadvantage on Universal Search SERPs that include images, video clips, etc.
FOR THE SEO: An opportunity to move beyond the technical or the mechanical by consulting on page design and content presentation, which will become more important if users are able to preview pages in the search engine's results. Smart SEOs will also develop formulas and "what if" scenarios to help site owners make strategic design decisions, such as how to find the sweet spot between optimum ad placement and optimum clickthrough rates from SERPs.
Who will be hurt by this kind of change? Two groups:
- SEOs who are unable to adapt (nothing new about that--how many SEOs are still making a living with their AltaVista/InfoSeek SEO techniques?)
- Site owners whose content presentation doesn't interest or entice users, such as template-based sites with little content on the page or "made for AdSense" sites that have three AdSense units above the fold.
Just reading Brett's description of mousing over search results and seeing page previews makes today's Google SERPs seem old-fashioned and quaint. This is a change that needs to happen and will happen--the sooner, the better.
An opportunity to move beyond the technical or the mechanical by consulting on page design and content presentation, which will become more important if users are able to preview pages in the search engine's results. Smart SEOs will also develop formulas and "what if" scenarios to help site owners make strategic design decisions, such as how to find the sweet spot between optimum ad placement and optimum clickthrough rates from SERPs.
an excellent segue toward more focus on a/b and multi-variate testing for optimal element positioning.
No, I'm not using any special URL. Just allow JavaScript for the Google domain and visit their home page. Type in a search. I disabled cookies and cleared the browser cache and still get this option, so it is not reliant on cookies whatsoever. But that doesn't matter too much when you are allowing JavaScript to run back and forth. Information is sent across that channel, not via cookie. There is a JavaScript object being returned by JavaScript that has it's src created on the fly. That object has certain properties of which I am not familiar so I am assuming they are experimental interface variables.
I pulled G's JavaScript exactly one month ago to the day (just checked my research logs) on a whim. Yes, the drill-down stuff you can watch as it happens (Firebug is great for this). In typing "example" I watched it chasing my keystrokes to retrieve display results. I didn't realize the address bar was changing on pagination though. I just pulled the code to check it and sure enough, there is a function that is running a regular expression replace on the location.
Google is known to give out as little information as possible. And they deliberately choose which information they give out.
So it is rather unrealistic to think that they will give out that information to webmasters once they have the ability to control that information.
If you want to get a glimpse into the future - just have a look at Adsense reports. There is hardly any valuable information in there, and critics are usually told to simply shut off: "You want Google's money, so accept the deal. You don't have a right to get that information. Accept the system 'as is', or do not use Google for monetization." - Now, with the AJAX referrers, it's easy to see how this may be translated to Google search: "You want Google's traffic, so accept the deal. You don't have a right to get that information. Accept the system 'as is', or do not use Google for getting traffic."
I see only one chance to stop Google: webmasters need to block the Google bot and actively promote other search engines. I know how hard this may be. It's certainly easier said than done. Even for our tiny group of sites. But it needs to be done. TBH, I don't even want to start thinking about a time where Google IS the Internet.
Oh, that "drill down". Understood and yes, I thought you meant the G suggest. However, in Firebug after G suggest is where I see the other AJAX src being created and retrieved during the same request. You have to sift through some chrome js calls but in the remaining (obfuscated filename) calls you will find the xmlhttprequest code and the changes in the window.location as is being discussed here.
We seemed to have bridged into another discussion regarding the other "experiment" going on. I think you may find that stuff in the accessibility area of G labs ... [google.com...]
That means that this is more than a simple one shot test, there are millions of people getting this test.
I encountered it in Japan.
Its HTML code was totally different from that of normal SERP. There were no URLs found in the code though they showed on SERP. That will mean rank checking software couldn't retrieve the ranking data.
What I can't quite figure out is that when I request the AJAX-y URL, live HTTP headers records a request for the traditional URL with a question mark. Maybe Google are requesting the whole page via AJAX and then just switching the content, but that would be slightly strange.
I see only one chance to stop Google: webmasters need to block the Google bot and actively promote other search engines.
Why? To keep users from making use of an improved search tool that gives useful, relevant Web pages an advantage over useless or irrelevant Web pages for a given search string?
As a user, I think this is a great idea. Right now, if Joe User searches Google for something like "pc won't recognize usb widget," he'll get a hodgepodge of results that range from articles on the topic to casual mentions of "pc," usb," and "widget" in a 100-message forum thread. There's no way for Joe to identify the helpful search results without visiting each of the listed pages, and the author of the helpful article on "What to Do When Your PC Won't Recognize Your USB Widget" is competing with a long list of less relevant or useful results. The scenario that Brett describes (mouse over the search results to see what's on the underlying pages) is a win-win-win situation for Google, for the site owners with the most relevant or useful results, and--above all--for Joe the previously frustrated user.
I can get these results for a site search, coop, but only if I start from the Google homepage (or [google.com...]OK, just did the same using "site" in the search and did indeed get the results. However, as you mentioned, do it from the advanced search page and it is a different story. For me at least.
What I can't quite figure out is that when I request the AJAX-y URL, live HTTP headers records a request for the traditional URL with a question mark. Maybe Google are requesting the whole page via AJAX and then just switching the content, but that would be slightly strange.I assume you are referring to the navigation links, like the "Next" link at the base of the page? Yeah, it's special. Have another look at the cache headers coming down with it and my notes earlier on the JavaScript.
for Joe the previously frustrated user
Joe User, doesn't normally search for "What to Do When Your PC Won't Recognize Your USB Widget" etc.
That's "Joe Geek" (a very SMALL population of searchers)
Joe User searches for "XYZ song lyrics" "sports scores" "celebrity gossip"
Don't believe me? Check out the zeitgeist sometime.
At best, "Young Joe User" searches for "xbox gaming cheats" as that population's most "technical" term.
...ethics...
This is about MONEY and control of information.
The money all of us have built businesses for.
Here's a very EASY and probable scenario for this "test"
(beyond all the VERY short sighted thinking about it's "implementation"... who cares?!)
What do you do 1-2 years down the line, when everyone has, once again, gracefully cow-towed to Goog's demands; every website has sheepishly given over all their info for Goog for accurate record-keeping, and
"oh by the way, we are now CHARGING for WMT and analytics.
"Cause it's too much of a drain on our servers to give away for free"
Geez, this is a BASIC scenario of what's going to happen.
Let alone more "conspiratorial" scenarios.
Who wants to bet AGAINST me on this scenario?!
I'm laying 3 to 1 odds...
[edited by: whitenight at 9:53 pm (utc) on Feb. 4, 2009]
Traffic distribution vs website rank is going to change. Entries will be more equally distributed between the listings. The top listings will lose their traffic share, for small niches they may lose to the point where it doesn't worth the effort anymore.
How about an Images link that only requires you to mouse over it to see a dynamic 5x5 list of image results that "hover" in a chromeless window over the SERPS?
How about a Videos link that does the same? News results? Blog results? Maps?
And that's just some of the basic stuff.
There's a lot less "win" for webmasters whose sites are monetized through advertising and so depend on an actual visitor for their revenues.
It may not be good for the kind of Web site that counts on users arriving on a page with three AdSense units above the fold and exiting via the nearest PPC link. But in the long run, it should benefit sites that present an attractive, informative face to users by sending them a bigger share of search referrals.
In any case, Google Search's main concern has to be about the quality of its product and its user experience. It's wishful thinking to imagine that Google would--or should--keep the status quo intact just because progress might not benefit every Webmaster or SEO.
The Referrer it self is www.google.com I don’t use any analytics programs on my sites. I wrote my own code that in the old way would give me the data that I was interested in, QS, Default Language, the page number on G search where the link was clicked. That Coupled with the IP(general geo location of the user) I was able to do some data digging and understand the patterns when it comes to delivering more relevant site experience to my site's visitors. It made possible for me make my sites better for visitors. In a way I king of new this was coming when they introduced the OQ(original query which is what was used to suggest the NEW, Google preferred, High Paying Adwords string) Variable in referrer. How is your Long Tail doing Now?
The referrer is not missing, it is basically striped.
ZETT, I don't think overage site owner would know what hit them, why their Analytics program stopped working. Blocking the GBOT? They will go to the web to search for answer, and guess where?
If this will stick, they would actually be killing several birds with the same stone. SERP Scraping is one(once implemented no need to be blamed for sending the canned response to the webmaster). This one also will encourage the individuals doing so do a manual visit to the site to get the content, in order to get to the site they would go to g…com, generating impressions and clicks on the ads. Traffic to the main g site is what the are looking for .
The referrer is not missing, it is basically striped. Oh, Google knows…
I am sure G does not give the flying –ask- about analytics programs about there, it is their right. But a move like this from G pretty much screams in my face. The PR is starting to work. They even have a word Child on their home page. I see where this is going. Privacy, Security +++… I wonder if this has something to do with a recent Press release from MySpace that the have identified the predators. The only thing we know is that the data will be available to G. and that is what the’re after, ALWAYS.
This is an incomplete train of thought, but I have thing to do in a few minutes…
NO QS FOR YOU.
Blend27
For example, we publish phone number #1 if users query X, and phone number #2 if users query Y. We determine query using referral string. It doesn't look like this will be possible much longer.
Any work around for situations like this?