Welcome to WebmasterWorld Guest from 54.227.125.200

Message Too Old, No Replies

Google Announces Page PreFetching (again) - beta

     

johnmoose

3:23 pm on Aug 2, 2011 (gmt 0)

5+ Year Member




System: The following 4 messages were cut out of thread at: http://www.webmasterworld.com/google/4326046.htm [webmasterworld.com] by engine - 3:17 pm on Aug 3, 2011 (utc +1)


< moved from another location >

Now this is going to create fake hits on web servers without any real visitors..

[chrome.blogspot.com ]

Me not so happy...

Leosghost

1:26 am on Aug 3, 2011 (gmt 0)

WebmasterWorld Senior Member leosghost is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



< moved from another location >

Automatic updates {of stable version of the Chrome 13 browser, with Instant Pages built in} will roll it out to all Chrome users.. interesting to see what it does to stats and serps..

story and links here
[theregister.co.uk...]

Google has released a new stable version of its Chrome browser, adding an "Instant Pages" service that attempts to accelerate your Google searches by rendering pages before you actually click on them.

[edited by: Robert_Charlton at 8:27 am (utc) on Aug 3, 2011]

Robert Charlton

7:29 am on Aug 3, 2011 (gmt 0)

WebmasterWorld Administrator robert_charlton is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



From the article....
Asked at press event earlier this summer how often Instant Pages chose the wrong link, Google Fellow Amit Singhal did not say. But there will surely be many cases where the service renders a page that you don't click on, and this could skew traffic numbers for sites across the web. In order to account for this fake traffic, webmasters must tap into a new Page Visibility API [code.google.com] that Google has submitted to the W3C as a standard.


Using the Page Visibility API
Google Chrome (Labs)
http://code.google.com/chrome/whitepapers/pagevisibility.html [code.google.com]

Brett_Tabke

2:12 pm on Aug 3, 2011 (gmt 0)

WebmasterWorld Administrator brett_tabke is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



Then why don't they serve the page from their own Ripped copy of the page?

Davidcjmad1

3:26 pm on Aug 3, 2011 (gmt 0)



It will be interesting to see how this ties in with google analytics..

StoutFiles

3:32 pm on Aug 3, 2011 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member



It will be interesting to see how this ties in with google analytics..


It likely won't. As witnessed many times before, one part of Google has no idea what the other parts are up to.

Leosghost

3:39 pm on Aug 3, 2011 (gmt 0)

WebmasterWorld Senior Member leosghost is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



Then why don't they serve the page from their own Ripped copy of the page?

Probably because that would get them into trouble with their use of the "fair use" argument and use of "cache" ..they always were ( in the opinion of many of us ) well over the line with serving their "cache" ..maybe their lawers said that the judges might consider serving from their "rips" would definitely contravene "fair use" ..and in the case of pages hosting illegal material ..get them into big trouble ..?

Either way..how it is set up now..it is the site owner and the end user are the ones actually paying for this "Google feature"..both are paying the bandwidth or the data charges ..and for a "benefit" that we only have Google's word the visitor actually wants ..most pages ( unless they are running Google's own slow analytics codes etc ) load fast enough for every one.

Fotiman

3:46 pm on Aug 3, 2011 (gmt 0)

WebmasterWorld Senior Member fotiman is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month




In order to account for this fake traffic, webmasters must tap into a new Page Visibility API [code.google.com] that Google has submitted to the W3C as a standard.

This really doesn't make any sense at all.

The Visibility API is code that runs client side, AFTER the page has already been served! By that time, it's too late to do anything about it. Yes, can do things like minimize the effect of pages that make AJAX requests to the server, but it really doesn't address the issue of skewed traffic numbers from pages that are pre-rendered.

Alcoholico

4:01 pm on Aug 3, 2011 (gmt 0)

5+ Year Member



RewriteCond %{HTTP_USER_AGENT} ^.*Chrome*$ 
RewriteRule ^/* http://www.example.com/sorry.html [L]

Fotiman

4:13 pm on Aug 3, 2011 (gmt 0)

WebmasterWorld Senior Member fotiman is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month



@Alcoholico, that would prevent everyone who uses Chrome from viewing your pages, not just the pre-rendering. That would be a terrible solution.

Pfui

4:43 pm on Aug 3, 2011 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member



Hmm. Cached "Google Web Preview" files serve double-duty as "Google Instant Pages"? Makes sense. Unfortunately.

Pfui

4:57 pm on Aug 3, 2011 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member



P.S./FWIW

On my just-updated Mac Chrome 13, the following (Preferences/ Under the Hood) setting was 'enabled for me' by default:

Predict network actions to improve page load performance

Your checkmarks may vary: "Prerender is currently enabled for many users of Chrome 13 by default. However..." [code.google.com...]

rlange

5:09 pm on Aug 3, 2011 (gmt 0)



Fotiman wrote:
This really doesn't make any sense at all.

The Visibility API is code that runs client side, AFTER the page has already been served! By that time, it's too late to do anything about it. Yes, can do things like minimize the effect of pages that make AJAX requests to the server, but it really doesn't address the issue of skewed traffic numbers from pages that are pre-rendered.

It can also be used to prevent JavaScript-based tracking like Google Analytics. However, unless Chrome sends a different user agent string when prefetching, log-based analytics will still be inflated.

--
Ryan

Hope_Fowl

6:32 pm on Aug 3, 2011 (gmt 0)

5+ Year Member



"Welcome. Your Chrome browser might have pre-fetched this web page before you decided to visit us. Please click HERE to continue to the actual page, now that you are ready to enjoy our delightful content."

johnmoose

6:32 pm on Aug 3, 2011 (gmt 0)

5+ Year Member



@pfui: I checked my setting (enable instant on the basics page) and it was off. Must say that I upgraded from a previous version too. Can't tell what a fresh install would be on.

Chrispcritters

6:42 pm on Aug 3, 2011 (gmt 0)

5+ Year Member



Anyone with a #1 SERPS result seeing any influx of Chrome traffic?

Does the pre-fetch include execution of JavaScript?

Leosghost

6:44 pm on Aug 3, 2011 (gmt 0)

WebmasterWorld Senior Member leosghost is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



Does the pre-fetch include execution of JavaScript?

Yes.. according to all the info available from Google and elsewhere to date...hence their "requirement" for their "api" ..if you want to make any attempt at sense of the dog's breakfast that your analytics /stats and adsense etc are going to be in ...or ..work on other ways of crunching your raw logs to account for "prefetch" screwing with them..and if you are on limited bandwidth hosting deals ..watch your logs..because you might get hit for "overages"..on sites which have a lot of "heavy pages"..

Likewise your data plans on phones.

[edited by: Leosghost at 6:52 pm (utc) on Aug 3, 2011]

rlange

6:51 pm on Aug 3, 2011 (gmt 0)



johnmoose wrote:
@pfui: I checked my setting (enable instant on the basics page) and it was off. Must say that I upgraded from a previous version too. Can't tell what a fresh install would be on.

I'm pretty sure that setting is for the "instant search results" feature, not the prefetching feature. The prefetching option is located in Under The Hood and is, I believe, "Predict network actions to improve page load performance".

Google needs to get their feature naming sorted out...

--
Ryan

johnmoose

7:16 pm on Aug 3, 2011 (gmt 0)

5+ Year Member



@rlange: yep. It is. I tried it and switched it off again. I do not want websites flashing in my screen while I am typing my search query. Which could be long tail.
I like to see the normal search results while typing my query, not every #1 serp website for half any baked queries my fingers type.

As far as I am concerned, I will never turn it on again.

John

Chrispcritters

7:25 pm on Aug 3, 2011 (gmt 0)

5+ Year Member



v13 accounted for only 3% of Chrome traffic in the past 30 days. It's at 14% of Chrome traffic today. I guess I'll have to wait a few more days for a higher level of adoption to see how much additional traffic load there is.

Davidcjmad1

8:26 pm on Aug 3, 2011 (gmt 0)



" Then why don't they serve the page from their own Ripped copy of the page? "

Dynamic content.

lucy24

9:20 pm on Aug 3, 2011 (gmt 0)

WebmasterWorld Senior Member lucy24 is a WebmasterWorld Top Contributor of All Time Top Contributors Of The Month



RewriteCond %{HTTP_USER_AGENT} ^.*Chrome*$
RewriteRule ^/* http://www.example.com/sorry.html [L]

A simple . would do for the pattern. As written, it just says "any request that either does or does not begin with a slash". Same for the user agent; all you need is Chrome without anchors. As written, it means "any User Agent that ends in 'Chrome' or 'Chrom'" (would that be the browser's German name?). Sorry. Just came from the Apache forum.

that would prevent everyone who uses Chrome from viewing your pages, not just the pre-rendering

And your point is...?

Sgt_Kickaxe

9:35 pm on Aug 3, 2011 (gmt 0)

WebmasterWorld Senior Member sgt_kickaxe is a WebmasterWorld Top Contributor of All Time 5+ Year Member



This is showing up in analytics as direct traffic and Google as the source. My charts are more accurate without this data so a way to set this data aside or give it it's own "prefetched" heading is a must. Prefetch needs some work.

My charts also show me some odd behavior - my site is getting the same total traffic give or take a small % but as the direct traffic from Google increases the search traffic decreases. It's almost as if I am capped within a range and prefetched is costing me natural visitors, could that be ?

Google - please respect the noarchive meta tag and do not prefetch pages that have it. thank you!

edit:
First, we’ve added some awesome to the omnibox by suggesting partial matches for URLs and page titles from your browsing history.

That's in the blog post?! - Google get the heck out of my browsing history!

[edited by: Sgt_Kickaxe at 9:45 pm (utc) on Aug 3, 2011]

Leosghost

9:41 pm on Aug 3, 2011 (gmt 0)

WebmasterWorld Senior Member leosghost is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



My charts are more accurate without this data so a way to set this data aside or give it it's own "prefetched" heading is a must.

Agree totally ..and not using something from G to do the "triage"

There has been a lot of speculation about capping and shaping over the years ..and some fairly convincing evidence that G may well have been doing this before ..if that was the case I don't suppose they would stop now..it may just become more apparent.

Google - please respect the noarchive meta tag and do not prefetch pages that have it. thank you!

Have you see them prefetch "no archived" pages yet ?..I think we would all hope that they respect "no archive" when pre fetching ..

[edited by: Leosghost at 9:45 pm (utc) on Aug 3, 2011]

Sgt_Kickaxe

9:44 pm on Aug 3, 2011 (gmt 0)

WebmasterWorld Senior Member sgt_kickaxe is a WebmasterWorld Top Contributor of All Time 5+ Year Member



if that was the case I don't suppose they would stop now..it may just become more apparent.

It is evident and probably necessary, imaging a search engine that gives me more traffic quickly after I make a positive change? That type of instant feedback, which is based on merit, would cause the spawn of countless ebooks titled "I TRIPLED MY SEARCH TRAFFIC BY DOING THIS".

Can't have that :-)

Leosghost

9:47 pm on Aug 3, 2011 (gmt 0)

WebmasterWorld Senior Member leosghost is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



Damn ..there goes what was to have been my next ebook ! ;-)

memories of Altavista come flooding back

Davidcjmad1

9:58 pm on Aug 3, 2011 (gmt 0)



"My charts also show me some odd behavior - my site is getting the same total traffic give or take a small % but as the direct traffic from Google increases the search traffic decreases. It's almost as if I am capped within a range and prefetched is costing me natural visitors, could that be ? "

Decrease as a percentage or in average number of visits? It could be that chrome is pre fetching a lot of searches that never turn into a visits?

dstiles

9:59 pm on Aug 3, 2011 (gmt 0)

WebmasterWorld Senior Member dstiles is a WebmasterWorld Top Contributor of All Time 5+ Year Member



Does anyone have real information that the same chrome UA is used for both prefetch AND fetch?

Is the prefetch header X-MOZ set for prefetches? If it's not then google is mis-using the web protocol. Again!

loner

10:36 pm on Aug 3, 2011 (gmt 0)

5+ Year Member



"Then why don't they serve the page from their own Ripped copy of the page?"

Coming soon!

Fotiman

11:48 pm on Aug 3, 2011 (gmt 0)

WebmasterWorld Senior Member fotiman is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month




And your point is...?

I certainly wouldn't want to shut out the current #2 browser from my website, and if I encountered sites that were shutting me out, I would stop visiting them.
This 61 message thread spans 3 pages: 61
 

Featured Threads

Hot Threads This Week

Hot Threads This Month