homepage Welcome to WebmasterWorld Guest from 54.227.40.166
register, free tools, login, search, subscribe, help, library, announcements, recent posts, open posts,
Subscribe to WebmasterWorld
Visit PubCon.com
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 61 message thread spans 3 pages: 61 ( [1] 2 3 > >     
Google Announces Page PreFetching (again) - beta
johnmoose




msg:4346707
 3:23 pm on Aug 2, 2011 (gmt 0)


System: The following 4 messages were cut out of thread at: http://www.webmasterworld.com/google/4326046.htm [webmasterworld.com] by engine - 3:17 pm on Aug 3, 2011 (utc +1)


< moved from another location >

Now this is going to create fake hits on web servers without any real visitors..

[chrome.blogspot.com ]

Me not so happy...

 

Leosghost




msg:4346927
 1:26 am on Aug 3, 2011 (gmt 0)

< moved from another location >

Automatic updates {of stable version of the Chrome 13 browser, with Instant Pages built in} will roll it out to all Chrome users.. interesting to see what it does to stats and serps..

story and links here
[theregister.co.uk...]

Google has released a new stable version of its Chrome browser, adding an "Instant Pages" service that attempts to accelerate your Google searches by rendering pages before you actually click on them.

[edited by: Robert_Charlton at 8:27 am (utc) on Aug 3, 2011]

Robert Charlton




msg:4347023
 7:29 am on Aug 3, 2011 (gmt 0)

From the article....
Asked at press event earlier this summer how often Instant Pages chose the wrong link, Google Fellow Amit Singhal did not say. But there will surely be many cases where the service renders a page that you don't click on, and this could skew traffic numbers for sites across the web. In order to account for this fake traffic, webmasters must tap into a new Page Visibility API [code.google.com] that Google has submitted to the W3C as a standard.


Using the Page Visibility API
Google Chrome (Labs)
http://code.google.com/chrome/whitepapers/pagevisibility.html [code.google.com]

Brett_Tabke




msg:4347169
 2:12 pm on Aug 3, 2011 (gmt 0)

Then why don't they serve the page from their own Ripped copy of the page?

Davidcjmad1




msg:4347215
 3:26 pm on Aug 3, 2011 (gmt 0)

It will be interesting to see how this ties in with google analytics..

StoutFiles




msg:4347218
 3:32 pm on Aug 3, 2011 (gmt 0)

It will be interesting to see how this ties in with google analytics..


It likely won't. As witnessed many times before, one part of Google has no idea what the other parts are up to.

Leosghost




msg:4347222
 3:39 pm on Aug 3, 2011 (gmt 0)

Then why don't they serve the page from their own Ripped copy of the page?

Probably because that would get them into trouble with their use of the "fair use" argument and use of "cache" ..they always were ( in the opinion of many of us ) well over the line with serving their "cache" ..maybe their lawers said that the judges might consider serving from their "rips" would definitely contravene "fair use" ..and in the case of pages hosting illegal material ..get them into big trouble ..?

Either way..how it is set up now..it is the site owner and the end user are the ones actually paying for this "Google feature"..both are paying the bandwidth or the data charges ..and for a "benefit" that we only have Google's word the visitor actually wants ..most pages ( unless they are running Google's own slow analytics codes etc ) load fast enough for every one.

Fotiman




msg:4347224
 3:46 pm on Aug 3, 2011 (gmt 0)


In order to account for this fake traffic, webmasters must tap into a new Page Visibility API [code.google.com] that Google has submitted to the W3C as a standard.

This really doesn't make any sense at all.

The Visibility API is code that runs client side, AFTER the page has already been served! By that time, it's too late to do anything about it. Yes, can do things like minimize the effect of pages that make AJAX requests to the server, but it really doesn't address the issue of skewed traffic numbers from pages that are pre-rendered.

Alcoholico




msg:4347234
 4:01 pm on Aug 3, 2011 (gmt 0)

RewriteCond %{HTTP_USER_AGENT} ^.*Chrome*$
RewriteRule ^/* http://www.example.com/sorry.html [L]

Fotiman




msg:4347237
 4:13 pm on Aug 3, 2011 (gmt 0)

@Alcoholico, that would prevent everyone who uses Chrome from viewing your pages, not just the pre-rendering. That would be a terrible solution.

Pfui




msg:4347248
 4:43 pm on Aug 3, 2011 (gmt 0)

Hmm. Cached "Google Web Preview" files serve double-duty as "Google Instant Pages"? Makes sense. Unfortunately.

Pfui




msg:4347255
 4:57 pm on Aug 3, 2011 (gmt 0)

P.S./FWIW

On my just-updated Mac Chrome 13, the following (Preferences/ Under the Hood) setting was 'enabled for me' by default:

Predict network actions to improve page load performance

Your checkmarks may vary: "Prerender is currently enabled for many users of Chrome 13 by default. However..." [code.google.com...]

rlange




msg:4347259
 5:09 pm on Aug 3, 2011 (gmt 0)

Fotiman wrote:
This really doesn't make any sense at all.

The Visibility API is code that runs client side, AFTER the page has already been served! By that time, it's too late to do anything about it. Yes, can do things like minimize the effect of pages that make AJAX requests to the server, but it really doesn't address the issue of skewed traffic numbers from pages that are pre-rendered.

It can also be used to prevent JavaScript-based tracking like Google Analytics. However, unless Chrome sends a different user agent string when prefetching, log-based analytics will still be inflated.

--
Ryan

Hope_Fowl




msg:4347311
 6:32 pm on Aug 3, 2011 (gmt 0)

"Welcome. Your Chrome browser might have pre-fetched this web page before you decided to visit us. Please click HERE to continue to the actual page, now that you are ready to enjoy our delightful content."

johnmoose




msg:4347312
 6:32 pm on Aug 3, 2011 (gmt 0)

@pfui: I checked my setting (enable instant on the basics page) and it was off. Must say that I upgraded from a previous version too. Can't tell what a fresh install would be on.

Chrispcritters




msg:4347322
 6:42 pm on Aug 3, 2011 (gmt 0)

Anyone with a #1 SERPS result seeing any influx of Chrome traffic?

Does the pre-fetch include execution of JavaScript?

Leosghost




msg:4347324
 6:44 pm on Aug 3, 2011 (gmt 0)

Does the pre-fetch include execution of JavaScript?

Yes.. according to all the info available from Google and elsewhere to date...hence their "requirement" for their "api" ..if you want to make any attempt at sense of the dog's breakfast that your analytics /stats and adsense etc are going to be in ...or ..work on other ways of crunching your raw logs to account for "prefetch" screwing with them..and if you are on limited bandwidth hosting deals ..watch your logs..because you might get hit for "overages"..on sites which have a lot of "heavy pages"..

Likewise your data plans on phones.

[edited by: Leosghost at 6:52 pm (utc) on Aug 3, 2011]

rlange




msg:4347329
 6:51 pm on Aug 3, 2011 (gmt 0)

johnmoose wrote:
@pfui: I checked my setting (enable instant on the basics page) and it was off. Must say that I upgraded from a previous version too. Can't tell what a fresh install would be on.

I'm pretty sure that setting is for the "instant search results" feature, not the prefetching feature. The prefetching option is located in Under The Hood and is, I believe, "Predict network actions to improve page load performance".

Google needs to get their feature naming sorted out...

--
Ryan

johnmoose




msg:4347345
 7:16 pm on Aug 3, 2011 (gmt 0)

@rlange: yep. It is. I tried it and switched it off again. I do not want websites flashing in my screen while I am typing my search query. Which could be long tail.
I like to see the normal search results while typing my query, not every #1 serp website for half any baked queries my fingers type.

As far as I am concerned, I will never turn it on again.

John

Chrispcritters




msg:4347348
 7:25 pm on Aug 3, 2011 (gmt 0)

v13 accounted for only 3% of Chrome traffic in the past 30 days. It's at 14% of Chrome traffic today. I guess I'll have to wait a few more days for a higher level of adoption to see how much additional traffic load there is.

Davidcjmad1




msg:4347375
 8:26 pm on Aug 3, 2011 (gmt 0)

" Then why don't they serve the page from their own Ripped copy of the page? "

Dynamic content.

lucy24




msg:4347395
 9:20 pm on Aug 3, 2011 (gmt 0)

RewriteCond %{HTTP_USER_AGENT} ^.*Chrome*$
RewriteRule ^/* http://www.example.com/sorry.html [L]

A simple . would do for the pattern. As written, it just says "any request that either does or does not begin with a slash". Same for the user agent; all you need is Chrome without anchors. As written, it means "any User Agent that ends in 'Chrome' or 'Chrom'" (would that be the browser's German name?). Sorry. Just came from the Apache forum.

that would prevent everyone who uses Chrome from viewing your pages, not just the pre-rendering

And your point is...?

Sgt_Kickaxe




msg:4347402
 9:35 pm on Aug 3, 2011 (gmt 0)

This is showing up in analytics as direct traffic and Google as the source. My charts are more accurate without this data so a way to set this data aside or give it it's own "prefetched" heading is a must. Prefetch needs some work.

My charts also show me some odd behavior - my site is getting the same total traffic give or take a small % but as the direct traffic from Google increases the search traffic decreases. It's almost as if I am capped within a range and prefetched is costing me natural visitors, could that be ?

Google - please respect the noarchive meta tag and do not prefetch pages that have it. thank you!

edit:
First, we’ve added some awesome to the omnibox by suggesting partial matches for URLs and page titles from your browsing history.

That's in the blog post?! - Google get the heck out of my browsing history!

[edited by: Sgt_Kickaxe at 9:45 pm (utc) on Aug 3, 2011]

Leosghost




msg:4347405
 9:41 pm on Aug 3, 2011 (gmt 0)

My charts are more accurate without this data so a way to set this data aside or give it it's own "prefetched" heading is a must.

Agree totally ..and not using something from G to do the "triage"

There has been a lot of speculation about capping and shaping over the years ..and some fairly convincing evidence that G may well have been doing this before ..if that was the case I don't suppose they would stop now..it may just become more apparent.

Google - please respect the noarchive meta tag and do not prefetch pages that have it. thank you!

Have you see them prefetch "no archived" pages yet ?..I think we would all hope that they respect "no archive" when pre fetching ..

[edited by: Leosghost at 9:45 pm (utc) on Aug 3, 2011]

Sgt_Kickaxe




msg:4347407
 9:44 pm on Aug 3, 2011 (gmt 0)

if that was the case I don't suppose they would stop now..it may just become more apparent.

It is evident and probably necessary, imaging a search engine that gives me more traffic quickly after I make a positive change? That type of instant feedback, which is based on merit, would cause the spawn of countless ebooks titled "I TRIPLED MY SEARCH TRAFFIC BY DOING THIS".

Can't have that :-)

Leosghost




msg:4347409
 9:47 pm on Aug 3, 2011 (gmt 0)

Damn ..there goes what was to have been my next ebook ! ;-)

memories of Altavista come flooding back

Davidcjmad1




msg:4347418
 9:58 pm on Aug 3, 2011 (gmt 0)

"My charts also show me some odd behavior - my site is getting the same total traffic give or take a small % but as the direct traffic from Google increases the search traffic decreases. It's almost as if I am capped within a range and prefetched is costing me natural visitors, could that be ? "

Decrease as a percentage or in average number of visits? It could be that chrome is pre fetching a lot of searches that never turn into a visits?

dstiles




msg:4347422
 9:59 pm on Aug 3, 2011 (gmt 0)

Does anyone have real information that the same chrome UA is used for both prefetch AND fetch?

Is the prefetch header X-MOZ set for prefetches? If it's not then google is mis-using the web protocol. Again!

loner




msg:4347440
 10:36 pm on Aug 3, 2011 (gmt 0)

"Then why don't they serve the page from their own Ripped copy of the page?"

Coming soon!

Fotiman




msg:4347455
 11:48 pm on Aug 3, 2011 (gmt 0)


And your point is...?

I certainly wouldn't want to shut out the current #2 browser from my website, and if I encountered sites that were shutting me out, I would stop visiting them.

This 61 message thread spans 3 pages: 61 ( [1] 2 3 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved