Welcome to WebmasterWorld Guest from 34.203.28.212
Forum Moderators: Robert Charlton & goodroi
System Requirements
Operating System: Win XP or Win 2000 SP3+
Browser: IE 5.5+ or Firefox 1.0+
Availability: For users in North America and Europe (during beta testing phase)
Press Release:
Google Web
Accelerator significantly reduces the time that it takes broadband users to
download and view web pages. The Google Web Accelerator appears as a small
speedometer in the browser chrome with a cumulative "Time saved" indicator.Here's how it works. A user downloads and installs the client and begins
browsing the web as she normally would. In the background, the Google Web
Accelerator employs a number of techniques to speed up the delivery of
content to users.
Looks like some of the Mozilla hires are paying dvidends.
# Sending your page requests through Google machines dedicated to handling Google Web Accelerator traffic.
# Storing copies of frequently looked at pages to make them quickly accessible.
# Downloading only the updates if a web page has changed slightly since you last viewed it.
# Prefetching certain pages onto your computer in advance.
# Managing your Internet connection to reduce delays.
# Compressing data before sending it to your computer.
My users are seeing this (with Web Accelerator)
GET index.html (sees non logged in version)
GET login.html
POST login.html > 302 (sets auth cookie) > index.html
User is now logged in....
GET index.html (still sees non logged in version, should be seeing logged in version)
If you bash CTRL-F5 hard enough and do whatever else you can to finally get to the logged in version, the same thing happens again when they logout...
GET logout.html > 302 (clears auth cookie) > index.html
GET index.html (still sees logged in version)
Aggggghhhh! My customers for this particular application are techie types who are likely to be trying the WA out!
My immediate solution to this has been to move the logged in area over to a sub-domain called members.example.com instead of www.example.com (just set it up as an alias).
Where do I sign-up for my free SSL cert, Google? ;)
Here's a new delimma to consider:
Consider that Google WA cache could actually significantly lower bandwidth on some very popular static sites the net result might be taking money out of the mouth of web hosting companies that charge fees for account bandwidth that goes "over the limit" on a monthly basis.
On the flip side of the coin pre-fetch may run some sites over the limit as well.
Therefore, this technology has the potential to impact hosts and/or webmasters monetarily.
Save your logs, send them a bill :)
You are missing what is the issue for me and my sites (and others with similar concerns). you say that "they proxy requests" but google's accelerator does not just proxy requests it automates them and proxies some requests that the user *does not request*. It's the additional automated requests that will cause increased load and cost for me without monetization to pay for the costs.
It has the potential to threaten the viability of some sites.
Can someone tell me, maybe from using a sniffer, whether the WA talks to something.google.com? Or maybe it uses a static IP address instead for speed?
In the first case, the famous Google cookie is sent by your browser. In the second case, it isn't.
I'm interested in whether the famous cookie with the globally-unique ID in it is offered up by your browser when you use WA. The cookie responds to anything.google.com
If the cookie is offered, then I think we have more of a privacy problem than we would otherwise.
And for those who think this is not also about cloaking and purifying the search results, well it is my personal humble opinion that it very is. just think about that now, on top of everything else mentioned, google will have access to queries made to other search engines. Google can analyze what are the principal differences between queries made on Yahoo versus Google :-)
actually i think its a brilliant product, for Google of course.
Other Tricks:
run a compare on pages in background between what GBot collected and prefetched via web accelerator
if difference penalise or delete or investigate
This could have some of the most significant impact on SEO black and white ranging from
includes / cloaking / redirect and much more
I think this may be G's attempt to clean up serps without human intervention
And as many of G's past attempts have shown fallout to the innocent is not seen as that important by G
And we may see one of the biggest changes for a long time to serps towards the end of the year if this is implemented both innocent and dark side SEO caught 6 months data collected so ( October or November comes to mind )
just my very jaundiced view
steve
For all the money Google makes you webmasters in advertising revenue you could at least lighten up and let them treat the end-users for a change.
And it's not like they aren't using the service on their own web page. It can't be that bad.
Load Time for 3647 Pages
Without Google Web Accelerator: 1.0 hr
With Google Web Accelerator: 47.1 mins
I'll take it.
I see this behaviour on some sites when I do automated link checking, so I imagine there's some way to implement it. But I can't see how to do it with a .htaccess file.
joined:Dec 29, 2003
posts:5428
votes: 0
[edited by: walkman at 7:32 pm (utc) on May 5, 2005]
The rest of the web is still stuck in the uncompressed dark ages of the 70's.
That's one hell of a generalization. More and more server owners are waking up to the benefits of HTTP compression. As the price of dedicated servers freefalls and shared servers become less common, webmasters are now able to take compression in to their own hands rather than letting greedy Web Hosts send uncompressed data to artificially increase bandwidth usage (and therefore profits)
One thing we know for absolute certainty is that bandwidth requirements and page sizes are going to increase.
Your definition of absolute is very different to mine!
Bandwidth requirements probably will go up barring any radical new inventions. File sharing has already taken off to a large extent and multimedia streaming is set to take off in a big way too.
However the Web Accellerator doesn't do a think about high bandwidth usages of the Internet. It deals with web pages.
I see a downward trend in web page file sizes.
My oldest site is 10 years old. Thanks to the widespread adaption of CSS, external javascript and browsers that support HTTP compression, I've been able to decrease my average web page on that site from about 10KB down to 1.5KB.
Slowly, more and more web site owners are taking the time out to make their code leaner and faster.
A compressing proxy does indeed stand to speed up the web.
Possibly so. But this utility wastes far more data through prefetches than it saves in compression. Widespread use of the utility will slow down the Internet as a whole.
I agree in principle with claus's feelings on the product, but I disagree on some of the specifics:> a made up consumer need
I agree with claus too and your statements regarding compression. However I will make my point again that compression is one small part of this application. Another part is prefetching and that alone wastes all the bandwidth savings that the compression saves.
> it does not honor robots.txtIt is a proxy on behalf of a human, it isn't a bot.
The proxy is a proxy. the client application you download is a prefetching agent which makes it a bot. It accesses resources without knowing what they are. I have seen it add items to shopping baskets of its own accord even when robots.txt specifically forbids automated user agents from doing such.
> User-Agent string for that file.It can't claus, it *has* to pass the UA unfettered.
That does not mean it can't identify itself in over ways via HTTP, especially with respect to its prefetching activities.
> wastes your bandwidth.Agreed. how much it uses is open for debate.
Well I'd hardly say it's open to debate, but it varies based on the way the user uses their web browser. However, in my studies it prefetched at least three pages for every one that I accessed
I think the take away here is that if everyone would just install GZip on their websites, we would have the same effect in ALL browsers and not just in IE/Moz.
Agreed.
I think more people should de-bloat their web sites too. I saw one homepage today that weighs in at a massive 40KB. For every 1 byte of visible data, there's 4 bytes of bandwidth usage. That's not a ratio I'd be pleased about.
On top of that, they're not using any HTTP compression.
What absolute rubbish.
It's very clear to me that you are the one making the assumptions. If you'd bothered to analyse the Web Accelerator, you'd see that it does indeed prefetch Anchors.
Is there a way to set things up so that the server sends no response to the prefetch request? Simply ignores it? It seems to me this would be better than feeding a 403 or a 404 or a redirect to a different page.
/sbin/route add -net 72.14.192.0 netmask 255.255.240.0 reject
/sbin/route add -net 64.233.160.0 netmask 255.255.224.0 reject
This affects all domains on that box. The question is, will Google retaliate by assuming that these domains are also broken for Googlebot? As far as I can determine, Googlebot is on the 66.249.*.* Class B these days, so these blocks woudn't affect the Googlebot unless Google makes this assumption.
If you have a box where you don't care if you're listed in Google, it's worth a try. If you have a box where you depend on your Google referrals, I'd recommend a wait-and-see approach.
I got "Forbitten"
when I log in webmasterworld and have this "Accelerator" switched on.Now I switched it off and could log in.
(Brett)
So before we go down this road of lynching them, lets be sure what the issues and consequences are for us. I don't think we can see those yet
It might have taken him a day, but I think it's safe to assume that Brett doesn't like the "issues and consequences" that arise from Web Accelerator.
However, Google's privacy invasion is not new, it's data expropriation is legendary, and it's total disinterest in anything other than their own interests is symptomatic, even more so since the company went public. I'm not as scandalized simply because it's not all new.
However, I can't see the long terms benefits of banning the accelerator IPs: I think that site owners will lose out in the long run with such a strategy. It may seem a good (if rather knee-jerk) reaction to the initial analysis, but I think we need more time to fully analyse the implications that the accelerator brings.
All of them. You'll have to look somewhere else for your conspiracy theory.How do you know? Parts of many web pages are dependent on the geolocation of the ip address. Parts of many pages are dependent on stored cookie values. I don't recall mentioning any conspiracy theory. I am just suggesting the the output on your monitor can be different based solely on whether you are using the web accelerator.
You can delete it with this:
/sbin/route del -net 64.233.160.0 netmask 255.255.224.0 reject
I don't recommend it because a lot of plain www.google.com gets resolved to addresses in that range, and these would be unreachable from your box. The block on the 72.*.*.* Class A seems to be rather new, and I'm not aware that it is used for anything else by Google. So that block is probably okay.
I can't see the long terms benefits of banning the accelerator IPs: I think that site owners will lose out in the long run with such a strategy
Depends on what the penetration for WA -- and those like it that are sure to follow -- turns out to be. I think it was pure genius on G's part to show prefetched links with a double underline. Some users will tend to click on those "accelerated" links, putting pressure for those webmasters who 'opted out' to opt back in; can't be known as the slow poke in a fast moving herd.
Now, about those double underlined links and futzing around with our pages, that's another whole can of worms that hopefully some bigger guns than us poor folk here will take up with the big G.
Is anybody else worried that Google's prefetching will skew your web traffic reports? It's pretty much the same thing as the "background loading" trick that affiliate cookie pushers have been using to grab affiliate commissions.
You'll have a lot of hits on your pages that may have been prefetched and were never really looked at, inflating your page hit counts artificially.
I think we'll need to add Javascript "mouse movement" trackers to our pages, just to verify that someone's actually reading our pages.
Thanks.
Parts of many web pages are dependent on the geolocation of the ip address
Yep, all of that hard work that web developers put into implementing geo-targeting on their sites will be pooped away.
All of those advertisers that didn't want to target those outside of the U.S now have to worry about losing money.
So in other words, the user who saved his/her 34 seconds of surfing time from using web accelerator can use that time to manually exclude sites that use geo-targeting.
You have a hidden link on your page, say to bot-trap.htm. I come along with WA and just happen to mouse over the link, Goo prefetches the page and falls into the trap.
Consequences?