homepage Welcome to WebmasterWorld Guest from 54.211.190.232
register, free tools, login, search, subscribe, help, library, announcements, recent posts, open posts,
Subscribe and Support WebmasterWorld
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 476 message thread spans 16 pages: < < 476 ( 1 2 3 4 5 [6] 7 8 9 10 11 12 13 14 ... 16 > >     
Google Windows Web Accelerator
Brett_Tabke




msg:736302
 8:09 pm on May 4, 2005 (gmt 0)

[webaccelerator.google.com...]


System Requirements
Operating System: Win XP or Win 2000 SP3+
Browser: IE 5.5+ or Firefox 1.0+
Availability: For users in North America and Europe (during beta testing phase)

Press Release:

Google Web
Accelerator significantly reduces the time that it takes broadband users to
download and view web pages. The Google Web Accelerator appears as a small
speedometer in the browser chrome with a cumulative "Time saved" indicator.

Here's how it works. A user downloads and installs the client and begins
browsing the web as she normally would. In the background, the Google Web
Accelerator employs a number of techniques to speed up the delivery of
content to users.

Looks like some of the Mozilla hires are paying dvidends.

 

davidgautier




msg:736452
 5:18 pm on May 5, 2005 (gmt 0)

I saw the same page macdave saw although now it works fine.

macdave




msg:736453
 5:30 pm on May 5, 2005 (gmt 0)

Ah, looks like they're doing UA cloaking. Browsing with Safari the WA home page redirects, and the Webmaster Help page says just
The requested URL was not found on this server. When I switch to Firefox it's all there.
py9jmas




msg:736454
 5:37 pm on May 5, 2005 (gmt 0)

Web Accelerator is a caching proxy server. Proxy servers don't look at robots.txt, they proxy requests. They also cache responses to speed up future requests, and to reduce bandwidth use/latency/server load. This is what Squid does. This is what Microsoft web proxy does. This is what AOL's proxies do. Why are you all upset by this? It is nothing new.

# Sending your page requests through Google machines dedicated to handling Google Web Accelerator traffic.

It's a proxy.
# Storing copies of frequently looked at pages to make them quickly accessible.

It's a caching proxy
# Downloading only the updates if a web page has changed slightly since you last viewed it.

It's a caching proxy
# Prefetching certain pages onto your computer in advance.

It uses Mozilla's standard prefetching algos
# Managing your Internet connection to reduce delays.

It fiddles with your connection settings. Lots of freee/shareware programs have been doing this for years
# Compressing data before sending it to your computer.

Wow, it even supports gzip.

dmorison




msg:736455
 5:41 pm on May 5, 2005 (gmt 0)

It is completely screwing up one website of mine in which the contents of the home page changes when you are logged in.

My users are seeing this (with Web Accelerator)

GET index.html (sees non logged in version)
GET login.html
POST login.html > 302 (sets auth cookie) > index.html

User is now logged in....

GET index.html (still sees non logged in version, should be seeing logged in version)

If you bash CTRL-F5 hard enough and do whatever else you can to finally get to the logged in version, the same thing happens again when they logout...

GET logout.html > 302 (clears auth cookie) > index.html
GET index.html (still sees logged in version)

Aggggghhhh! My customers for this particular application are techie types who are likely to be trying the WA out!

My immediate solution to this has been to move the logged in area over to a sub-domain called members.example.com instead of www.example.com (just set it up as an alias).

Where do I sign-up for my free SSL cert, Google? ;)

incrediBILL




msg:736456
 5:53 pm on May 5, 2005 (gmt 0)

Yesterday as we all screamed about the EXTRA bandwidth that Google WA would steal from everyone.

Here's a new delimma to consider:

Consider that Google WA cache could actually significantly lower bandwidth on some very popular static sites the net result might be taking money out of the mouth of web hosting companies that charge fees for account bandwidth that goes "over the limit" on a monthly basis.

On the flip side of the coin pre-fetch may run some sites over the limit as well.

Therefore, this technology has the potential to impact hosts and/or webmasters monetarily.

Save your logs, send them a bill :)

Craven de Kere




msg:736457
 6:05 pm on May 5, 2005 (gmt 0)

py9jmas wrote: "Web Accelerator is a caching proxy server. Proxy servers don't look at robots.txt, they proxy requests. They also cache responses to speed up future requests, and to reduce bandwidth use/latency/server load. This is what Squid does. This is what Microsoft web proxy does. This is what AOL's proxies do. Why are you all upset by this? It is nothing new."

You are missing what is the issue for me and my sites (and others with similar concerns). you say that "they proxy requests" but google's accelerator does not just proxy requests it automates them and proxies some requests that the user *does not request*. It's the additional automated requests that will cause increased load and cost for me without monetization to pay for the costs.

It has the potential to threaten the viability of some sites.

Scarecrow




msg:736458
 6:11 pm on May 5, 2005 (gmt 0)

I don't have the right hardware to install this thing.

Can someone tell me, maybe from using a sniffer, whether the WA talks to something.google.com? Or maybe it uses a static IP address instead for speed?

In the first case, the famous Google cookie is sent by your browser. In the second case, it isn't.

I'm interested in whether the famous cookie with the globally-unique ID in it is offered up by your browser when you use WA. The cookie responds to anything.google.com

If the cookie is offered, then I think we have more of a privacy problem than we would otherwise.

LeoXIV




msg:736459
 6:15 pm on May 5, 2005 (gmt 0)

This is absurd. it should be OPT-IN rather than opt-out anyway.

And for those who think this is not also about cloaking and purifying the search results, well it is my personal humble opinion that it very is. just think about that now, on top of everything else mentioned, google will have access to queries made to other search engines. Google can analyze what are the principal differences between queries made on Yahoo versus Google :-)

actually i think its a brilliant product, for Google of course.

Scarecrow




msg:736460
 6:28 pm on May 5, 2005 (gmt 0)

And if it's a static IP so that no google.com cookie is sent, please check to see if there is any sort of serial number on the WA that is perhaps sent along in the URL.

Brett_Tabke




msg:736461
 6:29 pm on May 5, 2005 (gmt 0)

How I see it working:
  • A user clicks a link in IE.
  • The URL is sent to Google.
  • Google fetches the URL from the website back to Google.
  • Google then sends the url data in Gzip compressed form back to the user.

Other Tricks:

  • it guesses the most logical link on page to fetch.
  • it autograbs pages that are likely to be fetched by users based on other users click tracks.
  • it auto fetches any url you mouse over.
  • although it may be using the prefetch mechanism in Mozilla, that algo is *NOT* implemented in IE.

steve40




msg:736462
 7:01 pm on May 5, 2005 (gmt 0)

hmm having looked at this
I can see one major use of
apart from those others have mentioned and G's reasoning

run a compare on pages in background between what GBot collected and prefetched via web accelerator
if difference penalise or delete or investigate

This could have some of the most significant impact on SEO black and white ranging from
includes / cloaking / redirect and much more

I think this may be G's attempt to clean up serps without human intervention

And as many of G's past attempts have shown fallout to the innocent is not seen as that important by G
And we may see one of the biggest changes for a long time to serps towards the end of the year if this is implemented both innocent and dark side SEO caught 6 months data collected so ( October or November comes to mind )

just my very jaundiced view
steve

BReflection




msg:736463
 7:05 pm on May 5, 2005 (gmt 0)

This isn't about used or unused bandwidth of servers and webmasters, it's about all the unused bandwidth of users.

For all the money Google makes you webmasters in advertising revenue you could at least lighten up and let them treat the end-users for a change.

And it's not like they aren't using the service on their own web page. It can't be that bad.

Load Time for 3647 Pages
Without Google Web Accelerator: 1.0 hr
With Google Web Accelerator: 47.1 mins

I'll take it.

Powdork




msg:736464
 7:17 pm on May 5, 2005 (gmt 0)

How many of those 3647 pages were shown as intended by the author? How many times did the user end up somewhere different because of cookie values or lack thereof? How many other pages were served needlessly to satisfy the need for those 3600 pages. What legitimate surfing need is there to load 60 pages/minute?

jomaxx




msg:736465
 7:21 pm on May 5, 2005 (gmt 0)

Is there a way to set things up so that the server sends no response to the prefetch request? Simply ignores it? It seems to me this would be better than feeding a 403 or a 404 or a redirect to a different page.

I see this behaviour on some sites when I do automated link checking, so I imagine there's some way to implement it. But I can't see how to do it with a .htaccess file.

walkman




msg:736466
 7:30 pm on May 5, 2005 (gmt 0)

or a custom page that explains why the page is blank ;).
That would work great for some Norton users too who want to view the content with no ads.

"Is there a way to set things up so that the server sends no response to the prefetch request? Simply ignores it? It seems to me this would be better than feeding a 403 or a 404 or a redirect to a different page. "

[edited by: walkman at 7:32 pm (utc) on May 5, 2005]

mrMister




msg:736467
 7:32 pm on May 5, 2005 (gmt 0)

The rest of the web is still stuck in the uncompressed dark ages of the 70's.

That's one hell of a generalization. More and more server owners are waking up to the benefits of HTTP compression. As the price of dedicated servers freefalls and shared servers become less common, webmasters are now able to take compression in to their own hands rather than letting greedy Web Hosts send uncompressed data to artificially increase bandwidth usage (and therefore profits)

One thing we know for absolute certainty is that bandwidth requirements and page sizes are going to increase.

Your definition of absolute is very different to mine!

Bandwidth requirements probably will go up barring any radical new inventions. File sharing has already taken off to a large extent and multimedia streaming is set to take off in a big way too.

However the Web Accellerator doesn't do a think about high bandwidth usages of the Internet. It deals with web pages.

I see a downward trend in web page file sizes.

My oldest site is 10 years old. Thanks to the widespread adaption of CSS, external javascript and browsers that support HTTP compression, I've been able to decrease my average web page on that site from about 10KB down to 1.5KB.

Slowly, more and more web site owners are taking the time out to make their code leaner and faster.

A compressing proxy does indeed stand to speed up the web.

Possibly so. But this utility wastes far more data through prefetches than it saves in compression. Widespread use of the utility will slow down the Internet as a whole.

I agree in principle with claus's feelings on the product, but I disagree on some of the specifics:

> a made up consumer need

I agree with claus too and your statements regarding compression. However I will make my point again that compression is one small part of this application. Another part is prefetching and that alone wastes all the bandwidth savings that the compression saves.

> it does not honor robots.txt

It is a proxy on behalf of a human, it isn't a bot.

The proxy is a proxy. the client application you download is a prefetching agent which makes it a bot. It accesses resources without knowing what they are. I have seen it add items to shopping baskets of its own accord even when robots.txt specifically forbids automated user agents from doing such.

> User-Agent string for that file.

It can't claus, it *has* to pass the UA unfettered.

That does not mean it can't identify itself in over ways via HTTP, especially with respect to its prefetching activities.


> wastes your bandwidth.

Agreed. how much it uses is open for debate.

Well I'd hardly say it's open to debate, but it varies based on the way the user uses their web browser. However, in my studies it prefetched at least three pages for every one that I accessed

I think the take away here is that if everyone would just install GZip on their websites, we would have the same effect in ALL browsers and not just in IE/Moz.

Agreed.

I think more people should de-bloat their web sites too. I saw one homepage today that weighs in at a massive 40KB. For every 1 byte of visible data, there's 4 bytes of bandwidth usage. That's not a ratio I'd be pleased about.

On top of that, they're not using any HTTP compression.

BReflection




msg:736468
 7:42 pm on May 5, 2005 (gmt 0)

How many of those 3647 pages were shown as intended by the author?

All of them. You'll have to look somewhere else for your conspiracy theory.

mrMister




msg:736469
 7:43 pm on May 5, 2005 (gmt 0)

[quote] > for a site like WebmasterWorld it's probably a lot

It is ZERO. We do not allow prefetching.[quote]

For some webmasters, banning users just because they unwittingly use a poorly thought-out Web Accellerator is not an option.

mrMister




msg:736470
 7:48 pm on May 5, 2005 (gmt 0)

[quote]Everybody seems to be assuming Google will pre-fetch anchor links, i.e. links in the <a> tag.[quote]

What absolute rubbish.

It's very clear to me that you are the one making the assumptions. If you'd bothered to analyse the Web Accelerator, you'd see that it does indeed prefetch Anchors.

Scarecrow




msg:736471
 8:07 pm on May 5, 2005 (gmt 0)

Is there a way to set things up so that the server sends no response to the prefetch request? Simply ignores it? It seems to me this would be better than feeding a 403 or a 404 or a redirect to a different page.

Sure, at least on a Linux box. This will hang the WA and the WA will have to give up after timing out. It will make the WA seem incompetent and/or broken:

/sbin/route add -net 72.14.192.0 netmask 255.255.240.0 reject

/sbin/route add -net 64.233.160.0 netmask 255.255.224.0 reject

This affects all domains on that box. The question is, will Google retaliate by assuming that these domains are also broken for Googlebot? As far as I can determine, Googlebot is on the 66.249.*.* Class B these days, so these blocks woudn't affect the Googlebot unless Google makes this assumption.

If you have a box where you don't care if you're listed in Google, it's worth a try. If you have a box where you depend on your Google referrals, I'd recommend a wait-and-see approach.

mrMister




msg:736472
 8:09 pm on May 5, 2005 (gmt 0)

I got "Forbitten"
when I log in webmasterworld and have this "Accelerator" switched on.

Now I switched it off and could log in.

(Brett)
So before we go down this road of lynching them, lets be sure what the issues and consequences are for us. I don't think we can see those yet

It might have taken him a day, but I think it's safe to assume that Brett doesn't like the "issues and consequences" that arise from Web Accelerator.

encyclo




msg:736473
 8:24 pm on May 5, 2005 (gmt 0)

I don't think there are very many people happy about the Web Accelerator, me included. It is a huge invasion of privacy for users, it will cause nothing but problems for site owners, and it gives all the advantages to Google. If i gave the mistaken impression that I condoned it or approved of it, let me correct that right now.

However, Google's privacy invasion is not new, it's data expropriation is legendary, and it's total disinterest in anything other than their own interests is symptomatic, even more so since the company went public. I'm not as scandalized simply because it's not all new.

However, I can't see the long terms benefits of banning the accelerator IPs: I think that site owners will lose out in the long run with such a strategy. It may seem a good (if rather knee-jerk) reaction to the initial analysis, but I think we need more time to fully analyse the implications that the accelerator brings.

Powdork




msg:736474
 8:32 pm on May 5, 2005 (gmt 0)

All of them. You'll have to look somewhere else for your conspiracy theory.
How do you know? Parts of many web pages are dependent on the geolocation of the ip address. Parts of many pages are dependent on stored cookie values. I don't recall mentioning any conspiracy theory. I am just suggesting the the output on your monitor can be different based solely on whether you are using the web accelerator.
Scarecrow




msg:736475
 8:33 pm on May 5, 2005 (gmt 0)

On second thought, I don't recommend this block:
/sbin/route add -net 64.233.160.0 netmask 255.255.224.0 reject

You can delete it with this:
/sbin/route del -net 64.233.160.0 netmask 255.255.224.0 reject

I don't recommend it because a lot of plain www.google.com gets resolved to addresses in that range, and these would be unreachable from your box. The block on the 72.*.*.* Class A seems to be rather new, and I'm not aware that it is used for anything else by Google. So that block is probably okay.

jimbeetle




msg:736476
 8:35 pm on May 5, 2005 (gmt 0)

I can't see the long terms benefits of banning the accelerator IPs: I think that site owners will lose out in the long run with such a strategy

Depends on what the penetration for WA -- and those like it that are sure to follow -- turns out to be. I think it was pure genius on G's part to show prefetched links with a double underline. Some users will tend to click on those "accelerated" links, putting pressure for those webmasters who 'opted out' to opt back in; can't be known as the slow poke in a fast moving herd.

Now, about those double underlined links and futzing around with our pages, that's another whole can of worms that hopefully some bigger guns than us poor folk here will take up with the big G.

LeoXIV




msg:736477
 8:38 pm on May 5, 2005 (gmt 0)

Imagine you have an extensive PPC campaign, how can you afford to ban the web accelerator?

androidtech




msg:736478
 8:45 pm on May 5, 2005 (gmt 0)

If this has been talked about before, mea culpa. This thread is 18 pages long.

Is anybody else worried that Google's prefetching will skew your web traffic reports? It's pretty much the same thing as the "background loading" trick that affiliate cookie pushers have been using to grab affiliate commissions.

You'll have a lot of hits on your pages that may have been prefetched and were never really looked at, inflating your page hit counts artificially.

I think we'll need to add Javascript "mouse movement" trackers to our pages, just to verify that someone's actually reading our pages.

Thanks.

Powdork




msg:736479
 9:00 pm on May 5, 2005 (gmt 0)

It is ZERO. We do not allow prefetching.

RewriteCond %{X-moz} ^prefetch
RewriteRule ^.* - [F]


Why does the accelerator show I am saving about .15 seconds per WW page visited?

msgraph




msg:736480
 9:17 pm on May 5, 2005 (gmt 0)

Parts of many web pages are dependent on the geolocation of the ip address

Yep, all of that hard work that web developers put into implementing geo-targeting on their sites will be pooped away.

All of those advertisers that didn't want to target those outside of the U.S now have to worry about losing money.

So in other words, the user who saved his/her 34 seconds of surfing time from using web accelerator can use that time to manually exclude sites that use geo-targeting.

jimbeetle




msg:736481
 9:27 pm on May 5, 2005 (gmt 0)

Hmmm. This thread [webmasterworld.com] got me thinking.

You have a hidden link on your page, say to bot-trap.htm. I come along with WA and just happen to mouse over the link, Goo prefetches the page and falls into the trap.

Consequences?

walkman




msg:736482
 9:36 pm on May 5, 2005 (gmt 0)

"You have a hidden link on your page"

consequences? maybe a ban ;)

This 476 message thread spans 16 pages: < < 476 ( 1 2 3 4 5 [6] 7 8 9 10 11 12 13 14 ... 16 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About
© Webmaster World 1996-2014 all rights reserved