Forum Moderators: Robert Charlton & goodroi
System Requirements
Operating System: Win XP or Win 2000 SP3+
Browser: IE 5.5+ or Firefox 1.0+
Availability: For users in North America and Europe (during beta testing phase)
Press Release:
Google Web
Accelerator significantly reduces the time that it takes broadband users to
download and view web pages. The Google Web Accelerator appears as a small
speedometer in the browser chrome with a cumulative "Time saved" indicator.Here's how it works. A user downloads and installs the client and begins
browsing the web as she normally would. In the background, the Google Web
Accelerator employs a number of techniques to speed up the delivery of
content to users.
Looks like some of the Mozilla hires are paying dvidends.
In general I don't think that's true, especially if you turn off referrers in your browser. Please see [desktop.google.com...]
webmasters can choose to just ignore prefetch requests if they so choose.
GG, this is not specific enough. Please define "ignore". IMO the logical thing to do is to send a 403, but that's not acceptable if there's a chance the end user will be shown that same error.
If you mean rejecting the request outright, please suggest a mechanism to do that. I specifically asked about this here and the only method proposed was based on IP address (obviously not an ideal solution).
FWIW, I don't have a bandwidth problem and don't intend to block prefetching unless it causes problems. My main concern is that I specifically bar robots from executing scripts in my /cgi-bin/ and /scripts/ directories, because of server load issues, not to mention unexpected and undesirable results in some cases.
I just read few days ago somewhere, maybe it was at SEW, it is DEFAULT to send info out, not the other way around.
We have absolutely zero need for this, and in most cases, it can potentially be harmful.
- we already compress our pages
- all of our pages use strict cache-control and expires headers to stop caching by browser and/or proxy as our pages are extremely dynamic
- We do not use query strings in our urls as exposed to outside world. We have gone through great lengths to hide ugly URLs w/ query strings behind mod_rewrite.
- We use cookies heavily for session and state passing.
- This will screw up tracking and reporting.
Forcing this on web sites with no way for a site to opt out is just plain rude and unprofessional.
Anyway--welcome to WebmasterWorld, and I hope you stick around and enjoy the less acrimonious threads too. :)
I wonder though if this is really worth it for google? Have you guys looked around the web lately at what people are saying, googleguy?
[news.google.com...]
just an observation:
the first 3 news links, have Adsense on them :-)
Forcing this on web sites with no way for a site to opt out is just plain rude and unprofessional.
Amen.
GoogleDude keeps hyping it'll speed everyone up [and it doesn't speed us all up] but doesn't seem to care that SOME of us webmasters just don't want it interfering with our sites and more importantly our SITE STATS which we use as a basis for marketing the site. I don't see where the "wait and see what we can do it's in beta" attitude shows one ounce of respect to the webmasters that don't want it touching their sites?
Again today Googles servers ping slower [much slower this time] than most of my destination servers so putting Google between me and my destination is adding sluggish response times just to get the promise of HTML compression which my Apache server already supports. Google can't really compress the images so the faster download from my destination server is almost doubled to a sluggish crawl as the images gets relayed thru Google.
What in the heck are they saving me?
Besides, who'll stop them next April Fool's Day when they decide to punk the net and feed the WA cache thru Snoop Dogg's Shizzolator?
Here’s the problem: Google is essentially clicking every link on the page — including links like “delete this” or “cancel that.” And to make matters worse, Google ignores the Javascript confirmations.
Serving with Google Web Accelerator can only be faster in some cases as long as a very small minority is using it. As soon as the tool is used by a significant fraction of the online population, the complete web will slow down, because it is forced to transport several times as much data.
And even while only a minority is using it: At least half of the time the additional overhead will necessarily make access to pages slower, instead of faster. Don't get fooled by the software informing you about "time saved". It's lying, because it has no way of knowing how fast the direct request would have been.
In sum, this thing is a very bad idea. It turns its users into parasites wasting other people's resources.
Proxy caches have the potential to speed up users' connection and reduce the bandwidth for a webmaster; that's why lots of ISPs use them.
The ISP is one of the few places where a proxy chache makes sense. This is so because the proxy there is garanteed to be within the natural path of the request. In contrast, Google's proxy is garanteed to divert most traffic from the shortest path, thus wasting bandwidth and slowing down a large number of page requests.
Sure, yah, so you feel vindicated about Gmail et al, so you really think that somehow that means no matter what spyware crap you put out that it's just fine, because Gmail was?
The web accelerator is breaking sites. It causes harm to site owners, it causes harm to advertisers, and in some cases causes harm to users.
The only benefit to this is the information that M$...sorry, Google, gets to glean from it.
I've been a huge Google fan , wasn't so worried about Gmail, but this is the deal breaker. This is flat out "evil" and puts you smack dab in the middle of the corporate greed and irresponsibility stereotype that somehow managed to avoid Google until now.
<If you go back a year to Gmail's debut, there were also people who wanted to block Gmail right after it was introduced. Fast forward a year, and many many people like Gmail now that they've seen the direction that we've gone with it: 2 gigs of storage and growing, a solid UI, free POP access, and free email forwarding as well.>
With all due respect, please don't underestimate the intelligence of fellow members.
Gmail is an opt-in operation where people have a choice. Where in the case of WA, the whole operation is an opt-out one.
This thread and several other recent threads illustrate with no doubt that Google has lost a big portion of the good-will it had among publishers, even those who were considered most Google friendly.
I admire the courage of those decent publishers who have chosen to block WA and have expressed their reasons and positions in public. God bless you all.
why can't you smart people at Google just focus on delivering cutting-edge compression technologies to speed up the web instead of coming up with something so controversial as Prefetching?
Also, I'd like to see G pair up with some of the companies that are at the next-level on this thing, such as OnSpeed, so you don't have to re-invent the wheel.
why can't you smart people at Google just focus on delivering cutting-edge compression technologies to speed up the web instead of coming up with something so controversial as Prefetching?
That's a fair question, zjacob. I think if you go back and look at Googlebot now vs. two or even one year ago, today's Googlebot is much more likely to support gzipped content when fetching pages. So that's one thing. The other thing that I can think of is enabling prefetching for Google search results where we have high confidence that the user will click on the first search result.
But last year people were complaining that Gmail wasn't opt in, because if you mailed someone using Gmail, it might generate ads.
Why do you insist on stating that because the furor died down before with a totally different product, that this means that your new "product" is not bad? It is breaking sites. What part of that do you not understand? Every time you post, it's always how we're "overreacting."
It's not overreacting when people are fed caches of other users information. It's not overreacting when statistics we all use on our sites for advertising are now rendered useless on a whim by the great "Google." It's not overreacting when products are getting added to shopping carts without consent.
All for what? Allegedly to speed up broadband users? You say you feel we're savvy, so why do you pretend this is for the users benefit? This is, pure and simple, information collecting for Google's benefit. You're insulting our intelligence to act like that's not the case.
<I want to make sure that I take comments from here to the people who work on the web accelerator.>
Very kind of you. Much appreciated.
But why don´t you folks at Google understand that publishers spend hours, days, weeks and years in creating decent sites with valuable contents, and it should be those publishers who decide how and in which way they wish to display the contents of their sites.
One thing WA and Gator have in common; they force their own will on the legitimate owners of websites.
Pls correct me if I´m wrong.
One thing WA and Gator have in common; they force their own will on the legitimate owners of websites.
... and both can be blocked via .htaccess ( Gator has been for quite sometime) .
As a Webmaster I don't see that same choice when it comes to the sites I manage when it comes to WA.
From the few tests I ran I can say that as soon as a simple tag is posted that does not involve a 403, I will have to spend a couple of days inserting code so that this thing is blocked from fetching any content on my sites.
The other thing that I can think of is enabling prefetching for Google search results where we have high confidence that the user will click on the first search result.
LMAO - you go right ahead with that plan and watch dynamic ecommerce sites with NO-CACHE try their best to get OUT of the number one position when it tanks their server with a few hundred thousand wasted requests each day and shoppers can't shop as the shopping engine is crawling under the load. Are you guys actually contemplating the impact of your actions with this Web Accellerator thing?
BTW, when you implement this new pre-cache technology for the #1 spot what will you tell webmasters like myself with NO-CACHE pages who's server bandwidth charges suddenly DOUBLE or TRIBLE that month?
I sure as hell won't be picking up your tab for unwanted traffic,
Do you have an address at Google where I can send the invoice for the excessive bandwidth charges when this is implemented?
As a computer user I have the choice to NOT download the toolbar, Desk Top Search or use Gmail.As a Webmaster I don't see that same choice when it comes to the sites I manage when it comes to WA.
Very true.
Why do you insist on stating that because the furor died down before with a totally different product, that this means that your new "product" is not bad?
I think the answer is that he has a very tough position right now.
you go right ahead with that plan and watch dynamic ecommerce sites with NO-CACHE try their best to get OUT of the number one position
Now, that is funny. #4 or #5 would be the new best spot?
[edited by: oneguy at 9:19 pm (utc) on May 6, 2005]
I'm going to be polite to you because I know you're just our liaison to Google. Ranting at you would probably be as productive as acting belligerent toward a help desk operator and then wondering why you couldn't get the help you needed.
I have spent many hours making all my links seem static. The content on those pages however is always dynamic. So I was a little miffed when I got a reply from Google suggesting that I can block the Web Accelerator by adding a question mark to each of the links on thousands of pages.
If we're as intelligent as you claim you'll hopefully understand that we webmasters, well myself at least, view that as a bogus reply which borders on being incredulous.
Please pass those sentiments along to whomever you consult with at Google.
1. If you have 2 browsers running, IE & Mozilla and you goto Mozilla and do some surfing, and then when you comeback to IE, the WA stops working. Need to hit refresh to get it going again.
2. If you hit refresh on WebmasterWorld, it is causing some page formatting errors when new posts have been added. In other words, it has a problem with dynamic content! Or, this is the result of WebmasterWorld blocking WA
[edited by: Namaste at 9:24 pm (utc) on May 6, 2005]
I have two more meetings back to back, but I'll stop by later.
WA still takes credit for saving a second
That's the slight of hand magic trick here is that if they pre-fetch it and download it before you ask for it so that's the "seconds" they saved you avoiding the download delay. In reality they're just choking our broadband pipes with things you may or may not want.
My site's home page has conservatively over 200 links on it, can you imagine someone moving their cursor across my page as they read it?
Can you imagine the impact on their downloading bandwidth?
Now imagine this is happening on the office DSL line with 50 co-workers also pre-fetching away.
Bottleneck.
Emails stop sending / receiving.
Network down.
Worker online productivity drops 100% thanks to GWA :)
IT guy that installed it network-wide is fired on the spot.
Quivering remaining IT guy trembles from desk to desk uninstalling it with an angry CEO standing over him.