Welcome to WebmasterWorld Guest from 126.96.36.199
Operating System: Win XP or Win 2000 SP3+
Browser: IE 5.5+ or Firefox 1.0+
Availability: For users in North America and Europe (during beta testing phase)
Accelerator significantly reduces the time that it takes broadband users to
download and view web pages. The Google Web Accelerator appears as a small
speedometer in the browser chrome with a cumulative "Time saved" indicator.
Here's how it works. A user downloads and installs the client and begins
browsing the web as she normally would. In the background, the Google Web
Accelerator employs a number of techniques to speed up the delivery of
content to users.
Looks like some of the Mozilla hires are paying dvidends.
However, I feel that they should have put something as potentially harmful as this through a lot more rigourous testing and public consultation before letting it loose on the Internet.
Benefit for the site owner: Automatic clicks on your site ads, automatic additions to your customers' shopping baskets!
Yes, but can it prefetch their credit card info and submit it later :)
Well, even though there is mostly/all negative comments in this thread I believe it gives Google some valuable info they may/many not have thought about. I guess they could have struck a silent deal with AOL for broadband and it's users wouldn't have a clue ;)
Hello everybody and am already in love with this forum :-)
so Leoxiv said that
if Page A(index 1)!= Page A(index 2)
then Page A is CLOAKING
its a splendid observation but how on earth they can measure inequality as precisely as that?! there might be just tiny differences ...
To make the accellerator work your internet security program needs your approval to connect the accellerator to the internet.
Try to remove the accellerator from your security programs firewall internet access protocol. When you've done that restart the accellerator, try to open a page - interesting error page, eh?
The Google Accellerator is an application that cooperates with your browser and checks for data on the page requested via another port than 80 (HTTP).
Ergo, no proxy IP and no user agents.
One tool that I love is their desktop search...you can control everything, and clean the history /cache anytime you want.
1. if 5% of internet users utilise this technology then the web is slowed down because of the huge extra bandwidth requirements by this 5%.
Thus the another 10% of users say.. "the internet is so slow." and of course they look for how to improve their speed. They therefor download the 'Web Accelerator'.
This compounds the speed issue and thus making it even slower for the other 85%.
This keeps going until the web accelerator has a 100% market share. And of course, as the internet is slowed down so significantly we are all running at a speed not faster than now.
2. The web accelerator click on other PPC advertising. Thus making the advertisers pay more and look for other alternatives that are more cost effective. i.e. adwords
if 5% of internet users utilise this technology then the web is slowed down because of the huge extra bandwidth requirements by this 5%
Yes, you hit the nail on the head and win a stuffed animal of your choice.
Imagine if EVERYONE installs this mess and web usage which may already be borderline goes up 3x, 4x, or more within a few months! The bandwidth for most of the servers we host hits 80%+ for a couple of hours a day during peak times and this technology would just literally destroy access times during peak. Packets would be dropping all over the place and the sheer economics of the situation doesn't justify adding more bandwidth to the network.
The only solution would be to block any pre-fetch technology from all servers.
What is their ultimate goal with pre-fetch?
Has anyone considered what reason Google could possibly have for unleashing this pre-fetch menace other than the obvious comments earlier to track site access more closely?
First they announced enabling "Enhanced searching with Firefox":
Now this Web Accellerator release seems designed just to snag the IE crowd for whatever reason they want to be able to shove the technology down everyone's throats and Microsoft IE users were next in line.
Here's the clue on their web site that it's not an optimization technology but a bandwidth hog technology:
Dial-up users may not see much improvement, as Google Web Accelerator is currently optimized to speed up web page loading for broadband connections
Nothing new here, mostly restatement of what's already been covered, that it just doesn't add up to create meaningless technology to solve a problem that doesn't exist [anyone you know perceive broadband slowness as an issue?] and create a bunch of new problems that didn't exist before. Even if your local broadband provider WAS slow due to overloading this will just make it worse, not better.
So WHAT IS THE MOTIVATION of choking the net with more requests and slowing down access to some dynamic sites (now potentially overloaded as you can't cache them and make them work properly) by the very nature of technology designed to make it faster?
There must be more to the story than we're not privvy to at this point as it doesn't pass the sniff test.
[edited by: incrediBILL at 1:51 am (utc) on May 5, 2005]
mrMister, the one thing I'd say is that it's only been on Labs for a few hours, so folks may want to give it a little while before judging it or deciding what they think.
That's a fair enough statement.
I have been quick to jump to conclusions on some of the bandwidth usage issues.
However, I do think that the Webmaster should be given more control over which pages can be accessed by the app, especially during beta testing.
It's just impossible for the programmers to pre-empt every possible problem that could occur on any of the billions of web pages out there.
The webmaster of the site is in a much better position to guage the possible problems that might occur on their site and therefore they should be able to have the final say on whether the caching and prefetching should go ahead or not.
There are a number of problematic issues that I now have to deal with regarding this pre-fetching on some of my sites. None of these would occur if the app obeyed my robots.txt files.
This may be the answer in itself:
>> Now this Web Accellerator release seems designed just to snag the IE crowd for whatever reason.
I would say 'whatever reason' could be to try to stay a step ahead of M, by getting people used to using this technology, before M decides to, for some odd reason, add MSN Search to IE browsers.
Mere speculation on my part.
[edited by: jd01 at 1:36 am (utc) on May 5, 2005]
Why did a Google employee apparently state that Google sends a Google UA for the prefetch when in fact it does not do this? Same sniff test issue here.
A few random questions:
- Are the cached pages the same as those displayed in the serps under the cache link?
- Does the meta noarchive tag have any influence?
- Are all pre-fetch requests still using the special header?
For the program itself:
- is the accelerator cache in a readable state by other programs, ie.
- Is it in text or binary format?
- Can Google Desktop Search read it?
Are the cached pages the same as those displayed in the serps under the cache link?
Now you're pointing to something that could be bordering on a real reason to make the technology compelling that I had completely overlooked. If they are actually combining this pre-fetch technology with the regular search engine indexing technology then pre-fetched pages could theoretically be updated in the SERPs based on user demand to keep their index even more fresh in ... REAL-TIME.
I know I get indexed every 2 nights already so they seem to be getting closer IMO.
Could this possibly be a precursor to live interactive search engines with SERPs updated in real-time?
This MAY pass the sniff test if in fact this is the ultimate goal.
If they are actually combining this pre-fetch technology with the regular search engine indexing technology then pre-fetched pages could theoretically be updated in the SERPs based on user demand to keep their index even more fresh in ... REAL-TIME.
re-arranging 8 billion pages in a database in real time? I don't think even Google have the computing power to do that.
I'd have thought that the toolbar would be enough to be able to fairly accurately calculate the popularity of sites on the Internet.
Alexa seems to do an adequate job of this and I suspect there are less Alexa toolbars in circulaion than Google ones.
re-arranging 8 billion pages in a database in real time?
Maybe just the index for the keyword you used to get to the site - just a thought.
I believe thats what lexiv said in msg #26. I am not sure if real time would be possible. They dont need real time really.
80 posts later it's hard to remember :)
I wasn't thinking instantaneous exactly, but anything less than a few days would be an improvement.
[edited by: incrediBILL at 3:25 am (utc) on May 5, 2005]
My sites struggle under their load as is. Google requests that we do not send automated queries to them, as it increases their load without monetization and can cause them problems. This is just plain meanspirited to unleash these unrequested requests on us.
Cry foul webmasters, it's not fair for Google to generate unrequested requests to our sites.
C'mon Google, do no evil. My site's speed is limited by load, not data transfer and this application will only contribute to my load.
Give me an easy way to opt out of prefetching please. If you expect people to have the courtesy to respect Google's load issues then don't unleash a DDos on us.