| 12:20 am on May 5, 2005 (gmt 0)|
I don't begrudge Google for creating this app. Some of the things coming out of their Research Labs have been very impressive.
However, I feel that they should have put something as potentially harmful as this through a lot more rigourous testing and public consultation before letting it loose on the Internet.
| 12:22 am on May 5, 2005 (gmt 0)|
|Benefit for the site owner: Automatic clicks on your site ads, automatic additions to your customers' shopping baskets! |
Yes, but can it prefetch their credit card info and submit it later :)
Well, even though there is mostly/all negative comments in this thread I believe it gives Google some valuable info they may/many not have thought about. I guess they could have struck a silent deal with AOL for broadband and it's users wouldn't have a clue ;)
| 12:23 am on May 5, 2005 (gmt 0)|
I wonder what Amazing ones will think of it?
| 12:23 am on May 5, 2005 (gmt 0)|
its funny that my first post on this forum would imply I have cloaked pages. I rather say 'protected'...
Hello everybody and am already in love with this forum :-)
so Leoxiv said that
if Page A(index 1)!= Page A(index 2)
then Page A is CLOAKING
its a splendid observation but how on earth they can measure inequality as precisely as that?! there might be just tiny differences ...
| 12:23 am on May 5, 2005 (gmt 0)|
|Just thought it beared mentioning all that money Google spent on hiring only those with the highest education, the supposedly bestest and brightest |
Yeah, but didn't they hire a Microsoft guy as well? ;-)
[edited by: mrMister at 1:02 am (utc) on May 5, 2005]
| 12:25 am on May 5, 2005 (gmt 0)|
To make the accellerator work your internet security program needs your approval to connect the accellerator to the internet.
Try to remove the accellerator from your security programs firewall internet access protocol. When you've done that restart the accellerator, try to open a page - interesting error page, eh?
The Google Accellerator is an application that cooperates with your browser and checks for data on the page requested via another port than 80 (HTTP).
Ergo, no proxy IP and no user agents.
| 12:43 am on May 5, 2005 (gmt 0)|
personally: I installed and it and saved some 4 seconds after browsing for a while. Decided that 1.5 MB dsl is fast enough for me.
One tool that I love is their desktop search...you can control everything, and clean the history /cache anytime you want.
| 12:48 am on May 5, 2005 (gmt 0)|
Lets review this system
1. if 5% of internet users utilise this technology then the web is slowed down because of the huge extra bandwidth requirements by this 5%.
Thus the another 10% of users say.. "the internet is so slow." and of course they look for how to improve their speed. They therefor download the 'Web Accelerator'.
This compounds the speed issue and thus making it even slower for the other 85%.
This keeps going until the web accelerator has a 100% market share. And of course, as the internet is slowed down so significantly we are all running at a speed not faster than now.
2. The web accelerator click on other PPC advertising. Thus making the advertisers pay more and look for other alternatives that are more cost effective. i.e. adwords
| 12:52 am on May 5, 2005 (gmt 0)|
I'll pass on feedback from this thread to the people that worked on this. I'll ask them to read the thread too, in case there's any issues that they can tackle (e.g. someone mentioned an issue with cookies..)
| 12:56 am on May 5, 2005 (gmt 0)|
That's a quick response from GoogleGuy. Thank-you.
At least with Google, these issues get quick attention. Which is a lot more than can be said for a lot of other big Internet companies.
| 1:04 am on May 5, 2005 (gmt 0)|
BlackTulip, measuring that inequality is relativly easy for Google, their Is_Near_Duplicate function would do the job. Basically any 'healthy' page Is_Near_Duplicate of itself.
| 1:06 am on May 5, 2005 (gmt 0)|
|if 5% of internet users utilise this technology then the web is slowed down because of the huge extra bandwidth requirements by this 5% |
Yes, you hit the nail on the head and win a stuffed animal of your choice.
Imagine if EVERYONE installs this mess and web usage which may already be borderline goes up 3x, 4x, or more within a few months! The bandwidth for most of the servers we host hits 80%+ for a couple of hours a day during peak times and this technology would just literally destroy access times during peak. Packets would be dropping all over the place and the sheer economics of the situation doesn't justify adding more bandwidth to the network.
The only solution would be to block any pre-fetch technology from all servers.
| 1:11 am on May 5, 2005 (gmt 0)|
mrMister, the one thing I'd say is that it's only been on Labs for a few hours, so folks may want to give it a little while before judging it or deciding what they think. I pinged one of the people who worked on it to ask for more info.
| 1:25 am on May 5, 2005 (gmt 0)|
The whole concept of this technology (unless i have it wrong) is pre-fetching.
Thus as a owner of many sites that have significant traffic one of my greatest costs is bandwidth. This concept of pre-fetching has the side effect of greatly increasing my bandwidth costs.
This is the biggest concern for me.
If this product is adopted then I would be forced to block any prefetching attempts.
| 1:25 am on May 5, 2005 (gmt 0)|
Check out the fifth parahraph - something on how to ID a prefetch request.
| 1:28 am on May 5, 2005 (gmt 0)|
OK, my final ponderings on this issue:
What is their ultimate goal with pre-fetch?
Has anyone considered what reason Google could possibly have for unleashing this pre-fetch menace other than the obvious comments earlier to track site access more closely?
First they announced enabling "Enhanced searching with Firefox":
Now this Web Accellerator release seems designed just to snag the IE crowd for whatever reason they want to be able to shove the technology down everyone's throats and Microsoft IE users were next in line.
Here's the clue on their web site that it's not an optimization technology but a bandwidth hog technology:
|Dial-up users may not see much improvement, as Google Web Accelerator is currently optimized to speed up web page loading for broadband connections |
Nothing new here, mostly restatement of what's already been covered, that it just doesn't add up to create meaningless technology to solve a problem that doesn't exist [anyone you know perceive broadband slowness as an issue?] and create a bunch of new problems that didn't exist before. Even if your local broadband provider WAS slow due to overloading this will just make it worse, not better.
So WHAT IS THE MOTIVATION of choking the net with more requests and slowing down access to some dynamic sites (now potentially overloaded as you can't cache them and make them work properly) by the very nature of technology designed to make it faster?
There must be more to the story than we're not privvy to at this point as it doesn't pass the sniff test.
[edited by: incrediBILL at 1:51 am (utc) on May 5, 2005]
| 1:29 am on May 5, 2005 (gmt 0)|
|mrMister, the one thing I'd say is that it's only been on Labs for a few hours, so folks may want to give it a little while before judging it or deciding what they think. |
That's a fair enough statement.
I have been quick to jump to conclusions on some of the bandwidth usage issues.
However, I do think that the Webmaster should be given more control over which pages can be accessed by the app, especially during beta testing.
It's just impossible for the programmers to pre-empt every possible problem that could occur on any of the billions of web pages out there.
The webmaster of the site is in a much better position to guage the possible problems that might occur on their site and therefore they should be able to have the final say on whether the caching and prefetching should go ahead or not.
There are a number of problematic issues that I now have to deal with regarding this pre-fetching on some of my sites. None of these would occur if the app obeyed my robots.txt files.
| 1:34 am on May 5, 2005 (gmt 0)|
This may be the answer in itself:
>> Now this Web Accellerator release seems designed just to snag the IE crowd for whatever reason.
I would say 'whatever reason' could be to try to stay a step ahead of M, by getting people used to using this technology, before M decides to, for some odd reason, add MSN Search to IE browsers.
Mere speculation on my part.
[edited by: jd01 at 1:36 am (utc) on May 5, 2005]
| 1:35 am on May 5, 2005 (gmt 0)|
|6. Will Google Web Accelerator affect traffic for ads on my site? |
No. We do not prefetch ads
um, how does google know what an ad is compared to a link?
| 1:48 am on May 5, 2005 (gmt 0)|
I'm still trying to figure out why Google created a special tag to let you opt-in when there apparently isn't a way to opt-out? This also doesn't pass the sniff test.
Why did a Google employee apparently state that Google sends a Google UA for the prefetch when in fact it does not do this? Same sniff test issue here.
| 1:48 am on May 5, 2005 (gmt 0)|
I don't do Windows so I can't test this. It looks like a transparent proxy server run on port 9100. From the help description, it pushes all your HTTP requests through Google servers, which are using a combination of cached pages and pre-fetching combined with data compression (gzip?) to help avoid delays. Much as I enjoy a good lynching of an evening, at a first glance there's not much going on here which could be seen as sinister other than the same pre-fetch debate already going on for the last couple of weeks.
A few random questions:
- Are the cached pages the same as those displayed in the serps under the cache link?
- Does the meta noarchive tag have any influence?
- Are all pre-fetch requests still using the special header?
For the program itself:
- is the accelerator cache in a readable state by other programs, ie.
- Is it in text or binary format?
- Can Google Desktop Search read it?
| 2:03 am on May 5, 2005 (gmt 0)|
|Are the cached pages the same as those displayed in the serps under the cache link? |
Now you're pointing to something that could be bordering on a real reason to make the technology compelling that I had completely overlooked. If they are actually combining this pre-fetch technology with the regular search engine indexing technology then pre-fetched pages could theoretically be updated in the SERPs based on user demand to keep their index even more fresh in ... REAL-TIME.
I know I get indexed every 2 nights already so they seem to be getting closer IMO.
Could this possibly be a precursor to live interactive search engines with SERPs updated in real-time?
This MAY pass the sniff test if in fact this is the ultimate goal.
| 2:14 am on May 5, 2005 (gmt 0)|
I'd assume this would pretty much negate any CPM based advertising, unless I'm missing something.
| 2:16 am on May 5, 2005 (gmt 0)|
|If they are actually combining this pre-fetch technology with the regular search engine indexing technology then pre-fetched pages could theoretically be updated in the SERPs based on user demand to keep their index even more fresh in ... REAL-TIME. |
re-arranging 8 billion pages in a database in real time? I don't think even Google have the computing power to do that.
I'd have thought that the toolbar would be enough to be able to fairly accurately calculate the popularity of sites on the Internet.
Alexa seems to do an adequate job of this and I suspect there are less Alexa toolbars in circulaion than Google ones.
| 2:46 am on May 5, 2005 (gmt 0)|
|If they are actually combining this pre-fetch technology with the regular search engine indexing technology |
I believe thats what lexiv said in msg #26. I am not sure if real time would be possible. They dont need real time really.
| 2:48 am on May 5, 2005 (gmt 0)|
Not sure if this has already been said, but did you notice that its caching adsense?
I installed it and started seeing PSA's on my sites all day long, and sometimes some weird foreign language ads.
Removed it, and surprise, everything is back to normal!
Anyone else noticed it?
| 3:04 am on May 5, 2005 (gmt 0)|
|re-arranging 8 billion pages in a database in real time? |
Maybe just the index for the keyword you used to get to the site - just a thought.
|I believe thats what lexiv said in msg #26. I am not sure if real time would be possible. They dont need real time really. |
80 posts later it's hard to remember :)
I wasn't thinking instantaneous exactly, but anything less than a few days would be an improvement.
[edited by: incrediBILL at 3:25 am (utc) on May 5, 2005]
| 3:14 am on May 5, 2005 (gmt 0)|
yes BlackTulip, but i didn mention the real time thing.
|Craven de Kere|
| 4:47 am on May 5, 2005 (gmt 0)|
My sites struggle under their load as is. Google requests that we do not send automated queries to them, as it increases their load without monetization and can cause them problems. This is just plain meanspirited to unleash these unrequested requests on us.
Cry foul webmasters, it's not fair for Google to generate unrequested requests to our sites.
C'mon Google, do no evil. My site's speed is limited by load, not data transfer and this application will only contribute to my load.
Give me an easy way to opt out of prefetching please. If you expect people to have the courtesy to respect Google's load issues then don't unleash a DDos on us.
| 5:01 am on May 5, 2005 (gmt 0)|
There is some good help for webmasters at:
It addresses issues like Page fetching, Advertising, usage, stats etc.
| 5:24 am on May 5, 2005 (gmt 0)|
I assume if we block Google from caching our pages our sites are immune from this?
Email, blog, guestbook and formmail spammers are going to absolutely love it.
Immediately blocked from all the sites on my servers. No need, no way.
Google you are getting carried away with yourselves.
btw, AOL has been using a proxy cache for years and it's the first thing users want to bypass.
Why would any broadband user want a 1%-5% speed improvement?