Welcome to WebmasterWorld Guest from 54.145.80.57

Forum Moderators: Robert Charlton & aakk9999 & andy langton & goodroi

Message Too Old, No Replies

Google Windows Web Accelerator

     
8:09 pm on May 4, 2005 (gmt 0)

Administrator from US 

WebmasterWorld Administrator brett_tabke is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Sept 21, 1999
posts:38059
votes: 13


[webaccelerator.google.com...]


System Requirements
Operating System: Win XP or Win 2000 SP3+
Browser: IE 5.5+ or Firefox 1.0+
Availability: For users in North America and Europe (during beta testing phase)

Press Release:

Google Web
Accelerator significantly reduces the time that it takes broadband users to
download and view web pages. The Google Web Accelerator appears as a small
speedometer in the browser chrome with a cumulative "Time saved" indicator.

Here's how it works. A user downloads and installs the client and begins
browsing the web as she normally would. In the background, the Google Web
Accelerator employs a number of techniques to speed up the delivery of
content to users.

Looks like some of the Mozilla hires are paying dvidends.

12:20 am on May 5, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Feb 24, 2005
posts:965
votes: 0


I don't begrudge Google for creating this app. Some of the things coming out of their Research Labs have been very impressive.

However, I feel that they should have put something as potentially harmful as this through a lot more rigourous testing and public consultation before letting it loose on the Internet.

12:22 am on May 5, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Mar 29, 2002
posts:1954
votes: 0


Benefit for the site owner: Automatic clicks on your site ads, automatic additions to your customers' shopping baskets!

Yes, but can it prefetch their credit card info and submit it later :)

Well, even though there is mostly/all negative comments in this thread I believe it gives Google some valuable info they may/many not have thought about. I guess they could have struck a silent deal with AOL for broadband and it's users wouldn't have a clue ;)

12:23 am on May 5, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Mar 19, 2003
posts:804
votes: 0


I wonder what Amazing ones will think of it?
12:23 am on May 5, 2005 (gmt 0)

Junior Member

10+ Year Member

joined:Apr 4, 2005
posts:43
votes: 0


its funny that my first post on this forum would imply I have cloaked pages. I rather say 'protected'...

Hello everybody and am already in love with this forum :-)

so Leoxiv said that


if Page A(index 1)!= Page A(index 2)

then Page A is CLOAKING

its a splendid observation but how on earth they can measure inequality as precisely as that?! there might be just tiny differences ...

12:23 am on May 5, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Feb 24, 2005
posts:965
votes: 0


Just thought it beared mentioning all that money Google spent on hiring only those with the highest education, the supposedly bestest and brightest

Yeah, but didn't they hire a Microsoft guy as well? ;-)

[edited by: mrMister at 1:02 am (utc) on May 5, 2005]

12:25 am on May 5, 2005 (gmt 0)

Junior Member

10+ Year Member

joined:Jan 17, 2005
posts:88
votes: 0


Next experiment...

To make the accellerator work your internet security program needs your approval to connect the accellerator to the internet.

Try to remove the accellerator from your security programs firewall internet access protocol. When you've done that restart the accellerator, try to open a page - interesting error page, eh?

The Google Accellerator is an application that cooperates with your browser and checks for data on the page requested via another port than 80 (HTTP).

Ergo, no proxy IP and no user agents.

12:43 am on May 5, 2005 (gmt 0)

Senior Member

joined:Dec 29, 2003
posts:5428
votes: 0


personally: I installed and it and saved some 4 seconds after browsing for a while. Decided that 1.5 MB dsl is fast enough for me.

One tool that I love is their desktop search...you can control everything, and clean the history /cache anytime you want.

12:48 am on May 5, 2005 (gmt 0)

Full Member

10+ Year Member

joined:Oct 23, 2003
posts:314
votes: 0


omg.
Lets review this system

1. if 5% of internet users utilise this technology then the web is slowed down because of the huge extra bandwidth requirements by this 5%.

Thus the another 10% of users say.. "the internet is so slow." and of course they look for how to improve their speed. They therefor download the 'Web Accelerator'.

This compounds the speed issue and thus making it even slower for the other 85%.

This keeps going until the web accelerator has a 100% market share. And of course, as the internet is slowed down so significantly we are all running at a speed not faster than now.

2. The web accelerator click on other PPC advertising. Thus making the advertisers pay more and look for other alternatives that are more cost effective. i.e. adwords

12:52 am on May 5, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member googleguy is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Oct 8, 2001
posts:2882
votes: 0


I'll pass on feedback from this thread to the people that worked on this. I'll ask them to read the thread too, in case there's any issues that they can tackle (e.g. someone mentioned an issue with cookies..)
12:56 am on May 5, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Feb 24, 2005
posts:965
votes: 0


That's a quick response from GoogleGuy. Thank-you.

At least with Google, these issues get quick attention. Which is a lot more than can be said for a lot of other big Internet companies.

1:04 am on May 5, 2005 (gmt 0)

Full Member

10+ Year Member

joined:June 18, 2004
posts:327
votes: 0


BlackTulip, measuring that inequality is relativly easy for Google, their Is_Near_Duplicate function would do the job. Basically any 'healthy' page Is_Near_Duplicate of itself.
1:06 am on May 5, 2005 (gmt 0)

Administrator from US 

WebmasterWorld Administrator incredibill is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Jan 25, 2005
posts:14650
votes: 94


if 5% of internet users utilise this technology then the web is slowed down because of the huge extra bandwidth requirements by this 5%

Yes, you hit the nail on the head and win a stuffed animal of your choice.

Imagine if EVERYONE installs this mess and web usage which may already be borderline goes up 3x, 4x, or more within a few months! The bandwidth for most of the servers we host hits 80%+ for a couple of hours a day during peak times and this technology would just literally destroy access times during peak. Packets would be dropping all over the place and the sheer economics of the situation doesn't justify adding more bandwidth to the network.

The only solution would be to block any pre-fetch technology from all servers.

Not pretty.

1:11 am on May 5, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member googleguy is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Oct 8, 2001
posts:2882
votes: 0


mrMister, the one thing I'd say is that it's only been on Labs for a few hours, so folks may want to give it a little while before judging it or deciding what they think. I pinged one of the people who worked on it to ask for more info.
1:25 am on May 5, 2005 (gmt 0)

Full Member

10+ Year Member

joined:Oct 23, 2003
posts:314
votes: 0


Googleguy,
The whole concept of this technology (unless i have it wrong) is pre-fetching.
Thus as a owner of many sites that have significant traffic one of my greatest costs is bandwidth. This concept of pre-fetching has the side effect of greatly increasing my bandwidth costs.
This is the biggest concern for me.
If this product is adopted then I would be forced to block any prefetching attempts.
1:25 am on May 5, 2005 (gmt 0)

Junior Member

10+ Year Member

joined:Jan 17, 2005
posts:88
votes: 0


[webaccelerator.google.com...]

Check out the fifth parahraph - something on how to ID a prefetch request.

1:28 am on May 5, 2005 (gmt 0)

Administrator from US 

WebmasterWorld Administrator incredibill is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Jan 25, 2005
posts:14650
votes: 94


OK, my final ponderings on this issue:

What is their ultimate goal with pre-fetch?

Has anyone considered what reason Google could possibly have for unleashing this pre-fetch menace other than the obvious comments earlier to track site access more closely?

First they announced enabling "Enhanced searching with Firefox":
[webmasterworld.com...]

Now this Web Accellerator release seems designed just to snag the IE crowd for whatever reason they want to be able to shove the technology down everyone's throats and Microsoft IE users were next in line.

Here's the clue on their web site that it's not an optimization technology but a bandwidth hog technology:

Dial-up users may not see much improvement, as Google Web Accelerator is currently optimized to speed up web page loading for broadband connections

Nothing new here, mostly restatement of what's already been covered, that it just doesn't add up to create meaningless technology to solve a problem that doesn't exist [anyone you know perceive broadband slowness as an issue?] and create a bunch of new problems that didn't exist before. Even if your local broadband provider WAS slow due to overloading this will just make it worse, not better.

So WHAT IS THE MOTIVATION of choking the net with more requests and slowing down access to some dynamic sites (now potentially overloaded as you can't cache them and make them work properly) by the very nature of technology designed to make it faster?

There must be more to the story than we're not privvy to at this point as it doesn't pass the sniff test.

[edited by: incrediBILL at 1:51 am (utc) on May 5, 2005]

1:29 am on May 5, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Feb 24, 2005
posts:965
votes: 0


mrMister, the one thing I'd say is that it's only been on Labs for a few hours, so folks may want to give it a little while before judging it or deciding what they think.

That's a fair enough statement.

I have been quick to jump to conclusions on some of the bandwidth usage issues.

However, I do think that the Webmaster should be given more control over which pages can be accessed by the app, especially during beta testing.

It's just impossible for the programmers to pre-empt every possible problem that could occur on any of the billions of web pages out there.

The webmaster of the site is in a much better position to guage the possible problems that might occur on their site and therefore they should be able to have the final say on whether the caching and prefetching should go ahead or not.

There are a number of problematic issues that I now have to deal with regarding this pre-fetching on some of my sites. None of these would occur if the app obeyed my robots.txt files.

1:34 am on May 5, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Apr 9, 2005
posts:1509
votes: 0


IncrediBILL,

This may be the answer in itself:

>> Now this Web Accellerator release seems designed just to snag the IE crowd for whatever reason.

I would say 'whatever reason' could be to try to stay a step ahead of M, by getting people used to using this technology, before M decides to, for some odd reason, add MSN Search to IE browsers.

Mere speculation on my part.

Justin

[edited by: jd01 at 1:36 am (utc) on May 5, 2005]

1:35 am on May 5, 2005 (gmt 0)

Full Member

10+ Year Member

joined:Oct 23, 2003
posts:314
votes: 0


[webaccelerator.google.com...]

6. Will Google Web Accelerator affect traffic for ads on my site?
No. We do not prefetch ads

um, how does google know what an ad is compared to a link?

1:48 am on May 5, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Sept 17, 2002
posts:2251
votes: 0


I'm still trying to figure out why Google created a special tag to let you opt-in when there apparently isn't a way to opt-out? This also doesn't pass the sniff test.

Why did a Google employee apparently state that Google sends a Google UA for the prefetch when in fact it does not do this? Same sniff test issue here.

1:48 am on May 5, 2005 (gmt 0)

Senior Member from CA 

WebmasterWorld Senior Member encyclo is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Aug 31, 2003
posts:9068
votes: 4


I don't do Windows so I can't test this. It looks like a transparent proxy server run on port 9100. From the help description, it pushes all your HTTP requests through Google servers, which are using a combination of cached pages and pre-fetching combined with data compression (gzip?) to help avoid delays. Much as I enjoy a good lynching of an evening, at a first glance there's not much going on here which could be seen as sinister other than the same pre-fetch debate already going on for the last couple of weeks.

A few random questions:

- Are the cached pages the same as those displayed in the serps under the cache link?
- Does the meta noarchive tag have any influence?
- Are all pre-fetch requests still using the special header?

For the program itself:

- is the accelerator cache in a readable state by other programs, ie.
- Is it in text or binary format?
- Can Google Desktop Search read it?

2:03 am on May 5, 2005 (gmt 0)

Administrator from US 

WebmasterWorld Administrator incredibill is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Jan 25, 2005
posts:14650
votes: 94


encyclo said:
Are the cached pages the same as those displayed in the serps under the cache link?

AH HA!

Now you're pointing to something that could be bordering on a real reason to make the technology compelling that I had completely overlooked. If they are actually combining this pre-fetch technology with the regular search engine indexing technology then pre-fetched pages could theoretically be updated in the SERPs based on user demand to keep their index even more fresh in ... REAL-TIME.

I know I get indexed every 2 nights already so they seem to be getting closer IMO.

Could this possibly be a precursor to live interactive search engines with SERPs updated in real-time?

This MAY pass the sniff test if in fact this is the ultimate goal.

2:14 am on May 5, 2005 (gmt 0)

Junior Member

10+ Year Member

joined:May 10, 2004
posts:80
votes: 0


I'd assume this would pretty much negate any CPM based advertising, unless I'm missing something.
2:16 am on May 5, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Feb 24, 2005
posts:965
votes: 0


If they are actually combining this pre-fetch technology with the regular search engine indexing technology then pre-fetched pages could theoretically be updated in the SERPs based on user demand to keep their index even more fresh in ... REAL-TIME.

re-arranging 8 billion pages in a database in real time? I don't think even Google have the computing power to do that.

I'd have thought that the toolbar would be enough to be able to fairly accurately calculate the popularity of sites on the Internet.

Alexa seems to do an adequate job of this and I suspect there are less Alexa toolbars in circulaion than Google ones.

2:46 am on May 5, 2005 (gmt 0)

Junior Member

10+ Year Member

joined:Apr 4, 2005
posts:43
votes: 0


If they are actually combining this pre-fetch technology with the regular search engine indexing technology

I believe thats what lexiv said in msg #26. I am not sure if real time would be possible. They dont need real time really.

2:48 am on May 5, 2005 (gmt 0)

New User

10+ Year Member

joined:Apr 13, 2005
posts:18
votes: 0


Not sure if this has already been said, but did you notice that its caching adsense?
I installed it and started seeing PSA's on my sites all day long, and sometimes some weird foreign language ads.
Removed it, and surprise, everything is back to normal!

Anyone else noticed it?

3:04 am on May 5, 2005 (gmt 0)

Administrator from US 

WebmasterWorld Administrator incredibill is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Jan 25, 2005
posts:14650
votes: 94


re-arranging 8 billion pages in a database in real time?

Maybe just the index for the keyword you used to get to the site - just a thought.

I believe thats what lexiv said in msg #26. I am not sure if real time would be possible. They dont need real time really.

80 posts later it's hard to remember :)

I wasn't thinking instantaneous exactly, but anything less than a few days would be an improvement.

[edited by: incrediBILL at 3:25 am (utc) on May 5, 2005]

3:14 am on May 5, 2005 (gmt 0)

Full Member

10+ Year Member

joined:June 18, 2004
posts:327
votes: 0


yes BlackTulip, but i didn mention the real time thing.
4:47 am on May 5, 2005 (gmt 0)

Junior Member

10+ Year Member

joined:Apr 1, 2004
posts:51
votes: 0


Ouch!

My sites struggle under their load as is. Google requests that we do not send automated queries to them, as it increases their load without monetization and can cause them problems. This is just plain meanspirited to unleash these unrequested requests on us.

Cry foul webmasters, it's not fair for Google to generate unrequested requests to our sites.

C'mon Google, do no evil. My site's speed is limited by load, not data transfer and this application will only contribute to my load.

Give me an easy way to opt out of prefetching please. If you expect people to have the courtesy to respect Google's load issues then don't unleash a DDos on us.

5:01 am on May 5, 2005 (gmt 0)

Preferred Member

10+ Year Member

joined:June 29, 2003
posts:363
votes: 0


There is some good help for webmasters at:
[webaccelerator.google.com...]

It addresses issues like Page fetching, Advertising, usage, stats etc.

Enjoy.

This 476 message thread spans 16 pages: 476