homepage Welcome to WebmasterWorld Guest from 54.166.108.167
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

This 476 message thread spans 16 pages: < < 476 ( 1 2 3 4 [5] 6 7 8 9 10 11 12 13 ... 16 > >     
Google Windows Web Accelerator
Brett_Tabke




msg:736302
 8:09 pm on May 4, 2005 (gmt 0)

[webaccelerator.google.com...]


System Requirements
Operating System: Win XP or Win 2000 SP3+
Browser: IE 5.5+ or Firefox 1.0+
Availability: For users in North America and Europe (during beta testing phase)

Press Release:

Google Web
Accelerator significantly reduces the time that it takes broadband users to
download and view web pages. The Google Web Accelerator appears as a small
speedometer in the browser chrome with a cumulative "Time saved" indicator.

Here's how it works. A user downloads and installs the client and begins
browsing the web as she normally would. In the background, the Google Web
Accelerator employs a number of techniques to speed up the delivery of
content to users.

Looks like some of the Mozilla hires are paying dvidends.

 

walkman




msg:736422
 2:24 pm on May 5, 2005 (gmt 0)

"Agreed. how much it uses is open for debate."

for a site like WebmasterWorld it's probably a lot. I have 1000GB limmit on mine and don't even get close to reaching it, so it doesn't bother me, but I could see other sites being upset.

as far as spyware: Sadly, I think we lost the war. Between the bar, this or other future products, they'll know your every move.

Brett_Tabke




msg:736423
 2:27 pm on May 5, 2005 (gmt 0)

> for a site like WebmasterWorld it's probably a lot

It is ZERO. We do not allow prefetching.

RewriteCond %{X-moz} ^prefetch
RewriteRule ^.* - [F]

That is the only extra bandwidth at issue. Everything else is just normal surfing.

The Contractor




msg:736424
 2:32 pm on May 5, 2005 (gmt 0)

There's a lot of hype and misinformation here.

Ok, so could you please tell me a benefit to the user or site owner that isn't at the expense of the site owners?

It is the cloakers who are screaming loudest today, mostly about the (undoubtedly important) privacy concerns, but their real fear is coming from the massive breach in their technology that the accelerator is bringing.

Nope, not me, I don't cloak anything. I fail to see how anyone could put a positive spin on this product?

This should really get interesting when people try to put a positive spin on this...sometimes you have to call it as you see it. I'm sure Googles intentions may be good in an effort to get users to provide them with more data, but how this is going about it, well it's wrong. It costs others money and breaks others sites (cookies) period.

Brett_Tabke




msg:736425
 2:36 pm on May 5, 2005 (gmt 0)

> I'm sure Googles intentions may be good

For their stockholders yes. I don't think we know their ultimate intentions with this product. (my personal side thought, is that this is the foundation for a Google ISP)

> tell me a benefit to the user

The speed up of moz/ie (which need all the help they can get in that dept)

> or site owner

None that I can see other than allowing users another method of getting to your site.

The big issue I see is in all the GeoTargetting that people are doing today. If everything goes through a single set of IP's a Google, how do we target LOCAL?

gaouzief




msg:736426
 2:37 pm on May 5, 2005 (gmt 0)

It is the cloakers who are screaming loudest today, mostly about the (undoubtedly important) privacy concerns, but their real fear is coming from the massive breach in their technology that the accelerator is bringing

Cloacking is not the Issue here,
The issue in my Opinion is about a Broader Google Vision to WebUsers.

Google Used to distinguish itself among other internet giants by offering GENUINE, Transparent Services to Users starting from Web Search, News Aggregation....

Now this WebAccelerator thing is Truly a move away from google's traditional way of doing things,

Why would broadBand Users Need to Accelerate Page access by a few Seconds anyway? this is Junk

The true answer to the question :Why only Broadband? is that WebAccelerator Needs Large Bandwith to give you the impression that it is actually delivering speed improvement, what a joke.

Looking at the "Webmaster Guidelines"
[webaccelerator.google.com...] i had a Feeling that this page could well belong to MSDN:

Advertising Links:
How on earth could it Securely identify every available type of ad link out there? no way

Web Apps:
How on earth could we stop it from messing with our CMS, E-Shops that are not under HTTPS? no way

In conclusion, this "FreeWare" will not offer any reasonably postiv service to users, It creates a mess on the web and Takes advantage of users's Computer processing power and bandwith as well as their Private web usage information for Google's datamining needs!

What a great piece of software!

queritor




msg:736427
 2:40 pm on May 5, 2005 (gmt 0)

Everybody seems to be assuming Google will pre-fetch anchor links, i.e. links in the <a> tag.

If they adhere to the Mozilla specs then they will only fetch links specified within the <link> tag. These must be explicitly set in the header of the document and contain a prefetch attribute.

The fact that these tags aren't in wide use may be the reason that people are seeing negligible performance increases when testing the accelerator.

[edited by: queritor at 2:43 pm (utc) on May 5, 2005]

Brett_Tabke




msg:736428
 2:43 pm on May 5, 2005 (gmt 0)

I don't think we know what Google is using for the Prefetch algo in the proxy queritor. Just because Moz does it one way, doesn't mean they did it the same way in the IE proxy/accelerator.

We really need to figure that part out quickly. It may be much more aggressive at prefetching (spidering), than we think it is...

The Contractor




msg:736429
 2:48 pm on May 5, 2005 (gmt 0)

The speed up of moz/ie (which need all the help they can get in that dept)

I used it for over an hour last night and saved .5 seconds. I wouldn't care if it saved me a whole 15 minutes (which it would have to be 1800X more efficient than it is now). Breaking sites and or speeding things up at the expense of other site owners isn't worth it to me. If you have slow broadband there are other providers out there.

I have no qualms about blocking it and telling visitors why they were blocked. No different than presenting a user with an error "You must have cookies enabled to shop this site" for a cookie based shopping cart or login error to site that requires cookies.

The Contractor




msg:736430
 2:56 pm on May 5, 2005 (gmt 0)

(my personal side thought, is that this is the foundation for a Google ISP)

There you go. Which I wouldn't blame them if they did. I'm not against them being creative or creating tools even though I don't kid myself of the reasons behind them. Nothing wrong with that. The problem I have is when they do it at the expense of others with no respect for the site owners.

claus




msg:736431
 3:04 pm on May 5, 2005 (gmt 0)

>> Google has been doing this for years, and the accelarator is just another step down that road

encyclo, I'm usually not very negative about new products from Google - normally I can see that they're at least a bit useful, and I am also normally one of the first to accept the tradeoff between privacy and benefits.

In this case, I maintain my view that this is the single worst product ever to come from Google, and that includes Autolink, Search History, Prefetch, and all the other controversial stuff they've launched recently.

This "product" is nothing more than a very lame1 excuse for:
- getting to measure the whereabouts of the users,
- proxying the "useful parts of the net",
- establishing a new kind of "trusted links", and de facto,
- a new Google-only division of the internet.

Now, what about the users on my sites
- Will that become bots only, as everything is cached?
- Do I want that?
- Do I trust that my site will always look the same through the Google proxy as it will on my server?
- Can I control this in any way?
- How will I know how many users I have, and/or
- What their behaviour is on my sites?
- Will my advertisers trust my figures?
- Can I?

You should give it some extra thought. There's a whole chain of events in how this little thing operates, and what happens as/if it gets widely adopted - I only mentioned a few issues above. This is definitely not as harmless as it seems at first look.

This is a pest, and a serious one at that. Yes, those are very hard words. Ban it now, or become sorry later.

And no, it's not about cloaking at all. Not for me at least. I've got better tings to do than making a few web pages look different to some bot. However, I will not hesitate to ban a bot (even GB) from access if I feel it's not giving value to my sites, to the internet in general, or to my user base. If it's a piece of cr*p then it's out and it can't get in. Like that.


1) Yes, the concept of broadband acceleration is lame. Even if it's written on a Google web page.
claus




msg:736432
 3:15 pm on May 5, 2005 (gmt 0)

>> Why?

Imho, this is just another way of rating/ranking sites. See the recent history patent threads for discussion.

---
Added: I see quite a few posts have been added after encyclos, which was the last when i wrote the above one.

Brett_Tabke




msg:736433
 3:29 pm on May 5, 2005 (gmt 0)

I've been using it and studying the prefetch/precache mech this morning.

I am distrubed by what I am seeing.

It appears that if you simply mouse over a link, it caches that link. It happens on about 9 out of 10 links I am trying. It seems if it is in SMALL font, or a cgi url, it is NOT cached. If it is a static URL in a normal font - it is cached.

Please correct me if that is wrong.

walkman




msg:736434
 3:31 pm on May 5, 2005 (gmt 0)

I doubt this is for G ISP. Margins too low and they have to deal with (sorry to say but it's the truth in many cases) stupid people to see if the modem, or even if the computer is plugged in. Google can make more than AOL makes a month from oen user in just one click. Some said Google was offering domains too...

IMO, this is to see where people go, and that can be used to rank sites, and /or sell the information to Kraft, Pepsi, etc., etc., a year or two from now. Global warehouse of all kinds of data.

dauction




msg:736435
 3:31 pm on May 5, 2005 (gmt 0)

The Good thing is you can now visit any forums you have been banned from ;) (even for the massess that couldnt have figured out how to proxy surf before)

"Do No Evil"

MatthewHSE




msg:736436
 3:36 pm on May 5, 2005 (gmt 0)

I haven't taken the time to read this entire thread carefully, but has anyone tried playing with the prefetching preferences? For instance, if you disable prefetching completely, is there any speed increase?

I have noticed an incredible speed boost in IE, though it's still not up to a properly-tweaked Firefox install. Firefox is showing far less improvement, probably because I already have pipelining and other preferences set to the point that pages download pretty near instantly anyway.

davidgautier




msg:736437
 4:10 pm on May 5, 2005 (gmt 0)

Brett_Tabke :
"Spyware? Agreed, but that is not new - it is just *more* of the same thing they have with the toolbar and all their sources of data now.
Proxy? So what?"

Right on, this is all it is, SPYWARE.
And now, go trust google. Trustrank? Maybe Spywarerank? IMHO they just blew their biggest advantage they had over other engines, their image.

jomaxx




msg:736438
 4:14 pm on May 5, 2005 (gmt 0)

For anybody who is blocking this by IP, it was previously reported that the prefetching was coming from 72.14.192.*. My testing of the accelerator showed prefetches coming from 64.233.172.18.

Before I change my .htaccess, I'd be very interested to know (a) if any 404s or 403s I send might end up affecting Google's search results (e.g. pages removed from index), and (b) if this could lead to a loss of functionality for end users.

The Contractor




msg:736439
 4:15 pm on May 5, 2005 (gmt 0)

It appears that if you simply mouse over a link, it caches that link. It happens on about 9 out of 10 links I am trying. It seems if it is in SMALL font, or a cgi url, it is NOT cached. If it is a static URL in a normal font - it is cached.

Please correct me if that is wrong.

Nope, you are exactly correct.

claus




msg:736440
 4:18 pm on May 5, 2005 (gmt 0)

>> the complete (and most compact) required code to block this thing

If you wish to add the prefetch, you can do like this:

------------------------------------- 
# google proxy: 72.14.192.0 - 72.14.207.255
RewriteCond %{REMOTE_ADDR} ^72\.14\.(19[2-9]20[1-7]) [OR]
# google prefetch
RewriteCond %{X-moz} ^prefetch
RewriteRule .* - [F]
-------------------------------------

>> the relevant text of your custom 403

I don't really have a specific text for this situation, only a general 403 text - but perhaps i should add a brief paragraph about this thing.

Anyway, suppose you have a custom error page with the file name "custom403.htm" - then you should change the last line above, like this:

------------------------------------- 
# google proxy: 72.14.192.0 - 72.14.207.255
RewriteCond %{REMOTE_ADDR} ^72\.14\.(19[2-9]20[1-7]) [OR]
# google prefetch
RewriteCond %{X-moz} ^prefetch
RewriteRule !custom403\.htm - [F]
-------------------------------------

Added: Remember to manually edit the above, so that the pipe character ("¦") is entered from your keyboard. The above broken one is not the right one, so don't just copy and paste.

[edited by: claus at 4:22 pm (utc) on May 5, 2005]

msgraph




msg:736441
 4:29 pm on May 5, 2005 (gmt 0)

Claus msg #130

Excellent post, one of the best in this thread. This is all about Google helping the users in the way they, Google, think users need to be helped.

Even though this tool has just been released, no matter how much they improve it, I would never feel comfortable in knowing that there would be a slight chance of something from my site getting served incorrectly; whether it be cookies, scripts, etc.

Google is always preaching their tools as "we felt this would help the user most". Well how can I help MY visitors if this tool comes along and messes with cookies or stats that I need to analyze in order to further help them.

Scarecrow




msg:736442
 4:46 pm on May 5, 2005 (gmt 0)

I have two servers, and a total of several domains. All are nonprofit.

I've checked my logs for Google's accelerator scraping. There are about 200 GETs on each server for today only. One server shows most of them from 72.14.192.* and the other server shows most of them from 72.14.194.*. I also have about 20 from 64.233.172.* and another 20 from 64.233.173.*.

One thing that disturbs me is that every single page on all of my domains has shown the NOARCHIVE meta for years now. Google does not consider this meta to be a prohibition for the accelerator. There is no opt-out to save your bandwidth! I don't see the accelerator checking for robots.txt either. To put it bluntly, Google considers this latest scraping to be a non-search function, and none of the old standards apply. Of course, you can be sure they save everything they grab for future use.

The other thing that disturbs me is that on one server (remember, this only for the last 12 hours), I saw 8 different accelerator GETs in a single one-second period. With 130,000 static pages on this site, should I be worried about load problems from Google? And for what -- so that Google can collect more information on people who want to access my sites?

If the Googlebot is hitting me this hard, I put up with it because I want the referrals from Google. I can also exercise control with robots.txt and NOARCHIVE. But the accelerator running in addition to the Googlebot is where I have to draw the line.

Craven de Kere




msg:736443
 4:48 pm on May 5, 2005 (gmt 0)

Bird wrote:

"Robots.txt is designed for spiders with automatic methods to determine which link to crawl next. With Google's Web Accelerator, however, each request is directly triggered by the behaviour of a human user."

All spiders are directly triggered by human behavior.

Automated requests are automated requests are automated requests.

Prefetching can't be done without some requests being ones that an eyeball never sees.

Craven de Kere




msg:736444
 4:51 pm on May 5, 2005 (gmt 0)

Brett Wrote:

"> it does not honor robots.txt

It is a proxy on behalf of a human, it isn't a bot."

It's more than a proxy. It is sending automated requests. Those automated requests do qualify as being a bot.

Just as does software for offline viewing of a website. A single human visitor may be the one using it, but automated requests means it *is* a bot.

Brett_Tabke




msg:736445
 4:57 pm on May 5, 2005 (gmt 0)

Yes, there are some big gray areas here in the terms Craven de Kere. I think we should all be as clear an accurate as possible here. There is way too much bad info coming out here and other forums about the program.

macdave




msg:736446
 5:06 pm on May 5, 2005 (gmt 0)

[webaccelerator.google.com ] now redirects to [toolbar.google.com ], which has no mention of the accelerator. Has Google pulled the plug on the accelerator?

Powdork




msg:736447
 5:13 pm on May 5, 2005 (gmt 0)

What happens to adsense when you view pages through the accelerator AND you have an adblocker installed? Is Google putting their broken ads back together.
What is the difference between Google breaking your pages and Norton breaking your pages. What is the difference between Google and Norton (or Network Solutions, or MS)? The differences are drying up in a hurry.

Powdork




msg:736448
 5:14 pm on May 5, 2005 (gmt 0)

It still takes me to the accelerator page in both IE and Firefox

incrediBILL




msg:736449
 5:16 pm on May 5, 2005 (gmt 0)

http://webaccelerator.google.com/ now redirects to [toolbar.google.com...] which has no mention of the accelerator. Has Google pulled the plug on the accelerator?

Doesn't do that here - [webaccelerator.google.com...] works like it did yesterday.

Craven de Kere




msg:736450
 5:17 pm on May 5, 2005 (gmt 0)

Maybe they've decided to do unto webmasters as they demand that webmasters do unto them.

Craven de Kere




msg:736451
 5:18 pm on May 5, 2005 (gmt 0)

"It still takes me to the accelerator page in both IE and Firefox"

For a few minutes it was going to the toolbar page for me. Back to the accelerator now.

davidgautier




msg:736452
 5:18 pm on May 5, 2005 (gmt 0)

I saw the same page macdave saw although now it works fine.

This 476 message thread spans 16 pages: < < 476 ( 1 2 3 4 [5] 6 7 8 9 10 11 12 13 ... 16 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved