" davidgautier, I'm sorry that I don't understand why this steals wires. Which wires? "
It's a metaphor.
oh not all, dmorison. Indeed I myself am an ardent fan of data mining/ prediction algorithms. As i said before i admire the thought process ... its part of evolution.
davidgaultier, you said "Of course, searches through google desktop search are track-able by google."
In general I don't think that's true, especially if you turn off referrers in your browser. Please see [desktop.google.com...]
|webmasters can choose to just ignore prefetch requests if they so choose. |
GG, this is not specific enough. Please define "ignore". IMO the logical thing to do is to send a 403, but that's not acceptable if there's a chance the end user will be shown that same error.
If you mean rejecting the request outright, please suggest a mechanism to do that. I specifically asked about this here and the only method proposed was based on IP address (obviously not an ideal solution).
FWIW, I don't have a bandwidth problem and don't intend to block prefetching unless it causes problems. My main concern is that I specifically bar robots from executing scripts in my /cgi-bin/ and /scripts/ directories, because of server load issues, not to mention unexpected and undesirable results in some cases.
"In general I don't think that's true, especially if you turn off referrers in your browser. Please see [desktop.google.com...] "
I just read few days ago somewhere, maybe it was at SEW, it is DEFAULT to send info out, not the other way around.
I wonder though if this is really worth it for google? Have you guys looked around the web lately at what people are saying, googleguy?
So far it gets terrible reviews, being blocked in many places, cause damages to sites, and that just scratching the surface and not discussing privacy issues.
I just want to add my voice to the list of people requesting a way to opt our site out of this (and sending 403s to clients is not the ideal solution).
We have absolutely zero need for this, and in most cases, it can potentially be harmful.
- we already compress our pages
- all of our pages use strict cache-control and expires headers to stop caching by browser and/or proxy as our pages are extremely dynamic
- We do not use query strings in our urls as exposed to outside world. We have gone through great lengths to hide ugly URLs w/ query strings behind mod_rewrite.
- This will screw up tracking and reporting.
Forcing this on web sites with no way for a site to opt out is just plain rude and unprofessional.
davidgautier, the last privacy controversy before this, the two people that I talked to in person didn't realize that a feature was opt-in each time, not opt out. I believe you're incorrect regarding Google Desktop searches being sent to Google, for example, as the link I mentioned pointed out. I've been here when people wanted to block Gmail last year. I've been here when usenet folks flamed us because they thought that buying Deja was a terrible idea. I was here during update Fritz when people were angry that "everflux" was causing their monthly rankings to change faster. I've been here when people claim that digitizing books and making that available will lead to some sort of cultural hegemony. In each case, I've believed that Google is trying to do the right thing. If folks take the long view (longer than two days, at least), I think many people are willing to take a step back and wait to see how things look after the initial reaction.
Anyway--welcome to WebmasterWorld, and I hope you stick around and enjoy the less acrimonious threads too. :)
P.S. I have to duck out for a short meeting and grab some lunch, but I'll be back after that..
|I wonder though if this is really worth it for google? Have you guys looked around the web lately at what people are saying, googleguy? |
just an observation:
the first 3 news links, have Adsense on them :-)
|Forcing this on web sites with no way for a site to opt out is just plain rude and unprofessional. |
GoogleDude keeps hyping it'll speed everyone up [and it doesn't speed us all up] but doesn't seem to care that SOME of us webmasters just don't want it interfering with our sites and more importantly our SITE STATS which we use as a basis for marketing the site. I don't see where the "wait and see what we can do it's in beta" attitude shows one ounce of respect to the webmasters that don't want it touching their sites?
Again today Googles servers ping slower [much slower this time] than most of my destination servers so putting Google between me and my destination is adding sluggish response times just to get the promise of HTML compression which my Apache server already supports. Google can't really compress the images so the faster download from my destination server is almost doubled to a sluggish crawl as the images gets relayed thru Google.
What in the heck are they saving me?
Besides, who'll stop them next April Fool's Day when they decide to punk the net and feed the WA cache thru Snoop Dogg's Shizzolator?
this [37signals.com] is quite amusing:
Am i crazy, or do webmasters DO NOT want their sites accessed faster and thereby improving user experience?
Serving with Google Web Accelerator can only be faster in some cases as long as a very small minority is using it. As soon as the tool is used by a significant fraction of the online population, the complete web will slow down, because it is forced to transport several times as much data.
And even while only a minority is using it: At least half of the time the additional overhead will necessarily make access to pages slower, instead of faster. Don't get fooled by the software informing you about "time saved". It's lying, because it has no way of knowing how fast the direct request would have been.
In sum, this thing is a very bad idea. It turns its users into parasites wasting other people's resources.
Proxy caches have the potential to speed up users' connection and reduce the bandwidth for a webmaster; that's why lots of ISPs use them.
The ISP is one of the few places where a proxy chache makes sense. This is so because the proxy there is garanteed to be within the natural path of the request. In contrast, Google's proxy is garanteed to divert most traffic from the shortest path, thus wasting bandwidth and slowing down a large number of page requests.
Sure, yah, so you feel vindicated about Gmail et al, so you really think that somehow that means no matter what spyware crap you put out that it's just fine, because Gmail was?
The web accelerator is breaking sites. It causes harm to site owners, it causes harm to advertisers, and in some cases causes harm to users.
The only benefit to this is the information that M$...sorry, Google, gets to glean from it.
I've been a huge Google fan , wasn't so worried about Gmail, but this is the deal breaker. This is flat out "evil" and puts you smack dab in the middle of the corporate greed and irresponsibility stereotype that somehow managed to avoid Google until now.
Thanks googleguy for the welcome.
I'll just finish with this for now. I still beleive google crossed the lines few times in the past but got around it and people forgot but this time, it is going way beyond anything you have done so far.
Just because every time you are here dealing with it doesn't make it right in any way.
The way I see it is there is a real need for an honest, privacy concious, and a real "do no evil" search engine out there.
<If you go back a year to Gmail's debut, there were also people who wanted to block Gmail right after it was introduced. Fast forward a year, and many many people like Gmail now that they've seen the direction that we've gone with it: 2 gigs of storage and growing, a solid UI, free POP access, and free email forwarding as well.>
With all due respect, please don't underestimate the intelligence of fellow members.
Gmail is an opt-in operation where people have a choice. Where in the case of WA, the whole operation is an opt-out one.
This thread and several other recent threads illustrate with no doubt that Google has lost a big portion of the good-will it had among publishers, even those who were considered most Google friendly.
I admire the courage of those decent publishers who have chosen to block WA and have expressed their reasons and positions in public. God bless you all.
why can't you smart people at Google just focus on delivering cutting-edge compression technologies to speed up the web instead of coming up with something so controversial as Prefetching?
Also, I'd like to see G pair up with some of the companies that are at the next-level on this thing, such as OnSpeed, so you don't have to re-invent the wheel.
"Gmail is an opt-in operation where people have a choice." But last year people were complaining that Gmail wasn't opt in, because if you mailed someone using Gmail, it might generate ads. reseller, I'm not trying to underestimate anyone's intelligence. WebmasterWorld is an incredibly savvy group of people, which is why I want to make sure that I take comments from here to the people who work on the web accelerator.
|why can't you smart people at Google just focus on delivering cutting-edge compression technologies to speed up the web instead of coming up with something so controversial as Prefetching? |
That's a fair question, zjacob. I think if you go back and look at Googlebot now vs. two or even one year ago, today's Googlebot is much more likely to support gzipped content when fetching pages. So that's one thing. The other thing that I can think of is enabling prefetching for Google search results where we have high confidence that the user will click on the first search result.
|But last year people were complaining that Gmail wasn't opt in, because if you mailed someone using Gmail, it might generate ads. |
Why do you insist on stating that because the furor died down before with a totally different product, that this means that your new "product" is not bad? It is breaking sites. What part of that do you not understand? Every time you post, it's always how we're "overreacting."
It's not overreacting when people are fed caches of other users information. It's not overreacting when statistics we all use on our sites for advertising are now rendered useless on a whim by the great "Google." It's not overreacting when products are getting added to shopping carts without consent.
All for what? Allegedly to speed up broadband users? You say you feel we're savvy, so why do you pretend this is for the users benefit? This is, pure and simple, information collecting for Google's benefit. You're insulting our intelligence to act like that's not the case.
<I want to make sure that I take comments from here to the people who work on the web accelerator.>
Very kind of you. Much appreciated.
But why don´t you folks at Google understand that publishers spend hours, days, weeks and years in creating decent sites with valuable contents, and it should be those publishers who decide how and in which way they wish to display the contents of their sites.
One thing WA and Gator have in common; they force their own will on the legitimate owners of websites.
Pls correct me if I´m wrong.
|One thing WA and Gator have in common; they force their own will on the legitimate owners of websites. |
... and both can be blocked via .htaccess ( Gator has been for quite sometime) .
As a computer user I have the choice to NOT download the toolbar, Desk Top Search or use Gmail.
As a Webmaster I don't see that same choice when it comes to the sites I manage when it comes to WA.
From the few tests I ran I can say that as soon as a simple tag is posted that does not involve a 403, I will have to spend a couple of days inserting code so that this thing is blocked from fetching any content on my sites.
|The other thing that I can think of is enabling prefetching for Google search results where we have high confidence that the user will click on the first search result. |
LMAO - you go right ahead with that plan and watch dynamic ecommerce sites with NO-CACHE try their best to get OUT of the number one position when it tanks their server with a few hundred thousand wasted requests each day and shoppers can't shop as the shopping engine is crawling under the load. Are you guys actually contemplating the impact of your actions with this Web Accellerator thing?
BTW, when you implement this new pre-cache technology for the #1 spot what will you tell webmasters like myself with NO-CACHE pages who's server bandwidth charges suddenly DOUBLE or TRIBLE that month?
I sure as hell won't be picking up your tab for unwanted traffic,
Do you have an address at Google where I can send the invoice for the excessive bandwidth charges when this is implemented?
|As a computer user I have the choice to NOT download the toolbar, Desk Top Search or use Gmail. |
As a Webmaster I don't see that same choice when it comes to the sites I manage when it comes to WA.
|Why do you insist on stating that because the furor died down before with a totally different product, that this means that your new "product" is not bad? |
I think the answer is that he has a very tough position right now.
|you go right ahead with that plan and watch dynamic ecommerce sites with NO-CACHE try their best to get OUT of the number one position |
Now, that is funny. #4 or #5 would be the new best spot?
[edited by: oneguy at 9:19 pm (utc) on May 6, 2005]
I find the WA is taking some undue credit.
On one of our sites, we have agressively implemented compression and browser side cache. Now when I goto links I have just surfted to and all the objects are browser chached, the GWA still takes credit for saving a second!
I'm going to be polite to you because I know you're just our liaison to Google. Ranting at you would probably be as productive as acting belligerent toward a help desk operator and then wondering why you couldn't get the help you needed.
I have spent many hours making all my links seem static. The content on those pages however is always dynamic. So I was a little miffed when I got a reply from Google suggesting that I can block the Web Accelerator by adding a question mark to each of the links on thousands of pages.
If we're as intelligent as you claim you'll hopefully understand that we webmasters, well myself at least, view that as a bogus reply which borders on being incredulous.
Please pass those sentiments along to whomever you consult with at Google.
New IP range for the accelerator 220.127.116.11
I've also noticed 2 bugs in WA:
1. If you have 2 browsers running, IE & Mozilla and you goto Mozilla and do some surfing, and then when you comeback to IE, the WA stops working. Need to hit refresh to get it going again.
2. If you hit refresh on WebmasterWorld, it is causing some page formatting errors when new posts have been added. In other words, it has a problem with dynamic content! Or, this is the result of WebmasterWorld blocking WA
[edited by: Namaste at 9:24 pm (utc) on May 6, 2005]
incrediBILL, we already do this: [google.com...]
I haven't heard of any complaints about it; after the initial reaction, most people seem to like it. I haven't seen sites trying not to rank well or heard any complaints about sites slowing down. But the prefetching uses the "X-moz: prefetch" header as described at [mozilla.org...]
so it could be blocked using that.
I have two more meetings back to back, but I'll stop by later.
|WA still takes credit for saving a second |
That's the slight of hand magic trick here is that if they pre-fetch it and download it before you ask for it so that's the "seconds" they saved you avoiding the download delay. In reality they're just choking our broadband pipes with things you may or may not want.
My site's home page has conservatively over 200 links on it, can you imagine someone moving their cursor across my page as they read it?
Can you imagine the impact on their downloading bandwidth?
Now imagine this is happening on the office DSL line with 50 co-workers also pre-fetching away.
Emails stop sending / receiving.
Worker online productivity drops 100% thanks to GWA :)
IT guy that installed it network-wide is fired on the spot.
Quivering remaining IT guy trembles from desk to desk uninstalling it with an angry CEO standing over him.
GaryK, I take your point. I believe that all the requests have "X-moz: prefetch" headers on them, so that's one thing you could use. But I will definitely communicate your comments about sites that look static (i.e. have no '?') but are generated dynamically.