homepage Welcome to WebmasterWorld Guest from 54.237.197.160
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Code, Content, and Presentation / HTML
Forum Library, Charter, Moderators: incrediBILL

HTML Forum

    
Google Aims to Double Web Speed with SPDY Networking Protocol
tedster




msg:4025332
 8:02 pm on Nov 15, 2009 (gmt 0)

In a Chromium Blog [blog.chromium.org] article last week, Google shared information about SPDY - a research project that aims to make the web a LOT faster, and more secure at the same time.

One technical goal is to make SSL the underlying protocol for all browsing. This would improve security and still offer compatibility with existing network infrastructure. In order to overcome the added latency from SSL, the added SPDY layer reworks the way that concurrent interleaved streams flow over a single TCP connection.

Here's the white paper:
[sites.google.com...]

 

incrediBILL




msg:4025334
 8:19 pm on Nov 15, 2009 (gmt 0)

Finally, something truly viable to speed up browsing unlike that failed "web accelerator" nonsense.

The only problem I see with SPDY is it'll hammer servers for resources faster than the current browser protocols so some servers already operating near capacity will be easily overloaded and need more hardware.

vrtlw




msg:4025463
 12:40 am on Nov 16, 2009 (gmt 0)

The only problem I see with SPDY is it'll hammer servers for resources faster than the current browser protocols so some servers already operating near capacity will be easily overloaded and need more hardware.

And you think that is a bad thing? Might deter some of the MFA/Spammers/Cheapie affiliates from ruining web as we once knew it.

motorhaven




msg:4025474
 1:21 am on Nov 16, 2009 (gmt 0)

I don't think hardware resources will be an issue, given the stable nature of web protocols and their installment on literally millions of systems, I doubt we'll see any browsers without backwards compatibility with existing protocols for a long, long time.

Those with the extra capacity will adopt it (maybe), those without won't.

Also, depending on the nature of the resource useage sometimes things like this can actually decrease resources. For example, if the limiting factor is RAM and/or maximum number of available TCP connections, any protocol which transmits the page faster will free up connections quicker and reduce RAM usage.

A good example is gzip/deflate compression. Technically it uses more resources but I've had high load systems which saw a decrease. More connections available/idle during peak times and as a result, more RAM available for disk caching. Compression made these systems faster, especially since they had many pages/scripts which could easily swamp a small disk cache.

J_RaD




msg:4025931
 6:04 pm on Nov 16, 2009 (gmt 0)

do we really need fair open and honest google developing internet protocals which move INFORMATION around?

motorhaven




msg:4025957
 6:29 pm on Nov 16, 2009 (gmt 0)

The protocol and specifications are public. Google is not forcing this in on anyone, and people are free to make their own implementation. Sometimes the mud-slinging for the sake of mud-slinging gets a little like wearing a tin foil hat.

Fotiman




msg:4025973
 6:46 pm on Nov 16, 2009 (gmt 0)

@J_RaD
I agree with mororhaven... mud-slinging for the sake of mud-slinging is not productive.
If company <fill-in-the-blank> is able to develop a better specification/standard than what is available today and that specification is open, then it shouldn't matter what company it is.

iThink




msg:4025980
 6:53 pm on Nov 16, 2009 (gmt 0)

And you think that is a bad thing? Might deter some of the MFA/Spammers/Cheapie affiliates from ruining web as we once knew it.

Wishful thinking.

motorhaven




msg:4025989
 6:59 pm on Nov 16, 2009 (gmt 0)

Java
.Net
C#
rel="nofollow"
Ethernet
USB
PCI

All examples of things developed by private companies which have become universal standards. (Though non-Microsoft .Net implementations are still weak)

Chico_Loco




msg:4026227
 2:30 am on Nov 17, 2009 (gmt 0)

do we really need fair open and honest google developing internet protocals which move INFORMATION around?

Well, Google certainly do have sufficient research funding.

zeus




msg:4027271
 12:43 pm on Nov 18, 2009 (gmt 0)

Now Im no server dude, but will google have a hand in this new technical when its done, means like a google service or is it just something new where google dont have any rights to it, like open source.

Seb7




msg:4027519
 7:20 pm on Nov 18, 2009 (gmt 0)

gzip has been around for some years giving speeds that SPDY is claiming. HTML is so uncompressed, HTTP is so old, I'm sure we can achieve 200% speeds increases quite easily.

My only concern is that any new protocols should be kept really simply, so that small devices like phones can easily process these protocols.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Code, Content, and Presentation / HTML
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved