Welcome to WebmasterWorld Guest from 54.166.198.6

Forum Moderators: incrediBILL

Message Too Old, No Replies

Google Aims to Double Web Speed with SPDY Networking Protocol

     
8:02 pm on Nov 15, 2009 (gmt 0)

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member



In a Chromium Blog [blog.chromium.org] article last week, Google shared information about SPDY - a research project that aims to make the web a LOT faster, and more secure at the same time.

One technical goal is to make SSL the underlying protocol for all browsing. This would improve security and still offer compatibility with existing network infrastructure. In order to overcome the added latency from SSL, the added SPDY layer reworks the way that concurrent interleaved streams flow over a single TCP connection.

Here's the white paper:
[sites.google.com...]

8:19 pm on Nov 15, 2009 (gmt 0)

WebmasterWorld Administrator incredibill is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



Finally, something truly viable to speed up browsing unlike that failed "web accelerator" nonsense.

The only problem I see with SPDY is it'll hammer servers for resources faster than the current browser protocols so some servers already operating near capacity will be easily overloaded and need more hardware.

12:40 am on Nov 16, 2009 (gmt 0)

10+ Year Member



The only problem I see with SPDY is it'll hammer servers for resources faster than the current browser protocols so some servers already operating near capacity will be easily overloaded and need more hardware.

And you think that is a bad thing? Might deter some of the MFA/Spammers/Cheapie affiliates from ruining web as we once knew it.

1:21 am on Nov 16, 2009 (gmt 0)

10+ Year Member



I don't think hardware resources will be an issue, given the stable nature of web protocols and their installment on literally millions of systems, I doubt we'll see any browsers without backwards compatibility with existing protocols for a long, long time.

Those with the extra capacity will adopt it (maybe), those without won't.

Also, depending on the nature of the resource useage sometimes things like this can actually decrease resources. For example, if the limiting factor is RAM and/or maximum number of available TCP connections, any protocol which transmits the page faster will free up connections quicker and reduce RAM usage.

A good example is gzip/deflate compression. Technically it uses more resources but I've had high load systems which saw a decrease. More connections available/idle during peak times and as a result, more RAM available for disk caching. Compression made these systems faster, especially since they had many pages/scripts which could easily swamp a small disk cache.

6:04 pm on Nov 16, 2009 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member



do we really need fair open and honest google developing internet protocals which move INFORMATION around?
6:29 pm on Nov 16, 2009 (gmt 0)

10+ Year Member



The protocol and specifications are public. Google is not forcing this in on anyone, and people are free to make their own implementation. Sometimes the mud-slinging for the sake of mud-slinging gets a little like wearing a tin foil hat.
6:46 pm on Nov 16, 2009 (gmt 0)

WebmasterWorld Senior Member fotiman is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month



@J_RaD
I agree with mororhaven... mud-slinging for the sake of mud-slinging is not productive.
If company <fill-in-the-blank> is able to develop a better specification/standard than what is available today and that specification is open, then it shouldn't matter what company it is.
6:53 pm on Nov 16, 2009 (gmt 0)

10+ Year Member



And you think that is a bad thing? Might deter some of the MFA/Spammers/Cheapie affiliates from ruining web as we once knew it.

Wishful thinking.

6:59 pm on Nov 16, 2009 (gmt 0)

10+ Year Member



Java
.Net
C#
rel="nofollow"
Ethernet
USB
PCI

All examples of things developed by private companies which have become universal standards. (Though non-Microsoft .Net implementations are still weak)

2:30 am on Nov 17, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



do we really need fair open and honest google developing internet protocals which move INFORMATION around?

Well, Google certainly do have sufficient research funding.

12:43 pm on Nov 18, 2009 (gmt 0)

WebmasterWorld Senior Member zeus is a WebmasterWorld Top Contributor of All Time 10+ Year Member



Now Im no server dude, but will google have a hand in this new technical when its done, means like a google service or is it just something new where google dont have any rights to it, like open source.
7:20 pm on Nov 18, 2009 (gmt 0)

5+ Year Member



gzip has been around for some years giving speeds that SPDY is claiming. HTML is so uncompressed, HTTP is so old, I'm sure we can achieve 200% speeds increases quite easily.

My only concern is that any new protocols should be kept really simply, so that small devices like phones can easily process these protocols.

 

Featured Threads

Hot Threads This Week

Hot Threads This Month