Finally, something truly viable to speed up browsing unlike that failed "web accelerator" nonsense.
The only problem I see with SPDY is it'll hammer servers for resources faster than the current browser protocols so some servers already operating near capacity will be easily overloaded and need more hardware.
|The only problem I see with SPDY is it'll hammer servers for resources faster than the current browser protocols so some servers already operating near capacity will be easily overloaded and need more hardware. |
And you think that is a bad thing? Might deter some of the MFA/Spammers/Cheapie affiliates from ruining web as we once knew it.
I don't think hardware resources will be an issue, given the stable nature of web protocols and their installment on literally millions of systems, I doubt we'll see any browsers without backwards compatibility with existing protocols for a long, long time.
Those with the extra capacity will adopt it (maybe), those without won't.
Also, depending on the nature of the resource useage sometimes things like this can actually decrease resources. For example, if the limiting factor is RAM and/or maximum number of available TCP connections, any protocol which transmits the page faster will free up connections quicker and reduce RAM usage.
A good example is gzip/deflate compression. Technically it uses more resources but I've had high load systems which saw a decrease. More connections available/idle during peak times and as a result, more RAM available for disk caching. Compression made these systems faster, especially since they had many pages/scripts which could easily swamp a small disk cache.
do we really need fair open and honest google developing internet protocals which move INFORMATION around?
The protocol and specifications are public. Google is not forcing this in on anyone, and people are free to make their own implementation. Sometimes the mud-slinging for the sake of mud-slinging gets a little like wearing a tin foil hat.
I agree with mororhaven... mud-slinging for the sake of mud-slinging is not productive.
If company <fill-in-the-blank> is able to develop a better specification/standard than what is available today and that specification is open, then it shouldn't matter what company it is.
|And you think that is a bad thing? Might deter some of the MFA/Spammers/Cheapie affiliates from ruining web as we once knew it. |
All examples of things developed by private companies which have become universal standards. (Though non-Microsoft .Net implementations are still weak)
|do we really need fair open and honest google developing internet protocals which move INFORMATION around? |
Well, Google certainly do have sufficient research funding.
Now Im no server dude, but will google have a hand in this new technical when its done, means like a google service or is it just something new where google dont have any rights to it, like open source.
gzip has been around for some years giving speeds that SPDY is claiming. HTML is so uncompressed, HTTP is so old, I'm sure we can achieve 200% speeds increases quite easily.
My only concern is that any new protocols should be kept really simply, so that small devices like phones can easily process these protocols.