Forum Moderators: open

Message Too Old, No Replies

Every new release more bloated than the last

I'm getting fed up with that

         

vordmeister

6:23 pm on Nov 16, 2009 (gmt 0)

10+ Year Member Top Contributors Of The Month



Why is it that you find some software that you like - maybe a forum, CMS or shop - and then 3 years down the line the software become incredibly bloated.

I was looking around for some shop software at the start of the year and got so completely fed up that I decided to write my own. Some of those shops take 10 seconds to load.

It turns out it is possible for a shop to have less than 20kb download for each page including style sheets and other rubbish (excuding images) for something that is more fully featured for the user than most offerings. Each page of my creation downloads in less than half a second and I'm on a really slow connection.

I run a forum and the current version takes a massive 4 seconds to download for new visitors on a similar connection to me. The new version takes 7 seconds. When I first adopted the software it was 2 seconds.

Why is this going on? The new 7 second forum doesn't do anything different to the old 2 second forum. It is presumably more secure so I'll have to upgrade or write my own forum (which would take at least 6 months).

What's the need for the un-necessary 600kb of code? How come new and improved versions aren't faster?

It's a serious question, but for more imagination I'd prefer responses to be in the in the spirit of FOO :-)

And what can we do about it? I reckon web developers should all be on dialup.

LifeinAsia

6:39 pm on Nov 16, 2009 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



Super-sized code!

Imagine that at the server level, every request is greeted with a little nano-worker asking, "You want a style sheet with that request?" So not only do you have the extra overhead of an unnecessary style sheet tacked on, but you also have the delay waiting for the style sheet question goes back to the originating server for an ACK/NACK response.

So smart coders try to reduce the overhead and tacking on a pre-preemptive "StyleSheet=False" header block to the request.

But then one day there's a new nano-worker request: "Do you want to make a donation to Net Neutrality today?" So we have another request going back to the originating server. Again, smart programmers eventually add a "NetNeutralityDonation=False" header block to their requests.

And the game goes on and on...

rocknbil

7:22 pm on Nov 16, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Why is this going on?

Having coded some "generic" apps myself, here's my take as to why.

You want your applications to be configurable and fit into "any" environment. In order to do this, the distro has to have a lot of features and items that many people won't need.

Every time users request a "new feature" it means adding a new set of programming, and likely more graphics to support that feature. If Mr. Tabke folded to the "we want smileys" requests (which I'm sure he gets) it would mean including those graphics in the distro.

Here's an example [webmasterworld.com], sort of, in reverse. If the cart in question had the features this client needs, it would probably be 10x larger and more complex.

This is why custom programming is so important, it allows you to trim the fat, control precisely what you need, no more, no less.

I've watched Acrobat go from a slim, trim, usable app to sheesh, what is it now, over 250 MB for just the compressed download?

vordmeister

8:16 pm on Nov 16, 2009 (gmt 0)

10+ Year Member Top Contributors Of The Month



Fair point there rocknbil. There's no way I could sell my shop. It suits me perfectly but to make it suit even a single other developer the code would double both sever side and client side. I've simplified things far too much.

One thing that worries me it it is difficult for new software to compete on the internet with established software. The same for new sites competing with established sites. The number 1 SEO rule these days seems to be to have started your site before 2002. Suits me very well but isn't great for competition.

And established software can only become more bloated. Those of us with 1Mb connections are already having deja-vu for the dial up days. Won't be that long before the 50Mb connections start feeling the same.

kaled

9:04 pm on Nov 16, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



The main reason software gets slower with each iteration is that programmers use cutting-edge hardware for development and testing. I make a point of using ordinary hardware for software development (for Windows) consequently, it remains fast.

For instance, I added a new feature to one program a few weeks ago and it was taking about 20 seconds to run. I experimented and found a small code change got it down to 2 seconds - that same code is also used in the startup sequence so that's a bit faster too now (but I can't notice the difference).

Kaled.

wyweb

9:38 pm on Nov 16, 2009 (gmt 0)



Well, it's also job security as well. Are you going to release an app that's good forever? One program that never has to be upgraded? Software engineers have to make a living as well.

I was competent in Flash 5 and Paint Shop Pro 7. I mean I could do what I needed to do in each in terms of modifying someone else's original work. Later releases were so drastic... I mean forget the cost of an upgrade. I had to buy the book to esentially relearn the program.

I know why software manufacturers do this. Keep the money coming in. Plus, as operating systems and PC capacities have become larger and able to handle more, software vendors have felt as though they can make their programs bigger and more resource intensive as well.

This isn't likely to change. Personally I liked win 98 SE extremely well. I was very, very comfortable with that OS and when XP came out was so reluctant to change it wasn't even funny. The only reason I DID change is because I had to buy a new machine. Now I like XP. I don't want to change but I know I'll have to some day.

I avoided Vista like the plague and am now glad I did. Win 7? I don't know.. we'll see. My current computer is 3 years old so I'm about due for a new one but I'm definitely going to wait and see. I'm not one to jump on any bandwagon just because it's rolling by my house.

graeme_p

8:44 am on Nov 17, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



@vordmeister, frameworks make it a lot easier to write your own as well. I have started using Django and it gets stuff done fast. It may run a little slower than if you wrote everything from scratch, but you can get quite good results.

@vyweb, A lot of software is used because it is what people know and they do not want to learn anything else, even if the learning curve is shallow (e.g. MS Word), or because they have invested so much in learning it (e.g. Photoshop), so changing it drastically is risky.

I agree with a cautious upgrade cycle. If it aint broke dont fix it. There is no need to get the latest of everything.

Running software for old or slow computers on new hardware can be a revelation. I have a (very) cheap new desktop in the house running PCLinuxOS LXDE and it flies.

You might be able to do the same with Windows 7: run the netbook version (there is, or will be, a netbook version to take over from XP, I assume) on your desktop, for example.

vordmeister

4:32 pm on Nov 17, 2009 (gmt 0)

10+ Year Member Top Contributors Of The Month



New cars are the same. Apparently there's no market for a basic car without the half ton of features you carry with you everywhere to burn more gas. That one is easy - I don't buy new cars. My old one still works and it still does 45 miles to the UK gallon.

With web based software I like to keep up to date for security reasons. I recall the days when new features also added something for me. ie "Wow I can upload photos". Now the new features seem to be heading towards adding a My Space, Blogger, Facebook and Twitter on every single site. Next we are sure to get them on every single page.

I've been timing sites today. Most impressive was the UK Times website which managed to get a 5kb article to me in 76 seconds. And I'm only down the road - would have been quicker walking.

For the web especially, software architecture seems to be getting worse. For sold software architecture it seems to me it should be possible to set it up and compile it before sticking it onto the web. That way I can have a single snippet of ajax without requiring users to download 300k of ajax on their first visit. If I want to use another feature I'd be happy to compile it again.

rocknbil

7:47 pm on Nov 17, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Yeah, that's the other side of the coin - but I don't think it has to do with application bloat, it has to do with methods of deployment.

Any remote request you make - and this includes Analytics and much of jQuery/Ajax or other on-page "widgets" - slows down your page load. Add five or six of them, and it's no longer the program's execution that's the problem, it's how you've assembled various technologies and are now reliant on server response, and response of remote servers, over which you have no control.

Shining example: mySpace.

D_Blackwell

7:15 am on Nov 18, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



.....there's no market for a basic car without the half ton of features you carry with you.....

I believe there is a market - but no cars to fill it. They make the money on all the junk bundled into the car. A stripped down car would probably have to be ordered, and then they would stick you as hard as they could to get back the extra profit they would be getting cheated out of for not buying all the extras.

kaled

11:18 am on Nov 18, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



They make the money on all the junk bundled into the car.

It's off-topic, however, this is nonsense. The bundled junk is there to sell the car not make a profit. It's the non-bundled extras that serve to make a profit.

Take it from someone who knows, in software, extra features may add considerably to code bloat but do not add considerably to loss of speed - that is down to sloppy programming and the attitude "if it ain't fast enough, buy a new computer".

For instance, I often see code in which a function is called several times rather than once (and the result saved). As a general rule, it is best to use a cache-and-invalidate approach to complex operations, but this is rarely done because nobody cares about efficient programming any more (except, perhaps, game-programmers).

Kaled.

graeme_p

1:18 pm on Nov 18, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Apparently there's no market for a basic car without the half ton of features you carry with you everywhere to burn more gas.

They are sold in the third world!

My pet hate is power steering in very small cars. They do not need it, and you can feel the road better without it.

lawman

12:31 pm on Nov 19, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I'm very practical when it comes to cars. The only option I ordered was HID headlamps:

[webmasterworld.com...]

:)

vordmeister

6:07 pm on Nov 30, 2009 (gmt 0)

10+ Year Member Top Contributors Of The Month



I'm back working in the car industry for the first time this year. My development car looks like the car from the "Back to the Future" films at the moment - I've made some tweeks.

I've figured out the difference between new cars and new software.

Developing new cars takes a long time, and this allows engine development to match the weight of new features. You can generally expect your new car to go faster than your old one.

It's possible to develop software very quickly, and this allows the new features can be added more quickly than computers and networks like the internet can develop. So you would generally expect any new software you buy to be slower than the old stuff was when you first decided to buy it.

I asked for a refund on some new internet software this week. I decided I would rather secure the old version myself than take the hit on speed.

blend27

7:11 pm on Dec 13, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Many software(forum/CMS/shop) vendors forget to tell the masses that $3 hosting might not be the best option when choosing the host.

But most of the time it is the thing that LifeinAsia describes. Some call it "modular code". Most of those only sometimes have a clue.

vordmeister

8:05 pm on Dec 13, 2009 (gmt 0)

10+ Year Member Top Contributors Of The Month



I pay a massive £2400/year for my hosting. I spent a lot of time selecting the location and provider. Low load, code cache and gzip enabled. The datacenter seems to catch fire quite often, but whenever it's not burning it's super speedy.

The only thing I don't have is a fast internet connection at home. It's not really possible in the back and beyond where I am located and that forces some discipline. I recently bought a bit of code (a CMS attached to a forum update) that had a 10 second load time to me from my server. I sent it back for a refund.

I get annoyed by use of AJAX. "We've used AJAX because it's really fast" say the developers. I say I have to download the page I want to look at, but also I need to download any other page I might want to look at in the future before I can see anything. That's abuse of AJAX fair enough, but it's the normal use so far as I can see.

The new fangled object oriented programming is abused in much the same way, but this time it slows the server. Rather than just process the code needed to send me a page lets process the code needed for the whole site on every hit. Developers turning to OOP for their own convenience (as Blend says) add a massive load on any server, never mind the cheap ones.

I've just about given up with the way things are going.

kaled

12:38 am on Dec 14, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Object orientation is inherently slower, but the difference is marginal. It can lead to code bloat - that rather depends on how smart the linker is (for compiled code) and how well written it is. For instance, virtual methods are typically linked even if they are never called but static methods are usually discarded if they are not called.

The real speed killer is often event-driven program methods. This can lead to repeated function calls that serve no purpose at all.

Kaled.

graeme_p

5:44 am on Dec 14, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I thought event driven servers were supposed to be fast and efficient: Lighttpd, Twisted, etc.?

As we are talking about web apps its mostly dynamic byte-code compiled languages. How much impact does OOP have there? Is there are difference between non-OO languages (if there are any left) and using a non-OO style in an OO language?

kaled

12:15 pm on Dec 14, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I'm not sure what you mean by "dynamic byte-code compiled languages". Languages are normally compiled (e.g. C++), semi-compiled (e.g. Java) or fully interpreted (e.g. very old fashioned BASIC).

Ultimately, all interactive computer software is event-driven. There is a loop that scans for events and a big case-statement or if-then-else that chooses which code to call for each event. That loop can itself be inefficient, especially when decision-taking has to pass through many objects, however, the real problem arises from repeated function calls, for instance...

Suppose a window is resized by the user, this will typically trigger a resize event and a move event. This will then cause the code that records window position to be called twice. Event loops can quickly become incredibly complicated but improving their speed is considered a low priority because hardware is so extraordinarily good - it's reliability that counts.

Kaled.

JAB Creations

3:34 pm on Dec 14, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



A blog is a blog.

A forum is a forum.

A product page is a product page.

A shopping cart is a shopping cart.

Each thing I've listed is a module...and while you should connect your modules you shouldn't mix them.

No page should be every page yet if you look at most sites there are three side bars, an 800 pixel width, etc.

Throw in frameworks like jQuery and that's 40KB (after compression) that has to be downloaded, uncompressed, parsed, etc.

A successful page is limited to three general "areas": content, navigation, and side. The side (however implemented) doesn't mix modules though blends them just enough to keep visitors at the site and able to interact with it on their terms.

- John

graeme_p

11:29 am on Dec 15, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Bytecode is what you call "semi-compiled". I think the term bytecode compiled is now more common.

Dynamic is also a common used term. It covers things like being able to alter objects (structure as well as data) at run time, redefining functions at runtime, etc. TCL is very much dynamic, C definitely is not. There are many languages in between!

Most of the time event loops are in the OS or the language. They cause minimal CPU usage, and using them adds little extra load. Run Ligghtpd, plus Twisted plus a full desktop OS on a low end PS: what is the CPU load when they are all doing nothing, despite all the event loops running?

Funnily enough, I did just write a little script with a hand-rolled event loop to stop my laptop from over-heating.

@John, I can understand that problem, but it is more of a site design and implementation fault, rather than that the software is over specified (vordmeister's original complaint) or that it is written using inefficient techniques/languages (kaled's view).