homepage Welcome to WebmasterWorld Guest from 54.197.65.82
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Code, Content, and Presentation / Perl Server Side CGI Scripting
Forum Library, Charter, Moderators: coopster & jatar k & phranque

Perl Server Side CGI Scripting Forum

    
Using Modules in Perl
The experts are wrong
IanKelley

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3737980 posted 4:17 am on Sep 4, 2008 (gmt 0)

This post is a bit random...

A small yet niggling pet peeve of mine for many few years now, even though I rarely use perl these days, is the prevailing wisdom among experienced perl users that modules are the way to go whenever possible.

This is completely untrue in many circumstances and is, IMO, one of the reasons that perl has been replaced by PHP as the de facto web language.

I'll use an example from my own experience... Once upon a time there was a popular, and professionally written, perl script collection that happened to run a large number of sites I did contract work for.

These sites generally made a lot of money and had a lot of traffic. In addition the work being done by the script was unusually (and necessarily) resource intensive. As a result they tended to bring servers to their knees.

For quite a while I assumed that this was a natural limit, that it was just all the hardware could handle. I found a couple of areas to optimize but, largely, the scripts seemed very efficient.

Then one day, at a site which I owned and could therefore tinker with freely, I was trying to squeeze some extra performance out of a specialized area of these scripts. I decided to try dropping the modules, the most important one being the well loved CGI perl module.

After replacing the module with my own functions the result was immediate tripling of performance. Suddenly an overloaded server was no longer overloaded.

Just like that I had saved $1000's of dollars in new hardware infrastructure.

I have since seen similar issues on popular and successful websites that were written in perl. In each case they would be running on a server farm when they really should have needed only a few (and in some cases just one) server.

The overhead of loading a module that contains endless lines of unneeded code kills high load applications. Or so I assume.

While at the same time a PHP script written to do the same thing would use built in C++ functions and therefore completely avoid the issue.

Sure perl modules are great for little apps on little sites where performance is never going to be an issue, but on a site where performance can mean the difference between 4 and 5 figures monthly in hardware costs they are a huge mistake.

And yet 9 out of 10 perl experts will tell you differently with a straight face. :-)

 

janharders

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3737980 posted 4:55 am on Sep 4, 2008 (gmt 0)

sounds alot like you're using perl as a cgi. of course, that's slow. go mod_perl - you'll love the speed (really, I'm talking speed here, not just a factor of 3) and modules you use can be loaded at server startup, so every handler can use them without taking the time to load them each and every time ...

IanKelley

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3737980 posted 5:51 am on Sep 4, 2008 (gmt 0)

go mod_perl

And lose the ability to distribute the application.

you'll love the speed

Why when I can use PHP with the everyday stuff built in and pre-compiled? Even faster execution, lower development time.

The only thing I use perl for these days is file access and system utilities since it gets better permissions by default on most server setups.

Not trying to start a PHP versus Perl thread though. I'm still fond of perl, I miss the shorthand :-)

modules you use can be loaded at server startup

Which is to say, load a lot of code you will never use into memory.

I'd rather cache database queries, etc. in those blocks thanks!

janharders

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3737980 posted 6:19 am on Sep 4, 2008 (gmt 0)

>>And lose the ability to distribute the application.

I thought you were talking about heavy traffic websites -- they won't mind using mod_perl, at least I never encountered that. Just regular hosting requires CGI (allthough: just use a generic interface and your app will run just fine on both).

>>Why when I can use PHP with the everyday stuff built in and pre-compiled? Even faster execution, lower development time.

I don't know about development time, guess that's an individual thing. I'm pretty sure though that mod_perl still beats php in speed, once it's loaded (e.g. if you don't load your handlers at startup, the first request might take a second).

plus, and that's the real big thing for me: controlling whatever you want to, not just the ResponsePhase in the request cycle of apache.

>>Which is to say, load a lot of code you will never use into memory.

hey, you were the one talking about php ;)
basically, that's what php does. afair, it has like 3000 built in functions. and, in a sane shared hosting environment, you'll most likely lose php's speed, since there are very few adventure loving admins that run mod_php on a server with alien scripts...

IanKelley

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3737980 posted 6:35 am on Sep 4, 2008 (gmt 0)

In any case you're still arguing PHP versus Perl and I was talking about perl modules versus custom functions.

janharders

WebmasterWorld Senior Member 5+ Year Member



 
Msg#: 3737980 posted 6:41 am on Sep 4, 2008 (gmt 0)

I know ... but still, give mod_perl a try.
I was once working on a heavy dynamic content website a few years back. It was sitting on it's own server and running just fine, until traffic went up a few hundred percent. it wasn't so much the module loading-time that brought the server to a halt but the whole process generation inherent to cgi. I jumped to mod_perl, got the basic stuff migrated in one evening and while the traffic remained the same, server load went from insane to nothing, delivery speed was great, not matter how many parallel users.
Of course, even mod_perl has it's limitations (e.g. you may need multiple servers) but they're far beyond those of cgi-scripts and I've never reached them or came close.

Edit: just to get back on topic - you're right. and nobody would argue against you, that's why there are many lightweight versions of the standard heavy modules around, aimed at those who need the most performance.

perl_diver

5+ Year Member



 
Msg#: 3737980 posted 4:42 pm on Sep 4, 2008 (gmt 0)

Modules are often written as a one-size-fits-all-cross-platform application, in which case they can be inefficient, sometimes grossly so. They can be safer and speed up development time, but if you really need effciency and you know what you are doing, write your own code. You can also use XS with perl to run C code.

[perldoc.perl.org...]

perl_diver

5+ Year Member



 
Msg#: 3737980 posted 4:51 pm on Sep 4, 2008 (gmt 0)

The reason 9 out of 10 experts tell people to use modules is because on forums, the people asking questions generally have little or no idea of what they are doing or getting themselves into. In that case using a module is a very good idea. Especially in the insanely insecure world of CGI scripts. There are too many things that you need to know about to be able to write secure CGI scripts. Your average person is better off using the CGI module (and other modules).

If they ever get into the desirable position of having so much traffic to a site they need to squeeze out all the performance they can get, then they will have to invest accordingly, in hardware, software, and everything in between. That is a step up most people will never face.

IanKelley

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3737980 posted 7:44 pm on Sep 4, 2008 (gmt 0)

the people asking questions generally have little or no idea of what they are doing or getting themselves into. In that case using a module is a very good idea.

You're right and it's understandable.

Nevertheless what it really translates to is an excuse. It's the act of passing on poor programming practices for convenience sake.

Your average person is better off using the CGI module (and other modules).

It's not just your average person. Many professional programmers and gurus routinely use the CGI module (et al). As they will very loudly tell you :-)

If they ever get into the desirable position of having so much traffic to a site they need to squeeze out all the performance they can get, then they will have to invest accordingly

Go back and rewrite the entire back end? That rarely happens, instead you end up with more and more hardware.

That is a step up most people will never face.

I suppose it's a matter of perspective.

In my experience websites with millions of daily requests are far from rare. And if your application is high load (lots of data crunching, database queries, external requests, etc...) you really only need to get into the 6 figure requests range before performance becomes an issue.

Also consider that for a small time webmaster the difference between one server and a database + web server setup is a big deal. Especially when, in most cases, the realistic move is from one to three servers.

The bottom line is that if you're coding for the web, efficiency is important from the very beginning. It's not realistic to expect to be able to go back and fix your efficiency mistakes later when the application becomes successful.

IanKelley

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3737980 posted 7:45 pm on Sep 4, 2008 (gmt 0)

I know ... but still, give mod_perl a try.

I've used it in the past. I'm definitely not saying there's anything wrong with it.

rocknbil

WebmasterWorld Senior Member rocknbil us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 3737980 posted 4:14 pm on Sep 5, 2008 (gmt 0)

I thought I was the only one, I totally agree - but am guilty as charged, the CGI module allows me to be lazy when parsing multipart form data, the DBI module makes connecting to mySQL stupid easy. Everything else is built from scratch, it makes it so much easier to know what's going on when things go sour.

perl_diver

5+ Year Member



 
Msg#: 3737980 posted 8:28 pm on Sep 5, 2008 (gmt 0)

I guess you could call it lazy if you knew how to write the code to do what you needed but just wanted to use the module instead. I use modules all the time though and I am capable of writing most things myself, and I don't think I'm lazy. If there was a time and place where a module was not practical for whataver reason then sure I would write the code myself.


The bottom line is that if you're coding for the web, efficiency is important from the very beginning. It's not realistic to expect to be able to go back and fix your efficiency mistakes later when the application becomes successful.

Nearly every major website that has been around for a number of years has done just that several times over. I remember when ebay used perl and had one sever. There is no way the founder (his name escapes me now) could had predicted the kind of traffic ebay was going to generate from the very beginning. Many popular sites started out on a shoe-string budget and had to do total overhauls, some did them a number of times and I assume they continue to do them.

IanKelley

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3737980 posted 8:57 pm on Sep 5, 2008 (gmt 0)

So we should write applications with no foresight because that's how it was done in the early days of the web?

Is there any good reason not to write efficient code from the beginning?

These days pretty much everyone launches websites with the intent of getting large quantities of traffic. And the total net population is large enough to make that a reasonable goal.

perl_diver

5+ Year Member



 
Msg#: 3737980 posted 10:04 pm on Sep 5, 2008 (gmt 0)

If you are starting a website on a shoestring, you will most likely use whatever is avaialable at low or no cost and worry about growth pains as they occur, that is the standard business model for 99% of startups. If you have lots of resources, like eHarmony did before "opening" the doors, thats a different story. Hire good programmers and enginneers from the start and use the best technology available from the beginning.

I'm working on a guys website right now, started on a shoe-string, he is now purchasing his first server and hiring people to help him now that he is generating income and traffic. Thats really what is so cool about an internet business, you can literally start it with a few dollars and expand as your needs increase. There is really no need to start with all the bells and whistles and blow a wad of cash unless you have those resources and can afford to lose them in case the website fails to generate enough revenue.

IanKelley

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 3737980 posted 1:55 am on Sep 7, 2008 (gmt 0)

I totally agree about internet business. Watching friends and clients achieve success online is almost as much fun as achieving it yourself.

But that doesn't have much to do with coding for the web.

you will most likely use whatever is avaialable at low or no cost

Efficient code is free.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Code, Content, and Presentation / Perl Server Side CGI Scripting
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved