This question is probably better answerd in a search engine forum. I guess the answer depends on how well coded the pages are that your perl program generates. If they have the proper headers and formatting the serach engines look for then they should be OK.
Every website I've built for myself in the past year has been 100% perl coded and I have no issues getting indexed or crawled. The pages don't appear to be scripts (.htm extensions), but I haven't had any issues even with strings like http://example.com/file.htm?this=1234&that=4321.
Yup, almost all my sites are completely generated by Perl scripts and the lowest one (other than the brand new one) is PR4 and ranks great. I do use mod_rewrite to have URL's look like [blah_blah...] but when I used to leave it as blahblah.cgi?page=this&stuff=whatever I got those indexed also.
I do recommend mod_rewrite or similar, though, since all search engines can index those, and some of them may still not do well with blahblah.cgi?this=that .