| 9:43 pm on Mar 27, 2006 (gmt 0)|
Do you mean the site runs out of a cgi-bin directory, with URLs like
| 9:49 pm on Mar 27, 2006 (gmt 0)|
This question is probably better answerd in a search engine forum. I guess the answer depends on how well coded the pages are that your perl program generates. If they have the proper headers and formatting the serach engines look for then they should be OK.
| 9:50 pm on Mar 27, 2006 (gmt 0)|
do you use static urls or are they loaded with parameters?
that wouldn't be addressed with a new site
there are no problems with perl sites because they are perl, same with any language, they usually stem from other issues
| 9:23 pm on Mar 28, 2006 (gmt 0)|
I would suggest you to use URLRewrite rules.
| 9:27 pm on Mar 28, 2006 (gmt 0)|
The technology you use to generate your web pages is irrelevant to your search rankings. Are you outputting HTML with standard <a href=> links? If so, you'll be fine.
| 7:48 am on Mar 30, 2006 (gmt 0)|
Every website I've built for myself in the past year has been 100% perl coded and I have no issues getting indexed or crawled. The pages don't appear to be scripts (.htm extensions), but I haven't had any issues even with strings like http://example.com/file.htm?this=1234&that=4321.
| 3:46 pm on Apr 11, 2006 (gmt 0)|
Yup, almost all my sites are completely generated by Perl scripts and the lowest one (other than the brand new one) is PR4 and ranks great. I do use mod_rewrite to have URL's look like [blah_blah...] but when I used to leave it as blahblah.cgi?page=this&stuff=whatever I got those indexed also.
I do recommend mod_rewrite or similar, though, since all search engines can index those, and some of them may still not do well with blahblah.cgi?this=that .
| 10:47 pm on Apr 14, 2006 (gmt 0)|
Mod_rewrite is the answer, whether the question is perl or php.