homepage Welcome to WebmasterWorld Guest from 54.145.243.51
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Code, Content, and Presentation / Perl Server Side CGI Scripting
Forum Library, Charter, Moderators: coopster & jatar k & phranque

Perl Server Side CGI Scripting Forum

    
Perl based website - Ranking in the engines
Anyone know if using Perl based programing...
marketingmagic

5+ Year Member



 
Msg#: 4418 posted 9:24 pm on Mar 27, 2006 (gmt 0)

We've just launched a new site that has been completely programed in Perl.

Just wondering if anyone has any comments on the seach engine's ability to crawl and index such a site.

MSN has pretty much dropped the new site with the exception of a few pages. Yahoo hasn't had us in there for ages due to some sort of penality. (I was hoping the new site would address this.)

Google has us listed still, but primarily for only the homepage. Very few of the interior pages are ranked.

Anyone have any experience with this type of site?

 

zCat

10+ Year Member



 
Msg#: 4418 posted 9:43 pm on Mar 27, 2006 (gmt 0)

Do you mean the site runs out of a cgi-bin directory, with URLs like
/cgi-bin/site.pl?page_id=12345 etc.?
perl_diver

5+ Year Member



 
Msg#: 4418 posted 9:49 pm on Mar 27, 2006 (gmt 0)

This question is probably better answerd in a search engine forum. I guess the answer depends on how well coded the pages are that your perl program generates. If they have the proper headers and formatting the serach engines look for then they should be OK.

jatar_k

WebmasterWorld Administrator jatar_k us a WebmasterWorld Top Contributor of All Time 10+ Year Member



 
Msg#: 4418 posted 9:50 pm on Mar 27, 2006 (gmt 0)

do you use static urls or are they loaded with parameters?

>> penalty

that wouldn't be addressed with a new site

there are no problems with perl sites because they are perl, same with any language, they usually stem from other issues

flashfan

10+ Year Member



 
Msg#: 4418 posted 9:23 pm on Mar 28, 2006 (gmt 0)

I would suggest you to use URLRewrite rules.

MichaelBluejay

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 4418 posted 9:27 pm on Mar 28, 2006 (gmt 0)

The technology you use to generate your web pages is irrelevant to your search rankings. Are you outputting HTML with standard <a href=> links? If so, you'll be fine.

webgo2

5+ Year Member



 
Msg#: 4418 posted 7:48 am on Mar 30, 2006 (gmt 0)

Every website I've built for myself in the past year has been 100% perl coded and I have no issues getting indexed or crawled. The pages don't appear to be scripts (.htm extensions), but I haven't had any issues even with strings like http://example.com/file.htm?this=1234&that=4321.

JollyK

10+ Year Member



 
Msg#: 4418 posted 3:46 pm on Apr 11, 2006 (gmt 0)

Yup, almost all my sites are completely generated by Perl scripts and the lowest one (other than the brand new one) is PR4 and ranks great. I do use mod_rewrite to have URL's look like [blah_blah...] but when I used to leave it as blahblah.cgi?page=this&stuff=whatever I got those indexed also.

I do recommend mod_rewrite or similar, though, since all search engines can index those, and some of them may still not do well with blahblah.cgi?this=that .

JK

BananaFish

5+ Year Member



 
Msg#: 4418 posted 10:47 pm on Apr 14, 2006 (gmt 0)

Mod_rewrite is the answer, whether the question is perl or php.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Code, Content, and Presentation / Perl Server Side CGI Scripting
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved