Welcome to WebmasterWorld Guest from 54.196.232.162

Forum Moderators: coopster & jatar k & phranque

Message Too Old, No Replies

Perl based website - Ranking in the engines

Anyone know if using Perl based programing...

     
9:24 pm on Mar 27, 2006 (gmt 0)

Full Member

10+ Year Member

joined:Oct 25, 2005
posts:307
votes: 0


We've just launched a new site that has been completely programed in Perl.

Just wondering if anyone has any comments on the seach engine's ability to crawl and index such a site.

MSN has pretty much dropped the new site with the exception of a few pages. Yahoo hasn't had us in there for ages due to some sort of penality. (I was hoping the new site would address this.)

Google has us listed still, but primarily for only the homepage. Very few of the interior pages are ranked.

Anyone have any experience with this type of site?

9:43 pm on Mar 27, 2006 (gmt 0)

Preferred Member

10+ Year Member

joined:Oct 1, 2004
posts:607
votes: 0


Do you mean the site runs out of a cgi-bin directory, with URLs like
/cgi-bin/site.pl?page_id=12345
etc.?
9:49 pm on Mar 27, 2006 (gmt 0)

Preferred Member

10+ Year Member

joined:Jan 5, 2006
posts:536
votes: 0


This question is probably better answerd in a search engine forum. I guess the answer depends on how well coded the pages are that your perl program generates. If they have the proper headers and formatting the serach engines look for then they should be OK.
9:50 pm on Mar 27, 2006 (gmt 0)

Administrator

WebmasterWorld Administrator jatar_k is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:July 24, 2001
posts:15755
votes: 0


do you use static urls or are they loaded with parameters?

>> penalty

that wouldn't be addressed with a new site

there are no problems with perl sites because they are perl, same with any language, they usually stem from other issues

9:23 pm on Mar 28, 2006 (gmt 0)

Full Member

10+ Year Member

joined:Feb 24, 2004
posts:219
votes: 0


I would suggest you to use URLRewrite rules.
9:27 pm on Mar 28, 2006 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Feb 12, 2003
posts:1199
votes: 0


The technology you use to generate your web pages is irrelevant to your search rankings. Are you outputting HTML with standard <a href=> links? If so, you'll be fine.
7:48 am on Mar 30, 2006 (gmt 0)

New User

10+ Year Member

joined:Nov 26, 2005
posts:21
votes: 0


Every website I've built for myself in the past year has been 100% perl coded and I have no issues getting indexed or crawled. The pages don't appear to be scripts (.htm extensions), but I haven't had any issues even with strings like http://example.com/file.htm?this=1234&that=4321.
3:46 pm on Apr 11, 2006 (gmt 0)

Preferred Member

10+ Year Member

joined:July 19, 2003
posts:538
votes: 0


Yup, almost all my sites are completely generated by Perl scripts and the lowest one (other than the brand new one) is PR4 and ranks great. I do use mod_rewrite to have URL's look like [blah_blah...] but when I used to leave it as blahblah.cgi?page=this&stuff=whatever I got those indexed also.

I do recommend mod_rewrite or similar, though, since all search engines can index those, and some of them may still not do well with blahblah.cgi?this=that .

JK

10:47 pm on Apr 14, 2006 (gmt 0)

Full Member

10+ Year Member

joined:Sept 7, 2005
posts:242
votes: 0


Mod_rewrite is the answer, whether the question is perl or php.