Forum Moderators: LifeinAsia

Message Too Old, No Replies

Best platform (from SE friendly viewpoint) for ecommerce sites

         

chelseaandco

8:12 pm on Jun 9, 2003 (gmt 0)

10+ Year Member



Hello,
Again, I'm new here and have some pressing questions. I am in the process of hiring someone to re-design my gift site with shopping cart and online ordering. I was wondering which is the most search engine friendly platform? My current site is with ColdFusion and not doing well with search engine placement.

What about ASP or PHP?

Thanks,
Michele

Travoli

9:25 pm on Jun 9, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Hi Michele

The database software is not nearly as important as the URLs that your site is built with. You'll want to make sure they make the URL's as static as possible, which means no "?", "=", etc..

There are several ways that this can be done. Does this make sense?

Storyteller

10:16 pm on Jun 10, 2003 (gmt 0)

10+ Year Member



Mason (www.masonhq.com) has some unique features that let you build extremely SEO-friendly sites.

At a site I'm currently building I arranged that titles defined within pages are automatically used in 'title' attribute in all anchors that point to that page (or if special link_title page attribute defined, it's used instead). There're more ticks that make life a lot easier for SEO because of inheritance concept and URL flexibility Mason offers.

I've seen a lot of frameworks, but this one tops it for me as far as SEO goes.

netguy

12:30 am on Jun 11, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



travoli - I completely agree to minimum lengths in (at least) php strings. A client went out and hired his brother-in-law to code his store in php. A typical product looked like this:

www.clients-domain.com/store/index.php?action=item&id=73&subid=&PHPSESSID=e574673a01a7d9a3bfb65a173e9150b9

Needless to say, it was a complete disaster, and more than 500 (all) the pages were ignored by Google.

As you said travoli, there is a right way, and a wrong way.

ircgeeks

1:32 am on Jun 11, 2003 (gmt 0)

10+ Year Member



even in bad url situations you can allways use mod_rewrite i took a page that went 3 deep to 3,000 deep useing mod_rewrite alot faster than starting from scratch

netguy

1:57 am on Jun 11, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



ircgeeks...

Care to elaborate on how to handle 'fat' php files with mod_rewrite for future reference? We completely redid the site, then added a strong front-end to satisfy the SEs.

On my example above, we're back to #5 on page 1 (from page 6), but we may have been able to correct this more efficiently.

mincklerstraat

5:39 pm on Jun 30, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Definitely use mod_rewrite, this comes standard in most Apache (server) setups, you just need to make sure it's turned on from the config file httpd.conf. Mod_rewrite will take incoming pagerequests to files like index12-7.html, and convert it into a url really understands, like "index.php?this=12&that=7".

My .htaccess file first makes sure Apache is rewriting with these lines:
Options +FollowSymlinks
RewriteEngine On

Then it has lines like this:
RewriteRule ^page([1-9][0-9]*).html secd.php?artid=$1
RewriteRule ^page([1-9][0-9]*)-([1-9][0-9]*).html secd.php?artid=$1&page=$2

These are rules, telling Apache, when it gets a page request from page26.html, to translate it into the url secd.php?artid=26.
It will also go from page12-3.html to secd.php?artid=1&page=2.

The ([1-9][0-9]*) is a regular expression designating an entity of digits 1-9, followed (optionally) by digits 0-9, as many as there may be -- and the .html just corresponds to the .html in the 'incoming' url. The $1 just reiterates this stipulated value of digits it found between the parentheses. In the second rewrite rule, there are two sets of numbers it finds, mapped then to the url secd.php with these two numbers taking the places of $1 and $2. So even you don't have this actual html file on your server, Apache takes the number(s) and feeds them as parameters to the file secd.php. Google for mod_rewrite, one of the developpers of Apache has got a nice article on using it.

If you are using a content-management system, chances are good that it has lots of fancy stuff it can do to pretty much just view the same page in different ways. Don't try, then, to make rewrite rules which correspond to every last page. You might find that there are places where search robots (which are pretty dumb in this way) get stuck in infinite loops, or go mad clicking all over the place and drowning your server. Just rewrite the url's for the pages that are important and matter, and leave out the url's that are likely to be only duplicate content (which you'd get punished for anyways -- even better, put these in a directory that's marked as no-go with robots.txt).

You will also, of course, have to make your software generate these new url's. Often this is done by buffering using ob_start() and the other caching functions. Or you can rewrite the software to actually produce these url's itself, which will make for better performance / speed of your site.

You also don't want to use too many rewrite rules to keep your site's speed up.

I'm currently writing a bit of software as a front-end for one of the more popular PHP content management systems that produces spider-friendly url's in combo with mod_rewrite, and only needs a small number of rewrite rules, but this isn't shopping cart software. However, you can always have your spider-friendly content in non-shopping cart pages, and the "put this in your shopping cart" button in the usual, ugly php multi-parameter url's, as long as your important content is there to get spidered. Sticky me if interested.