Forum Moderators: buckworks

Message Too Old, No Replies

Which E-Commerce package is the most search engine friendly?

         

opiesilver

10:13 am on Mar 19, 2004 (gmt 0)

10+ Year Member



Hi all,

I've got a site that I've been running Interchange on for about 3 years, but it doesn't seem to index very well. What I need is your opinions as to what is the best/most friendly package that indexs well. I'm open to all suggestions and please let me know why you think that some are better than others.

Thanks!

derekwong28

3:41 pm on Mar 19, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



As a rule static html pages will index better than dynamic pages. To rank well, it would be important to put your keywords into the title metatags and perhaps even into the page url.

There will be no problems with litecommerce or x-cart which will generate an html catalog. You can also look at Actinic which is totally html.

sun818

4:28 pm on Mar 19, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I looked at the LiteCommerce demo. Each product page needs to have a unique title. This, I think, for clearer communication to your customes and search engine listings. LiteCommerce does not do this... yet.

derekwong28

4:42 pm on Mar 19, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Hi Sun, You can get litecommerce to do this with a minor modification in the code. So that the keywords you put in the metatags field for individual products will come up in the title. You have to ask the litecommerce support team to show you how to do it.

[edited by: derekwong28 at 4:56 pm (utc) on Mar. 19, 2004]

sun818

4:56 pm on Mar 19, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Thanks derek, I have not used the script myself so I have no idea what the code looks like. I was just speaking from looking at the demo. There's really a great need out there for a low-cost search engine friendly ecommerce/catalog program that is easy to install.

tberthel

2:09 am on Mar 21, 2004 (gmt 0)

10+ Year Member



I use Weblisket TM I uses static pages.

Ian_Cowley

4:38 pm on Mar 22, 2004 (gmt 0)

10+ Year Member



Hi

Interchange can be made *very* serach engine friendly. I have built many sites that look just like static sites but are in fact huge database driven websites.

It's also robot (spider) aware and will turn off session handling to stop duplicate pages making their way into the SERPS.

The freedom of feature implementation in Interchange is unsurpassed. You can database drive meta fields, titles and headings etc. Small tweaks can easily be made without having to edit loads of HTML pages. If you wanted to you could even serve up different pages to different spiders, although this would be unethical.

You need to make sure you have at least Interchange version 4.9 (it's spider aware) or above and really you need to use Interchange Actionmaps so you don't have to pass cgi parameters.

Actionmaps allow:
[domain.com...]

instead of:
[domain.com...]

Sticky me if you want to see some URLs.

Ian

opiesilver

5:45 am on Mar 24, 2004 (gmt 0)

10+ Year Member



Could you post a link on how to use this feature in Interchange? I can't find it in the RTFM pages at all.

cwenham

6:18 am on Mar 24, 2004 (gmt 0)

10+ Year Member



I have an Interchange based site with a PR of 6, it gets crawled regularly.

The key is to get it to hide the session key and "page count" from the URL when a spider comes around. The spider-aware code introduced in 4.9 does that. However, I don't think it's very easy to back-port it to earlier versions of Interchange without several hours of developer time.

Migrating to a newer version of Interchange is not dead simple, either, but it's mostly a case of installing the new version in another directory, then running it in parallel to your old one while you debug the catalog. More like a couple hours of plodding work, versus reverse-and-re-engineering.

The main documentation on the new RobotUA directive is here:

[icdevgroup.org...]

opiesilver

7:57 am on Mar 24, 2004 (gmt 0)

10+ Year Member



Is there a way to make this a permanent change for all users and not just the bots. I don't particularly like the looks of it either. I could do it with an Apache "mod rewrite" but have heard that it adds considerably to the sever load. Not so sure I want to go that route.

Ian_Cowley

9:02 am on Mar 24, 2004 (gmt 0)

10+ Year Member



opiesilver - Interchange is smart and only uses ids in the URLs if it the browser does not accept cookies. If you apply the RobotUA stuff to all users then Interchange will cease to function correctly as the session handling will not work.

Having seen your site the first thing to do, and perhaps the most important is to get rid of the cgi-bin/cart in the URL. Do this using modrewrite.

cwenham

4:28 pm on Mar 24, 2004 (gmt 0)

10+ Year Member



Ian is right that eliminating the session key from all URLs will affect session tracking. Unless you _want_ to make the site usable only for people with cookies turned on.

However, here's the gun to shoot your foot with ;-)

The following lines added to your catalog.cfg will suppress URL session IDs and mv_pc page counts:

ScratchDefault mv_add_dot_html 1
ScratchDefault mv_no_session_id 1
ScratchDefault mv_no_count 1

But in my experience Interchange will still fail back to using URL session IDs if it can't set a cookie, even with the above parameters set. I think this is a bug in Interchange because, according to the documentation, that's not supposed to happen. I've found that the following hack to lib/Vend/Util.pm does the trick:

Find the function vendUrl, and scroll down until you find this line:

push @parms, "$::VN->{mv_session_id}=$id" if $id;

Comment it out. Then go down one more to find this line:

push @parms, "$::VN->{mv_pc}=$ct" if $ct;

And comment that one out, too. Save the file and restart Interchange, and your store will continue to track sessions if cookies are enabled, but under no circumstances will it put session IDs or page counts in the URL.

As for using mod_rewrite, I don't recommend it, either. You're better off using mod_interchange, which actually removes overhead rather than adding it. With mod_interchange you can have a URL like:

[example.com...]

Instead of:

[example.com...]

or

[example.com...]

Kevin Walsh says he even has an experimental version of mod_interchange that lets you serve straight from the root URL, like:

[example.com...] <-- Interchange page.

Ian_Cowley

4:48 pm on Mar 24, 2004 (gmt 0)

10+ Year Member



The scratch defaults are behaving properly it's not a bug. If cookies are not enabled it will fall back to putting the session ID in the URL.

It's bad idea to overide a patch to the core. If you want to you can overide a sub in the config.

Adding: push @parms, "$::VN->{mv_pc}=$ct" if $ct;

Will cause havoc - it won't allow anyone with cookies disabled to add products to the basket. You'd be suprised how many privacy freaks there are out there with cookies diabled!

mod_interchange, yeah fair enough although it doesn't work with Apache 2. We have sites doing 30,000 plus uniques a day on a 3 year old server. They all use mod_rewrite to give:

[example.com...]

We havn't seen any problems from doing this.

cwenham

5:14 pm on Mar 24, 2004 (gmt 0)

10+ Year Member



Yes, you don't want to use that hack on any site that absolutely needs to track sessions for everybody, like virtually all shopping cart sites.

I use the hack on two sites to disable session IDs and page counts for everybody, but neither of them use the shopping-cart functions of Interchange. Both are content sites that are serving AdSense, and when the URLs are forced to be unique it confuses Google. The result is that all your pages start delivering charity ads instead of the, ahem, more profitable ones ;-)