Forum Moderators: buckworks

Message Too Old, No Replies

X-Cart - Static HTML questions

Trying to figure this out

         

Spine

6:36 pm on Apr 14, 2005 (gmt 0)

10+ Year Member



Hi all, I've been handed a site that was built with X-cart. Right now the inner product pages have no Page rank, even though the site is getting close to 10 years old, has decent links, and a page rank of 5.

The ISP hosting the site is unwilling to try a mod rewrite solution, so I'm looking at the static HTML option, but I have a couple of questions.

1. the ISP is concerned that building the static version of the site is a big 'job' for the server, and wants to know how often we'd need to do this strenuous task (after adding new products). I'm thinking 2-3 times a month tops, maybe just once.

The site has about 500 products, so how roughly how long would it take to generate a static version of the site?

I don't see how it could be that much different from rebuilding a large blog in Moveable Type, which I do all the time with my personal ISP.

2. would this cause a duplicate content issue with Google? I just fixed one with this site (URL only listings because pages were too similar). Would there be a way to prevent a duplicate content issue, with robots.txt or something?

3. Anything else I need to know about building a static version? :)

I think it's important to get the page rank circulating around in this site, right now the index, and static pages (privacy.php etc) all have PR 5, and the category and product pages all have PR n/a.

The ISP thinks it's purely a content issue, but up until a month ago, 90% of the pages were URL only listings in google, so...

sun818

7:32 pm on Apr 14, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Regarding duplicate content issue, you have two issues here. One is duplicate content within the site, a dynamic URL and the static URL. Two is duplicate content outside of the site. One thought is to analyze the current URL and if it has dynamic URL, you can generate a robots noindex tag. As to the second issue, if you sell a blue widget that a competitor also sells, there is not much you can do about it. You could also put a noindex on those products and promote your site in other ways.

Ozark

8:59 pm on Apr 14, 2005 (gmt 0)

10+ Year Member



1. the ISP is concerned that building the static version of the site is a big 'job' for the server, and wants to know how often we'd need to do this strenuous task (after adding new products). I'm thinking 2-3 times a month tops, maybe just once.

The site has about 500 products, so how roughly how long would it take to generate a static version of the site?

I have a site with about 100 products and it takes about 3 minutes to produce. I am surprised that the ISP is concerned. Most servers should be able to do this task without any problems.

2. would this cause a duplicate content issue with Google? I just fixed one with this site (URL only listings because pages were too similar). Would there be a way to prevent a duplicate content issue, with robots.txt or something?

Yes it could cause a duplicate content issue. I use a robots.txt to solve this problem.

lorax

9:04 pm on Apr 14, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Re: server load
It's not a big load. The script is generating HTML files and not something processor intensive like indexing the site. It's simply using PHP to create files - in your case - about 500 +/-. Should be painless.

Spine

5:27 am on Apr 15, 2005 (gmt 0)

10+ Year Member



For some reason they are reluctant to try this, I'll keep this all in mind though, thanks.

sun818

5:58 am on Apr 15, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



More than the generation of static HTML files, I am under the impression that X-Cart is notorious for utilizing lots of CPU time in the day-to-day use of the product.

lorax

12:54 pm on Apr 15, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



>> X-Cart is notorious for utilizing lots of CPU time

I'll let you know if I run into any problems on the 3 installs I've got in the works.

Spine

5:38 pm on Apr 16, 2005 (gmt 0)

10+ Year Member



I'd be curious to know about the load stuff.

Damn, the version of X-cart I'm having to use is 3.xx, it doesn't seem too search engine friendly.

If I want to change the title of a category or product, I can't just change the contents of the <title> tag, I have to change the header and the incoming anchor text as well, which to me seems a bit overdone, and limits how many keywords I can get in the <title>.

I'd like to have similar but different anchor text, title and headers.

Spine

6:19 pm on May 1, 2005 (gmt 0)

10+ Year Member



What's the best way with robots.txt to prevent a duplicate content issue if I generate the static version of a site?

I'm also a bit concerned about how Google would handle this in general.

The site in question was full of URL only listings, and I fixed that for the owner, but there is still NO PR showing on any of these internal pages, and they rank poorly in the SERPs.

Will making a static version of the site, in essence changing it's structure, be likely to throw the whole site / domain into the Google sandbox?

Ozark

7:00 pm on May 1, 2005 (gmt 0)

10+ Year Member



User-agent: Googlebot
Disallow: /*?
www.google.com/intl/en/webmasters/faq.html#12

You can also disallow the directory that your dynamic cart is located in.

User-agent: *
Disallow: /directoryofyourcart/

If your cart is in the root
User-agent: *
Disallow: /cart.php

Ozark

Spine

4:03 am on May 4, 2005 (gmt 0)

10+ Year Member



Cool, thanks. It's fairly obvious, but for some reason I felt I should ask around before I commit to this.

Spine

5:23 pm on May 19, 2005 (gmt 0)

10+ Year Member



Any updates?

Has anyone tried building a static html catalog lately who can comment on the server load during the building process?