Welcome to WebmasterWorld Guest from 54.224.57.95

Forum Moderators: open

Message Too Old, No Replies

CMS built pages - how do you optimise them ?

how do search engine treat these ?

     
12:34 pm on Jul 8, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



We're using a content management system to build some sites. How can we optimise some of the pages that it produces ?

Is it possible or do we have to create additional static HTML pages ?

12:39 pm on Jul 8, 2002 (gmt 0)

WebmasterWorld Administrator brett_tabke is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



That is all specific to the utility. Everyone of those programs is different.

The ones I've seen, often have some thing where you can inject code, or edit in a stock editor, or manipulate the page entities. It's just a question of the app. I'm sure there is a spot in there where you can control things.

2:34 pm on Jul 8, 2002 (gmt 0)

10+ Year Member



I have optimised several content management systems and Iíve had good and bad experiences.

Sometimes you can give the developers a list of identified problems and they will fix it within a week. Unfortunately I have also had a lot of cases where adjustment werenít possible and I had to inform the client that their site newer would be crawled by a SE.

There always seems to be one common problem, which is ď?Ē in the URLís. This can be solved with an ISAPI filter, which rewrites the URL.

URL before: \view.asp?id=XX
URL after: \XX.asp

The URLís in the HTML also needs to written as \XX.asp instead of \view.asp?id=XX, which means that all templates and navigation need to be adjusted accordingly.

I have experienced that many CMS developers find this solution very complex and beyond their capabilities. However, Iíve also had the pleasure of working together with a developer who wrote the ISAPI filter in C/C++ and adjusted the templates and it didnít take more than a days work.

Once the URL structure is SE friendly you can start optimising the site. However, the room for optimisation is often very limited because a site usually is build around 5-10 templates. Normally all pages have a document title or header, which can be used in the title tag. All you have to do is add this field in the template: <title>$$TITLE;</title>

Making the URLís SE friendly and insuring unique titles on all pages may seem like very low tech SEO but it is very effective.

I have got a few clients who had a hopeless CMS with about 1000 pages but only the front page was indexed. Now all 1000 pages are indexed with unique and relevant titles. You can imagine what happened to the internal link pop and the traffic from the SEís exploded.

3:39 pm on Jul 8, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



That's what I'm after. I just want some unique page TITLES, along with META DESCRIPTIONs for a handful of these template-driven pages.

So that's possible ?

3:42 pm on Jul 8, 2002 (gmt 0)

10+ Year Member



Anything is posible bu the key is having a good relationship with a competent developer.
4:37 pm on Jul 8, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Brett, it's an ASP based system which talks to a SQL 2000 database to produce the pages.

There are 2 issues as I see it. Firstly I want to be sure I can optimise a handful of these pages through the existing system (an admin panel).

Secondly, I want all the other internal pages to be fully included in search engine listings. I'm not going to optimise these but I just want the text on these pages to be searchable.

Does that explain it better ?

Thanks in advance.

11:34 am on Aug 5, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Any more thoughts on this question, guys ?
2:54 pm on Sep 27, 2002 (gmt 0)

10+ Year Member



All of my company's clients web sites are catalog management software db driven web sites. I used the xqasp product (changes link structure to look static - search for xqasp in google for more info), and we do things like including department name, sub-department name, keywords (from the db) in the TITLE, BODY etc so each page is unique...

The one company this month went from having only about 80 pages indexed in Google to over 8,000...Some of the pages still need optomized though, the site is using site server commerce edition 3.0. For instance there are a few drop-downs at the top of the page, and the text from the drop-down is getting in the one-two line results in google...

All in all though, the goal was to deep-link departments and products and that went exceedingly well.

Scott

3:01 pm on Sep 27, 2002 (gmt 0)

10+ Year Member



I have experienced up to 500% more traffic on different sites so it's definitely worth the trouble.
 

Featured Threads

Hot Threads This Week

Hot Threads This Month