homepage Welcome to WebmasterWorld Guest from 54.198.185.156
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Marketing and Biz Dev / Deprecated - Search Engine Submission
Forum Library, Charter, Moderator: open

Deprecated - Search Engine Submission Forum

    
CMS built pages - how do you optimise them ?
how do search engine treat these ?
fom2001uk




msg:702694
 12:34 pm on Jul 8, 2002 (gmt 0)

We're using a content management system to build some sites. How can we optimise some of the pages that it produces ?

Is it possible or do we have to create additional static HTML pages ?

 

Brett_Tabke




msg:702695
 12:39 pm on Jul 8, 2002 (gmt 0)

That is all specific to the utility. Everyone of those programs is different.

The ones I've seen, often have some thing where you can inject code, or edit in a stock editor, or manipulate the page entities. It's just a question of the app. I'm sure there is a spot in there where you can control things.

Torben Lundsgaard




msg:702696
 2:34 pm on Jul 8, 2002 (gmt 0)

I have optimised several content management systems and Iíve had good and bad experiences.

Sometimes you can give the developers a list of identified problems and they will fix it within a week. Unfortunately I have also had a lot of cases where adjustment werenít possible and I had to inform the client that their site newer would be crawled by a SE.

There always seems to be one common problem, which is ď?Ē in the URLís. This can be solved with an ISAPI filter, which rewrites the URL.

URL before: \view.asp?id=XX
URL after: \XX.asp

The URLís in the HTML also needs to written as \XX.asp instead of \view.asp?id=XX, which means that all templates and navigation need to be adjusted accordingly.

I have experienced that many CMS developers find this solution very complex and beyond their capabilities. However, Iíve also had the pleasure of working together with a developer who wrote the ISAPI filter in C/C++ and adjusted the templates and it didnít take more than a days work.

Once the URL structure is SE friendly you can start optimising the site. However, the room for optimisation is often very limited because a site usually is build around 5-10 templates. Normally all pages have a document title or header, which can be used in the title tag. All you have to do is add this field in the template: <title>$$TITLE;</title>

Making the URLís SE friendly and insuring unique titles on all pages may seem like very low tech SEO but it is very effective.

I have got a few clients who had a hopeless CMS with about 1000 pages but only the front page was indexed. Now all 1000 pages are indexed with unique and relevant titles. You can imagine what happened to the internal link pop and the traffic from the SEís exploded.

fom2001uk




msg:702697
 3:39 pm on Jul 8, 2002 (gmt 0)

That's what I'm after. I just want some unique page TITLES, along with META DESCRIPTIONs for a handful of these template-driven pages.

So that's possible ?

Torben Lundsgaard




msg:702698
 3:42 pm on Jul 8, 2002 (gmt 0)

Anything is posible bu the key is having a good relationship with a competent developer.

fom2001uk




msg:702699
 4:37 pm on Jul 8, 2002 (gmt 0)

Brett, it's an ASP based system which talks to a SQL 2000 database to produce the pages.

There are 2 issues as I see it. Firstly I want to be sure I can optimise a handful of these pages through the existing system (an admin panel).

Secondly, I want all the other internal pages to be fully included in search engine listings. I'm not going to optimise these but I just want the text on these pages to be searchable.

Does that explain it better ?

Thanks in advance.

fom2001uk




msg:702700
 11:34 am on Aug 5, 2002 (gmt 0)

Any more thoughts on this question, guys ?

semick




msg:702701
 2:54 pm on Sep 27, 2002 (gmt 0)

All of my company's clients web sites are catalog management software db driven web sites. I used the xqasp product (changes link structure to look static - search for xqasp in google for more info), and we do things like including department name, sub-department name, keywords (from the db) in the TITLE, BODY etc so each page is unique...

The one company this month went from having only about 80 pages indexed in Google to over 8,000...Some of the pages still need optomized though, the site is using site server commerce edition 3.0. For instance there are a few drop-downs at the top of the page, and the text from the drop-down is getting in the one-two line results in google...

All in all though, the goal was to deep-link departments and products and that went exceedingly well.

Scott

Torben Lundsgaard




msg:702702
 3:01 pm on Sep 27, 2002 (gmt 0)

I have experienced up to 500% more traffic on different sites so it's definitely worth the trouble.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Marketing and Biz Dev / Deprecated - Search Engine Submission
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved