Forum Moderators: open

Message Too Old, No Replies

CMS and SEO

How does a content management system affect internal PR?

         

sooperfly

3:48 pm on Aug 30, 2003 (gmt 0)



I have asked this in other forums and couldnt get much help.

I have a client who is a developer of a CMS and I am now working with them on optmizing their site, some of their clients sites, and potentially wish to use the product myself. Problem I am finding is that there is no PR on any page other than the homepage.

Does anyone have any experience in the affect of CMS technology on SEO? Other than building a site map and working on the internal link structure, which I have done, is there any way to transfer PR to internal pages?

I want to say that I saw PR on their internal pages a few months back, disappearing right before I started to work with them. I am having a hard time working this out with all the variables; the CMS, the way the pages are named, reducing backlinks in Google, a sometimes "broken" toolbar and the fact that Google is not showing new content on their site from the past 30 days even though the PR and backlinks dropped again today.

Thanks for any help here.

John_Caius

8:52 pm on Aug 30, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Use allinurl:www.domain.com to check whether Google can spider to the internal pages. Otherwise it may just be that the front page doesn't have enough PR to give internal pages a PR of >1. This is often the case if your internal link structure is just a sitemap to hundreds of 'equivalent' pages.

Mark_A

11:28 am on Sep 1, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



The fact that pages are generated by a CMS (Content Management System) should make no intrinsic difference to the way that search engines see the site, the same rules apply as to any other site ....

What are the resulting webpages like, what contents / code and linking do they have, what are the url strings like, how does navigating round the site work, links, anchor text, navigation menus, page code and content location etc etc.

It does not matter what or how the site was generated .. it matters how the resulting site is usable where SEs are concerned.

Of course if internal linking is weak, inward links are weak, all urls include query strings or worse unique visitor id codes etc ... or links require some post or get command .. button pushes etc .. and if there is a lot of code in the page rather than content etc, if page titles H1 and alt tags have not been made as straightforward as they might be on a static site etc ...

Well then it wont get the same results as it might on a site which was made a more standard way ....

Its just not to do with it being a cms or not a cms

Thats my 2p anyhow ..

chiyo

11:33 am on Sep 1, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I think Mark A put it well. As long as the pages are created server side, before the robot reads them robots need not know they are reading a CMS generated site than any other template based site. Some CMS's have complex file names for pages which need to be made simpler. This would usually be the culprit for problems you are talking about, if the navigation structure and internal links are all OK.

Marcia

11:59 am on Sep 1, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I've been looking for a CMS so someone can update existing pages and add pages to their own site and I've had a hard time finding one that results in search-engine friendly pages as is.

I did find one little Perl script developed by some programmer in a small obscure town in Northern California that looks like it's the closest to what we need.

It generates pages with an .html file extension, you can use it in whichever directories you want, use different templates based on your own design and put links where you want them.

It is definitely NOT a full featured CMS, but I'll be writing him this week to show him an example of what's needed, explaining, and see what it'll take for him to customize it.

I've been looking for weeks, trying to find something simple that's workable that won't mess up the existing page names. The simpler the better.

I've looked at PHP, but I'm inclined toward Perl at this point so that a tracking program can be used that will need SSI and we'll be able to stay with .html or .htm pages.