| 6:41 pm on Nov 24, 2004 (gmt 0)|
I use a CMS or (at the very least) PHP includes on every site I build. I universally ensure that the site is navigable without session IDs or cookies. All URLs are rewritten to search-engine and user-friendly, non language-specific versions (eg. .htm not .php).
However, the biggest problem is that those seemingly static pages are still behaving dynamically when it comes to cacheing: and it is that which is the biggest remaining hurdle. It's open knowledge that supporting "Last Modified Since" is a good thing [webmasterworld.com] as it permits more rapid indexation, but because most CMSs don't attempt to cache or generate static pages, each request for a page makes a call to the database and a new page is generated, even if the content hasn't changed in days or even months.
On sites where I control the content, I have started using the cheap and easy trick of running the CMS with mod_rewrite and everything on the development server, then simply using wget to rip the entire site locally and generate completely static pages for upload to the live server. As Apache can automatically handle caching of static pages, it's a good compromise.
On clients' sites where they control the content via the CMS, that is not however an option. I can add some cache-control headers, but unless I can reliably calculate the Last-Modified date, it won't help indexing much.
| 7:05 pm on Nov 24, 2004 (gmt 0)|
We are getting closer to separating content management from web page display, as the CMS systems move towards XML and CSS standards.
In the past, I agree most CMS were poor choices if you intended to spend time optimizing. Too much code, too many assumptions. They seem to consider meta tags to be site tags (instead of page tags), serve content on multiple URLs depending on path to page, and hacking is too complex for my liking.
Even with the latest versions today I think the popular CMS's are still a poor choice, but they are good at content management. I use them for multi-author content management with review, plusprivate display. I then pull the content from another SEOd site used for public display. That is usually a static site, generated on demand from the content.
| 7:09 pm on Nov 24, 2004 (gmt 0)|
|Make sure the CMS outputs clean markup to help search engines index your site better |
YES! You should really be able to edit every single bit of html that your CMS is putting out.
| 7:33 pm on Nov 24, 2004 (gmt 0)|
At my site I began using php files where in the php file was an identifier to pull content out of a mysql table. The system worked well for adding content and produced simple url's, www.someweb.com/index.php. But, I scrapped this idea though because ultimately it says to search engines, imo, that the page is dynamic and could change, making the page less reliable in the eyes of a search engine.
To make my site more se friendly I created a script that took the page's content as input, which would then output a static html file. This new process seems to be working better as I see more pages being indexed by se's.
| 7:57 pm on Nov 24, 2004 (gmt 0)|
In addition to the great resource site posted previously, there is one that allows you to 'test-drive' many of the more popular CMS default installations.
The site even lets you mess around with the backend/admin functions so you can see all it's capabilities.
Google for 'open source cms' and you should find it.
| 8:16 pm on Nov 24, 2004 (gmt 0)|
It depends on how much you need from your CMS. For many sites our semi-basic custom CMS is perfect and makes SEO simple, even for clients.
It might not be the most easily changeable in terms of SEO fluctuations but it allows for our clients or our clients' webmasters to manage their sites effectively when major additions or revisions are not necessary.
I assume when you get into session ID's and such things get complicated - more demand for functionality creates the need for difficult SEO measures but for more basic content driven sites a CMS that allows copywriters for instance to upload work directly from word for example and dynamically plugs in the optimization techniques necessary to have a competitive on-page layout.
Ever seen the ad serving technology that scans your pages for words and makes them into PPC ads that you can earn revenue for? Something like that can be used to create awesome internal linking so that your cms-driven site doesn't seem too templated and non-unique.
You just have to make mods or develop your own.
| 9:46 pm on Nov 24, 2004 (gmt 0)|
|ultimately it says to search engines, imo, that the page is dynamic and could change |
If you configure apache and php correctly there is no way that any search engine would be able to tell whether you pages are being pulled from a database or static. If you can't do this it means you haven't spent enough time RTFM. A .php extension does not necessarily mean the page was made using php, nor does an .html extension mean the page is static.
In my experience every header that apache or php sends out can be configured and modified.
| 1:56 am on Nov 25, 2004 (gmt 0)|
|It's entirely possible to use some sort of SSI-based system (Apache includes or slighly more complex house-style generators, which are not really CMS's in the true sense) to maintain sites using 'real' HTML files. I maintain at least 10,000 flat-file, extremely SEO-friendly, pages in under an hour a day using little more than that. |
I assume that content writers and seo are seprate and also I assume that people need to edit and add content more frequently. We generally have a team where we have expert writers who donot know anything except their field of expertise, we teach them dreamweaver(thats it). Now our SEO person sleeps and works on links e.t.c, our content writers creates pages which are optimized.
Thus IMHO CMS is almost a must for almost all sites.
| 12:54 pm on Nov 25, 2004 (gmt 0)|
Actually a nice "CMS" would be something like the Blogger service, that makes pages with the keywords stuffed in, and they are spiderable ;)
CMS are great if you have more than 1 person working on that website... It will also be faster than creating manually other page (copy/paste and a link to it, then write).
Anyway, I still use the old fashion way... Copy/paste and write in Dreamweaver.
| 4:35 pm on Nov 26, 2004 (gmt 0)|
Hi there folks. I am new here although i have been lurking for some years :) Just want to tell you about <snip> There are no querystring params at all and all URIs look like they are part of the filesystem <snip>
[edited by: Brett_Tabke at 8:45 pm (utc) on Nov. 26, 2004]
[edit reason] please - no urls [/edit]
| This 40 message thread spans 2 pages: < < 40 ( 1  ) |