homepage Welcome to WebmasterWorld Guest from 54.166.111.111
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / WebmasterWorld / Webmaster General
Forum Library, Charter, Moderators: phranque

Webmaster General Forum

This 42 message thread spans 2 pages: < < 42 ( 1 [2]     
Designing a Low Maintenance Website
Feel free to add your own tips
graywolf




msg:373299
 6:49 pm on Jan 14, 2005 (gmt 0)

Making the transition from working for someone else to working for myself my methods for building sites have changed as well. I now try to keep things as low maintenance as possible. Here are my tips feel free to add your own

Site Architecture

  • Don't display the technology running the website
    bad: example.com/blue-widgets.html, example.com/blue/widgets/index.php, example.com/product.php?prod=blue widgets
    good: example.com/blue-widgets/ , example.com/blue/widgets/ example.com/product/blue-widgets
    why If you want to change from HTML,ASP,PHP,JSP or some other technology to something different you'll lose all of those other indexed URL's. Sure you can play around with htaccess if you want later on but if you do it right the first time you wont have to.

  • Develop a set of base site pages
    Most of your sites have a lot of the same pages right? Home, contact us, about us, privacy, terms of use, site map, etc. Develop a list of all of these pages and put them in the default folder, and just eliminate the ones you don't need on a project by project basis

  • Error pages
    All of your websites have custom 401, 403, 404, and 500 error pages right? If you took the time to make them once and put them in your default site folder they would. Itís also a good idea to put them in the base htaccess file as well. See the advanced error page section below

  • favicon file
    You use a favorite icon file donít you? When I come to a website that does it shows me someone cares about this website. Ok you will have to make a new one for each website but heck it only takes 15 minutes, right?

  • Robots.txt
    Get it set up the way you want blocking whatever rouge spiders you feel are necessary. This way you know itís there and only has to be tweaked to each sites particulars later on.

    Content Management

  • Separate content from context
    Youíre using CSS and keeping all of your layout information separate from the actual content arenít you. Say you donít have the time to learn, well you should. Once you get it and I mean really get it and see the power of CSS youíll never go back again. One 2 hour plane ride with ďdesigning with web standardsĒ and I was a convert.

  • Separate content from programming
    Letís look at the contact form. It has some blah-blah copy and then some form elements. Separate the copy into one file. Put the programming that builds, error-checks and sends the form in a separate include file. This way when you build your next project you only have to worry about changing the blah-blah copy.

  • Code Re-Use
    Have an include file with all of your common functions and site variables in it. Build a library of all of your functions so they are all there at your fingertips.

  • Common Include Files
    Chances are your top masthead, bottom footer, and side navigation only comes in one or two versions. Make them include files, this way when something changes you only have to change one file not four hundred. Be really crafty and define your site name, and site URL as a variable in your code re-use file , and then put the variable in your common includes.

  • controlling spam
    Never display an email address on your website, only use a contact form. Ok thatís not always practical, if you have to display an email obfuscate it or even better use an image.

    Intermediate Ideas

  • Link Trades
    If you are the link trading type use an automated, submission and checking form. Get one that checks the links are up before accepting the submission. Make sure the links have to be approved before they are visible. Make sure it can periodically check that the reciprocal link is still up and email and remove deadbeats who take them down.

  • monitor uptime
    Set up a test page that has a brief static text message. Subscribe to or buy an uptime checking service. You can get programs to run from your home development server for as little as $30. Set up an email emergency@example.com that forwards to your main email that you check regularly. Have the monitoring program alert you if the test page is down. When you go on vacation redirect emergency@example.com to your alternate email, cell phone other device . Have all of your websites use one address so you only have to change one.

  • Error pages part II
    Have your error pages record the error in a text file or database. For some errors you may even want to make a form so the user can give you more information. Record the IP, session variables, or other info in hidden fields so you can debug them later on.

  • Spy vs. Spy
    Check for people who are trying to reverse engineer your work. If you see someone with an ďallinĒ in the referrer record their IP and pages they visit. Store the info in a text file or database. Have it mailed to you on a daily basis.

  • Daily reporting
    How about setting a job or script that emails you all of the link requests, errors, and other pertinant info on a daily basis.

    Have any other tips feel free to add them.

  •  

    MultiMan




    msg:373329
     11:01 am on Jan 17, 2005 (gmt 0)

    Can anyone else share their experience with "http://example.com/blah/page.html" vs "http://example.com/blah/page"? I am about to make this call and I am trying to decide what re-write would be better. Doesn't .html just feel more soothing

    I learned that lesson a few years ago. Now, just about every new file I upload to a site is turned into a directory index file, so the URL is always
    www.site.com/fileasdirectory/
    (or deeper subdirectory URLs like that too).
    You learn this lesson when you are forced to convert 100s (and 1000s) of .html pages into an upgraded technology like .asp or .php in order to capitalize on server-side technology (such as using a common include-file system of webdevelopment).

    I once had to make such conversion to .asp and it was an extremely huge and tedious job. Years later, that site may soon be converted to .php files instead (as the site is moved from a Windows to a Unix server). But since all URLs for the site have been made as simple directory index files (with directory-only based URLs), I only have to change from index.asp files to index.php files (in each directory) while the URLs will remain constant. That means the pages' rankings in SEs will be unaffected, which is an important matter, of course.

    So, unless one is using multi-variable query-string URLs (for specific reasons), it is easiest to always design sites by making all webpages as individual directory index files. Indeed, "hide the technology" is an excellent recommendation for this -- much more soothing.

    raptorix




    msg:373330
     11:12 am on Jan 17, 2005 (gmt 0)

    Well don't agree, its only 20 seconds work to configure your webserver that it parses a certain extension, ie its very easy to parse .asp as PHP and vice versa. I don't think the theory about not showing your "technology" is correct. The only sensible reason why you should make a seperate directory for each page is for SEO optimising.

    [edited by: trillianjedi at 1:13 pm (utc) on Jan. 20, 2005]
    [edit reason] Terminology softening ;-) [/edit]

    ncw164x




    msg:373331
     11:14 am on Jan 17, 2005 (gmt 0)

    >>seperate directory for each page is for SEO optimising

    Absolutely correct and no other reason

    drdsl2000




    msg:373332
     3:15 pm on Jan 17, 2005 (gmt 0)

    "Link Trades
    If you are the link trading type use an automated, submission and checking form. Get one that checks the links are up before accepting the submission. Make sure the links have to be approved before they are visible. Make sure it can periodically check that the reciprocal link is still up and email and remove deadbeats who take them down. "

    Can anybody name some of these programs? I know I used them on other sites.

    Much Appreciated
    Craigster

    meta4ic




    msg:373333
     4:19 pm on Jan 17, 2005 (gmt 0)

    For link trade scripts, I've had reasonably good luck with Reciprocal Manager. It's not freeware, but it's cheap, and the license lets you use it on an unlimited number of sites (as long as they're yours, if I recall correctly).

    It's in PHP, and does recip checking (along with a degree of cheat detection) both on submission, and periodically. You can configure it to send warning e-mails, deletes, etc., if they drop the link back to you. It'll pend new submissions for you to approve, and auto-generates the link pages. The templating system is a little old-skool, but not terribly so.

    - Chuck

    elgumbo




    msg:373334
     1:45 pm on Jan 18, 2005 (gmt 0)

    Do not delete your data.

    This is especially important if a non techie is responsible for site updates. I use a INT field called SHOW on every record and add the following WHERE clause in my SQL "SHOW = '1'".

    Default the show field to 1 and when the user "deletes" a record I just set the SHOW to 0. Record is removed from the site and the user can "undelete" any mistakes at a later date.

    I can't tell you how much time this has saved me when somebody has deleted a record in error.

    ronburk




    msg:373335
     7:57 pm on Jan 18, 2005 (gmt 0)


    This is a static file reference. Change it to this:

    <img src="/images/test.gif">

    Doing so means the exact same thing code-wise


    Change that to "means roughly the same thing" and you'll be right.

    If you have a <base> element that sets the base for relative URLs to your own domain, then the relative URL will be equivalent. Otherwise, some crawlers (including some past versions of Googlebot) who arrived at your page via a redirect will assume that the base for resolving relative URLs is the domain from which the redirect came.

    Without a <base> element, the relative URL does not mean exactly the same thing code-wise to all clients.

    encyclo




    msg:373336
     1:31 am on Jan 20, 2005 (gmt 0)

    One aspect I would to the list is portability. You need your entire website to be "transportable", in other words, easy to switch from one server to another, one hosting company to another.

    Unless your website is very complex, try to use standard, widely-avilable technologies: use PHP, Perl or ASP, not Objective-CAML with a home-grown .NET plugin. Apache and IIS are standard too, and try not to require any unusual modules to be available (mod_rewrite is very common, mod_speling is far less so). This is particularly important when you are using shared hosting (most small and medium websites), but even with a dedicated box, setting up a new server with obscure software is time-consuming and problematic. If you are regularly switching hosting providers, outsourcing your DNS is a good move as you can buy a new hosting plan, upload the site, test and switch the DNS in less than an hour, rather than it taking days.

    On the markup side, try to keep things simple too: avoid stuff like browser-sniffing where you are constantly updating your scripts to account for new browser or bot versions. Use web standards (but not too cutting-edge) so you don't have to tweak your code for each version increment of every browser, and keep the basic templates validated, whilst accepting that some areas of a site (including user input) might fail validation occasionally.

    raptorix




    msg:373337
     12:56 pm on Jan 20, 2005 (gmt 0)

    As a professional webdeveloper i think i can add some other tips.

    -Security
    -Never store passwords hardcoded in your database, always encrypt them!
    -Always check userinput, especially when you don't use stored procedures to execute sql!
    -Make sure the database is not running under root/administrator priviliges!
    -Secure "mail a friend" functionality, 2 years ago we detected that someone used the send a friend functionality to send a spamrun.
    -NEVER send detailed error messages back to the browser, this is key information for a potential hacker or wormvirus. Instead log them to file.

    Performance
    -Put often used javascripts in include files.
    -Check execution time of sql statements, most commercial databases do have analysis tools which can give you tips.
    -Think about caching often requested data, ie data from dropdown boxes does not change very often, so it can be wise to store it in memory instead of querying it every time out of your database.
    -Use HTTP caching, most webservers do support it and it will boost your performance!

    I will give some tips on other subjects later today.

    CanadianChris




    msg:373338
     3:42 pm on Jan 20, 2005 (gmt 0)

    Scheduled Backups

    Backup, backup, backup!

    At minimum, I backup all my sites every day. The busier sites I have a cron script which backs up the database every 6 hours and automatically saves the data to a storage server in a different location.

    You would be surprised at how easily you can screw all the data in your site by missing 1 or 2 words in a db query.

    Same goes with pages and files. Never never never delete anything! That image you just deleted that you paid $100 was your only copy and it's gone now! If you want to conserve space or keep clean directories, then backup everything off your server every day before deleting everything.

    raptorix




    msg:373339
     4:06 pm on Jan 20, 2005 (gmt 0)

    True chris, thats why its smart to put ALL content in your database, then you can make incremental backups so you waste a minimum of space and have maximum backups.

    Kysmiley




    msg:373340
     9:37 pm on Jan 23, 2005 (gmt 0)

    henry-0 posted this
    ini_set('include_path', realpath($_SERVER['DOCUMENT_ROOT'] . '/../includes/') . '.' . PATH_SEPARATOR . ini_get('include_path'));
    Some pretty neat stuff going on in here.

    First, notice the use of the realpath() function to put the canonicalized absolute pathname together from a relative link (/../includes/), which by the way, resides below the public document root ;) -- tidies it up quite nicely.

    I am assuming if I am on a shared server I will not have access to the config file or do i just creat another one. Im getting ready to upgrade my site and find this thread very insiteful. I want to do it the right/short way from the start. It would make it so much easier if i could set my server to auto route include files from a relative path. It is a constant battle with me finding the correct path to use. One day ../dir/file.php will work and the next day what appears to be the same kind of path just different names wont work so setting it like this would help me out very much if it can be done.
    Pat

    This 42 message thread spans 2 pages: < < 42 ( 1 [2]
    Global Options:
     top home search open messages active posts  
     

    Home / Forums Index / WebmasterWorld / Webmaster General
    rss feed

    All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
    Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
    WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
    © Webmaster World 1996-2014 all rights reserved