Welcome to WebmasterWorld Guest from 18.104.22.168
Forum Moderators: phranque
Have any other tips feel free to add them.
Can anyone else share their experience with "http://example.com/blah/page.html" vs "http://example.com/blah/page"? I am about to make this call and I am trying to decide what re-write would be better. Doesn't .html just feel more soothing
I learned that lesson a few years ago. Now, just about every new file I upload to a site is turned into a directory index file, so the URL is always
(or deeper subdirectory URLs like that too).
You learn this lesson when you are forced to convert 100s (and 1000s) of .html pages into an upgraded technology like .asp or .php in order to capitalize on server-side technology (such as using a common include-file system of webdevelopment).
I once had to make such conversion to .asp and it was an extremely huge and tedious job. Years later, that site may soon be converted to .php files instead (as the site is moved from a Windows to a Unix server). But since all URLs for the site have been made as simple directory index files (with directory-only based URLs), I only have to change from index.asp files to index.php files (in each directory) while the URLs will remain constant. That means the pages' rankings in SEs will be unaffected, which is an important matter, of course.
So, unless one is using multi-variable query-string URLs (for specific reasons), it is easiest to always design sites by making all webpages as individual directory index files. Indeed, "hide the technology" is an excellent recommendation for this -- much more soothing.
[edited by: trillianjedi at 1:13 pm (utc) on Jan. 20, 2005]
[edit reason] Terminology softening ;-) [/edit]
Can anybody name some of these programs? I know I used them on other sites.
It's in PHP, and does recip checking (along with a degree of cheat detection) both on submission, and periodically. You can configure it to send warning e-mails, deletes, etc., if they drop the link back to you. It'll pend new submissions for you to approve, and auto-generates the link pages. The templating system is a little old-skool, but not terribly so.
This is especially important if a non techie is responsible for site updates. I use a INT field called SHOW on every record and add the following WHERE clause in my SQL "SHOW = '1'".
Default the show field to 1 and when the user "deletes" a record I just set the SHOW to 0. Record is removed from the site and the user can "undelete" any mistakes at a later date.
I can't tell you how much time this has saved me when somebody has deleted a record in error.
This is a static file reference. Change it to this:<img src="/images/test.gif">
Doing so means the exact same thing code-wise
If you have a <base> element that sets the base for relative URLs to your own domain, then the relative URL will be equivalent. Otherwise, some crawlers (including some past versions of Googlebot) who arrived at your page via a redirect will assume that the base for resolving relative URLs is the domain from which the redirect came.
Without a <base> element, the relative URL does not mean exactly the same thing code-wise to all clients.
Unless your website is very complex, try to use standard, widely-avilable technologies: use PHP, Perl or ASP, not Objective-CAML with a home-grown .NET plugin. Apache and IIS are standard too, and try not to require any unusual modules to be available (mod_rewrite is very common, mod_speling is far less so). This is particularly important when you are using shared hosting (most small and medium websites), but even with a dedicated box, setting up a new server with obscure software is time-consuming and problematic. If you are regularly switching hosting providers, outsourcing your DNS is a good move as you can buy a new hosting plan, upload the site, test and switch the DNS in less than an hour, rather than it taking days.
On the markup side, try to keep things simple too: avoid stuff like browser-sniffing where you are constantly updating your scripts to account for new browser or bot versions. Use web standards (but not too cutting-edge) so you don't have to tweak your code for each version increment of every browser, and keep the basic templates validated, whilst accepting that some areas of a site (including user input) might fail validation occasionally.
-Never store passwords hardcoded in your database, always encrypt them!
-Always check userinput, especially when you don't use stored procedures to execute sql!
-Make sure the database is not running under root/administrator priviliges!
-Secure "mail a friend" functionality, 2 years ago we detected that someone used the send a friend functionality to send a spamrun.
-NEVER send detailed error messages back to the browser, this is key information for a potential hacker or wormvirus. Instead log them to file.
-Check execution time of sql statements, most commercial databases do have analysis tools which can give you tips.
-Think about caching often requested data, ie data from dropdown boxes does not change very often, so it can be wise to store it in memory instead of querying it every time out of your database.
-Use HTTP caching, most webservers do support it and it will boost your performance!
I will give some tips on other subjects later today.
Backup, backup, backup!
At minimum, I backup all my sites every day. The busier sites I have a cron script which backs up the database every 6 hours and automatically saves the data to a storage server in a different location.
You would be surprised at how easily you can screw all the data in your site by missing 1 or 2 words in a db query.
Same goes with pages and files. Never never never delete anything! That image you just deleted that you paid $100 was your only copy and it's gone now! If you want to conserve space or keep clean directories, then backup everything off your server every day before deleting everything.
First, notice the use of the realpath() function to put the canonicalized absolute pathname together from a relative link (/../includes/), which by the way, resides below the public document root ;) -- tidies it up quite nicely.
I am assuming if I am on a shared server I will not have access to the config file or do i just creat another one. Im getting ready to upgrade my site and find this thread very insiteful. I want to do it the right/short way from the start. It would make it so much easier if i could set my server to auto route include files from a relative path. It is a constant battle with me finding the correct path to use. One day ../dir/file.php will work and the next day what appears to be the same kind of path just different names wont work so setting it like this would help me out very much if it can be done.