Forum Moderators: coopster
1 - Ensure register_globals is firmly OFF, it will stop you making some of the most common scripting mistakes involving people overwriting session variables etc using the url
2 - Use mysql_escape_string() on EACH AND EVERY user submitted string you put into your SQL statements.
3 - For reasonable security, use sessions, and check the value of a variable (eg $userstatus) on every page - try not to use it as true/false, but use an exact number, such as logged in is ($userstatus==74) - this is a great defense against many of the most common forms of attack
4 - Whenever you store a password, only store the md5() hash of the password - that way _if_ someone gets into your database, they don't have a password they can use. To check passwords of later logins, test md5($login_password)==$stored_md5_password
5 - If you use .inc files, set your .htaccess so that php parses them - that way they won't return the source code if someone sneakily guesses/finds out the URL and enters it into their browser.
#5 is always good advice or just don't use inc extensions at all or .inc.php works too.
Error Handling/Trapping - check all data and disallow variations or use default cases or catch alls to trap exceptions.
need to use sessions
I'll add another one:
place @ before every function likely to give an error - i'm going for database calls and file handling, especially from external sites here - the default error it spews out not only looks bad for your site - but it reveals your actual server path - and possibly your mySql username.
only once you are putting it live, I know that's what you mean vincevincevince but I just want to make sure everyone is clear.
The @ symbol will suppress error messages and when the site is still in some form of development you will run yourself ragged chasing errors you suppressed.
place @ before every function likely to give an error - i'm going for database calls and file handling, especially from external sites here - the default error it spews out not only looks bad for your site - but it reveals your actual server path - and possibly your mySql username.
I'm no expert on this but I think it might be worth it to do this in the php.ini file ... see [php.net...] . You can turn off displaying errors and have everything log to a text file. This way you don't have to track down every function.
2 - Use mysql_escape_string() on EACH AND EVERY user submitted string you put into your SQL statements.
Might want to gheck the value of magic_quotes_gpc first - escaping twice is safer than not at all, but can still screw up your site.
5 - If you use .inc files, set your .htaccess so that php parses them - that way they won't return the source code if someone sneakily guesses/finds out the URL and enters it into their browser.
Or put them in a directory outside the web root, so there is no URL for them.
Always delete (possibly after backup) any installation scripts supplied with ready made systems, typically install.php or setup.php.
Asandir...
How about a very very commonly used function - taking an email address to save as part of a registration / post to a site / email to. I'll be impressed when you can do that without reading form data :-)
Or did I miss the point?
Every bit of data coming from outside of "my" system gets treated as if it is "hostile" - every bit of data from a form, for example, would all be treated as if they were SQL injects, etc.
What you said in post #1, point 2, really.
phpinfo() is actually a function that produces a page with all of those variables. It can be added to a file named what ever you want but yes - if someone were to guess the name of that file, they would have access to all of the server info.
the disabling of phpinfo and mail are recommended only for where you will let a third party run php on your service - for example you are a host with virtual php enabled domains.
If guest posting is disabled and you have a limit on post sizes/subject sizes then it is easy to delete posts by user. Minimum delays can piss your members off. The functions to watch out for are anything that sends email, like lost password, registration etc, or any function that works the CPU like Search. As I said, you can always delete posts by user, but the emailer and search use CPU resources which will eventually completely bog the server.
If you have your website virtually hosted, then the host will eventually drop your site to release CPU resources.
Dealing with sessions tables filling is another issue too.
Not sure I agree with your views, though. You can reduce user irritation by making the delay appropriate for the particular audience. For example, WebmasterWorld uses 15s I think.
You raise good points about controlling the server load, but flood control is important for other reasons too. Here are a few examples:
Shawn
I often have competitors try to extract my database. I even once got a funny email from a microsoft employee complaining he didn't manage to extract my databasse and if I could please send it to him (what ARE they thinking?).
What I did for my ant-hammering is to hae a minimum time before it kicks in, for example 5mins, and then have a threshold of hits/second that will trigger a timed ban, so perhaps a user can do maybe 1 or two hits per second, or perhaps 30 perminute or less. Should stop most extraction spiders. Of course have t obe carefull not to hit any search engines (was easy in my case, I only affected IPs from my country which doesn't host any SEs)
SN
Well if the forums are set up to run reports like 'delete all posts by user', then it is not a biggy. My comments are based on the preferred denial of service attacks on PHP websites. Whether forums or not, attackers first preferences are always functions that sapp server CPU resources. i.e. anything that dispatches email, registration scripts, then the search scripts etc.
Disabling attacks of that nature means creating a sessions include script that will more accurately determine if it is indeed a browser that is requesting/posting, or a attack script.
Look at the way browers deal with cookies. For instance, 99% of attack apps cannot store cookie data and then return them on future requests from a given proxy/IP.
So create a session based on the requesting IP, which allows it to have persistant data even if the requester is not giving back the PHPSESSID via cookie or URL.
Create a hash based on several different persistant pieces of information, sends it to the requester as a cookie, then have them re-request the document... meaning it ensures the user has the correct cookie before allowing access to any document.
Another way to deal with post floods, is rather than arbitarily forcing delays between posting is to start a counter which would allow say 10 consecutive post requests of say less than 20 seconds apart, if that counter gets past 10, a message tells them they've been blocked to prevent spamming. If that counter gets past 40, then that IP is banned via .htaccess
So each time a document is requested, a random hash would need to be sent to them as a cookie. Whenever they perform a POST operation, it makes sure the cookie they send matches the last hash that was sent.
This will effectively stop attacks made from applications that employ huge anonymous proxy lists.