Forum Moderators: coopster
however if i do it the easy way hackers can put in (http://www.example of a bad site.com) for example. (http://www.example.com/index.php?module=http://www.example of a bad site.com)
I dont want that to be possible so how would I go about preventing that on my server so I dont have problems with my Network Admin.
I dont want to have to have a list, I just want to beable to check for specifc characters like http:// and then have them removed and then it try and display the page but cause a 404 error or tell them NO DIRTY SITES. It would be much better to have the 404 but anwayz, how would i go about checking for and removing http:// from the query!?
Thank you ahead of time
Jordon Bedwell
[edited by: coopster at 3:19 pm (utc) on Aug. 16, 2004]
[edit reason] examplified urls as per TOS [webmasterworld.com] [/edit]
This come from WebmasterWorld bag o'trick
Credit goes to: Andreas
Validating an URI [webmasterworld.com]
#############OTHER###########
Resolving a relative URI [webmasterworld.com]
[edited by: coopster at 3:26 pm (utc) on Aug. 16, 2004]
[edit reason] linked up Bag-O-Tricks for PHP II as per henry0 request [/edit]
Protect yourself with a switch statement, some thing like this.
switch($_GET['page']) {
case 'home':
$file = 'home.php';
break;
case 'forum':
$file = 'forum.php';
break;
default:
die('Have at you, vile hacker!');
break;
}
You wouldn't have to worry about http:// or anything else, since only by match the exact options in the switch statement will anything actually be done. Otherwise, the page just stops executing, giving the wannabe hacker a message.
Or, as has already been suggested, check the page parameter against an array, and then include it/process/whatever. :)
Alex ...