Msg#: 3550074 posted 8:48 pm on Jan 16, 2008 (gmt 0)
I'm creating a site where people will log in and build a profile. Then they can share the profile with select others. Users will need to register, but the info they enter isnt particularly sensitive.
What Im concerned about is can/will malicious robots enter the site and wreak havoc with the databases?
In the past years, I've started to see those "systems" where a person has to type in a code (which is shown on an image unreadable to the robots) for entry to the site. Would anybody recommend I use these, and if so are there any recommendations on best practices?
Msg#: 3550074 posted 9:08 pm on Jan 16, 2008 (gmt 0)
Take preventative measures in forms that deal with the databases to prevent an sql injection and the like would be a good place to start off.
You can prevent search engines spiders and robots by accesing certain areas on your site by using the robots.txt file, however, i would of thought any knowledgable hacker would find a way to ignore that file so they can spider their way through the entire site.
Can you restrict certain areas with a password? The user has to login to access some areas where the info is displayed, so the bots cant get thru? You an implement a captcha on the form too to help prevent bots getting through.
Msg#: 3550074 posted 9:43 am on Jan 17, 2008 (gmt 0)
If you're simply worried about a bot entering data, i.e. adding spam links. If its propietary page chances are they probably won't bother as they won't recognize what it is. They are designed to attack specific installations on a large scale, for example you have millions of phpbb forums installed worldwide and every forum has the same captcha system. Therefore if you can break one you can break them all (assuming the site admin hasn't taken precautions to prevent this). Same goes for large sites like Yahoo where you would sign up, they are looking for sites/pages that can easily be circumvented.
To prevent this on individual custom pages or even more vulnerable things like a forum creatin a unique captcha such as question is probably you're best bet. Example: Have some text highlighted somewhere on the page and ask them to type in the highlighted text to validate that its a human and not a bot. Simple but very effective, first bots don't answer questions(at least not yet), secondly this question can be unique on every site. Best part about that is it gets rid of the hardly legible image and is quite accessible to anyone.