Welcome to WebmasterWorld Guest from 18.104.22.168
I'm creating a site where people will log in and build a profile. Then they can share the profile with select others. Users will need to register, but the info they enter isnt particularly sensitive.
What Im concerned about is can/will malicious robots enter the site and wreak havoc with the databases?
In the past years, I've started to see those "systems" where a person has to type in a code (which is shown on an image unreadable to the robots) for entry to the site. Would anybody recommend I use these, and if so are there any recommendations on best practices?
You can prevent search engines spiders and robots by accesing certain areas on your site by using the robots.txt file, however, i would of thought any knowledgable hacker would find a way to ignore that file so they can spider their way through the entire site.
Can you restrict certain areas with a password? The user has to login to access some areas where the info is displayed, so the bots cant get thru? You an implement a captcha on the form too to help prevent bots getting through.
Hope that helps
To prevent this on individual custom pages or even more vulnerable things like a forum creatin a unique captcha such as question is probably you're best bet. Example: Have some text highlighted somewhere on the page and ask them to type in the highlighted text to validate that its a human and not a bot. Simple but very effective, first bots don't answer questions(at least not yet), secondly this question can be unique on every site. Best part about that is it gets rid of the hardly legible image and is quite accessible to anyone.