Msg#: 3265694 posted 6:21 pm on Feb 27, 2007 (gmt 0)
This is is a must read for all programming web developers: [honeynet.org...]
Web applications present a very high risk, and an attractive target to attackers for the following reasons: Firstly, the quality of the code is often rather poor and many vulnerabilities of commonly used code are published. Second, attacks can often be performed using PHP and shell scripts, which are much easier to develop and use than buffer-overflow exploits. Thirdly, tools such as search engines provide a very easy way for attackers to locate vulnerable web applications. We believe that web servers present relatively high-value targets for attackers since they are more likely to have higher bandwidth connections than the average desktop computer. They will also typically need to access the organisation's databases and so may provide a stepping stone for an attacker who wishes to recover such data.
The paper is rather thick in spots, but stick with it, as there are some subtle gems in there worth finding.
Msg#: 3265694 posted 7:11 pm on Feb 28, 2007 (gmt 0)
While I didn't find anything in the article that dealt with search engine spider identification it was an interesting read nonetheless.
It's amazing how lax some long-time programmers can be when it comes to recognizing and dealing with potential security threats both offline and online.
I wonder if part of the problem has to do with there being more independent web developers who try to handle everything themselves?
A person might be a fine VB, VBScript, PHP, et cetera programmer yet know only the absolute basics about databases.
If you don't have a firm grip on the nuances of a db query it's really easy to get yourself into big trouble quite rapidly. The article cites some examples that should be part of any basic primer on databases.