OWASP Code Review Guide Survey
As some of you may know the OWASP Code Review Guide did a survey of the attendees at AppSec USA. We wanted to find out how attendees rated the effectiveness of various security tools/reviews at finding issues, such as business logic problems, or each of the OWASP Top 10. Our intention was to evaluate if Secure Code Review (the topic of our guide) is seen as an effective security process in an organizations SDLC. These results (below) will be included in the next version of the guide.
We want to thank all of those who took part, and communicate the results of the survey (it is Security Awareness Month after all). In the first part of our survey we asked attendees to rate which of the following security tools/reviews were the most effective in finding:
1) General security vulnerabilities
2) Privacy issues
3) Business logic bugs
4) Compliance issues (such as HIPPA, PCI, etc.)
5) Availability issues
Next we concentrated on the OWASP Top 10 issues, this time the results were as follows:
Please feel free to make use of this survey in whatever way you want. Also feel free to discuss any of the outcomes, for example:
a) A high percentage of people prefer manual pen testing as a way of detecting availability/traffic load issues. Is this specific to any tool, or is it simply because 'load' or 'DoS' testing was not an option?
b) For A1, Injection, source code scanning was three times more popular than manual pen testing, does that match your experience?
c) For A9, Using Components with Known Vulnerabilities, automated vulnerability scans were far more popular than the rest.
Just to note, this type of activity was a great outcome of the Project Summit which took place before the conference. This survey is just one of the many valuable things to come from that summit. Thanks to Larry for digitizing this info.
Best of luck,
Gary Robinson
Larry Conklin
0 Comments:
Post a Comment
Subscribe to Post Comments [Atom]
<< Home