I’m taking a few moments to provide my position and thoughts on this release as the senior person hired to oversee OWASP Projects and a full-time member of the OWASP Foundation staff. In addition, I have spent a long time in the AppSec field and have a long history as part of the OWASP community - project leader, Foundation board member, speaker, trainer, etc.
First some quick background: The OWASP Top 10 is one of the most well known and referenced of the OWASP projects. It’s long history with OWASP and its use by the greater security community is well known. Also, the original project leaders (Jeff Williams and Dave Wichers) not only started this project at OWASP but have a long and prolific history of contributions to OWASP and AppSec in general. OWASP and the information security community is better off from the creation and continued maintenance of the OWASP Top 10 so thanks for your work to date.
Matt’s personal take on this release after watching the project for the releases since 2008: This is the most open and visible release to date. Why?
- The release candidate was shared publicly with the stated goal of getting feedback from the community BEFORE 2017 is finalized. Yes, PDFs aren’t the most feedback-friendly way to have a RC, but the project is still wanting and actually looking for feedback well before a final release. Not a new thing for this project but very important.
- The data call was open and public. Announced back in May 2016, the call for data for this release was made publicly and without any significant restrictions. The Project leader accepted both emailed and Google Form submissions. Additionally, the project accepted suggestions on the data being collected and possible omissions to consider as seen in this thread.
- The collected data used to create the RC is publicly available in GitHub. Again, Excel isn't my preferred data format but the data used to derive the RC is easily and publicly available. This is also an improvement over past releases.
- I have seen no violation of OWASP project policy or community norms by this project. Yes, every release has perceived winners and losers depending on what problem vendors believe their secret sauce solves. That’s the nature of narrowing down the possible risks to just 10.
It seems like most of the discussion is about the new bits - A7 and A10. Lots of this discussion has been great and helped clarify where the language used in the Top 10 didn’t hit its intended target. Dave Wichers (Project Leader) provided a detailed explanation of what A7 was and how hard it was to condense that into the space required. This was further clarified by some good feedback from Colin Watson. A reasonable overview of the pros and cons can be had at CISO online. Particular high praise should be given to Brian Glas for his pair of excellent blog posts looking at the data and different ways to view the collected data. Those posts are stellar examples of what an open process can create simply by making things visible.
This is where I feel obligated to make yet another request for more involvement from all the players in the AppSec community. We have a mountain of work to get done in this field and we’ll make more headway working together than tearing each other apart with infighting.
Inspired by the insights in the feedback already gathered and wanting to get more, Dave Wichers has decided to attend the OWASP Summit in London in June and is organizing a set of working sessions on the OWASP Top 10. There’s time to still register if this is how you’d like to participate. Also, if you’re an OWASP leader, Community Engagement Funding or your project/chapter funds can help defray the costs to attend the summit. If the summit doesn’t work for you, you can always join the Top 10 mail list and comment there. This project will only be as good as the contributions of the AppSec community, so get involved for this current and future releases.
All this said, let me be direct with you: If you feel there’s a violation of the rules for OWASP projects, I would like to know the specifics of your complaint. I haven’t seen any violations but if you feel that you have, please submit a case via the “Contact Us” form so I can keep track of any submissions.
When you’re providing those specifics, please keep in mind the following: The rules and norms of OWASP are documented in the Core Values, Core Purpose and Code of Ethics and Principles on the “About OWASP” wiki page. Specifically for projects, there’s also the Project Handbook, whose content has been moved to Github for an update/review this year - issues and PRs gladly accepted to help make that better after this iteration. Also note that there’s a ton of cognitive biases out there - try to check those at the door.
Is the process around the creation and release of the OWASP Top 10 perfect? Nope.
However, as a volunteer-driven effort that has been exceptionally valuable to OWASP and the greater software community, the process of producing a new release continues to get better iteratively on each release. So, please, join the discussion on the Top 10 mail list and help ensure this and future iterations continue on a trajectory of continuous improvement.
“Keep the stones you are about to throw in your pocket. Use those stones to build a bridge."
-- Michael Coates on the Leaders List