Using the software security framework introduced in October (A Software Security Framework: Working Towards a Realistic Maturity Model), we interviewed nine executives running top software security programs in order to gather real data from real programs. Our goal is to create a maturity model based on these data, and we're busy working on that (stay tuned here for more). However, in the course of analyzing the data we gathered, we unearthed some surprises that we share in this article."
...
"Of the twenty-three large-scale software security initiatives we are aware of, we chose nine that we considered the most advanced. Our nine organizations are drawn from three verticals: financial services, independent software vendors, and technology firms.
On average, the target organizations have practiced software security for five years and four months (with the newest initiative being two and a half years old and the oldest initiative being a decade old). All nine have an internal group devoted to software security that we choose to call the Software Security Group or SSG. SSG size on average is 41 people (smallest 12, largest 100, median 35) with a "satellite" of others (developers, architects and people in the organization directly engaged in and promoting software security) of 79 people (smallest 0, largest 300, median 20). The average number of developers among our targets was 7550 people (smallest 450, largest 30,000, median 5000), yielding an average percentage of SSG to development of just over 1%.
We conducted the nine interviews in person and spent two hours going over each software security initiative in a conversation guided by the software security framework."
Here's the high level list of the top 9 issues (read the article for more information on them).
9. Not only are there are no magic software security metrics, bad metrics actually hurt.
8. Secure-by-default frameworks can be very helpful, especially if they are presented as middleware classes (but watch out for an over focus on security "stuff").
7. Web application firewalls are not in wide use, especially not as Web application firewalls.
6. Involving QA in software security is non-trivial... Even the "simple" black box Web testing tools are too hard to use.
5. Though software security often seems to fit an audit role rather naturally, many successful programs evangelize (and provide software security resources) rather than audit even in regulated industries.
4. Architecture analysis is just as hard as we thought, and maybe harder.
3. Security researchers, consultants and the press care way more about the who/what/how of attacks than practitioners do.
2. All nine programs we talked to have in-house training curricula, and training is considered the most important software security practice in the two most mature (by any measure) software security initiatives we interviewed.
1. Though all of the organizations we talked to do some kind of penetration testing, the role of penetration testing in all nine practices is diminishing over time.
0. Fuzz testing is widespread.
Article Link
...
"Of the twenty-three large-scale software security initiatives we are aware of, we chose nine that we considered the most advanced. Our nine organizations are drawn from three verticals: financial services, independent software vendors, and technology firms.
On average, the target organizations have practiced software security for five years and four months (with the newest initiative being two and a half years old and the oldest initiative being a decade old). All nine have an internal group devoted to software security that we choose to call the Software Security Group or SSG. SSG size on average is 41 people (smallest 12, largest 100, median 35) with a "satellite" of others (developers, architects and people in the organization directly engaged in and promoting software security) of 79 people (smallest 0, largest 300, median 20). The average number of developers among our targets was 7550 people (smallest 450, largest 30,000, median 5000), yielding an average percentage of SSG to development of just over 1%.
We conducted the nine interviews in person and spent two hours going over each software security initiative in a conversation guided by the software security framework."
Here's the high level list of the top 9 issues (read the article for more information on them).
9. Not only are there are no magic software security metrics, bad metrics actually hurt.
8. Secure-by-default frameworks can be very helpful, especially if they are presented as middleware classes (but watch out for an over focus on security "stuff").
7. Web application firewalls are not in wide use, especially not as Web application firewalls.
6. Involving QA in software security is non-trivial... Even the "simple" black box Web testing tools are too hard to use.
5. Though software security often seems to fit an audit role rather naturally, many successful programs evangelize (and provide software security resources) rather than audit even in regulated industries.
4. Architecture analysis is just as hard as we thought, and maybe harder.
3. Security researchers, consultants and the press care way more about the who/what/how of attacks than practitioners do.
2. All nine programs we talked to have in-house training curricula, and training is considered the most important software security practice in the two most mature (by any measure) software security initiatives we interviewed.
1. Though all of the organizations we talked to do some kind of penetration testing, the role of penetration testing in all nine practices is diminishing over time.
0. Fuzz testing is widespread.
Article Link
Fuente www.cgisecurity.net
No hay comentarios:
Publicar un comentario