You are on page 1of 19

The OWASP Application

Security Metrics Project

Bob Austin
Application Security Metrics Project
KoreLogic, Inc.
P 804.379.4656

AppSe Copyright © 2006 - The OWASP Foundation

Permission is granted to copy, distribute and/or modify this document
under the terms of the Creative Commons Attribution-ShareAlike 2.5
License. To view this license, visit

Oct 2006
Presentation Objectives

Drivers for Security Metrics

Review the Project Plan.
Work Accomplished To Date, Next Steps
Provide Application Security Metrics
Solicit Feedback and Participation

OWASP AppSec Seattle 2006 2

The Best Metrics….Can Answer Hard
How secure am I?
Am I better than this time last year?
Am I spending the right amount of
How do I compare to my industry
peers (senior management’s favorite

Source: Dr. Dan Geer

OWASP AppSec Seattle 2006 3
Forrester Survey: “What are your top three
drivers for measuring information security?”

Justification for security 63%


Regulations 51%

Loss of reputation 37%

Better stewardship
Better stewardship 26% Base: 40 CISOs and
senior security
Report progress
Report to to
progress business

Manage risk 11%

Source: “Measuring Information Security Through Metrics And Reporting”, Forrester

Research, Inc., May 2006”
OWASP AppSec Seattle 2006 4
Forrester Survey: What do CISOs want to
“As a CISO, if you have a choice of measuring and monitoring up to five areas
in security, which ones would you measure?”

Regulatory compliance 62%

Incident handling and response 59%

Corporate governance 55%

Risk management process

Security awareness 52% Information
Vulnerability and patch Through
management Metrics And
Application security 34% Forrester
Base: 34 CISOs Research, Inc.,
and senior security May 2006”
OWASP AppSec Seattle 2006 5
Project Goal and Roadmap
Project Goal:
Address the current lack of effective application security metrics by
identifying, sharing and evolving useful metrics and metric processes to
benefit the OWASP community.
Current Project Contributors: Jeff Williams (Aspect Security), Cliff
Barlow (KoreLogic), Matt Burton (Mitre)

Phase Conduct
research. Publish
Develop Identify leading Develop/Conduct PublishSurvey
One DevelopProject
Project ➨ Identify leading ➨ Develop/Conduct ➨ Results ➨
practices, Initial
Approach practices, InitialSurvey
Survey Results

Current Project Status

Phase Identify CreateApproach
OWASP IdentifyShort
List totoDevelop
Two Feedback ➨ ofofNeeded ➨ Develop
and Needed Metrics
Perform Gap Analysis Metrics
Perform Gap Analysis Metrics
OWASP AppSec Seattle 2006 6
Phase One – Application Security Metrics Baseline
Survey Plan
Information Capture Analysis Survey Results

Interviews Key
Interviews Keyfindings
➨ Identify (fromsurvey
(common themes,barriers,
themes, barriers, Application
concerns) Applicationsecurity
Provide set of “best practices”
Assess Provide set of “best practices”
Research Assesssurvey
survey ➨ associated
Research participant-provided associatedwith
participant-provided an
an application securitymetrics
application security
metrics metrics
metrics program.
for program.
➨ for
for Use
OWASP Useresults
communityuse use Phase
Survey PhaseTwo Two
of the Project
of the Project

OWASP AppSec Seattle 2006 7

Useful Resources from Research
 OWASP CLASP Project – “Monitor Security Metrics”
 Dr. Dan Geer’s “Measuring Security” Tutorial
 Other Initiatives:, Metricon 1.0
 Secure Software Development Life Cycle:
 “The Security Development Lifecycle”, Howard and Lipner,
 “Security in the Software Lifecycle”, DHS, Cybersecurity Div.
 Information Security Metrics Standard - ISO 27004
 Dr. Larry Gordon, Cybersecurity Economics Research Projects
 Resources from NIST:
 Security Metrics Guide for Information Technology Systems,
 Guide for Developing Performance Metrics for Information Security
 NIST Software Assurance Metrics and Tool Evaluation (SAMATE)

OWASP AppSec Seattle 2006 8

Organizing Metric Types

Process Metrics Examples

Information about the  Secure coding standards in use
processes themselves.
Evidence of maturity.  Avg. time to correct critical

Vulnerability Metrics
Metrics about application
vulnerabilities themselves  By vulnerability type
 By occurrence within a software
development life cycle phase

Management Examples
 % of applications that are currently security “certified” and
Metrics specifically accepted by business partners
designed for senior  Trending: critical unresolved, accepted risks

OWASP AppSec Seattle 2006 9

Opportunities for Metrics - Secure
Development Life Cycle (SDL)
Were software assurance activities conducted at each lifecycle phase?
Security push/audit
Secure questions Learn &
during interviews Refine

Concept Designs Test plans Code Deploy Post

Complete Complete Complete Deployment

Team member
training Review old defects
Data mutation = on-going
Check-ins checked
Security & Least Priv Secure coding guidelines
Review Tests
Use tools
Source: Microsoft
OWASP AppSec Seattle 2006 10
Examples of Application Security Metrics

Process Metrics Management Metrics

 Is a SDL Process used? Are  % of applications rated
security gates enforced?
“business-critical” that have
 Secure application development been tested.
standards and testing criteria?
 Security status of a new
 % of applications which
application at delivery (e.g., % business partners, clients,
compliance with organizational regulators require be
security standards and application “certified”.
system requirements).
 Average time to correct
 Existence of developer support
website (FAQ's, Code Fixes, vulnerabilities (trending).
lessons learned, etc.)?  % of flaws by lifecycle
 % of developers trained, using phase.
organizational security best
practice technology, architecture  % of applications using
and processes centralized security
 Business impact of critical
security incidents.

OWASP AppSec Seattle 2006 11

Examples of Application Security Metrics

Vulnerability Metrics
 Number and criticality of vulnerabilities found.
 Most commonly found vulnerabilities.
 Reported defect rates based on security testing (per
developer/team, per application)
 Root cause of “Vulnerability Recidivism”.
 % of code that is re-used from other
 % of code that is third party (e.g., libraries)*
 Results of source code analysis**:
 Vulnerability severity by project, by organization
 Vulnerabilities by category by project, by organization
 Vulnerability +/- over time by project
 % of flaws by lifecycle phase (based on when testing
Source: * WebMethods, ** Fortify Software
OWASP AppSec Seattle 2006 12
The Path Forward

Complete KoreLogic-sponsored surveys

Encourage others to complete survey forms
Create metrics taxonomy. Test drive it.
Collaborate/share with other metrics initiatives
“Will Work for Metrics”. Volunteers needed!
Solicit survey participants. Collect survey data.
Help analyze survey data
Donate useful application security metrics
Help plan Phase Two

OWASP AppSec Seattle 2006 13

Our Security Metrics Challenge

“A major difference between a "well

developed" science such as physics and
some of the less "well-developed" sciences
such as psychology or sociology is the degree
to which things are measured.” Source: Fred
S. Roberts, ROBE79

“Give information risk management the

quantitative rigor of financial information
Source: CRA/NSF, 10 Year Agenda for Information Security Research, cited by
Dr. Dan Geer

OWASP AppSec Seattle 2006 14

Supplemental Slides and Metrics

OWASP AppSec Seattle 2006 15

Resources – Security Metrics
 Security Metrics Standards:
 ISO 27004 - a new ISO standard on Information Security Management Measurements.
 Other metrics initiatives -
 Metricon 1.0 presentations,
 Dan Geer’s measuringsecurity tutorial. Pdf,
 Developing metrics programs:
 Security Metrics Guide for Information Technology Systems,
 Guide for Developing Performance Metrics for Information Security,
 Establishing an Enterprise Application Security Program, Tony Canike, OWASP 2005
 Metrics-related Tools:
 NIST Software Assurance Metrics and Tool Evaluation (SAMATE),
 Metrics-related Models, Frameworks:
 Current Articles on Metrics
 Metric-related Financial and Econometric Resources:
 Economics and Security Resource Page, Ross Anderson),
 Dr. Larry Gordon, University of Maryland, Cybersecurity Economics Research Projects,

OWASP AppSec Seattle 2006 16

Resources – Software Assurance
 “A Clinic to Teach Good Programming Practices”, Matt Bishop,
 Team Software Process for Secure Systems (TSP-Secure),
 OMG’s Software Assurance Workshop 2007,
 DHS Cybersecurity Division Software Assurance Initiatives:
 Software Assurance Measurement Workshop, Oct, 2006
 Software Assurance Program,
 Software Assurance Forum,
 CERT Secure Coding Standards,
 CRA Conference on
"Grand Research Challenges in Information Security & Assurance“,

OWASP AppSec Seattle 2006 17

Resources – General Software Measures &
 Measures and Metrics Web Sites,
 Software Process Metrics Organizations:
 Software Metrics Symposium
 Capability Maturity Model Integration (CMMI)
Performance and Decision Analysis,
 History of Software Measurement (Horst Zuse), http://irb.cs.tu-
 NASA Software Engineering Laboratory, Experience Factory:
 ISO/IEC 15939, Software Engineering - Software Measurement Process
 Software Metrics Glossary,
 2006 State of Software Measurement Practice Survey,

OWASP AppSec Seattle 2006 18

Really Bad Metrics Advice
 According to my data, roughly 122.45 percent of this journal's 347,583,712 readers need some sharpening up on
how to effectively collect and use metrics. There is less than a 0.0345 percent chance that this column will help.
 Q: I'm a manager who believes in keeping metrics simple, which is why I've limited the number we collect to 62.
But I also want to simplify their collection—do you know where I can find timecard readers designed for bathroom
 A: Try voice print-activated stalls with timed door locks. But first, are you really trying to collect 62 metrics? 62?
[snicker snort chortle] You're obviously clueless about the "KISS" principle: Keep It Stupefyingly Strenuous. You can
collect a lot more than 62 different metrics. The accepted rule of thumb for the number of metrics you can
reasonably work with is this: "Seven, plus or minus the square of the number of door knobs in your home."
Remember, if something can be measured, it must be measured, and all metrics are equally critical.
 Q: I feel vindicated. Now I can introduce additional metrics for every obscure area of our process improvement
model. Naturally, I plan to drop the whole wad as an enforced edict and then make myself unavailable for a few
 A: Bravo! But be sure you don't overcomplicate things by defining every minute detail, such as data integrity
standards or what you plan to do with the data. People learn nothing from constant handholding. Your job is to sit
back and wait for those reliable numbers to start pouring in.
 Q: Great! What do you suggest I do with all that data?
 A: What should you do with the data? Do? That question implies that metrics are a means to some end. Don't
waste resources—time spent analyzing metrics is time that could have been spent collecting even more metrics.
 Q: My boss keeps asking for data on stuff I don't think can be quantified—and it's often common sense stuff he
could just ask us! Aren't metrics just a big sham?
 A: Shhh! You're right, metrics are actually an extensive conspiracy—but an extremely helpful one. When people
want to make decisions based on "facts" rather than "opinions," you need metrics to push your personal agenda
under the guise of unassailable objectivity. Perception is everything:
 Politicized emotional drivel: "Let's try my approach. Her plan isn't working."
 Objective insight: "A consumptive analysis of my plan projects a 84.67 percent increased density of pro-active
rationals within six months. However, her key preambulatory vindicators are creating a 24.38 percent downward
sloping polymorphic trend. Plus, she wears really cheesy business suits."


OWASP AppSec Seattle 2006 19