Professional Documents
Culture Documents
Version: 1.5
Description: One aim of the Software Assurance Maturity Model (SAMM) is to help organizations build software security assurance
programs. The current position and future targets can be charted and the SAMM document includes roadmap templates
for different industries. This spreadsheet helps produce roadmaps once the plan is known. It is structured with four
phases of improvement, like in SAMM, although could be altered to suit any number of stages.
SAMM The Software Assurance Maturity Model (SAMM) was created by Pravir Chandra and is now an Open Web Application
Security Project (OWASP) project.
SAMM is licensed under the Creative Commons Attribution-Share Alike 3.0 License
https://www.owasp.org/index.php/OWASP_SAMM_Project
SAMM Assessment Interview: Brick Builder For Acme Brick Co
Instructions
Interview an individual based on the questions below organized according to SAMM Business Functions and Security Practices.
Select the best answer from the multiple choice drop down selections in the answer column.
Document additional information such as how and why in the "Interview Notes" column.
The formulas in hidden columns F-H will calculate the scores and update the Rating boxes and other worksheets as needed.
Once the interview is complete, go to the "Scorecard" sheet and follow instructions.
Governance
Strategy & Metrics Answer Interview Notes Rating
Is there a software security assurance program in place? Yes, it's less than a year old
Guidance: Assurance program is documented and accessible to staff.
Guidance: Assurance program has been used in recent development efforts.
Guidance: Staff receives training against assurance program and responsibilities.
1.48
Are development staff aware of future plans for the assurance program? Yes, a small percentage are/do
Guidance: Assurance program goals are documented and accessible to staff.
Guidance: Assurance program goals have been presented to staff.
SM1
Guidance: A plan has been put in place to reach those goals in a specific period of time.
Do the business stakeholders understand your organization’s risk profile? Yes, the majority of them are/do
Guidance: Organization has documented motivation behind creating a software security assurance program.
Guidance: Assurance program has been customized to align with the organization's motivation and goals.
Guidance: Worst-case scenarios for organization's application and data assets have been collected and documented.
Guidance: Scenarios, contributing factors, and mitigating factors have been reviewed with business owners and other stakeholders.
Are many of your applications and resources categorized by risk? Yes, at least half of them are/do
Guidance: A data and application risk classification system has been documented.
Guidance: An evaluation criteria has been created to apply the classification system to data and applications.
Guidance: Staff receives training in how to apply evaluation criteria to application and data assets.
Guidance: Most applications and data have been categorized using this evaluation criteria.
SM2
Are risk ratings used to tailor the required assurance activities? Yes, the majority of them are/do
Guidance: The assurance program is customized based on data and application risk classification.
Does the organization know about what’s required based on risk ratings? Yes, at least half of them are/do
Guidance: Staff receives training according to documented assurance program and risk classifications.
Is per-project data for the cost of assurance activities collected? Yes, at least half of them are/do
Statisticssecurity
Guidance: Baseline are collect forare
costs spending related
estimated basedtoon
security incidents
each project or breaches.
based on it's assigned security assurance road map and risk
Guidance: category.
Guidance: Actual security spending is tracked for each project.
Guidance: Actual spending vs. estimated spending is evaluated on a quarterly basis.
Guidance: Spending statistics and historical data are used to make case-by-case decisions on security expenditures.
SM3 Guidance: Security spending decisions are made on a per project basis with consideration for expense versus loss potential.
Does your organization regularly compare your security spend with that of other organizations? Yes, we did it once
Guidance: Statistics regarding similar organization's security spending is collected regularly.
Guidance: Compare potential cost savings by purchasing products or switching vendors for security tools.
Guidance: Security cost-comparison exercises are conducted at least annually.
Are compliance requirements specifically considered by project teams? Yes, but on an adhoc basis
0.90
Guidance: External, third-party regulatory or compliance requirements have been identified.
PC1 Guidance: A consolidated list of regulatory and compliance requirements has been mapped to security requirements.
Guidance: Control statements or responses have been created for each security requirement indicating how the organization will meet
the requirement.
Guidance: Security requirements are added to each project based on applicable regulatory and compliance standards.
Guidance: The organization researches and updates regulatory or compliance requirements biannually.
Does the organization utilize a set of policies and standards to control software development? Yes, there is a standard set
Guidance: A set of security policies has been created based on compliance drivers.
Guidance: Optional or recommended compliance items have been added to security policies.
Guidance: Requirements based on known business drivers for security have been added to security policies.
Guidance: Common or similar policies have been grouped, generalized, and rewritten to satisfy compliance and security requirements.
Guidance: Security policies do not include requirements that are too costly or difficult for project teams to comply.
Guidance: Awareness programs have been created to advertise and spread awareness of security policies.
PC2
Are project teams able to request an audit for compliance with policies and standards? Yes, a small percentage are/do
Guidance: A process has been created for project teams to request an audit against security policies and compliance requirements.
Guidance: Internal audits are prioritized based on business risk indicators.
Guidance: Each project undergoes an audit at least biannually.
Guidance: Awareness programs have been created to advertise and spread awareness of the organization's audit process.
Guidance: Audit results are reviewed by project stakeholders including per requirement pass/fail status, impact, and remediation.
Are projects periodically audited to ensure a baseline of compliance with policies and standards? Yes, a small percentage are/do
Guidance: Compliance and security gates are established throughout the development process.
Guidance: Automated
An exception approval
tools process
(code review, has been testing,
penetration createdetc)
for are
legacy ortoother
used specialized
assist projects.
in identifying non-compliance prior to the audit
Guidance: process.
PC3
Does the organization systematically use audits to collect and control compliance evidence? Yes, localized to business areas
Guidance: An automated system is used to capture, organize, and display audit data and documentation.
Guidance: Access to audit data is controlled based on a need to know
Guidance: Instructions and procedures for accessing audit data are published and advertised to project groups.
Are those involved in the development process given role-specific security training and guidance? Yes, at least half of them are/do
Guidance: Role specific application security training is given to developers, architects, QA, etc.
Guidance: Managers and requirements specifiers receive training in security requirements planning, vulnerability and incident
management, threat modeling, and misuse/abuse case design.
Guidance: Testers and auditors receive training in code review, architecture and design analysis, runtime analysis, and effective
security test planning.
Guidance: Developer training includes security design patterns, tool-specific training, threat modeling and software assessment
EG2 techniques.
Guidance: Role specific training is provided at least annually as well as on demand based on need.
Are stakeholders able to pull in security coaches for use on projects? Yes, a small percentage are/do
Guidance: Internal or external security experts are available to project teams for consultation.
Guidance: The process for requesting these experts is advertised to project teams.
Guidance: A set security analysts or security-savvy developers have been selected as security coaches.
Is security-related guidance centrally controlled and consistently distributed throughout the organization? Yes, teams write/run their own
Guidance: An
A centralized repository
approval board has been
and change created
control to organize
management secure
process is indevelopment information,
place to control resources,
modification and processes.
of information in this
Guidance: repository.
Guidance: A method for collaboration and communication of secure development topics has been provided.
Guidance: Content is searchable based on common factors like platform, language, library, life-cycle stage, etc.
EG3
Are developers tested to ensure a baseline skill-set for secure development practices? Yes, we did it once
Guidance: Exams are used to verify retention of security knowledge in a per training module or per role context.
Guidance: Exams are given to staff at least biannually.
Guidance: Staff are organized or ranked based on exam scores.
Guidance: Some security activities or gates require staff of a certain rank to sign off before the item is marked as complete.
Construction
Threat Assessment Answer Interview Notes Rating
Do projects in your organization consider and document likely threats? Yes, a small percentage are/do
Guidance: Attack
Likely worst-case scenarios
trees or a threat modelare documented
is created for project
for each each project
tracingbased on its business
the preconditions risk profile.
necessary for a worst-case scenario to be
Guidance: realized.
Attack trees or threat models are expanded to include potential security failures in current and historical functional
Guidance: requirements.
1.10
Guidance: When new features are added to a project, attack trees or threat models are updated.
TA1
Does your organization understand and document the types of attackers it faces? Yes, a small percentage are/do
Guidance: Potential
Potential internal
externalthreat
threatagents,
agentstheir
andassociated
their motivations aredamage
roles, and documented for are
potential each project.
documented for each project or architecture
Guidance: type.
A common set of threat agents, motivations, and other information is collected at the organization level and re-used within
Guidance: projects.
Do project teams regularly analyze functional requirements for likely abuses? Yes, a small percentage are/do
Guidance: Each project derives abuse-cases from its use-cases.
Guidance: As project requirements or features are added, abuse-cases are updated.
Do project teams use a method of rating threats for relative comparison? Yes, a small percentage are/do
Guidance: A documented weighting system based on documented threat agents, exploit value, technical difficulty, and other factors is
TA2
used to rank threats.
Guidance: Remediation of vulnerabilities is prioritized based on the weighting system.
Are stakeholders aware of relevant threats and ratings? Yes, at least half of them are/do
Guidance: Potential threats and ratings are reviewed with project stakeholders.
Do project teams specifically consider risk from external software? Yes, the majority of them are/do
Guidance: Third-party, external libraries and code used in each project are clearly identified and documented for each project.
Guidance: Project threat models are updated based on identified threat agents and motivations for third party libraries and code.
Are the majority of the protection mechanisms and controls captured and mapped back to threats? Yes, a small percentage are/do
Guidance: An assessment for each project has been conducted to identify mitigating controls that prevent preconditions identified in
TA3
attack trees or threat models.
Guidance: This assessment is updated each time new features or requirements are introduced or the attack tree is modified.
Guidance: Mitigating controls have been documented within the attack tree or threat model.
Guidance: Mitigating controls or security requirements have been added to each project to address any preconditions that still lead to a
successful attack within attack trees.
Do stakeholders review access control matrices for relevant projects? Yes, a small percentage are/do
Guidance: Users, roles, and privileges are identified in each project.
Guidance: Resources and capabilities are identified in each project.
Guidance: A matrix of roles and capabilities is documented for each project.
Guidance: As new features are introduced, the matrix documentation is updated.
SR2 Guidance: The matrix is reviewed with project stakeholders prior to release.
Do project teams specify requirements based on feedback from other security activities? Yes, at least half of them are/do
Guidance: Additional security requirements are created based on feedback from code reviews, penetration tests, risk assessments, or
other security activities.
Do stakeholders review
During vendor of
the creation agreements for securityspecific
third-party agreements, requirements?
security requirements, activities, and processes are considered for Yes, a small percentage are/do
Guidance: inclusion.
Are audits performed against the security requirements specified by project teams? Yes, we did it once
SR3 Guidance: Audits are routinely performed to ensure security requirements have been specified for all functional requirements.
Guidance: Audits also verify attack trees are constructed and mitigating controls are annotated.
Guidance: A list of unfulfilled security requirements and their projected implementation date is documented.
Guidance: Security requirement audits is performed on every development iteration prior to the implementation of code.
Do you advertise shared security services with guidance for project teams? Yes, across the organization
Guidance: A list of reusable resources is collected and categorized based on the security mechanisms they fulfill (LDAP server, single
sign-on server, etc.).
Guidance: The organization has selected a set of reusable resources to standardize on.
Guidance: These resources have been thoroughly audited for security issues.
Guidance: Design guidance has been created for secure integration of each component within a project.
Guidance: Project groups receive training regarding the proper use and integration of these components.
SA2
Are project teams provided with prescriptive design patterns based on their application architecture? Yes, the standard set is integrated
Guidance: Each project is categorized based on architecture (client-server, web application, thick client, etc.).
Guidance: A set of design patterns is documented for each architecture (Risk-based authentication system, single sign-on, centralized
logging, etc.).
Guidance: Architects, senior developers, or other project stakeholders identify applicable and appropriate patterns for each project
during the design phase.
Do project teams build software from centrally-controlled platforms and frameworks? Yes, a small percentage are/do
Guidance: Reusable code components based on established design patterns and shared security services have been created for used
within projects across the organization.
Guidance: Reusable code components are regularly maintained, updated, and assessed for risk.
SA3 Are project teams audited for the use of secure architecture components? Yes, we do it every few years
Guidance: Audits include evaluation of usage of recommended frameworks, design patterns, shared security services, and reference
platforms.
Guidance: Results are used to determine if additional frameworks, resources, or guidance need to be specified as well as the quality of
guidance provided to project teams.
Verification
Design Review Answer Interview Notes Rating
Do project teams document the attack perimeter of software designs? Yes, the majority of them are/do
Guidance: Each project group creates a simplified one-page architecture diagram representing high-level modules.
Guidance: Each component in the diagram is analyzed in terms of accessibility of the interface from authorized users, anonymous
users, operators, application-specific roles, etc. 1.85
Guidance: Interfaces and components with similar accessibility profiles are grouped and documented as the software attack surface.
Guidance: One-page architecture diagram is annotated with security-related functionality.
Guidance: Grouped interface designs are evaluated to determine whether security-related functionality is applied consistently.
Guidance: Architecture diagrams and attack surface analysis is updated when an application's design is altered.
DR1
Do project teams check software designs against known security risks? Yes, at least half of them are/do
Guidance: Each project group documents a list of assumptions the software relies on for safe execution.
Guidance: Each project group documents a list of security requirements for the application.
Guidance: Each project's one-page architecture diagram is evaluated for security requirements and assumptions. Missing items are
documentedare
Evaluations as findings.
repeated when security requirements are added or the high-level system design changes occur within a
Guidance: project.
Do project teams specifically analyze design elements for security mechanisms? Yes, at least half of them are/do
Guidance: Each interface within the high-level architecture diagram is formally inspected for security mechanisms (includes internal and
external application tiers).
Guidance: Analysis includes the following minimum categories: authentication, authorization, input validation, output encoding, error
handling, logging, cryptography, and session management.
Guidance: Each software release is required to undergo a design review.
DR2
Are project stakeholders aware of how to obtain a formal secure design review? Yes, at least half of them are/do
Guidance: A process for requesting a formal design review is created and advertised to project stakeholders.
Guidance: The design review process is centralized and requests are prioritized based on the organization's business risk profile.
Guidance: Design reviews include verification of software's attack surface, security requirements, and security mechanisms within
module interfaces.
Does the secure design review process incorporate detailed data-level analysis? Yes, a small percentage are/do
Guidance: Project teams identify details on system behavior around high-risk functionality (such as CRUD of sensitive data).
Guidance: Project teams document relevant software modules, data sources, actors, and messages that flow between data sources or
business functions.
Guidance: Utilizing the data flow diagram, project teams identify software modules that handle data or functionality with differing
sensitivity levels.
DR3 Does a minimum security baseline exist for secure design review results? Yes, the standard set is integrated
Guidance: A consistent design review program has been established.
Guidance: A criteria is created to determine whether a project passes the design review process (for example no high-risk findings).
Guidance: Release gates are used within the development process to ensure projects cannot advance to the next step until the project
succesfully completes a design review.
Guidance: A process is established for handling design review results in legacy projects, including a requirement to establish a time
frame for successfully completing the design review process.
Can project teams access automated code analysis tools to find security problems? Yes, there is a standard set
Guidance: The organization has reviewed open source, commercial, and other solution for performing automated code reviews and
selected a solution that will best fit the organization.
Guidance: Automated code analysis has been integrated within the development process (at code check-in for example).
IR2
Do stakeholders consistently review results from code reviews? Yes, a small percentage are/do
Guidance: Project stakeholders review and accept any risks that they choose not to address.
Guidance: Project stakeholders have created a plan for addressing findings in legacy code.
Do project teams utilize automation to check code against application-specific coding standards? Yes, across the organization
Guidance: Automated code review tools are customized to perform additional API checks, verify organization coding standards, etc.
(including the addition of custom rules).
Does a minimum security baseline exist for code review results? Yes, teams write/run their own
IR3
Guidance: Each project contains a checkpoint in the development process that requires a specific level of code review results to be met
before release.
Guidance: The organization has established an exception process for legacy code, which requires a certain level of assurance to be
met within a specific time period
Are stakeholders aware of the security test status prior to release? Yes, a small percentage are/do
Guidance: Penetration testing issues are reviewed with project stakeholders.
Guidance: Project stakeholders select issues to remediate prior to release.
Guidance: Project stakeholders set a time line for addressing identified issues or accept outstanding risks.
Do projects use automation to evaluate security test cases? Yes, at least half of them are/do
Guidance: The organization has reviewed open source, commercial, and other solution for performing automated security testing and
selected a solution that will best fit the organization.
Guidance: Automated security testing has been integrated within the development process.
ST2
Do projects follow a consistent process to evaluate and report on security tests to stakeholders? Yes, a small percentage are/do
Guidance: Automated security testing occurs across projects on a regular, scheduled basis.
Guidance: A process has been created for reviewing security testing results with project stakeholders and remediating risk.
Are security test cases comprehensively generated for application-specific logic? Yes, a small percentage are/do
Guidance: Using automated tools, unit tests, or other similar methods, a comprehensive set of security test cases is constructed and
evaluated for each project.
Does a minimum security baseline exist for security testing? Yes, teams write/run their own
ST3 Guidance: Each project contains a checkpoint in the development process that requires a specific level of security testing results to be
met before release.
Guidance: The organization has established an exception process for handling security testing results in legacy projects, which requires
a certain level of assurance to be met within a specific time period
Operations
Issue Management Answer Interview Notes Rating
Do projects have a point of contact for security issues or incidents? Yes, the majority of them are/do
Guidance: Each project or development group has assigned a security-savvy developer to be the point of contact for security issues.
Guidance: The organization maintains a centralized list of applications, projects, and points of contact regarding security issues. 1.93
Does your organization have an assigned security response team? Yes, it's a number of years old
Guidance: The organization has defined a centralized security response team responsible for managing incidents, vulnerability reports,
IM1
remediation, and reporting.
Guidance: During an incident, the security response team provides briefings and upward communication.
Are project teams aware of their security point(s) of contact and response team(s)? Yes, the majority of them are/do
Guidance: The security response team meets with project groups at least annually to brief individuals on the incident response process.
Are project stakeholders aware of relevant security disclosures related to their software projects? Yes, at least half of them are/do
Guidance: A formal, documented process has been established for tracking, handling, and communicating incidents internally.
Are incidents inspected for root causes to generate further recommendations? Yes, at least half of them are/do
Guidance: Incident response teams investigate root causes of security failures leading to an incident or security issue.
Guidance: The root
root cause
causeisiscompared
used to analyze the requirements
to security project or software for additional
and existing potential
processes failures.
to determine how to improve security
Guidance: assurance efforts.
Guidance: The root cause is reviewed with project stakeholders and management to determine a mitigation process and time frame.
Do projects consistently collect and report data and metrics related to incidents? Yes, a small percentage are/do
IM3 Guidance: The organization's centralized incident response process is expanded to collect and record metrics.
Guidance:
Metrics such as frequency of software projects affected by incidents, system downtime and cost from loss of use, human
resources taken in handling and cleanup of the incident, estimates of long-term costs such as regulatory fines or brand
damage, etc. are collected.
Guidance: Past security incidents are recorded and reviewed every six months and recommendations to improve the organization or
software assurance process are made.
Is a consistent
A process used
documented to apply
ongoing upgrades
process and created
has been patchesatto critical
the dependencies?
organization level to consistently identify and apply security Yes, across the organization
Guidance: patches.
Guidance: The patch management process requires security patches to be applied within a specific time window based on risk.
Guidance: Project teams share a list of third-party components and a source for updates with the operations team.
Do projects leverage automation to check application and environment health? Yes, a small percentage are/do
Guidance: The organization has reviewed open source, commercial, and other solution for performing automated monitoring and patch
management and has selected a solution that will best fit the organization.
EH2 Guidance: Automated monitoring and patch management tools and processes has been integrated within the organization's operational
environments.
Guidance: Project teams and the operations team document and implement application-level heath checks.
Guidance: The automated monitoring and patch management tools generate alerts and a documented process for handling and
responding to these alerts has been established.
Guidance: Project teams and the operations team reviews configuration changes and alerts at least quarterly in order to improve
current processes.
Are stakeholders aware of options for additional tools to protect software while running in operations? Yes, the standard set is integrated
Guidance: The security team or operations team reviews optional tools for protecting software with project stakeholders.
Guidance: Appropriate solutions such as a WAF, IPS, HIDS, etc. are adopted for each project's operational environment.
Does a minimum security baseline exist for environment health (versioning, patching, etc)? Yes, localized to business areas
EH3 Guidance: Project-level audits include analysis and testing of the operational environment in which the software resides.
Guidance: Audits include verification of compliance with the organization's patch management process.
Guidance: Operational environment audits occur at least every six months.
Guidance: The organization has established an exception process for legacy operational environments, which requires a certain level of
assurance to be met within a specific time period.
Are project releases audited for appropriate operational security information? Yes, we did it once
Guidance: The audit process includes verification that projects' operational security guides are complete, contain sufficient details, and
are up-to-date.
Guidance: The organization has established an exception process for legacy projects, which requires a certain level of assurance within
the software's operational environment to be met within a specific time period.
OE3
Is code signing routinely performed on software components using a consistent process? Yes, but on an adhoc basis
Guidance: Code signing is used to ensure and verify the authenticity of developed code.
Guidance: A code signing and key management process has been documented for the organization.
Guidance: Project teams work with security auditors to determine which components warrant including within the code signing process.
Guidance: List of included code components is reviewed and updated regularly.
SAMM Assessment Scorecard: Brick Builder For Acme Brick Co
Notes:
Data in this worksheet is automatically imported from the Interview and Roadmap worksheets and will automatically update when changed
in the respective worksheets. This is mostly a read-only worksheet, changes should be made in Interview or Roadmap worksheets.
Governance Policy & Compliance 0.90 0.35 0.35 0.20 Construction 1.37 Strat egy & Metric s Governance Education & Guidance 1.05 0.00 0.00 0.00
Operational Enablement Policy & Compl iance
3.00
Governance Education & Guidance 1.05 0.50 0.35 0.20 Verification 1.34 Construction Threat Assessment 0.00 1.10 0.00 0.00
Construction Threat Assessment 1.10 0.20 0.30 0.60 Operations 1.53 1.48 Construction Security Requirements 0.00 1.55 0.00 0.00
Environment Hard ening Education & Guidance
0.95 1.00 0.90
Construction Security Requirements 1.55 1.00 0.35 0.20 Construction Secure Architecture 0.00 1.45 0.00 0.00
1.70
1.05
Construction Secure Architecture 1.45 0.35 0.75 0.35 Verification Design Review 0.00 0.00 1.85 0.00
Construction Threat Assessment 1.20 0.20 0.40 0.60 Operations 1.56 Environm ent Hardening 1.75 Education & Guidance Construction Security Requirements 0.00 1.70 0.00 0.00
1.20 1.00
Construction Security Requirements 1.70 1.00 0.35 0.35 1.70 1.45 Construction Secure Architecture 0.00 1.45 0.00 0.00
Construction Secure Architecture 1.45 0.35 0.75 0.35 Verification Design Review 0.00 0.00 1.85 0.00
Issue Management 2.08 -1.00 1.35 Threat Assessment Implementation
Verification Design Review 1.85 0.75 0.50 0.60 Verification Review 0.00 0.00 1.20 0.00
Verification Implementation Review 1.20 0.35 0.50 0.35 1.12 Verification Security Testing 0.00 0.00 1.12 0.00
1.70
Verification Security Testing 1.12 0.57 0.35 0.20 Operations Issue Management 0.00 0.00 0.00 1.93
Sec urity Testing
1.35 Securi ty R equirements
1.70 Environment
Operations Issue Management 1.93 0.83 0.75 0.35 Operations 0.00 0.00 0.00 1.70
1.85 Hardening
Operational
Operations Environment Hardening 1.70 0.75 0.35 0.60 Operations 0.00 0.00 0.00 1.05
Impl ementation R eview Secure Archi tecture
Enablement
Operations Operational Enablement 1.05 0.50 0.20 0.35 Design Review SAMM Phase 1 Score
Construction Threat Assessment 1.35 0.35 0.40 0.60 Operations 1.66 Environm ent Hardening 1.75 Education & Guidance Construction Security Requirements 0.00 1.70 0.00 0.00
1.20 1.00
Construction Security Requirements 1.70 1.00 0.35 0.35 1.70 Construction Secure Architecture 0.00 1.70 0.00 0.00
1.45
Construction Secure Architecture 1.70 0.35 1.00 0.35 Verification Design Review 0.00 0.00 1.85 0.00
Issue Management 2.08 -1.00 1.35 Threat Assessment Implementation
Verification Design Review 1.85 0.75 0.50 0.60 Verification 0.00 0.00 1.35 0.00
Review
Verification Implementation Review 1.35 0.50 0.50 0.35 1.12 Verification Security Testing 0.00 0.00 1.12 0.00
1.70
Verification Security Testing 1.12 0.57 0.35 0.20 Operations Issue Management 0.00 0.00 0.00 2.08
1.35
Sec urity Testing 1.70 Securi ty R equirements
Environment
Operations Issue Management 2.08 0.83 0.75 0.50 Operations Hardening 0.00 0.00 0.00 1.70
1.85
Operational
Operations Environment Hardening 1.70 0.75 0.35 0.60 Operations Enablement 0.00 0.00 0.00 1.20
Impl ementation R eview Secure Archi tecture
Operations Operational Enablement 1.20 0.50 0.35 0.35 Design Review SAMM Phase 2 Score
Construction Threat Assessment 1.92 0.75 0.57 0.60 Operations 1.79 Environm ent Hardening 1.75 Education & Guidance Construction Security Requirements 0.00 1.95 0.00 0.00
1.20 1.00
Construction Security Requirements 1.95 1.00 0.35 0.60 1.70 Construction Secure Architecture 0.00 1.70 0.00 0.00
1.45
Construction Secure Architecture 1.70 0.35 1.00 0.35 Verification Design Review 0.00 0.00 2.00 0.00
Verification Implementation
Issue Management 2.08 -1.00 1.35 Threat Assessment
Verification Design Review 2.00 0.75 0.50 0.75 0.00 0.00 1.60 0.00
Review
Verification Implementation Review 1.60 0.50 0.75 0.35 1.12 Verification Security Testing 0.00 0.00 1.27 0.00
1.70
Verification Security Testing 1.27 0.57 0.35 0.35 Operations Issue Management 0.00 0.00 0.00 2.08
1.35
Sec urity Testing 1.70 Securi ty R equirements
Environment
Operations Issue Management 2.08 0.83 0.75 0.50 Operations 0.00 0.00 0.00 1.85
1.85 Hardening
Operational
Operations Environment Hardening 1.85 0.75 0.35 0.75 Operations 0.00 0.00 0.00 1.45
Impl ementation R eview Secure Archi tecture Enablement
Operations Operational Enablement 1.45 0.50 0.60 0.35 Design Review SAMM Phase 3 Score
Construction Threat Assessment 1.92 0.75 0.57 0.60 Operations 1.93 Environm ent Hardening 1.75 Education & Guidance Construction Security Requirements 0.00 1.95 0.00 0.00
1.20 1.00
Construction Security Requirements 1.95 1.00 0.35 0.60 1.70 Construction Secure Architecture 0.00 1.70 0.00 0.00
1.45
Construction Secure Architecture 1.70 0.35 1.00 0.35 Verification Design Review 0.00 0.00 2.00 0.00
Issue Management 2.08 -1.00 1.35 Threat Assessment Implementation
Verification Design Review 2.00 0.75 0.50 0.75 Verification 0.00 0.00 2.25 0.00
Review
Verification Implementation Review 2.25 0.75 1.00 0.50 1.12 Verification Security Testing 0.00 0.00 1.52 0.00
1.70
Verification Security Testing 1.52 0.67 0.35 0.50 Operations Issue Management 0.00 0.00 0.00 2.33
1.35
Sec urity Testing 1.70 Securi ty R equirements
Environment
Operations Issue Management 2.33 0.83 0.75 0.75 Operations 0.00 0.00 0.00 1.85
1.85 Hardening
Operational
Operations Environment Hardening 1.85 0.75 0.35 0.75 Operations 0.00 0.00 0.00 1.60
Impl ementation R eview Secure Archi tecture Enablement
Operations Operational Enablement 1.60 0.50 0.60 0.50 Design Review SAMM Phase 4 Score
SAMM Assessment Interview: Brick Builder For Acme Brick C
Instructions
The questions and answers from the Interview tab are automatically copied over (and will be updated when changed)
There are four phases of improvement by default, if you need more, should be able to copy paste additional phase columns.
There are hidden columns for each phase that keep track of the answer values and scoring formulas.
"Improving" answers are highlighted in green to help indicate where improvements are made.
"Weakening" answers are highlighted in RED to help indicate where answers are lower than before.
The scores are imported into the Roadmap Chart and Scorecard worksheets and are automatically updated.
The left panes are frozen, so the projections can be scrolled to align with the questions to create "views"
Organization:
Project:
Interview Date:
Interviewer:
Persons Interviewed:
Governance
Strategy & Metrics
Is there a software security assurance program in place?
SM1 Are development staff aware of future plans for the assurance program?
Do the business stakeholders understand your organization’s risk profile?
Does the organization utilize a set of policies and standards to control software development?
PC2
Are project teams able to request an audit for compliance with policies and standards?
Are projects periodically audited to ensure a baseline of compliance with policies and standards?
PC3
Does the organization systematically use audits to collect and control compliance evidence?
Education & Guidance
Have developers been given high-level security awareness training?
EG1
Does each project team understand where to find secure development best-practices and guidance?
Are those involved in the development process given role-specific security training and guidance?
EG2
Are stakeholders able to pull in security coaches for use on projects?
Is security-related guidance centrally controlled and consistently distributed throughout the
organization?
EG3
Are developers tested to ensure a baseline skill-set for secure development practices?
Construction
Threat Assessment
Do projects in your organization consider and document likely threats?
TA1
Does your organization understand and document the types of attackers it faces?
Do project teams regularly analyze functional requirements for likely abuses?
TA2 Do project teams use a method of rating threats for relative comparison?
Are stakeholders aware of relevant threats and ratings?
Do you advertise shared security services with guidance for project teams?
SA2
Are project teams provided with prescriptive design patterns based on their application architecture?
Does the secure design review process incorporate detailed data-level analysis?
DR3
Does a minimum security baseline exist for secure design review results?
Implementation Review
Do project teams have review checklists based on common security related problems?
IR1
Do project teams review selected high-risk code?
Can project teams access automated code analysis tools to find security problems?
IR2
Do stakeholders consistently review results from code reviews?
Do project teams utilize automation to check code against application-specific coding standards?
IR3
Does a minimum security baseline exist for code review results?
Security Testing
Do projects specify security testing based on defined security requirements?
ST1 Is penetration testing performed on high risk projects prior to release?
Are stakeholders aware of the security test status prior to release?
Does the organization utilize a consistent process for incident reporting and handling?
IM2
Are project stakeholders aware of relevant security disclosures related to their software projects?
Are stakeholders aware of options for additional tools to protect software while running in operations?
EH3
Does a minimum security baseline exist for environment health (versioning, patching, etc)?
Operational Enablement
Are security notes delivered with each software release?
OE1
Are security-related alerts and error conditions documented on a per-project basis?
Acme Brick Co
Brick Builder
28-Feb-17
Steve
Willy Thomas, Kate Smith, Joe Kats, Ars Hickory, Rick Links
Governance
Strategy & Metrics
here a software security assurance program in place?
e development staff aware of future plans for the assurance program?
the business stakeholders understand your organization’s risk profile?
es the organization utilize a set of policies and standards to control software development?
e project teams able to request an audit for compliance with policies and standards?
e projects periodically audited to ensure a baseline of compliance with policies and standards?
es the organization systematically use audits to collect and control compliance evidence?
Education & Guidance
ve developers been given high-level security awareness training?
es each project team understand where to find secure development best-practices and guidance?
e those involved in the development process given role-specific security training and guidance?
e stakeholders able to pull in security coaches for use on projects?
security-related guidance centrally controlled and consistently distributed throughout the
ganization?
e developers tested to ensure a baseline skill-set for secure development practices?
Construction
Threat Assessment
projects in your organization consider and document likely threats?
es your organization understand and document the types of attackers it faces?
project teams regularly analyze functional requirements for likely abuses?
project teams use a method of rating threats for relative comparison?
e stakeholders aware of relevant threats and ratings?
you advertise shared security services with guidance for project teams?
e project teams provided with prescriptive design patterns based on their application architecture?
n project teams access automated code analysis tools to find security problems?
stakeholders consistently review results from code reviews?
project teams utilize automation to check code against application-specific coding standards?
es a minimum security baseline exist for code review results?
Security Testing
projects specify security testing based on defined security requirements?
penetration testing performed on high risk projects prior to release?
e stakeholders aware of the security test status prior to release?
es the organization utilize a consistent process for incident reporting and handling?
e project stakeholders aware of relevant security disclosures related to their software projects?
e stakeholders aware of options for additional tools to protect software while running in operations?
es a minimum security baseline exist for environment health (versioning, patching, etc)?
Operational Enablement
e security notes delivered with each software release?
e security-related alerts and error conditions documented on a per-project basis?
Yes, teams write/run their own Yes, teams write/run their own
Yes, we did it once Yes, we do it every few years
Current State Phase 1 Projection
Answer Rating Answer Rating
Yes, a small percentage are/do
Yes, a small percentage are/do 1.10 Yes, a small percentage are/do
Yes, a small percentage are/do 1.20
Yes, a small percentage are/do Yes, a small
Yes, percentage
at least are/do
half of them
Yes, a small percentage are/do are/do
Yes, at least half of them
Yes, at least half of them are/do are/do
Yes, the majority of them
Yes, the majority of them are/do are/do
Yes, a small percentage are/do Yes, a small percentage are/do
Answer Rating Answer of them
Yes, the majority Rating
Yes, the majority of them are/do
Yes, the standard set is integrated 1.55 Yes, the are/do
standard set is
integrated 1.70
Yes, a small percentage are/do Yes, a small
Yes, percentage
at least are/do
half of them
Yes, at least half of them are/do are/do
Yes, teams write/run their own Yes, teams write/run their own
Yes, we do it every few years Yes, we do it every few years
Phase 2 Projection Phase 3 Projection
Answer
Yes, at least half of them Rating Answer of them
Yes, the majority Rating
are/do
Yes, a small percentage are/do 1.35 are/do
Yes, at least half of them
are/do 1.92
Yes, a small percentage
Yes, a small
Yes, percentage
at least are/do
half of them are/do of them
Yes, the majority
are/do
Yes, at least half of them are/do
Yes, at least half of them
are/do are/do
Yes, the majority of them Yes, the majority of them
are/do are/do
Yes, a small percentage
Yes, a small percentage are/do are/do
Answer of them
Yes, the majority Rating Answer of them
Yes, the majority Rating
Yes, the are/do
standard set is
integrated 1.70 Yes, the are/do
standard set is
integrated 1.95
Yes, a small percentage
Yes, a small
Yes, percentage
at least are/do
half of them are/do
Yes, at least half of them
are/do are/do
Yes, a small percentage
Yes, a small percentage are/do are/do
Yes, we do it every few years Yes, we do it at least annually
Answer Rating Answer Rating
Yes, there is a standard set
Yes, a small percentage are/do 1.70 Yes, there is a percentage
Yes, a small
are/do
standard set
1.70
Yes, across the organization Yes, across the organization
andstandard
Yes, the requiredset is andstandard
Yes, the requiredset is
integrated integrated
Yes, a small percentage
Yes, a small percentage are/do are/do
Yes, we do it every few years Yes, we do it every few years
Phase 2 Projection Phase 3 Projection
Answer of them
Yes, the majority Rating Answer of them
Yes, the majority Rating
are/do
Yes, at least half of them
are/do 1.85 are/do
Yes, at least half of them
are/do 2.00
Yes, at least half of them Yes, at least half of them
are/do
Yes, at least half of them are/do
Yes, at least half of them
are/do are/do
Yes, at least half of them
Yes,Yes,
a small percentage
the standard setare/do
is Yes, the are/do
standard set is
integrated integrated
Answer Rating Answer Rating
Yes,
Yes,across thehalf
at least
are/do
organization
of them
1.35 Yes, across
Yes, thehalf
at least
are/do
organization
of them
1.60
Yes,
Yes,there is ahalf
at least standard set
of them Yes,
Yes,there is a standard
the majority set
of them
are/do are/do
Yes, it'sthe
Yes,
are/do
a number
majorityofofyears
themold 2.08 Yes, it'sthe
Yes,
are/do
a number
majorityofofyears
themold 2.08
are/do are/do
Yes, across the organization Yes, across the organization
Yes, atand required
least half of them Yes, atand required
least half of them
are/do are/do
Yes, at least half of them Yes, at least half of them
are/do
Yes, at least half of them are/do
Yes, at least half of them
are/do are/do
Answer of them
Yes, the majority Rating Answer of them
Yes, the majority Rating
are/do
Yes, at least half of them
are/do 1.70 are/do
Yes, at least half of them
are/do 1.85
Yes, across the organization Yes, across
Yes, thepercentage
a small organization
Yes, a small percentage are/do are/do
Yes, the standard set is Yes, the standard set is
integrated
Yes, localized to business integrated
areas Yes, across the organization
Answer
Yes, at least half of them Rating Answer
Yes, at least half of them Rating
are/do
Yes, at least half of them
are/do 1.20 are/do
Yes, at least half of them
are/do 1.45
Yes, at least half of them Yes, the majority of them
are/do are/do
Yes, a small percentage
Yes, a small percentage are/do are/do
Yes, it'sthe
Yes,
are/do
a number
majorityofofyears
themold 2.33
are/do
Yes, across the organization
Yes, atand required
least half of them
are/do
Yes, at least half of them
are/do of them
Yes, the majority
are/do
Answer
Yes, the majority of them Rating
are/do
Yes, at least half of them
are/do 1.85
Yes, across the organization
Yes, a small percentage are/do
Yes, the standard set is
integrated
Yes, across the organization
Answer
Yes, at least half of them Rating
are/do
Yes, at least half of them
are/do 1.60
Yes, the majority of them
are/do
Yes, a small percentage are/do
Security Practice
Phase 1 Phase 2 Phase 3 Phase 4
3.00
2.00
1.00
3.00
2.00
1.00
Policy & Compliance
0.00
1 2 3 4 5 6 7 8 9
3.00
2.00
1.00
3.00
2.00
1.00
Threat Assessment
0.00
1 2 3 4 5 6 7 8 9
3.00
2.00
1.00
Security Requirements
0.00
1 2 3 4 5 6 7 8 9
3.00
2.00
1.00
Secure Architecture
0.00
1 2 3 4 5 6 7 8 9
3.00
2.00
1.00
Design Analysis
0.00
1 2 3 4 5 6 7 8 9
3.00
2.00
1.00
Implementation Review
0.00
1 2 3 4 5 6 7 8 9
3.00
2.00
1.00
Security Testing
0.00
1 2 3 4 5 6 7 8 9
3.00
2.00
1.00
Issue Management
0.00
1 2 3 4 5 6 7 8 9
3.00
2.00
1.00
Environment Hardening
0.00
1 2 3 4 5 6 7 8 9
3.00
2.00
1.00
Operational Enablement
0.00
1 2 3 4 5 6 7 8 9