You are on page 1of 103

Evaluating Systems

•Information Assurance
Reading Material
• Chapter 21 Computer Security: Art and Science
• The orange book and the whole rainbow series
– http://nsi.org/Library/Compsec/orangebo.txt
• The common criteria
– Lists all evaluated protection profiles and products
– http://www.commoncriteriaportal.org
Outline
• Motivation for system evaluation
• Specific evaluation systems
– TCSEC/Orange Book
Trusted ComputerSystem Evaluation Criteria (TCSEC) or the
“Orange Book”
– Interim systems
– Common Criteria
Evaluation Goals
• Oriented to purchaser/user of system
• Assurance that system operates as advertised
Evaluation Options
• Rely on vendor/developer evidence
– Self-evaluate vendor design docs, test results, etc
– Base on reputation of vendor
• Rely on an expert
– Read product evaluations from trusted source
– Penetration testing
Formal Evaluation
• Provide a systematic framework for system
evaluation
– More consistent evaluation
– Better basis for comparing similar product
• Trusted third party system for evaluation
• Originally driven by needs of government and
military
TCSEC: 1983-1999
• Trusted Computer System Evaluation Criteria (TCSEC)
also called the Orange Book
– Specifies evaluation classes (C1, C2, B1, B2, B3, A1)
– Specifies functionality and assurance requirements for
each class
• Functional Model builds on
– BLP (mandatory labeling)
– Reference Monitors
Reference Monitor
• Reference Monitor – abstract machine that mediates
all access to objects by subjects
• Reference Validation Mechanism (RVM) –
Implementation of a Reference Monitor
– Tamper-proof
– Well defined
– Never bypassed
– Small enough for analysis and testing
Trusted Computing Base (TCB)
• Includes all protection mechanisms including HW,
firmware, and software responsible for enforcing the
security policy
• Strong boundary around the TCB is critical
– Any code trusted by element of TCB must be part of TCB
too.
– If portion of TCB is corrupted, must consider that all of the
TCB can be corrupted
TCSEC Functional Requirements
• DAC
• Object Reuse
– Sufficient clearing of objects between uses in resource pool
– E.g. zero pages in memory system
• MAC and Labels
• Identification and Authentication
• Audit
– requirements increase at higher classes
• Trusted Path
– Non-spoofable means to interact with TCB
– Ctl-Alt-Del in Windows
TCSEC Assurance Requirements
• Configuration Management
– For TCB
• Trusted Distribution
– Integrity of mapping between master and installations
• System Architecture
– Small and modular
• Design Specification – vary between classes
• Verification – Vary between classes
• Testing
• Product Documentation
TCSEC Classes
• D – Catch all
• C1 – Discretionary Protection
– Identification and authentication and DAC
– Minimal Assurance
• C2 – Control access protection
– Adds object reuse and auditing
– More testing requirements
– Windows NT 3.5 evaluated C2
TCSEC Classes
• B1 – Labeled Security Protection
– Adds MAC for some objects
– Stronger testing requirements. Information model of
security policy.
– Trusted Unixes tended to be B1
• B2 – Structured protection
– MAC for all objects. Additional logging. Trusted Path.
Least privilege.
– Covert channel analysis, configuration management, more
documentation, formal model of security policy
TCSEC Classes
• B3 – Security Domains
– Implements full RVM. Requirements on code modularity,
layering, simplicity.
– More stringent testing and documentation.
• A1 – verified protection
– Same functional requirements as B3
– Significant use of formal methods in assurance
– Honeywell’s SCOMP
TCSEC Evaluation process
• Originally controlled by government
– No fee to vendor
– May reject evaluation application if product not of interest
to government
• Later introduced fee-based evaluation labs
• Evaluation phases
– Design analysis – no source code access
– Test analysis
– Final review
TCSEC Evaluation Issues
• Evaluating a specific configuration
– E.g., Window NT, no applications installed, no network
– New patches, versions require re-certification
• RAMP introduced to ease re-certifications
• Long time for evaluation
– Sometimes product was obsolete before evaluation
finished
• Criteria Creep
– B1 means something more in 1999 than it did in 1989
Interim Efforts in the ’90s
• Canadian Trusted Computer Product Evaluation
Criteria (CTCPEC)
• Information Technology Security Evaluation Criteria
(ITSEC) – Western Europe
• Commercial International Security Requirements
(CISR) – AmEx and EDS
• Federal Criteria – NSA and NIST
FIPS 140
• Framework for evaluating Cryptographic Modules
• Still in Use
• Addresses
– Functionality
– Assurance
– Physical security
FIPS 140-2 Security Levels
• Security Level 1 – Uses a FIPS-approved crypto
algorithm.
• Security Level 2 – Adds physical security requirements,
e.g. Tamper-evident coatings
• Security Level 3 – Greater physical security. Protect
data hardware falls into the wrong hands.
• Security Level 4 – Greatest physical security. Detects
and responds to environmental and unauthorized
attacks.
Common Criteria – 1998 to today
• Pulls together international evaluation efforts
– Evaluations mean something between countries
• Three top level documents
– Common Criteria Documents
• Describe functional and assurance requirements. Defines
Evaluation Assurance Levels (EALs)
– CC Evaluation Methodology (CEM)
• More details on the valuation. Complete through EAL5 (at least)
– Evaluation Scheme
• National specific rules for how CC evals are performed in that
country
• Directed by NIST in US
CC Terminology
• Target of Evaluation (TOE)
– The product being evaluated
• TOE Security Policy (TSP)
– Rules that regulate how assets are managed, protected,
and distributed in a product
• TOE Security Functions (TSF)
– Implementation of the TSP
– Generalization of the TCB
Protection Profile (PP)
• Profile that describes the security requirements for a
class of products
– List of evaluated PP’s
– http://www.commoncriteriaportal.org/pp.html
• Replaces the fixed set of classes from TCSEC
• ISSO created some initial profiles to match TCSEC
classes
– Controlled Access Protection Profile (CAPP) corresponds to
C2
– Labeled Security Protection Profile (LSPP) corresponds to
B1
Product evaluation
• Define a security target (ST)
– May leverage an evaluated protection profile
• Evaluated with respect to the ST
CC Functional Requirements
• Defined in a taxonomy
– Top level 11 classes
• E.g., FAU – Security audit and FDP – User Data Protection
– Each class divided into families
• E.g., FDP_ACC – Access control policy
– Each family divided into components
• E.g., FDP_ACC.2 – Complete access control
– Each component contains requirements and dependencies
on other requirements
CC Assurance Requirements
• Similar class, family, component taxonomy
• Eight product oriented assurance classes
– ACM – Configuration Management
– ADO – Delivery and Operation
– ADV – Development
– AGD – Guidance Documentation
– ALC – Life Cycle
– ATE – Tests
– AVA – Vulnerability Analysis
– AMA – Maintenance of Assurance
Evaluation Assurance Levels
• 7 fixed EALs
– EAL1 – Functionality Tested
– EAL2 – Structurally Tested
– EAL3 – Methodically tested and checked
• Analogous to C2
– EAL4 – Methodically Designed, Tested, and Reviewed
– EAL5 – Semiformally Designed and Tested
– EAL6 – Semiformally Verified Design and Tested
– EAL7 – Formally Verified Design and Tested
CC Evaluation Process in US
• NIST provides accreditation of third party evaluation
labs
– Vendor pays lab
– Lab works with oversight board
• Evaluate both PP’s and Products
• List of evaluated products
– http://www.commoncriteriaportal.org/products.html
Certifying Process
• Gain assurance from knowledge of developers process
– ISO 9000
– SEI's Capability Maturity Model(CMM)
– System Security Engineering Capability Maturity Model (SSE-
CMM)
• http://www.sse-cmm.org
System Security Engineering
Capability Maturity Model
• SSE-CMM - http://www.sse-cmm.org
– Based on SEI’s SE-CMM
• Divide software development into process areas
(which are further divided into processes)
– E.g., Assess Threat, Coordinate Security, Assess impact
• Plus some process areas from base SE-CMM
– E.g., Ensure Quality, Plan Technical Effort
Capability Maturity Levels
• An organization is evaluated at a maturity level for
these process areas and processes
1. Performed informally
2. Planned and tracked
3. Well-defined
4. Quantitatively controlled
5. Continuously improving
Key Points
• Evaluation for the benefit of the customer
• Product Evaluations
– Functional Requirements
– Assurance Requirements
• Process Evaluation
Standardization and Security Criteria:
Security Evaluation of Computer Products
Guide to Computer Network Security
Introduction
• Buying computer product is not easy because of the
complexity of computer products to the ordinary person.
• One cannot always rely on the words of the manufacturers
and product vendors to ascertain the suitability and reliability
of the products.
• This is currently the case in both computer hardware and
software products.
• It is a new computer security problem all computer product
buyers must grapple with and computer network managers
must try to mitigate as they acquire new computer products.
• There are several approaches including: standardization and
security evaluation of products.
Product Standardization
• A standard is a document that establishes uniform
engineering or technical specifications, criteria,
methods, processes, or practices. Some standards
are mandatory while others are voluntary.
• Standardization is then a process of agreeing on
these standards.
Product Standardization
• The standardization process consists of several stages
through which the product specifications must
undergo.
• First the specifications undergo a period of development
and several iterations of review by the interested
engineering or technical community and the revisions are
made based on members experiences. These revisions are
then adopted by the Steering Committee as draft
standards. The goals of this process are to create standards
that:
• are technically excellent;
• have prior implementation and testing;
• are clear, concise, and easily understood documentation;
• foster openness and fairness.
Need for Standardization of
(Security) Products
• Products and indeed computer products are produced by many
different companies with varying technical and financial capabilities
based on different technical design philosophies.
• The interface specifications for products meant to interconnect, must
be compatible.
• Stndardization reduces the conflicts in the interface specifications.
Security Evaluations
• Buyers of computer products cannot always rely on
the words of the manufacturers and those of the
product vendors to ascertain the suitability and
reliability of the products
• The security evaluation gives the buyer a level of
security assurance that the product meets the
manufacturer’s stated claims and also meets the
buyer’s expectations
• The process of security evaluation, based on criteria,
consists of a series of tests based on a set of levels
where each level may test for a specific set of
standards.
• The process itself starts by establishing the following:
• Purpose
• Criteria
• Structure/Elements
• Outcome/benefit
• Purpose of Evaluation
• Based on the Orange Book, a security assessment of a computer
product is done for:
• Certification – to certify that a given product meets the stated security
criteria and, therefore, is suitable for a stated application. Currently, there
are a variety of security certifying bodies of various computer products.
This independent evaluation provides the buyer of te product added
confidence in the product.
• Accreditation – to decide whether a given computer product, usually
certified, meets stated criteria for and is suitable to be used in a given
application. Again , there are currently several firms that offer
accreditations to students after they use and get examined for their
proficiency in the use of a certified product.
• Evaluation - to assess whether the product meets the security
requirements and criteria for the stated security properties as claimed.
• Potential Market benefit, if any for the product. If the product passes the
certification, it may have a big market potential
• Criteria
• A security evaluation criteria is a collection of security standards that
define several degrees of rigor acceptable at each testing level of
security in the certification of a computer product.
• Criteria also may define the formal requirements the product needs to
meet at each Assurance Level. Each security criteria consists of several
Assurance Levels with specific security categories in each level.
• Before any product evaluation is done, the product evaluator must
state the criteria to be used in the process in order to produce the
desired result. By stating the criteria, the evaluator directly states the
Assurance Levels and categories in each Assurance Level that the
product must meet. The result of a product evaluation is the
statement whether the product under review meets the stated
Assurance Levels in each criteria category.
• Process of Evaluation
• The evaluation of a product can take one of the following
directions:
• Product-oriented - which is an investigative process to thoroughly
examine and test every state security criteria and determine to
what extent the product meets these stated criteria in a variety of
situations.
• Process-oriented – which is an audit process that assesses the
developmental process of the product and the documentation
done along the way, looking for security loopholes and other
security vulnerabilities.
• Structure of Evaluation
• The structure of an effective evaluation process, whether
product-oriented or process-oriented, must consider the
following items:
• Functionality - because acceptance of a computer security
product depends on what and how much it can do. If the product
does not have enough functionality, and in fact if it does not have
the needed functionalities, then it is of no value.
• Effectiveness - after assuring that the product has enough
functionalities to meet the needs of the buyer, the next key
question is always whether the product meets the effectiveness
threshold set by the buy in all functionality areas. If the product
has all the needed functionalities but they are not effective
enough, then the product cannot guarantee the needed security
and, therefore, the product is of no value to the buyer.
• Assurance – to give the buyer enough confidence in the product,
the buyer must be given an assurance, a guarantee, that the
product will meet nearly all, if not exceed, the minimum stated
security requirements.
• Outcome/Benefits
• The goal of any product producer and security evaluator is
to have a product that gives the buyer the best outcome
and benefits
Computer Products Evaluation Standards
• Among the many standards organizations that developed the
most common standards used by the computer industry today
are the following:
• American National Standards Institute (ANSI)
• British Standards Institute (BSI)
• Institute of Electrical and Electronic Engineers Standards Association
(IEEE-SA)
• International Information System Security Certification Consortium
(ISC)2
• International Organization for Standardization (ISO)
• Internet Architecture Board (IAB)
• National Institute of Standards and Technology (NIST)
• National Security Agency (NSA)
• Organization for the Advancement of Structured Information
standards (OASIS)
• Underwriters Laboratories (UL)
• World Wide Web Consortium (W3C)
Major Evaluation Criteria
• The Orange Book
• Most of the security criteria and standards in product security evaluation have their
basis in The Trusted Computer System Evaluation Criteria (TCSEC), the first collection of
standards used to grade or rate the security of computer system products. The TCSEC
has come to be a standard commonly referred to as "the Orange Book" because of its
orange cover. The criteria were developed with three objectives in mind:
• to provide users with a yardstick with which to assess the degree of trust that can
be placed in computer systems for the secure processing of classified or other
sensitive information;
• to provide guidance to manufacturers as to what to build into their new, widely-
available trusted commercial products in order to satisfy trust requirements for
sensitive applications; and
• to provide a basis for specifying security requirements in acquisition specifications
• The criteria also address two types of requirements:
• specific security feature requirements
• assurance requirements.
• The U.S. Federal Criteria
• The U.S. Federal Criteria drafted in the early 1990s, were
meant to be a replacement of the old TCSEC criteria.
However, these criteria were never approved and events
over run them when the international criteria board used
some of them in the developing of the ISO-based Common
Criteria (CC), thus overtaking it. Many of its ideas were
incorporated in the Common Criteria.
• The Information Technology Security Evaluation Criteria (ITSEC)
• While the U.S. Orange Book Criteria were developed in 1967, the
Europeans did not define a unified valuation criteria well until the
1980s when the United Kingdom, Germany, France and the
Netherlands harmonized their national criteria into a European
Information Security Evaluation Criteria (ITSEC). Since then, they have
been updated and the current issue is Version 1.2, published in 1991
followed two years later by its user manual, the IT Security Evaluation
Manual (ITSEM), which specifies the methodology to be followed when
carrying out ITSEC evaluations. ITSEC was developed because the
Europeans thought that the Orange Book was too rigid. ITSEC was
meant to provide a framework for security evaluations that would lead
to accommodate new future security requirements. It puts much more
emphasis on integrity and availability.
• The Trusted Network Interpretation (TNI): The Red
Book
• The Trusted Network Interpretation (TNI) of the TCSEC, also
referred to as "The Red Book," is a restating of the
requirements of the TCSEC in a network context. It
attempted to address network security issues. It is seen by
many as a link between the Red Book and new critera that
came after. Some of the shortfall of the Orange Book that
the Red Book tries to address include the distinction
between two types of computer networks:
• Networks of independent components with different jurisdictions
and management policies
• Centralized networks with single accreditation authority and
policy.
• The Common Criteria (CC)
• The Common Criteria (CC) occasionally, though incorrectly,
referred to as the Harmonized Criteria, is a multinational
successor to the TCSEC and ITSEC that combines the best
aspects of ITSEC, TCSEC, CTCPEC (Canadian Criteria), and
the and U.S. Federal Criteria (FC) into the Common Criteria
for Information Technology Security Evaluation . CC was
designed to be an internationally accepted set of criteria in
the form of an International Standards Organization ( ISO )
standard.
Does Evaluation Mean Security?

• The evaluation of a product either with a standard or a criteria


does not mean that the product is assured of security. No
evaluation of any product can guarantee such security.
However, an evaluated product can demonstrate certain
features and assurances from the evaluating criteria, that the
product does have certain security parameters to counter
those threats.
• The development of new security standards and criteria, will
no doubt continue to result in better ways of security
evaluations and certification of computer products and will,
therefore, enhance computer systems’ security.
Security Evaluation
Contents
• Introduction
• The Orange Book
• TNI-The Trusted Network Interpretation
• Information Technology Security Evaluation
Criteria
• The Common Criteria
• Security Analysis
What is an Evaluation?
• Independent Verification and Validation (IV&V) by an accredited and
competent Trusted Third Party
• Provides a basis for international Certification against specific formal
standards (i.e. CC) by national authorities
Evaluation Process
Assurance Independent
Techniques Evaluations

Produce provide formal evidence of

Assurance

giving
Information
Asset Owners
Confidence
require

that Privacy Properly Privacy


Requirements Managed Rights

are to protect
The target of the evaluation
• Products, e.g., operating systems, which will be used
in a variety of applications and have to meet the
generic security requirements.
• Systems, i.e., a collection of products assembled to
meet the specific requirements of a given
application.
The purpose of the evaluation
• Evaluation: assessing whether a product has the security properties
claimed for it.
• Certification: assessing whether a product is suitable for a given
application.
• Accreditation: deciding that a product will be used in a given
application.
The method of the evaluation
• A method must prevent from:
• The product is later found to contain a serious flaw
• Different evaluations of the same product disagree in
their assessment
• Product-Oriented:
• Examine and test the product.
• Different evaluations may give different results.
• Process-Oriented:
• Look at the documentation and the process of product
development.
• Easier to achieve repeatable results, but may not be very
useful
The structure of the evaluation criteria
• Functionality:
• The security features of a system, e.g., DAC (Discretionary), MAC (Mandate),
authentication, auditing
• Effectiveness:
• The mechanisms used appropriate for the given security requirements?
• Assurance:
• The thoroughness of the evaluation
Organizations of the evaluation process
• Government agency: backs the evaluation process and issues the
certification
• Accredited private enterprise: enforce the consistency of evaluations
(repeatability and reproducibility)
What Do CC Evaluations Give Us?-
Benefits
• Confidence & Trust in privacy and security characteristics of products and the
processes used to develop and support them (full product life cycle)
• Build official assurance arguments
• Prove technologies are indeed privacy enhancing as claimed
• formal, independently verifiable and repeatable methods
• Provide basis for international certification
• Provide Certification Report
• Differentiate products
• Formally support demonstrable due diligence/care
The costs of evaluation
• Costs:
• Fee paid to evaluation;
• Time to collect required evidences
• Time and money of training of evaluators
• Time and efforts of liaising with the evaluation team
The Orange Book
• Trusted Computer Security Evaluation Criteria (1985)
• A yardstick for users to assess the degree of trust that can be placed in a
computer security systems;
• Guidance for manufacturers of computer security systems;
• A basis for specifying security requirements when acquire a computer
security system
Security and evaluation categories
• Security policy:
• mandatory and discretionary access control policies expressed in terms of
subjects and objects
• Marking of objects:
• Labels specify the sensitivity of objects
• Identification of subjects:
• Individual subjects must be identified and authenticated
• Accountability:
• audit logs of security relevant events have to be kept.
Security and evaluation categories (Cont’d)
• Assurance:
• Operational: architecture
• Life cycle: design, test, and configuration management
• Documentation:
• Required by system managers, users and evaluators
• Continuous protection:
• Security mechanisms cannot be tampered with.
Four security divisions
• D: Minimal protection
• C: Discretionary protection
• B: Mandatory protection
• A: Verified protection
Orange Book Ratings
• D: Minimal Protection
• Did not qualify for higher
• C1: Discretionary Security Protection
• Resources protected by ACLs, memory overwrites prevented
• C2: Controlled Access Protection
• Access control at user level, clear memory when released
• B1: Labeled Security Protection
• Users, files, processes, etc, must be labeled
• B2: Structured Security Protection
• Big step, covert channels, secure kernel, etc.
• B3: Security Domains
• Auditing, secure crash
• A1: Verified Design
• Same requirements, more rigorous implementation
TNI-The Trusted Network
Interpretation-The Red Book
• Two kinds of networks
• Networks of independent components, with different jurisdictions, policies,
management, etc.
• Centralized networks with single accredited authority, policy and network
trusted computing base
• The red book only considers the 2nd type.
• The vulnerability of the communication paths
• Concurrent and asynchronous operation of the network components
Red book Policy
• Security policies deal with secrecy and integrity.
• Node names as DAC group identifiers in C1
• Audit trails should log the user of cryptographic keys.
• In the red book, integrity refers to
• The protection of data and labels against unauthorized
modification
• The correctness of message transmission, authentication
of source and destination of a message.
• Labels indicate whether an object had ever been
transmitted between nodes
Other Security Services in the Red Book
• Describe Services
• Functionality
• Strength: how well it is expected to meet its objective
• Assurance: derived from theory, testing, SE practice, validation and
verification.
• Rating
• None
• Minimum (C1)
• Fair(C2)
• Good (B2)
• Not offered - present
Services in the red book
• Communication integrity
• Authentication
• Communication field integrity
• Non-repudiation
• Denial of Service
• Continuity of operation
• Protocol-based protection
• Network management
• Compromise protection
• Data confidentiality
• Traffic confidentiality
Windows NT security rating
• Windows NT is only secure for such purposes (e.g. - "C2 Certified") if:
• Run the particular Compaq or Digital hardware models specified by NIST,
• Run the particular version of Windows NT (3.50) specified by NIST,
• Remove the floppy drive from the computer, and
• Remove network connectivity, and
• Configure Windows NT as specified by NIST.
UNIX security rating
• The Unix system is only as secure as the C1 criterion
• provides only discretionary security protection (DSP)
against browsers or non-programmer users
• AT&T, Gould and Honeywell made vital changes to the
kernel and file system in order to produce a C2 rated
Unix operating system.
• Have to sacrifice some of the portability of the Unix
system. It is hoped that in the near future a Unix system
with an A1 classification will be realized, though not at
the expense of losing its valued portability.
• http://secinf.net/unix_security/
Unix_System_Security_Issues.html
ITSEC: Information Technology Security Evaluation Criteria

• A harmonized European criteria refers to (1991)


• Effectiveness: how well a system is suited for countering the threats
envisaged.
• And correctness: assurance aspects relating to the development and
operation of a system.
The Evaluation Process
• TOE (Target of Evaluation)
• An IT system, part of a system or product that has been
identified as requiring security evaluation; ie, that which is
being evaluated.
• A Target of Evaluation (TOE) is the specific IT product or
system that is subject to evaluation.
• It is particularly relevant to, and part of the standard terms
within, Common Criteria and ITSEC.
• A Security Target (ST) contains the IT security
objectives and requirements as pertaining to a
specific target of evaluation with the definition of its
functional and assurance measures.
• http://www.itsecurity.com/papers/border.htm
Security target
• All aspects of the TOE that are relevant for evaluation
• Security objectives
• Statements about the system environment
• Assumptions about the TOE environment
• Security functions
• Rationale for security functions
• Required security mechanisms
• Required evaluation level
• Claimed rating of the minimum strength of mechanisms
• http://www.rycombe.com/itsec.htm
• Definition (Matt Bishop, 2003):
• A security target is a set of security requirements and specifications to
be used as the basis for evaluation of an identified product or system.
Security Functionality
• Security functionality description:
• Security objectives:
• Why is the functionality wanted?
• Security functions:
• What is actually done?
• Security mechanisms:
• How is it done?
Security functions in ITSEC
• Identification and authentication
• Access control
• Accountability: record the exercise of rights
• Audit: detect and investigate events that might
represent threats to security
• Object reuse
• Accuracy: correctness and consistency of data
• Reliability: consistency and availability of service
• Data exchange: referring to the International
standard ISO 7498-2
Security rating in ITSEC
• F1: C1 Discretionary security protection
• F2: C2 Controlled Access Protection
• F3: B1 Labeled Security Protection
• F4: B2 Structured Protection
• F5: B3 & A1 Security domain and verified design
• F6: high integrity
• F7: high availability
• F8: data integrity during communication
• F9: for high confidentiality (cryptographic devices)
• F10: for networks with high demands on confidentiality and integrity
Assurance of Effectiveness
• An assessment of effectiveness should examine
• Suitability of functionality
• Binding of functionality (compatibility)
• Strength of mechanism
• Ease of use
• Assessment of security vulnerabilities within the
construction of the TOE, e.g., ways of bypassing or
corrupting security enforcing functions
• Assessment of security vulnerabilities within the
operation of the TOE.
Assurance of Correctness
• Seven levels E0-E6 specify the list of documents
that have to be provided by the sponsor and the
actions to be performed by the evaluator.
• Development Process:
• Following the stages of a top-down methodology, security
requirements, architectural design, detailed design, and
implementation are considered.
• Development evaluation:
• includes configuration control and, from class E2 upwards,
developer security, e.g., the confidentiality of documents
association.
• Operation:
• Refers to operational document, including delivery,
configuration, star-up and operation.
Seven Evaluation classes
• E0: fail
• E1:a security target and an informal description of the target
• E2:+informal description of detailed design, configuration
control and a controlled distribution process
• E3: + a detailed design and the source code corresponding to
the security functions shall be provided
• E4: formal model of the security policy; rigorous approach and
notation for architectural and detailed design, vulnerability
analysis
• E5: close correspondence between detailed design and source
code, vulnerability based on source code
• E6: formal description of the security architecture of the TOE,
consistent formal model of security policy, possible to relate
portions of the executable form of TOE to the source code
Correspondence between Orange Book and ITSEC

OB ITSEC
D E0
C1 F1+E1
C2 F2+E2
B1 F3+E3
B2 F4+E4
B3 F5+E5
A1 F5+E6
Common Criteria ISO 15408
• International ISO IT Security standard for formally specifying IT
Security Requirements and how these are to be independently
evaluated and tested so products may be formally certified as being
trustworthy (1991)
• 3-Part Standard, plus evaluation methodology
• http://www.corsec.com/ccc_faq.php
CC Evaluations Involve:
• ANALYSIS
• Product Documentation
• Product Design (Security & Privacy Focus)
• Development Processes & Procedures
• Operation & Administration Guidance and Procedures
• Vulnerability Assessments
• TESTING
• Independent & Witnessed
• Fully Documented & Repeatable
• REPORTS
• Lead to International Certification
Types of Common Criteria
• Categories of Evaluations
evaluations
* Typically as the first step in an EAL.

Protection
Protection *Security
*Security
Profile
Profile Target
Target
Evaluation
Evaluation
Assurance
Assurance
Levels
Levels(EALs)
(EALs)
Scope
• Interviews
• Full Documentation Review
• Independent Testing
• Witness of Developer Testing
• Observation Reports When Required
• Deliverables:
• Security/Privacy Target or Protection Profile
• Evaluation Technical Report
• Certification Report (published by CSE, and recognized
by NSA and other Certification Bodies)
Protection Profiles
• A Protection Profile (PP) is an implementation-independent
statement of security requirements that is shown to address
threats that exist in a specified environment.
• A PP would be appropriate in the following cases:
• A consumer group wishes to specify security requirements for an
application type
• A government wishes to specify security requirements for a class of
security products
• An organization wishes to purchase an IT system to address its security
requirements
• A certified protection profile is one that a recognized
Certification Body asserts as having been evaluated by a
laboratory competent in the field of IT security evaluation to
the requirements of the Common Criteria and Common
Methodology for Information Technology Security Evaluation.
The evaluation process
Develop Conduct
Documentation NIAP
Security Evaluation
Preparation Issues
Target --------------------
------------------- Certificate
-------------------- Lab
Vendor --------------------
Vendor Vendor
Consultant NIAP CCEVS
Consultant or Lab NIAP CCEVS

• Work not necessarily performed by the CCTL:


• Documentation preparation
• Writing the Security Target
• Other consulting
• Evaluations must be performed by lab personnel
•CCTL: Common Criteria Test Labs
•CCEVS (Common Criteria Evaluations)
•The National Information Assurance Partnership (NIAP) is the
governing body for all CCTLs in the U.S.
Required evaluation materials
• Security Target
• TOE (target of evaluation)
• Configuration Management documentation
• Functionality Specification
• High and low level design documentation
• User and Administrator’s guides
• Life-cycle documentation
• Development tool documentation
• Security Policy model
• Correspondence analyses
• Installation and start-up procedures
• Delivery procedures
Steps in the evaluation process
80

70

60

50

40 EAL 2
EAL 4
30

20

10

0
I n p u t Task CM Develo p men t L if e-cycle Test in g Ou t p u t Task
su p p o rt

Evaluation assurance levels (EAL)


Results of the evaluation process
• Outcomes of Common Criteria Testing

Validation
Validation
Certificate
Certificate

• In U.S. this follows approval of lab test results


• Public posting of ST, validation report, and certificate
How the Process Works
1. Privacy (and security) requirements for a
technology and associated claims are precisely
specified using the CC
2. Technology is built, documented and tested to
these requirements
3. Technology is submitted to nationally accredited
labs for evaluation against the standards
4. Evaluation is conducted under the oversight of
national authority
Process (Continued)
5. Once vendor claims are proven, national authority confers
certification and publishes a Certification Report
6. Results are internationally recognized under a Mutual Recognition
Arrangement
Evaluation assurance levels (EAL)
• To meet the great variation in required levels of security within and
between both government and commercial interests, there are seven
levels of evaluation (EAL-1 through EAL-7).
• Only the first four levels can be evaluated by commercial laboratories.
EAL (Cont’d)
• EAL-1 examines the product and its documentation for
conformity, establishing that the Target does what its
documentation claims.
• EAL-2 tests the structure of the product through an
evaluation, which includes the product’s design history and
testing.
• EAL-3 evaluates a product in design stage, with
independent verification of the developer’s testing results,
and evaluates the developer’s checks for vulnerabilities, the
development environmental controls, and the Target’s
configuration management.
• EAL-4 is an even greater in-depth analysis of the
development and implementation of the Target and may
require more significant security engineering costs.
EAL (Cont’d)
• EALs 5-7 require even more formality in the design process and
implementation, analysis of the Target’s ability to handle attacks and
prevent covert channels, for products in high-risk environments.
• In the United States, evaluation to EALs 5-7 must be done by the
National Security Agency (NSA) for the U.S. government.
Correspondences
OB ITSEC CC
D E0 NA
NA NA EAL1
C1 F1+E1 EAL2
C2 F2+E2 EAL3
B1 F3+E3 EAL4
B2 F4+E4 EAL5
B3 F5+E5 EAL6
A1 F5+E6 EAL7
International evaluations history
• TCSEC (1980)
• Trusted Computer System Evaluation
Criteria (U.S.)
• ITSEC (1991)
• Information Technology Security
Evaluation and Certification Scheme
(Europe)
• CTCPEC (1993)
• Canadian Trusted Computer Product
Evaluation Criteria (CTCPEC)
International evaluations history
TCSEC – U.S. Canadian Criteria
(Orange Book) 1993
1985
Federal Criteria Common Criteria
U.K. Draft 1993 V1 1996
Confidence V2 1998
Levels
1989 ITSEC
German 1991
Criteria
www.commoncriteria.org

French Criteria
Common Criteria participating countries
• Certificate producing  Certificate
countries consuming
countries
• Australia
• New Zealand  Austria
• Canada  Finland
• France  Greece
• Germany 
• United Kingdom
Israel
 Italy
• United States
 Netherlands
 Norway
 Spain
 Sweden
Security Analysis
• Phases:
• Identification of the system and its assets
• Valuation of the assets
• security levels
• Identification of vulnerabilities and threats
• Valuation of vulnerabilities and threats
• Assessment of risks on assets
• depending on security levels and misuse likelihoods
• stop, if all risks are bearable
• Planning and design of countermeasures
• Analysis of the extended system
• countermeasures are also vulnerable
Security Analysis
• Common Criteria Trust Concept:
• Owner of an asset has to trust countermeasures built up by audits
• concept is insufficient since it does not concentrate on parts of the audited system!
• Potentially Trusted System Parts:
• Principals with access to an asset
• all principals may be benevolent
• Asset itself
• free of vulnerabilities
• Countermeasures
• sufficient protection
• immune against attacks on itself
• Reduction of the analysis process by considering trust in system parts
IS 2150 / TEL 2810
CC Evaluation, Risk Management, Legal
Issues, Physical Security

You might also like