You are on page 1of 16

Introduction to Computer Security

1. 2. 3. 4. Ethical Issues State of the Art Fundamentals of Computer Security Security Engineering :Lifecycle and Principles

1. Ethical Issues 2. State-of-the-Art


CSI/FBI Annual Survey for 2003 (sample of 530 companies)
Questions
Total annual financial losses
(only 251 out of 530 reported their loss, i.e. 47%)

Answers
$201,797,340

Detected attacks during the last 12 months Acknowledged financial losses due to security breaches Use a mixture of prevention, detection, and reaction technologies Use prevention technologies such as firewall, access control, and physical security Also use intrusion detection systems

92% 75% 99% 90%

75%

Popular Security Technologies


-Many organizations address security from three different perspectives: prevention, detection, and reaction. -Prevention technologies involve firewall, encryption, access control, physical security etc. -Detection technologies involve intrusion detection systems (IDS), digital watermarking etc. -Reaction technologies involve forensics systems, trace-back tools etc.

Classes of Security Threats


-System modification: may involve intrusion into the system itself. -Invasion of privacy: involves disclosing information about a user or host machine that should not be publicized. -Denial of service: makes system resources unavailable. -Antagonism: attacks that merely antagonize or annoy a user.

2. Fundamentals of Computer Security


What is Computer Security?
From the dictionary: safety and freedom from worry when using computers. More technically: confidentiality, integrity and availability identification and authentication, access control, audit, and assurance In fact security is what you get if all the above technology works as advertised. The main thing the computer industry does to provide safety is called protection. What the computer industry does to provide freedom from worry is called assurance.

Protection
Provided by a set of security services (countermeasures), each designed to prevent a specific kind of bad thing (threat) from happening. Example: a file system access control mechanism Three kinds of protection: authorization, accountability, and availability

Authorization Protects the system against attempt to break the rules:


Users

The Rules

Protected Resources

The rules generally say things like:


Anyone can read unclassified data No one outside the company can read proprietary data No one inside the company can read confidential data without first demonstrating a need to know

Two broad categories of authorization mechanisms: Access control mechanisms: -enforce the rules -used in environments that can be trusted to run a program to check whether the rules are being violated. Data protection mechanisms: -used when the environment isnt able to run a program to check the rules or isnt trusted to enforce the rules even if it can check them. Example: a telephone wire cant run a program; a PC running DOS can run programs, but DOS isnt a secure OS. -normally implemented using encryption. -Confidentiality protection keeps unauthorized readers from snooping through protected data. -Integrity protection keeps vandals from making unauthorized changes to protected data

Accountability Assumption: Theres no way to prevent authorized users with evil intent from doing things which the rules dont allow. Hence, the only rule that can keep all your resources safe: no one is allowed to do anything. Accountability: you can tell who did what, when. Two strengths of accountability: audit and non-repudiation.

Audit: A weak form of accountability When someone suspects foul play, the audit log is examined to discover evidence of the deed and the identity of the perpetrator. Limitations: some kinds of foul play cant be accurately diagnosed using audit.

The System

Audit Log

Non-repudiation: A stronger form of accountability Requires users to sign their requests for systems actions

The System

Signature

Audit Log

Depends on: -the strength of a digital signature algorithm -the secrecy of each users signature key to guarantee that privileged users and system administrators cant forge other users signatures.

Availability A resource is available if its there when you need it. A bad guy can do various things to deny the use of a resource: Destroy or damage the resource Interfere with the communications between you and the resource Interfere with your ability to pass the authorization check required for use of the resource
network

User
attacks

Server

Hacker

Two approaches to availability protection: service continuity and disaster recovery. -Service continuity: Make sure that you can always get to your resources Usually involves keeping many active copies of each resource and keeping a couple of independent communication paths to each copy.

-Disaster recovery: Assumes that service will eventually be interrupted, and figures out how you can get back up and running after the interruption. Consists of keeping backup copies of everything and planning in advance how the backups will be activated and used in emergency.

Assurance
The set of things the builder and the operator of a system do to convince you that it really is safe to use. Means that the system keeps its security promises: the system can enforce the policy youre interested in; the system works. Based on an assurance argument, which tries to prove three things: the systems protection mechanisms are correct (e.g. not full of bugs, enforce the stated policy) the system always uses its protection mechanisms when they are needed theres no way to circumvent the systems protection mechanisms Assurance has to be done throughout the systems lifetime.

Three kinds of assurance contribute to a strong assurance argument:

Design assurance: use of good security engineering practices to identify important threats and to choose appropriate countermeasures. Development assurance: use of disciplined processes to implement the design correctly and to deliver the final system securely and reliably. Operational assurance: mandates secure installation, configuration and day-to-day operation of the system.

Good records of what has been done during every phase of systems life must be kept as evidence, and can help in deciding how much faith in the systems security is justified.

4. Security Engineering: Lifecycle and Principles


Lifecycle
Define Application &Resources to be protected Identify Security Vulnerabilities and Threats

Estimate risks Prioritize risks Risk is Acceptably low Establish security policy

Repeat procedure when certain interval has expired or circumstances have changed

Specify System Architecture

Deploy& Maintain

Select and implement security services and mechanisms

Design Principles and Guidelines


Least privileges: every user and process should have the least set of access rights necessary. Economy of mechanisms: the design should be sufficiently small and simple that it can be verified and correctly implemented. Complete mediation: every access should be checked for authorization. Open design: security should not depend on the design being secret or on the ignorance of the attackers. Separation of privilege: where possible, access to objects should depend on more than one condition being satisfied. Least common mechanism: mechanisms shared by multiple users provide potential information channels, and therefore should be minimized. Psychological acceptability: the mechanism must be easy to use so that they will be applied correctly and not bypassed.

You might also like