P. 1
Vulnerability Management: High-impact Strategies - What You Need to Know: Definitions, Adoptions, Impact, Benefits, Maturity, Vendors

Vulnerability Management: High-impact Strategies - What You Need to Know: Definitions, Adoptions, Impact, Benefits, Maturity, Vendors


|Views: 954|Likes:
Published by Emereo Publishing
Vulnerability management is the cyclical practice of identifying, classifying, remediating, and mitigating vulnerabilities. This practice generally refers to software vulnerabilities in computing systems.

This book is your ultimate resource for Vulnerability Management. Here you will find the most up-to-date information, analysis, background and everything you need to know.

In easy to read chapters, with extensive references and links to get you to know all there is to know about Vulnerability Management right away, covering: Vulnerability management, AAA protocol, Information technology security audit, Automated information systems security, Canary trap, CBL Index, CESG Claims Tested Mark, Chroot, Commercial Product Assurance, Common Criteria Testing Laboratory, Composite Blocking List, Computer forensics, Computer security policy, Computer Underground Digest, Cryptographic Module Testing Laboratory, Control system security, Cyber security standards, Cyber spying, Cyber-security regulation, Defense in depth (computing), Department of Defense Information Assurance Certification and Accreditation Process, Department of Defense Information Technology Security Certification and Accreditation Process, Differentiated security, DShield, Dynablock, Enterprise Privacy Authorization Language, Evaluation Assurance Level, Exit procedure, Filesystem permissions, Full disclosure, Fuzz testing, Google hacking, Hardening (computing), Host protected area, Identity management, Internet ethics, Intruder detection, Labeled Security Protection Profile, Erik Laykin, Mobile device forensics, MyNetWatchman, National Information Assurance Certification and Accreditation Process, National Information Assurance Training and Education Center, National Strategy to Secure Cyberspace, Need to know, Network security policy, Not Just Another Bogus List, Off-site data protection, Open Vulnerability and Assessment Language, Patch Tuesday, Penetration test, Presumed security, Privilege revocation, Privilege separation, Protection mechanism, Protection Profile, Responsible disclosure, RISKS Digest, Same origin policy, Schneier's Law, Secure attention key, Secure by default, Secure error messages in software systems, Security controls, Security management, Security Target, Security through obscurity, Security-evaluated operating system, Setuid, Shibboleth (computer security), Software forensics, System High Mode, System Security Authorization Agreement, Trust negotiation, Trusted computing base, XACML, XTS-400, 201 CMR 17.00, Asset (computer security), Attack (computer), Federal Information Security Management Act of 2002, Health Insurance Portability and Accountability Act, Information Assurance Vulnerability Alert, IT risk, IT risk management, Month of bugs, Nikto Web Scanner, North American Electric Reliability Corporation, Payment Card Industry Data Security Standard, Sarbanes–Oxley Act, Security Content Automation Protocol, Threat (computer), Vulnerability (computing), Network security, Administrative domain, AEGIS SecureConnect, Aladdin Knowledge Systems, Alert Logic, Anomaly-based intrusion detection system, Anti-pharming, Anti-phishing software, Anti-worm, Application-level gateway, ARP spoofing, Asprox botnet, Attack tree, Authentication server, Avaya Secure Network Access, Avaya VPN Router, Bagle (computer worm), Barracuda Networks, Bastion host, Black hole (networking), BLACKER, Blue Cube Security, BNC (software), Botnet, BredoLab botnet, Bro (software), Byzantine Foothold, Captive portal, Capture the flag, Check Point, Check Point Abra, Check Point VPN-1, Christmas tree packet, Cisco ASA, Cisco Global Exploiter, Cisco PIX...and much more.

This book explains in-depth the real drivers and workings of Vulnerability Management. It reduces the risk of your technology, time and resources investment decisions by enabling you to compare your understanding of Vulnerability Management with the objectivity of experienced professionals.
Vulnerability management is the cyclical practice of identifying, classifying, remediating, and mitigating vulnerabilities. This practice generally refers to software vulnerabilities in computing systems.

This book is your ultimate resource for Vulnerability Management. Here you will find the most up-to-date information, analysis, background and everything you need to know.

In easy to read chapters, with extensive references and links to get you to know all there is to know about Vulnerability Management right away, covering: Vulnerability management, AAA protocol, Information technology security audit, Automated information systems security, Canary trap, CBL Index, CESG Claims Tested Mark, Chroot, Commercial Product Assurance, Common Criteria Testing Laboratory, Composite Blocking List, Computer forensics, Computer security policy, Computer Underground Digest, Cryptographic Module Testing Laboratory, Control system security, Cyber security standards, Cyber spying, Cyber-security regulation, Defense in depth (computing), Department of Defense Information Assurance Certification and Accreditation Process, Department of Defense Information Technology Security Certification and Accreditation Process, Differentiated security, DShield, Dynablock, Enterprise Privacy Authorization Language, Evaluation Assurance Level, Exit procedure, Filesystem permissions, Full disclosure, Fuzz testing, Google hacking, Hardening (computing), Host protected area, Identity management, Internet ethics, Intruder detection, Labeled Security Protection Profile, Erik Laykin, Mobile device forensics, MyNetWatchman, National Information Assurance Certification and Accreditation Process, National Information Assurance Training and Education Center, National Strategy to Secure Cyberspace, Need to know, Network security policy, Not Just Another Bogus List, Off-site data protection, Open Vulnerability and Assessment Language, Patch Tuesday, Penetration test, Presumed security, Privilege revocation, Privilege separation, Protection mechanism, Protection Profile, Responsible disclosure, RISKS Digest, Same origin policy, Schneier's Law, Secure attention key, Secure by default, Secure error messages in software systems, Security controls, Security management, Security Target, Security through obscurity, Security-evaluated operating system, Setuid, Shibboleth (computer security), Software forensics, System High Mode, System Security Authorization Agreement, Trust negotiation, Trusted computing base, XACML, XTS-400, 201 CMR 17.00, Asset (computer security), Attack (computer), Federal Information Security Management Act of 2002, Health Insurance Portability and Accountability Act, Information Assurance Vulnerability Alert, IT risk, IT risk management, Month of bugs, Nikto Web Scanner, North American Electric Reliability Corporation, Payment Card Industry Data Security Standard, Sarbanes–Oxley Act, Security Content Automation Protocol, Threat (computer), Vulnerability (computing), Network security, Administrative domain, AEGIS SecureConnect, Aladdin Knowledge Systems, Alert Logic, Anomaly-based intrusion detection system, Anti-pharming, Anti-phishing software, Anti-worm, Application-level gateway, ARP spoofing, Asprox botnet, Attack tree, Authentication server, Avaya Secure Network Access, Avaya VPN Router, Bagle (computer worm), Barracuda Networks, Bastion host, Black hole (networking), BLACKER, Blue Cube Security, BNC (software), Botnet, BredoLab botnet, Bro (software), Byzantine Foothold, Captive portal, Capture the flag, Check Point, Check Point Abra, Check Point VPN-1, Christmas tree packet, Cisco ASA, Cisco Global Exploiter, Cisco PIX...and much more.

This book explains in-depth the real drivers and workings of Vulnerability Management. It reduces the risk of your technology, time and resources investment decisions by enabling you to compare your understanding of Vulnerability Management with the objectivity of experienced professionals.

More info:

Published by: Emereo Publishing on Aug 02, 2011
Copyright:Traditional Copyright: All rights reserved
List Price: $39.95


Read on Scribd mobile: iPhone, iPad and Android.
This book can be read on up to 6 mobile devices.
Full version available to members
See more
See less


Vulnerability Management


Kevin Roebuck



High-impact Strategies - What You Need to Know: Definitions, Adoptions, Impact, Benefits, Maturity, Vendors

Topic relevant selected content from the highest rated entries, typeset, printed and shipped. Combine the advantages of up-to-date and in-depth knowledge with the convenience of printed books. A portion of the proceeds of each book will be donated to the Wikimedia Foundation to support their mission: to empower and engage people around the world to collect and develop educational content under a free license or in the public domain, and to disseminate it effectively and globally. The content within this book was generated collaboratively by volunteers. Please be advised that nothing found here has necessarily been reviewed by people with the expertise required to provide you with complete, accurate or reliable information. Some information in this book maybe misleading or simply wrong. The publisher does not guarantee the validity of the information found here. If you need specific advice (for example, medical, legal, financial, or risk management) please seek a professional who is licensed or knowledgeable in that area. Sources, licenses and contributors of the articles and images are listed in the section entitled “References”. Parts of the books may be licensed under the GNU Free Documentation License. A copy of this license is included in the section entitled “GNU Free Documentation License” All used third-party trademarks belong to their respective owners.

Vulnerability management AAA protocol Information technology security audit Automated information systems security Canary trap CBL Index CESG Claims Tested Mark chroot Commercial Product Assurance Common Criteria Testing Laboratory Composite Blocking List Computer forensics Computer security policy Computer Underground Digest Cryptographic Module Testing Laboratory Control system security Cyber security standards Cyber spying Cyber-security regulation Defense in depth (computing) Department of Defense Information Assurance Certification and Accreditation Process Department of Defense Information Technology Security Certification and Accreditation Process Differentiated security DShield Dynablock Enterprise Privacy Authorization Language Evaluation Assurance Level Exit procedure Filesystem permissions Full disclosure Fuzz testing Google hacking Hardening (computing) Host protected area 1 2 4 6 6 8 9 10 13 14 15 16 20 21 22 23 26 30 32 35 36 37 38 39 39 40 40 44 45 49 53 56 57 58

Identity management Internet ethics Intruder detection Labeled Security Protection Profile Erik Laykin Mobile device forensics MyNetWatchman National Information Assurance Certification and Accreditation Process National Information Assurance Training and Education Center National Strategy to Secure Cyberspace Need to know Network security policy Not Just Another Bogus List Off-site data protection Open Vulnerability and Assessment Language Patch Tuesday Penetration test Presumed security Privilege revocation Privilege separation Protection mechanism Protection Profile Responsible disclosure RISKS Digest Same origin policy Schneier's Law Secure attention key Secure by default Secure error messages in software systems Security controls Security management Security Target Security through obscurity Security-evaluated operating system setuid Shibboleth (computer security) Software forensics System High Mode

61 67 67 69 69 71 78 78 79 80 81 82 83 83 85 86 88 92 92 93 94 95 97 98 99 101 101 102 103 104 106 108 108 113 115 118 119 119

System Security Authorization Agreement Trust negotiation Trusted computing base XACML XTS-400 201 CMR 17.00 Asset (computer security) Attack (computer) Federal Information Security Management Act of 2002 Health Insurance Portability and Accountability Act Information Assurance Vulnerability Alert IT risk IT risk management Month of bugs Nikto Web Scanner North American Electric Reliability Corporation Payment Card Industry Data Security Standard Sarbanes–Oxley Act Security Content Automation Protocol Threat (computer) Vulnerability (computing) Network security Administrative domain AEGIS SecureConnect Aladdin Knowledge Systems Alert Logic Anomaly-based intrusion detection system Anti-pharming Anti-phishing software Anti-worm Application-level gateway ARP spoofing Asprox botnet Attack tree Authentication server Avaya Secure Network Access Avaya VPN Router Bagle (computer worm)

120 120 121 124 127 131 132 133 136 141 150 151 165 180 181 182 185 192 205 207 217 225 228 229 230 233 234 235 236 238 239 240 243 244 246 247 248 249

Barracuda Networks Bastion host Black hole (networking) BLACKER Blue Cube Security BNC (software) Botnet BredoLab botnet Bro (software) Byzantine Foothold Captive portal Capture the flag Check Point Check Point Abra Check Point VPN-1 Christmas tree packet Cisco ASA Cisco Global Exploiter Cisco PIX Cisco Security Agent Cisco Systems VPN Client Clarified Networks Clear Channel Assessment attack Client Puzzle Protocol Cloudvpn Codenomicon Columbitech Computer security Context-based access control ContraVirus Core Impact Core Security Countermeasure (computer) Cryptek Cutwail botnet CVSS CyberCIEGE Dark Internet

251 254 256 257 257 258 260 266 268 269 269 272 276 280 283 287 288 290 290 300 302 304 306 307 308 309 313 314 322 323 324 326 328 330 332 333 335 337

Data breach Deep packet inspection Denial-of-service attack Device fingerprint DHIPDS Digital Postmarks Digital security Distributed firewall DMZ (computing) DNS hijacking Donbot botnet Dual-homed Egress filtering Entrust Evil bit Extensible Threat Management (XTM) Extranet Fail2ban Fake AP Finjan Firewalk (computing) Firewall (computing) Firewall pinhole Firewalls and Internet Security Fortinet Forward-confirmed reverse DNS General Dynamics C4 Systems Generalized TTL security mechanism Global Internet Freedom Consortium Greynet Grum botnet Guided tour puzzle protocol Gumblar Hole punching Honeyd HoneyMonkey Honeynet Project Honeypot (computing)

338 341 348 360 362 363 368 371 376 380 383 383 384 385 388 389 389 392 394 394 395 396 401 401 402 405 407 408 409 409 411 411 414 415 416 417 418 419

1X IF-MAP Ingress filtering Institute for Applied Network Security Integrated Windows Authentication Inter-protocol communication Inter-protocol exploitation Internet censorship Internet security Internet Storm Center IntruShield Network intrusion detection system Intrusion prevention system IP address spoofing IP blocking IP fragmentation attacks Kaspersky Anti-Virus Kerberos (protocol) Kerio Control Key distribution center Knowledge-based authentication Kraken botnet Lethic botnet List of cyber attack threat trends Lock-Keeper lorcon Lumeta Corporation MAC flooding Managed security service Managed VoIP Service Mariposa botnet Mega-D botnet Messaging Security 423 424 425 426 427 428 433 435 436 437 438 439 440 445 450 451 452 453 455 457 458 463 466 471 472 473 474 475 476 477 478 479 481 482 485 485 487 488 .Honeytoken Host Identity Protocol ICMP hole punching Identity driven networking IEC 62351 IEEE 802.

Metasploit Project Middlebox Miredo Mobile virtual private network Monoculture (computer science) Mu Dynamics MySecureCyberspace NAT traversal NeoAccel NetBox Blue Network Access Control Network Admission Control Network Based Application Recognition Network encryption cracking Network intelligence Network Security Toolkit Nfront security NIST RBAC model NTLM Null session OCML Online Armor Personal Firewall Open proxy OpenVPN Operation Cyber Condition Zebra Operation: Bot Roast OSSEC Oulu University Secure Programming Group Outbound content compliance Packet capture Packet Storm PacketFence Pass the hash Password length parameter Personal firewall Philippine Honeynet Project Phoning home Port forwarding 490 493 494 495 498 499 500 502 504 506 509 512 513 514 515 518 521 522 523 527 528 529 530 533 539 539 541 543 544 545 548 549 550 551 551 552 552 554 .

Port knocking Port triggering Prelude Hybrid IDS Protected computer Proxy list Pseudoserver Racoon (KAME) Real-time adaptive security Rogue access point Rogue DHCP Rustock botnet Safe@Office SAMP (Security Attribute Modulation Protocol) Sandstorm Enterprises Screened-subnet firewall Screening router Secure Password Authentication Secure Service Network Securelist.com SecureWorks Security Protocols Open Repository Security service (telecommunication) Security Task Manager Semantic URL attack Service scan Session hijacking Sguil Shadowserver Shell shoveling Snort (software) Sourcefire Vulnerability Research Team Split tunneling Spoofed URL Spoofing attack Spyware Srizbi botnet SSL-Explorer SSL-Explorer: Community Edition 556 559 560 561 562 563 564 564 565 566 567 569 569 570 571 571 571 572 572 573 576 576 582 583 584 584 586 587 588 589 590 591 593 594 595 610 614 615 .

Standard Access Control List Stateful firewall Stealth wallpaper Stepping stone (computer security) Stockade (software) Stonesoft Corporation Storm botnet Sucuri Sunbelt Personal Firewall Suricata (software) Sybil attack SYN cookies TACACS TACLANE Tarpit (networking) TCP Cookie Transactions TCP Gender Changer TCP hole punching TCP reset attack tcpcrypt TeamF1 Herbert Hugh Thompson Thresh (software) Ticket (IT security) Transaction verification Trusted Network Connect Trusted path TrustPort Twinge attack Typhoid adware UDP hole punching Unified threat management UT-VPN Verisys Virtual private network Virtual private server VLAN hopping VPN-1 VSX NGX 619 621 624 624 625 626 628 636 637 638 639 641 642 643 643 646 647 648 650 652 653 657 658 659 660 660 662 664 668 668 670 671 674 677 678 683 685 686 .

Licenses and Contributors 735 752 Article Licenses License 755 .Vyatta w3af Waledac botnet Warchalking Wardriving Warflying WarVOX Warzapping Web application security Web content security WebScarab Wi-Fi Protected Access Wired Equivalent Privacy Wireless LAN security Wireless security Woo–Lam XKMS XSA XSS worm Zenux Zero-day attack Zombie computer ZoneAlarm Z100G Zorp firewall Zscaler 687 688 690 691 692 696 697 698 699 700 701 702 706 710 713 722 723 724 725 726 727 730 732 733 734 References Article Sources and Contributors Image Sources.

the organization can take steps to minimize the damage that could be caused by the vulnerability by creating compensating controls. Baseline the Environment . as do security policy requirements.Once a policy has been defined. [3] and fuzzers for finding zero day vulnerabilities. a prominent IT Analyst company. . Mitigate Vulnerabilities . changes must be made to the underlying software. Zero-day vulnerabilities are problems that vulnerability scanners cannot detect.In the short term. and mitigating vulnerabilities"[1] This practice generally refers to software vulnerabilities in computing systems. but to address the root cause. Maintain and Monitor . This is often done via patching vulnerable services. For this reason. Vulnerability Management Programs While program definitions vary in the industry. remediating. additional security vulnerabilities are always being identified.Ultimately.Organizations' computing environments are dynamic and evolve over time.Instances of policy violations are Vulnerability (computing). classifying. These vulnerabilities are then prioritized using risk and effort-based criteria. Gartner. Managing Known Vulnerabilities Versus Unknown Vulnerabilities Typical tools used for identifying and classifying known vulnerabilities are vulnerability scanners. Unknown Vulnerability Management process augments the known vulnerability management by introducing tools and techniques such as network analyzers for mapping attack surface. Custom software or application-based vulnerabilities often require additional software development in order to fully mitigate. Shield . the root causes of vulnerabilities must be addressed. and which typically are already fixed by relevant vendors with patches and security updates. and which also do not have any patches or updates available from vendors. vulnerability management is an ongoing process rather than a point-in-time event. defines Six steps for vulnerability management programs[2] Define Policy . Prioritize Vulnerabilities . Vulnerability Management for Applications Versus Hosts and Infrastructure Host and infrastructure vulnerabilities can often be addressed by applying patches or changing configuration settings. In addition. Technologies such as web application firewalls can be used in the short term to shield systems.Organizations must start out by determining what the desired security state for their environment is. These tools look for vulnerabilities known and reported by the security community. This includes determining desired device and service configurations and access control rules for users accessing resources. changing vulnerable configurations or making application updates to remove vulnerable code. the organization must assess the true security state of the environment and determine where instances of policy violations are occurring.Vulnerability management 1 Vulnerability management "Vulnerability management is the cyclical practice of identifying.

php?id=1282) • Webcasts on Unknown (Zero-Day) Vulnerability Management Process (http://www. [3] Anna-Maija Juuso and Ari Takanen Unknown Vulnerability Management. Quality of Service/differential services. when the service began. or physical location restrictions. Typical information that is gathered in accounting is the identity of the user or other entity. P: Vulnerability Management. net-security.shtml) AAA protocol In computer security. it may record events such as authentication and authorization failures.org/article. 2010. which permits verifying the correctness of procedures carried out based on accounting data. or restrictions against multiple access by the same entity or user. and include auditing functionality. May 2005 (http:/ / www. bandwidth control/traffic management. Typical authorizations in everyday computer life is for example granting read access to a specific file for authenticated user. typically inherited from authentication when logging on to an application or service.[1] In addition. gartner. one-time tokens. A and Nicollet. com/ solutions/ unknown-vulnerability-management/ ). address assignment. Accounting Accounting refers to the tracking of network resource consumption by users for the purpose of capacity and trend analysis. Codenomicon whitepaper. billing. ISBN 978-1-4398-0150-5 [2] Williams. com/ DisplayDocument?doc_cd=127481). cost allocation. AAA commonly stands for authentication. . authorization and accounting. Taylor & Francis Group. Batch accounting refers to accounting information that is saved until it is delivered at a later time. for example time-of-day restrictions. and when it ended. typically by providing evidence that it holds a specific digital identity such as an identifier and the corresponding credentials.html • Q&A on vulnerability management with QualysGuard product manager Eric Perraudeau (http://www. digital certificates. the nature of the service delivered. and encryption. External links • http://denimgroup. page 1.com/denim_group/2009/03/ owasp-minneapolis-st-paul-slide-deck-and-video-online.codenomicon. route assignment. Authorization The authorization function determines whether a particular entity is authorized to perform a given activity. Examples of types of service include. Examples of types of credentials are passwords. Authorization may be determined based on a range of restrictions. and if there is a status to report. compulsory tunneling to a specific endpoint.typepad. Real-time accounting refers to accounting information that is delivered concurrently with the consumption of the resources. October 2010 (http:/ / www. Gartner ID Number: G00127481.com/ solutions/unknown-vulnerability-management/webcasts.Vulnerability management 2 References [1] Foreman. and phone numbers (calling/called). Authentication Authentication refers to the process where an entity's identity is authenticated. M: Improve IT Security With Vulnerability Management. but are not limited to: IP address filtering. codenomicon.

Types of AAA servers include the following: • Access Network AAA (AN-AAA) – Communicates with the RNC in the Access Network (AN) to enable authentication and authorization functions to be performed at the AN. and collects accounting information. The H-AAA is similar to the HLR in voice. future AAA servers are expected to use a successor protocol to RADIUS known as Diameter.ietf. • Broker AAA (B-AAA) – Acts as an intermediary to proxy AAA traffic between roaming partner networks (i. David Harrington. The H-AAA stores user profile information. between the H-AAA server in the home network and V-AAA server in the serving network). TIA specifications refer to AAA servers as RADIUS servers. Jari Arkko. Current AAA servers communicate using the RADIUS protocol.AAA protocol 3 List of AAA Protocols • • • • RADIUS Diameter TACACS TACACS+ Usage of AAA servers in CDMA data networks AAA servers in CDMA data networks are entities that provide Internet Protocol (IP) functionality to support the functions of authentication. either directly or through a B-AAA. Authentication requests and accounting information are forwarded by the V-AAA to the H-AAA. External links • The webpage of the Authentication. RFC 2975. • Home AAA (H-AAA) – The AAA server in the roamer's home network. authorization and accounting. As such.. Oct. 2000. The behavior of AAA servers (RADIUS servers) in the CDMA2000 wireless IP network is specified in TIA-835. IETF. responds to authentication requests. References [1] Bernard Aboba. The V-AAA in the serving network communicates with the H-AAA in a roamer's home network.e. • Visited AAA (V-AAA) – The AAA server in the visited network from which a roamer is receiving service. The AAA server in the CDMA wireless data network architecture is similar to the HLR in the CDMA wireless voice network architecture. Authorization and Accounting IETF working group (http://tools. However.org/ wg/aaa/) . The interface between AN and AN-AAA is known as the A12 interface. "Introduction to Accounting Management". B-AAA servers are used in CRX networks to enable CRX providers to offer billing settlement functions.

for example) difficult and error prone. a Bandit project identity component. Modern Auditing Services Most contemporary enterprise operating systems. Microsoft Project Central. and micro computers have gradually replaced custom software and hardware as more cost-effective business management solutions. switches. and analyzing physical access to the systems. Java applications often fall back to the standard Java logging facility. Oracle Database. log4j. performing security vulnerability scans. and the overall format in which that record should be presented to the audit log. having little else to fall back on. OpenXDAS is based on the Open Group Distributed Auditing Service specification. or the Microsoft Windows System. reviewing application and operating system access controls. The importance of audit event logging has increased with recent new (post-2000) US and worldwide legislation mandating corporate and enterprise auditing requirements. performance and data integrity issues. who is often not a computer. Over the last thirty years. have begun to take their place in software security reviews as not only an improvement. Manual assessments include interviewing staff. Ironically. but a requirement. Audit Event Reporting During the last few decades systematic audit record generation (also called audit event reporting) can only be described as ad hoc. as changes to event formats inevitably work their way into newer versions of the applications over time. Traditional Logging Using traditional logging methods. or CAAT's. Open source projects such as OpenXDAS. and FreeBSD (via the TrustedBSD Project) support audit event logging due to requirements in the Common Criteria (and more historically. the critical nature of audit event reporting gradually transformed into low priority customer requirements. the Orange Book). custom software systems from companies such as IBM and Hewlett Packard.… During this transition. (examples only). single-vendor. This variance in formatting among thousands of instrumented applications makes the job of parsing audit event records by analysis tools (such as the Novell Sentinel product. Software consumers. include system generated audit reports or using software to monitor and report changes to files and settings on a system. These text messages usually contain information only assumed to be security-relevant by the application developer. applications and components submit free-form text messages to system logging facilities such as the Unix Syslog process. have simply accepted the lesser standards as normal. The fundamental problem with such free-form event records is that each application developer individually determines what information should be included in an audit event record. network routers.Information technology security audit 4 Information technology security audit A computer security audit is a manual or systematic measurable technical assessment of a system or application. Systems can include personal computers. Both FreeBSD and Mac OS X make use of the open source OpenBSM library and command suite to generate and process audit records. auditing was considered a mission-critical function. commercial off-the-shelf (COTS) software applications and components. Such domain and application specific parsing code included in analysis tools is also difficult to maintain. and has begun to show prominence in the security community as a more structured alternatives to free-form text audit .or network-security expert. Applications can include Web Services. servers. Mac OS X. Automated assessments. including Microsoft Windows. mainframes. in the early days of mainframe and mini-computing with large scale. Solaris. Security or Application event logs. The consumer licenses of existing COTS software disclaim all liability for security.

5 Performing an Audit Generally. org/ [3] http:/ / www.Security managers. org/ [7] http:/ / www. bandit-project. cfm?Section=IT_Audit_Basics& Template=/ ContentManagement/ ContentDisplay. CISSP. oversight support.Certified accountants. org [6] http:/ / www. DOJ. IT Staff . cfm& ContentID=11234 [4] http:/ / www. computer security audits are performed by: 1. CISA. 4. org/ itaudit/ index.Certificated accountants. [2] • • • • • • Information Systems and Audit Control Association (ISACA) [3] The Institute of Internal Auditors [4] The OpenXDAS Distributed Auditing Service project [1] The Bandit Project [5] OpenBSM Project [6] TrustedBSD Project [7] References [1] http:/ / openxdas. Corporate Internal Auditors . CISM. Corporate Security Staff . Federal OTS. theiia. and a standardized API for event submission and management. References • The OpenXDAS project [1] External links • Information Technology Audit Resources. Federal or State Regulators . 2. org/ . net [2] http:/ / auditnet. TrustedBSD. 3. cfm?fuseaction=forum& fid=5444 [5] http:/ / www. CISA. isaca. OpenBSM. sourceforge. an event taxonomy with event types that cover most security-related event scenarios.Information technology security audit logging. org/ Template. The XDAS specification defines a well-considered event format for security-related events. etc.subject matter experts. OCC.

the basic premise is to reveal a secret to a suspected enemy (but nobody else) then monitor whether there is evidence of the fake information being utilised by the other side. though Clancy did not invent the technique. therefore. the double agent could be offered some tempting "bait" e. and access controls at the central computer facility. A refinement of this technique uses a thesaurus program to shuffle through synonyms. operational procedures. and terminal facilities. and power sources. There are over a thousand possible permutations. remote computer. modification. characteristics and features. accountability procedures. integrity. but only ninety-six numbered copies of the actual document.Automated information systems security 6 Automated information systems security In telecommunication. the term automated information systems security refers to measures and controls that ensure confidentiality. transmission lines. and availability of the information processed and stored by automated information systems. The fake dead drop site could then be periodically checked for signs of disturbance. If he quotes something from two or three of those paragraphs. which involves giving different versions of a sensitive document to each of several suspects and seeing which version gets leaked. and personnel and communications controls needed to provide an acceptable level of risk for the automated information system and for the data and information contained in the system. The unauthorized disclosure. The reason the summary paragraphs are so lurid is to entice a reporter to quote them verbatim in the public media. and the mixture of those paragraphs is unique to each numbered copy of the paper. The term was coined by Tom Clancy in his novel Patriot Games. a double agent. . In INFOSEC automated information systems security is a synonym for computer security. The actual method (usually referred to as a Barium meal test in espionage circles) has been used by intelligence agencies for many years. Automated information systems security includes consideration of all hardware and software functions. who leaked it. If the site showed signs of being disturbed (in order to copy the microfilm stored there) then this would confirm that the suspected enemy really was an enemy e. or destruction may be accidental or intentional.g. such as computers. under the name "Barium meal test". be told that important information was stored at a dead drop site. A Barium meal test is more sophisticated than a canary trap because it is flexible and may take many different forms.g. For example. Barium meal test According to the book Spycatcher by Peter Wright (published in 1987) the technique is standard practice which has been used by MI5 (and other intelligence agencies) for many years. Canary trap A canary trap is a method for exposing an information leak. Automated information systems security also includes the totality of security safeguards needed to provide an acceptable protection level for an automated information system and for the data handled by an automated information system. The fictional character Jack Ryan describes the technique he devised for identifying the sources of leaked classified documents: Each summary paragraph has six different versions. However. physical structures and devices. we know which copy he saw and. management constraints. thus making every copy of the document unique.

• A canary trap hides information in a document that uniquely identifies it. Screener versions of DVDs are often marked in some way so as to allow the tracking of unauthorised releases to their source. revealing an internal leak who was giving information to the KGB.edu/~wagner/CS1023/readings/finger. and printer serial number from forensic tracking codes in a Xerox DocuColor color laser printout. with various rendezvous dates leaked to different groups. Appearances in fiction The canary trap was also used in Clancy's (chronologically) earlier novel.utsa.cs. . A variation of the canary trap was used in Miami Vice. References External links • Fingerprinting (http://www. so that copies of it can be traced. which are usually classified according to intent: • Watermarks are used to show that items are authentic and not forged.eff.org/Privacy/printers/docucolor/) How to read the date.Canary trap 7 Embedding information The technique of embedding significant information in a hidden form in a medium has been used in many ways. Appearances in media When distributing Broken to friends. in order to escape detection. • EFF. when a CIA official alters a report given to a senator. and later in the TV short-series with same name. time.org DocuColor Tracking Dot Decoding Guide (http://www.html) gives a good overview of different kinds of canary trap techniques. Barium meals are also administered in Robert Littel's book The Company. • Steganography is used to hide a secret message in an apparently innocuous message. Trent Reznor claims that he watermarked the tapes with dropouts at certain points so that he could identify if a leak would surface. Without Remorse. The technique (not named) was used in the 1970s BBC television serial 1990.

The CBL Index is a reasonably good tool for getting estimates of subnet "outgoing spam reputation".abuseat. The CBL Index for the net was: 2_097_152/166_086 = 12. The CBL Index may be used for estimation of overall anti-spam performance of ISP or AS operator. The CBL's full zone (data) is available publicly via rsync for download. computer virus.0. Example In CBL zone dated 2007-07-07T21:03+00:00 there was 166_086 IP addresses listed from 83. the "cleaner" the subnet. or spamware).see http://cbl. The CBL Index should be treated with caution . you are encouraged to register for it .3 . The CBL index may be represented in Decibels (dB) or as CIDR suffix (*/xx). Rationale The CBL DNSBL (Composite Blocking List) lists IP addresses that are compromised by a virus or spam sending infection (computer worm.6 (*/28. Note: other spam researchers prefer to use a percentage of IPs that are listed in a subnet. It may be used to measure how "clean" (of compromised computers) a given subnet is. 11.org for more detail.0 dB) 2_097_152 .0/11 network. Using percentages is better suited for "unclean" subnets because "clean" nets have significantly less than 1% of addresses listed.0. The higher the number is.subnets often contain IPs with radically different purposes.CBL Index 8 CBL Index The CBL Index is a ratio between the number of IP addresses in a given IP subnet (Subnetwork) to the number of CBL (Composite Blocking List) listings in the subnet.number of IP addresses in */11 network (2**(32-11)) . Assuming all IPs within a subnet represent the same risk/reputation is potentially dangerous.

CESG Claims Tested Mark 9 CESG Claims Tested Mark The CESG Claims Tested Mark (abbreviated as CCT Mark). The CCT Mark is based upon framework where vendors can make claims about the security attributes of their products and/or services. and has been specifically designed for timescale and cost efficiency . which is simultaneously both correct and incorrect: • Both provide methods for achieving a measure of assurance of computer security products and systems • Neither can provide a guarantee that approval means that no exploitable flaws exist. Security and Resilience (ISR) function. timescale and costs as the EAL number rises • Common Criteria is supported by a Mutual Recognition Agreement (MRA). with multiple Evaluation Assurance Level (EAL) specifications being available with increasing complexity. The role of providing specialist input to the CCT Mark fell to CESG as the UK National Technical Authority (NTA) for Information Security. which. at the lower EAL numbers at least. the CCT Mark provides quality assurance approach to validate whether the implementation of a computer security product or services has been performed in an appropriate manner. formerly CSIA Claims Tested Mark[1] . and independent testing laboratories can evaluate the products/services to determine if they actually meet the claims. which is part of the Cabinet Office's Intelligence. Comparisons The CCT Mark is often compared to the international Common Criteria (CC). who assumed responsibility for the scheme as a whole on 7 April 2008. but rather reduce the likelihood of such flaw being present • The Common Criteria is constructured in a layered manner. means that products tested in one country will normally be accepted in other markets • The CCT Mark is aimed at the same market as the lower CC EAL numbers (currently EAL1/2). with the United Kingdom Accreditation Service (UKAS) carrying out the accreditation. is a UK Government Standard for computer security. History The CCT Mark was developed under the auspices of the UK Government's Central Sponsor for Information Assurance[2] (CSIA). In other words. Operation All Testing Laboratories must comply with ISO 17025.

Recovery Should a system be rendered unbootable. csia. a chroot can be used to move back into the damaged environment after bootstrapping from an alternate root file system (such as from installation media. Privilege separation . cctmark. External links • The official website of the CESG Claims Tested Mark [3] References [1] FAQs About CCTM (http:/ / www. built and tested in a chroot populated only with its expected dependencies.2BSD was released — in order to test its installation and build system. This can be useful for: Testing and development A test environment can be set up in the chroot for software that would otherwise be too risky to deploy on a production system. Uses A chroot environment can be used to create and host a separate virtualized copy of the software system. It is unclear as yet whether CCT Mark will remain in existence for assurance of Information Security services. uk/ FAQs/ tabid/ 56/ Default. aspx) [2] Central Sponsor for Information Assurance (CSIA) (http:/ / www. uk) [3] http:/ / www. A program that is run in such a modified environment cannot name (and therefore normally not access) files outside the designated directory tree. or a Live CD). uk/ chroot A chroot on Unix operating systems is an operation that changes the apparent root directory for the current running process and its children. gov. The modified environment is called a "chroot jail" or (less commonly) a "chroot prison" History The chroot system call was introduced during development of Version 7 Unix in 1979. Dependency control Software can be developed. and also added to BSD by Bill Joy on 18 March 1982 – 17 months before 4.CESG Claims Tested Mark 10 Future As of September 2010. The term "chroot" may refer to the chroot(2) system call or the chroot(8) wrapper program. gov. gov. Compatibility Legacy software or software using a different ABI must sometimes be run in a chroot because their supporting libraries or data files may otherwise clash in name or linkage with those of the host system. CESG have announced that the product assurance element of CCT Mark will be overtaken by the new Commercial Product Assurance (CPA) approach. This can prevent some kinds of linkage skew that can result from developers building projects with different sets of program libraries installed. cctmark.

[2] • On systems that support device nodes on ordinary filesystems.should be used instead. a chrooted root user can still create device nodes and mount the file systems on them. with a fake /etc/passwd and /etc/shadow file) that would fool it into a privilege escalation. using an X11 VNC server and connecting a VNC client outside the environment . or other mechanisms . which can simplify jail design by making it unnecessary to leave working files inside the chroot directory. such as FreeBSD. • Only the root user can perform a chroot. thus. the chroot directory must be populated with a minimum set of these files. device nodes and shared libraries at certain preset locations. take precautions to prevent the second chroot attack. configuration files. pipelines and network connections) into the chroot. This can make chroot difficult to use as a general sandboxing mechanism. chrooted programs should relinquish root privileges as soon as practical after chrooting. See: • Implementations of operating system-level virtualization technology Graphical Applications on chroot It is possible to run graphical applications on a chrooted environment. For a chrooted program to successfully start.such as FreeBSD Jails . Note that some systems. Most Unixes are not completely file system-oriented and leave potentially disruptive functionality like networking and process control available through the system call interface to a chrooted program. 11 Limitations • The chroot mechanism is not intended to defend against intentional tampering by privileged (root) users. Note that chroot is not necessarily enough to contain a process with root privileges. • The chroot mechanism in itself also is not intended to restrict the use of resources like I/O. the chroot mechanism is not intended by itself to be used to block low-level access to system devices by privileged users. disk space or CPU time. using some methods[3] [4] : • • • • • Use xhost (or copy the secret from . chroot contexts do not stack properly and chrooted programs with sufficient privileges may perform a second chroot [1] to break out. bandwidth. • At startup. To mitigate the risk of this security weakness. in order to pre-emptively contain a security breach. programs expect to find scratch space. On most systems. This is intended to prevent users from putting a setuid program inside a specially-crafted chroot jail (for example.Xauthority) Use a nested X server like Xnest or the more modern Xephyr (or start a real X server from inside the jail) Access the chroot via SSH using the X11 forwarding (ssh -X) feature [5] use openroot if your X server has been started with -nolisten tcp and if you do not run an ssh server. This also simplifies the common arrangement of running the potentially-vulnerable parts of a privileged program in a sandbox.chroot Programs are allowed to carry open file descriptors (for files. Extensions Some Unixes offer extensions of the chroot mechanism to address at least some of these limitations.

elstel. • chroot(2) (http://www.2BSD before it. openbsd. com/ en/ Development/ Howto/ Chroot#Launch_X_Applications_inside_the_chroot http:/ / gentoo-wiki. SUSE uses a similar method with its build program. elstel. cgi?query=sshd_config).org/cgi/man. Retrieved 2008-04-27.2. com/ openroot/ http:/ / fedoraproject. org/ chroot/ 2 http:/ / wiki.cgi?query=chroot&sektion=2): change root directory – FreeBSD System Calls Manual • chroot(8) (http://www. the OpenSSH daemon will chroot an unprivileged helper process into an empty directory to handle pre-authentication network traffic for each client. (http://www.[7] References [1] [2] [3] [4] [5] [6] http:/ / www. org/ wiki/ Projects/ Mock [7] "sshd_config(5) manual page" (http:/ / www. mandriva. Fedora. This may be done by forking a process to handle an incoming connection. org/ cgi-bin/ man. /dev & /media automounting & more. 2008-04-05.org/cgi/man. and various RPM-based distributions build all RPMs using a chroot tool such as mock [6]. • Like 4.freebsd. bpfh.freebsd. Red Hat.chroot 12 Notable applications • The Postfix mail transfer agent operates as a pipeline of individually-chrooted helper programs. then chrooting the child (to avoid having to populate the chroot with libraries required for program startup). • Many FTP servers for POSIX systems use the chroot mechanism to sandbox untrusted FTP clients. • If privilege separation is enabled.com/ openroot/) .kernel.9p1 onwards). com/ HOWTO_startx_in_a_chroot http:/ / www.org/doc/man-pages/online/pages/man2/chroot.an extended chroot with X11 access.cgi?query=chroot&sektion=8): change root directory – FreeBSD System Manager's Manual • chroot(2) (http://www. html http:/ / man. . net/ simes/ computing/ chroot-break.html): change root directory – Linux Programmer's Manual – System Calls • openroot . the Debian and Ubuntu internal package-building farms use chroots extensively to catch unintentional build dependencies between packages. The daemon can also sandbox SFTP and shell sessions in a chroot (from version 4. freebsd.

Commercial Product Assurance 13 Commercial Product Assurance Commercial Product Assurance (abbreviated as CPA) is (as of September 2010) an emergent UK Government Standard for computer security. cesg. which means that products tested in the UK will not normally be accepted in other markets • Unlike the CCT Mark. The target audience for CPA also appears to be focused on Central Government ("I'm protecting Government data")[2] rather than including the Wider Public Sector (WPS) and Critical National Infrastructure (CNI) segments that were target customers for CCT Mark References [1] CESG Home Page (http:/ / www. the coverage of CPA is limited to Information Security products. cesg. gov. Organisation CPA is being developed under the auspices of the UK Government's CESG[1] as the UK National Technical Authority (NTA) for Information Security. there is no Mutual Recognition Agreement (MRA) for CPA. and therefore excludes services. uk/ ) [2] CESG CPA Home Page (http:/ / www. shtml) . uk/ products_services/ iacs/ cpa/ index. gov. Comparisons In comparison to other schemes: • Unlike Common Criteria. It is intended to supplant other approaches such as Common Criteria (CC) and CCT Mark for UK government use.

CCTL requirements These laboratories must meet the following requirements: • NIST Handbook 150. To become a CCTL. Common Methodology and other technology based sources. CCTLs must operate within the guidelines established by the CCEVS.S. Additional laboratory-related information can be found in CCEVS publications: • #1 Common Criteria Evaluation and Validation Scheme for Information Technology Security — Organization.Common Criteria Testing Laboratory 14 Common Criteria Testing Laboratory A Common Criteria Testing Laboratory (CCTL) is an information technology (IT) computer security testing laboratory that is accredited to conduct IT security evaluations for conformance to the Common Criteria international standard. CCTL accreditation A testing laboratory becomes a CCTL when the laboratory is approved by the NIAP Validation Body and is listed on [1] the Approved Laboratories List . Government participants in selected Common Criteria evaluations. Government technical oversight and validation of evaluation-related activities in accordance with the policies and procedures established by the CCEVS • Accept U. there are only three scheme-specific requirements imposed by the Validation Body. validly existing and in good standing under the laws of the state where the laboratory intends to do business • Accept U.S. integrity and commercial confidentiality. and be a legal entity. NVLAP Procedures and General Requirements • NIST Handbook 150-20. Some scheme requirements that cannot be satisfied by NVLAP accreditation are addressed by the NIAP Validation Body. CCTLs must observe the highest standards of impartiality. NVLAP accreditation is the primary requirement for achieving CCTL status. it is strongly recommended that prospective CCTLs ensure that they are able to satisfy the scheme-specific requirements prior to seeking accreditation from NVLAP.S. other NIAP approved test methods derived from the Common Criteria. and Concept of Operations and Scheme Publication • #4 Common Criteria Evaluation and Validation Scheme for Information Technology Security — Guidance to Common Criteria Testing Laboratories . duly organized and incorporated. NVLAP Information Technology Security Testing — Common Criteria • NIAP specific criteria for IT security evaluations and other NIAP defined requirements CCTLs enter into contractual agreements with sponsors to conduct security evaluations of IT products and Protection Profiles which use the CCEVS. This can be accomplished by sending a letter of intent [2] to the NIAP prior to entering the NVLAP process. At present. a testing laboratory must go through a series of steps that involve both the NIAP Validation Body and NVLAP. Management. In the United States the National Institute of Standards and Technology (NIST) National Voluntary Laboratory Accreditation Program (NVLAP) accredits CCTLs to meet National Information Assurance Partnership (NIAP) Common Criteria Evaluation and Validation Scheme requirements and conduct IT security evaluations for conformance to the Common Criteria. To avoid unnecessary expense and delay in becoming a NIAP-approved testing laboratory. NIAP approved CCTLs must agree to the following: • Located in the U.

without doing open proxy tests of any kind. html http:/ / isotc. cfm http:/ / www. org http:/ / www. org/ products. Misconfigured mail servers (for example. abuseat. niap-ccevs. niap-ccevs. abuseat. org/ cctls/ http:/ / www.Common Criteria Testing Laboratory 15 External links • • • • • • NIAP Common Criteria Evaluation and Validation Scheme [3] Common Criteria Testing Laboratories [4] The Common Criteria standard documents [5] Common Criteria Recognition Agreement [6] List of Common Criteria evaluated products [7] ISO/IEC 15408 [8] — available free as a public standard References [1] [2] [3] [4] [5] [6] [7] [8] http:/ / www. and only lists IPs exhibiting characteristics such as: • • • • Open proxies of various sorts (HTTP. org http:/ / www. org/ lookup. niap-ccevs. org/ forms/ ltr-of-intent. CBL data are used in Spamhaus XBL list. niap-ccevs. org/ cc_docs http:/ / www. External links • The CBL [1] • CBL lookup and removal page [2] References [1] http:/ / cbl. org/ livelink/ livelink/ fetch/ 2000/ 2489/ Ittf_Home/ PubliclyAvailableStandards. wingate etc) Worms/viruses that do their own direct mail transmission Trojan horse or "stealth" spamware. AnalogX. iso. cgi . commoncriteriaportal. The CBL takes its source data from very large spamtraps/mail infrastructures. servers that send HELO with 'localhost' or a similar incorrect domain. commoncriteriaportal. htm Composite Blocking List In computer networking. The CBL does not provide public access to gathered evidence. org/ [2] http:/ / cbl. the Composite Blocking List (CBL) is a DNS-based Blackhole List of suspected E-mail spam senders. org/ cctls http:/ / www.) Entries automatically expire after a period of time. niap-ccevs. socks.

several new "computer crimes" were recognized (such as hacking). Towards the end of this period. analyzing and presenting facts and opinions about the information. fraud. some notable examples include:[4] BTK Killer Dennis Rader was convicted of a string of serial killings that occurred over a period of sixteen years.[3] They go on to describe the discipline as "more of an art than a science". murder and rape. storage medium (e. hard disk or CD-ROM). extraction. recovering. The discipline involves similar techniques and principles to data recovery. Rader sent letters to the police on a floppy disk. requiring information to be authentic. While the guidelines are voluntary they are widely accepted in courts of Wales. At the same time. The discipline of computer forensics emerged during this time as a method to recover and investigate digital evidence for use in court. In the United Kingdom examiners often follow guidelines from the Association of Chief Police Officers which help ensure the authenticity and integrity of evidence. In a 2002 book Computer Forensics authors Kruse and Heiser define computer forensics as involving "the preservation. England and Scotland. indicating that forensic methodology is backed by flexibility and extensive domain knowledge. Overview In the early 1980s personal computers became more accessible to consumers leading to their increased use in criminal activity (for example. The discipline also features in civil proceedings as a form of information gathering (for example. Different countries have specific guidelines and practices for the recovery of evidence. Metadata within the documents .[2] The scope of a forensic analysis can vary from simple information retrieval to reconstructing a series of events. Although it is most often associated with the investigation of a wide variety of computer crime. an electronic document (e. to help commit fraud). documentation and interpretation of computer data". preserving. reliably obtained and admissible. Computer forensics has been used as evidence in criminal law since the mid 1980s. It has been used in a number of high profile cases and is becoming widely accepted as reliable within US and European court systems. The goal of computer forensics is to examine digital media in a forensically sound manner with the aim of identifying.Computer forensics 16 Computer forensics Computer forensics (sometimes known as computer forensic science[1] ) is a branch of digital forensic science pertaining to legal evidence found in computers and digital storage media. such as a computer system.g. computer forensics may also be used in civil proceedings. Electronic discovery) Forensic techniques and expert knowledge are used to explain the current state of a digital artifact. identification. Computer forensics analysis is not limited only to computer media Evidence from computer forensics investigations is usually subjected to the same guidelines and practices of other digital evidence. Today it is used to investigate a wide variety of crime.g. but with additional guidelines and practices designed to create a legal audit trail. cyberstalking. including child pornography. an email message or JPEG image). Use as evidence In court computer forensic evidence is subject to the usual requirements for digital evidence.

[4] Corcoran Group This case confirmed parties' duties to preserve digital evidence when litigation has commenced or is reasonably anticipated.[8] Deleted files A common technique used in computer forensics is the recovery of deleted files. in some instances.Computer forensics implicated an author named "Dennis" at "Christ Lutheran Church". The process.[9] Most operating systems and file systems do not always delete physical file data. Joseph E. Techniques A number of techniques are used during computer forensics investigations. Prosecutors used this to show premeditation and secure the death penalty. acquired images) rather than "live" systems. for example.[6] [7] Live analysis The examination of computers from within the operating system using custom forensics or existing sysadmin tools to extract evidence. which is still being researched. Although no evidence of deletion on the hard drives were found. Duncan III A spreadsheet recovered from Duncan's computer contained evidence which showed him planning his crimes. This is a change from early forensic practices which. evidence came out before the Court that the Defendants were found to have intentionally destroyed emails. due to a lack of specialist tools. The practice is useful when dealing with Encrypting File Systems.[5] Sharon Lopatka Hundreds of emails on Lopatka's computer lead investigators to her killer. misled and failed to disclose material facts to the Plaintiffs and the Court. Modern forensic software have their own tools for recovering or carving out deleted data. Robert Glass. Hard drives were analyzed by a Computer Forensics expert and could not find relevant emails which the Defendants should have. File carving involves searching for known file headers within the disk image and reconstructing deleted materials. Cross-drive analysis A portable Tableau write-blocker attached to a Hard Drive A forensic technique that correlates information found on multiple hard drives. saw investigations commonly carried out on live data.[4] Investigations are performed on static data (i. can be used for identifying social networks and for performing anomaly detection. allowing it to be reconstructed from the physical disk sectors.e. 17 Forensic process Computer forensic investigations usually follow the standard digital forensic process (acquisition. analysis and reporting). this evidence helped lead to Rader's arrest. the logical hard drive volume may be imaged (known as a live acquisition) before the computer is shut down. where the encryption keys may be collected and. .

uk/ books?id=yMdNrgSBUq0C). . Retrieved 2009-11-20. be/ maarten/ forensics. However. . MM Pollitt (2003). because the electrical charge stored in the memory cells takes time to dissipate. keyword searches for topics related to the crime. Typical forensic analysis includes a manual review of material on the media. [6] Garfinkel. 1. do?AwardNumber=0730389). Retrieved 6 December 2010. The length of time for which data recovery is possible is increased by low temperatures and higher cell voltages. daemon. Retrieved 18 August 2010. 9510& rep=rep1& type=pdf). 392. Ariel J. [10] J. . Addison-Wesley. RAM can be analyzed for prior content after power loss. [7] "EXP-SA: Prediction and Detection of Network Membership through Automated Hard Drive Analysis" (http:/ / www. edu/ memory/ ). Retrieved 27 August 2010. William Clarkson. net/ clips/ academic/ 2006. Alex Halderman. edu/ viewdoc/ download?doi=10. Retrieved 27 August 2010. ISBN 0-12-163104-4. DFRWS. pp. reviewing the Windows registry for suspect information.[5] One application of "live analysis" is to recover RAM data (for example. Academic Press. "Computer forensics education" (http:/ / citeseerx. William Paul. princeton. Holding unpowered RAM below −60 °C will help preserve the residual data by an order of magnitude. Noblett. [9] Aaron Phillip. google. Pollitt. using Microsoft's COFEE tool) prior to removing an exhibit. . Princeton University. google. html). 544. ISBN 0123742676. com/ resources/ RecoveringComputerEvidence. Eoghan Casey. doc). Chris Davis (2009). Retrieved 26 July 2010. . pp. Eoghan (2004). 1. Lawrence A. discovering and cracking passwords. ISBN 0201707195. S. (August 2006). ed. pp. . David Cowen. DG Marks. com/ books?id=nNpQAAAAMAAJ). if the machine is still active. and extracting e-mail and pictures for review. 1. Heiser (2002). Second Edition). [4] Casey. google. . Nadia Heninger.[4] Certifications There are several computer forensics certifications available. psu. Digital Evidence and Computer Crime. Jay G. Kruse. Lest We Remember: Cold Boot Attacks on Encryption Keys (http:/ / citp. Computer forensics: incident response essentials (http:/ / books. Elsevier. Presley (October 2000). IEEE Security & Privacy. 567. and Edward W. Many state laws in the United States require computer forensic expert witnesses to have a professional certification or a private investigator's license. co. simson. RF Erbacher. any information stored solely in RAM that is not recovered before powering down may be lost. pdf). Feldman. . uk/ books?id=xNjsDprqtUYC). it can be impractical to do this during a field examination. Retrieved 26 July 2010. google. [8] Maarten Van Horenbeeck (24). Mark M. [2] A Yasinsac. . [5] Various (2009). "Technology Crime Investigation" (http:/ / www. . nsf. Seth D. Handbook of Digital Forensics and Investigation (http:/ / books. gov/ awardsearch/ showAward. References [1] Michael G. Joseph A. Jacob Appelbaum. Schoen. "Recovering and examining computer forensic evidence" (http:/ / bartholomewmorgan. co. McGraw Hill Professional. thus improving the chances of successful recovery. "Forensic Feature Extraction and Cross-Drive Analysis" (http:/ / www. Second Edition (http:/ / books. ist.Computer forensics 18 Volatile data When seizing evidence. . Hacking Exposed: Computer Forensics (http:/ / books.[10] Analysis tools A number of open source and commercial tools exist for computer forensics investigation. Felten (2008-02-21). Calandrino. com/ ?id=Xo8GMt_AbQsC& dq=Digital Evidence and Computer Crime. [3] Warren G. ISBN 0071626778.

google. pdf) (PDF) • Forensics Wiki (http://www.gov/Pub-Draft-1-DDA-Require.safemode. A.cftt.ukoln.com/wps/find/journaldescription..1368519.co. Eoghan.com/ books?id=z4GLgpwsYrkC). Artech House. Digital archaeology? Rescuing Neglected or Damaged Data Resources (http:// www.edu/~huangyz/ cvpr08_Huang. First Edition (Paperback) by David Benton (Author).edu/academic/institutes/ecii/ijde/) • International Journal of Forensic Computer Science (http://www. Stellatos. Related journals • Journal of Digital Forensics. Computer and intrusion forensics (http://books.org/) • Electronic Evidence Information Center (http://www.ac. Mohay (2003). • George M. Security and Law (http://www.asp) • Small Scale Digital Device Forensic Journal (http://www. ISBN 1580533698. Gerasimos J. Operating Systems Review 42 (3): 93–98.com) • Original Computer Forensics Wiki (http://computer-forensics.uk/services/elib/papers/supporting/pdf/p2.tandf. Proc.org) External links • US NIST Digital Data Acquisition Tool Specification (http://www.ssddfj. Second Edition (Paperback) by Chris Prosise (Author). • Computer Forensics World Forum (http://www.e-evidence. (1999).forensicswiki.org) • International Journal of Digital Crime and Forensics (http://www. S.org/community/whitepapers) • Forensic Science Information and Resources (http://www.1145/1368506.dcs. • YiZhen Huang and YangJing Long (2008).utica.forensicsciencenews.cs. • Incident Response and Computer Forensics.org/) • Journal of Digital Forensic Practice (http://www. IEEE Conference on Computer Vision and Pattern Recognition: 1–8.. 395.forensicfocus.tandf.uk/~ctli/IJDCF. ISBN 1-900508-51-6.uk/journals/titles/15567281." (more) • Ross.jdfsl. (2008).nist.wisc. doi:10.org) . Frank Grindstaff (Author) • Casey.asp) • Cryptologia (http://www. Matt Pepe (Author) "Truth is stranger than fiction.html) • Journal of Digital Investigation (http://www. Kevin Mandia (Author).org) • Computer Forensic Whitepapers (SANS) (http://computer-forensics.pdf).computerforensicsworld. and Gow.pdf).warwick.cws_home/702130/ description#description) • International Journal of Digital Evidence (http://www. a Creative Commons wiki of computer forensics information.elsevier. "The impact of full disk encryption on digital forensics".uk/journals/titles/01611194.Computer forensics 19 Further reading • A Practice Guide to Computer Forensics.org).info) • Forensic Focus (http://www.co.ijofcs. Bristol & London: British Library and Joint Information Systems Committee.ac.sans. "Demosaicking recognition with applications in digital photo authentication based on a quadratic pixel correlation model" (http://pages. pp.dfrws.com) • Digital Forensic Research Workshop (DFRWS) (http://www.

Addison-Wesley. Formal description If a system is regarded as a finite-state automaton with a set of transitions (operations) that change the system's state. Preliminary Literature Review of Policy Engineering Methods . There exist a lot of application specific languages that are closely coupled with the security mechanisms that enforce the policy in that application. Inc. Damascus. "Security Models".g. Integrity and Availability. For example the Bell-La Padula model is a confidentiality policy model. The definition can be highly formal or informal. Proceeding of 3rd international conference on information and communication technologies : from theory to applications (ICTTA 08). • McLean. the Domain Type Enforcement-Language.Computer security policy 20 Computer security policy A computer security policy defines the goals and elements of an organization's computer systems. A technical implementation defines whether a computer system is secure or insecure. John (1994).Toward Responsibility Concept [1]. References • Bishop.Toward Responsibility Concept. These formal policy models can be categorized into the core security principles of: Confidentiality. Christophe (2008). are independent of the concrete mechanism. Computer security: art and science. Preliminary Literature Review of Policy Engineering Methods . • Feltus. Encyclopedia of Software Engineering. Given this simple definition one can define a secure system as one that starts in an authorized state and will never enter an unauthorized state. Syria. Compared with this abstract policy languages. e. Matt (2004). 1136–1145. . Formal policy models Confidentiality policy model • Bell-La Padula model Integrity policies model • Biba model • Clark-Wilson model Hybrid policy model • Chinese Wall (Also known as Brewer and Nash model) Policy languages To represent a concrete policy especially for automated enforcement of it. then a security policy can be seen as a statement that partitions these states into authorized and unauthorized ones. whereas Biba model is an integrity policy model. pp. Security policies are enforced by organizational policies or security mechanisms. a language representation is needed. 2. New York: John Wiley & Sons.

textfiles. It existed primarily as an email mailing list and on USENET. wrote in his seminal paper The Social Organization of the Computer Underground that the "computer underground consists of actors in three roles .Computer security policy 21 References [1] http:/ / ieeexplore. com/ magazines/ CUD/ . com/ magazines/ CUD/ ). . 2009.com. Florida International University. and many off-topic postings by its readership. org/ xpl/ freeabs_all. The newsletter came to prominence when it published legal commentary and updates concerning the "hacker crackdowns" and federal indictments of Leonard Rose and Craig Neidorf of Phrack."" [3] http:/ / cu-digest. a sociologist who has since left academia but continues to be involved in the computer industry. . 1990 to March. html). 2000. Retrieved 2009-05-10. edu/ ~mizrachs/ cudisc. phone phreaks. it ceased publication in March. org/ [4] http:/ / www. ieee. 2000. book reviews of topical publications. [2] Steve Mizrach (2009). fiu. and software pirates. "The electronic discourse of the computer underground" (http:/ / www. textfiles. Retrieved 2009-05-10. 1990[1] Country Language  United States English The Computer Underground Digest (CuD) was a weekly online newsletter on early Internet cultural. The CuD published commentary from its membership on subjects including the legal and social implications of the growing Internet (and later the web).computer hackers. and legal issues published by Gordon Meyer and Jim Thomas from March. textfiles.com References [1] "Electronic Magazines: CUD (The Computer Underground Digest)" (http:/ / www.[2] History Meyer and Thomas were Criminal Justice professors at Northern Illinois University. Overtaken by the growth of online forums on the web. jsp?arnumber=4529912& language=fr Computer Underground Digest Computer Underground Digest Editor Gordon Meyer and Jim Thomas Categories Online magazine Frequency Bi-weekly First issue March 28. External links • Computer Underground Digest [3] • CuD [4] on textfiles. "Gordon Meyer. though its archives were later provided on a website. and intended the newsletter to cover topical social and legal issues generated during the rise of the telecommunications and the Internet. social.

etc.S. • FIPS 140-1 required evaluated operating systems that referenced the Trusted Computer System Evaluation Criteria (TCSEC) classes C2. Typically. • A CC evaluation does not supersede or replace a validation to either FIPS 140-1 or FIPS 140-2. CMTL requirements These laboratories must meet the following requirements: • NIST Handbook 150. FIPS 140-2 testing is against a defined cryptographic module and provides a suite of conformance tests to four FIPS 140 security levels. B1 and B2. usually created by the user. If the operational environment is a modifiable operational environment. the operating system requirements of the Common Criteria are applicable at FIPS Security Levels 2 and above. self tests. The CC and FIPS 140-2 are different in the abstractness and focus of tests. nist. or security target (ST).Cryptographic Module Testing Laboratory 22 Cryptographic Module Testing Laboratory A Cryptographic Module Testing Laboratory (CMTL) is an information technology (IT) computer security testing laboratory that is accredited to conduct cryptographic module evaluations for conformance to the FIPS 140-2 U.prior to the development of the CC. gov/ cryptval/ . Consequently. NVLAP Procedures and General Requirements • NIST Handbook 150-17 Information Technology Security Testing . However. Government standard. a PP covers a broad range of products. The four security levels in FIPS 140-1 and FIPS 140-2 do not map directly to specific CC EALs or to CC functional requirements. FIPS 140-2 now references the Common Criteria. The CC is an evaluation against a Protection Profile (PP).Cryptographic Module Testing • NVLAP Specific Operations Checklist for Cryptographic Module Testing FIPS 140-2 in relation to the Common Criteria A CMTL can also be a Common Criteria (CC) Testing Laboratory (CCTL). The standard was initially developed in 1994 . FIPS 140-2 describes the requirements for cryptographic modules and includes such areas as physical security. TCSEC is no longer in use and has been replaced by the Common Criteria. A CC certificate cannot be a substitute for a FIPS 140-1 or FIPS 140-2 certificate. roles and services. key management. External links • List of CMTLs [1] from NIST References [1] http:/ / csrc. The National Institute of Standards and Technology (NIST) National Voluntary Laboratory Accreditation Program (NVLAP) accredits CMTLs to meet Cryptographic Module Validation Program (CMVP) standards and procedures.

These control systems manage essential services including electricity. petroleum production. Risks Insecurity of industrial automation and control systems can lead the following risks: • • • • • • Safety Environmental impact Lost production Equipment damage Information theft Company image Vulnerability of control systems Industrial automation and control systems have become far more vulnerable to security incidents due to the following trends that have occurred over the last 10 to 15 years. transportation. The 2010 discovery of the Stuxnet worm demonstrated the vulnerability of these systems to cyber incidents. PCN security. and control system cyber security. operations or technical support means more insecure or rogue connections to control system • Public Information . only does so for the nuclear power and [1] the chemical industries. operating systems. networks. Integration of technology such as MS Windows. each of which could contain security vulnerabilities. SQL. Control system security is known by several other names such as SCADA security. worms and trojans that affect IT systems Increased Connectivity • Enterprise integration (using plant. The United States and other governments have passed cyber-security regulations requiring enhanced protection for control systems operating critical infrastructure. . and Ethernet means that process control systems are now vulnerable to the same viruses. The United States. for example.Manuals on how to use control system are publicly available to would be attackers as well as to legitimate users Regulation of control system security is rare. applications. industrial network security. They rely on computers.24/7 access for engineering. water. corporate and even public networks) means that process control systems (legacy) are now being subjected to stresses they were not designed for • Demand for Remote Access . manufacturing. and communications. • Heavy use of Commercial Off-the Shelf Technology (COTS) and protocols. and programmable controllers.Control system security 23 Control system security Control system security is the prevention of intentional or unintentional interference with the proper operation of industrial automation and control systems.

This standard is currently under development. It has also been approved and published by the IEC as IEC 62443-2-1 [7] ISA-99. This standard is approved and published. Government Computer Emergency Readiness team (US-CERT) [2] has instituted a Control Systems Security Program (CSSP [3]) which has made available a large set of free National Institute of Standards and Technology (NIST) standards documents regarding control system security [4].03.00. • ISA-99.03 is a technical report on the subject of patch management. ISA-TR99. Control system security standards ISA99 ISA99 is the Industrial Automation and Control System Security Committee of the International Society for Automation (ISA).isa.03 identifies a set of compliance metrics for IACS security. • ISA-TR99. ISA- ([8])is a technical report on the subject of suitable technologies for IACS security. Work products from the ISA99 committee are also submitted to IEC as standards and specifications in the IEC 63443 series.03.02. This document is currently under development.04.03. This standard is currently under development.org/ISA99%20Wiki/Master%20Glossary.02 addresses how to define security assurance levels using the zones and conduits concept. The committee is developing a multi-part series of standards and technical reports on the subject.04 addresses the requirements for the development of secure IACS products and solutions. This report is approved and published. These standards are currently under development.S.01.01 (formerly referred to as "Part 1") (ANSI/ISA 99.03 defines detailed technical requirements for IACS security.03.xx series address detailed technical requirements at the component level. Standards in the ISA-99. several of which have been publicly released. This report is currently under development.01 [5]) is approved and published.02. ISA-99. This standard is currently under development. aspx) • ISA-99.01-2009 [6]) addresses how to establish an IACS security program. • ISA-99.02 is a master glossary of terms used by the committee.01. • • • • • • • More information about the activities and plans of the ISA99 committee is available on the committee Wiki site ([9]) .01 (formerly referred to as "Part 2") (ANSI/ISA 99. ISA-99. This document is still a working draft but the content is available on the committee Wiki site (http://isa99.Control system security 24 Government efforts The U.02. This standard is currently under development. ISA-TR99.02 addresses how to operate an IACS security program.

gov/ control_systems/ csstandards. cfm& ProductID=10243 [7] http:/ / www. Condé Nast. cfm& ProductID=9665 [9] http:/ / isa99. "A Declaration of Cyber-War" (http:/ / www. ansica. They have also created an ANSI [15] accredited certification program called ISASecure for the certification of industrial automation devices such as programmable logic controllers (PLC). html [5] http:/ / www. php?cid=2|20 [12] http:/ / www. manufacturing. iec. isa. com/ page. com/ Industrial-network-security. The recent news about the industrial control system malware known as Stuxnet has heightened concerns about the vulnerability of these systems. The ISA Security Compliance Institute (ISCI) has developed compliance test specifications for ISA99 and other control system security standards. electric utility. bin95. com/ culture/ features/ 2011/ 04/ stuxnet-201104). htm [14] http:/ / www. p?wwwlang=E& wwwprog=pro-det. These types of devices provided automated control of industrial processes such as those found in the oil & gas. Vanity Fair. organized criminals. org/ ISA99%20Wiki/ Home. org/ [15] https:/ / www. org/ Template. isa. [2] http:/ / www. terrorist organizations or even state-sponsored groups. asp?menuID=1 . cfm?Section=Standards2& template=/ Ecommerce/ ProductDisplay. us-cert.Control system security 25 American Petroleum Institute API 1164 Pipeline SCADA Security [10] North American Electric Reliability Committee (NERC) NERC Critical Infrastructure Protection (CIP) Standards [11] Guidance documents American Chemistry Council ChemITC Guidance Documents [12] Insightful Articles Industrial Netorking Security [13] Control system security certification ISA Security Compliance Institute Related to the work of ISA 99 is the work of the ISA Security Compliance Institute [14]. isasecure. . 2011. p& He=IEC& Pu=62443& Pa=2& Se=1& Am=& Fr=& TR=& Ed=1 [8] http:/ / www. cfm [11] http:/ / www. org/ Template. vanityfair. americanchemistry. api. aspx [10] http:/ / www. isa. org/ Standards/ new/ api-standard-1164. cfm?Section=Shop_ISA& Template=/ Ecommerce/ ProductDisplay. isa. asp?CID=1641& DID=6201 [13] http:/ / www. nerc. us-cert. ch/ cgi-bin/ procgi. disgruntled employees. pl/ www/ iecwww. gov/ control_systems/ [4] http:/ / www. food & beverage and water/wastewater processing industries. org/ Template. Michael Joseph (2011-04). org/ wwwversion2/ outside/ PROpilotISA. References [1] Gross. com/ s_chemitc/ sec. cfm?Section=Standards& template=/ Ecommerce/ ProductDisplay. Retrieved March 03. gov/ [3] http:/ / www. us-cert. chemical. There is growing concern from both governments as well as private industry regarding the risk that these systems could be intentionally compromised by "evildoers" such as hackers. distributed control systems (DCS) and safety instrumented systems (SIS). cfm& Productid=9661 [6] http:/ / www.

proprietary information. Also many tasks that were once done by hand are carried out by computer.org/isa99/) ISA Security Compliance Institute (http://www. This standard consists of two basic parts.isa. information systems .gov)| NIST The Repository of Industrial Security Incidents (http://www. There are many advantages to obtaining certification including the ability to get cyber security insurance. cyber security certification by an accredited body can be obtained. The certification once obtained lasts three years and is periodically checked by the BSI to ensure an organization continues to be compliant throughout that three year period. security policy. Businesses also have a need for cyber security because they need to protect their trade secrets. and availability. Recently this standard has become ISO 27001.com/page. History Cyber security standards have been created recently because sensitive information is now frequently stored on computers that are attached to the Internet. ISO 27002 ISO 27002 incorporates both parts of the BS 7799 standard. asset management. One of the most widely used security standards today is ISO/IEC 27002 which started in 1995.isasecure.nist. ISO/IEC 27002 states that information security is characterized by integrity. whereas BS 7799 part 2 provides a certification. The series of ISA industrial cyber security standards are known as ISA-99 and are being expanded to address new areas of concern. but since it is backward compatible any organization working toward BS 7799 part 2 can easily transition to the ISO 27001 certification process. BS 7799 part 1 and BS 7799 part 2 both of which were created by (British Standards Institute) BSI.php?cid=220/) NIST webpage (http://www. ISO 27001 (ISMS) replaces BS 7799 part 2. There is also a transitional audit available to make it easier once an organization is BS 7799 part 2-certified for the organization to become ISO 27001-certified.nerc. and personally identifiable information (PII) of their customers or employees. These guides provide general outlines as well as specific techniques for implementing cyber security.org/) NERC Standards (see CIP 002-009) (http://www.securityincidents.” and the 800-26 titled “Security Self-Assessment Guide for Information Technology Systems”. physical and environmental security. communication and operations. Sometimes ISO/IEC 27002 is referred to as BS 7799 part 1 and sometimes it refers to part 1 and part 2. Cyber security is important in order to guard against identity theft. The government also has the need to secure its information. access controls. human resources security. confidentiality.” 800-14 titled “Generally Accepted Principles and Practices for Securing Information Technology. Three of these special papers are very relevant to cyber security: the 800-12 titled “Computer Security Handbook. It is most beneficial for an organization to obtain a certification to be recognized as compliant with the standard. For certain specific standards.org/) Cyber security standards Cyber security standards are security standards which enable organizations to practice safe security techniques to minimize the number of successful cyber security attacks. The International Society of Automation (ISA) developed cyber security standards for industrial automation control systems (IACS) that are broadly applicable across manufacturing industries. The National Institute of Standards and Technology (NIST) has released several special publications addressing cyber security. organizing information security. BS 7799 part 1 provides an outline for cyber security policy. therefore there is a need for Information Assurance (IA) and security. The outline is a high level guide to cyber security. The ISO/IEC 27002 standard is arranged into eleven control areas.Control system security 26 External links • • • • • ISA 99 Standards (http://www.

Eight principles and fourteen practices are described within this document. published as the Standard of Good Practice (SoGP). 800-26 4. Initially this document was aimed at the federal government although most practices in this document can be applied to the private sector as well. Special publication 800-12 provides a broad overview of computer security and control areas. The newest version of NERC 1300 is called CIP-002-1 through CIP-009-2 (CIP=Critical Infrastructure Protection). 800-12 2. Special publication 800-37.Cyber security standards acquisition/development/maintenance. the ISF offers its member organizations a comprehensive benchmarking program based on the SoGP. Special publication 800-14 describes common security principles that are used. department 27 Standard of good practice In the 1990s. the Information Security Forum (ISF) published a comprehensive list of best practices for information security. These standards are used to secure bulk electric systems although NERC has created standards within other areas. 800-14 3. compliance. specifically addresses the 194 security controls that are applied to a system to make it "more secure." . Specifically it was written for those people in the federal government responsible for handling sensitive systems. the latest version was published in February 2007. updated in August 2009. Originally the Standard of Good Practice was a private document available only to ISF members. NERC The North American Electric Reliability Corporation (NERC) has created many standards. "Guide for Assessing the Security Controls in Federal Information Systems". It also emphasizes the importance of the security controls and ways to implement them. It provides a high level description of what should be incorporated within a computer security policy. The most widely recognized is NERC 1300 which is a modification/update of NERC 1200. but the ISF has since made the full document available to the general public at no cost. business continuity management. incident handling. Among other programs. The ISF continues to update the SoGP every two years. Special publication 800-53 rev3. updated in 2010 provides a new risk approach: "Guide for Applying the Risk Management Framework to Federal Information Systems" 5. Special publication 800-26 provides advice on how to manage IT security. This document emphasizes the importance of self assessments as well as risk assessments. NERC NIST 1. It describes what can be done to improve existing security as well as how to develop a new security practice. The bulk electric system standards also provide network security administration while still supporting best practice industry processes.

This report is currently under development. distributed control systems (DCS) and safety instrumented systems (SIS).01 (formerly referred to as "Part 2") (ANSI/ISA 99. This report is approved and published. The ISA Security Compliance Institute (ISCI) has developed compliance test specifications for ISA99 and other control system security standards. This standard is currently under development. This standard is currently under development. • • • • • • • More information about the activities and plans of the ISA99 committee is available on the committee Wiki site ([9]) ISA Security Compliance Institute Related to the work of ISA 99 is the work of the ISA Security Compliance Institute [14].xx series address detailed technical requirements at the component level. • ISA-99. ISA-99.02 addresses how to operate an IACS security program. Work products from the ISA99 committee are also submitted to IEC as standards and specifications in the IEC 63443 series.02 is a master glossary of terms used by the committee.03 defines detailed technical requirements for IACS security. ISA-TR99. ISA-TR99.isa. This standard is currently under development.03.03 identifies a set of compliance metrics for IACS security.01. • • aspx) ISA-99.01 (formerly referred to as "Part 1") (ANSI/ISA 99. The RFC 2196 provides a general and broad overview of information security including network security. These standards are currently under development.03.01.02. several of which have been publicly released. The document is very practical and focusing on day-to-day operations. ISA-99 ISA99 is the Industrial Automation and Control System Security Committee of the International Society for Automation (ISA). This standard is approved and published. This document is still a working draft but the content is available on the committee Wiki site (http://isa99.01.02.01 ([8])is a technical report on the subject of suitable technologies for IACS security. These types of devices provided automated control of . ISA-99. The committee is developing a multi-part series of standards and technical reports on the subject. It has also been approved and published by the IEC as IEC 62443-2-1 [7] ISA-99.02.01-2009 [6]) addresses how to establish an IACS security program. incident response or security policies. RFC 2196 RFC 2196 is memorandum published by Internet Engineering Task Force for developing security policies and procedures for information systems connected on the Internet. They have also created an ANSI [15] accredited certification program called ISASecure for the certification of industrial automation devices such as programmable logic controllers (PLC). This standard is currently under development. This document is currently under development. Standards in the ISA-99.02. • ISA-TR99. It allows many different software applications to be integrated and tested in a secure way.02 addresses how to define security assurance levels using the zones and conduits concept.03.04.01 [5]) is approved and published.04 addresses the requirements for the development of secure IACS products and solutions.03. ISA-99.Cyber security standards 28 ISO 15408 This standard develops what is called the “Common Criteria”.org/ISA99%20Wiki/Master%20Glossary.03 is a technical report on the subject of patch management. ISA-99.00.

U. organized criminals.. Generally Accepted Principles and Practices for Securing Information Technology Systems (800-14). Technology Administration. There is growing concern from both governments as well as private industry regarding the risk that these systems could be intentionally compromised by "evildoers" such as hackers.com.Swanson. A Comparison of Cyber Security Standards Developed by the Oil and Gas Segment. Department of Commerce. University of Illinois [12] Carnegie Mellon University Portal for Cyber Security [13] Critical Infrastructure Protection [14] Cybertelecom :: Security [15] Surveying federal cyber security work Global Cybersecurity Policy Conference [16] The Repository of Industrial Security Incidents [17] Rsam: Standards Based IT GRC Management Platform [18] .Retrieved November 12.National Institute of Standards and Technology. Special Publication 800-12.Guttman. The recent news about the industrial control system malware known as Stuxnet has heightened concerns about the vulnerability of these systems.S. terrorist organizations or even state-sponsored groups. disgruntled employees. Department of Commerce.The North America Electric Reliability (NERC).. External links • • • • • • • • • • • • • • • • • • • [1] [2] NEWS about ISO 27002 [3] BS 7799 certification [4] ISO webpage [5] BSI website [6] NERC Standards (see CIP 002-009) [11] NIST webpage [7] The Information Security Forum (ISF) [8] The Standard of Good Practice (SoGP) [9] CYBER-ATTACKS! Trends in US Corporations [10] Securing Cyberspace-Media [11] Presentation by Professor William Sanders. electric utility. 29 References 1. M. Department of Commerce.Department of Homeland Security. http://www. 1.nerc. 4.. U. (September 1996) 3. Technology Administration.S. 5. food & beverage and water/wastewater processing industries.S. 5. National Institute of Standards and Technology. Technology Administration. 2. M. M. manufacturing. National Institute of Standards and Technology.. An Introduction to Computer Security: The NIST Handbook. 4. U. (November 5. Swanson. 3. 2004) 2.Cyber security standards industrial processes such as those found in the oil & gas. chemical. 2005. Security Self-Assessment Guide for Information Technology Systems (800-26)...

Yahoo News Canada. com/ products_iso. frontpage [6] http:/ / www. cs. Alex (March 30.Simple Google search quickly finds link to software for Ghost Rat program used to target governments (http://www. securityforum.But government officials choose words carefully. Ontario. rsam. org/ [18] http:/ / www. securityincidents. org/ iso/ en/ ISOOnline. org [9] http:/ / www. edu/ DCS/ 2007-08/ ETSI-2008-02-27. retrieved 2009-03-31 • Chinese-based cyber spy network exposes need for better security: Cdn researchers (http://ca. html [5] http:/ / www. governments and enemies for personal.news. sensitive. It may wholly be perpetrated online from computer desks of professionals on bases in far away countries or may involve infiltration at home by computer trained conventional spies and moles or in other cases may be the criminal handiwork of amateur malicious hackers and software programmers.cms). com/ index. NATO (http://www.thestar. net [4] http:/ / www. itmanagementnews. Canada.Cyber security standards 30 References [1] http:/ / www.thestar. cybertelecom. isfsecuritystandard. retrieved 2009-03-31 . retrieved 2009-04-04 • All about Chinese cyber spying (http://infotech.com/quickiearticleshow/4334292.com/News/ World/article/611481). networks or individual computers through the use of cracking techniques and malicious software including Trojan horses and spyware. infotech. Detail& HearingID=261 [12] http:/ / media. stevens. isasecure. 2009. org/ isa99 [2] http:/ / www. com/ [14] https:/ / inlportal. 2009). Cathal (Mar 31. retrieved 2009-04-01 • Cooper. Toronto investigator helped expose hacking of embassies. molemag. cfm?Fuseaction=Hearings. bsi-global. senate. bizforum. Cyberspies' code a click away . asx [13] https:/ / www.thestar.com (Times of India). Ontario. groups. gov/ portal/ server. org/ security/ [16] http:/ / www. inl. com/s/capress/090330/national/computer_spying). from individuals.indiatimes.yahoo.com/news/canada/article/610329). Toronto. edu/ cyberpolicy/ [17] http:/ / www. Canada. March 30. com [10] http:/ / www. retrieved 2009-04-04 • Kelly. htm [11] http:/ / hsgac. Ontario. Cyber spying typically involves the use of such illegally gained access to secrets and classified information or illegally gained control of individual computers or whole networks for an unethical and illegal strategic advantage and for psychological. References • Bill Schiller. economic.indiatimes. sleuth says. gov/ index. nist.com/News/World/Article/610860). htm/ Cyber spying Cyber spying or Cyber espionage is the act or practice of obtaining secrets without the permission of the holder of the information (personal. rivals. iso. mysecurecyberspace. ASIA BUREAU (Apr 01. political and physical subversion activities and sabotage. political or military advantage using illegal exploitation methods on the Internet. Toronto. March 30. com/ itmanagementnews-54-20040224BS7799CompliancyandCertification. Toronto. Chinese ridicule U of T spy report . competitors. org [3] http:/ / www. 2009. isa. gov [8] http:/ / www. 2009). org/ whitepapers/ rand001. We can lead in cyber spy war. Canada. never denying country engages in cyber-espionage (http://www. proprietary or of classified nature). xalter [7] http:/ / www. uiuc. pt?open=514& objID=1275& parentname=CommunityPage& parentid=5& mode=2& in_hi_userid=200& cached=true [15] http:/ / www. 2009).

Cyber spying • Steve Herman (30 March 2009). BBC News.aspx?RsrcID=45797).beijing2008conference.com/news/story.ctv. 29 March 2009. retrieved 2009-03-31 • "Chinese government accused of cyber spying" (http://www.org.Cyber-Spying for Dummies" (http://www. 2009).com/articles. The Australian 31 External links • Congress to Investigate Google Charges Of Chinese Internet Spying (AHN) (http://www. 2007). New Delhi: GlobalSecurity. Mike (March 29.uk/news/world-news/ chinese-government-accused-of-cyber-spying-14248347. "INTELLIGENCE .Tracking Cyberpower (University of Toronto. 2009).bbc. 20 • • • • November 2008.co.com.co. CNSNews.com/ articles/7017511426?Congress to Investigate Google Charges Of Chinese Internet Spying) • Information Warfare Monitor .newsweek. Jane Macartney (December 7.stm). Belfast Telegraph.uk/2/hi/americas/7970471. 30 March 2009 • Patrick Goodenough. Exiled Tibetan Government Expresses Concern over Cyber-Spying Traced to China (http://www.uk/tol/news/uk/crime/article5996253. Newsweek Walton. Ontario. Canada (Canwest News Service) • US warned of China 'cyber-spying' (http://news. Don Mills. retrieved 2009-03-30 • Major cyber spy network uncovered (http://news.html). German court limits cyber spying (http://news. retrieved 2009-03-30 • SciTech Cyber spy network 'smoking gun' for China: expert (http://www. retrieved 2009-03-30 • Kim Covert (March 28.co.allheadlinenews.com.com) . "Canadian researchers uncover vast Chinese cyber spy network" (http://www.co.25197. World Association of Newspapers. 2009.ca/servlet/ArticleNews/story/ CTVNews/20090329/China_Hackers_090329/20090329?hub).com/public/content/article. CTV Canada. The Times (London). 2009).au/story/0.stm). retrieved 2009-04-01 Mark Hosenball (June 2.uk/2/hi/asia-pacific/7740483.timesonline. March 29.news. BBC News.SnoopPal.php) • Twitter: InfowarMonitor (http://twitter.stm).htm). "Year of the Gh0st RAT" (http://www. Canada/Munk Centre) (http:// infowar-monitor.uk/2/hi/europe/7266543.00. BBC News. php?id=101).net/index. nationalpost. Retrieved 2009-04-01.belfasttelegraph. Gregory (April 2008).com/InfowarMonitor) • Spy software for cyber spies (http://www.html?id=1440426).cnsnews. China Rejects Cyber Spying Allegations.bbc. retrieved 2009-03-31 • Harvey. International Editor (March 30. 27 February 2008 Rowan Callick. theaustralian.globalsecurity. 2008). "'World's biggest cyber spy network' snoops on classified documents in 103 countries" (http://www.co.html).com/ id/138520). "Chinese fury at cyber spy claims" (http://www. National Post.22882854-2703. ‘Dalai Lama Propaganda’ (http://www.ece).bbc.org/intell/library/news/2009/intell-090330-voa01.

driver’s license number.kirby Federal government regulation There are few federal cyber-security regulations. this regulation creates an incentive for companies to voluntarily invest in cyber-security to avoid the potential loss of reputation and the resulting economic loss that can come from a successful cyber-attack. phishing. Reasons for cyber-security The United States government believes the security of computer systems is important to the world for two reasons. such as the national power grid. anti-virus software.heiman These three regulations mandate that healthcare organizations.kirby He also states that successful cyber-attacks on government systems still occur despite government efforts.privacy Personal information includes name. FISMA. the 1999 Gramm-Leach-Bliley Act and the 2002 Homeland Security Act. Cyber-attacks include viruses. standards. The increased role of Information Technology (IT) and the growth of the e-commerce sector. which included the Federal Information Security Management Act (FISMA).lemos Furthermore. these regulations do not specify what cyber-security measures must be implemented and require only a “reasonable” level of security. which applies to every government agency. Also. Trojan horses. such as Internet Service Providers (ISPs) and software companies.privacy Several other states have followed California’s example and passed similar security breach notification regulations. .heiman For example. The purpose of cyber-security regulation is to force companies and organizations to protect their systems and information from cyber-attacks. credit card number or financial information. “requires the development and implementation of mandatory policies. and the ones that exist focus on specific industries. Bruce Schneier. social security number.gordon Federal and state governments in the United States have attempted to improve cyber-security through regulation and collaborative efforts between government and the private-sector to encourage voluntary improvements to cyber-security. cyber-security is vital to the operation of safety critical systems. Cyber-security regulation are directives from the Executive Branch and legislation from Congress that safeguards information technology and computer systems. The three main cyber-security regulations are the 1996 Health Insurance Portability and Accountability Act. financial institutions and federal agencies protect their systems and information . Also.lemos State government regulation State governments have attempted to improve cyber-security by increasing public visibility of firms with weak security. and guidelines on information security. these regulations do not address numerous computer related industries. founder of Cupertino’s Counterpane Internet Security.privacyrights These security breach notification regulations punish firms for their cyber-security failures while giving them the freedom to choose how to secure their systems. have made cyber-security essential to the economy.howstuffworks There are numerous measures available to prevent cyber-attacks. encryption and login passwords. and to the protection of infrastructure systems. intrusion detection and prevention systems. such as emergency response. California passed the Notice of Security Breach Act which requires that any company that maintains personal information of California citizens and has a security breach must disclose the details of the event.”heiman But. In 2003.Cyber-security regulation 32 Cyber-security regulation In the United States government. principles. unauthorized access (stealing intellectual property or confidential information) and control system attacks. Cyber-security measures include firewalls. argues that companies will not make sufficient investments in cyber-security unless government forces them to do so. The vague language of these regulations leaves much room for interpretation. denial of service (DOS) attacks. worms.

S.”schmitt Congress has proposed cyber-security regulations similar to California’s Notice of Security Breach Act for companies that maintain personal information. install unsolicited software.lemos Bruce Schneier stated that “The National Strategy to Secure Cyberspace hasn’t secured anything yet.heiman Some industry security experts state that the President’s National Strategy to Secure Cyberspace is a good first step but is insufficient. Congress is also considering bills that criminalize cyber-attacks.”epic In addition to requiring companies to improve cyber-security. the President’s National Strategy clearly states that the purpose is to provide a framework for the owners of computer systems to improve their security rather than the government taking over and solving the problem. 2005 in the United States House of Representatives and is currently in committee in the Senate.” Pro-regulation opinions While experts agree that cyber-security improvements are necessary.Cyber-security regulation In 2004.free2innovate U. or anti-virus software. and mitigate potential harm to individuals. which leaves much room for interpretation until case law is established. authenticate and track users.”kirby However.whitehouse Yet. In 2003. Congress allocated $4.”lemos In 2004. Congress has proposed numerous bills that expand upon cyber-security regulation.rasmussen 33 Other government efforts In addition to regulation. “Industry only responds when you threaten regulation. and tamper with security.heiman The plan calls for cooperative efforts between government and industry “to create an emergency response system to cyber-attacks and to reduce the nation’s vulnerability to such threats. the President’s National Strategy to Secure Cyberspace made the Department of Homeland Security (DHS) responsible for security recommendations and researching national solutions. However. detect and prevent unauthorized activity. The Consumer Data Security and Notification Act amends the Gramm-Leach-Bliley Act to require disclosure of security breaches by financial institutions. like the federal legislation. Rep.rasmussen This regulation dictates that businesses maintain a reasonable level of security and that these required security practices also extend to business partners.”free2innovate He believes that software companies must be forced to produce more secure programs.7 billion toward cyber-security and achieving many of the goals stated in the President’s National Strategy to Secure Cyberspace. there is disagreement about whether the solution is more government regulation or more private-sector innovation. This bill “makes unlawful the unauthorized usage of a computer to take control of it. collect or induce the owner to disclose personally identifiable epic information. Many government officials and cyber-security experts believe that the private-sector has failed to solve the cyber-security problem and that regulation is needed. This bill which focuses on phishing and spyware bill that was passed on May 23. California passed California Assembly Bill 1950 which also applies to businesses that own or maintain personal information for California residents. you have to follow through. modify its setting. the federal government has tried to improve cyber-security by allocating more resources to research and collaborating with the private-sector to write standards. If industry doesn’t respond [to the threat].pbs Bruce Schneier also supports regulation that encourages software companies to write more secure code through economic incentives. anti-spyware.rasmussen This regulation is an improvement on the federal standard because it expands the number of firms required to maintain an acceptable standard of cyber-security. companies that participate in the collaborative efforts outlined in the strategy are not required to adopt the discovered security solutions. including any firm that accepts payment by a credit card. Proposed regulation The U.epic Congressmen have also proposed “expanding Gramm-Leach-Bliley to all industries that touch consumer financial information. it requires a “reasonable” level of cyber-security. Rick Boucher (D-VA) . The Securely Protect Yourself Against Cyber Trespass Act (SPY ACT) is a bill of this type. The Information Protection and Security Act requires that data brokers “ensure data accuracy and confidentiality. S. Richard Clarke states that.

President and CEO of TechNet." Retrieved October 31. 2005.1798. (2003). "The National Strategy to Secure Cyberspace [4]. Washington. M. 2005. 11. C.”free2innovate Another reason many private-sector executives oppose regulation is because it is costly.free2innovate 34 Anti-regulation opinions On the other hand. San Francisco Chronicle. Los Angeles Times.82 . Clarke believes that certain industries. "How computer viruses work [3]. . E. "A chronology of data breaches reported since the ChoicePoint incident [1]." (2005). 13. Security flaws may be pitfall for Microsoft. president of the Information Technology Association of America. Retrieved October 10. 7. to improving software security. (2004). (2005).menn In addition. P. Menn. J. 2005.Cyber-security regulation proposes improving cyber-security by making software companies liable for security flaws in their code." (2003). Loeb. speech and civil liberties in the 109th congress [2]. "Notice of security breach . “The private-sector must continue to be able to innovate and adapt in response to new attack methods in cyber space. 8. R. Rasmussen.. we commend President Bush and the Congress for exercising regulatory restraint. Lucyshyn. A. "Bush unveils final cybersecurity plan [8]. & Brown.civil code sections 1798. pp.29 and 1798.. References 1." Retrieved October 31.. believes that regulation inhibits innovation. & Richardson. January 14.84 [5]. W. He states that. C. 2005. (2004). Retrieved October 23. "Richard Clarke interview [6]. 6.. 2005. also opposes more regulation. "2005 CSI/FBI computer crime and security survey [7]. Forum focuses on cyber-security. Gordon. 9. Firms are just as concerned about regulation reducing profits as they are about regulation limiting their flexibility to solve the cyber-security problem efficiently. (2003). 12. require regulation. 2002). December 4. 4." 2003). 2005. R. Cybersecurity regulation is here. such as utilities and ISPs. Retrieved October 23." Retrieved October 10. 2003). 2005. (2002. C1. and toward that end. "What Proposed Data Laws Will Mean for Marketers [9]. Retrieved October 17. 2005. Charron. Schmitt. Anderson. "California Law Establishes Duty of Care for Information Security [9]. 5. 3. Kirby.. (2003. B. Retrieved December 4.free2innovate Rick White. 10. M. Harris Miller. Retrieved October 13." (2003)." (2005). J. RSA security conference. 2005. "Electronic privacy information center bill track: Tracking privacy. 2. J." (2005). & Joseph. A. many private-sector executives believe that more regulation will restrict their ability to improve cyber-security. 2005." Retrieved December 4.C. D. Heiman. Retrieved December 14.. L. 2005. Lemos. E.

howstuffworks. deadbolt locks) Authentication and password security Hashing passwords Anti virus software Firewalls (hardware or software) DMZ (demilitarized zones) IDS (intrusion detection systems) Packet filters VPN (virtual private networks) Logging and auditing Biometrics Timed access control .7211. org/ ar/ ChronDataBreaches. conceived by the National Security Agency (NSA) as a comprehensive approach to information and electronic security. zdnet. 82. com/ 2100-1009_22-984697. html [3] http:/ / www.[1] [2] Defense in depth is originally a military strategy that seeks to delay. It is a layering tactic. the advance of an attacker by yielding space in order to buy time. html [10] http:/ / www. varying methods. htm [2] http:/ / www. archives. privacyrights. Background The idea behind the defense in depth approach is to defend a system against any particular attack using several. forrester. • • • • • • • • • • • • Physical security (e. org/ wgbh/ pages/ frontline/ shows/ cyberwar/ interviews/ clarke. Examples Using more than one of the following layers constitutes defense in depth. org/ privacy/ bill_track. aspx Defense in depth (computing) Defense in depth is an information assurance (IA) strategy in which multiple layers of defense are placed throughout an information technology (IT) system. but buys an organization time to detect and respond to an attack. epic. procedures and policies is intended to increase the dependability of an IT system where multiple layers of defense prevent espionage and direct attacks against critical systems. The placement of protection mechanisms. htm [4] http:/ / georgewbush-whitehouse. gov/ criminal/ cybercrime/ FBI2005.Cyber-security regulation 35 External links • Microsoft cybersecurity for government [10] References [1] http:/ / www. It addresses security vulnerabilities in personnel. html [9] http:/ / www. pdf [8] http:/ / news.00. html [7] http:/ / www. gov/ code/ cc1798. usdoj. technology and operations for the duration of the system's life cycle. privacy.g. pdf [5] http:/ / www. In terms of computer network defense. com/ Research/ Document/ 0. htm [6] http:/ / www. ca. gov/ pcipb/ cyberspace_strategy. com/ industry/ government/ guides/ security/ default.35913. thereby reducing and mitigating the consequences of a breach. com/ virus. 291798. rather than prevent. microsoft. defense in depth measures should not only prevent security breaches. pbs.

Process • System Identification Profile • DIACAP Implementation Plan • • • • • Validation Certification Determination DIACAP Scorecard POA&M Approval to Operate Decision . owasp. It supersedes the Interim DIACAP Guidance. org/ index. An interim version of the DIACAP was signed July 6.01 and was signed on November 28. The IA Controls are determined based on the system's mission_assurance category (MAC) and confidentiality level (CL).1 and DoDI 8500. (http:/ / www. One major change in DIACAP from DITSCAP is the embracing of the idea of information assurance controls (defined in DoDD 8500. pdf) [2] OWASP Wiki: Defense in depth (http:/ / www.2) as the primary set of security requirements for all automated information systems (AISs). DIACAP defines a DoD-wide formal and standard set of activities.Defense in depth (computing) • Software/hardware not available to the public (but see also security through obscurity) 36 References [1] Defense in Depth: A practical strategy for achieving Information Assurance in today’s highly networked environments. History DIACAP is the result of a NSA directed shift in underlying security paradigm and succeeds its predecessor: DITSCAP. php/ Defense_in_depth) Department of Defense Information Assurance Certification and Accreditation Process The DoD Information Assurance Certification and Accreditation Process (DIACAP) is the United States Department of Defense (DoD) process to ensure that risk management is applied on information systems (IS). 2006 and superseded DITSCAP. The final version is titled Department of Defense Instruction 8510. gov/ ia/ _files/ support/ defenseindepth. nsa. 2007. general tasks and a management structure process for the certification and accreditation (C&A) of a DoD IS that will maintain the information assurance (IA) posture throughout the system's life cycle.

01 establishes a standard DoD-wide process with a set of activities. mil/ diacap/ http:/ / govitwiki. It identifies four phases: 1. 4. DIACAP replaced the former process. navy. dtic. NIACAP. or processes unclassified or classified information since December 1997. pdf http:/ / www. transmits. disa. iaportal. pdf Department of Defense Information Technology Security Certification and Accreditation Process The Department of Defense Information Assurance Certification and Accreditation Process (DIACAP) is a process defined by the United States Department of Defense (DoD) for managing risk.Department of Defense Information Assurance Certification and Accreditation Process 37 References • • • • DIACAP Knowledge Service [1] (requires DoD PKI certificate) DIACAP Guidance at the DoD Information Assurance Support Environment [2] Full list of DIACAP Phases [3] With instructions at GovITwiki. mil/ http:/ / iase. dtic.1: Information Assurance (IA) [5] • Department of Defense Instruction 8500.2: Information Assurance (IA) Implementation [6] References [1] [2] [3] [4] [5] [6] https:/ / diacap. stores. general tasks and a management structure to certify and accredit an Automated Information System (AIS) that will maintain the Information Assurance (IA) posture of the Defense Information Infrastructure (DII) throughout the system's life cycle. 3. A similar methodology. dtic. pdf http:/ / www.01: DoD Information Assurance Certification and Accreditation Process [4] • Department of Defense Directive 8500. mil/ whs/ directives/ corres/ pdf/ 850001p. . DIACAP applies to the acquisition. mil/ whs/ directives/ corres/ pdf/ 851001p. System Definition Verification Validation Re-Accreditation DIACAP also uses weighted metrics to describe risks and their mitigation. in 2006. operation and sustainment of any DoD system that collects. com/ wiki/ Defense_Information_Assurance_Certifications_and_Accreditation_Process_(DIACAP) http:/ / www. mil/ whs/ directives/ corres/ pdf/ 850002p. Department Of Defense Instruction 8510. DoD Instruction (DoDI) 8510. 2. is used for the certification and accreditation (C&A) of national security systems outside of the DoD. known as DITSCAP (Department of Defense Information Technology Security Certification and Accreditation Process). The DIACAP processes was refined by the publication of the DIACAP Application Manual.

mil/ j-6/ dlmso/ eLibrary/ Documents/ PKI/ i520040.Department of Defense Information Technology Security Certification and Accreditation Process 38 References • DIACAP website [2] • DoD 8510. External links • Differentiated security in wireless networks [1] Andreas Johnsson. dla. December 30. DITSCAP Application Manual. pdf . This makes it much more difficult to scale or replicate attacks. References [1] http:/ / www. At the extreme. mil/ whs/ directives/ corres/ html/ 851001m. uu. 2002. it. htm [2] http:/ / www. 2000 [1] Cancelled Reference • DoDI 5200. dtic. 1997 [2] References [1] http:/ / www. July 31. One way of achieving this is by subdividing the population into small differentiated clusters. se/ research/ reports/ 2002-015/ 2002-015-nc.40.1-M. each individual belongs to a different class. pdf Differentiated security Differentiated security is a form of computer security that deploys a range of different security policies and mechanisms according to the identity and context of a user or transaction. since each cluster/individual has a different security profile and there should be no common weaknesses.

org/ pbl . org/ howto. "Leaves". spamhaus. DShield is regularly used by the media to cover current events. DShield data is regularly used by researchers to analyze attack patterns. "SQL Snake" and more. NJABL passed their data along to The Spamhaus Project [1]. In early 2007. Code Red. for using in their PBL [2] service. The goal of the DShield project is to allow access to its correlated information to the public at no charge to raise awareness and provide accurate and current snapshots of internet attacks. Updates of Dynablock stopped December 2003 but it became the basis for NJABL and SORBS own dynamic IP lists. html Dynablock Dynablock is a name which was used by Easynet from 2001 to 2003 for their Dialup Users List DNSBL of Internet addresses that appeared to be assigned dynamically. It is used as the data collection engine behind the SANS Internet Storm Center (ISC). It receives logs from volunteers world wide and uses them to analyze attack trends. org [2] http:/ / isc. dshield. Analysis provided by DShield has been used in the early detection of several worms. Several data feeds are provided to users to either include in their own web sites or to use as an aide to analyze events. org [3] http:/ / www.DShield 39 DShield DShield is a community-based collaborative firewall log correlation system. External links • DShield Homepage [1] • Internet Storm Center [2] • How to Participate in DShield [3] References [1] http:/ / www. dshield. like "Ramen". it has grown to be a dominating attack correlation engine with worldwide coverage. i. It was officially launched end of November 2000 by Johannes Ullrich. org/ dynablock. References [1] http:/ / www. html [2] http:/ / www. The dynamic list parts of NJABL and SORBS have been developed independently since then. to dialup and residential broadband users. njabl. with NJABL using the 'dynablock' name for their list.e. Since then. sans.

w3. org/ Submission/ 2003/ SUBM-EPAL-20031110/ [2] http:/ / xml. The intent of the higher levels is to provide higher confidence that the system's principal security features are reliably implemented. a product with a higher EAL is not necessarily "more secure" in a particular application than one with a lower EAL. Achieving a higher EAL certification generally costs more money and takes more time than achieving a lower one. To achieve a particular EAL. Most of these requirements involve design documentation. A product's fitness for a particular security application depends on how well the features listed in the product's Security Target fulfill the application's security requirements. sun. Therefore. com/ techrep/ 2005/ smli_tr-2005-147/ TRCompareEPALandXACML.Enterprise Privacy Authorization Language 40 Enterprise Privacy Authorization Language Enterprise Privacy Authorization Language (EPAL) is a formal language for writing enterprise privacy policies to govern data handling practices in IT systems according to fine-grained positive and negative authorization rights. Although every product and system must fulfill the same assurance requirements to achieve a particular level. It has been submitted by IBM to the World Wide Web Consortium (W3C) to be considered for recommendation.2 [1] submission to the W3C 10 Nov 2003 • Technology Report on EPAL [2] from OASIS • A Comparison of Two Privacy Policy Languages:EPAL and XACML by Anne Anderson [3]. the computer system must meet specific assurance requirements. html Evaluation Assurance Level The Evaluation Assurance Level (EAL1 through EAL7) of an IT product or system is a numerical grade assigned following the completion of a Common Criteria security evaluation. The EAL level does not measure the security of the system itself. or penetration testing. The functional features for each certified product are established in the Security Target document tailored for that product's evaluation. Sun Microsystem Laboratories References [1] http:/ / www. functional testing. html [3] http:/ / research. References • EPAL 1. it simply states at what level the system was tested. design analysis. they do not have to fulfill the same functional requirements. . The higher EALs involve more detailed documentation. an international standard in effect since 1999. The EAL number assigned to a certified system indicates that the system completed all requirements for that level. org/ epal. and testing than the lower ones. then the higher EAL should indicate the more trustworthy product for that application. The increasing assurance levels reflect added assurance requirements that must be met to achieve Common Criteria certification. coverpages. since they may have very different lists of functional features in their Security Targets. analysis. If the Security Targets for two products both contain the necessary security features.

EAL3 is applicable in those circumstances where developers or users require a moderate level of independently assured security. EAL2 is therefore applicable in those circumstances where developers or users require a low to moderate level of independently assured security in the absence of ready availability of the complete development record. Examples include Trusted Solaris. Solaris. including independent testing against a specification. user-based security features are typically evaluated at [1] [1] [1] EAL4.[1] [2] SUSE Linux Enterprise Server 10. FreeBSD. An evaluation at this level should provide evidence that the TOE functions in a manner consistent with its documentation. though rigorous. EAL3: Methodically Tested and Checked EAL3 permits a conscientious developer to gain maximum assurance from positive security engineering at the design stage without substantial alteration of existing sound development practices. HP-UX.2. skills. and that it provides useful protection against identified threats. Commercial operating systems that provide conventional. Windows 7.[1] [6] and Windows Server 2008 R2[1] [6] . EAL2: Structurally Tested EAL2 requires the cooperation of the developer in terms of the delivery of design information and test results. Windows 2003. SUSE Linux Enterprise Server 9.5 and 4. It will be of value where independent assurance is required to support the contention that due care has been exercised with respect to the protection of personal or similar information. do not require substantial specialist knowledge. EAL1 provides an evaluation of the TOE (Target of Evaluation) as made available to the customer.[7] an early version of the XTS-400. and other resources. Novell NetWare. . and require a thorough investigation of the TOE and its development without substantial re-engineering. It is intended that an EAL1 evaluation could be successfully conducted without assistance from the developer of the TOE. EAL4: Methodically Designed. and an examination of the guidance documentation provided.[1] [5] Windows XP[1] [5] . and for minimal cost. EAL4 is the highest level at which it is likely to be economically feasible to retrofit to an existing product line. but should not demand more effort on the part of the developer than is consistent with good commercial practice. Solaris 10 Release 11/06 Trusted Extensions. and Reviewed EAL4 permits a developer to gain maximum assurance from positive security engineering based on good commercial development practices which. and VMware ESXi version 3. Operating systems that provide multilevel security are evaluated at a minimum of EAL4. Such a situation may arise when securing legacy systems. Examples of such operating systems are AIX. As such it should not require a substantially increased investment of cost or time.0.[4] Windows 2000 Service Pack 3. EAL4 is therefore applicable in those circumstances where developers or users require a moderate to high level of independently assured security in conventional commodity TOEs and are prepared to incur additional security-specific engineering costs.[8] 3.0 (EAL 4+).[3] Red Hat Enterprise Linux 5.Evaluation Assurance Level 41 Assurance levels EAL1: Functionally Tested EAL1 is applicable where some confidence in correct operation is required. Tested. but the threats to security are not viewed as serious.

It is likely that the additional costs attributable to the EAL5 requirements. It is often assumed that a system that achieves a higher EAL will provide its security features more reliably (and the required third-party analysis and testing performed by security experts is reasonable evidence in this direction). a higher EAL means nothing more.[12] Implications of assurance levels Technically speaking. XTS-400 (STOP 6) is a general-purpose operating system which has been evaluated at EAL5 augmented. LPAR on IBM System z is EAL5 Certified. will not be large. Open Kernel Labs has also performed formal verification of their seL4 microkernel OS. EAL7: Formally Verified Design and Tested EAL7 is applicable to the development of security TOEs for application in extremely high risk situations and/or where the high value of the assets justifies the higher costs. or less. Practical application of EAL7 is currently limited to TOEs with tightly focused security functionality that is amenable to extensive formal analysis. but there is little or no published evidence to support that assumption. Numerous smart card devices have been evaluated at EAL5. Impact on cost and schedule In 2006.Evaluation Assurance Level 42 EAL5: Semiformally Designed and Tested EAL5 permits a developer to gain maximum assurance from security engineering based upon rigorous commercial development practices supported by moderate application of specialist security engineering techniques. Such a TOE will probably be designed and developed with the intent of achieving EAL5 assurance. than that the evaluation completed a more stringent set of quality assurance requirements. relative to rigorous development without the application of specialized techniques. EAL6 is therefore applicable to the development of security TOEs for application in high risk situations where the value of the protected assets justifies the additional costs. the US Government Accountability Office published a report on Common Criteria evaluations that summarized a range of costs and schedules reported for evaluations performed at levels EAL2 through EAL4.[9] EAL6: Semiformally Verified Design and Tested EAL6 permits developers to gain high assurance from application of security engineering techniques to a rigorous development environment in order to produce a premium TOE for protecting high value assets against significant risks.[11] allowing devices running seL4 to achieve EAL7. The Tenix Interactive Link Data Diode Device and the Fox Data Diode[10] have been evaluated at EAL7 augmented. . EAL5 is therefore applicable in those circumstances where developers or users require a high level of independently assured security in a planned development and require a rigorous development approach without incurring unreasonable costs attributable to specialist security engineering techniques. as have multilevel secure devices such as the Tenix Interactive Link.

niap-ccevs. html) [10] Fox Data Diode Certifications (http:/ / www. Windows Server 2008 R2 and SQL Server 2008 SP2 Now Certified as Common Criteria Validated Products (http:/ / technet. commoncriteriaportal. sun. html). vendors will often simply add a "plus" sign (as in EAL4+) to indicate the augmented requirements. eu/ technology/ certifications) [11] http:/ / www. the evaluation may be augmented to include assurance requirements beyond the minimum required for a particular EAL. EAL notation The Common Criteria standards denote EALs as shown in this article: the prefix "EAL" concatenated with a digit 1 through 7 (Examples: EAL1. org/ cc-scheme/ st/ ?vid=10271) Red Hat Enterprise Linux Version 5 EAL4 Certificate (http:/ / www. microsoft. References Common Criteria certified product list (http:/ / www. In the mid to late 1990s. Augmentation of EAL requirements In some cases. html) [9] IBM System z Security (http:/ / www-03. com/ presspass/ press/ 2005/ dec05/ 12-14CommonCriteriaPR. aspx) [7] Solaris 10 Release 11/06 Trusted Extensions EAL 4+ Certification Report (http:/ / www. com/ software/ security/ securitycert/ docs/ Solaris_10_TX_CR_v1. niap-ccevs. pdf) SUSE Linux Enterprise Server 10 EAL4 Certificate (http:/ / www. com/ whitepapers/ sample/ sel4-formal-verification-of-an-os-kernel [12] http:/ / www. some countries place a space between the prefix and the digit (EAL 1. IBM System z partitioning achieves highest certification (http:/ / www-03. html#OS) Certification Report for SUSE Linux Enterprise Server 9 (http:/ / www. In practice. org/ files/ epfiles/ 0256a. 0_11_june_PDF. org/ products_OS. ok-labs. com/ en-us/ library/ dd229319. datadiode. com/ systems/ z/ security/ ccs_certification. vendors reported spending US$1 million and even US$2. org/ cc-scheme/ st/ ?vid=10125) Windows Platform Products Awarded Common Criteria EAL 4 Certification (http:/ / www. EAL3. Officially this is indicated by following the EAL number with the word augmented and usually with a list of codes to indicate the additional requirements. There have been no published reports of the cost of the various Microsoft Windows security evaluations. ok-labs. pdf) [8] VMware Infrastructure Earns Security Certification for Stringent Government Standards (http:/ / www. com/ systems/ z/ security/ certification.Evaluation Assurance Level 43 Range of completion times and costs for Common Criteria evaluations at EAL2 through EAL4. The use of a plus sign to indicate augmentation is an informal shorthand used by product vendors (EAL4+ or EAL 4+).5 million on evaluations comparable to EAL4. vmware. ibm. EAL 3. mspx#Microsoft) [6] Microsoft Windows 7. EAL5). ibm. EAL 5). microsoft. com/ company/ news/ releases/ common_criteria. commoncriteriaportal. As shorthand. com/ releases/ release/ ok-labs-and-galois-partner-in-original-research-for-ultra-secure-systems [1] [2] [3] [4] [5] .

they may leave behind logons with access to networks.pdf).html) • Charu Chaubal (February 2007) (PDF).nist. An effective exit procedure consists of documented standard processes that are carried out for each person who has ceased employment as well as measures to ensure that cessations are detected and reported so the processes will be completed. Inc. but Faces Considerable Challenges (http://www. to ensure that security integrity is maintained. Retrieved 2006-07-10.gov/nissc/2000/proceedings/papers/032.gov/new. "Trends in Government Endorsed Security Product Evaluations" (http://www. such as building security codes or banking passwords. 20th National Information Systems Security Conference. . Proc.com/ProductsServices/bae_prod_csit_xts400. csrc. INFORMATION ASSURANCE: National Partnership Offers Benefits. Richard (October 2000).com/content/view/118374/65/) • XTS-400 information (http://www. There is no reason for an adverse inference to be drawn when a standard exit procedure is carried out after a person has ceased employment.cs.gov.edu/~shap/NT-EAL4. as they are known. It is important that steps are taken to disable or negate all of those access privileges when a person leaves.pdf). It is worth remembering a person who is no longer working at a site is unable to detect the use of their old logon by another person.Evaluation Assurance Level 44 External links • GAO (March 2006) (PDF).uk/site/iacs/index.com/pdf/vi3_security_architecture_wp.. United States Government Accountability Office.items/d06392.html) • SUSE Linux awarded government security cert (http://www. When a person leaves the place where they worked or studied. Security Design of the VMware Infrastructure 3 Architecture (http:// www. Exit Procedures. An Exit Procedure will also cover other issues such as the recovery of equipment. VMware. should be in place at every work and school location.windowsecurity.baesystems.vmware.niap-ccevs. Retrieved 2008-11-19. jhu.org/web/20060527063317/http://eros.pdf) (PDF). 20070215 Item: WP-013-PRD-01-01.gao. • Smith. cfm?menuSelected=1&displayPage=13) • IBM AIX operating system certifications (http://www-03. Retrieved 2006-07-10.com/servers/aix/products/aixos/certifications/ ) • Microsoft Windows and the Common Criteria Certification (http://www.ibm.org/vpl/) • Common Criteria Assurance Level information from IACS (http://www. They may also take with them knowledge of many kinds of passwords outside of the network. Report GAO-06-392. keys and credit cards.linuxsecurity.html) • Understanding the Windows EAL4 Evaluation (http://web.com/articles/ Windows-Common-Criteria-Certification-Part-I.archive. A person who has left a worksite should expect their privileges to be removed and for records to be kept showing they have returned property and keys. Exit procedure Exit procedure is a security term in computing that ensures that knowledge about a computer system remains more or less closed only to the people with access to it.cesg. • CCEVS Validated Products List (http://www.

They still support "traditional Unix permissions" as used in previous versions of Mac OS X. There is no permission in these systems which would keep a user from reading a file. The AmigaOS Filesystem. use access control lists (ACLs)[1] to administer a more complex and varied set of permissions. There are four categories(System. based on an early POSIX draft that was abandoned.k. Script.4+ File Services Administration Manual recommends using only traditional Unix permissions if possible. In AmigaOS 1. The categories are not mutually disjoint: World includes Group which in turn includes Owner. DOS variants (including MS-DOS. and Windows Me) do not have permissions. as well as Microsoft Windows NT and its derivatives (including Windows 2000 and Windows XP). only file attributes. Windows 98. Windows 95. have a simple system for managing individual file permissions. OpenVMS (a. The System category independently includes system users (similar to superusers in Unix). while [3] ZFS supports only NFSv4 ACLs. for a single-user OS. Most of these systems also support some kind of access control lists. also support the use of NFSv4 ACLs. It also still supports the Mac OS Classic's "Protected" attribute. files had Archive. Solaris ACL support depends on the filesystem being used. Read. Differences between operating systems Unix-like and otherwise POSIX-compliant systems. but only a "Protected" file attribute. older UFS filesystem supports POSIX. Mac OS X. Mac OS X versions 10. There is experimental support for NFSv4 ACLs for ext3 filesystem. for example). which in this article are called "traditional Unix permissions".1e ACLs. and World) and four types of access permissions (Read. and the Apple Mac OS X Server version 10. There is a read-only attribute (R). but more complex.a. or NFSv4 ACLs.4 ("Tiger"). Owner. There is experimental support for NFSv4 ACLs for UFS.[5] IBM z/OS implements file security via RACF (Resource Access Control Facility)[6] .[4] FreeBSD supports POSIX. and Pure permissions/flags were added. and Delete). In AmigaOS 2. Write. OpenVMS also uses a permission scheme similar to that of Unix. which can be set or unset on a file by any user or program. including Linux-based systems and all Mac OS X versions. Write.x.1e ACLs on UFS and NFSv4 ACLs on ZFS. Group. Linux supports POSIX. Execute.. additional Hold.1e ACLs. or POSIX. AmigaDOS supports a relatively advanced permissions system. These systems control the ability of the users affected to view or make changes to the contents of the filesystem.3 ("Panther") and prior use POSIX-compliant permissions.Filesystem permissions 45 Filesystem permissions Most current file systems have methods of administering permissions or access rights to specific users and groups of users.x and higher.1e ACLs.[2] Classic Mac OSes are similar to DOS variants and DOS-based Windows: they do not support permissions. which are part of the NFSv4 standard. beginning with version 10. VMS). either proprietary (old HP-UX ACLs. Execute and Delete (collectively known as ARWED) permissions/flags. and therefore does not prevent him/her from changing/deleting the file.

deleting files. which define the file's group class. Classes Files and directories are owned by a user. When a permission is not set. Files and directories are assigned a group. the rights it would grant are denied. For example. The permissions to be assigned are determined using umasks. Distinct permissions apply to members of the file's group members. this permission grants the ability to read the names of files in the directory (but not to find out any further information about them such as contents. Files created within a directory will not necessarily have the same permissions as that directory. In effect. permissions. this permission grants the ability to modify entries in the directory. and renaming files. group. The effect of setting the permissions on a directory (rather than a file) is "one of the most frequently misunderstood file permission issues" (Hatch 2003). Unlike ACL-based systems. These classes are known as user. When set for a directory. This permission must be set for executable binaries (for example. When set for a directory. The owner determines the file's owner class. a compiled C++ program) or shell scripts (for example. file type. Distinct permissions apply to the owner. Distinct permissions apply to others. comprise a file's others class. ownership. • The execute permission. permissions on a Unix-like system are not inherited. etc. The effective permissions are determined based on the user's class. The most common form is symbolic notation. this permission grants the ability to traverse its tree in order to access files or subdirectories.Filesystem permissions 46 Traditional Unix permissions Permissions on Unix-like systems are managed in three distinct classes. and others. which grants the ability to execute a file. Permissions There are three specific permissions on Unix-like systems that apply to each class: • The read permission. The owner doesn't need to be a member of the file's group. a Perl program) in order to allow the operating system to run them. the user who is the owner of the file will have the permissions given to the owner class regardless of the permissions assigned to the group class or others class. which grants the ability to read a file. Notation of traditional Unix permissions Symbolic notation There are many ways by which Unix permission schemes are represented. Unix permissions are a simplified form of access control lists (ACLs). which grants the ability to modify a file. When a new file is created on a Unix-like system.) • The write permission. When set for a directory. but not see the content of files inside the directory (unless read is set). . nor a member of the group. This includes creating files. its permissions are determined from the umask of the process that created it. size. Users who are not the owner.

Octal notation Another common method for representing Unix permissions is octal notation. • crw-rw-r-. The second set represents the group class. and execute permissions respectively: • r if the read bit is set. each numeral represents a different component of the permission set: user class. each sum represents a specific set of permissions.if it is not.for a character special file whose user and group classes have the read and write permissions and whose others class has only the read permission. group class. As a result. The first character indicates the file type: • • • • • • • d b c l p s denotes a regular file denotes a directory denotes a block special file denotes a character special file denotes a symbolic link denotes a named pipe denotes a domain socket Each class of permissions is represented by three characters. • w if the write bit is set. specific bits add to the sum as it is represented by a numeral: • The read bit adds 4 to its total (in binary 100). • dr-x-----. The first set of characters represents the user class.Filesystem permissions 47 Three groups of three first what the owner can do second what the group members can do third what other users can do The triplet first r: readable. Octal notation consists of a three.or four-digit base-8 value. and • The execute bit adds 1 to its total (in binary 001). and "others" class respectively. . • x if the execute bit is set. but not executable.if it is not.if it is not. With three-digit octal notation. . The following are some examples of symbolic notation: • -rwxr-xr-x for a regular file whose user class has full permissions and whose group and others classes have only the read and execute permissions. s or t: executable and setuid/setgid/sticky. . Each of the three characters represent the read. Each of these digits is the sum of its component bits (see also Binary numeral system). The third and final set of three characters represents the others class. third x: executable. . S or T: setuid/setgid or sticky. write.for a directory whose user class has read and execute permissions and whose group and others classes have no permissions. • The write bit adds 2 to its total (in binary 010). These values never produce ambiguous combinations. second w: writable.

User private group Some systems diverge from the traditional POSIX-model of users and groups. stat -c '%n %a %A' can be used to print out octal notation. no other users must be added to the "user private group" or they will have write-permission on all files.for other 124 = "--x-w-r--" = x for owner. In this case however. r-x for group.Filesystem permissions These are the examples from the Symbolic notation section given in octal notation: • "-rwxr-xr-x" would be represented as 755 in three-digit octal. Here is a summary of the meanings for individual octal digit values: 0 1 2 3 4 5 6 7 ----x -w-wx r-r-x rwrwx no permission execute write write and execute read read and execute read and write read. write and execute 48 Note the similarity to binary counting (starting at the right and going left): 0: 1: 2: 3: 4: 5: 6: 7: 000 001 010 011 100 101 110 111 Octal digit values can be added together to make Symbolic Notations: (4=r)+(1=x) == (5=r-x) (4=r)+(2=w) == (6=rw-) (4=r)+(2=w)+(1=x) == (7=rwx) Here is a summary showing which octal digits affect permissions for user. The "user private group" scheme can be preferred for a variety of reasons[7] [8] [9] including using a umask of 002 and not having every "user" able to write to newly created files. w for group. r for other Listing Permissions ls -l prints symbolic notation for files and directories. and other: • • • • UGO = User. group. • "-rw-rw-r--" would be represented as 664 in three-digit octal. r-. by creating a new group – a "user private group" – for each user. Other 777 = "-rwxrwxrwx" = rwx for all 754 = "-rwxr-xr--" = rwx for owner. Group. . • "-r-x------" would be represented as 500 in three-digit octal.

hp. is reduced. html#int_prot_code http:/ / docs.com/en/File_Services_v10.com/articles/20030424. com/ onlamp/ blog/ 2006/ 09/ using_user_private_groups.4. html) [9] Red Hat Enterprise Linux 5 Manual. ibm.4+ File Services Administration Manual (see pages 16-26) (http://manuals. zos.hackinglinuxexposed. • "Linux File Permission Confusion pt 2" (http://www. com/ docs/ manuals/ enterprise/ RHEL-5-manual/ Deployment_Guide-en-US/ s1-users-groups-private-groups.pdf) • "Linux File Permission Confusion" (http://www. User Private Groups (http:/ / www.apple. the amount of time the vulnerability is open to attack. redhat.hackinglinuxexposed. com/ app/ docs/ doc/ 819-5461/ ftyxi?a=view http:/ / www. html) [8] Red Hat 9 Manual. In the realm of computer vulnerabilities. www7. full disclosure means to disclose all the details of a security problem which are known. disclosure is often achieved via mailing lists such as a Full-Disclosure mailing list and by other means. . microsoft. com/ en-us/ library/ bb727008.org/cookbook/cookbook_9. sun. Using User Private Groups (http:/ / www. de/ ~agruen/ nfs4acl/ http:/ / wiki. com/ infocenter/ zos/ v1r9/ index. r9. Security is improved because the window of exposure. but not new. com/ docs/ manuals/ linux/ RHL-9-Manual/ ref-guide/ s1-users-groups-private-groups. • The Linux Cookbook: Groups and How to Work in Them (http://dsl. suse. ibm. It is a philosophy of security management completely opposed to the idea of security through obscurity. html) External links • Apple Mac OS X Server version 10.5. Fixes are produced faster because vendors and authors are forced to respond in order to protect their system from potential attacks as well as to protect their own image. including details of the vulnerability and how to detect and exploit it. The concept of full disclosure is controversial.html) by Brian Hatch 2003. org/ NFSv4_ACLs http:/ / publib. boulder.Filesystem permissions 49 References [1] [2] [3] [4] [5] [6] [7] http:/ / technet. info. redhat.com/articles/20030417. Definition Full disclosure requires that full details of a security vulnerability are disclosed to the public. freebsd. 32. com/ doc/ 731final/ 6489/ 6489pro_025. The theory behind full disclosure is that releasing vulnerability information immediately results in quicker fixes and better security. oreillynet. it has been an issue for locksmiths since the 19th century. htm O'Reilly ONLamp Blog.html) by Michael Stutz 2004 Full disclosure In computer security. User Private Groups (http:/ / www. ieab600/ xddprot. jsp?topic=/ com.html) by Brian Hatch 2003. aspx http:/ / h71000.

Rogues are very keen in their profession. and the spread of the knowledge is necessary to give fair play to those who might suffer by ignorance. Rain Forest Puppy developed the RFPolicy. To address the controversy of disclosing harmful information to the general Internet community. Rogues knew a good deal about lock-picking long before locksmiths discussed it among themselves. In the case that a vendor is notified and a fix is not produced within a reasonable time. including blackhats. It cannot be too earnestly urged that an acquaintance with real facts will. and the exposure only taught purchasers the necessity of a little scrutiny and caution. According to A. disclosure is generally made to the public. in a 19th century controversy regarding whether weaknesses in lock systems should be kept secret in the locksmithing community. and establish guidelines on what to do if the vendor fails to respond. as they have lately done. Many well-meaning persons suppose that the discussion respecting the means for baffling the supposed safety of locks offers a premium for dishonesty. by showing others how to be dishonest. full and public disclosure should be preceded by disclosure of the vulnerability to the vendors or authors of the system. is an alternative approach where full details of the vulnerability are provided to a restricted community of developers and vendors while the public is only informed of a potential security issue. on the plea that it would give instructions in the art of adulterating milk. If a lock. Internet Security Systems was widely criticized for allowing less than eight hours before disclosing details of a vulnerability in the Apache HTTP Server. This philosophy is sometimes called responsible disclosure. let it have been made in whatever country. or may soon be detected by someone with intent to exploit them. most security researchers set maximum times (such as 14 days or 30 days) before fully revealing a vulnerability to the public. surely it is to the interest of honest persons to know this fact. or revealed to the public. Many security researchers cite vendors' past failure to respond to vulnerability reports as the reason that they fully disclose vulnerabilities. C. although the period could be a matter of hours. History The issue of full disclosure was first raised in the context of locksmithing. when the reading public was alarmed at being told how London milk is adulterated. Some believe that in the absence of any public exploits for the problem. This is a fallacy. is not so inviolable as it has hitherto been deemed to be. to vulnerability reports that are not public. and how much to disclose. Thus. because the dishonest are tolerably certain to apply the knowledge practically. Fourteen to thirty days is typical. leaving them to obey this necessity or not.[1] One challenge with "responsible disclosure" is that some vendors do not respond. Limited disclosure. Unfortunately. since otherwise many vendors would never fix even critical vulnerabilities in their products. in 2000. or by whatever maker. timid persons deprecated the exposure. vendors may refuse to fix the vulnerability or refuse to give it enough priority to actually repair it. This private advance disclosure allows the vendor time to produce a fix or workaround. and know already much more than we can teach them respecting their several kinds of roguery. Advocates of this approach also claim the term "responsible disclosure". as they .Full disclosure 50 Various interpretations Even among those who believe in disclosure there are differing policies about when. a vain fear. Hobbs: A commercial. vulnerabilities reported to a vendor may already be exploited. milkmen knew all about it before. and in some respects a social doubt has been started within the last year or two. to whom. or inordinately delay their response. in the end. whether they practiced it or not. be better for all parties. As long as a vulnerability is not widely known to the public (with enough detail to reproduce the attack). whether it is right to discuss so openly the security or insecurity of locks. which is an attempt to formalize the process of alerting vendors to security problems in their products. Opinions differ on what constitutes a reasonable time . Some time ago.

In response to this.. a Slovenian security firm. 1853 (revised 1868). The advantage of disclosure is that White hats will obtain the information. n3td3v is thought to be banned in response to his widespread criticism of what he saw as irresponsible disclosure practices carried out by some security researchers. ed. such as black hats and script kiddies. this argument assumes that without disclosure such tools and attacks would not have occurred. However. and that the vulnerability will be detected and patched more quickly. Though Secunia often disagrees with the conclusions being drawn on Full-Disclosure or some of the irresponsible disclosures being made .Full disclosure pleased. Secunia never attempts to intervene . it is in fact a crucial part of the infrastructure in the security community as it is the only significant security list not moderated by a commercial entity with its own hidden agenda . Locks and Safes: The Construction of Locks.in fact Full-Disclosure is out of Secunia's control and is solely run by John Cartwright who has been active in maintaining Full-Disclosure since it was launched by Len Rose and John Cartwright in 2002 as an alternative to the moderated and biased lists predominant at the time . C. The argument against disclosure is that providing complete details or tools to malicious attackers. — A. such as HD Moore.[7] Arcos. HD Moore found about 40 vulnerabilities related to DLL load hijacking in Windows applications that Rapid7 was going to publish under its vulnerability disclosure policy. which would then inform the vendor of that software. found one related vulnerability for iTunes and decided to publish without alerting the vendor. as often these disclosures include code or executable tools to exploit the vulnerability. or never produced a fix at all. Controversy Full disclosure can be controversial. it was felt that CERT's policies were a manifestation of an impractical. notably the Full-Disclosure mailing list. the vulnerabilities were actively exploited by crackers. Since CERT and the vendors were aware of the holes. Some saw the banning of n3td3v as an attack on freedom of speech in an email post to the list August 31 2009 [5]. In the meantime. Published by Virtue & Co. since the disclosures were private. The desire by software companies to ignore warnings and rely on crackers' ignorance of the problem is a form of the questionable security through obscurity approach. but attempted to keep them secret even to the administrators of machines being cracked in the field. However. Software security vulnerabilities were reported to CERT. 51 Mailing list Full-Disclosure [2] is viewed as radical by many not knowing how the security community works . [6] In August 2010.). saying "it hasn’t paid out well" in the past and "we’ve found better markets for this kind of information". The full disclosure debate came back to life through dissatisfaction at the methods employed by the Internet security infrastructure in the early 1990s. some vendors took years to produce a fix. allows them to take advantage of vulnerabilities more quickly and makes attacks more widespread.[8] . Hobbs (Charles Tomlinson. "ivory tower" attitude. London. while others accuse him of being an internet troll. mailing lists and other avenues for full disclosure were established.[3] n3td3v was banned from the Full-Disclosure mailing list on January 21 2009 [4]. Public disclosure of the hole would not take place until the vendor had readied a patch to fix it.

uk/ pipermail/ full-disclosure/ 2009-August/ 070465. Matt Mecham: Why full disclosure is bad (http://ips2.schneier.org/bib/disclosure-by-date. 2010-06-11. uk/ pipermail/ full-disclosure/ 2009-January/ 067676. by date (http://www. [2] http:/ / lists.html#1) from Bruce Schneier's Crypto-Gram.com/crypto-gram-0111. • • • • • • from Bruce Schneier's Crypto-Gram. html [5] http:/ / lists.com/Network/net0800.schneier. 2000 Full-Disclosure mailing list (http://lists. "Bug brokers offering higher bounties" (http:/ / www.com/crypto-gram-0009. grok.uk/mailman/listinfo/full-disclosure) Full-Disclosure Mailing List Charter document (http://lists. securityfocus. uk/ pipermail/ full-disclosure/ [3] "Googler criticized for disclosing Windows-related flaw" (http:/ / news. (2007-01-23). grok. com/ 8301-27080_3-20007421-245.html) • Why publishing exploit code is *generally* a bad idea if you're paid to protect (http://www.grok.1. downloaded October 2005. [7] Rapid7 Vulnerability Disclosure Policy (http:/ / www.html#PublicizingVulnerabilities). com/ news/ 11419). . 2006-10-20. September 2000 • Full Disclosure (http://www. securityfocus.org.blogs.com/matts_blog/2004/09/why_full_disclo.com/pages/Full-Disclosure/ 186727924708838) Matt Blaze.cgisecurity. cnet. org.) . spirit. com/ ?p=5325) External links • Full Disclosure Debate Bibliography.com/hobbs. Retrieved 2011-01-06. org.Full disclosure 52 References [1] Robert Lemos.html) Full Disclosure security community on Facebook (https://www. . rapid7. html).schneier.crypto. rapid7. org. Retrieved 2011-01-06.com/crypto-gram-0002. Is it harmful to discuss security vulnerabilities? (http://www. grok. [4] http:/ / lists.html#1) from Bruce Schneier's Crypto-Gram. .html) • Full Disclosure and the Window of Exposure (http://www.wildernesscoast.org. html [6] "Researcher attempts to shed light on security troll" (http:/ / www.html) A history of the CERT Advisory CA-93:15 case. com/ news/ 11437).html). Retrieved 2011-01-06. February 15. com/ disclosure.facebook.grok. SecurityFocus.uk/full-disclosure-charter. jsp) [8] Rapid7 blog: Application DLL Load Hijacking (http:/ / blog. which spawned the movement in the first place (http://www.html#section-1. November 2001 • Publicizing Vulnerabilities (http://www.com/ 2010/06/why-publishing-exploit-code-is-generally-a-bad-idea-if-youre-paid-to-protect.

or partial rewrites. The program is then monitored for exceptions such as crashes or failing built-in code assertions. The assignment was titled "Operating System Utility Program Reliability . and sequences of API calls. input that crosses a trust boundary is often the most interesting.[3] [4] The project developed a basic command-line fuzzer to test the reliability of Unix programs by bombarding them with random data until they crashed. it is more important to fuzz code that handles the upload of a file by any user than it is to fuzz the code that parses a configuration file that is accessible only to a privileged user. where they are referred to as robustness testing. For the purpose of security. application of static analysis. keyboard and mouse events.[1] One of the earliest examples of fuzzing dates from before 1983. The technique can only provide a random sample of the system's behavior. Fuzzing is commonly used to test for security problems in software or computer systems.or black-box testing. and was used to test for bugs in MacPaint. and in many cases passing a fuzz test may only demonstrate that a piece of software can handle exceptions without crashing. . or random data to the inputs of a computer program. grey. it used journaling hooks to feed random events into Mac programs. Even items not normally considered "input" can be fuzzed. There are two forms of fuzzing program. Fuzz testing is one of the techniques which offers a high benefit to cost ratio. often automated or semi-automated.The Fuzz Generator". taught by Professor Barton Miller. shared memory. As a gross measurement of reliability. This means fuzz testing is an assurance of overall quality. "The Monkey" was a Macintosh application developed by Steve Capps prior to 1983.[5] Uses Fuzz testing is often employed as a black-box testing methodology in large software projects where a budget exists to develop test tools.[2] For example. in the form of a code audit. which can be employed as white-. that involves providing invalid. fuzzing can suggest which parts of a program should get special attention. The test was repeated in 1995. or the precise interleaving of threads. syntax testing or negative testing. mutation-based and generation-based.Fuzz testing 53 Fuzz testing Fuzz testing or fuzzing is a software testing technique. but any type of program input can be fuzzed. unexpected. The term first originates from a class project at the University of Wisconsin 1988 although similar techniques have been used in the field of quality assurance. rather than a bug-finding tool. rather than behaving correctly. and not a substitute for exhaustive testing or formal methods. such as the contents of databases.[1] File formats and network protocols are the most common targets of testing. History The term "fuzz" or "fuzzing" originates from a 1988 class project at the University of Wisconsin. Interesting inputs include environment variables.

g. However. or involve proprietary extensions to published protocols. and then using model-based test generation techniques in walking through the specifications and adding anomalies in the data contents. If the fuzz stream is pseudo-random number-generated.[8] Techniques Fuzzing programs fall into two different categories. For example. Reproduction and isolation So as to be able to reproduce errors. A specification-based fuzzer involves writing the entire array of specifications into the tool. which are important for software that does not control its input. This "smart fuzzing" technique is also known as robustness testing. randomly mutated protocol packets. some fuzzing software will help to build a test case. it can be used to find incorrect-serialization bugs by complaining whenever a program's serializer emits something that the same program's parser rejects. The understanding can be based on a specification. fuzzing software will often record the input data it produces. White-box fuzzing uses symbolic execution and [15] Evolutionary fuzzing leverages feedback from code coverage. or they can mutate examples from test suites or real life. Another common technique that is easy to implement is mutating existing input (e. and sequences. the seed value can be stored to reproduce the fuzz attempt. network protocols. . Since fuzzing often generates invalid input it is used for testing error-handling routines. syntax testing. which is used for debugging. grammar testing. Fuzz testing can be combined with other testing techniques. structures. since a specification is a prerequisite for writing such a fuzzer. the test data is preserved. Once a bug is found. and 2) Many useful protocols are proprietary. test coverage for new or proprietary protocols will be limited or nonexistent. [12] There are two limitations of protocol-based fuzzing based on protocol implementations of published specifications: 1) Testing cannot proceed until the specification is relatively mature. approach of exploratory testing. files from a test suite) by flipping bits at random or moving blocks of the file around. usually before applying it to the software. and GUI-based applications and services.Fuzz testing 54 Types of bugs As well as testing for outright crashes. using testcase reduction tools such as Delta or Lithium.[14] These fuzzers can generate test cases from scratch.[6] It can also find unintentional differences between two versions of a program[7] or between two implementations of the same specification. messages.[9] [10] [11] The protocol awareness can also be created heuristically from examples using a tool such as Sequitur [13].[1] The simplest form of fuzzing technique is sending a stream of random bits to software. Simple fuzzing can be thought of as a way to automate negative testing. with mostly-valid input tending to trigger the "deepest" error cases.[16] effectively automating the constraint solving. or as events. Fuzzing can also find some types of "correctness" bugs. and (input) fault injection. Mutation based fuzzers mutate existing data samples to create test data while generation based fuzzers define new test data based on models of the input. If the computer crashes outright. This technique of random inputs still continues to be a powerful tool to find bugs in command-line applications. the most successful fuzzers have detailed understanding of the format or protocol being tested. where any bug affecting memory safety is likely to be a severe vulnerability. fuzz testing is used to find bugs such as assertion failures and memory leaks (when coupled with a memory debugger). They can concentrate on valid or invalid input. either as command line options. If fuzzing is based only on published specifications. The methodology is useful against large applications.

ISBN 978-1-59693-214-2 [4] "Fuzz Testing of Application Reliability" (http:/ / pages. . squarefree. 1998. "Grammar-based Whitebox Fuzzing" (http:/ / people. mit. com/ 2008/ 12/ 23/ fuzzing-tracemonkey/ ). . fi/ inf/ pdf/ publications/ 2001/ P448. for example. A primitive fuzzer may have poor code coverage. . bugs found using fuzz testing are sometimes severe. .). References [1] Michael Sutton. [5] "Macintosh Stories: Monkey Lives" (http:/ / www. Levin. "Black Ops 2006" (http:/ / usenix. . py?story=Monkey_Lives. "Fuzzing TraceMonkey" (http:/ / www. John Wiley & Sons. Jared DeMott and Charlie Miller. . . This is a major advantage over binary or source auditing. VTT Publications 447. com/ dp/ 1850328803). "Automated Penetration Testing with White-Box Fuzzing" (http:/ / msdn. Adam Kiezun. Addison-Wesley. Fuzz testing enhances software security and software safety because it often finds odd oversights and defects which human testers would fail to find. which often relies on artificial fault conditions that are difficult or impossible to exploit. pdf). Retrieved 2010-05-28. com/ dp/ 0471183814). org/ events/ lisa06/ tech/ slides/ kaminsky. Code coverage tools are often used to estimate how "well" a fuzzer works. com/ 2007/ 08/ 02/ fuzzing-for-correctness/ ). com/ 2008/ 12/ 23/ differences/ ). org/ StoryView. The computational complexity of the software testing problem is of exponential order ( ) and every fuzzer takes shortcuts to find something interesting in a timeframe that a human cares about.com. folklore. or even fuzzing's close cousin. com/ resources/ SB_002_Robustness_Testing_With_Achilles. 2 Sub edition (June 1990)" (http:/ / www. Fuzzing: Brute Force Vulnerability Discovery. vtt. . Retrieved 2009-05-14. [11] "Kaksonen. The randomness of inputs used in fuzzing is often seen as a disadvantage. microsoft. Voas and Gary McGraw" (http:/ / www. "Some differences between JavaScript engines" (http:/ / www. [16] "VDA Labs" (http:/ / www. pdf) (PDF). pdf). if the input includes a checksum which is not properly updated to match other random changes. Technical Research Centre of Finland. . This has become more common as fuzz testing has become more widely known. [10] "Software Testing Techniques by Boris Beizer. but these are only guidelines to fuzzer quality. txt). Fuzzing for Software Security Testing and Quality Assurance. fault injection. 15 p. Adam Greene. [8] Jesse Ruderman. com/ en-us/ library/ cc162782.) ISBN 951-38-5874-X (on-line ed. . [7] Jesse Ruderman. com/ tools/ efs_gpf. edu/ akiezun/ pldi-kiezun. . (2001) A Functional Method for Assessing Protocol Implementation Security (Licentiate thesis). vdalabs. January 28.org. only the checksum validation code will be verified. info/ [14] Dan Kaminski (2006). and even careful human test designers would fail to create tests for. [9] "Robustness Testing Of Industrial Control Systems With Achilles" (http:/ / wurldtech. + app.Fuzz testing 55 Advantages and disadvantages The main problem with fuzzing to find program faults is that it generally only finds very simple faults. Michael Y. [3] Barton Miller (2008). ISBN 0321446119. as the same techniques and tools are now used by attackers to exploit deployed software. "Fuzzing for correctness" (http:/ / www. exploitable bugs that could be used by a real attacker. In Ari Takanen. cs. [13] http:/ / sequitur. Retrieved 2010-05-28. [6] Jesse Ruderman. [12] "Software Fault Injection: Inoculating Programs Against Errors by Jeffrey M. Rauli. . "Preface". as catching a boundary value condition with random inputs is highly unlikely. pdf) (PDF). Folklore. On the other hand. edu/ ~bart/ fuzz/ ). Microsoft Research. Pedram Amini (2007). . University of Wisconsin-Madison. amazon." (http:/ / www. [15] Patrice Godefroid. International Thomson Computer Press. Retrieved 2010-05-28. 1999-02-22. Amazon. html). . aspx). [2] John Neystadt (2008-02). Microsoft. Retrieved 2009-05-14. csail. ISBN 951-38-5873-1 (soft back ed. Retrieved 2010-05-28. Every fuzzer can be expected to find a different set of bugs. wisc. 128 p. Espoo. squarefree. amazon. squarefree.

Ari Takanen. The following search query will locate all websites that have the words "admbook" and "version" in the title of the website.edu/~bart/fuzz) Source of papers and fuzz software. e.google.dtic. Charles Miller External links • University of Wisconsin Fuzz Testing (the original fuzz project) (http://www. Basics Google hacking involves using advance operators in the Google search engine to locate specific strings of text within search results. "Powered by XOOPS 2.cs.pdf) • Designing Inputs That Make Software Fail (http://video. intitle:admbook intitle:version filetype:php [1] Another technique is searching for insecure coding practices in the public code indexed by Google Code Search or other source code search engines.3 Final".google.com/viewer?url=https://github.. The following search query would locate all web pages that have that particular text contained within them.google. A search string such as inurl:"ViewerFrame?Mode=" will find public web cameras.oulu. Some of the more popular examples are finding specific versions of vulnerable Web applications.Fuzz testing 56 Further reading • ISBN 978-1-59693-214-2. .com/ videoplay?docid=6509883355867972121).2. conference video including fuzzy testing • Link to the Oulu (Finland) University Secure Programming Group (http://www.ppt) Google hacking Google hacking is a computer hacking technique that uses Google Search and other Google applications to find security holes in the configuration and computer code that websites use.wisc. Fuzzing for Software Security Testing and Quality Assurance.mil/iatac/download/Vol10_No1.com/ s7ephen/Ruxxer/raw/master/presentations/Ruxxer. conference presentation video • Building 'Protocol Aware' Fuzzing Frameworks (http://docs.fi/research/ouspg/) • JBroFuzz .g.com/videoplay?docid=-1551704659206071145). DeMott.ee.Building A Java Fuzzer (http://video. It also checks to ensure that the web page being accessed is a PHP file. It is normal for default installations of applications to include their running version in every page they serve. Jared D. One can even retrieve the username and password list from Microsoft FrontPage servers by inputting the given microscript in Google search field: "#-Frontpage-" inurl:administrators.pwd Devices connected to the Internet can be found. • Look out! It's the Fuzz! (IATAC IAnewsletter 10-1) (http://iac.

colorado. Google (printable) • "Google Hacking & Deep Web" [5]. deactivate unneeded features in configuration files or perform various other protective measures. among other measures. There are also hardening scripts and tools like Bastille Linux. unnecessary usernames or logins and the disabling or removal of unnecessary services. de/ 2011/ 01/ 19/ google-hacking-datenpannen-und-deep-web/ Hardening (computing) In computing. This may involve.0 [3] External links • "Google Help: Cheat Sheet" [4]. com/ us/ downloads/ free-tools/ sitedigger. External links • IT Security Topic — Hardening [3] . in principle a single-function system is more secure than a multipurpose one. mcafee.Google hacking 57 Google Hacking Tools • Stach & Liu . Reducing available vectors of attack typically includes the removal of unnecessary software.globalsecurity. com/ software/ security/ jass/ http:/ / www. globalsecurity. org/ military/ library/ report/ 1997/ harden. boris-koch. A system has a larger vulnerability surface the more that it does. closing open network ports. syhunt. google. com/ help/ cheatsheet.edu • Hardening Your Computing Assets [4]PDF . There are various methods of hardening Unix and Linux systems.www. aspx http:/ / www.SiteDigger v3. and setting up intrusion-detection systems. JASS [1] for Solaris systems and Apache/PHP Hardener [2] that can. com/ hardener/ http:/ / www. stachliu. pdf . GERMAN References [1] [2] [3] [4] [5] http:/ / www.Google Hacking Diggity Project [2] • McAfee/Foundstone . edu/ cns/ security/ awareness/ hardening/ http:/ / www. com/ search?q=intitle%3Aadmbook+ intitle%3Aversion+ filetype%3Aphp http:/ / www. com/ resources/ tools/ google-hacking-diggity-project/ http:/ / www.www. applying a patch to the kernel such as Exec Shield or PaX.colorado. firewalls and intrusion-prevention systems. google.org References [1] [2] [3] [4] http:/ / sun. hardening is usually the process of securing a system by reducing its surface of vulnerability. html http:/ / www. for example.

The IDENTIFY DEVICE command queries a particular register on the IDE controller to establish the size of a drive.[2] How it works The IDE controller has registers that contain data that can be queried using ATA commands. To use the area. READ NATIVE MAX ADDRESS returns the true size of the hard drive. There are three ATA commands involved in creating and utilizing a hidden protected area.Host protected area 58 Host protected area The host protected area. the register read by IDENTIFY DEVICE is returned to its original fake value. . An HPA has been created. 2001). This register however can be changed using the SET MAX ADDRESS ATA command.g. sometimes referred to as hidden protected area[1] . Creation of an HPA. READ NATIVE MAX ADDRESS returns the true size of the hard drive. The commands are: • IDENTIFY DEVICE • SET MAX ADDRESS • READ NATIVE MAX ADDRESS Operating systems use the IDENTIFY DEVICE command to find out the addressable space of a hard drive. It is protected because the OS will work with only the value in the register that is returned by the IDENTIFY DEVICE command and thus will never be able to address the parts of the drive that lie within the HPA. When its operations are complete. IDENTIFY DEVICE returns the now fake size of the hard drive. the HPA is in existence. The diagram shows how a host protected area (HPA) is created. The data returned gives information about the drive attached to the controller.IDENTIFY DEVICE returns the true size of the hard drive. The ATA command that these entities use is called READ NATIVE MAX ADDRESS. If the value in the register is set to less than the actual hard drive size then effectively a host protected area is created. History HPA was first introduced in the ATA-4 standard cxv (T13. The HPA is useful only if other software or firmware (e. is an area of a hard drive that is not normally visible to an operating system (OS). SET MAX ADDRESS reduces the reported size of the hard drive. Software and firmware that are able to utilize the HPA are referred to as 'HPA aware'. the controlling HPA-aware program changes the value of the register read by IDENTIFY DEVICE to that found in the register read by READ NATIVE MAX ADDRESS. This command accesses a register that contains the true size of the hard drive. BIOS) is able to utilize it. READ NATIVE MAX ADDRESS returns the true size of the hard drive.

HPA is useful to them because even when a stolen laptop has its hard drive formatted the HPA remains untouched. open software) by Brian Carrier. 128 GB). • HPA is also used by various theft recovery and monitoring service vendors. which utilizes BEER (boot engineering extension record) and PARTIES (protected area run-time interface extension services). current capacity is 12000 sectors (6 MB) native capacity is 120103200 sectors (61492 MB) With program hdparm.0. The latest Linux versions will print a message when the system is booting. one must use software utilities (see below) that use READ NATIVE MAX ADDRESS and SET MAX ADDRESS to change the drive's reported size back to its native size. EnCase by Guidance Software Access Data's Forensic Toolkit Identification methods Using Linux. When this occurs. there are a couple of ways to detect the existence of an HPA. For example: dmesg | less [. For example the laptop security firm Computrace use the HPA to load software that reports to their servers whenever the machine is booted on a network. An example of this implementation is the Phoenix FirstBIOS.g. which can look like a BIOS or dynamic drive overlay (DDO) problem. In this case. one can compare the number of sectors output from 'hdparm -I' with the number of sectors reported for the hard drive model's published statistics.. IBM and LG notebooks hide system restore software in HPA. Identification tools • • • • The Sleuth Kit (free. . • Some vendor-specific external drive enclosures (Maxtor) are known to use HPA to limit the capacity of unknown replacement hard drives installed into the enclosure. where X is your drive letter: hdparm -N /dev/sdX For versions of hdparm < 8.) The ATA Forensics Tool (TAFT)[3] by Arne Vidstrom.] hdb: Host Protected Area detected. version >= 8. normally in conjunction with the BIOS. • Computer manufacturers may use the area to contain a preloaded OS for install and recovery purposes (instead of providing DVD or CD media). the drive may appear to be limited in size (e. • Dell notebooks hide Dell MediaDirect utility in HPA.Host protected area 59 Use • HPA can be used by various booting and diagnostic utilities. • HPA can also be used to store data that is deemed illegal and is thus of interest to government and police computer forensics teams. Identification and manipulation Identification of HPA on a hard drive can be achieved by a number of tools and methods. • Some rootkits hide in the HPA to avoid being detected by anti-rootkit and antivirus software.. and avoid using the external enclosure again with the affected drive. (HPA identification is currently Linux-only.

0 you can modify the HPA directly.computing. HDAT2[4] by Lubomir Cabla.Downloads and Utilities (http:/ / www.ThinkWiki (http:/ / www. 02-MHDD/ External links • The Sleuth Kit (http://www. • hdparm is a Linux program for reading and writing ATA and SATA hard drive parameters. tue.net .org/informer/sleuthkit-informer-17. win.setmax sysctl which can be set to 1.dcu.php/ Using_ATA_commands_on_hard_disks_.edu/academic/institutes/ecii/publications/ articles/EFE36584-D13F-2962-67BEB146864A2671.. thinkwiki. net/ stools/ taft/ ) [4] HDAT2/CBL Hard Disk Repair Utility (http:/ / www. utica. com/ hdd/ support/ download. 10.html#hpa) • International Journal of Digital Evidence (https://www.pdf) • Dublin City University Security & Forensics wiki (http://polya._why_bother?) • Wiki Web For ThinkPad Users (http://www. hdat2. htm) [7] http:/ / hddguru. hitachigst.thinkwiki. Brouwer Feature Tool[6] by Hitachi Global Storage Technologies.security tools (http:/ / vidstrom. com/ ) [5] http:/ / www. MHDD (created by Dmitry Postrigan) [7] is a freeware tool for hard drives that among other low-level functionalities provides information about the HPA state of a disk and can manipulate it.utica. nl/ ~aeb/ linux/ setmax. • FreeBSD has the hw. • • • • Manipulation methods Using the Linux program hdparm with version >=8. org/ wiki/ Hidden_Protected_Area) [2] Host Protected Areas (https:/ / www. c [6] Support . edu/ academic/ institutes/ ecii/ publications/ articles/ EFE36584-D13F-2962-67BEB146864A2671. Where ABC is the number of visible sectors and X is the drive letter: hdparm -NpABC /dev/sdX References [1] Hidden Protected Area .sleuthkit.ie/wiki/index.. setmax[5] by Andries E.ata. pdf) [3] vidstrom. com/ content/ en/ software/ 2005.org/wiki/Hidden_Protected_Area) .Host protected area 60 Manipulation tools Creating and manipulating HPA on a hard drive can be achieved by a number of tools.

History Identity management (IdM) is a term related to how humans are identified and authorized across computer networks. buildings and data within an organization). It can also mean Middle-of-the-Road Italy. or simply IdM) is a broad administrative area that deals with identifying individuals in a system (such as a country. Digital identity: Personal identifying information (PII) selectively exposed over a network. Thus the term management is appended to "identity" to indicate that there is technological and best practices framework around a somewhat intractable philosophical concept.g. The pure identity paradigm: Creation. Identity management is multidisciplinary and covers many dimensions. 2. . It covers issues such as how users are given an identity. passwords.. etc. The service paradigm: A system that delivers personalized. and the technologies supporting that protection (e. the protection of that identity. network protocols. The user access (log-on) paradigm: For example: a smart card and its associated data used by a customer to log on to a service or services (a traditional view). • Social and humanity – Deals with issues such as privacy. presence-based services to users and their devices. role-based. or an organization) and controlling access to the resources in that system by placing restrictions on the established identities of the individuals. • Security – Manages elements such as access control. • Police – Deals with identity theft. a network. administration and termination of identities with access to information systems. Identity management (or ID management. implementation. digital certificates. such as: • Technical – Employs identity management systems (identification. In each organisation there is normally a role or department that is responsible for managing the schema of digital identities of their staff and their own objects. management and deletion of identities without regard to access or entitlements. on-demand. While the term management requires little explanation.). identity management can involve three perspectives: 1.Identity management 61 Identity management IdM redirects here. multimedia (content).[4] SAML protocol is used to exchange identity information between two identity domains Perspectives on IdM In the real-world context of engineering online systems. Digital identity can be interpreted as the codification of identity names and attributes of a physical instance in a way that facilitates processing. online. the term identity is a more abstract concept that will always be difficult to define in a way that satisfies everyone. • Organizations – Hierarchies and divisions of access. It is a concept that is fluid and contextual depending on a number of factors including culture. these represented by object identities or object identifiers (OID). • Legal – Deals with legislation for data protection. See OECD[1] and NIST[2] guidelines on protecting PII[3] and the risk of identity theft. 3.

identity management is often used to express how identity information is to be provisioned and reconciled between multiple identity models. a given identity object consists of a finite set of properties. An axiomatic model of this kind can be considered to express "pure identity" in the sense that the model is not constrained by the context in which it is applied. In most theoretical and all practical models of digital identity. . The most common departure from "pure identity" in practice occurs with properties intended to assure some aspect of identity. and each identity can consist of multiple attributes or identifiers. The absence of external semantics within the model qualifies it as a "pure identity" model. can be defined as a set of operations on a given identity model. In practice. for example that all identities in a given abstract namespace are unique and distinctive. A "pure identity" model is strictly not concerned with the external semantics of these properties. Identity management. or as a set of capabilities with reference to it. in other words not treated specially by the model.Identity management 62 Pure identity paradigm A general model of identity can be constructed from a small set of axiomatic principles. The diagram below illustrates the conceptual relationship between identities and the entities they represent. Contrast this situation with properties which might be externally used for purposes of information security such as managing access or entitlement. To the extent that the model attempts to express these semantics internally. for example a digital signature or software token which the model may use internally to verify some aspect of the identity in satisfaction of an external purpose. then. or that such identities bear a specific relationship to corresponding entities in the real world. as well as between identities and the attributes they consist of. for example in classification and retrieval. These properties may be used to record information about the object. some of which are shared and some of which are unique within a given name space. either for purposes external to the model itself or so as to assist the model operationally. it is not a pure model. an entity can have multiple identities. In general. but which are simply stored and retrieved.

. From the user lifecycle perspective. and allows the organization to keep tabs of excessive privileges granted to any individual within the company. • IdM provides the focus to deal with system-wide data quality and integrity issues often encountered by fragmented databases and workflow processes. user access can be tracked from new hire. • User-based IdM has started to evolve away from username/password and web-access control systems toward those that embrace preferences. and its application more critical. servers. soft switch. parental controls. See Service Delivery Platform and Directory service. insurance. with security and single-customer viewing facilities. but will have to be taken seriously in the future. entitlements. many organizations face a major clean-up in their systems if they are to bring identity coherence into their influence. CRM.Identity management 63 User access paradigm User access requires each user to assume a unique "digital identity" across applications and networked infrastructures. This aspect has largely been ignored during the early development of identity management. Such coherence has become a prerequisite for delivering unified services to very large numbers of users on demand — cheaply. billing system. Emerging fundamental points • IdM provides significantly greater opportunities to online businesses beyond the process of authenticating and granting access to authorized users via cards. product catalog set. • Critical factors in IdM projects include consideration of the online services of an organization (what the users log on to) and how they are managed from an internal and customer self-care perspective. preferences. help desk etc. suspension to termination of employee. where organizations evolve their systems to the world of converged services. such as health. the use of a unique identity across all systems ease the monitoring and verification of potential unauthorized access. Service paradigm In the service paradigm perspective. security domain. entitlements and telephone numbers. the scope of identity management becomes much larger. travel and government services. The scope of identity management includes all the resources of the company deployed to deliver online services. content title. mail server. media server. single products and services as well as single IT infrastructure and network views to the respective parties. applications and/or products as well as a user's credentials. portals. • IdM can deliver single-customer views that include the presence and location of the customer. IdM applies to the products and services of an organization. It is also applicable to means by which these products and services are provisioned and assigned to (or removed from) "entitled" users. • It is equally important for users to correctly identify and authenticate service providers as it is for service providers to identify and authenticate users. • IdM embraces what the user actually gets in terms of products and services and how and when they acquire them. IdM relates intrinsically to information engineering. • IdM covers the machinery (system infrastructure components) that delivers such services because a system may assign the service of a user to: a particular network technology. Today. tokens and web access control systems. voice mailbox. media. usage right. network equipment. security and privacy. which enables access controls to be assigned and evaluate against this identity. Technically. These may include devices. content. Accordingly. Therefore. policy-based routing. address books. presence and loyalty schemes.

but have to combine the different dimensions such as: • • • • legal. it should be noted that people. On the backdrop of an increased risk to privacy of the citizen in the Information Society. several new projects related to Identity Management started. SWIFT [6] focuses on extending identity functions and federation to the network while addressing usability and privacy concerns. legal. police (i. and in particular the young people (15-25). social sciences. PICOS investigates and develops a state-of-the-art platform for providing trust. Lips & Organ 2009). and leverages identity technology as a key to integrate service and transport infrastructures for the benefit of users and the providers. also create a certain number of risks related to the disclosure of personal information (Gross. privacy and identity management in mobile communities. [7] ) . Addressing the identity issues First. technical.0) that they use frequently and for a long time • have a high level of perception of risk associated to these tools Addressing these different issues may be done only by legislation or via the use of technical systems. Forensics). such as privacy issues that may lead to the implementation of a surveillance society (Taylor. European research Within the Seventh Research Framework Programme of the European Union from 2007 to 2013. More specifically young people: • are often very knowledgeable about these systems (web 2. irrespective of their activities. security. With the data protection legislation or human rights legislation (Pounder 2009).Identity management 64 Issues The management of identity raises a certain number of issues. are well aware of the risks towards eID enabled services (Lusoli & Miltgen 2009). Other identity related projects from older European Union funded framework programs include: • FIDIS (Future of Identity in the Information Society • GUIDE [8] • PRIME [9]. and in particular in losing an individual's privacy (Taylor 2008). for which the management of their identities of their members represent a core element of these systems. Including socio-psychological aspects (social engineering). etc. and in particular the important development of online social networking services. or risk related to the stealing of identity (identity theft). Research Research related to the management of identity covers a variety of disciplines (such as technology. Using for instance with the use of Privacy enhancing technologies. The advent of the social web. societal. PrimeLife [5] will develop concepts and technologies to help individuals to protect their autonomy and retain control over personal information. the humanities and the law (Halperin & Backhouse 2009)) and areas.).e. Acquisti & Heinz 2008). and tries to investigate many different issues (technical.

"Identification practices in government: citizen surveillance and the quest for public service improvement". (2008). ISO/IEC WD 24760 (Working draft) • Pohlman. ISBN 978-1420072471. .1145/1102199. 2009. Workshop On Privacy In The Electronic Society. Lips. J. C. and for instance have special issue on Identity such as: • Online Information Review [13]. "Nine principles for assessing whether privacy is protected in a surveillance society". Identity in the Information Society (Springer) 1: 1. M. "Information revelation and privacy in online social networks" [14]. doi:10. Miriam. Identity in the Information Society (Springer) 1: 135. Heinz. doi:10. (2005). 2009. "Young People and Emerging Digital Services. and Compliance Architecture. M. doi:10. Oracle Identity Management: Governance.1007/s12394-008-0004-0. This may take from three to six months. 71–80.B. Issue 3. Caroline (2009). it becomes critical that an organization gears up for proper support through a transition phase or stabilization phase. JRC Scientific and Technical Reports (Sevilla: EC on Motivations.Security Techniques -. Proceedings of the 2005 ACM workshop on Privacy in the electronic society. • Pounder. • Lusoli. John A. See for instance the Special Issue on: Digital ID management (Volume 33. • Taylor. Standardization ISO (and more specifically ISO/IEC JTC1. Lack of focus on integration testing Lack of consistent architectural vision Expectations for "over-automation" Deploying too many IdM technologies in a short time period • • • • References • Gross. Identity in the Information Society (Springer) 1 (1): 71. Auerbach Publications. An Exploratory Survey [15] . IEC (2009).2791/68925. Wainer. Miltgen. (2008). Alessandro. Joe (2009). including the definition of different identity related terms. doi:10. Acquisti. SC27 IT Security techniques) is conducting some standardization work for identity management (ISO 2009). Information Technology -. Implementation challenges • • • • • • Getting all stakeholders to have a common view of area which is likely to come together and discuss the issues Expectation to make the IdM a data synchronization engine for application data Envisaging an appropriate business process leading to post-production challenges Lack of leadership and support from sponsors Overlooking change management — expecting everybody to go through the self-learning process Lack of definition of the post-production phase in a project plan — for a smooth transition of the system to the end-user community. Risk.A Framework for Identity Management [16]. Ruth. Perceptions and Acceptance of Risks" JRC IPTS) (EUR 23765 EN). such as the elaboration of a framework for identity management.1007/s12394-008-0002-2. March 2009. H. Ralph. 2009). "A roadmap for research on identity in the information society".1102214 • Halperin. Organ. • ISO.Identity management 65 Publications Different academic journals can be used to publish articles related to identity management such as: • Ethics and Information Technology [10] • Identity in the Information Society [11] • Surveillance & Society [12] Less specialized journals may also publish on the topic. N. pp. James (2008).. Backhouse.

in Bruce Momjian. 09 May 2007 (PPT presentation) [2] Guide to Protecting the Confidentiality of Personally Identifiable Information (PII) (http:/ / csrc.htm) (Computer Weekly) • Secure Widespread Identities for Federated Telecommunications (SWIFT) (http://www. editorialmanager. no registration required) • Sloppiness in access and authorization management can cost enterprises dearly (http://www. OECD Workshop on Digital Identity Management. "Zero Privacy". emeraldinsight.com/quest-one-identity-manager/) Identity Forge (http://www.html) (free.iso.com/) EmpowerID (http://www. uk/ projects/ guide/ [9] https:/ / www. 2007 [4] Object Id's (OID'S) (http:/ / doc. com/ etin/ [11] http:/ / www.htm?qt=18876&searchSubmit=Search&sort=rel&type=simple&published=on) • Identity management terminology (http://identity-manager. pdf). MC Press. (2008). aspx) External links • General Public Tutorial about Privacy and Identity Management (http://www.org/) • Identity management and information sharing in ISO 18876 Industrial automation systems and integration (http:// www.Identity management doi:10. ist-swift.eu/tutorials/gpto/) • Identity Management Overview (http://www. sumy. com/blog/?p=1516) . com/ computer/ journal/ 12394 [12] [13] [14] [15] [16] http:/ / www.com/Articles/2007/07/23/225715/ identity-management-the-expert-view. September 14. Trondheim. ua/ db/ pgsql_book/ node72.prime-project.hitachi-id. primelife. org/ http:/ / info. Graham. Recommendations of the National Institute of Standards and Technology. John A. Ilan. 1999 [5] http:/ / www. • Taylor. Identity Management: A Primer. ec. Norway. nist. springer.4547499. org/ dataoecd/ 36/ 30/ 38573952. eu/ [6] http:/ / www. January 2009 [3] PII (Personally Identifiable Information) (http:/ / www. ISBN 978-1-58347-093-0.empowerid. eu/ [10] http:/ / www.quest. org/ [7] http:/ / www. Kent (September 01. cfm?id=2119 http:/ / www.2008.ist-swift. org/ privacy/ issues/ pii/ ). The Center For Democracy & Technology.org/) • Federation for Identity and Cross-Credentialing Systems (FiXs) (http://www. ac. 66 Notes [1] Functional requirements for privacy enhancing systems (http:/ / www. htm?id=oir http:/ / doi. David.1007/s12394-009-0007-5.com/forefront/identitymanager/en/us/default. November 21.itsecuritystandard. 1102214 http:/ / ipts. html). • Williamson. prime-project. europa. fidis. htm?csnumber=51625 Notable Identity Management Products • • • • Quest Identity Manager One (http://www. org/ iso/ iso_catalogue/ catalogue_tc/ catalogue_detail. Yip. cdt. pdf) Fred Carter. PostgreSQL: Introduction and Concepts. IEEE Spectrum 45 (7): 20–20. surveillance-and-society. iso. gov/ publications/ drafts/ 800-122/ Draft-SP800-122.microsoft. 1145/ 1102199. acm.org/iso/search.FiXs. com/ products/ journals/ journals. surrey.com/docs/ identity-management-terminology. 2009). net/ home/ [8] http:/ / istrg. doi:10.computerweekly. org/ 10. eu/ publications/ pub. Sharni. som.com/) Microsoft ForeFront Identity Manager (http://www.identityforge.1109/MSPEC. Spaulding. oecd. jrc.

This concept is sometimes confused with Intrusion Detection (also known as IDS) techniques which are the art of detecting intruder actions.Internet ethics 67 Internet ethics In January 1989 the Internet Architecture Board (IAB) issued a statement of policy concerning Internet ethics. References • RFC 1087 "Ethics and the Internet" Intruder detection In information security. Theory Intruder Detection Systems try to detect who is attacking a system by analyzing his or her computational behaviour or biometric behaviour. Disrupts the intended use of the Internet. Compromises the privacy of users. typing pattern. computer) through such actions. intruder detection is the art of detecting intruders behind attacks as unique persons. Destroys the integrity of computer-based information. Communications Research and Infrastructure which. Wastes resources (people. capacity. in paraphrase. An extract of RFC 1087 follows: The IAB strongly endorses the view of the Division Advisory Panel of the National Science Foundation Division of Network. This technique tries to identify the person behind an attack by analyzing their computational behaviour. This document is referred to as RFC 1087 'Ethics and the Internet'. typing behaviour) • Patterns using an interactive command interpreter: • Commands used • Commands sequence • Accessed directories • Character deletion • Patterns on the network usage: • IP address used • ISP • Country • City • Ports used • TTL analysis • Operating system used to attack . Some of the parameters used to identify a person • Keystroke Dynamics (aka keystroke patterns. characterized as unethical and unacceptable any activity which purposely: • • • • • Seeks to gain unauthorized access to the resources of the Internet.

citefa. gov. cx/ p0f. or Intruder Classification. Only the former is correct. Keystroke dynamics analyze times between keystrokes issued in a computer keyboard or cellular phone keypad searching for patterns. First techniques used statistics and probability concepts like 'standard deviations' and 'Mean'.Intruder detection • Protocols used • Connection times patterns 68 Keystroke dynamics Keystroke dynamics is paramount in Intruder Detection techniques because it is the only parameter that has been classified as a real 'behavioural biometric pattern'. but others translate it as 'Sistemas de Detección de Intrusos'. htm . Intruder Verification. Some people translate it as 'Sistemas de Detección de Intrusiones'. shtml [2] http:/ / www. ar/ SitioSI6_EN/ si6. Translation confusion There is a confusion with the Spanish translation of 'Intrusion detection system'. External links • P0f OS fingerprinting tool [1] • Si6 Paranoid Project [2] References [1] http:/ / lcamtuf. Support Vector Machine. also known as IDS. etc. but the Si6 project was one of the first projects to deal with the full scope of the concept. later approaches use data mining. History Some other earlier works reference the concept of Intruder Authentication. neural networks. coredump.

. Coined the term:"Find the Digital Smoking Gun".7 was certified EAL4+ for LSPP in 2006. Born 1964. [3] with Charlie Balot in mid 1990's. Laykin is a fellow of the Academy of Court Appointed Masters. (NYSE:DUF) [10][11] as of 2008 where he is the Co-Practice Chair of the Global Electronic Discovery and Investigations practice.[15] Laykin was featured in the 2010 History Channel / A&E Network Series "Partners in Crime" [16] episode "Control-Alt-Delete" which highlighted a number of his high tech investigations in Asia. Co-Founder of Online Security.. (see:Lehman bankruptcy) Report of Anton R.2008 where he established the Information Technology Investigations Practice and worked with noted industry leaders Jennifer Baker and James Gordon who was portrayed by William H Macy in the film A Civil Action(film) Managing Director of Duff & Phelps (Duff & Phelps). Chicago and Hong Kong [7] catering to the legal and corporate community. California is a noted computer forensic /eDiscovery expert and cyber-crime investigator. Director of Navigant_Consulting Inc. FBI Infragard program as well as (External Link:[1]) Chairman of the Advisory Board of the EC-Council. (NYSE:NCI) [8] [9] from 2004 . The LSPP requirements are derived from the B1 class of the US Department of Defense security standard called Trusted Computer System Evaluation Criteria (TCSEC) which was originally published in 1985. Overview Erik Laykin is a noted computer forensic / eDiscovery and cyber-crime investigations pioneer and past President and Pacific Rim Director of the L. [13][14] and was appointed by California Insurance Commissioner and Gubernatorial Candidate Steve Poizner to the California Department of Insurance Anti-Fraud Task Force where he chaired the technology committee. Valukas brought to light Lehman Brothers usage of Repo 105. (Electronic Commerce Council) [2] Laykin has served as a expert witness throughout the United States in both Federal and State Court and has also served as a court appointed Special Master in matters involving complex eDiscovery and technology investigations. Kroll Inc. Anton Valukis. Erik Laykin Erik Laykin Erik Laykin. Inc. Investigative Group International [4] and Pinkerton Investigations. Led the Data Forensics and eDiscovery Investigative team (2009–2010) on behalf of the court appointed Lehman brothers Bankruptsy [12] Chairman of Jenner and Block who's extensive report Examiner. a set of security functional and assurance requirements for IT products. z/OS V1. Established initial computer forensic capabilites on behalf of leading American investigative agencies.Pinkerton National Detective Agency [5][6] Established first commercial computer forensic labs in Los Angeles.A. Los Angeles. (View following links for previews:[17][18] [19] [20]) The program was syndicated by the Crime & Investigation Network (South East Asia) throughout Asia. For example.Labeled Security Protection Profile 69 Labeled Security Protection Profile Labeled Security Protection Profile (LSPP) is a protection profile within the Common Criteria for Information Technology Security Evaluation. LLC.

cfm [16] http:/ / citvasia. htm http:/ / www. com/ PIC/ computer3. org [26] http:/ / forensic. com/ PIC/ computer1. com http:/ / www. htm http:/ / www. businesswire. com/ Pages/ newsDetail. aspx http:/ / www. com/ expertise/ our_team/ Pages/ bio. com/ prodser/ article. abanet.Erik Laykin 70 Organizations Supported FBI Infragard Program [21] (Past President of Los Angeles Chapter and Director for the Pacific Rim) Electronic Commerce Council [22] (Chairman of the Advisory Council) California Judicial Council Sub-Committee on Electronic Evidence United States Secret Service Electronic Crimes Task Force [23] American Bar Association [24] Forensic Expert Witness Association [25][26] References [1] [2] [3] [4] [5] [6] [7] [8] [9] http:/ / www. duffandphelps. infragard. gov/ 0400-news/ 0100-press-releases/ 0060-2007/ release047-07. gov/ ectf. org/ find-an-expert/ erik-j. courtappointedmasters. php/ 2199301/ Pinkerton-CA-Combine-to-Package-Security-Services. forensic. zdnetasia. aspx [17] http:/ / www. com/ http:/ / www. navigantconsulting. aspx?ID=183& list=People) [12] Examiner’s Report on Lehman Brothers Bankruptcy Unsealed | News | Duff & Phelps (http:/ / www. eccouncil. citvasia. com [23] http:/ / www. org/ [14] http:/ / www. shtml [24] http:/ / www. com Navigant Consulting Adds Computer Forensics and Specialized Technology Experts to Discovery Services Practice | Business Wire (http:/ / www. com http:/ / www. gov [22] http:/ / www. html References . com [11] Erik Laykin | Managing Director | Duff & Phelps (http:/ / www. com/ PIC/ becoming3. esecurityplanet. courtappointedmasters. org/ about_us/ governance/ honorary_council. html [18] http:/ / www. com/ PIC/ computer2. duffandphelps. OnlineSecurity. com http:/ / www. citvasia. html [19] http:/ / www. aspx?list=News& ID=266& DisplayLang=us) [13] http:/ / www. org [25] http:/ / www. html [20] http:/ / www. com/ news/ home/ 20050207005650/ en/ Navigant-Consulting-Adds-Computer-Forensics-Specialized-Technology) [10] http:/ / www. infragard. asp [15] http:/ / www. duffandphelps. org/ member_current. secretservice. -laykin. ci-pinkerton. igint. com/ online-labs-looking-to-establish-hong-kong-office-13019768. ca. citvasia. com/ partnersincrime. citvasia. eccouncil. insurance. html [21] http:/ / www.

[1] In more recent years these commerical techniques have developed further and the recovery of deleted data from proprietory mobile devices has become possible with some specialist tools. Each device often has to have custom extraction techniques used on it. . including important data now retained on smartphone 'apps'. analysing phone contents directly via the screen and photographing important content. photos. law enforcement are much more likely to encounter a suspect with a mobile device in his possession and so the growth of demand for analysis of mobiles has increased exponentially in the last decade. custom interface and proprietary nature of mobile devices requires a different forensic process compared to computer forensics. e-mail and other forms of rich internet media. Over time commercial tools appeared which allowed examiners to recover phone memory with minimal disruption and analyse it separately. Newer generations of smart phones also include wider varieties of information. however. The use of phones in crime was widely recognised for some years. email. Wireless network settings. web browsing) demand for forensic examination grew. from web browsing. A proliferation of phones (particularly smartphones) on the consumer market caused a demand for forensic examination of the devices. Mobile devices can be used to save several types of personal information such as contacts.[1] In comparison to computer forensics. Types of evidence As mobile device technology advances.[1] The memory type. it can also relate to any digital device that has both internal memory and communication ability. With the increased availability of such devices on the consumer market and the wider array of communication platforms they support (e. The phrase mobile device usually refers to mobile phones. History As a field of study forensic examination of mobile devices dates from the late 1990s and early 2000s.Mobile device forensics 71 Mobile device forensics Mobile device forensics is a branch of digital forensics relating to recovery of digital evidence or data from a mobile device under forensically sound conditions. as well as call logs. calendars and notes. which could not be met by existing computer forensics techniques. dating from the early 2000s. Early efforts to examine mobile devices used similar techniques to the first computer forensics investigations.g. Handset and attached memory cards. including SIM card. but the forensic study of mobile devices is a relatively new field. the amount and types of data that can be found on a mobile device is constantly increasing. contact lists and phone IMEI/ESN information. Evidence that can be potentially recovered by law enforcement agents from a mobile phone may come from several different sources. The role of mobile phones in crime had long been recognised by law enforcement. Traditionally mobile phone forensics has been associated with recovering SMS and MMS messaging.

connecting it to a recharger and putting it into a faraday cage may not be good practice. NOR type 64. however. which may bring in new data. 256 and NAND memory 16. most mobile device acquisition is performed live. similar to the file systems of computers. often automated. 128. Mobiles will often be recovered switched on. 128. overwriting evidence. The mobile device would recognize the network disconnection and therefore it would change its status information that can trigger the memory manager to write data. Forensic process The forensics process for mobile devices broadly matches other branches of digital forensics. One of the main ongoing considerations for analysts is preventing the device from making a network/cellular connection.Mobile device forensics 72 Service provider logs The European Union requires its members countries to retain certain telecommunications data for use in investigations. e. This includes data on calls made and retrieved. methods and tools can be taken over from hard disk forensics or only need slight changes[4] . Although this is a different science than forensic analysis which is undertaken once the mobile phone has been seized. Examination and analysis As an increasing number of mobile devices use high-level file systems. To prevent a connection mobile devices will often be transported and examined from within an Faraday cage (or bag).[3] iPhone in an RF shield bag Most acquisition tools for mobile devices are commercial in nature and consist of a hardware and software component. which is larger than 512 bytes for hard disks and depends on the used memory type. 256. Seizure Seizing mobile devices is covered by the same legal considerations as other digital media.[2] Acquisition The second step in the forensic process is acquisition.g. A difference is the block size used.[5] . One could use specialized and automated forensic software products or . Different software tools can extract the data from the memory image. some particular concerns apply.. as the aim of seizure is to preserve evidence the device will often be transported in the same state to avoid a shutdown changing files. in this case usually referring to retrieval of material from a device (as compared to the bit-copy imaging used in computer forensics).[2] Because of the proprietary nature of mobiles it is often not possible to acquire data with it powered down. or 512 kilobyte. With more advanced smartphones using advanced memory management. The location of a mobile phone can be determined and this geographical data must also be retained. The FAT file system is generally used on NAND memory.

A physical acquisition has the advantage of allowing deleted files and data remnants to be examined. It produces a bit-by-bit copy of the device's flash memory.g. In contrast. a memory chip). A logical extraction is generally easier to work with as it does not produce a large binary blob. Physical extraction acquires information from the device by direct access to the flash memories. due to it normally being removed from the file system of the phone. if the device allows file system access through their synchronization interface.Mobile device forensics generic file viewers such as any hex editor to search for characteristics of file headers. specialized forensic software simplifies the search and extracts the data but may not find everything. However. In this case. the dumping phase and the decoding phase. This usually does not produce any deleted information. and EnCase. Logical extraction acquires information from the device using the vendor interface for synchronizing the contents of the phone with a personal computer. in some cases the phone may keep a database file of information which does not overwrite the information but simply marks it as deleted and available for later overwriting. it is advisable to use two or more tools for examination.. Logical acquisition Logical acquisition implies a bit-by-bit copy of logical storage objects (e. Generally the physical extraction is then split into two steps. Sleuthkit. a mobile device acquisition unit Data acquisition types Physical acquisition Physical acquisition implies a bit-by-bit copy of an entire physical store (e. to mention only some. Logical acquisition has the advantage that system data structures are easier for a tool to extract and organize.g.. AccessData. A physical extraction is the method most similar to the examination of a personal computer. However a skilled forensic examiner will be able to extract far more information from a physical extraction. a file system partition).[6] Since there is no tool that extracts all possible information.. . but working with a hex editor means a lot of handwork and file system as well as file header knowledge. directories and files) that reside on a logical store (e. it is possible to recover deleted information. are forensic software products to analyze memory images. The advantage of the hex editor is the deeper insight into the memory management. There is currently (February 2010) no software solution to get all evidences from flash memories.g.[7] 73 RTL Aceso. Generally this is harder to achieve because the device vendors needs to secure against arbitrary reading of memory so that a device may be locked to a certain operator.

The flasher tools are easy to connect and use.g.. nowadays mostly flash memory. Every command usage with options and output must be documented.. MMC cards. the memory chip can be desoldered. while making a copy. Note. If the USB drive has no protection switch a write blocker [10] can be used to mount the drive in a read-only mode or. AT commands AT commands are old modem commands. The SIM card is soundly analyzed. the Unix command dd. where the user data is stored and most likely deleted important data will be lost. e. This method has the advantage that the operating system makes the transformation of raw data into human interpretable information. but some can change the data and have other dangerous options or do not make a complete copy [4] . in an exceptional case. Project-a-Phone [8].g. System commands Mobile devices do not provide the possibility to run or boot from a CD. The SIM and memory cards need a card reader to make the copy.. The disadvantage is that only data visible to the operating system can be recovered and that all data are only available in form of pictures. System commands are the cheapest method. In practice this method is applied to cell phones. e. SD cards. These tools mainly originate from the manufacturer or service centers for debugging. Therefore the device is used as normal and pictures are taken from the screen. External memory External memory devices are SIM cards. by software command or destruction of fuses in the read circuit not prevent writing or using the memory internally by the CPU. is needed to make the bit-level copy. repair.g. For external memory and the USB flash drive. Internal memory This section describes various possibilities to save the internal storage. e. originally intended as a backup. such that it is possible to recover (deleted) data like contacts or text messages [3] . With the risk of modified system commands it must be estimated if the volatile memory is really important. CF cards.g.Mobile device forensics 74 Manual acquisition The user interface can be utilized to investigate the content of the memory. but imply some risks of data loss.. connecting to a network share or another device with clean tools. The memory can [11] . and the Memory Stick. Using these commands one can only obtain information through the operating system. and can therefore only be used on a device that has modem support. Therefore system commands could be the only way to save the volatile memory of a mobile device. this would be protected from reading. Hayes command set and Motorola phone AT commands.. or upgrade services. can also read the memory to make a copy. A similar problem arises when no network connection is available and no secondary memory can be connected to a mobile device because the volatile memory image must be saved on the internal non-volatile memory. such that no deleted data can be extracted [3] . appropriate software. Flasher tools A flasher tool is a programming hardware and/or software that can be used to program (flash) the device memory.g. Furthermore USB flash drives with memory protection do not need special hardware and can be connected to any computer. e. e. EEPROM or flash memory. Many USB drives and memory cards have a write-lock switch that can be used to prevent data changes. PDAs and navigation systems[9] . They can overwrite the non-volatile memory and some. depending on the manufacturer or device.

The BGA technique bonds the chips directly onto the PCB through molten solder balls. The ball is then heated by a laser. Before the invention of the BGA technology it was possible to attach probes to the pins of the memory chip and to recover the memory through these probes. The miniaturizing of device parts opens the question how to test automatically the functionality and quality of the soldered integrated components. The boundary scan produces a complete forensic image of the volatile and non-volatile memory. such that it is sometimes necessary to open the device and re-solder the access port[4] . This produces the so-called "popcorn effect." . Chip re-balling After desoldering the chip a re-balling process cleans the chip and adds new tin balls to the chip. Not all mobile devices provide such a standardized interface nor does there exist a standard interface for all mobile devices. To find the correct bits in the boundary scan register one must know which processor and memory circuits are used and how they are connected to the system bus. but all manufacturers have one problem in common. e. When not accessible from outside one must find the test points for the JTAG interface on the printed circuit board and determine which test point is used for which signal. A • The second method is laser re-balling. The stencil is chip-dependent and must fit exactly. There are mainly three methods to melt the solder: hot air. The infrared light technology works with a focused infrared light beam onto a specific integrated circuit and is used for small chips. The JTAG port is not always soldered with connectors. While reloading the bondhead of the re-balling unit changes the position to the next pin. Here the stencil is programmed into the re-balling unit. Also. Instantly after melting the ball the laser turns off and a new ball falls into the bondhead. This prevents the so-called popcorn effect. After cooling the tin the stencil is removed and if necessary a second cleaning step is done. The protocol for reading the memory must be known and finally the correct voltage must be determined to prevent damage to the circuit [3] . Before the chip is desoldered the PCB is baked in an oven to eliminate remained water. Re-balling can be done in two different ways. Despite the standardization there are four tasks before the JTAG device interface can be used to recover the memory. see bondhead (looks like a tube/needle) is automatically loaded with one tin ball from a solder ball singulation tank. to get position data from GPS equipment (NMEA) or to get deceleration information from airbag units [9] . such that the tin-solder ball becomes fluid and flows onto the cleaned chip.this is the last and most intrusive method to get a memory image is to desolder the non-volatile memory chip and connect it to a memory chip reader. the Joint Test Action Group (JTAG). and steam-phasing. The risk of data change is minimized and the memory chip must not be desoldered. so that the heat does not destroy the chip or data. [12] [13] [14] . Generating the image can be slow and not all mobile devices are JTAG enabled.g. at which the remaining water would blow the chip package at desoldering. Then the tin-solder is put on the stencil. This method contains the potential danger of total data destruction: it is possible to destroy the chip and its content because of the heat required during desoldering. it can be difficult to find the test access port [5] . such that it is no longer possible to attach probes. infrared light. 75 Here you can see that moisture in the circuit board turned to steam when it was subjected to intense heat.Mobile device forensics JTAG Existing standardized interfaces for reading data are built into several mobile devices. The hot air and steam methods cannot focus as much as the infrared technique. Desoldering the chips is done carefully and slowly. For this problem an industry group. Forensic desoldering Commonly referred to as a "Chip-Off" technique within the industry .. developed a test technology called boundary scan. • The first is to use a stencil.

Cellebrite UFED. as well as leaving many parts of the proprietary operating system inaccessible. For a detailed discussion see Gubian and Savoldi. Most tools consist of a hardware portion. but the pogo pins can be used directly on the pads on the chip without the balls [3] [4] . For instance a device where logical extraction using one product only produces a list of calls made by the device may be listed as supported by that vendor while another vendor can produce much more information. Hence.[15] An example of a mobile forensics tool currently available to forensic investigators: Controversies In general there exists no standard for what constitutes a supported device in a specific product. and some software. This leads to a very complex landscape when trying to overview the products. occasionally. 2007. The disadvantage is that the re-balling devices are expensive.Mobile device forensics A third method makes the entire re-balling process unnecessary. This had the disadvantage of risking the modification of the device content. It is quite common to use at least two products which complement each other. Oxygen Forensic Suite 2 and Paraben Device Seizure. so this process is very costly and there are some risks of total data loss. 76 Tools Early investigations consisted of live analysis of mobile devices. The chip is connected to an adapter with Y-shaped springs or spring-loaded pogo pins. For a wide overview on nand flash forensic see Salvatore Fiorillo. The Y-shaped springs need to have a ball onto the pin to establish an electric connection. In recent years a number of hardware/software tools have emerged to recover evidence from mobile devices. Furthermore different products extract different amounts of information from different devices. with examiners photographing or writing down useful material for use as evidence. In general this leads to a situation where testing a product extensively before purchase is strongly recommended. to extract the evidence and. A situation such as this makes it much harder to compare products based on vendor provided lists of supported devices. Some current tools include those by Radio Tactics. This has led to the situation where different vendors define a supported device differently. Micro Systemation XRY. forensic desoldering should only be done by experienced laboratories [5] . to analyse it. The advantage of forensic desoldering is that the device does not need to be functional and that a copy without any changes to the original data can be made. 2009[7] . with a number of cables to connect the phone to the acquisition machine.

forensic tools and technology. & Ayers. Nevertheless there are developments to secure the memory in hardware with security circuits in the CPU and memory chip.au/proceedings/2009/ forensics/Fiorillo. gov/ publications/ nistir/ nistir-7250. (December 2009).scis. Rick.org/) 'Theory and practice of flash memory mobile forensics' (http://igneous. ti. wmv) [15] Kipper. ecu.mobileforensicsworld. Guidelines on cell phone forensics. [8] http:/ / www. [5] Ronald van der Knijff. com/ index. (1995). Texas Instruments. pdf [3] Svein Y.com.org/papers/ SSDDFJ_V2_1_Punja_Mislan. org/ papers/ SSDDFJ_V1_1_Breeuwsma_et_al. 1. Theosecurity. factronix.. gov/ publications/ nistpubs/ 800-101/ SP800-101. p. dfrws. Martien de Jongh. forensicswiki. com/ 5937063.ssddfj. and Mark Roeloffs. retrieved from Theory and practice of flash memory mobile forensics (http:/ / igneous. projectaphone. [10] http:/ / www. [7] Salvatore Fiorillo. 101.edu. 2. (2006) Retrieved from Forensic analysis of mobile phone internal memory (http:/ / citeseerx. retrieved from 10 Good Reasons Why You Should Shift Focus to Small Scale Digital Device Forensics (http:/ / www. References [1] Casey. Retrieved from Cell Phone Forensic Tools: An Overview and Analysis (http:/ / csrc. US Patent 5469557. Coert Klaver. 6742& rep=rep1& type=pdf). (October 2005). org/ wiki/ Write_Blockers [11] Tom Salt and Rodney Drake. nist. edition.ecu.crypto. such that the memory chip cannot be read even after desoldering[16] [17] . "Mobile Device Analysis" (http://www. Wireless Crime and Forensic Investigation. and Ronan Daniellou. org/ 2007/ proceedings/ vanderknijff_pres. retrieved from http:/ / csrc. html Secure Boot Patent] [17] Harini Sundaresan. Retrieved from Code protection in microcontroller with EEPROM fuses. scis. (http:/ / www.Mobile device forensics 77 Anti-forensics Anti-computer forensics is more difficult because of the small size of the devices and the user's restricted data accessibility[5] . patentstorm.org/iphone-security-iphone-forensics/) Seminar 'Covert Channels and Embedded Forensics' (http://www.pdf) • SG Punja. Ronald van der Knijff. au/ proceedings/ 2009/ forensics/ Fiorillo. pdf). (2007).infosecinstitute. nist. Elsevier. [16] http:/ / www. Volume 1 (Number 1). External links • • • • Video Tutorial on iPhone 4 Forensics (http://resources. psu. [4] Marcel Breeuwsma. Second Edition. (July 2003). [2] Wayne. (2007). com/ [9] Eoghan Casey. 1. 2007. pdf). 2003. Wayne Jansen. Digital Evidence and Computer Crime.de/its_seminar_ss09. . factronix. retrieved from Forensic Data Recovery from Flash Memory (http:/ / www. pdf). (May 2007). 97. Willassen. com/ _downloads/ Reballing. com/ pdfs/ vf/ wireless/ platformsecuritywp. php?view=content& id=146& themeid=279& parentid=56& tpl=1& bild=1) [13] Video: Scheme of the Laser Re-balling process (http:/ / www.html) Conference 'Mobile Forensics World' (http://www. html) [12] Homepage of Factronix (http:/ / www. wmv) [14] Video: Re-balling process (http:/ / www. Small Scale Digital Device Forensics Journal. Handbook of computer crime investigation . Eoghan (2004). ISBN 0-12-163104-4. pdf). Retrieved 23 August 2010. Greg. pdf). ist.rub. Retrieved from OMAP platform security features (http:/ / focus. RP Mislan. National Institute of Standards and Technology.pdf). [6] Rick Ayers. us/ patents/ 5469557/ description. ssddfj. Nicolas Cilleros. factronix. Auerbach Publications. edu/ viewdoc/ download?doi=10. Academic Press. freepatentsonline. edu. Jansen. com/ _downloads/ laser. Small l Scale Digital Device Forensics Journal.

External links • Official website [1] References [1] http:/ / www.MyNetWatchman 78 MyNetWatchman MyNetWatchman is a community-based collaborative firewall log correlation system. national-security information. mynetwatchman. PDF [2] http:/ / www. Retrieved December 30. Several data feeds are provided to users to either include in their own web sites or to use as an aide to analyze events. cnss. 2009. The goal of the MyNetWatchman project is to allow access to its correlated information to the public at no charge to raise awareness and provide accurate and current snapshots of internet attacks. pdf .S. and it plays a key role in the National Information Assurance Partnership. gov/ Assets/ pdf/ nstissi_1000. References [1] http:/ / www. NIACAP is derived from the Department of Defense Certification and Accreditation Process (DITSCAP). gov/ Assets/ pdf/ CNSSP-6. References • "National Policy on Certification and Accreditation of National Security Systems" [1] (PDF). 2009. cnss. • "National Information Assurance Certification and Accreditation Process (NIACAP)" [2] (PDF). Retrieved December 30. It receives logs from volunteers world wide and uses them to analyze attack trends. com National Information Assurance Certification and Accreditation Process The National Information Assurance Certification and Accreditation Process (NIACAP) is the minimum-standard process for the certification and accreditation of computer and telecommunications systems that handle U.

4015. Corey Schou is the director of NIATEC. cyber-corps to defend against cyber-based disruption and attacks. and 4016. Dr. Dr. industry. a National Security Agency Center of Academic Excellence in Information Assurance [1] The Centers of Academic Excellence and NIATEC are components of a plan to establish a federal Education. NIATEC has been active in development of training standards associated with both the National Institute of Standards and Technology Special Publication 800-16 and Committee on National Security Systems Instructions 4011. awareness. and ultimately contributes to the protection of the National Information Infrastructure. 4012. 4013. The group subscribes to the (ISC)² Code of Ethics. training and education standards in Information Assurance.info/) . References [1] NSA official web site page listing National Centers of Academic Excellence in Information Assurance Education (http:/ / www. James Frost is the associate director. nsa. NIATEC is associated with Idaho State University. shtml) External links • Official website (http://http://niatec. 4014. and government organizations to improve the literacy. gov/ ia/ academic_outreach/ nat_cae/ institutions.National Information Assurance Training and Education Center 79 National Information Assurance Training and Education Center The National Information Assurance Training and Education Center (NIATEC) is an American consortium of academic. It serves to develop professionals with IA expertise in various disciplines.

and fourth priorities (the development of a National Cyberspace Security Threat and Vulnerability Reduction Program. monitor and analyze attacks. Retrieved 2008-05-18.dhs. focuses on improving the government’s response to cyberspace security incidents and reducing the potential damage from such events. and after five months of public comment. intends to prevent cyber attacks that could impact national security assets and to improve the international management of and response to such attacks. Released on February 14. . the National Strategy to Secure Cyberspace.whitehouse. it offers suggestions. (2) Reduce The National Strategy to Secure national vulnerability to cyber attacks. the National the new cabinet-level United States Department of Homeland Security as Strategy outlines five national priorities: The first priority. government via Department of Homeland Security. National Cyberspace Security Response System. It calls for a single federal center to help detect. third. dhs. and vulnerabilities to. the creation of a [1] the lead agency protecting IT. U. 16.gov/ the_press_office/Statement-by-the-Press-Secretary-on-Conclusion-of-the-Cyberspace-Review/). the establishment of a system of National Security and International Cyberspace Security Cooperation. universities. and individual users of cyberspace to secure computer systems and networks. 2003. The National Strategy to Secure Cyberspace identifies three strategic objectives: (1) Prevent cyber attacks against America's critical infrastructures. The plan advises a number of security practices as well as promotion of cyber security education.gov/xprevprot/programs/editorial_0329. To meet these objectives. Notes [1] "The National Strategy to Secure Cyberspace" (http:/ / www. and individuals who use the Internet to add firewalls and anti-virus software to their systems.National Strategy to Secure Cyberspace 80 National Strategy to Secure Cyberspace In the United States government.shtm) by the DHS • Statement by the Press Secretary on Conclusion of the Cyberspace Review (http://www. Press Office. and government. External links • National Strategy to Secure Cyberspace (http://www. The National Strategy to Secure Cyberspace was drafted by the Department of Homeland Security in reaction to the September 11. the National Strategy encourages companies to regularly review their technology security plans. p. pdf) (PDF). April 17.S. not mandates. The White House. The second. to business. 2001 terrorist attacks. Ultimately. 2009 . February 2003. the creation of a National Cyberspace Security Awareness and Training Program. gov/ xlibrary/ assets/ National_Cyberspace_Strategy. the necessity of Securing Governments' Cyberspace) aim to reduce threats from. academic. It was prepared after a year of research by businesses. and (3) Minimize damage and recovery Cyberspace (February 2003) featured time from cyber attacks that do occur. is a component of the larger National Strategy for Homeland Security. The fifth priority. cyber attacks. and for expanded cyber security research and improved government-industry cooperation.

the aim is to make it difficult for unauthorized access to occur. The term is also used in the concept of graphical user interface design where computers are controlling complex equipment such as airplanes.Need to know 81 Need to know The term "need to know". and will unequivocally appear on an exam or be used to save a patient's life. org/ wiki/ Trinity_%28nuclear_test%29| . without inconveniencing legitimate access. only a small number of them knew the entire scope of the operation. in which the lack of an official approval (such as a clearance) may absolutely prohibit a person from accessing the information. Mandatory access control schemes can also audit accesses. The same is true of the Trinity project [1]. when used by government and other organizations (particularly those related to the military or espionage). safety-related messages are given priority. or to cover up illegal actions. In this case. Problems and criticism Need-to-know (like other security measures) can be misused by some personnel who wish to refuse others access to information they hold in an attempt to increase their personal power. References [1] http:/ / en. or read into a clandestine operation. one would not be given access to such information. In this usage. in order to determine if need to know has been violated. wikipedia. In Medical Education Describing material that is high yield. prevent unwelcome review of their work. Under need-to-know restrictions. describes the restriction of data which is considered very sensitive. Need-to-know also aims to discourage "browsing" of sensitive material by limiting access to the smallest possible number of people. In computer technology The discretionary access control mechanisms of some operating systems can be used to enforce need to know. Need to know is often concurrently applied with mandatory access control schemes. the owner of a file determines whether another person should have access. access to the information must be necessary for the conduct of one's official duties. prevent embarrassment resulting from actions or thoughts. the rest were only informed of data needed to complete a small part of the plan. that is. even if one has all the necessary official approvals (such as a security clearance) to access certain information. Though thousands of military personnel were involved in planning the invasion. The Battle of Normandy in 1944 is an example of a need-to-know restriction. As with most security mechanisms. when many different pieces of data are dynamically competing for finite UI space. unless one has a specific need to know. This is because need to know can be a subjective assessment.

grant users access only to what is necessary for the completion of their work. The first step in creating a policy is to understand what information and services are available (and to which users). that is. The document itself is usually several pages long and written by a committee. the security policy should dictate a hierarchy of access permissions. A security policy goes far beyond the simple idea of "keep the bad guys out". meant to govern data access. determines how policies are enforced and lays out some of the basic architecture of the company security/ network security environment. email attachments and more. While writing the security document can be a major undertaking. It's a very complex document.Network security policy 82 Network security policy A network security policy is a generic document that outlines rules for computer network access. a good start can be achieved by using a template. It specifies these rules for individuals or groups of individuals throughout the company. web-browsing habits. External links • National Institute for Standards and Technology [1] References [1] http:/ / csrc. National Institute for Standards and Technology provides a security-policy guideline. gov/ . In addition. what the potential is for damage and whether any protection is already in place to prevent misuse. use of passwords and encryption. nist. Security policy should keep the malicious users out and also exert control over potential risky users within your organization. The policies could be expressed as a set of instructions that could be understood by special purpose network hardware dedicated for securing the network.

org Off-site data protection In computing. or system crash. open proxies. many choose to have their backups managed and stored by third parties who specialize in the commercial protection of off-site data. Sending backups off-site ensures systems and servers can be reloaded with the latest data in the event of a natural disaster. Off-site backup services are convenient for companies that backup pertinent data on a daily basis (classified and unclassified). • Free-standing dedicated vaults • Insulated chambers sharing facilities . lasso) [3] http:/ / njabl. as backups are stored in purpose built vaults. njabl. or even disused mines. The SORBS dynamic IP list is also a development from Dynablock. or NJABL. . off-site data protection. References [1] NJABL. Although some organizations manage and store their own off-site backups.Not Just Another Bogus List 83 Not Just Another Bogus List Not Just Another Bogus List. or vaulting. NJABL maintains a list of known and potential spam sources (open mail relays. is a DNS blacklist.org [[FAQ (http:/ / www. open form to mail HTTP gateways. which is known as electronic vaulting or e-vaulting. org/ xbl/ index.often converted defunct cold war military or communications facilities. accidental error. External links • NJABL. but is more aggressively inclusive than NJABL's version. and direct spammers) for the purpose of being able to tag or refuse e-mail and thereby block spam from certain sources.often implemented within existing record center buildings. is the strategy of sending critical data out of the main location (off the main site) as part of a disaster recovery plan. dynamic IP pools. org/ faq. Sending backups off-site also ensures that there is a copy of pertinent data that isn’t stored on-site. That said. html)]] [2] Exploits Block List (http:/ / www. Data can also be sent electronically via a remote backup service. commercial vaults typically fit into three categories: • Underground vaults . There are no generally recognized standards for the type of structure which constitutes a vault. Data vaults The storage of off-site data is also known as vaulting.org [3] . The Open Proxy IPs portion (only) of NJABL data are used in Spamhaus XBL list [2] NJABL's dynamic IP list originally came from Dynablock but has been developed independently since Dynablock stopped updating[1] December 2003. Data is usually transported off-site using removable storage media such as magnetic tape or optical storage. NJABL automatically retests only listed open relays every 90 days [1] .Home site of Not Just Another Bogus List project. spamhaus.

Gerard Nicol 2006 ISBN 0-9802859-0-9 . 10 Mass.S. A. 2005 Extra LEXIS 94 (Fla. 1999 WL 462015.Off-site data protection 84 Statutory obligations Data Protection Statutes are usually non-prescriptive within the commercial IT arena in how data is to be protected. No.. L. 97-2307). LINNEN. 189 (Mass Super. Every organization has used these standards to develop "their" version of compliance . Super.today's regulatory requirements started with the "Rainbow" Series.The FCPA of 1977 Legal precedents • • • • • Thomas F. et als. UBS Warburg Coleman (Parent) Holdings. Ct. et als v. Robins. gov/ publications/ PubsSPs. Inc.don't get wrapped around the NIC on compliance . United States Federal entities have specific requirements as defined by the U. v. INC. ROBINS COMPANY. 2005). Linnen v. NIST documentation can be obtained at http:/ / csrc. FJS Electronics v. Inc. National Institute of Standards and Technology (NIST). 23.H. nist. Morgan Stanley & Co. References • Protecting Data Off-Site.. (Mass. html and commercial agencies have the option of using these documents for compliance requirements. • History . Fidelity Bank Zubulake v. Mar. Court. Statutes which mandate the protection of data are: • • • • • • • • Federal Information Systems Management Act (FISMA) FEDERAL INFORMATION SYSTEM CONTROLS AUDIT MANUAL (FISCAM) Health Insurance Portability and Accountability Act Sarbanes-Oxley (SOX) Basel II Gramm-Leach-Bliley (GLBA) Data Protection Act 1998 Foreign Corrupt Practices Act ("FCPA") . Court. 1999).use "Due Care" and apply "Due Diligence" and base your infrastructure using "SECURITY" as the foundation. but they increasingly require the active protection of data. Cir.Rptr.

The repositories are collections of publicly available and open content that utilize the language. An OVAL Board consisting of representatives from a broad spectrum of industry. This means that the OVAL. community standard to promote open and publicly available security content. Department of Homeland Security for the benefit of the community. patch state. configuration issue. OVAL Interpreter The OVAL Interpreter [2] is a freely available reference implementation created to show how data can be collected from a computer for testing based on a set of OVAL Definitions and then evaluated to determine the results of each definition. The language standardizes the three main steps of the assessment process: representing configuration information of systems for testing. Content written in the OVAL Language is located in one of the many repositories found within the community. patch state. and to standardize the transfer of this information across the entire spectrum of security tools and services. . which is funded by US-CERT at the U. etc. The OVAL Interpreter demonstrates the usability of OVAL Definitions. analyzing the system for the presence of the specified machine state (vulnerability. Each definition in the OVAL Repository determines whether a specified software vulnerability. information security. It is not a fully functional scanning tool and has a simplistic user interface. or patch is present on a system. and government organizations from around the world oversees and approves the OVAL Language and monitors the posting of the definitions hosted on the OVAL Web site. and can be used to by definition writers to ensure correct syntax and adherence to the OVAL Language during the development of draft definitions. and an assortment of content repositories held throughout the community. but running the OVAL Interpreter will provide you with a list of result values for each evaluated definition. It is the central meeting place for the OVAL Community to discuss. is hosted by The MITRE Corporation. analyzing the system for the presence of the specified machine state (vulnerability. known as the OVAL Repository. OVAL Language The OVAL Language [1] standardizes the three main steps of the assessment process: representing configuration information of systems for testing. and disseminate OVAL Definitions. OVAL includes a language used to encode system details. etc. analyze. store. One such repository.Open Vulnerability and Assessment Language 85 Open Vulnerability and Assessment Language Open Vulnerability and Assessment Language (OVAL) is an international. The information security community contributes to the development of OVAL by participating in the creation of the OVAL Language on the OVAL Developers Forum and by writing definitions for the OVAL Repository through the OVAL Community Forum. configuration. These schemas correspond to the three steps of the assessment process: an OVAL System Characteristics schema for representing system information. and an OVAL Results schema for reporting the results of an assessment.S. configuration.). and reporting the results of this assessment. academia. The OVAL community has developed three schemas written in Extensible Markup Language (XML) to serve as the framework and vocabulary of the OVAL Language. program.). reflects the insights and combined expertise of the broadest possible collection of security and system administration professionals worldwide. and reporting the results of this assessment. an OVAL Definition schema for expressing a specific machine state.

an event for which system administrators may prepare. In order to reduce the costs related to the deployment of patches. analyze. In this system. store. Microsoft included a "Windows Update" system that would check for patches to Windows and its components. Microsoft's solution was the "Automatic Update. but. itsecdb. security patches are accumulated over a period of one month and then dispatched all at once on the second Tuesday of the month. html http:/ / oval. com/ http:/ / www. Microsoft introduced "Patch Tuesday" in October [2] 2003. this system also checks for updates to other Microsoft products.itsecdb. Some speculate that Tuesday was selected so that post-patch problems could be troubleshot and resolved before the weekend. on which Microsoft releases security patches. gideontechnologies. With the release of Microsoft Update.secpod. The non-Microsoft terms for the following day are "Exploit Wednesday" and "Day Zero." when attacks may be launched against the newly announced vulnerabilities. The Patch Tuesday begins at 17:00 UTC. such as corporate users. Visual Studio. which can include OVAL System Characteristics files and OVAL Results files as well as definitions. certainly.com [6] Portal for OVAL definitions from several sources oval. secpod. com Patch Tuesday Patch Tuesday is usually the second Tuesday of each month. org/ language/ download/ interpreter/ index.com [7] SecPod OVAL Definitions Professional Feed References [1] [2] [3] [4] [5] [6] http:/ / oval. org/ language/ index. html http:/ / oval. mitre.g. org/ http:/ / www. . mitre. com/ [7] http:/ / oval. html http:/ / oval. such as Office.[1] Starting with Windows 98. Sometimes there is an extraordinary Patch Tuesday. Patch-deployment costs Earlier versions of the Windows Update system suffered from two problems. The first was that less-experienced users were often unaware of Windows Update and did not install it. org/ repository/ index.Open Vulnerability and Assessment Language 86 OVAL Repository The OVAL Repository [3] is the central meeting place for the OVAL Community to discuss. not every patch-induced injury may be cured in that time. mitre. which Microsoft would release intermittently. There are also updates which are published daily (e. Other repositories in the community also host OVAL content. 14 days after the regular Patch Tuesday." which notified each user that an update was available for their system. and disseminate OVAL Definitions. SQL Server. External links • • • • OVAL web site [4] Gideon Technologies(OVAL Board Member) Corporate Web Site [5] www. mitre. with many copies of Windows not only had to update every Windows deployment in the company but also uninstall patches issued by Microsoft that broke existing functionality. The second problem was that customers. definitions for Windows Defender and Microsoft Security Essentials) or irregularly.

com. microsoft.com. "Are Patches Leading to Exploits?" (http:/ / redmondmag. knowing that there will be an entire month before Microsoft releases any patch to fix it. References [1] "Security updates" (http:/ / www. com/ news/ article. . mcafee. however. other uses of the Internet may be significantly slowed from machines actively retrieving updates. This policy is adequate when the vulnerability is not widely known or extremely obscure. The Register. [6] Strong.[6] As a result. Retrieved 2010-01-14. CNET News. mspx).[3] By analyzing the patch. "Microsoft pulls 'critical' Windows update" (http://news. This can be particularly noticeable in environments where many machines individually retrieve updates over a shared. To some extent the bandwidth demands of patching a group of computers can be alleviated by deploying Windows Server Update Services. so this is not generally a problem. asp?editorialsid=9143). Microsoft.[5] Also. Joris (2005-09-09). . com/ Microsoft-details-new-security-plan/ 2100-1002_3-5088846. html) (blog). • Evers. "Operation “Aurora” Hit Google. Faster. Others" (http:/ / siblog. . mcafee. George (2010-01-14). exploitation developers can more easily figure out how to exploit the underlying vulnerability. There have been cases where either vulnerability information or actual worms were released to the public a day or two before patch Tuesday. "Google and Microsoft Cheat on Slow Start" (http:/ / blog. . benstrong.[4] and attack systems that have not been patched. html) [4] Kurtz. com/ 2010/ 08/ better-faster-stronger. left a one month window for attackers to exploit the hole. Stronger: DLLHijackAuditKit v2 (http:/ / blog. Retrieved 2006-12-12. Malware authors can sit on the vulnerability of a new exploitation entry point until after a given patch Tuesday. Retrieved 2007-11-02. but that is not always the case. bandwidth constrained link such as those found in many small to medium sized businesses.html). html [3] HD Moore: Better. Jabulani (2007-10-12). theoretically. Bandwidth impact Microsoft's download servers do not honor the TCP Slow-start congestion control strategy.Patch Tuesday 87 Security implications The most obvious security implication is that security problems that have a solution are withheld from the public for a period of up to a month. com/ cto/ operation- aurora -hit-google-others/ ). This did not leave Microsoft enough time to incorporate a fix for said vulnerabilities. Therefore the term "Exploit Wednesday" was coined. cnet.com. [5] Leffall. and thus. Ben (2010-11-25). Microsoft issues critical patches as they become ready. . com/ athome/ security/ update/ bulletins/ default.com/Microsoft+ pulls+critical+Windows+update/2100-7349_3-5857338. before a patch is available to formally fix it. 2007-10-09. com/ 2010/ 11/ google-and-microsoft-cheat-on-slow. Retrieved 2009-02-25. starting to abuse an unpatched exploitation entry point on this day gives malicious code writers the longest period of time before a fix is supplied to users. Exploit Wednesday Many exploitation events are seen shortly after the release of a patch. [2] http:/ / news. benstrong.com. metasploit.

and possibly even some passwords. not due to a security issue but for DRM-related reasons.Report on DLL Hijacking vulnerability and exploit that led to many patches on Patch Tuesday in August 2010 • Bruce Schneier's blog (http://www. Any security issues that are found will be presented to the system owner. White box testing simulates what might happen during an "inside job" or after a "leak" of sensitive information. At the other end of the spectrum.html) .com/technet/security/bulletinsandadvisories/ default.com/technet/security/current. Penetration test A penetration test.metasploit. • HD Moore: Exploiting DLL Hijacking Flaws HD Moore's blog (http://blog. "partial disclosure" (grey box). network layouts. The process involves an active analysis of the system for any potential vulnerabilities that could result from poor or improper system configuration. white box testing provides the testers with complete knowledge of the infrastructure to be tested. often including network diagrams. and often with a proposal for mitigation or a technical solution.microsoft. or "blind" (black box) tests based on the amount of information provided to the testing party. known as a Black Hat Hacker. source code. There are also several variations in between.com/blog/archives/2006/07/zeroday_microso. The intent of a penetration test is to determine the feasibility of an attack and the amount of business impact of a successful exploit. The services offered by penetration testing firms span a similar range.aspx)) • Microsoft Support Website (http://support. White box Penetration tests can be conducted in several ways. occasionally pentest. is a method of evaluating the security of a computer system or network by simulating an attack from a malicious source. Penetration tests can also be described as "full disclosure" (white box). The testers must first determine the location and extent of the systems before commencing their analysis.com/2010/08/ exploiting-dll-hijacking-flaws. For example. or Cracker. The most common difference is the amount of knowledge of the implementation details of the system being tested that are available to the testers.mspx) ( Security Bulletin List and Search (http://www. It is a component of a full security audit. together with an assessment of their impact. This analysis is carried out from the position of a potential attacker and can involve active exploitation of security vulnerabilities. the Payment Card Industry Data Security Standard (PCI DSS). .microsoft. requires both annual and ongoing penetration testing (after system changes).Example of a quick patch response. both known and unknown hardware or software flaws. and security and auditing standard.schneier. Black box testing simulates an attack from someone who is unfamiliar with the system.Patch Tuesday 88 External links • Microsoft: Bulletins and Advisories (http://www.html) . and IP addressing information.microsoft. often known as grey box tests.html) . Black box vs.schneier. or operational weaknesses in process or technical countermeasures. if discovered.com/blog/archives/2006/09/microsoft_and_f. Black box testing assumes no prior knowledge of the infrastructure to be tested.Example of report about vulnerability found in the wild with timing seemingly coordinated with "Patch Tuesday". where the attacker has access to source code.com) • Bruce Schneier's blog (http://www. from a simple scan of an organization's IP address space for open ports and identification banners to a full audit of source code for an application. The relative merits of these approaches are debated.

However. NIST refers to the OSSTMM. New tests for international best practices. mobile devices. Furthermore. however. and ethical concerns are regularly added and updated. Basic white box penetration testing is often done as a fully automated inexpensive process. For this reason. before it is deployed. their assessment and hardening to get a complete picture of the vulnerabilities that might exist. The Information Systems Security Assessment Framework (ISSAF) is a peer reviewed structured framework from the Open Information Systems Security Group that categorizes information system security assessment into various domains and details specific evaluation or testing criteria for each of these domains. The National Institute of Standards and Technology (NIST) discusses penetration testing in SP800-115. security processes. Risks Penetration testing can be an invaluable technique to any organization's information security program.[1] [2] NIST's methodology is less comprehensive than the OSSTMM. and military bases. it may slow the organization's networks response time due to network scanning and vulnerability scanning. and physical locations such as buildings. what to do before. computer and telecommunications networks. It includes the crucial facet of security processes and. At a minimum. and after a security test. It aims to provide field inputs on security assessment that reflect real life scenarios. the possibility exists that systems may be damaged in the course of penetration testing and may be rendered inoperable. during. black box penetration testing is a labor-intensive activity and requires expertise to minimize the risk to targeted systems. laws. . it can never be fully eliminated. it is more likely to be accepted by regulatory agencies. physical security access controls. Black box penetration testing is useful in the cases where the tester assumes the role of an outside hacker and tries to intrude into the system without adequate knowledge of it. wireless devices. fraud and social engineering control levels. in particular any Internet facing site. The ISSAF should primarily be used to fulfill an organization's security assessment requirements and may additionally be used as a reference for meeting other information security needs. The OSSTMM test cases are divided into five channels which collectively test: information and data controls. and how to measure the results. This provides a level of practical assurance that any malicious user will not be able to penetrate the system. The OSSTMM focuses on the technical details of exactly which items need to be tested. Methodologies The Open Source Security Testing Methodology Manual is a peer-reviewed methodology for performing security tests and metrics.Penetration test 89 Rationale A penetration test should be carried out on any computer system that is to be deployed in a hostile environment. OSSTMM is also known for its Rules of Engagement which define for both the tester and the client how the test needs to properly run starting from denying false advertising from testers to how the client can expect to receive the report. The ISSAF. perimeters. however. is still in its infancy. regulations. even though the organization benefits in knowing that the system could have been rendered inoperable by an intruder. personnel security awareness levels. Although this risk is mitigated by the use of experienced penetration testers.

one for infrastructure and one for application testing. SANS founded GIAC. For web applications. aiding with client engagement and procurement processes and proving that the member company is able to provide testing services to the CREST standard.[5] Two of the GIAC certifications are penetration testing specific: the GIAC Certified Penetration Tester (GPEN) certification[6] . In 1999. The CPT requires that the exam candidate pass a traditional multiple choice exam. It is for this reason that most security firms are at pains to show that they do not employ ex-black hat hackers and that all employees adhere to a strict ethical code. and the GIAC Web Application Penetration Tester (GWAPT) certification.[3] For organisations. Upon completion of the course students become eligible to take a certification challenge. requiring holders to successfully attack and penetrate various live machines in a safe lab environment. The Council of Registered Ethical Security Testers (CREST) is a UK non-profit association created to provide recognised standards and professionalism for the penetration testing industry. The OSCP is a real-life penetration testing certification. not the company. Documentation must include procedures used and proof of successful penetration including special marker files.[4] The Information Assurance Certification Review Board (IACRB) manages a penetration testing certification known as the Certified Penetration Tester (CPT).000 members to date. as well as pass a practical exam that requires the candidate to perform a penetration test against live servers.[7] Offensive Security offers an Ethical Hacking certification (Offensive Security Certified Professional) . Tiger Scheme certifies the individual. SANS provides a wide range of computer security training arena leading to a number of SANS qualifications. There are several professional and government certifications that indicate the firm's trustworthiness and conformance to industry best practice.Penetration test 90 Standards and certification The process of carrying out a penetration test can reveal sensitive information about an organization. Three certifications are currently offered: the CREST Registered Tester and two CREST Certified Tester qualifications. which has to be completed within twenty-four hours. the Global Information Assurance Certification. the Open Web Application Security Project (OWASP) provides a framework of recommendations that can be used as a benchmark. The SST is technically equivalent to CHECK Team Leader and QST is technically equivalent to the CHECK Team Member certification[8] . Computer Hacking Forensics Investigator program. The Tiger Scheme offers two certifications: Qualified Tester (QST) and Senior Security Tester (SST). CREST provides a provable validation of security testing methodologies and practices. . which according to SANS has been undertaken by over 20. The International Council of E-Commerce consultants certifies individuals in various e-business and information security skills. Licensed Penetration Tester program and various other programs. Government-backed testing also exists in the US with standards such as the NSA Infrastructure Evaluation Methodology (IEM). These include the Certified Ethical Hacker course. which are widely available worldwide.a training spin off of the BackTrack Penetration Testing distribution.

including: • Known vulnerabilities in COTS applications • Technical vulnerabilities: URL manipulation. . org/ certifications/ security/ GPEN. foundstone. including input validation flaws such as SQL injection and Cross Site Scripting (XSS). The Firefox browser is a popular web application penetration testing tool. Computerworld UK. . org/ index. dvwa. pdf) [3] "Infosec 2008: UK association of penetration testers launched" (http:/ / www. gov/ publications/ nistpubs/ 800-42/ NIST-SP800-42. org/ index. ISBN 978-0071613743 External links • List of Network Penetration Testing software (https://mosaicsecurity. Retrieved 2008-08-16. php [8] http:/ / www. simulates a banking application. etc. web server configuration. back-end authentication. com/ Articles/ 2008/ 04/ 24/ 230417/ infosec-2008-uk-association-of-penetration-testers. the Open Web Application Security Project. OWASP. htm • Johnny Long. "Security testing standards council launched" (http:/ / www. Mosaic Security Research . php/ OWASP_Guide_Project [10] http:/ / www. pricelist modification.com/categories/ 27-network-penetration-testing). uk/ products_services/ iacs/ check/ index. [11] specifically Damn vulnerable web app otherwise known as DVWA [12] is an open source web application which has been made to be vulnerable so that security professionals and students can learn more about web application security. php [7] http:/ / giac. mozilla. php SANS Institute [6] http:/ / giac. unauthorized funds transfer. uk/ [13] http:/ / www. Google Hacking for Penetration Testers. Computer Weekly. nist. • Business logic errors: Day-to-Day threat analysis. an open source web application security documentation project. computerweekly. 2009. password in memory. htm). org/ en-US/ firefox/ collection/ webappsec [12] http:/ / www. Guideline on Network Security Testing (http:/ / csrc. McGraw-Hill. sans. gov. com/ us/ resources/ proddesc/ hacmebank. computerworlduk. [4] King. cross-site scripting. buffer overflow. credential management. with many plugins designed for web application penetration testing. org/ certifications/ security/ GWAPT. co. Hacking Exposed: Network Security Secrets and Solutions. owasp. Retrieved 2008-08-16. Leo (2008-04-24). cfm?newsid=8730). breach of customer trust etc. Clickjacking. SQL injection. has produced documents such as the OWASP Guide [9] and the widely adopted OWASP Top 10 [10] awareness document. unauthorized logins. cesg. Technical Guide to Information Security Testing and Assessment. gov/ publications/ nistpubs/ 800-115/ SP800-115. Syngress. owasp. September 2008 (replaces SP800-42) (http:/ / csrc. Foundstone's Hacme Bank [13] References [1] Special Publication 800-42. personal information modification. 2008-04-24. shtml [9] http:/ / www.Penetration test 91 Web application penetration testing Web application penetration testing refers to a set of services used to detect various security issues with web applications and identify vulnerabilities and risks. 2007. org/ why_certify. com/ management/ security/ cybercrime/ news/ index. nist. session hijacking. [5] http:/ / www. ISBN 978-1597491761 • Stuart McClure. It helps developers and auditors practice web application attacks. pdf) [2] Special Publication 800-115. php/ OWASP_Top_Ten_Project [11] https:/ / addons.

zeroflaws. and other services on the same machine aren't affected (or at least probably not as much as in the alternative case: i. References & notes [1] Zero Flaws: Presumed Security (http:/ / www. As the chances of restarting such a process are better. which may be publicly known. Information theory Honoring the Principle of least privilege at a granularity provided by the base system such as sandboxing of (to that point successful) attacks to an unprivileged user account helps in reliability of computing services provided by the system. The article also details the flaws inherent in a trust seal such as the Verisign Secure Site seal. The reasons for an attacker to make this assumption may range from personal risk (the attacker believes the system owners can easily identify. If a program doesn't revoke privileges. but its owners or designers deliberately make the system more complex in the hope that attackers are unable to find a flaw. Privilege revocation is a variant of privilege separation whereby the program terminates the privileged part immediately after it has served its purpose. Presumed security is the opposite of security through obscurity. .e. rendering an attack moot). The phrase "presumed security" appears to have been first coined by the security commentary website Zero Flaws[1] . The article uses the Royal Military Academy Sandhurst as an example. or all of. focusing on the apparent lack of entry security and contrasting it against the presumed security a military installation will have. Conversely a system relying on presumed security makes no attempt to address its security flaws. but instead relies upon potential attackers simply assuming that the target is not worth attacking. Revocation of privileges is a technique of defensive programming. it is rarely discussed or documented. Computer security In computing security privilege revocation is a measure taken by a program to protect the system against misuse of itself. a privileged process gone haywire instead). the privileges they possess. capture and prosecute them) to technological knowledge (the attacker believes the system owners have sufficient knowledge of security techniques to ensure no flaws exist.Presumed security 92 Presumed security Presumed security is a principle in security engineering that a system is safe from attack due to an attacker assuming. Although this approach to security is implicitly understood by security professionals. that it is secure. it risks the escalation of privileges. on the basis of probability. net/ presumedsecurity) Privilege revocation Privilege revocation is the act of an entity giving up some. and explains why this presumed security approach is actually detrimental to an overall security posture. A system relying on security through obscurity may have actual security vulnerabilities. or some authority taking those (privileged) rights away.

The two halves then communicate via a socket pair. state. merely to cause a denial-of-service attack). Such software tends to separate privileges by revoking them completely after the critical section is done. and change the user it runs under to some unprivileged account after so doing. any successful attack against the larger program will gain minimal access. Issue 1. gov. htm [2] http:/ / www. so the scope of the potential vulnerabilities is limited (since a crash in the less privileged part cannot be exploited to gain privileges. wpi. cesg. Solaris implements a separate set of functions for privilege bracketing. gaps can allow widespread network penetration. Privilege separation is traditionally accomplished by distinguishing a real user ID/group ID from the effective user ID/group ID. The main program drops privileges. privilege separation is a technique in which a program is divided into parts which are limited to the specific privileges they require in order to perform a specific task. Approved July 1. 1997 • Protection Profile for Privilege-Directed Content [2] Authoriszor Ltd. 97-H 5836 am. using the setuid(2)/setgid(2) and related system calls. and the smaller program keeps privileges in order to perform a certain task. Ref: Auth_CC/PP/DES/01. ri. 22 December 2000 • LOMAC: Low Water-Mark Integrity Protection for COTS Environments [3] by Timothy Fraser References [1] http:/ / www. such as a drivers licence. The unprivileged part is usually run under the "nobody" user or an equivalent separate user account. . rilin. A common method to implement privilege separation is to have a computer program fork into two processes. This is used to mitigate the potential damage of a computer security attack.Privilege revocation 93 Law terminology In law the general term is often used when discussing some paper. Many network service daemons have to do a specific privileged operation such as open a raw socket or an Internet socket in the well known ports range. being voided after a (negative) condition is met by the holder. uk/ products_services/ iacs/ cc_and_itsec/ media/ protection-profiles/ authpp. which were specified by POSIX. That way the different programs have to communicate with each other through the operating system.3. This action is known as dropping root under Unix-like operating systems. Privilege separation can also be done by splitting functionality of a single program into multiple smaller programs. pdf Privilege separation In computer programming and computer security. edu/ ~tfraser/ Papers/ timothy-fraser-2000-1. References • State of Rhode Island General Assembly AN ACT RELATING TO SUSPENSION OF SCHOOL BUS DRIVER'S CERTIFICATES [1] CHAPTER 36. Administrative utilities can require particular privileges at run-time as well. If these are incorrectly positioned. and then assigning the extended privileges to particular parts using file system permissions. Thus. The implementation of Postfix was focused on implementing comprehensive privilege separation. Separation of privileges is one of the major OpenBSD security features. us/ PublicLaws/ law97/ law97036. pdf [3] http:/ / alum. even though the pair of programs will be capable of performing privileged operations.

1975 • Lampson.278 • Wulf. openbsd. Jones. Pierson.Privilege separation 94 External links • • • • Theo de Raadt: Exploit Mitigation Techniques in OpenBSD [1] slides Niels Provos.[1] The access matrix model. umich. sun. C.[4] Notes [1] [2] [3] [4] Jones 1975 Lampson 1971 Landwehr 1981 Wulf 74 pp.org/10. ISSN 0001-0782.org/10. virginia. Lipton The enforcement of security policies for computation (http://doi.cs. Austin. Proceedings of the fifth ACM symposium on Operating systems principles. Corwin. Richard J. org/ papers/ ven05-deraadt/ http:/ / niels. pp. R.cfm?id=364017&coll=portal&dl=ACM). W. (1971). Communications of the ACM 17 (6): 337–345. xtdnet.356852) Volume 13 . Proceedings of the 5th Princeton Conference on Information Sciences and Systems. pp. "Protection".acm. A.acm. edu/ u/ provos/ ssh/ privsep. Markus Friedl. W. com/ app/ docs/ doc/ 816-1042/ 6m7g4ma52?a=view Protection mechanism In computer science.acm. United States. 197 .[2] is a generalized description of operating system protection mechanisms.edu/~ninghui/courses/Fall03/ papers/landwehr_survey. pdf http:/ / www.1145/356850. protection mechanisms are built into a computer architecture to support the enforcement of security policies. Levin.edu/papers/p337-wulf.806538) ACM Symposium on Operating Systems Principles. (http://www.206. • Carl E. citi.[1] A simple definition of a security policy is "to set who may use what information in a computer system". 247 . Texas..pdf) (http://doi. Landwehr Formal Models for Computer Security (http://crypto. 437. first introduced in 1971.pdf) .org/citation. nl/ papers/ privsep. Peter Honeyman: Preventing Privilege Escalation [2] paper Niels Provos: Privilege Separated OpenSSH [3] project Trusted Solaris Developer's Guide: Bracketing Effective Privileges [4] References [1] [2] [3] [4] http:/ / www. E. Butler W.[3] The separation of protection and security is a special case of the separation of mechanism and policy.1145/355616. Pollack (June 1974). 1145/800213. Jones.337-345 use nortan 360 References • Anita K. html http:/ / docs. Issue 3 (September 1981) pp. doi:10. "HYDRA: the kernel of a multiprocessor operating system" (http://portal.364017.stanford. F. Cohen.

comparing evaluated products requires assessing both the EAL and the functional requirements. Among others. A PP specifies generic security evaluation criteria to substantiate vendors' claims of a given family of information system products. the Orange Book followed a less advanced approach defining functional protection capabilities and appropriate assurance requirements as single category. Loss of this application technology seems to have been an unintended consequence of the superseding of the Orange Book by the Common Criteria. its simplicity is deceptive because this number is rather meaningless without an understanding the security implications of the PP(s) and ST used for the evaluation. security objectives.S. A PP may inherit requirements from one or more other PPs. A PP is a combination of threats. In this way a PP may serve as a template for the product's ST. usually in the form of supporting documentation and testing. known as the Target of Evaluation (TOE) and to specify security requirements to address that problem without dictating how these requirements will be implemented. It is not obvious what trusted agency possesses the depth in IT security expertise needed to evaluate systems applicability of Common Criteria evaluated products. the Yellow Book defined a matrix of security environments and assessed the risk of each. security assurance requirements (SARs) and rationales. Seven such categories were defined in this way. The National Institute of Standards and Technology (NIST) and the National Security Agency (NSA) have agreed to cooperate on the development of validated U. The results were documented in the Rainbow Series. it typically specifies the Evaluation Assurance Level (EAL). that a product meets the security requirements specified in the PP. In order to get a product evaluated and certified according to the CC. assumptions. security functional requirements (SFRs). Rather than separating the EAL and functional requirements. Unfortunately. As the generic form of a Security Target (ST). but deciding if some product's CC evaluation is adequate for a particular application is quite another. interpreting the security implications of the PP for the intended application requires very strong IT security expertise. Purpose A PP states a security problem rigorously for a given collection of system or products. and mapped security features needed for specific operating environment risks. Further. . it is typically created by a user or user community and provides an implementation independent specification of information assurance security requirements. Evaluating a product is one thing. This approach produced an unambiguous layman's cookbook for how to determine whether a product was usable in a particular application. evaluated their strength. Problem Areas Although the EAL is easiest for laymen to compare. The problem of applying evaluations is not new. Technically. a number 1 through 7. This problem was addressed decades ago by a massive research project that defined software features that could protect information.Protection Profile 95 Protection Profile A Protection Profile (PP) is a document used as part of the certification process according to the Common Criteria (CC). the product vendor has to define a Security Target (ST) which may comply with one or more PPs. indicating the depth and rigor of the security evaluation. government PPs. It then established precisely what security environment was valid for each of the Orange Book categories.

Government PP • Smart Cards • Remote electronic voting systems[1] External links • NIAP Protection Profiles [2] • Computer Security Act of 1987 [3] References [1] M. Springer. ISBN 978-3-642-01661-5. Evaluation of Electronic Voting (Chapter 8) (http:/ / springerlink. [2] http:/ / www. html . . com/ content/ 978-3-642-01661-5/ contents/ ).Protection Profile 96 Security devices with PPs Validated US Government PP • • • • • • • • • • • Anti-Virus Key Recovery PKI/KMI Biometrics Certificate Management Tokens DBMS Firewalls Operating System IDS/IPS Peripheral Switch Draft US Government PP • • • • • • • • • • • Switches and Routers Biometrics Remote Access Mobile Code Secure Messaging Multiple Domain Solutions VPN Wireless LAN Guards Single-Level Web Server Separation Kernel Validated Non-U. niap-ccevs. gov/ archive/ computer_security_act_jan_1998.S. org/ pp/ [3] http:/ / www. Volkamer (2009). cio.

5 months[2] Radboud University Nijmegen breaks the security of the MIFARE Classic cards. Security vulnerabilities resolved by applying responsible disclosure: • • • • Dan Kaminsky discovery of DNS cache poisoning. It is easier to patch software by using the internet as a distribution channel. Developers of hardware and software often require time and resources to repair their mistakes. the two primary players in the commercial vulnerability market are iDefense. . with their zero-day initiative (ZDI) started in 2005. 5 months[4] MD5 collision attack that shows how to create false CA certificates. . net/ papers/ weis_security_ecosystem_2009. [2] "Dan Kaminsky discovery of DNS cache poisoning" (http:/ / www.Responsible disclosure 97 Responsible disclosure Responsible disclosure is a computer security term describing a vulnerability disclosure model. [5] "MD5 collision attack that shows how to create false CA certificates" (http:/ / www. with the addition that all stakeholders agree to allow a period of time for the vulnerability to be patched before publishing the details. Vendor-sec is a responsible disclosure mailing list. . pdf). nl/ media/ pressrelease. Hackers and computer security scientists have the opinion that it is their social responsibility to make the public aware of vulnerabilities with a high impact. These organisations follow the responsible disclosure process with the material bought. To avoid this. [4] "MIT students find vulnerability in the Massachusetts subway security" (http:/ / tech. Responsible disclosure fails to satisfy security researchers who expect to be financially compensated. [3] "Researchers break the security of the MIFARE Classic cards" (http:/ / www2. Between March 2003 and December 2007 an average 7. vulnerability commercialization remains a hotly-debated topic tied to the concept of vulnerability disclosure. which started their vulnerability contributor program (VCP) in 2003. edu/ V128/ N30/ subway. MIT students find vulnerability in the Massachusetts subway security. org/ blog/ 2009/ verisign-and-responsible-disclosure/ ). pdf). if not all. Many. while reporting vulnerabilities to the vendor with the expectation of compensation might be viewed as extortion. pdf). the involved parties join forces and agree on a period of time for repairing the vulnerability and preventing any future damage. 1 week[5] References [1] "Paper measuring the prevalence of responsible disclosure and model of the processes of the security ecosystem" (http:/ / www. 6 months[3] MBTA vs.5% of the vulnerabilities affecting Microsoft and Apple were processed by either VCD or ZDI [1] . . mit. phreedom. of the CERT groups coordinate responsible disclosures. Depending on the potential impact of the vulnerability. ru. org/ netsa/ publications/ faber-OARC2008. . and TippingPoint. techzoom. It is like full disclosure. this period may vary between a few weeks and several months. Hiding these problems could cause a feeling of false security. . Anderson. While a market for vulnerabilities has developed. Today. cert. html).

csl.risks RISKS Digest] (Usenet newsgroup comp. which exists solely to carry the Digest. google. com/ mailman/ listinfo/ risks . here are taken broadly. com/ group/ comp. Security. It is a moderated forum concerned with the security and safety of computers.risks [1] RISKS Digest web archive [2] Mailing List Subscription Web Interface [3] References [1] http:/ / groups. External links • • • • [news:///comp.risks) Google groups interface to comp. and risk. software. RISKS is concerned not merely with so-called security holes in software. org/ [3] http:/ / lists.RISKS Digest 98 RISKS Digest The RISKS Digest or Forum On Risks to the Public in Computers and Related Systems is an online periodical published since 1985 by the Committee on Computers and Public Policy of the Association for Computing Machinery. as well as computer scientists and engineers. and computer security managers. Other recurring subjects include cryptography and the effects of technically ill-considered public policies. The editor is Peter G. It is heavily read by system administrators. Neumann. risks. most contributions are readable and informative to anyone with an interest in the subject. The RISKS Digest is published on a frequent but irregular schedule through the moderated Usenet newsgroup comp. but with unintended consequences and hazards stemming from the design (or lack thereof) of automated systems. though occasionally by others). Although RISKS is a forum of a computer science association. and technological systems. and technical book reviews (usually by Rob Slade. RISKS also publishes announcements and Calls for Papers from various technical conferences. risks [2] http:/ / www. Summaries of the forum appear as columns edited by Neumann in the ACM SIGSOFT Software Engineering Notes (SEN) and the Communications of the ACM (CACM). sri.risks.

com:81/dir/other. History The concept of same origin policy dates back to Netscape Navigator 2. right-hand fragment of their current host name.example. Two resources are considered to be of the same origin if and only if all these values are exactly the same.domain logic An important extension to the same origin policy implemented for JavaScript DOM access (but not for most of the other flavors of same-origin checks) is that two sites sharing a common top-level domain may opt to communicate despite failing the "same host" check by mutually setting their respective document.html http://www.domain DOM property to the same qualified. if http://en. and (in most browsers) TCP port of the HTML document running the script.example. .html http://www. they would be from that point on considered same-origin for the purpose of DOM manipulation.example.com/dir/other.com/dir/other.html http://v2.html https://www.com/dir2/other. or for mechanisms other than direct DOM manipulation. such as Adobe Flash. the same origin policy is an important security concept for a number of browser-side programming languages. such as JavaScript.com/ and http://fr.html http://example. Origin determination rules The term "origin" is defined using the domain name.0.example. Compared URL http://www.example.example. as servers act based on the HTTP cookie information to reveal sensitive information or take state-changing actions.domain to "example.example. Close derivatives of the original design are used in all current browsers and are often extended to define roughly compatible security boundaries for other web scripting languages.com/ both set document.com/dir/other. A strict separation between content provided by unrelated sites must be maintained on client side to prevent the loss of data confidentiality or integrity.com/dir/page.www. the following table gives an overview of typical outcomes for checks against the URL "http://www. application layer protocol. such as XMLHttpRequest.com". To illustrate.Same origin policy 99 Same origin policy In computing. For example.com/dir/other.html http://en.html Outcome Success Success Failure Failure Failure Failure Failure Reason Same protocol and host Same protocol and host Same protocol and host but different port Different protocol Different host Different host (exact match required) Different host (exact match required) Additional document.com/dir/page.example. but prevents access to most methods and properties across pages on different sites. This mechanism bears a particular significance for modern web applications that extensively depend on HTTP cookies to maintain authenticated user sessions. The policy permits scripts running on pages originating from the same site to access each other's methods and properties with no specific restrictions.example.html".

html http:/ / www. the javascript library easyXDM [1] External links • • • • A detailed comparison of several flavors of same-origin policies [2] A review of deficiencies in same-origin policies and their implication for web security [3] Sample vendor-provided same origin policy specification [4] Defeating. Workarounds To enable developers to. circumvent the Same Origin Policy. net http:/ / code. php/ ScriptsAndResources/ Flex-Bypass-Same-Origin-Policy . a number of 'hacks' such as using the Fragment Identifier. one such example is the ability to include scripts across domains. This library also allows you to set up methods for Remote Procedure Calls. etc. can be used to provide a unified API for the `postMessage`-interface as well as a number of hacks used to allow Cross Domain Communication (XDM). google. certain types of attacks. For supporting older browsers. com/ p/ browsersec/ wiki/ Part2#Same-origin_policy http:/ / taossa. This historically caused a fair number of security problems. permit the host name check to be partly subverted.name` property have been used to pass data between documents residing in different domains. breaking and bypassing The Same Origin Policy [5] References [1] [2] [3] [4] [5] http:/ / easyxdm. Lastly. in a controlled manner. or the `window. such as for protocols that do not have a clearly defined host name or port associated with their URLs (file:. com/ index. JSONP is a popular cross-domain alternative to XMLHttpRequest (Ajax). or communicate with any site on the Internet. such as the generally undesirable ability of any locally stored HTML file to access all other files on the disk. while at the same time ensuring that the communication is secure. Twitter and Scribd. org/ projects/ security/ components/ same-origin. The impact of such attacks is limited to very specific scenarios. and make it possible for rogue web pages to directly interact with sites through addresses other than their "true". mozilla.). since the browser still believes that it is interacting with the attacker's site.Same origin policy 100 Corner cases and exceptions The behavior of same-origin checks and related mechanisms is not well-defined in a number of corner cases. php/ 2007/ 02/ 08/ same-origin-policy/ http:/ / www. and can therefore be used to easily access methods with arguments and return values across the domain boundary. or submit POST forms. com/ index. such as DNS rebinding or server-side proxies. data:. easyXDM is currently used by major sites such as Disqus. and therefore does not disclose third-party cookies or other sensitive information to the attacker. In addition. but this is only available on recent browsers. azizsaleh. With the HTML5 standard a method was finally formalized for this. canonical origin. the `postMessage` interface. many legacy cross-domain operations predating JavaScript are not subjected to same-origin checks.

Examples Some examples are: • • • • • Ctrl+Alt+Del for Windows NT-based systems (called Secure Attention Sequence) Ctrl+Alt+Pause or the SysRq+K sequence for Linux Ctrl+X Ctrl+R for AIX Break for OpenVMS ⇧ Shift+Stop for PLATO IV in the 1970s. "Microsoft Research DRM talk" (http:/ / web. References • Bruce Schneier (1994). The secure attention key is used to make login spoofing more difficult. dashes. In The Codebreakers. html). .[1] The law is phrased as: “ “ Any person can invent a security system so clever that he or she can't imagine a way of breaking it. John Wiley & Sons. com/ anil/ stuff/ doctorow-drm-ms. [2] [1] . dashes. org/ web/ 20061202192720/ http:/ / www. if they just tried.Schneier's Law 101 Schneier's Law The term Schneier's law was coined by Cory Doctorow in his speech about Digital Rights Management for Microsoft Research. ” ” He attributes this to Bruce Schneier. In Microsoft Windows this is handled by the Winlogon component. Only the kernel. which is the part of the operating system that interacts directly with the hardware. presumably making reference to his book Applied Cryptography. Secure attention key The secure attention key (SAK) is a special key combination to be pressed on a computer keyboard before a login screen is presented. html) on 2006-12-02. Retrieved 2006-12-31. Users are advised to be suspicious of login prompts that appear without having pressed this key combination. ISBN 0-471-59756-2 [1] Cory Doctorow (2004-06-17). archive. David Kahn states: Few false ideas have more firmly gripped the minds of so many intelligent men than the one that. they could invent a cipher that no one could break. com/ anil/ stuff/ doctorow-drm-ms. although the principle predates its publication. Applied Cryptography. can detect whether the secure attention key has been pressed. Archived from the original (http:/ / www.

In many cases. Windows Vista attempts to remedy this situation through its User Account Control system. This however. microsoft. References [1] Soumitra Sengupta (2005-10-12). but users with limited rights can still fully utilise the system. However. com/ kb/ 555476). because that depends on the definition of an operating system. or is willing to. Not everyone can. Thus. and managing disk drives). . kernel. A common example is whether or not blank passwords are allowed for login. considering the amount of network-based security compromises today. a secure network. OpenBSD is a network operating system. one can argue such an operating system is more secure. txt). In a network operating system. no open network ports. the user will be better protected. Another way to secure a program or system is through abstraction. can lead to less functionality or reduced flexibility.4. Linux Kernel Organization. There are many operating systems that are not capable of networking with other systems. Secure by default Security by default. If a program uses secure configuration settings by default. As a general rule. which are not necessarily the most user friendly settings. This leads to the discussion what the most secure settings actually are. Operating systems OpenBSD claims to be the only operating system that is fully secure by default. the precise meaning of secure by default remains undefined. where the user is presented an interface in which the user cannot (or is discouraged to) cause accidental data loss. in software. anyone who knows the default configuration can successfully authenticate. As a result. means that the default configuration settings are the most secure settings possible.2 Secure Attention Key (SAK) handling" (http:/ / www. Retrieved 2011-05-30. Ubuntu is a GNU/Linux distribution aimed at the desktop user that by default hides the administrative account and only allows the first user to gain administrative privileges for certain system tasks (such as installing system updates. . org/ doc/ Documentation/ SAK.Secure attention key 102 Use by BIOS A similar combination. is often used by a PC system BIOS to force a reboot during a boot sequence. Some servers or devices that have an authentication system. [2] Andrew Morton (computer programmer) (2001-03-18). type or memorize a password. security and user friendliness is waged based on both risk analysis and usability tests. and remotely with a port scanner such as nmap. this typically means first and foremost that there are no listening INET(6) domain sockets after installation. "Using CTRL +ALT+ DELETE KEYS for logging on" (http:/ / support. If not properly changed. This can be checked on the local machine with a tool such as netstat. . not all users will care about security and may be obstructed by secure settings. Retrieved 2011-05-30. This. Microsoft. does not mean it's inherently the most secure operating system. such as Ctrl+Alt+Del. Mac OS X does not hide this account. however. That is. Microsoft Windows and Linspire have been criticised for allowing the user to have administrative privileges without warning—a potential threat to the system. Having user control preferences does not typically cause this. is only as secure as the least secure node in the entire network. but at the cost of having a larger part of the user interface for configuration controls. "Linux 2. have default usernames and passwords.

an important issue is the design of error messages in a way that prevents security vulnerabilities. asp . Another common example is the IIS 5. and so is considered by some to be less secure. A commonly-cited example of this is a system that shows either "Invalid user" or "Invalid password" depending on which is incorrect. microsoft. com/ library/ default. • Don't give so much information that the user is overwhelmed or confused and so unable to make an intelligent decision. Some of the primary recommended design principles include: • When asking a question. • Don't give error messages that could be exploited by a cracker to obtain information that is otherwise difficult to obtain. often resulting in compromised security. for lack of information. External links • Everett McKay. This aspect of software security has only recently begun to receive increased attention. they will choose the choice that allows them to make progress. References [1] http:/ / msdn. If this additional information is sometimes useful for debugging or advanced diagnosing. if this information is useful. This allows an attacker to determine a valid username without knowledge of any user passwords. Otherwise. log it in a separate location. or require special privileges to view it.Secure error messages in software systems 103 Secure error messages in software systems In computer security and usability of software systems. give the user enough information to make an intelligent decision. asp?url=/ library/ en-us/ dnsecure/ html/ securityerrormessages.0 web server's error page. which features a complete technical description of the error including a source code fragment. log it in a separate location or strictly limit access to it. either hide it by default. Again. MSDN: Writing Error Messages for Security Features [1].

inventory and classification of information assets Human resources security . 6. the Information Security Forum's Standard of Good Practice for Information Security and NIST SP 800-53 (more below). by sounding the intruder alarm and alerting the security guards or police. counteract or minimize security risks. Others argue that these are subsidiary categories. moving and leaving an organization Physical and environmental security . This is simply a matter of semantics. relative to a security incident: • Before the event. Procedural controls e. fences. Organizations may also opt to demonstrate the adequacy of their information security controls by being independently assessed against certification standards such as ISO/IEC 27001. technology and operations/processes.g. Some of the most well known are outlined below. Information security controls protect the confidentiality. A similar categorization distinguishes control involving people. integrity and/or availability of information (the so-called CIA Triad). 4. antivirus software.g. for example: • • • • Physical controls e.management direction Organization of information security . 5. International information security standards ISO/IEC 27001 1. locks and fire extinguishers. 3. implement. In telecommunications. they can be classified by several criteria.) Security controls can also be categorized according to their nature.Security controls 104 Security controls Security controls are safeguards or countermeasures to avoid. incident response processes. (Some security professionals would add further categories such as deterrent controls and compensation.g. • During the event. design. operate and maintain their security controls. Information security standards and control frameworks Numerous information security standards promote good security practices and define frameworks or systems to structure the analysis and design for managing information security controls.g.security aspects for employees joining. Again. Risk assessment and treatment .g. corrective controls are intended to limit the extent of any damage caused by the incident e. firewalls. some would add further categories such as non-repudiation and accountability. security awareness and training. Legal and regulatory or compliance controls e. usually by assessing the risks and implementing a comprehensive security management framework such as ISO/IEC 27002. for example according to the time that they act.protection of the computer facilities . To help review or design security controls.800 and ISO ISO 7498-2 (Information processing systems – Open systems interconnection – Basic Reference Model – Part 2: Security architecture are technically aligned. detective controls are intended to identify and characterize an incident in progress e.800 Recommendation.governance of information security Asset management .analysis of the organization's information security risks Security policy . X. Technical controls e. by recovering the organization to normal working status as efficiently as possible. depending on how narrowly or broadly the CIA Triad is defined. management oversight. preventive controls are intended to prevent an incident from occurring e. security controls are defined as Security services as part of OSI Reference model by ITU-T X. 2. user authentication (login) and logical access controls. policies and clauses. • After the event. by locking out unauthorized intruders. Risk-aware organizations may choose proactively to specify. doors.g. privacy laws.g.

3. Information systems acquisition. 5. standards. 8. AC Access Control 2. Compliance .Security controls 7. PE Physical and Environmental Protection 12. CM Configuration Management 6. 6. Information security incident management . applications.restriction of access rights to networks. SC System and Communications Protection 17. Accreditation. Access control . DC Security Design & Configuration IA Identification and Authentication EC Enclave and Computing Environment EB Enclave Boundary Defense PE Physical and Environmental PR Personnel CO Continuity VI Vulnerability and Incident Management DoD assigns the IA control per CIA Triad leg. SI System and Information Integrity 18. IA Identification and Authentication 8. CP Contingency Planning 7. Department of Defense information security standards From DoD Instruction 8500. AU Audit and Accountability 4. PS Personnel Security 14. . development and maintenance . PM Program Management U. laws and regulations 105 U. maintaining and recovering business-critical processes and systems 12. AT Awareness and Training 3. 4. PL Planning 13. Communications and operations management . Federal Government information security standards From NIST Special Publication SP 800-53 revision 3.2 [6] there are 8 Information Assurance (IA) areas and the controls are referred to as IA controls. 1. systems. IR Incident Response 9. functions and data 9. MP Media Protection 11.anticipating and responding appropriately to information security breaches 11. MA Maintenance 10. CA Certification.S. 7.management of technical security controls in systems and networks 8.S. Business continuity management . RA Risk Assessment 15. and Security Assessments 5.ensuring conformance with information security policies. 1.protecting. 2.building security into applications 10. SA System and Services Acquisition 16.

In network management it is the set of functions that protects telecommunications networks and systems from unauthorized access by persons. pdf [2] http:/ / fismapedia. org/ index. suppliers.like competition and customer demand Operational-Regulation. Risk Types External • • • • Strategic. physical security and human resource safety functions. contracts Financial-FX. cyber. gov/ publications/ nistpubs/ 800-53-Rev3/ sp800-53-rev3-final_updated-errata_05-01-2010. php?title=Category:Term Security management Security Management is a broad field of management related to asset management. procedures and guidelines. rights. deleting. standards. A key component to LP is assessing the potential threats to the successful achievement of the goal. nist.Natural disaster. such as creating. distributing security-relevant information. This must include the potential opportunities that further the object (why take the risk unless there’s and upside?) Balance probability and impact determine and implement measures to minimize or eliminate those threats. documentation and implementation of policies. and privileges. and authorizing subscriber access. Loss Prevention Loss prevention focuses on what your critical assets are and how you are going to protect them. controlling the distribution of cryptographic keying material. It entails the identification of an organization's information assets and the development. risk assessment and risk analysis are used to identify threats. external criminal act . classify assets and to rate system vulnerabilities so that effective control can be implemented. acts. reporting security-relevant events.Security controls 106 References • • • • Information Security Forum's Standard of Good Practice for Information Security [9] NIST SP 800-53 Revision 3 [1] DoD Instruction 8500. credit Hazard. or influences and that includes many subfunctions. Management tools such as information classification. and controlling security services and mechanisms.2 [6] FISMApedia Terms [2] References [1] http:/ / csrc.

"BBC NEWS | In Depth. 4.bbc. Boston. 5 Mar. Strategic-R&D Operational. . "Internal & External Threats." Security Management. 2. Cash Flow Hazard.co.Safety & security. Web.Insurance Reduce.Some risk is inherent in business Transfer.Liquidity. Daniel. Boston. 2010. employee & equipment Accept.uk/2/shared/spl/hi/guides/456900/456993/html/>. Rattner. Northeastern University. but not always realistic Deter Detect Deny Delay Detain Risk Options Loss Prevention Strategy Range of Tools These tools are helpful in reducing and eliminating conflicts • • • • • • • • • • • • • Simple Locks Local alarms Barriers Security lighting Sophisticated locks Access control cards Biometrics Monitored alarms Personnel Perimeter alarms Personnel with communications capability Coordination with LE Armed Security References •  This article incorporates public domain material from websites or documents of the General Services Administration (in support of MIL-STD-188). 2011. "Loss Prevention & Risk Management Strategy. Daniel. Lecture. "Risk Assessments.Ideal. Rattner. 2010.Systems and process (H&R.Specific systems and processes Eliminate. 8 April. Lecture. Lecture.Security management 107 Internal • • • • • • • • 1. 5. 2010." BBC News <http://news. Daniel." Security Management. 15 Mar. Northeastern University. Boston. Payroll) Financial. Rattner. Home. Northeastern University. 3. 18 Mar." Security Management.

In the field of legal academia. An ST is a complete and rigorous description of a security problem in terms of TOE description. implementation. the ST must fulfill the generic security requirements given in each of these PPs. This is a modern reincarnation of Kerckhoffs' doctrine. It may refer to one or more Protection Profiles (PPs). A system may use security through obscurity as a defense in depth measure. typically provided by the developer of the product. usually in the form of supporting documentation and testing. An ST defines information assurance security requirements for the given information system product. security functional requirements (SFRs). The SARs are typically given as a number 1 through 7 called Evaluation Assurance Level (EAL). Background There is scant formal literature on the issue of security through obscurity. security objectives. and that attackers are unlikely to find them. For example. a Security Target (ST) is the central document. public disclosure of products and versions in use makes them early targets for newly discovered vulnerabilities in those products and versions. securitymanagementjobsonline. An ST contains some (but not very detailed) implementation-specific information that demonstrates how the product addresses the security requirements. and rationales. and may define further requirements. not on its design remaining obscure. security assurance requirements (SARs).) to provide security. An attacker's first step is usually information gathering. threats. which is called the Target of Evaluation (TOE). A system relying on security through obscurity may have theoretical or actual security vulnerabilities. while all known security vulnerabilities would be mitigated through other measures. com Security Target In an IT product certification process according to the Common Criteria (CC). first put forward in the nineteenth century. that specifies security evaluation criteria to substantiate the vendor's claims for the product's security properties. if they cite anything at all.Security management 108 External links • Security Management Jobs Online [1] References [1] http:/ / www. In such a case. Peter Swire has written about the trade-off between the notion that "security through obscurity is an illusion" and the military notion that "loose lips sink ships"[3] as well as how competition affects the incentives to disclose. this step is delayed by security through obscurity. although many real-world projects include elements of both strategies. The technique stands in contrast with security by design. in a discussion about secrecy and openness in Nuclear Command and Control:[1] [T]he benefits of reducing the likelihood of an accidental war were considered to outweigh the possible benefits of secrecy.[2] that the security of a system should depend on its key. that the product meets the SFRs. assumptions. Books on security engineering will cite Kerckhoffs' doctrine from 1883.[4] . but its owners or designers believe that the flaws are not known. Security through obscurity Security through (or by) obscurity is a pejorative referring to a principle in security engineering. which attempts to use secrecy (of design. etc. indicating the depth and rigor of the security evaluation.

the argument has lost some of its former popularity. with the obscure details of the system acting as the secret key value. given a good enough hashing algorithm and salt. at best. an attacker will be able to run a dictionary attack against the hashed passwords. There is a general consensus. either by locking the system down until proper administrators have a chance to react. keeping secret some of the details of an otherwise well-engineered system may be a reasonable tactic as part of a defense in depth strategy. such as the National Security Agency. A variant of the defense in the previous paragraph is to have a double-layer of detection of the exploit. For example. For example. that security through obscurity should never be used as a primary security measure. It is. consider a computer network that appears to exhibit a known vulnerability. which seems to have been using a security through obscurity analysis to support its opposition to such work. Lacking the security layout of the target. . but absolutes may be difficult to obtain. may be considered a secondary-level key: compromising it does not fully compromise the passwords. then. the attacker is denied the information required to make a solid risk-reward decision about whether to attack in the first place. or in private industry. these are perhaps better termed obscurity bait in an active security defense. Although relying solely on security through obscurity is almost always a very poor design decision. the goal is simply to reduce the short-run risk of exploitation of a vulnerability in the main components of the system. Security through obscurity can also be used to create a risk that can detect or deter potential attackers. The idea is to give the attacker a false sense of confidence that the obscurity has been uncovered and defeated. Indeed. dictionary attacks can become infeasible. a secondary measure. both of which are kept secret but one is allowed to be "leaked". and their security will rely only on the quality of the user-chosen passwords. An example of where this would be used is as part of a honeypot. Where the salt-generating algorithm is not compromised. However. but does weaken security significantly. where researchers publish many or even all of their results. it will recognize that it is under attack and can respond. where results are more often controlled by patents and copyrights than by secrecy. Here. and publicly test others' designs. The wide availability of high quality cryptography was disturbing to the US government. 109 Viewpoints Arguments for Perfect or "unbroken" solutions provide security. In neither of these cases is there any actual reliance on obscurity for security. even among those who argue in favor of security through obscurity. by monitoring the attack and tracing the assailant. The essence of this principle is that raising the time or risk involved. A commonly-accepted example of this is the cryptographic salts of a password storage system. Should the system be compromised. security through obscurity may (but cannot be guaranteed to) act as a temporary "speed bump" for attackers while a resolution to a known security issue is implemented. or by disconnecting the attacker. such reasoning is very often used by lawyers and administrators to justify policies which were designed to control or limit high quality cryptography only to those authorized. Now that cryptographers often work at universities. it can be argued that a sufficiently well-implemented system based on security through obscurity simply becomes another variant on a key-based scheme. An example is PGP released as source code. and generally regarded (when properly used) as a military-grade cryptosystem. the attacker must consider whether to attempt to exploit the vulnerability or not. If the system is set to detect this vulnerability. both the list of the hashed passwords and the salt may be compromised: in this case.Security through obscurity The principle of security through obscurity was more generally accepted in cryptographic work in the days when essentially all well-informed cryptographers were employed by national intelligence agencies. The obscured salt. and disclosure of the obscurity should not result in a compromise.

Security through obscurity 110 Arguments against In cryptography proper. the house owner will experience greater risk of a burglary by hiding the key in this way. with the exception of the cryptographic key. • Details of Diebold Election Systems voting machine software were published on a publicly accessible Web site. identify flaws. For example. since burglars often know likely hiding places. then they are relying on security through obscurity. several algorithms. put forth in 1883 by Auguste Kerckhoffs. • The A5/1 cipher for GSM mobile cellular telephone system became public knowledge partly through reverse engineering • Details of the RSADSI (RSA Data Security.g. through publication of alleged RC4 source on Usenet.. • Cisco router operating system software was accidentally exposed to public access on a corporate network. Furthermore. More people can review the details of such algorithms. the argument against security by obscurity dates back to at least Kerckhoffs' principle. and this may . • Vulnerabilities in various versions of Microsoft Windows. and one which is very easy to guess and exploit. also suggests improved security for algorithms and protocols whose details are published. or will be developed. • in the 1940s. (See Bev Harris) Linus's law. In the past. The security level of the system is then reduced to the effort required to exploit the vulnerability. "the enemy knows the system". Accidental disclosure has happened several times. The owner has in effect added a vulnerability—the fact that the entry key is stored under the doormat—to the system. these and other examples suggest that it is difficult or ineffective to keep the details of systems and algorithms secret. for instance in the notable case of GSM confidential cipher documentation being contributed to the University of Bradford neglecting to impose the usual confidentiality requirements. Taken together. its default web browser Internet Explorer. This principle has been paraphrased in several ways: • system designers should assume that the entire design of a security system is known to all attackers. assorted government agencies (e. The principle holds that design of a cryptographic system should not require secrecy and should not cause "inconvenience" if it falls into the hands of the enemy. the greater the chance that an attack on one of those points of compromise exists. Trojan horses. Claude Shannon put it bluntly. Systems which include secrets of design or operation which are also points of compromise are less secure than equivalent systems without these points of compromise if the effort required to obtain the vulnerability caused by the secret design or method of operation. Furthermore. Proponents of this viewpoint expect that the frequency and severity of security compromises will be less severe for open than for proprietary or secret software. and fix the flaws sooner. probably deliberately. since the effort of finding the key is likely to be less effort to the burglar than breaking in by another means. even when the internal details remained secret. vulnerabilities have been discovered and exploited in software. Inc. Operators and developers/vendors of systems that rely on security by obscurity may keep the fact that their system is broken secret to avoid destroying confidence in their service or product and thus its marketability. in case they are locked out of the house. or computer worms have taken advantage of those vulnerabilities. • the security of a cryptographic system resides entirely in the cryptographic key. The greater the number of points of compromise in a system. have seen those internal details become public. the US Department of Commerce) have from time to time issued security warnings about the use of that software. if somebody stores a spare key under the doormat. that many eyes make all bugs shallow. or software systems with secret internal details. and its mail applications Outlook and Outlook Express have caused worldwide problems when computer viruses. and the effort to exploit this vulnerability is less than the effort required to obtain the secret key. [5]) cryptographic algorithm software were revealed. Indeed. The theoretical security vulnerability is that anybody could break into the house by unlocking the door using that spare key.

[6] On the other hand. especially if an analysis. Security through minority may be helpful for organisations who will not be subject to targeted attacks. as the low hanging fruit vulnerabilities are more likely to have already turned up. and quicker turn around on patches.the wide range of "long tail" minority products is clearly more diverse than a market leader in any product type. one study[7] reports that linux source code has 0. research indicates that open-source software does have a higher flaw discovery. the widely used term security through diversity . the Morris worm of 1988 spread through some obscure—though widely visible to those who looked—vulnerabilities). The issue is further confused by the fact that new vulnerabilities in minority products cause all known users of that (perhaps easily identified) product to become targets.17 bugs per 1000 lines of code while non-Open-Source commercial software generally scores 20-30 bugs per 1000 lines.g. quicker flaw discovery. suggesting the use of a product in the long tail.[9] "unpopularity". However. of companies delaying release of fixes or patches to suit their corporate priorities rather than customer concerns or risks. if the patch is that significant to the user. Application of the law in this respect has been less than vigorous. However. the likelihood of being randomly targeted with a new vulnerability remains greater. Instances have been known.[8] but is also a factor in product choice for some large organisations. The whole issue is closely linked with. These arguments are hard to prove. just because there is not an immediate financial incentive to patch a product. "lack of interest". so a random attack will be less likely to succeed. does not turn up much of interest. This variant is most commonly encountered in explanations of why the number of known vulnerability exploits for products with the largest market share tends to be higher than a linear relationship to market share would suggest. does not mean there is not any incentive to patch a product. With market leading products. An argument sometimes used against open-source security is that developers tend to be less enthusiastic about performing deep reviews as they are about contributing new code. the argument goes. This does not currently appear to have a single defining term. Security through minority A variant of the basic approach is to rely on the properties (including whatever vulnerabilities might be present) of a product which is not widely adopted. security sometimes receives less thorough treatment than it might in an environment in which security reviews were part of someone's job description.[10] "scarcity". having the source code. in part because vendors almost universally impose terms of use as a part of licensing contracts in order to disclaim their apparently existing obligations under statutes and common law that require fitness for use or similar quality standards. Such work is sometimes seen as less interesting and less appreciated by peers. finding a new vulnerability in a market leading product is likely harder than for obscure products. however diligent and time-consuming. . Further. 111 Open source repercussions Software which is deliberately released as open source once experienced a security debacle 20 years ago (e. the user can technically patch the problem themselves.. "minority"[8] being the most common but "rarity". thus lowering the prominence of those vulnerabilities (should they become known) against random or even automated attacks. from at least the 1960s. Combined with the fact that open-source is dominated by a culture of volunteering. and others also being used.Security through obscurity amount to fraudulent misrepresentation of the security of their products. For example. and in a sense depends upon. which may suggest these products are better for organisations who expect to receive many targeted attacks.

Journal des Sciences Militaires: 5–38. Retrieved 2008-05-01. "A Theory of Disclosure for Security and Competitive Reasons: Open Source. Feb. Fans of MIT's ITS say it was coined in opposition to Multics users down the hall. wired.1895. Security and Obscurity (http://www. asp) by Larry Seltzer. 2002 (http://www. and Government Agencies" (http:/ / ssrn.bastille-linux. Inc.00. Shifting Liability and the First Amendment (http:// www. 2004. "La Cryptographie Militaire" (http:/ / www. . and to the attitude that by the time a tourist figured out how to make trouble he'd generally got over the urge to make it. [10] "When 'Security Through Obscurity' Isn't So Bad" (http:/ / slashdot.archive. New York. blogspot. for whom security was far more an issue than on ITS. Retrieved 2008-05-01. "A Model for When Disclosure Helps Security: What is Different About Computer and Network Security?" (http:/ / ssrn. 2003 (http:/ / www.com. [11] http:/ / catb. Retrieved 2008-05-01. [7] Linux: Fewer Bugs Than Rivals (http:/ / www. com/ crypto-gram-0308.html) by Jay Beale • Secrecy. Journal on Telecommunications and High Technology Law 2. at [Geeks are Sexy] Technology News. 22. . schneier. rsasecurity.. Houston Law Review 42. June 6. ac. Proprietary Software. . to the poor coverage of the documentation and obscurity of many commands. Swire (2004). because he felt part of the community. Retrieved 2008-05-01.html#1) by Bruce Schneier • "Security through obsolescence". linux. pl?sid=01/ 07/ 23/ 2043209& mode=thread& threshold=1) by CmdrTaco.net/Articles/85958/) • Computer Security Publications: Information Economics. 2006. html External links • Eric Raymond on Cisco's IOS source code 'release' v Open Source (http://lwn. org/ article. [9] Crypto-Gram Newsletter. [2] Auguste Kerckhoffs (January 9.com/crypto-gram-0205. Slashdot. com/ [6] See "How Closely Is Open Source Code Examined?" (http:/ / www. Security Engineering: A Guide to Building Dependable Distributed Systems (http:/ / www. eWeek.pdf) by Ethan M. ac. html). cam.linux. eweek. org/ jargon/ html/ S/ security-through-obscurity.com/articles/ 23313) . self-mockingly. com/ article2/ 0. 1883).org/jay/obscurity-revisited. by Bruce Schneier. [3] Peter P. Preston and John Lofton • "Security Through Obscurity" Ain't What They Think It Is (http://web. cl. com/ abstract=842228).us/data/ComputerSecurityPublications. com/ software/ coolapps/ news/ 2004/ 12/ 66022) [8] "Mac Users Finally Waking Up to Security" (http:/ / geeksaresexy. ISBN 0-471-38922-6. One instance of deliberate security through obscurity on ITS has been noted: the command to allow patching the running ITS system (altmode altmode control-R) echoed as ##^D. cl. August 15. NY: John Wiley & Sons. July 23. cam. uk/ users/ fapp2/ kerckhoffs/ ). uk/ ~rja14/ book. com/ abstract=531782). Ross (2001).com. com/ 2006/ 12/ mac-users-finally-waking-up-to. Within the ITS culture the term referred. [4] Peter P. html). .org/web/20070202151534/ http://www. html) by Kiltak.Security through obscurity 112 Historical notes There are conflicting stories about the origin of this term.1536427.eplaw. December 19.[11] References [1] See page 240 of: Anderson. Robin Miller.schneier. [5] http:/ / www. 2001. Swire (January 2006). Typing Alt Alt Control-D set a flag that would prevent patching the system even if the user later got it right.

In general.4 U4 received an EAL5+ certification in July 2008. Certificates do not endorse the "goodness" of an IT product by any organization that recognizes or gives effect to the certificate. such as a B2 or A1 CSC-STD-001-83 "Department of Defense Trusted Computer System Evaluation Criteria" or Common Criteria (CC) certification. Red Hat Enterprise Linux 5 Red Hat Enterprise Linux 5 achieved EAL4+ in June 2007. Trusted Solaris Trusted Solaris is a security-focused version of the Solaris Unix operating system. while this does not mean that other solutions. and does not extend to the same software if any aspect of the installation varies in any way. See [4] for an overview of the system. without even having tried to pass this advocacy evaluation.1. Trusted Solaris adds detailed auditing of all tasks. Often these scenarios are extremely limited compared to the normal environments in which computer operating systems usually run. because of the costs that ensue. Note that certification applies to a particular configuration of the system running on a certain set of hardware. major vendors get listed. it does not derive from the Linux kernel. Note that meeting a given set of evaluation criteria does not make a computer operating system "secure". STOP version 6. the field of operating systems which can apply to be evaluated is restricted to those with strong financial backing. couldn't reach or exceed this level of security under certain circumstances. the certificate is only valid for this specific configuration. Moreover.0. such as open-source solutions. See [3] for explanation of The Evaluation Assurance Levels. While STOP 6 is binary compatible with Linux.E received an EAL4+ in April 2004 and the 6.[5] [6] . pluggable authentication. BAE Systems' STOP BAE Systems' STOP version 6. Aimed primarily at the government computing sector. security-evaluated operating systems have achieved certification from an external security-auditing organization. additional physical authentication devices. and fine-grained access control(FGAC).E version received an EAL5+ certification in March 2005.Security-evaluated operating system 113 Security-evaluated operating system In computing. Versions of STOP prior to STOP 6 have held B3 certifications under TCSEC. mandatory access control. See [1] and [2] Trusted Solaris Version 8 received the EAL4 certification level augmented by a number of protection profiles. Versions of Trusted Solaris through version 8 are Common Criteria certified. A certificate represents the successful completion of a validation that a product met CC requirements for which it was evaluated/tested.

See News release at heise.[12] GEMSOS Gemini Multiprocessing Secure Operating System [13] is a TCSEC A1 system that runs on x86 processor type COTS hardware.[15] Green Hills INTEGRITY Green Hills Software's INTEGRITY-178B real-time operating system was certified at Common Criteria EAL6+ in September 2008. and Professional. were all certified [9] in December 2005.3. Schell: GEMSOS presentation [18] . External links • NIST published list of CC Evaluated Products [17] • Roger R. (This includes standard configurations as Domain Controller.4. and Windows Server 2003 Standard and Enterprise editions (32-bit and 64-bit). Mac OS X Apple's Mac OS X and Mac OS X Server running 10. [16] running on an embedded PowerPC processor on a Compact PCI card. [10][11] Apple's Mac OS X & Mac OS X Server running the latest version 10.3 certification: • Windows 2000 Server.de [7] Microsoft Windows The following versions of Microsoft Windows have received EAL 4 Augmented ALC_FLR.6 have not yet been fully evaluated however the Common Criteria Tools package is available. with Service Pack 2. HP OpenVMS and SEVMS CC B1/B3[14] system formerly of Digital Equipment Corporation (DEC) later Compaq. Advanced Server. Workstation in a Domain. Stand-alone Workstation) • Windows XP Professional and Embedded editions.6 both with the Common Criteria Tools Package installed were certified at CAPP/EAL3 in January 2005. with Service Pack 1.3 [8] in October 2002. Stand-alone Server. Server in a Domain. each with Service Pack 3 and Q326886 Hotfix operating on the x86 platform were certified as CAPP/EAL 4 Augmented ALC_FLR. now Hewlett-Packard (HP).Security-evaluated operating system 114 Novell SUSE Linux Enterprise Server Novell's SUSE Linux Enterprise Server 9 running on an IBM eServer was certified at CAPP/EAL4+ in February 2005.

Some of the tasks that require elevated privileges may not immediately be obvious. The presence of setuid executables explains why the chroot system call is not available to non-root users on Unix. org/ cc-scheme/ st/ index. gov/ cc/ Documents/ CC%20v2. de/ slides/ sy2000/ Vortraege_2803/ 1M01. PDF) [15] http:/ / citeseer. niap-ccevs. the application can then perform tasks on the system that regular users normally would be restricted from doing. Users can exploit vulnerabilities in flawed programs to gain permanent elevated privileges. html [18] http:/ / amcis2005. html [16] http:/ / www. such as changing their login password. gov/ cc-scheme/ st/ ST_VID4002-VR. The invoking user will be prohibited by the system from altering the new process in any way. html#operatingsystem [11] http:/ / www. html [12] http:/ / www. pdf setuid setuid and setgid (short for "set user ID upon execution" and "set group ID upon execution". which must send and listen for control packets on a network interface. They are often used to allow users on a computer system to run programs with temporarily elevated privileges in order to perform a specific task. setuid on executables When a binary executable file has been given the setuid attribute. com/ downloads/ macosx/ apple/ commoncriteriatools. microsoft. nist. sun. mspx [10] http:/ / niap. LD_LIBRARY_PATH or sending signals to it (signals [1] from the terminal will still be accepted. org/ st/ vid10119/ [17] http:/ / niap. The setuid and setgid bits are normally set with the command chmod by setting the high-order octal digit to 4 (for setuid) or 2 (for setgid). pdf [9] http:/ / www. While the assumed user id or group id privileges provided are not always elevated. com/ ProductsServices/ bae_prod_csit_xts400. respectively) are Unix access rights flags that allow users to run an executable with the permissions of the executable's owner or group. unomaha. isqa. Due to the increased likelihood of security flaws. com/ software/ security/ securitycert/ trustedsolaris. at a minimum they are specific. html [5] http:/ / www. apple. ist. niap-ccevs. decus. com/ presspass/ press/ 2005/ dec05/ 12-14CommonCriteriaPR. gov/ cc-scheme/ vpl/ vpl_type. The setgid attribute will allow for changing the group-based privileges within a process. normal users on the system who have permission to execute this file gain the privileges of the user who owns the file (commonly root) within the created process. make the file read/write/executable for . such as by using ptrace. When root privileges have been gained within the process. or unintentionally execute a trojan horse program. sun. nist. See limitations of chroot for more details. setuid and setgid are needed for tasks that require higher privileges than those which common users have. however). cfm/ vid/ 10125 [7] http:/ / www. While this setuid feature is very useful in many cases. many operating systems ignore the setuid attribute when applied to executable shell scripts.Security-evaluated operating system 115 References [1] http:/ / wwws. jpg [3] http:/ / csrc. like the setuid flag does for user-based privileges. though — such as the ping command. html [13] http:/ / www. niap-ccevs. de/ english/ newsticker/ news/ 56451 [8] http:/ / niap. aesec. nist. edu/ 428108. `chmod 6711` will set the setuid and setgid bits (6). HTM [4] http:/ / www. baesystems. gov/ cc-scheme/ vpl/ vpl_type. it can pose a security risk if the setuid attribute is assigned to executable programs that are not carefully designed. apple. heise. edu/ Schell-AMCIS-Keynote-050813a. psu. nist. 1%20-%20HTML/ PART3/ PART36. com/ [14] OpenVMS security presentation (http:/ / www. html [2] http:/ / wwws. org/ cc-scheme/ st/ ?vid=10165 [6] http:/ / www. com/ support/ downloads/ commoncriteriatoolsfor104. com/ software/ security/ securitycert/ images/ TSol8_7-03CMS.

and executable by the group (first 1) and others (second 1).c #include <sys/types. itself a client of the setuid feature. The `su` command./printid Real UID = 1007 Effective UID = 1008 Real GID = 1007 .h> #include <stdio. consistent with the /etc/passwd file.h> #include <unistd. return 0. getegid() ). Most implementations of the chmod command also support finer-grained. All chmod flags are octal. 116 Demonstration [bob@foo]$ cat /etc/passwd alice:x:1007:1007::/home/alice:/bin/bash bob:x:1008:1008::/home/bob:/bin/bash [bob@foo]$ cat printid. This is shown in the demonstration below as the `chmod ug+s` command. and the least significant bit of the high-order octal digit is used for a special mode known as the sticky bit. } UID UID GID GID = = = = %d\n" %d\n" %d\n" %d\n". geteuid().c -o printid [bob@foo]$ chmod ug+s printid [bob@foo]$ su alice Password: [alice@foo]$ ls -l -rwsr-sr-x 1 bob bob 6944 2007-11-06 10:22 printid [alice@foo]$ . and finally the demonstration program is run. Note that the demonstration program listed below will silently fail to change the effective UID if run on a volume mounted with the `nosuid` option. symbolic arguments to set these bits. [bob@foo]$ gcc -Wall printid.h> int main(void) { printf( "Real "Effective "Real "Effective getuid (). getgid ().setuid the owner (7). is then used to assume the id of `alice`. The demonstration C program below simply obtains and reveals the real and effective user and group ID currently assigned to the process. The commands shown first compile the process as user `bob` and subsequently use `chmod` to establish the setuid and setgid bits. The effectiveness of the `chmod` command is checked with `ls -l`. revealing the expected identity change.

Note that setting the setgid permission on a directory only affects the group ID of new files and subdirectories created after the setgid bit is set. with a command such as the following: [root@foo]# find /path/to/directory -type d -exec chmod g+s {} \. txt). directories behave as if their setgid bit was always set. have an entirely different meaning. The patent was later placed in the public domain. linuxjournal. cgi?query=chmod& apropos=0& sektion=0& manpath=FreeBSD+ 6. Part II" (http:/ / www. Mick (2004). com/ textdoc?DB=EPODOC& IDX=US4135240 [6] "Summary of key software patents" (http:/ / www. freebsd. "Paranoid Penguin . namely. freebsd." Security Programs that use this bit must be carefully designed to be immune to buffer overrun attacks.setuid Effective GID = 1008 [alice@foo]$ 117 setuid and setgid on directories The setuid and setgid flags. com/ article/ 7727). . textfiles. In the event a vulnerable process uses the setuid bit to run as root. . regardless of its actual value. Setting the setgid permission on a directory (chmod g+s) causes new files and subdirectories created within it to inherit its group ID. linuxjournal. His employer.[3] In FreeBSD. in effect giving the attacker root access to the system on which the vulnerable process is running. org/ faqs/ unix-faq/ faq/ part4/ section-7. faqs. .Linux Filesystem Security. . [3] "chmod manpage on www.freebsd. 1-RELEASE& format=html). and is not applied to existing entities. Successful buffer overrun attacks on vulnerable applications allow the attacker to execute arbitrary code under the rights of the process being exploited. rather than the primary group ID of the user who created the file (the owner ID is never affected.[6] References [1] http:/ / www. to force all files and sub-directories to be owned by the top directory owner.org" (http:/ / www. The setuid permission set on a directory is ignored on UNIX and Linux systems. As is stated in open(3) [4]. org/ cgi/ man. applied for a patent in 1972.[2] FreeBSD can be configured to interpret it analogously to setgid. com/ law/ softpat. cgi?query=open& sektion=2 [5] http:/ / v3. espacenet. the patent was granted in 1979 as patent number US 4135240 [5] "Protection of data file contents". [4] http:/ / www. AT&T. Setting the setgid bit on existing subdirectories must be done manually. only the group ID). History The setuid bit was invented by Dennis Ritchie. Newly created subdirectories inherit the setgid bit. org/ cgi/ man.com. html [2] Bauer. the code will be executed with root privileges. "When a new file is created it is given the group of the directory which contains it. when set on a directory.

The general concept of shibboleth is to test something. This principle is frequently used in computer security. Drew. Unix File and Directory Permissions and Modes (http://wpollock. a biometric feature such as a fingerprint or an iris scan. If the password is entered correctly. Wayne. . and Dean. it is considered more secure to combine various classes of shibboleth.berkeley. Wagner.pdf) (pdf) • Pollock." In general. Da Silva. Dilma. • Class 1: Something known. rather than using the approach of just requiring a class 1 shibboleth that is common today. a card or a physical tag of some kind. The most commonly seen usage is logging on to a computer with a password.berkeley. and based on that response to take a particular course of action.com/AUnix1/ FilePermissions. The Murky Issue of Changing Process Identity: Revising Setuid Demystified (http://www.edu/~daw/papers/setuid-login08b.cs. David." and "something you cease to be. and Wagner. There are various classes of computer security-related shibboleth. David. The three classes are also jokingly referred to as "something you forget. • Class 3: Something that is. Hao. providing an encoded card and passing a biometric test.setuid 118 External links • Chen. linguistic differentiation. a high security system might require an authorized user to login by entering a password. perhaps a password or another fact. Setuid Demystified (http://www.htm) Shibboleth (computer security) Within the field of computer security. Dan." "something you lose. access is blocked. • Class 2: Something held.edu/~daw/papers/ setuid-usenix02.eecs. the word shibboleth is sometimes used with a different meaning than the usual meaning of verbal. the user can log on to the computer.pdf) (pdf) • Tsafrir. if the password entered is incorrect password. So for example.

For example. Used to determine if a problem created by software is intentional or carelessness. all information in a System High AIS is treated as if it were classified at the highest security level of any data in the AIS. it has been done where the resulting risk is overlooked or accepted. National Computer Security Center. Sources • NCSC (1985). and specifically not to the strength or trustworthiness of the system. therefore it can never be shared with unclassified destinations (unless downgraded by reliable human review. . System high pertains to the IA features of information processed.28 STD). which itself is risky because of lack of omniscient humans. (a. As a result. Nevertheless.g.a. this precludes use of the features of objects (e.k. such as Top Secret. The field is an outgrowth of the field of computer virus research and malware intent determination. In particular. When unreliable means are used (including Cross-Domain Solutions and Bypass Guards) a serious risk of system exploitation via the bypass is introduced. Unclassified information can exist in a Secret System High computer but it must be treated as Secret. "Trusted Computer System Evaluation Criteria".Software forensics 119 Software forensics Software forensics is a field concerned with the evidence of intention from the examination of software.) There is no known technology to securely declassify system high information by automated means because no reliable features of the data can be trusted after having been potentially corrupted by the system high host. System High Mode is distinguished from other modes (such as multilevel security) by its lack of need for the system to contribute to the protection or separation unequal security classifications. content or format) produced by or exposed to an IAS operating in system high mode as criteria to securely downgrade those objects. System High Mode System High Mode (also referred to simply as System High) is a mode of using an automated information system (AIS) that pertains to an environment that contains restricted data that is classified in a hierarchical scheme. Secret and Unclassified. the TCSEC or "Orange Book" or DOD 5200.

DITSCAP was replaced on July 6. that describes DITSCAP and provides an outline for the SSAA document is DODI 5200. or DITSCAP. In contrast to a closed system. Researchers from the University of Illinois at Urbana-Champaign are working on the next-generation version of TrustBuilder. to prove they are a student in order to qualify for a student discount at an online bookstore. disa. where the interacting entities have a preexisting relationship (often proved by typing a username and password). References • DoD Information Assurance Portal [1]. The PeerTrust system uses guarded distributed logic programs as the basis for a simple yet expressive policy and trust negotiation language. mil/ ditscap/ Trust negotiation Trust Negotiation is an approach to gradually establishing trust between strangers online through the iterative exchange of digital credentials. and SSH. Researchers at Brigham Young University built a software prototype of trust negotiation called TrustBuilder. for example. SSAA is not cited in the DIACAP as a necessary documentation format. built upon the rule layer of the Semantic Web layer cake. . trust negotiation is an open system.System Security Authorization Agreement 120 System Security Authorization Agreement A System Security Authorization Agreement (SSAA). The DoD instruction (issues in December 1997. titled TrustBuilder2. including HTTP.509 certificates as its credentials and runs on top of several common Internet protocols. External links • Internet Security Research Lab [1] A research lab that has done research on many areas of trust negotiation. but is still required by some organizations for software release processes. The SSAA is part of the Department of Defense Information Technology Security Certification and Accreditation Process. A student might receive a credential from his or her university that certifies that they are a student at that university. such as a driver's license. This is done by disclosing digital credentials. TrustBuilder uses X. Credentials are digitally signed in order to allow third parties to verify them.1-M). which is the DoD Information Assurance Certification and Accreditation Process (DIACAP). credit card. References [1] http:/ / iase. The student could then use that credential. 2006 by an Interim Department of Defense (DoD) Certification and Accreditation (C&A) Process (DIACAP) Guidance. Such policies will be one component of a run-time system that can negotiate to establish trust on the Semantic Web. digital credentials assert that their owner possesses certain attributes. Digital credentials are the computer analog to paper credentials. • PeerTrust: Automated Trust Negotiation for Peers on the Semantic Web [2] In the PeerTrust project we are developing and investigating policy languages to describe trust and security requirements on the Semantic Web.40. and complete strangers can build trust in one another. Rather than proving the credential owner's identity. published in July 2000 provides additional details. is an information security document used in the United States Department of Defense (DoD) to describe and accredit networks and systems. The Interim Guidance cites SSAA documentation but only as a transition to the replacement for DITSCAP. or student ID. The DITSCAP application manual (DoD 8510. TLS.

[1] who defined it as the combination of kernel and trusted processes. the TCB is formed of the language runtime and standard library. as the totality of protection mechanisms within it. and/or software components that are critical to its security. byu. In programming languages that have security features designed in such as Java and E. therefore provides[3] a more formal definition of the TCB of a computer system. in the sense that bugs or vulnerabilities occurring inside the TCB might jeopardize the security properties of the entire system. another classic computer security literature reference. Modern operating systems strive to reduce the size of the TCB so that an exhaustive examination of its code base (by means of manual or computer-assisted software audit or program verification) becomes feasible. The careful design and implementation of a system's trusted computing base is paramount to its overall security. In operating systems. the combination of which is responsible for enforcing a computer security policy. Definition and characterization The term trusted computing base goes back to Rushby. In the classic paper Authentication in Distributed Systems: Theory and Practice[2] Lampson et al. l3s. a network server process under a UNIX-like operating system might fall victim to a security breach and compromise an important part of the system's security. de/ peertrust/ Trusted computing base The trusted computing base (TCB) of a computer system is the set of all hardware. define the TCB of a computer system as simply a small amount of software and hardware that security depends on and that we distinguish from a much larger amount that can misbehave without affecting security. setuid programs and daemons in UNIX systems). the protection of those mechanisms to ensure their correctness.[4] . while clear and convenient. By contrast. a given piece of hardware or software is a part of the TCB if and only if it has been designed to be a part of the mechanism that provides its security to the computer system.g. The latter refers to processes which are allowed to violate the system's access-control rules. The Orange Book. edu/ [2] http:/ / www. including hardware. as e. parts of a computer system outside the TCB must not be able to misbehave in a way that would leak any more privileges than are granted to them in accordance to the security policy. firmware. yet is not part of the operating system's TCB. Both definitions. cs. and software. firmware. this typically consists of the kernel (or microkernel) and a select set of system utilities (for example. The Orange Book further explains that [t]he ability of a trusted computing base to enforce correctly a unified security policy depends on the correctness of the mechanisms within the trusted computing base. In other words. are neither theoretically exact nor intended to be.Trust negotiation 121 References [1] http:/ / isrl. and the correct input of parameters related to the security policy.

Trusted computing base


Properties of the TCB
Predicated upon the security policy: “TCB is in the eye of the consultant”
It should be pointed out that as a consequence of the above Orange Book definition, the boundaries of the TCB depend closely upon the specifics of how the security policy is fleshed out. In the network server example above, even though, say, a Web server that serves a multi-user application is not part of the operating system's TCB, it has the responsibility of performing access control so that the users cannot usurp the identity and privileges of each other. In this sense, it definitely is part of the TCB of the larger computer system that comprises the UNIX server, the user's browsers and the Web application; in other words, breaching into the Web server through e.g. a buffer overflow may not be regarded as a compromise of the operating system proper, but it certainly constitutes a damaging exploit on the Web application. This fundamental relativity of the boundary of the TCB is exemplifed by the concept of the target of evaluation (TOE) in the Common Criteria security process: in the course of a Common Criteria security evaluation, one of the first decisions that must be made is the boundary of the audit in terms of the list of system components that will come under scrutiny.

A prerequisite to security
Systems that don't have a trusted computing base as part of their design do not provide security of their own: they are only secure insofar as security is provided to them by external means (e.g. a computer sitting in a locked room without a network connection may be considered secure depending on the policy, regardless of the software it runs). This is because, as David J. Farber et al. put it,[5] [i]n a computer system, the integrity of lower layers is typically treated as axiomatic by higher layers. As far as computer security is concerned, reasoning about the security properties of a computer system requires being able to make sound assumptions about what it can, and more importantly, cannot do; however, barring any reason to believe otherwise, a computer is able to do everything that a general Von Neumann machine can. This obviously includes operations that would be deemed contrary to all but the simplest security policies, such as divulging an email or password that should be kept secret; however, barring special provisions in the architecture of the system, there is no denying that the computer could be programmed to perform these undesirable tasks. These special provisions that aim at preventing certain kinds of actions from being executed, in essence, constitute the trusted computing base. For this reason, the Orange Book (still a reference on the design of secure operating systems design as of 2007) characterizes the various security assurance levels that it defines mainly in terms of the structure and security features of the TCB.

Software parts of the TCB need to protect themselves
As outlined by the aforementioned Orange Book, software portions of the trusted computing base need to protect themselves against tampering to be of any effect. This is due to the von Neumann architecture implemented by virtually all modern computers: since machine code can be processed as just another kind of data, it can be read and overwritten by any program barring special memory management provisions that subsequently have to be treated as part of the TCB. Specifically, the trusted computing base must at least prevent its own software from being written to. In many modern CPUs, the protection of the memory that hosts the TCB is achieved by adding in a specialized piece of hardware called the memory management unit (MMU), which is programmable by the operating system to allow and deny access to specific ranges of the system memory to the programs being run. Of course, the operating system is also able to disallow such programming to the other programs. This technique is called supervisor mode; compared to more crude approaches (such as storing the TCB in ROM, or equivalently, using the Harvard architecture), it has the advantage of allowing the security-critical software to be upgraded in the field, although

Trusted computing base allowing secure upgrades of the trusted computing base poses bootstrap problems of its own.[6]


Trusted vs. trustworthy
As stated above, trust in the trusted computing base is required to make any progress in ascertaining the security of the computer system. In other words, the trusted computing base is “trusted” first and foremost in the sense that it has to be trusted, and not necessarily that it is trustworthy. Real-world operating systems routinely have security-critical bugs discovered in them, which attests of the practical limits of such trust.[7] The alternative is formal software verification, which uses mathematical proof techniques to show the absence of bugs. Researchers at NICTA and its spinout Open Kernel Labs have recently performed such a formal verification of seL4 [8], a member of the L4 microkernel family, proving functional correctness of the C implementation of the kernel.[9] This makes seL4 the first operating-system kernel which closes the gap between trust and trustworthiness, assuming the mathematical proof and the compiler are free from error.

TCB size
Due to the aforementioned need to apply costly techniques such as formal verification or manual review, the size of the TCB has immediate consequences on the economics of the TCB assurance process, and the trustworthiness of the resulting product (in terms of the mathematical expectation of the number of bugs not found during the verification or review). In order to reduce costs and security risks, the TCB should therefore be kept as small as possible. This is a key argument in the debate opposing microkernel proponents and monolithic kernel aficionados.[10] The aforementioned Coyotos kernel will be of the microkernel kind for this reason, despite the possible performance issues that this choice entails.

AIX materializes the trusted computing base as an optional component in its install-time package management system.[11]

[1] Rushby, John (1981). "Design and Verification of Secure Systems". 8th ACM Symposium on Operating System Principles. Pacific Grove, California, US. pp. 12–21. [2] B. Lampson, M. Abadi, M. Burrows and E. Wobber, Authentication in Distributed Systems: Theory and Practice (http:/ / citeseer. ist. psu. edu/ lampson92authentication. html), ACM Transactions on Computer Systems 1992, on page 6. [3] Department of Defense trusted computer system evaluation criteria (http:/ / csrc. nist. gov/ secpubs/ rainbow/ std001. txt), DoD 5200.28-STD, 1985. In the glossary under entry Trusted Computing Base (TCB). [4] M. Miller, C. Morningstar and B. Frantz, Capability-based Financial Instruments (An Ode to the Granovetter diagram) (http:/ / www. erights. org/ elib/ capability/ ode/ ode-linear. html), in paragraph Subjective Aggregation. [5] W. Arbaugh, D. Farber and J. Smith, A Secure and Reliable Bootstrap Architecture (http:/ / citeseer. ist. psu. edu/ article/ arbaugh97secure. html), 1997, also known as the “aegis papers”. [6] A Secure and Reliable Bootstrap Architecture (http:/ / citeseer. ist. psu. edu/ article/ arbaugh97secure. html), op. cit. [7] Bruce Schneier, The security patch treadmill (http:/ / www. schneier. com/ crypto-gram-0103. html#1) (2001) [8] http:/ / ertos. org/ research/ l4. verified [9] Klein, Gerwin; Elphinstone, Kevin; Heiser, Gernot; Andronick, June; Cock, David; Derrin, Philip; Elkaduwe, Dhammika; Engelhardt, Kai et al. (October 2009). "seL4: Formal verification of an OS kernel" (http:/ / www. sigops. org/ sosp/ sosp09/ papers/ klein-sosp09. pdf). 22nd ACM Symposium on Operating System Principles. Big Sky, Montana, US. pp. 207–220. . [10] Andrew S. Tanenbaum, Tanenbaum-Torvalds debate, part II (http:/ / www. cs. vu. nl/ ~ast/ reliable-os/ ) (12 May 2006) [11] AIX 4.3 Elements of Security (http:/ / www. redbooks. ibm. com/ pubs/ pdfs/ redbooks/ sg245962. pdf), August 2000, chapter 6.



XACML stands for eXtensible Access Control Markup Language. It is a declarative access control policy language implemented in XML and a processing model, describing how to interpret the policies. Latest version 2.0 was ratified by OASIS standards organization on February 1, 2005. The planned version 3.0 will add generic attribute categories for the evaluation context and policy delegation profile (administrative policy profile). The first committee draft of XACML 3.0 was released April 16, 2009. [1] The first version of administrative policy profile working draft was publicly released on April 1, 2009.[2]

Non normative terminology (following RFC 2904 [3], except for PAP)
Term PAP PDP PEP PIP Description Policy Administration Point - Point which manages policies Policy Decision Point - Point which evaluates and issues authorization decisions Policy Enforcement Point - Point which intercepts user's access request to a resource and enforces PDP's decision. Policy Information Point - Point which can provide external information to a PDP, such as LDAP attribute information.

Policy Elements
XACML is structured into 3 levels of elements: • PolicySet, • Policy, and • Rule. All 3 elements can contain Target elements. PolicySet elements can contain PolicySet and Policy. Policy can contain Rule. See http:/ / docs. oasis-open. org/ xacml/ 3. 0/ xacml-3. 0-core-spec-cs-01-en. pdf for more details. Policies are defined with a collection of Rules. Both Rules and Requests use Subjects, Resources and Actions. • • • • A Subject element is the entity requesting access. A Subject has one or more Attributes. The Resource element is a data, service or system component. A Resource has a single Attribute. An Action element defines the type of access requested on the Resource. Actions have one or more Attributes. An Environment element can optionally provide additional information.

Within XACML, a concept called obligations can be used. An obligation is a directive from the Policy Decision Point (PDP) to the Policy Enforcement Point (PEP) on what must be carried out before or after an access is granted. If the PEP is unable to comply with the directive, the granted access may or must not be realized. The augmentation of obligations eliminates a gap between formal requirements and policy enforcement. An example of an obligation could look like this: Access control rule: Allow access to resource MedicalJournal with attribute patientID=x if Subject match DesignatedDoctorOfPatient and action is read with obligation

XACML on Permit: doLog_Inform(patientID, Subject, time) on Deny : doLog_UnauthorizedLogin(patientID, Subject, time) The XACML's obligation can be an effective way to meet formal requirements (non-repudiation for example) that can be hard to implement as access control rules. Furthermore, any formal requirements will be part of the access control policy as obligations and not as separate functions, which makes policies consistent and centralization of the IT environment easier to achieve.


New in XACML 3.0
The implementation of delegation is new in XACML 3.0. The delegation mechanism is used to support decentralized administration of access policies. It allows an authority (delegator) to delegate all or parts of its own authority or someone elses authority to another user (delegate) without any need to involve modification of the root policy. This is because, in this delegation model, the delegation rights are separated from the access rights. These are instead referred to as administrative control policies[4] . Access control and administrative policies work together as in the following scenario: A partnership of companies' many services are protected by an access control system. The system implements the following central rules to protect its resources and to allow delegation: Access control rules: Allow access to resource with attribute WebService if subject is Employee and action is read or write. Administration control rules: Allow delegation of access control rule #1 to subjects with attribute Consultant. Conditions: delegation must expire within 6 months, resource must not have attribute StrictlyInternal. (Attributes can be fetched from an external source, e.g. a LDAP catalog.) When a consultant enters the corporation, a delegation can be issued locally by the consultant's supervisor, granting the consultant access to systems directly. The delegator (supervisor in this scenario) may only have the right to delegate a limited set of access rights to consultants.



Other features
Other new features of XACML 3.0 are listed enhancements-and-new-features-in-xacml-3-axiomatics/ at http:/ / www. webfarmr. eu/ 2010/ 07/

[1] "XACML 3.0 - work in progress." (http:/ / www. oasis-open. org/ committees/ tc_home. php?wg_abbrev=xacml#CURRENT). OASIS (oasis-open.org). . Retrieved 09-September-2009. [2] "XACML v3.0 Hierarchical Resource Profile Version 1.0." (http:/ / xml. coverpages. org/ XACML-v30-HierarchicalResourceProfile-WD7. pdf). OASIS (oasis-open.org). . Retrieved 01-April-2009. [3] http:/ / tools. ietf. org/ html/ rfc2904 [4] XACML v3.0 Administrative Policy Version 1.0 (http:/ / www. oasis-open. org/ committees/ tc_home. php?wg_abbrev=xacml)

External links
• • • • • Axiomatics - XACML Policy Server (http://www.axiomatics.com/) BiTKOO - XACML 2.0 and 3.0 for Cloud, SharePoint and DB-Access (http://www.bitkoo.com/) Enterprise XACML (http://code.google.com/p/enterprise-java-xacml/) eXtensible Access Control Markup Language (http://xml.coverpages.org/xacml.html) HERAS-AF: An Open Source Project providing an XACML-based Security Framework (http://www.herasaf. org/)

• JBoss XACML - Open Source LGPL licensed library (https://www.jboss.org/community/docs/DOC-10840) • Jericho Systems Corp. - XACML-compliant EnterSpace Decisioning Service (http://www.jerichosystems.com/ ) • NextLabs - XACML Information Risk Management (http://www.nextlabs.com/html/?q=products) • OASIS XACML committee website (http://www.oasis-open.org/committees/xacml/) • OASIS declaration of issues with two software patents of IBM (http://www.oasis-open.org/committees/xacml/ ipr.php) • Oracle Entitlements Server (http://www.oracle.com/us/products/middleware/identity-management/ oracle-entitlements-server/index.html) • XACML 2.0 PDP and PAP implemented as Axis2 web services (http://gryb.info/xacml/doc/ XACMLightReference.html) • XACML authorization for many PAM enabled applications (http://pamxacml.sourceforge.net/) • SICSACML XACML (http://www.sics.se/node/2465) • SunXACML (http://sunxacml.sourceforge.net/)



Company / developer BAE Systems Working state Source model Latest stable release Current Closed source 6.5 / August 2008

Supported platforms x86 Kernel type Official website Monolithic kernel www.baesystems.com [4]

The XTS-400 is a multi-level secure computer operating system. It is multi-user and multitasking. It works in networked environments and supports Gigabit Ethernet and both IPv4 and IPv6. The XTS-400 is a combination of Intel x86 hardware and the STOP (Secure Trusted Operating Program) operating system. XTS-400 was developed by BAE Systems, and was originally released as version 6.0 in December 2003. STOP provides "high-assurance" security and was the first general-purpose operating system with a Common [1] Criteria assurance level rating of EAL5 or above . The XTS-400 can host, and be trusted to separate, multiple, concurrent data sets, users, and networks at different sensitivity levels. The XTS-400 provides both an "untrusted" environment for normal work and a "trusted" environment for administrative work and for privileged applications. The untrusted environment is similar to traditional Unix environments. It provides binary compatibility with Linux applications running most Linux commands and tools as well as most Linux applications without the need for recompiling. This untrusted environment includes an X Window System GUI, though all windows on a screen must be at the same sensitivity level. To support the trusted environment and various security features, STOP provides a set of proprietary APIs to applications. In order to develop programs that use these proprietary APIs, a special software development environment (SDE) is needed. The SDE is also needed in order to port some complicated Linux/Unix applications to the XTS-400. A new version of the STOP operating system, STOP 7 performance and new features such as RBAC.

has since been introduced, with claims to improved

As a high-assurance, MLS system, XTS-400 can be used in "cross-domain" solutions. A cross-domain solution will typically require a piece of privileged software to be developed which can temporarily circumvent one or more security features in a controlled manner. Such pieces are outside the CC evaluation of the XTS-400, but they can be accredited. The XTS-400 can be used as a desktop, server, or network gateway. The interactive environment, typical Unix command line tools, and a GUI are present in support of a desktop solution. Since the XTS-400 supports multiple, concurrent network connections at different sensitivity levels, it can be used to replace several single-level desktops connected to several different networks. In support of server functionality, the XTS-400 can be purchased in a rack-mount configuration, accepts a UPS, allows multiple network connections, accommodates many hard disks on a SCSI subsystem (also saving disk blocks using a "sparse file" implementation in the file system), and provides a trusted backup/save tool. Server software,

XTS-400 such as an Internet daemon, can be ported to run on the XTS-400. A popular application for high-assurance systems like the XTS-400 is to "guard" information flow between two networks of differing security characteristics. Several customer guard solutions are available based on XTS systems.


XTS-400 version 6.0.E completed a Common Criteria (CC) evaluation in March 2004 at EAL4 augmented with ALC_FLR.3 (validation report CCEVS-VR-04-0058). Version 6.0.E also conformed with the protection profiles entitled "Labeled Security Protection Profile" (LSPP) and "Controlled Access Protection Profile" (CAPP), though both profiles are surpassed both in functionality and assurance. XTS-400 version 6.1.E completed evaluation in March 2005 at EAL5 augmented with ALC_FLR.3 and ATE_IND.3 (validation report CCEVS-VR-05-0094), still conforming to the LSPP and CAPP. The EAL5+ evaluation included analysis of covert channels and additional vulnerability analysis and testing by the National Security Agency. XTS-400 version 6.4.U4 completed evaluation in July 2008 at EAL5 augmented with ALC_FLR.3 and ATE_IND.3 (validation report CCEVS-VR-VID10293-2008), also still conforming to the LSPP and CAPP. Like its predecessor, it also included analysis of covert channels and additional vulnerability analysis and testing by the National Security Agency. The official postings for all the XTS-400 evaluations can be seen on the Validated Product List [3] [4] The main security feature that sets STOP apart from most operating systems is the mandatory sensitivity policy. Support for a mandatory integrity policy, also sets STOP apart from most MLS or trusted systems. While a sensitivity policy deals with preventing unauthorized disclosure, an integrity policy deals with preventing unauthorized deletion or modification (such as the damage that a virus might attempt). Normal (i.e., untrusted) users do not have the "discretion" to change the sensitivity or integrity levels of objects. The Bell-La Padula and Biba formal models are the basis for these policies. Both the sensitivity and integrity policies apply to all users and all objects on the system. STOP provides 16 hierarchical sensitivity levels, 64 non-hierarchical sensitivity categories, 8 hierarchical integrity levels, and 16 non-hierarchical integrity categories. The mandatory sensitivity policy enforces the U.S. DoD data sensitivity classification model (i.e., Unclassified, Secret, Top Secret), but can be configured for commercial environments. Other security features include: • Identification and authentication, which forces users to be uniquely identified and authenticated before using any system services or accessing any information. The user's identification is used for access control decisions and for accountability via the auditing mechanism. • Discretionary access control (DAC), which appears just as in Unix/Linux, including the presence of access control lists on every object. Set-id functionality is supported in a controlled fashion. • A mandatory "subtype" policy, which allows some of the functionality of trusted systems which support a full "Type Enforcement" or "Domain-Type Enforcement" policy. • Auditing of all security-relevant events and trusted tools to allow administrators to detect and analyze potential security violations. • Trusted path, which allows a user to be sure s/he is interacting directly with the TSF during sensitive operations. This prevents, for example, a Trojan horse from spoofing the login process and stealing a user's password. • Isolation, of the operating system code and data files from the activity of untrusted users and processes. Thus, even if, for example, a user downloads a virus, the virus will be unable to corrupt or affect the operating system. • Separation, of processes from one another (so that one process/user cannot tamper with the internal data and code of another process). • Reference monitor functionality, so that no access can bypass scrutiny by the operating system. • Strong separation of administrator, operator, and user roles using the mandatory integrity policy.

XTS-400 supports a Mission Support Cryptographic Unit (MSCU) and Fortezza cards. SCOMP completed evaluation in 1984 at the highest functional and assurance level then in place: "A1". National Security Agency. except for an optional MSCU (Mission Support Cryptographic Unit). Though the security features of the XTS system put some restrictions on the API and require additional. The XTS-200 was designed as a general-purpose operating system supporting a Unix-like application and user environment. For customers who want them. ultimately ending up with version 5. All of the predecessor products were evaluated under TCSEC (a. "XTS-200". RAMP). Up to 2 GB of main memory is supported. all of which can be configured at different mandatory security and integrity levels. The XTS-400 uses only standard PC. the XTS-400 must be installed. The main customer-visible change was specific conformance to the programming API of Linux. 129 STOP comes in only a single package. XTS-200 completed evaluation in 1992 at the "B3" level. Policy configuration does not require a potentially complicated process of defining large sets of domains and data types (and the attendant access rules).S. Since then the product has evolved from proprietary hardware and interfaces to commodity hardware and Linux interfaces.k. Some security features were added or improved as compared to earlier versions of the system and performance was also improved.2. Hardware The CC evaluation forces particular hardware to be used in the XTS-400. proprietary interfaces. XTS-300 also went through several ratings maintenance cycles (a. including rack-mount and tower form factors. and "XTS-300". Orange Book) standards. and configured by trusted personnel. mini-computer hardware to COTS. and repairing file systems. Development of the XTS-400 began in June 2000.XTS-400 • • • • Residual information (i.e. The hardware is based around an Intel Xeon (P4) CPU at up to 2. very similar to an "assurance continuity" cycle under CC. evaluated tools for configuring the system..a. conformance is close enough that most applications will run on the XTS without recompilation. A PCI bus is used for add-in cards such as Gigabit Ethernet. The MSCU performs "type 1" cryptography and has been separately scrutinized by the U.E being evaluated in 2000. managing security-critical data. Self-testing of security mechanisms. The site must also provide physical protection of the hardware components. so that there is no confusion about whether a particular package has all security features present. and software upgrades.8 GHz speeds. XTS-300 completed evaluation in 1994 at the B3 level. are shipped from BAE Systems in a secure fashion. all developed by the same group: SCOMP (Secure Communications Processor). History The XTS-400 has been preceded by several evaluated ancestors. A SCSI subsystem is used to allow a number of high-performance peripherals to be attached.k. COTS components. Though this places restrictions on the hardware configurations that can be used. several configurations are available. To maintain the trustworthiness of the system.a. One SCSI peripheral is a PC Card reader that can support Fortezza. booted. Intel x86 hardware. Multiple SCSI host adapters can be included. Up to 16 simultaneous Ethernet connections can be made. . object reuse) mechanisms to prevent data scavenging. Exclusion of higher layer network services from the trusted security functions (TSF). The XTS-300 transitioned from proprietary. The system. on demand. Trusted. so that the TSF is not susceptible to the publicly known vulnerabilities in those services. Mandatory policies cannot be disabled.

2006. including privileged commands. process groups. Fewer application-level features available out-of-the-box. Limited hardware choices. Software in an outer ring is prevented from tampering with software in an inner ring. run in the outermost. implements TCP/IP. or lower. The trusted user interface does not utilize a GUI and has weak command line features. enhancements continue to be made to the XTS line of products.. kept at "system high". Trusted system services (TSS) software executes in ring 1. The security kernel also provides I/O services and an IPC message mechanism. which isolates one process from another. It performs all low-level scheduling. STOP is layered into four "rings" and each ring is further subdivided into layers. but trade-offs are made to attain that security. the establishment of user authorization). 130 Architecture STOP is a monolithic kernel operating system (as is Linux). LLC. On September 5. TSS implements file systems. The security kernel's data is global to the system. with all integrity categories. The kernel is part of every process's address space and is needed by both normal and privileged processes. Potential weaknesses The XTS-400 can provide a high level of security in many application environments. . A "security kernel" occupies the innermost and most privileged ring and enforces all mandatory policies.914 "Trusted computer system". OSS implements signals. Not intended for embedded or real-time solutions. while establishing a session for a user at a lower sensitivity level. Untrusted software runs at integrity level 3. The inner three rings constitute the "kernel". and enforces the discretionary access control policy on file system objects. This determination is based on integrity level and privileges. the United States Patent Offices granted BAE Systems Information Technology. It provides a virtual process environment. Operating system services (OSS) executes in ring 2. Though it provides a Linux-like API. TSS's data is local to the process within which it is executing. OSS provides Linux-like API to applications as well as providing additional proprietary interfaces for using the security features of the system. OSS's data is local to the process within which it is executing. Some source level changes may be necessary to get complicated applications to run.g. STOP is highly layered and highly modularized and relatively small and simple.XTS-400 As of July 2006.103. Software is considered trusted if it performs functions upon which the system depends to enforce the security policy (e. memory management. These characteristics have historically facilitated high-assurance evaluations. Potential weaknesses for some customers may include: • • • • • • Slower performance due to more rigid internal layering and modularity and to additional security checks. Some processes require privileges to perform their functions—for example the Secure Server needs to access to the User Access Authentication database. STOP was not based on Unix or Linux source. and interrupt handling. and some memory devices. The innermost ring has hardware privilege and applications. United States Patent # 7.

It will need to be illustrative of policies that demonstrate technical.nps. org/ products_OS. 2010 due to the state of the economy and confusion about the law. html#OS External links • • • • • BAE's XTS-400 product page (http://www.00 require that any companies or persons who store or use personal information (PI) about a Massachusetts resident develop a written.914: Trusted computer system (http://patft. The regulations go into effect on March 1. The law was originally supposed to go into effect on January 1.org/wiki/index.php?title=Trusted_Computing_Base) 201 CMR 17.html) XTS-400 EAL5+ validated product page (http://www. Therefore. According to the regulations. as well as disciplinary measures for employees who do not conform to the new regulations. commoncriteriaportal. html http:/ / www.sensornet.[1] Identity theft and fraud are the major concerns at the core of the implementation of the 201 CMR 17. companies will need a written security plan to safeguard their contacts' and/or employees personal information. regularly audited plan to protect personal information. org/ cc-scheme/ vpl/ http:/ / www. . Both electronic and paper records will need to comply with the new law. gov/net_ready_workshop/Boyd_Fletcher_CDCIE_XMPP_Overview_for_NetReadySensors_Conf. making changes to keep residents' information secure will be required to avoiding security breach and fines. if a Massachusetts resident's information is leaked or captured.baesystems. as well as continuously monitor and address security hazards. Companies will have to designate employees to oversee and manage security procedures in the workplace.00 The Massachusetts General Law Chapter 93H and its new regulations 201 CMR 17.navy.gov/selinux/papers/ inevitability/) • Monterey Security Architecture (MYSEA) (http://cisr.103.niap-ccevs. com/ ProductsServices/ bae_prod_csit_xtsstop7. For example.nsa.uspto.niap-ccevs. Policies addressing employee access to and transportation of personal information will need to be developed. 2010.htm&Sect1=PTO1&Sect2=HITOFF&p=1&r=0& l=50&f=S&d=PALL) • Paper on the need for secure operating systems and mandatory security (http://www.afcea.XTS-400 131 References [1] [2] [3] [4] http:/ / www.niap-ccevs. niap-ccevs. The plan will need to be written to meet industry standards. commoncriteriaportal.pdf). baesystems.org/cc-scheme/st?vid=3012) XTS-400 EAL4+ validated product page (http://www.com/ProductsServices/bae_prod_csit_xts400.org/cc-scheme/st/vid10293) XTS-400 EAL5+ validated product page (http://www.html). and administrative protection for residents’ information. org/ products/ http:/ / www. physical. 2010 and then to March 1. there could be serious consequences for the business that allowed the breach and for the individual whose information was leaked.gov/netacgi/ nph-Parser?TERM1=7103914&u=/netahtml/srchnum.mil/projects/mysea. but then was pushed to May 1 and then January 1. 2009. multinational information sharing in both single and cross domain environments (utilizes STOP OS) • Trustifier TCB overview (http://www. Limiting the collection of data to the minimum that is needed for the purpose it will be used for is also part of the new regulations.00. a Naval Postgraduate School project which utilized the STOP OS • XMPP & Cross Domain Collaborative Information Environment (CDCIE) Overview (http://www.org/cc-scheme/st?vid=9503) United States Patent 7.

If a hacker makes a copy of all a companies credit card numbers it does not cost them anything directly but the loss in fines and reputation can be enormous. Risk analysis When performing risk analysis it is important to weigh how much to spend protecting each asset against the cost of losing the asset. device. Intangible costs must also be factored in. eu/ act/ rm/ cr/ risk-management-inventory/ glossary#G3) [3] "An Introduction to Factor Analysis of Information Risk (FAIR)". gov/ ?pageID=ocaterminal& L=3& L0=Home& L1=Business& L2=Identity+ Theft& sid=Eoca& b=terminalcontent& f=idtheft_notification_reqs& csid=Eoca [5] http:/ / www. destruction.[1] [2] Assets should be protected from illicit access. pdf).Part 1: Concepts and models for information and communications technology security management (http:/ / www. mission critical applications and support systems) and confidential information. use.201 CMR 17. or other component of the environment that supports information-related activities. software (eg. resulting in loss to the company.Security techniques -. Information Security experts must asses the likely impact of an attack and employ appropriate countermeasures.[3] The CIA Triad The goal of Information Security is to ensure the Confidentiality. _CMR_Deadline_Was_Extended) on CSO Online [2] http:/ / www. enisa. iso. ma201. alteration.00 Compliance Check List [3] Requirements for Security Breach Notifications under Chapter 93H [4] MA 201 Compliance Toolkit [5] Compliance with 201 CMR 17:00: Standards for the Protection of Personal Information of Residents of the Commonwealth [6] References [1] Why Mass. riskmanagementinsight. pdf [3] http:/ / www. November 2006 (http:/ / www. a hacker might attack a system in order to steal credit card numbers by exploiting a vulnerability. For example. mass.Management of information and communications technology security -. disclosure.00 132 Further reading • • • • • 201 CMR 17.00 statute [2] 201 CMR 17. gov/ ?pageID=ocamodulechunk& L=4& L0=Home& L1=Government& L2=Our+ Agencies+ and+ Divisions& L3=Division+ of+ Professional+ Licensure& sid=Eoca& b=terminalcontent& f=dpl_cmr_201_compliance_letter& csid=Eoca Asset (computer security) In information security. gov/ Eoca/ docs/ idtheft/ 201CMR1700reg. mass. Risk Management Insight LLC. com [6] http:/ / www.[4] In this case they might put up a firewall and encrypt their credit card numbers. servers and switches). computer security and network security an Asset is any data. mass. htm?csnumber=39066) [2] ENISA Glossary (http:/ / www. References [1] ISO/IEC 13335-1:2004 Information technology -. Integrity and Availability of assets from various threats. [4] IETF RFC 2828 . com/ article/ 465629/ Why_Mass. 201 CMR 17 Deadline Was Extended (http:/ / www. and/or theft. It is also important to take into account the chance of each loss occurring. Assets generally include hardware (eg. org/ iso/ catalogue_detail. mass. pdf [4] http:/ / www. csoonline. europa. com/ media/ docs/ FAIR_introduction. gov/ Eoca/ docs/ idtheft/ compliance_checklist.

US Government CNSS Instruction No. targeting an enterprise’s use of cyberspace for the purpose of disrupting. steal or gain unauthorized access to or make unauthorized use of an asset.php?title=Term:Asset) Attack (computer) In computer and computer networks an attack is any attempt to destroy. In the Internet.[1] Definitions IETF Internet Engineering Task Force defines attack in RFC 2828 as:[2] an assault on system security that derives from an intelligent threat.. CNSS Instruction No. an intelligent act that is a deliberate attempt (especially in the sense of a method or technique) to evade security services and violate the security policy of a system.. via cyberspace. disable. i. an entity that is authorized to access system resources but uses them in a way not approved by those who granted the authorization. disabling. i. disrupt. international terrorists. A "passive attack" attempts to learn or make use of information from the system but does not affect system resources. or destroy information system resources or the information itself. including military)[4] [5] [6] has led to new terms like cyber attack and Cyberwarfare. The term "attack" relates to some other basic security terms as shown in the following diagram:[2] . Phenomenology An attack can be active or passive. 4009 dated 26 April 2010 by Committee on National Security Systems of United States of America[3] defines an attack as: Any kind of malicious activity that attempts to collect.e. The increasing dependencies of modern society on information and computers networks (both in private and public sectors. deny. expose.org/index. (E. or maliciously controlling a computing environment/infrastructure. by an unauthorized or illegitimate user of the system (an "outsider"). An "outside attack" is initiated from outside the perimeter. destroying.g. 4009[3] define a cyber attack as: An attack.[2] An "inside attack" is an attack initiated by an entity inside the security perimeter (an "insider").) An attack can be perpetrated by an insider or from outside the organization.Asset (computer security) 133 External links • FISMApedia TERM (http://fismapedia. and hostile governments.. alter. see: wiretapping.[2] An "active attack" attempts to alter system resources or affect their operation. potential outside attackers range from amateur pranksters to organized criminals. or destroying the integrity of the data or stealing controlled information. degrade.e.

.e.g... an individual cracker or a criminal organization) or "accidental" (e.[8] An organization should make steps to detect. theft or damage of computers and other equipments. A threat can be either "intentional" (i. Botnet are used to conduct distributed attacks.. action.. or the possibility of an "act of God" such as an earthquake. The attacks can be classified according to their origin: i. Computer emergency response team. called an asset.... Other classifications are according to the procedures used or the type of vulnerabilities exploited: attacks can be concentrated on network mechanisms or host features. A "passive attack" attempts to learn or make use of information from the system but does not affect system resources: so it compromises Confidentiality.g.. a number of countermeasures can be set up at organizational. capability.[9] Types of attacks An attack usually is perpetrated by someone with bad intentions: Black hatted attacks falls in this category.. A Threat Action | | measure | | Target of the Attack | +----------+ | | | | +-----------------+ | | Attacker |<==================||<========= | | | i...e.. Information technology security audit and Intrusion detection system are example of these.. That is.... while other perform Penetration testing on an organization information system to find out if all foreseen controls are in place.. if it is conducted using one or more computers: in the last case is called a distributed attack..... The attack can be active when it attempts to alter system resources or affect their operation: so it compromises Integrity or Availability..-+ 134 A resource (both physical or logical).+ + .e.[7] An attack should led to a security incident i.e.... classify and manage security incidents.. In order to detect attacks.. procedural and technical levels. Integrity or Availability properties of resources (potentially different that the vulnerable one) of the organization and others involved parties (customers.. the possibility of a computer malfunctioning. has been developed to manage.. or event that could breach security and cause harm. A Threat is a potential for violation of security.. a threat is a possible danger that might exploit a vulnerability. can have one or more vulnerabilities that can be exploited by a threat agent in a threat action. intelligent. or a tornado).. suppliers). the Information Security Management Systems (ISMS). The first logical step is to set up an Incident response plan and eventually a Computer emergency response team.... The so called CIA triad is the basis of Information Security..... a security-relevant system event in which the system's security policy is disobeyed or otherwise breached.... The result can potentially compromises the Confidentiality. Other are logical.e.+ + .. a fire.-+ An Attack: | |Counter.+ + .. Some attacks are physical: i.+ + . which exists when there is a circumstance... | Passive | | | | | Vulnerability | | | A Threat |<=================>||<========> | | | Agent | or Active | | | | +-------|||-------+ | +----------+ Attack | | | | VVV | | | | | Threat Consequences | . according to Risk management principles. e.| | A System Resource: | i.Attack (computer) + | | | | | | | | | + .. The overall picture represents the risk factors of the risk scenario.... In other words. a security event that involves a security violation.. trying to force .[2] A set of policies concerned with information security management. the countermeasures in order to accomplish to a security strategy set up following rules and regulations applicable in a country.e.

Attack (computer) changes in the logic used by computers or network protocols in order to achieve unforeseen (by the original designer) result but useful for the attacker. . The general term used to describe the category of software used to logically attacking computers is called malware. For a partial list look at Category:Computer security software companies They offer different products and services. design and deploy countermeasures set up contingency plan in order to be ready to respond Many organization are trying to classify vulnerability and their consequence: the most famous vulnerability database is the Common Vulnerabilities and Exposures The Computer emergency response teams were set up by government and large organization to handle computer security incidents. aimed at: • • • • • • • study all possible attacks category publish books and articles about the subject discovering vulnerabilities evaluating the risks fixing vulnerabilities invent. The following is a partial short list of attacks: • Passive • Network • wiretapping • Port scanner • Idle scan • Active • Denial-of-service attack • Spoofing • Network • Man in the middle • ARP poisoning • Ping flood • Ping of death • Smurf attack • Host • Buffer overflow • Heap overflow • Format string attack 135 Consequence of a potential attack A whole industry is working trying to minimize the likelihood and the consequence of an information attack.

[4] In FY 2008.L.S. iso. Vol 3: How Computers Changed the Work of American Public Sector Industries USA: Oxford University Press pp. and inspectors general (IGs) to conduct annual reviews of the agency’s information security program and report the results to Office of Management and Budget (OMB). The act recognized the importance of information security to the economic and national security interests of the United States. OMB uses this data to assist in its oversight responsibilities and to prepare this annual report to Congress on agency compliance with the act. isaca. (2005-11-03) The Digital Hand: Volume II: How Computers Changed the Work of American Financial. (2003-12-04) The Digital Hand: How Computers Changed the Work of American Manufacturing. Joe. pdf) [9] Caballero.) is a United States federal law enacted in 2002 as Title III of the E-Government Act of 2002 (Pub.C. § 3541 [1]. including those provided or managed by another agency. 225 ISBN 978-0-12-374354-1 External links • Term in FISMApedia (http://fismapedia.2 billion securing the government’s total information technology investment of approximately $68 billion or about 9. and Entertainment Industries USA: Oxford University Press ISBN 978-0195165876 [6] Cortada. Jim Harmening (2009) "15" Computer and Information Security Handbook Morgan Kaufmann Pubblications Elsevier Inc p. Albert (2009) "14" Computer and Information Security Handbook Morgan Kaufmann Pubblications Elsevier Inc p.php?title=Term:Attack) Federal Information Security Management Act of 2002 The Federal Information Security Management Act of 2002 ("FISMA"."[3] FISMA requires agency program officials. et seq. org/ ittf/ PubliclyAvailableStandards/ c041933_ISO_IEC_27000_2009. contractor. pdf) dated 26 April 2010 [4] Cortada. Transportation. zip) [2] Internet Engineering Task Force RFC 2828 Internet Security Glossary [3] CNSS Instruction No. cnss. federal agencies spent $6. Media. James W. document. 257 ISBN 978-0-12-374354-1 [8] ISACA THE RISK IT FRAMEWORK (registration required) (http:/ / www.[3] FISMA has brought attention within the federal government to cybersecurity and explicitly emphasized a "risk-based policy for cost-effective security. (2007-11-06) The Digital Hand. chief information officers.org/index. Telecommunicati ons.2 percent of the total information technology portfolio. James W. 116 Stat. 107-347 [2]. 512 ISBN 0195165888 [5] Cortada. org/ Knowledge-Center/ Research/ Documents/ RiskIT-FW-18Nov09-Research. or other source. gov/ Assets/ pdf/ cnssi_4009. James W. 44 U. 4009 (http:/ / www. and Retail Industries USA: Oxford University Press pp.[5] .Attack (computer) 136 References [1] Free download of ISO/IEC 27000:2009 from ISO. via their ITTF web site. (http:/ / standards.[3] The act requires each federal agency to develop. and implement an agency-wide program to provide information security for the information and information systems that support the operations and assets of the agency. 496 ISBN 978-0195165869 [7] Wright. 2899).

the National Institute of Standards and Technology (NIST) and the Office of Management and Budget (OMB) in order to strengthen information system security. According to FISMA. including those not operated by or under the control of the agency. federal government agency or by a contractor or other organization on behalf of a federal agency. and validation programs to promote. NIST performs its statutory responsibilities through the Computer Security Division of the Information Technology Laboratory. and associated methods and techniques for providing adequate information security for all agency operations and assets.Federal Information Security Management Act of 2002 137 Purpose of the act FISMA assigns specific responsibilities to federal agencies. modification.. measure. the head of each agency shall develop and maintain an inventory of major information systems (including major national security systems) operated by or under the control of such agency[9] The identification of information systems in an inventory under this subsection shall include an identification of the interfaces between each such system and all other systems or networks. metrics. NIST works closely with federal agencies to improve their understanding and implementation of FISMA to protect their information and information systems and publishes standards and guidelines which provide the foundation for strong information security programs at agencies. excluding national security systems.[9] The first step is to determine what constitutes the "information system" in question. government repository of standards based vulnerability management data. and compliance (e. government content repository for ISAP and SCAP. disclosure. This framework is further defined by the standards and guidelines developed by NIST. NIST hosts the following: • Information Security Automation Program (ISAP) – * National Vulnerability Database (NVD) – the U. NVD is the U.[4] According to FISMA. Guide for Developing Security Plans for Federal Information Systems[10] provides guidance on determining system boundaries. or destruction in order to provide integrity. In particular.[6] NIST develops standards. disruption. tests. NIST is responsible for developing standards.S. the term information security means protecting information and information systems from unauthorized access.S. FISMA requires the head of each agency to implement policies and procedures to cost-effectively reduce information technology security risks to an acceptable level. Revision 1. security measurement. There is not a direct mapping of computers to information system. rather.g. FISMA)[8] • FISMA implementation project[7] Compliance framework defined by FISMA and supporting standards FISMA defines a framework for managing information security that must be followed for all information systems used or operated by a U. an information system may be a collection of individual computers put to a common purpose and managed by the same system owner. guidelines. .[9] Inventory of information systems FISMA requires that agencies have in place an information systems inventory. confidentiality and availability.S. and validate the security in information systems and services. NIST SP 800-18. This data enables automation of vulnerability management. Implementation of FISMA In accordance with FISMA. use.

Risk assessment The combination of FIPS 200 and NIST Special Publication 800-53 requires a foundational level of security for all federal information and information systems.[13] A risk assessment starts by identifying potential threats and vulnerabilities and mapping implemented controls to individual vulnerabilities. functions. Agencies have flexibility in applying the baseline security controls in accordance with the tailoring guidance provided in Special Publication 800-53. If mitigated by the implementation of a control. One then determines risk by calculating the likelihood and impact that any given vulnerability could be exploited. The culmination of the risk assessment shows the calculated risk for all vulnerabilities and describes whether the risk should be accepted or mitigated. risk-based activity involving management and operational personnel within the organization. The process of selecting the appropriate security controls and assurance requirements for organizational information systems to achieve adequate security is a multifaceted.Federal Information Security Management Act of 2002 138 Categorize information and information systems according to risk level All information and information systems should be categorized based on the objectives of providing appropriate levels of information security according to a range of risk levels[9] The first mandatory security standard required by the FISMA legislation. The agency's risk assessment validates the security control set and determines if any additional controls are needed to protect agency operations (including mission. agency assets. individuals. This allows agencies to adjust the security controls to more closely fit their mission requirements and operational environments. . or the Nation. NIST also initiated the Information Security Automation Program (ISAP) and Security Content Automation Protocol (SCAP) that support and complement the approach for achieving consistent." "integrity." then the entire system has a FIPS PUB 199 categorization of "Moderate. namely FIPS 200 "Minimum Security Requirements for Federal Information and Information Systems". The resulting set of security controls establishes a level of “security due diligence” for the federal agency and its contractors. taking into account existing controls.[9] These requirements are defined in the second mandatory security standard required by the FISMA legislation. "Recommended Security Controls for Federal Information Systems". For example.[11] Organizations must meet the minimum security requirements by selecting the appropriate security controls and assurance requirements as described in NIST Special Publication 800-53. if one information type in the system has a rating of "Low" for "confidentiality. The guidelines are provided by NIST SP 800-60 "Guide for Mapping Types of Information and Information Systems to Security Categories." and "availability. namely FIPS PUB 199 "Standards for Security Categorization of Federal Information and Information Systems"[11] provides the definitions of security categories."[12] The overall FIPS PUB 199 system categorization is the "high water mark" for the impact rating of any of the criteria for information types resident in a system. or reputation). image. The controls selected or planned must be documented in the System Security Plan. cost-effective security control assessments." Security controls Federal information systems must meet the minimum security requirements." and another type has a rating of "Low" for "confidentiality" and "availability" but a rating of "Moderate" for "integrity. one needs to describe what additional Security Controls will be added to the system. other organizations.

[10] The System security plan is the major input to the security certification and accreditation process for the system.[10] Certification and accreditation Once the system documentation and risk assessment has been completed. The organization establishes the selection criteria and subsequently selects a subset of the security controls employed within the information system for assessment. By accrediting an information system. Thus. and technical security controls in an information system. an agency official accepts responsibility for the security of the system and is fully accountable for any adverse impacts to the agency if a breach of security occurs. to determine the extent to which the controls are implemented correctly. and follows up on planned security controls. During the security certification and accreditation process. and controls that are significantly modified may need to be re-certified. Appendix III. updated.[10] System security plans are living documents that require periodic review. The certification agent confirms that the security controls described in the system security plan are consistent with the FIPS 199 security category determined for the information system.[9] NIST SP-800-18 introduces the concept of a System Security Plan. typically referred to as security certification. the system's controls must be reviewed and certified to be functioning appropriately. or individuals based on the implementation of an agreed-upon set of security controls. and status reporting. and accepted. The organization also establishes the schedule for control monitoring to ensure adequate coverage is achieved. and trustworthy information possible on the security status of their information systems in order to make timely.[14] Continuous monitoring All accredited systems are required to monitor a selected set of security controls and the system documentation is updated to reflect changes and modifications to the system. It is essential that agency officials have the most complete. operating as intended. security impact analyses of changes to the system. operational constraints. and plans of action and milestones for implementing security controls. Continuous monitoring activities include configuration management and control of information system components. The results of a security certification are used to reassess the risks and update the system security plan. Security certification is a comprehensive assessment of the management. . keeps the plan current. Required by OMB Circular A-130. Large changes to the security profile of the system should trigger an updated risk assessment. or equivalent document. responsibility and accountability are core principles that characterize security accreditation. thus providing the factual basis for an authorizing official to render a security accreditation decision. operational. and that the threat and vulnerability identification and initial risk determination are identified and documented in the system security plan. security accreditation provides a form of quality control and challenges managers and technical staffs at all levels to implement the most effective security controls possible in an information system.Federal Information Security Management Act of 2002 139 System security plan Agencies should develop policy on the system security planning process.[14] Security accreditation is the official management decision given by a senior agency official to authorize operation of an information system and to explicitly accept the risk to agency operations. and cost/schedule constraints. accurate. ongoing assessment of security controls. risk assessment.[14] The information and supporting evidence needed for security accreditation is developed during a detailed security review of an information system. credible. modification. and producing the desired outcome with respect to meeting the security requirements for the system. Procedures should be in place outlining who reviews the plans. given mission requirements. Based on the results of the review. agency assets. the information system is accredited. risk-based decisions on whether to authorize operation of those systems. technical constraints. made in support of security accreditation. the system security plan is analyzed. The certification and accreditation process is defined in NIST SP 800-37 "Guide for the Security Certification and Accreditation of Federal Information Systems".

nist.nist.gov/groups/SMA/fisma/index. nist. html) [13] NIST SP 800-53A "Guide for Assessing the Security Controls in Federal Information Systems" [14] NIST SP 800-37 "Guide for the Security Certification and Accreditation of Federal Information Systems [15] http:/ / gcn. html) [12] Catalog of NIST SP-800 publications (http:/ / csrc. Revision 1. FISMA efficiency questioned.org) • (http://www. com/ Articles/ 2007/ 03/ 18/ FISMAs-effectiveness-questioned. com/ Articles/ 2009/ 06/ 15/ Interview-Keith-Rhodes-IT-security.fismacenter.[17] References [1] [2] [3] [4] [5] [6] http:/ / www. gov/ ) [9] The 2002 Federal Information Security Management Act (FISMA) [10] NIST SP 800-18.nist. html http:/ / www. gpo. former GAO CTO says [17] http:/ / gcn. nist. gov/ groups/ SMA/ fisma/ overview. html NIST: FISMA Overview (http:/ / csrc.[16] Status As of June 2010. html) [8] National Vulnerability Database (http:/ / nvd.pdf) Report on 2004 FISMA scores (http://searchsecurity.fismaresources. aspx?sc_lang=en& Page=2 Government Computer News.com/ rsam_fisma.gov/sec-cert/) Full text of FISMA (http://csrc. nist.com/default.289142. html) FY 2005 Report to Congress on Implementation of The Federal Information Security Management Act of 2002 FY 2008 Report to Congress on Implementation of The Federal Information Security Management Act of 2002 NIST Computer Security Division 2008 report (http:/ / csrc. nist.gov/drivers/documents/FISMA-final.rsam. com/ Articles/ 2010/ 06/ 03/ Cybersecurity-congressional-priority.sid14_gci1059656. cornell. 2007.html) NIST FISMA Implementation Project Home Page (http://csrc. gov/ fdsys/ pkg/ PLAW-107publ347/ content-detail. aspx?Page=2 Government Computer News.techtarget. and argued that the compliance and reporting methodology mandated by FISMA measures security planning rather than measuring information security.com) FISMA Guidance External links • • • • NIST SP 800 Series Special Publications Library (http://csrc. gov/ publications/ PubsSPs.[15] Past federal chief technology officer Keith Rhodes said that FISMA can and has helped government system security but that implementation is everything.html) • FISMA Resources (http://www. including shifting focus from periodic assessment to real-time assessment and increasing use of automation for reporting. "Guide for Developing Security Plans for Federal Information Systems" [11] Catalog of FIPS publications (http:/ / csrc.html) NIST: FISMA Implementation Project • FISMApedia project (http://www. nist.nist. law.fismapedia.htm) .asp?lnc=resources) • Rsam: Automated Platform for FISMA Compliance and Continuous Monitoring (http://www. edu/ uscode/ 44/ 3541.00. [16] http:/ / gcn. director of research for the SANS Institute – have described FISMA as a well-intentioned but fundamentally flawed tool. gov/ groups/ SMA/ fisma/ overview. nothing is going to get done. a former federal chief information security officer.nist. gov) [7] FISMA implementation (http:/ / csrc. Effective IT security starts with risk analysis. multiple bills in Congress are proposing changes to FISMA. aspx?sc_lang=en Cybersecurity moving up on Congress' to-do list • (http://csrc.com/originalContent/ 0. and if security people view FISMA as just a checklist.gov/publications/nistpubs/index.Federal Information Security Management Act of 2002 140 Critique Security experts Bruce Brody. and Alan Paller. gov/ publications/ PubsFIPS.

Such clauses must not be acted upon by the health plan and also must be re-written so that they comply with HIPAA. then it must count creditable continuous coverage under the old health plan towards any of its exclusion periods for dental benefits.S. suppose someone enrolls in a group health plan on January 1. For example. the beneficiary may be counted with 18 months of general coverage. and employers. and Renewability Title I of HIPAA regulates the availability and breadth of group health plans and certain individual health insurance policies. The standards are meant to improve the efficiency and effectiveness of the nation's health care system by encouraging the widespread use of electronic data interchange in the U. Anything not under those 5 categories must use the general calculation (e.L. health insurance plans. Hidden exclusion periods are not valid under Title I (e.g. 2005 and from August 1. and the Internal Revenue Code. Group health plans may refuse to provide benefits relating to preexisting conditions for a period of 12 months after enrollment in the plan or 18 months in the case of late enrollment. individuals may reduce this exclusion period if they had group health plan coverage or health insurance prior to enrolling in the plan.[5] Some health care plans are exempted from Title I requirements. start at the enrollment . However. Title I: Health Care Access. Portability. 2004 until February 1. 2005. Title I also limits restrictions that a group health plan can place on benefits for preexisting conditions. if the new plan offers dental benefits. 2006. must have occurred while the beneficiary was covered under this exact same health insurance contract"). but only 6 months of dental coverage. health care system. known as the Administrative Simplification (AS) provisions. the Public Health Service Act. This person had previously been insured from January 1. To illustrate. "The accident.[2] However. Title I of HIPAA protects health insurance coverage for workers and their families when they change or lose their jobs.. to be covered. Title II of HIPAA. Congress in 1996. 5 categories of health coverage can be considered separately.104-191) [HIPAA] was enacted by the U.) and Sen. Title I allows individuals to reduce the exclusion period by the amount of time that they had "creditable coverage" prior to enrolling in the plan and after any "significant breaks" in coverage. To determine how much coverage can be credited against the exclusion period in the new plan. requires the establishment of national standards for electronic health care transactions and national identifiers for providers.[4] A "significant break" in coverage is defined as any 63 day period without any creditable coverage. An alternate method of calculating creditable continuous coverage is available to the health plan under Title I. It was originally sponsored by Sen.[3] "Creditable coverage" is defined quite broadly and includes nearly all group and individual health plans. 2005 until December 31. Since limited-coverage plans are exempt from HIPAA requirements.S. then HIPAA still applies to such benefits. if such benefits are part of the general health plan.g. and Medicaid.Health Insurance Portability and Accountability Act 141 Health Insurance Portability and Accountability Act The Health Insurance Portability and Accountability Act (HIPAA) of 1996 (P. It amended the Employee Retirement Income Security Act. including dental and vision coverage. Nancy Kassebaum (R-Kan.).. because the beneficiary did not have a general health plan that covered dental until 6 months prior to the application date). such as long-term health plans and limited-scope plans such as dental or vision plans that are offered separately from the general health plan. the odd case exists in which the applicant to a general group health plan cannot obtain certificates of creditable continuous coverage for independent limited-scope plans such as dental to apply towards exclusion periods of the new plan that does include those coverages. [1] The Administration Simplification provisions also address the security and privacy of health data. That is. Edward Kennedy (D-Mass. Medicare.

Covered entities must disclose PHI to the individual within 30 days upon request.[17] The Privacy Rule gives individuals the right to request that a covered entity correct any inaccurate PHI. 2005 is greater than 63 days. 2003 with a one-year extension for certain "small plans". Medical Liability Reform Title II of HIPAA defines numerous offenses relating to health care and sets civil and criminal penalties for them. the Unique Identifiers Rule. provision of health care. health insurers.[20] They must appoint a Privacy Official and a contact person[21] responsible for receiving complaints and train all members of their workforce in procedures regarding PHI. Title I requires that any preexisting condition begin to be covered on August 1. and medical service providers that engage in certain transactions. or health care operations. payment.[14] A covered entity may disclose PHI to facilitate treatment. instead of home or cell phone number.[12] This is interpreted rather broadly and includes any part of an individual's medical record or payment history. So. or payment for health care that can be linked to an individual. the Transactions and Code Sets Rule. 2005 clearly counts against the exclusion period. Hence. health care clearinghouses. The HIPAA Privacy Rule regulates the use and disclosure of certain information held by "covered entities" (generally. the HHS has promulgated five rules regarding Administrative Simplification: the Privacy Rule. the Security Rule. an individual can ask to be called at his or her work number. health care clearinghouses. the OCR has a long backlog and ignores most complaints. it must make a reasonable effort to disclose only the minimum necessary information required to achieve its purpose. 2005 and December 31.Health Insurance Portability and Accountability Act date and count backwards until you reach a significant break in coverage. Covered entities must also keep track of disclosures of PHI and document privacy policies and procedures. PHI is any information held by a covered entity which concerns health status. such as billing services and community health information systems. and any coverage prior to it cannot be deducted from the exclusion period. Title II requires the Department of Health and Human Services (HHS) to draft rules aimed at increasing the efficiency of the health care system by creating standards for the use and dissemination of health care information. The Privacy Rule requires covered entities to notify individuals of uses of their PHI. 2005 and August 1. employer sponsored health plans.[23] [24] However. 2006.[6] [7] [8] However. Thus.[13] They also must disclose PHI when required to do so by law. "Complaints of privacy violations have been piling up at . These rules apply to "covered entities" as defined by HIPAA and the HHS. this person could deduct five months from his or her exclusion period. But the period without insurance between February 1.[9] [10] Per the requirements of Title II. It also creates several programs to control fraud and abuse within the health care system. and health care providers that transmit health care data in a way that is regulated by HIPAA. the five months of coverage between August 1. Privacy Rule The effective compliance date of the Privacy Rule was April 14. such as reporting suspected child abuse to state child welfare agencies. according to the Wall Street Journal.)[11] It establishes regulations for the use and disclosure of Protected Health Information (PHI).[15] or if the covered entity has obtained authorization from the individual.[16] However. reducing the exclusion period to seven months. So. and the Enforcement Rule. the most significant provisions of Title II are its Administrative Simplification rules.[22] An individual who believes that the Privacy Rule is not being upheld can file a complaint with the Department of Health and Human Services Office for Civil Rights (OCR). Covered entities include health plans.[19] For example. when a covered entity discloses any PHI. Administrative Simplification.[18] It also requires covered entities to take reasonable steps to ensure the confidentiality of communications with individuals. 142 Title II: Preventing Health Care Fraud and Abuse. this is a significant break in coverage.

Health Insurance Portability and Accountability Act the Department of Health and Human Services. replacing the version 4010. associations or insurance agencies to enroll members to a payer. EDI Payroll Deducted and other group Premium Payment for Insurance Products (820) is a transaction set which can be used to make a premium payment for insurance products. due to widespread confusion and difficulty in implementing the rule. It can be used to order a financial institution to make a payment to a payee."[25] 143 Transactions and Code Sets Rule The HIPAA/EDI provision was scheduled to take effect from October 16. Chiropractors. there can be slight derivations to cover off claims involving unique claims such as for Institutions. health care professional (HMO). send an Explanation of Payments (EOP) remittance advice. Professionals. Medicare etc. send an Explanation of Benefits (EOB). typically because it found no violation or after it provided informal guidance to the parties involved. EDI Benefit Enrollment and Maintenance Set (834) can be used by employers. government agency (Medicaid. and Dentists etc. The payer is a healthcare organization that pays claims. but it has not yet taken any enforcement actions against hospitals. administers insurance or benefit or product. Providers and health plans who trade professional (medical) health care claims electronically must use the 837 Health Care Claim: Professional standard to send in claims. either directly or via intermediary billers and claims clearinghouses. government agencies. and/or payment of health care services within a specific health care/insurance industry segment. either directly or via intermediary billers and claims clearinghouses. .[26] This allows for the larger field size of ICD-10-CM as well as other improvements. It can also be used to transmit claims for retail pharmacy services and billing payment information between payers with different payment responsibilities where coordination of benefits is required or between payers and regulatory agencies to monitor the rendering. doctors. 2012 the newest version 5010 becomes effective. For example. However.) or any organization that may be contracted by one of these former groups. 2005 most medical providers that file electronically did have to file their electronic claims using the HIPAA standards in order to be paid.1) is used to submit retail pharmacy claims to payers by health care professionals who dispense medications. On January 1. the agency fielded 23. billing. a state mental health agency may mandate all healthcare claims. As there are many different business applications for the Health Care claim. preferred provider organization (PPO). except for retail pharmacy claims (see EDI Retail Pharmacy Claim Transaction).896 complaints related to medical-privacy rules. Key EDI(X12) transactions used for HIPAA compliance are: EDI Health Care Claim Transaction set (837) is used to submit health care claim billing information. EDI Retail Pharmacy Claim Transaction (NCPDP Telecommunications Standard version 5. Examples of payers include an insurance company. Between April of 2003 and November of 2006. or make a payment and send an EOP remittance advice only from a health insurer to a health care provider either directly or via a financial institution. 2003 with a one-year extension for certain "small plans". It can also be used to transmit health care claims and billing payment information between payers with different payment responsibilities where coordination of benefits is required or between payers and regulatory agencies to monitor the rendering. insurers or anyone else for rule violations. unions. and/or payment of retail pharmacy services within the pharmacy health care/insurance industry segment. billing. EDI Health Care Claim Payment/Advice Transaction Set (835) can be used to make a payment. or both. A spokesman for the agency says it has closed three-quarters of the complaints. After July 1. encounter information. It can be sent from providers of health care services to payers. CMS granted a one-year extension to all parties.

It took effect on April 21. EDI Health Care Eligibility/Benefit Response (271) is used to respond to a request inquire about the health care benefits and eligibility associated with a subscriber or dependent. EDI Health Care Claim Status Request (276) This transaction set can be used by a provider. This standard does not cover the semantic meaning of the information encoded in the transaction sets. 2006 for "small plans". The notification is at a summary or service line detail level. patient. is not used for account payment posting. The Security Rule complements the Privacy Rule. EDI Functional Acknowledgement Transaction Set (997) this transaction set can be used to define the control structures for a set of acknowledgments to indicate the results of the syntactical analysis of the electronically encoded documents. establishment. Individual covered entities can evaluate their own situation and determine the best way to implement addressable specifications. or to request additional information from the provider regarding a health care claim or encounter. This transaction set is not intended to replace the Health Care Claim Payment/Advice Transaction Set (835) and therefore. Required specifications must be adopted and administered as dictated by the Rule. It lays out three types of security safeguards required for compliance: administrative. and technical. demographic. EDI Health Care Claim Status Notification (277) This transaction set can be used by a health care payer or authorized agent to notify a provider. the Security Rule deals specifically with Electronic Protected Health Information (EPHI). certification. • Entities must show that an appropriate ongoing training program regarding the handling of PHI is provided to employees performing health plan administrative functions. 2005 for most covered entities and April 21. Addressable specifications are more flexible. such as subscriber. it names both required and addressable implementation specifications. While the Privacy Rule pertains to all Protected Health Information (PHI) including paper and electronic. diagnosis or treatment data for the purpose of request for review. it is necessary for X12 transaction set processing . and for each standard. Although it is not specifically named in the HIPAA Legislation or Final Rule. recipient or authorized agent regarding the status of a health care claim or encounter. [27] The standards and specifications are as follows: • Administrative Safeguards – policies and procedures designed to clearly show how the entity will comply with the act • Covered entities (entities that must comply with HIPAA requirements) must adopt a written set of privacy procedures and designate a privacy officer to be responsible for developing and implementing all required policies and procedures. 2003. notification or reporting the outcome of a health care services review. • The procedures must address access authorization. 144 Security Rule The Final Rule on Security Standards was issued on February 20. the Rule identifies various security standards. recipient of health care products or services or their authorized agent to request the status of a health care claim. Access to EPHI must be restricted to only those employees who have a need for it to complete their job function.Health Insurance Portability and Accountability Act EDI Health Care Eligibility/Benefit Inquiry (270) is used to inquire about the health care benefits and eligibility associated with a subscriber or dependent. EDI Health Care Service Review Information (278) This transaction set can be used to transmit health care service information. The encoded documents are the transaction sets. physical. which are grouped in functional groups. Some privacy advocates have argued that this "flexibility" may provide too much latitude to covered entities. modification. • Procedures should clearly identify employees or classes of employees who will have access to electronic protected health information (EPHI). For each of these types. . • The policies and procedures must reference management oversight and organizational buy-in to compliance with the documented security controls. 2003 with a compliance date of April 21. The notification may be solicited or unsolicited. used in defining transactions for business data interchange. and termination.

• In addition to policies and procedures and access records. Examples of corroboration include: password systems. Authentication consists of corroborating that an entity is who it claims to be. Care must be taken to determine if the vendor further out-sources any data handling functions to other vendors and monitor whether appropriate contracts and controls are in place. double-keying. and procedures of audits. • Physical Safeguards – controlling physical access to protect against inappropriate access to protected data • Controls must govern the introduction and removal of hardware and software from the network. • Documented risk analysis and risk management programs are required. • Information systems housing PHI must be protected from intrusion. and change control procedures.Health Insurance Portability and Accountability Act • Covered entities that out-source some of their business processes to a third party must ensure that their vendors also have a framework in place to comply with HIPAA requirements. some form of encryption must be utilized. frequency. • A contingency plan should be in place for responding to emergencies. • Internal audits play a key role in HIPAA compliance by reviewing operations with the goal of identifying potential security violations. • Procedures should document instructions for addressing and responding to security breaches that are identified either during the audit or the normal course of operations. • Technical Safeguards – controlling access to computer systems and enabling covered entities to protect communications containing PHI transmitted electronically over open networks from being intercepted by anyone other than the intended recipient. and always changing. maintenance records.) 145 . including the use of check sum. configurable. they too must be fully trained on their physical access responsibilities. • Data corroboration. Workstations should be removed from high traffic areas and monitor screens should not be in direct view of the public. Policies and procedures should specifically document the scope. (The requirement of risk analysis and risk management implies that the act’s security requirements are a minimum standard and places responsibility on covered entities to take all reasonable precautions necessary to prevent PHI from being used for non-health purposes. When information flows over open networks. Audits should be both routine and event-based. • Policies are required to address proper workstation use. Companies typically gain this assurance through clauses in the contracts stating that the vendor will meet the same data protection requirements that apply to the covered entity. The plan should document data priority and failure analysis. • Covered entities must make documentation of their HIPAA practices available to the government to determine compliance. • Covered entities must also authenticate entities with which they communicate. testing activities. message authentication. existing access controls are considered sufficient and encryption is optional.) • Access to equipment containing health information should be carefully controlled and monitored. information technology documentation should also include a written record of all configuration settings on the components of the network because these components are complex. Covered entities are responsible for backing up their data and having disaster recovery procedures in place. and digital signature may be used to ensure data integrity. If closed systems/networks are utilized. two or three-way handshakes. • Access to hardware and software must be limited to properly authorized individuals. • Required access controls consist of facility security plans. • If the covered entities utilize contractors or agents. (When equipment is retired it must be disposed of properly to ensure that PHI is not compromised. • Each covered entity is responsible for ensuring that the data within its systems has not been changed or erased in an unauthorized manner. and token systems. Covered entities must carefully consider the risks of their operations as they implement systems to comply with the act. telephone callback. and visitor sign-in and escorts.

2011.[30] Both HHS and the Federal Trade Commission (FTC) were required under the HITECH Act to issue regulations associated with the new breach notification requirements. An institution may obtain multiple NPIs for different "subparts" such as a free-standing cancer center or rehab facility. the regulations associated with the new enhancements to HIPAA enforcement took effect. 2009. never re-used. It became effective on March 16.[31] and the FTC rule was published on August 25.Health Insurance Portability and Accountability Act 146 Unique Identifiers Rule (National Provider Identifier) HIPAA covered entities such as providers completing electronic transactions. However. 2006. This includes the extension of newly updated civil and criminal penalties to business associates. The NPI cannot contain any embedded intelligence. healthcare clearinghouses. health insurance companies. HHS issued the Final Rule regarding HIPAA enforcement. enacted as part of the American Recovery and Reinvestment Act of 2009. however. Small health plans must use only the NPI by May 23. business associates. and large health plans. payment and health care operations when an organization is using an electronic health record (EHR). implements new rules for the accounting of disclosures of a patient's health information. and January 1. and other government programs. The NPI is 10 digits (may be alphanumeric). 2009.g. This subtitle extends the complete Privacy and Security Provisions of HIPAA to business associates of covered entities. . all covered entities using electronic communications (e. Effective from May 2006 (May 2007 for small health plans). addresses the privacy and security concerns associated with the electronic transmission of health information.. and so forth) must use a single new NPI. 2013.[29] Another significant change brought about in Subtitle D of the HITECH Act. Medicare. must use only the National Provider Identifier (NPI) to identify covered healthcare providers in standard transactions by May 23. These changes won't take effect until January 1. for organizations who had implemented an EHR prior to January 1. 2009. The NPI is unique and national. Enforcement Rule On February 16. The Enforcement Rule sets civil money penalties for violating HIPAA rules and establishes procedures for investigations and hearings for HIPAA violations. It extends the current accounting for disclosure requirements to information that is used to carry out treatment.[28] HITECH Act: Privacy Requirements Subtitle D of the Health Information Technology for Economic and Clinical Health Act (HITECH Act). with the last digit being a checksum. The HHS rule was published in the Federal Register on August 24. the NPI is simply a number that does not itself have any additional meaning. 2008. 2011. 2009 and January 1. 2007. Medicaid. a provider usually can have only one. hospitals. physicians. This imposes new notification requirements on covered entities. the NPI does not replace a provider's DEA number. vendors of personal health records (PHR) and related entities if a breach of unsecured protected health information (PHI) occurs. or tax identification number.[32] The final significant change made in Subtitle D of the HITECH Act. state license number. These changes are also required to be included in any business associate agreements with covered entities. 2009. On April 27. in other words. On November 30. is the new breach notification requirements. for organizations implementing EHRs between January 1. The NPI replaces all other identifiers used by health plans. the Department of Health and Human Services (HHS) issued guidance on how to secure protected health information appropriately. This new requirement also limits the timeframe for the accounting to three years instead of six as it currently stands. and except for institutions. 2009. its deterrent effects seem to be negligible with few prosecutions for violations. 2006.

These data suggest that the HIPAA privacy rule. was quoted in the Annals article as saying. The complex legalities and potentially stiff penalties associated with HIPAA. a tripling of time spent recruiting patients. A study from the University of Michigan demonstrated that implementation of the HIPAA Privacy rule resulted in a drop from 96% to 34% in the proportion of follow-up surveys completed by study patients being followed after a heart attack. can lead physicians and medical centers to withhold information from those who may have a right to it. We hope that we will figure this out and do it right. were causes for concern among physicians and medical centers. An August 2006 article in the journal Annals of Internal Medicine detailed some such concerns over the implementation and effects of HIPAA.than necessary to ensure compliance with the Privacy rule".[36] Costs of implementation In the period immediately prior to the enactment of the HIPAA Privacy and Security Acts.S. "Privacy is important. and a tripling of mean recruitment costs.Health Insurance Portability and Accountability Act 147 Effects on research and clinical care The enactment of the Privacy and Security Rules has caused major changes in the way physicians and medical centers operate. informed consent forms for research studies now are required to include extensive detail on how the participant's protected health information will be kept private.. the addition of a lengthy. many practices and centers turned to private. Kim Eagle. Dr.[34] Another study. Goverment Accountability Office found that health care providers were "uncertain about their legal privacy responsibilities and often responded with an overly guarded approach to disclosing information. combined with potentially stiff penalties for violators. but research is also important for improving care. With an early emphasis on the potentially severe penalties associated with violation..[33] Reports of this uncertainty continue. as currently implemented.[33] Effects on research HIPAA restrictions on researchers have affected their ability to perform retrospective. professor of internal medicine at the University of Michigan. may be having negative impacts on the cost and quality of medical research. demonstrated that HIPAA-mandated changes led to a 73% decrease in patient accrual.[35] In addition. the increase in paperwork and staff time necessary to meet the legal requirements of HIPAA may impact the finances of medical centers and practices at a time when insurance companies and Medicare reimbursement is also declining. legalistic section on privacy may make these already complex documents even less user-friendly for patients who are asked to read and sign them. . chart-based research as well as their ability to prospectively evaluate patients by contacting them for follow-up. as well as the increase in paperwork and the cost of its implementation."[33] Effects on clinical care The complexity of HIPAA. medical centers and medical practices were charged with getting "into compliance". In addition to the costs of developing and revamping systems and practices. While such information is important. detailing the effects of HIPAA on recruitment for a study on cancer prevention. A review of the implementation of the HIPAA Privacy Rule by the U. for-profit "HIPAA consultants" who were intimately familiar with the details of the legislation and offered their services to ensure that physicians and medical centers were fully "in compliance".

gpo. 3103. gpo. gov/ cgi-bin/ get-cfr. gov/ HIPAAGenInfo/ ) [2] 29 U. access. access. volume 19. cornell. access date July 2.F.R. gpo. Final Rule: 45 CFR Parts 160 and 164 Law: Pub. access. 164.R. Pt.530(b) (http:/ / frwebgate. html) 45 C. htm) Physicians Practice journal. cgi?TYPE=TEXT& YEAR=current& TITLE=45& PART=164& SECTION=530) [23] "How to File A Health Information Privacy Complaint with the Office for Civil Rights" (http:/ / www. 162.F.[33] [38] Legislative information • • • • • HHS Security Standards.F. 164. year 2009.C.Health Insurance Portability and Accountability Act 148 HIPAA and drug and alcohol rehabilitation organizations Special considerations for confidentiality are needed for health care organizations that offer federally-funded drug or alcohol rehabilitation services. Rept.  § 1395ddd (http:/ / www. H. html) 29 U.F.  § 1320a-7c (http:/ / www.S. cgi?TYPE=TEXT& YEAR=current& TITLE=45& PART=164& SECTION=526) [19] 45 C. number 3. 104-469. html) 42 U. 164. gov/ cgi-bin/ get-cfr.  § 1181(c)(1) (http:/ / www.502(a)(1)(iv) (http:/ / frwebgate. cgi?TYPE=TEXT& YEAR=current& TITLE=45& PART=160& SECTION=103) [10] Definitions of a Covered Entity (http:/ / hipaa.F.R. H.C. L. edu/ uscode/ 29/ 1181.526 (http:/ / frwebgate. cornell. 164.R. html) 29 U. Rept. gpo. access.C. gov/ cgi-bin/ get-cfr.R. html) 42 U. cornell. gpo.  § 1181(a)(3) (http:/ / www. Rept.522(b) (http:/ / frwebgate. cgi?TYPE=TEXT& YEAR=current& TITLE=45& PART=164& SECTION=512) [15] 45 C. html) 42 U. law.R. access.512 (http:/ / frwebgate.R.C.C.S. 1936 United States House of Representatives: 104 H. ohio. and 164 HHS Standards for Privacy of Individually Identifiable Health Information. 1028. 1.502(b) (http:/ / frwebgate. edu/ uscode/ 42/ 1395b-5.  § 1181(c)(2)(A) (http:/ / www. 104 S. gpo. com/ index/ fuseaction/ articles. gpo. cgi?TYPE=TEXT& YEAR=current& TITLE=45& PART=164& SECTION=522) [20] 45 C. 164. gov/ cgi-bin/ get-cfr. ihs. cornell. cgi?TYPE=TEXT& YEAR=current& TITLE=45& PART=164& SECTION=501) [13] 45 C. gov/ cgi-bin/ get-cfr. gpo.S. gov/ cgi-bin/ get-cfr. 104-156 References [1] Centers for Medicare and Medicaid Services (http:/ / www. cgi?TYPE=TEXT& YEAR=current& TITLE=45& PART=164& SECTION=502) [17] 45 C. law. Treatment and Rehabilitation Act of 1970[37] and language amended by the Drug Abuse Office and Treatment Act of 1972. 104-191 [39].530(a) (http:/ / frwebgate.The New Threats" (http:/ / www. law. cms.S. Final Rule: 45 CFR Parts 160. cgi?TYPE=TEXT& YEAR=current& TITLE=45& PART=164& SECTION=502) [18] 45 C. gpo.501 (http:/ / frwebgate. access.F.F. Ken "Patient Privacy .524(a)(1)(ii) (http:/ / frwebgate. access.103 (http:/ / frwebgate. access. 160. access. access. cornell.F. physicianspractice.F. 164. cornell. pdf) [11] Terry.R. 104-736 United States Senate: 104 S. law. html) 29 U. gpo. pdf) [3] [4] [5] [6] [7] [8] [9] . edu/ uscode/ 42/ 1320a-7c.F. gov/ cgi-bin/ get-cfr.S. details/ articleID/ 1299/ page/ 1. cgi?TYPE=TEXT& YEAR=current& TITLE=45& PART=164& SECTION=530) [22] 45 C.S. Predating HIPAA by over a quarter century are the Comprehensive Alcohol Abuse and Alcoholism Prevention. edu/ uscode/ 29/ 1181.524(b) (http:/ / frwebgate.F. cgi?TYPE=TEXT& YEAR=current& TITLE=45& PART=164& SECTION=524) [16] 45 C.  § 1181(a)(2) (http:/ / www. gov/ cgi-bin/ get-cfr. edu/ uscode/ 42/ 1395ddd. 164. 2009 [12] 45 C. 164.R. gov/ cgi-bin/ get-cfr.C. 110 Stat. law. gov/ cgi-bin/ get-cfr.  § 1395b-5 (http:/ / www.R. law. edu/ uscode/ 29/ 1181. gov/ cgi-bin/ get-cfr. gov/ cgi-bin/ get-cfr. edu/ uscode/ 29/ 1181.S. 1698.C. gov/ AdminMngrResources/ HIPAA/ documents/ OCR_HIPAA_ComplaintFormInstructions. cgi?TYPE=TEXT& YEAR=current& TITLE=45& PART=164& SECTION=524) [14] 45 C. cornell.R.R. access. gov/ tools/ CEDefinition.528 (http:/ / frwebgate. gpo.F. S. 164. 164. access. 164. gpo. law. cgi?TYPE=TEXT& YEAR=current& TITLE=45& PART=164& SECTION=528) [21] 45 C.R.

omitted and moved to 42 U.306 (http:/ / frwebgate. hhs. com/ 2007/ 07/ 03/ health/ policy/ 03hipaa. 42 U. com/ sol3/ papers. 160.ca.C. (http:/ / www. Ann Intern Med 145 (4): 313–6. Eagle K (2005).L. pdf) [31] Health and Human Services Breach Notification Rule (http:/ / edocket. Volume 30. Arch Intern Med 165 (10): 1125–9. 102-321) [38] Pub. [28] Medical Privacy Law Nets No Fines. Summer 2010.Health Insurance Portability and Accountability Act [24] 45 C.hhs. "Potential impact of the HIPAA privacy rule on data collection in a registry of patients with acute coronary syndrome". ssrn.F. [29] HHS Strengthens HIPAA Enforcement (http:/ / www.gpo. gov/ fdsys/ pkg/ PLAW-104publ191/ pdf/ PLAW-104publ191. The Wall Street Journal [26] CMS Transactions and Code Sets Regulations (http:/ / www.hhs.com/) Notes .S. asp#TopOfPage) [27] Tim Wafa (J.library.gov/fdsys/ search/pagedetails. cms. Jani S. Centers for Medicare and Medicaid Services • Congressional Research Service (CRS) reports regarding HIPAA (http://digital.ohi. access. 92-255. nytimes. pdf) [32] Federal Trade Commission Breach Notification Rule (http:/ / edocket.C.edu/govdocs/crs/ search/?q=hipaa&t=fulltext). pdf 149 External links • California Office of HIPAA Implementation (http://www. PMID 16908928. Northern Illinois University Law Review. html) Rob Stein: The Washington Post. gov/ 2009/ pdf/ E9-20142. § 290dd-2 (2006 through Pub. gov/ 2009/ pdf/ E9-20169. Mukherjee D. gpo. [34] Armstrong D. gpo. University of North Texas Libraries • Full text of the Health Insurance Portability and Accountability Act (PDF/TXT) (http://www. 102-321) [39] http:/ / www. [36] "Keeping Patients’ Details Private.gov/calohi/) (CalOHI) • "HIPAA" (http://www.C. § 290ee-3 (1976). cgi?TYPE=TEXT& YEAR=current& TITLE=45& PART=160& SECTION=306) [25] "Spread of records stirs fears of privacy erosion" (http:/ / www. resources and commentary (http://www.org/ hipaa.L. com/ pg/ 06362/ 749444-114.cms. Nallamothu B. Kline-Rogers E. doi:10.action?granuleId=CRPT-104hrpt736&packageId=CRPT-104hrpt736) U. .L. gov/ news/ press/ 2009pres/ 10/ 20091030a. gov/ TransactionCodeSetsStands/ 02_TransactionsandCodeSetsRegulations.1125. § 290dd-2 (2006 through Pub. [35] Wolf M. gpo. gov/ ocr/ privacy/ hipaa/ understanding/ coveredentities/ hitechrfi.R. gpo. Government Printing Office • Full text of the Health Insurance Portability and Accountability Act (HTM) (http://www.htm) Legal Archiver • Office for Civil Rights page on HIPAA (http://www. doi:10. hhs.gov/ocr/hipaa/) • HIPAA documentation.S. (http:/ / www.).S. com/ wp-dyn/ content/ article/ 2006/ 06/ 04/ AR2006060400672. Cancer 106 (2): 474–9. Number 3. access.L. PMID 16342254. html?ex=1341115200& en=19160c75b9633d68& ei=5090& partner=rssuserland& emc=rss) [37] Pub. 2006. Even From Kin. cfm?abstract_id=1547425).21599.legalarchiver. html) [30] Guidance for Securing Protected Health Information (http:/ / www. access. "Health Insurance Portability and Accountability Act Privacy rule causes ongoing concerns among clinicians and researchers". by Theo Francis. 42 U. gov/ cgi-bin/ get-cfr. Bennett C (2006). PMID 15911725. July 3.S. Goldman E. "How the Lack of Prescriptive Technical Granularity in HIPAA Has Compromised Patient Privacy" (http:/ / papers. washingtonpost. § 290dd-3 (1976).1001/archinte. pdf) [33] Wilson J (2006). omitted and moved to 42 U. 91-161.1002/cncr.gov/HIPAAGenInfo/). post-gazette.hipaa." New York Times.165. 2007.D.unt. Fang J.10.C. December 28. "Local perspective of the impact of the HIPAA privacy rule on research".S. stm).

e. The Deputy. Services. The IAVA policy requires the Component Commands. Secretary of Defense issued a classified memorandum on Information Assurance. Services. a division of the United States Cyber Command. the number of assets in compliance. and the number of assets with waivers. Secretary of Defense issued an Information Assurance Vulnerability Alert (IAVA) policy memorandum on December 30. the compliance data to be reported should include the number of assets affected. Information Assurance Vulnerability Management Program The Combatant Commands. to develop an alert system that ensured positive control of information assurance. USSTRATCOM and USCYBERCOM will coordinate with all affected organizations to determine operational impact to the DoD before instituting a disconnection.Information Assurance Vulnerability Alert 150 Information Assurance Vulnerability Alert An Information Assurance Vulnerability Alert (IAVA) is an announcement of a computer application software or operating system vulnerability notification in the form of alerts. Implementation of IAVA policy will help ensure that DoD Components take appropriate mitigating actions against vulnerabilities to avoid serious compromises to DoD computer system assets that would potentially degrade mission performance. The policy memorandum instructs the DISA to develop and maintain an IAVA database system that would ensure a positive control mechanism for system administrators to receive. bulletins. with the potential to severely degrade mission performance. According to the memorandum. not in compliance with the IAVA program directives and vulnerability response measures (i. 1998. and technical advisories. the Deputy. Current events of the time demonstrated that widely known vulnerabilities exist throughout DoD networks. communication tasking orders or messages). that instructed the DISA. with the assistance of the Military Departments. which may ultimately include disconnection of any enclave. USCYBERCOM analyzes each vulnerability and determines if it is necessary or beneficial to the Department of Defense to release it as an IAVA. acknowledge. These selected vulnerabilities are the mandated baseline. . or minimum configuration of all hosts residing on the GIG. Send alert notifications to each point of contact. Background On February 15. USSTRATCOM via its sub-unified command USCYBERCOM has the authority to direct corrective actions. the alert system should: • • • • Identify a system administrator to be the point of contact for each relevant network system. and enable DISA to confirm whether the correction has been implemented. and comply with system vulnerability alert notifications. bulletins. and Agencies to register and report their acknowledgement of and compliance with the IAVA database. According to the policy memorandum. 1999. and technical advisories identified by DoD-CERT. Establish a date for the corrective action to be implemented. Require confirmation by each point of contact acknowledging receipt of each alert notification. Agencies and field activities are required to implement vulnerability notifications in the form of alerts. or affected system on the enclave.

2. pdf [3] http:/ / iac. August 2007. Dec 2001. mil/ cjcs_directives/ cdata/ unlimit/ 6510_01. The uncertainty of loss expressed in terms of probability of such loss. This relatively new term due to an increasing awareness that information security is simply one facet of a multitude of risks that are relevant to IT and the real world processes it supports. . dodig. • National Security Telecommunications and Information Systems Security Instruction (NSTISSI) No. The loss potential that exists as the result of threat-vulnerability pairs. Decision theory should be applied to manage risk as a science.[3] Committee on National Security Systems The Committee on National Security Systems of United States of America defined risk in different documents: • From CNSS Instruction No. mil/ iatac/ ia_policychart. quite similar to NIST SP 800-30 one: Risk .A combination of the likelihood that a threat will occur. dtic. Because risk is strictly tied to uncertainty. DoD Compliance with the Information Assurance Vulnerability Alert Policy. or IT risk. • [2] Chairman of the Joint Chiefs of Staff Instruction. osd. the likelihood that a threat occurrence will result in an adverse impact. Generally speaking.01E. It is measured in terms of a combination of the probability of an event and its consequence. mil/ Audit/ reports/ fy01/ 01-013. 6510. ISO IT risk: the potential that a given threat will exploit vulnerabilities of an asset or group of assets and thereby cause harm to the organization. rationally making choices under uncertainty.[5] introduces a probability aspect.[1] The measure of a IT risk can be determined as a product of threat. Reducing either the threat or the vulnerability reduces the risk. pdf [2] http:/ / www.Information Assurance Vulnerability Alert 151 External links • [1] Office of the Inspector General. and the severity of the resulting impact National Information Assurance Training and Education Center defines risk in the IT field as:[6] 1. IT-related risk. dtic. vulnerability and asset values:[2] Risk = Threat * Vulnerability ∗ Asset Definitions Definitions of IT risk come from different but authoritative sources. 4009 dated 26 April 2010[4] the basic and more technical focused definition: Risk . html IT risk Information technology risk.Possibility that a particular threat will adversely impact an IS by exploiting a particular vulnerability.e. i. 1000. • DoD IA Policy Chart [3] DoD IA Policy Chart References [1] http:/ / www. risk is the product of likelihood times impact (Risk = Likelihood * Impact). is a risk related to information technology.

its factors are threat and vulnerability. 3. 152 NIST Many NIST publications define risk in IT contest in different publications: FISMApedia[7] term[8] provide a list. • From NIST FIPS 200[10] Risk . involvement. 4. Risk management insight IT risk is the probable frequency and probable magnitude of future loss. Unauthorized (malicious or accidental) disclosure.IT risk 3. ownership. IT-related risks arise from legal liability or mission loss due to: 1. 2. but also the benefit\value enabling risk associated to missing opportunities to use technology to enable or enhance business or the IT project management for aspects like overspending or late delivery with adverse business impact . NIST SP 800-30[9] defines: IT-related risk The net mission impact considering: 1.The level of impact on organizational operations (including mission.[12] IT risk has a broader meaning: it encompasses not just only the negative impact of operations and service delivery which can bring destruction or reduction of the value of the organization.[11] ISACA ISACA published the Risk IT Framework in order to provides an end-to-end. comprehensive view of all risks related to the use of IT. modification. or destruction of information Unintentional errors and omissions IT disruptions due to natural or man-made disasters Failure to exercise due care and diligence in the implementation and operation of the IT system. 5. image. operation. 4. the likelihood that a threat occurrence shall result in an adverse impact. or reputation). organizational assets. the probability that a particular threat will exploit a particular vulnerability of the system. and the severity of the resulting adverse impact. influence and adoption of IT within an enterprise According to Risk IT. the probability that a particular threat-source will exercise (accidentally trigger or intentionally exploit) a particular information system vulnerability and 2. A combination of the likelihood that a threat shall occur. The probability that a hostile entity will successfully exploit a particular telecommunications or COMSEC system for intelligence purposes. or individuals resulting from the operation of an information system given the potential impact of a threat and the likelihood of that threat occurring. Between them: • According to NIST SP 800-30[9] : Risk is a function of the likelihood of a given threat-source’s exercising a particular potential vulnerability. and the resulting impact of that adverse event on the organization. the resulting impact if this should occur. functions. There.[12] IT risk is defined as: The business risk associated with the use.

Information security event An identified occurrence of a system. the same asset can have different values to different organizations. service or network state indicating a possible breach of information security policy or failure of safeguards. so in the first step of risk evaluation. or a previously unknown situation that may be security relevant. OWASP proposes a practical risk measurement guideline [18] based on: • Estimation of Likelihood as a mean between different factors in a 0 to 9 scale: . It is not always practical to express this values. • Consequences can be expressed qualitatively or quantitatively (ISO/IEC Guide 73) The risk R is the product of the likelihood L of a security incident occurring times the impact I that will be incurred to the organization due to the incident.IT risk 153 Measuring IT risk You can't effectively and consistently manage what you can't measure. risk are graded dimensionless in three or five steps scales.11] • There can be more than one consequence from one event. :(ISO/IEC Guide 73) Information security incident is indicated by a single or a series of unwanted information security events that have a significant probability of compromising business operations and threatening information security[3] An event [G.(ISO/IEC PDTR 13335-1) Consequence[17] Outcome of an event [G. Harm is related to the value of the assets to the organization.11] that has been assessed as having an actual or potentially adverse effect on the security or performance of a system. • The event can be a single occurrence or a series of occurrences. So R can be function of four factors: • A = Value of the assets • T = the likelihood of the threat • V = the nature of vulnerability i.e.17]. The consequence of the occurrence of a security incident are a function of likely impact that the incident will have on the organization as a result of the harm the organization assets will sustain. to properly measure IT risk.[11] [13] It is useful to introduce related terms. that is:[18] R=LXI The likelihood of a security incident occurrence is a function of the likelihood that a threat appears and the likelihood that the threat can successfully exploit the relevant system vulnerabilities.[3] Occurrence of a particular set of circumstances[14] • The event can be certain or uncertain. • Consequences can range from positive to negative. the risk can be expressed in monetary terms and compared to the cost of countermeasures and the residual risk after applying the security control. and you can't measure what you haven't defined.[15] Impact[16] The result of an unwanted incident [G. the likelihood that can be exploited (proportional to the potential benefit for the attacker and inversely proportional to the cost of exploitation) • I = the likely impact. the extent of the harm If numerical values (money for impact and probabilities for the other factors).

technical impact can be broken down into factors aligned with the traditional security areas of concern: confidentiality. high reward (9) • Opportunity: What resources and opportunity are required for this group of threat agents to find and exploit this vulnerability? full access or expensive resources required (0). some technical skills (3). some access or resources required (7). brand damage (9) • Non-compliance: How much exposure does non-compliance introduce? Minor violation (2). but requires a deep understanding of what is important to the company running the application. extensive secondary services interrupted (5). security penetration skills (9) • Motive: How motivated is this group of threat agents to find and exploit this vulnerability? Low or no reward (1). • Loss of confidentiality: How much data could be disclosed and how sensitive is it? Minimal non-sensitive data disclosed (2). advanced computer user (4). Loss of major accounts (4). logged without review (8). easy (7). partners (5). The goal here is to estimate the likelihood of the particular vulnerability involved being discovered and exploited. availability. you should be aiming to support your risks with business impact. network and programming skills (6). and accountability. bankruptcy (9) • Reputation damage: Would an exploit result in reputation damage that would harm the business? Minimal damage (1). easy (5). logged and reviewed (3). extensive non-sensitive data disclosed (6). significant effect on annual profit (7). anonymous Internet users (9) • Vulnerability Factors: the next set of factors are related to the vulnerability involved. minimal primary services interrupted (5). all data disclosed (9) • Loss of integrity: How much data could be corrupted and how damaged is it? Minimal slightly corrupt data (1). authenticated users (6). minor effect on annual profit (3). obvious (6). integrity. completely anonymous (9) • Business Impact Factors: The business impact stems from the technical impact. Assume the threat agent selected above. all data totally corrupt (9) • Loss of availability How much service could be lost and how vital is it? Minimal secondary services interrupted (1). automated tools available (9) • Ease of exploit: How easy is it for this group of threat agents to actually exploit this vulnerability? Theoretical (1). minimal seriously corrupt data (3). The business risk is what justifies investment in fixing security problems. possible reward (4). not logged (9) • Estimation of Impact as a mean between different factors in a 0 to 9 scale • Technical Impact Factors. extensive seriously corrupt data (7). • Financial damage: How much financial damage will result from an exploit? Less than the cost to fix the vulnerability (1).IT risk • Threat agent factors • Skill level: How technically skilled is this group of threat agents? No technical skills (1). difficult (3). special access or resources required (4). no access or resources required (9) • Size: How large is this group of threat agents? Developers (2). clear violation (5). hidden (4). high profile violation (7) 154 . public knowledge (9) • Intrusion detection: How likely is an exploit to be detected? Active detection in application (1). difficult (3). intranet users (4). all services completely lost (9) • Loss of accountability: Are the threat agents' actions traceable to an individual? Fully traceable (1). system administrators (2). In general. automated tools available (9) • Awareness: How well known is this vulnerability to this group of threat agents? Unknown (1). minimal critical data disclosed (6). extensive critical data disclosed (7). extensive slightly corrupt data (5). • Ease of discovery: How easy is it for this group of threat agents to discover this vulnerability? Practically impossible (1). extensive primary services interrupted (7). possibly traceable (7). particularly if your audience is executive level. The goal is to estimate the magnitude of the impact on the system if the vulnerability were to be exploited. loss of goodwill (5).

millions of people (9) • If the business impact is calculated accurately use it in the following otherwise use the Technical impact • rate likelihood and impact in a LOW. each of them divided in processes and steps. to take in reducing risk to an acceptable level. and deciding what countermeasures. based on the value of the information resource to the organization. hundreds of people (5). MEDIUM. if any. The establishment. Nonbinding guideline to UN nations calling for national regulation in this field . and 6 to 9 is HIGH. assessment and management of information security risks. thousands of people (7). maintenance and continuous update of an ISMS provide a strong indication that a company is using a systematic approach for the identification.[23] United Nations United Nations issued the following: • UN Guidelines concerning computerized personal data files of 14 December 1990 [24] Generic data processing activities using digital processing methods.[21] Risk Management Elements The CISA Review Manual 2006 provides the following definition of risk management: "Risk management is the process of identifying vulnerabilities and threats to the information resources used by an organization in achieving business objectives. 3 to less than 6 is MEDIUM. • calculate the risk using the following table Overall Risk Severity Impact HIGH MEDIUM LOW Medium Low Note LOW High Medium Low MEDIUM Critical High Medium HIGH 155 Likelihood IT risk management IT risk management can be considered a component of a wider [19] Enterprise risk management system.IT risk • Privacy violation: How much personally identifiable information could be disclosed? One individual (3)."[22] IT Risk Laws and Regulations In the following a brief description of applicable rules organized by source. HIGH scale assuming that less than 3 is LOW.[20] Different methodologies has been proposed to manage IT risks.

Furthermore. article 35 of the Regulation requires the Community institutions and bodies to take similar precautions with regard to their telecommunications infrastructure. specifically non-E. and Commission Decision 2004/915/EC [30] of 27 December 2004 amending Decision 2001/497/EC as regards the introduction of an alternative set of standard contractual clauses for the transfer of personal data to third countries. • Directive 95/46/EC [28] on the protection of individuals with regard to the processing of personal data and on the free movement of such data require that any personal data processing activity undergoes a prior risk analysis in order to determine the privacy implications of the activity.). businesses. provide. Both Commission Decisions provide a set of voluntary model clauses which can be use to export personal data from a data controller (who is subject to E. Topic: Export of personal data to third countries. own. who is not subject to these rules or to a similar set of adequate rules.U. • Commission Decision 2001/497/EC of 15 June 2001 [29] on standard contractual clauses for the transfer of personal data to third countries. other organisations and individual users who develop. technical and organisation measures to protect such activities.U. Furthermore.U. While no part of the text is binding as such. unless such countries have provided adequate legal protection for such personal data. The OECD Guidelines state the basic principles underpinning risk management and information security practices. which must be state of the art keeping into account the sensitivity and privacy implications of the activity (including when a third party is charged with the processing task) is notified to a national data protection authority. article 25 and following of the Directive requires Member States to ban the transfer of personal data to non-Member States. Topic: Requirement for the providers of . and use information systems and networks).U.e.IT risk 156 OECD OECD issued the following: • Organisation for Economic Co-operation and Development (OECD) Recommendation of the Council concerning guidelines governing the protection of privacy and trans-border flows of personal data [25] (23 September 1980) • OECD Guidelines for the Security of Information Systems and Networks: Towards a Culture of Security [26] (25 July 2002).is effectively protected by such measures. service. Scope: Non binding guidelines to any OECD entities (governments. • International Safe Harbor Privacy Principles (see below USA and International Safe Harbor Privacy Principles ) • Directive 2002/58/EC [31] of 12 July 2002 concerning the processing of personal data and the protection of privacy in the electronic communications sector • National Security • Directive 2006/24/EC [32] of 15 March 2006 on the retention of data generated or processed in connection with the provision of publicly available electronic communications services or of public communications networks and amending Directive 2002/58/EC (‘Data Retention Directive’). and to determine the appropriate legal. divided by topic: • Privacy • Regulation (EC) No 45/2001 [27] on the protection of individuals with regard to the processing of personal data by the Community institutions and bodies and on the free movement of such data provide an internal regulation which is a practical application of the principles of the Privacy Directive described below. and to properly inform the users of any specific risks of security breaches. European Union The European Union issued the following. including the measures taken to ensure the security of the activity. under Directive 95/46/EC. equivalent to that of the E. Topic: General information security. countries which have not been recognised as having a data protection level that is adequate (i. data protection rules) to a data processor outside the E. or barring certain other exceptions. non-compliance with any of the principles is indicative of a serious breach of RM/RA good practices that can potentially incur liability. manage.

procedural criminal law (including investigative measures and international cooperation) and liability issues. definitions of specific crimes).XI. Scope: Requires Member States to implement the provisions of the Framework Decision in their national legal frameworks. Any party whose activities imply a risk of being involved in such proceedings must therefore take adequate . Apart from the definitions of a series of criminal offences in articles 2 to 10. Following this designation.IT risk public electronic telecommunications service providers to retain certain information for the purposes of the investigation. Topic: General treaty aiming to harmonise national provisions in the field of cyber crime. European Treaty Series-No. This includes effects resulting from cross-sector dependencies on other types of infrastructure’). Requires Member States to identify critical infrastructures on their territories. which includes finalised reports.e. Topic: General decision aiming to harmonise national provisions in the field of cyber crime. definitions of specific crimes). also because the Decision states that a legal entity can be held liable for acts of omission in this regard. Thus. the Convention is relevant to RM/RA because it states the conditions under which legal liability can be imposed on legal entities for conduct of certain natural persons of authority within the legal entity. Topic: Identification and protection of European Critical Infrastructures. such information may now include electronic information. Federal rules with regard to the production of electronic documents in civil proceedings.e. 185. liability issues and data retention. detection and prosecution of serious crime • Council Directive 2008/114/EC [33] of 8 December 2008 on the identification and designation of European critical infrastructures and the assessment of the need to improve their protection. encompassing material criminal law (i. internal memos and e-mails with regard to a specific subject. which entered into force on 1 December 2006. USA United States issued the following. procedural criminal law (including investigative measures and international cooperation). also because the Convention states that a legal entity can be held liable for acts of omission in this regard.S. which should establish relevant security solutions for their protection • Civil and Penal law • Council Framework Decision 2005/222/JHA [34] of 24 February 2005 on attacks against information systems. encompassing material criminal law (i. working documents. the Framework decision requires that the conduct of such figures within an organisation is adequately monitored. and to designate them as ECIs. which may or may not be specifically delineated. This implies that any party being brought before a U. Topic: U.S. so as to allow the parties and the court to correctly assess the matter. court in civil proceedings can be asked to produce such documents. the owners/operators of ECIs are required to create Operator Security Plans (OSPs). Framework decision is relevant to RM/RA because it contains the conditions under which legal liability can be imposed on legal entities for conduct of certain natural persons of authority within the legal entity. 157 Council of Europe • Council of Europe Convention on Cybercrime. The discovery rules allow a party in civil proceedings to demand that the opposing party produce all relevant documentation (to be defined by the requesting party) in its possession.2001 [35]. Scope: Applicable to Member States and to the operators of European Critical Infrastructure (defined by the draft directive as ‘critical infrastructures the disruption or destruction of which would significantly affect two or more Member States. divided by topic: • Civil and Penal law • Amendments to the Federal Rules of Civil Procedure with regard to electronic discovery [36]. Through the e-discovery amendment. Budapest. Thus. or a single Member State if the critical infrastructure is located in another Member State. the Convention requires that the conduct of such figures within an organisation is adequately monitored. 23.

any businesses who risk civil litigation before U. However. since their data processing activities are subject to similar obligations under general European law (including the Privacy Directive). including the secure storage. This title required the U.S. Specifically: The party must be capable of initiating a ‘litigation hold’. all of which have been published. published in the Federal Register on 20 February 2003 (see: http://www. Storage policies must be responsible: while deletion of specific information of course remains allowed when this is a part of general information management policies (‘routine. along with a guidance document on the basics of HIPAA risk management and risk assessment (see http://www. courts must implement adequate information management policies. and since the underlying trends of modernisation and evolution towards electronic health files are the same. each of which would provide specific standards which would improve the efficiency of the health care system and prevent abuse. in practice.cms.hhs. Organisational and Technical Safeguards. the HHS has adopted five principal rules: the Privacy Rule. and physical security procedures to assure the confidentiality of electronic protected health information. a technical/organisational measure which must ensure that no relevant information can be modified any longer in any way. good-faith operation of the information system’. and the Security Rule.IT risk precautions for the management of such information.asp).6 billion US$). and must implement the necessary measures to initiate a litigation hold.S.cms.hhs. as it specifies a series of administrative.S. the wilful destruction of potentially relevant information can be punished by extremely high fines (in one specific case of 1. specifically with regard to the processing of electronic health information. • Privacy • Gramm–Leach–Bliley Act (GLBA) • USA PATRIOT Act. These aspects have been further outlined in a set of Security Standards on Administrative. the Transactions and Code Sets Rule. Thus.gov/EducationMaterials/ 04_SecurityMaterials. is specifically relevant. market. Department of Health and Human Services (HHS) to draft specific rule sets. Physical. the Unique Identifiers Rule. HIPAA security standards include the following: • Administrative safeguards: • Security Management Process • Assigned Security Responsibility • Workforce Security • Information Access Management • Security Awareness and Training • Security Incident Procedures • Contingency Plan • Evaluation • Business Associate Contracts and Other Arrangements • Physical safeguards • Facility Access Controls • Workstation Use • Workstation Security • Device and Media Controls 158 . As a result. Rule 37 (f)). technical.gov/SecurityStandard/ Downloads/securityfinalrule. the Enforcement Rule.European or other countries health care service providers will generally not be affected by HIPAA obligations if they are not active on the U. the HHS safeguards can be useful as an initial yardstick for measuring RM/RA strategies put in place by European health care service providers.pdf). The latter. the Act is particularly known for its provisions with regard to Administrative Simplification (Title II of HIPAA). Title III • Health Insurance Portability and Accountability Act (HIPAA) From an RM/RA perspective.

by requiring that the entity self-certifies its compliance with the so-called Safe Harbor Principles. privacy regulations to a destination subject to U.S. privacy regulations to a U.gov/publications/secpubs/otherpubs/reviso-faq.Information technology—Security techniques—Management of information and communications technology security—Part 1: Concepts and models for information and communications technology security management http://www. destination is indeed on the Safe Harbor list (see safe harbor list [37]) • Sarbanes–Oxley Act • FISMA 159 Standards Organizations and Standards • International standard bodies: • International Organization for Standardization . pdf) . 2000 Export of personal data from a data controller who is subject to E.NIST • Federal Information Processing Standards . One way of complying with this obligation is to require the receiving entity to join the Safe Harbor.iso. and serves as a resource for the implementation of security management practices and as a yardstick for auditing such practices.FIPS by NIST devoted to Federal Government and Agencies • UK standard bodies • British Standard Institute Short description of standards The list is chiefly based on [23] : ISO • ISO/IEC 13335-1:2004 . (See also http://csrc.Standard containing generally accepted descriptions of concepts and models for information and communications technology security management. the European entity must ensure that the receiving entity provides adequate safeguards to protect such data against a number of mishaps.S.nist.IT risk • Technical safeguards • Access Control • Audit Controls • Integrity • Person or Entity Authentication • Transmission Security • Organisational requirements • Business Associate Contracts & Other Arrangements • Requirements for Group Health Plans • International Safe Harbor Privacy Principles issued by the US Department of Commerce on July 21.ISO • Payment Card Industry Security Standards Council • Information Security Forum • The Open Group • USA standard bodies: • National Institute of Standards and Technology .U. before personal data may be exported from an entity subject to E. If this road is chosen. The standard is a commonly used code of practice.S. CatalogueDetail?CSNUMBER=39066. the data controller exporting the data must verify that the U.org/iso/en/CatalogueDetailPage.U. based destination. law.

org/iso/en/CatalogueDetailPage. • ISO/IEC TR 15947:2002 .org/iso/en/CatalogueDetailPage.org/livelink/livelink/fetch/2000/2489/ Ittf_Home/PubliclyAvailableStandards. specific provisions cannot be quoted).Information technology—Security techniques—Code of practice for information security management. Topic: Security management – Intrusion detection in IT systems. Topic: Standard containing generally accepted guidelines and general principles for initiating.IT risk • ISO/IEC TR 15443-1:2005 – Information technology—Security techniques—A framework for IT security assurance reference:http://www. which can be voluntarily implemented.org/livelink/livelink/fetch/2000/2489/ Ittf_Home/PubliclyAvailableStandards. reference: http://www.iso. specific provisions cannot be quoted). product or environmental factor • ISO/IEC 15816:2002 .Information technology—Security techniques—IT intrusion detection framework reference:http://www. The standard is a commonly used code of practice.Information technology—Security techniques—Security information objects for access control reference:http://www. reference: http://isotc. and serves as a resource for the implementation of information security management practices and as a yardstick for auditing such practices. thus avoiding duplication or divergence in other standardisation efforts. implementing.htm Topic: Standard containing a common set of requirements for the security functions of IT products and systems and for assurance measures applied to them during a security evaluation. or before procuring it. Scope: Publicly available ISO standard. The standard is commonly used as a resource for the evaluation of the security of IT products and systems.iso. but it is rather a tool for facilitating RM/RA activities in the affected field. For this reason. The standard allows security professionals to rely on a specific set of concepts and methodologies for describing and assessing security risks with regard to potential intrusions in IT systems.org/iso/en/CatalogueDetailPage. and its provisions are not publicly available. manufacturing or marketing. and its provisions are not publicly available. Topic: Security assurance – the Technical Report (TR) contains generally accepted guidelines which can be used to determine an appropriate assurance method for assessing a security service. including business continuity management. the standard is not free of charge.org/iso/en/CatalogueDetailPage. specific provisions cannot be quoted).CatalogueDetail?CSNUMBER=39733 (Note: this is a reference to the ISO page where the standard can be acquired.CatalogueDetail?CSNUMBER=29580 (Note: this is a reference to the ISO page where the standard can be acquired. The standard is predominantly used as a tool for security professionals to develop PPs and STs. • ISO/IEC 15408-1/2/3:2005 . the standard is not free of charge. the standard is not free of charge.CatalogueDetail?CSNUMBER=29139 (Note: this is a reference to the ISO page where the standard can be acquired. including (if not specifically) for procurement decisions with regard to such products. and improving information security management in an organization. (See alsoISO/IEC 17799) • ISO/IEC TR 15446:2004 – Information technology—Security techniques—Guide for the production of Protection Profiles and Security Targets. For this reason. but can also be used to assess the validity of the same (by using the TR as a yardstick to 160 . The standard can thus be used as an RM/RA tool to determine the security of an IT product or system during its design. For this reason.Information technology — Security techniques — Evaluation criteria for IT security — Part 1: Introduction and general model (15408-1) Part 2: Security functional requirements (15408-2) Part 3: Security assurance requirements (15408-3) reference: http://isotc. The text is a resource for the evaluation of the security of IT products and systems. specific provisions cannot be quoted).htm Topic: Technical Report (TR) containing guidelines for the construction of Protection Profiles (PPs) and Security Targets (STs) that are intended to be compliant with ISO/IEC 15408 (the "Common Criteria").iso. and its provisions are not publicly available. Topic: Security management – Access control. However. However. The standard allows security professionals to rely on a specific set of syntactic definitions and explanations with regard to SIOs.iso. maintaining. CatalogueDetail?CSNUMBER=39612&ICS1=35&ICS2=40&ICS3= (Note: this is a reference to the ISO page where the standard can be acquired. the standard is not free of charge.iso. It does not contain any RM/RA obligations as such. • ISO/IEC 17799:2005 . and its provisions are not publicly available. However. and can thus be used as a tool for RM/RA. However.iso. For this reason.

Topic: Standard containing generally accepted guidelines for the implementation of an Information Security Management System within any given organisation. • ISO/IEC 18028:2006 . However. compliance with ISO/IEC 15408 is impossible if ISO/IEC 18045 has been disregarded. • ISO/IEC 18045:2005 . Topic: Five part standard (ISO/IEC 18028-1 to 18028-5) containing generally accepted guidelines on the security aspects of the management. and its provisions are not publicly 161 . The standard is a high level resource introducing basic concepts and considerations in the field of incident response.IT risk determine if its standards have been obeyed). However. specific provisions cannot be quoted). specific provisions cannot be quoted). and its provisions are not publicly available. The standard is a commonly used code of practice. operation and use of information technology networks. For this reason. Scope: Not publicly available ISO standard.Scope: Not publicly available ISO TR.org/iso/en/CatalogueDetailPage.htm Topic: Standard containing auditing guidelines for assessment of compliance with ISO/IEC 15408 (Information technology—Security techniques—Evaluation criteria for IT security) Scope Publicly available ISO standard. which can be voluntarily used. the standard is not free of charge. which can be voluntarily implemented. and its provisions are not publicly available.iso. Topic: Technical Report (TR) containing generally accepted guidelines and general principles for information security incident management in an organization. it is mostly useful as a catalyst to awareness raising initiatives in this regard.Information technology—Security techniques—Information security management systems—Requirements reference: http://www.org/iso/en/CatalogueDetailPage. However.Financial services—Information security guidelines reference: http://www. CatalogueDetail?CSNUMBER=35396 (Note: this is a reference to the ISO page where the standard can be acquired. specific provisions cannot be quoted).CatalogueDetail?CSNUMBER=40008 (Note: this is a reference to the ISO page where the standard can be acquired. Its application in practice is often combined with related standards.Information technology—Security techniques—IT network security reference: http:// www.Information technology—Security techniques—Methodology for IT security evaluation reference: http://isotc. and serves as a resource for the implementation of security management practices and as a yardstick for auditing such practices. • ISO/IEC 27001:2005 . Since it describes minimum actions to be performed by such auditors. to be followed when evaluating compliance with ISO/IEC 15408 (Information technology—Security techniques—Evaluation criteria for IT security). As such. the standard is not free of charge. bsiglobal.iso.iso. For this reason. the text contains direct guidelines for incident management.CatalogueDetail?CSNUMBER=37245 (Note: this is a reference to the ISO page where the standard can be acquired. (See also ISO/IEC 27001).iso. While not legally binding. • ISO/TR 13569:2005 .org/iso/en/CatalogueDetailPage. The standard is considered an extension of the guidelines provided in ISO/IEC 13335 and ISO/IEC 17799 focusing specifically on network security risks.org/iso/ en/CatalogueDetailPage. CatalogueDetail?CSNUMBER=42103 (Note: this is a reference to the ISO page where the standard can be acquired. The standard is a ‘companion document’. such as BS 7799-3:2006 which provides additional guidance to support the requirements given in ISO/IEC 27001:2005 (see http://www.com/en/Shop/Publication-Detail/?pid=000000000030125022&recid=2491) • ISO/IEC TR 18044:2004 – Information technology—Security techniques—Information security incident management reference: http://www. the standard is not free of charge. For this reason. However.iso. the text contains direct guidelines for the creation of sound information security practices The standard is a very commonly used code of practice.While not legally binding. Thus. it is a (nonbinding) normative tool for the creation and assessment of RM/RA practices.org/livelink/livelink/fetch/2000/2489/Ittf_Home/PubliclyAvailableStandards. and serves as a resource for the implementation of information security management systems and as a yardstick for auditing such systems and/or the surrounding practices. the standard is not free of charge. which is thus primarily of used for security professionals involved in evaluating compliance with ISO/IEC 15408 (Information technology—Security techniques—Evaluation criteria for IT security). and its provisions are not publicly available.

and serves as a resource for the implementation of information security management programmes in institutions of the financial sector.Scope: Not publicly available BSI standard. and is therefore typically applied in conjunction with this standard in risk assessment practices 162 .gov/publications/secpubs/otherpubs/reviso-faq. However. specific provisions cannot be quoted).pdf) • ISO/IEC 21827:2008 . Given its relative newness. Following this TR. (See also http://csrc. the text contains direct guidelines for the creation of sound information security practices. and as a yardstick for auditing such programmes. which describes the essential characteristics of an organization's security engineering process that must exist to ensure good security engineering. For this reason. reference: http://www. BSI • BS 25999-1:2006 . While not legally binding. although it could be very influential to RM/RA practices.IT Service Continuity Management Code of Practice (see http:// www. and its provisions are not publicly available.IT risk available.Capability Maturity Model® (SSE-CMM®). it can be determined which level of security assurance a deliverable is intended to meet.Topic: Standard containing a business continuity code of practice.bsi-global. The model is a standard metric for security engineering practices. product or environmental factor (a deliverable). For this reason. The standard is a commonly referenced guideline. Topic: Standard containing guidelines for the implementation and assessment of information security policies in financial services institutions. the standard is not free of charge. given the general lack of universally applicable standards in this regard and the increasing attention to business continuity and contingency planning in regulatory initiatives.Business continuity management Part 1: Code of practice Note: this is only part one of BS 25999.nist. in particular PAS 77:2006 . ISO/IEC 21827:2008 does not prescribe a particular process or sequence. The standard is intended as a code of practice for business continuity management. • BS 7799-3:2006 . but captures practices generally observed in industry.The TR allows security professionals to determine a suitable methodology for assessing a security service.com/en/Shop/Publication-Detail/?pid=000000000030141858).bsi-global.com/en/Shop/ Publication-Detail/?pid=000000000030157563.Information technology—Security techniques—Systems Security Engineering—Capability Maturity Model® (SSE-CMM®): ISO/IEC 21827:2008 specifies the Systems Security Engineering . The standard is mostly intended as a guiding complementary document to the application of the aforementioned ISO 27001:2005. specific provisions cannot be quoted). which can be voluntarily implemented. Topic: Standard containing general guidelines for information security risk management. Part two (which should contain more specific criteria with a view of possible accreditation) is yet to appear.bsi-global. the potential impact of the standard is difficult to assess. Application of this standard can be complemented by other norms.Information security management systems—Guidelines for information security risk management reference: http://www. and will be extended by a second part that should permit accreditation for adherence with the standard. which was published in November 2006.com/en/Shop/Publication-Detail/ ?pid=000000000030125022&recid=2491 (Note: this is a reference to the BSI page where the standard can be acquired. and if this threshold is actually met by the deliverable.

enisa. "Information technology -. pdf) [10] FIPS Publication 200 Minimum Security Requirements for Federal Information and Information Systems (http:/ / csrc. org/ Knowledge-Center/ Research/ Documents/ RiskIT-FW-18Nov09-Research. oecd. References [1] "Risk is a combination of the likelihood of an occurrence of a hazardous event or exposure(s) and the severity of injury or ill health that can be caused by the event or exposure(s)" (OHSAS 18001:2007). eu/ act/ rm/ cr/ risk-management-inventory/ glossary#G4) [18] OWASP risk rating Methodology (http:/ / www. J. oecd. ch/ html/ menu3/ b/ 71. riskmanagementinsight. Information technology adoption is always increasing and spread to vital infrastructure for civil and military organizations. 85. pdf) ISBN 978-1-60420-111-6 (registration required) [13] Technical Standard Risk Taxonomy ISBN 1-931624-77-1 Document Number: C081 Published by The Open Group. nist. Everybody can get involved in the Cyberwar. php) devoted to FISMA [8] FISMApedia Risk term (http:/ / fismapedia. 4009 (http:/ / www. europa. gov/ publications/ nistpubs/ 800-30/ sp800-30. pp. pdf) [20] Enisa Risk management. January 2009. owasp.Security techniques-Information security risk management" ISO/IEC FIDIS 27005:2008 [4] CNSS Instruction No. page 46 (http:/ / www. pdf [27] http:/ / eur-lex. 605 ISBN 978-0-12-374354-1 [22] ISACA (2006).00. enisa. pdf) [12] ISACA THE RISK IT FRAMEWORK (http:/ / www.be June 2007 [24] http:/ / www. eu/ act/ rm/ cr/ risk-management-inventory/ glossary#G17) [16] ENISA Glossary Impact (http:/ / www. Risk assessment inventory. enisa. gov/ publications/ fips/ fips200/ FIPS-200-final-march. eu/ act/ rm/ cr/ risk-management-inventory/ glossary#G11) [15] ENISA Glossary Incident (http:/ / www. It also encompasses the education process required to accomplish different tasks in these fields. ISBN 1-933284-15-3. Sokratis K. (2009) "35" Computer and Information Security Handbook Morgan Kaufmann Pubblications Elsevier Inc p. Albert.2340. 232 ISBN 978-0-12-374354-1 [3] ISO/IEC. org/ index. europa. It is crucial that a nation can have skilled professional to defend its vital interests. html [26] http:/ / www. unhchr. [14] ENISA Glossary event (http:/ / www. nist. pdf) [11] FAIR: Factor Analysis for Information Risks (http:/ / www. Information Systems Audit and Control Association (http:/ / www. [2] Caballero. enisa. europa. europa. org/ ). isaca. europa. aspx?term=4253& alpha=R) [7] a wiki project (http:/ / fismapedia. enisa. aspx?pid=567) [6] NIATEC Glossary of terms (http:/ / niatec. eu/ LexUriServ/ LexUriServ. europa. org/ index.en_2649_34255_1815186_1_1_1_1. cnss. htm [25] http:/ / www. [23] Risk Management / Risk Assessment in European regulation. enisa. pdf) dated 26 April 2010 [5] National Information Assurance Certification and Accreditation Process (NIACAP) by National Security Telecommunications and Information Systems Security Committee (http:/ / niatec. eu/ act/ rm/ cr/ risk-management-inventory/ files/ deliverables/ risk-management-principles-and-inventories-for-risk-management-risk-assessment-methods-and-tools/ at_download/ fullReport) [21] Katsicas. CISA Review Manual 2006. europa. org/ document/ 18/ 0. php/ OWASP_Risk_Rating_Methodology) [19] ISACA THE RISK IT FRAMEWORK (registration required) (http:/ / www. info/ Glossary. international guidelines and codes of practice (http:/ / www. isaca. gov/ Assets/ pdf/ cnssi_4009. eu/ act/ rm/ cr/ laws-regulation/ downloads/ risk-management-risk-assessment-in-european-regulation-international-guidelines-and-codes-of-practice/ at_download/ fullReport) Conducted by the Technical Department of ENISA Section Risk Management in cooperation with: Prof. eu/ act/ rm/ cr/ risk-management-inventory/ glossary#G21) [17] ENISA Glossary Consequence (http:/ / www.IT risk Information Security Forum • Standard of Good Practice 163 Professionalism Information security professionalism is the set of knowledge that people working in Information security and similar fields (Information Assurance and Computer security) should have and eventually demonstrate through certifications from well respected organizations. isaca. info/ GetFile. org/ index. do?uri=CELEX:32001R0045:EN:NOT . com/ media/ docs/ FAIR_introduction. org/ Knowledge-Center/ Research/ Documents/ RiskIT-FW-18Nov09-Research. Dumortier and Hans Graux www. php?title=Term:Risk) [9] NIST SP 800-30 Risk Management Guide for Information Technology Systems (http:/ / csrc. org/ dataoecd/ 16/ 22/ 15582260.lawfort. (2009) "14" Computer and Information Security Handbook Morgan Kaufmann Pubblications Elsevier Inc p.

eu/ LexUriServ/ LexUriServ.nist. eu/ LexUriServ/ LexUriServ. gov/ safeharbor/ shlist. do?uri=CELEX:32002L0058:EN:NOT http:/ / eur-lex.gov/publications/nistpubs/800-30/sp800-30.gov/publications/nistpubs/800-37-rev1/sp800-37-rev1-final.edu/confluence/display/itsg2/Home) • Risk Management . nsf/ webPages/ safe+ harbor+ list 164 External links • The Institute of Risk Management (IRM) (http://www. coe. edu/ rules/ frcp/ http:/ / web. do?uri=CELEX:31995L0046:EN:NOT http:/ / eur-lex. do?uri=CELEX:32001D0497:EN:NOT http:/ / eur-lex. eu/ LexUriServ/ LexUriServ. europa.eu/act/rm/cr/risk-management-inventory/files/deliverables/ risk-management-principles-and-inventories-for-risk-management-risk-assessment-methods-and-tools/ at_download/fullReport). htm http:/ / www. law.fr/) • 800-30 NIST Risk Management Guide (http://csrc. int/ Treaty/ EN/ Treaties/ Html/ 185. 2006 Authors:Conducted by the Technical Department of ENISA Section Risk Management • Clusif Club de la Sécurité de l'Information Français (https://www.gov/publications/PubsDrafts.Principles and Inventories for Risk Management / Risk Assessment methods and tools (http:/ /www. Standards for Security Categorization of Federal Information and Information (http://csrc.pdf) • FIPS Publication 200 Minimum Security Requirements for Federal Information and Information Systems (http:// csrc. do?uri=CELEX:32005F0222:EN:NOT http:/ / conventions.pdf) • 800-37 NIST Guide for Applying the Risk Management Framework to Federal Information Systems: A Security Life Cycle Approach (http://csrc.asso. do?uri=CELEX:32004D0915:EN:NOT http:/ / eur-lex.enisa. europa.html#SP-800-39) • FIPS Publication 199. europa.gov/publications/fips/fips200/FIPS-200-final-march. Publication date: Jun 01.pdf) • FISMApedia is a collection of documents and discussions focused on USA Federal IT security (http:// fismapedia. ita.IT risk [28] [29] [30] [31] [32] [33] [34] [35] [36] [37] http:/ / eur-lex. eu/ LexUriServ/ LexUriServ.europa. eu/ LexUriServ/ LexUriServ. eu/ LexUriServ/ LexUriServ.org/index. eu/ LexUriServ/ LexUriServ. europa.org/index.nist.pdf) • 800-39 NIST DRAFT Managing Risk from Information Systems: An Organizational Perspective (http://csrc. do?uri=CELEX:32006L0024:EN:NOT http:/ / eur-lex. europa. internet2. do?uri=CELEX:32008L0114:EN:NOT http:/ / eur-lex.nist. doc.php?title=Main_Page) .gov/publications/fips/fips199/FIPS-PUB-199-final.clusif.html) is risk management's leading international professional education and training body • Internet2 Information Security Guide: Effective Practices and Solutions for Higher Education (https://wiki. europa.theirm. cornell. nist. europa. nist.

[1] it encompasses not just only the negative impact of operations and service delivery which can bring destruction or reduction of the value of the organization. Generally speaking. Decision theory should be applied to manage risk as a science. rationally making choices under uncertainty. the process of risk management is an ongoing iterative process. the choice of countermeasure (computer)s (controls) used to manage risks must strike a balance between productivity. indeed it pervades decision-making in all areas of our daily lives. but also the benefit\value enabling risk associated to missing opportunities to use technology to enable or enhance business or the IT project management for aspects like overspending or late delivery with adverse business impact Risk Management Elements Relationships between IT security entity Because risk is strictly tied to uncertainty. i. if any.[4] The measure of a IT risk can be determined as a product of threat.[1] The establishment. and the value of the informational asset being protected. based on the value of the information resource to the organization. influence and adoption of IT within an enterprise IT risk management can be considered a component of a wider Enterprise risk management system. effectiveness of the countermeasure.[3] According to Risk IT.: The business risk associated with the use. to take in reducing risk to an acceptable level."[6] There are two things in this definition that may need some clarification.IT risk management 165 IT risk management The IT risk management is the application of risk management to Information technology context in order to manage IT risk. vulnerability and asset values:[5] Risk = Threat * Vulnerability ∗ Asset Definitions The CISA Review Manual 2006 provides the following definition of risk management: "Risk management is the process of identifying vulnerabilities and threats to the information resources used by an organization in achieving business objectives. Risk management is the process that allows IT managers to balance the operational and economic costs of protective measures and achieve gains in mission capability by protecting the IT systems and data that support their organizations’ missions. operation. and deciding what countermeasures.e. cost. It must be repeated indefinitely. maintenance and continuous update of an ISMS provide a strong indication that a company is using a systematic approach for the identification. ownership. Most organizations have tight budgets for IT . assessment and management of information security risks.e. involvement. First. This process is not unique to the IT environment. risk is the product of likelihood times impact (Risk = Likelihood * Impact).[2] Different methodologies has been proposed to manage IT risks. Second.[7] The head of an organizational unit must ensure that the organization has the capabilities needed to accomplish its mission. i. each of them divided in processes and steps. The business environment is constantly changing and new threats and vulnerability emerge every day. These mission owners must determine the security capabilities that their IT systems must have to provide the desired level of mission support in the face of real world threats.

aligned with and supporting the organization's mission Operations . The four objectives categories addressed. therefore. according to COSO are: • • • • Strategy . security test and evaluation. An element of managerial science concerned with the identification. measurement.IT risk management security. and many others should have. 2. safeguard implementation. 3. and overall security review. ERM should provide the context and business objectives to IT risk management . The picture show the relationships between different related terms.effective and efficient use of resources Financial Reporting . cost benefit analysis.[9] IT risk is transversal to all four categories. Management decision. 4. 166 Risk management as part of enterprise risk management Some organizations have. a Risk assessment. measuring. An effective risk management program encompasses the following four phases: 1. and systems review. cost benefit analysis. The IT risk should be managed in the framework of Enterprise risk management: Risk appetite and Risk sensitivity of the whole enterprise should guide the IT risk management process. a comprehensive Enterprise risk management (ERM) in place. security evaluation of safeguards. can help management identify appropriate controls for providing the mission-essential security capabilities. The total process of identifying. and eliminating or minimizing uncertain events that may affect system resources. A well-structured risk management methodology. Effectiveness review. 4. lt indudes risk analysis. Control implementation. 3. The total process of identifying. National Information Assurance Training and Education Center defines risk in the IT field as:[8] 1. controlling. as derived from an evaluation of threats and vulnerabilities. The process facilitates the management of security risks by each level of management throughout the system life cycle. and minimizing uncertain events affecting AIS resources. and minimize the impact of uncertain events. when used effectively. The objective of the risk management program is to reduce risk and obtain and maintain DAA approval. and minimization of uncertain events.reliability of operational and financial reporting Compliance . The approval process consists of three elements: risk analysis. and approval. It includes risk analysis.high-level goals. selection. with a lot of relations with other complex activities.compliance with applicable laws and regulations According to Risk It framework by ISACA. safeguard selection. The total process to identify. certification.[7] Risk management in the IT world is quite a complex. IT security spending must be reviewed as thoroughly as other management decisions. 2. multi faced activity. control. control. implementation and test.

3 Identify risk response options RR2. These processes constitute a generic framework. Risk treatment Risk acceptance Risk communication Ongoing risk management activities Risk monitoring and review Evaluation and assessment Risk treatment and Risk mitigation management decision making • • RE 2. however.1 Define IT risk analysis scope.4 Provide adequate resources for IT risk management.4 Perform a peer review of IT risk analysis. RE2.4 Accept IT risk • • • • • • RG1. the elements as described in the ISO 27005 process are all included in Risk IT. RG2.1 Establish and maintain accountability for IT risk management RG2. The Risk IT Practitioner-Guide[10] compares Risk IT and ISO 27005. or their sequence may change.6 Develop IT risk indicators. Risk assessment Risk assessment Risk assessment RE2 process includes: • • • • RE2. RE2.4 Perform a peer review of IT risk analysis. RE2.[3] ISACA Risk IT framework is more recent. The following table compare the processes foreseen by three leading standards.[3] A methodology does not describe specific methods. RG2 Integrate with ERM.5 Promote IT risk-aware culture RG1. However. some are structured and named differently. RE2. RG2. any risk management exercise must carry out these processes in one form or another. In general. according to ISO Standard 13335 Risk management constituent processes ISO/IEC 27005:2008 Context establishment BS 7799-3:2006 SP 800-30 Risk IT Organizational context RG and RE Domains more precisely • • • • • RG1.3 Identify risk response options.5 Provide independent assurance over IT risk management .6 Encourage effective communication of IT risk RE3. they may be combined. ENISA: The Risk Management Process. The overall comparison is illustrated in the following table.3 Respond to discovered risk exposure and opportunity RG3.1 Define IT risk analysis scope.3 Adapt IT risk practices to enterprise risk practices. RG2.2 Estimate IT risk. nevertheless it does specify several processes that need to be followed. RE2. They may be broken down in sub-processes.IT risk management 167 Risk management methodology The term methodology means an organized set of principles and rules that drive action in a particular field of knowledge.2 Propose IT risk tolerance.

The scope can be an incident reporting plan. computer installations. operate.[13] The main roles inside this organization are:[7] • • • • • • • Senior Management Chief information officer (CIO) System and Information owners the business and functional managers the Information System Security Officer (ISSO) or Chief information security officer (CISO) IT Security Practitioners Security Awareness Trainers .[11] 168 Context establishment This step is the first step in ISO ISO/IEC 27005 framework. political. scope and boundaries of risk management activities and the organization in charge of risk management activities.[7] Information risk analysis conducted on applications. These are conditioned by:[12] • • • • legal and regulatory requirements the strategic value for the business of information processes stakeholder expectations negative consequences for the reputation of the organization Establishing the scope and boundaries. review. The constraints (budgetary. networks and systems under development should be undertaken using structured methodologies. maintain and improve an ISMS. Organization for security management The set up of the organization in charge of risk management is foreseen as partially fulfilling the requirement to provide the resources needed to establish. and 3. risk assessment. technical) of the organization are to be collected and documented as guide for next steps. Criteria include the risk evaluation. a business continuity plan. risk acceptance and impact evaluation criteria. evaluation and assessment. the organization should be studied: its mission. its values. the IT risks are managed following a process that accordingly to NIST SP 800-30 can be divided in the following steps:[7] 1. its locations and cultural environment. Another area of application can be the certification of a product. its structure.IT risk management Due to the probabilistic nature and the need of cost benefit analysis. Effective risk management must be totally integrated into the Systems Development Life Cycle. The purpose is usually the compliance with legal requirements and provide evidence of due diligence supporting an ISMS that can be certified. Most of the elementary activities are foreseen as the first sub process of Risk assessment according to NIST SP 800-30. cultural. 2. implement. purpose. its strategy. This step implies the acquisition of all relevant information about the organization and the determination of the basic criteria. monitor. risk mitigation.

The process can divided in the following steps:[12] • Risk analysis. loss or impact. A management tool which provides a systematic approach for determining the relative value and sensitivity of computer installation assets. known and postulated. the output is the list of assessed risks prioritized according to risk evaluation criteria. . to determine expected loss and establish the degree of acceptability to system operations. A study of the vulnerabilities. An identification of a specific ADP facility's assets. while the other iterations detailed the analysis of the major risks and other risks.) and – until the performance of the next assessment provides a temporary view of assessed risks and while parameterizing the entire Risk Management process. The process of evaluating threats and vulnerabilities. the threats to these assets. An analysis of system assets and vulnerabilities to establish an expected loss from certain events based on estimated probabilities of the occurrence of those events. risk analysis) of a large scale computer installation.IT risk management 169 Risk assessment Risk Management is a recurrent activity that deals with the analysis. control and monitoring of implemented measurements and the enforced security policy. on demand. the first being a high-level assessment to identify high risks. assessing vulnerabilities. likelihood. and the ADP facility's vulnerability to those threats. The purpose of a risk assessment is to determine if countermeasures are adequate to reduce the probability of loss or the impact of loss to an acceptable level. further divided in: • Risk identification • Risk estimation • Risk evaluation The following table compare these ISO 27005 processes with Risk IT framework processes:[10] . etc. ISO 27005 framework Risk assessment receives as input the output of the previous step Context establishment. Risk assessment methodologies may vary from qualitative or quantitative approaches to any combination of these two approaches. planning.g. 4. 3. Risk assessments may vary from an informal review of a small scale microcomputer installation to a more formal and fully documented analysis (i. On the contrary. Managers use the results of a risk assessment to develop security requirements and specifications. Decisions for implementing additional protection features are normally based on the existence of a reasonable ratio between cost/benefit of the safeguard and sensitivity/value of the assets to be protected. 5. e. assessing existing protection features and additional protection alternatives or acceptance of risks and documenting management decisions. According to National Information Assurance Training and Education Center risk assessment in the IT field is:[8] 1. Risk Assessment is executed at discrete time points (e. once a year. This view of the relationship of Risk Management to Risk Assessment is depicted in figure as adopted from OCTAVE. 2. implementation.[2] ENISA: Risk assessment inside risk management Risk assessment is often conducted in more than one iteration. threats. and theoretical effectiveness of security measures. assessing loss expectancy or perceived risk exposure levels.

2 Estimate IT risk. RE1 Collect data serves as input to the analysis of risk (e. asset management. business continuity management. physical and environmental security. site.g. information systems acquisition. and regulatory compliance.. organization of information security.e. primary (i. personnel. Business processes and related information) and supporting (i. access control. the following are to be identified:[12] • assets. hardware. (see Systems Development Life Cycle) information security incident management. This process is included in RE2. collecting data on the external environment). development and maintenance.e. Risk identification Risk identification states what could cause a potential loss. RE2 has as its objective developing useful information to support risk decisions that take into account the business relevance of risk factors. existing and planned security measures • list of vulnerabilities unrelated to any identified threats • list of incident scenarios with their consequences OWASP: relationship between threat agent and business impact . human resources security.2 Estimate IT risk Risk evaluation RE2. communications and operations management.2 Estimate IT risk The ISO/IEC 27002:2005 Code of practice for information security management recommends the following be examined during a risk assessment: • • • • • • • • • • • security policy. organization structure) • threats • existing and planned security measures • vulnerabilities • consequences • related business processes The output of sub process is made up of:[12] • list of asset and related business processes to be risk managed with associated list of threats. software. The identification of risk comprises the following elements: • • Risk scenarios Risk factors Risk estimation RE2. identifying risk factors.IT risk management 170 Risk assessment constituent processes ISO 27005 Risk analysis • • Risk identification Risk IT RE2 Analyse risk comprises more than what is described by the ISO 27005 process step.

Integrity.[14] Purely quantitative risk assessment is a mathematical calculation based on security metrics on the asset (system or application). not only the value of the directly affected resource. liability) can be worth much more than physical resources at risk (the laptop hardware in the example). For example. For each risk scenario. if you consider the risk scenario of a Laptop theft threat.IT risk management Risk estimation There are two methods of risk assessment in information security field.[15] Intangible asset value can be huge. Availability. or the persons performing the assessment don't have the sophisticated mathematical. you should consider the value of the data (a related asset) contained in the computer and the reputation and liability of the company (other assets) deriving from the lost of availability and confidentiality of the data that could be involved. for example the annual rate of occurrence (ARO). the Annualized Loss Expectancy is determined as the product of ARO X SLE. financial. It can be documented in a risk register During risk estimation there are generally three values of a given asset. Qualitative risk assessments are descriptive versus measurable. one for the loss of one of the CIA properties: [17] Confidentiality. Risk estimation has as input the output of risk analysis and can be split in the following steps: • assessment of the consequences through the valuation of assets • assessment of the likelihood of the incident (through threat and vulnerability valuation) • assign values to the likelihood and consequence of the risks The output is the list of risks with value levels assigned. a significant quantity of relevant data is not available. Usually a qualitative classification is done followed by a quantitative evaluation of the highest risks to be compared to the costs of security measures. It is easy to understand that intangible assets (data.[5] It is important to point out that the values of assets to be considered are those of all involved assets. but is not easy to evaluate: this can be a consideration against a pure quantitative approach. taking into consideration the different risk factors a Single loss expectancy (SLE) is determined. reputation. Qualitative risk assessments are typically performed through interviews of a sample of personnel from all relevant groups within an organization charged with the security of the asset being assessed. from Very High to Low) is performed when the organization requires a risk assessment be performed in a relatively short time or to meet a small budget.[14] Qualitative risk assessment can be performed in a shorter period of time and with less data. 171 . qualitative and quantitative. Then. and risk assessment expertise required.[16] Qualitative risk assessment (three to five steps evaluation. considering the probability of occurrence on a given period basis.

[13] but is up to the single organization to choose the most appropriate one according to its business strategy. Because the elimination of all risk is usually impractical or close to impossible.. the third according to ISO 27005 of risk management. It compares each risk level against the risk acceptance criteria and prioritise the risk list with risk treatment indications. There are some list to select appropriate security measures.g. with minimal adverse impact on the organization’s resources and mission. The level of impact is governed by the potential mission impacts and produces a relative value for the IT assets and resources affected (e. involves prioritizing. that is the output of the process with the residual risks subject to the acceptance of management. The choice . evaluating. the criticality sensitivity of the IT system components and data). it is the responsibility of senior management and functional and business managers to use the least-cost approach and implement the most appropriate controls to decrease mission risk to an acceptable level. Impact refers to the magnitude of harm that could be caused by a threat’s exercise of vulnerability. and implementing the appropriate risk-reducing controls recommended from the risk assessment process. the second process according to SP 800-30. threats to an IT system must be in conjunction with the potential vulnerabilities and the controls in place for the IT system. constraints of the environment and circumstances. 172 NIST SP 800 30 framework To determine the likelihood of a future adverse event. The risk assessment methodology encompasses nine primary steps:[7] • • • • • • • • • Step 1 System Characterization Step 2 Threat Identification Step 3 Vulnerability Identification Step 4 Control Analysis Step 5 Likelihood Determination Step 6 Impact Analysis Step 7 Risk Determination Step 8 Control Recommendations Step 9 Results Documentation Risk assessment according NIST SP 800-30 Figure 3-1 Risk mitigation Risk mitigation. ISO 27005 framework The risk treatment process aim at selecting security measures to: • • • • reduce retain avoid transfer risk and produce a risk treatment plan.IT risk management Risk evaluation The risk evaluation process receives as input the output of risk analysis process.

For example. such as purchasing insurance. To limit the risk by implementing controls that minimize the adverse impact of a threat’s exercising a vulnerability (e. since it influences decisions to be taken. Establishing a common understanding is important. If the residual risk is unacceptable. and maintains controls • Research and Acknowledgement. To accept the potential risk and continue operating the IT system or to implement controls to lower the risk to an acceptable level • Risk Avoidance. the choice of not storing sensitive information about customers can be an avoidance for the risk that customer data can be stolen. The importance of accepting a risk that is too costly to reduce is very high and led to the fact that risk acceptance is considered a separate process. Risk mitigation methodology flow chart from NIST SP 800-30 Figure 4-2 Address the greatest risks and strive for sufficient risk mitigation at the lowest cost.[18] Risk avoidance describe any action where ways of conducting business are changed to avoid any risk occurrence. use of supporting..IT risk management should be rational and documented. .. To lower the risk of loss by acknowledging the vulnerability or flaw and researching controls to correct the vulnerability • Risk Transference.[12] Risk transfer apply were the risk has a very high impact but is not easy to reduce significantly the likelihood by means of security controls: the insurance premium should be compared against the mitigation costs. To transfer the risk by using other options to compensate for the loss. To avoid the risk by eliminating the risk cause and/or consequence (e. preventive. detective controls) • Risk Planning. the risk reaming after risk treatment decision have been taken. 173 NIST SP 800 30 framework Risk mitigation is a systematic methodology used by senior management to reduce mission risk.e. forgo certain functions of the system or shut down the system when risks are identified) • Risk Limitation. the risk treatment process should be iterated. with minimal impact on other mission capabilities: this is the suggestion contained in[7] Risk communication Risk communication is a horizontal process that interacts bidirectionally with all other processes of risk management. eventually evaluating some mixed strategy to partially treat the risk. Another option is to outsource the risk to somebody more efficient to manage the risk.g.g.[7] Risk mitigation can be achieved through any of the following risk mitigation options: • Risk Assumption.implements. The residual risks. Its purpose is to establish a common understanding of all aspect of risk among all the organization's stakeholder. should be estimated to ensure that sufficient protection is achieved. i. To manage risk by developing a risk mitigation plan that prioritizes.

an incident response plan and security validation and metrics are fundamental activities to assure that an optimal level of security is obtained. . Within this process implemented security measures are regularly monitored and reviewed to ensure that they work as planned and that changes in the environment rendered them ineffective. It is important to monitor the new vulnerabilities. The attitude of involved people to benchmark against best practice and follow the seminars of professional associations in the sector are factors to assure the state of art of an organization IT risk management practice. and evaluate other kinds of controls to deal with zero-day attacks. IT evaluation and assessment Security controls should be validated. apply procedural and technical security controls like regularly updating software. The IT systems of most organization are evolving quite rapidly. Risk management should cope with this changes through change authorization after risk re evaluation of the affected systems and processes and periodically review the risks and mitigation actions.[5] Monitoring system events according to a security monitoring strategy. Regular audits should be scheduled and should be conducted by an independent party. Business requirements. somebody not under the control of whom is responsible for the implementations or daily management of ISMS. and Penetration test are instruments for verifying the status of security controls. Technical controls are possible complex systems that are to tested and verified. vulnerabilities and threats can change over the time. never ending process.IT risk management 174 Risk mitigation action point according to NIST SP 800-30 Figure 4-1 Risk monitoring and review Risk management is an ongoing. Information technology security audit is an organizational and procedural control with the aim of evaluating security. i. both internal and external. The hardest part to validate is people knowledge of procedural controls and the effectiveness of the real application in daily business of the security procedures.e.[7] Vulnerability assessment.

descriptions of the key security This guide roles and responsibilities that are needed in most information system developments are provided.a. development and maintenance by implementing effective security practices in the following areas. operation or maintenance. Typically the system is being modified on an ongoing basis through the addition of hardware and software and by changes to organizational processes. and a security concept of operations (strategy) The risks identified during this phase can be used to support the security analyses of the IT system that may lead to architecture and design tradeoffs during system development The risk management process supports the assessment of the system implementation against its requirements and within its modeled operational environment. including security requirements. Early integration of security in the SDLC enables agencies to maximize return on investment in their security programs. purchased. discarding. The risk management methodology is the same regardless of the SDLC phase for which the assessment is being conducted.g. tested. and that system migration is conducted in a secure and systematic manner Phase 2: Development or Acquisition Phase 3: Implementation Phase 4: Operation or Maintenance The system performs its functions. or otherwise constructed The system security features should be configured. and • Facilitation of informed executive decision making through comprehensive risk management in a timely manner. development or acquisition.IT risk management 175 Integrating risk management into system development life cycle Effective risk management must be totally integrated into the SDLC. • Identification of shared security services and reuse of security strategies and tools to reduce development cost and schedule while improving security posture through proven methods and techniques. waterfall) SDLC.. hardware. policies. or destroying information and sanitizing the hardware and software Phase 5: Disposal NIST SP 800-64[19] is devoted to this topic. implementation. • Awareness of potential engineering challenges caused by mandatory security controls. production environment (e. First. and disposal. An IT system’s SDLC has five phases: initiation. Risk management is an iterative process that can be performed during each major phase of the SDLC. and IT facility developments. The document integrates the security steps into the linear. cross-organization projects. Second. [19] focuses on the information security components of the SDLC. Lastly. The five-step SDLC cited in the document is an example of one method of development and is not intended to mandate this methodology. developed. sufficient information about the SDLC is provided to allow a person who is unfamiliar with the SDLC process to understand the relationship between information security and the SDLC. and procedures This phase may involve the disposition of information. enabled. Security can be incorporated into information systems acquisition. programmed. and software. sequential (a. that residual data is appropriately handled. archiving. resulting in lower cost of security control implementation and vulnerability mitigation.[20] . Decisions regarding risks identified must be made prior to system operation Risk management activities are performed for periodic system reauthorization (or reaccreditation) or whenever major changes are made to an IT system in its operational. new system interfaces) Risk management activities are performed for system components that will be disposed of or replaced to ensure that the hardware and software are properly disposed of. and verified Support from Risk Management Activities Identified risks are used to support the development of the system requirements. SP 800-64 provides insight into IT projects and initiatives that are not as clearly defined as SDLC-based developments. through:[19] • Early identification and mitigation of security vulnerabilities and misconfigurations. such as service-oriented architectures.k. Activities may include moving.[7] Table 2-1 Integration of Risk Management into the SDLC[7] SDLC Phases Phase 1: Initiation Phase Characteristics The need for an IT system is expressed and the purpose and scope of the IT system is documented The IT system is designed.

production back-off plans. by avoiding the complexity that accompanies the formal probabilistic model of risks and uncertainty. extensive testing. Formal testing should be done to determine whether the product meets the required security specifications prior to purchasing the product. It is highly subjective in assessing the value of assets. implementing and operating secure information systems because it systematically classifies and drives the process of deciding how to treat risks. Security requirements are presented to the vendor during the requirements phase of a product purchase. Applications need to be monitored and patched for technical vulnerabilities. Having considered this criticisms the risk management is a very important instrument in designing. checking for processing errors. authenticity and integrity of information. unauthorized modification or misuse of information. Effective coding techniques include validating input and output data. and appropriate access to program code are some effective measures that can be used to protect an application's files. Disk Encryption is one way to protect data at rest. Applied properly.[3] Major programs that implies risk management applied to IT systems of large organizations as FISMA has been criticized. including proper key management. Data in transit can be protected from alteration and unauthorized viewing using SSL certificates issued through a Certificate Authority that has implemented a Public Key Infrastructure.IT risk management • • • • • • Security requirements for information systems Correct processing in applications Cryptographic controls Security of system files Security in development and support processes Technical vulnerability management 176 Information systems security begins with incorporating security into the requirements process for any new application or system enhancement. and whether or not they can be successfully removed in case of a negative impact. System files used by applications must be protected in order to ensure the integrity and stability of the application. Security should be designed into the system from the beginning. A better way to deal with the subject it is not emerged. protecting message integrity using encryption. The risk management methodology is based on scientific foundations of statistical decision making: indeed. risk management looks more like a process that attempts to guess rather than formally predict the future on the basis of statistical evidence. cryptographic controls provide effective mechanisms for protecting the confidentiality. Its usage is foreseen by legislative rules in many countries. Correct processing in applications is essential in order to prevent errors and to mitigate loss. Security in development and support processes is an essential part of a comprehensive quality assurance and production control process. Procedures for applying patches should include evaluating the patches to determine their appropriateness. Using source code repositories with version control. and creating activity logs.[3] . and would usually involve training and continuous oversight by the most experienced staff. Critique of risk management as a methodology Risk management as a scientific methodology has been criticized as being shallow. the likelihood of threats occurrence and the significance of the impact. An institution should develop policies on the use of encryption.

cites FAIR.[1] The "Build Security In" initiative of Homeland Security Department of USA. Gramm–Leach–Bliley Act (GLBA) and Health Insurance Portability and Accountability Act (HIPAA) • EBIOS developed by the French government it is compliant with major security standards: ISO/IEC 27001. rules. Risk evaluation. and Vulnerability EvaluationSM (OCTAVE®) approach defines a risk-based strategic assessment and planning technique for security. Risk acceptance. updated 1991 • ENISA[21] in 2006. free availability. November 2006. ISF methods. IT-Grundschutz provides a method for an organization to establish an Information Security Management System (ISMS). It is integrated with COBIT. it encompasses not just only the negative impact of operations and service delivery which can bring destruction or reduction of the value of the organization. the result is that: • EBIOS. • EBIOS and IT-Grundschutz are the only ones freely available and • only EBIOS has an open source tool to support it.[22] Among them the most widely used are:[3] • CRAMM Developed by British government is compliant to ISO/IEC 17799.IT risk management 177 Risk managements methods It is quite hard to list most of the methods that at least partially support the IT risk management process. FAIR is not another methodology to deal with risk management. a general framework to manage IT. architects. Risk analysis. Asset. tools. ISO/IEC 13335. SEI (Software Engineering Institute) The Operationally Critical Threat. ISO/IEC 17799 and ISO/IEC 21287 • Standard of Good Practice developed by Information Security Forum (ISF) • Mehari developed by Clusif Club de la Sécurité de l'Information Français [23] • Octave developed by Carnegie Mellon University. called Risk IT.[25] The initiative Build Security In is a collaborative effort that provides practices. but it complements existing methodologies. The Factor Analysis of Information Risk (FAIR) main document. but also the benefit\value enabling risk associated to missing opportunities to use technology to enable or enhance business or the IT project management for aspects like overspending or late delivery with adverse business impact. . Risk IT has a broader concept of IT risk than other methodologies. guidelines. mainly by The Open Group and ISACA. chiefly security related risks.[16] outline that most of the methods above lack of rigorous definition of risk and its factors. Efforts in this direction were done by: • NIST Description of Automated Risk Management Packages That NIST/NCSC Risk Management Research Laboratory Has Examined. ISACA developed a methodology. and other resources that software developers. and security practitioners can use to build security into software in every phase of its development. tool support. It comprises both generic IT security recommendations for establishing an applicable IT security process and detailed technical recommendations to achieve the necessary IT security level for a specific domain Enisa report[2] classified the different methods regarding completeness. Risk treatment. IT-Grundschutz cover deeply all the aspects (Risk Identification. to address various kind of IT related risks. Risk Management Insight LLC. • IT-Grundschutz (IT Baseline Protection Manual) developed by Federal Office for Information Security (BSI) (Germany). a list of methods and tools is available on line with a comparison engine. So it chiefly address Secure coding. Risk assessment. Risk communication).[24] FAIR has had a good acceptance. ISO/IEC 15408. principles. "An Introduction to Factor Analysis of Information Risk (FAIR)".

asso. html . Sokratis Katsikasa. gov/ bsi/ articles/ best-practices/ deployment/ 583-BSI. For a description see the main article. isaca. pdf). eu/ act/ rm/ cr/ risk-management-inventory/ files/ deliverables/ risk-management-principles-and-inventories-for-risk-management-risk-assessment-methods-and-tools/ at_download/ fullReport) [3] Katsicas. ISBN 1-933284-15-3.2005. theft_1_laptop-personal-data-single-veteran?_s=PM:POLITICS) [16] . fr/ [24] Technical Standard Risk Taxonomy ISBN 1-931624-77-1 Document Number: C081 Published by The Open Group.+ Development. CISA Review Manual 2006. (2009) "14" Computer and Information Security Handbook Morgan Kaufmann Pubblications Elsevier Inc p. nist. Information Systems Audit and Control Association (http:/ / www. Issue 5. aspx) [11] Standard of Good Practice by Information Security Forum (ISF) Section SM3. 605 ISBN 978-0-12-374354-1 [4] "Risk is a combination of the likelihood of an occurrence of a hazardous event or exposure(s) and the severity of injury or ill health that can be caused by the event or exposure(s)" (OHSAS 18001:2007). riskmanagementinsight. [5] Caballero. enisa. enisa. January 2009. pp. pdf) [20] EDUCAUSE Dashboard ISO 12 (https:/ / wiki. Stefanos Gritzalisa. Risk Management Insight LLC. November 2006 (http:/ / www. [15] CNN article about a class action settlement for a Veteran Affair stolen laptop (http:/ / articles.IT risk management 178 Standards There are a number of standards about IT risk and IT risk management. isaca. org/ Knowledge-Center/ Research/ ResearchDeliverables/ Pages/ The-Risk-IT-Practitioner-Guide. pdf) [2] Enisa Risk management.+ and+ Maintenance+ (ISO+ 12)) [21] ENISA. Inventory of Risk Management / Risk Assessment Methods (http:/ / www. Appendix 3 ISACA ISBN 978-1-60420-116-1 (registration required) (http:/ / www. isaca. Albert. europa. [25] https:/ / buildsecurityin. Computer Standards & Interfaces . eu/ rm_ra_methods. 232 ISBN 978-0-12-374354-1 [6] ISACA (2006). References [1] ISACA THE RISK IT FRAMEWORK (registration required) (http:/ / www. "A formal model for pricing information systems insurance contracts". org/ Knowledge-Center/ Research/ Documents/ RiskIT-FW-18Nov09-Research. cnn. nist. europa. Pages 521-532 doi:10. eu/ act/ rm/ cr/ risk-management-inventory/ rm-ra-methods) [22] Inventory of Risk Management / Risk Assessment Methods (http:/ / rm-inv. isfsecuritystandard. data. June 2005. com/ media/ docs/ FAIR_introduction. ISBN 978-1-60420-111-6 [10] The Risk IT Practitioner Guide.01. org/ ). 1065.Volume 27. "Information technology -. pp. [17] British Standard Institute "ISMSs-Part 3: Guidelines for information security risk management" BS 7799-3:2006 [18] Costas Lambrinoudakisa. [7] NIST SP 800-30 Risk Management Guide for Information Technology Systems (http:/ / csrc. Sokratis K. info/ Glossary. (2009) "35" Computer and Information Security Handbook Morgan Kaufmann Pubblications Elsevier Inc p. Yannacopoulosb. europa.1016/j. "An Introduction to Factor Analysis of Information Risk (FAIR)". enisa. Risk assessment inventory. Risk Management: Auerbach Publications.Security techniques-Information security risk management" ISO/IEC FIDIS 27005:2008 [13] ISO/IEC 27001 [14] Official (ISC)2 Guide to CISSP CBK. clusif. 2007.010 [19] 800-64 NIST Security Considerations in the Information System Development Life Cycle (http:/ / csrc. html) [23] https:/ / www. 85. com/ 2009-01-27/ politics/ va. com) [12] ISO/IEC. gov/ publications/ nistpubs/ 800-64-Rev2/ SP800-64-Revision2. page 46 (http:/ / www. internet2. Athanasios N.4 Information risk analysis methodologies (https:/ / www. us-cert. aspx?term=4253& alpha=R) [9] The Risk IT Framework by ISACA. Petros Hatzopoulosb. gov/ publications/ nistpubs/ 800-30/ sp800-30.csi. edu/ confluence/ display/ itsg2/ Information+ Systems+ Acquisition. pdf) [8] NIATEC Glossary of terms (http:/ / niatec.

Principles and Inventories for Risk Management / Risk Assessment methods and tools (http:/ /www.aracnet.europa. 2005.html)". 2006 Authors:Conducted by the Technical Department of ENISA Section Risk Management • Clusif Club de la Sécurité de l'Information Français (https://www. • Danny Lieberman.gov/publications/nistpubs/800-37-rev1/sp800-37-rev1-final.gov/publications/PubsDrafts. Publication date: Jun 01.il/case-studies/254-data-security-threat-assessment. internet2.pdf)".org/index. " Intelligence-Based Threat Assessments for Information Networks and Infrastructures: A White Paper (http://www.nist. nist.nist.pdf) • 800-37 NIST Guide for Applying the Risk Management Framework to Federal Information Systems: A Security Life Cycle Approach (http://csrc. nist. K.asso.gov/publications/fips/fips199/FIPS-PUB-199-final.com/~kea/Papers/threat_white_paper. 2009 .html#SP-800-39) • FIPS Publication 199.fr/) • 800-30 NIST Risk Management Guide (http://csrc.edu/confluence/display/itsg2/Home) • Risk Management .nist.pdf) • FIPS Publication 200 Minimum Security Requirements for Federal Information and Information Systems (http:// csrc. software.php?title=Main_Page) • Anderson.eu/act/rm/cr/risk-management-inventory/files/deliverables/ risk-management-principles-and-inventories-for-risk-management-risk-assessment-methods-and-tools/ at_download/fullReport).org/index.enisa.gov/publications/fips/fips200/FIPS-200-final-march.pdf) • FISMApedia is a collection of documents and discussions focused on USA Federal IT security (http:// fismapedia.html) is risk management's leading international professional education and training body • Internet2 Information Security Guide: Effective Practices and Solutions for Higher Education (https://wiki.co.clusif.IT risk management 179 External links • The Institute of Risk Management (IRM) (http://www. Standards for Security Categorization of Federal Information and Information (http://csrc. " Using a Practical Threat Modeling Quantitative Approach for data security (http://www.theirm.gov/publications/nistpubs/800-30/sp800-30.pdf) • 800-39 NIST DRAFT Managing Risk from Information Systems: An Organizational Perspective (http://csrc.

com/ DisplayDocument?doc_cd=144700& ref=g_homelink). PCWorld Communications.html). Joris (2 November 2006). eWeek. Retrieved 22 October 2010. gartner. Ryan (1 November 2006). "Month of PHP Bugs Begins" (http:/ / www. "Hackers Promise Month of MySpace Bugs" (http://www. html). Ryan (4 May 2007). Sean Michael (5 July 2006). Retrieved 22 October 2007. Rich (6 November 2006). . . "Apple Bug-Hunt Begins" (http:/ / www. com/ security/ article.com/ article/129933/hackers_promise_month_of_myspace_bugs.com/mokb/) Kernel Fun (http://kernelfun. PC World. Inc. [4] Evers. PCWorld Communications. eWeek Security Watch. "Month of Apple bugs planned for January" (http:/ / www. Inc. Ziff Davis Enterprise Holdings Inc.Month of bugs 180 Month of bugs Month of Bugs is an increasingly popular strategy used by security researchers to draw attention to the lax security procedures of commercial software corporations. Ryan (19 December 2006). [3] Naraine. "Apple wireless flaw revealed" (http:/ / www.content). php/ 3618126). Retrieved 22 October 2010. [5] McMillan. Solaris and Windows. html). The tenet is these corporations have shown themselves to be unresponsive and uncooperative to security alerts and that "responsible disclosure" isn't working properly where they're concerned. eweek. com/ blog/ security/ flaw-trifecta-kicks-off-month-of-php-bugs/ 107). [9] Naraine. . The original "Month of Bugs" was the Month of Browser Bugs (MoBB) run by security researcher HD Moore. External links • • • • • • Month of Kernel Bugs (MoKB) archive (http://projects.blogspot.. . CBS Interactive. Further reading • McMillan. internetnews. Ziff Davis Enterprise Holdings Inc. pcworld. FreeBSD. Retrieved 22 October 2010.info-pull. Retrieved 22 October 2010. "The Month of The Browser Bugs Begins" (http:/ / www.. . Retrieved 22 October 2010. zdnet. CBS Interactive.. Gartner archive.org/) . Retrieved 22 October 2010. Retrieved 22 October 2010.com blog (http://blog. [8] Prince. "Controversial 'month of bugs' getting security results" (http:/ / www. uk/ 2006/ 12/ 20/ month_of_apple_bugs/ ). ZDNet. Gartner Inc. QuinStreet Inc. com/ c/ a/ Security/ Month-of-Kernel-Bugs-Launches-with-Apple-WiFi-Exploit/ ).[2] [3] [4] the Month of Apple Bugs (MoAB) conducted by researchers Kevin Finisterre and LMH which published bugs related to OS X.[1] Subsequent projects include the Month of Kernel Bugs (MoKB) which published kernel bugs for Mac OS X. InternetNews. Retrieved 22 October 2010.. Retrieved 22 October 2010. Linux.. . Robert (20 December 2006).info-pull. John (20 December 2006).[5] [6] [7] and the Month of PHP Bugs sponsored by the Hardened PHP team which published 44 PHP bugs. [2] Mogull. ..info-pull. CBS Interactive. com/ blog/ security/ controversial-month-of-bugs-getting-security-results/ 189?tag=mantle_skin. theregister. The Register. eWeek. [7] Naraine. ZDNet. . Retrieved 22 October 2010.pcworld. Ryan (1 March 2007)..com/moab/) Apple Fun (http://applefun. PC World. "Coming in January: Month of Apple Bugs" (http:/ / securitywatch. "Month of Kernel Bugs Launches with Apple Wi-Fi Exploit" (http:/ / www. com/ c/ a/ Security/ Month-of-PHP-Bugs-Begins/ ). The Register. co.com/): Month of the Kernel Bugs blog Month of Apple Bugs (MoAB) archive (http://projects. zdnet. Brian (3 March 2007). Ziff Davis Enterprise Holdings Inc. . eweek. ZDNet. com/ article/ 128282/ apple_bughunt_begins. co. To that effect.blogspot. . com/ apple/ coming_in_january_month_of_apple_bugs.com. uk/ news/ security-threats/ 2006/ 11/ 02/ apple-wireless-flaw-revealed-39284508/ ). [6] Leyden.com/): Month of the Apple Buggs blog Info-pull. "Learn from 'Month of Kernel Bugs'" (http:/ / www. as well as four wireless driver bugs. Robert (17 March 2007). "Flaw trifecta kicks off Month of PHP bugs" (http:/ / www. zdnet.com/): A complementary blog from the hosts of MoKB and MoAB the Month of PHP Security (http://php-security.[8] [9] [10] References [1] Kerner. [10] Naraine. researchers start a Month of Bugs project for a certain software product and disclose one security vulnerability each day for one month. eweek.

php). command-line driven Nikto web security scanner. [3] Functions Nikto performs comprehensive tests against web servers for multiple items. installed along with the MacNikto application. 03/ db_404_strings).shtml) .net/code/nikto. Variations There are some variations of Nikto. . the CFO of Open Security Foundation has written this scanner for vulnerability assessment. It performs generic and server type specific checks. . net/ code/ nikto. cirt. outdated server software and other problems. [5] "Yet another Nikto GUI" (http:/ / www. The Nikto code itself is Open Source (GPL). and version specific problems on over 260 servers. . It provides easy access to a subset of the features available in the Open Source. MacNikto is an AppleScript GUI shell script wrapper built in Apple's Xcode and Interface Builder.cirt. com/ yang). one of which is MacNikto. released under the terms of the GPL. shtml [2] "Data file distributed with Nikto with non-Open Source licence notice at the top" (http:/ / www. [3] "OSVDB Profile" (http:/ / osvdb. [4] [5] References [1] http:/ / www. informationgift. versions on over 950 servers.Nikto Web Scanner 181 Nikto Web Scanner Nikto Development status Active Available in License Website English GPL [1] Nikto Web Scanner is a Web server scanner that tests Web servers for dangerous files/CGIs. [2] Chris Sullo. however the data files used to drive it are not. com/ macnikto/ ). rbcafe. cirt. . It also captures and prints any cookies received. net/ nikto/ UPDATES/ 2. org/ about. [4] "MacNikto" (http:/ / www. External links • CIRT Nikto Page (http://www. including over 6100 potentially dangerous files/CGIs.

Restoration efforts were also partially hampered due to the lack of common practices and coordination procedures. when the Eastern Interconnection was established in its current form. NERC also investigates and analyzes the causes of significant power system disturbances in order to help prevent future events. 1965. The small initial outage quickly cascaded into the Northeast Blackout of 1965. and that interconnected power systems frequently had varying operating standards and procedures developed somewhat independently by each member on the interconnection. By the end of the 1960s there were virtually no isolated power systems remaining in the lower forty-eight states and southern Canada. which led to the formation of the North American Power Systems Interconnection Committee (NAPSIC). On November 9. NAPSIC eventually grew to also include the Texas Interconnection and most of the companies in what is today the Western Electricity Coordinating Council (WECC). a nonprofit corporation based in Atlanta. practically all power companies were attached to large interconnections. a relatively minor system disturbance triggered a power system protection component that was not properly configured. such as those installed by George Westinghouse and Thomas Edison.000 km of lines operated by 500 companies. as the successor to the North American Electric Reliability Council (also known as NERC). providing mutual benefit to each side. GA. 2006. Generally it was decided that the benefits outweighed the risks. and the Canada-United States Eastern Interconnection (CANUSE) formed the Interconnection Coordination Committee to recommend an informal operations structure. operating within the Western Interconnection. This disturbance revealed the extent that interconnections had evolved without adequate high-level planning and operating oversight to try to prevent such events. was formed on March 28. Furthermore. prior to the turn of the century were isolated central stations which served small pockets of customers independently of each other. However. by the electric utility industry to promote the reliability and adequacy of bulk power transmission in the electric utility systems of North America. The Interconnected Systems Group (composed of Southern and Midwestern utility companies). the PJM Interconnection. NERC's mission states that it is to "ensure that the bulk power system in North America is reliable. 1968. and was therefore more vulnerable than usual. Canada and a portion of Baja California in Mexico. NERC also provides for critical infrastructure protection (NERC CIP). power system protection schemes were often designed with .North American Electric Reliability Corporation 182 North American Electric Reliability Corporation The North American Electric Reliability Corporation (NERC). The original NERC was formed on June 1. monitoring and enforcing compliance with those standards. As some of these power systems grew to cover larger geographic areas. it became possible to connect previously isolated systems." NERC oversees eight regional reliability entities and encompasses all of the interconnected power systems of the contiguous United States. NERC's major responsibilities include working with all stakeholders to develop standards for power system operation. In 1962. Origins of NERC Early electric power systems. tying power systems together with these early interconnections also introduced the risk that a single significant disturbance could collapse all of the systems tied to the interconnection. assessing resource adequacy. Contiguous United States power transmission grid consists of 300. This allowed neighboring systems to share generation and voltage stability resources. The interconnection was operating near peak capacity due to the extreme cold weather and high heating demand. and by 1915 interconnections began to flourish and grow in size. and providing educational and training resources as part of an accreditation program to ensure power system operators remain qualified and proficient.

which collectively covered the entire footprints of the major North American interconnections. such as the New York City blackout of 1977. NERC applied for and was granted this designation. NERC undoubtedly played a significant role in minimizing the impact and frequency of these events. meaning that they might misoperate in response to protection schemes activating in neighboring systems. was a significant turning point in the arena of electric reliability in North America. In 2006. sub-committees. ten regional reliability councils were created by groups of interconnected power systems. . found to be in violation of a standard can be subject to fines of up to $1 million per day per violation. "self-regulatory organization" was created in recognition of the interconnected and international nature of the bulk power grid. NERC's standards are mandatory and enforceable throughout the 50 United States and several provinces in Canada.S. Although significant disturbances continued to occasionally occur. NERC developed a complex committee structure which brings together hundreds of industry expert volunteers in nearly 50 committees. With the passage of the Energy Policy Act of 2005. passed due to the political pressure and fallout from the 1965 blackout. and NERC was then formed as a more formalized successor to NAPSIC to spearhead reliability efforts and assist the regional councils by developing common operating policies and procedures as well as training resources and requirements. This non-governmental.S. The Electric Reliability Act of 1967. and working groups considering issues from wind and renewable power integration to education to demand-side management and energy efficiency. 183 NERC today Out of its long history. Entities in the U. an "Electric Reliability Organization" was created to develop and enforce compliance with mandatory reliability standards in the U. Today. It is difficult to quantify this success because it is impossible to know how many disturbances were prevented by the influence of NERC and the reliability councils. This disturbance revealed the necessity to develop common operating and protection standards as well as plans to effectively coordinate power system restoration efforts. Initially. task forces. NERC's role in raising awareness of reliability issues and creating the impetus to address them is intended to improve reliability every day.North American Electric Reliability Corporation only a local power system's design in mind.

The Eastern Interconnection is tied to the Western Interconnection via high voltage DC transmission facilities and also has ties to non-NERC systems in northern Canada. The reliability council for the Western Interconnection is: • Western Electricity Coordinating Council (WECC) Minor interconnections • The Texas Interconnection covers most of the state of Texas.North American Electric Reliability Corporation 184 Interconnections and Reliability Councils Major interconnections • The Eastern Interconnection covers most of eastern North America. North America. excluding most of Texas. and also has ties to non-NERC systems in northern Canada and Northwestern Mexico. It is tied to the Eastern Interconnection at two points. It is tied to the Eastern Interconnection at six points. 2006. NERC has worked with all stakeholders over the past several years to revise its Policies into Standards. the Energy Policy Act of 2005 authorized the Federal Energy Regulatory Commission (FERC) to designate a national Electric Reliability Organization (ERO). (SPP) • The Western Interconnection covers most of western The two major and three minor NERC Interconnections. The reliability council for the Texas Interconnection is: • Electric Reliability Council of Texas (ERCOT) • The Québec Interconnection covers the province of Québec and is tied to the Eastern Interconnection at two points. NERC's guidelines for power system operation and accreditation were referred to as Policies. FERC issued an order certifying NERC as the ERO for the United States. and the nine NERC Regional Reliability Councils. and now has authority to enforce those standards on power system entities operating in the United States. and also has ties to non-NERC systems in Mexico. extending from the foot of the Rocky Mountains to the Atlantic seaboard. the Québec Interconnection is often considered to be part of the Eastern Interconnection. Inc. The reliability councils within the Eastern Interconnection are: • Florida Reliability Coordinating Council (FRCC) • Midwest Reliability Organization (MRO) • Northeast Power Coordinating Council (NPCC) • ReliabilityFirst Corporation (RFC) • SERC Reliability Corporation (SERC) • Southwest Power Pool. Prior to being the National ERO. On July 20. Due to its isolated nature. as well as several provinces in Canada. The reliability council for the Québec Interconnection is: • Northeast Power Coordinating Council (NPCC) • The Alaska Interconnection covers a portion of the state of Alaska and is not tied to any other interconnections. an affiliate member of NERC NERC Authority As part of the fallout of the Northeast Blackout of 2003. the Alaska Interconnection is not generally counted among North America's interconnections. The reliability council for the Alaska Interconnection is: • Alaska Systems Coordinating Council (ASCC). for which compliance was strongly encouraged yet ultimately voluntary. Despite being a functionally separate interconnection. by way of significant financial penalties for . from the Rocky Mountains to the Pacific coast.

cfm/news_id=12112) (USDOE). com/ public/ resources/ documents/ CIP-002-Identification-Letter-040609.energy.com/page. PCI DSS version 2. which are called “control objectives”. pdf) External links • Home Page (http://www. e-purse. credit. Efforts between NERC and the Canadian and Mexican governments are underway to obtain comparable authority for NERC to enforce its standards on the NERC member systems residing outside of the United States. NERC issued a public notice that warns that the Electrical Grid is not adequately protected from cyber-warfare.energypulse.0 must be adopted by all organisations with payment card data by 1 January 2011. released on 26 October 2010. . and POS cards. ATM.0 of the standard. and from 1 January 2012 all assessments must be under version 2.[1] Requirements The current version of the standard is version 2. PCI DSS version 2.eere. wsj.0 has two (2) new or evolving requirements out of 132 changes.net/centers/article/article_display.com/) Payment Card Industry Data Security Standard The Payment Card Industry Data Security Standard (PCI DSS) is an information security standard for organizations that handle cardholder information for the major debit. Defined by the Payment Card Industry Security Standards Council. Validation of compliance is done annually . prepaid. Remaining changes and enhancements falls under the category of clarification or additional guidelines.php?cid=3|23) • Reliability Report Warns of Transmission Needs with Wind Power Booming (http://apps1.wispubs.nerc.North American Electric Reliability Corporation noncompliance.[1] References [1] NERC Public Notice (http:/ / online. or by Self-Assessment Questionnaire (SAQ) for companies handling smaller volumes.2 of 1 October 2008[3] and specifies the 12 requirements for compliance.0.gov/ news/news_detail. organized into six logically-related groups.com) • List of NERC Regional Reliability Councils (http://www. 185 Cyber warfare In April 2009. the standard was created to increase controls around cardholder data to reduce credit card fraud via its exposure.[2] The table below summarizes the differing points from version 1.nerc. cfm?a_id=2033) • Reporting and Compliance with FERC and NERC: Best Practices for Meeting the Challenge (http:// insiderresearch. • Compliance Roadmap for the Power Industry (http://www.by an external Qualified Security Assessor (QSA) for organisations handling large volumes of transactions.

public networks Maintain a Vulnerability Management Program 5.2 to version 1. American Express Data Security Operating Policy. In September 2006.2 was released on October 1.1 for the purpose of making minor corrections designed to create more clarity and consistency among the standards and supporting documents.1 to provide clarification and minor revisions to version 1. Discover Information and Compliance. Encrypt transmission of cardholder data across open. these companies aligned their individual policies and released the Payment Card Industry Data Security Standard (PCI DSS).[4] Version 1. improved flexibility.Payment Card Industry Data Security Standard 186 Control Objectives Build and Maintain a Secure Network PCI DSS Requirements 1. Protect stored cardholder data 4. Regularly test security systems and processes Maintain an Information Security Policy 12. and the JCB Data Security Program. 2008. process and transmit cardholder data.2. Assign a unique ID to each person with computer access 9. The Payment Card Industry Security Standards Council (PCI SSC) was formed. Maintain a policy that addresses information security History PCI DSS originally began as five different programs: Visa Card Information Security Program. Version 1. Updates and supplemental information The PCI SSC has released several supplemental pieces of information to clarify various requirements.0. and addressed evolving risks/threats. Do not use vendor-supplied defaults for system passwords and other security parameters Protect Cardholder Data 3.2 did not change requirements.[5] v1.Understanding the Intent of the Requirements[9] Information Supplement: PCI DSS Wireless Guidelines[10] . 2008. In August 2009 the PCI SSC announced[6] the move from version 1. Develop and maintain secure systems and applications Implement Strong Access Control Measures 7. Restrict access to cardholder data by business need-to-know 8. Restrict physical access to cardholder data Regularly Monitor and Test Networks 10. Track and monitor all access to network resources and cardholder data 11. These documents include the following • • • • Information Supplement: Requirement 11.6 Code Reviews and Application Firewalls Clarified[8] Navigating the PCI DSS . MasterCard Site Data Protection. Install and maintain a firewall configuration to protect cardholder data 2. Each company’s intentions were roughly similar: to create an additional level of protection for card issuers by ensuring that merchants meet minimum levels of security when they store.3 Penetration Testing[7] Information Supplement: Requirement 6.1 "sunsetted" on December 31. Use and regularly update anti-virus software on all systems commonly affected by malware 6. only enhanced clarity. and on 15 December 2004. the PCI standard was updated to version 1.

In this scenario. 9. 3 minimum scanning requirements (Sections 11. Enable WPA or WPA2 security. 2010.9) of the PCI DSS apply.1.4 Wireless Logs: Archive wireless access centrally using a WIPS for 1 year. store or transmit cardholder data. Issuing and acquring banks are not required to go through PCI DSS validation. the Payment Card Industry Security Standards Council published wireless guidelines[10] for PCI DSS recommending the use of Wireless Intrusion Prevention System (WIPS) to automate wireless scanning for large organisations.9) of the PCI DSS apply.1. Three minimum scanning requirements (Sections 11.2 that are relevant for wireless security are classified and defined below.1 Change Defaults: Change default passwords. three minimum scanning requirements (Sections 11. • Section 12. 10.1. There are no known WLAN APs inside the CDE. • Section 9. These WLAN APs are segmented from the CDE by a firewall. 11. The purpose of these requirements is to deploy WLAN APs with proper safeguards. . also known as CDEs.5. Wireless LAN and CDE Classification PCI DSS wireless guidelines classify CDEs into three scenarios depending on how wireless LANs are deployed.1 802.9). • Known WLAN AP inside the CDE: The organisation has deployed WLAN APs inside the CDE.3 Usage Policies: Develop usage policies to list all wireless devices regularly.Payment Card Industry Data Security Standard 187 Compliance versus Validation of compliance Although PCI DSS must be implemented for all entites that process. validation of PCI DSS compliance is not required for all entities.1. Currently both Visa and Mastercard are requiring only Merchants and Service Providers to be validated according to the PCI DSS standard.11i Security: Set up APs in WPA or WPA2 mode with 802. SSIDs on wireless devices.1.6 Log Review: Review wireless access logs daily.2 compliance. Use of WEP in CDE is not allowed after June 30.4 and 12. In this scenario. • No Known WLAN AP inside or outside the CDE: The organisation has not deployed any WLAN AP. • Section 10. 4. as well as six secure deployment requirements (Sections 2.3 Physical Security: Restrict physical access to known wireless devices. (Although in the event of security breach .compromised bank/entity that is not PCI DSS certified (PCI DSS validated by the Qualified Security Assesor) will be subjected to additional Card Scheme penalties) Compliance and wireless LANs In July 2009.6 and 12. Wireless guidelines clearly define how wireless security applies to PCI DSS 1. • Section 10.5.4 and 12. 11.3) of the PCI DSS apply.1. A CDE is defined as a network environment that possesses or transmits credit card data. 10. • Section 2. 11. In this scenario.1X authentication and AES encryption.[11] These guidelines apply to the deployment of Wireless LAN (WLAN) in cardholder data environments.1.1.1. • Section 4.4 and 12. Key sections of PCI DSS 1. Develop usage possible for the use of wireless devices. • Known WLAN AP outside the CDE: The organisation has deployed WLAN APs outside the CDE. Secure Deployment Requirements for Wireless LANs These secure deployment requirements apply to only those organisations that have a known WLAN AP inside the CDE.

These recordings are accessible by a host of call center personnel. technology solutions can completely prevent skimming (credit card fraud) by agents. but the significant costs have been thought to lead some companies to avoid WIPS deployments.Payment Card Industry Data Security Standard 188 Minimum Scanning Requirements for Wireless LANs These minimum scanning requirements apply to all organisations regardless of the type of wireless LAN deployment in the CDE. • Section 12. and can be implemented in either an internally hosted or externally hosted Software as a Service(SaaS) model. Such a deployment is viable. Sampling of sites is not allowed.[18] PCI Compliance in Call Centers While the PCI DSS standards are very explicit about the requirements for the back end storage and access of PII (personally identifiable information). Moreover.[24] The benefits .[21] To address some of these concerns. given the high threat potential for credit card fraud and data compromise that call centers pose. CVV codes. on January 22. Interactive Voice Response systems or call center agents. The agent remains on the phone and customers enter their credit card information directly into the Customer Relationship Management software using their phones. The DTMF tones are converted to monotones so the agent cannot recognize them and so that they cannot be recorded. Though the council has not yet issued any requirements.[19] In a call center.[22] The bottom line is that companies can no longer store digital recordings that include CVV information if those recordings can be queried. The network implementation is an on-site deployment of WIPS within a private network. the Payment Card Industry Security Standards Council has said very little about the collection of that information on the front end. but can create an awkward customer interaction.[15] Hosted implementations are said to be particularly cost-effective[16] for organisations looking to fulfill only the minimum scanning requirements for PCI DSS compliance (AirMinder [17]). Wireless Intrusion Prevention System (WIPS) Implementations Wireless Intrusion Prevention Systems are a possible option for compliance with some PCI DSS requirements. Newer solutions allow the agent to "collect" the credit card information without ever seeing or hearing it. and expiration dates to call center agents. customers read their credit card information. which is capturing and storing all of this sensitive consumer data. whether through websites. A WIPS is recommended for large organisations since it is not possible to manually scan or conduct a walk-around wireless security audit[12] of all sites on a quarterly basis • Section 11. Enable automatic containment mechanism on WIPS to block rogues and unauthorized wireless connections.4 Monitor Alerts: Enable automatic WIPS alerts to instantly notify personnel of rogue devices and unauthorized wireless connections into the CDE. almost all call centers deploy some kind of call recording software. requiring the company to secure the channel from the home-based agent through the call center hub to the retailer applications. There are few controls which prevent the agent from skimming (credit card fraud) this information with a recording device or a computer or physical note pad. At the point in the transaction where the agent needs to collect the credit card information.[13] [14] The hosted implementation is offered in an on-demand. are often unencrypted. • Section 11. The purpose of these requirements is to eliminate any rogue or unauthorized WLAN activity inside the CDE.9 Eliminate Threats: Prepare an incident response plan to monitor and respond to alerts from the WIPS. This is surprising. 2010 the Payment Card Industry Security Standards Council issued a revised FAQ about call center recordings. subscription-based SaaS model.[20] Home-based telephone agents pose an additional level of challenges.[23] This protects the sensitive information. the call can be transferred to an Interactive Voice Response system. and generally do not fall under the PCI DSS standards outlined here.1 Quarterly Wireless Scan: Scan all sites with CDEs whether or not they have known WLAN APs in the CDE.

testifying before a U. Ellen Richey. Much of this confusion is a result of the 2008 Heartland Payment Processing Systems breach. wherein more than one hundred million card numbers were compromised."[30] However.[31] Around this same time Hannaford Brothers[32] and TJX Companies were similarly breached as a result of the alleged very same source of coordinated efforts of Albert "Segvec" Gonzalez and two unnamed Russian hackers. criticism lies in that compliance validation is required only for Level 1-3 merchants and may be optional for Level 4 depending on the card brand and acquirer. Michael Jones..Payment Card Industry Data Security Standard of increasing the security around the collection of personally identifiable information goes beyond credit card fraud to include helping merchants win chargebacks due to friendly fraud. both in their interpretation and in their enforcement." . more substantial.[25] 189 Controversies and criticisms It has been suggested by some IT security professionals that the PCI DSS does little more than provide a minimal baseline for security. Look at online application vulnerabilities. confusing to comply with.[34] Therefore.. in fact.[33] Assessments examine the compliance of merchants and services providers with the PCI DSS at a specific point in time and frequently utilize a sampling methodology to allow compliance to be demonstrated through representative systems and processes.[of] specificity and high-level concepts" that allows "stakeholders the opportunity and flexibility to work with Qualified Security Assessors (QSAs) to determine appropriate security controls within their environment that meet the intent of the PCI standards. the European Data Protection Act."[27] In contrast.." . In fact there are over 220 sub-requirements.no compromised entity has yet been found to be in compliance with PCI DSS at the time of a breach. Congress subcommittee regarding the PCI DSS. some of which can place an incredible burden on a retailer and many of which are subject to interpretation. Other.. Visa's compliance validation details for merchants . and sells more products and services. It is often stated that there are only twelve “Requirements” for PCI compliance." [29] Compliance and Compromises Per Visa Chief Enterprise Risk Officer. even if minimum standards are not enough to completely eradicate security problems. Regulation forces companies to take security more seriously. per PCI Council General Manager Bob Russo's response to the NRF: PCI is a structured "blend. received its PCI DSS compliance validation one day after it had been made aware of a two-month long compromise of its internal systems. It is the responsibility of the merchant and service provider to achieve. "The fact is you can be PCI-compliant and still be insecure. and maintain their compliance at all times both throughout the annual validation/assessment cycle and across all systems and processes in their entirety. albeit in this case having not been identified by the assessor. demonstrate. And it works...Greg Reber[26] Additionally.[35] fail to appropriately assign blame in their blasting of the standard itself as flawed as opposed to the more truthful breakdown in merchant and service provider compliance with the written standard. these frequently cited breaches and their pointed use as a tool for criticism even to the point of noting that Hannaford Brothers had.. CIO of Michaels' Stores. ". it has nevertheless become a common misconception that companies have had security breaches while also being PCI DSS compliant. HIPAA. the credit-card industry's PCI. the various disclosure laws. "Regulation--SOX. and for good reason — exposures in customer-facing applications pose a real danger of a security breach. whatever--has been the best stick the industry has found to beat companies over the head with.. They're arguably the fastest growing area of security. and ultimately subjective. GLBA. others have suggested that PCI DSS is a step toward making all businesses pay more attention to IT security.S. says "(.the PCI DSS requirements.Bruce Schneier[28] Further.) are very expensive to implement.

org/ pdfs/ PCI_DSS_Wireless_Guidelines.Understanding the Intent of the Requirements (https:/ / www. pcisecuritystandards. [22] "Call Center FAQ Significantly Changes" (http:/ / pciguru.PCI Security Standards Council (https:/ / www.uk. pcisecuritystandards. shtml) [4] PCI SECURITY STANDARDS COUNCIL RELEASES VERSION 1.000 Visa e-commerce transactions annually and all other merchants processing up to 1 million Visa transactions annually". . net/ cms/ 2010/ pci-dss-ver-2-0-quick/ [3] PCI DSS . pdf) [5] Supporting Documents PCI DSS (https:/ / www. Pop" (http:/ / online. a secure state could be more costly to some organisations than accepting and managing the risk of confidentiality breaches. the PCI Standards Council General Manager Bob Russo has indicated that liabilities could change depending on the state of a given organisation at the point in time when an actual breach occurs. tmcnet. com/ article/ SB119042666704635941. searchcrm. Retrieved 2008-02-13. pciguru. Retrieved 2009-07-22. networkworld.[37] 190 Compliance as a Snapshot The state of being PCI DSS compliant might appear to have some temporal persistence. pdf) [9] Navigating the PCI DSS . com/ fileadmin/ pdf/ whitepaper/ PCI_Wireless_Whitepaper. Retrieved 2009-07-16. Retrieved 2009-07-22.6 Code Reviews and Application Firewalls Clarified (https:/ / www. . [19] "Overseas credit card scam exposed" (http:/ / news. tmcnet.com.[36] Visa level 4 merchants are "Merchants processing less than 20. co.com. . pcisecuritystandards.3 Penetration Testing (https:/ / www. . org/ documents/ information_supplement_11. pcisecuritystandards. 2009. org/ security_standards/ pci_dss. networkworld. [2] http:/ / grc360. [11] "Don’t Let Wireless Detour your PCI Compliance" (http:/ / www. gartner. [20] "PCI Compliance in the Call Center a Headache for Many" (http:/ / searchcrm. html). htm). html). org/ pdfs/ pr_080930_PCIDSSv1-2. bbc. pdf [7] Information Supplement: Requirement 11. org/ pdfs/ infosupp_6_6_applicationfirewalls_codereviews. stm). airtightnetworks. shtml) [6] https:/ / www. [13] "Webinar on Wireless Security as SaaS by Gartner Analyst John Pescatore" (http:/ / www. [17] http:/ / www. [12] "Walk Around Wireless Security Audits – The End Is Near" (http:/ / www. [21] "PCI Compliance: What it Means to the Call Center Industry" (http:/ / callcenterinfo. Retrieved 2009-04-24. com/ 2010/ 01/ 25/ call-center-faq-significantly-changes/ ). [16] "New Low-Cost Wireless PCI Scanning Services.com. Retrieved 2011-01-28. mwir/ topstory.2 OF PCI DATA SECURITY STANDARD (https:/ / www. networkworld. networkworld. Retrieved 2010-01-25. pcisecuritystandards. com/ newsletters/ wireless/ 2008/ 040708wireless1. .co.[39] References [1] Sidel. Culprits Often Are Mom. Retrieved 2008-04-07. . pcisecuritystandards. . . . pcisecuritystandards. org/ pdfs/ statement_090810_minor_corrections_to_standards. Retrieved 2009-07-22. 3. airtightnetworks. com/ news/ 2240031378/ PCI-compliance-in-the-call-center-a-headache-for-many). However. com/ view/ 9661/ comment-saas-offerings-for-wireless-pci-compliance/ ). At the same time 80% of payment card compromises since 2005 affected Level 4 merchants. bbc. pdf) [8] Information Supplement: Requirement 6. pcisecuritystandards. . com/ analysis/ articles/ 20732-pci-compliance-what-it-means-the-call-center. com/ community/ node/ 26755). infosecurity-us. org/ pdfs/ pci_dss_saq_navigating_dss. pdf). org/ security_standards/ supporting_documents_home.Payment Card Industry Data Security Standard state that level 4 merchants compliance validation requirements are set by the acquirer. [14] "Saas offerings for wireless pci compliance" (http:/ / www. The Wall Street Journal. March 19. Retrieved 2008-04-08. com/ fileadmin/ content_images/ news/ webinars/ SaaS/ player. at least from a merchant point of view. pdf) [10] "PCI DSS Wireless Guidelines" (https:/ / www.com.com. "In Data Leaks. html?mod=sphere_ts). com [18] "Big-Time Wireless Security . many studies have shown that this cost is justifiable. com/ fileadmin/ pdf/ whitepaper/ WP_WalkAroundWireless. wsj. .[38] Costs Similar to other industries. . Retrieved 2010-05-25. airtightnetworks. New Offerings Satisfy PCI DSS Requirements" (http:/ / newsblaze. uk/ 2/ hi/ uk_news/ 7953401. . pdf). com/ story/ 2009072205011500038. pdf).com. wordpress. . Robin (2007-09-22). techtarget. icrewsecurity. In contrast.As a Service" (http:/ / www. html). [15] "Security SaaS hits WLAN community" (http:/ / www.

pdf). [32] McGlasson. "Visa: Post-breach criticism of PCI standard misplaced" (http:/ / www. [33] Goodin. pdf). Dan (2009). [30] Vijayan. [28] "Bruce Schneier reflects on a decade of security trends" (http:/ / searchsecurity. Retrieved 2010-10-19. [27] Jones. Retrieved 2009-02-15. com/ analysis/ articles/ 45010-restructuring-contact-center-pci-compliance. [26] "PCI compliance falls short of assuring website security" (http:/ / searchsoftwarequality. "TESTIMONY OF MICHAEL JONES BEFORE THE EMERGING THREATS. computerworld. "Letter to NRF" (http:/ / www. . cso. computerworld.pdf) • PCI DSS 1. .2 Announcement. . Retrieved 2009-01-28. tmcnet.com. . 2008 (https://www. Retrieved 2008-11-10. Jaikumar (2009). do?command=viewArticleBasic& articleId=9126608). uk/ 2009/ 08/ 17/ heartland_payment_suspect/ ). . bankinfosecurity. [35] Vijayan. . "TJX suspect indicted in Heartland. com/ action/ article. Computerworld Security.Payment Card Industry Data Security Standard [23] "Restructuring the Contact Center for PCI Compliance" (http:/ / callcenterinfo. com/ index. com/ PCI-Compliance/ ). [31] "Heartland data breach sparks security concerns in payment industry" (http:/ / www. Dennis (February 21.pcisecuritystandards. Solidcore Systems. pdf). . elitetele. htm). Retrieved 2009-02-15. pcisecuritystandards. homeland. tmcnet.org/pdfs/pci_dss_summary_of_changes_v1-2. "The QSA's Perspective: PCI Compliance Risk Abounds" (http:/ / blogs. "PCI security standard gets ripped at House hearing" (http:/ / www. . . BankInfo Security. house. php?postID=492). pl/ materialy/ prezentacje/ adrian_pastor_confidence_2009. com/ action/ article. php?option=com_k2& view=item& id=1854& Itemid=1& Itemid=1). isixsigma. html [37] Pastor.2 • Summary of Changes (https://www. techtarget.pdf) . "A Pentester’s Guide to Credit Card Theft Techniques" (http:/ / 2009. com.com. [38] "Q and A: Head of PCI council sees security standard as solid. Linda (2008-04-04). . "Error-proofing strategies for managing call center fraud" (http:/ / www. [24] "PCI Compliance with CallGuard" (http:/ / www.00. Retrieved 2009-05-04.org/pdfs/ pr_080930_PCIDSSv1-2. PCI Council. . com/ action/ article.org/pdfs/ pci_dss_summary_of_changes_faqs_v1-2.pdf) • Summary of Changes FAQ (https://www. confidence. [25] Adsit. au/ contents/ 21998-Bruce-Schneier-reflects-on-a-decade-of-security-trends). [34] Spier. computerworld. 2011). Retrieved 2010-10-19. AND SCIENCE AND TECHNOLOGY SUBCOMMITTEE" (http:/ / www. Bob (2009-06-15). theregister. html). Congress of the United States. pdf). co. Oct. isixsigma. org/ pdfs/ statement090615_letter_to_nrf. Hannaford breaches" (http:/ / www.sid92_gci1335662. CYBERSECURITY. . [29] Russo. despite breaches" (http:/ / www.com. com/ posts. com. solidcore. . com/ articles. "Hannaford Data Breach May Be Top of Iceberg" (http:/ / www. com/ merchants/ risk_management/ cisp_merchants. Retrieved 2010-07-19. org. com/ news/ column/ 0. com/ assets/ PCI_Cost_Analysis. 191 Books on PCI DSS • "PCI DSS Handbook"(ISBN 9780470260463) • "PCI DSS: A Practical Guide to Implementation" (ISBN 9781849280235) • "PCI Compliance: Understand and Implement Effective PCI Data Security Standard Compliance" (ISBN 9781597494991) Updates on PCI DSS v1. visa. BankInfo Security. do?command=viewArticleBasic& taxonomyName=Financial& articleId=9078059). Adrian (2009). Peter (2010-03-22). bankinfosecurity.294698. 1. [39] "PCI Cost Analysis Report: A Justified Expense" (http:/ / www. au/ article/ 296278/ visa_post-breach_criticism_pci_standard_misplaced). Jaikumar (2009-01-04). do?command=viewArticleBasic& articleId=9130901& intsrc=news_ts_head).pcisecuritystandards. . . techtarget. .pcisecuritystandards. [36] Visa Merchant levels http:/ / usa. elitetele. gov/ SiteDocuments/ 20090331142012-77196. Retrieved 2009-02-15. Michael (2009-03-31). php?art_id=810).

Payment Card Industry Data Security Standard


Updates on PCI DSS v2.0
• Summary of Changes (https://www.pcisecuritystandards.org/pdfs/summary_of_changes_highlights.pdf) • PCI DSS 2.0 Announcement, Aug. 12, 2010 (https://www.pcisecuritystandards.org/pdfs/ pr_100810_summary_changes.pdf)

External links
• PCI DSS Standard (https://www.pcisecuritystandards.org/security_standards/pci_dss.shtml) • PCI Quick Reference Guide (https://www.pcisecuritystandards.org/pdfs/pci_ssc_quick_guide.pdf)

Sarbanes–Oxley Act
The Sarbanes–Oxley Act of 2002 (Pub.L. 107-204 [1], 116 Stat. 745, enacted July 30, 2002), also known as the 'Public Company Accounting Reform and Investor Protection Act' (in the Senate) and 'Corporate and Auditing Accountability and Responsibility Act' (in the House) and commonly called Sarbanes–Oxley, Sarbox or SOX, is a United States federal law enacted on July 30, 2002, which set new or enhanced standards for all U.S. public company boards, management and public accounting firms. It is named after sponsors U.S. Senator Paul Sarbanes (D-MD) and U.S. Representative Michael G. Oxley (R-OH).

Sen. Paul Sarbanes (D–MD) and Rep. Michael G. Oxley (R–OH-4), the co-sponsors of the Sarbanes–Oxley Act.

The bill was enacted as a reaction to a number of major corporate and accounting scandals including those affecting Enron, Tyco International, Adelphia, Peregrine Systems and WorldCom. These scandals, which cost investors billions of dollars when the share prices of affected companies collapsed, shook public confidence in the nation's securities markets. It does not apply to privately held companies. The act contains 11 titles, or sections, ranging from additional corporate board responsibilities to criminal penalties, and requires the Securities and Exchange Commission (SEC) to implement rulings on requirements to comply with the new law. Harvey Pitt, the 26th chairman of the SEC, led the SEC in the adoption of dozens of rules to implement the Sarbanes–Oxley Act. It created a new, quasi-public agency, the Public Company Accounting Oversight Board, or PCAOB, charged with overseeing, regulating, inspecting and disciplining accounting firms in their roles as auditors of public companies. The act also covers issues such as auditor independence, corporate governance, internal control assessment, and enhanced financial disclosure.
[2] The act was approved by the House by a vote of  423 in favor, 3 opposed, and 8 abstaining and by the Senate with a vote of  99 in favor, 1 abstaining [3]. President George W. Bush signed it into law, stating it included "the most far-reaching reforms of American business practices since the time of Franklin D. Roosevelt."[4]

Debate continues over the perceived benefits and costs of SOX. Supporters contend the legislation was necessary and has played a useful role in restoring public confidence in the nation's capital markets by, among other things, strengthening corporate accounting controls. Opponents of the bill claim it has reduced America's international competitive edge against foreign financial service providers, saying SOX has introduced an overly complex regulatory environment into U.S. financial markets.[5] Proponents of the measure say that SOX has been a "godsend" for improving the confidence of fund managers and other investors with regard to the veracity of corporate financial [6] statements.

Sarbanes–Oxley Act


Sarbanes–Oxley contains 11 titles that describe specific mandates and requirements for financial reporting. Each title consists of several sections, summarized below. 1. Public Company Accounting Oversight Board (PCAOB) Title I consists of nine sections and establishes the Public Company Accounting Oversight Board, to provide independent oversight of public accounting firms providing audit services ("auditors"). It also creates a central oversight board tasked with registering auditors, defining the specific processes and procedures for compliance audits, inspecting and policing conduct and quality control, and enforcing compliance with the specific mandates of SOX. 2. Auditor Independence Title II consists of nine sections and establishes standards for external auditor independence, to limit conflicts of interest. It also addresses new auditor approval requirements, audit partner rotation, and auditor reporting requirements. It restricts auditing companies from providing non-audit services (e.g., consulting) for the same clients. 3. Corporate Responsibility Title III consists of eight sections and mandates that senior executives take individual responsibility for the accuracy and completeness of corporate financial reports. It defines the interaction of external auditors and corporate audit committees, and specifies the responsibility of corporate officers for the accuracy and validity of corporate financial reports. It enumerates specific limits on the behaviors of corporate officers and describes specific forfeitures of benefits and civil penalties for non-compliance. For example, Section 302 requires that the company's "principal officers" (typically the Chief Executive Officer and Chief Financial Officer) certify and approve the integrity of their company financial reports quarterly.[7] 4. Enhanced Financial Disclosures Title IV consists of nine sections. It describes enhanced reporting requirements for financial transactions, including off-balance-sheet transactions, pro-forma figures and stock transactions of corporate officers. It requires internal controls for assuring the accuracy of financial reports and disclosures, and mandates both audits and reports on those controls. It also requires timely reporting of material changes in financial condition and specific enhanced reviews by the SEC or its agents of corporate reports. 5. Analyst Conflicts of Interest Title V consists of only one section, which includes measures designed to help restore investor confidence in the reporting of securities analysts. It defines the codes of conduct for securities analysts and requires disclosure of knowable conflicts of interest. 6. Commission Resources and Authority Title VI consists of four sections and defines practices to restore investor confidence in securities analysts. It also defines the SEC’s authority to censure or bar securities professionals from practice and defines conditions under which a person can be barred from practicing as a broker, advisor, or dealer. 7. Studies and Reports Title VII consists of five sections and requires the Comptroller General and the SEC to perform various studies and report their findings. Studies and reports include the effects of consolidation of public accounting firms, the role of credit rating agencies in the operation of securities markets, securities violations and enforcement actions, and whether investment banks assisted Enron, Global Crossing and others to manipulate earnings and obfuscate true financial conditions. 8. Corporate and Criminal Fraud Accountability Title VIII consists of seven sections and is also referred to as the “Corporate and Criminal Fraud Accountability Act of 2002”. It describes specific criminal penalties for manipulation, destruction or alteration

Sarbanes–Oxley Act of financial records or other interference with investigations, while providing certain protections for whistle-blowers. 9. White Collar Crime Penalty Enhancement Title IX consists of six sections. This section is also called the “White Collar Crime Penalty Enhancement Act of 2002.” This section increases the criminal penalties associated with white-collar crimes and conspiracies. It recommends stronger sentencing guidelines and specifically adds failure to certify corporate financial reports as a criminal offense. 10. Corporate Tax Returns Title X consists of one section. Section 1001 states that the Chief Executive Officer should sign the company tax return. 11. Corporate Fraud Accountability Title XI consists of seven sections. Section 1101 recommends a name for this title as “Corporate Fraud Accountability Act of 2002”. It identifies corporate fraud and records tampering as criminal offenses and joins those offenses to specific penalties. It also revises sentencing guidelines and strengthens their penalties. This enables the SEC to resort to temporarily freezing transactions or payments that have been deemed "large" or "unusual".


History and context: events contributing to the adoption of Sarbanes–Oxley
A variety of complex factors created the conditions and culture in which a series of large corporate frauds occurred between 2000–2002. The spectacular, highly-publicized frauds at Enron, WorldCom, and Tyco exposed significant problems with conflicts of interest and incentive compensation practices. The analysis of their complex and contentious root causes contributed to the passage of SOX in 2002.[8] In a 2004 interview, Senator Paul Sarbanes stated: The Senate Banking Committee undertook a series of hearings on the problems in the markets that had led to a loss of hundreds and hundreds of billions, indeed trillions of dollars in market value. The hearings set out to lay the foundation for legislation. We scheduled 10 hearings over a six-week period, during which we brought in some of the best people in the country to testify...The hearings produced remarkable consensus on the nature of the problems: inadequate oversight of accountants, lack of auditor independence, weak corporate governance procedures, stock analysts' conflict of interests, inadequate disclosure provisions, and grossly inadequate funding of the Securities and Exchange Commission.[9] • Auditor conflicts of interest: Prior to SOX, auditing firms, the primary financial "watchdogs" for investors, were self-regulated. They also performed significant non-audit or consulting work for the companies they audited. Many of these consulting agreements were far more lucrative than the auditing engagement. This presented at least the appearance of a conflict of interest. For example, challenging the company's accounting approach might damage a client relationship, conceivably placing a significant consulting arrangement at risk, damaging the auditing firm's bottom line. • Boardroom failures: Boards of Directors, specifically Audit Committees, are charged with establishing oversight mechanisms for financial reporting in U.S. corporations on the behalf of investors. These scandals identified Board members who either did not exercise their responsibilities or did not have the expertise to understand the complexities of the businesses. In many cases, Audit Committee members were not truly independent of management. • Securities analysts' conflicts of interest: The roles of securities analysts, who make buy and sell recommendations on company stocks and bonds, and investment bankers, who help provide companies loans or handle mergers and acquisitions, provide opportunities for conflicts. Similar to the auditor conflict, issuing a buy or sell recommendation on a stock while providing lucrative investment banking services creates at least the

Sarbanes–Oxley Act appearance of a conflict of interest. • Inadequate funding of the SEC: The SEC budget has steadily increased to nearly double the pre-SOX level.[10] In the interview cited above, Sarbanes indicated that enforcement and rule-making are more effective post-SOX. • Banking practices: Lending to a firm sends signals to investors regarding the firm's risk. In the case of Enron, several major banks provided large loans to the company without understanding, or while ignoring, the risks of the company. Investors of these banks and their clients were hurt by such bad loans, resulting in large settlement payments by the banks. Others interpreted the willingness of banks to lend money to the company as an indication of its health and integrity, and were led to invest in Enron as a result. These investors were hurt as well. • Internet bubble: Investors had been stung in 2000 by the sharp declines in technology stocks and to a lesser extent, by declines in the overall market. Certain mutual fund managers were alleged to have advocated the purchasing of particular technology stocks, while quietly selling them. The losses sustained also helped create a general anger among investors. • Executive compensation: Stock option and bonus practices, combined with volatility in stock prices for even small earnings "misses," resulted in pressures to manage earnings.[11] Stock options were not treated as compensation expense by companies, encouraging this form of compensation. With a large stock-based bonus at risk, managers were pressured to meet their targets.


Timeline and passage of Sarbanes–Oxley
The House passed Rep. Oxley's bill (H.R. 3763) on April 24, 2002, by a vote of 334 to 90. The House then referred the "Corporate and Auditing Accountability, Responsibility, and Transparency Act" or "CAARTA" to the Senate Banking Committee with the support of President George W. Bush and the SEC. At the time, however, the Chairman of that Committee, Senator Paul Sarbanes (D-MD), was preparing his own proposal, Senate Bill 2673. Senator Sarbanes’ bill passed the Senate Banking Committee on June 18, 2002, by a vote of 17 to 4. On June 25, 2002, WorldCom revealed it had overstated its earnings by more than $3.8 billion during the past five quarters (15 months), primarily by improperly accounting for its operating costs. Sen. Sarbanes introduced Senate Bill 2673 to the full Senate that same day, and it passed 97–0 less than three weeks later on July 15, 2002.
Before the signing ceremony of the Sarbanes–Oxley Act, President George W. Bush met with Senator Paul Sarbanes, Secretary of Labor Elaine Chao and other dignitaries in the Blue Room at the White House on July 30, 2002

The House and the Senate formed a Conference Committee to reconcile the differences between Sen. Sarbanes's bill (S. 2673) and Rep. Oxley's bill (H.R. 3763). The conference committee relied heavily on S. 2673 and “most changes made by the conference committee strengthened the prescriptions of S. 2673 or added new prescriptions.” (John T. Bostelman, The Sarbanes–Oxley Deskbook § 2–31.) The Committee approved the final conference bill on July 24, 2002, and gave it the name "the Sarbanes–Oxley Act of 2002." The next day, both houses of Congress voted on it without change, producing an overwhelming margin of [12] in the House and 99 to 0 [3] in the Senate. On July 30, 2002, President George W. Bush signed it victory: 423 to 3 into law, stating it included "the most far-reaching reforms of American business practices since the time of Franklin D. Roosevelt." [4]

Sarbanes–Oxley Act


Analyzing the cost-benefits of Sarbanes–Oxley
A significant body of academic research and opinion exists regarding the costs and benefits of SOX, with significant differences in conclusions. This is due in part to the difficulty of isolating the impact of SOX from other variables affecting the stock market and corporate earnings.[13] [14] Conclusions from several of these studies and related criticism are summarized below:

Compliance costs
• FEI Survey (Annual): Finance Executives International (FEI) provides an annual survey on SOX Section 404 costs. These costs have continued to decline relative to revenues since 2004. The 2007 study indicated that, for 168 companies with average revenues of $4.7 billion, the average compliance costs were $1.7 million (0.036% of revenue).[15] The 2006 study indicated that, for 200 companies with average revenues of $6.8 billion, the average compliance costs were $2.9 million (0.043% of revenue), down 23% from 2005. Cost for decentralized companies (i.e., those with multiple segments or divisions) were considerably more than centralized companies. Survey scores related to the positive effect of SOX on investor confidence, reliability of financial statements, and fraud prevention continue to rise. However, when asked in 2006 whether the benefits of compliance with Section 404 have exceeded costs in 2006, only 22 percent agreed.[16] • Foley & Lardner Survey (2007): This annual study focused on changes in the total costs of being a U.S. public company, which were significantly affected by SOX. Such costs include external auditor fees, directors and officers (D&O) insurance, board compensation, lost productivity, and legal costs. Each of these cost categories increased significantly between FY2001-FY2006. Nearly 70% of survey respondents indicated public companies with revenues under $251 million should be exempt from SOX Section 404.[17] • Zhang (2005): This research paper estimated SOX compliance costs as high as $1.4 trillion, by measuring changes in market value around key SOX legislative "events." This number is based on the assumption that SOX was the cause of related short-duration market value changes, which the author acknowledges as a drawback of the study.[18] • Butler/Ribstein (2006): Their book proposed a comprehensive overhaul or repeal of SOX and a variety of other reforms. For example, they indicate that investors could diversify their stock investments, efficiently managing the risk of a few catastrophic corporate failures, whether due to fraud or competition. However, if each company is required to spend a significant amount of money and resources on SOX compliance, this cost is borne across all publicly traded companies and therefore cannot be diversified away by the investor.[19] • A 2011 SEC study found that Section 404(b) compliance costs have continued to decline, especially after 2007 accounting guidance.[20]

Benefits to firms and investors
• Arping/Sautner (2010): This research paper analyzes whether SOX enhanced corporate transparency.[21] Looking at foreign firms that are cross-listed in the US, the paper indicates that, relative to a control sample of comparable firms that are not subject to SOX, cross-listed firms became significantly more transparent following SOX. Corporate transparency is measured based on the dispersion and accuracy of analyst earnings forecasts. • Iliev (2007): This research paper indicated that SOX 404 indeed led to conservative reported earnings, but also reduced—rightly or wrongly—stock valuations of small firms.[22] Lower earnings often cause the share price to decrease. • Skaife/Collins/Kinney/LaFond (2006): This research paper indicates that borrowing costs are lower for companies that improved their internal control, by between 50 and 150 basis points (.5 to 1.5 percentage points).[23] • Lord & Benoit Report (2006): Do the Benefits of 404 Exceed the Cost? A study of a population of nearly 2,500 companies indicated that those with no material weaknesses in their internal controls, or companies that corrected them in a timely manner, experienced much greater increases in share prices than companies that did not.[24] [25]

Sarbanes–Oxley Act The report indicated that the benefits to a compliant company in share price (10% above Russell 3000 index) were greater than their SOX Section 404 costs. • Institute of Internal Auditors (2005): The research paper indicates that corporations have improved their internal controls and that financial statements are perceived to be more reliable.[26]


Effects on exchange listing choice of non-US companies
Some have asserted that Sarbanes–Oxley legislation has helped displace business from New York to London, where the Financial Services Authority regulates the financial sector with a lighter touch. In the UK, the non-statutory Combined Code of Corporate Governance plays a somewhat similar role to SOX. See Howell E. Jackson & Mark J. Roe, “Public Enforcement of Securities Laws: Preliminary Evidence” (Working Paper January 16, 2007). The Alternative Investment Market claims that its spectacular growth in listings almost entirely coincided with the Sarbanes Oxley legislation. In December 2006 Michael Bloomberg, New York's mayor, and Charles Schumer, a US senator from New York, expressed their concern.[27] The Sarbanes–Oxley Act's effect on non-US companies cross-listed in the US is different on firms from developed and well regulated countries than on firms from less developed countries according to Kate Litvak.[28] Companies from badly regulated countries see benefits that are higher than the costs from better credit ratings by complying to regulations in a highly regulated country (USA), but companies from developed countries only incur the costs, since transparency is adequate in their home countries as well. On the other hand, the benefit of better credit rating also comes with listing on other stock exchanges such as the London Stock Exchange. Piotroski and Srinivasan (2008) examine a comprehensive sample of international companies that list onto U.S. and U.K. stock exchanges before and after the enactment of the Act in 2002. Using a sample of all listing events onto U.S. and U.K. exchanges from 1995–2006, they find that the listing preferences of large foreign firms choosing between U.S. exchanges and the LSE's Main Market did not change following SOX. In contrast, they find that the likelihood of a U.S. listing among small foreign firms choosing between the Nasdaq and LSE's Alternative Investment Market decreased following SOX. The negative effect among small firms is consistent with these companies being less able to absorb the incremental costs associated with SOX compliance. The screening of smaller firms with weaker governance attributes from U.S. exchanges is consistent with the heightened governance costs imposed by the Act increasing the bonding-related benefits of a U.S. listing.[29]

Implementation of key provisions
Sarbanes–Oxley Section 302: Disclosure controls
Under Sarbanes–Oxley, two separate sections came into effect—one civil and the other criminal. 15 U.S.C. § 7241 [30] (Section 302) (civil provision); 18 U.S.C. § 1350 [31] (Section 906) (criminal provision). Section 302 of the Act mandates a set of internal procedures designed to ensure accurate financial disclosure. The signing officers must certify that they are “responsible for establishing and maintaining internal controls” and “have designed such internal controls to ensure that material information relating to the company and its consolidated subsidiaries is made known to such officers by others within those entities, particularly during the period in which the periodic reports are being prepared.” 15 U.S.C. § 7241(a)(4) [32]. The officers must “have evaluated the

effectiveness of the company’s internal controls as of a date within 90 days prior to the report” and “have presented in the report their conclusions about the effectiveness of their internal controls based on their evaluation as of that date.” Id..

The SEC interpreted the intention of Sec. 302 in Final Rule 33–8124. In it, the SEC defines the new term "disclosure controls and procedures", which are distinct from "internal controls over financial reporting".[33] Under both Section 302 and Section 404, Congress directed the SEC to promulgate regulations enforcing these provisions.[34]

Sarbanes–Oxley Act External auditors are required to issue an opinion on whether effective internal control over financial reporting was maintained in all material respects by management. This is in addition to the financial statement opinion regarding the accuracy of the financial statements. The requirement to issue a third opinion regarding management's assessment was removed in 2007.


Sarbanes-Oxley Section 401: Disclosures in periodic reports (Off-balance sheet items)
The bankruptcy of Enron drew attention to off-balance sheet instruments that were used fraudulently. During 2010, the court examiner's review of the Lehman Brothers bankruptcy also brought these instruments back into focus, as Lehman had used an instrument called "Repo 105" to allegedly move assets and debt off-balance sheet to make its financial position look more favorable to investors. Sarbanes-Oxley required the disclosure of all material off-balance sheet items. It also required an SEC study and report to better understand the extent of usage of such instruments and whether accounting principles adequately addressed these instruments; the SEC report was issued June 15, 2005.[35] [36] Interim guidance was issued in May 2006, which was later finalized.[37] Critics argued the SEC did not take adequate steps to regulate and monitor this activity.[38]

Sarbanes–Oxley Section 404: Assessment of internal control
The most contentious aspect of SOX is Section 404, which requires management and the external auditor to report on the adequacy of the company's internal control over financial reporting (ICOFR). This is the most costly aspect of the legislation for companies to implement, as documenting and testing important financial manual and automated controls requires enormous effort.[39] Under Section 404 of the Act, management is required to produce an “internal control report” as part of each annual Exchange Act report. See 15 U.S.C. § 7262 [40]. The report must affirm “the responsibility of management for establishing and maintaining an adequate internal control structure and procedures for financial reporting.” 15 U.S.C. § 7262(a) [41]. The report must also “contain an assessment, as of the end of the most recent fiscal year of the Company, of the effectiveness of the internal control structure and procedures of the issuer for financial reporting.” To do this, managers are generally adopting an internal control framework such as that described in COSO. To help alleviate the high costs of compliance, guidance and practice have continued to evolve. The Public Company Accounting Oversight Board (PCAOB) approved Auditing Standard No. 5 for public accounting firms on July 25, 2007.[42] This standard superseded Auditing Standard No. 2, the initial guidance provided in 2004. The SEC also released its interpretive guidance [43] on June 27, 2007. It is generally consistent with the PCAOB's guidance, but intended to provide guidance for management. Both management and the external auditor are responsible for performing their assessment in the context of a top-down risk assessment, which requires management to base both the scope of its assessment and evidence gathered on risk. This gives management wider discretion in its assessment approach. These two standards together require management to: • Assess both the design and operating effectiveness of selected internal controls related to significant accounts and relevant assertions, in the context of material misstatement risks; • Understand the flow of transactions, including IT aspects, sufficient enough to identify points at which a misstatement could arise; • Evaluate company-level (entity-level) controls, which correspond to the components of the COSO framework; • Perform a fraud risk assessment; • Evaluate controls designed to prevent or detect fraud, including management override of controls; • Evaluate controls over the period-end financial reporting process; • Scale the assessment based on the size and complexity of the company; • Rely on management's work based on factors such as competency, objectivity, and risk; • Conclude on the adequacy of internal control over financial reporting.

during 2004 U.[46] The PCAOB intends to issue further guidance to help companies scale their assessment based on company size and complexity during 2007. The SEC stated in their release that the extension was granted so that the SEC’s Office of Economic Analysis could complete a study of whether additional guidance provided to company managers and auditors in 2007 was effective in reducing the costs of compliance. 199 Sarbanes–Oxley 404 and smaller public companies The cost of complying with SOX 404 impacts smaller companies disproportionately. Outside auditors of non-accelerated filers however opine or test internal controls under PCAOB (Public Company Accounting Oversight Board) Auditing Standards for years ending after December 15. as there is a significant fixed cost involved in completing the assessment. while companies with less than $100 million in revenue spent 2. 2009. For example. 2007. The reason for the timing disparity was to address the House Committee on Small Business concern that the cost of complying with Section 404 of the Sarbanes–Oxley Act of 2002 was still unknown and could therefore be disproportionately high for smaller publicly held companies.[44] Costs of evaluating manual control procedures are dramatically reduced through automation. obstruct.S. They also stated that there will be no further extensions in the future.C.06% of revenue on SOX compliance.S. more efficient systems. 2009. or influence the investigation or proper administration of any matter within the jurisdiction of any department or agency of the United States or any case filed under title 11. or makes a false entry in any record.[45] This disparity is a focal point of 2007 SEC and U. or tangible object with the intent to impede. conceals. the 2007 FEI survey indicated average compliance costs for decentralized companies were $1. document.[47] On October 2. 2007 to document a Management Assessment of their Internal Controls over Financial Reporting (ICFR). imprisoned not more than 20 years.[49] Sarbanes–Oxley Section 802: Criminal penalties for violation of SOX Section 802(a) of the SOX. mutilates. The SEC issued their guidance to management in June.[48] On September 15. 2008. versus those with centralized.[43] After the SEC and PCAOB issued their guidance. shall be fined under this title. 2010 the SEC issued final rule 33-9142 the permanently exempts registrants that are neither accelerated nor large accelerated filers as defined by Rule 12b-2 of the Securities and Exchange Act of 1934 from Section 404(b) internal control audit requirement. the SEC required smaller public companies (non-accelerated filers) with fiscal years ending after December 15.55%. For example. covers up. 18 U. or both. ” . the SEC granted another extension for the outside auditor assessment until fiscal years ending after June 15.3 million. encouraging companies to centralize and automate their financial reporting systems. or in relation to or contemplation of any such matter or case. destroys.Sarbanes–Oxley Act SOX 404 compliance costs represent a tax on inefficiency. falsifies. 2010. companies with revenues exceeding $5 billion spent 0. Another extension was granted by the SEC for the outside auditor assessment until years ending after December 15. This is apparent in the comparative costs of companies with decentralized operations and systems. while centralized company costs were $1. Senate action. § 1519 [50] states: “ Whoever knowingly alters.S.9 million.

IPOs were up 195% from the previous year to 233. for providing to a law enforcement officer any truthful information relating to the commission or possible commission of any federal offense.S. the number of American companies deregistering from public stock exchanges nearly tripled during the year after Sarbanes–Oxley became law. and cost U. According to a survey by Korn/Ferry International. In November 2008. cripple the venture capital business. "The new laws and regulations have neither prevented frauds nor instituted fairness. The reluctance of small businesses and foreign firms to register on American stock exchanges is easily understood when one considers the costs Sarbanes–Oxley imposes on businesses. Compare that with 269 IPOs in 1999.S.. Newt Gingrich and co-author David W. shall be fined under this title. House of Representatives.S. corporations at a competitive disadvantage with foreign firms. 2005 speech before the U. but before Sarbanes–Oxley was passed.K. Cooked up in the wake of accounting scandals earlier this decade. and 365 in 1986. critics blamed Sarbanes–Oxley for the low number of Initial Public Offerings (IPOs) on American stock exchanges during 2008." Hoover's IPO Scorecard notes 31 IPOs in 2008. ” Criticism Congressman Ron Paul and others such as former Arkansas governor Mike Huckabee have contended that SOX was an unnecessary and costly government intrusion into corporate management that places U. 272 in 1996. while a study by the law firm of Foley and Lardner found [53] the Act increased costs associated with being a publicly held company by 130 percent. According to a study by a researcher at the Wharton Business School.[57] In 2004.[29] During the financial crisis of 2007-2010.C.[59] [60] . with the intent to retaliate.S. "These regulations are damaging American capital markets by providing an incentive for small US firms and foreign firms to deregister from US stock exchanges. Kralik called on Congress to repeal Sarbanes–Oxley. well down from the highs. According to the National Venture Capital Association. stock exchanges. Journal editorial stated. and damage entrepreneurship. takes any action harmful to any person. while the New York Stock Exchange had only 10 new foreign listings in all of 2004. including interference with the lawful employment or livelihood of any person. Sarbanes–Oxley cost Fortune 500 companies an average of $5. Paul stated. smaller international companies were more likely to list in stock exchanges in the U.S." A research study published by Joseph Piotroski of Stanford University and Suraj Srinivasan of Harvard Business School titled "Regulation and Bonding: Sarbanes Oxley Act and the Flow of International Listings" in the Journal of Accounting Research in 2008 found that following the act's passage.1 million in compliance expenses in 2004.[58] There were 196 IPOs in 2005.S. § 1513(e) [51] states:[52] “ Whoever knowingly. in all of 2008 there have been just six companies that have gone public. rather than U. But they have managed to kill the creation of new public companies in the U.[55] The editorial concludes that: "For all of this. In an April 14. 205 in 2006 (with a sevenfold increase in deals over $1 billion) and 209 in 2007." [56] Previously the number of IPOs had declined to 87 in 2001.[54] A December 21.Sarbanes–Oxley Act 200 Sarbanes–Oxley Section 1107: Criminal penalties for retaliation against whistleblowers Section 1107 of the SOX 18 U. 2008 Wall St. imprisoned not more than 10 years. it has essentially killed the creation of new public companies in America. industry more than $200 billion by some estimates. driving businesses out of the United States. we can first thank Sarbanes–Oxley. or both. hamstrung the NYSE and Nasdaq (while making the London Stock Exchange rich).

S. citing improved investor confidence and more accurate.[74] SOX requires a standard data entry system.[63] [64] Financial restatements increased significantly in the wake of the SOX legislation and have since dramatically declined. the United States Supreme Court unanimously . it heard the oral arguments. so rapidly developed and enacted.[82] On December 7. rather than the SEC.the act importantly reinforced the principle that shareholders own our corporations and that corporate managers should be working on behalf of shareholders to allocate business resources to their optimum use. LLC is a San Francisco-based firm that tracks the volume of do-overs by public companies. The fraud which spanned nearly 20 years and involved over $24 million was committed by Value Line (NASDAQ: VALU [67]) against its mutual fund shareholders. Congress may have to devise a different method of officer appointment. its officers should be appointed by the President.”[61] SOX has been praised by a cross-section of financial industry experts.[65] One fraud uncovered by the Securities and Exchange Commission (SEC) in November 2009 [66] may be directly credited to Sarbanes-Oxley.[75] SOX nurtures an ethical culture as it forces top management be transparent and employees to be responsible for their acts and also protects whistle blowers. SEC Chairman Christopher Cox stated in 2007: "Sarbanes–Oxley helped restore trust in U.295 restatements of financial earnings in 2005 for companies listed on U. auditor conflicts of interest have been addressed. Its March 2006 report. so is the remainder. 2010. 2008. markets by increasing accountability. securities markets.[83] On June 28. The complaint argues that because the PCAOB has regulatory powers over the accounting industry.[73] SOX has had a significant impact on corporate governance.S." shows 1. a primary objective of the legislation. Further.[81] On May 18. the decision was upheld by the Court of Appeals on August 22. Glass.[76] Legal challenges A lawsuit (Free Enterprise Fund v.[72] No criminal charges have been filed. the United States Supreme Court agreed to hear this case.[78] [79] The lawsuit was dismissed from a District Court. Investors' confidence is increased as SOX ensures reliable financial reporting. and senior management engagement in financial reporting and improvements in financial controls. almost twice the number for 2004. The IIA study also indicated improvements in board. by prohibiting auditors from also having lucrative consulting agreements with the firms they audit under Section 201. the other parts of the law may be open to revision." says the report..Sarbanes–Oxley Act 201 Praise Former Federal Reserve Chairman Alan Greenspan praised the Sarbanes–Oxley Act: "I am surprised that the Sarbanes–Oxley Act.[69] [70] [71] Restitution totalling $34 million will be placed in a fair fund and returned to the affected Value Line mutual fund investors. speeding up reporting. If the plaintiff prevails. in his dissent. because the law lacks a "severability clause.. Public Company Accounting Oversight Board) was filed in 2006 challenging the constitutionality of the PCAOB. the U."[62] The FEI 2007 study and research by the Institute of Internal Auditors (IIA) also indicate SOX has improved investor confidence in financial reporting.[77] Further. Lewis & Co. as companies are no longer able to manipulate inventories or stocks of products or sales as there is a real time reporting system in place. Further. reliable financial statements. has functioned as well as it has. as companies "cleaned up" their books. 2009.[80] Judge Kavanaugh. which was not the case prior to SOX." if part of the law is judged unconstitutional.S. audit committee. 2009. "Getting It Wrong the First Time. The fraud was first reported to the SEC in 2004 by the Value Line Fund (NASDAQ: VLIFX [68]) portfolio manager who was asked to sign a Code of Business Ethics as part of SOX. "That's about one restatement for every 12 public companies—up from one for every 23 in 2004. The CEO and CFO are now required to unequivocally take ownership for their financial statements under Section 302. and making audits more independent. Benefits of Sarbanes Oxley Act on a long term basis Sarbanes Oxley Act provides a number of long term benefits. argued strongly against the constitutionality of the law.

section404. Rept.com. Retrieved 2010-08-27. foley. gov/ ~schumer/ SchumerWebsite/ pressroom/ special_reports/ 2007/ NY_REPORT%20_FINAL. Retrieved 2010-08-27. com/ index. [10] "SEC Annual Budget" (http:/ / www. "Sarbanes Interview" (http:/ / findarticles. (D-N. gov/ news/ studies/ 2011/ 404bfloat-study. [20] "Study and Recommendations on Section 404(b)" (http:/ / sec. Retrieved 2010-08-27.com. businessweek. pdf) (PDF). org/ news/ local/ icd/ icd. cfm?abstract_id=1561619). Rept. Theiia. Nance (2004). Accruals and Stock Returns" (http:/ / papers. php?s=43& item=204). . Retrieved 2010-08-27. The act remains "fully operative as a law" pending a process correction.L. . cfm?abstract_id=983772). txt). Greg. 117 Stat. 2008-04-30. pdf). . Retrieved 2010-08-27. nyu. com/ sol3/ papers. org/ book/ 855)." Wizard Academy Press: 2005 [9] Lucas. The Economist. house. pdf) (PDF).col1).com. org/ pdf/ Lord & Benoit Report Do the Benefits of 404 Exceed the Cost. Foley. Retrieved 2010-08-27. The Sarbanes Oxley Act: "Big Brother is watching" you or Adequate Measures of Corporate Governance Regulation? 5 Rutgers Business Law Journal [2008]. edu/ accounting/ docs/ speaker_papers/ spring2005/ Zhang_Ivy_Economic_Consequences_of_S_O. wbur. mediaroom. ssrn. . Sec. [22] "The Effect of the Sarbanes–Oxley Act (Section 404) Management's Report on Audit Fees. 107–414.ssrn. The New York Times.org. Bernhard. com/ gst/ fullpage. [25] "Benoit WSJ" (http:/ / www. senate. . stern. gov/ cgi-bin/ vote.mediaroom.org.ssrn. gov/ ~schumer/ SchumerWebsite/ pressroom/ special_reports/ 2007/ NY_REPORT _FINAL. asp?year=2002& rollnumber=348 [13] "Five years of Sarbanes–Oxley" (http:/ / www. aspx?newsid=3074).com. [15] "FEI 2007 Survey of SOX 404 Costs" (http:/ / fei.[84] 202 Legislative information • House: H. Reference http:/ / www. . Retrieved 2010-08-27. gov/ legislative/ LIS/ roll_call_lists/ roll_call_vote_cfm. cfm?congress=107& session=2& vote=00192 Bumiller. Findarticles. htm). 64 – 95. . mediaroom. Fei. gov/ foia/ docs/ budgetact. 745 References [1] [2] [3] [4] http:/ / www. Henry N. house. 107–205 • Law: Pub. 3763 [85].Sarbanes–Oxley Act turned away a broad challenge to the law. [12] http:/ / clerk. Journal of Business & Technology Law: 333. senate. pdf [6] Reference this BusinessWeek article quoting fund managers at Eaton Vance and T. sec. H. [18] "Economic Consequences of the Sarbanes-Oxley Act" (http:/ / w4. economist. Sen. Aei. com/ sol3/ papers. 2673 [86]. edu/ RBLJ_vol5_no1_kuschnik. 2007-02-08. cfm?story_id=9545905). Retrieved 2010-08-27. . Papers. pdf) . . H. [14] Shakespeare. gov/ news/ speech/ speecharchive/ 1998/ spch220. org/ research/ research-reports/ chronological-listing-research-reports/ downloadable-research-reports/ ?i=248). gov/ evs/ 2002/ roll348. section404. "''The Sarbanes–Oxley Debacle''" (http:/ / www. Retrieved 2010-08-27. htm [7] Kuschnik.R.). php?s=press_releases& item=187) [17] "Foley & Lardner 2007 Study" (http:/ / www. [19] Butler. org/ pdf/ 09_wall_street_journal. [16] FEI 2006 Survey of SOX 404 Costs (http:/ / fei.gov. sec. . aei. [26] "IIA Research SOX Looking at the Benefits" (http:/ / www. com/ news/ news_detail. "Bush Signs Bill Aimed at Fraud in Corporations" (http:/ / query. pdf [8] Farrell. pdf) [24] "Lord & Benoit Report" (http:/ / www. [11] "SEC Levitt Speech The Numbers Game" (http:/ / www. theiia. 2009-06-23. (2006-06-05). nytimes. Retrieved 2010-08-27.S. cites this as one reason America's financial sector is losing market share to other financial centers worldwide. html http:/ / clerk. . Catharine (2008).Y. Rept. but ruled 5-4 that a section related to appointments violates the Constitution's separation of powers mandate. rutgers. 107-204 [1]. . Papers. "America Robbed Blind. Rowe Price http:/ / www. [27] Bloomberg-Schumer report (http:/ / www. Securities and Exchange Commission. April 2011 [21] "The Effect of Corporate Governance Regulation on Transparency: Evidence from the Sarbanes-Oxley Act of 2002" (http:/ / papers. com/ index. html?res=9C01E0D91E38F932A05754C0A9649C8B63). . pdf) (PDF). 107–610 • Senate: S. 2007-07-26. ssrn. "Sarbanes–Oxley Act of 2002 Five Years On: What Have We Learned?". . [23] The Effect of Internal Control Deficiencies on Firm Risk and Cost of Capital (http:/ / www. senate. Elisabeth (2002-07-31). Retrieved 2010-08-27. com/ displaystory. Charles Schumer. newark. com/ magazine/ content/ 07_05/ b4019053. xml http:/ / www. Retrieved 2010-08-27. gpo. available at http:/ / businesslaw. com/ p/ articles/ mi_m0NXD/ is_1_11/ ai_n25101748/ print?tag=artBody. S. [5] A Mckinsey & Company study commissioned by NYC Mayor Michael Bloomberg and U.com. gov/ fdsys/ pkg/ PLAW-107publ204/ content-detail.

[38] Koniak. 5 (http:/ / pcaobus. cfm?abstract_id=876624). cornell. Cohen. htm). 2005 [54] Newt Gingrich. cornell. [60] "Hoover's IPO Scorecard Reveals Only Slight Growth in 2007" (http:/ / www. gov/ rules/ final/ 2010/ 33-9142. hoovers. . Hoovers. ISBN 0-275-98127-4 [53] Repeal Sarbanes-Oxley! (http:/ / paul. Regulation and Bonding: The Sarbanes–Oxley Act and the Flow of International Listings(January 2008). com/ 2010/ 04/ 04/ opinion/ 04koniak. [58] Number of IPOs in 2004 Increased by 195% (http:/ / www. and Transparency of Filings by Issuers" (http:/ / www. 2007-01-03. Journal. hoovers. xhtml).com. Retrieved 2010-08-27. Kohn. L.David W. xhtml).ssrn. . Praeger Publishers. Along With Seven-Fold Increase In Number of $1 Billion-Plus Deals" (http:/ / www. Special Purpose Entities. gov/ rules/ final/ 33-8238. htm). Fei. sec. sec. New York Times. xhtml). hoovers. DTL). gov/ info/ smallbus/ acspc/ acspc-finalreport. com/ cgi-bin/ article. Kralik (2008-11-05). Retrieved 2010-08-27. sec.gov. law. Retrieved 2010-09-15. php?s=43& item=204). hoovers. and Srinivasan. Michael D. pdf?abstractid=978834& mirid=3). cornell. html [51] http:/ / www. 2008-01-04. Hoovers. Whistleblower Law: A Guide to Legal Protections for Corporate Employees. mediaroom. cornell. pdf) (PDF). [49] "Internal control over financial reporting in exchange act periodic reports of non-accelerated filers" (http:/ / www. edu/ uscode/ 15/ 7262. htm). 2009-01-14. pdf) [40] http:/ / www. Dana.gov. [39] See New Center for Data Analysis Report (http:/ / www. nytimes. "How Washington Abetted the Bank Job" (http:/ / www. [29] Piotroski. html#a [42] PCAOB Auditing Standard No. com/ business-information/ --pageid__15824--/ global-corp-press-index. Retrieved 2010-08-27. law. Retrieved 2010-08-27. . . pdf) (PDF). sec. ssrn. Retrieved 2010-08-27. com/ business-information/ --pageid__16356--/ global-corp-press-index. .gov. com/ abstract=956987 [30] http:/ / www. ssrn. Hoovers. .mediaroom. html [41] http:/ / www. Kohn. html [32] http:/ / www. [48] "SEC Press Release:Final Stage of Section 404 of Sarbanes–Oxley to Begin in June" (http:/ / www. Retrieved 2010-08-27. Retrieved 2010-08-27. . gov/ rules/ final/ 33-8124.gov. [34] "SEC Final Rules 33–8238" (http:/ / www. & Ross. Retrieved 2010-08-27. org/ CDA/ upload/ SOX-CDA-edited-3. 95 (2007)" (http:/ / papers. . edu/ uscode/ 18/ 1519. law. wsj. html#e [52] Stephen M. com/ business-information) [59] "Hoover's IPO Scorecard Reveals Increase In Momentum In 2006. heritage. Retrieved 2010-08-27. cornell.gov. Colapinto (2004).Sarbanes–Oxley Act [28] "SSRN-The Effect of the Sarbanes–Oxley Act on Non-US Companies Cross-Listed in the US by Kate Litvak" (http:/ / papers. law. gov/ rules/ interp/ 2007/ 33-8810. April 14.com. 44 Hous. com/ business-information/ --pageid__16750--/ global-corp-press-index. com/ sol3/ papers. edu/ uscode/ 15/ 7241. edu/ uscode/ 15/ 7241. Sfgate. [50] http:/ / www. aspx) [43] "SEC Interpretive Release: Commission Guidance Regarding Management's Report on Internal Control Over Financial Reporting Under Section 13(a) or 15(d) of the Securities Exchange Act of 1934" (http:/ / www. sfgate. php?q=node/ 3852). Retrieved 2010-08-27. Sec.ssrn. Rev. cfm/ SSRN_ID978834_code607342. [44] "FEI Survey 2007" (http:/ / fei. Retrieved 2010-08-27. [35] "SEC-Press Release on 401(c) Report-June 15. pdf) (PDF). 2002-01-03. David A. [37] "Policy Statement: Interagency Statement on Sound Practices Concerning Elevated Risk Complex Structured Finance Activities" (http:/ / www. Papers. 2005" (http:/ / www. edu/ uscode/ 18/ 1350. [36] "Report and Recommendations Pursuant to Section 401(c) of the Sarbanes-Oxley Act of 2002 On Arrangements with Off-Balance Sheet Implications. . gov/ index. George M. Retrieved 2010-08-27. Papers. federalreserve. com/ sol3/ Delivery.com. Federalreserve. html). pdf) (PDF). Sec. . html [31] http:/ / www. [45] "Final Report: Advisory Committee on Smaller Public Companies" (http:/ / www. sec. 2008 [57] "Hoover's IPO Analysis For 2001 Shows Resurgence Of "Not-Coms" | Hoover's: The most comprehensive business info available" (http:/ / www. edu/ uscode/ 18/ 1513. 2010). . house. Retrieved 2010-08-27. Sec. [55] "Hoover's IPO Scorecard Reveals Major Decline In 2008" (http:/ / www.. Available at SSRN: http:/ / ssrn. . Retrieved 2010-08-27. Retrieved 2010-08-27. cgi?f=/ c/ a/ 2008/ 11/ 05/ ED2813T8O9. [47] "Sarbanes–Oxley: Progressive Punishment for Regressive Victimization. edu/ uscode/ 15/ 7262.com. Hoovers. . Retrieved 2010-08-27.gov. sec. senate. . cornell. . 203 . htm). hoovers.senate. [61] "Greenspan praises SOX" (http:/ / www. gov/ news/ press/ 2009/ 2009-213.com. [56] Washington Is Killing Silicon Valley (http:/ / online. org/ Rules/ Rulemaking/ Pages/ Docket021. 2005-06-15. December 21. sec. Joseph D. Retrieved 2010-08-27. and David K. Wall St. Retrieved 2010-08-27. sec. Suraj. law. 2007-04-25. gov/ news/ studies/ soxoffbalancerpt. [46] "Dodd-Shelby Amendment" (http:/ / dodd. . com/ index. law. . cornell. law.. Susan P. "Gingrich" (http:/ / www. gov/ index. Dodd. pdf). htm). sec. com/ business-information/ --pageid__4046--/ global-corp-press-index. SEC. . . html#a_4 [33] "Final Rule: Certification of Disclosure in Companies' Quarterly and Annual Reports" (http:/ / www. gov/ rules/ policy/ 2006/ 34-53773. 2008-04-30. Sec. html). gov/ boarddocs/ speeches/ 2005/ 20050515/ default. gov/ news/ press/ 2005-91. 2005-05-15.com.com. com/ article/ SB122990472028925207.com. Thomas (April 3.gov. xhtmlr). php?option=com_content& task=view& id=209& Itemid=60) Ron Paul.

htm). loc. Retrieved 2010-08-27. wsj. [65] "Glass Lewis Survey of Restatements" (http:/ / www. html?dbk). com/ wp-dyn/ content/ article/ 2008/ 07/ 19/ AR2008071900106.pdf) U. . . Is Investigating Fee Practices at Value Line (http:/ / www. . 2003-06-26. asp?symbol=VLIFX& selected=VLIFX [69] 2:22 p. com/ whitepapers/ Compliance_to_Cash_060725. Floyd. Sarbanes Oxley Compliance journal.com. Today2:22 p. Retrieved 2010-08-27. [74] "Sarbanes-Oxley: Dragon or white knight?" (http:/ / www. marketwatch.access. USA Today.htm) . com/ news/ Supreme-Court-weighs-validity-apf-3124646921. oversightsystems.ucsb. [77] The Wall Street Journal. 2005-12-06. Gina (2009-11-04). . html) [72] Keating. [67] http:/ / quotes.Sarbanes–Oxley Act [62] Farrell.com. Retrieved 2010-08-27.com. loc. cadwalader. . Jean Bernhard Buttner. [85] http:/ / hdl. [63] "FEI Survey" (http:/ / fei.E. 107s2673 204 External links • The text of the law (PDF) (http://frwebgate. Retrieved 2010-08-27. [71] The S. Reuters. com/ editorials/ sell-sarbanes-oxley/ 84635/ ). . Aug. washingtonpost. Usatoday.org. html). 2010.mediaroom. The New York Times. 2010). glasslewis. com/ money/ companies/ regulation/ 2003-10-19-sarbanes_x. gov/ loc. Inc.cgi?dbname=107_cong_bills& docid=f:h3763enr. Retrieved 2010-08-27. Government Printing Office • President Bush – Signing Statement (http://www. mediaroom. [66] "Administrative Proceeding: Value Line.m. . Retrieved 2011-03-04. com/ story/ publisher-value-line-may-take-earnings-hit-from-sec-inquiry). nasdaq. "Value Line.gov/cgi-bin/getdoc. Greg (2007-07-30).org. com/ asp/ SummaryQuote. aspx) [81] "NY Sun Editorial" (http:/ / www. usatoday.gov/news/studies/ principlesbasedstand. bloomberg. htm).com.php?pid=64514) • Study Pursuant to Section 108(d) of the Sarbanes–Oxley Act of 2002 on the Adoption by the United States Financial Reporting System of a Principles-Based Accounting System (http://www. Fei.S. . . Liptak.m. Npr. 2008-04-30. Retrieved 2010-08-27. com/ article/ idUSN045378520091104). pdf) (PDF). org/ News/ Releases/ Pages/ 08222008_PCAOBStatement. Inc. Retrieved 2011-03-04. uscongress/ legislation. & Financial Markets" (http:/ / www. . nysun. com/ money/ companies/ regulation/ 2007-07-29-sarbanes-oxley_N. Retrieved 2010-08-27. Retrieved 2010-08-27. 107hr3763 [86] http:/ / hdl. pdf. Supremecourtus. Oversight. . [78] Post Store (2008-07-20). Washington Post. Retrieved 2011-03-04. "USA Today – SOX Law Has Been a Pretty Clean Sweep" (http:/ / www. pdf) (PDF). nytimes. com/ 2010/ 06/ 29/ business/ 29accounting. "Supreme Court Upholds Accounting Board" (http:/ / www. asp?symbol=VALU& selected=VALU [68] http:/ / quotes. php?s=43& item=204). Bloomberg. Retrieved 2010-08-27. [79] "NPR-Supreme Court Considers Sarbanes-Oxley Board" (http:/ / www.C. usatoday. . gov/ litigation/ admin/ 2009/ 33-9081. com/ 2008/ 08/ 02/ business/ 02fund. gov/ oral_arguments/ argument_transcripts/ 08-861.. . org/ templates/ story/ story. 2003-10-19. Theiia. Sree (2009-11-09). com/ public/ resources/ documents/ PCAOBcomplaint. execs to pay $45 mln in SEC case" (http:/ / www. php?storyId=121146830). [80] PCAOB News Release (http:/ / pcaobus. Value Line Securities.tst.com.gpo. com/ index. reuters. supremecourtus. Adam (June 28. [84] Norris. [70] Vidya. and David Henigson" (http:/ / www. Retrieved 2010-08-27. nytimes. org/ research/ research-reports/ chronological-listing-research-reports/ downloadable-research-reports/ ?i=248). com/ apps/ news?pid=20601103& sid=aPpxj3FdB0uM). [76] "Whistleblower Protection Under The Sarbanes-Oxley Act" (http:/ / www. pdf).edu/ws/index. [82] (http:/ / finance.presidency.gov. Retrieved 2010-08-27. com/ assets/ article/ BlockHoff062603. 27. . "Publisher Value Line may take earnings hit from SEC inquiry" (http:/ / www. [75] "A Sarbanes-Oxley Compliance Program that Saves Cash" (http:/ / www. Marketwatch. yahoo. com/ dsp_getFeaturesDetails. Nysun. theiia.com. Retrieved 2010-08-27. cfm?CID=1141). uscongress/ legislation. nasdaq. 2009-12-07. npr. Newyork law journal. "Value Line Settlement Marks End of Buttner Reign" (http:/ / www. s-ox. com/ downloads/ Restatements2005Summary. com/ asp/ SummaryQuote. pdf). "Washington Post" (http:/ / www. . .. [64] "IIA Study" (http:/ / www. [73] "The Impact Of Sarbanes Oxley On Companies. Investors. . pdf). Retrieved 2011-03-04.sec. sec. gov/ loc. html?x=0) [83] "12_7_09 Oral Argument Transcript" (http:/ / www. http:/ / online.

combines a number of open standards that are used to enumerate software flaws and configuration issues related to security. and policy compliance evaluation (e. measurement. SCAP contributes to the implementation. It is a method for using those open standards for automated vulnerability management. and monitoring steps of the NIST Risk Management Framework. SCAP defines how the following standards (referred to as SCAP 'Components') are combined: SCAP Components • Common Vulnerabilities and Exposures (CVE) [1] • Common Configuration Enumeration (CCE) [2] • • • • Common Platform Enumeration (CPE) Common Vulnerability Scoring System (CVSS) [4] Extensible Configuration Checklist Description Format (XCCDF) [5] Open Vulnerability and Assessment Language (OVAL) [3] Starting with SCAP version 1. measurement. and policy compliance evaluation. The National Vulnerability Database (NVD) is the U. Future versions will likely standardize and enable automation for implementing and changing security settings of corresponding SP 800-53 Rev1 controls. They measure systems to find vulnerabilities and offer methods to score those findings in order to evaluate the possible impact. government content repository for SCAP. SCAP is an integral part of the NIST FISMA [6] implementation project. The current version of SCAP is meant to perform initial measurement and continuous monitoring of security settings and corresponding SP 800-53 Rev1 controls.0 These components can be used to build products that have SCAP Capabilities: SCAP Capabilities • • • • • • • • • • • • • Federal Desktop Core Configuration (FDCC) Scanner Authenticated Configuration Scanner Authenticated Vulnerability [and Patch] Scanner Unauthenticated Vulnerability Scanner Intrusion Detection and Prevention Patch Remediation Mis-configuration Remediation Asset Management Asset Database Vulnerability Database Mis-configuration Database Malware Tool Ask Questions That are not Automatable Security Content Automation Protocol (SCAP) checklists standardize and enable automation of the linkage between computer security configurations and the NIST Special Publication 800-53 Revision 1 (SP 800-53 Rev1) controls framework. Accordingly. . assessment. Purpose The Security Content Automation Protocol (SCAP)..g. In this way. FISMA compliance).Security Content Automation Protocol 205 Security Content Automation Protocol The Security Content Automation Protocol (SCAP) is a method for using specific standards to enable automated vulnerability management. pronounced “S-Cap”.1 • Open Checklist Interactive Language (OCIL) Version 2.S.

mitre. guidance on the use of evaluated and tested products. gov . or wants to use security products that have been tested and validated to the SCAP standard by an independent third party laboratory should visit the SCAP validated products web page [7] to verify the status of the product(s) being considered. Vulnerability Database. org/ http:/ / nvd. Authenticated Vulnerability Scanner. security evaluation criteria and evaluation methodologies. A vendor seeking validation of a product that implements a SCAP component (CVE.Security Content Automation Protocol 206 SCAP Validation Program Security programs overseen by NIST focus on working with government and industry to establish more secure systems and networks by developing. tests and test methods. org/ http:/ / cpe. nist. nist. should contact an NVLAP accredited SCAP validation laboratory for assistance in the validation process. gov http:/ / nvd. cfm http:/ / scap. A customer who is subject to the FISMA requirements. nist. External links • Security Content Automation Protocol web site [8] • National Vulnerability Database web site [9] References [1] [2] [3] [4] [5] [6] [7] [8] [9] http:/ / cve. Authenticated Configuration Scanner. and appropriate coordination with assessment-related activities of voluntary industry standards bodies and other assessment regimes. mitre. nist. Mis-configuration Database or Malware Tool). gov/ xccdf. mitre. and addresses such areas as: development and maintenance of security metrics. cfm?version=2 http:/ / nvd. techniques. XCCDF or OVAL). The SCAP standards can be complex and several configurations must be tested for each component and capability to ensure that the product meets the requirements. cfm http:/ / csrc. CPE. research to address assurance methods and system-wide security and assessment methodologies. nist. gov/ groups/ SMA/ fisma/ index. gov/ scapproducts. Asset Management. evaluation and validation. security-specific criteria for laboratory accreditation. Unauthenticated Vulnerability Scanner. gov/ cvss. nist. services. or capability (Federal Desktop Core Configuration (FDCC) Scanner. managing and promoting security assessment tools. security protocol validation activities. A third-party lab (accredited by National Voluntary Laboratory Accreditation Program (NVLAP)) provides assurance that the product has been thoroughly tested and has been found to meet all of the requirements. Intrusion Detection and Prevention. html http:/ / nvd. CVSS. Asset Database. org/ http:/ / cce. CCE. Patch Remediation. Independent third party testing assures the customer/user that the product meets the NIST specifications. and supporting programs for testing. Mis-configuration Remediation.

the possibility of a computer malfunctioning. organizational assets. modification of information. or the possibility of an "act of God" such as an earthquake. 4. Categorize and classify threats as follows: Categories Classes Human Intentional Unintentional Environmental Natural Fabricated 2. and/or denial of service. image. sabotage and fraud. exploit code. e. Factor Analysis of Information Risk defines threat as:[6] threats are anything (e. human. Any circumstance or event with the potential to cause harm to a system in the form of destruction. substance. acts of God (weather. for example. modification of data.). action. A tornado is a threat.g.e.) that are capable of acting against an asset in a manner that can result in harm. Examples are flooding. disclosure.etc. that may result in harm of systems and organization A more comprehensive definition.g. or event. or individuals through an information system via unauthorized access. destruction. A threat is a potential for harm. National Information Assurance Training and Education Center gives a more articulated definition of threat: [7] [8] The means through which the ability or intent of a threat agent to adversely affect an automated system. National Information Assurance Glossary defines threat as: Any circumstance or event with the potential to adversely impact an IS through unauthorized access. 5. or operation can be manifest. Also. a fire. or denial of service. the potential for a threat-source to successfully exploit a particular information system vulnerability. The Open Group defines threat in [5] as: Anything that is capable of acting in a manner resulting in harm to an asset and/or organization. the threat of fire exists at all facilities regardless of the amount of fire protection available. capability. Minimum Security Requirements for Federal Information and Information Systems" by NIST of United States of America [3] Any circumstance or event with the potential to adversely impact organizational operations (including mission. as is a flood. or a tornado) or otherwise a circumstance. The presence of a threat does not mean that it will necessarily cause actual harm. modification of data. can be found in "Federal Information Processing Standards (FIPS) 200. etc. disclosure. The key consideration is that threats apply the force (water.3] through unauthorized access. and/or denial of service. failures. tied to an Information assurance point of view. perils) that may result in losses.. intelligent. as is a hacker. disclosure. ENISA gives a similar definition[4] : Any circumstance or event with the potential to adversely impact an asset [G. Types of computer systems related adverse events (i. Threats exist because of the very existence of the system or activity and not because of any specific weakness. an individual cracker or a criminal organization) or "accidental" (e. and modification of data.) against an asset that can cause a loss event to occur. and/or denial of service. destruction. malicious actors. e. Any circumstance or event with the potential to cause harm to the ADP system or activity in the form of destruction. For example. errors. geological events.. disclosure.Threat (computer) 207 Threat (computer) In Computer security a threat is a possible danger that might exploit a vulnerability to breach security and thus cause possible harm. destruction. modification or data. 3. A threat can be either "intentional" (i. .[1] Definitions ISO 27005 defines threat as:[2] A potential cause of an incident. or reputation). wind. and/or denial of service...g. object. . facility. disclosure. etc. functions.

<peril.. i is an internal entity or an empty set. implies a risk (according to some body of knowledge). asset category>.| | A System Resource: | i... A Threat Action | | measure | | Target of the Attack | +----------+ | | | | +-----------------+ | | Attacker |<==================||<========= | | | i.-+ An Attack: | |Counter... the countermeasures in order to accomplish to a security strategy set up following rules and regulations applicable in a country... Countermeasures are also called Security controls... Integrity or Availability properties of resources (potentially different that the vulnerable one) of the organization and others involved parties (customers.. In threat analysis.e. 8. A "passive attack" attempts to learn or make use of information from the system but does not affect system resources: so it compromises Confidentiality..e.Threat (computer) An assertion primarily concerning entities of the external environment (agents). The result can potentially compromises the Confidentiality.. The so called CIA triad is the basis of Information Security.... we write: T(e... when applied to the transmission of information are named security services. according to Risk management principles. we say that an agent (or class of agents) poses a threat to one or more assets. The attack can be active when it attempts to alter system resources or affect their operation: so it compromises Integrity or Availability.i) where: e is an external entity..-+ A resource (both physical or logical) can have one or more vulnerabilities that can be exploited by a threat agent in a threat action...+ + ... 208 Phenomenology The term "threat" relates to some other basic security terms as shown in the following diagram:[1] + | | | | | | | | | + ....[10] The widespread of computer dependencies and the consequent raising of the consequence of a successful attack... suppliers). A set of policies concerned with information security management.. A potential violation of security..[1] OWASP (see figure) depicts the same phenomenon in slightly different terms: a threat agent through an attack vector exploits a weakness (vulnerability) of the system and the related security controls causing an technical impact on an IT resource (asset) connected to a business impact.[9] The overall picture represents the risk factors of the risk scenario..+ + .. 6.. | Passive | | | | | Vulnerability | | | A Threat |<=================>||<========> | | | Agent | or Active | | | | +-------|||-------+ | +----------+ Attack | | | | VVV | | | | | Threat Consequences | . An undesirable occurrence that might be anticipated but is not the result of a conscious act or decision....... suggesting the nature of these occurrences but not the details (details are specific to events)..+ + . in union with a set of properties of a specific internal entity. OWASP: relationship between threat agent and . has been developed to manage. A set of properties of a specific external entity (which may be either an individual or class of entities) that. led to a new term cyberwarfare.. 7..+ + ....... a threat is defined as an ordered pair.. the business impact Information Security Management Systems (ISMS).

specifically Social network services. but a serious study to apply cost effective countermeasures can only be conducted following a rigorous IT risk analysis in the framework of an ISMS: a pure technical approach will let out the psychological attacks.[12] One famous case is Robin Sage.0 applications. • theft of media • retrieval of discarded materials • technical failures • equipment • software • capacity saturation • compromise of functions • error in use • abuse of rights • denial of actions • Origin • Deliberate: aiming at information asset • spying • illegal processing of data • accidental • equipment failure • software failure • environmental . that are increasing threats. inducing them to reveal sensitive information. can be a mean to get in touch with people in charge of system administration or even system security. trojan and other malware. Phishing and Pretexting and other methods are called social engineering techniques.Threat (computer) It should be noted that nowadays the many real attacks exploit Psychology at least as much as technology. 209 Threats classification Threats can be classified according to their type and origin:[2] • Type • Physical damage • fire • water • pollution • natural events • climatic • seismic • volcanic • loss of essential services • electrical power • air conditioning • telecommunication • compromise of information • eavesdropping.[13] The most widespread documentation on Computer insecurity is about technical threats such computer virus.[11] The Web 2.

Threat (computer) • natural event • loss of power supply Note that a threat type can have multiple origins. The spread over a network of threats can led to dangerous situations. threat level as been defined: for example INFOCOM is a threat level used by USA. identity theft.) Elevation of privilege Microsoft used to risk rating security threats using five categories in a classification called DREAD: Risk assessment model..how easy it is to reproduce the attack? Exploitability . be a threat agent – the well-intentioned.how easy it is to discover the threat? The DREAD name comes from the initials of the five categories listed. computer operator who trashes a daily batch job by typing the wrong command. setting up a porn distribution service on a compromised server.how many people will be impacted? Discoverability .) • Disclose – the threat agent illicitly discloses sensitive information • Modify – unauthorized changes to an asset • Deny access – includes destruction. 210 Threat model People can be interested in studying all possible threats that can: • affect an asset.[6] Threat agents can take one or more of the following actions against an asset[6] : • Access – simple unauthorized access • Misuse – unauthorized use of assets (e. etc. . etc.how bad would an attack be? Reproducibility . The categories were: • • • • • Damage . under the right circumstances. The model is considered obsolete by Microsoft. but inept. or the squirrel that chews through a data cable. theft of a non-data asset.g.S. the regulator performing an audit.[14] from the initial of threat categories: • • • • • • Spoofing of user identity Tampering Repudiation Information disclosure (privacy breach or Data leak) Denial of Service (D. Practically anyone and anything can. • affect a software system • are brought by a threat agent Threat classification Microsoft has proposed a threat classification called STRIDE.how much work is it to launch the attack? Affected users .o. Leading antivirus software vendors publish global threat level on their websites[15] [16] Associated terms Threat Agents Threat Agents Individuals within a threat population. In military and civil fields.

and how they might use them against the company. intents. Unintentional: Accidents. lightning. • Employees: Staff. • Corporations: Corporations are engaged in offensive information warfare or competitive intelligence.[17] The term Threat Agent is used to indicate an individual or group that can manifest a threat. how likely is it that terrorists would target the company information or systems?[6] The following threat communities are examples of the human malicious threat landscape many organizations face: • Internal • Employees . active terrorist groups? Does the organization represent a high profile. which drives the degree and nature of loss. revenge. or security guards who are annoyed with the company.g. • Human. Criminals will often make use of insiders to help them. Yet that same asset. through the network) and the event that a threat agent act against the asset. It is fundamental to identify who would want to exploit the assets of a company. etc. the potential for productivity loss resulting from a destroyed or stolen asset depends upon how critical that asset is to the organization’s productivity. what components of the organization would be likely targets? For example. the destruction of a highly sensitive asset that doesn’t play a critical role in productivity won’t directly result in a significant productivity loss.) and the nature of the asset. such as bank accounts. recreation.[17] Threat Agent = Capabilities + Intentions + Past Activities These individuals and groups can be classified as follows:[17] • Non-Target Specific: Non-Target Specific Threat Agents are computer viruses.e. operational/maintenance personnel. contractors. If a critical asset is simply illicitly accessed. The point is that it’s the combination of the asset and type of action against the asset that determines the fundamental nature and degree of loss. For example.Threat (computer) It’s important to recognize that each of these actions affects different assets differently. there is no direct productivity loss. i. and capabilities of the terrorists.. Is the organization closely affiliated with ideology that conflicts with known. 211 Threat Communities Threat Communities Subsets of the overall threat agent population that share key characteristics. meteor. high impact target? Is the organization a soft target? How does the organization compare with other potential targets? If the organization were to come under attack. a threat agent bent on financial gain is less likely to destroy a critical server than they are to steal an easily pawned asset like a laptop. fire. • Human. the probability that an organization would be subject to an attack from the terrorist threat community would depend in large part on the characteristics of your organization relative to the motives. carelessness. Partners and competitors come under this category. can result in significant loss of competitive advantage or reputation. earthquakes. The notion of threat communities is a powerful tool for understanding who and what we’re up against as we try to manage risk. if disclosed. • Organized Crime and Criminals: Criminals target information that is of value to them.[6] It is important to separate the concept of the event that a threat agent get in contact with the asset (even virtually.[6] OWASP collects a list of potential threat agents in order to prevent system designers and programmers insert vulnerabilities in the software. trojans and logic bombs. and generate legal costs. worms. For example. • Natural: Flood. Similarly. financial gain. credit cards or intellectual property that can be converted into money. outsider. Intentional: Insider. For example. Which action(s) a threat agent takes will be driven primarily by that agent’s motive (e.

"(Unauthorized) Disclosure" (a threat consequence) A circumstance or event whereby an entity gains access to data for which the entity is not authorized. counterparts to the CIA.[1] Includes disclosure.[1] Threat actions that are accidental events are marked by "*". Threat consequence Threat consequence is a security violation that results from a threat action. This includes: "Theft" . deception. "Interception" A threat action whereby an unauthorized entity directly accesses sensitive data travelling between authorized sources and destinations.[18] Various kinds of threat actions are defined as subentries under "threat consequence". A complete security architecture deals with both intentional acts (i. etc. disruption. Threat analysis Threat analysis is the analysis of the probability of occurrences and consequences of damaging actions to a system.g. and also list and describe the kinds of threat actions that cause each consequence. * "Human error" Human action or inaction that unintentionally results in an entity gaining unauthorized knowledge of sensitive data.e. attacks) and accidental events.[1] It is the basis of risk analysis.) Malware (virus/worm/etc. "Scavenging" Searching through data residue in a system to gain unauthorized knowledge of sensitive data.). This includes: "Deliberate Exposure" Intentional release of sensitive data to an unauthorized entity. The following threat actions can cause unauthorized disclosure: "Exposure" A threat action whereby sensitive data is directly released to an unauthorized entity. and usurpation. The following subentries describe four kinds of threat consequences. (See: data confidentiality.Threat (computer) • Contractors (and vendors) • Partners • External • • • • • • Cyber-criminals (professional hackers) Spies Non-professional hackers Activists Nation-state intelligence services (e. * "Hardware/software error" System failure that results in an entity gaining unauthorized knowledge of sensitive data..) authors 212 Threat action Threat action is an assault on system security.

(See: emanation. The following threat actions can cause deception: "Masquerade" A threat action whereby an unauthorized entity gains access to a system or performs a malicious act by posing as an authorized entity. "Spoof" Attempt by an unauthorized entity to gain access to a system by posing as an authorized user.g. or software (e.Threat (computer) Gaining access to sensitive data by stealing a shipment of a physical medium. Trojan horse) that appears to perform a useful or desirable function. "Cryptanalysis" Transforming encrypted data into plain text without having prior knowledge of encryption parameters or processes. such as a magnetic tape or disk.) "Emanations analysis" Gaining direct knowledge of communicated data by monitoring and resolving a signal that is emitted by a system and that contains the data but is not intended to communicate the data. firmware. (See: wiretapping. "Malicious logic" In context of masquerade.. "Deception" (a threat consequence) A circumstance or event that may result in an authorized entity receiving false data and believing it to be true.) "Intrusion" A threat action whereby an unauthorized entity gains access to sensitive data by circumventing a system's security protections. any hardware. but actually gains unauthorized access to system resources 213 . that holds the data. "Penetration" Gaining unauthorized logical access to sensitive data by circumventing a system's protections. "Signals analysis" Gaining indirect knowledge of communicated data by monitoring and analyzing a signal that is emitted by a system and that contains the data but is not intended to communicate the data. (See: emanation. This includes: "Trespass" Gaining unauthorized physical access to sensitive data by circumventing a system's protections.) "Inference" A threat action whereby an unauthorized entity indirectly accesses sensitive data (but not necessarily the data contained in the communication) by reasoning from characteristics or byproducts of communications. This includes: "Traffic analysis" Gaining knowledge of data by observing the characteristics of communications that carry the data. "Reverse engineering" Acquiring sensitive data by disassembling and analyzing the design of a system component. "Wiretapping (passive)" Monitoring and recording data that is flowing between two points in a communication system.

(See: denial of service.. firmware. fire. a computer virus) intentionally introduced into a system to modify system functions or data..g. any hardware.) "Substitution" Altering or replacing valid data with false data that serves to deceive an authorized entity.Threat (computer) or tricks a user into executing other malicious logic. or wind) that disables a system component. "Malicious logic" In context of corruption.. "False denial of origin" Action whereby the originator of data denies responsibility for its generation. * "Human error" 214 .[18] "Corruption" A threat action that undesirably alters system operation by adversely modifying system functions or data. (See: active wiretapping. * "Natural disaster" Any "act of God" (e. "Tamper" In context of corruption. or software (e. "Disruption" (a threat consequence) A circumstance or event that interrupts or prevents the correct operation of system services and functions.g. lightning. "Malicious logic" In context of incapacitation. "Falsification" A threat action whereby false data deceives an authorized entity. "Insertion" Introducing false data that serves to deceive an authorized entity. logic bomb) intentionally introduced into a system to destroy system functions or resources. earthquake. any hardware. . * "Human error" Action or inaction that unintentionally disables a system component. * "Hardware or software error" Error that causes failure of a system component and leads to disruption of system operation. firmware. or control information to interrupt or prevent correct operation of system functions. deliberate alteration of a system's logic. "Physical destruction" Deliberate destruction of a system component to interrupt or prevent system operation. "Repudiation" A threat action whereby an entity deceives another by falsely denying responsibility for an act. data.g. or software (e. flood. "False denial of receipt" Action whereby the recipient of data denies receiving and possessing the data.) The following threat actions can cause disruption: "Incapacitation" A threat action that prevents or interrupts system operation by disabling a system component.

or firmware of a system component. * "Natural disaster" Any "act of God" (e. "Violation of permissions" Action by an entity that exceeds the entity's system privileges by executing an unauthorized function. "Overload" Hindrance of system operation by placing excess burden on the performance capabilities of a system component. "Interference" Disruption of system operations by blocking communications or user data or control information. software. "Theft of functionality" Unauthorized acquisition of actual hardware.g. The following threat actions can cause usurpation: "Misappropriation" A threat action whereby an entity assumes unauthorized logical or physical control of a system resource. "Malicious logic" In context of misuse. "Misuse" A threat action that causes a system component to perform a function or service that is detrimental to system security. (See: flooding. any hardware. power surge caused by lightning) that alters system functions or data.[18] "Obstruction" A threat action that interrupts delivery of system services by hindering system operations.) "Usurpation" (a threat consequence) A circumstance or event that results in control of system services or functions by an unauthorized entity. "Theft of service" Unauthorized use of service by an entity. "Tamper" In context of misuse.. or control information to cause the system to perform unauthorized functions or services. software. 215 . "Theft of data" Unauthorized acquisition and use of data. data. deliberate alteration of a system's logic. or firmware intentionally introduced into a system to perform or control execution of an unauthorized function or service. * "Hardware or software error" Error that results in the alteration of system functions or data.Threat (computer) Human action or inaction that unintentionally results in the alteration of system functions or data.

Version 2. intrusion detection system and anti-virus software. maintain and recover business-critical processes and systems. second edition. November 2006 (http:/ / www. References [1] Internet Engineering Task Force RFC 2828 Internet Security Glossary [2] ISO/IEC. Handbook of INFOSEC Terms. Very large organizations tend to adopt business continuity management plans in order to protect. riskmanagementinsight. 257 ISBN 978-0-12-374354-1 [10] ISACA THE RISK IT FRAMEWORK (registration required) (http:/ / www. isaca. eu/ act/ rm/ cr/ risk-management-inventory/ glossary#G51 ENISA Glossary threat [5] Technical Standard Risk Taxonomy ISBN 1-931624-77-1 Document Number: C081 Published by The Open Group. html) [14] Uncover Security Design Flaws Using The STRIDE Approach (http:/ / msdn. Information security awareness generates quite a large business: (see the category:Computer security companies). com/ en-us/ magazine/ cc163519. info/ Glossary.1040 pages ISBN 978-0-470. standards and methodologies. pdf) [11] Security engineering:a guide to building dependable distributed systems. europa. Minimum Security Requirements for Federal Information and Information Systems (http:/ / csrc. pdf). Ross Anderson. Wiley. org/ Knowledge-Center/ Research/ Documents/ RiskIT-FW-18Nov09-Research. eweek. CD-ROM (Idaho State University & Information Systems Security Organization) [8] NIATEC Glossary (http:/ / niatec. gov/ publications/ fips/ fips200/ FIPS-200-final-march. Physical Security measures. nist. Corey (1996). aspx?term=5652& alpha=T) [9] Wright. enisa. Jim Harmening (2009) "15" Computer and Information Security Handbook Morgan Kaufmann Pubblications Elsevier Inc p. page 17 [12] Eweek Using Facebook to Social Engineer Your Way Around Security (http:/ / www. Joe. com/ c/ a/ Security/ Social-Engineering-Your-Way-Around-Security-With-Facebook-277803/ ) [13] Networkworld Social engineering via Social networking (http:/ / www.Threat (computer) 216 Threat management Threats should be managed by operating an ISMS. [7] Schou. Chapter 2.Security tecniques-Information security risk management" ISO/IEC FIDIS 27005:2008 [3] Federal Information Processing Standards (FIPS) 200. Some of these plans foreseen to set up computer security incident response team (CSIRT) or computer emergency response team (CERT) There are some kind of verification of the threat management process: • Information security audit • Penetration test Most organizations perform a subset of these steps. A lot of software has been developed to deal with IT threats: • Open source software • see the category category:free security software • Proprietary • see the category category:computer security software companies for a partial list Threat literature Well respected authors have published books on threats or computer security (see category:computer security books: Hacking: The Art of Exploitation Second Edition is a good example. aspx) [15] McAfee lab page (http:/ / www. microsoft. html) . policies and procedures such as regular backups and configuration hardening. 2008 . networkworld. training such as security awareness education. adopting countermeasures based on a non systematic approach: Computer insecurity studies the battlefield of computer security exploits and defences that results.0. Countermeasures may include tools such as firewalls. [6] "An Introduction to Factor Analysis of Information Risk (FAIR)".06852-6. com/ us/ mcafee_labs/ gti. "Information technology -. com/ newsletters/ sec/ 2010/ 100410sec1. com/ media/ docs/ FAIR_introduction. January 2009. mcafee. performing all the IT risk management activities foreseen by laws. Risk Management Insight LLC. pdf) [4] http:/ / www.

system security procedures. A vulnerability with one or more known instances of working and fully-implemented attacks is classified as an exploitable vulnerability . The risk is tied to the potential of a significant loss. Then there are vulnerabilities without risk: for example when the affected asset has no value. implementation. . symantec.Threat (computer) [16] Symantec TreatCon (http:/ / www. pdf) 217 External links • Term in FISMApedia (http://fismapedia. a security fix was available/deployed. to when access was removed. a vulnerability is a weakness which allows an attacker to reduce a system's information assurance. 4009 dated 26 April 2010 National Information Assurance Glossary[6] : Vulnerability .[1] To be vulnerable. classifying.Weakness in an IS. site. Vulnerability is the intersection of three elements: a system susceptibility or flaw. The window of vulnerability is the time from when the security hole was introduced or manifested in deployed software. Security bug is a narrower concept: there are vulnerabilities that are not related to software: hardware. jsp) [17] OWASP Threat agents categorization (http:/ / www. or internal controls that could be exercised (accidentally triggered or intentionally exploited) and result in a security breach or a violation of the system's security policy. Vulnerability management is the cyclical practice of identifying. mil/ tmis_new/ Policy\Federal\fips31. vulnerability is also known as the attack surface. The usage of vulnerability with the same meaning of risk can lead to confusion. or implementation that could be exploited Many NIST publications define vulnerability in IT contest in different publications: FISMApedia provide a list. personnel vulnerabilities are examples of vulnerabilities that are not security software bugs.org/index. Definitions ISO 27005 defines vulnerability as:[3] A weakness of an asset or group of assets that can be exploited by one or more threats where an asset is anything that can has value to the organization.php?title=Term:Threat) Vulnerability (computing) In computer security. or the attacker was disabled. including information resources that support the organization's mission[4] IETF RFC 2828 define vulnerability as: [5] A flaw or weakness in a system's design. design. owasp. In this frame. A security risk may be classified as a vulnerability. attacker access to the flaw. and attacker capability to exploit the flaw. and mitigating vulnerabilities"[2] This practice generally refers to software vulnerabilities in computing systems. org/ index. or operation and management that could be exploited to violate the system's security policy The Committee on National Security Systems of United States of America defined vulnerability in CNSS Instruction No. com/ security_response/ threatconlearn. Constructs in programming languages that are difficult to use properly can be a large source of vulnerabilities. its business operations and their continuity. an attacker must have at least one applicable tool or technique that can connect to a system weakness.a vulnerability for which an exploit exists.[9] give a broader one: [7] term [8] A flaw or weakness in system security procedures. internal controls. implementation. Between them SP 800-30. remediating. php/ Category:Threat_Agent) [18] FIPS PUB 31 FEDERAL INFORMATION PROCESSING STANDARDS PUBLICATION 1974 JUNE (http:/ / www. tricare.

as defined by a security policy. the probable level of force that a threat agent is capable of applying against an asset. A compromised state is the state so reached. 5. application. A weakness in the physical layout. if specific. . operation or internal control Data and Computer Security: Dictionary of standards concepts and terms. 2.e) where: e may be an empty set. etc. management. which could be exploited to gain unauthorized access to classified or sensitive information. the strength of a control as compared to a standard measure of force and the threat Capabilities..e. it may characterize only one. i. in union with a set of properties of a specific external entity. administration. 2) In computer security. An assertion primarily concerning entities of the internal environment (assets). a vulnerability is merely a condition or set of conditions that may allow the ADP system or activity to be harmed by an attack. Internet controls. Matt Bishop and Dave Bailey [13] give the following definition of computer vulnerability: A computer system is composed of states describing the current configuration of the entities that make up the computer system. internal controls. procedures. or software that may be exploited to cause harm to the ADP system or activity. or implementation error that can lead to an unexpected. The presence of a vulnerability does not in itself cause harm. ISBN 0-935859-17-9.. i. implementation. management. 6. All states reachable from a given initial state using a set of state transitions fall into the class of authorized or unauthorized. undesirable event [G. administration. A weakness in system security procedures. hardware or softwarethat may be exploited to cause harm to the ADP system or activity. National Information Assurance Training and Education Center defines vulnerability: [14] [15] A weakness in automated system security procedures. that could be exploited by a threat to gain unauthorized access to information or disrupt critical processing. that could be exploited by a threat to gain unauthorized access toinformation of to disrupt critical processing. and so forth. possibly involving an agent or collection of agents).(ITSEC) The Open Group defines vulnerability in [11] as: The probability that threat capability exceeds the ability to resist the threat. administrativecontrols. organization. organization. The system computes through the application of state transitions that change the state of the system.Vulnerability (computing) ENISA defines vulnerability in [10] as: The existence of a weakness. administrative controls. 4. or protocol involved. a weakness in the physicallayout. authors Dennis Longley and Michael Shain.11] compromising the security of the computer system. the definitions of these classes and transitions is considered axiomatic. personnel. any weakness or flaw existing in a system. In this paper. A set of properties of a specific internal entity that. 3. If generic. or the opportunity availableto a threat agent to mount that attack. procedures. 218 . we say that an asset (or class of assets) is vulnerable (in some way. we write: V(i. design.e. Susceptibility to various threats.. an attack begins in a vulnerable state. By definition. A vulnerable state is an authorized state from which an unauthorized state can be reached using authorized state transitions. internal controls. etc. 3) In computer security. An attack is a sequence of authorized state transitions which end in a compromised state. a weakness in automated systems security procedures. the vulnerability may characterize many vulnerable states. defines vulnerability as: 1) In computer security. A vulnerability is a characterization of a vulnerable state which distinguishes it from all non-vulnerable states. hardware design. network. ISACA defines vulnerability in Risk It framework as: A weakness in design. hardware. Stockton Press. personnel. The attack or harmful event. Factor Analysis of Information Risk (FAIR) defines vulnerability as[12] : The probability that an asset will be unable to resist the actions of a threat agent According FAIR vulnerability is related to Control Strength.

.. A Threat Action | | measure | | Target of the Attack | +----------+ | | | | +-----------------+ | | Attacker |<==================||<========= | | | i. A "passive attack" attempts to learn or make use of information from the system but does not affect system resources: so it compromises Confidentiality...[17] OWASP: relationship between threat agent and .....+ + .. the business impact Information Security Management Systems (ISMS)...-+ An Attack: | |Counter. when applied to the transmission of information are named security services.....-+ A resource (both physical or logical) can have one or more vulnerabilities that can be exploited by a threat agent in a threat action.....+ + . The characteristics of a system which cause it to suffer a definite degradation (incapability to perform the designated mission) as a result of having been subjected to a certain level of effects in an unnatural (manmade) hostile environment. has been developed to manage..| | A System Resource: | i.e..Vulnerability (computing) implies a risk.......+ + .[16] The overall picture represents the risk factors of the risk scenario....[5] OWASP (see figure) depicts the same phenomenon in slightly different terms: a threat agent through an attack vector exploits a weakness (vulnerability) of the system and the related security controls causing an technical impact on an IT resource (asset) connected to a business impact.. A set of policies concerned with information security management... Countermeasures are also called Security controls. The attack can be active when it attempts to alter system resources or affect their operation: so it compromises Integrity or Availability....... 219 Phenomenology The term "vulnerability" relates to some other basic security terms as shown in the following diagram:[5] + | | | | | | | | | + .+ + .. Integrity or Availability properties of resources (potentially different that the vulnerable one) of the organization and others involved parties (customers. the countermeasures in order to accomplish to a security strategy set up following rules and regulations applicable in a country.e. suppliers). 7.... | Passive | | | | | Vulnerability | | | A Threat |<=================>||<========> | | | Agent | or Active | | | | +-------|||-------+ | +----------+ Attack | | | | VVV | | | | | Threat Consequences | . The result can potentially compromises the Confidentiality..... The so called CIA triad is the basis of Information Security.. according to Risk management principles.

For example operating systems with policies such as default permit grant every program and every user full access to the entire computer. The computer user stores the password on the computer where a program can access it. complex systems increase the probability of flaws and unintended access points [18] • Familiarity: Using common. [20] • Internet Website Browsing: Some internet websites may contain harmful Spyware or Adware that can be installed automatically on the computer systems. After visiting those websites.[18] • Fundamental operating system design flaws: The operating system designer chooses to enforce suboptimal policies on user/program management.[21] • Software bugs: The programmer leaves an exploitable bug in a software program. and services and time each of those are accessible increase vulnerability [12] • Password management flaws: The computer user uses weak passwords that could be discovered by brute force. the computer systems become infected and personal information will be collected and passed on to third party individuals. privileges.[18] . The software bug may allow an attacker to misuse an application. protocols. SQL injection or other non-validated inputs). operating systems. software.[18] • Unchecked user input: The program assumes that all user input is safe. ports. and/or hardware increases the probability an attacker has or can find the knowledge and tools to exploit the flaw [19] • Connectivity: More physical connections.Vulnerability (computing) 220 Classification Vulnerabilities are classified according to the asset class they related to:[3] • hardware • susceptibility to humidity • susceptibility to dust • susceptibility to soiling • susceptibility to unprotected storage • software • insufficient testing • lack of audit trail • network • unprotected communication lines • insecure network architecture • personnel • inadequate recruiting process • inadequate security awareness • site • area subject to flood • unreliable power source • organizational • lack of regular audits • lack of continuity plans Causes • Complexity: Large. well-known code.[18] This operating system flaw allows viruses and malware to execute commands on behalf of the administrator. Users re-use passwords between many programs and websites. Programs that do not check user input can allow unintended direct execution of commands or SQL statements (known as Buffer overflows.

Privacy law forces managers to act to reduce the impact or likelihood that security risk. can (easily) know that IT systems and applications have vulnerabilities and do not perform any action to manage the IT risk is seen as a misconduct in most legislations. [26] The proper way to professionally manage the IT risk is to adopt an Information Security Management System. or upper management. operator.Vulnerability (computing) • Too feeble learning system from occurred accidents [22] [23] : for example most vulnerabilities discovered in IPv4 protocol software where discovered in the new IPv6 implementations [24] The research has shown that the most vulnerable point in most information systems is the human user. Microsoft. The never ending effort to find new vulnerabilities and to fix them is called Computer insecurity. 221 Vulnerability consequences The impact of a security breach can be very high. according to the security strategy set forth by the upper management. and Rapid7 have recently issued guidelines and statements addressing how they will deal with disclosure going forward. at least having demonstrated the good faith. Physical security is a set of measures to protect physically the information asset: if somebody can get physical access to the information asset is quite easy to made resources unavailable to its legitimate users. Information technology security audit is a way to let other independent people certify that the IT environment is managed properly and lessen the responsibilities."[27] A responsible disclosure first alerts the affected vendors confidentially before alerting CERT two weeks later. Penetration test is a form of verification of the weakness and countermeasures adopted by an organization: a White hat hacker tries to attack an organization information technology assets. Vulnerability disclosure Responsible disclosure of vulnerabilities is a topic of great debate. to set up a multilayer defence system that can: • prevent the exploit • detect and intercept the attack • find out the threat agents and persecute them Intrusion detection system is an example of a class of systems used to detect attacks. they offer their exploits privately to enable Zero day attacks. threat. TippingPoint. "Google.[28] A full disclosure is done when all the details of vulnerability is publicized. such as ISO/IEC 27002 or Risk IT and follow them. As reported by The Tech Herald in August 2010. information resources. The fact that IT managers. perhaps with the intent to put pressure on the software or procedure authors to find a fix urgently.e. designer.[29] Instead. to find out how is easy or difficult to compromise the IT security. its operating system and applications in order to meet a good security level have been developed: ITSEC and Common criteria are two examples. or other human:[25] so humans should be considered in their different roles as asset. Well respected authors have published books on vulnerabilities and how to exploit them: Hacking: The Art of Exploitation Second Edition is a good example. . [16] One of the key concept of information security is the principle of defence in depth: i. Social engineering is an increasing security concern. Some set of criteria to be satisfied by a computer. Security researchers catering to the needs of the cyberwarfare or cybercrime industry have stated that this approach does not provide them with adequate income for their efforts. which grants the vendors another 45 day grace period before publishing a security advisory.

vulnerability information is discussed on a mailing list or published on a security web site and results in a security advisory afterward. Relying solely on scanners will yield false positives and a limited-scope view of the problems present in the system. various forms of Unix and Linux. OpenVMS.Vulnerability (computing) 222 Vulnerability inventory Mitre Corporation maintains a list of disclosed vulnerabilities in a system called Common Vulnerabilities and Exposures.g. including careful system maintenance (e. The only way to reduce the chance of a vulnerability being used against a system is through constant vigilance. Usually. best practices in deployment (e. where vulnerability are classified (scored) using Common Vulnerability Scoring System (CVSS). see Social engineering (security). Vulnerabilities have been found in every major operating system including Windows. OWASP collects a list of potential vulnerabilities in order to prevent system designers and programmers insert vulnerabilities in the software [30] Vulnerability disclosure date The time of disclosure of a vulnerability is defined differently in the security community and industry. It is evident that a pure technical approach cannot even protect physical assets: you should have administrative procedure to let maintenance personnel to enter the facilities and people with adequate knowledge of the procedures. It is most commonly referred to as "a kind of public disclosure of security information by a certain party". Mac OS. Four examples of vulnerability exploits: • an attacker finds and uses an overflow weakness to install malware to export sensitive data. motivated to follow it with proper care. they can not replace human judgment. applying software patches). The time of disclosure is the first date a security vulnerability is described on a channel where the disclosed information on the vulnerability has to fulfill the following requirement: • The information is freely available to the public • The vulnerability information is published by a trusted and independent channel/source • The vulnerability has undergone analysis by experts such that risk rating information is included upon disclosure Identifying and removing vulnerabilities Many software tools exist that can aid in the discovery (and sometimes removal) of vulnerabilities in a computer system.g. Examples of vulnerabilities Vulnerabilities are related to: • • • • • • • • • physical environment of the system the personnel management administration procedures and security measures within the organization business operation and service delivery hardware software communication equipment and facilities and their combinations. the use of firewalls and access controls) and auditing (both during development and throughout the deployment lifecycle). . Though these tools can provide an auditor with a good overview of possible vulnerabilities present. and others.

• an insider copies a hardened. 223 Software vulnerabilities Common types of software flaws that lead to vulnerabilities include: • Memory safety violations. such as: • Buffer overflows • Dangling pointers • Input validation errors. such as: • Cross-site request forgery in web applications • Clickjacking • FTP bounce attack • Privilege escalation • User interface failures. . such as: • Format string attacks • Improperly handling shell metacharacters so they are interpreted • SQL injection • Code injection • E-mail injection • Directory traversal • Cross-site scripting in web applications • HTTP header injection • HTTP response splitting • Race conditions. encrypted program onto a thumb drive and cracks it at home. such as: • Time-of-check-to-time-of-use bugs • Symlink races • Privilege-confusion bugs.Vulnerability (computing) • an attacker convinces a user to open an email message with attached malware. • a flood damage your computer systems installed at ground floor. such as: • Warning fatigue [31] or user conditioning [32] • Blaming the Victim Prompting a user to make a security decision without giving the user enough information to answer it [33] • Race Conditions [34] [35] Some set of coding guidelines have been developed and a large number of static code analysers has been used to verify that the code follows the guidelines.

ranum. org/ index. dod. [24] Hacking: The Art of Exploitation Second Edition [25] Kiountouzis. org/ index. April 15. cnss. ISBN 978-1-4398-0150-5 [3] ISO/IEC. nz/ ~pgut001/ pubs/ phishing. freedom-to-tinker. pdf) dated 26 April 2010 [7] a wiki project (http:/ / fismapedia. Corey (1996). mozilla. U. edu/ viewdoc/ download?doi=10. com/ ?p=5325) [30] OWASP vulnerability categorization (http:/ / www. php?title=Term:Vulnerability) [9] NIST SP 800-30 Risk Management Guide for Information Technology Systems (http:/ / csrc. webappsec. com/ security/ computer_security/ editorials/ dumb/ [21] The Web Application Security Consortium Project. gov/ publications/ nistpubs/ 800-30/ sp800-30. and Failures of the Twentieth Century. com/ media/ docs/ FAIR_introduction. info/ Glossary. 2010. blogspot. Ltd ISBN 0-412-78120-4 [26] Bavisi. A. pdf).S. page 1. com/ 2004/ 07/ 01/ race-conditions-in-security-dialogs/ [35] http:/ / lcamtuf. 1997 (http:/ / citeseerx. html . rapid7.0. squarefree. com/ article. P: Vulnerability Management. php/ 201033/ 6025/ The-new-era-of-vulnerability-disclosure-a-brief-chat-with-HD-Moore)]] [28] [[Rapid7 (http:/ / www. spi. owasp. When Technology Fails: Signi cant Technological Disasters. Sanjay (2009) "22" Computer and Information Security Handbook Morgan Kaufmann Pubblications Elsevier Inc p. University Computer Laboratory. com/ ?p=459 [32] http:/ / www.bridge. Technical Report CSE-96-11. psu.Security tecniques-Information security risk management" ISO/IEC FIDIS 27005:2008 [4] British Standard Institute. Web Application Security Statistics 2009 (http:/ / projects. Handbook of INFOSEC Terms. jsp)] Vulnerability Disclosure Policy] [29] Blog post about DLL hijacking vulnerability disclosure (http:/ / blog. gov/ Assets/ pdf/ cnssi_4009. Jim Harmening (2009) "15" Computer and Information Security Handbook Morgan Kaufmann Pubblications Elsevier Inc p. January 2009.. [23] Neil Schlager. nist. pdf) [18] Kakareka. Joe. Purdue University. Why Cryptosystems Fail. CD-ROM (Idaho State University & Information Systems Security Organization) [15] NIATEC Glossary (http:/ / niatec. org/ w/ page/ 13246989/ Web-Application-Security-Statistics#APPENDIX2ADDITIONALVULNERABILITYCLASSIFICATION) [22] Ross Anderson. 26. E. A Critical Analysis of Vulnerability Taxonomies. pdf) [10] Risk Management Glossary Vulnerability (http:/ / www. com/ 2010/ 08/ on-designing-uis-for-non-robots. org/ Knowledge-Center/ Research/ Documents/ RiskIT-FW-18Nov09-Research.. "Information technology -. Accidents. September 1996 [14] Schou. pdf [33] http:/ / blog. riskmanagementinsight. [2] Foreman. europa. 375 ISBN 978-0-12-374354-1 [27] The Tech Herald: The new era of vulnerability disclosure . ac. php/ Category:Vulnerability) [31] http:/ / www. 4009 (http:/ / www. Retrieved 2009-12-15. 1994. Gale Research Inc. 1. Kokolakis. Version 2. rapid7.Security techniques -. eu/ act/ rm/ cr/ risk-management-inventory/ glossary#G52) [11] Technical Standard Risk Taxonomy ISBN 1-931624-77-1 Document Number: C081 Published by The Open Group. aspx?term=6018& alpha=V) [16] Wright. Technical report. mil/ tenets. Information systems security: facing the information society of the 21st century London: Chapman & Hall. A. isaca. [12] "An Introduction to Factor Analysis of Information Risk (FAIR)". Cam. ist. Air Force Software Protection Initiative. auckland.Part 1: Concepts and models for information and communications technology security management BS ISO/IEC 13335-1-2004 [5] Internet Engineering Task Force RFC 2828 Internet Security Glossary [6] CNSS Instruction No. org/ index. January 1994. November 2006 (http:/ / www.Vulnerability (computing) 224 References [1] "The Three Tenents of Cyber Security" (http:/ / www. Taylor & Francis Group. Information technology -. Department of Computer Science at the University of California at Davis. [13] Matt Bishop and Dave Bailey. 1. 257 ISBN 978-0-12-374354-1 [17] ISACA THE RISK IT FRAMEWORK (registration required) (http:/ / www. com/ disclosure. 393 ISBN 978-0-12-374354-1 [19] Technical Report CSD-TR-97-026 Ivan Krsul The COAST Laboratory Department of Computer Sciences. .a brief chat with [[HD Moore (http:/ / www. Risk Management Insight LLC. com/ rob-sayre/ 2007/ 09/ 28/ blaming-the-victim/ [34] http:/ / www. php) devoted to FISMA [8] FISMApedia Vulnerability term (http:/ / fismapedia.Management of information and communications technology security -. 5435& rep=rep1& type=pdf) [20] http:/ / www. enisa. S. thetechherald. htm). cs. Almantas (2009) "23" Computer and Information Security Handbook Morgan Kaufmann Pubblications Elsevier Inc p.

Such analysis could be used to further tighten security of the actual network being protected by the honeypot.dmoz. the password which is something you 'know'. a firewall enforces access policies such as what services are allowed to be accessed by the network users. commonly with a username and a password. from denial of service attacks or an employee accessing files at strange times. i.microsoft.[2] Though effective to prevent unauthorized access.com/technet/archive/community/columns/ security/essays/vulnrbl. misuse. Since this requires just one thing besides the user name.g. Honeypots.e. and all other type of institutions.org/isai/): Guidance for Avoiding Vulnerabilities through Language Selection and Use • Microsoft Security Response Center (http://www. an ATM card.[4] .osvdb. Individual events occurring on the network may be logged for audit purposes and for later high level analysis. suspicious) content or behavior and other anomalies to protect resources. essentially decoy network-accessible resources.org/index. such as within a company.mitre. could be deployed in a network as surveillance and early-warning tools as the honeypot will not normally be accessed. With two factor authentication something you 'have' is also used (e. a security token or 'dongle'. or your mobile phone).e.aitcnet. both public and private that are used in everyday jobs conducting transactions and communications among businesses. this is sometimes termed one factor authentication. modification.mspx): Definition of a Security Vulnerability • NIST Software Assurance Metrics and Tool Evaluation (SAMATE) project (http://samate. a fingerprint or retinal scan). Once authenticated. Network Security is the authorization of access to data in a network. Network Security is involved in organization. An anomaly-based intrusion detection system may also monitor the network and traffic for unexpected (i.org/) Network security In the field of networking.org/) • Open Web Application Security Project (http://www. Users are assigned an ID and password that allows them access to information and programs within their authority. and others which might be open to public access. Protects and oversees operations being done. secures the network. e. Anti-virus software or an intrusion prevention system (IPS)[3] help detect and inhibit the action of such malware.cve. or with three factor authentication something you 'are' is also used (e.g. or denial of the computer network and network-accessible resources.owasp. Networks can be private. Techniques used by the attackers that attempt to compromise these decoy resources are studied during and after an attack to keep an eye on new exploitation techniques.gov/) • Open Source Vulnerability Database (OSVDB) homepage (http://www.php/Category:Vulnerability) • Common Vulnerabilities and Exposures (CVE) (http://www.g. government agencies and individuals. enterprises. Network security concepts Network security starts from authenticating the user.nist. the area of network security[1] consists of the provisions and policies adopted by the network administrator to prevent and monitor unauthorized access. Network Security covers a variety of computer networks.org/Computers/Security/ Advisories_and_Patches/ • Languages Standard's group (http://www. It does as its titles explains. Communication between two hosts using a network could be encrypted to maintain privacy. this component may fail to check potentially harmful content such as computer worms or Trojans being transmitted over the network. which is controlled by the network administrator.Vulnerability (computing) 225 External links • Security advisories links from the Open Directory http://www.

as this function is unnecessary for home use. A home or small office would only require basic security while large businesses will require high maintenance and advanced software and hardware to prevent malicious attacks from hacking and spamming. • Have multiple accounts per family member. • When using a wireless connection. (However. Also try to use the strongest security supported by your wireless devices. Use an optional network analyzer or network monitor. Exercise physical security precautions to employees.php?p=43 ) • Enable MAC Address filtering to keep track of all home network MAC devices connecting to your router. zdnet. An enlightened administrator or manager. Large businesses • • • • • • • • • • A strong firewall and proxy to keep unwanted people out. Implement physical security management like closed circuit television for entry areas and restricted zones. also disable SSID Broadcast. use a robust password. using non-administrative accounts for day-to-day activities. many security experts consider this to be relatively useless. A strong Antivirus software package and Internet Security Software package. Fire extinguishers for fire-sensitive areas like server rooms and security rooms. Disable the guest account (Control Panel> Administrative Tools> Computer Management> Users). For authentication.Network security 226 Security management Security Management for networks is different for all kinds of situations. Raise awareness about physical security to employees. basic Antivirus software.com/Ou/index. Homes & Small Businesses • A basic firewall or a unified threat management system. • Review router or firewall logs to help identify abnormal network connections or traffic to the Internet. use a robust password. use strong passwords and change it on a bi-weekly/monthly basis. Prepare a network analyzer or network monitor and use it when needed. • Use passwords for all accounts. There are many other types of antivirus or anti-spyware programs out there to be considered. . use a robust password. • Raise awareness about information security to children. • For Windows users. • Assign STATIC IP addresses to network devices. such as WPA2 with AES encryption. Security guards can help to maximize security.[5] Medium businesses • • • • • • • A fairly strong firewall or Unified Threat Management System Strong Antivirus software and Internet Security Software. • If using Wireless: Change the default SSID network name. http://blogs. When using a wireless connection. When using a wireless connection. Security fencing to mark the company's perimeter. An anti-spyware program would also be a good idea. use strong passwords and change it on a weekly/bi-weekly basis. For authentication. • Disable ICMP ping on router.

com/bookstore/ product. librarians. P. A.interview with Jayshree Ullal.com/bookstore/ product.html) (The Froehlich/Kent Encyclopedia of Telecommunications vol. Marcel Dekker.ciscopress. Cisco Press. html). Jul. asp?a=b411523c-c935-4708-a563-d24de8fcfc71). Network monitoring/Intrusion Detection Systems (IDS) (http:/ / staff. Honeynets (http:/ / www. Constant supervision by teachers. html?sid=BAC-NewsWire) . cisco. 2006. 2007. 1997. Sandilands. 15.2542. Whitelist authorized wireless connection. senior VP of Cisco [3] Dave Dittrich.t=network+security&i=47911. Duane DeCapite. Put web servers in a DMZ. New York. 2005.html • http://www.ciscopress. asp?isbn=1587052148). com/ dlls/ 2008/ ts_010208b. Large government • • • • • • • • A strong firewall and proxy to keep unwanted people out. L (2004). washington. block all else. Children's Internet Protection Act compliance.com/cisco/web/solutions/ small_business/resource_center/articles/secure_my_business/what_is_network_security/index. Social software development program Wi-Tech Further reading • Cisco.com/authors/bio.ciscopress. Sep.cert. Strong encryption.Network security 227 School • • • • • • An adjustable firewall and proxy to allow authorized users access from the outside and inside. Matt Curtin. May 27. Strong Antivirus software and Internet Security Software packages. (2011).org/encyc_article/tocencyc. 2006. Gary Halleen/Greg Kellogg. • Self-Defending Networks: The Next Generation of Network Security (http://www.asp • Security of the Internet (http://www. doi:10.) • Introduction to Network Security (http://www. Supervision of network to guarantee updates and changes based on popular site usage. Cisco Press.interhack. Lecture Notes in Computer Science 3285: 317–323. van Ekert. or a firewall from the outside and from the inside. Strong antivirus software and Internet Security Software suites. References [1] Simmonds. [4] Honeypots. Wireless connections that lead to firewalls. 8. What is network security?. [2] A Role-Based Trusted Network Provides Pervasive Security and Compliance (http:/ / newsroom. All network hardware is in secure zones. .com/bookstore/product. "An Ontology for Network Security Attacks". asp?isbn=1587052709). All hosts should be on a private network that is invisible from the outside. Greg Abelar (http://www.asp?isbn=1587052601). Dale Tesch/ Greg Abelar (http://www. and administrators to guarantee protection against attacks by both internet and sneakernet sources.ciscopress. • Securing Your Business with Cisco ASA and PIX Firewalls (http://www. 6. Retrieved from http://www.1007/978-3-540-30176-9_41.ciscopress. • Security Threat Mitigation and Response: Understanding CS-MARS (http://www. edu/ dittrich/ network. • Security Monitoring with Cisco Security MARS (http://www.com/bookstore/product.cisco. Cisco Press. Sep. 231-255. University of Washington.net/pubs/network-security). net) [5] Julian Fredin. asp?a=b411523c-c935-4708-a563-d24de8fcfc71). 26.00. pp. Cisco Press.ciscopress.asp?isbn=1587052539).com/encyclopedia_term/0. honeypots. Security fencing to mark perimeter and set wireless range to this.pcmag.com/authors/bio.

html) about Security and VPN • (http://www.t=network+security&i=47911. 2.asp?isbn=1587053101). Charlie Kaufman | Radia Perlman | Mike Speciner. and the interconnecting network(s). administrative domains wishing ad hoc interoperation or full interoperability have to build a federation.springer. This particularly applies to computer network security.which is the real threat to network security? .ac.asp?docid=102) • OpenLearn . Springer. 2002. Cisco Press.2542.com/computer/communications/book/ 978-1-4419-0165-1).com/encyclopedia_term/0.deepnines. This concept is captured by the 'AdminDomain' class of the GLUE information model[1] . 147. Prentice-Hall.Video (http://www. managed by a single administrative authority.com/web/about/ciscoitatwork/case_studies/security.00.tv/ docuplayer.open. org/ documents/ GFD.com) Cyber Security Network • (http://www. 228 External links • (http://www. which is licensed under the GFDL. 2009. 2006.cyberwarzone.cisco. References [1] http:/ / www. 5.Network security • Deploying Zone-Based Firewalls (http://www.uk/course/view.Network Security (http://openlearn.ciscopress. Angus Wong and Alan Yeung.com/bookstore/product.0 (Open Grid Forum) This article was originally based on material from the Free On-line Dictionary of Computing.network. . Ivan Pepelnjak. Therefore.asp) Definition of Network Security • Debate: The data or the source . • Network Security: PRIVATE Communication in a PUBLIC World.netevents. • Network Infrastructure Security (http://www.com/secure-web-gateway/definition-of-network-security) Definition of Network Security • Cisco IT Case Studies (http://www.php?id=2587) Administrative domain Definition An administrative domain is a service provider holding a security repository permitting to easily authenticate and authorize clients with credentials. Implementation It may be implemented as a collection of hosts and routers.pcmag. different security software or different security policies is notoriously difficult. ogf. Interoperation between different administrative domains having different security repositories. Oct. ISBN . pdf GLUE Specification v.

AEGIS SecureConnect 229 AEGIS SecureConnect AEGIS SecureConnect (or simply ‘AEGIS’) is a network authentication system used in IEEE 802.e. com/ en/ US/ prod/ collateral/ wireless/ ps6442/ ps7034/ prod_qas0900aecd80507fd8. asp [2] http:/ / www.1X “Supplicant” (i. It was developed by Meetinghouse Data Communications.) [2] References [1] http:/ / www. (since acquired by Cisco Systems and renamed “Cisco Secure Services Client”). cisco.1X networks. WPA-Radius. and is commonly installed along with a Network Interface Card’s (NIC) drivers. or Certificate-based authentication). The AEGIS Protocol is an 802. html . handles authentication for wired and wireless networks. Inc. mtghouse. such as those that use WPA-PSK. com/ index_home. Inc. External links • Meetinghouse Data Communications [1] • Cisco Secure Services Client Q&A (Cisco Systems.

In its early years the company developed two product lines.Aladdin. General Manager North America. Ludger Wilmer.50 per share. when he was 23 years old.[7] By 2007 the company's annual revenues reached over $105 million.000. General Manager Europe eToken. In March 2009. but Aladdin's founder Margalit refused the offer arguing that the company was worth more. similar to digital rights management.com Aladdin Knowledge Systems (formerly NASDAQ: ALDN [3] and TASE: ALDN [4]) is a company that produces software for digital rights management and Internet security since 1985.50 per share.900. Yanki raised just $10. In mid-2008. he was soon joined by brother Dany Margalit.[8] . eSafe. Its corporate headquarters are located in Tel Aviv. The same year that company had an initial public offering on NASDAQ raising $7. an artificial intelligence package (which was dropped early on) and a hardware product to prevent unauthorized software copying.888 million (2007) 464 [1] [2] Products Revenue Operating income Net income Employees Website www. History Aladdin Knowledge Systems was founded in 1985 by Jacob (Yanki) Margalit. Israel.[5] The digital rights management product became a success and by 1993 generated sales of $4.000 as an initial capital for the company. Vector Capital was attempting to purchase Aladdin.911 million (2007) $ 14.000. HASP US$ 105. Vector initially offered $14.000.[6] In 2004 the company's shares were also listed on the Tel Aviv Stock Exchange. while at the same time completing a Mathematics and Computer Science degree in Tel Aviv University. Aviram Shemer. Vector Capital acquired Aladdin and officially merged it with SafeNet. Hardlock.Aladdin Knowledge Systems 230 Aladdin Knowledge Systems Aladdin Knowledge Systems Type Industry Founded Founder(s) Headquarters Key people Private Security Software & Services 1985 Jacob (Yanki) Margalit and Tzvi Popowski Tel Aviv. Israel John Gunn. CFO. in cash. Aladdin's shareholders agreed on the merger in February 2009 at $11. who took the responsibility for product development at the age of 18.9 million (2007) $ 13.

in recent years also software as a service capability. which stands for Hardware Against Software Piracy.Aladdin acquired the software protection business of EliaShim 1996 .Aladdin acquired 10% of Comsec 2001 .Aladdin Knowledge Systems was established 1993 . used by over 30.Aladdin held an Initial Public Offering 1995 .Aladdin completed second offering . . Products DRM Aladdin's HASP product line is a digital rights management (DRM) suite of protection and licensing software with 40% global market share.Aladdin Knowledge Systems 231 Corporate Timeline[1] • • • • • • • • • • 1985 .Aladdin was acquired by Vector Capital and merged with Vector's SafeNet. portable device for two-factor authentication.000 software publishers. mainly deployed as a USB token.Aladdin acquired the German company FAST 1998 . protecting networks against malicious. password and digital identity management. Linux.Aladdin acquired the eSafe content security business of EliaShim 2000 . was the company's first product and evolved into a complete digital rights management suite.000 shares with net proceeds of $39m 2009 . Network Security eSafe a line of integrated content security solutions.Aladdin acquired the ESD assets of Preview Systems 2005 . It is used across many platforms (Windows.000.[9] HASP. inappropriate and non-productive Internet-borne content.Aladdin patented USB smart card based authentication tokens 1999 . Mac). that includes a software only option and a back office management application. offering two product lines: Digital identity management eToken. Internet security In the late 1990s the company started diversifying and began offering Internet security and Network security products.2.

Shmuel (March 2009). ." (http:/ / www. ac. "Aladdin's Yanki Margalit: It's not personal" (http:/ / www. [10] http:/ / www. retrieved 2008-10-20 [2] http:/ / www. 2004. PR Newswire Article date: July 22. com/ doc/ 1G1-14668135. Zack. haifa. php?symbol=ALDN& page=quotesearch). nasdaq. asp?a=1& lang=eng& pos=pages& fname=alladin& fType=htm& show=5& show2=4 [12] http:/ / techlaw. il/ TASEEng/ Management/ GeneralPages/ SimpleSearchResult. com [11] http:/ / techlaw. com/ 2008/ 03/ 12/ march-stock-of-the-month-aladdin-knowledge-systems-aldn/ ). [7] "Aladdin Knowledge Systems to Dual-List on Tel Aviv Stock Exchange. [9] Miller. html). . Israel Business Today Article date: October 29. com/ asp/ SummaryQuote. haaretz. [8] Shuster. highbeam. highbeam. com/ index. co. zenobank. Israel Opportunity Investor. 1993. com/ doc/ 1G1-132339650. com/ ivcWeeklyItem. com [3] http:/ / quotes. com/ hasen/ spages/ 1073750. Aladdin. htm?objectId=& objectType=& securityType=& searchTerm=ALDN [5] Shohet. asp?articleID=7672). html). retrieved 2009-11-16 [6] "Aladdin completes flotation. il/ eng . Globes 15 09 -2008. tase. Dan. External links • Aladdin Knowledge Systems website [10] • Aladdin Prize [11] granted by the Haifa Center of Law & Technology [12] References [1] Company Profile for Aladdin Knowledge Systems Ltd (ALDN) (http:/ / www.Aladdin Knowledge Systems 232 Vbox Used by Adobe and Macromedia in their 30-days Product trials. haifa. asp?symbol=ALDN& selected=ALDN [4] http:/ / www. ac. il/ techlaw_index. "March Stock of the Month: Aladdin Knowledge Systems (ALDN)" (http:/ / israelnewsletter. . ivc-online. Vector completes Aladdin takeover (http:/ / www. aladdin. html). (Aladdin Knowledge Systems finishes initial public offering on NASDAQ)" (http:/ / www.

Alert Logic 233 Alert Logic Alert Logic is a provider of hosted IT network security. (http:/ / www. computerworld.com/features/ponemon/ponemon102306. 2) a hosted expert system that analyzes. Security technology firms lock in fresh rounds of funding. Security and Compliance. National Survey on the Detection and Prevention of Data Breaches. founded in 2002 and based in Houston. jhtml?articleID=192202949) [5] Bruce Schneier (July/Aug.00. com/ showArticle.1759. Houston Business Journal. Alert Logic Courts SMB Partners For On-Demand IPS.[2] Alert Logic provides protection against Internet computer worms. Texas. (http:/ / www. correlates. (http://www. the Gramm-Leach-Bliley Act of 1999. crn. com/ article2/ 0. html) • Dr. org/ incident_notes/ IN-2003-01. Alert Logic uses a Software as a Service (SaaS) platform to deliver IT network intrusion protection. (http:/ / houston.[3] Alert Logic’s security solution consists of three layers: 1) a security operations center staffed on a full-time basis by certified security analysts who analyze and respond to security incidents on customer networks.2004422. CRN (Aug. asp) [2] Mary Ann Azevedo. html) [4] Kevin McLaughlin. cert. 2006). Botnets.csoonline. and other IT security threats. (http:/ /www. 2006). schneier. (http:/ / www.[1] Alert Logic is a privately-held corporation. CSO (Oct. (http:/ / www. vulnerability assessment and improved IT compliance for mid-sized businesses and institutions. and 3) an on-premise appliance that monitors network traffic and continuously scans the network for threats and vulnerabilities. com/ essay-047. Alert Logic also provides hosted network security for managed service providers and their customers.com/action/article. and Denver-based Access Venture Partners. 2006). com/ houston/ stories/ 2006/ 07/ 24/ story8. trojan horse programs. Malicious Code Propagation and Antivirus Software Updates.do?command=viewArticleBasic&articleId=9002162) External links • Alert Logic corporate website (http://www. The U. Dallas-based Hunt Ventures LP led a $5 million round of Series B funding for Alert Logic. American City Business Journals (July 2006). Computerworld (Aug. Research has shown that firewalls are not sufficient to detect or eliminate all network threats. and the Sarbanes-Oxley Act of 2002.[5] References [1] Matt Hines. Larry Ponemon. eweek. html) [3] CERT (July 2003). which included participation from three other investors: DFJ Mercury of Houston.com) .[4] Alert Logic’s reporting capabilities provide midsized companies with improved documentation and overall IT compliance to policies and regulations. In August 2006. has experienced an increase in IT network security compliance laws.alertlogic. 'Brute force' attacks against SMBs on the rise. In addition to protecting individual customer networks. 2004). penalties for inadequate network security have increased. Since passage of the Health Insurance Portability and Accountability Act of 1996 and 2003.S. eWeek (August 2006).html) • Linda Rosencrance. bizjournals. OCA Venture Partners LLC of Chicago. Midmarket Could Push Adoption of Hosted Security Model. and automatically mitigates IT security incidents and vulnerabilities.

Recent Advances in Intrusion Detection. namely a high false positive rate and the ability to be fooled by a correctly delivered attack. org/ 10. Springer Berlin. and flag any deviation from this as an attack. . doi. most often with artificial intelligence type techniques. This is as opposed to signature based systems which can only detect attacks for which a signature has previously been created. [2] A strict anomaly detection model for IDS.[1] In order to determine what is attack traffic. Special Issue on Traffic Classification and Its Applications to Modern Networks 5 (6): 864-881. "Anomalous Payload-Based Network Intrusion Detection" (http:/ / dx. "McPAD : A Multiple Classifier System for Accurate Payload-based Anomaly Detection" (http:/ / 3407859467364186361-a-1802744773732722657-s-sites. and Wenke Lee (2009). Giorgio Giacinto. This is known as strict anomaly detection. Computer Networks. Systems using neural networks have been used to great effect. Retrieved 4/22/2011.Anomaly-based intrusion detection system 234 Anomaly-based intrusion detection system An Anomaly-Based Intrusion Detection System. The classification is based on heuristics or rules.[2] Attempts have been made to address these issues through techniques used by PAYL[1] and MCPAD. pdf?attachauth=ANoY7cracv9VJh1PrdgVG8tSdMRh7AImufRG9pxwHd4gHQuws7RxQXrD4duQNiRXFBPSRouloipzLOAWbDV16ZUkCyICr4RYRsiknMqLo attredirects=0). TXT) [3] Perdisci. Roberto. . This can be accomplished in several ways. . and will detect any type of misuse that falls out of normal system operation. Sasha/Beetle (http:/ / artofhacking. com/ files/ phrack/ phrack56/ P56-11. rather than patterns or signatures. 1007/ 978-3-540-30143-1_11). the system must be taught to recognize normal system activity. Prahlad Fogla.[2] Anomaly-based Intrusion Detection does have some short-comings. is a system for detecting computer intrusions and misuse by monitoring system activity and classifying it as either normal or anomalous. googlegroups. Phrack 56 0x11. Another method is to define what normal usage of the system comprises using a strict mathematical model. com/ site/ robertoperdisci/ publications/ publication-files/ McPAD-revision1. Ke.[3] References [1] Wang. Davide Ariu.

Currently the most efficient way to prevent pharming is for end users to make sure they are using secure web connections (HTTPS) to access privacy sensitive sites such as those for banking or taxing. pdf) (PDF). replacing it with a password that is not susceptible to a dictionary attack.html).org). CSO Magazine. while browser add-ins allow individual users to protect themselves from phishing. Retrieved December 3.windowsitpro.Anti-pharming 235 Anti-pharming Anti-pharming techniques and technology are used to combat pharming. com/ files/ activecookies3. Legislation also plays an essential role in anti-pharming.S. and web browser add-ins such as toolbars. A certificate from an unknown organisation or an expired certificate should not be accepted all the time for crucial business. 2005-07-20. ravenwhite.csoonline. 2006.org: A free resource for users and web sites" (http://www. 2005-06-22.pharming. • "How Can We Stop Phishing and Pharming Scams?" (http://www. References [1] "Active Cookies for Browser Authentication" (http:/ / www. 2006-03-31. Spam filters typically do not provide users with protection against pharming. .pharming. DNS protection mechanisms help ensure that a specific DNS server cannot be hacked and thereby become a facilitator of pharming attacks.com/Article/ArticleID/46789/46789. Traditional methods for combating pharming include: Server-side software. Senator Patrick Leahy (D-VT) introduced the Anti-Phishing Act of 2005. So-called active cookies[1] provide for a server-side detection tool. Server-side software is typically used by enterprises to protect their customers and employees who use internal or private web-based systems from being pharmed and phished. and only accept the valid public key certificates issued by trusted sources. html?Ad=1). For home users of consumer-grade routers and wireless access points. DNS protection. www. Windows IT Pro Magazine. In March 2005. a bill that proposes a five-year prison sentence and/or fine for individuals who execute phishing attacks and use information garnered through online fraud such as phishing and pharming to commit crimes such as identity theft. • "Pharming. perhaps the single most effective defense is to change the password on the router to something other than the default. U. .com/talkback/071905. • "Security: Phishing and Pharming" (http://www.org.

and allow good sites.Anti-phishing software 236 Anti-phishing software Anti-phishing software consists of computer programs that attempt to identify phishing content contained in websites and e-mail. and it incorrectly identified 38% of the tested legitimate sites as phishing. Netcraft Toolbar Netscape 8. Client-based anti-phishing programs • • • • • • • • • • • • • • • • • • • • Avira Premium Security Suite Earthlink ScamBlocker (recently discontinued) eBay Toolbar ESET Smart Security Firefox 3. 2006.". but when testing . released November 13.2 Phishtank SiteChecker PineApp Mail-SeCure Safari 3. Of the solutions tested. Severe problems were however discovered using SpoofGuard. concluded that Internet Explorer and Netcraft Toolbar were the most effective anti-phishing tools. EarthLink ScamBlocker and SpoofGuard were able to correctly identify over 75% of the sites tested.2 Windows Mail. warn about phishing sites. Google Safe Browsing (which has since been built into Firefox) and Internet Explorer both performed well. 2006 tested the ability of eight anti-phishing solutions to block known phishing sites. without incorrectly identifying legitimate sites as phishing.1 Norton 360 Norton Internet Security Opera 9. not block or warn about legitimate sites.0. which was commissioned by Microsoft and titled "Gone Phishing: Evaluating Anti-Phishing Tools for Windows". with Netcraft Toolbar receiving the highest score. It is often integrated with web browsers and email clients as a toolbar that displays the real domain name for the website the viewer is visiting.[2] conducted by Carnegie Mellon University CyLab titled "Phinding Phish: An Evaluation of Anti-Phishing Toolbars". A later independent study.e-mail client which warns users of e-mails which may be part of an e-mail scam. in an attempt to prevent fraudulent websites from masquerading as other legitimate web sites. as well as usability testing of each solution. an e-mail client that comes with Windows Vista Anti-phishing effectiveness A study[1] conducted by 3Sharp released on September 27.10 GeoTrust TrustWatch Google Safe Browsing (usable with Firefox) Windows Internet Explorer 8 Kaspersky Internet Security McAfee SiteAdvisor Mozilla Thunderbird . The study. tested the ability of ten anti-phishing solutions to block known or warn about phishing sites. Password managers can also be used to help defend against phishing. Anti-phishing functionality may also be included as a built-in capability of some web browsers. Netcraft Toolbar. leading to the conclusion that "It would seem that such inaccuracies might nullify the benefits SpoofGuard offers in identifying phishing sites.

PhishTank and an unnamed email filtering vendor. org/ security/ phishing-test. cmu.Anti-phishing software ability to detect fresh phishes Netcraft Toolbar scored as high 96%. Retrieved 2008-05-25. edu/ files/ pdfs/ tech_reports/ cmucylab06018. org/ asa/ archives/ 2008/ 02/ safari_unsafe_p.berlios. [4] "Comment to Asa Dotzler blog post "safari unsafe? paypal thinks so. . html#comment-2528666). . Retrieved 2008-05-25. [5] "Comment to Asa Dotzler blog post "safari unsafe? paypal thinks so. Director of Community Development at Mozilla. pdf). with the limited testing data. has responded to the criticism of the Mozilla-commissioned report by saying ". both of which use data from PhishTank in their anti-phishing solutions. The testing was performed using phishing data obtained from Anti-Phishing Working Group. 237 References [1] "3Sharp Study finds Internet Explorer 7 Edges Out [[Netcraft (http:/ / web. both Microsoft and Opera Software have started licensing Netcraft's anti-phishing data. html). possibly due to technical problems with Google Safe Browsing.[3] conducted by SmartWare for Mozilla. The study only compared Internet Explorer and Firefox. cylab. . archive. .dmoz. Archived from the original (http:/ / www. 3sharp. com/ projects/ antiphish/ )] As Most Accurate for Anti-Phishing Protection"]. released November 14. [3] "Firefox 2 Phishing Protection Effectiveness Testing" (http:/ / www. Good enough for me. org/ asa/ archives/ 2008/ 02/ safari_unsafe_p. mozilla. com/ projects/ antiphish/ ) on 2007-12-09. 2006. Retrieved 2008-05-25. Asa Dotzler. mozillazine. both Opera and Netcraft Toolbar would have gotten a perfect score had they been part of the study. [2] "Phinding Phish: An Evaluation of Anti-Phishing Toolbars" (http:/ / www. and left out among others Netcraft Toolbar and the Opera browser."" (http:/ / weblogs. The latest study. 3sharp.[4] criticising that the testing data was sourced exclusively from PhishTank."[6] Since these studies were conducted. html#comment-2528717). Retrieved 2008-05-25. External links • The Open Phishing Database Project (http://opdb. Retrieved 2008-05-25. The results of this study have been questioned by critics.org/Society/Crime/Theft/Identity_Theft/Phishing//) at the Open Directory Project . Retrieved 2008-05-25. This has led to speculations that. mozillazine. while Google Safe Browsing scored as low as 0%. . bringing the effectiveness of their browser's built-in anti-phishing on par with Netcraft Toolbar and beyond.[5] While the two later reports were released only one day apart.. .de/) • Anti-phishing software (http://www."" (http:/ / weblogs."" (http:/ / weblogs. [6] "Comment to Asa Dotzler blog post "safari unsafe? paypal thinks so. itself an anti-phishing provider. org/ asa/ archives/ 2008/ 02/ safari_unsafe_p. concluded that the anti-phishing filter in Firefox was more effective than Internet Explorer by more than 10%.so you're agreeing that the most recent legitimate data puts Firefox ahead. org/ web/ 20071209190958/ http:/ / www. html#comment-2528657). mozillazine.

vnunet. It can be a piece of software designed to protect against computer worms. Criticism Many computer security experts have denounced the so-called "anti-worm". repeat" model that malicious computer worms use. It can also mean a worm designed to do something that its author feels is helpful.[1] Notes [1] 'Anti-worms' fight off Code Red threat (http:/ / www. Anti-worms have the ability to spread just as fast as regular computer worms. It can overflow the traffic capacity of the network. someone released another worm to combat the Santy worm and patch the vulnerable phpBB forum. resulting in a denial-of-service attack. utilizing the same "scan." thus making the author(s) of such code liable to prosecution. even if its author has good intentions. using Google to search for vulnerable versions of phpBB. Whether or not the anti-worm had a significant positive impact on the spread Santy worm is unknown. the anti-santy worm caused problems of its own. combining the features of anti-virus software and a personal firewall. Approximately 10 days after the worm's launch. Example The Santy worm was released shortly before Christmas 2004 and spread quickly. Their position is that no code should be run on a system without the system owner's consent. The worm was poised to spread to hundreds of thousands of other websites running the phpBB forum. anti-worms reach computers by scanning IP ranges and placing a copy of themselves on vulnerable hosts. Thus. Most jurisdictions that have computer crime laws covering worms do not distinguish "worms" from "anti-worms. The anti-Santy worm spread quickly affecting thousands of servers running the phpBB. Anti-worms have also been used to combat the effects of the Code Red worm. com/ news/ 1125206) . Many site administrators reported that the anti-worm crashed their systems by flooding them with requests. Others reported that the patch did not work. However. can wreak havoc on a network. Within several hours of Santy's release. Concept The concept of "anti-worms" is a proactive method of dealing with virus and computer worm outbreaks. and it could render that system useless for its intended purpose. Google blocked the search string the worm was using to find vulnerable hosts. Its author does not know the exact configuration of the system on which the code is running. There is no way to determine if Google's actions or the anti-Santy worm did more to protect hosts. defacing the website and deleting all of the messages stored on the forums. Worm code. The worm exploited a bug in the phpBB software to infect the host. Just like malicious computer worms. The anti-worm then patches the computer's vulnerability and uses the affected computer to find other vulnerable hosts.Anti-worm 238 Anti-worm Anti-worm has multiple meanings within the field of computer security. infect. the worm could not find new hosts to infect.

the ALG FTP plugin redirects all traffic that passes through the NAT and that is destined for port 21 (FTP control port) to a private listening port in the 3000-5000 range on the Microsoft loopback adapter. • recognizing application-specific commands and offering granular security controls over them • synchronizing between multiple streams/sessions of data between two hosts exchanging data. The ALG FTP plugin then monitors/updates traffic on the FTP control channel so that the FTP plugin can plumb port mappings through the NAT for the FTP data channels. The ALG FTP plugin is designed to support active FTP sessions through the NAT engine in Windows. Basically a NAT with a builtin ALG can rewrite information within the SIP messages and can hold address-bindings until the session terminates. facilitating the exchange.Application-level gateway 239 Application-level gateway In the context of computer networking. rather than the real server. In order for these protocols to work through NAT or a firewall. An ALG can prevent the control connection getting timed out by network devices before the lengthy file transfer completes. on the other hand. [2] Deep packet-inspection of all the packets handled by ALGs over a given network makes this functionality possible. an application-level gateway [1] (also known as ALG or application layer gateway) consists of a security component that augments a firewall or NAT employed in a computer network. such as ports and IP addresses. If the firewall has its SIP traffic terminated on an ALG then the responsibility for permitting SIP sessions passes to the ALG instead of the firewall. This aspect introduces the term 'gateway' for an ALG. for Session Initiation Protocol (SIP) Back-to-Back User agent (B2BUA). . the control connection may remain idle. To do this. An ALG understands the protocol used by the specific applications that it supports. an FTP application may use separate connections for passing control commands and for exchanging data between the client and a remote server. A proxy. During large file transfers. There seems to be an industry convention that an ALG does its job without the application being configured to use it. an ALG can allow firewall traversal with SIP. ALG service in Microsoft Windows The Application Layer Gateway service in Microsoft Windows provides support for third-party plugins that allow network protocols to pass through the Windows Firewall and work behind it and Internet Connection Sharing. either the ports would get blocked or the network administrator would need to explicitly open up a large number of ports in the firewall — rendering the network vulnerable to attacks on those ports. file transfer in IM applications etc. An ALG is very similar to a proxy server. BitTorrent. or the NAT has to monitor the control traffic and open up port mappings (firewall pinhole) dynamically as required. An ALG can solve another major SIP headache: NAT traversal. For example. Windows Server 2003 also includes an ALG FTP plugin. by intercepting the messages. RTSP. The client is then explicitly aware of the proxy and connects to it. even though a firewall-configuration may allow only a limited number of known ports. either the application has to know about an address/port number combination that allows incoming packets. ALG plugins can open ports and change data that is embedded in packets. usually needs to be configured in the client application. SIP. It allows customized NAT traversal filters to be plugged into the gateway to support address and port translation for certain application layer "control/data" protocols such as FTP. Legitimate application data can thus be passed through the security checks of the firewall or NAT that would have otherwise restricted the traffic for not meeting its limited filter criteria. In the absence of an ALG. An ALG may offer the following functions: • allowing client applications to use dynamic ephemeral TCP/ UDP ports to communicate with the known ports used by the server applications. as it sits between the client and real server. For instance. • converting the network layer address information found inside an application payload between the addresses acceptable by the hosts on either side of the firewall/NAT.

ARP spoofing attacks can be run from a compromised host on the LAN. Generally.ALG: official definition (refer section 2. the aim is to associate the attacker's MAC address with the IP address of another host (such as the default gateway). html)/ Network Address Translation (NAT) Router / Load-Balancing Router. Principle The principle of ARP spoofing is to send fake. A successful ARP spoofing attack allows an attacker to alter routing on a network. Any traffic meant for that IP address would be mistakenly sent to the attacker instead. Denied access to the gateway in this way. com/ ncftpd/ doc/ misc/ ftp_and_firewalls. The attack can only be used on networks that make use of the Address Resolution Protocol (ARP) and not another method of address resolution. is a technique used to attack a local-area network (LAN). nothing outside the LAN will be reachable by hosts on the LAN. [3] http:/ / tools. modify the traffic. org/ html/ rfc2694 ARP spoofing ARP spoofing. or from an attacker's machine that is connected directly to the target LAN. The attacker could also launch a denial-of-service attack against a victim by associating a nonexistent MAC address to the IP address of the victim's default gateway.Application-level gateway 240 References • DNS Application Level Gateway (DNS_ALG) [3] [1] RFC 2663 . also known as ARP flooding. ARP spoofing may allow an attacker to intercept data frames on a LAN. effectively allowing for a man-in-the-middle attack.9) [2] The File Transfer Protocol (FTP) and Your Firewall (http:/ / www. ARP messages onto a LAN. or stop the traffic altogether. ietf. A denial-of-service attack may be executed if the attacker is able to use ARP snooping to associate an alternate MAC address with the IP address of the default gateway. or spoofed. ARP poisoning or ARP poison routing (APR). The attacker could then choose to forward the traffic to the actual default gateway (interception) or modify the data before forwarding it (man-in-the-middle attack). ncftp. .

which Sniffing.ARP spoofing 241 Legitimate usage ARP spoofing can also be used for legitimate purposes. Filtering & co attacks for more complex derived attacks. In a more passive approach a device listens for ARP replies on a network. The existence of multiple IP addresses associated with a single MAC address may indicate an ARP spoof attack. This only prevents simple attacks and does not scale on a large network. read-only entries for critical services in the ARP cache of a host. ARP Poison Routing (APR) attacks. Injection. Defenses An open source solution is ArpON [1] "ARP handler inspection". Tools Defense • • • • • • ArpON . A backup server may use ARP spoofing to take over for a defective server and transparently offer redundancy.ARP handler inspection ARPDefender appliance[6] Arpwatch XArp[7] anti-arpspoof[8] AntiARP[9] [5] . ARP spoofing can also be used to implement redundancy of network services.[3] [4] The simplest form of certifiation is use of static. This capability may be implemented in individual hosts or may be integrated into Ethernet switches or other network equipment. These techniques may be integrated with the DHCP server so that both dynamic and static IP addresses are certified. since the mapping has to be set for each pair of machines resulting in (n*n) ARP caches that have to be configured. network registration tools may redirect unregistered hosts to a signup page before allowing them full access to the network. It blocks also the derived attacks by it. Session Hijacking and SSL/TLS Hijacking & co attacks. Some switch vendors have devised a defense against this form of attack that imposes very strict control over what ARP packets are allowed into the network. Hijacking. ARP Cache Poisoning. and sends a notification via email when an ARP entry changes. An other defenses against ARP spoofing generally rely on some form of certification or cross-checking of ARP responses. WEB Spoofing. For instance. The feature is known as ARP Security[2] or Dynamic ARP Inspection. It is a portable handler daemon that make ARP protocol secure in order to avoid the Man In The Middle (MITM) attack through ARP Spoofing. as: DNS Spoofing. Uncertified ARP responses are blocked. although there are legitimate uses of such a configuration. This technique is used in hotels and other semi-public networks to allow traveling laptop users to access the Internet through a device known as a head end processor (HEP).

Retrieved 2011-05-03. net/ Sec/ anti-arpspoof.ARP spoofing 242 Spoofing Some of the tools that can be used to carry out ARP spoofing attacks: • • • • • • • • • • • • • • • Arpspoof (part of the DSniff suite of tools) Arpoison Ettercap Cain&Abel Seringe[10] ARP-FILLUP -V0. asp) [10] "Seringe .swf) • GRC's Arp Poisoning Explanation (http://www. chrismc. com/ tools/ 5QP0I2AC0I. securiteam.Professional defence ARP spoof/poison/attack (http:/ / www.0[11] SwitchSniffer[11] References [1] http:/ / arpon.htm).grc. External links • Introduction to APR (Arp Poison Routing) by MAO (http://www. Retrieved 2011-05-03. l0T3K. . antiarp. .1[11] ArpToXin -v 1.3. brocade. "ARP Cache Poisoning" (http://www. html). Retrieved 2011-05-03.Statically Compiled ARP Poisoning Tool" (http:/ / www. . de/ development/ xarp) [8] anti-arpspoof (http:/ / sync-io.2[11] arping -v2.5[11] ArpSpyX -v1. . pdf) [4] Brocade Fastiron LS Series datasheet (http:/ / www.13[11] arpalert -v0.blogspot. alliedtelesis. com/ forms/ getFile?p=documents/ data_sheets/ product_data_sheets/ ds-fi-ls-series.com/2008/08/arptables-and-arp-poisoningnetcut. sourceforge.0. and ARP poisoning (http://abulmagd. aspx) [9] AntiARP .oxid. l0t3k.osischool. 2_20_se/ configuration/ guide/ swdynarp. pdf) [5] "ArpON" (http:/ / arpon.1[11] arp-sk -v0. arpdefender. sourceforge. net [2] How to Secure a L3 Network (http:/ / www. com/ media/ datasheets/ howto/ secure_network_l3switches. com/ en/ US/ docs/ switches/ lan/ catalyst3560/ software/ release/ 12.2[11] arpoison -v0.04[11] arpmitm -v0. org/ security/ tools/ arp/ ). [6] ARPDefender (http:/ / www. pdf) [3] Dynamic ARP Inspection configuration guide (http:/ / www. html) • ARP Spoofing Simulation (http://www.htm) • arptables.grc. cisco.it/downloads/apr-intro.15[11] ARPOc -v1.com/nat/arp. com) [7] XArp (http:/ / www.com/protocol/arp/arp-spoofing) • Steve Gibson (2005-12-11).com/nat/arp. GRC. com/ English/ e_index. [11] "ARP Vulnerabilities: The Complete Documentation" (http:/ / www. net).

co. .2. com/ blog/ security/ sony-playstations-site-sql-injected-redirecting-to-rogue-security-software/ 1394). v3. "Asprox botnet malware morphs" (http:/ / www. . cert-in. which infected an estimated 1. theregister.V3.000 pages.000 within a day. . bloombit. uk/ 2008/ 05/ 14/ asprox_attacks_websites/ [5] Hines. scmagazineus. some high profile websites have been infected in the past. html). also known by its aliases Badsrc and Aseljo. [10] http:/ / www.com. infecting an unknown amount of websitesAnother wave took place in June 2010.com" (http:/ / www. . and Top Business Websites" (http:/ / cyberinsecure. SC Magazine US. zdnet. eweek. Retrieved 2010-07-30.000 . • Sony Playstation U. [6] Michael Zino (2008-05-01).uk.000 infected computers as of May.[11] • Adobe's Serious Magic website [11] • Several government. com/ asprox-botnet-malware-morphs/ article/ 110169/ ). DarkReading. Retrieved 2010-07-30. co. com/ botnets/ asprox_botnet_attacks_come_back. 2008-07-18. com/ documents/ pdfs/ security_labs/ m86_security_labs_report_1H2010. . 2008-07-02.uk . Securitywatch. "ASCII Encoded/Binary String Automated SQL Injection Attack" (http:/ / www.000 to an estimated 10. theregister. ZDNet. CyberInsecure. . pdf [11] "Sony PlayStation's site SQL injected. Retrieved 2010-07-30. . healthcare and business related websites [7] References [1] "Indian Computer Emergency Response Team" (http:/ / www.[8] [9] [10] .Asprox Botnet Attacks Come Back . inserting an IFrame which redirects the user visiting the site to a site hosting Malware.eWeek Security Watch" (http:/ / securitywatch. . redirecting to rogue security software" (http:/ / www. "Botnets . Some of these infections have received individual coverage. Retrieved 2010-07-30. htm).[4] although the size of the botnet itself is highly variable as the controllers of the botnet have been known to deliberately shrink (and later regrow) their botnet in order to prevent more aggressive countermeasures from the IT Community. org. "Asprox botnet causing serious concern .13. likely to prevent aggressive counterreactions from the security community.S.co. increasing the estimated total amount of infected domains from 2. Cert-In. An additional wave took place in October 2009. The botnet itself consists of roughly 15. Once it finds a potential target the botnet performs a SQL Injections on the website. V3. co. The botnet usually attacks in waves . Once a wave is completed the botnet lay dormant for an extended amount of time. The initial wave took place in July. uk/ v3/ news/ 2265398/ asprox-spambot-digging). aspx).[1] [2] [3] Operations Since its discovery in 2008 the Asprox botnet has been involved in multiple high-profile attacks on various websites in order to spread malware.eweek.[5] The botnet propagates itself in a somewhat unusual way.000 . . is a botnet mostly involved in phishing scams and performing SQL Injections into websites in order to spread Malware. [2] Sue Marquette Poremba (2008-05-15).[2] [7] . Retrieved 2010-07-30. [8] David Neal. 2008. uk/ 2009/ 02/ 03/ conficker_arbor_analysis/ [4] http:/ / www. com/ asprox-botnet-mass-attack-hits-governmental-healthcare-and-top-business-websites/ ). [9] "Researchers: Asprox Botnet Is Resurging .[4] [6] . m86security.com. 2008. jhtml?articleID=225800197). Retrieved 2010-07-30.com. Notable high-profile infections While the infection targets of the Asprox botnet are randomly determined through Google searches. com/ Articles/ 2008/ 05/ ASCII-Encoded-Binary-String-Automated-SQL-Injection. as it actively searches and infects vulnerable websites running Active Server Pages. thus achieving the highest possible spread rate. bloombit.the goal of each wave is to infect as many websites as possible. [3] http:/ / www.co.formerly vnunet. darkreading. com/ security/ attacks/ showArticle.botnets/Attacks" (http:/ / www.Asprox botnet 243 Asprox botnet The Asprox botnet (Discovered around 2008). Matthew (2009-10-06). Healthcare. [7] "Asprox Botnet Mass Attack Hits Governmental. Retrieved 2010-07-30. in/ virus/ Asprox_Botnet.

however. Attack trees are similar to threat trees. bribing a keyholder. the securing cable must be cut or the lock unlocked. From the bottom up. However. Basic Attack trees are multi-leveled diagrams consisting of one root. threatening a key holder. leaves.Steal Computer) is created. A node may be the child of another node. under a mousemat).Obtain Key. Here we assume a system such as Windows NT. By including apriori probabilities with each node. and children. attackers). Threat trees have been discussed by Edward Amoroso [2] . the probability distribution of events are probably not independent nor uniformly distributed.Steal Computer). The key may be obtained by not all users have full system access.e. hence. or taking it from where it is stored (e. by assuming an electronic alarm which must be disabled if and only if the cable will be cut. . CTO of Counterpane Internet Security. in reality accurate probability estimates are either unavailable or too expensive to gather. Each node may be satisfied only by its direct child nodes. Our above condition shows only OR conditions. an AND condition can be created. Note also that an attack described in a node may require one or more of many attacks described in child nodes to be satisfied. Fault tree methodology employs boolean expressions to gate conditions when parent nodes are satisfied by leaf nodes. Thus a four level attack tree can be drawn. when the root is satisfied. for example. in such a case. For example. child nodes are conditions which must be satisfied to make the direct parent node true. Rather than making this task a child node of cutting the lock. To steal one.. of which one path is (Bribe Keyholder. the attack is complete.Attack tree 244 Attack tree Attack trees are conceptual diagrams of threats on computer systems and possible attacks to reach those threats. The lock may be unlocked by picking or by obtaining Attack tree for computer viruses. it is possible to perform calculate probabilities with higher nodes using Bayes Rule. With respect to computer security with active participants (i. The concept was suggested by Bruce Schneier[1] . Attack trees are related to the established fault tree[3] formalism. it becomes logical that multiple steps must be taken to carry out an attack.Cut Cable).g. both tasks can simply reach a summing junction. naive Bayesian analysis is unsuitable. All child nodes operate on OR conditions. consider classroom computers which are secured to the desks. Thus the path ((Disable Alarm.Unlock Lock. where the key.

Upper Saddle River: Prentice Hall. creating attack trees. [3] "Fault Tree Handbook with Aerospace Applications" (http:/ / www. schneier. Implementing this negates any possible way. it also requires that users switch to an administrative account to carry out administrative tasks.Networked Object-Oriented Security Examiner.Attack tree 245 Examination Attack trees can become largely complex. Retrieved 2010-04-21. nasa. ISBN 0-13-108929-3. v. This adds to the attack tree the possibility of design flaws or exploits in the package manager.24. foreseen or unforeseen. 14th Systems Administration Conference (LISA 2000). that implementing policy to execute this strategy changes the attack tree. that a normal user may come to infect the system with a virus. For example. A full attack tree may contain hundreds or thousands of different paths all leading to completion of the attack.[4] References [1] Schneier. New Orleans" (http:/ / www. instead requiring a package manager be used. gov/ office/ codeq/ doctree/ fthb. . Dr Dobb's Journal. . hq. Bruce (December 1999). Fundamentals of Computer Security. Retrieved 2007-12-09. One could observe that the most effective way to mitigate a threat on the attack tree is to mitigate it as close to the root as possible. [2] Amoroso. especially when dealing with specific attacks. [4] "NOOSE . thus creating a different set of threats on the tree and more operational overhead. n. Systems using cooperative agents that dynamically examine and identify vulnerability chains. html). Retrieved 2007-08-16. the threat of viruses infecting a Windows system may be largely reduced by using NTFS instead of FAT file system so that normal users are unable to modify installed programs. these trees are very useful for determining what threats exist and how to deal with them. Attack trees can lend themselves to defining an information assurance strategy. "Attack Trees" (http:/ / www. Although this is theoretically sound. com/ paper-attacktrees-ddj-ft. computer viruses may be protected against by refusing the system administrator access to directly modify existing programs and program folders. have been built since 2000. It is important to consider. . Edward (1994). pdf). however. usenix. however. For example. Even so.12. . it is not usually possible to simply mitigate a threat without other implications to the continued operation of the system. org/ events/ lisa00/ full_papers/ barnett/ barnett_html/ ).

These tickets are then exchanged with one another to verify identity. Authentication is used as the basis for authorization (determining whether a privilege will be granted to a particular user or process). privacy (keeping information from becoming known to non-participants). . The major authentication algorithms utilized are passwords. Kerberos. and public key encryption. and receive cryptographic tickets.Authentication server 246 Authentication server Authentication servers are servers that provide authentication services to users or other systems via networking. Remotely placed users and other servers authenticate to such a server. and non-repudiation (not being able to deny having done something that was authorized to be done based on the authentication).

The overall objective is to make policy based decisions about who gets access to a network segment and what they can do once they are admitted. com/ solutions/ securenet/ collateral/ esg_nac_whitepaper_8_aug_2007. The health checking policy provides additional security protection by limiting or restricting access to endpoints identified to be “unhealthy” based upon an enterprise policy definition. com/ document. The Nortel Secure Network Access Switch (SNAS) 4050 is a device that centrally controls access policies and admission controls for integration with wired.nortel. "comply to my policies and then you can connect. (55.nwc.jsp?segId=0&parId=0& prod_id=55260) • A Fully Integrated Layered Security Defense (http://i. org/ groups/ network/ ) Addressing Enterprise Network Access Requirements (http:/ / www. nortel.1x). And if you can't comply.34 kg) Rack mountable: Yes.pdf) .[4] References [1] [2] [3] [4] NAC: More Is More (http:/ / www.7 in. (44. wireless and mobile users and devices. personal firewall. The NSNA-4050 complies with the Trusted Computing Group (TCG) Trusted Network Connect (TNC) [3] specifications so a customers will not be locked into a proprietary architecture. remediation (SMS).9 cm) 25 lb (11. com/ article/ 04/ 04/ 29/ HNdesantisint_1.9 in. and/or an Active Portel.nortel. The Trusted Network Connect (TNC) specification is not part of a single antivirus software.com/go/product_content.com/go/product_content.com/multimedia/flash/demo/index_760. asp?doc_id=123813& page_number=1) Eliminating threats on the network (http:/ / www. lightreading. The TNC specification as well as the NSNAS-4050 supports third party and legacy Nortel switches to provide the best return on investment protection and is closely integrated with Microsoft's Network Access Protection (NAP) technologies in the Vista desktops and Windows Servers.com/pdf/Nortel_Q& A.jsp?segId=0&parId=0&prod_id=55260& locale=en-US) • Nortel Multimedia Presentation (http://www.cmpnet.9 cm) 22 in.html?demo=sna) • Secure Network Access Switch 4050 (http://products. come into this safe zone where I'll remediate. html) Trusted Network Connect Work Group (https:/ / www. I'll get you back to a trusted state"[2] ) at the network endpoint. trustedcomputinggroup. or security application. pdf) External links • Official Website (http://products. The SNAS-4050 switch is the policy integration point between Nortel and/or 3rd party switches/routers/VPN systems and the Policy authentication (AD server/certificate server/802.com/nacbattleground.Avaya Secure Network Access 247 Avaya Secure Network Access Secure Network Access 4050 Rack Space Height: Width: Depth: Weight: 1 Rack Unit 1. (44 mm) 16. 19-inch standard rack Avaya Secure Network Access (Nortel-SNA or NSNA) in computer network is a Network Access Control system designed by Nortel (now Avaya) to guarantee endpoint security policy compliance and remediation (also known as comply to connect[1] or. infoworld.nortel.

or perform as IP access routers connection.Avaya VPN Router 248 Avaya VPN Router VPN Routers VPN Router 5000 Rack Space Height: Width: Depth: Weight: Rack mountable: 3 Rack Unit 5.5 kg) Yes.0 in.4 cm) 43 lb (19. The systems can be deployed to provide dedicated encrypted VPN gateways.25 in.3 cm) 17. (58. VPN Router Scaling • • • • • • VPN Router 200 (5 concurrent tunnels) for remote office connections VPN Router 600 (50 concurrent tunnels) VPN Router 1000 (30 concurrent Tunnels) VPN Router 1750 (500 concerrent tunnels) VPN Router 2700 (2000 concerrent tunnels) VPN Router 5000 (5000 concerrent tunnels) References .8 cm) 23. (43. (13. stateful firewall protection. The VPN Router is capable of providing security and connectivity at several levels.25 in. 19-inch standard rack Avaya VPN Router or VPN Router is a device in computer networking made by Nortel (now Avaya) formally called the Contivity VPN Router.

the threat risk from these variants has been changed to "low" due to decreased prevalence. 2004. So. In a nameless time. It was not widespread and stopped spreading after January 28.[5] though the actual percentage seems to rise and drop rapidly.[3] is a botnet mostly involved in proxy-to-relay e-mail spam.000 [4] computers infected with the Bagle Computer worm.39% of the worldwide spam volume on December 29. a number remain notable threats.exe) and opens a backdoor on TCP port 6777 (Bagle.3% of the global spam volume.A. 2004. A second variant. was first sighted on February 17. Mitglieder and Lodeight. Botnet The Bagle botnet (Initial discovery early 2004[1] [2] ).exe.04. The second strain.B. Some of these variants contain the text "Greetz to antivirus companies In a difficult world. Network Associates rated it a "medium" threat. was first sighted on January 18.B as au. It was much more widespread and appeared in large numbers.04. Subsequent variants have later been discovered. also known by its aliases Beagle. or about 4.B). is considerably more virulent. 2009.000-230." Which makes some people think the worm originated in Germany. The Bagle botnet consists of an estimated 150.B.com".7 billion spam messages a day. It copies itself to the Windows system directory (Bagle. you will be mine!! -. "@msn. 29. "@microsoft" or "@avp". 2004.A as bbeagle. It is designed to stop spreading after February 25.A.[6] As of April 2010 it is estimated that the botnet sends roughly 5. Bagle. However. Bagle. Germany.Bagle Author. The first strain. 2004. Bagle. It does not mail itself to addresses containing certain strings such as "@hotmail. Windows users are warned to watch out for it. Since 2004. I want to survive.[4] . Bagle. It was estimated that the botnet was responsible for about 10. Bagle uses its own SMTP engine to mass-mail itself as an attachment to recipients gathered from the infected computer. with a surge up to 14% on New Year's Day. did not propagate widely. The initial strain. Bagle.Bagle (computer worm) 249 Bagle (computer worm) Bagle (also known as Beagle) is a mass-mailing computer worm affecting all versions of Microsoft Windows.A) or 8866 (Bagle.com". Although they have not all been successful.

scmagazineuk. "New botnet threats emerge in the New Year from Lethic and Bagle" (http:/ / www. SC Magazine UK. com/ labs/ i/ A-Little-Spam-With-Your-Bagle-. References [1] "The Bagle botnet" (http:/ / www.trace. . Retrieved 2010-07-30. m86security. . asp?article=938). M86 Security.Bagle (computer worm) 250 Trivia Bagle and Netsky cancel each other out. DarkReading. pdf [5] Dan Raywood. . [3] "Bagle" (http:/ / www. Retrieved 2010-07-30. just like they're having a war. 2009-06-05. . m86security. jhtml?articleID=221600694). com/ labs/ spambotitem. [2] "A Little Spam With Your Bagle?" (http:/ / www. M86 Security. Retrieved 2010-07-30. 999~. darkreading. [6] "New Spamming Botnet On The Rise" (http:/ / www. 2009-06-05. . Retrieved 2010-07-30. [4] http:/ / www. Retrieved 2010-07-30. com/ new-botnet-threats-emerge-in-the-new-year-from-lethic-and-bagle/ article/ 160999/ ). Securelist. securelist. com/ en/ analysis/ 162656090/ The_Bagle_botnet). com/ mlireport/ MLI_2010_04_Apr_FINAL_EN. . asp). messagelabs. com/ security/ vulnerabilities/ showArticle.

The company’s security products include solutions for protection against email. which Trend Micro claimed to be in violation of their patent on 'anti-virus detection on an SMTP or FTP gateway'.[5] In January 2006.Barracuda Networks 251 Barracuda Networks Barracuda Networks Type Industry Founded Private Telecommunication 2003 Headquarters Campbell. IM firewalls. Backup solutions. backup services and data protection. trojans. application delivery controllers. is a privately held company providing security. Inc. NG firewalls.[4] and opened an office in Ann Arbor.[2] Barracuda Networks was established in 2003 and introduced the Barracuda Spam and Virus Firewall. and Yahoo!. Web Site Firewalls.[10] [11] [12] [13] As of October 2009. Load balancers.[9] Soon after opening BRBL many IP addresses got blacklisted without apparent reason and without any technical explanation. Netscreen. web surfing.[7] In addition to providing samples of prior art in an effort to render Trend Micro's patent invalid.com [1] Products Employees Website Barracuda Networks. it closed its first outside investment of $40 million from Sequoia Capital and Francisco Partners.000 customers. its proprietary and dynamic list of known spam servers. California.[8] In December 2008. in July 2008 Barracuda launched a countersuit against Trend Micro claiming Trend Micro violated several antivirus patents Barracuda Networks had acquired from IBM. the company launched the BRBL (Barracuda Reputation Block List). Barracuda had over 85. message archiving. web hackers and instant messaging threats such as spam. NG Firewalls. 2008. United States Key people Dean Drako (CEO) Michael Perone (CMO) Zach Levow (CTO) Spam Firewalls. Sequoia Capital had previously provided financing to Cisco Systems. Web filters. Email Archivers.[6] On January 29. networking and storage solutions based on appliances and cloud services. load balancing. The company's networking and storage solutions include web filtering. California. Michigan. for free and public use in blocking spam at the gateway. spyware.[14] . SSL VPNs. and viruses. Google. CudaTel PBX 500-1000 www.barracudanetworks.[3] In 2007 the company moved its headquarters to Campbell. Barracuda Networks was sued by Trend Micro over their use of the open source anti-virus software Clam AntiVirus.

formerly Tapeware. and safe web surfing. allowing the company to introduce Secure Sockets Layer (SSL) Virtual Private Network (VPN) products to perform malware scans on files uploaded during a VPN session to network file shares or internal Web sites.[15] In June 2008.[22] • Link Balancer . integrating Barracuda Backup Service with Yosemite Backup. Barracuda Networks acquired controlling interest in phion AG.[29] In November 2008.[34] .[17] • IM Firewall . for securing Web applications for large enterprises and to address regulation compliance such as PCI DSS. Barracuda launched a spam and virus firewall for large enterprises and ISPs. Barracuda Networks acquired NetContinuum.Announced in September 2008.In July 2007. remote access.[20] • SSL VPN . Barracuda Networks acquired Purewire Inc.In October 2009.[25] • Purewire Web Security Service . Barracuda Networks launched the Purewire Web Security Service which is a software as a service offering for Web filtering.[24] In January 2009. Barracuda announced its spam and virus firewall plug-in appliance. including on-site backup with data deduplication and off-site data replication for disaster recovery. the company introduced a load balancing appliance for high availability distribution of network traffic across multiple servers. the company announced a service to back up data in the cloud.[28] Acquisitions In September 2007.[16] • Web filter .[18] • Load balancer . to optimize and aggregate internet connections from different providers. a company providing application controllers to secure and manage enterprise web applications. CudaTel features FreeSWITCH. Barracuda Networks expanded into cloud-based backup services by acquiring BitLeap.In February 2010.In October 2003.in August 2010.Barracuda Networks 252 Products • Spam and virus firewall .In November 2008. Barracuda Networks acquired 3SP. the company introduced its web filtering appliance to prevent spyware and viruses from gathering and transmitting user data.[26] • NG Firewall . and network access control into one platform that is centrally managed across multiple distributed enterprise network locations. 2008.Announced in February 2008.[23] • Backup services . in conjunction with its acquisition of Purewire. The firewalls integrate web and email filtering. and Windows system states.[19] • Message archiver . intrusion prevention. a VOIP Private branch exchange designed for IT administrators. and to control web surfing. a open-source project sponsored by Barracuda Networks. layer 7 application profiling. NG Firewalls are available both as hardware or as a virtual appliance and include wide area network traffic optimization. Barracuda announced its NG Firewalls to protect enterprise network infrastructures.In November 2006. a software as a service (SaaS) company offering cloud based web filtering and security.[31] In January 2009. clientless. the company launched its secure sockets layer virtual private network product to provide secure.In April 2005. the company introduced message archiving to index and preserve emails. content security. Barracuda announced the release of CudaTel.[21] • Web Application Firewall .[32] September 2009.Launched in September 2005 to protect and archive instant messaging content. and to meet legal and regulatory compliance. Barracuda Networks acquired Yosemite Technologies to add software agents for incremental backups of applications such as Microsoft Exchange Server and SQL Server.[30] In November 2008.[33] In October 2009.In November. Barracuda added message-level backup for Microsoft Exchange and Novell GroupWise. an Austria-based public company delivering enterprise-class firewalls.[27] • CudaTel Communication Server (PBX) .

com/ archive/ ?module=comments& func=display& cid=1204572 [12] http:/ / steve. linux. html) [26] Channel Insider Barracuda Acquires Cloud Security Vendor Purewire (http:/ / www.jsessionid=OVGGXQ5J5OOQHQE1GHPCKH4ATMY32JVN) [31] Eweek Barracuda Networks Breaks into SSL VPN Space (http:/ / www. com/ news/ 2006/ 011106-barracuda. com/ usa/ brief. barracudanetworks. echannelline. com/ [2] Company Product Page. html) [34] Atlanta Business Chronicle Barracuda buys Purewire Inc. html) [9] Linux. networkworld. bizjournals. isp-planet. informationweek. shtml) [6] NetworkWorld Barracuda attracts $40 million in venture investment (http:/ / www. barracudanetworks. eweek. com/ blog/ main/ archives/ 2008/ 11/ barracuda_swims. com/ topic/ 32502 [14] San Jose Business Journal Barracuda Networks buys Purewire (http:/ / sanjose. echannelline. com/ ns/ products/ ) Barracuda Networks. com/ equipment/ 2007/ barracuda_message_archiver. html) [33] Silicon Valley Business Journal Barracuda Networks takes controlling interest in phion (http:/ / sanjose. com/ news/ security/ vulnerabilities/ showArticle. com/ archives/ campbellreporter/ 20070323/ business2. community-newspapers. com/ c/ a/ Security/ Barracuda-Networks-Breaks-Into-SSL-VPN-Space-for-Small-Business/ ) [32] PC World Backup Merger Unites Barracuda. com/ 2009/ 10/ 13/ barracuda-swallows-purewire-as-it-becomes-a-bigger-fish-in-web-based-security-services/ ) [4] The Campbell Reporter Barracuda Networks sinks its teeth into site on Winchester Boulevard (http:/ / www. html) [21] Comms Express New SSL VPN Announced (http:/ / www. html [11] http:/ / www. computerworld. jhtml?articleID=160902103) [18] ComputerWorld Security Security Log (http:/ / www. informationweek.jsessionid=XDFD2WAXZBEB3QE1GHPCKH4ATMY32JVN) [25] PCWorld Backup Merger Unites Barracuda. computerworld. [3] VentureBeat Barracuda swallows Purewire as it becomes a bigger fish in web-based security services (http:/ / venturebeat. ca/ barracuda-problems. cfm?item=15073) [23] ISP Planet Barracuda Networks' Link Balancer (http:/ / www. Yosemite (http:/ / www. com/ archives/ campbellreporter/ 20070323/ business2. linux. scmagazineus. triumf. com/ atlanta/ stories/ 2009/ 10/ 19/ story7. com/ sanjose/ stories/ 2009/ 10/ 12/ daily19. shtml) [5] Crain's Detroit Business Silicon Valley firm picks Ann Arbor for office (http:/ / www. html) [15] ComputerWorld Barracuda Networks launches antispam appliance line (http:/ / www. com/ article/ 158462/ backup_merger_unites_barracuda_yosemite. html. barracudanetworks. heyvan. (http:/ / www. com/ ns/ news_and_events/ index. channelinsider. html) [20] ISP Planet Barracuda's Message Archiver (http:/ / www. bizjournals. isp-planet. pcworld. bizjournals. html. com/ barracuda-networks-launches-barracuda-load-balancer. com/ s/ article/ 86007/ Barracuda_Networks_launches_antispam_appliance_line?taxonomyId=086) [16] eChannelline Barracuda launches Spam Firewall for large enterprises (http:/ / www. com/ 2008/ 11/ 06/ barracudacentral-another-blacklist-black-hole/ [13] http:/ / community. com/ news. html) [24] Information Week Barracuda Swims Into The Cloud (http:/ / www. cfm?item=23340) [17] InformationWeek Barracuda Rolls Out Spyware-Blocking Appliance (http:/ / www. com/ article/ 158462/ backup_merger_unites_barracuda_yosemite. com/ news. ars/ post/ 20080702-barracuda-bites-back-at-trend-micro-in-clamav-patent-lawsuit. comms-express. Retrieved 2010-02-10. html?ana=from_rss) . com/ view/ 7138/ barracuda-moves-into-distributed-firewall-technology/ ) [28] Barracuda Networks Launches CudaTel – New VoIP PBX Based on the Open Source FreeSWITCH Project (http:/ / www. (http:/ / www. com/ sanjose/ stories/ 2009/ 09/ 28/ daily9. community-newspapers. informationweek. ars/ post/ 20080129-barracuda-defends-open-source-antivirus-from-patent-attack. com/ s/ article/ 104909/ Security_Log?taxonomyId=017) [19] IT & Security Portal Barracuda Networks Launches Barracuda Load Balancer (http:/ / www. com/ archive/ articles/ 155880) [10] http:/ / andrew.com Barracuda offers a new alternative to Spamhaus (http:/ / www. com/ equipment/ 2008/ barracuda+ link+ balancer. spiceworks. Yosemite (http:/ / www. php?nid=368) [29] SCMagazine Barracuda Networks buys NetContinuum (http:/ / www. com/ blog/ main/ archives/ 2008/ 11/ barracuda_swims.Barracuda Networks 253 References [1] http:/ / www. html) [8] Ars Technica Barracuda bites back at Trend Micro in ClamAV patent lawsuit (http:/ / arstechnica. pcworld. infosecurity-magazine. com/ usa/ story. com/ c/ a/ Security/ Barracuda-Acquires-Cloud-Security-Vendor-Purewire-559167/ ) [27] InfoSecurity Barracuda moves into distributed firewall technology (http:/ / www. it-observer. com/ barracuda-networks-buys-netcontinuum/ article/ 35669/ ) [30] InformationWeek Barracuda Swims Into The Cloud (http:/ / www. html?fsrc=rss-virusworms) [7] Ars Technica Barracuda defends open-source antivirus from patent attack (http:/ / arstechnica. com/ news/ networking-equipment/ floor-boxes/ new-ssl-vpn-announced-18881252/ ) [22] eChannelline Barracuda puts bite on SMB Web application controller (http:/ / www.

in a demilitarized zone (DMZ). unprotected by a firewall or filtering router. with bastion hosts sitting between the first "outside world" firewall. so if only one firewall exists in a network. The system is on the public side of the demilitarized zone (DMZ). Other types of bastion hosts include web. A screened host is a dual-homed host that is dedicated to running the firewall. The first requires two firewalls. Frequently the roles of these systems are critical to the network security system. and FTP servers.Barracuda Networks 254 External links • Barracuda Networks corporate website (http://www. and may have modified software.com) • Barracuda Networks company Products page (http://www.. Indeed the firewalls and routers can be considered bastion hosts.. [3] Bastion hosts are related to multi-homed hosts and screened hosts. mail.[4] .  Thinking About Firewalls [1] Definition A bastion host is a computer that is fully exposed to attack. and an inside firewall. Due to their exposure a great deal of effort must be put into designing and configuring bastion hosts to minimize the chances of penetration.barracudanetworks. Often smaller networks do not have multiple firewalls. Generally. Marcus J.com) Bastion host A bastion host is a special purpose computer on a network specifically designed and configured to withstand attacks. [2] Placement There are two common network configurations that include bastion hosts and their placement. Ranum in an article discussing firewalls. DNS. bastion hosts will have some degree of extra attention paid to their security. Bastion server can also be set up using ProxyCommand with OpenSSH. It is hardened in this manner primarily due to its location and purpose.com/ns/products/) • CudaTel . In it he defines bastion hosts as . may undergo regular audits.. bastion hosts are commonly placed outside the firewall.a system identified by the firewall administrator as a critical strong point in the network's security. which is either on the outside of the firewall or in the DMZ and usually involves access from untrusted networks or computers. for example a proxy server.Barracuda Networks PBX/Phone System (http://www.cudatel.barracudanetworks. The computer generally hosts a single application. While a dual-homed host often contains a firewall it is also used to host other services as well. and all other services are removed or limited to reduce the threat to the computer. Background The term is generally attributed to Marcus J. —Ranum.

ancient-rome. especially root or administrator accounts. . au/ pub/ docs/ security/ ThinkingFirewalls/ ThinkingFirewalls. telstra. html). . a product example" (http:/ / www. net/ unix_security/ Building_a_Bastion_Host_Using_HPUX_11. info/ Bastion_host). there are several best practice suggestions to follow:[6] • • • • • • • • • Disable or remove any unneeded services or daemons on the host. com/ ).Bastion host 255 Examples These are several examples of bastion host systems/services: • • • • • • • • Web server DNS (Domain Name System) server Email server FTP (File Transfer Protocol) server Proxy server Honeypot VPN (Virtual Private Network) server Deep-Secure Bastion [5] Best Practices Because bastion hosts are particularly vulnerable to attack. Lock down user accounts as much as possible. [6] "Best practice to use Bastion server" (http:/ / www. References [1] [2] [3] [4] http:/ / www. Disable or remove any unneeded network protocols. org/ resources/ idfaq/ bastion. Intrusion Detection FAQ: What is a bastion host?" (http:/ / www. php). Disable or remove any unneeded user accounts. Run an Intrusion Detection System on the host. . Patching the operating system with the latest security updates. Close all ports that are not needed or not used. due to the level of required access with the outside world to make them useful. "How to build a Bastion host" (http:/ / secinf. Use encryption for logging in to the server. html "Sans Institute. chmouel. . com/ blog/ 2009/ 02/ 08/ proxycommand-ssh-bastion-proxy). deep-secure. "How to have OpenSSH using transparently a bastion host" (http:/ / www. sans. . com. [5] "Deep-Secure Bastion. Configure logging and check the logs for any possible attacks. . vtcif.

often dynamically to respond quickly to distributed denial-of-service attacks. Even though TCP/IP provides means of communicating the delivery failure back to the sender via ICMP. Firewalls and "stealth" ports Most firewalls can be configured to silently discard packets addressed to forbidden hosts or ports. hence the name. washington. These addresses are often used as return addresses for automated e-mails. cs. pdf http:/ / hubble. without informing the source that the data did not reach its intended recipient. External links • • • • Remotely triggered black hole filtering (Cisco Systems) [1] University of Washington blackhole monitor/lookup system [2] Tools for detecting a blackhole attack in an ad hoc wireless network [3] Remote Triggered Black Hole Filtering [4] References [1] [2] [3] [4] http:/ / www. cisco. traffic destined for such addresses is often just dropped. but to which all messages sent are automatically deleted. net/ http:/ / blog. resulting in small or large "black holes" in the network. This causes TCP connections from/to hosts with a lower MTU to hang. including the ones needed for Path MTU discovery to work correctly. sourceforge. Dead addresses The most common form of black hole is simply an IP address that specifies a host machine that is not running or an address to which no host has been assigned.Black hole (networking) 256 Black hole (networking) In networking. com/ warp/ public/ 732/ Tech/ security/ docs/ blackhole. Black hole e-mail addresses A black hole e-mail address is an e-mail address which is valid (messages sent to it will not generate errors). Black hole filtering Black hole filtering refers specifically to dropping packets at the routing level. and can only be detected by monitoring the lost traffic. edu/ http:/ / safewireless. PMTUD black holes Some firewalls incorrectly discard all ICMP packets. When examining the topology of the network. usually using a routing protocol to implement the filtering on several routers at once. com/ 2010/ 11/ 24/ remote-triggered-black-hole-filtering/ . the black holes themselves are invisible. black holes refer to places in the network where incoming traffic is silently discarded (or "dropped"). ipexpert. and never stored or seen by humans.

digital certificates. email and file encryption. [2] Weissman. intrusion prevention. [3] Pike.Mature Imperva opts for two tiers [8] Financial Times . including authentication. org/ irp/ program/ security/ blacker. an article at the Intelligence Resource Program" (http:/ / www. Retrieved 2007-12-02. jsp?arnumber=213253).Data Lost. Blue Cube provides solutions in enterprise-wide security. Clark. "BLACKER. Department of Defense computer network security project designed to achieve A1 class ratings of the Trusted Computer System Evaluation Criteria (TCSEC). . Blue Cube Security Blue Cube Security Ltd is an independent IT solutions provider delivering enterprise-wide IT security solutions. vulnerability. mil/ oai/ oai?verb=getRecord& metadataPrefix=html& identifier=ADA390673).Data leakage protection: how to secure your most vital assets [7] CRN .BLACKER 257 BLACKER BLACKER is a U.Blue Cube and SC survey will map the effects of recession on IT security [6] Computer Weekly . Retrieved 2007-12-02. Retrieved 2007-12-02.The key to data protection [5]SC Magazine . VPNs. htm).The ego and the ID [10] . . org/ xpl/ freeabs_all.Collaboration: Learn from an organisation in the same sector as your own [4]Post Magazine . New Zealand. Clark (1995-01-24).Policyholder Security . East Sussex in the UK.[1] [2] The project was implemented by SDC and Burroughs. Blue Cube's headquarters are in Forest Row.How to Survive an IT Squeeze [9] • Post Magazine . "BLACKER: security for the DDN examples of A1 security engineering trades" (http:/ / ieeexplore. firewalls. Blue Cube was founded by CEO Gary Haycock-West in 2000. References • • • • • • • • • Pro Security Zone -The Choice Between Managing Internet Access And Blocking It [1] Pro Security Zone . Not Found [3]HR Magazine .S.PINsafe Added To Security Tools Offered By Blue Cube [2] Info Security Magazine . John (2000-02-11). . It was the first secure system with trusted End-to-end encryption on the United States' Defense Data Network. with a further office in Wellington. dtic. ieee. fas. "Handbook for the Computer Security Certification of Trusted Systems" (http:/ / stinet.[3] References [1] Weissman. intrusion detection. and scanning.

The use of a vhost does not conceal the connection any better. A BNC can also be used to hide the true target to which a user connects. Example: User A logs onto IRC directly and appears as USER!user@users. uk/ post/ analysis/ 1217189/ the-ego-id [11] http:/ / www. prosecurityzone. co.dns User A logs onto IRC indirectly through a BNC and appears as USER!user@bnc. uk/ crn/ news/ 2231450/ mature-imperva-opts-two-tiers-4366224 [9] http:/ / www. co. uk/ news/ search/ 909971/ Collaboration-Learn-organisation-sector-own/ [5] http:/ / www.com [11] References [1] http:/ / www.reverse. com/ Customisation/ News/ IT_Security/ Internet_Security_and_Content_Filtering/ The_Choice_Between_Managing_Internet_Access_And_Blocking_It. IRC One common usage is over Internet Relay Chat (IRC) via a BNC running on remote servers. Using a BNC allows a user to hide the original source of the user's connection. much like a proxy. where it is very easy to ascertain a user's IP address a BNC may help to hide the original connection source. postonline. ft. hrmagazine. uk/ post/ analysis/ 1222314/ policyholder-security-the-key-protection [6] http:/ / www. asp [3] http:/ / www. asp [2] http:/ / www. and send them upon the clients reconnection. In such an environment. Many BNCs remain connected to an IRC server in the event the client should disconnect from the Internet.net . htm [8] http:/ / www. this is often considered to be much too resource dependent for commercial hosting services to provide. computerweekly. bluecubesecurity. infosecurity-magazine. but merely adds a statement as the hostname. html?nclick_check=1 [10] http:/ / www. com/ Customisation/ News/ IT_Security/ Data_Protection/ PINsafe_Added_To_Security_Tools_Offered_By_Blue_Cube. com/ view/ 2297/ data-lost-not-found-why-data-loss-is-still-prevalent-in-many-organisations-/ [4] http:/ / www. com BNC (software) A BNC (short for bouncer) is a piece of software that is used to relay traffic and connections in computer networks. Some implementations opt to store all messages sent across the network that the client would have normally received. com/ blue-cube-and-sc-survey-will-map-the-effects-of-recession-on-it-security/ article/ 123513/ [7] http:/ / www. channelweb. Other logging features and bot like functions may be included with various implementations but are not standard. as well as providing the opportunity for fun "vhosts" or "virtual hosts". co. co. providing privacy as well as the ability to route traffic through a specific location. com/ cms/ s/ 0/ 31b34948-aadc-11dd-897c-000077b07658. com/ Articles/ 2009/ 01/ 06/ 233890/ Data-leakage-protection-how-to-secure-your-most-vital. postonline.bluecubesecurity.Blue Cube Security 258 External links • www. scmagazineuk. prosecurityzone. Often state changes are tracked so that they may be relayed to the client upon reconnection.

like cubnc can be used in multi-server setup for easy access to each server and load balancing. • • • • ezbounce supports SSL conncections. psyBNC supports SSL connections. IPv6 and logging and offers a web interface. Supports logging.BNC (software) 259 Bouncer Software Following a (incomplete) list of Bouncer Software. Most elaborate bouncers can even bounce secure SSL/TLS connections. Multiple traffic bouncers can be installed parallel. External links • BNC (software) [1] at the Open Directory Project References [1] http:/ / www. but it doesn't hide the existence of the actual server. FTP BNCs are also often used for FTP. It's extensible by modules and scripts in Perl. dmoz. Traffic bouncers relay traffic through the host they are installed on. • Irssi supports SSL connections. IPv6 and logging. it appears as the bouncer is actually the FTP server thus hiding the real location of the server completely. Tcl and C++. in order to balance traffic load across different links. Entry bouncer acts as a gateway to the server. It's extendible by modules and scripts in C++. • JBouncer written in Java (programming language). org/ Computers/ Software/ Internet/ Clients/ Chat/ IRC/ Bouncers/ . Perl and Tcl. again to either hide the user and server from each other and also to route traffic through a specific location. ZNC supports SSL connections. entry and traffic. and when used. • muh bnc supports logging. This removes the need to select which FTP server to login to. FTP bouncers can be divided into two different categories. when trying to access the server farm. Entry bouncers. IPv6 and logging. shroudBNC supports SSL connections and offers a web interface.

bots began as a useful tool without malicious overtones. However. Due to most conventional IRC networks taking measures and blocking access to previously-hosted botnets. Trojan horses. A botnet is considered a botnet if it is taking action on the client itself via IRC channels without the hackers having to log in to the client's computer.[9] While the term "botnet" can be used to refer to any group of bots. usually through a means such as IRC. The Dutch police found a 1. huge volumes of traffic (either email or denial of service) can be generated. the FBI arrested a 23-year old Slovenian held responsible for the malicious software that integrated an estimated 12 million computers into a botnet. a few worms had exploited vulnerabilities in IRC clients and used the bots to steal passwords.[1] Background Like many things on the Internet today. Due to the large numbers of compromised machines within the botnet.[7] Conficker is one of the largest botnets out there that has infected an estimated 1 million to 10 million machines which attempts to sell fake antivirus to its victims.g. Botnets have become a significant part of the Internet. in recent times.[8] Organization While botnets are often named after their malicious software name. this word is generally used to refer to a collection of compromised computers (called zombie computers) running software. and hide their identity. there are typically multiple botnets in operation using the same malicious software families. The main drivers for botnets are for recognition and financial gain.[4] In July 2010. albeit increasingly hidden. or backdoors. Sometimes a controller will hide an IRC server installation on an educational or corporate site where high-speed connections can support a large number of other bots.[6] It has been estimated that up to one quarter of all personal computers connected to the internet may be part of a botnet. The larger the botnet.5 million node botnet[3] and the Norwegian ISP Telenor disbanded a 10. such as IRC bots. Though rare.Botnet 260 Botnet In malware. A botnet's originator (aka "bot herder" or "bot master") can control the group remotely. Often the command-and-control takes place via an IRC server or a specific channel on a public IRC network. more experienced botnet operators program their . but operated by different criminal entities. the volume of spam originating from a single compromised host has dropped in order to thwart anti-spam detection algorithms – a larger number of compromised hosts send a smaller number of messages in order to evade detection by anti-spam techniques. A computer becomes a bot when it downloads a file (e. usually for sending out spam messages. the more ‘kudos’ the herder can claim to have among the underground community. worms.[2] Soon after the release of the first IRC bot. a botnet will include a variety of connections and network types. Exploitation of this method of using a bot to host other bots has proliferated only recently. Bots were originally developed as a virtual individual that could sit on an IRC channel and do things for its owner while the owner was busy elsewhere. The bot herder will also ‘rent out’ the services of the botnet to third parties.. or for performing a denial of service attack against a remote target. Individual programs manifest as IRC "bots". log keystrokes.000-node botnet. an email attachment) that has bot software embedded in it. a botnet is a collection of infected computers or bots that have been taken over by hackers (also known as bot herders) and are used to perform malicious tasks or functions. Several botnets have been found and removed from the Internet. under a common command-and-control infrastructure. and usually for nefarious purposes.[5] Large coordinated international efforts to shut down botnets have also been initiated. A botnet consists of many threats contained in one. Often. controllers must now find their own servers. usually installed via drive-by downloads exploiting web browser vulnerabilities. This server is known as the command-and-control server ("C&C"). The typical botnet consists of a bot server (usually an IRC server) and one or more botclients.

linked together for purposes of greater redundancy. Actual botnet communities usually consist of one or several controllers that rarely have highly-developed command hierarchies between themselves. As of 2006. they rely on individual friend-to-friend relationships. Depending upon the topology implemented by the botnet. All three of these usually communicate with each other over a network using a unique encryption scheme for stealth and protection against detection or intrusion into the botnet network. 1. as well as others. and not all botnets exhibit the same topology for command and control. such that a group may contain 20 or more individual cracked high-speed connected machines as servers. The spammer provides the spam messages to the operator. infecting ordinary users' computers.[11] Typical botnet topologies are: • • • • Star Multi-server Hierarchical Random 261 To thwart detection. A spammer purchases the services of the botnet from the operator." Botnet servers will often liaise with other botnet servers. some of these topologies limit the saleability and rental potential of the botnet to other third-party operators. A botnet operator sends out viruses or worms. see also RPC). The process of stealing computing resources as a result of a system being joined to a "botnet" is sometimes referred to as "scrumping. client program for operation. whose payload is a malicious application—the bot. Newer bots can automatically scan their environment and propagate themselves using vulnerabilities and weak passwords.g.000 computers. 4. some botnets were scaling back in size. but. in some cases a web server). although larger networks continued to operate. the average size of a network was estimated at 20.[10] The architecture of botnets has evolved over time. Generally. the more vulnerabilities a bot can scan and propagate through. 3. the more valuable it becomes to a botnet controller community. who instructs the compromised machines via the IRC server. the RFC 1459 (IRC) standard. However. Generally.Botnet own commanding protocols from scratch.[12] Formation and exploitation This example illustrates how a botnet is created and used to send email spam. Twitter or IM) to communicate with its C&C server. 2. the perpetrator of the botnet has compromised a series of systems using various tools (exploits. buffer overflows. . A bot typically runs hidden and uses a covert channel (e. The bot on the infected PC logs into a particular C&C server (often an IRC server. or command and control location discovery. it may make it more resilient to shutdown. causing them to send out spam messages. and the program that embeds itself on the victim's machine (bot). enumeration. The constituents of these protocols include a server program.

and Afraid. etc. the victim is consistently bombarded with phone calls attempting to connect to the internet. • Access number replacements are where the botnet operator replaces the access numbers of a group of dial-up bots to that of a victim's phone number. but are either advertising. Removing such services can cripple an entire botnet.Botnet Botnets are exploited for various purposes. the highest overall bandwidth. and financial information such as credit card numbers. as they can often gain access to confidential information held within that company. because the DNS hosting services usually re-direct the offending subdomains to an inaccessible IP address. For example. Compromised machines that are located within a corporate network can be worth more to the bot herder.com. and the most "high-quality" infected machines. few choices exist. Bringing down . Given the general geographic dispersal of botnets. click fraud. Mega-D features a slightly modified SMTP protocol implementation for testing the spam capability. • Adware exists to advertise some commercial entity actively and without the user's permission or awareness. corporate. or malicious in nature.org. Passive OS fingerprinting can identify attacks originating from a botnet: network administrators can configure newer firewall equipment to take action on a botnet attack by using information obtained from passive OS fingerprinting. Having very little to defend against this attack. Recently. and get most of its results by network packet analysis. There have been several targeted attacks on large corporations with the aim of stealing sensitive information. The implementation differences can be used for fingerprint-based detection of botnets.[15] Some botnets use free DNS hosting services such as DynDns. but much more frequently than normal use and cause the system to become busy.). cell phone. creation or misuse of SMTP mail relays for spam (see Spambot). like university. Similarly. spamdexing and the theft of application serial numbers. one such example is the Aurora botnet. The most serious preventive measures utilize rate-based intrusion prevention systems implemented with specialized hardware. NIDS monitors a network.[13] 262 Types of attacks • Denial-of-service attacks where multiple systems autonomously access a single Internet system or service in a way that appears legitimate. annoying. While these free DNS services do not themselves host attacks. and even government machines. it becomes difficult to identify a pattern of offending machines. for example by replacing banner ads on web pages with those of another content provider. The botnet controller community features a constant and continuous struggle over who has the most bots. these companies have undertaken efforts to purge their domains of these subdomains. Preventive measures If a machine receives a denial-of-service attack from a botnet. • Click fraud is the user's computer visiting websites without the user's awareness to create false web traffic for the purpose of personal or commercial gain. • Fast flux is a DNS technique used by botnets to hide phishing and malware delivery sites behind an ever-changing network of compromised hosts acting as proxies. including denial-of-service attacks. No-IP. rather than as a single system.[14] • E-mail spam are e-mail messages disguised as messages from people. • Spyware is software which sends information to its creators about a user's activities – typically passwords. The botnet community refers to such efforts as "nullrouting". most are forced into changing their phone numbers (land line. login IDs.org to point a subdomain towards an IRC server that will harbor the bots. it sees protected hosts in terms of the external interfaces to the rest of the network. A network based intrusion detection system (NIDS) will be an effective approach when detecting any activities approaching botnet attacks. some botnets implement custom versions of well-known protocols. they provide reference points (often hard-coded into the botnet executable). credit card numbers and other information that can be sold on the black market. and the sheer volume of IP addresses does not lend itself to the filtering of individual cases. Given enough bots partake in this attack.

However. FireEye. Kido Zbot.000 12.000 260. Kraken Sinowal. Newer botnets are almost entirely P2P. Hacktool. Commanders can be identified solely through secure keys and all data except the binary itself can be encrypted.000. By being dynamically updateable and variable they can evade having any single point of failure. For example a spyware program may encrypt all suspected passwords with a public key hard coded or distributed into the bot software. will be revealed.000 509. if one was to find one server with one botnet channel. Several security companies such as Afferent Security Labs.Spammer. DownAdUp. as well as other bots themselves.[17] 263 Historical list of botnets Date created 2009 (May) 2008 (around) ? ? 2007 (Around) ? ? ? 2007 (March) ? 2004 (Early) ? ? Name BredoLab Mariposa Conficker Zeus Cutwail Grum Mega-D Kraken Srizbi Lethic Bagle Bobax Torpig Estimated no. with command-and-control embedded into the botnet itself. the disconnection of one server will cause the entire botnet to collapse.000 560. most are aimed to protect enterprises and/or ISPs.000 495. of bots 30.000 185. more recent IRC server software includes features to mask other connected servers and bots. Only with the private key.000 [22] [23] [24] [25] [26] [27] [27] [27] [28] . DownAndUp.9 billion/day 10 billion/day 9 billion/day 60 billion/day 2 billion/day 5.480-node Dell high-performance computer cluster. Symantec.000 450. Lodeight Bobic. For example. Pushdo) Tedroo Ozdok Kracken Cbeplay.600. shutting down C&C servers. Newer botnets have even been capable of detecting and reacting to attempts to figure out how they work. Wsnpoem. A large botnet that can detect that it is being studied can even DDoS those studying it off the internet. Network-based approaches tend to use the techniques described above. or completely shutting down IRC servers.000+ 3.000 230. Mutant (related to: Wigon. Kneber Pandex.7 billion/day 9 billion/day n/a DownUp. Gorhax.000 180. The host-based techniques use heuristics to try to identify bot behavior that has bypassed conventional anti-virus software.Botnet the Mega-D's SMTP server disables the entire pool of bots that rely upon the same SMTP server.500. nullrouting DNS entries. which only the commander has.000 (US Only) 1. like Norton AntiBot (discontinued). so that a discovery of one channel will not lead to disruption of the botnet.500. often all other servers. Oderoor.000. There is an effort by researchers at Sandia National Laboratories to analyze the behavior of these botnets by simultaneously running one million Linux kernels as virtual machines on a 4.6 billion/day ? 10 billion/day [21] n/a 74 billion/day 39. PRG.000 [18] Spam capacity 3.[16] The botnet server structure mentioned above has inherent vulnerabilities and problems. Exchanger none Beagle. Mitglieder. can the data that the bot has captured be read. Trend Micro. are aimed at consumers. at least until the controller(s) decides on a new hosting space. If a botnet server structure lacks redundancy. Anserin Oficla Aliases [19] [20] 10. While some. Umbra Data and Damballa have announced offerings to stop botnets. Cotmonger.

8 June 2009.35 billion/day 0. chief technology officer at MessageLabs. [5] "SOME INPORTANT INFORMATION WHEN YOU USE 3RD PARTY APPICATIONS AND GAMES BECAREFUL" (http:/ / www. "The size of bot networks peaked in mid-2004. networkworld. he said. Computer (IEEE Computer Society). .000 80. The Register. damballa. April 2006. [4] Telenor takes down 'massive' botnet (http:/ / www..000 10. [14] "Operation Aurora — The Command Structure" (http:/ / www. Pixoliz none None None 264 2008 (November) Waledac ? ? ? ? ? 2008 (Around) ? ? 2009 (August) 2008 (Around) ? Maazben [32] [27] Onewordsub 40. broadbandreports. Xmiler Rlsloup. Damballa. MA: Syngress Publishing.5m PCs (http:/ / www. Waledpak None ? Tofsee. [15] al. html). .000 computers. dslreports. [8] Messmer.5 billion/day 0. Botnets the killer web app ([Online-Ausg.6 billion/day ? 0.8 billion/day 0. Craig A.". bbc. [2] al.000 12. p. Schiller .]. ISBN 9781597491358. Facebook. DSLReports. pdf). Costrat Nuwar. MA: Syngress Publishing. damballa. Damballa. ISBN 9781597491358. Craig A.Botnet 2006 (Around) ? ? Rustock Storm Donbot 150. Botnets the killer web app ([Online-Ausg. The Register. "1". BBC. co. com/ news/ 2009/ 070909-botnets-increasing..000 ? ? ? [27] [33] [33] [34] [33] [27] References [1] al. theregister. stm).000 20. Retrieved 6 April 2011.000 infected machines. Cryptic Danmec. 25 January 2007. pdf). co.000 160.. facebook. uk/ 2004/ 09/ 09/ telenor_botnet_dismantled/ ) by John Leyden.com.. Craig A. co. Peacomm. com/ faq/ 14158). [9] Many-to-Many Botnet Relationships (http:/ / www. Mondera Loosky. Retrieved 2010-10-22.].). com/ note.] ed. [12] "Hackers Strengthen Malicious Botnets by Shrinking Them" (http:/ / csdl2. 6. [et (2007).). computer. vnunet. Rockland.15 billion/day 2. uk/ 1/ hi/ business/ 6298641. Ellen.. . 10 June 2009. Bachsoy Waled.000 [33] Gheg Nucrypt Wopla Asprox Spamthru Xarvester Festi Gumblar Akbot 30.] ed. .). DSL Reports. "The botnet world is booming" (http:/ / www..The average botnet size is now about 20. Covesmer. Zhelatin Buzus. uk/ vnunet/ news/ 2144375/ botnet-operation-ruled-million) by Tom Sanders. Hydraflux Spam-DComServ.8 billion/day 1. com/ faq/ trojans/ 1. p.24 billion/day 5 billion/day 0. com/ downloads/ d_pubs/ WP Many-to-Many Botnet Relationships (2009-05-21).000 125.com. Rockland. Locksky Pokier. [et (2007). [11] Botnet Communication Topologies (http:/ / www. pdf) (PDF). Retrieved 7 April 2011.000 15. 30.000 [29] [30] [31] 30 billion/day 3 billion/day 0.25 billion/day ? ? RKRustok. [et (2007). MA: Syngress Publishing. uk/ 2005/ 05/ 24/ operation_spam_zombie/ ) by John Leyden. ISBN 9781597491358. Damballa. Slogger. Schiller . 156. com/ downloads/ r_pubs/ WP Botnet Communications Primer (2009-06-04).] ed. Botnets the killer web app ([Online-Ausg. with many using more than 100.000 50. org/ comp/ mags/ co/ 2006/ 04/ r4017. p. Schiller . damballa.5 billion/day 1. co.. theregister. [3] Botnet operation controlled 1. [13] "Trojan horse. "5". Rockland. 0_Trojan_horses). and Virus FAQ" (http:/ / www. Retrieved 8 April 2011. . v3. com/ research/ aurora/ ). Retrieved 2010-07-30. . [7] Criminals 'may overwhelm the web' (http:/ / news. according to Mark Sunner. php?note_id=191466177550278). . Network World.].000 20.. Retrieved 7 April 2011. [6] ISPs urged to throttle spam zombies (http:/ / www. [10] "what is a Botnet trojan?" (http:/ / www. "2".

networkworld. co.Research . Mega-D Botmaster to Stand Trial" (http:/ / garwarner. infosecurity-magazine. . pdf [28] Researchers hijack control of Torpig botnet . 2010 ACM Conference on Computer and Communications Security.com (http:/ / www. . [33] Gregg Keizer (2008-04-09). "Know your Enemy: Tracking Botnets".asp).2029720. com/ security/ perimeter/ showArticle. accessdate=2010-07-30. secureworks. Babic. html?id=3333655).html) at Wired Dark Reading . IT Security & Network Security News. .canada. com/ view/ 13620/ bredolab-downed-botnet-linked-with-spamitcom/ ) [19] "Suspected 'Mariposa Botnet' creator arrested" (http:/ / www2. com/ blog/ security/ research-small-diy-botnets-prevalent-in-enterprise-networks/ 4485). domagoj-babic. R. and reports on malware. EWeek. . canada. co. . and D. f-secure.asp?doc_id=122116&WT. "Oleg Nikolaenko. D. Retrieved 2010-12-06.pdf) . Attack of the Bots (http://www.be/maarten/mobbot.Is the Botnet Battle Already Lost? (http://www. Retrieved 2010-04-24.com.Y. 2007-10-21.com/proxy/) (including administration screenshots). Shin.Com. Song. Gary (2010-12-02).Botnets Battle Over Turf (http://www.com How-to: Build your own botnet with open source software (http://howto. [34] "Botnet sics zombie soldiers on gimpy websites" (http:/ / www.Botnet [16] C. 2009-01-16. "Top botnets control 1M hijacked computers" (http:/ / www. Retrieved 2011-04-23. . Retrieved 2010-07-30. com/ research/ threats/ botnets2009/ ) [32] "Waledac botnet 'decimated' by MS takedown" (http:/ / www.org) .John Kristoff's NANOG32 Botnets presentation. com/ researchers-hijack-control-of-torpig-botnet/ article/ 136207/ ) [29] Chuck Miller (2008-07-25). . Cho. CyberCrime & Doing Time.eweek. Mobile botnets (http://www. scmagazineus. botnet activity. . [20] "Calculating the Size of the Downadup Outbreak — F-Secure Weblog : News from the Lab" (http:/ / www.com . . Retrieved 2010-07-30. . theregister. messagelabs.com.org/papers/bots/). php/ Pubs/ CCS10botnets). [31] Spam Botnets to Watch in 2009 .com/wired/archive/14.00. . com/ 2010/ 12/ oleg-nikolaenko-mega-d-botmaster-to.com/wiki/ Build_your_own_botnet_with_open_source_software) • The Honeynet Project & Research Alliance (http://www. zdnet. 2010-03-16. darkreading. uk/ 2/ hi/ technology/ 7749835. html) [22] "Pushdo Botnet — New DDOS attacks on major web sites — Harry Waldron — IT Security" (http:/ / msmvps. Retrieved 2011-04-23. [25] "New Massive Botnet Twice the Size of Storm — Security/Perimeter" (http:/ / www.11/botnet. computerworld.wired. 265 External links • Wired.nanog.SC Magazine US (http:/ / www. Computerworld. com/ news/ 2009/ 072209-botnets. uk/ 2008/ 05/ 14/ asprox_attacks_websites/ ). bbc. com/ the-rustock-botnet-spams-again/ article/ 112940/ ).darkreading. . com/ mlireport/ MLI_2010_04_Apr_FINAL_EN.daemon. [23] "Research: Small DIY botnets prevalent in enterprise networks" (http:/ / www.1895. ZDNet.Blorge.Intrusive analysis of a web-based proxy botnet (http://lowkeysoft. scmagazineus. Retrieved 2010-04-24. co. Retrieved 2011-04-23.An all volunteer security watchdog group that • • • • • • gathers.wired.An economic and technological assessment of mobile botnets. [27] http:/ / www. The Register. com/ weblog/ archives/ 00001584.com/document. html). com/ Structure: / 2007/ 10/ 21/ 2483/ ). The Register. stm). jhtml?articleID=211201307).html) .org/meetings/nanog32/presentations/kristoff. DarkReading. [24] Warner. • The Shadowserver Foundation (http://www. Msmvps. BBC News. 2009-08-12. "The Rustock botnet spams again" (http:/ / www. [30] "Storm Worm network shrinks to about one-tenth of its former size" (http:/ / tech. .SecureWorks (http:/ / www.honeynet. . html). com/ blogs/ harrywaldron/ archive/ 2010/ 02/ 02/ pushdo-botnet-new-ddos-attacks-on-major-web-sites. Lowkeysoft .shadowserver. blogspot. uk/ 2010/ 03/ 16/ waledac_takedown_success/ ). 2008-11-26.com. [18] Infosecurity (UK) . SC Magazine US. Retrieved 2010-07-30. NANOG Abstract: Botnets (http://www. eweek. [21] America's 10 most wanted botnets (http:/ / www. svl=news1_1). com/ topics/ technology/ story. [26] "Technology | Spam on rise after brief reprieve" (http:/ / news. and electronic fraud. 2010-02-02. Tech.com/article2/0. Retrieved 2010-07-30. 2008-05-14. com/ index. tracks.BredoLab downed botnet linked with Spamit. com/ c/ a/ Security/ Researchers-Boot-Million-Linux-Kernels-to-Help-Botnet-Research-550216/ ?kc=EWKNLLIN08182009STR2). com/ s/ article/ 9076278/ Top_botnets_control_1M_hijacked_computers). Inference and Analysis of Formal Models of Botnet Command and Control Protocols (http:/ / www. aspx). blorge. F-secure. . Retrieved 2010-07-30. theregister. Retrieved 2011-04-23. [17] "Researchers Boot Million Linux Kernels to Help Botnet Research" (http:/ / www.

The suspect denied any such involvement in the botnet. etc) addresses (http://luno.[8] effectively removing the botnet herder's ability to control the botnet centrally.[8] [15] Subsequently Armenian law enforcement officers arrested an Armenian citizen.[11] [12] While the seizure of the command and control servers severely disrupted the botnet's ability to operate.[9] The main income of the botnet was generated through leasing parts of the botnet to third-parties who could subsequently use these infected systems for their own purposes. 2008 • Milcord Botnet Defense (http://wiki. Even so.Botnet • List of dynamic (dsl.arbor.[1] was a Russian-founded[2] botnet mostly involved in viral e-mail spam. the botnet was capable of sending 3. when there was a major surge in the size of the botnet. • ATLAS Global Botnets Summary Report (http://atlas.[14] After taking control over the botnet.[4] [10] [11] Due to the rental business strategy. the group noted that the botnet's size and capacity has been severely reduced by the law enforcement . At its peak.FBI April 16. the payload of Bredolab has been very diverse.[12] Dismantling and aftermath On 25 October 2010.[17] the botnet itself is still partly intact. 266 BredoLab botnet The BredoLab Botnet.a method which exploits security vulnerabilities in software. Before the botnet was eventually dismantled in November 2010 through the seizure of 143 command and control servers. stating that their computer was part of the botnet.[6] [7] Bredonet's main form of propagation was through sending malicious e-mails that included malware attachments which would infect a computer when opened. effectively turning the computer into another zombie controlled by the botnet.fbi.gov/dojpressrel/pressrel08/la041608usa. This method allowed the botnet to bypass software protection in order to facilitate downloads without the user being aware of them. to unleash a DDoS attack on LeaseWeb servers.com/wiki/Botnet_Defense) .milcord.DHS-sponsored R&D project that uses machine learning to adaptively detect botnet behavior at the network-level • A Botnet by Any Other Name (http://www.SecurityFocus column by Gunter Ollmann on botnet naming. cable.[8] The other main form of propagation was through the use of drive-by downloads .[2] [12] [13] In an attempt to regain control over his botnet. • FBI LAX Press Release DOJ (http://losangeles.Filter SMTP mail for hosts likely to be in botnets.htm) . possibly a previous client who reverse engineered parts of the original botnet creator's code.[15] Security firm Fireeye believes that a secondary group of botnet herders has taken over the remaining part of the botnet for their own purposes.[4] [16] on the basis of being the suspected mastermind behind the botnet. also known by its alias Oficla. it was estimated to consist of around 30 million zombie computers.[3] [4] [5] Operations Though the earliest reports surrounding the BredoLab botnet originate from May 2009 (when the first malware samples of the Bredolab trojan horse were found) the botnet itself did not rise to prominence until August 2009. the botnet herder utilized 220. with command and control servers still being present in Russia and Kazakhstan.6 billion viral emails every day. and ranged from scareware to malware and e-mail spam.000 computers which were still under his control.000 a month from botnet related activities. and security researchers estimate that the owner of the botnet made up to $139.Real-time database of malicious botnet command and control servers. a team of Dutch law enforcement agents seized control of 143 command and control servers rented from LeaseWeb. the law enforcement team utilized the botnet itself to send a message to owners of infected computers.org/project/lred) .net/summary/botnets) .securityfocus. though these attempts were ultimately in vain.com/columnists/501) . modem.

Techworld. jsp?docid=2009-052907-2436-99& tabid=2) [10] Bredolab Down but Far from Out After Botnet Takedown (http:/ / www. softpedia. trendmicro.com (http:/ / www.com Affiliate — Krebs on Security (http:/ / krebsonsecurity. infosecurity-magazine. 29 October 2010 [16] Bredolab Mastermind Was Key Spamit. php?id=10089) [6] http:/ / us. com/ news/ Suspected-Bredolab-Runner-Arrested-in-Armenia-163068. co. fireeye. thetechherald. html) . com/ security/ 3246311/ more-bredolab-arrests-may-occur-say-dutch-prosecutors/ ) [12] Bredolab Botnet Still Spewing Malware . com/ security/ portal/ Threat/ Encyclopedia/ Search. dying or dormant? » CounterMeasures (http:/ / countermeasures. co. guardian.BredoLab downed botnet linked with Spamit. 28 October 2010 [11] More Bredolab arrests may occur. net-security.co. com/ view/ 13620/ bredolab-downed-botnet-linked-with-spamitcom/ ) [5] The aftermath of the Bredolab botnet shutdown (http:/ / www. techworld. pdf [7] Trojan.Bredolab | Symantec (http:/ / www. dead. scmagazineuk. aspx?query=Bredolab). com/ news/ security/ vulnerabilities/ showArticle. trendmicro.com (http:/ / news. com/ security_response/ writeup.Softpedia (http:/ / news. eu/ bredolab-dead-dying-or-dormant/ ) [18] FireEye Malware Intelligence Lab: Bredolab . com/ view/ 13461/ dutch-government-shuts-down-bredolab-botnet) [9] Trojan. informationweek. com/ article. com/ c/ a/ Security/ Bredolab-Down-But-Far-From-Out-After-Botnet-Takedown-160657/ ). com/ security_response/ writeup. eweek. symantec.InformationWeek (http:/ / www. com/ bredolab-botnet-taken-down-after-dutch-intervention/ article/ 181737/ ) [3] Researchers: Bredolab still lurking. uk/ technology/ 2010/ oct/ 26/ bredolab-worm-suspect-arrested-armenia) [14] Suspected Bredolab Botnet Runner Arrested in Armenia .uk (http:/ / www.[10] [18] 267 References [1] Search the malware encyclopedia: Bredolab (http:/ / www.Security (http:/ / www. Microsoft. php/ 201043/ 6346/ Researchers-Bredolab-still-lurking-though-severely-injured-Update-3) [4] Infosecurity (UK) . infosecurity-us. shtml) [15] Undead Bredolab zombie network lashes out from the grave (http:/ / www. com/ research/ 2010/ 10/ bredolab-severely-injured-but-not-dead. org/ secworld.Bredolab Technical Details | Symantec (http:/ / www. though severely injured (Update 3) . com/ imperia/ md/ content/ us/ trendwatch/ researchandanalysis/ bredolab_final.Dutch government shuts down Bredolab botnet (http:/ / www. theregister. com/ 2010/ 10/ bredolab-mastermind-was-key-spamit-com-affiliate/ ) [17] Bredolab. say Dutch prosecutors .Severely Injured but not dead (http:/ / blog. symantec.Bredolab Botnet .SC Magazine UK (http:/ / www. microsoft.com [2] Bredolab botnet taken down after Dutch intervention .BredoLab botnet intervention. jsp?docid=2009-052907-2436-99) [8] Infosecurity (USA) . uk/ 2010/ 10/ 29/ bredolab_botnet_death_throes/ ). jhtml?articleID=228000344& subSection=News) [13] Suspected Bredolab worm mastermind arrested in Armenia | Technology | guardian.

It is released under the BSD license.Bro (software) 268 Bro (software) Bro Original author(s) Vern Paxson Stable release Operating system Type License Website 1. bro-ids.org [1] Bro is an open source Unix based Network intrusion detection system (NIDS). org/ [2] http:/ / www.bro-ids.3 / March 3. org/ vern/ papers/ bro-CN99. html . 2011 Linux Network intrusion detection system BSD license www. Bro was originally written by Vern Paxson.5. icir. External links • Bro IDS Home page [1] • The original paper describing Bro [2] References [1] http:/ / www.

businessweek. This redirect server responds with a regular HTTP response which contains HTTP status code 302 to redirect the client to the Captive Portal. "The New E-spionage Threat" (http:/ / www. . is intercepted by a firewall and forwarded to a redirect server. This request. BusinessWeek. Retrieved 15 April 2008. Redirection by HTTP If an unauthenticated client requests a website. In addition to whitelisting the URLs of web hosts. apartment houses. multiple web servers can be whitelisted (say for iframes or links within the login page). however. . and it can be used to control wired access (e.[1] This is done by intercepting all packets. To the client.Byzantine Foothold 269 Byzantine Foothold Byzantine Foothold is a classified United States Department of Defense emergency program within the larger Cyber Initiative framework. The client assumes that the website actually responded to the initial request and sent the redirect. this process is totally transparent. The twenty largest American military-industrial contractors have also been invited to participate in the program. Captive portal The captive portal technique forces an HTTP client on a network to see a special web page (usually for authentication purposes) before using the Internet normally.g. some gateways can whitelist TCP ports. A captive portal turns a Web browser into an authentication device. DNS is queried by the browser and the appropriate IP resolved as usual. At that time the browser is redirected to a web page which may require authentication and/or payment. htm). com/ magazine/ content/ 08_16/ b4080032218430. Brian. hotel rooms. until the user opens a browser and tries to access the Internet. "open" Ethernet jacks) as well. Captive portals are used at most Wi-Fi hotspots. or the web server hosting that page must be "whitelisted" via a walled garden to bypass the authentication process. Implementation There is more than one way to implement a captive portal. specifically aimed at curbing and preventing foreign intrusions into the computer networks of US federal agencies. The browser then sends an HTTP request to that IP address. regardless of address or port. business centers. either that login page is locally stored in the gateway. Depending on the feature set of the gateway. after a highly potent hacker attack was detected at Booz Allen Corp. Chi-Chu Tschang (10 April 2008). Since the login page itself must be presented to the client. or simply display an acceptable use policy and require the user to agree.[1] References [1] Grow. Keith Epstein. The MAC address of attached clients can also be set to bypass the login process.

and ChilliSpot • LogiSense. Software as a service solution (commercial) • Untangle Captive Portal. C++ based and is executable both in Linux and Windows/Cygwin environments • Zeroshell. a user simply needs to configure their computer to use another. Software captive portals • Air Marshal. Some naive implementations don't block outgoing DNS requests from clients. DNS server. small C based kernel solution (embeddable) • Wilmagate. for any Chillispot or Mikrotik router • SilverSplash. FreeBSD based firewall distribution • NoCatAuth. Modified Linux OS. open source Linux daemon [abandoned] • CoovaChilli. Windows based software solution (commercial) • FirstSpot. Although the legal standing is still unclear (especially in the USA) common thinking is that by forcing users to click through a page that displays terms of use and explicitly releases the provider from any liability. commercial directory integration) • WiFiDog Captive Portal Suite. open source Linux daemon based on ChilliSpot • DNS Redirector. Redirection by DNS When a client requests a website. The firewall will make sure that only the DNS server(s) provided by DHCP can be used by unauthenticated clients (or. open source Linux daemon based on OpenWRT. Billing & OSS / Network Access Control • m0n0wall. The DNS poisoning technique used here. and therefore are very easy to bypass. software based for Linux platform (commercial) • Amazingports. This DNS server will return the IP address of the Captive Portal page as a result of all DNS lookups. when not considering answers with a TTL of 0. . This has the disadvantage that content served to the client does not match the URL.Captive portal 270 IP Redirect Client traffic can also be redirected using IP redirect on the layer 3 level. Captive portals are gaining increasing use on free open wireless networks where instead of authenticating users. it will forward all DNS requests by unauthenticated clients to that DNS server). they often display a message from the provider along with the terms of use. an open source ad serving captive portal for Linux platforms • Sputnik. Linux based • PacketFence. FreeBSD based firewall software derived from m0n0wall • pointHotspot a web-based Hotspot Solution. Firewall featuring Captive Portal (Linux-based. They also allow enforcement of payment structures. alternatively. free basic functionality. Linux based software with integrated billing and payment implementing service-oriented provisioning. Windows based software solution (commercial) • HotSpotPA. Linux based network services distribution • Hotspot Engine. free and commercial • ChilliSpot. any potential problems are mitigated. DNS is queried by the browser. Implementing a firewall or ACL that ensures no inside clients can use an outside DNS server is critical. Linux based Network Access Control software featuring a captive portal (open source) • pfSense. may negatively affect post-authenticated internet use when the client machine references non-authentic data in its local resolver cache. paid or a 30 day trial by request. public. OpenVPN.

but users who first use an email client or other will find the connection not working without explanation. Such platforms include the Nintendo DS running a game that uses Nintendo Wi-Fi Connection. Captive portals require the use of a browser. personaltelco. and will need to open a browser to validate. Non browser authentication is possible using WISPr. Once the IP and MAC addresses of other connecting computers are found to be authenticated. cgi/ CaptivePortal) . VoIP SIP ports could be allowed to bypass the gateway to allow phones to work. There also exists the option of the platform vendor entering into a service contract with the operator of a large number of captive portal hotspots to allow free or discounted access to the platform vendor's servers via the hotspot's walled garden. and be allowed a route through the gateway. this is usually the first application that users start. For example. net/ index. after which their IP and MAC address are allowed to pass through the gateway. Platforms that have Wi-Fi and a TCP/IP stack but do not have a web browser that supports HTTPS cannot use many captive portals. an XML-based authentication protocol for this purpose. This has been shown to be exploitable with a simple packet sniffer. References [1] CaptivePortal (http:/ / wiki. For this reason some captive portal solutions created extended authentication mechanisms to limit the risk for usurpation.Captive portal 271 Limitations Some of these implementations merely require users to pass an SSL encrypted login page. such as the deal between Nintendo and Wayport. or MAC-based authentication or authentications based on other protocols. any machine can spoof the MAC address and IP of the authenticated target.

players who are tagged remain in jail indefinitely. but can be any object small enough to be easily carried by a person (night time games might use flashlights. and the players have to see the flag. However. depending on the rules. then knock it out and bring it to their base. out of the game." The jail is a predesignated area of the group's territory which exists for holding tagged players. Within their own territory players are "safe. Jailbreaks are accomplished by a player running from their own territory into the enemy's jail. located at the team's "base." and bring it safely back to their own base. grab the flag and return with it to their own territory without being tagged. around 6. In another. freed players usually. observation Capture the Flag (CTF) is a traditional outdoor sport often played by children or sometimes adults where two teams each have a flag (or other marker) and the objective is to capture the other team's flag. Location The flags are generally placed in a visibly obvious location (but in some variations the flag is hidden) at the rear of a team's territory. endurance. The player performing the jail break. although not always. (One variation of the game includes a "jail" area in addition to the flag on each team's territory. While returning to their own side. It is also suggested that team wear dark colors at night time to increase the difficulty of the opponents to see them. the most common form of the game involves the option for a "jailbreak. on the other . be forced to join the opposing team.. free all jailed players or simply those who are physically touched by the one performing the jailbreak. players from their own team may free them from jail by means of a jailbreak." meaning that they cannot be tagged by opposing players. Whether indoor or outdoor. It also might have some challenge involved." in which they are safe from tagging until they reach their home territory. the field is divided into two halves. acquire "free walk-backs. Enemy players can be "tagged" by players in their home territory. known as territories. these players are then. more difficult version. members of the opposite team. the flag is hidden in a place where it can only be seen from one angle. In general freed players are obligated to return directly to their own territory before attempting offensive action (i. predetermined time. glowsticks or lanterns as the flags). Jail Different versions of Capture the Flag have different rules both for handling the flag and for what happens to tagged players.) Overview Capture the Flag is played on some sort of playing field. Once they cross in to the opposing team's territory they are vulnerable.e. attempting to grab the flag). the flag could be hidden in the leaves up in a tall tree. A player who is tagged may be eliminated from the game entirely. For example. It is usually located a good distance from the flag in order to minimize the possibility of simultaneous flag grabs and jail breaks. Eye black is suggested for team members to create team unity. Such action may. or be placed temporarily in "jail. or "in jail".10 players Random chance Low Skills required stealth. Players form two teams. Each side has a "flag" which is most often a piece of fabric. one for each territory. The object of the game is for players to make their way into the opposing team's territory. While tagged players may be confined to jail for a limited." In this version. The flag is defended mainly by tagging opposing players who attempt to take it.Capture the flag 272 Capture the flag Capture the flag Players Large group. depending on the agreed rules.

In some games. CTF mods are available for multiple first person shooters. the flag is left in the location where the player was tagged. is neither safe. unlike the children's game. while the first real-time strategy to feature CTF was Command & Conquer in 1995. not only one. but only hand it off while running. one can only win when all flags are captured. Simply leaving jail without being freed is considered poor sportsmanship and is severely frowned upon. Possibly the first first-person shooter to feature CTF was Rise of the Triad. as with the real-life game) called "Capture the Flag" is found in many first. Unreal Tournament. Also. Quake. it is possible for the players to throw the flag to teammates. This latter variant obviously makes offensive play easier. as opposed to bringing it back to friendly territory. which is a free download using the game engine from the popular Return to Castle Wolfenstein. the TimeSplitters series (renamed "Capture The Bag"). to be moved closer to the dividing line between territories. In most versions. he is not safe from being tagged. and Half-Life respectively. But he has the option to return to his own side or hand it off to a teammate who will then carry it to the other side. A common multiplayer gameplay mode (usually with team-based gameplay. CTF is also a popular mode in the Team Fortress and Team Fortress Classic mods for Quake. Sometimes. CTF is most commonly played in multiplayer games. nor restricted from performing other actions such as attempting to grab the flag or generally moving about enemy territory. even in his home territory. it is allowed for the players to pass. or games with three or more flags. thus ending the game. including Wolfenstein: Enemy Territory.and third-person shooters such as Team Fortress 2. The game is won when a player returns to his own territory with the enemy flag. As long as the flag stays in play without hitting the ground. the standalone Team Fortress 2. . players can be harmed irrespective of whether they are in their own base. as a general rule. It is a turn-based strategy game with real time network / modem play (or play-by-mail) based around the traditional outdoor game. oftentimes leading to expulsion from the game. the flag holder may not be safe at all.Capture the flag hand. with the exception that the items to be captured and defended were triad symbols. Each team has a flag and the players attempt to take the enemy's flag from their base and bring it back to their own flag to score. because of difficulties implementing the artificial intelligence that the computer player would have needed to bring the enemy flag home and intercept opposing characters carrying the flag. CTF was popularized when it was first introduced as a modification to Quake by the company Threewave. CTF is even in some sports games such as the Tony Hawk Pro Skater series and the racing series Midnight Club. he may not throw the flag. Note that in First-Person Shooters. 273 Capturing the Flag The rules for the handling of the flag also vary from game to game and deal mostly with the disposition of the flag after a failed attempt at capturing it. The game required players to merely move one of their characters onto the same square as their opponent's flag. over the course of the game. the Call of Duty series. Marathon (video game). players in jail form chains. In the case of the latter. until he obtains both flags. everyone is free. Tribes.[1] Variants Alterations may include "one flag" CTF in which there is a defensive team and an offensive team. and conforms to the objectives stated above for CTF games in first person shooters. the Halo series. In one variant after a player is tagged while carrying the flag it is returned to its original place. Richard Carr released an MS-DOS based game called Capture the Flag. Sometimes. Urban Terror. One of the multiplayer modes was called Capture the Triad. released in 1994. Software and games In 1992. so that if your teammate tags one person in the chain. the flag carrier may not attempt to free any of their teammates from jail. as the flag will tend. In another variant. When the flag is captured by one player. and Metroid Prime Hunters.

as well as conducting and reacting to the sort of attacks found in the real world.Capture the flag Compared to a deathmatch game.[2] 274 Computer security In computer security. In that case. Projectors will display a scoreboard on the wall. World of Warcraft. Such tools might be a grappling hook or a portable teleporter. It is the same as capture the flag except the players attack the other players on the opposing team with their obtained skills. CTS is giving a new twist. differentiated from normal CTF maps by the presence of vehicles and in Unreal Tournament III and the replacement of the Translocator with a Hoverboard. There are also CTF variants for more than 2 teams (4 teams most commonly). Also. preventing the flag carrier from using weapons at all. to get the flag. players must kill monsters on their side of the field until they obtain a key that one of the monsters is holding. there is a hidden underground CTF area where players are able to construct their own secret bases to house their flags. In Unreal Tournament 2004. unless the carrier willfully drops the flag. Players are also able place traps around(but not in) their bases to stop any potential thieves because you cannot attack your opponents other than retaking your flag. thus making the game sessions last a bit longer. Currently the DEF CON CTF competitions are run by Diutinus Defense Technologies Corp [3]. though the flag-carrier can still board vehicles as a passenger (Halo 2 and 3) or as the driver (1 only). CTF was popularized by the hacker conference DEF CON. The Halo series takes this concept a step further. Unreal Tournament 2004 introduces a Vehicle CTF mode. and are vulnerable objectives in most missions. This feature gives the defenders a slight edge. arranged around a central area reserved for the contest administrators. for example. down them. Depending on the nature of the particular CTF game. Opposing players must avoid being killed by the flag player. only players in ground vehicles can hold and thus capture the flag. Each team is given a machine (or small network) to defend on an isolated network. CTF contests are usually designed to serve as an educational exercise to give participants experience in securing a machine. Reverse-engineering. players can play capture the flag free every 2 hours in the game. CTF scenarios often feature some sort of transportation tool that can be used to travel faster and to reach areas which the player wouldn't normally be able to reach without this extra aid. which will be intermittently interrupted by witty or humorous video clips. Teams are scored on both their success in defending their assigned machine and on their success in attacking other team's machines. a flag object exists to facilitate the creation of CTF-style maps (as flags can be picked up and carried by worker units). the scoring system can vary greatly. as the flag carrier is often not able to use transportation tools. In Starcraft. . paying members can play a mini-game called Castle Wars. though in ETF the vehicles move slower than the players. In the MMORPG RuneScape. Music is usually provided by a PA system during the contest. In Pokémon Platinum. designed as an homage to Team Fortress). The usual reason for including such equipment is because it allows players to outmaneuver the flag carrier on his way home. each team's "flag" is now an actual player that is armed and able to move. the Warsong Gulch and Twin Peaks battlegrounds implement CTF-style gameplay. In Battlefield 1942 CTF the many vehicles available in the game serve this role. A typical contest will have an area for each team playing. and take as a meatshield to forcibly drag back to their homebase. In another MMORPG. In the MMORPG Silkroad Online. This is demonstrated in the 8-player custom map StarCraft Fortress (included with the game. In Gears of War 2. Contests are generally executed in a hotel ballroom or meeting room. teams may either be attempting to take an opponent's flag from their machine or teams may be attempting to plant their own flag on their opponent's machine. capture the flag is a computer security wargame. whereas using air vehicles or the Translocator (a personal teleporter) will cause the player to drop the flag. which includes the basic principles of capture the flag.

programming.asp) at usscouts.carrsoft. An international. and software licensing. Retrieved 2010-03-16. Urban Capture the Flag has been played in cities throughout North America. system administration. regulatory compliance.Capture the flag network sniffing. Retrieved 2010-03-16. protocol analysis. ddtek.[4] 275 Urban Gaming Capture the Flag is among the games that have made a recent comeback among adults as part of the urban gaming trend (which includes games like Pac-Manhattan. org/ games/ game_cf. . References [1] [2] [3] [4] "USSSP .org . 2009-12-04.cs. The 2009 edition occurred on December 4.com/ctf/capture1. Santa Barbara in 2004. asp). Successful teams generally have extensive industry experience and are capable of addressing these issues. News about the games spreads virally through the use of blogs and mailing lists.org.edu.Capture the Flag" (http:/ / usscouts.ucsb. . Fugitive and manhunt). biz/ "UCSB International Capture The Flag" (http:/ / ictf. making it the largest live security competition ever performed. The game is played on city streets and players use cellphones to communicate. 2009 and involved 56 teams—up from 39 in the previous year—from across the world. Ictf.Games . one per year. Usscouts. CTF games often touch on many other aspects of information security. cs. edu/ ). 240-249 http:/ / www. even when raised by surprise in the middle of the contest. The Official Pokémon Guide Pokémon Platinum version p. such as physical security. There have been six iCTF exercises since then. ucsb. External links • Richard Carr's website (http://www.html) • Capture the flag (http://usscouts. and cryptanalysis are all skills which have been required by prior CTF contests at DEF CON. academic CTF was created by University of California.org/games/game_cf.

Gil had the initial idea for the company’s core technology known as stateful inspection. as well as in Canada in the Ottawa. Texas area. Today the company develops. SofaWare www.097.checkpoint. (NASDAQ: CHKP [1]) is a global provider of IT security solutions. Type Public NASDAQ-100 component NASDAQ: CHKP [1] Traded as Industry IT security Computer software Computer hardware 1993 Founded Headquarters Tel Aviv. security. at the age of 25. Web Application Security US$ $1.200 (2010) [2] Products Revenue Net income Employees Subsidiaries Website ZoneAlarm. endpoint security. Israel (before May 2007: Ramat Gan) Key people Gil Shwed. Best known for its firewall and VPN products. Israel.9 million (2010) US$ $452. Check Point Integrity. California and in the Dallas. UTM-1. Founded in 1993 in Ramat-Gan. VPN-1. Founder FireWall-1. including network security. Sweden (Former Protect Data development centre) and in Belarus. Security appliances. data security and security management. Check Point first pioneered the industry with FireWall-1 and its patented stateful inspection technology. Founder. California (ZoneAlarm). which became the foundation for the company’s first product (simply called FireWall-1).Check Point 276 Check Point Check Point Software Technologies Ltd. Chairman & CEO Marius Nacht. soon afterwards they also developed one of the world’s first .com [3] Check Point Software Technologies Ltd. The company also has offices in the United States. in Redwood City.200 employees worldwide. The company's development centers are located in Israel. together with two of his friends. History Check Point was established in 1993. Endpoint. Marius Nacht (currently serving as Vice Chairman) and Shlomo Kramer (who left Check Point in 2003 to set up a new company – Imperva. Ontario area. Check Point today counts approximately 2. Intrusion prevention systems. by the company’s current Chairman & CEO Gil Shwed. where he serves as President and CEO).8 million (2010) 2. markets and supports a wide range of software and combined hardware and software products that cover all the aspects of IT security.

IPS.[13] • Zone Labs. a U. media encryption with port protection. for $20 million in late 2006. a data security startup company based in Boston.[4] During the 2000s.[14] • Protect Data. correlation and device provisioning. Application Control.[12] • Privately held Liquid Machines. in January 2002 (partial acquisition). just over 10 years after first establishing the partnership with Nokia. IP Appliances. culminating in the acquisition of Nokia’s network security business unit in 2009. following its failed plan to acquire the larger IPS vendor Sourcefire. was acquired in June 2010. head office was established in Redwood City.[7] The company’s first commercial breakthrough came in 1994 when Check Point signed an OEM agreement with Sun Microsystems. By February 1996. full disk encryption. • Endpoint Security – Check Point Integrity – Single security agent that combines firewall. Acquisitions • SofaWare Technologies.[4] followed by a distribution agreement with HP in 1995. Mobile Security. VSX-1. [5] [6] Initial funding of $600. Safe@Office and Software Blades such as Firewall.[11] by 2000 the company became the world’s leading provider of VPN solutions (in terms of market share). network access control (NAC). a venture capital fund established by brothers Eli and Nir Barkat (who on November 11. with products such as Power-1 appliances.[18] In 2005. was acquired in April 2009.[4] Shwed developed the idea while serving in the Israel Defense Forces. the company was named worldwide firewall market leader by IDC with a market share of 40 percent.000 was provided by BRM Group. Check Point started acquiring other IT security companies.[9] In June 1996. the holding company for PointSec Mobile Technologies.[19] but later withdrew its offer after it became clear US authorities would try to block the acquisition.Check Point VPN products (VPN-1). California.[12] 277 Products Check Point's products fall into the following main categories: • Security Gateways – Check Point's core business. Solutions are based on the Software Blade architecture. anti-spyware. Identity Awareness. program control and VPN on the end point. for $205 million in cash and shares. UTM-1 appliances. Check Point raised $67 million from its initial public offering on NASDAQ. Connectra.S. IPSEC VPN. antivirus. in a cash deal valued at $586m in late 2006.[17] • Nokia Security Appliances division. which bundled Check Point’s Software with Nokia’s computer Network Security Appliances. in 2003. Check Point tried to acquire intrusion prevention system developers Sourcefire for $225 million. Check Point established a successful partnership with Nokia. where he worked on securing classified networks. makers of the ZoneAlarm personal firewall software.[8] The same year.[20] . • Security Management – Allows administrators to manage events. URL Filtering.[10] In 1998. 2008 was elected as mayor of Jerusalem). with a portfolio of 10 management software blades including event analysis. an intrusion prevention system developer. set policy and apply protections across the entire security infrastructure from a single interface. Protect Data acquired Reflex Software.[15] Prior to their acquisition by Check Point.[16] • NFR security.

Check Point 278 SofaWare Legal Battle SofaWare Technologies was founded in 1999. as a cooperation between Check Point and SofaWare's founders. Adi Ruppin and Etay Bogner.[13] Certification Check Point has a long-running history of training and certification on their products. which owns 60% of Sofaware.[21] By the fourth quarter of 2002.1 million in alleged damage to SofaWare.[13] In 2004. According to SofaWare's co-founder Adi Ruppin. which includes Bogner. consumer and branch office market. SofaWare began selling firewall appliances under the SofaWare S-Box brand. with the purpose of extending Check Point's success in the enterprise market. "The Company's vision is to take this enterprise-strength technology and make it as simple to use and as affordable as possible without detracting from its quality. content filtering.[25] In 2009. sales of SofaWare's Safe@Office firewall/VPN appliances skyrocketed. and SofaWare held the #1 revenue position in the worldwide firewall/VPN sub-$490 appliance market. the company started selling the Safe@Office / Safe@Home line of security appliances. but later lost the appeal. has veto power to prevent SofaWare from taking any decision of which he disapproves. the Tel Aviv District Court Judge ruled that Bogner SofaWare could sue Check Point by proxy for $5. to the small business.[23] Relations between Check Point and the SofaWare founders went sour after the company acquisition in 2002.[24] Bogner claimed that Check Point.Check Point Certified Security Expert Plus CCMSE . Etay Bogner. "[21] In 2001.Check Point Certified Managed Security Expert CCMA .Check Point Certified Security Administrator CCSE .Check Point Certified Security Expert CCSE+ . with a 38% revenue market share. anti-virus and more.Check Point Certified Master Architect . but only as a group and by majority rule.[13] His derivative suit was ultimately approved and Check Point was ordered to pay SofaWare NIS 13 million for breach of contract. including the following: • • • • • • CPCS . the Israeli Supreme Court ruled that a group of founders of SofaWare. One of the key aspects of this effort has been the creation of a management system designed to enable service providers or value added resellers to lift the burden of security management from the end users while at the same time delivering additional services such as automatic security and software updates.Check Point Certified Specialist CCSA . under the Check Point brand.[22] In 2002.[13] The court ruled that the three founders could not individually exercise their veto power. has behaved belligerently. and withheld monies due for use of SofaWare technology and products[24] Check Point appealed the ruling. Bogner sought court approval to file a shareholder derivative suit. co-founder of SofaWare scored a legal victory over Check Point. claiming Check Point was not transferring funds to SofaWare as required for its use of SofaWare's products and technology.[13] In 2006.

encyclopedia. ).September 25. [20] "Check Point calls off Sourcefire buy" (http:/ / www. 2006-12-19. . com/ article/ 31405/ CIO_20_20_Honorees_Innovator_s_Profile_Gil_Schwed_of_Check_Point_Software_Technologies_Ltd. by Berislav Kucan .February 5. checkpoint. com/ doc/ 1G1-17461605. haaretz. checkpoint. Gil Shwed.com. 07.רצלמ 'ח .. 15. . html). "CIO 20/20 Honorees--Innovator's Profile: Gil Schwed of Check Point Software Technologies Ltd" (http:/ / www. CIO Magazine October 1. [17] "Check Point to Acquire NFR Security. Business Wire . 2009-04-13. Business Wire . . com/ press-release/ Check-Point-Acquires-Data-Security-Startup-Liquid-Machines-NASDAQ-CHKP-1273411. . 186085). html). [18] Check Point Acquires Data Security Startup Liquid Machines (http:/ / www. asp?symbol=CHKP& selected=CHKP "Check Point Software Facts @ A Glance" (http:/ / www.ןוילע( רנגוב יתיא 'נ‬ ‫)לדנה‬ [1] [2] [3] [4] . php?id=361) [22] Check Point bolsters new firewall appliance. checkpoint. com/ corporate/ facts. com/ press/ 2009/ check-point-completes-nokia-acquisition-041309.ןייטשניבור 'א . com/ books?id=MA0AAAAAMBAJ& lpg=PA76& dq=Gil Shwed& pg=PA76#v=onepage& q=Gil Shwed& f=false). . Nov. 2006-03-23. "Etay Bogner bests Check Point in court once again" (http:/ / www. Check Point Software Tech (http:/ / www. html). checkpoint. Retrieved 14 June 2010. crn. David. checkpoint. Retrieved 2008-10-12. encyclopedia. "Check Point Software & Nokia Expand Partnership" (http:/ / www. 1996.December 1. com/ business/ economy-finance/ sofaware-founder-cleared-to-sue-check-point-for-5-1-million-1. [21] Interview with Adi Ruppin. [11] Press Release . Retrieved 2008-10-12. haaretz. Network World Dec 17. http:/ / www. com/ ebiz/ 9912/ em1201. [6] Gil Shwed. Retrieved 2009-04-13. [16] "Protect Data acquires Reflex Software Limited to extend product portfolio" (http:/ / www. Retrieved 2008-10-12. Retrieved 2008-10-12. "MOVERS & SHAKERS: Eli Barkat: Making Push More Polite -. Retrieved 2009-07-01. htm). html). . 2006). sciencedirect. Oct 1. org/ article. Chairman & CEO. com/ press/ pointsec/ 2006/ 11-02. Marcia. [14] "Check Point Software Technologies to Acquire Zone Labs" (http:/ / www. Retrieved 2008-10-13. Worldwide Market Share of 40 Percent Represents Significant Lead in Providing Enterprise Network Security Solutions. com/ press/ 2003/ zonelabs121503. p. 2006-11-20. html). [12] "Check Point Completes Acquisition of Nokia Security Appliance Business" (http:/ / www. sciencedirect. 2002. [10] Breznitz. (http:/ / www. checkpoint. html). Haaretz. Founder and Managing Director of SofaWare. securityfocus. com/ press/ 2003/ infonetics031203. com/ press/ 2006/ pointsec112006. 2001. nasdaq. Business Wire . Nurit (26-11-09). 2002 [7] Wallace. "CheckPoint Software Named Firewall Market Share Leader by IDC. CIO (http:/ / books. html). 18 [23] Check Point Software Stakes Claim in Small Business Internet Security Space Company Duplicates Market-leading Enterprise Success in Sub-$490 Appliance Segment (http:/ / www. ‫ 80/0582 אע‬CHECK POINT SOFTWARE TECHNOLOGIES LTD ‫'נ . . com/ Mayor. businessweek. cio. Research Policy 36 (9): Research Policy. checkpoint. checkpoint. . net-security. 21 January 2003. marketwire. 1995. Expands Intrusion Prevention Capabilities to Fortify Enterprise Networks" (http:/ / www. checkpoint. .October 19. 2001. com/ doc/ 1G1-17461605. 1999. "Sofaware founder cleared to sue Check Point for $5. Ha'aretz.Check Point 279 References http:/ / quotes. com/ science?_ob=ArticleURL& _udi=B6V77-4PTN8PN-1& _user=10& _rdoc=1& _fmt=& _orig=search& _sort=d& _docanchor=& view=c& _searchStrId=1085105296& _rerunOrigin=google& _acct=C000050221& _version=1& _urlVersion=0& _userid=10& md5=3cf69059540492454dd362317b553b2f). Tracy. . html). [9] Press Release. Business Week . retrieved 2009-11-09 [8] Company Press Release. 1999. com/ press/ 2006/ nfrsecurity121906. 2006-03-24. Dan. CRN. com/ asp/ SummaryQuote. com/ science?_ob=ArticleURL& _udi=B6V77-4PTN8PN-1& _user=10& _rdoc=1& _fmt=& _orig=search& _sort=d& _docanchor=& view=c& _searchStrId=1085105296& _rerunOrigin=google& _acct=C000050221& _version=1& _urlVersion=0& _userid=10& md5=3cf69059540492454dd362317b553b2f). 3385). htm) [19] "Check Point and Sourcefire to Explore Alternative Business Relationship" (http:/ / www. Oded (April 25. [15] "Check Point Announces a Cash Tender Offer to Acquire Protect Data" (http:/ / www.Tuesday. com/ print-edition/ business/ etay-bogner-bests-check-point-in-court-once-again-1. html). com/ news/ 11382). Retrieved 2008-10-12. "Industrial R&D as a national policy: Horizontal technology policies and industry-state co-evolution in the growth of the Israeli software industry" (http:/ / www." (http:/ / www." (http:/ / www. [25] Israeli Supreme Court. [5] Savage. market-leading FireWall-1 solution now available through HP and its reseller channel.1 million" (http:/ / www. google. com/ press/ 2006/ sourcefire032306.and Ready for Prime Time" (http:/ / www. html) [24] Arbel. [13] Roth. com/ news/ channel-programs/ 18836954/ gil-shwed-chairman-ceo-check-point-software-tech. 2006-11-02. htm). "CheckPoint Software and HP sign distribution agreement. 2003-12-15.

[3] History Abra was first introduced by Check Point Software Technologies Ltd. documents. All applications running on the Abra desktop (including the new explorer) operate in a virtual file system and registry. . The virtual files and registry data are instantly written to the flash drive and immediately encrypted. the product has won industry awards including "IT Product of 2010" by Computerworld[9] and was named one of the “25 Hot Products to Watch” at the 2010 RSA Conference by CRN Magazine. Since its release.[5] and SanDisk Corporation in March 2010 to address security and compliance issues for companies with remote employees. the user is presented with a login screen. for those needing high security (including encryption) for computing devices outside the workplace. All subsequent processes are started as child processes of this new explorer.[8] The companies cite that the Abra system provides a solution for companies that want to let their employees purchase and manage their own PCs and laptops. By plugging Abra into the USB port of a Microsoft Windows® OS-based PC or laptop. creating a potential avenue for corporate data loss or allowing unsecure connections. All file and registry input/output calls for the secure application inside Abra are redirected to the flash drive.com/) Check Point Abra Abra is a USB drive that combines an encrypted USB flash drive with virtualization. preventing a seamless use. The device interfaces with software on a corporate server to support company policies and security updates through security gateways. users can launch a secure virtual workspace that is segregated from the host PC.[2] Abra uses hardware and software encryption to protect user credentials.checkpoint. including insecure host environments such as a hotel business center or Internet café.Check Point 280 External links • Corporate website (http://www.[7] Personal equipment accessing the network can pose a risk to corporate networks. due to incompatbilty with enterprise client settings.[6] Workers have been increasingly demanding remote access to company applications and data. a new explorer.exe instance is started in the Abra Secure Workspace. as well as certificates and security tokens for multifactor authentication for remote connectivity.[1] This allows users to securely access company files and applications from any remote location. Upon successful login.[10] [4] Technology Architecture When Abra is inserted into the USB port of any PC. VPN and computer security technologies to turn a PC into a secure corporate desktop. The system uses an authentication process that enforces minimum levels of password strength. as well as contractors/vendors who require access to the company network while working on site. Abra uses the software installed on the host PC to run applications such as Microsoft Word and Microsoft Excel. but the user’s documents remain secure in the Abra environment – a virtual workspace that runs parallel to the host environment. so that data cannot be compromised in transit or in the event the device is lost. Currently Abra works not on all windows workstations. and other sensitive data.

XP.4 or R66 Connectra plug-in R65 HFA60 Abra Hotfix SmartDashboard Version Version R70.20 or R70. Home and Professional.0 interface AES 256-bit hardware encryption FIPS 140-2 Level 2 certified drives available Versions Version R65 Security Gateway R65 HFA60 R65 HFA60 Abra Hotfix SmartCenter server R65 HFA60 R65. Home Premium. Enterprise.0 SmartCenter GUI Operating systems Windows 2000/2003.4 with Abra R70. ME.40 Security Gateway No additions R65. SP2+) Windows XP (32-bit. Home and Professional. Ultimate) Windows Vista (32 & 64-bit.Check Point Abra 281 Specifications [11] Abra Host Platform Support Operating systems Windows 7 (32 & 64-bit. Vista Solaris 8/9/10 Encrypted USB Drive SanDisk USB Drive Available capacities: 4.1 update Installed with: Installed with: Security Management Server No additions SmartDashboard Version Version R71.1 update . 8 GB High-speed USB 2.1 update Installed with: Security Management Server No additions SmartDashboard Version R71. SP3+) SmartCenter Management Server Operating systems Check Point SecurePlatform™ Windows Server 2000/2003 Solaris 8/9/10 Red Hat Linux Enterprise 3.1 for versions with Abra R70.1 Security Gateway No additions SmartDashboard for versions with Abra R70.

techworld. . . com/ whitepapers/ ). com/ slide-shows/ security/ 223100893/ 25-hot-products-to-watch-at-rsa. com/ press/ 2010/ AbraAwardWinITSecurityDayHungary. Press release. [13] References [1] "Check Point unveils Abra mobile" (http:/ / www. com/ news/ 73943/ ). com/ press/ 2010/ check-point-sandisk-abra-030210. swordshield. newswiretoday.Check Point Abra 282 Awards • It was distinwished with "The best international innovation" award at the 2010 Information Security Day (ITBN) conference in Hungary. 2010. Hungary). 2010). . . [13] "Check Point Abra Recognized Best International Innovation At Hungarian IT Security Day" (http:/ / www. Press release. pdf). checkpoint. htm?pgno=7). [4] "Check Point puts VPN in USB stick" (http:/ / news. [9] Newswiretoday. ciol. Deccan Herald. com/ Technology/ Security/ News-Reports/ Check-Point-launches-ABRA/ 134488/ 0/ ). html). WorldatWork. [7] Telework Trendlines 2009. . com/ docs/ abra_ds. crn. info-safe. [5] "Secure Virtual Workspace" (http:/ / www.com (Budapest. 10/11/2010. "Check Point Abra Named 'IT Product of 2010' by Computerworld Magazine" (http:/ / www. "Check Point Abra Recognized Best International Innovation At Hungarian IT Security Day" (http:/ / www.com. html). checkpoint.[12] • It Received the Computerworld Czech Republic's annual "IT Product of the Year" in 2010. checkpoint. [11] "Abra Specifications" (http:/ / www. pdf). .com. newswiretoday.com (March 2. com/ business-solutions/ enterprise/ sandisk-and-check-point-abra). deccanherald.com (July 15. [12] checkpoint. [10] "25 Hot Products To Watch At RSA" (http:/ / www. [6] Checkpoint. com/ images/ abra-white-paper. . . 2010. checkpoint. CRN Magazine. ciol. . newswiretoday. April 7. 2010).com.com.com. "Check Point and SanDisk Deliver Secure Virtual Workspace" (http:/ / www. . Press release. com/ security/ 3214138/ check-point-puts-vpn-in-usb-stick/ ). February 26. sandisk. com/ content/ 62594/ check-point-unveils-abra-mobile. . swordshield. [2] "A virtual Secure workspace" (http:/ / www. info-safe. [3] "Check Point launches ABRA" (http:/ / www. techworld. . com/ news/ 78827/ ). [8] "Company white paper: Check Point Abra – A Virtual Secure Workspace" (http:/ / www.com (11 October 2010). Retrieved 17 November 2010. html). .

4.SecurePlatform and IPSO into one. In the decimal releases. VPN-1 is also sold in appliance form as Check Point's UTM-1 (starting 2006) and Power-1 appliances. which provides the protocol stack. Patent # 5. See the table in the Version History section below for details. Later (1997). VPN-1 supports the following operating systems: • Windows Server 2003 and 2008. process scheduling and other features needed by the product. among other features. file system. often called SPLAT). Then the name changed to NG AI which meant NG with Application Intelligence. the license always includes strong cryptographic capabilities and was instead split into VPN-1 Pro or VPN-1 Express.0 and 4. most other commercial firewalls such as Cisco PIX and Juniper NetScreen were acquired by their present owners. the product used a traditional decimal version number such as 3. VPN-1 is a stateful firewall which also filters traffic by inspecting the application layer. but in fact it runs the same VPN-1 software as other platforms.1 (although 4. The product is licensed in several variants. This new. and the minor revisions became known as Rxx e. the license determined what encryption strength was available for the VPN (DES or "Strong"). • Red Hat Enterprise Linux (RHEL). • Crossbeam XOS and COS Previous versions of Check Point firewall supported other operating systems including Solaris. By contrast. As of version R70.S. previously known as FireWall-1. Then the version changed to NG meaning Next Generation and minor revisions became known as Feature Packs. is positioned to finally replace both existing operating systems at some point in future Gaia project [2] Version history The VPN-1 version naming can be rather confusing because Check Point have changed the version numbering scheme several times through the product's history. NG AI R54. VPN-1 Express was intended for simplified . VPN-1 running on the Nokia platform on IPSO is often called a Nokia Firewall as if it were a different product. It was the first commercially available software firewall to use stateful inspection. the version name has changed to NGX. included stateful inspection[1] . These appliances run the SecurePlatform operating system. is now sold as an integrated firewall and VPN solution.g. Platforms The VPN-1 software is installed on a separate operating system. On completing the acquisition of Nokia Security Appliance Business in 2009 Checkpoint started the project named Gaia aimed at merging two different operating systems . Initially. HP-UX and IBM AIX. The first production release is scheduled for the 2010.0. The product. • Nokia IPSO.606. still to be named OS. Check Point registered U. VPN-1 is one of the few firewall products that is still owned by its creators (Check Point Software Technologies). Since NG. Most recently.1 was also called Check Point 2000 on the packaging). This is different from most other commercial firewall products like Cisco PIX and Juniper firewalls where the firewall software is part of a proprietary operating system. • Check Point SecurePlatform (a Check Point Linux distribution based on Red Hat Enterprise Linux. VPN-1 functionality is currently bundled within all the Check Point's perimeter security products.Check Point VPN-1 283 Check Point VPN-1 VPN-1 is a firewall and VPN product developed by Check Point Software Technologies Ltd. Although traditionally sold as software only.668 on their security technology that.

0 3.1 [5] 1997 1998 2000 Windows NT 3.5.0 and 2000. Solaris 8 (32 or 64-bit) and 9 (64-bit). 7.0 (2.0 (2. HP-UX Platforms [3] [4] Notes 284 1.1.0 (2.2. IPSO 3.5.5 and 3.0. In NGX R62.2 and 7. Solaris 2. Solaris 2. Version branches: NG AI R55P (for IPSO 3.5.0 and 2000. Solaris 7 (32-bit) and 8 (32 or 64-bit). VPN-1 UTM includes certain content inspection features such as antivirus and more recently.3. The table below shows the version history. IPSO 3. SecurePlatform NGX Mar 2006 Windows 2000 and 2003.5.2 kernel) Also known as Check Point 2000 NG NG FP1 NG FP2 Jun 2001 NG stands for Next Generation Nov 2001 Windows NT 4.5.Check Point VPN-1 deployment while VPN-1 Pro provided more configurability.2. SecurePlatform NG FP3 Jun 2003 Windows NT 4.5 and 4. 4. the branding was changed to VPN-1 Power (instead of Pro) and VPN-1 UTM (instead of Express).2 kernel).2 (2. Version 3.0 (2. Red Hat Linux 6. IPSO 3.0 (2.1.7 and NG AI R55W (contains Web Intelligence) 3.5. Solaris.5.4 kernel). The Platforms column shows the operating systems that are supported by the firewall product: Version Release date April 1994 Sep 1995 Jun 1996 Oct 1996 SunOS 4.2 and 7. 2.0 (2.8). Solaris 2.4. HP-UX 10.7.4 kernel).1 and 3.5. Solaris 8.2.4 kernel).0 and 4. Red Hat Linux 6.1 and 3.3 (2. SecurePlatform NGX .0 and 2000.0 (2. later VPN-1 UTM (Unified Threat [6] Management) Version branches: NGX R60A NGX R60 NGX R61 Aug 2005 Windows 2000 and 2003. AIX 4.0 2.0.0 Windows NT 4.0b 4. SecurePlatform NG AI.0 and 2000.1. but with slightly different packaging and file system layout.2 Apr 2002 Windows NT 4. 9 and 10.0 (2.4 kernel). Solaris 7 (32-bit) and 8 (32 or 64-bit). SecurePlatform NG AI April 2005 SecurePlatform NG AI R57 For product Check Point Express CI (Content Inspection).1 and 2.0.4 kernel). IPSO 3.9 and 4.0 and 2000. 4. SecurePlatform NG FP2 NG FP3 Aug 2002 Windows NT 4. 3. RHEL 3. Red Hat Linux 7.6 and 7 (32-bit). HP-UX 10.7.2 kernel).6. IPSO 3.3 Windows NT 4. 2. AIX 4.6.3 (2.4. 2000 and 2003. IPSO 3.1 and 4. 2. 7. AIX 4.2 kernel). Red Hat Linux 7.0 2. Red Hat Linux 6.0a 3. 7. 7. Solaris 8 and 9 (64-bit).2 kernel) and 7.2.2 and 4.0.2 and 7.1. Solaris 8 (32 or 64-bit) and 9 (64-bit).20 and 11. Solaris 2.1. Red Hat Linux 6. Solaris 8 (32 or 64-bit) and 9 (64-bit). web filtering.3.0 (2.0.1 3.0 4.3. 7 and 8 (32-bit).4 kernel). RHEL 3. This was essentially the same product. AIX 5.0 and 2000.x.2 (2.2 The full name is NG with Application Intelligence NG AI R54 NG AI R55 NG AI R57 Nov 2003 Windows NT 4.3 (2. 4.1. IPSO 3. Solaris 7 (32-bit) and 8 (32 or 64-bit).0 was also sold by Sun Microsystems as Solstice FireWall-1.6. HP-UX 10.2 kernel). Red Hat Linux 7.3 SunOS.9.3. 7.1 Windows NT 4.4 kernel).2 kernel) and 7.2 and 7.2.2 and 7.x. IPSO 3.6.

RHEL 3.10. IPSO 3.20. RHEL 3. It supports bandwidth guaranteeing or limiting per QOS rule or per connection.limiting access of internal to the firewall hosts to the Web resources using explicit URL specification or category rating. Also the priority queuing can be done (LLQ). Solaris 8. And while they are licensed separately.40 [13] Installation files were publicly available in December 2010. Minor versions: R71.4 kernel). XOS R75 Windows 2003 and 2008. later more features were added. SecurePlatform NGX Mar 2007 Windows 2000 and 2003.1.10 (May 2011) 285 R71 April 2010 January 2011 Windows 2003 and 2008. in preparation R71.4 (Feb 2009) SecurePlatform 2. IPSO 6.4 kernel).[7] R65.40 [11] Minor versions: R71. Nevertheless.2.9 and 4. External links • www.2. SecurePlatform. SecurePlatform. • VPN-1 UTM [16] — UTM product version for small and medium business • VPN-1 Power [17] — version for enterprise business • VPN-1 UTM Power [18] — UTM product version for enterprise business www.30 (March 2010). Starting NGX R70 this feature has been rebranded as IPS.scanning of the passing traffic for viruses • Web filtering . they have since began to be bundled in default installations of the VPN-1 as well. R70. R71. Quality of service (Floodgate-1) Checkpoint implementation of the Quality of service (QOS).20 (2009). R71. Solaris 8.2. IPSO 6.1.Check Point VPN-1 NGX R62 NGX R65 R70 Nov 2006 Windows 2000 and 2003.com [14] — Check Point Software Technologies web site • FireWall-1 [15] — information about the product although it is not being sold separately anymore. R70. Security (Dec 2007). XOS [10] [8] [9] Minor versions: R70.checkpoint. be it Differentiated services or Ip precedence. R70. 9 and 10.0 (2. XOS [12] Features While started as pure firewall and vpn only product.1 and 4. SecurePlatform.de [19] — information about VPN-1 Check Point Official Forums [20] CPUG: The Check Point User Group [21] Check Point IPsec IKE Implementation details [22] • • • • . 9 and 10 (Ultra-SPARC Version branches: NGX R65 with Messaging architecture). RFC based QOS implementation.0 (2. are not supported Content Inspection Starting with NGX R65 this new feature has been introduced providing 2 services: • Antivirus scanning .fw-1. SmartDefense (IPS) This feature adds to the built-in stateful inspection and inherent TCP/IP protocols checks and normalization inspection of most common application protocols. IPSO 6. SecurePlatform. IPSO 4.7 and Feb 2009 Windows 2003 and 2008.

(2010-04-13). Passarge. html http:/ / www. "Check Point Introduces Groundbreaking New UTM-1 Total Security Solutions. Press release. . . com http:/ / www. Retrieved 2007-03-14. html). checkpoint. Red Hat Enterprise Linux and Windows XP/7 is supported for the management server only. html [3] Check Point Software Technologies Ltd. Press release. com/ press/ 2005/ express_ci_041105. Check Point Software Technologies Ltd. [6] Check Point Software Technologies Ltd. Press release. (2005-04-11). checkpoint. Curtis. html). fw-1.. (1997-03-17). html). Retrieved 2007-03-14. [2] http:/ / www. checkpoint. . com/ products/ vpn-1_power/ http:/ / www. [9] Check Point Software Technologies Ltd. [5] Tolly. archive. John. com/ products/ vpn-1_utm_power/ http:/ / www. checkpoint. http:/ / www. awarded patent for stateful inspection technology" (http:/ / www. com/ wiki/ index.Check Point VPN-1 286 References [1] Check Point Software Technologies Ltd. Wins 'BEST OF SHOW' Award at Networld+Interop '94" (http:/ / www. 1996). com/ press/ 1994/ interop_press. [10] Solaris and Red Hat Enterprise Linux is supported for the management server only. cpug. (2009-02-24). "Check Point Raises the Bar on Performance of Antivirus and URL Filtering Software Blades" (http:/ / www. html). com/ 96jun/ 606s052b. Press release. LANTimes. . 1998-07-24. html). com/ corporate/ CheckPoint_Financial. com/ products/ vpn-1_utm/ http:/ / www. (1994-05-06). Retrieved 2009-04-01. checkpoint. html). Retrieved 2008-01-25. [4] (PDF) Management’s Discussion and Analysis of Financial Condition and Results of Operations (http:/ / www. [11] Check Point Software Technologies Ltd. Press release. [7] Check Point Software Technologies Ltd. "Firewall-1 2. checkpoint. checkpoint. checkpoint. . pdf). checkpoint. de/ aerasec/ http:/ / forums. Elke (June 17. checkpoint. "Check Point Introduces Revolutionary Internet Firewall Product Providing Full Internet Connectivity with Security. Retrieved 2007-03-14. "Check point software technologies Ltd. Kevin. [12] [13] [14] [15] [16] [17] [18] [19] [20] [21] [22] Solaris. com/ products/ gaia/ index. "Check Point Adds Antivirus Protection to Integrated Security Solution for Mid-sized Businesses" (http:/ / www. com http:/ / www.0" (http:/ / web. com/ press/ 2009/ ips-software-blade-022409. Retrieved 2010-06-04. checkpoint. org/ web/ 19970109121559/ www. php/ Check_Point_Firewall-1 . Retrieved 2009-03-28. "Check Point Introduces Latest Security Gateway and Management Release Based On New Software Blade Architecture" (http:/ / www. com/ press/ 1997/ patent2. com/ products/ firewall-1/ http:/ / www. new. checkpoint. com/ services/ lifecycle/ support-periods. Solaris and Red Hat Enterprise Linux is supported for the management server only. [8] XOS was supported in older releases already but was not mentioned in release notes. html). com/ press/ 2007/ utm1120307. nta-monitor. (2007-12-03). lantimes. checkpoint. checkpoint. checkpoint. com/ press/ 2010/ 041310-utm-1-enhancements. org/ http:/ / www. Press release. . . Now Including Best-In-Class Messaging Security" (http:/ / www.

nastygram or a lamp test segment. the TCP header of a Christmas tree packets has the flags FIN. all turned on. Cisco IOS. and IRIX display behaviors that differ from the RFC standard when queried with said packets. MVS. a Christmas tree packet is a packet with every single option set for whatever protocol is in use. The term derives from a fanciful image of each little option bit in a header being represented by a different-colored light bulb. such as a Christmas tree packet. Many operating systems implement their compliance with the Internet Protocol standard (RFC 791) in varying or incomplete ways. By observing how a host responds to an odd packet." It can also be known as a kamikaze packet. URG and PSH set. html . packets that initiate connection according to the standards). Versions of Microsoft Windows. External links • Nmap documentation [1] References [1] http:/ / insecure. "the packet was lit up like a Christmas tree. Since Christmas tree scan packets do not have the SYN flag turned on. Christmas tree packets can be easily detected by intrusion-detection systems or more advanced firewalls. BSD/OS. as in. they can pass through these simple systems and reach the target host. From a network security point of view. Christmas tree packets are always suspicious and indicate a high probability of network reconnaissance activities. A large number of Christmas tree packets can also be used to conduct a DoS attack by exploiting the fact that Christmas tree packets require much more processing by routers and end-hosts than the 'usual' packets do. assumptions can be made regarding the host's operating system. When used as part of scanning a system. Christmas tree packets can be used as a method of divining the underlying nature of a TCP/IP stack by sending the packets and awaiting and analyzing the responses.Christmas tree packet 287 Christmas tree packet In information technology. HP-UX. org/ nmap/ man/ man-port-scanning-techniques. Some stateless firewalls only check against security policy those packets which have the SYN flag set (that is.

2 included. which worked as intrusion prevention system (IPS).0.2. AIP-SSM. which provided firewall and network address translation (NAT) functions.Cisco ASA 288 Cisco ASA In computer networking. Model Introduced CPU Type 2006 AMD Geode LX 5505 2005 Intel Celeron 5510 2005 Intel Pentium 4 Celeron 2. Active/Active Active/Standby.1 250 12 GB ATA CompactFlash 1GB 8. Yes .1 250 5520 2005 Intel Pentium 4 5540 2006 Intel Pentium 4 5550 5580-20 2008 5580-40 2008 AMD Opteron AMD Opteron (2 CPU.2 included.0. Yes . Cisco ASA 5500 Series Adaptive Security Appliances.2 included.1 512 MB ATA CompactFlash 64MB 7. Active/Active Active/Standby. • Cisco VPN 3000 Series Concentrators. Max 25 Stateless Active/Standby (with Sec Plus License) 5505 Yes .0 GHz Intel 875P Canterwood 256 MB ATA CompactFlash 64MB 7. 4GE-SSM CSC-SSM.1. Max 750 Max 2500 Max 5000 Max 10000 Max 10000 Active/Standby. Max 250 Active/Standby. 8 cores) 2.1 1.0 GHz 3 (trunking 50 or 100 with disabled) or 20 Sec Plus License (trunking enabled) with Sec Plus License Marvell 88E6095 AIP-SSC CSC-SSM. Through PIX OS release 7.[1] that succeeded three existing lines of popular Cisco products: • Cisco PIX. or simply Cisco ASA 5500 Series.2 included.x the PIX and the ASA use the same software images.6 GHz 2.6 GHz 2. which provided virtual private networking (VPN).1 200 4 GB ATA CompactFlash 64MB 7.1.0 GHz 3. is Cisco's line of network security devices introduced in 2005. Active/Active (with Sec Plus License) 5510 Network chipset(s) Expansion Modules Supported Supports SSL VPN Failover Supported CSC-SSM. 4GE-SSM Yes . Yes . 4 cores) (4 CPU. The ASA series of devices run PIX code 7. 4GE-SSM No 6 Interface Cards 6 Interface Cards Yes .0 and later.6 GHz CPU Speed 500 MHz Chipset Default RAM Boot Flash Device Default Flash Min OS Version Max Virtual Interfaces Geode CS5536 256 MB ATA CompactFlash 64MB 7.2 included.1 150 1 GB ATA CompactFlash 64MB 7. AIP-SSM. AIP-SSM.1. Specifications of past and present models Like the Cisco PIX firewalls. • Cisco IPS 4200 Series. and with NetworkSims (Networksims) for a simulator.1 250 8 GB ATA CompactFlash 1GB 8. Active/Active Model 5520 5540 5550 5580-20 5580-40 . the ASAs are based on Intel x86 architecture. Yes .0. Active/Active Active/Standby.2 included. Active/Active Active/Standby. Examples of emulators include PEMU and Dynagen [2].2 included.

000 with Sec Plus License) 250 280.000 650.000 ASA 5585-X [3] SSP40 12.000 (130.000 10.000 10.000 10.000.000 1.000 400.500 5.html) • Cisco ASA 5505 Basic Configuration (http://www.com/ cisco-asa-5505-basic-configuration-tutorial/) • Cisco ASA 5510 Basic Configuration (http://www.000 10.html) • ASA Simulator (http://networksims.000 10.000 10. Mbit/s AES/Triple DES throughput.000. Retrieved 2008-05-15.000 Cleartext throughput.000 1.000. cisco. 2005 – Cisco Systems. html) quote: "Las Vegas (Interop) May 3. cisco. Inc.000 10.000 4. .cisco.000 10. com/ dlls/ 2005/ prod_050305. org [3] "Cisco ASA Model Comparison page" (http:/ / www. today announced the availability of the Cisco ASA 5500 Series Adaptive Security Appliance s" [2] http:/ / www.000 (25.000 5. html).cisco.200 5.000 ASA 5585-X [3] SSP20 7..000 100 170 225 325 425 1. Mbit/s Max simultaneous connections 150 300 450 650 1.000 5.Cisco ASA 289 Performance specifications Model ASA [3] 5505 ASA [3] 5510 ASA ASA ASA ASA ASA [3] [3] [3] [3] [3] 5520 5540 5550 5580-20 5580-40 ASA 5585-X [3] SSP10 3.com/pix.000 10.000.ASA troubleshooting information (http://www.000 ASA 5585-X [3] SSP60 20. dynagen.000 2.000 2.000.000.com/en/US/solutions/ ns170/tac/security_tac_podcasts.000 Max site-to-site and remote access VPN sessions 750 5.000 2.000 1.000 5.000 3.networkstraining.000 10.com/go/asa) • Cisco TAC Security Podcast .000 10. External links • Cisco ASA 5500 Series Adaptive Security Appliances (http://www.000 1.000 10.000 with Sec Plus License) 10 (25 with Sec Plus License) 50.tech21century.000 5.000 10. com/ en/ US/ products/ ps6120/ prod_models_comparison.com/ how-to-configure-a-cisco-asa-5510-firewall-basic-configuration-tutorial/) .000 Max number 25 of SSL VPN user sessions Model ASA 5505 250 750 2.000 ASA 5510 ASA 5520 ASA 5540 ASA 5550 ASA 5580-20 ASA 5580-40 ASA 5585-X SSP10 ASA 5585-X SSP20 ASA 5585-X SSP40 ASA 5585-X SSP60 References [1] Cisco press release (http:/ / newsroom.

It was one of the first products in this market segment. much as PBXs do for internal phone extensions. Mayes and Coile hired four long time associates: Jim Jordan. 1994 at KLA Instruments in San Jose. California. Cisco introduced the newer Adaptive Security Appliance (ASA). and testing were carried out in 1994 by John Mayes. Together they continued development on Finesse OS and the original version of the Cisco PIX Firewall. Beta testing of PIX serial number 000000 was completed and first customer acceptance was on December 21. The PIX quickly became one of the leading enterprise firewall products and was awarded the Data Communications Magazine "Hot Product of the Year" award in January 1995. The design. for the Cisco Catalyst 6500 switch series and the 7600 Router series. During this time. with Brantley Coile being the sole software developer. . the FireWall Services Module (FWSM). In 2005. and in 2008 announced PIX end-of-sale. the LocalDirector. Inc. PIX 535 Firewall History PIX was originally conceived in early 1994 by John Mayes of Redwood City. The PIX name is derived from its creators' aim of creating the functional equivalent of an IP PBX to solve the then-emerging registered IP address shortage. RFC 1597 and RFC 1631 were being discussed. Cisco PIX Cisco PIX (Private Internet eXchange) is a popular IP firewall and network address translation (NAT) appliance. that inherited much of PIX features. they wanted to conceal a block or blocks of IP addresses behind a single or multiple registered IP addresses.[1] After Cisco acquired Network Translation in November 1995. but the now-familiar RFC 1918 had not yet been submitted. 2004. The PIX technology is still sold in a blade. Brantley Coile and Johnson Wu of Network Translation. now known as the PIX "Classic". and Richard Howes and Pete Tenereillo (both who worked for NTI prior to the acquisition). It is created by the Italian group called "BlackAngels".. the PIX shared most of its code with another Cisco product. Tom Bohannon. CGE was discovered the first time on March. California and designed and coded by Brantley Coile of Athens. At a time when NAT was just being investigated as a viable approach. Georgia.Cisco Global Exploiter 290 Cisco Global Exploiter Cisco Global Exploiter is a hacking tool used to find and exploit vulnerabilities in Cisco Network-systems. When they began.

2008. which runs locally on a Windows NT client. etc. The last day to purchase accessories and licenses was January 27. as long as the configuration was using ACLs. The Cisco PIX was also one of the first commercially available security appliances to incorporate IPSec VPN gateway functionality.com ProfSIMs (Networksims) for a simulator [7]. in most configuration commands 'ip' is omitted.0. This allows for an easy migration from PIX to ASA. and with NetworkSims.x. The last day for purchasing Cisco PIX Security Appliance platforms and bundles was July 28.x. DECNet. but socket based connections (a port and an IP Address . Beginning with PIX OS version 8. 2009. The configuration is upwards compatible.[4] [5] [6] Examples of emulators include PEMU and Dynagen [2]. with the ASA using a Linux kernel and PIX continuing to use the traditional Finesse/PIX OS combination.Cisco PIX 291 End-of-Life On January 28. The PIX was the first commercially available firewall product to introduce protocol specific filtering with the introduction of the "fixup" command. As the PIX is an acquired product. When a 5. By default it allows internal connections out (outbound traffic). The CLI is accessible from the serial console. 2013.[3] Description of operation The PIX runs a custom-written proprietary operating system originally called Finese (Fast InterNEt Server Executive).). as well as being a virtual private network (VPN) endpoint appliance. Although the 501 and 506E are relatively recent models. The PIX can be managed by a command line interface (CLI) or a graphical user interface (GUI).x configuration is loaded on a 7. 515(E). GUI administration was introduced with version 4. PIX OS v7. PIX Device Manager (PDM) for PIX OS version 6. It is classified as a network layer firewall with stateful inspection. and it has been through several incarnations: PIX Firewall Manager (PFM) for PIX OS versions 4.x and 5.1. telnet and SSH. Starting with version 7.x the PIX and the ASA use the same software images. and Adaptive Security Device Manager (ASDM) for PIX OS version 7 and greater. As the PIX only supports IP traffic (as opposed to IPX.x formatting. but not downwards. although technically the PIX would more precisely be called a Layer 4. the configuration is automatically converted to 7. the configuration is much more IOS-like. accessories. Cisco will continue to support Cisco PIX Security Appliance customers through July 27. as its access is not restricted to Network Layer routing.x. 2008. the flash memory size of only 8 MB prevents official . the operating system code diverges.[2] Adaptive Security Appliance (ASA) In May 2005. which runs over https and requires Java. The ASA series of devices run PIX code 7. Two protocols for which specific fixup behaviors were developed are DNS and SMTP. Through PIX OS release 7.x platform. which can run locally on a client or in reduced-functionality mode over HTTPS. and licenses. VPN 3000 series and IPS product lines. The PIX can be configured to perform many functions including network address translation (NAT) and port address translation (PAT). versus conduits and "outbounds". software. it allowed just one DNS response from a DNS server on the Internet (known as outside interface) for each DNS request from a client on the protected (known as inside) interface. Cisco introduced the Adaptive Security Appliance (ASA) which combines functionality from the PIX.x or 6. Cisco announced the end-of-sale and end-of-life dates for all Cisco PIX Security Appliances. The PIX "fixup" capability allows the Firewall to apply additional security policies to connections identified as using specific protocols.0 is only supported on models 515.Port communications occur at Layer 4). and only allows inbound traffic that is a response to a valid request or is allowed by an Access Control List (ACL) or a conduit. 525 and 535.0 and later. the CLI was originally not aligned with the Cisco IOS syntax. "Fixup" has been superseded by "Inspect" on later versions of PIX OS. but now the software is known simply as PIX OS. The DNS fixup originally implemented a very simple but effective security policy. or Transport Layer Firewall.

2 GBit without overhead taken in account). 506/506e. has a part code of WS-SVC-FWM-1-K9. and PIX 515 with top cover removed. chassis. As the lower Cisco ASA models use a PCI bus. and WS-SVC-FWM-1-K9. not the ASDM software (GUI). This designation denotes a multicast receive bug in the card's firmware that the designers addressed with a feature they called Multi Cast Work Around. and all other standalone models used Intel 80486 through Pentium III processors. a doubling of the memory size is required (32->64 MB for restricted and 64->128MB for Unrestricted/Failover licenses). although 7.Cisco PIX upgrading to version 7. the PIX 501 used an AMD 5x86 processor. the PIX535 was faster for cleartext than its successor ASA.0. Nearly all PIXs used Ethernet NICs with Intel 82557. and Interphase-based FDDI cards. for the Catalyst 6500 and the 7600 Router. 515/515e. The PIX boots off a proprietary ISA flash memory daughtercard in the case of the NTI PIX. etc. For the PIX 515(E) to run version >7. The PIX535 has a PCI-X 66 MHz/64 bit bus for expansion slots. 82558. flash cards. though the latter two run VxWorks. Specifications of latest and older models Latest models .0 with 64 MB memory installed. such as motherboard. resulting in maximum throughput of 1. 292 Description of hardware The original NTI PIX and the PIX Classic had cases that were sourced from OEM provider Appro. The PIX was constructed using Intel-based/Intel-compatible motherboards. Both the PIX 510 and 520 share basic components. and it boots off integrated flash memory in the case of the PIX 501.1(2). with the Cisco LocalDirector 416/420/430. and 535. as the PCI bus is no longer the bottleneck (the PCI bus is 33 MHz and 32 bits. All flash cards and the early encryption acceleration cards. but that is not recommended as larger configuration and session/xlate tables can exceed the available memory. the PIX-PL and PIX-PL2. but some older models are occasionally found with 3COM 3c590 and 3c595 Ethernet cards. and the Cisco Cache Engine CE2050. PIX Classic. Some Intel-based Ethernet cards for the PIX are identified at boot with the designation "mcwa". The 8MB flash size only allows for installation of the PIX OS software. were sourced from Productivity Enhancement Products (PEP).[8] Later models had cases from Cisco OEM manufacturers.0 can be installed on a 506E using monitor mode up to version 7. 82559 network chipsets. 525. rather than a Finesse derivative. This results in a much higher cleartext throughput. 10000. the Cisco Service Selector Gateway 6510 (SSG-6510). 520. until the introduction of the ASA5580. The PIX technology implemented in the FWSM.x. 510. A 515(E) UR/FO can run 7. Olicom-based Token-Ring cards.. NICs.

1 port 1000baseSX4 No Yes No Yes 506e 515e 6(10)3 10/100baseT20 10/100baseT20 3 1 port FE.4 Minimum PIX OS version Maximum PIX OS version officially supported Max interfaces 6.3(x) FWSM 4.1(1) Latest 6. 4 port FE.0.0. 1 port 1000baseSX No Yes No Yes 535 No No 1 Yes7 Fixed internal interface 10/100baseT 10/100baseT Fixed external interface PCI slots Expansion cards supported 10/100baseT 10/100baseT 0 No 0 No Supports SSL VPN VPN accelerator supported Floppy drive Failover supported Model No No No No 501 No No No No No No9 No Yes FWSM .0(x) 22 2 3(6)3 10/100baseT 10/100baseT 2 1 port FE.1(x) Latest 6. 1 port 1000baseSX No Yes No Yes 525 8(14)3 No No 9 1 port FE.2(x) 8. 4 port FE.1(x) 8.3(x) 8.Cisco PIX 293 Model Introduced Discontinued CPU type 501 2001 2008 AMD SC520 5x86 11 506e 2002 2008 Intel Celeron (Mendocino SL36A)11 300 MHz Intel 440BX Seattle 32 MB 2002 2008 515e 2000 2008 525 2000 2008 535 2003 FWSM Intel Celeron (Mendocino SL3BA)11 433 MHz Intel 440BX Seattle 64 (128) MB 3 Onboard Intel Pentium III (Coppermine)22 600 MHz Intel 440BX Seattle 128 (256) MB 3 Onboard Intel Pentium III (Coppermine) One Intel Pentium III and three IBM 4GS3 PowerNP network processors CPU speed Chipset 133 MHz AMD SC520 1 GHz Broadcom Serverworks RCC 512 (1024) MB 3 1 GHz ? Default RAM 16 MB11 Onboard 1 GB Boot flash device Onboard ISA card & Onboard17 16 MB Onboard Default flash Boot flash chips PIX BIOS flash chips 8 MB11 1 x 28F640 28F640 8 MB11 1 x 28F640 AM29F400B 16 MB11 1 x E28F128J3 AM29F400B 16 MB11 128 MB ATA CompactFlash 1 x EF28F128J3 2 x i28F640J5 AM29F400B/ E28F400B5T15 5.3(x) 5. 4 port FE.0.3(x) 5.4 DA28F320J517 5.4 FWSM 2.

1(x)13 Latest 6.3(x)12 6(3)3 No No 10/100baseT 10/100baseT 2 1 port FE. 1 port Token Ring.x 4.4(x) Minimum PIX OS version Maximum PIX OS version Max interfaces Fixed internal interface Fixed external interface PCI slots Expansion cards supported 1.Cisco PIX 294 Older models Model NTI PIX Classic 47-3158-01 1995 1998 Intel Pentium 10000 506 510 515 520 Introduced Discontinued CPU type 1994 1995 Intel 486DX2/ Intel Pentium1 66 / 90 MHz1 1996 1998 Intel Pentium Pro21 2000 2002 Intel Pentium MMX11 200 MHz 1997 1999 Intel Pentium 1999 2002 1999 2001 Intel Intel Pentium MMX11 Pentium II (Deschutes)10 200 MHz CPU speed 100~133 MHz 200 MHz 166 MHz 233~350 MHz10 440LX/BX Balboa/ Seattle 128 MB ISA card 2 MB / 16 MB6 Chipset Intel 430FX/TX 4 MB ISA card 512KB 8 MB ISA card 512KB / 8 2 MB 2 x i28f020 / 4 x 29C040 16 Intel 440FX Natoma 16 MB ISA card 2 MB Intel 430TX 32 MB Onboard 8 MB11 Intel 430TX 16 MB ISA card 2 MB Intel 430TX Default RAM Boot flash device Default flash 32 (64) MB 3 Onboard 16 MB11 2 x i28F640J5 Boot flash chips 2 x i28f020 4 x 29C04016 1 x i28F640J5 4 x 29C040 4 x 29C040 / 2 x i28F640J514 PIX BIOS flash chips AM28F256 AM28F25616 2. 1 port FDDI Yes Yes Yes 10000 10baseT 10baseT 0 No 4+18 1 port FE.x 4.3(x)12 5. 1 port FDDI Yes Yes No/Yes23 Classic No No 4 1 port FE. 4 port FE.4(x) AT29C257 AM28F256 AT29C257 AM28F256/ AT29C25714 4. 1 port FDDI Yes Yes Yes 506 4+18 1 port FE.1(x)13 5.x AM28F25616 4. 1 port 1000baseSX Yes Yes Yes 520 VPN accelerator supported Floppy drive Failover supported Model Yes Yes No NTI PIX No No No .1(x) Latest 8.2(2) 13 4.2(2) 5. 4 port FE. 1 port Token Ring. 1 port 1000baseSX4 Yes No Yes 510 515 8(6)3 No No No No ? ? No No 4 1 port FE.4(x) 5.3(4)13 2 Latest 6. 1 port Token Ring.4(x) 4.

0003 130.5 30 45 / 130 3 65 / 135 3 110 / 495 225 3 n/a 3.0003 256.000 25. Mbit/s AES-256 throughput. Mbit/s 168-bit Triple DES throughput.900 total / 280.000 per second 256.500 10.0003 100.000 / 250. Mbit/s 56-bit DES throughput. 750 SSL ASA 5520 n/a Model PIX PIX Classic 10000 PIX 501 PIX 506 PIX 506e PIX PIX 515 PIX 515e PIX 520 510 PIX 525 PIX 535 FWSM ---Information on models supported as of 6/27/2005 verified from Cisco's PIX Brochure [9] (page 2) and the specific product pages [10] .000 / 128.000 140.000 / 48.000 999. Mbit/s Max simultaneous connections 90 60 20 100 147 190 330 1655 450 5500 6 20 n/a n/a n/a n/a ? n/a 3 6 16 10 / 63 (135)3 5 20 / 63 (135)3 5 20 5 30 / 72 (145)3 5 50 / 100 (425)3 5 225 n/a 4.4 25 35 / 130 3 50 / 135 3 90 / 425 3 225 n/a 16. Mbit/s AES-128 throughput.0003 500.000 64.000 Max simultaneous hosts (users) 10 / 50 / Unlimited3 Unlimited Unlimited 128 / Unlimited Unlimited ? 1000 / unlimited 13 Max number of ACL entries Max simultaneous VPN peers 10 25 25 0 / 20003 0 / 20003 0 / 20003 ? 80.000 / 280.000 750 IPSec.000 7.Cisco PIX ---Information on models supported as of 6/27/2005 verified from Cisco's PIX Brochure [9] (page 2) and the specific product pages [10] 295 Performance specifications Model PIX PIX Classic 10000 PIX 501 PIX 506 PIX 506e PIX PIX 515 PIX 515e 510 PIX 52019 240 PIX 525 PIX 535 ASA 5520 FWSM Cleartext throughput.

64 bit/66 MHz PCI 1000baseSX card for PIX 53x. and AES. Uses a DEC 21154BE bridge chip. 82558 and 82559 chipsets. 520. Uses the Broadcom BCM5823KPB-5 chip. 510. The 1000baseT variant of this card.1. Cisco advises against installing this card in the 525 and 535 [14]. as well as the SSG-6510 and many LocalDirectors. and 520. and 535. The ASIC used on this card is the LSI L2A1157/695314-003. It is manufactured by Productivity Enhancement Products. the 512KB and 2MB flash cards were identical aside from the chips that populated it.3(1) or higher. PIX Classic and 10000. 10000. 520. Part number 74-3176-01.4. PIX 512KB flash memory card PIX-PL2 encryption card • PIX-1GE . . due to Carrier Extension [12] interoperability problems with early 1000baseT switch products [13]. • ??? . Supported by the 515.32 bit/33 MHz PCI 1000baseSX card for PIX 52x.32 bit/33 MHz PCI Single-port 10/100 Fast Ethernet card. Since these are off-the-shelf PC components predating the creation of the PIX.32 bit/33 MHz PCI Four port 10/100 Fast Ethernet card. the Intel Pro/1000-t Server adapter (PWLA8490t [11]). is not supported by PIX OS. Accelerates DES. and EISA expansion cards • Flash cards • ??? . identified by PIX OS as a PIX-VAC+. ISA.64 bit/66 MHz PCI IPSec Hardware VPN Accelerator Card. • PIX-1FE . 525. Based on the Intel PWLA8490 Pro/1000 fiber network card with the 82542 (Intel code name "Wiseman") chipset. Triple DES. PIX Classic. Uses an Intel 21154AC or DEC 21154AB bridge chip. but the 512KB card only populated two of the flash sockets with 28F020 chips. Based on the Intel Pro/1000-F fiber network card using the Intel TL82543GC (Intel code name "Livengood") ASIC (PWLA8490sx [11]). Based on the Intel Pro/100+ family with the 82557. Aside from progressive manufacturing refinements. and 535 running PIX OS 6. Based on the Intel 82558b chipset. It is manufactured by Productivity Enhancement Products.1 install guide and supported through at least PIX OS 5. 510. there may not be PIX-specific part numbers for these at all. 515e. while the 2MB card populated all four sockets with 29C040 chips • ??? . 515. In the release notes for PIX OS 6.5 [16]. Both booted from a 28F256 chip. • Ethernet cards • PIX-1GE-66 .512 kB ISA flash card used in the original NTI PIX. which is the only model with a 66 MHz PCI bus.3COM 3c590 and 3c595 PCI NICs occasionally found in NTI PIX. 10000.02. Mentioned in version 4. referencing caveat CSCdu00850.2 MB ISA flash card used in the PIX Classic. Based on the Intel 82559 chipset.16 MB ISA flash card for the PIX 510.64 bit/66 MHz PCI Four port 10/100 Fast Ethernet card. • VPN/Encryption acceleration cards • PIX-VAC-PLUS . • PIX-4FE . and 520. • PIX-FLASH-16MB . although this caveat actually only lists the PIX 535.Cisco PIX 296 List of part numbers for PCI.[15] • PIX-4FE-66 . It is manufactured by Productivity Enhancement Products. [11]. There is no 1000baseT variant of this card.

Celeron Mendocino. respectively). and 535. Using the PowerLeap PL-iP3 converter.0. It was discontinued and unsupported from PIX OS 6.1(x).3(5). starting with the PII 233 and ending with the PII 350. A slotket can also be used to install the newer 500 MHz .[21]   Cannot be easily upgraded. identified by PIX OS as a PIX-VAC.6 GHz are possible. which is found in the 520 rev B. due to clearance issues with the top cover. unless one is using a PIX 535.1 GHz Socket 370 Pentium III Coppermine cpus. 515e. . as long as the cpu uses 2.2(2). Newer 520s shipped with a 16 MB flash card [17]. it is actually possible to update the 506E to 7.   In early 2005.0. auto-polarity 4 port switch.   The WS-SVC-FWM-1-K9 blade only supports IPSec VPN for management.   PIX Classic firewalls with a serial number of 06002015 or lower came with a 512KB flash card. and the Pentium III Katmai families. Pentium II Klamath.1.   VAC acceleration vs VAC+ (in parenthesis) acceleration (Implies Unrestricted package). for example a 933 MHz CPU for 133 MHz FSB will only run at 700 MHz. 525.2(4) and higher explicitly does not support the Intel 440FX chipset.32 bit/33 MHz 4/16 Mbit/s PCI Token Ring card based on the Olicom OC-3137/PE-67597 (discontinued and unsupported from PIX OS 6. Newer models came with a 2MB flash card [18]. One may also use 133 MHz FSB CPUs. Using the bus-speed settings on the Powerleap. It uses the Analog Devices ADSP-2141L chip. While not officially supported.   Older 520s made before February 2000 and with a serial number less than 18025677 shipped with a 2 MB flash card.   Restricted package / Unrestricted package limits (referred to by Cisco as R and UR/FO/FO-AA. The PIX 520 rev A firewalls may use the Intel AL440LX [20] motherboard instead of the SE440BX-2. it makes use of either VLAN interfaces (being used by physical interfaces on a remote switch) or the physical interfaces on the switch/router it is installed in. The maximum OS version with a 16MB card is 6. RAM configurations above 384MB are not supported by Cisco however up to 3x 256MB work for a maximum of 768MB.32 bit/33 MHz PCI IPSec Hardware VPN Accelerator Card. For PIX-525. This is a repackaged IRE SafeNet CryptPCI 413-10004 rev 2.0.   The "inside" port is connected to an internal.3 card. OS version 5. but it will work. • FDDI and Token Ring cards • PIX-1TR .   The maximum OS version one can run with a 512KB card is 4.x code by removing all GUI management software. unmanaged. Accelerates DES and Triple DES.   The PIX 520 received updated PII processors as they became available. as long as the slotket provides a voltage regulator and manual bus speed selector. The AL440LX may be replaced by a SE440BX-2 motherboard. Tualatin processors can be used.Cisco PIX • PIX-VPN-ACCEL . Its part number is 74-1908-01. It is manufactured by Productivity Enhancement Products. while a "stripped-down" version would eventually be released for the 501 and 506e. The maximum OS version one can run with a 2MB card is 5. the rest came with a Pentium processor.1 on. • PIX-FDDI . A BIOS upgrade to the latest level of the SE440-BX2 is required. It doesn't have the ability to terminate a VPN connection for remote users.1 on).32 bit/8 MHz EISA encryption card found in some early PIXs.   The WS-SVC-FWM-1-K9 blade has no fixed ports or internal expansion. It is manufactured by Productivity Enhancement Products. Pentium II Deschutes. but they will run at lower speeds.0v core voltage and can run on a 66 or 100 MHz fsb.1 on). • PIX-PL .x would only support the 515. • PIX-PL2 .32 bit/33 MHz PCI proprietary DES encryption card (discontinued and unsupported from PIX OS 6. The Intel-manufactured SE440BX-2 [19] ATX motherboard in the 520 can support any Slot1 processor from the Celeron Covington.32 bit/33 MHz 100 Mbit/s SC duplex PCI FDDI card based on the Interphase 5511 FDDI card (PB05511-002). the 1000baseSX card is not officially supported by the 515/515e. 297 Footnotes Only the first few NTI PIXs came with the 486 processor.   According to Cisco. speeds of 1. Cisco indicated that PIX OS 7.

com/ en/ US/ prod/ collateral/ vpndevc/ ps5708/ ps5709/ ps2030/ qa_eos_for_sale_for_cisco_pix_products_customer. It would appear that all 1.x) overrides the PIX BIOS on the flash card (version 3.2 GHz P3(Tualatin core) is on the photos. . Retrieved 2008-02-20. Most. 298 Citations [1] "History of NTI and the PIX Firewall by John Mayes" (http:/ / www. the 60 or 66 MHz bus (no 100 MHz bus) and 72-pin SIMM memory limitations of the workstation-style 440FX board used limit the potential gains in performance to be had from such upgrades. Cisco. 525s in use today within that range have likely been corrected. com/ en/ US/ docs/ security/ asa/ asa80/ license/ opensrce.1. Viewing the field notice requires registration [22]. PIX 525s with serial numbers 44480380055 through 44480480044 were manufactured with erroneous or omitted EEPROM programming in their 82559 chips that caused the onboard FastEthernet ports to behave erratically when set to full-duplex. archive. shtml#nine). but an unused or unopened unit within that range would still need the corrective action to be taken. if not all.65V to 1.3.75V 100 MHz FSB CPUs would work. as one can upgrade the CPU in the PIX 520 to a 1 GHz Pentium III. com/ warp/ public/ 110/ 41. [5] "Documentation on Cisco PDM" (http:/ / www.   According to a 2000 field notice. cisco. Starting with PIX OS 5. Coupled with the Powerleap Neo S370 FC-to-PPG adapter. However. html).   Various models of the 525 use different flash chips. it may arguably be more cost-efficient to upgrade to a SE440BX-2 motherboard with a slocket and Tualatin Celeron CPU. PIX 506E overclocked specs [2] "End of Sale for Cisco PIX Products" (http:/ / www. html). putting it on a level with the 525 and 535.   Since both the 510 and 520 have standard ATX motherboards. pdf) (PDF). . one can use a 533 . both of which are long out of production. html).3.533 MHz Mendocino Celeron PPGA processor.4 explicitly does not support the 440FX chipset. org/ web/ 20070616121501/ http:/ / www. jma.   The PIX 525 is known to come with a variety of processors including 1. [4] "FAQs for Cisco PFM" (http:/ / www. 2008-01-28. Only after this feature debuted with the LocalDirector did it come to be included in the later PIX Classics. . Retrieved 2007-06-19. com/ en/ US/ docs/ security/ pix/ pix63/ pdm30/ installation/ guide/ pdm_ig. [3] "Cisco open source license page" (http:/ / www. due to a "procedural error". although if the motherboard is to be replaced. The Powerleap adapter natively can allow use of a 300 .[25] Proof of successful overclocking of Cisco Pix 506E with mainboard. probably due to differing production runs. Retrieved 2007-08-21. Upgrading the motherboard to a compatible server-style 440FX board with DIMM slots may allow for the use of the 440FX chipset's theoretical limit of 1 GB of RAM.75V CPU. cisco.Cisco PIX   Shows flash chips on the 2 MB flash card versus the chips on the 16 MB flash card.   It is theoretically possible to upgrade the Socket 8 Pentium Pro processor in the PIX Classic and 10000 with either an Intel Pentium II Overdrive (300 or 333 MHz depending on the system bus speed)[23] or a Powerleap PL-Pro/II Celeron adapter [24]. html).   Shows flash chips on the 512KB flash card versus the chips on the 2 MB flash card. cisco. [6] "Documentation on Cisco ASDM" (http:/ / web.   The performance figures cited here are highly changeable. com/ en/ US/ products/ ps6121/ products_user_guide_book09186a00806aea58.6) at boot. cisco. which will considerably increase its throughput in all of the below categories. . com/ en/ US/ products/ ps6121/ . the PCI slot count can be higher or lower than the default if the motherboard is replaced with a different one. .75V 600 MHz (SL5BT). Retrieved 2007-06-19. cisco. cisco. Archived from the original (http:/ / www. socket and circuits modification for 1. It is also worthwhile to note that PIX OS later than 5.   The first PIX Classics did not support failover. the "eeprom update" command will reprogram the defective data and restore normal operation permanently.766 MHz FC-PGA Coppermine-128 Celeron processor. this has been substantiated to 1000 MHz with a SL5QV 1. com/ The_History_of_the_PIX_Firewall/ NTI_files/ DataComm_Jan_1995.   While the PIX 535 boots off of the same ISA flash card as some PIX 510s and 520s (the PIX-FLASH-16MB) its newer on-board PIX BIOS (version 4.65V 600 MHz (SL3VH) and 1. This mod was done by someone called i8.

cisco. shtml [23] http:/ / www.cisco.cisco. cisco.cisco. intel. roadrunner. com/en/US/customer/products/hw/switches/ps708/ products_module_configuration_guide_chapter09186a0080394e0a.com/cgi-bin/ message_more. html [19] http:/ / support. cisco. cisco. com/ support/ motherboards/ desktop/ SE440BX2/ [20] http:/ / www.2 release notes" (http:/ / www.com/univercd/cc/td/doc/product/iaabu/pix/ pix_sw/v_63/63qsg/501quick. cisco. jma.com/en/US/products/hw/modules/ps2706/ ps4452/index. com/ ~dealgroup/ pix/ pix_page_history. shtml [14] http:/ / www. com/ en/ US/ products/ hw/ vpndevc/ ps2030/ prod_eol_notice09186a008032d3af.cisco. • Cisco site detailing what PIX features are/aren't supported by the WS-SVC-FWM-1-K9 (http://www.cisco. html#wp11364). intel. .pl?message_no=7781&table_type=pix&template=content) • Cisco's website for the WS-SVC-FWM-1-K9 (http://www. htm [12] http:/ / www.html) • Version 7. com/ en/ US/ partner/ products/ hw/ vpndevc/ ps2030/ products_field_notice09186a00800949c4. com/ en/ US/ products/ hw/ vpndevc/ ps2030/ prod_eol_notice09186a008032d39e. ca/ Products/ PL-Pro-II. com/ en/ US/ products/ hw/ vpndevc/ ps2030/ prod_models_home. com/ support/ motherboards/ desktop/ al440lx/ [21] "Cisco PIX 4. com/ en/ US/ products/ sw/ secursw/ ps2120/ prod_release_note09186a008059f93b. com/ application/ pdf/ en/ us/ guest/ products/ ps2030/ c1031/ ccmigration_09186a008007d065. . html [11] http:/ / www.cisco. cisco. intel.com/en/US/ customer/products/hw/vpndevc/ps2030/products_installation_guide_chapter09186a00803d245f. com/ pressroom/ archive/ releases/ DP081098. html) on 2007-06-16. powerleap.html) • Cisco site detailing which hardware is supported by which PIXOS release (http://www. 299 External links • A basic configuration guide for the PIX (http://www.com/go/asa) • Cisco's website for the PIX series (http://www. HTM [24] http:/ / www. com [8] "Notes on PIX production" (http:/ / www.com/go/pix) • PIX Simulator (http://networksims. shtml [16] http:/ / www. com/ en/ US/ products/ sw/ secursw/ ps2120/ prod_release_note09186a008057bf29. [9] http:/ / www. . com/ en/ US/ docs/ security/ pix/ pix42/ release/ notes/ pixrn420. pdf [10] http:/ / www.html) The following links may require a free registration at Cisco's website to view.pdf) • Cisco's website for the ASA 5500 Series (http://www. jpg). com/ en/ US/ products/ hw/ vpndevc/ ps2030/ products_field_notice09186a00800940f4.html) . Retrieved 2007-06-19. [22] http:/ / www. cisco.com/pix. com/ PIX_History/ NTI_1994-1995_files/ Manufacturing_Plan.0 of Cisco's hardware install instructions for various PIX models (http://www. Retrieved 2008-07-10. com/ en/ US/ products/ hw/ switches/ ps700/ products_field_notice09186a0080174a72. html [17] http:/ / www. cisco. com/ support/ network/ sb/ cs-012904. . intel. html [13] http:/ / www. htm [25] "History of NTI and the PIX Firewall by Brantley Coile" (http:/ / home.cisco. htm). cisco.Cisco PIX products_user_guide_book09186a00806aea58. cisco. [7] http:/ / networksims. html [18] http:/ / www. com/ web/ about/ ac123/ ac147/ ac174/ ac199/ about_cisco_ipj_archive_article09186a00800c85a6. html [15] http:/ / www. cisco.

The Management Center 'MC' (or Management Console) contains the program logic.Cisco Press Release Cisco Security Agent [1] .0. Full article linked below. 2010. All Cisco IDS Host Sensor customers were eligible for this migration program. an MS SQL database backend is used to store alerts and configuration information. through Cisco Cisco Security Agent [1] Cisco Security Agent (CSA) is an endpoint intrusion prevention system made originally by Okena (formerly named StormWatch Agent) . On June 11.2. This end of life action was the result of Cisco's acquisition of Okena. CSA uses a two or three-tier client-server architecture. whether or not the customer had purchased a Cisco Software Application Support (SAS) service contract for their Cisco IDS Host Sensor products. 2010 Operating system Cross-platform Type License Website Security / IPS Per-computer.130 / June 7th. External links • • • • • • • Alternative to CSA [2] . determining which behaviors are normal and which may indicate an attack.Cisco's product page for the Agent software Cisco IT Case Study [4] about Cisco Security Agent Cisco IDS Host Sensor Migration Program [5] EOS and EOL of for the Cisco IDS Host Sensor Product Line [6] Cisco hinted EOL for CSA [7] . CSA was offered as a replacement for Cisco IDS Host Sensor. As a result of this end-of-life action. the MC and SQL database may be co-resident on the same system. A Network World article dated 17 December 2009 stated "Cisco hinted that it will end-of-life both CSA and MARS". which was announced end-of-life on 21 February 2003. and the Cisco Security Agent product line based on the Okena technology would replace the Cisco IDS Host Sensor product line from Entercept. The Agent communicates with the Management Center. Cisco offerred a no-cost. Unfortunately. Cisco this time did not offer any replacement product nor nor any offers leaving its loyal customer to move to other third-party End Point Security Products. which was bought by Cisco Systems in 2003. The software is rule-based and examines system activity and network traffic.Cisco Security Agent 300 Cisco Security Agent Cisco Security Agent Developer(s) Stable release Okena/Cisco 6.Network World article .Cisco Recommended Alternative to CSA End-of-Life Announcement [3] .. The Agent is installed on the desktops and/or servers to be protected. Cisco announced the end-of-life and end-of-sale of CSA. The intent of this program was to support existing IDS Host Sensor customers who choose to migrate to the new Cisco Security Agent product line. sending logged events to the Management Center and receiving updates in rules when they occur. one-for-one product replacement/migration program for all Cisco IDS Host Sensor customers to the new Cisco Security Agent product line. Inc.

cisco. html http:/ / www. html http:/ / www. com/ en/ US/ products/ sw/ secursw/ ps976/ prod_eol_notice09186a008018d90e. html http:/ / www. html http:/ / www. networkworld. html http:/ / us. cisco. com/ community/ node/ 49121?source=NWWNLE_nlt_security_2009-12-18 . html http:/ / www. com/ en/ US/ products/ sw/ secursw/ ps976/ prod_eol_notice09186a008032d4e6. cisco.Cisco Security Agent 301 References [1] [2] [3] [4] [5] [6] [7] http:/ / www. cisco. com/ us/ solutions/ enterprise/ security-solutions/ virtualization/ cisco-csa/ why-cisco/ index. com/ en/ US/ prod/ collateral/ vpndevc/ ps5739/ ps2330/ end_of_life_c51-602579. com/ web/ about/ ciscoitatwork/ case_studies/ security_dl1. cisco. com/ en/ US/ products/ sw/ secursw/ ps5057/ index. trendmicro.

0440 / March 15.4. Mac OS X 10. Installation The client is normally distributed with an executable installer and profile file(s). Availability The software is not free but is often installed on university and business computers in accordance with a site-license.78 MB Available in Type License Website English VPN software Proprietary cisco.9. The client makes remote resources of another network available in a secure way as if the user was connected directly to that "private" network.07.63 MB x64 .01. Developer(s) Stable release Cisco Systems • • [1] Windows .9.0230 for Mac / July 27.4 and 10. administrators are allowed to freely distribute the software to users within their network. As with most corporate licenses. Solaris UltraSPARC. Linux (Intel)[3] Size • • x86 .0. 2009 Preview release 4.01. Cisco VPN Client Profile files have a security vulnerability which can potentially put the virtual private network at risk.5. 2010 Operating system Windows.0180 / February 5.4.5. 2011 [2] Mac OS X .com/en/US/products/sw/secursw/ps2308/ [4] The Cisco Systems VPN Client enables computers to connect to a virtual private network. . which contain all the necessary information to easily connect to a network.Cisco Systems VPN Client 302 Cisco Systems VPN Client Cisco Systems VPN Client Cisco VPN Client on Windows 7.7.

[4] http:/ / cisco. uni-kl. com/ en/ US/ docs/ security/ vpn_client/ cisco_vpn_client/ vpn_client49/ release/ notes/ 49client. html?mdfid=281940729& flowid=4465& softwareid=282364316& os=Mac OS) [3] "VPN Client Homepage" (http:/ / www.0180 appears to lack that support. cisco.pcf). cisco. 9. network administrators are advised to use the Mutual Group Authentication feature.0050 for Mac OS X.[5] Stable version 4.[8] and those passwords can easily be decoded using software or online services. .9. in which an encrypted password for the VPN network is usually stored. unix-ag.9.01. or use unique passwords (that aren't related to other important network passwords). 1.0230 Beta for Mac OS X (http:/ / www. cisco. cisco.[7] VPN Client does not run on any Linux 64 bit dual core systems that have SMP turned on. shtml). 0440-rel-notes.07.9.Cisco Systems VPN Client 303 Compatibility VPN Client 4. Security The client uses profile files (*. 07.[9] To workaround these issues. com/ warp/ public/ 707/ cisco-sn-20040415-grppass. .0. txt) [6] Release Notes for VPN Client. A vulnerability has been identified.6.5. com/ en/ US/ docs/ security/ vpn_client/ cisco_vpn_client/ vpn_client5007/ release/ notes/ vpnclient5007. Release 4.07. Release 4. Release 5.01. com/ en/ US/ products/ sw/ secursw/ ps2308/ ).9.0290 (http:/ / www. .0050 explicitly did not support versions of Mac OS X later than 10.x … Mac OS (http:/ / www.[8] References [1] VPN Client release notes (http:/ / www.[6] VPN Client 5. com/ web/ software/ 282364316/ 35919/ 4.00. cisco. txt) [2] Cisco VPN Client v4.0.00.0290 added support for 64-bit versions of Windows Vista and Windows 7.0230 Beta added support for Mac OS X 10. . cisco. 0230-beta-rel-notes. [9] "Cisco Systems VPN Client Group Password Decoder" (http:/ / www. Revised: May 21. cisco. 4. 0. 2010.9. html#wp84047) [8] "Cisco Security Notice: Cisco IPsec VPN Implementation Group Password Usage Vulnerability" (http:/ / www. com/ web/ software/ 282364316/ 47352/ 5. com/ cisco/ software/ release. OL-11179-04 (http:/ / www. de/ ~massar/ bin/ cisco-decode).01. pdf) [7] Release Notes for Cisco VPN Client. com/ en/ US/ products/ sw/ secursw/ ps2308/ [5] Release Notes for VPN Client.

The company was acquired by Codenomicon in 2011. for example the patching of DNS cache poisoning attacks[4] and Botnet[5] traffic. Clarified Networks is the lead developer and community contributor of AbuseHelper. Finland. Network Mapping [2] Clarified Networks is a company that is headquartered in Oulu. real-time views to the information provided by your monitoring systems. Products Clarified Networks provides a wide set of different situation awareness tools. Practical applications for Clarified Networks' tools are for example Traffic Auditing. The analyzer helps you in collaborative troubleshooting. Network Analysis.[3] The company is most famous for producing visualizations of security incidents. AbuseHelper is an open framework for collecting and sharing intelligence on suspected malicious activity.Clarified Networks 304 Clarified Networks Clarified Networks Type Founded Privately held company 2006 Headquarters Oulu. but continues to operate as a separate company under the Codenomicon Group. It provides beautiful situation overviews of complex data for decision makers and first line operation centers.[6] including: Virtual Situation Room (VSRoom) provides unified. . Network Analyzer is the tool of choice for collaborative analysis and visualization of complex networks. visualize and share monitoring data collected from your critical infrastructure. troubleshooting and malware analysis. Since 2006 Clarified Networks has in particular concentrated in developing the collaborative focus in their products and currently refers to itself as a provider of Collaborative Network Analysis tools. Finland Area served Products Services Owner(s) Employees Website worldwide Situation Awareness Tools Network Analysis Services Codenomicon ~10 [1] Clarified Networks Operating system Cross-platform Type Website Situation Awareness. With VSRoom you will be able to collect. traffic audits and network documentation based on real traffic.

tmcnet. http:/ / web. com/ watch?v=Eth58llJmNw [6] https:/ / www.[7] In 2007. youtube. clarifiednetworks. http:/ / financial.HowNetWorks: An Interview with its designers at Clarified Networks. php?id=102 [8] Greg Shields . com/ * http:/ / www. Greatest Hits http:/ / web. youtube. venturecup.000 USD prize from VMware Ultimate Virtual Appliance Challenge. Codenomicon Acquires Clarified Networks. com/ [2] http:/ / www. realtime-windowsserver. oreilly. archive. htm [4] O'Reilly Radar. fi/ index. cfm?j=590265 External links * https:/ / www.[8] [9] In 2011. and was one of the finalists. http:/ / www. html [5] Clarified Networks Tia . clarifiednetworks. fi/ index. com/ / topics/ mergers-acquisitions/ articles/ 178377-codenomicon-acquires-clarified-networks. http:/ / www. The company entered the Venture Cup competition that year.Clarified Networks 305 History The research and development for Clarified Networks' tools began in 2002 and continued for four years in the Oulu University Secure Programming Group (OUSPG) before Clarified Networks spun off from the research group in 2006. com/ [3] Financial Tech Spotlight. org/ web/ 20071110104327/ http:/ / www. hightechforum. clarifiednetworks. com/ user/ clarifiednetworks . com/ podcast/ 2007/ 08/ hownetworks_an_interview_with. Kaminsky DNS Patch Visualization. the founders of Clarified Networks also were awarded for their VMware Applicance called HowNetWorks. Company was acquired by Codenomicon. com/ 2008/ 08/ kaminsky-dns-patch-visualizati. clarifiednetworks. htm [9] News on Finnish Newspaper on the 100. com/ Products [7] Venture Cup Finland.Botnet analysis. References [1] https:/ / www. http:/ / radar.

pdf).11b/g access points will hinder the 802.[1] Discovery The attack was discovered by researchers at Queensland University of Technology's Information Security Research Center. To execute the attack properly. .[2] The origin of the Queensland attack name comes thus from its original discoverers. so that the transmitter may start sending it.11a. a high-power NIC and external antenna are required. "Denial of Service Attacks Against 802.11b network is attacked. and is not effective on the OFDM-based protocols 802.11g network when the 802. griffith. In practice The signal telling the system the airwaves are busy is of course sent through the attacker's NIC.Clear Channel Assessment attack 306 Clear Channel Assessment attack A Clear Channel Assessment attack or Queensland attack is physical layer DoS attack against Wi-Fi networks. au/ dspace/ bitstream/ 10072/ 12207/ 1/ 41331. AusCERT. org. However. html?it=4091). References [1] Bo Chen. Vallipuram Muthukkumarasamy. . The attack focuses the need of a wireless network to receive the "Clear Channel Assessment".11 DCF" (http:/ / www98. auscert. The attack makes it appear that the airwaves are busy. edu. The attack can be set up through the use of the Intersil's Prism Test Utility (PrismTestUtil322. . Griffith University. [2] "AusCERT Advisory: Denial of Service Vulnerability in IEEE 802.11g and 802.11 Wireless Devices" (http:/ / www. some hybrid 802. au/ render. which basically puts the entire system on hold.exe). by placing it in continuous transmit mode. which is a function within CSMA/CA to determine whether the wireless medium is ready and able to receive data.11b. The attack works only on 802.

com/ rsalabs/ node.Client Puzzle Protocol 307 Client Puzzle Protocol Client Puzzle Protocol (CPP) is a computer algorithm for use in Internet communication. References Ari Juels and John Brainard. if the server is under attack. aspx?id=138 [2] http:/ / www. It is an implementation of a proof-of-work system (POW). In S. After solving the puzzle. asp?id=2753 . Legitimate users would experience just a negligible computational cost. The puzzle is made simple and easily solvable but requires at least a minimal amount of computation on the client side. Proceedings of NDSS '99 (Networks and Distributed Security Systems). solution to the server. but abuse would be deterred: those clients that try to simultaneously establish a large numbers of connections would be unable to do so because of the computational cost (time delay). com/ press_release. or reject and drop the connection. rsa. editor. 1999. rsa. rsa. The idea of the CPP is to require all clients connecting to a server to correctly solve a mathematical puzzle before establishing a connection. the client would return the Possible generation method of client puzzles. whose goal is to make abuse of server resources infeasible. asp?id=2050 [3] http:/ / www. com/ rsalabs/ node. External links • RSA press release about client puzzles [1] • Client Puzzles: A Cryptographic Countermeasure Against Connection Depletion Attacks [2] • New Client Puzzle Outsourcing Techniques for DoS Resistance [3] References [1] http:/ / www. This method holds promise in fighting some types of spam as well as other attacks like Denial of Service. Client Puzzles: A Cryptographic Countermeasure Against Connection Depletion Attacks. which the server would quickly verify. pages 151-165. Kent.

org/ ?view=cloudvpn-howto .8 / August 3. org/ projects/ cloudvpn/ wiki/ [2] http:/ / e-x-a. e-x-a. WASTE-like features are planned. Capabilities Cloudvpn tries to establish a decentralized transport network. routing through which is done using an optimized DVR-like algorithm. External links • how-to document [2] References [1] http:/ / dev. and then attaches various other 'mesh clients'.Cloudvpn 308 Cloudvpn cloudvpn Original author(s) Mirek Kratochvíl Initial release Stable release Platform Type License Website December 2008 1. but is easily extensible to any other kind of traffic. Cloudvpn is designed as a set of tools. 2009 Cross-platform VPN GNU GPLv3 [1] Cloudvpn is an open-source. which provides an encrypted connection and mesh routing capabilities. Every node runs a 'cloud' program. aren't secured from attacks from themselves . giving the peers some security against eavesdropping and related attacks. that communicate using the pre-created network. mesh-networking capable communication tool. This gives the "cloud" of nodes a great potential. 'ether' tool creates a virtual Ethernet interface and routes its traffic through the mesh. if the traffic isn't secured or encrypted on some higher level. All transported traffic is encrypted. which can include traditional VPN purpose. Peers are therefore needed to trust each other. For example.one node can usually see and read all traffic coming through it. on the other side. giving the transport Ethernet-VPN capabilities. allowing simple failover of connections and traffic route optimalization. Other mesh clients are yet in development. It differentiates itself from other VPNs by serving as a generic transport layer for any packet-based data traffic.99. Communication peers.

service providers.[3] Codenomicon is also known for having t-shirts that say "GO HACK YOURSELF". with more than 40% growth in sales each year. codenomicon.[1] In 2011. The company has raised Venture money mid 2000's and and has been profitable since 2008. instead of depending on external security consultants. codenomicon. and has offices in Saratoga. Finland (Europe). the company acquired Clarified Networks. This comes from the goal of Codenomicon to enable testers and system administrators to find their own zero-day vulnerabilities. Network Analysis Services Private (profitable since 2008) 70 http:/ / www. Robustness testing. Hong Kong (Asia/Pacific) and Singapore (Asia/Pacific). Finland Area served Products Services Owner(s) Employees Website worldwide Robustness Testing Tools. and special hacker skills. which they usually have at their booth during security conferences. Situation Awareness Tools Security Testing Services. Fuzzing. com/ Codenomicon is a private company founded in late 2001. government/defense and enterprise customers. Network Analysis http:/ / www. a situation awareness company.Codenomicon 309 Codenomicon Codenomicon Type Founded Privately held company 2001 Headquarters Oulu. com/ Codenomicon Operating system Cross-platform Type Website Computer security. . and develops robustness testing tools (also called fuzzing tools) for manufacturers.[2] Codenomicon is based in Oulu. California (US).

[18] The research side span out into PROTOS Genome. and will generate intelligent test cases that find crash-level flaws and other failures in software.[6] PROTOS and Codenomicon testing approach.[7] Codenomicon has also built nearly 100 customer proprietary fuzzers for special interfaces such as device API's and complex banking systems. In addition there are also test suites for various Bluetooth profiles and Wireless LAN.means . an extension of syntax testing. fuzzing with some intelligence behind the generated test data.[4] These tools have roots in the research done at the University of Oulu in the Secure Programming Group (OUSPG). and is providing those tools under commercial licensing.[11] Codenomicon's Defensics Product line is also known as a "Fuzzer that does not fuzz"[12] .[15] [16] After Codenomicon was founded in 2001. LDAP and X. extensive test documentation and better test coverage.1 structures in such protocols as SNMP. Robustness testing Robustness testing is a model based fuzzing technique and over all Black box testing.[8] The technique was first described in a University of Oulu white paper on robustness testing published in 2000. is based around the idea of proactive protocol testing by injecting unexpected anomalies into the protocol message sequences. or as a suite of protocols related to a specific technology such as IPTV. VoIP. Routing. including wireless interfaces such as Bluetooth and WLAN.509.[19] . The first ideas for the engine were based on ideas the founders had while working at OUSPG. This enables fast test execution.[9] and Licentiate Thesis by Kaksonen. SSH and SIP. SNMP. structures and data types.[17] After founding Codenomicon. Bluetooth. where systematic fuzzing was first used to break ASCII/MIME contents in email clients and web services. BGP. and several other communication domains.. DEFENSICS includes test suites for 200+ protocols industry standard networks protocols such as SMTP. also PROTOS Test-Suites disclose they are running on top of Codenomicon engine.it uses smart anomalies instead of random Fuzzing structures.Codenomicon 310 Products The product line of Codenomicon consists of a suite of 200+ independent network protocol testing solutions called DEFENSICS. DEFENSICS for XML provides an added capability for testing common XML-based protocols and file formats more efficiently than before. by Kaksonen et al.[10] published in 2001. IPv6. Defensics tools address all fields in the protocols with all effective combinations of anomalies.[13] [14] Later. its DEFENSICS product line has grown to cover over 200 industry-standard network protocols and file formats. Traditional fuzzing lacks this capability as with random inputs that would take too much time to be effective in fast paced test cycles. Fault injection and specification mutations were other names they used for the same approach. that systematically will explore the input space defined by various communication interfaces or data formats. called robustness testing. History Codenomicon and its founders have been developing fuzzing tools since 1996. the same technique was applied to ASN. Each protocol fuzzer can be licensed separately. in essence. PROTOS tools are still widely used.[5] Whereas since 1999 the PROTOS project produced free software for testing about 10 protocols. Codenomicon has added support for much wider test coverage for about 200+ protocols.

html The Fuzzer That Does Not Fuzz [13] Mime bugs in Netscape. sans. (http://www. html#1) [2] Acquisition Expands Codenomicon’s Offering of Proactive Defense Solutions.codenomicon.com/display. com/ 2009/ 08/ fuzzer-that-does-not-fuzz. fi/ research/ ouspg) [6] PROTOS (http:/ / www. VTT Publications 447. oulu. In Proceedings of Communications and Multimedia Security Issues of the New Century / IFIP TC6/TC11 Fifth Joint Working Conference on Communications and Multimedia Security (CMS'01) May 21-22. com/ products/ test-suites. fi/ inf/ pdf/ publications/ 2001/ P448. (http:/ / sunsite. Takanen A.. "Software Security Assessment through Specification Mutations and Fault Injection". 15 p. Steve Manzuik. vtt. (http:/ / www. Martin Steinebach. A Functional Method for Assessing Protocol Implementation Security (Licentiate thesis). codenomicon. codenomicon. 128 p. 2008. July 28. com/ news/ eon/ 20110523005695/ en) [3] Codenomicon history (http:/ / www.Codenomicon 311 References [1] Codenomicon Newsletter 2010/12 (http:/ / www. (http:/ / www. org/ advisories/ CA-2001-18.htm) • eSecurity DEFEND THEN DEPLOY. 1command. codenomicon. Dobbs article on Automated Penetration Testing Toolkit Released (based on Codenomcion press release) http://www. ISBN 951-38-5873-1 (soft back ed.. Halunen K.php/aid/361/ CODENOMICON_DEFENDS_AGAINST_NETWORK_DATA_STORAGE_THREATS. html) [15] CERT Advisory CA-2001-18 Multiple Vulnerabilities in Several Implementations of the Lightweight Directory Access Protocol (LDAP).com/security/224600546 . http:/ / www. Laakso M.com/press/030608. Germany.tmcnet. ee. ee.com/ security/207000941 • Dr.. (https:/ / www. Paul Guersch. (http:/ / www. Nicolas Beauchesne. CA. html) [16] Edmund Whelan.com/esecurity/datasheet. Dobbs interview with Ari Takanen: Fuzzing. oulu. com/ labs/ xml/ [18] Bryan Burns.htm) • Dr. Puuperä R. Rauli.prweb. ee. Takanen A. shtml Codenomicon Test Suite Catalogue [5] OUSPG (http:/ / www. fi/ research/ ouspg/ protos/ sota/ woot08-experiences/ ) External links • Official site (http://www. pdf) [11] Kaksonen R. net/ Articles/ 228366/ ) [9] Kaksonen R.How does the e-mail security bug affect Solaris users? By Stephanie Steenbergen.. OUSPG 2001.php?ds=21) • Codenomicon Introduces DEFENSICS for WLAN (http://www.. Pietikäinen P.. com/ products/ test-suites. shtml) [4] http:/ / www. San Jose.itcinstitute.techguideonline. (http:/ / books. Published in 2001 by Technical Research Centre of Finland. Dave Killion. sk/ sunworldonline/ swol-08-1998/ swol-08-emailbug. Laakso M. fi/ research/ ouspg/ PROTOS_WP2000-robustness) [10] Kaksonen. (http:/ / eon.drdobbs. fi/ research/ ouspg/ protos/ analysis/ CMS2001-spec-centered/ ) [12] http:/ / crashatatime. phtml#10) [14] The buzz on the bug . (http:/ / netscape.drdobbs. News on EON. fi/ research/ ouspg/ protos) [7] Codenomicon DEFENSICS Test Suites (http:/ / www. com/ news/ newsletter/ archive/ 2010-12.joltawards.com/releases/2007/8/ prweb545287. Jana Dittmann. Seppänen M. google. oulu. Model-based Testing.1 Vulnerabilities. In proceedings of the 2nd USENIX Workshop on Offensive Technologies (WOOT '08).com/wifirevolution/articles/ 13638-codenomicon-introduces-defensics-wlan. codenomicon. oulu.html) • Jolt Productivity Award 2008 (http://www. shtml) [8] LWN Security (http:/ / lwn.com/news/fullstory. Published by O'Reilly. Laakso M. com/ relnotes/ details. ee. ee.. uakom. Helin A. December 2002.. Darmstadt. + app. businesswire. (http:/ / www. (http:/ / www. Röning J. ISDN 0-7923-7365-0. com/ books?q="Created with Codenomicon Mini-Simulation Toolkit") [19] Viide J. SANS Institute InfoSec Reading Room. edited by Ralf Steinmetz. bapcojournal.. 2001. blogspot. oulu. and Security http://www.) ISBN 951-38-5874-X (on-line ed.. White paper.. SunWorld staff. codenomicon..htm) • Codenomicon Offers Preemptive Security and Quality Testing (http://www. org/ reading_room/ whitepapers/ protocols/ snmp_and_potential_asn_1_vulnerabilities_912) [17] XML Security and Fuzzing. aspx?id=4632) • CODENOMICON DEFENDS AGAINST NETWORK DATA STORAGE THREATS (http://www. com/ company/ history. Vulnerability Analysis of Software through Syntax Testing. cert. Security Power Tools. "Experiences with Model Inference Assisted Fuzzing". Jennifer Granick.).com/) • AlwaysOn as an 100 Top Private Company Award Winner (http://www. SNMP and Potential ASN.

cert.html) CVE-2004-0786 (http://cve.com/ videoplay?docid=6509883355867972121) • Codenomicon .codenomicon.fi/en/reports/2009/vulnerability2009085.cgi?name=CVE-2004-0786) CVE-2004-0081 (http://cve.cgi?name=CVE-2004-0081) Video links • Heikki Kortti .google.viddler.cert.com/labs/advisories/) CERT-FI Advisory on XML libraries (https://www.org/cgi-bin/cvename.html) CERT-FI Vulnerability Advisory on GnuTLS (http://www.HS Startup competition Video (http://www.com/explore/antti/videos/9/) .fi/haavoittuvuudet/advisory-gnutls.mitre.Designing Inputs That Make Software Fail (http://video.Codenomicon 312 Security advisory links • • • • • Codenomicon Advisories (http://www.org/cgi-bin/cvename.mitre.

4G and WiMAX.[2] with support for WLAN and public networks. mobileenterprisemag. References [1] http:/ / www. . columbitech. com/ [2] "Lost Connections".[4] and in other industries where mobile devices are used over wireless networks. org/ LPinformation_new/ 2008/ 05/ LPiEdit2. asp). Wall Street Journal. com/ article/ SB119717610996418467. asp?sid=& nm=& type=Publishing& mod=Publications::Article& mid=8F3A7027421841978F18BE895F87F791& tier=4& id=D7B7F1A7229743C1A55EFC1BD55DF05F). founded in 2000. Sweden Products Website Columbitech Mobile VPN [1] Columbitech. provides wireless security to secure mobile devices.[3] The solution is encrypted on standards-based Wireless Transport Layer Security (WTLS) and holds a FIPS 140-2 certification. stores. [4] "LINUX-based VPN system has NEXCOM running a tight ship". [3] "Mobile VPN for the Field".Columbitech 313 Columbitech Columbitech Type Industry Founded Private Mobile Communication 2000 Headquarters Stockholm. Mobile Enterprise. December 11 2007 (http:/ / online. corporate WLAN users and telecommuters – mobilizing the enterprise. wsj. May 2008 (http:/ / www. The company is headquartered in Stockholm. February 2008 (http:/ / www. Columbitech Mobile VPN The Columbitech mobile VPN provides remote network access to field mobility users. com/ ME2/ dirmod. including 3G. Sweden with offices in New York City. Stores Magazine. The technology is utilized in the retail industry to meet PCI DSS requirements. html).

Security by design The technologies of computer security are based on logic. 4. The objective of computer security includes protection of information and property from theft. while allowing the information and property to remain accessible and productive to its intended users. 2. Trust all the software to abide by a security policy but the software is not trustworthy (this is computer insecurity). it is more practical. The term computer system security means the collective processes and mechanisms by which sensitive and valuable information and services are protected from publication. designing a program with security in mind often imposes restrictions on that program's behavior. Trust all the software to abide by a security policy and the software is validated as trustworthy (by tedious branch and path analysis for example). Many systems have unintentionally resulted in the first possibility. its use is very limited. There are 4 approaches to security in computing. corruption. . tampering or collapse by unauthorized activities or untrustworthy individuals and unplanned events respectively. sometimes a combination of approaches is valid: 1. Approaches one and three lead to failure. Combinations of approaches two and four are often used in a layered architecture with thin layers of two and thick layers of four. 3. Trust no software but enforce a security policy with trustworthy hardware mechanisms. As security is not necessarily the primary goal of most computer applications. The strategies and methodologies of computer security often differ from most other computer technologies because of its somewhat elusive objective of preventing unwanted computer behavior instead of enabling wanted computer behavior. Since approach two is expensive and non-deterministic. Because approach number four is often based on hardware mechanisms and avoids abstractions and a multiplicity of degrees of freedom.Computer security 314 Computer security Computer security Secure operating systems Security architecture Security by design Secure coding Computer insecurity Vulnerability Social engineering Eavesdropping Exploits Trojans Viruses and worms Denial of service Backdoors Rootkits Keyloggers Payloads Computer security is a branch of computer technology known as information security as applied to computers and networks. Trust no software but enforce a security policy with mechanisms that are not trustworthy (again this is computer insecurity). or natural disaster.

[1] Hardware mechanisms that protect computers and data Hardware based or assisted computer security offers an alternative to software-only computer security. where they can only be appended to. This enables a closed form solution to security that works well when only a single well-characterized property can be isolated as critical. Devices such as dongles may be considered more secure due to the physical access required in order to be compromised. Finally. Full audit trails should be kept of system activity. Defense in depth works when the breaching of one security measure does not provide a platform to facilitate subverting another. by breaking the system up into smaller components. So cascading several weak mechanisms does not provide the safety of a single stronger mechanism. a secure system should require a deliberate. One technique enforces the principle of least privilege to great extent. the mechanism and extent of the breach can be determined. Much of this technology is based on science developed in the 1980s and used to produce what may be some of the most impenetrable operating systems ever. can keep intruders from covering their tracks. The strategy is based on a coupling of special microprocessor hardware features. to a special correctly implemented operating system kernel. and how they relate to the overall information technology architecture. The designers and operators of systems should assume that security breaches are inevitable. conscious. The design should use "defense in depth". the complexity of individual components is reduced. where more than one subsystem needs to be violated to compromise the integrity of the system and the information it holds. Also. among them confidentiality. opening up the possibility of using techniques such as automated theorem proving to prove the correctness of crucial software subsystems. Secure operating systems One use of the term computer security refers to technology to implement a secure operating system. so that when a security breach occurs. However. and wherever possible should be designed to "fail secure" rather than "fail insecure" (see fail-safe for the equivalent in safety engineering). In addition. much less proven. That way even if an attacker gains access to one part of the system. Furthermore. if any. 315 Security architecture Security Architecture can be defined as the design artifacts that describe how the security controls (security countermeasures) are positioned. Not surprisingly. integrity. availability. it is impractical for generalized correctness. Ideally. the cascading principle acknowledges that several low hurdles does not make a high hurdle. Where formal correctness proofs are not possible. which probably cannot even be defined. An example of such a Computer security policy is the Bell-LaPadula model. knowledgeable and free decision on the part of legitimate authorities in order to make it insecure. often involving the memory management unit. Such ultra-strong secure operating systems are based on operating system kernel technology that can guarantee that certain security policies are absolutely enforced in an operating environment. Though still valid. rigorous use of code review and unit testing represent a best-effort approach to make modules secure. where an entity has only the privileges that are needed for its function. effective strategies to enhance security after design. Subsystems should default to secure settings. accountability and assurance. Storing audit trails remotely. security should not be an all or nothing issue. These controls serve the purpose to maintain the system's quality attributes. the technology is in limited use today. primarily because it imposes some changes to system management and also because it is not widely understood. full disclosure helps to ensure that when bugs are found the "window of vulnerability" is kept as short as possible. there are few. and that property is also assessable to math. This forms the . fine-grained security ensures that it is just as difficult for them to access the rest.Computer security There are various strategies and techniques used to design security systems.

weight and power. Common software defects include buffer overflows. and management hosts and are used not only to protect the data stored on these systems but also to provide a high level of protection for network connections and routing services. such as income tax information. and code/command injection. Systems designed with such methodology represent the state of the art of computer security although products using such security are not widely known. The design methodology to produce such secure systems is precise. These systems are found in use on web servers. These are very powerful security tools and very few secure operating systems have been certified at the highest level (Orange Book A-1) to operate over the range of "Top Secret" to "unclassified" (including Honeywell SCOMP. Secure operating systems designed this way are used primarily to protect national security information. and therefore there are degrees of security strength defined for COMPUSEC. Lower levels mean we can be less certain that the security functions are implemented flawlessly. security functionality and assurance level (such as EAL levels). guards. lack the features that assure this maximal level of security. on the other hand. executable instructions are cleverly exploited. There are 'best effort' secure coding practices that can be followed to make an application more resistant to malicious subversion. military secrets. such as Java. In low security operating environments. and capable of protecting the system from subverted code. integer overflow. None of these ultra-high assurance secure general purpose operating systems have been produced for decades or certified under Common Criteria. but in theory completely protects itself from corruption. format string vulnerabilities. database servers. most commercial systems fall in a 'low security' category because they rely on features not supported by secure operating systems (like portability. deterministic and logical. Other languages.Computer security foundation for a secure operating system which. Some common languages such as C and C++ are vulnerable to all of these defects (see Seacord. NSA Blacker and Boeing MLS LAN. but also on the assurance of correctness of the implementation. It is to be immediately noted that all of the foregoing are specific instances of a general class of attacks. and the data of international financial institutions. 316 Secure coding If the operating environment is not based on a secure operating system capable of maintaining a domain for its own execution. are more resistant to some of these defects. While such secure operating systems are possible and have been implemented. USAF SACDIN. the term High Assurance usually suggests the system has the right security functions that are implemented robustly enough to protect DoD and DoE classified information. Medium assurance suggests it can protect less valuable information.) The assurance of security depends not only on the soundness of the design strategy. The Common Criteria quantifies security strength of products in terms of two components. applications must be relied on to participate in their own protection. if certain critical parts are designed and implemented correctly. "Secure Coding in C and C++" [2]). This capability is enabled because the configuration not only imposes a security policy. . In USA parlance. Ordinary operating systems. they meet specifications with verifiable certainty comparable to specifications for size. and others). and capable of protecting application code from malicious subversion. and therefore less dependable. Medium robust systems may provide the same security functions as high assurance secure operating systems but do so at a lower assurance level (such as Common Criteria levels EAL4 or EAL5). but are still prone to code/command injection and other software defects which facilitate subversion. where situations in which putative "data" actually contains implicit or explicit. In commercial environments. can ensure the absolute impossibility of penetration by hostile elements. then high degrees of security are understandably not possible. the majority of software subversion vulnerabilities result from a few known kinds of coding defects. In sharp contrast to most kinds of software. and these are specified in a Protection Profile for requirements and a Security Target for product descriptions. Secure operating systems designed to meet medium robustness levels of security functionality and assurance have seen wider use within both government and commercial markets.

insofar as the variety of mechanisms are too wide and the manners in which they can be exploited are too variegated. or both. the distinction between code (ideally. for (fictitious) example. Before this publication the problem was known but considered to be academic and not practically exploitable. The first known exploit for this particular problem was presented in July 2007. In LISP. The most secure computers are those not connected to the Internet and shielded from any interference. Both of these problems are resolved by capabilities. In the real world. It has also been shown that the promise of ACLs of giving access to an object to only one person can never be guaranteed in practice. who would use such a feature but a well-intentioned system programmer? It was simply beyond conception that software could be deployed in a destructive fashion. nor is one practically achievable. It is worth noting that. dangling pointers.[4] Cloud computing security Security in the cloud is challenging.Computer security Recently another bad coding practice has come under scrutiny. First the Plessey System 250 and then Cambridge CAP computer demonstrated the use of capabilities. This does not mean practical flaws exist in all ACL-based systems. Capabilities have been mostly restricted to research operating systems and commercial OSs still use ACLs. 317 Capabilities and access control lists Within computer systems. Capabilities can. and the "user" of a LISP program who manages to insert an executable LAMBDA segment into putative "data" can achieve arbitrarily general and dangerous functionality. Applications Computer security is critical in almost any technology-driven industry which operates on computer systems. which enables one to generate Perl code and submit it to the interpreter. in some languages. solidly trained academics with naught but the goodness of mankind in mind. An open source project in the area is the E language. or data. disguised as string data.. In this connection one logical protocol base need to evolve so that the entire gamet of components operates synchronously and securely. all of whom were likely highly educated. read-only) and data (generally read/write) is blurred. there is no distinction whatsoever between code and data. there is no theoretical model of "secure coding" practices. it was considered quite harmless if. The issues of computer based systems and addressing their countless vulnerabilities are an integral part of maintaining an operational industry. for example.[3] Unfortunately. two security models capable of enforcing privilege separation are access control lists (ACLs) and capability-based security. due to varied degree of security features and management schemes within the cloud entitites. but only that the designers of certain utilities must take responsibility to ensure that they do not introduce flaws. ." After all. leading to a style of programming that is essentially a refinement of standard object-oriented design. however. A reason for the lack of adoption of capabilities may be that ACLs appeared to offer a 'quick fix' for security without pervasive redesign of the operating system and hardware. the most security comes from operating systems where security is not an add-on. The semantics of ACLs have been proven to be insecure in many situations. both taking the same form: an S-expression can be code. particularly. that such vulnerabilities often arise from archaic philosophies in which computers were assumed to be narrowly disseminated entities used by a chosen few. the confused deputy problem. a FORMAT string in a FORTRAN program could contain the J format specifier to mean "shut down system after printing. Even something as "modern" as Perl offers the eval() function. Thus. however. It is interesting to note. both in hardware and software. also be implemented at the language level. in the 1970s. Computer security can also be referred to as computer safety.

brown-outs. human error. Some of the most controversial parts of the bill include Paragraph 315. espionage. the Assistant to the President for Homeland Security and Counterterrorism: "our nation’s security and economic prosperity depend on the security. which grants the President the right to "order the limitation or . and other private sector organizations. power fluctuations. 2010. These incidents are very common. and Olympia Snowe (R-ME). Controlling aircraft over oceans is especially dangerous because radar surveillance only extends 175 to 225 miles offshore. which may lead to more serious concerns such as data theft or loss. Beyond the radar's sight controllers must rely on periodic radio communications with a third party. and Transportation. and integrity of communications and information infrastructure that are largely privately-owned and globally-operated" and talks about the country's response to a "cyber-Katrina". and foster and fund cybersecurity research. Wright-Patterson Air Force Base. A proper attack does not need to be very high tech or well funded.[6] One of the easiest and. co-written with Senators Evan Bayh (D-IN). surges. 2009. such as air tasking order systems data and furthermore able to penetrate connected networks of National Aeronautics and Space Administration's Goddard Space Flight Center. Bill Nelson (D-FL). hackers were able to obtain unrestricted access to Rome's networking systems and remove traces of their activities. Barbara Mikulski (D-MD). which in turn can lead to airport closures. The intruders were able to obtain classified files.[5] The consequences of a successful deliberate or inadvertent misuse of a computer system in the aviation industry range from loss of confidentiality to loss of system integrity. some Defense contractors. which approved a revised version of the same bill (the "Cybersecurity Act of 2010") on March 24. network and air traffic control outages. arguably. Science. stability. terrorist attack. since they are dependent on an electrical source. by posing as a trusted Rome center user. Lightning. and human error. cargo. loss of passenger life. Other accidental and intentional faults have caused significant disruption of safety critical systems throughout the last few decades and dependence on reliable communication and electrical power only jeopardizes computer safety. mechanical malfunction. over a hundred intrusions were made by unidentified crackers into the Rome Laboratory. expensive equipment. the most difficult to trace security vulnerabilities is achievable by transmitting unauthorized communications over specific radio frequencies. having altered flight courses of commercial aircraft and caused panic and confusion in the past. the bill. Threats that exploit computer vulnerabilities can stem from sabotage. the US Air Force's main command and research facility. for a power outage at an airport alone can cause repercussions worldwide. industrial competition. Senator Jay Rockefeller (D-WV) introduced the "Cybersecurity Act of 2009 . Notable system accidents In 1994. increase public awareness on cybersecurity issues.[9] The bill seeks to increase collaboration between the public and the private sector on cybersecurity issues.S. and transportation infrastructure. loss of aircraft. 773" (full text [8]) in the Senate.[7] Computer security policy United States Cybersecurity Act of 2010 On April 1. Security can be compromised by hardware and software malpractice.[10] ). and faulty operating environments. and various other power outages instantly disable all computer systems. especially those private entities that own infrastructures that are critical to national security interests (the bill quotes John Brennan.Computer security 318 Aviation The aviation industry is especially important when analyzing computer security because the involved risks include human life. Using trojan horse viruses. Military systems that control munitions can pose an even greater risk. was referred to the Committee on Commerce. These transmissions may spoof air traffic controllers or simply disrupt communications altogether. blown fuses.

atop which an operating system could be built. • A microkernel is a carefully crafted. • Chain of trust techniques can be used to attempt to ensure that all software loaded has been certified as authentic by the system's designers.H. The White House sent Congress a proposed cybersecurity law designed to force companies to do more to fend off cyberattacks. and end-user protection worldwide. was referred to three House committees. deliberately small corpus of software that underlies the operating system per se and is used solely to provide very low-level. • Authentication techniques can be used to ensure that communication end-points are who they say they are. an international non-profit digital rights advocacy and legal organization based in the United States. It also "directs the President to give priority for assistance to improve legal. judicial.3480" (full text in pdf [14]). • Firewalls can provide some protection from online intrusion.[11] International Cybercrime Reporting and Cooperation Act On March 25. the bill."[10] The Electronic Frontier Foundation. This section discusses their use. this controversial bill. • Capability and access control list techniques can be used to ensure privilege separation and mandatory access control. all three co-authors of the bill issued a statement claiming that instead. cybercrime. such as "segment" management.Computer security shutdown of Internet traffic to and from any compromised Federal Government or United States critical infrastructure information system or network. a threat that has been reinforced by recent reports about vulnerabilities in systems used in power and water utilities. 2010. and enforcement capabilities with respect to cybercrime to countries with low information and communications technology levels of development or utilization in their critical infrastructure. If signed into law. 2010. United States Senator Joe Lieberman (I-CT) introduced a bill called "Protecting Cyberspace as a National Asset Act of 2010 .R.[15] White House proposes cybersecurity legislation On May 12. and financial industries"[13] as well as to develop an action plan and an annual compliance assessment for countries of "cyber concern". reducing the probability that data exchanged between systems can be intercepted or modified.4962" (full text [12]) in the House of Representatives. However.[13] The bill seeks to make sure that the administration keeps Congress informed on information infrastructure. The theory (in the case of "segments") was that—rather than have the operating . would grant the President emergency powers over the Internet. 2010. A simple example with considerable didactic value is the early '90s GEMSOS (Gemini Computers). characterized the bill as promoting a "potentially dangerous approach that favors the dramatic over the sober response". Representative Yvette Clarke (D-NY) introduced the "International Cybercrime Reporting and Cooperation Act .S. which he co-wrote with Senator Susan Collins (R-ME) and Senator Thomas Carper (D-DE). • Automated theorem proving and other verification tools can enable critical algorithms and code used in secure systems to be mathematically proven to meet their specifications. very precisely defined primitives upon which an operating system can be developed. which the American media dubbed the "Kill switch bill". • Cryptographic techniques can be used to defend data in transit between systems. [16] 319 Terminology The following terms used in engineering secure systems are explained below. co-sponsored by seven other representatives (among whom only one Republican). the bill "[narrowed] existing broad Presidential authority to take over telecommunications networks". which provided extremely low-level primitives.[13] Protecting Cyberspace as a National Asset Act of 2010 On June 19. telecommunications systems.

Some individuals and companies also keep their backups in safe deposit boxes inside bank vaults. CD-RWs. Cryptographically secure ciphers are designed to make any practical attempt of breaking infeasible. as such. These files are kept on hard disks. Natural disasters. such as passwords. or an explosion may occur. both of which were destroyed in the 9/11 attack. a client (an "application. and heat proof safe. Symmetric-key ciphers are suitable for bulk encryption using shared keys. and having one's primary site and recovery site in the same coastal region. identification cards. such as USB drives. The intended recipient can unscramble the message. The backup media should be moved between the geographic sites in a secure manner. • Endpoint Security software helps networks to prevent data theft and virus infection through portable storage devices. • Encryption is used to protect the message from the eyes of others. • Backups are also important for reasons other than security. be they memory "segments" or file system "segments" or executable text "segments. in order to prevent them from being stolen. both of which were hit by Hurricane Katrina in 2005). more recently. waterproof. hurricanes. offsite location than that in which the original files are contained. • Applications with known security flaws should not be run. there is no theoretically viable means for a clever hacker to subvert the labeling scheme. Publicly known flaws are the main entry used by worms to automatically break into a system and then spread to other systems connected to it. Further. There is also a fourth option. and public-key 320 Cryptographic techniques involve transforming information.Computer security system itself worry about mandatory access separation by means of military-style labeling—it is safer if a low-level. . such as an FTP server. in case of such kind of disaster. There are many methods for identifying and authenticating users. it is recommended that the alternate location be placed where the same disaster would not affect both locations. which leads to both being vulnerable to hurricane damage (for example. and tapes. or delete it and replace it with some other application. • Backups are a way of securing information. Suggested locations for backups are a fireproof. but eavesdroppers cannot. Examples of alternate disaster recovery sites being compromised by the same disaster that affected the primary site include having had a primary site in World Trade Center I and the recovery site in 7 World Trade Center. such as earthquakes. Some of the following items may belong to the computer insecurity article: • Access authorization restricts access to a computer to group of users through the use of authentication systems." arguably) atop the microkernel and. The security website Secunia provides a search tool for unpatched known flaws in popular products. These systems can protect either the whole computer – such as through an interactive logon screen – or individual services. they are another copy of all the important computer files kept in another location. There needs to be a recent backup at an alternate secure location. CD-Rs. independently scrutinized module can be charged solely with the management of individually labeled segments. since the operating system per se does not provide mechanisms for interfering with labeling: the operating system is. which involves using one of the file hosting services that backs up files over the Internet for both business and individuals. subject to its restrictions. and. • Anti-virus software consists of computer programs that attempt to identify. smart cards and biometric systems." If software below the visibility of the operating system is (as in this case) charged with labeling. The building can be on fire. essentially. primary site in New Orleans and recovery site in Jefferson Parish. or in a separate. thwart and eliminate computer viruses and other malicious software (malware). may strike the building where the computer is located. scrambling it so it becomes unreadable during transmission. or tornadoes. Either leave it turned off until it can be patched or otherwise fixed.

Social engineering awareness keeps employees aware of the dangers of social engineering and/or having a policy in place to prevent social engineering can reduce successful breaches of the network and servers. 321 • • • • • Notes [1] Definitions: IT Security Architecture (http:/ / opensecurityarchitecture. "FAA Computer Security". SearchSecurity. html?part=rss& subj=news& tag=2547-1_3-0-20). eff. [14] http:/ / hsgac. pp. cfm?FuseAction=Files. Anderson: Security Engineering: A Guide to Building Dependable Distributed Systems (http://www. 2006 [2] http:/ / www. 2010. 2010. com). [5] P. 2003. opencongress.pdf) ISBN 0-442-23022-2 1988 • Stephen Haag. for example trying a lot of passwords to gain access to the network. 2010. White House Commission on Safety and Security. cert. 2009." presented at International Conference on Aviation Safety and Security in the 21st Century. Computer World. Honey pots are computers that are either intentionally or unintentionally left vulnerable to attack by crackers. opencongress. Retrieved on June 26. July 2007 [4] J. NY: Nova Science. View& FileStore_id=4ee63497-ca5b-4a4b-9bba-04b7f4cb0123 [15] Senators Say Cybersecurity Bill Has No 'Kill Switch' (http:/ / www. org/ bill/ 111-h4962/ show).Computer security encryption using digital certificates can provide a practical solution for the problem of securely communicating when no key is shared in advance.unomaha.4962 . Donald McCubbrey. opencongress. Firewalls are systems which help protect computers and computer networks from attack and subsequent intrusion by restricting the network traffic which can pass through them. Alain Pinsonneault. SecurityArchitecture. org/ deeplinks/ 2009/ 04/ cybersecurity-act). [6] J. cnet. OpenCongress. 2011. [11] Federal Authority Over the Internet? The Cybersecurity Act of 2009 (http:/ / www. CNET. 65–70. Retrieved on June 25.ac. org/ bill/ 111-h4962/ text [13] H.html). 2010. [12] http:/ / www. April 10. Pinging The ping application can be used by potential crackers to find if an IP address is reachable. "Computer Security in Aviation. jhtml?articleID=225701368& subSection=News). Retrieved on June 26.ac. 1997. they can try a port scan to detect and attack services on that computer. Richard Donovan: Management Information Systems for the information age. 2009. org/ bill/ 111-s773/ text [9] Cybersecurity bill passes first hurdle (http:/ / www. Willemssen. References • Ross J.00. Hauppauge. opencongress. They can be used to catch crackers or fix vulnerabilities. ISBN 0-07-091120-7 • E. based on a set of system administrator defined rules. org/ books/ secure-coding [3] New hacking technique exploits common programming error (http:/ / searchsecurity. 2010.edu/~stanw/gasserbook. ISBN 0-471-38922-6 • Morrie Gasser: Building a secure computer system (http://cs. org/ irp/ gao/ aim96084. gov/ public/ index. March 24. Zellan. G. 2010." May 12.uk/~rja14/book. Presented at Committee on Science.cl. informationweek.pdf) Cambridge. C. House of Representatives. [16] Declan McCullagh.org. June 24.sid14_gci1265116. htm).289142.International Cybercrime Reporting and Cooperation Act (http:/ / www. 2011. com/ 8301-31921_3-20062277-281. com/ s/ article/ 9174065/ Cybersecurity_bill_passes_first_hurdle). United States Department of Defense. Retrieved on June 26. Retrieved May 12. Intrusion-detection systems can scan a network for people that are on the network but who should not be there or are doing things that they should not be doing.cl. 1999 . [7] Information Security (http:/ / www. Retrieved on June 26. informationweek. Aviation Security. com/ originalContent/ 0. fas.R. cam.cam. 2000. GAO/T-AIMD-00-330. html). Maeve Cummings.org.com. Stewart Lee: Essays about Computer Security (http://www. Neumann. com/ news/ government/ security/ showArticle.org. techtarget. computerworld. org/ bill/ 111-s773/ text). 2010. " White House proposes cybersecurity legislation (http:/ / news. OpenCongress.org. [10] Cybersecurity Act of 2009 (http:/ / www. 1986 [8] http:/ / www.com. Jan. senate.uk/~mgk25/lee-essays. April 1. If a cracker finds a computer. eff.

RPC. CBAC does the deep packet inspection and hence it is termed to be a IOS Firewall. CBAC can be configured to permit specified TCP and UDP traffic through a firewall only when the connection is initiated from within the network needing protection. This is the basic function of a stateful inspection firewall.networkworld.pdf). ISBN 0-471-25311-1 • Robert C.securityincidents.com/community/node/59971) from Network World • The Repository of Industrial Security Incidents (http://www. Pocket Books. CBAC inspects traffic that travels through the firewall to discover and manage state information for TCP and UDP sessions. Springer.org/) Context-based access control Context-based access control (CBAC) intelligently filters TCP and UDP packets based on application layer protocol session information and can be used for intranets. 2005. Addison Wesley. extranets and internets. and SQL*Net) involve multiple control channels. Roger R.sri. • Bruce Schneier: Secrets & Lies: Digital Security in a Networked World. Neumann: Principled Assuredly Trustworthy Composable Architectures (http://www. CBAC examines not only network layer and transport layer information but also examines the application-layer protocol information (such as FTP connection information) to learn about the state of the TCP or UDP session. This allows support of protocols that involve multiple channels created as a result of negotiations in the FTP control channel. Most of the multimedia protocols as well as some other protocols (such as FTP.com/computer/communications/book/ 978-1-4419-0165-1). Karger. CBAC also provides the following benefits: • Denial-of-Service prevention and detection • Real-time alerts and audit trails . This state information is used to create temporary openings in the firewall's access lists to allow return traffic and additional data connections for permissible sessions (sessions that originated from within the protected internal network). or at most. 2009. ISBN 0-321-33572-4 • Clifford Stoll: Cuckoo's Egg: Tracking a Spy Through the Maze of Computer Espionage. acsac. Angus Wong and Alan Yeung.) However.com/ neumann/chats4. Schell: Thirty Years Later: Lessons from the Multics Security Evaluation (http://www.dmoz. traffic filtering is limited to access list implementations that examine packets at the network layer.csl. Seacord: Secure Coding in C and C++. ISBN 0-7434-1146-3 • Network Infrastructure Security (http://www.springer. while this example discusses inspecting traffic for sessions that originate from the external network. 322 External links • Security advisories links (http://www.pdf) 2004 • Paul A. the transport layer. CBAC can inspect traffic for sessions that originate from either side of the firewall. However. CBAC can inspect traffic for sessions that originate from the external network. IBM white paper.org/2002/papers/classic-multics. Without CBAC.Computer security • Peter G.org/Computers/Security/Advisories_and_Patches/) from the Open Directory Project • Top 5 Security No Brainers for Businesses (http://www. (In other words. September.

nai. com/ forums/ topic95405. aspx?id=453113271 http:/ / answers. References [1] [2] [3] [4] [5] http:/ / vil. htm#threat-minimum-engine http:/ / www. it may be installed by the SmitFraud trojan.xp-vista. users have had success removing the program using the SmitFraudFix.[5] Traditionally.[1] The application uses a false scanner to force computer users to pay for the removal of non-existent spyware items.com/english/virus/2007/04029) www. 2000. com/ securityadvisor/ pest/ pest. along with possible other software. Typically.xp-vista.frsirt. Removal The removal of Contravirus is difficult and may require assistance from qualified IT Support Personnel.com (http://www.shtml) www. located in C:\WINDOWS\ and ext32inc.com (http://www. html External links • • • • Symantec Security (http://www. bleepingcomputer.dll located in C:\WINDOWS\system\. It may also be known as ExpertAntivirus. Server 2003. Me.symantec. symantec. as well as well known programs Kaspersky Anti-Virus. and the Norton Family of Security products.jsp?docid=2007-050111-3914-99) F-secure (http://www. a user will see Contravirus running a "scan" of their computer at which time a user will be prompted to purchase the Contravirus software in order to remove the threat. NT. However. 95. com/ question/ index?qid=20070612151517AAcGLlh http:/ / www.f-secure. Spybot Search & Destroy.com/v-descs/trojan-downloader_w32_agent_btf. It may also install the file wincom27. jsp?docid=2007-050111-3914-99& tabid=2 http:/ / www. 7 and Server 2008 R2 are operating systems capable of becoming infected. yahoo. com/ security_response/ writeup. ca. 98. XP. Server 2008.ContraVirus 323 ContraVirus ContraVirus is a Rogue Spyware application that poses as a legitimate anti-spyware program.zip program. com/ vil/ content/ v_122056.frsirt. It may also hijack the user's browser and install a toolbar.com/security_response/writeup. [2] [3] Methods of Infection ContraVirus may be downloaded as a Trojan horse.dll.com/spyware-removal/contravirus-removal-instructions) . Vista. Server 2000.[4] Symptoms of infection ContraVirus has been known to display fake messages stating that a user's computer is infected with spyware. in order to persuade a user to purchase the software.

4.[3] The product's interface is designed to be usable by individuals both with and without specialized training in penetration testing and vulnerability assessment[4] . 3.Core Impact 324 Core Impact CORE IMPACT Pro Developer(s) Stable release Core Security Technologies 10.Runs local exploits in an attempt to obtain administrative privileges on compromised systems 5. IMPACT Pro allows the user to transparently pivot (as through a proxy) on an exploited target to continue probing and exploiting other elements of the network. Report Generation. How it Works Six Basic Steps 1. cracks encryption keys. Information gathering. CORE IMPACT Pro is designed to attempt to evaluate the whole of the security in an office ecosystem. web applications and wireless networks[2] .Uninstalls connected Agents. Deploys agents to enable users to interact with compromised systems. 2010 Operating system Microsoft Windows Type Website Security www. audit trails of [8] tests performed. and details about proven vulnerabilities. services and application exploits. Privilege Escalation. Attack and penetration.Selects and launches OS. web applications. 6.5 / April. endpoints. Local information gathering. checking for known exploits. as well checking for compliance with government regulation.Collects data about targeted networks. Cleanup.com [1] CORE IMPACT Pro is a commercial automated penetration testing software solution developed by Core Security Technologies which allows the user to probe for and exploit security vulnerabilities in computer networks. .Collects information about computers that have IMPACT agents deployed on them.Generates reports that provide data about targeted systems and applications. viability of current software and hardware security. vulnerability to psychological attack. and includes functions for generating reports from the gathered [5] [6] Core claims usage by about 1000 entities worldwide. generates and launches web application attacks. and/or email users 2.coresecurity.[7] information.

coresecurity. com/ slideshows/ 2010/ 042610-products-of-the-week. SC Magazine Review (http://www.replicates phishing. 2006. the software allows the penetration tester to use that system as a beachhead from which to launch additional attacks against other systems on the same network.Core Impact development begins and product is first presented at RSA 2001.Core Impact versions 1 and 2 are released to market. networkworld.org/sploits. updated with multistaged testing between client-side and network assessments.Core Impact Tutorial (http://www.Org Sploits (http://sectools. coresecurity.Impact version 3: client-side exploits introduced. and remote file inclusion attacks against web applications • Wireless Network Penetration Testing. coresecurity. updated with visual attack path recording and wireless penetration testing.com/Core-Security-Technologies-Core-Impact-Pro-8/ Review/2835/) 2. com/ content/ how-it-works [9] http:/ / www.Impact Version 6 is released. SQL injection.replicates cross-site scripting. coresecurity.replicates attempts at discovering Wi-Fi access points.ethicalhacker. References [1] http:/ / www. coresecurity. 2003. com/ test-equipment/ core-impact. 2005. com/ content/ reports [6] http:/ / nsslabs. Ethical Hacker Network . 2002.Version 4 is released (Rapid Penetration Test) as Core Security receives its first Product Award from EWeek.Core Impact 325 Pivoting Once a system is compromised during a CORE IMPACT penetration test. infoworld. com/ content/ how-it-works [10] http:/ / www. com/ test-equipment/ core-impact. phishing assessment capabilities. html [7] http:/ / www.[9] Attack Categories • Network Penetration Testing.0 [5] http:/ / www. Insecure. 2009. covering web application penetration testing.Impact Version 7 released.html) .Both Version 9 and 10 are released. com/ content/ how-it-works 1. com/ [2] http:/ / www. and joining exposed networks[10] Development History and Updates • • • • • • • • • 2000. [3] http:/ / nsslabs. first product sale made to NASA. coresecurity. 2004.Version 8 released. html#slide5. com/ d/ security-central/ product-review-core-impact-penetration-tester-goes-phishing-390?page=0. spear phishing and other social engineering attacks against end users • Web Application Penetration Testing.net/content/view/56/24/) 3. replicating an attacker’s attempts at moving across vulnerable systems to gain deeper levels of access to network environments. 2007. 2008. com/ content/ customers [8] http:/ / www. cracking encryption keys.replicates the actions of an attacker launching remote exploits on a network • Client-Side Penetration Testing.scmagazineus. html [4] http:/ / www.Impact Version 5: Vulnerability Scanner Integration released.

1997: The CoreLabs Research group was established and published their first advisory. 1998: Core conducted its first penetration test for a U.Vice President of Sales and Services Fred Pinkett. Core Security Technologies and Managing Director of Aconcagua Ventures Shinya Akamine. The company’s research arm. publishes public vulnerability advisories. ChosenSecurity.Vice President and General Manager of South American Operations Tom Kellermann. Core Insight Enterprise Company Management Team[4] • • • • • • • • • • • Mark Hatton.President and CEO. • 2010: Core announces beta of new security testing and measurement product.Vice President of Engineering Ariel Waissbein. NY.Executive Vice President of Corporate Operations and CFO Ivan Arce.[3] • 2009: Core adds development sites in Boston and India.Morgan Stanley Venture Partners Mark Hatton. Core Security Technologies Robert Steinkrauss. Argentina. headquarters was relocated from New York to Boston.S. 2000: The company’s first U. company. and works with vendors to assist in eliminating the exposures they find. 2002: Core released the first and second versions of their flagship penetration testing product.Chief Technology Officer Milan Shah.Director of Research & Development • Alberto Soliño.[1] History • • • • • • • • 1996: Core Security was founded in Buenos Aires. CoreLabs.[2] 2003: The company’s U. Jeronimo Bosch.CEO. office opened in New York.S. 2008: Mark Hatton becomes CEO of Core Security. Core Impact Pro.Pegasus Capital Peter Chung. MA.S.President and Chief Executive Officer [5] John O´Brien.VP of Security Awareness and Government Affairs Kimberly Legelis.Morgan Stanley Venture Partners Edward Hamburg.Vice President of Marketing Stephen Pace.Director of Security Consulting Services Board of Directors[6] • • • • • • • .Co-Founder.Core Security 326 Core Security Core Security Technologies is a computer and network security company that provides IT security testing and measurement software products and services. BlueRoads Corp. a foundation that supports entrepreneurial projects in emerging markets. proactively identifies new IT security vulnerabilities. Advisory Board Jonatan Altszul. 1998: Core Security was recognized as an “Endeavor Entrepreneur” by the Endeavor Foundation. Inc .Senior Vice President of Engineering Jeffrey Cassidy.CEO.Vice President of Product Management Paula Varas.

Cisco [9] 327 Products Core Impact Pro: a penetration testing software solution that replicates cyber attacks to assess the security of web applications. Vice President. including system vulnerabilities. endpoint systems. CoreLabs. MA.[7] • Roland Cloutier. LLC and Former Acting Senior Director for Cyberspace for the Nati