Secure Hardware Design

Secure Hardware Design
The Black Hat Briefings July 26-27, 2000 Brian Oblivion, Kingpin [oblivion, kingpin]@atstake.com

Why Secure Hardware? 
Embedded systems now common in the industry 
Hardware tokens, smartcards, crypto accelerators, internet appliances 

Detailed analysis & reverse engineering techniques available to all  Increase difficulty of attack  The means exist

Solid Development Process 
Clearly identified design requirements  Identify risks in the life-cycle 
    Secure build environment Hardware/Software Revision control Verbose design documentation Secure assembly and initialization facility End-of-life recommendations 

Identify single points of failure  Security fault analysis  Third-party design review

Inc. Crime Large $100K+ Varies Low Money Few Yes Varies Gov¶t Large Unknown Varies Low Varies Unknown Yes No Source: Cryptography Research. 1999. ³Crypto Due Diligence´ .Sources of Attack  Attacker resources and methods vary greatly Resource Time Budget ($) Creativity Detectability Target Number Organized Spread info? Teenager Limited <$1000 Varies High Challenge Many No Yes Academic Moderate $10K-$100K High High Publicity Moderate No Yes Org.

in-service Remote access All attacks possible Most attacks possible with risk of detection Most attacks possible No physical access .Accessibility to Product Purchase Evaluation Active.

Attack Scenarios System Enclosure Circuit Firmware .

Attack Scenarios  System     Initial experimentation & probing Viewed as a ³black box´ Can be performed remotely Bootstrapping attacks .

optical)  Bypassing tamper-proofing mechanisms .Attack Scenarios  Enclosure  Gaining access to product internals  Probing (X-ray. thermal imaging.

Attack Scenarios  Circuit       PCB design & parts placement analysis Component substitution Active bus and device probing Fault induction attacks1 Timing attacks2 Integrated circuit die analysis3 .

Attack Scenarios  Firmware     Low-level understanding of the product Obtain & modify intellectual property Bypass system security mechanisms Ability to mask failure detection .

Attack Scenarios  Strictly Firmware .no product needed!  Obtain firmware from vendor¶s public facing web site  Can be analyzed and disassembled without detection .

What Needs To Be Protected?  Firmware binaries  Boot sequence  Cryptographic functionality (offloaded to coprocessor)  Secret storage and management  Configuration and management communication channels .

System System Firmware Circuit Enclosure .

circuit boards. etc. Entire trusted base resides within tamper envelope Firmware Security Kernel .Trusted Base  Minimal functionality ± Trusted base to verify the integrity on firmware and/or Operating System ± Secure store for secrets ± Secrets never leave the base unencrypted ± Security Kernel System  Examples of a Trusted Base ± A single IC (some provide secure store for secrets) ± May be purchased or custom built (Secure Coprocessor) ± ± ± ± All Internals . components.

Security Kernel  Better when implemented in Trusted Base. but can function in OS  Enforces the security policy  Ability to decouple secrets from OS System Example: Cryptlib4 .

) f r ulk e crypt/decrypt ystem tr l Bus Bulk Tra sfer Bus mmu icati I terface(s) st irm are em ry apped Bus st r cess r ai mem ry ) ( .Trusted Base example xter al em ry Bus ata em ry (may e dual p rted.

Failure Modes  Determine how the product handles failures  Fail-open or fail-closed? System  Response depends on failure type  Halt system  Set failure flags and continue  Zeroization of critical areas .

Management Interfaces  Do not include service backdoors!  Utilize Access Control  Encrypt all management sessions  SSH for shell administration  SSL for web administration System .

Firmware System Firmware Circuit Enclosure .

Remove symbol tables. Prod.Secure Programming Practice  Code obfuscation & symbol stripping     irmware Use compiler optimizations Remove functionality not needed in production Two versions of firmware: Development. . debug info.

driver code with overflow could potentially lead to compromise irmware .Secure Programming Practice  Buffer overflows5  Highly publicized and attempted  If interfacing to PC.

Boot Sequence H o st S yste m F la sh (B IO S ) (M a y b e R O M ) N e w o r O ve rlo a d e d fu n ctio n a lity Firmware C om m on Boot M odel F la sh D isk o r F ixe d D isk E m bedded O S or sta te m a ch in e F la sh D isk o r F ixe d D isk A p p lica tio n s H ardw are R eset T im e Trusted Boot Sequence CSOC B o o tro m F la s h N e w o r O v e rlo a d e d fu n c tio n a lity V e rify E m b e d d e d OS H o st S yste m F la s h D is k o r F ixe d D is k Em bedded OS or s ta te m a c h in e V e rify A p p lic a tio n s C om m on B oot M odel P O S T . S e c u rity K e rn e l V e rify B o o tro m a n d F la s h F la s h D is k o r F ixe d D is k A p p lic a tio n s H a rd w a re R e se t T im e .

Run-Time Diagnostics Firm are  Make sure device is 100% operational all the time  Periodic system checks  Failing device may result in compromise .

Secret Management  Never leak unencrypted secrets out  Escrow mechanisms are a security hazard Firmware  If required. in the physical presence of humans  Physically export Key Encryption Key and protect  Export other keys encrypted with Key Encryption Key . perform at key generation.

Cryptographic Functions  If possible. move out of firmware  «into ASIC  Difficult to modify algorithm  Cannot be upgraded easily  Increased performance Firmware  «into commercial CSOC or FPGA     Can reconfigure for other algorithms May also provide key management Increased Performance Reconfiguration via signed download procedure (CSOC only) .

Field Programmability Firmware  Is your firmware accessible to everyone from your product support web page?  Encryption  Compressing the image is not secure  Encrypting code will limit exposure of intellectual property  Code signing  Reduce possibility of loading unauthorized code .

Circuit System Firmware Circuit Enclosure .

PCB Design Circuit  Remove unnecessary test points  Traces as short as possible  Differential lines parallel (even if on separate layers)  Separate analog. digital & power GND planes  Alternate power and GND planes .

Parts Placement ircuit  Difficult access to critical components  Proper power filtering circuit as close to input as possible  Noisy circuitry (i. inductors) compartmentalized .e.

Physical Access to Components  Epoxy encapsulation of critical components Circuit  Include detection mechanisms in and under epoxy boundary .

Regulators.)  dc-dc Converters.Power Supply & Clock Protection  Set min. operating limits  Protect against intentional voltage variation  Watchdogs (ex: Maxim. Dallas Semi. & max. Diodes Circuit  Monitor clock signals to detect variations .

Digital Honeypot  Disable all unused I/O pins . for FPGAs) .I/O Port Properties ircuit  Use unused pins to detect probing or tampering (esp.

passwords. private keys.Programmable Logic & Memory  Make use of on-chip security features  FPGA design  Make sure all conditions are covered  State machines should have default states in place ircuit  Be aware of what information is being stored in memory at all times6 (i.e.)  Prevent back-powering of non-volatile memory devices . etc.

R/W restricted to defined memory  DMA restricted to specified areas only  Trigger response based on detection of ³code probing´ or error condition ircuit .Advanced Memory Management  Often implemented in small FPGA  Bounds checking in hardware  Execution.

Bus Management  COMSEC Requirements Circuit  Keep black (encrypted) and red (in-the-clear) buses separate  Data leaving the device should always be black  Be aware of data on shared buses .

Enclosure System Firmware Circuit Enclosure .

Tamper Proofing Enclos re  Resistance. Detection. Response  Most effective when layered  Possibly bypassed with knowledge of method . Evidence.

Tamper Proofing  Tamper Resistance Hardened steel enclosures Locks Encapsulation. potting Security screws Tight airflow channels. 90o bends to prevent optical probing  Side-effect is tamper evident      Enclosure .

seals.Tamper Proofing  Tamper Evidence     Enclosure Major deterrent for minimal risk takers Passive detectors . cables Special enclosure finishes Most can be bypassed7 . tapes.

Tamper Proofing  Tamper Detection  Ex: Enclosure Temperature sensors Micro-switches Nichrome wire Flex circuit Radiation sensors Magnetic switches Pressure contacts Fiber optics .

Tamper Proofing  Tamper Response  Result of tampering being detected  Zeroization of critical memory areas  Provide audit information nclosure .

ESD Emissions & Immunity  Clean. properly filtered power supply  EMI Shielding  Coatings. sprays. housings Enclosure  Electrostatic discharge protection  Could be injected by attacker to cause failures  Diodes. Transient Voltage Suppressor devices (i.e.RF. Semtech) .

intentionally bad packets  Encrypt or (at least) obfuscate traffic Enclosure  Be aware if interfaces provide access to internal bus  Control bus activity through transceivers  Attenuate signals which leak through transceivers with exposed buses (token interfaces)  Disable JTAG and diagnostic functionality in operational modes .External Interfaces  Use caution if connecting to ³outside world´  Protect against malformed.

allocate time to analyze and break product  Peer review  Third-party analysis  Be aware of latest attack methodologies & trends .In Conclusion« As a designer:  Think as an attacker would  As design is in progress.

com/advisories/bufitos.cs. ³The Design of a Cryptographic Security Architecture.. Tamper Resistance. 2.ac.References 1.auckland. pp.html Mudge.cryptography.com/timingattack/ Beck..nz/~pgut001/cryptlib.´ Financial Cryptography.cs. 4. F. ³Fault Induction Attacks. http://www. David P.html . 3.´ John Wiley & Sons. P.pdf Gutmann. 7.org/midyear-97/Proceedings/johnstons.. February 1997. 109-121 Timing Attacks. Ltd.. Maher. Cryptlib. ³Secure Deletion from Magnetic and Solid-State Memory Devices. P.cs.auckland. ³Compromised Buffer Overflows.´ http://www.´ http://www. 1998 Gutmann.L0pht. 6. 5.´ Usenix Security Symposium 1999.asis. http://www.. and Hostile Reverse Engineering in Perspective.html ³Physical Security and Tamper-Indicating Devices. Cryptography Research.nz/~pgut001/secure_del. ³Integrated Circuit Failure Analysis: A Guide to Preparation Techniques. from Intel to SPARC version 8.. Inc.´ http://www.

Arnold.eskimo. G. Cryptography Research. pp. ³Physical Protection of Cryptographic Devices. http://www. Andrew J. ³An Evaluation System for the Physical Security of Computing Systems. pp. ³Design Concepts for Tamper Responding Systems. Double.com/dpa/ The Complete..html 5.Additional Reading 1.. http://www. April 1987.com/~joelm/tempest. 2.´ Eurocrypt: Advances in Cryptography.H. 3.mil/tpep/library/rainbow/5200. White.C. http://www...cryptography. pp.´ Crypto 1983. S.ncsc. 5200.. D.. . Inc. 4. DoD Trusted Computer System Evaluation Criteria (Orange Book). Unofficial TEMPEST Information Page.´ Sixth Annual Computer Security Applications Conference 1990.R. December 1985. 83-93 Chaum.P. 387-392 Weingart.28-STD. 232-243 Differential Power Analysis.radium.. S. 6.28-STD. W.html Clark.

Thanks! .