You are on page 1of 39

IBM Systems and Technology Group

Sequential Equivalence Checking


Across Arbitrary Design Transformation:
Technologies and Applications

Viresh Paruthi, IBM Corporation

J. Baumgartner, H. Mony, R. L. Kanzelman

Formal Methods in Computer-Aided Design, 2006 11/16/2006 © 2006 IBM Corporation


IBM Systems and Technology Group

Outline

 Equivalence Checking Overview


– Combinational Equivalence Checking (CEC)
– Sequential Equivalence Checking (SEC)

 Use of SEC within IBM


 IBM’s SEC Solution
 SEC Applications
 SEC Challenges
 Conclusion

2 Formal Methods in Computer-Aided Design, 2006 © 2006 IBM Corporation


IBM Systems and Technology Group

Equivalence Checking

 A technique to check equivalent behavior of two designs


Logic
1

{x0, x1, …} {0, 0, …}?


R1

Logic
2

R2

 Validates that certain design transforms preserve behavior


– Logic synthesis, manual redesign does not introduce bugs
 Often done formally to save resources, eliminate risk

3 Formal Methods in Computer-Aided Design, 2006 © 2006 IBM Corporation


IBM Systems and Technology Group

Combinational Equivalence Checking (CEC)

 No sequential analysis: latches treated as cutpoints  


Logic
1

X 0?
R1

S
Logic
2
0?
R2

 Equivalence check over outputs + next-state functions


– Though NP-complete, CEC is scalable+mature technology
 CEC is the most prevalent formal verification application
– Often mandated to validate synthesis
4 Formal Methods in Computer-Aided Design, 2006 © 2006 IBM Corporation
IBM Systems and Technology Group

Sequential Equivalence Checking (SEC)

 Latch cutpointing req’ment severely limits CEC applicability


– Cannot handle retimed designs, state machine re-encoding, ...
– Cutpointing may cause mismatches in unreachable states
• Often requires manual introduction of constraints over cutpoints

 SEC overcomes these CEC limitations


– Supports arbitrary design changes that do not impact I/O behavior
• Does not require for 1:1 latch or hierarchy correspondence
– Known mappings can be leveraged to reduce problem complexity
• Check restricted to reachable states

– Explores sequential behavior of design to assess I/O equivalence

5 Formal Methods in Computer-Aided Design, 2006 © 2006 IBM Corporation


IBM Systems and Technology Group

SEC Is Computationally Expensive

 Sequential verification: more complex than combinational


– Higher complexity class: PSPACE vs. NP
– Model checking is thus less scalable than CEC

 SEC deals with 2x size of model checking!


– Composite model built over both designs being equiv checked
– However, tuned algorithms exist to scale SEC better in practice

6 Formal Methods in Computer-Aided Design, 2006 © 2006 IBM Corporation


IBM Systems and Technology Group

SEC Paradigms

 Various SEC paradigms exist

 Initialized approaches
– Check equivalent behavior from user-specified initial states
– Assumes that designs can be brought into known reset states

 Uninitialized approaches, e.g. alignability analysis


– Require designs to share a common reset mechanism
– Compute reset mechanism concurrently with checking
equivalence from a reset state

7 Formal Methods in Computer-Aided Design, 2006 © 2006 IBM Corporation


IBM Systems and Technology Group

IBM’s Approach: Initialized SEC

 More flexible:
– Enables checking specific modes of operation
– Applicable even if initialization logic altered (or not yet implemented)
– Applicable even to designs that are not exactly equivalent
• Pipeline stage added? check equivalence modulo 1-clock delay
• data_out differs when data_valid=0? check equiv only when data_valid=1

 More scalable: 1,000s to even 100,000+ state elements


– Reset mechanism computation adds (needless) complexity

 Validation of reset mechanism can be done independently


– Functional verification performed w.r.t. power-on reset states

8 Formal Methods in Computer-Aided Design, 2006 © 2006 IBM Corporation


IBM Systems and Technology Group

SEC Usage at IBM

 IBM’s SEC toolset: SixthSense


– Developed primarily for custom microprocessor designs
– Also used by ASICs; for (semi-)formal functional verification

 Use CEC to validate combinational synthesis


– Verity is IBM’s CEC toolset
– Also used for other specific purposes, e.g. ECO verification

 Use SEC for pre-synthesis HDL comparisons


– Sequential optimizations manually reflected in HDL
– SEC efficiently eliminates the risk of such optimizations

9 Formal Methods in Computer-Aided Design, 2006 © 2006 IBM Corporation


IBM Systems and Technology Group

SixthSense Horsepower

 SixthSense is a system of cooperating algorithms


– Transformation engines (simplification/reduction algorithms)
– Falsification engines
– Proof engines

 Unique Transformation-Based Verification(TBV) framework


– Exploits maximal synergy between various algorithms
– Retiming, redundancy removal, localization, induction...
– Incrementally chop problem into simpler sub-problems until solvable

 Transformations yield exponential speedups to bug-finding


(semi-formal), as well as proof (formal) applications

10 Formal Methods in Computer-Aided Design, 2006 © 2006 IBM Corporation


IBM Systems and Technology Group

Transformation-Based Verification (TBV)

Design N

Redundancy
Removal Design N’
Result N
Engine

Retiming Design N’’


Result N’ Engine

Target
Enlargement Design N’’’
Result N’’
Engine

Result N’’’

11 Formal Methods in Computer-Aided Design, 2006 © 2006 IBM Corporation


IBM Systems and Technology Group

Transformation-Based Verification Framework


Design + Counter-example
Driver + Trace consistent
140627 registers Checker with original design

Combinational
Optimization
SixthSense
Problem optimized trace
Engine
decomposition
via synergistic 119147 registers
transformations
These transformations are
Retiming Engine
optimized,
completely transparent retimed
to the
user trace
100902 registers

All results are in terms of


Localization
Engine optimized, retimed,
original design
localized trace
132 registers

Reachability
Engine

12 Formal Methods in Computer-Aided Design, 2006 © 2006 IBM Corporation


IBM Systems and Technology Group

Example SixthSense Engines


 Combinational rewriting  Semi-formal search
 Sequential redundancy  Symbolic sim: SAT+BDDs
removal
 Symbolic reachability
 Min-area retiming
 Induction
 Sequential rewriting
 Interpolation
 Input reparameterization
 … 
 Localization
 Target enlargement
 Expert System Engine automates
 State-transition folding optimal engine sequence
 Isomorphic property experimentation
decomposition
 Unfolding

13 Formal Methods in Computer-Aided Design, 2006 © 2006 IBM Corporation


IBM Systems and Technology Group

Key to Scalability: Assume-then-prove framework

1. Guess redundancy candidates


 Equivalence classes of gates

2. Create speculatively-reduced model


 Add a miter (XOR) over each candidate and its equiv class
representative
 Replace fanout references by representatives

3. Attempt to prove each miter unassertable


4. If all miters proven unassertable, corresponding gates can
be merged
5. Else, refine to separate unproven candidates; go to Step 2

14 Formal Methods in Computer-Aided Design, 2006 © 2006 IBM Corporation


IBM Systems and Technology Group

Assume-then-prove Framework

 Speculative reduction greatly enhances scalability


– Generalizes CEC
• Sequential analysis only needed over sequentially redesigned logic

 Proof step is the most costly facet


– Most equivalences solved by lower-cost algos (e.g. induction)
• However, some equivalences can be very difficult to prove
– Failure to prove a cutpoint often degrades into inconclusive SEC run 

 Novel SixthSense technology: leverage synergistic


algorithms to solve these harder proofs

15 Formal Methods in Computer-Aided Design, 2006 © 2006 IBM Corporation


IBM Systems and Technology Group

Causes of refinement

 Asserted miter – incorrect candidate guessing


 Resource limitations preclude proof
– Induction becomes expensive with depth
– Approximation weakens power of reachability

 Refinement weakens induction hypothesis


– Immediate separation of candidate gates
– Avalanche of future resource-gated refinements
– End result? Suboptimal redundancy removal
• Inconclusive equivalence check

16 Formal Methods in Computer-Aided Design, 2006 © 2006 IBM Corporation


IBM Systems and Technology Group

SixthSense: Enhanced Redundancy Proofs

 Use of robust variety of synergistic transformation and


verification algorithms
– Enables best proof strategy per miter
• Exponential run-time improvements
– Greater speed and scalability
– Greater degree of redundancy identified

 Powerful use of Transformation-based Verification


– Synergistically leverage transformations to simplify large problems
– Reduction in model size, number of distinct miters
• Transformation alone sufficient for many proofs

17 Formal Methods in Computer-Aided Design, 2006 © 2006 IBM Corporation


IBM Systems and Technology Group

Benefits of Transformation-Based Verification

 Reduction in model size, number of distinct miters


– Useful regardless of proof technique

 Transformations alone sufficient for many proofs


– Sub-circuits differing by retiming and resynthesis solved using
polynomial-resource transformations
– Scales to aggressive design modifications

 Leverage independent proof strategy on each miter


– Different algorithms suited for different problems
– Entails exponential difference in run-times

18 Formal Methods in Computer-Aided Design, 2006 © 2006 IBM Corporation


IBM Systems and Technology Group

TBV on Reduced Model

 Methodology restrictions
– Retiming may render name- and structure-based candidate
guessing ineffective

 Synergistic increase in reduction potential


– TBV flows more effective after merging
– Applying TBV before + after induction-based redundancy
removal insufficient

 Need to avoid resource-gated refinement

“Exploiting Suspected Redundancy without Proving it”, DAC 2005

19 Formal Methods in Computer-Aided Design, 2006 © 2006 IBM Corporation


IBM Systems and Technology Group

Redundancy Removal Results


70000 3000
60000 2000
Number of
Registers

50000
40000 1000
30000
20000 0
10000 SMM
0
IFU 600
400
Original Design
200
After Merging via Induction
After Merging via TBV 0
S6669
• Induction alone unable to solve all properties
• TBV => solves all properties, faster than induction

20 Formal Methods in Computer-Aided Design, 2006 © 2006 IBM Corporation


IBM Systems and Technology Group

TBV on speculatively-reduced model

IFU Initial COM LOC CUT COM


Registers 33231 30362 19 19 19
ANDs 304990 276795 86 76 71
Inputs 1371 1329 23 10 10

S6669 Initial COM CUT RET CUT COM


Registers 325 186 138 0 0 0
ANDs 3992 3067 1747 2186 1833 1788
Inputs 83 61 40 40 24 24

RETiming, LOCalization, COMbinational reduction, CUT: reparameterization

21 Formal Methods in Computer-Aided Design, 2006 © 2006 IBM Corporation


IBM Systems and Technology Group

Enhanced search without Proofs

 Use miters as filters


– No miter asserted => search remains within states for which
speculative merging is correct
• i.e., search results valid on original model also
– Miters need not be proven unassertable
– Enables exploitation of redundancy that holds only for an initial
bounded time-frame

 Faster and deeper bounded falsification

 Improved candidate guessing using spec-reduced model

22 Formal Methods in Computer-Aided Design, 2006 © 2006 IBM Corporation


IBM Systems and Technology Group

Bounded Falsification Results (% improvement)

Bounded Steps Completed in 1 hour


393%
350 50%
300
189%
250
200
150
100
50 43.75% 25%
0
FPU IFU SMM S3384 S6669

Original Design Speculatively Reduced Model

23 Formal Methods in Computer-Aided Design, 2006 © 2006 IBM Corporation


IBM Systems and Technology Group

Miter Validation Results (% improvement)

Bounded Steps Completed in 1 hour


785%
350 58%
300
872%
250
200
150
100
50 92% 100%
0
FPU IFU SMM S3384 S6669

Original with Miters Speculatively Reduced with Miters

24 Formal Methods in Computer-Aided Design, 2006 © 2006 IBM Corporation


IBM Systems and Technology Group

SixthSense Sequential Equivalence Checking


Drivers
(stimulus) Black-Box Checkers
list

Mapping file

Mismatch
OLD Design Trace

SixthSense
NEW Design Proof of
Equality
Initialization
Data
Outputs
Initialized
OLD Design
Inputs =?
Initialized
NEW Design

25 Formal Methods in Computer-Aided Design, 2006 © 2006 IBM Corporation


IBM Systems and Technology Group

Running the Sequential Equivalence Check

 Little manual effort to use

 Produces a counterexample showing output mismatch


– With respect to specified initial state(s)
– Trace is short and has minimal activity to simply illustrate mismatch

 Or, proves that no such trace exists


– Proof of equivalence

 Mandatory inputs:
– Requires OLD and NEW version of a design

26 Formal Methods in Computer-Aided Design, 2006 © 2006 IBM Corporation


IBM Systems and Technology Group

Running the Seq Equiv Check: Optional Inputs

 Initialization data; equiv checked w.r.t. given initial values


 Mapping file
– Indicates I/O signal renaming/polarities, add cutpoints, omit checks…

 Drivers, filter input stimuli to prevent spurious mismatches


 Black Box file, to easily delete components from design
– Outputs correlated, driven randomly; Inputs correlated, made targets

 Checkers (check equivalence of internal events)


– Ensure that coverage obtained before change, is valid after
– "Audit" known mismatches to enable meaningful proofs

27 Formal Methods in Computer-Aided Design, 2006 © 2006 IBM Corporation


IBM Systems and Technology Group

Sequential Equivalence Checking Applications

 Used at block/unit-level on multiple projects…


– To verify remaps, retiming, synthesis optimizations…
– CEC inadequate to deal with these changes

 Exposed 100's of unintended mismatches/design errors


– No need to run lengthy regression buckets for lesser coverage
– SixthSense often provides proofs/bugs in lesser time
– No need to debug lengthy, more cluttered traces
• SixthSense traces are short, with minimal activity to illustrate bug
– Quickly finds bugs before faulty logic is released

28 Formal Methods in Computer-Aided Design, 2006 © 2006 IBM Corporation


IBM Systems and Technology Group

Example SEC Applications

 Timing optimizations: retiming, adding redundant logic,…


 Power optimizations: clock gating, logic minimization, …
 Check specific modes of design behavior
– Backward-compatibility modes of a redesign preserve functionality
– BIST change must not alter functionality

 Verifying RTL vs. higher-level models

 Quantifying late design fixes


– Eg., constrain SEC to disallow ops that are the ones affected by a fix

29 Formal Methods in Computer-Aided Design, 2006 © 2006 IBM Corporation


IBM Systems and Technology Group

Example Applications: Clock-Gating Verification

 Clock-gating: input

– Disables clocks to certain state elements when


they are not required to update

 Approach: Equiv-check identical unit Unit with


Clock
Clock Gating
Gating
Unit with
Clock
Clock Gating
Gating
– One with clock-gating enabled, one disabled Disabled Enabled

– Check design behavior does not change during


care time-frames

 Leveraged to converge upon an optimal


clock-gating solution
– Iteratively apply SEC to ascertain if clock-gating
a latch alters function

30 Formal Methods in Computer-Aided Design, 2006 © 2006 IBM Corporation


IBM Systems and Technology Group

Example Applications: Quantifying a late design fix

 Late bug involving specific cmds on target memory node


– Fix made with backwards-compatible "disable" chicken-switch

 Wanted to validate:
– "disable" mode truly disabled fix
– Fix had no impact upon other commands, non-target nodes

 Several quick SixthSense equiv check runs performed:


– With straight-forward comparison, 192/217 outputs mismatched
– "Disabled" NEW design is equivalent to OLD
– If configured as non-target node, NEW equivalent to OLD
– If specific commands excluded (via a driver), NEW equiv to OLD

31 Formal Methods in Computer-Aided Design, 2006 © 2006 IBM Corporation


IBM Systems and Technology Group

Example Applications: Hierarchical Design Flow

 FPU designed hierarchically FPU spec


FPU spec
– Conventional latch-equivalent VHDL (yellow)
SixthSense
– Simple, abstract cycle-accurate VHDL (green)
– FPU spec (blue) is behavioral model High-Level Design
High-

SixthSense
 Verification approach:
– First, formally verified green box is equivalent VHDL
VHDL
to its spec using SixthSense (SEC) (Latch-Equivalent)
(Latch -
– Next, yellow box is verified equivalent to Verity(CEC)
green, macro by macro (takes minutes)
– Finally, schematics verified using Verity (CEC) Schematics
Schematics
– FPU verification is done completely by Formal

32 Formal Methods in Computer-Aided Design, 2006 © 2006 IBM Corporation


IBM Systems and Technology Group

SEC Future Directions: Hierarchical Design Flow

 Enables raising the level of abstraction (ESL)


– IBM methodology requires, CEC-equivalent to circuit, RTL model
• Allows for verifying self-test logic, asynchronous crossings, scan, …
– Specification of each macro precisely captured by high-level model
• Allows creativity in designing optimal circuit for the macro

 Verifn can begin without having the entire design ready


– Verify the high-level macros, unit/core/chip compositions
– Verifn done in parallel to circuit design; reduces design+verifn cycle

 Formal correctness eliminates risk of late design changes


– Efficient automated equiv proof of high-level vs. ckt-accurate macros

33 Formal Methods in Computer-Aided Design, 2006 © 2006 IBM Corporation


IBM Systems and Technology Group

SEC Future Directions: Sequential Optimizations

 SEC is an enabler for “safe” sequential synthesis


– E.g. retiming, addition/deletion of sequential redundancy
– Opens the door for automated (behavioral) synthesis
• Results in higher quality, more optimized designs
• Enabler for system-level design and verification

 SEC enables sequential optimizations


– Identify sequential redundancy, unreachable states…
– Validate user specified don’t-care conditions
– Verify “global” optimizations, e.g. FSM re-encoding, clock-gating,…
– Leveraged in diverse areas such as power-gating, fencing, etc.

34 Formal Methods in Computer-Aided Design, 2006 © 2006 IBM Corporation


IBM Systems and Technology Group

SEC Challenges: Scalability

 SEC has to scale to real world problems


– Large design slices, arbitrary transforms, low-level HDL spec,…
– Tighten induction to resolve miters in spec-reduced model
• TBV attempts to do just that, but further improvements welcome!
– Improved proof techniques critical to improving scalability
– Improved falsification methods to help with candidate guessing
• Helps distinguish false equivalences to converge faster

 Abstractions to reduce computational complexity


– Leverage techniques such as uninterpreted functions, blackboxing,…
– Hierarchical proof decomposition
• Bottom-up approach – blackboxes verified portions of the logic, and
captures constraints at the interfaces
35 Formal Methods in Computer-Aided Design, 2006 © 2006 IBM Corporation
IBM Systems and Technology Group

SEC Challenges: Combined CEC and SEC

 Leverage mappings of state elements obtained from CEC


– Take advantage of the wealth of techniques to correspond latches
• Name-based, structural, functional, scan-based…
– Used as cutpoints to define a boundary between CEC and SEC
• Significantly simplifies the SEC problem via co-relation hints
• Refining a cut if a false negative obtained is a hard problem
– Automatically propagate constraints across mapped state elements

 Benefits to CEC
– Improved latch pair matching via functional analysis
• Latch-phase determination, functional correspondence,…
– Apply constraints derived from SEC to simplify problems

36 Formal Methods in Computer-Aided Design, 2006 © 2006 IBM Corporation


IBM Systems and Technology Group

Conclusion: Sequential Equivalence Checking

 Eliminates Risk:
– SEC is exhaustive, unlike sim regressions

 Improves design quality:


– Enables aggressive optimizations, even late in design flow

 Saves Resources:
– Obviates lengthy verification regressions

 Generalizes CEC, and improves productivity

 Opens door to automated sequential synthesis

37 Formal Methods in Computer-Aided Design, 2006 © 2006 IBM Corporation


IBM Systems and Technology Group

Conclusion: SEC at IBM

 SEC becoming part of standard methodology at IBM


– Pre-synthesis HDL-to-HDL applications
– CEC closes gap with combinational synthesis flow

 IBM’s SEC solution driven by scalability across arbitrary


design transforms
– Hooks for: initial values, interface constraints, “partial equivalence”…

 SixthSense: TBV-Powered SEC


– Leverage a rich set of synergistic algos for highly-scalable SEC

38 Formal Methods in Computer-Aided Design, 2006 © 2006 IBM Corporation


IBM Systems and Technology Group

Conclusion: References/Links

 Website (lists SixthSense publications):


www.research.ibm.com/sixthsense

 Relevant Papers:
“Exploiting Suspected Redundancy without Proving it”, DAC 2005

“Scalable Sequential Equivalence Checking across Arbitrary Design


Transformations”, ICCD 2006

39 Formal Methods in Computer-Aided Design, 2006 © 2006 IBM Corporation

You might also like