You are on page 1of 31

Synthesis For Finite State

Machines
FSM (Finite State Machine) Optimization

State tables

identify and remove


State minimization equivalent states

assign unique binary


State assignment
code to each state

Combinational use unassigned state-codes


logic optimization as don’t care

net-list
FSM Optimization

01 00
-0
S2 S3

11 01 0- 10
1-
-0 -1
S1 S4
11

PI PO
Combinational
Logic

v1 u1

PS NS
v2 u2
State Minimization

Goal : identify and remove redundant states


(states which can not be observed from the
FSM I/O behavior)
Why : 1. Reduce number of latches
– assign minimum-length encoding
– only as the logarithm of the number
of states
2. Increase the number of unassigned states
codes
– heuristic to improve state-assignment
and logic-optimization
State Minimization Definition

• Completely-specified state machine


– two states are equivalent if outputs are
identical for all input combinations
Next states are equivalent for all input
combinations
– equivalence of states is an equivalence
relation which partitions the states into
disjoint equivalence classes
• Incompletely specified state machines
Classical State Minimization

1. Partition states based on input output values


asserted in the state
2. Define the partitions so that all states in a
partition transition into the same next-state
partition (under corresponding inputs)
Example

Ex :
0A B0
1A C0
0B D0 (A,B,C,D,E,F,H)(G)
1B E0
0C F0 (A,B,C,E,F,H)(G)(D)
1C A0
0D H0 (A,C,E,H)(G)(D)(B,F)
1D G0
0E B0 (A,C,E)(G)(D)(B,F)(H)
1E C0
0F D0
1F E0
0G F1
1G A0
0H H0
1H A0
State Assignment

• Assign unique code to each state to produce


logic-level description
– utilize unassigned codes effectively as don’t
cares
• Choice for S state machine
– minimum-bit encoding
log S
– maximum-bit encoding
• one-hot encoding
• using one bit per state
• something in between
• Modern techniques
– hypercube embedding of face constraint
derived for collections of states (Kiss,Nova)
– adjacency embedding guided by weights
derived between state pairs (Mustang)
Hypercube Embedding Technique

• Observation : one -hot encoding is the easiest


to decode
Am I in state 2,5,12 or 17?
binary : x4’x3’x2’x1x0’(00010) +
x4’x3’x2x1’x0 (00101) +
x4’x3x2x1’x0’(01100) +
x4x3’x2’x1’x0 (10001)
one hot : x2+x5+x12+x17
But one hot uses too many flip flops.
• Exploit this observation
1. two-level minimization after one hot
encoding identifies useful state group for
decoding
2. assigning the states in each group to a single
face of the hypercube allows a single product
term to decode the group to states.
State Group Identification

Ex: state machine


input current-state next state output
0 start S6 00
0 S2 S5 00
0 S3 S5 00
0 S4 S6 00
0 S5 start 10
0 S6 start 01
0 S7 S5 00
1 start S4 01
1 S2 S3 10
1 S3 S7 10
1 S4 S6 10
1 S5 S2 00
1 S6 S2 00
1 S7 S6 00
Symbolic Implicant : represent a transition from
one or more state to a next state under some input
condition.
Representation of Symbolic Implicant

Symbolic cover representation is related to a


multiple-valued logic.
Positional cube notation : a p multiple-valued
logic is represented as P bits
(V1,V2,...,Vp)
Ex: V = 4 for 5-value logic
(00010)
represent a set of values by one string
V = 2 or V = 4
(01010)
Minimization of Multi-valued Logic

Find a minimum multiple-valued-input cover


- espresso
Ex: A minimal multiple-valued-input cover
0 0110001 0000100 00
0 1001000 0000010 00
1 0001001 0000010 10
State Group

Consider the first symbolic implicant


0 0110001 0000100 00
• This implicant shows that input “0” maps
“state-2” or “state-3” or “state-7” into “state-5”
and assert output “00”
• This example shows the effect of symbolic
logic minimization is to group together the
states that are mapped by some input into the
same next-state and assert the same output.
• We call it “state group” if we give encodings to
the states in the state group in adjacent binary
logic and no other states in the group face, then
the states group can be implemented as a cube.
Group Face

• group face : the minimal dimension subspace


containing the encoding assigned to that group.
Ex: 0 010 0**0 group face
0100
0110
Hyper-cube Embedding

state groups :
a b
{2,5,12,17}
{2,6,17}
12 17

6
5 2

2 17
wrong!
6
5 12
Hyper-cube Embedding

state groups :
a b
{2, 6, 17}
6 17 {2, 4, 5}
2

5 4
6 17

2 4
wrong!

5
How to Check if a State Assignment
Satisfies the Constraint Matrix?

Step1: Find the group face of the encoding

Step2: For all states, check if a state that does


not belong to a state group intersects that
group face
Example

Constraint matrix A, state encoding S and


group-face matrix F
010
110
0110001
101
A= 1001000 S=
000
0001001
001
011
100

1 * *
Step1: Group face F = A˙S = 0 * 0
* 0 0
Step2: Check encoding of state-6 = [011]
Since it does not belong to group 1, 2 and 3,

check [0 1 1] ∩ [1 * *] = 
[0 1 1] ∩ [0 * 0] = 
[0 1 1] ∩ [* 0 0] = 

Encoding of state-6 satisfies the constraint


Other State Encoding

If encoding of state-6 = [111],

check [1 1 1] ∩ [1 * *] = 111
[1 1 1] ∩ [0 * 0] = 
[1 1 1] ∩ [* 0 0] = 

Do not satisfy the constraint.


Algorithm for State Assignment

Step 1: Select an uncoded state (or a state subset).

Step 2: Determine the encodings for that state


(states) satisfying the constraint relation.

Step 3: If no encoding exists, increase the state


code dimension and go to Step 2.

Step 4: Assign an encoding to the selected state


(states).

Step 5: If all states have been encoded, stop. Else


go to Step 1.
Step 3

– Can always increase the coding


length by one bit
– New state assignment:
1. For states already assigned,
append 0 at the end
2. For the new state, ns,

case1: ns does not belong to any state


group,
encoding of ns = [c | 1]
 c is any vector

case2: ns belongs to some state group,


encoding of ns = [c | 1]
c is the encoding of any state
that belongs to the state group
Example

Ex: 00
0101 10
A= 1010 S = 01
1100 11
To encode a new state (state-5), we have a new
constraint matrix,
01011
A’ = 10100
11000
For the states already assigned, we have a new
encoding,
000
100
S’ =
010
110
For the new state (state-5), we have encodings

ns = [1 0 1] or [1 1 1]
Hyper-cube Embedding Method

• Advantage :
– use two-level logic minimizer to
identify good state group
– almost all of the advantage of one-hot
encoding, but fewer state-bit
Adjacency-Based State Assignment

Basic algorithm:
(1) Assign weight w(s,t) to each pair of states
– weight reflects desire of placing states
adjacent on the hypercube
(2) Define cost function for assignment of codes
to the states
– penalize weights for the distance between
the state codes
eg. w(s,t) * distance(enc(s),enc(t))
(3) Find assignment of codes which minimize
this cost function summed over all pairs of
states.
– heuristic to find an initial solution
– pair-wise interchange (simulated annealing)
to improve solution
Adjacency-Based State Assignment

• Mustang : weight assignment technique based


on loosely maximizing common cube factors
How to Assign Weight to State Pair

• Assign weights to state pairs based on


ability to extract a common-cube factor if
these two states are adjacent on the hyper-
cube.
Fan-Out-Oriented
(examine present-state pairs)

• Present state pair transition to the same


next state

S1 S3

S2

$$$ S1 S2 $$$$
$$$ S3 S2 $$$$

Add n to w(S1,S3) because of S2


Fan-Out-Oriented

• present states pair asserts the same output

S1 S3
$/j $/j

S2 S4

$$$ S1 S2 $$$1$
$$$ S3 S4 $$$1$

Add 1 to w(S1 , S3) because of output j


Fanin-Oriented (exam next state pair)

• The same present state causes transition to


next state pair.

S1

S2 S4

$$$ S1 S2 $$$$
$$$ S1 S4 $$$$

Add n/2 to w(S2,S4) because of S1


Fanin-Oriented (exam next state pair)

• The same input causes transition to next


state pair.

S1 S3

i i

S2 S4

$0$ S1 S2 $$$$
$0$ S3 S4 $$$$

Add 1 to w(S2,S4) because of input i


Which Method Is Better?

• Which is better?
FSMs have no useful two-level
face constraints => adjacency-embedding
FSMs have many two-level
face constraints => face-embedding

You might also like