Professional Documents
Culture Documents
Lecture 04 Polar
Lecture 04 Polar
Li-Wei Liu
National Chiao Tung University
Department of Electronic Engineering
Contents
1 Introduction 2
1.1 Intro . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
1.1.1 Channel Combining WN : X N → Y N . . . . . . . . . . . . . . . . . . . 2
1.1.2 Channel Splitting WNi : X i → Y N , X i −1 . . . . . . . . . . . . . . . . . 4
1.1.3 Channel Polarization . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
3 Polar Decoding 9
1
Chapter 1
Introduction
1.1 Intro
• Proposed by Erdal Arikan in 2008
• Provably achieve the channel capacity (capacity achieving for any symmetric binary
input discrete memoryless channel)
Symbol Representation
2
3
In Fig.1.1c case
W2 (y 12 |u 12 ) = W (y 1 |u 1 ⊕ u 2 )W (y 2 |u 2 )
= W (y 1 |x 1 )W (y 2 |x 2 )
In Fig.1.1b case
W4 (y 14 |u 14 ) = W2 (y 12 |u 1 ⊕ u 2 , u 3 ⊕ u 4 )W2 (y 34 |u 2 , u 4 )
= W2 (y 12 |u 1,o
4 4
⊕ u 1,e )W2 (y 34 |u 1,e
4
)
= W2 (y 12 |v 12 )W2 (y 34 |x 34 )
WN (y 1N |u 1N ) = WN /2 (y 1N /2 |u 1,o
N N
⊕ u 1,e N
) × WN /2 (y N N
+1 |u 1,e )
4
Channel Capacity of WN
C (WN ) = I (U N ; Y N )
N
I (u i ; y 1N |u 1i −1 )
X
=
i =1
N
C (WN(i ) )
X
=
i =1
WN(i ) (y 1N , u 1i −1 |u i ) = WN (y 1N , u 1i −1 , u iN+1 |u i )
X
u iN+1
WN (y 1N , u 1i −1 , u iN+1 |u 1N )P (u 1i −1 , u iN+1 )
X
=
u iN+1
1
WN (y 1N |u 1N )
X
=
u iN+1
2N −1
5
(2i −1) 2N
X1
W2N (y 1 , u 12i −2 |u 2i −1 ) = WN(i ) (y 1N , u 1,o
2i −2 2i −2
⊕ u 1,e |u 2i +1 ⊕ u 2i )
u 2i 2
WN(i ) (y N
2N 2i −2
+1 , u 1,e |u 2i )
even channel
(2i ) 2N 1
W2N (y 1 , u 12i −1 |u 2i ) = WN(i ) (y 1N , u 1,o
2i −2 2i −2
⊕ u 1,e |u 2i +1 ⊕ u 2i )
2
WN(i ) (y N
2N 2i −2
+1 , u 1,e |u 2i )
X X 1 W (y|x)
I (W ) = W (y|x)l og 2 ( 1 1
)
y∈Y x∈X 2 2
W (y|0) + 2
W (y|1)
2
I (W ) ≥ log2
1 + Z (W )
q
I (W ) ≤ 1 − Z (W )2
6
Symmetric Capacity
When analyzing the channel in symmetric capacity way in Binary Erasure Symmetric
Channel(BEC)
) 2
I (WN(2i −1) ) = I (WN(i/2 ) (1.1)
2
I (WN(2i ) ) = 2I (WN(i/2
)
) − I (WN(i/2
)
) (1.2)
with I (W1(1) ) = 1 − ²
This is only valid for BECs. No Algorithm for general B-DMC W.
Bhattacharyya parameter
2
Z (WN(2i −1) ) = 2Z (WN(i/2
)
) − Z (WN(i/2
)
) (1.3)
2
Z (WN(2i ) ) = Z (WN(i/2
)
) (1.4)
with Z (W1(1) ) = ²
This is only valid for BECs. No Algorithm for general B-DMC W.
The higher Z(W) the higher error it would have -> Bad Channel -> Frozen Set AC
The lower Z(W) the lower error it would have -> Good Channel -> Information set A
Chapter 2
2.1 Encoding
N N
G N = (I N F )R N (I 2 G N )
N 2 N L N 2
I N F : U1,e U1,o
2
R N : Even down, odd up Permutation
I 2 G N : Concatenated with smaller G (G N )
N
2 2
Choose those Z (WN(i ) ) near 0 , as information set to transmitt signal, the other would
be frozen set.
A block length N code is encoded as: x 1N = u 1N G N , where N = 2n for some n ≥ 0
O O O
(AC ) (B D) = (A B )(C D) (2.1)
O O
(I N /2 F )R N = R N (F I N /2 ) (2.2)
7
8
O O
G N = {(I N F )R N }(I 2 GN)
2 2
= ...
DefineB N = R N (I 2
O O O
R N /2 )(I 4 R N /4 ) . . . (I N 2 R2 )
N
n
= BN F
Polar Decoding
Polar Decoding
1. Successive Decoding(SC)
2. Belief Propagation
Successive Decoding
• y 1N : Decoder Output
(i ) N
WN(i ) (y 1N , û 1i −1 |0)
lN (y 1 , û 1i −1 ) =
WN(i ) (y 1N , û 1i −1 |1)
Decision:
1, if l N(i ) (y 1N , û 1i −1 ) ≥ 0.
û =
0, otherwise.
9
10
SC Decoding
• It can be found that once a wrong bit position is made, there is no chane to correct
it in the following decoding procedure
Solution: store the partial sequence in a size-L list, in each estimation just fork the result
and calculate the probability of each sequence.
13
1. Initialization: Set a null sequence as the only candidate in the initial list and set
the corresponding probability to be 1. S 0 = {;}, P (;) = 1
2. Bit Estimation:
S ( i ) = û 1i |û 1i −1 ∈ S i −1 , û i ∈ {0, 1}
,where b ∈ {0, 1}
WN(i ) (y 1N ,û 1i −1 |û i =b)
, if û i is information bit
WN(i ) (y 1N ,û 1i −1 |û i =0)+WN(i ) (y 1N ,û 1i −1 |û i =1)
P (û = b|u 1i −1 = û 1i −1 ) =
1 when b=0,
if û i is frozen bit.
(b) Competition: If the number of candidate is larger, reserve L candidate with
the largest probabilities.
3. Sequence Determinate: After the N bits are decoded, rencode every candidate in
the list and calculate all likelihood probability, select the maximual one.
Generally, list size’s complexity would grow exponentially, therefore the list size would
not be too large.
After N bits are decodeded, the path in the list are examined one-by-one with decreasing
reliability.(WN (Y1N ,U1N )). The decoder output the CRC passing code. If not, then decode
failure.
14
Belief Propagation
Node(i,j), i t h level from left to right, row j t h node from top to down
W (y j |x j = 0)
L n+1, j = log( ), 1 ≤ j ≤ N
W (y j |x j = 1)
15
f-function
as check node update
Generally, BP is a little bit better than Successive Cancellation Decoding.