You are on page 1of 16

Module 1: AI for Channel Decoding

• You will learn:


• How to use colab
• Basics of channel coding
• Support vector machine
• Deep learning

• Grading:
• Syndrome Decoding, ML decoding 30%
• Classification with support vector machine 30%
• Deep Learning 30%
• Report (no more than 10 pages) 10%

1
2
Error Correction Codes

3
Terminology
• Message: Sequence of bits representing your date (link to the website)
• Codeword: Sequence of bits forming your QR code

• Example: These bits are added for error correction


• Message = 1 1 0 1
• Codeword = 1 1 0 1 0 1 0

It is everywhere!!! Even in string theory!!!

Sylvester James Gates, “Symbols of Power,” Physics World, vol. 23, no. 6, June 2010. 4
A digital communication system over
wireless channel
Binary
source
Channel Up-
Modulation
encoder converter

Channel Down-
Demod Preprocessing
Decoded decoder converter
source

• Channel enc/dec allows error correction by adding redundancy

How it works? How to create those bits?


5
Index: 1234567
Codeword: 1 1 0 1 1 0 0

The last 3 bits are added such that


inside every circle the number of 1s are even

5 1 11 60

41
21 30

Suppose 1 bit is erased, we can fix it by


checking the parity of each circle
7 0

6
Index: 1234567
Codeword: 0 1 0 1 ? ? ?

It’s your turn


5 0 10 6 1

41
21 30

What if 2 bits are erased?


7 0

7
Index: 1234567
Codeword: 1 1 0 1 1 0 0

It can also correct 1 bit flip

5 1 11 60

41
21 30
1
Suppose 1 bit is flipped, we can fix it by flipping 1
bit to meet all constraints
7 0

8
(n,k)-Linear Block Codes
• k-bit message 𝒎𝒎, n-bit codeword 𝒄𝒄
• Relationship: 𝒄𝒄 = 𝒎𝒎𝑮𝑮

• The code C contains all (2𝑘𝑘 in total) such codewords


• C is the row space of 𝑮𝑮 (𝑘𝑘 × 𝑛𝑛)
• Call it a generator matrix
• There exists 𝑯𝑯 (𝑛𝑛 − 𝑘𝑘 × 𝑛𝑛) such that 𝒄𝒄𝑯𝑯𝑇𝑇 = 𝑯𝑯𝒄𝒄𝑇𝑇 = 𝟎𝟎
• Rows of 𝑯𝑯 span the nullspace of 𝑮𝑮
• Call it a parity check matrix

9
(7,4)-Hamming Codes
• 4-bit message 𝒎𝒎, 7-bit codeword 𝒄𝒄
• Relationship: 𝒄𝒄 = 𝒎𝒎𝑮𝑮

10
System Model
• When transmit, we map coded bits to baseband signal
• Binary phase shift keying (BPSK) for sending the bit 𝑐𝑐𝑖𝑖

𝑥𝑥𝑖𝑖 = 𝑃𝑃 2𝑐𝑐𝑖𝑖 − 1 , 𝒙𝒙 = [𝑥𝑥1 , … , 𝑥𝑥𝑛𝑛 ]

• Additive white Gaussian noise (AWGN) channel

𝑦𝑦𝑖𝑖 = 𝑥𝑥𝑖𝑖 + 𝑤𝑤𝑖𝑖 , 𝑤𝑤𝑖𝑖 ~𝑁𝑁 0, 𝑁𝑁0 /2 , 𝒚𝒚 = [𝑦𝑦1 , … , 𝑦𝑦𝑛𝑛 ]

11
BER in Uncoded System
• Detection of 𝑥𝑥𝑖𝑖 from 𝑦𝑦𝑖𝑖
𝑥𝑥�𝑖𝑖 = 𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠 (𝑦𝑦𝑖𝑖 ) and 𝑟𝑟𝑖𝑖̂ = (𝑥𝑥�𝑖𝑖 + 1)/2

• Error if 𝑥𝑥�𝑖𝑖 ≠ 𝑥𝑥𝑖𝑖

• Bit error rate (BER): 𝑝𝑝𝑒𝑒 = ∑1( 𝑥𝑥�𝑖𝑖 ≠ 𝑥𝑥𝑖𝑖 )/𝑛𝑛

• Plot BER as a function of 𝐸𝐸𝑏𝑏 /𝑁𝑁0 where 𝐸𝐸𝑏𝑏 is energy per bit

12
Maximum Likelihood Decoding
MLD: 𝒄𝒄� = 𝑎𝑎𝑎𝑎𝑎𝑎min | 𝒚𝒚 − 𝒙𝒙| 2
𝒄𝒄∈𝐶𝐶

• This is optimal in AWGN channel with equiprobable inputs

• Complexity is very high, especially when k is large


• Need to check 2k codewords

13
Syndrome Decoding
• First make a hard decision
𝑥𝑥�𝑖𝑖 = 𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠 (𝑦𝑦𝑖𝑖 ) and 𝑟𝑟𝑖𝑖̂ = (𝑥𝑥�𝑖𝑖 + 1)/2

• Construct the standard array


• Compute the syndrome 𝒓𝒓� 𝑯𝑯𝑇𝑇 = 𝒔𝒔
• Decide 𝒆𝒆� = coset leader(𝒔𝒔)
• Decode to 𝒄𝒄� = 𝒓𝒓 + 𝒆𝒆�

14
Standard Array

15
Decoding via Learning
• Decoding is nothing but classification

• (n,k)-linear block code has 2𝑘𝑘 classes

• We can generate a lot of training data (𝒚𝒚, 𝒄𝒄)

• This is a supervise, batch, passive, and statistical learning

16

You might also like