This action might not be possible to undo. Are you sure you want to continue?

Welcome to Scribd! Start your free trial and access books, documents and more.Find out more

)

Melissa Frances Balmes Chariz Sandra Ramos

• •

**Introduction Sequential Decoding
**

- Introduction to Sequential Decoding - Decoding Process - Example of Sequential Decoding

**• Soft-decision Decoding
**

- Introduction to Soft-decision Decoding - Hard-decision vs. Soft-decision - Decoding Process - Example of Soft-decision Decoding for Convolutional Codes

• Example of Convolutional Codes with Sequential Decoding Soft Decision

Introduction

• Recall… Convolutional codes parameters: (n,k,m) n = number of output bits k = number of input bits m = number of memory registers rate: r = k / n • Idea behind decoding: A code with n input bits results in 2n possible output sequences decoder’s task is to determine which sequence was sent

Introduction…

• How do we decode a received sequence: 111100? By bit agreement: comparing the received sequence with all the 8 possible sequence and choose the one with the smallest Hamming distance (bit disagreement)

• BUT!!! - this type of decoding shows ambiguity - as the number of input bits increase, the possible code sequence increase = longer calculations!!

Sequential Decoding

Introduction

• One of the first methods proposed by Wozencraft to decode a convolutionally coded bit stream • The purpose is to search all the nodes of the diagram in an efficient way in an attempt to find the maximum likelihood path • Maximum likelihood path depends on the measure of “closeness” of a path to the received sequence

Properties

• Allows both forward and backwards movement through the trellis • The decoder keeps track of its decisions, each time it makes an ambiguous decision, it tallies it. If the tally increases faster than some threshold value, decoder gives up that path and retraces the path back to the last fork where the tally was below the threshold.

Steps to decoding:

• Choose the all-zero state to be the initial node • Compare the outputs of two paths from the initial node and the received sequence

– If an output of one path produced the same output as the received sequence, choose that path. – If the output of both paths is not the same as the received sequence, tally the error.

Steps to decoding:

• Continue the process of comparing and choosing paths until the end of the received sequence • NOTE: Error must not exceed the threshold value!!!

- If a path’s tally of errors exceed the threshold value, GO BACK to where the tally was less than the threshold value and find another path.

Example Problem

• The code is: (2,1,4) for which the encoder diagrams was presented in the previous report. Assume that bit sequence 1011 000 was sent. (Remember the last 3 bits are the result of the flush bits. Their output is called tail bits.) • If no errors occurred, we would receive: 11 11 01 11 01 01 11 • But let’s say we received instead : 01 11 01 11 01 01 11. One error has occurred. The first bit has been

I npu t I1 Bit 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1

Input State S1 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 S2 0 0 0 0 1 1 1 1 0 0 0 0 1 1 1 1 S3 0 0 1 1 0 0 1 1 0 0 1 1 0 0 1 1

Output Bit O1 0 1 1 0 1 0 0 1 1 0 0 1 0 1 1 0 O2 0 1 1 0 0 1 1 0 1 0 0 1 1 0 0 1

Output State S1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1 S2 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 S3 0 0 0 0 1 1 1 1 0 0 0 0 1 1 1 1

11 11 11 01 01 11 01

Comparing with the encoded trellis…

Soft Decision Decoding

Channel Model

y(t) = x(t) + n(t) where: y(t) = received sequence x(t) = encoded sequence n(t) = noise

x(t)

y(t)

**Hard Decision vs. Soft Decision
**

Hard Decision

• find a codeword that minimizes the Hamming distance between the symbol sequence and the codeword

Soft Decision

• find a codeword that minimizes the Euclidean distance between the received signal and the modulated signal corresponding to the codeword **given 2 points P = (p1, p2,..pn) and Q = (q1, q2,..qn) the Euclidean distance is

Hamming distance – is the number of differing elements between two words

**Example of Soft Decision decoding= (1 –1 1) in polar A transmitted codevector = (101)
**

format The received vector is r = (-0.1072, -10504, 1.6875)

After transmission over an Additive Noise Gaussian channel

**Example of Soft Decision decoding
**

Calculating the Euclidean distance with each of the possible codevectors, c0=(-1-1-1), c1=(111), c2=(1-11) and c3=(11-1)

A soft decision decoder would decide that the decoded vector is the code that has the smallest Euclidean distance: d(r,c2) = 1.3043 Thus, the decoded word is c2 = 101!!

Convolutional Codes with Sequential Decoding Soft Decision

Convolutional Codes with Sequential Decoding Soft Decision 1/n, there will be For a Convolutional code with a rate of

2n possible outputs for each transition of the trellis: each vector per transition has n components: received sequence: the value for each transition i-1 is the squared Euclidean distance between the received sample at a time instant ti and the possible output:

the cumulative squared Euclidean distance for a path of U transitions:

Steps:

• Compute the cumulative squared Euclidean distance, dU2, for each transition in the trellis diagram • If 2 paths reach a certain node in the trellis, the path with a higher dU2 is dropped • At the last time instant, the correctly decoded sequence is the path that has the smallest dU2

Example

Message sequence: m = (10101) Code sequence: c = (11, –11, 11, –1 –1, 11) Code rate: 1/2 Received sequence: sr = (1.35 –0.15, –1.25 1.4, –0.85 –0.1, 0.95 – 1.75, 0.5 1.3)

**1. Compute the cumulative SED for each transition
**

sr = (1.35 –0.15, –1.25 1.4, –0.85 –0.1, –0.95 –1.75, 0.5 1.3) For t1 t2 , Sa Sa d02 = [(1.35, -0.15), (-1 –1)] = (1.35 + 1)2 + (-0.15 + 1)2 d02 = 6.245 = d(U=1)2 For t2 t3 , Sa Sa d12 = [(-1.25, 1.4), (-1 –1)] = (-1.25 + 1)2 + (1.4 + 1)2 = 5.8225 d(U=2)2 = d02 + d12 = 12.0675

2. If 2 paths reach a certain note, the path with a higher dU2 is discarded

+1.35 -0.15 6.245 1. 44 5

-1.25 +1.40 12.067 5 1 1. 46 7

1. 66 7

-0.85 -0.10 12.900

-0.95 -1.75 6.86 4 6.299 16.7 2.499 12.699 16.499 15.699

+0.50 +1.30 14.404 24.064 17.664 13.264 10.064 17.864 6.864 10.404 7.204 17.604 15.604 12.404 18.804 9.20 4

.2 12 67

13.499

21.064

Message = 10101 Encoded message = 11 01 11 00 11

References:

•D. G. Hoffman, D. A. Leonard, C. C. Lindner, K. T. Phelps, C. A. Rodger, and J. R. Wall. Coding Theory: The Essentials. Marcel Dekker, Inc., 1991. •S. Lin and D.J. Costello, Jr. Error Control Coding: Fundamentals and Applications. New Jersey: Prentice Hall, Inc., 1983. •J.G. Moreira and P.G. Farrell. Essentials of Error-Control Coding. England: John Wiley & Sons, Ltd, 2006. •M. Purser. Introduction to Error-Correcting Codes. Boston: Artech House, Inc., 1995. •Tutorial on Coding and Decoding with Convolutional Codes

http://www.complextoreal.com/chapters/convo.pdf Date Accessed: March 9, 2008

•Soft-Decision Minimum-Distance Sequential Decoding Algorithm for Convolutional Codes Paper

Report by Chariz Sandra Ramos and Melissa Balmes

Report by Chariz Sandra Ramos and Melissa Balmes

- Checklist for Best Practices in Powercenter
- Technology Background - JEE Final
- SEC309_Advanced Malware Cleaning
- PLAN9NOTDEADYET
- Ubuntu Server Technologies Paper
- 03 Measuring Runtime and Pseudocode
- BPMFocus Review of BizAgi Xpress
- Web Sphere Manual
- 3003_en
- people softdba training
- Contoh Resume Aku
- Android
- dmm2411ec
- Radcom Solution Overview
- Case Study - Performance
- Java-mobile Computing -- Controlled Mobility Sensor Networks for Target Tracking Using Ant Colony Optimization
- Template SLT DS KONV Table
- C# Compiler Errors
- Day1
- Android-Opersting System-Presentation by Farhan
- unix intoduction
- Simutools08 Keynote Final
- EE 421-Digital System Design-Dr. Shahid Masud-updated.pdf
- Functions of the SAP Memory Management System (SAP Library - SAP Memory Management System
- Custom M Code
- Moving Object Detection for Real-Time
- Project Report Format
- 1. BASIS_Technical_Overview.ppt
- Paper 2
- C#_Lec1_Sakr

Are you sure?

This action might not be possible to undo. Are you sure you want to continue?

We've moved you to where you read on your other device.

Get the full title to continue

Get the full title to continue listening from where you left off, or restart the preview.

scribd