MAXIMUM LIKELIHOOD DECODING BASICS THE EXAMPLE OF AN AWGN RECEIVER Any set of M real energy signals S = defined on [0,T) can

be represented as a linear combination of N <= M real orthonormal basis functions S. Thus, we can write each representation as . We say that these basis functions span the set in terms of its basis function

We denote the coefficients

as a vector

which is The

called the signal constellation point corresponding to the signal signal constellation consists of all constellation points Given the channel output receiver structure to determine

, we now investigate the

which constellation point si or, equivalently, which message mi, was s ent over the time interval [0,T]. We convert the received signal r(t) over each time interval into a vector, as it allows us to work in finite-dimensional vector space to estimate the transmitted signal.

Since n(t) is a Gaussian random process, if we condition on the transmitted signal then the channel output ) is a random vector.

is also a Gaussian random process and

E( ) = E(Sij) + E(Nj) E( )=0.....AWGN )= .

E(constant)= Constant itself. Therefore E( This implies E( Therefore Also, Moreover E(Nj)= . )=

.

Therefore the joint probability distribution function is given by

For each value of i=1....M this function is computed. Actually this function gives the probability of occourence of r when Therfore the maximum value gives the value of index i for which this probability(i.e r is received ) is maximum . is sent

The M signals forms a constellation in the signaling s ace
¤

TN 0

N 2

2 « N xi  s ji » exp ¬ § ¼ N0 ¼ ¬ i !1 ­ ½

Maximizing the likelihood function is e uivalent to minimizing the quantity
2 T TT N Ö s j ! min d x , s j ! § xi  s ji T
sj i !1

In other word , The maximum Likelihood receiver picks the signal that is closed to the received signal in the signal space This distance is called Euclidean distance
¥ ¥

¥

An ML receiver selects sj that maximize fX|sj

¦

£

The e

¢   ¡  

e if a trans itter sends one of M signals si(t), for i=1,2, ,M

EXTENDING THIS PROCEDURE TO MIDDELTON BIVARIATE CLASS A NOISE AND BIVARIATE GAUSSIAN STATISTICS
§

Here we are considering the case of a 2X2 MIMO therefore there will be two transmission matrices X1= X2 = DERIVATION FOR BIVARIATE MIDDELTON CLASS A:: Y=HX + N ... MIMO System model. E(Y)=E(HX)+E(N). As E(HX)=HX and E(N) =0 .This implies E(Y)=HX. = HX. This implies Y= Y-HX. .

If we replace X with S where S is the codebook i.e all the possible symbols that may have been transmitted

Then keeping n= Y- HS. The Maximum likelihood decision rule is given by the Joint spatial distribution

Where S=[x(1....N),x(1,....N)]. We have used 4 QAM therefore codebook is of size 16 X 2. The value of index at which this function is maximized gives the decoded symbol. Why not Distance Computation ? The computational complexity of the ML receiver for Mideton Class A noise is higher than its Gaussian counterpart nce the likelihood function given by this equation can no longer expressed as just a fraction of the minimum distance. While for Multivariate Gaussian noise we can make use of Mahalanobis distance minimization. D=min((N inv(K)N)^(1/2)). Where N = Y-HS. S = Codebook.