You are on page 1of 23

University of Maisan

College of Engineering
Electrical Engineering

Fourth stage/ Communication Laboratory-M


Study : Morning

Name of Report : Homework Lab 1,2,3,4,5,6

Name of student : Reham Khaled Johe

2021
Discussion Lab1

:Write Matlab code to plot the function .4

A:Y=5*cos(100*pi*t)+2*sin(50*pi*t)

;t=0:0.001:0.1
y=5*cos(100*pi*t)+2*sin(50*pi*t)
plot(t,y,'b')
grid on
xlabel('time')
title('y')
ylabel('The function value')
))b
;x=-50:50
y=sin(x)./x
plot(x,y,'b')
grid on
xlabel('time')
title('y')
ylabel('The function value')
))C
;t=0:20:100
y=exp(-10*t)
plot(t,y,'r')
grid on
xlabel('time')
title('y')
ylabel('The function value')
))Lab2
A
cussion Dis

On the a.
plot of the pdf, draw a second Gaussian pdf with mue = 0, alfa
=2
;mue=0

;sigma=2

;x=[-40:0.02:40]

f=(1/(sigma*sqrt(2*pi)))*exp(- ((x-mue).^2)/(2*(sigma^2)));
;plot(x,f,'b')

;grid on

;xlabel('Time')

;ylabel('The function value')


title('Gaussian')

))B

;mue=0
;sigma=2
;x=mue
f=(1/(sigma*sqrt(2*pi)))*exp(- ((x-mue).^2)/(2*(sigma^2)))
;plot(x,f,'b')
;grid on
;xlabel('Time')
;ylabel('The function value')
title('Gaussian')
<<0.1995
:C: Use Matlab to find

;mue=3
;sigma=2
syms x
;x1=-1
;x2=1
f=(1/(sigma*sqrt(2*pi)))*exp(- ((x-
;mue).^2)/(2*(sigma^2)))
F=int(f,x,x1,x2)
>>
= f
/)exp(-(x - 3)^2/8)*7186705221432913(
36028797018963968
= F
*)1/2(^2*7186705221432913(-
/)pi^(1/2)*(erf(2^(1/2)/2) - erf(2^(1/2)))
36028797018963968
= F>>
0.1359

:LAB 3
a. For a continuous random variable with a Gaussian pdf of:

b. Drawing the CDF of a single die tossing experiment: Enter


:and run the following Matlab code
Discussion

a. Re run the code in part (a) of (4) above using mue=3 and
:show the resulting graph. Show that

;mue=3
;sigma=2
syms x
f=(1/(sigma*sqrt(2*pi)))*exp(-((x-
;mue).^2)/(2*(sigma^2)))

F=int(f,x,mue-sigma,mue) :1

/)pi^(1/2)*erf(2^(1/2)/2)*)1/2(^2*7186705221432913(( >>
100*)36028797018963968
=F
34.13%
F=int(f,x,mue+sigma,mue+2*sigma) :2
pi^(1/2)*(erf(2^(1/2)/2) -*)1/2(^2*7186705221432913(- (>>
100*)36028797018963968/)erf(2^(1/2)))
13.59%
F=int(f,x,-inf,mue) :3

/)pi^(1/2)*)1/2(^2*7186705221432913(( >>
100*)36028797018963968
= ans
50%

F=int(f,x,mue+sigma,inf) :4
F=-(7186705221432913*2^(1/2)*pi^(1/2)*(erf(2^(1/2)/2) >>
- 1))/36028797018963968
=F
15.87%
F=int(f,x,-inf,inf) :5
>>
F=(7186705221432913*2^(1/2)*pi^(1/2))/1801439850948198
4
=F
1
b. Re run the code in part(b) of (4) above and show the
resulting graph assuming the following probabilities of an

experiment outcomes:

prob=[1/6 1/12 1/18 1/12 1/9 1/2];%prob


of the experiment outcomes
;cdf=zeros(1,6)
;cdf(1,1)=1/6
for k=2:6
;cdf(1,k) =cdf(1,k-1)+prob(1,k)
end
;stairs(cdf,'LineWidth',2)
;xlim([0,7])
;grid
;xlabel('\itx')
;ylabel('CDF(\itx)')
;title('Die tossing experiment')
LAB 4.Information Theory & Coding

a. Enter and run the following Matlab code to plot vs.


:probability

Prob=0:0.01:1
;log_base_2=(log10(1./Prob))/log10(2)
;plot(Prob,log_base_2,'g','LineWidth',2)
;xlabel('Probability(X)')
;ylabel('I(X)')
;title('The self information I(X) against probability P(X)')
grid on; %Finding the same result of I(X) against Probability usin
:Loops

;prob=0:0.01:1
;size(prob)=]no_of_row,no_of_columns[
;I=zeros(1,no_of_columns)
for k=1:no_of_columns
;I(1,k)=(log10(1/prob(1,k)))/log10(2)
end
hold on;%in order to plot on the same figure
;plot(prob,I,'--r','LineWidth',2)
b.Enter and run the following Matlab code to plot vs. probability of
:the bit zero

Plotting the Entropy H vs. the probability of the zero bit%

prob_of_zero=0:0.01:1;%probability of zero
prob_of_one=1-prob_of_zero;%the corresponding probability of one
Entropy=prob_of_zero.*((log10(1./prob_of_zero))/log10(2))
;+prob_of_one.*((log10(1./prob_of_one))/log10(2))
plot(prob_of_zero,Entropy,'k','LineWidth',2)
;xlabel('Probability(zero)')
;ylabel('Entropy H')
;title('The Entropy H of a Binary Source')
;grid on
Finding the same result of Entropy Using Loops%
Prob_of_zero=0:0.01:1;%probability of zero
Prob_of_one=1-Prob_of_zero;%the corresponding probability of one
;size(Prob_of_zero)=]no_of_rows,no_of_columns[
;Entropy=zeros(1,no_of_columns)
for k=1:no_of_columns
Entropy(1,k)=Prob_of_zero(1,k)*(log10(1/Prob_of_zero(1,k))/log10(
;2))+Prob_of_one(1,k)*(log10(1/Prob_of_one(1,k))/log10(2))
end
;hold on
;plot(Prob_of_zero,Entropy,'--r','LineWidth',2)
LAB 5

SNR = 30;%Signal to noise ratio in dB


BW=3000;%Bandwidth is 3000 Hz
;a=(1+SNR)
;log_base_2=log(a)/log(2)
;ChannelCapacity=BW * log_base_2
;bar(ChannelCapacity,BW,'r')
;title('Channel capacity vs bandwidth')
;grid on
Channel Coding

;p=[1 1 1;1 0 1; 0 0 1]
;Im=[1 0 0;0 1 0;0 0 1]
G=[Im p]
= G

1 1 1 0 0 1
1 0 1 0 1 0
1 0 0 1 0 0
;d=[1 1 1]
;C=d*G
dec2bin(c)
ans
'01'
'01'
'01'
'10'
'01'
'11'
Source Coding

;x=1:5
;q=1
;m=5
;L=m^q
N=(1/q)*log2(L)+1
N
3.3219=
H=log2(m)
H
2.3219=
Eff=H/N
Eff
=
0.6990
))Lab6
1
;f=[0.0625 0.125 0.25]
;prob_of_zero=0:0.01:1
;prob_of_one=1-prob_of_zero
;hold on
;col=['b','g','r']
k=1:size(f,2)
a=(1-f(1,k))*prob_of_zero+f(1,k)*prob_of_one;
;b=f(1,k)*prob_of_zero+(1-f(1,k))*prob_of_one
c=(1-f(1,k))*(log10((1f(1,k)))/log10(2))+
(f(1,k)*log10(f(1,k))/log10(2)); I=-(a.*(log10(a)/log10(2))
+b.*(log10(b)/log10(2)))+c;
;plot(prob_of_zero,I,col(1,k),'LineWidth',2.5)
end
;xlabel('Probability(0)')
;ylabel('I(X,Y)')
;grid
legend('f=0.0625','f=0.125','f=0.25'); title('Mutual Information
;I(X,Y) vs. P(0), Flipping Factor f as a parameter')
From the above figure, it is clear that the
mutual information is inversely proportional to
the flipping factor, as the higher the value of the
flipping factor, the lower the value of the mutual
information
))B
;f=0:0.001:1
;C=1+(1-f).*(log10(1-f)/log10(2))+f.*log10(f)/log10(2)
;plot(f,C,'LineWidth',2.5)
;xlabel('The flipping factor f')
;ylabel('The Channel Capacty C')
;grid
title('The BSC Capacity C vs.the Flipping Factor f')

From the above figure we notice that the greatest and lowest
value of the channel capacity depends on the flipping factor ,
where we notice the highest value of the channel capacity when
the flipping factor is one or zero and it is at its lowest value
when the flipping factor equals 0.5
2
Through the results we have received after the implementation
of the code we note that the mutual information depends on the
flipping factor as well as the channel capacity is in the greatest
values when the flipping factor is one or zero and the channel
capacity is zero when the flipping factor is 0.5
3
when the flipping factor is 0.5 the channel capacity is Zero
4
The channel capacity is One when the when the flipping factor
is 1 or 0
Because the logarithm of one equal to zero

When f = 0
C=1+(1-0)log2(1-0)+(0)log2(0)
C=1+0+0=1

When f = 1
C=1+(1-1)log2(1-1)+1log2(1)
C=1+0+0=1

You might also like