You are on page 1of 13

EXPERIMENT-11:Noise realization

VIGNESH NAGARAJAN [EDM18B055]


October 31, 2020

Date Performed:26/10/2020
Faculty Name:Dr. Mandha D.Selvaraj

1 Aim/Practice Questions
1. (a) Generate random numbers from 0 to 10 using rand function.

(b) Generate N random numbers from A to B using rand function. (A>1 & A<B)

2. Generate normally distributed random numbers with mean 100 and variance 25.
3. (a)Plot the histogram for uniformally distributed random variable.

(b)Plot the histogram for normally distributed random variable for the following specifications:
µ = 0, σ = 1 µ = 2, σ = 2 µ = 10, σ = 100 (µ and σ are mean and standard deviation respectively.)

4. Plot the histogram for Gaussian p.d.f and validate by using ”randn” function.

2 Apparatus
HARDWARE :PERSONAL COMPUTER
SOFTWARE :MATLAB

3 Theory :
3.1 Random variables
A random variable is a variable whose value is unknown or a function that assigns values to each of an experiment’s
outcomes. Random variables are often designated by letters and can be classified as discrete, which are variables
that have specific values, or continuous, which are variables that can have any values within a continuous range.
Some types of random variables are:

• Uniform random variable


• Gaussian random variable
• Geometric random variable
• Poisson random variable

• Hypergeometric random variable

Out of all the above variables listed , we mainly discuss about the first two random variables as these are the most :

1
Uniform random variables
Suppose X is a random variable with probability density function:

 1 , when a<x<b (1)
fX (x) = b − a
0, otherwise (2)

Figure 1: Uniform Random Variable

We see from the above probability density function (p.d.f) that all expected outcomes in the interval are equi-probable.

Uniform noise is not often encountered in real-world imaging systems, but provides a useful comparison with Gaus-
sian noise. The linear average is a comparatively poor estimator for the mean of a uniform distribution. This implies
that nonlinear filters should be better at removing uniform noise than Gaussian noise.

Gaussian random variable


Gaussian noise provides a good model of noise in many imaging systems and communication modelling. Its probability
density function (pdf) is: Suppose ’X’ is a random variable with parameters µ and σ 2 and its probability density
function:
1 (x−µ)2
fX (x) = √ e− 2σ 2
2πσ 2
2
We denote X N (µ, σ ).
The special case X N (0, 1) is referred to as the Standard Normal random variable.

2
Figure 2: Gaussian Random Variable

The Gaussian distribution has an important property: to estimate the mean of a stationary Gaussian random variable,
one can’t do any better than the linear average. This makes Gaussian noise a worst-case scenario for nonlinear image
restoration filters, in the sense that the improvement over linear filters is least for Gaussian noise. To improve on
linear filtering results, nonlinear filters can exploit only the non-Gaussianity of the signal distribution.

White Noise
First of all, ”Noise” is an unwanted signal which interferes with the original message signal and corrupts the param-
eters of the message signal. This alteration in the communication process, leads to the message getting altered. It is
most likely to be entered at the channel or the receiver.

In signal processing, white noise is a random signal having equal intensity at different frequencies, giving it a
constant power spectral density. The term is used, with this or similar meanings, in many scientific and technical
disciplines, including physics, acoustical engineering, telecommunications, and statistical forecasting. White noise
refers to a statistical model for signals and signal sources, rather than to any specific signal. White noise draws its
name from white light,although light that appears white generally does not have a flat power spectral density over
the visible band.

In discrete time, white noise is a discrete signal whose samples are regarded as a sequence of serially uncorrelated
random variables with zero mean and finite variance; a single realization of white noise is a random shock. Depending
on the context, one may also require that the samples be independent and have identical probability distribution
(in other words independent and identically distributed random variables are the simplest representation of white
noise). In particular, if each sample has a normal distribution with zero mean, the signal is said to be additive white
Gaussian noise.

The samples of a white noise signal may be sequential in time, or arranged along one or more spatial dimensions.
In digital image processing, the pixels of a white noise image are typically arranged in a rectangular grid, and are
assumed to be independent random variables with uniform probability distribution over some interval. The concept
can be defined also for signals spread over more complicated domains, such as a sphere or a torus.
An infinite-bandwidth white noise signal is a purely theoretical construction. The bandwidth of white noise is limited
in practice by the mechanism of noise generation, by the transmission medium and by finite observation capabilities.
Thus, random signals are considered ”white noise” if they are observed to have a flat spectrum over the range of
frequencies that are relevant to the context. For an audio signal, the relevant range is the band of audible sound
frequencies (between 20 and 20,000 Hz). Such a signal is heard by the human ear as a hissing sound, resembling the
/h/ sound in a sustained aspiration. On the other hand, the /sh/ sound in ”ash” is a colored noise because it has
a formant structure. In music and acoustics, the term ”white noise” may be used for any signal that has a similar
hissing sound.

3
Statistical Properites of White noise
Any distribution of values is possible (although it must have zero DC component). Even a binary signal which can
only take on the values 1 or 0 will be white if the sequence is statistically uncorrelated. Noise having a continuous
distribution, such as a normal distribution, can of course be white.
It is often incorrectly assumed that Gaussian noise (i.e., noise with a Gaussian amplitude distribution – see normal
distribution) necessarily refers to white noise, yet neither property implies the other. Gaussianity refers to the prob-
ability distribution with respect to the value, in this context the probability of the signal falling within any particular
range of amplitudes, while the term ’white’ refers to the way the signal power is distributed (i.e., independently)
over time or among frequencies.
White noise is the generalized mean-square derivative of the Wiener process or Brownian motion.
A generalization to random elements on infinite dimensional spaces, such as random fields, is the white noise measure.

Figure 3: Gaussian-white noise

• A white noise signal (process) is constituted by a set of independent and identically distributed (i.i.d) random
variables.
• A series of samples that are independent and generated from the same probability distribution.
• A white noise signal generated from a Uniform distribution is called Uniform White Noise.
• In system modelling, white noise can be generated using an appropriate random generator.

4 Matlab code for Practice 1,2,3,4

Listing 1: Practice-1,2,3,4

4
1 %% P r a c t i c e −1
2 clc
3 A1=10 * rand ( 1 0 , 1 ) ;% 10 random numbers from 0 t o 10
4 f p r i n t f ( ” Output ( a ) \n ” ) ;
5 d i s p ( A1 ) ;
6 N=i n p u t ( ” \ n ( b )How many number o f random numbers ( s e c o n d p a r t o f P−1) ” ) ;
7 a=i n p u t ( ” Lower bound (=a ) ” ) ; %a must be >1
8 b=i n p u t ( ” Upper bound (=b ) ” ) ; % b must be >a
9 A2=a+(b−a ) * rand (N, 1 ) ;
10 f p r i n t f ( ” Output o f ( b ) i s \n” ) ;
11 d i s p ( num2str ( A2 ) ) ;
12
13
14 %% P r a c t i c e −2
15 clc
16 mean =100;
17 v a r i a n c e =25;
18 s t d d e v=s q r t ( v a r i a n c e ) ;
19 X=mean + s t d d e v * randn ( 2 0 , 1 ) ;
20 f p r i n t f ( ” Output o f P r a c t i c e −2 i s \n” ) ;
21 d i s p ( num2str (X) ) ;
22 %% P r a c t i c e −3
23 clc
24 t i t l e ( ”PRACTICE−3 EDM18B055 ” ) ;
25 a=−2;b=2;%Lower bound & Upper bound f o r u n i f o r m d i s t r i b u t i o n
26 subplot (2 ,2 ,1)
27 h i s t o g r a m ( a+(b−a ) * rand ( 1 0 0 0 0 , 1 ) ) ; %h i s t o g r a m f o r u n i f o r m d i s t r i b u t i o n
28 t i t l e ( ” Uniform d i s t r u b u t i o n ( a=”+num2str ( a ) +” ,b=”+num2str ( b ) +”) ” ) ;
29 subplot (2 ,2 ,2)
30 h i s t o g r a m ( randn ( 1 0 0 0 0 , 1 ) ) ; %h i s t o g r a m mu=0 , sigma=1
31 t i t l e ( ”N˜ ( 0 , 1 ) ” ) ;
32 subplot (2 ,2 ,3)
33 h i s t o g r a m (2+2 * randn ( 1 0 0 0 0 , 1 ) ) ; %h i s t o g r a m mu=2 , sigma=2
34 t i t l e ( ”N˜ ( 2 , 4 ) ” ) ;
35 subplot (2 ,2 ,4)
36 h i s t o g r a m (10+100 * randn ( 1 0 0 0 0 , 1 ) ) ; %h i s t o g r a m mu=10 , sigma =100
37 t i t l e ( ”N˜ ( 1 0 , 1 0 0 0 0 ) ” ) ;
38 %% P r a c t i c e −4
39 figure () ;
40 mu=0; sigma =1;
41 x=l i n s p a c e ( −7 * sigma , 7 * sigma , 1 0 0 0 ) ;
42 f x x= ( 1 / s q r t ( 2 * p i * sigma ˆ 2 ) ) * exp ( −(x−mu) . ˆ 2 / ( 2 * sigma ˆ 2 ) ) ;
43 p l o t ( x , f x x , ' g r e e n ' , ' LineWidth ' , 3 ) ;
44 h o l d on
45 h i s t o g r a m ( randn ( 1 0 0 0 , 1 ) , ' N o r m a l i z a t i o n ' , ' p d f ' ) ;
46 l e g e n d ( ' From E q u a t i o n ' , ' randn ' ) ;
47
48
49 %% A u t o c o r r e l a t i o n
50 Y=randn ( 1 0 0 0 0 , 1 ) ; % N˜ ( 0 , 1 )
51 [ c1 , l a g s 1 ] = x c o r r (Y) ;
52 subplot (2 ,2 ,1)
53 s g t i t l e ( ” A u t o c o r r e l a t i o n −EDM18B055 ” ) ;
54 stem ( l a g s 1 , c1 ) ;
55 t i t l e ( ” A u t o c o r r e l a t i o n o f N˜ ( 0 , 1 ) ” ) ;
56 subplot (2 ,2 ,2)
57 p l o t ( abs ( f f t ( x c o r r (Y) ) ) . ˆ 2 ) ;
58 t i t l e ( ' Power s p e c t r u m o f A u t o c o r r e l a t i o n o f N˜ ( 0 , 1 ) ' ) ;
59 Z=rand ( 1 0 0 0 0 , 1 ) ;
60 [ c2 , l a g s 2 ] = x c o r r ( Z ) ;
61 subplot (2 ,2 ,3)
62 stem ( l a g s 2 , c2 ) ;
63 t i t l e ( ” A u t o c o r r e l a t i o n o f Uniform d i s t r i b u t i o n ” ) ;
64 subplot (2 ,2 ,4)
65 p l o t ( abs ( f f t ( x c o r r ( Z ) ) ) . ˆ 2 ) ;
66 t i t l e ( ' Power s p e c t r u m o f A u t o c o r r e l a t i o n o f Uniform ' ) ;

5
4.1 Practice-1

Figure 4: Practice-1 simulation-1

6
Figure 5: Practice-1 simulation-2

4.2 Practice-2

Figure 6: Practice-2 simulation-1

7
Figure 7: Practice-2 simulation-2

8
4.3 Practice-3

Figure 8: Practice-3 simulation-1

9
Figure 9: Practice-3 simulation-2

10
4.4 Practice-4

Figure 10: Practice-4 simulation-1

11
Figure 11: Practice-4 simulation-2

12
4.5 Autocorrelation

Figure 12: Autocorrelation of Gaussian and Uniformly distrubuted noise

5 Conclusion
All practice questions were simulated and output histograms were shown. Related inferences were made under each
Practice section.

13

You might also like