You are on page 1of 11

White noise

From Wikipedia, the free encyclopedia

Jump to: navigation, search For other uses, see White noise (disambiguation). This article needs additional citations for verification.
Please help improve this article by adding reliable references. Unsourced material may be challenged and removed. (March 2010)

Colors of noise White Pink Brown/Red Grey

White noise is a random signal (or process) with a flat power spectral density. In other words, the signal contains equal power within a fixed bandwidth at any center frequency. White noise draws its name from white light in which the power spectral density of the light is distributed over the visible band in such a way that the eye's three color receptors (cones) are approximately equally stimulated. In statistical sense, a time series rt is characterized as having weak white noise if {rt} is a sequence of serially uncorrelated random variables with zero mean and finite variance. Strong white noise also has the quality of being independent and identically distributed, which implies no autocorrelation. In particular, if rt is normally distributed with mean zero and standard deviation , the series is called a Gaussian white noise. 1 An infinite-bandwidth white noise signal is a purely theoretical construction. The bandwidth of white noise is limited in practice by the mechanism of noise generation, by the transmission medium and by finite observation capabilities. A random signal is considered "white noise" if it is observed to have a flat spectrum over a medium's widest possible bandwidth.

Contents

1 White noise in a spatial context 2 Statistical properties 3 Applications 4 Mathematical definition

4.1 White random vector 4.2 White random process (white noise) 5 Random vector transformations o 5.1 Simulating a random vector o 5.2 Whitening a random vector 6 Random signal transformations o 6.1 Simulating a continuous-time random signal o 6.2 Whitening a continuous-time random signal 7 In music 8 See also 9 References
o o

10 External links

White noise in a spatial context


While it is usually applied in the context of frequency domain signals, the term white noise is also commonly applied to a noise signal in the spatial domain. In this case, it has an auto correlation which can be represented by a delta function over the relevant space dimensions. The signal is then "white" in the spatial frequency domain (this is equally true for signals in the angular frequency domain, e.g., the distribution of a signal across all angles in the night sky).

Statistical properties

An example realization of a Gaussian white noise process. The image to the right displays a finite length, discrete time realization of a white noise process generated from a computer. Being uncorrelated in time does not restrict the values a signal can take. Any distribution of values is possible (although it must have zero DC component). Even a binary signal which can only take on the values 1 or -1 will be white if the sequence is statistically uncorrelated. Noise having a continuous distribution, such as a normal distribution, can of course be white. It is often incorrectly assumed that Gaussian noise (i.e., noise with a Gaussian amplitude distribution see normal distribution) is necessarily white noise, yet neither property implies the

other. Gaussianity refers to the probability distribution with respect to the value i.e. the probability that the signal has a certain given value, while the term 'white' refers to the way the signal power is distributed over time or among frequencies.

FFT spectrogram of pink noise (left) and white noise (right), shown with linear frequency axis (vertical). We can therefore find Gaussian white noise, but also Poisson, Cauchy, etc. white noises. Thus, the two words "Gaussian" and "white" are often both specified in mathematical models of systems. Gaussian white noise is a good approximation of many real-world situations and generates mathematically tractable models. These models are used so frequently that the term additive white Gaussian noise has a standard abbreviation: AWGN. Gaussian white noise has the useful statistical property that its values are independent (see Statistical independence). White noise is the generalized mean-square derivative of the Wiener process or Brownian motion.

Applications
It is used by some emergency vehicle sirens due to its ability to cut through background noise, which makes it easier to locate.citation needed White noise is commonly used in the production of electronic music, usually either directly or as an input for a filter to create other types of noise signal. It is used extensively in audio synthesis, typically to recreate percussive instruments such as cymbals which have high noise content in their frequency domain. It is also used to generate impulse responses. To set up the equalization (EQ) for a concert or other performance in a venue, a short burst of white or pink noise is sent through the PA system and monitored from various points in the venue so that the engineer can tell if the acoustics of the building naturally boost or cut any frequencies. The engineer can then adjust the overall equalization to ensure a balanced mix. White noise 10 second sample of

white sound.
Problems listening to this file? See media help.

White noise can be used for frequency response testing of amplifiers and electronic filters. It is not used for testing loudspeakers as its spectrum contains too great an amount of high frequency content. Pink noise is used for testing transducers such as loudspeakers and microphones. White noise is a common synthetic noise source used for sound masking by a tinnitus masker.2 White noise is a particularly good source signal for masking devices as it contains higher frequencies in equal volumes to lower ones, and so is capable of more effective masking for high pitched ringing tones most commonly perceived by tinnitus sufferers.citation needed White noise is used as the basis of some random number generators. For example, Random.org uses a system of atmospheric antennae to generate random digit patterns from white noise. White noise machines and other white noise sources are sold as privacy enhancers and sleep aids and to mask tinnitus.3 Some people claim white noise, when used with headphones, can aid concentration by masking irritating or distracting noises in a person's environment.4

Mathematical definition
White random vector
A random vector is a white random vector if and only if its mean vector and autocorrelation matrix are the following:

That is, it is a zero mean random vector, and its autocorrelation matrix is a multiple of the identity matrix. When the autocorrelation matrix is a multiple of the identity, we say that it has spherical correlation.

White random process (white noise)


A continuous time random process w(t) where is a white noise process if and only if its mean function and autocorrelation function satisfy the following:

i.e. it is a zero mean process for all time and has infinite power at zero time shift since its autocorrelation function is the Dirac delta function. The above autocorrelation function implies the following power spectral density:

since the Fourier transform of the delta function is equal to 1. Since this power spectral density is the same at all frequencies, we call it white as an analogy to the frequency spectrum of white light. A generalization to random elements on infinite dimensional spaces, such as random fields, is the white noise measure.

Random vector transformations


Two theoretical applications using a white random vector are the simulation and whitening of another arbitrary random vector. To simulate an arbitrary random vector, we transform a white random vector with a carefully chosen matrix. We choose the transformation matrix so that the mean and covariance matrix of the transformed white random vector matches the mean and covariance matrix of the arbitrary random vector that we are simulating. To whiten an arbitrary random vector, we transform it by a different carefully chosen matrix so that the output random vector is a white random vector. These two ideas are crucial in applications such as channel estimation and channel equalization in communications and audio. These concepts are also used in data compression.

Simulating a random vector


Suppose that a random vector has covariance matrix Kxx. Since this matrix is Hermitian symmetric and positive semidefinite, by the spectral theorem from linear algebra, we can diagonalize or factor the matrix in the following way.

where E is the orthogonal matrix of eigenvectors and is the diagonal matrix of eigenvalues. We can simulate the 1st and 2nd moment properties of this random vector with mean and covariance matrix Kxx via the following transformation of a white vector of unit variance:

where

Thus, the output of this transformation has expectation

and covariance matrix

Whitening a random vector


The method for whitening a vector with mean and covariance matrix Kxx is to perform the following calculation:

Thus, the output of this transformation has expectation

and covariance matrix

By diagonalizing Kxx, we get the following:

Thus, with the above transformation, we can whiten the random vector to have zero mean and the identity covariance matrix.

Random signal transformations


We cannot extend the same two concepts of simulating and whitening to the case of continuous time random signals or processes. For simulating, we create a filter into which we feed a white noise signal. We choose the filter so that the output signal simulates the 1st and 2nd moments of any arbitrary random process. For whitening, we feed any arbitrary random signal into a specially chosen filter so that the output of the filter is a white noise signal.

Simulating a continuous-time random signal

White noise fed into a linear, time-invariant filter to simulate the 1st and 2nd moments of an arbitrary random process. We can simulate any wide-sense stationary, continuous-time random process constant mean and covariance function with

and power spectral density

We can simulate this signal using frequency domain techniques. Because Kx() is Hermitian symmetric and positive semi-definite, it follows that Sx() is real and can be factored as

if and only if Sx() satisfies the Paley-Wiener criterion.

If Sx() is a rational function, we can then factor it into pole-zero form as

Choosing a minimum phase H() so that its poles and zeros lie inside the left half s-plane, we can then simulate x(t) with H() as the transfer function of the filter. We can simulate x(t) by constructing the following linear, time-invariant filter

where w(t) is a continuous-time, white-noise signal with the following 1st and 2nd moment properties:

Thus, the resultant signal

has the same 2nd moment properties as the desired signal x(t).

Whitening a continuous-time random signal

An arbitrary random process x(t) fed into a linear, time-invariant filter that whitens x(t) to create white noise at the output. Suppose we have a wide-sense stationary, continuous-time random process defined with the same mean , covariance function Kx(), and power spectral density Sx() as above. We can whiten this signal using frequency domain techniques. We factor the power spectral density Sx() as described above. Choosing the minimum phase H() so that its poles and zeros lie inside the left half s-plane, we can then whiten x(t) with the following inverse filter

We choose the minimum phase filter so that the resulting inverse filter is stable. Additionally, we must be sure that H() is strictly positive for all so that Hinv() does not have any singularities. The final form of the whitening procedure is as follows:

so that w(t) is a white noise random process with zero mean and constant, unit power spectral density

Note that this power spectral density corresponds to a delta function for the covariance function of w(t).

In music
White noise, pink noise, and brown noise are used as percussion in 8-bit (chiptune) music. They are also used in Electronic Music such as Trance and House , to create 'sweeps'.

Gaussian noise
From Wikipedia, the free encyclopedia Jump to: navigation, search This article does not cite any references or sources.
Please help improve this article by adding citations to reliable sources. Unsourced material may be challenged and removed. (July 2009)

Gaussian noise is statistical noise that has its probability density function equal to that of the normal distribution, which is also known as the Gaussian distribution. In other words, the values that the noise can take on are Gaussian-distributed. A special case is white Gaussian noise, in which the values at any pairs of times are statistically independent (and uncorrelated). In applications, Gaussian noise is most commonly used as additive white noise to yield additive white Gaussian noise.

Explanation
Gaussian noise is properly defined as the noise with a Gaussian amplitude distribution. This says nothing of the correlation of the noise in time or of the spectral density of the noise. Labeling Gaussian noise as 'white' describes the correlation of the noise. It is necessary to use the term "white Gaussian noise" to be precise. Gaussian noise is sometimes misunderstood to be white Gaussian noise, but this is not the case.

Gaussian process
From Wikipedia, the free encyclopedia

Jump to: navigation, search In probability theory and statistics, a Gaussian process is a stochastic process whose realisations consist of random values associated with every point in a range of times (or of space) such that each such random variable has a normal distribution. Moreover, every finite collection of those random variables has a multivariate normal distribution. Gaussian processes are important in statistical modelling because of properties inherited from the normal. For example, if a random process is modelled as a Gaussian process, the distributions of various derived quantities can be obtained explicitly. Such quantities include: the average value of the process over a range of times; the error in estimating the average using sample values at a small set of times.

Contents

1 Definition 2 History 3 Alternative definitions 4 Important Gaussian processes 5 Applications 6 See also 7 Notes 8 External links
o

8.1 Video tutorials

Definition
A Gaussian process is a stochastic process { Xt ; t T } for which any finite linear combination of samples will be normally distributed (or, more generally, any linear functional applied to the sample function Xt will give a normally distributed result). Some authors1 also assume the random variables Xt have mean zero.

History
The concept is named after Carl Friedrich Gauss simply because the normal distribution is sometimes called the Gaussian distribution, although Gauss was not the first to study that distribution: see history of the normal/Gaussian distribution.

Alternative definitions
Alternatively, a process is Gaussian if and only if for every finite set of indices t1, ..., tk in the index set T

is a vector-valued Gaussian random variable. Using characteristic functions of random variables, the Gaussian property can be formulated as follows:{ Xt ; t T } is Gaussian if and only if, for every finite set of indices t1, ..., tk, there are reals l j with i i > 0 and reals j such that

The numbers l j and j can be shown to be the covariances and means of the variables in the process.2

Important Gaussian processes


The Wiener process is perhaps the most widely studied Gaussian process. It is not stationary, but it has stationary increments. The OrnsteinUhlenbeck process is a stationary Gaussian process. The Brownian bridge is a Gaussian process whose increments are not independent. The fractional Brownian motion is a Gaussian process whose covariance function is a generalisation of Wiener process.

Applications
A Gaussian process can be used as a prior probability distribution over functions in Bayesian inference.34 (Given any set of N points in the desired domain of your functions, take a multivariate Gaussian whose covariance matrix parameter is the Gram matrix of your N points with some desired kernel, and sample from that Gaussian.) Inference of continuous values with a Gaussian process prior is known as Gaussian process regression, or Kriging.5 Gaussian processes are also useful as a powerful non-linear interpolation tool.

You might also like