You are on page 1of 3

# Lecture 5

Dirac delta functions, characteristic functions, and the law of large numbers
Dirac delta functions
Let’s consider the following expression. ∫ ( ) ( ) ( )

The center of the delta function is a. Now, let’s define a function whose derivative is the delta function, i.e. ( The function Θ can be written as ( ) ∫ ( ) ) ( )

The integral is zero at values less than a and one at values greater than or equal to a. Θ is called a step function. Recall that we were interested in Pf(f), the probability density associated with a function f(x1,…,xN) if we already know P(x1,…,xN). The probability that the function f(x1,…,xN) is less than some value f0 is ( ) ∫ ∫ ∫ ∫ ∫ 〈 ( ( ( ( ( ))〉 ) ( )∫ ( ( ) ( )) ( ( )) ))

The relationship between probability density Pf(f) and the cumulative probability Pr(f<f0) is ( ) ( ) 〈 ( ( ))〉

The Gaussian (continuous) distribution The normal distribution is ( ) ( ) √ where x is a variable. Characteristic functions Let’s insert the spectral representation of the Dirac delta function in the expression above. depends on position x. for example. It is given by ( ) 〈 ( ) 〉 ∫ ( ) ( ) Fourier transforms connect many physical quantities in a complementary manner.e.The last step is a general rule from calculus. and μ and σ are parameters. i. the density of particles ρ(x). we pull a few terms out of the expectation value because they do not depend on x1. In spectroscopy. the scattering function or structure factor S(k) depends on the wave vector k whereas it’s Fourier transform. the relaxation function ϕ(t). what the expectation value is averaged over. the spectral lineshape I(ω) depends on the angular frequency ω whereas its Fourier transform. depends on time t and tells us about the dynamics of a system.…. ( ) ∫ ( ) In scattering experiments. The binomial distribution is ( ) ( ) . The Gaussian distribution is the limiting case of the binomial distribution for N>>1 and Np>>1. The new expectation value on the right is called the characteristic function and can be viewed as the Fourier transform of the probability density.xN. ( ) 〈 ∫ ( ( )) 〉 ∫ 〈 ( ) 〉 First. Np is finite. For the Poisson distribution.

xN}. We can use this law in situations where the system is identically distributed and independent.….xN. ( ) An important question to ask is. the probability density is factorizable. . i. i.We also know that the Gaussian distribution is normalized.e. 〈 〉 as N approaches infinity. This produces M values of y ( ) ( ) ( ) ( ) ( ) etc. all y(i) will be equal. ∫ ( ) The law of large numbers Consider N identically distributed random variables that are independent * + Let’s define the sum of these variables to be y ∑ Now consider M sets of values of x1.e.…. For a set of independent variables the probability density can be factorized as follows ( The mean is 〈 〉 ∫ ( ) ) ( ) ( ) The law of large numbers basically says the following: as N approaches infinity. “how is y distributed across the M different samples? The law of large numbers tells us that for any realization of N random numbers {x1.