You are on page 1of 23

CMA 2007

Applications of free probability and random matrix


theory

Øyvind Ryan

December 2007

Øyvind Ryan Applications of free probability and random matrix theory


CMA 2007 Applications of free probability and random matrix theory

Some important concepts from classical probability

Random variables are functions (i.e. they commute w.r.t.


multiplication) with a given p.d.f. (denoted f )
Expectation (denoted E ) is integration
Independence
Additive convolution (∗) and the logarithm of the Fourier
transform
Multiplicative convolution
Central limit law, with special role of the Gaussian law
∗n
Poisson distribution Pc : The limit of 1 − nc δ(0) + cn δ(1)


as n → ∞.
Divisibility: For a given a, find i.i.d. b1 , ..., bn such that
fa = fb1 +···+bn .

Øyvind Ryan Applications of free probability and random matrix theory


CMA 2007 Applications of free probability and random matrix theory

Can we find a more general theory, where the random variables


are matrices (or more generally, operators), with their
eigenvalue distribution (or spectrum) taking the role as the
p.d.f.?
What are the analogues to the above mentioned concepts for
this theory?
What are the applications of such a theory?

Øyvind Ryan Applications of free probability and random matrix theory


CMA 2007 Applications of free probability and random matrix theory

Free probability

Free probability was developed as a probability theory for random


variables which do not commute, like matrices
The random variables are elements in a unital ∗-algebra
(denoted A), typically B(H), or Mn (C).
Expectation (denoted φ) is a normalized linear functional on A.
The pair (A, φ) is called a noncommutative probability space.
For matrices, φ will be the normalized trace trn , defined by
n
1X
trn (a) = aii .
n
i =1

For random matrices, we set φ(a) = τn (a) = E (trn (a)) is


defined by.

Øyvind Ryan Applications of free probability and random matrix theory


CMA 2007 Applications of free probability and random matrix theory

What is the "central limit" for large matrices?


We will attempt to make a connection with classical probability
through large random matrices. We would like to define random
matrices as "independent" if all entries in one are independent from
all entries in the other.
Assume that X1 , ..., Xm are n × n i.i.d. complex matrices, and
τn (Xi ) = 0, τn (Xi2 ) = 1. What is the limit when m → ∞ in
X1 + · · · + Xm
√ ?
m
If Xi = √1n Yi where Yi has i.i.d. complex standard Gaussian
entries, then
X1 + · · · + Xm
√ ∼ X,
m
where X = √1n Y and Y has i.i.d. complex standard Gaussian
entries. Therefore, matrices with complex standard Gaussian entries
are central limit candidates.
Øyvind Ryan Applications of free probability and random matrix theory
CMA 2007 Applications of free probability and random matrix theory

The full circle law

What happens when n is large? The eigenvalues converge to what


is called the full circle law. Here for n = 500.

0.8

0.6

0.4

0.2

−0.2

−0.4

−0.6

−0.8

−1

−1 −0.5 0 0.5 1

plot(eig( (1/sqrt(1000)) * (randn(500,500) +


j*randn(500,500)) ),’kx’)
Øyvind Ryan Applications of free probability and random matrix theory
CMA 2007 Applications of free probability and random matrix theory

The semicircle law

35

30

25

20

15

10

0
−3 −2 −1 0 1 2 3

A = (1/sqrt(2000)) * (randn(1000,1000) +
j*randn(1000,1000));
A = (sqrt(2)/2)*(A+A’);
hist(eig(A),40)
Øyvind Ryan Applications of free probability and random matrix theory
CMA 2007 Applications of free probability and random matrix theory

The Marc̆henko Pastur law

What happens with the eigenvalues of N1 XX H when X is an n × N


random matrix with standard complex Gaussian entries?
The eigenvalue distribution converges to the Marc̆henko
Pastur law with parameter Nn , denoted µ Nn .
Let f µc be the p.d.f. of µc . Then
p
1 (x − a)+ (b − x)+
f µc
(x) = (1 − )+ δ(x) + , (1)
c 2πcx
√ √
where (z)+ = max(0, z), a = (1 − c)+ and a = (1 + c)+ .
The matrices N1 XX H occur most frequently as sample
covariance matrices: N is the number of observations, and n is
the number of parameters in the system.

Øyvind Ryan Applications of free probability and random matrix theory


CMA 2007 Applications of free probability and random matrix theory

Four different Marc̆henko Pastur laws µ Nn are drawn.

1.6
c=0.5
c=0.2
1.4 c=0.1
c=0.05

1.2

1
Density

0.8

0.6

0.4

0.2

0
0.5 1 1.5 2 2.5 3

Øyvind Ryan Applications of free probability and random matrix theory


CMA 2007 Applications of free probability and random matrix theory

Derivation of the limiting distribution for √1 XXH


N

When x is standard complex Gaussian, we have that

E |x|2p = p!.


A more general statement concerns a random matrix √1N XXH ,


where X is an n × N random matrix with independent standard
complex Gaussian entries. It is known [HT] that
 p 
1 H 1 X k(π̂) l(π̂)
τn √ XX = p N n ,
N N n
π∈Sp

where π̂ is a permutation in S2p constructed in a certain way from


π, and k(π̂), l (π̂) are functions taking values in {0, 1, 2, ...}.

Øyvind Ryan Applications of free probability and random matrix theory


CMA 2007 Applications of free probability and random matrix theory

One can show that this equals


 p 
1 H
X X ak
τn √ XX = 1+ .
N N 2k
π̂∈NC k
2p

The convergence is "almost sure", which means that we have very


accurate eigenvalue prediction when the matrices are large.

Øyvind Ryan Applications of free probability and random matrix theory


CMA 2007 Applications of free probability and random matrix theory

Motivation for free probability


One can show that for the Gaussian random matrices we
considered, the limits

φ Ai1 B j1 · · · Ail B jl = lim trn Ain1 Bnj1 · · · Ainl Bnjl


 
n→∞

exist. If we linearly extend the linear functional φ to all polynomials


in A and B, the following can be shown:
Theorem
If Pi , Qi are polynomials in A and B respectively, with 1 ≤ i ≤ l ,
and φ(Pi (A)) = 0, φ(Qi (B)) = 0 for all i, then

φ (P1 (A)Q1 (B) · · · Pl (A)Ql (B)) = 0.

This motivates the definition of freeness, which is the analogy to


independence.
Øyvind Ryan Applications of free probability and random matrix theory
CMA 2007 Applications of free probability and random matrix theory

Definition of freeness

Definition
A family of unital ∗-subalgebras (Ai )i ∈I is called a free family if
 
 aj ∈ Aij 
i1 6= i2 , i2 6= i3 , · · · , in−1 6= in ⇒ φ(a1 · · · an ) = 0. (2)
φ(a1 ) = φ(a2 ) = · · · = φ(an ) = 0
 

A family of random variables ai is called a free family if the algebras


they generate form a free family.

Øyvind Ryan Applications of free probability and random matrix theory


CMA 2007 Applications of free probability and random matrix theory

The free central limit theorem

Theorem
If
a1 , ..., an are free and self-adjoint,
φ(ai ) = 0,
φ(ai2 = 1,
supi |φ(aik )| < ∞ for all k,

then the sequence (a1 + · · · + an )/ n converges in distribution to
the semicircle law.
In free probability, the semicircle law thus
√ has the role of the
1
Gaussian law. it’s density is density 2π 4 − x 2

Øyvind Ryan Applications of free probability and random matrix theory


CMA 2007 Applications of free probability and random matrix theory

Similarities between classical and free probability

1 Additive convolution ⊞: The p.d.f. of the sum of free random


variables. The role of the logarithm of the Fourier transform is
now taken by the R-transform, which satisfies
Rµa ⊞µb (z) = Rµa (z) + Rµb (z).
2 The S-transform: Transform on probability distributions which
satisfies Sµa ⊠µb (z) = Sµa (z)Sµb (z)
3 Poisson distributions have their analogy in the free Poisson
distributions: These are given by the Marc̆enko Pastur laws µc
with parameter c, which also can be written as the limit of
⊞n
1 − cn δ(0) + nc δ(1)

as n → ∞
4 Infinite divisibility: There exists an analogy to the Lévy-Hinčin
formula for infinite divisibility in classical probability.

Øyvind Ryan Applications of free probability and random matrix theory


CMA 2007 Applications of free probability and random matrix theory

Main usage of free probability in my papers

Let A and B be random matrices. How can we make a good


prediction of the eigenvalue distribution of A when one has the
eigenvalue distribution of A + B and B? Simplest case is when
one assumes that B is Gaussian (Noise). What about the
eigenvectors?
Assume that we have the eigenvalue distribution of
1 H
N (R + X )(R + X ) , where R and X are n × N random
matrices, with X Gaussian. If the columns of R are
realizations of some random vector r , what is the covariance
matrix E (ri rj∗ )?
Have use for multiplicative free convolution with the
Marc̆henko Pastur law. This has an efficient implementation.

Øyvind Ryan Applications of free probability and random matrix theory


CMA 2007 Applications of free probability and random matrix theory

Channel capacity estimation

The following is a much used observation model in MIMO systems:


1
Ĥi = √ (H + σXi ) (3)
n

where
n is the number of receiving and transmitting antennas,
Ĥi is the n × n measured MIMO matrix,
H is the n × n MIMO channel and
Xi is the n × n noise matrix with i.i.d zero mean unit variance
Gaussian entries.

Øyvind Ryan Applications of free probability and random matrix theory


CMA 2007 Applications of free probability and random matrix theory

Channel capacity estimation

With free probability we can estimate the eigenvalues of n1 HHH


based on few observations Ĥi . This helps us estimate the channel
capacity:
The capacity of a channel with channel matrix H and signal to
noise ratio ρ = σ12 is given by
 
1 1 H
C = log det I + 2 HH (4)
n nσ
n
1X 1
= log(1 + 2 λl ) (5)
n σ
l=1

where λl are the eigenvalues of n1 HHH .

Øyvind Ryan Applications of free probability and random matrix theory


CMA 2007 Applications of free probability and random matrix theory

Observation model

Form the compound observation matrix


1 h i
Ĥ1...L = √ Ĥ1 , Ĥ2 , ..., ĤL ,
L
from the observations
1
Ĥi = √ (H + σXi ) , (6)
n

Using free probability, one can with high accuracy estimate the
eigenvalues of n1 HHH from the eigenvalues of Ĥ1...LĤH
1...L .

Øyvind Ryan Applications of free probability and random matrix theory


CMA 2007 Applications of free probability and random matrix theory

Free capacity estimation for channel matrices of various rank


4.5

3.5
True capacity, rank 3
Cf, rank 3
Capacity

True capacity, rank 5


3 Cf, rank 5
True capacity, rank 6
Cf, rank 6
2.5

1.5
0 10 20 30 40 50 60 70 80 90 100
Number of observations

Figure: The free probability based estimator for various number of


observations. σ 2 = 0.1 and n = 10. The rank of H was 3, 5 and 6.

Øyvind Ryan Applications of free probability and random matrix theory


CMA 2007 Applications of free probability and random matrix theory

Application areas

digital communications,
nuclear physics,
mathematical finance
Situations in these fields, can often be modelled with random
matrices. When the matrices get large, free probability theory is an
invaluable tool for describing the asymptotic behaviour of many
systems.
Other types of matrices which are of interest are random unitary
matrices and random Vandermonde matrices.

Øyvind Ryan Applications of free probability and random matrix theory


CMA 2007 Applications of free probability and random matrix theory

List of papers
Free Deconvolution for Signal Processing Applications.
Submitted to IEEE Trans. Inform. Theory.
arxiv.org/cs.IT/0701025.
Multiplicative free Convolution and Information-Plus-Noise
Type Matrices. Submitted to Ann. Appl. Probab.
arxiv.org/math.PR/0702342.
Channel Capacity Estimation using Free Probability Theory.
Submitted to IEEE. Trans. Signal Process.
arxiv.org/abs/0707.3095.
Random Vandermonde Matrices-Part I: Fundamental results.
Work in progress.
Random Vandermonde Matrices-Part II: Applications to
wireless applications. Work in progress.
Applications of free probability in finance. Estimation of the
covariance matrix itself (not only it’s eigenvalue distribution).
2008.
Øyvind Ryan Applications of free probability and random matrix theory
CMA 2007 Applications of free probability and random matrix theory

References

[HT]: "Random Matrices and K-theory for Exact C ∗ -algebras". U.


Haagerup and S. Thorbjørnsen. citeseer.ist.psu.edu/114210.html.
1998.
This talk is available at
http://heim.ifi.uio.no/∼oyvindry/talks.shtml.
My publications are listed at
http://heim.ifi.uio.no/∼oyvindry/publications.shtml

Øyvind Ryan Applications of free probability and random matrix theory

You might also like