You are on page 1of 17

May 28, 2015 15:20 Statistical Physics 9in x 6in b2143 page 3

Chapter 1

Basic Concepts and Tools in


Statistical Physics

1.1 Introduction

Statistical mechanics provides general methods to study properties of sys-


tems composed of a large number of particles. It establishes general for-
mulas connecting physical parameters to various physical quantities. From
these formulas, one can deduce properties of a system if one knows the sys-
tems parameters. In general, microscopic mechanisms leading to interac-
tion parameters are provided by quantum mechanics. Statistical mechanics
and quantum mechanics are at the heart of the modern physics which has
allowed a microscopic understanding of properties of matter and spectac-
ular technological innovations and progress which have radically changed
our daily life since 50 years.
While quantum mechanics searches for microscopic structures and mech-
anisms which are not perceptible at the macroscopic scale, statistical me-
chanics studies macroscopic properties of systems composed of a large num-
ber of particles using information provided by quantum mechanics on inter-
action parameters. The progress made in the 20-th century in the theory
of condensed matter shows the power of methods borrowed from statistical
physics.
Statistical physics of systems at equilibrium is based on a single postu-
late called fundamental postulate introduced in the case of an isolated
system at equilibrium. Using this postulate, one studies in chapter 2 prop-
erties of isolated systems and recovers results from thermodynamics. More-
over, as one sees in chapter 3 and 4, one can also study properties of non
isolated systems in some particular situations, such as systems maintained
at a constant temperature and systems maintained at a constant temper-
ature and a constant chemical potential, where the fundamental postulate

3
May 28, 2015 15:20 Statistical Physics 9in x 6in b2143 page 4

4 Statistical Physics Fundamentals and Application to Condensed Matter

can be used in each case to calculate the probability of a microscopic state,


or microstate for short.
In this chapter one recalls basic mathematical tools and denitions
which are used throughout this book.

1.2 Combinatory analysis

In the following, one recalls some useful denitions in the combinatory


analysis.

1.2.1 Number of permutations


Let N be the number of discernible objects. The number of permutations
two by two of these objects is given by

P =N ! (1.1)

To demonstrate this formula, let us consider an array of N cases and N


objects numbered from 1 to N :
-there are N ways to choose the rst object to put into the rst case
-there are N 1 ways to choose the second object to put into the second
case, etc.
The total number of dierent congurations of objets in the cases is
thus N (N 1)(N 2)...1 = N !.
Example: The number of permutations of 3 objects numbered from 1
to 3 is 3! = 6. They are |1|2|3|, |1|3|2|, |2|1|3|, |2|3|1|, |3|1|2|, |3|2|1|.

1.2.2 Number of arrangements


The number of arrangements of n objects taken among N objects is the
number of ways to choose n objects among N objects, one by one, taking
into account the order of sorting-out. This number is given by
N!
AnN = (1.2)
(N n)!
Example: Consider 4 objects numbered from 1 to 4. One chooses 2
objects among 4. The possible congurations are (1,2), (1,3), (1,4), (2,1),
(2,3), (2,4), (3,1), (3,2), (3,4), (4,1), (4,2) and (4,3) where the rst and
second numbers in the parentheses correspond respectively to the rst and
second sorting. If one takes into account the sorting order, then (1,2) and
May 28, 2015 15:20 Statistical Physics 9in x 6in b2143 page 5

Basic Concepts and Tools in Statistical Physics 5

(2,1) are considered to be dierent. There are thus 12 arrangements, as


given by A24 = 4!/(4 2)! = 12.

1.2.3 Number of combinations


The number of combinations of n objects among N objects is the number
of ways to choose n objects among N objects, without taking into account
the sorting order. This number is given by

n N!
CN = (1.3)
(N n)!n!
n
One sees that CN is equal to AnN divided by the number of permutations
of n objects: Cn = AnN /n!. In the example given above, if (1,2) and (2,1)
n

are counted only once, and the same for the other similar pairs, then one
has 6 combinations, namely C42 .

1.3 Probability

Statistical mechanics is based on a probabilistic concept. As one sees in


the following, the value of a physical quantity experimentally observed is
interpreted as its most probable value. In terms of probabilities, this value
corresponds to the maximum of the probability to nd this quantity. One
sees below that the observed quantity is nothing but the mean value result-
ing from the statistical average over microstates.
Hereafter, one recalls the main properties of the probability and presents
some frequently used probability laws.

1.3.1 Denition
One distinguishes two cases: discrete events and continuous events.

1.3.1.1 Ensemble of discrete events


Discrete events can be distinguished from each other without error or ambi-
guity. Examples include rolling a dice or ipping a coin in which the result
of each event is independent of the previous one.
Let us realize N experiments. Each experiment gives a result. Let ni
be the number of times that an event of the type i occurs among N results.
The probability of that event is dened by
May 28, 2015 15:20 Statistical Physics 9in x 6in b2143 page 6

6 Statistical Physics Fundamentals and Application to Condensed Matter

ni
Pi = lim (1.4)
N N
Example: One ips N times a coin. One obtains n1 heads and n2 tails.
If N is very large, one expects n1 = n2 = N2 . The ratios nN1 and nN2 are
called probabilities to obtain heads and tails, respectively.

Remark: For all i, 0 Pi 1.

1.3.1.2 Ensemble of continuous events Density of probability


Variables in continuous events are always given with an uncertainty (error).
Example: The length of a table is measured with an uncertainty. Let
P (x) be the probability so that the length of the table belongs to the
interval [x, x + x]. Let n(x) be the number of times among N measure-
ments where x belongs to [x, x + x]. One has by denition

n(x)
P (x) = lim (1.5)
N N
The density of probability is dened by
P (x) dP (x)
W (x) = lim = (1.6)
x0 x dx
The probability of nding x in the interval [x, x + dx] is thus dP (x) =
W (x)dx.

Remarks:
1) W (x) 0
2) For more than one variable one writes dP (x, y, z) = W (x, y, z)dxdydz
or dP (r ) = W (

r )d

r.

1.3.2 Fundamental properties


1.3.2.1 Normalization of probabilities
From their denition, one has for discrete and continuous cases the following
normalization relations 
 ni
Pi = i =1 (1.7)
i
N
  
1
W (x)dx = dP (x) = dn(x) = 1 (1.8)
N
May 28, 2015 15:20 Statistical Physics 9in x 6in b2143 page 7

Basic Concepts and Tools in Statistical Physics 7

1.3.2.2 Addition rule


Let e1 and e2 be two incompatible discrete events of probabilities P (e1 )
and P (e2 ), respectively. The probability of nding e1 or e2 is given by
P (e1 or e2 ) = P (e1 ) + P (e2 ) (1.9)
For continuous variables, the probability to nd x between a and b is
 b
P (a x b) = W (x)dx (1.10)
a
If e1 and e2 are not incompatible in N experiments, then one should
remove the number of times where e1 and e2 simultaneously take place.
One has

P (e1 or e2 ) = P (e1 ) + P (e2 ) P (e1 and e2 ) (1.11)

1.3.2.3 Multiplication rule


Let e1 and e2 be two independent events. The probability to nd e1 and
e2 is given by
P (e1 and e2 ) = P (e1 )P (e2 ) (1.12)

Example: One ips a dice twice. The probability to nd face 1 or face


2 is 16 + 16 , and that to nd face 1 and face 2 is 16 16 .

1.3.3 Mean values


One denes the mean value of a quantity f by


f= Pm fm (1.13)
m

where fm is the value of f in the microstate m of probability Pm . The sum


is performed over all states.
If the states (or events) are characterized by continuous variables, one
has


f= W (x)f (x)dx (1.14)

where f (x) is the value of f in the state x.


May 28, 2015 15:20 Statistical Physics 9in x 6in b2143 page 8

8 Statistical Physics Fundamentals and Application to Condensed Matter

If a value A occurs many times with dierent events m (or dierent


states), one can write

A= P (Ai )Ai (1.15)
Ai

where P (Ai ) is the probability to nd Ai , namely


P (Ai ) = Pm (1.16)
m,Am =Ai

where the sum is made only on the events m having Am = Ai . In the case
of a continuous variable, one has


A= W (A)AdA (1.17)

where W (A) is the density of probability to nd A.


The most probable value of Ai (or A) corresponds to the maximum of
P (Ai ) (or W (A)).
The variance is dened by

 2
(f )2 = f f (1.18)

One can also write

  2
(f )2 = Pm fm f
m
  2 
= Pm fm + (f )2 2fm f
m

= f 2 + (f )2 2f f
= f 2 (f )2 (1.19)

One calls f = (f )2 standard deviation. This quantity expresses
the dispersion of the statistical distribution. In the same manner, one has
for the continuous case

 +
2
 2
(A) = W (A) A A dA = A2 (A)2 (1.20)

May 28, 2015 15:20 Statistical Physics 9in x 6in b2143 page 9

Basic Concepts and Tools in Statistical Physics 9

1.4 Statistical distributions

1.4.1 Binomial distribution


When there are only two possible outcomes in an experiment, the distribu-
tion of the results in a series of experiments follows the binomial law.
Example: ipping a coin two possible results, head or tail.
Let PA and PB be the probabilities of the two types of outcome. The
normalization of probability imposes PA + PB = 1. If after N experiments
one obtains n times A and N n times B, then the probability to nd n
times A and (N n) times B is given by the multiplication rule
n n N n
P (N, n) = CN PA PB (1.21)
!
n
where the factor CN = n!(NNn)! is introduced to take into account the
number of combinations in which n times A and (N n) times B occur.
One shows that P (N, n) is normalized:


N 
N
n n N n
P (N, n) = CN PA PB
n=0 n=0
= (PA + PB )N
=1 (1.22)
where one has used the formula of the Newton binomial to go from the rst
to the second line.
One shows in the following some main results:

the mean value of n is n = N PA :


N 
N
n n N n
n= nP (N, n) = nCN PA PB
n=0 n=0


N
n n N n
= PA CN PA PB
PA n=0

= PA (PA + PB )N
PA
= PA N (PA + PB )N 1
= N PA (1.23)
where one has used in the last line PA + PB = 1.
May 28, 2015 15:20 Statistical Physics 9in x 6in b2143 page 10

10 Statistical Physics Fundamentals and Application to Condensed Matter

the variance:
The variance is dened by (n)2 = n2 (n)2 . One calculates n2
as follows:
N
n n N n
n2 = n2 P (N, n) = n2 CN PA PB
n=0
N

2  n n N n
= (PA ) CN PA PB
PA n=0
2
= (PA ) (PA + PB )N
PA
 
= PA PA N (PA + PB )N 1
PA
= N PA + N (N 1)PA2 (1.24)
where one has used in the last equality PA + PB = 1. One nds

(n)2 = n2 (n)2 = N PA + N (N 1)PA2 (N PA )2


= N PA (1 PA ) = N PA PB (1.25)

from which the standard deviation is n = N PA PB .
the relative uncertainty or relative error:
The relative error on n is given by

n N 1
 = (1.26)
n N N
The relative error (or uncertainty) decreases with increasing N . For
N = 1000 (which is the standard sample for polls), n/n  3%.

1.4.2 Gaussian distribution


The Gaussian distribution applies in the case of continuous events. The
density of probability W (x) of the Gaussian distribution, or Gaussian law,
is given by

(x x0 )2
W (x) = A exp (1.27)
2 2
where A is a constant determined by the normalization of W (x), x0 the
central value of the distribution and 2 the full width at half-maximum of
W (x) (see Fig.1.1).
May 28, 2015 15:20 Statistical Physics 9in x 6in b2143 page 11

Basic Concepts and Tools in Statistical Physics 11

0.9

0.8

0.7

0.6
W(X)

0.5

0.4

0.3

0.2

0.1

0
-2 -1 0 1 2 3 4
X

Fig. 1.1 Gaussian density of probability W (x) shown with x0 = 1, = 0.4.

To calculate A, one writes


 
(x x0 )2
1= W (x)dx = A exp 2
= A 2 2 (1.28)
2
where u = x x0 and where one has used in the last equality the Gauss
integral (see Appendix A):

 2

exp au du = (1.29)
a
One then has A = 12 .
One shows some important results in the following:
the mean value of x:
With (1.27), one has
 
(x x0 )2
x= xW (x)dx = A x exp dx
2 2

(x x0 )2
=A (x x0 ) exp dx
2 2

(x x0 )2
+A x0 exp dx
2 2

(x x0 )2
= 0 + x0 A exp dx
2 2
= x0 (1.30)
where the rst integral of the second line is zero (integral of an
odd function with two symmetrical opposite bounds), and one has
May 28, 2015 15:20 Statistical Physics 9in x 6in b2143 page 12

12 Statistical Physics Fundamentals and Application to Condensed Matter

used the normalization of the probability density W (x) in the line


before the last line.
the variance:
Using x = x0 , the variance (x)2 = (x x)2 is

2 (x x0 )2
2
(x) = A (x x0 ) exp dx
2 2

y2
=A y 2 exp 2 dy
2
1
=A (2 2 )3
2
where one used in the last line a formula in Appendix A. Replacing
A by 12 , one obtains

(x)2 = 2 (1.31)

from which the standard deviation is x = .

One can also show that (see Problem 1) the binomial law becomes the
Gaussian law when N >> n >> 1. This equivalence is known as central
limit theorem.

1.4.3 Poisson law


When the probability of an event is very small with respect to 1, namely
a rare event, the probability to nd n events of the same kind is given by
the Poisson law:
n exp()
P (n) = (1.32)
n!
where is a constant which is the mean value of n (see Problem 2). One
shows that P (n) is normalized:


 
n exp() n
P (n) = = exp() = exp() exp() = 1
n=0 n=0
n! n=0
n!
(1.33)
One can also show that the binomial law becomes the Poisson law when
PA << PB and N >> n >> 1 (see Problem 2).
May 28, 2015 15:20 Statistical Physics 9in x 6in b2143 page 13

Basic Concepts and Tools in Statistical Physics 13

1.5 Microstates Macrostates

1.5.1 Microstates Enumeration


A microstate of a system is dened in general by the individual states
of the particles which constitute the system. These individual states are
characterized by the physical parameters which dene the system. Let us
mention some examples. The microstates of an atom are given by the states
of the electrons of the atom: each microstate is dened by four quantum
numbers (n, l, ml , ms ). The microstates of an isolated system of energy E
are given by the dierent distributions of E on the system particles. The
microstates, at a given time, of a system of N classical particles of mass m
are dened by the positions and the velocities of the particles at that time.
The microstates of a system of independent quantum particles are given by
the wave-vectors of the particles.
It is obvious that for a given system there are a large number of mi-
crostates which satisfy the physical constraints imposed on the system (vol-
ume, energy, temperature, ...). These microstates are called realizable mi-
crostates. It is very important to know how to calculate the number of
these states. Systems of classical and quantum independent particles are
shown in details in chapter 2. In the following, one gives some examples
and formulas to enumerate the microstates of simple systems. In general,
one distinguishes the cases of indiscernible and discernible particles. Dis-
cernible particles are those one can identify individually such as particles
with dierent colors or particles bearing each a number. Indiscernible parti-
cles are identical particles impossible to distinguish the one from the others
such as atoms in a mono-atomic gas or conduction electrons in a metal.
Example: Consider an isolated system of total energy equal to 3 units.
This system has 3 particles supposed to be discernible and bearing letters
A, B and C . Each of the particles can occupy levels of energy at 0, 1, 2,
or 3 units. With the constraint that the sum of energies of the particles is
equal to 3, one has the 3 following categories of microstates (see Fig. 1.2):
-level 0: B, C; level 3: A (Fig. 1.2, left) permutations of A with B
and A with C give 3 microstates
-level 0: C; level 1: B ; level 2: A (Fig. 1.2, middle) permutations
of A, B and C give 6 microstates
-level 1: A, B, C (Fig. 1.2, right) 1 microstate.
Remark: Permutations between particles of the same level do not give
rise to new microstates.
May 28, 2015 15:20 Statistical Physics 9in x 6in b2143 page 14

14 Statistical Physics Fundamentals and Application to Condensed Matter

A
3
A
2
B A,B,C
1
B,C C
0
Fig. 1.2 Three categories of microstates (left, middle, right) according to occupation
numbers of energy levels.

In the example given above, one can use the following formula to calcu-
late the number of microstates of each category in Fig. 1.2

N!
= (1.34)
n0 !n1 !n2 !...
where ni (i = 0, 1, 2, ...) is the number of particles occupying the i-th level
and N the total number of particles. One can obviously verify that this
formula gives the number of microstates enumerated above in each category.
Furthermore, one sees that the total number of microstates (all categories)
is 10. This number can be calculated in the general case with arbitrary N
and E by

(N + E 1)!
(E) = (1.35)
(N 1)!E!

For E = 3 and N = 3, one recovers = (3+31)!


(31)!3! = 10.
The demonstrations of Eqs. (1.34) and (1.35) are done in Problem 3.

Remark: In the above example, if the particles are indiscernible, the


number of microstates is reduced to 3 because permutations of particles on
the dierent levels do not yield new microstates.

It should be emphasized that the knowledge of the number of mi-


crostates allows one to calculate principal physical properties of isolated
systems as one will see in chapter 2.
The microstates having the same energy are called degenerate states.
The number of degenerate states at a given energy is called degeneracy.
In the above example, the degeneracy is 10 for E = 3.
May 28, 2015 15:20 Statistical Physics 9in x 6in b2143 page 15

Basic Concepts and Tools in Statistical Physics 15

Each microstate l has a probability Pl . If one knows Pl , one can calcu-


late the mean values of physical quantities as will be seen in the following
chapters. That is why the main objective of statistical mechanics is to nd
a way to determine Pl according to the conditions imposed on the system.

Remark: Quantum particles, by their nature, are indiscernible. One


distinguishes two kinds of particles: bosons and fermions. Bosons are par-
ticles having integer spins (0, 1, 2, ...) and fermions are those having
half-integer spins (1/2, 3/2, 5/2, ...). The symmetry postulate in quantum
mechanics states that wave functions of bosons are invariant with respect
to permutation of two particles in their states, while wave functions of
fermions do change their sign at each permutation. A consequence of this
postulate is that a quantum microstate can contain any number of bosons
but only zero or one fermion. This aects obviously the enumeration of
microstates as one will see in chapters 5 and 6.

1.5.2 Macroscopic states


A macroscopic state , or macrostate for short, is an ensemble of microstates
having a common macroscopic property. An observed macroscopic property
is considered as the mean value of a statistical mixing of microstates of the
ensemble. So, its denition depends on which macroscopic property one
wants to observe: there are thus many ways to dene a macrostate. In the
example shown in Fig. 1.2, one can dene a macrostate by the occupation
numbers of the energy levels: there are thus 3 macrostates corresponding
to the following occupation numbers in the rst 4 levels (2,0,0,1), (1,1,1,0),
(0,3,0,0) (Fig. 1.2, from left to right). One can also dene a macrostate as
the one having an energy E = 3. In this case, the macrostate contains the
ensemble of all 10 microstates.

1.5.3 Statistical averages Ergodic hypothesis


When a system is at equilibrium, the mean values of its macroscopic vari-
ables do not vary anymore with time evolution. However, often the sys-
tem uctuates between dierent microstates giving rise to small variations
around the mean values of the macroscopic variables. One denes the time
average of a variable f by
May 28, 2015 15:20 Statistical Physics 9in x 6in b2143 page 16

16 Statistical Physics Fundamentals and Application to Condensed Matter

 t0 +
1
f = lim f (t)dt (1.36)
t0

where is the averaging time.


On the other hand, as seen above, the mean value of a quantity can
be calculated over the ensemble of microstates. Let Pl be the probability
of the microstate l in which the variable f is equal to fl . One denes the
so-called ensemble average of f by


f = Pl fl (1.37)
(l)

The ergodic hypothesis of statistical mechanics at equilibrium introduced


by Boltzmann in 1871 states that the time and ensemble averages are equal.
The validity of this principle has been experimentally veried for systems
at equilibrium. Of course, for systems with extremely long relaxation times
such as glasses, this equivalence is not easy to prove.

1.6 Statistical entropy

In statistical mechanics, the fundamental quantity which allows one to cal-


culate principal properties of a system is the statistical entropy introduced
by Boltzmann near the end of the 19th century. It is dened by


S = kB Pl ln Pl (1.38)
l

where Pl is the probability of the microstate l of the system and kB the


Boltzmann constant.
One will see in the following chapters that S coincides with the thermo-
dynamic entropy dened in a macroscopic manner in the second principle
of the classical thermodynamics. The advantage of the denition (1.38) is
that the physical consequences associated with the entropy are clear as seen
below:
As 0 Pl 1, one has S 0
When one of the events is sure, namely one probability is equal to
1 (other probabilities are zero by normalization), one sees that S is
zero (mimimum). When all probabilities are equal, namely when
May 28, 2015 15:20 Statistical Physics 9in x 6in b2143 page 17

Basic Concepts and Tools in Statistical Physics 17

the uncertainty on the outcome is maximum, one can show that S


is maximum (see Problem 8). In other words, S represents the un-
certainty or the lack of information on the system. With the above
denition, one understands easily that why in thermodynamics en-
tropy S is said to express the disorder of the system.
Statistical entropy is additive: consider two independent ensembles of
events {em ; m = 1, ..., M } and {em ; m = 1, ..., M  } of respective proba-
  
bilities {Pm ; m = 1, ..., M } and {Pm  ; m = 1, ..., M } . The probability to
  
nd em and em is thus P (m, m ) = Pm Pm . The entropy of these double
events is then

 
S(e, e ) = kB 
Pm Pm 
 ln(Pm Pm ) = kB

Pm Pm 
 [ln Pm + ln Pm ]

m,m m,m
   
  
= kB Pm ln Pm Pm  kB Pm  ln Pm Pm
m m m m
= S(e) + S(e ) (1.39)
 

using m Pm = 1 and m Pm = 1. The total entropy is the sum of
partial entropies.

1.7 Conclusion

Since statistical mechanics is based on a probabilistic concept, the enumer-


ation of microstates and the use of probabilities are very important. In
this chapter, basic denitions and fundamental concepts on these points
have been presented. In particular, most frequently used probability laws
such as binomial, Gaussian and Poisson distributions have been shown.
Microstates and macrostates were dened using examples for better illus-
tration. Finally, the denition of the statistical entropy which is the most
important quantity in statistical mechanics was shown and discussed in
connection with the thermodynamic entropy. The above denitions and
tools will be used throughout this book.
May 28, 2015 15:20 Statistical Physics 9in x 6in b2143 page 18

18 Statistical Physics Fundamentals and Application to Condensed Matter

1.8 Problems

Problem 1. Central limit theorem:


Show that the binomial law becomes the Gaussian law when N >>
n >> 1.
Problem 2. Poisson law (1.32):
a) Calculate n.
b) Show that the binomial law becomes the Poisson law when PA <<
PB and N >> n >> 1.
Problem 3. Demonstrate the formulas (1.34) and (1.35).
Problem 4. Application of the binomial law:
Calculate the probability to nd the number of heads between 3
and 6 (boundaries included) when one ips 10 times a coin.
Problem 5. Random walk in one dimension:
Consider a particle moving on the x axis with a constant step
length l on the right or on the left with the same probability.
a) Calculate the probability for the particle to make n steps to the
right and n steps to the left after N steps.
b) Calculate the averages n and n . Calculate the variance and rela-
tive uncertainty on n.
c) Calculate the probability to nd the particle at the position x = ml
from the origin. Calculate x and (x)2 .
d) Suppose that the step length xi is not constant. What is the density
of probability to nd the particle at X after N steps with N  1?
Problem 6. Random walk in three dimensions:
Consider a particle moving in three dimensions with variable dis-
crete step lengths l and a density of probability independent of the
direction of the step. Each step corresponds to a change of direc-
tion of the particle motion which is due to a collision with another
particle. This is a simple model of the Brownian motion.
a) Calculate the Cartesian components (X, Y, Z) of the particle po-
sition after N steps. Calculate the averages (X, Y , Z) and the
corresponding variances (X)2 , (Y )2 and (Z)2 . Deduce the
average l2 .
b) What is the density of probability to nd the particle at (X, Y, Z)
after N steps?
c) Putting (X)2 = 2Dt where D is the diusion coecient and t the
2
lapse of time of the motion of the particle, demonstrate D = 16 l
where is the average time between two particle collisions.
May 28, 2015 15:20 Statistical Physics 9in x 6in b2143 page 19

Basic Concepts and Tools in Statistical Physics 19

Problem 7. Exchange of energy:


Consider two isolated systems. The rst one has N1 = 2 particles
and an energy equal to E1 = 3 units. The second one has N2 = 4
particles and E2 = 5 units. Each particle can have 0, 1, 2, ...
energy units.
a) Calculate the number of microstates of the total system composed
of systems 1 and 2 separated by an insulating wall.
b) Remove the wall. The total system is kept isolated. Calculate the
number of microstates of the total system in this new situation.
Comment.
c) In the situation of the previous question, calculate the number of
microstates in which there are at least 2 particles having 2 energy
units each. Deduce the probability of this macrostate.
Problem 8. Statistical entropy:

Show that the statistical entropy S = kB l Pl ln Pl is maximum
when all probabilities Pl are equal.

You might also like