You are on page 1of 73

Supplement

– Statistical Thermodynamics
Contents

1. Introduction

2. Distribution of Molecular States

3. Interacting Systems – Gibbs Ensemble

4. Classical Statistical Mechanics


1. Introduction

 Mechanics : Study of position, velocity, force and


energy
 Classical Mechanics (Molecular Mechanics)
• Molecules (or molecular segments) are treated as rigid object
(point, sphere, cube,...)
• Newton’s law of motion
 Quantum Mechanics
• Molecules are composed of electrons, nuclei, ...
• Schrodinger’s equation  Wave function
1. Introduction

 Methodology of Thermodynamics and Statistical Mechanics


 Thermodynamics
• study of the relationships between macroscopic properties
– Volume, pressure, compressibility, …
 Statistical Mechanics (Statistical Thermodynamics)
• how the various macroscopic properties arise as a consequence of the microscopic nature of
the system
– Position and momenta of individual molecules (mechanical variables)
 Statistical Thermodynamics (or Statistical Mechanics) is a link between microscopic
properties and bulk (macroscopic) properties
Pure mechanical variables Thermodynamic Variables

Statistical
Methods of QM
Mechanics P T V

Methods of MM
U

A particular
microscopic model
r
can be used
1.Introduction

 Equilibrium Macroscopic Properties


 Properties are consequence of average of individual
molecules
 Properties are invariant with time  Time average

average average statistical


Mechanical over molecules over time thermodynamics
Properties of Thermodynamic
Individual Properties
2 3 4
Molecules

position, velocity temperature, pressure


energy, ... internal energy, enthalpy,...
1. Introduction

 Description of States
 Macrostates : T, P, V, … (fewer variables)
 Microstates : position, momentum of each particles (~1023 variables)
 Fundamental methodology of statistical mechanics
 Probabilistic approach : statistical average
• Most probable value
 Is it reasonable ?
• As N approaches very large number, then fluctuations are negligible
• “Central Limit Theorem” (from statistics)
• Deviation ~1/N0.5
2. Distribution of Molecular States

 Statistical Distribution
 n : number of occurrences
 b : a property

if we know “distribution”
ni we can calculate the average
value of the property b

b
1 2 3 4 5 6
2. Distribution of Molecular States

 Normalized Distribution Function


 Probability Distribution Function
ni (bi ) n (b )
Pi (bi )   i i
n  ni (bi )
i

 P (b )  1
i i

Pi i

 b   bi Pi
i

 F (b)   F (bi ) Pi
b i

b1 b2 b3 b4 b5 b6
Finding probability (distribution) function is
the main task in statistical thermodynamics
2. Distribution of Molecular States

 Quantum theory says ,


 Each molecules can have only discrete values of energies
 Evidence Energy
 Black-body radiation Levels

 Planck distribution
 Heat capacities
 Atomic and molecular spectra
 Wave-Particle duality
2. Distribution of Molecular States

 Configuration ....
 At any instance, there may be no molecules at  , n1
molecules at , n2 molecules at  , …
 {n0 , n1 , n2 …} configuration




{ 3,2,2,1,0,0}



2. Distribution of Molecular States

 Weight ....
 Each configurations can be achieved in different ways

 Example1 : {3,0} configuration  1




 Example2 : {2,1} configuration  3

  
  
2. Distribution of Molecular States

 Calculation of Weight ....


 Weight (W) : number of ways that a configuration can be achieved in
different ways
 General formula for the weight of {n0 , n1 , n2 …} configuration

N! N!
W  Example1
n1!n2 !n3!...  ni ! {1,0,3,5,10,1} of 20 objects
W = 9.31E8
i

Example 2
{0,1,5,0,8,0,3,2,1} of 20 objects
W = 4.19 E10
Principles of Equal a Priori Probability

 All distributions of energy are equally probable


 If E = 5 and N = 5 then

  
  
  
  
  
  
All configurations have equal probability, but
possible number of way (weight) is different.
A Dominating Configuration

 For large number of molecules and large number of energy


levels, there is a dominating configuration.
 The weight of the dominating configuration is much more
larger than the other configurations.

Wi

Configurations
{ni}
Dominating Configuration

  
  
  
  
  
  

W = 1 (5!/5!) W = 20 (5!/3!) W = 5 (5!/4!)

Difference in W becomes larger when N is increased !


In molecular systems (N~1023) considering the
most dominant configuration is enough for average
How to find most dominant
configuration ?

 The Boltzmann Distribution


 Task : Find the dominant configuration for given N and
total energy E
 Method : Find maximum value of W which satisfies,

N   ni  dn i 0
i i

E    i ni   dn
i i 0
i i
Stirling's approximation

 A useful formula when dealing with factorials of


large numbers.

ln N ! N ln N  N
N!
ln W  ln  ln N ! ln ni !
n1!n2 !n3!... i

 N ln N  N   ni lnni   ni
i i

 N ln N   ni lnni
i
Method of Undetermined
Multipliers

 Maximum weight , W
Recall the method to find min, max of a function…
d ln W  0
  ln W 
   0
 dni 

 Method of undetermined multiplier :


 Constraints should be multiplied by a constant and
added to the main variation equation.
Method of Undetermined
Multipliers
undetermined multipliers
  ln W 
d ln W    dni    dni     i dni
i  dni  i i

  ln W  
        i dni  0
i  dni  

  ln W 
      i  0
 dni 
Method of Undetermined
Multipliers
ln W  N ln N   ni ln ni
  ln W  N ln N  (n j ln n j )
   
 ni  ni j ni

N ln N  N  1  N 
   ln N  N     ln N  1
ni n
 i N n
 i
 (n j ln n j )  n j  1  n j 
 ni
    ln n j  n j     ln ni  1
j j  ni  nj  ni 

 ln W n
 (ln ni  1)  (ln N  1)   ln i
ni N
Method of Undetermined
Multipliers

ni ni
 ln     i  0  e   i
N N
Normalization Condition

N   n j  Ne  e
  j

j j

1
e 
e j
 

ni e   i Boltzmann Distribution
Pi   (Probability function for
N  e   j
j energy distribution)
The Molecular Partition Function

 Boltzmann Distribution
ni e   i e   i
pi   
N e   j
q
j

 Molecular Partition Function


q  e
  j

 Degeneracies : Same energy value but different states (gj-


fold degenerate)
 g je
  j
q
levels
j
How to obtain the value of beta ?

 Assumption :   1 / kT
 T 0 then q  1

 T infinity then q  infinity

 The molecular partition function gives an indication of the


average number of states that are thermally accessible to a
molecule at T.
2. Interacting Systems
– Gibbs Ensemble

 Solution to Schrodinger equation (Eigen-value problem)


 Wave function h2
  2  i2   U  E
 Allowed energy levels : En i 8 mi

 Using the molecular partition function, we can calculate


average values of property at given QUANTUM STATE.

 Quantum states are changing so rapidly that the observed


dynamic properties are actually time average over
quantum states.
Fluctuation with Time

states

time

Although we know most probable distribution of energies of individual


molecules at given N and E (previous section – molecular partition
function) it is almost impossible to get time average for interacting
molecules
Thermodynamic Properties

 Entire set of possible quantum states

1 , 1 , 1 ,...i ,...

E1 , E2 , E3 ,..., Ei ,...

 Thermodynamic internal energy


1
U  lim  Ei ti
  
i
Difficulties

 Fluctuations are very small


 Fluctuations occur too rapidly

We have to use alternative, abstract approach.

Ensemble average method (proposed by Gibbs)


Alternative Procedure

 Canonical Ensemble
 Proposed by J. W. Gibbs (1839-1903)
 Alternative procedure to obtain average
 Ensemble : Infinite number of mental replica
of system of interest
Large reservoir (constant T)

All the ensemble members have the same (n, V, T)

Energy can be exchanged but


particles cannot

Number of Systems : N
N∞
Two Postulate

Fist Postulate

The long time average of a mechanical variable M is


equal to the ensemble average in the limit N ∞

time

E1 E2 E3 E4 E5

Second Postulate (Ergodic Hypothesis)

The systems of ensemble are distributed uniformly for (n,V,T) system


Single isolated system spend equal amount of time
Averaging Method

 Probability of observing particular quantum state i


n~i
Pi 
 n~i
i

 Ensemble average of a dynamic property

 E   Ei Pi
i
 Time average and ensemble average

U  lim  Ei ti  lim  Ei Pi


  n 
i
How to find Most Probable
Distribution ?

 Calculation of Probability in an Ensemble


 Weight
~ ~
N! N!
 ~ ~ ~ 
n1!n2 !n3!...  n~i !
i

 Most probable distribution = configuration with maximum weight

 Task : find the dominating configuration for given N and E


• Find maximum  which satisfies
~ ~ 0
N   n~i  i
i
d n
i
~ 0
Et   Ei n~i  i i
i
E d n
i
Canonical Partition Function

 Similar method (Section 2) can be used to get most


probable distribution
ni e  E i
Pi  
N  e  E j
j

ni e  Ei e  Ei
Pi   
N e  E j
Q
j

Q  e
 E j
Canonical Partition Function
j
How to obtain beta ?
– Another interpretation

dU  d ( Ei Pi )   Ei dPi   Pi dEi
i i i

dU  qrev  wrev  TdS  pdV

 Ei 
 P dE   P  V 
i
i i
i
i dV   PdV  wrev
N

1 1
 Ei dPi  
i
( ln Pi dPi  ln Q  dPi )  
 i i 
 ln P dP  TdS  dq
i
i i rev

The only function that links heat (path integral) and


state property is TEMPERATURE.

  1 / kT
Properties from Canonical Partition
Function

 Internal Energy
1
U  E   Ei Pi   i E e  Ei

i Q i ( qs )
 Q 
     Ei e  Ei
   N ,V i ( qs )

1  Q    ln Q 
U       
Q    N ,V     N ,V
Properties from Canonical Partition
Function

 Pressure Small Adiabatic expansion of system


(wi ) N  Pi dV  Fi dx
(dEi ) N   Fi dx   Pi dV  wi dx
 E 
Pi   i 
 V  N Fi
P  P   Pi Pi dV V
i

1 1  Ei   Ei
P 
Q i
Pi e  Ei
    e
Q i  V  N
dEi Ei
  ln Q    E 
     i  e  Ei
 V   , N Q i  V  N
1   ln Q 
P  
  ln V 
Thermodynamic Properties from
Canonical Partition Function

 ln Q
U  kT ( )V , N
 ln T
  ln Q 
S  k  ln Q  ( )V , N 
  ln T 
  ln Q  ln Q 
H  kT  ( )V , N  ( )T , N 
  ln T  ln V 
A   kT ln Q
  ln Q 
G   kT  ln Q  ( )T , N 
  ln V 
  ln Q 
 i  kT  
 N i T ,V , N ji
Grand Canonical Ensemble

 Ensemble approach for open system


 Useful for open systems and mixtures
 Walls are replaced by permeable walls
Large reservoir (constant T )

All the ensemble members have the same (V, T, i )

Energy and particles can be exchanged

Number of Systems : N
N∞
Grand Canonical Ensemble

 Similar approach as Canonical Ensemble


 We cannot use second postulate because systems are not isolated
 After equilibrium is reached, we place walls around ensemble and treat
each members the same method used in canonical ensemble

After
equilibrium

T,V, T,V,N1 T,V,N3 T,V,N5

Each members are (T,V,N) systems T,V,N2 T,V,N4


 Apply canonical ensemble methods for each member
Grand Canonical Ensemble

 Weight and Constraint Number of ensemble members

  Number of molecules after


 n j ( N )! N   n j (N ) fixed wall has been placed

  j,N  j,N

 n j ( N )!
j,N
Et   n j ( N ) E j (V , N )
j,N

Method of undetermined multiplier Nt   n j ( N ) N


with  j,N
 E j ( N ,V )  N
n*j ( N )  Ne  e e
 E j ( N ,V )  N
n j (N ) n*j ( N ) e e
Pj ( N )   
e
 E j ( N ,V ) N
N N e
j,N
Grand Canonical Ensemble

 Determination of Undetermined Multipliers


U  E   Pj ( N ) E j ( N , V )   e
 E j ( N ,V )  N
e
j,N
j,N
dU   E j ( N ,V )dPj ( N )   Pj ( N )dE j ( N , V )
j,N j,N

E ( N , V )
dU  
1

  N  ln P ( N )  ln
j   dP ( Nj)   P ( N )
j
V
j
dV
j,N j,N

dU  TdS  pdV  dN Comparing two equation gives,

1 
  
kT kT
  e
 E j ( N ,V ) / kT
e N / kT
Grand Canonical Partition Function
j,N
4. Classical Statistical Mechanics

 The formalism of statistical mechanics relies very much at the


microscopic states.
 Number of states , sum over states
 convenient for the framework of quantum mechanics
 What about “Classical Sates” ?
 Classical states
• We know position and velocity of all particles in the system
 Comparison between Quantum Mechanics and Classical Mechanics

QM Problem H  E Finding probability and discrete energy states

CM Problem F  ma Finding position and momentum of individual molecules


Newton’s Law of Motion

 Three formulations for Newton’s second law of motion


 Newtonian formulation
 Lagrangian formulation
 Hamiltonian formulation
H (r N , p N )  KE(kinetic energy)  PE(potential energy)
pi
H (r N , p N )   U (r1 , r2 ,..., rN )
i 2mi

 H  ri p i r  r (rx , ry , rz )
   p i 
 ri  t mi p  p( p x , p y , p z )
 H  p i Fi   Fij
   r i  Fi
 p i  t j 1
j i
Classical Statistical Mechanics
 Instead of taking replica of systems, use abstract “phase space”
composed of momentum space and position space (total 6N-space)

pN

t2

2

t1 Phase space

   p1 , p 2 , p 3 ,..., p N , r1 , r1 , r3 ,..., rN 
1

rN
Classical Statistical Mechanics

 “ Classical State “ : defines a cell in the space (small volume of


momentum and positions)
" Classical State"  dqx dq y dqz drx dry drz  d 3 pd 3r for simplicity
 Ensemble Average
1 
U  lim  E ()d  lim  PN () E ()d
   0 n 

PN ()d Fraction of Ensemble members in this range


( to +d)

Using similar technique used for


Boltzmann distribution

exp( H / kT )d
PN ()d 
 ... exp( H / kT )d
Classical Statistical Mechanics

 Canonical Partition Function

Phase Integral T   ... exp( H / kT )d

Canonical Partition Function Q  c  ... exp( H / kT )d

Match between Quantum


 exp( E / kT )
i

and Classical Mechanics c  lim i


T 
 ... exp( H / kT )d
1
c
N!h NF
For rigorous derivation see Hill, Chap.6 (“Statistical Thermodynamics”)
Classical Statistical Mechanics

 Canonical Partition Function in Classical Mechanics

1
NF  
Q ... exp(H / kT )d
N !h
Example ) Translational Motion for
Ideal Gas

H (r N , p N )  KE(kinetic energy)  PE(potential energy)


p
H (r N , p N )   i U (r1 , r2 ,..., rN )
i 2mi
2
No potential energy, 3 dimensional
3N
pi space.
H 
i 2mi

1 pi2
Q ... exp(
3N  
) dp1...dp N dr1...drN
N !h i 2mi
3N N
1  p  V 
   exp( )dp    dr1dr2 dr3 
N !h 3 N  2mi  0 
3N / 2
1  2mkT 
 VN
N !  h 2  We will get ideal gas law
pV= nRT
Semi-Classical Partition Function

 The energy of a molecule is distributed in different modes


 Vibration, Rotation (Internal : depends only on T)
 Translation (External : depends on T and V)
 Assumption 1 : Partition Function (thus energy distribution) can be separated
into two parts (internal + center of mass motion)

EiCM  Eiint EiCM Eiint


Q   exp( )   exp( ) exp( )
kT kT kT

Q  QCM ( N , V , T )Qint ( N , T )
Semi-Classical Partition Function

 Internal parts are density independent and most of the


components have the same value with ideal gases.

Qint ( N ,  , T )  Qint ( N ,0, T )

 For solids and polymeric molecules, this assumption is not


valid any more.
Semi-Classical Partition Function

 Assumption 2 : for T>50 K , classical approximation can be


used for translational motion
pix2  piy2  piz2
H CM    U (r1 , r2 ,..., r3 N )
i 2m
1 pix2  piy2  piz2
Q ... exp(
3N  
)dp 3 N  ... (U / kT )dr 3 N
N !h i 2mkT
3 N
 Z
N!
Configurational Integral
1/ 2
 h2 
   
 2mkT  1
Z   ... (U / kT )dr1dr2 ...dr3 N Q Qint 3 N Z
N!
Another, Different Treatment
Statistical Thermodynamics:
the basics
• Nature is quantum-mechanical
• Consequence:
– Systems have discrete quantum states.
– For finite “closed” systems, the number of states is
finite (but usually very large)
• Hypothesis: In a closed system, every state is
equally likely to be observed.
• Consequence: ALL of equilibrium Statistical
Mechanics and Thermodynamics
…, but there are not
Each individual many microstates that
microstate is Basic assumption
give these extreme
equally probable results

If the number of
particles is large (>10)
these functions are
sharply peaked
Does the basis assumption lead to something
that is consistent with classical
thermodynamics?

E1 E2  E  E1
Systems 1 and 2 are weakly coupled
such that they can exchange energy.

What will be E1?

  E1 , E  E1   1  E1    2  E  E1 

BA: each configuration is equally probable; but the number of


states that give an energy E1 is not know.
  E1 , E  E1   1  E1    2  E  E1 
ln   E1 , E  E1   ln 1  E1   ln 2  E  E1 
  ln   E1 , E  E1   Energy is conserved!
  0 dE1=-dE2
 E1  N1 ,V1
 
  ln 1 E1  
  ln 2 E  E1 

 0
  
 E1  N ,V  E1 N
1 1 2 ,V2

This can be seen as


  ln 1  E1     ln  2  E  E1  
    an equilibrium
 E1  N1 ,V1  E2  N 2 ,V2 condition
  ln   E  
  
 E  N ,V
1   2
Entropy and number of
configurations
Conjecture: S  ln 
Almost right.
•Good features:
•Extensivity
•Third law of thermodynamics comes for free
•Bad feature:
•It assumes that entropy is dimensionless but (for
unfortunate, historical reasons, it is not…)
We have to live with the past, therefore

S  k B ln   E 
With kB= 1.380662 10-23 J/K
In thermodynamics, the absolute (Kelvin)
temperature scale was defined such that
n
 S  1 dE  TdS-pdV   i dN i
  
 E  N ,V T i 1
But we found (defined):
  ln   E  
  
 E  N ,V
And this gives the “statistical” definition of temperature:

1   ln   E  
 kB  
T  E  N ,V
In short:
Entropy and temperature are both related to
the fact that we can COUNT states.

Basic assumption:
1. leads to an equilibrium condition: equal temperatures
2. leads to a maximum of entropy
3. leads to the third law of thermodynamics
Number of configurations
How large is ?
•For macroscopic systems, super-astronomically large.
•For instance, for a glass of water at room temperature:

25
210
  10
•Macroscopic deviations from the second law of
thermodynamics are not forbidden, but they are
extremely unlikely.
Canonical ensemble 1/kBT

Consider a small system that can exchange heat with a big reservoir

 ln 
Ei E  Ei ln   E  Ei   ln   E   Ei  
E
  E  Ei  Ei
ln 
 E k BT

Hence, the probability to find Ei:


  E  Ei  exp   Ei k BT 
P  Ei   
  E  E  
j j j
exp   E j k BT 

P  Ei   exp   Ei k BT 
Boltzmann distribution
Example: ideal gas
1
Q  N ,V , T  Thermo recall
 exp   U  r  
dr (3)N N

 N!
3N

Helmholtz Free energy:N


1 V
 dF3 N  SdTdr p1dV 3 N
N

 N!  N!
Free energy: Pressure
  V
F 
N

 F   ln
  3 N   P

V
  T N ! 
Energy: N
 N ln   N ln    N ln  3  N ln 
3

Pressure: F T
 V 
   F
  E
 1 T  
  Energy:
 F  N
P       F  3 N  3
 V T V E    Nk BT
     2
Ensembles
• Micro-canonical ensemble: E,V,N
• Canonical ensemble: T,V,N
• Constant pressure ensemble: T,P,N
• Grand-canonical ensemble: T,V,μ
…, but there are not
Each individual many microstates that
microstate is Basic assumption
give these extreme
equally probable results

If the number of
particles is large (>10)
these functions are
sharply peaked
Does the basis assumption lead to something
that is consistent with classical
thermodynamics?

E1 E2  E  E1
Systems 1 and 2 are weakly coupled
such that they can exchange energy.

What will be E1?

  E1 , E  E1   1  E1    2  E  E1 

BA: each configuration is equally probable; but the number of


states that give an energy E1 is not know.
  E1 , E  E1   1  E1    2  E  E1 
ln   E1 , E  E1   ln 1  E1   ln 2  E  E1 
  ln   E1 , E  E1   Energy is conserved!
  0 dE1=-dE2
 E1  N1 ,V1
 
  ln 1 E1  
  ln 2 E  E1 

 0
  
 E1  N ,V  E1 N
1 1 2 ,V2

This can be seen as


  ln 1  E1     ln  2  E  E1  
    an equilibrium
 E1  N1 ,V1  E2  N 2 ,V2 condition
  ln   E  
  
 E  N ,V
1   2
Entropy and number of
configurations
Conjecture: S  ln 
Almost right.
•Good features:
•Extensivity
•Third law of thermodynamics comes for free
•Bad feature:
•It assumes that entropy is dimensionless but (for
unfortunate, historical reasons, it is not…)
We have to live with the past, therefore

S  k B ln   E 
With kB= 1.380662 10-23 J/K
In thermodynamics, the absolute (Kelvin)
temperature scale was defined such that
n
 S  1 dE  TdS-pdV   i dN i
  
 E  N ,V T i 1
But we found (defined):
  ln   E  
  
 E  N ,V
And this gives the “statistical” definition of temperature:

1   ln   E  
 kB  
T  E  N ,V
In short:
Entropy and temperature are both related to
the fact that we can COUNT states.

Basic assumption:
1. leads to an equilibrium condition: equal temperatures
2. leads to a maximum of entropy
3. leads to the third law of thermodynamics
Number of configurations
How large is ?
•For macroscopic systems, super-astronomically large.
•For instance, for a glass of water at room temperature:

25
210
  10
•Macroscopic deviations from the second law of
thermodynamics are not forbidden, but they are
extremely unlikely.
Canonical ensemble 1/kBT

Consider a small system that can exchange heat with a big reservoir

 ln 
Ei E  Ei ln   E  Ei   ln   E   Ei  
E
  E  Ei  Ei
ln 
 E k BT

Hence, the probability to find Ei:


  E  Ei  exp   Ei k BT 
P  Ei   
  E  E  
j j j
exp   E j k BT 

P  Ei   exp   Ei k BT 
Boltzmann distribution
Example: ideal gas
1
Q  N ,V , T  Thermo recall
 exp   U  r  
dr (3)N N

 N!
3N

Helmholtz Free energy:N


1 V
 dF3 N  SdTdr p1dV 3 N
N

 N!  N!
Free energy: Pressure
  V
F 
N

 F   ln
  3 N   P

V
  T N ! 
Energy: N
 N ln   N ln    N ln  3  N ln 
3

Pressure: F T
 V 
   F
  E
 1 T  
  Energy:
 F  N
P       F  3 N  3
 V T V E    Nk BT
     2
Ensembles
• Micro-canonical ensemble: E,V,N
• Canonical ensemble: T,V,N
• Constant pressure ensemble: T,P,N
• Grand-canonical ensemble: T,V,μ

You might also like