You are on page 1of 9

# Chapter 1

Introduction
1.1

The Purpose of Statistical Mechanics

Statistical Mechanics is the mechanics developed to treat a collection of
a large number of atoms or particles. Such a collection is, for example, a solid
made up of N ≈ 1023 atoms or a liquid or a gas of 1023 molecules.
Ordinary mechanics, classical or quantum mechanics, is suited to treating
the behavior of one, two or at most a few bodies. There we set up an equation
of motion for the body, Newton’s equations in classical mechanics, for example,
and solve this equation to find how the momentum and position of the body
vary with time. With this equation of motion, and once we have specified the
initial state of the body by giving its position and momentum at some initial
time, we can predict the behavior of the body at all future times. If we attempt
this approach for 1023 atoms we have not only an impossible task, but the result
would not be useful. For it does not help us much to know the individual motion
of each atom and describe the properties of an enormous collection of individual
motions. What we want rather is the average or macroscopic properties of the
1023 atoms and how these are related to the microscopic interatomic interactions. We want to know particularly how the atoms are distributed over the
possible states available to them from which we can construct average values.
These average values constitute the macroscopic properties of the liquid, solid
or gas.
With this in mind we may say that statistical mechanics has two purposes. Firstly, given an assembly of N identical systems, how are these systems
distributed over the states available to them? This might be the distribution
over the possible momentum values or in the case of an external field, how the
systems are distributed in position. Strictly only the distribution over energy,
the number of systems in a given energy interval, is required since the other
distributions may be obtained from the energy distribution. Once we have this
distribution - which is often of specified physical interest itself - we may evaluate
average properties such as the energy, pressure and specific heat which make up
the thermodynamic properties of the assembly of systems.
1

and there are a number of methods. however. Hence the concept of an assembly of N identical systems should be viewed as a mathematical model. We denote the number of bodies in the system by NS to distinguish it from the number of systems in the assembly. The number N of systems . If the body we wish to specify contained only a single particle. respectively. many-particle system? If we had one or two particles. not really said what these systems are. We seek. this point in the six dimensional phase space would specify the state of the system completely. If we combine the two spaces.2 The Mathematical Model We have been discussing an assembly of N identical systems. This link is made through the average process via the distributions discussed above.2 Chapter 1. one due to Ludwig Boltzmann and the other due later to Willard Gibbs. six dimensions for each of the NS particles. This leads to the second role of statistical mechanics. This concept of an assembly of N systems and their distribution over the elements ∆Γ of phase space is central to statistical mechanics. on the one hand. Fundamentally. The dimensions of the phase space is now 6NS . the distribution of the N systems over the possible energy states available to the systems given that the total energy of the assembly is fixed. A point in this 6NS dimensional space completely represents the state of the NS body system. We can regard ~xi and p~i as vectors to points in a three dimensional configuration space and a three dimensional momentum space. the assembly of systems is a mathematical tool which we may interpret physically in different ways. The combine position and momentum space is referred to as PHASE SPACE. is identical. In statistical mechanics we seek the distribution of a collection of identical such systems over the possible energy states available to the systems. we will have a single point in a six dimensional space. This distribution can be represented by a distribution of representative points in the 6NS dimensional phase space or Γ space. We specify the state of a many-body system in the same way. by specifying the ~xi and p~i of each of the i = 1 to NS bodies. We solve the model here using the method of the most probable distribution introduced by Boltzmann. we know from classical mechanics that we need only specify the position ~xi and momentum p~i of the particles and this completely determines (with the equations of motion) the present (and future) behavior. Only the physical interpretation of the systems is different in the Gibbs and Boltzmann pictures of statistical mechanics. mathematically. and the thermodynamic properties of the system made up of these particles on the other. There are two basic interpretations. It is the link between the microscopic particle dynamics and interactions. We have. 1. It provides the link between the microscopic properties of a many body system and its macroscopic character. The two interpretations are conceptually quite different but the mathematical method of treating the assembly. How do we specify microscopically the state of a large.

here the assembly is the collection of mental copies. We might picture constructing our assembly of mental copies in the following way. If we counted the number of times. That is.3 The Gibbs Interpretation Time and Space Average In the Gibbs interpretation. This assumption is discussed in more detail in section 1. We could then give the average value of a property A. N (s). If we focus first on the selected system we could watch it for a long time. If we observe the system for a long enough time T we should observe it in all its possible states. We assume loose mechanical contact but good thermal contact between the copies so that the temperature will be uniform in the block. During this time heat will be exchanged between the heat bath (the remainder of the block) and the system’s energy will fluctuate. 3 in the assembly is also fixed. We could partition this block into 105 or 106 equal parts with each part remaining large enough that it still represents the character of a block of copper. Thus. to represent all possible states in this way each possible state of the system must appear at least once in the assembly. containing ∼ 1023 atoms. is represented at least once in the assembly. To do this we construct an assembly of mental copies of the system in its different states. liquid or gas containing many atoms that we wish to study. however. Each part can then be taken as the system. Clearly. we observed the system in state s. however rare. Clearly again we must have enough mental copies so that each possible state. Consider a large block of solid.3. The last sentence is basically the statement if the ERGODIC HYPOTHESIS. 1. As energy fluctuations take place in the block. of the system as hAi = 1 T Z T dts A(ts ) (1. We naturally want to include and describe all possible states of the system. In doing this we will assume that initially there is an equal probability of finding a system in any energy state or in any region of phase space. be in a number of states.Chapter 1. This assembly is also often referred to as an ENSEMBLE in the Gibb’s method. In this way we may imagine our system immersed in a heat bath made up of identical mental copies of the initial system. say. there is no a priori reason to choose one region over another to begin our probability arguments. the system is the actual solid. energy is exchanged between the one system selected and the mental copies. say of copper. during . The idea is to select one of these parts as our system and regard all the other parts as mental copies of this system.1) 0 where A(ts ) is the value of A at time ts when the system is in state s. This system may. The number of copies appearing in a given state gives the probability of seeing the selected system in that state. that is each part has the same character of the initial system.

4 Chapter 1. We now turn to our N mental copies of the selected system and count. This is obviously true in classical mechanics. and has never been ”proved”.4) V where V is the volume of the block. but in quantum mechanics ~xi and p~i can be specified only within Heisenberg’s uncertainty limit ∆~ pi ∆~xi ≥ h3 . The instantaneous . we could also write this time average as P s N (s)A(s) hAi = P s N (s) (1.1) should equal the space average (1. sec. the number of copies in state s. In this way we specify the state of a particle by locating p~i and ~xi within a six 1 dimensional ”cube” of sides h 2 .4) are identical.2). our block of copper) into identical systems.3) s where ρ(s) is the fraction of copies in the ensemble observed in state s. classically at least. Such a theorem cannot be. then the average of A in (1.1) and (1. then the second counting is a space average over this larger body or an ensemble average over the N mental copies. time T . We will take this as the limit to which we attempt to specify ~xi and p~i and outside this limit ~xi and p~i are regarded as statistically independent. Once the initial momentum and position of a single particle are given then. it is intuitively reasonable that provided T is long enough and provided the number of copies N is large enough then the time average (1. The ergodic hypothesis states that these two averages are identical. However. The later we can write as P X N (s) s N (s)A(s) hAi = P = A(s) N s N (s) s = X ρ(s)A(s) (1. where h = 6. In this second counting. The ergodic hypothesis states that the time and space ensemble averages in (1. the complete motion is specified for all times by the eqution of motion the particle satisfies. We also take ~xi and p~i as independent.g. Z 1 hAi = drA(r) (1. at one instant.2) is interpreted as a sum over all the states appearing among the N mental copies. How do we specify the states s of our system. If we regard the mental copies of the system as constructed by partitioning a large body (e.023 × 10−27 erg. which is now composed of a large number M of atoms or molecules? To do this we go back to dynamics as noted above where we saw that we may specify the state of a single particle by its momentum p~i and position ~xi . The last average may be regarded as a space average over the whole block since we could convert this to an integral over the block.2) where A(s) is the value of A in state s(t) observed at time t.

The motion of the point in Γ describes the re-distribution of energy between kinetic and potential as the particles (e. . 5 state of our system of NS particles is then specified by specifying the position ~xi and p~i of each constituent. NS X pi 2 + V (~xi . although simpler.Chapter 1. then we need to fix 2f coordinates to locate each molecule and a 2f NS dimensional phase space to specify the system. For a gas.1906) was interested primarily in the statistics and thermodynamics of gases. In this . Particularly. E= 2mi i=1 is fixed. collisions will be rare and particularly for a very dilute gas we may regard each atom as a non-interacting particle. Finally. molecules or more elementary particles that make up the system. The Boltzmann interpretation to follow. in which the particles do not interact at all. is called a perfect or ideal gas. This state point is in the volume element. if the NS ~xi and p~i are fixed then the total energy of the system. The limit of an infinitely dilute gas. Once this is done the state of the system may be represented by a point in 6NS dimensional phase space or Γ space. atoms) interact and as the system looses (or gains) total energy through contact and exchange of energy with the heat bath surrounding it. . i=1 of phase space. the great power of the Gibb’s interpretation of statistical mechanics is that we have said nothing about the interaction between the atoms. If the constituents are atoms (which we regard as a unit which does not have an internal degree of freedom) then we have a total of 6NS coordinates to specify.4 Boltzmann’s Statistical Mechanics Ludwig Boltzmann (1847 . Thus. If the constituents making up the system are NS molecules having f degrees of freedom.g. In this case the particles making up the gas are nearly independent particles interacting only rarely with other particles. The motion of the particles in the system can then be described by the motion of this point through phase space. This means that the averages we attain will be valid for arbitrary interaction among the particles. in a gas at standard temperature and pressure (STP) the average ˚ which is more than 10 times a typical atomic interparticle spacing is ∼ 40A or molecular radius. For example. Boltzmann identifies the system with the particles and the assembly of the N identical systems (particles) as the gas under study. is valid only for very weakly interacting particles which means it is restricted to dilute gases in practice. 1. ~xn ). In this case each particle is clearly independent. dΓ = NS Y d~ pi d~xi .

however. we shall see. since each system is a single particle in a gas. The ∆Γ are called CELLS in phase space and we select a cell size small enough that we can distinguish the different cells. each cell of this size is taken to contain one state of the system. energy) we have the others since for a free particle the energy. In Boltzmann’s statistical mechanics the cell size is h3 . For a gas at STP there is only one chance in ∼ 100. The upper limit would be reached if the properties of the system changed substantially as we moved the point within the cell. we could make the cell large.g. decide how large the cells in Γ space should actually be. In other words if there is more than one distinguishable states within the cell we must reduce the size of the cell to obtain a precise description of the system. That is. Once we have one distribution (e. The number of states in volume dΓ = dΓ h3NS This choice is really arbitrary and most properties.5 The Basic Assumption of Statistical Mechanics In discussing Gibb’s statistical mechanics we saw that the state of an NS particle system can be represented or specified by a point in an element Q ∆Γ = pi ∆~xi of the 6NS dimensional phase space Γ. This identifies the i ∆~ momentum p~i and position ~xi of each of the NS particles in the system. however. The absolute value of the entropy does. In this case we ought to choose each cell at least as big as h3NS since we cannot locate a state more accurately than this limit. are independent of this size. We did not. 000 that a cell contains an atom. This would tell us that we must divide the cell so different states are represented by different cells. In statistical mechanics we choose the size of the cell as h3NS . In fact this cell is too small to contain a large number of atoms. This means that we will have large statistical fluctuations in the occupations of the cells (from 1 to 0 with rarely more than 1 in a cell). 2m 2 We are able to regard the particles as the systems in a gas since they are statistically independent units. case we are seeking the distribution of the N = NS particles over the possible energy or momentum or velocity states. We saw also that due to Heisenberg’s uncertainty principle we could not locate the point more precisely than each ∆piα ∆xiα ≥ h in Γ. Boltzmann’s combinatorial method depends mathematically on a uniform variation of occupation among adjacent cells. This apparent contradiction between the cell size and the mathematical requirements of Boltzmann’s combinatorial . We can also show that for the microscopic distributions we seek. Physically. 1. and particularly in classical cases. the cell size is not too large. depend upon this choice of cell size. momentum and velocity are simply related by ²i = p2i 1 = mvi2 .6 Chapter 1.

2.6 The Three Statistics Classical. Given that the state of a system is represented by a point in phase space in what cell do we place this point? That is. We shall see that when we weight our statistical arguments with external conditions that certain regions of phase space become more heavily occupied than others. It is based on a symmetry argument. We then have two cases: 1. This means that in enumerating the states of a system we cannot distinguish between states in which two particles are interchanged. But before these conditions are imposed. each energy state of the system is equally likely to be occupied. the particle de Broglie wavelength is comparable or greater than the inter-particle spacing (λ & r) and due fundamentally to Heisenberg’s uncertainty principle. we cannot trace the path of a particle in the gas. We thus cannot distinguish between particles. . an assembly of non-interacting atoms (the systems). we consider an assembly of non-interacting systems. This assumption is based on the fact that phase space is everywhere the same and we have no reason to choose or favor one region over another. By analogy if we had symmetric dice. 1. The Quantum Case: Here. Since we have taken the interactions to be negligible.Chapter 1. This problem can. In this case the de Broglie wavelength (the spread of each particle in space) is much less than the inter-particle spacing (λ ¿ r) so that each particle can be clearly distinguished from another. however. In this case we can distinguish between states in which two particles have interchanged positions. be overcome by combining cells into groups large enough to contain a large number of particles. A physical example to keep in mind is a perfect gas. how do we weight or choose the probability that a cell is occupied? The basic assumption of statistical mechanics is that each region or cell of phase space is intrinsically equally likely to be occupied. we would say that each side is equally likely to appear facing up when the dice is thrown. This is valid since the observable particle energy ² = p2 /2m varies slowly over the larger elements. This is often referred to as the ”Equal A Priori Probability of all regions of Phase Space”. with each of the six sides identical. The Classical Case: Here the particles are large enough or the temperature of a gas is high enough that classical mechanics accurately describes the particles. With this distinguishability we can trace the path of a single particle through the gas and identify it at each point. before these conditions are imposed. Fermi and Bose To introduce the idea of particle statistics. 7 method lead many to criticize and dismiss his statistical mechanics. we assume that each region is intrinsically equally likely to be occupied. Equally. there remains only the intrinsic character of the particles to distinguish one gas from another. Only if we weighted the faces with some additional conditions would any faces be favored. We must count them as the same state.

ψ anti (1. ψ sym (1. x2 . 1) = −ψ anti (1. this |ψ| must be symmetric with respect to interchange of 2 two particles. 2) = φa (1) φb (2).e. 1) = ψ(1. The ψ sym (1. The restriction on Fermions leads to Fermi-Dirac statistics. which is a product of two single particle functions. i. . This symmetry of |ψ| will be maintained if ψ is symmetric with respect to interchange or if ψ is anti-symmetric with respect to interchange. 2 . 2) have the correct symmetry and each pair of particles must have a wave function of one of these forms. . 2) or (b) ψ(2. If we now try to put two particles in the same state. construct from φa and φb a pair function which is symmetric and one which is anti-symmetric. Thus in placing particles in the possible states available we will be able to assign at most one particle per state. φa (1) = φb in (2). 1) = ψ sym (1. 1) = −ψ(1. Since the particles are indistinguishable. This ψ(1. consider a system of two particles. we see that ψ anti (1. for Fermi particles having having anti-symmetric wave functions we can put only one Fermion in each single particle state. It is an observed property that particles having integer spin (s = 0. have the correct symmetry since if we interchange particles 1 and 2 we do not get either (a) ψ(2. In addition. 2) vanishes. 2) does not. . 2) corresponding to the symmetric and anti-symmetric cases respectively. xn ) | 2 All observable properties depend upon the square of the wave function ψ.) have symmetric wave functions while particles having half integral spins have anti-symmetric wave functions. 1.. in Quantum Mechanics the probability of observing a particular distribution of the N particles in the gas is given by | ψ (x1 . The integral spin particles are called BOSONS and the half-integral spin particles are called FERMIONS. For the Boson case there is no such restriction and we may assign any number of particles to a single particle state. . . 2) = φa (1) φb (2) − φb (1) φa (2) (2) ψ anti (2. 2). In 2 the last case ψ changes sign when two particles are interchanged so that |ψ| remains unchanged. We can however. In most cases we will consider only spin zero Bosons and spin 12 Fermions. To investigate the properties of this symmetry further. however.8 Chapter 1. 2) = φa (1) φb (2) + φa (2) φb (1) for which and for which (1) ψ sym (2. 2). 2) and ψ anti (1. That is. that is we cannot tell when two particles 2 are interchanged. If the particles are non-interacting we could write the total wave function of the pair as ψ(1. for example. In summary we have three cases .

classical particles can be independent because their wave functions are well localized compared to the inter-particle spacing. The wave function describing the whole system must again be either symmetric or anti-symmetric w. What constitutes a statistically independent system? Firstly. however. Classical Particles • distinguishable • leads to Maxwell-Boltzmann statistics 2. The de Broglie wavelength is long compared to the inter-particle spacing. particle interchange. we seek modes or excitations if the body which are independent. On the other hand quantum particles have widely spread wave functions which overlap with the wave functions of other particles. the Boltzmann factor.Chapter 1. Bosons (integer spin particles) • indistinguishable. Fermions ( 21 integer spin particles) • indistinguishable. can have any number in one state • leads to Bose-Einstein statistics 3. In this case the quantum particles can never be statistically independent even if the inter-particle force is weak. . can have only one particle per state • leads to Fermi-Dirac statistics Finally. In the quantum case. For cases in which the particles of a body are not statistically independent. For classical particles. either because they are classical and strongly interacting or because they are quantum. we note that we have considered a system of non-interacting particles to introduce the ideas of statistics. They interact effectively via the overlap of their wave functions. In this case the probability of observing the quantum particles having energy ² is not given simply by the Boltzmann factor. their de Broglie wavelength λ are much less than the inter-particle spacing. Examples are phonons in solids or liquid 4 He or the ”quasi-particles” of liquid 3 He. Thus they are distinguishable and well isolated from one another.t. Rather this probability is given by the Fermi-Dirac distribution for Fermions and by the Bose-Einstein distribution for Bosons. These are then described by the Boltzmann factor. Secondly. We will find that the probability of observing such a system in energy state E is proportional to e−E/kT . Yet. That is. a key idea is that of statistical independence. If they also interact weakly or rarely then they can be regarded as statistically independent. the states a particle can occupy are modified by the inter-particle interaction. The only change is that the states are more complicated to work out and are certainly not independent single particle states. The three statistics hold. the state of the whole system is unchanged if we interchange two particles in their states. The mathematical model we will treat is an assembly of N weakly interacting and statistically independent systems. the particles remain distinguishable when they interact.r. 9 1. for interacting particles as well.