You are on page 1of 15
Lecture 10 I. INTRODUCTION Thave to confess at this time to having performed a “sleight of hand” earlier on when deriving the formulae for the Bose-Einstein and Fermi-Dirac distributions. You will recall that the final formulae included a term, 1, the so-called chemical potential. Now, this appeared in a rather arbitrary way, without full justification. ‘The astute amongst you might also have noticed that we never derived a partition function for Fermi-Dirac, or Bose- Einstein systems, Furthermore, for the systems where we did derive a partition function, we assumed that we had a fixed number of particles, N, and a fixed energy U. A moments thonght should convince you that there might be systems where such an approach will not work. For example, even if we have a fixed m nber of monatomic ideal gas particles in a box, surely there is some small probability that the particles in the box will have a little more, or a little less energy than 3Nkp7'/2, as there is some chance that the atoms making up the walls of the box might give up a little more or less of the energy that they possess to the gas atoms. Or consider an even more complex case: consider a cubic centimetre of air in the room, and draw an maginary box around it, Although if you know the density and total energy of all of the air in the room, there will be some small fluctuations in the exact number of particles, and the exact total energy of the particles within the given cubic centimetre, Evidently we are not going to be able to learn anything about such fluctuations merely by taking an approach that assumes that the total energy and total particle number is always fixed: a more sophisticated approach is necessary. ‘The more sophisticated approach is the method of ensembles. Those of you who go on to take the theory option will study the following concepts in much more detail, and my intention here is simply to given you a taster. Up until now we have been considering boxes, or sub-systems of V atoms with total energy U. Consider many, many such sub-systems ~ the whole lot taken together is called an ensemble. There are three types of ensembles that we can imagine: 1. Anensemble of sub-systems, with each sub-system containing a fixed number of atoms, fact the .d having fixed energy. This is called the microcanonical ensemble, and is i ensemble that we have considered up to now in these notes, although we have not hitherto referred to it explicitly 2 Imagine a sub-system in thermal contact with a very large reservoir, so energy can flow between each sub-system and its reservoir. ‘The total energy of the sub- system+reservoir is kept fixed, but the energy of the particular sub-system itself is not fixed as it can exchange energy with its reservoir. However, the sub-system still has a fixed number of particles within it. Now consider replicating many, many such sub-systems + their associated reservoirs - such an ensemble is known as the canonical ensemble. 3. Imagis ea sub-system again in contact with a very large reservoir, but now such that both energy and particles can flow between the sub-system and the reservoir, The total energy and the total number of particles of the sub-system-treservoir is fixed, but the energy and number of particles in the sub-system itself can vary as it ean exchange energy and particles with the reservoir. Now consider replicating many, many such sub-systems and their associated reservoirs - such an ensemble is known as the grand canonical ensemble - which sounds like a great name for a physicist’s jazz band. By considering the grand canonical approach we shall see that we arrive at a much more jirac formulae in complete, and more satisfying, derivation of the Bose-Finstein and Fermi- which the chemical potential is explicitly introduced at the beginning. ‘Thus the present lecture can be thought of as a bridge between the simplified form of statistical mechanics used up to now and the more complete theory needed at a higher level. I. THE CHEMICAL POTENTIAL Before we set about redefining the statistical arguments needed to solve this problem, we wish to consider for the moment a particular bit of macroscopic classical thermodynamics. Normally we are accustomed to thinking of the internal energy U as being a function of heat and work, expressed, for instance, by a relationship involving functions of state, S and V, thus 1(S,V) () The most common expression of this relationship is dU =TdS — PV (2) Now this equation is only valid if we define a thermodynamic system in which the amount of material contained is held constant. But suppose now we allow for the possibility that N; particles of a substance i are added to the system. Clearly this will affect the internal energy, and so we can rewrite equation (1) by U=USY. (3) ‘This functional relationship can then be written as au o, (HU a d=() ds+(5) w 4) (is), (ar), 7 ® By comparison with equation (2), we can then write dU = Td8 — Pav + So nN; (5) where the chemical potential is given by au w= (5m)., es ° ‘The same procedure can be used for the other standard thermodynamic potential equations for enthalpy H, Gibbs free energy G and Helmholtz free energy F, to give the following equivalent definitions of the chemical potential 3 oon CO orng (anny. (ON) BND SVNige spre OND apne ONS tye ‘This shows that the chemical potential is a quantity that describes how the various ther- mod; ynamic potentials change when particles are added to the system under consideration This is an important result for thermodynamics because it leads further to @ clean definition of what is meant by equilibrium. Thus suppose there an equilibrium set up between two phases @ and of a material; we expect to find a balance between the effects of particles of the a phase moving to the 3 phase and vice versa, so that the equilibrium is defined simply by Ho = Hs (8) III. ENSEMBLES AND PROBABILITIES We now leave the world of macroscopic thermodynamics and return to the micro-world of statistical assemblies of particles, in order to try to link the two by using the grand canonical ensemble - that is an ensemble of assemblies of sub-systems and reservoirs, where b-system in which we both energy and particles can flow between the two. We denote the sub-sy are interested by the symbol I and the larger reservoir by R, rather like in Fig. 1. Taken together they form a very large system O. For example the sub-system could be a erystal growing out of a solution, whilst the reservoir could be an enormous volume of the solution. It should be re-emphasised that the sub-system [and the reservoir R allow for exchange of thermal energy as well as for numbers of particles, Our sub-system is a quantum entity. Up to now we have thought of it as having nm particles in the single-particle energy level 1, ny in the single-particle energy level ey and so on, However, as alluded to in lecture 3, we can also dese! .e the sub-system in terms of its energy levels as a whole. The total energy of the sub-system, E, is related to the single particle energies by = mer tne. +... = Yo nici (9) and the total number of particles in the sub-system is N rn (10) Reservoir of Solution }omm Crystal (subsystem 7) Lae Fig. 1. One of the systems, O, in our grand canonical ensemble. Here it comprises a sub-system F of a crystal, in contact with a very large reservoir, fz, the solution, It is important to recognise that this is an equally valid way of representing the quantum state of the sub-syste ( particles. - there is a quantum state of the whole sub-system with energy B, is another state with the same overall energy, it is clearly degenerate), containing NV, We now ask ourselves what is the probability of finding our sub-system in a particular quantum state? Well, we know overall that the total large system, comprising our sub-system and the reservoit does have fixed energy, U, and particle number, NV. If our sub-system is in a particular quantum state with energy B, and particle number NV,, then the energy of the reservoir must be (U — E,), and it must contain ( ) particles If we look at the whole ensemble of such sub-systems, what is the probability of finding a sub-system in a particular quantum state? Well, it must be proportional to the number of ways that the reservoir can have energy (U — B,) and particle number (N — N,). In order to determine this number of ways we return to Boltzmann’s original postulate that one can express macroscopic entropy in terms of numbers of microstates, thus kelnW oor W exp(S/kp) ay ‘The number of arrangements W is in turn proportional to the probability P of being in a particular state. Re ‘ng to equation (5) and considering now finite changes we can write that the change in entropy of the reservoir is given by AS = G(AU + PAV ~ AN) (12) We will consider changes at constant volume, so will drop the term in AV. Thus by allowing energy AU and particles AN to transfer from the reservoir R to the particular sub-system I, the entropy of the reservoir is reduced by the amount AS: Hau — HAN) (13) ‘This means that the entropy Sp of the reservoir after the this process is given by 1 Sn = Smar— =(AU + —HAN’ R ‘as plat +—pAN) (4) and so , using equation (11), we then get the probability distribution P xexp [Sone = lau = wan) Ike (15) or 1 . Pox Aes | au -1Ax)| (16) SF where A is a constant. Recalling that our sub-system has energy £, and particle number N,, we can say that the relative probability of finding such a sub-system in this particular quantum state within the whole ensemble is thus P x Aexp|- (7) ‘To find the absolute probability we need a normal ng factor - we no malise by summing over all possible states, s and all possible particle numbers - thus 1 : exp [-pty(Es — Nutt) [-eir( a | (18) Teo Daw exp [= Fe(Bay — ‘This normalising factor is the Grand Partition Function exp |— 19) dd ex |-a (19) N=0.(N) IV. THE FERMI-DIRAC AND BOSE-EINSTEIN DISTRIBUTIONS We now come to the real magic resulting from adopting the Grand Canonical formalism ~ and a both beautiful and subtle piece of physics which will allow us to derive the quantum distribution functions. Up to now 1 our discussion of the grand canonical ensemble, we have talked in terms of a sub-system in contact with a lange reservoir: we gave as an example a crystal in a very large volume of solution. That is to say we thought in terms of a real piece of ‘stuff’ (the erystal) in thermal and particle-flow contact with another piece of ‘stuff’ (the solution). However, let us consider something like a gas of particles (Bosons or Fermions). We know that a box of such particles has particular quantum states and energies (the solutions to Schrédinger’s equation for a 3-D infinite squ well). Why don’t we denote one of these quantum states to be the sub-system, and call all the other quantum states, with their associated particles and thus energy, the reservoir? This is the subtle thought which is the stroke of genius. There is nothing stopping us doing this as particles and energy can certainly flow into our quantum state (when it becomes oceupied) or flow out of it when a particle transfers to one of the other quantum states. By thinking along these lines we can make use of the probability distribution (17) to calculate the mean occupation numbers of quantum states containing either fermions or bosons. If the quantum state we are interested in (the one we call the sub-system) is associated with the energy level ¢, then if it has n particles in it, then clearly the energy of this ‘sub-system’ is nz. Thus we can write the mean occupation number of the quantum state as = 1) {keT) = {kT ] EP _ Ynexpl-n EP Yexpl-nt where the sum goes over all the possible integer values of n. For convenience, we shall write this as Tne” 5 Se (21) where x = exp[—(e— )/keT] (22) V. FERMI-DIRAC DISTRIBUTION Consider the application of equation (21) to the case of Fermi-Dirac statistics. As we saw earlier, particles obeying such statistics conform to the Pauli exclusion principle, ie. it is not possible to have more than one particle in the same state at the same time. This means that the values of n possible in equation (21) can only be either 0 or 1, ‘Thus _0te 1 1 “Ta 241 exple— kT] 41 (23) VI. BOSE-EINSTEIN DISTRIBUTION In this ease, there is no limit to the number of particles that we can place in a particular quantum state with energy ¢, and so we now have to sum the complete series in equation (21). Thus we have O+ a+ 2x? + 30% + det + L+ata?+e3tat+... ‘The quotient is merely a geometric series whose sum is, a- (25) and taking the factor « out of the top line we get n=(1—s)x(1 + 20 +32? + 42°...) (26) ‘The remaining series to sum on the r-h.s. is a harmonic series. This is just the first derivative of a geometric series and is given by (a (8) You should now appreciate that both these distribution formulae have appeared with the chemical potential in place naturally and without the need to insert it artificially at. the end. ‘This is a very impressive demonstration of the power of using the full treatment afforded by the Grand Canonical formalism. VII. BOSE-EINSTEIN CONDENSATION In lecture 6 and in lecture 9 we studied blackbody radation - that is a thermal distribution of photons, which we recall are bosons. However, we noted that a collection of photons need not conserve particle number. We now turn to the question of the statistics of Bosons when particle number is conserved. For example, we could consider a collection of Helium-4 atoms. As the atoms contain an even number of fermions, their overall spin is integer, and thus they act as Bosons, but we certainly can’t create or destroy them in the way we can photons, and particle number is conserved. Bose-Einstein condensation is a phenomenon that occurs at low temperatures in a col- lection of a fixed number of identical Bosons. Unlike fermions, we recall that we can have as many Bosons as we like in a particular quantum state, and thus if it were possible to cool a system of Bosons down to absolute zero, we would be unsurprised to find them all sitting in the ground state, Such a collection of Bosons would be of considerable interest, as we would have a host of particles all in the same quantum state - such a system would surely exhibit quite unique physical properties as it would act ‘as one’. ‘The pertinent question to ask is how cold do we need to make the system before an appreciable fraction of the particles are in this lowest energy quantum state? Now, if the particles were acting classically (which they most certainly are not), we would apply the Boltzmann distribution, kT) (29) and conclude that: a goodly fraction of the particles will be in the ground state at a temper- ature @, where ky@ © Ac, and where Ac is the energy difference between the first excited state and the ground state, It is illuminating to calculate what sort of temperature this (erroneous) classical approach predicts for a real system Consider a millimetre-scale droplet of liquid He. We ignore the interactions between the particles, and treat it as a gas (we comment upon this assumption later on). As we have done on several occasions in these notes, we work out the energies of the quantum states by treating the gas particles as being confined in a 3-D infinite square well, for which the solution to Schrédinger’s equation yields 2 (P 4m? $n?) (30) ima Sma? where a is the length of the side of the box (i.e, 1 millimetre in this case). ‘The values of (I,m,n) for the ground state are (111), and for one of the degenerate first excited states (211). Thus the gap in energy, Ac, between the ground and first excited state is: ht Sma? 3h? (?4+P4r)-(2+P 4+?) (31) elma Sma? We know the value of a (we said we would consider a millimetre scale droplet), and we know the mass of a helium atom, so we can calculate this energy difference - the result is of 10 order 2.5 x 10-8, which corresponds to a temperature 0 of just 1.8 x 107". ‘Thus the classical Boltzmann formula would predict that we need an incredibly low temperature to get a large fraction of the atoms into the ground state, However, we know that Boson’s don’t obey the Boltzmann distribution at low temperature, and we need to take into account the full Bose-Einstein distribution: ny 1 2 —p)/keT)— 1 i (32) g expl What does this distribution tell us about the ground state populations? First of all we ask ourselves what we expect 1 to be, The first thing we note is that we expect 1 to be less than the ground state energy of the system. If it were not, then we can see from equation (32) that we would end up with a negative occupation number for the ground state, which is clearly unphysical. Let us study how we would expect js to vary as we reduce the temperature. We know that as we reduce the temperature, the particles are going to start to occupy states of lower energy. In particular, we will get more and more particles in the ground state as the system gets colder. From equation (32) we see that for the occupation number to increase, j: must itself increase as the temperature drops, that is to say jis always less than ég, but must increase and tend towards ¢¢ as the temperature falls Recall that the occupation number of the ground state, ng, is given by ne = phat (33) = At very low temperatures we expect many, many particles to be in the ground state, thus expl the argument of the exponential must be close to zero, and we can use an expans 1 ‘@ — W)/kpT)— 1 7 o where yt! = pt — eq We now ask ourselves what the population of the excited states looks like once we have reached this temperature such that a good fraction of the particles are in the ground state, For example, let us assume that we have a total of V particles, and we are at a low temper- ature such that half of them are in the ground state. Under such conditions p' ke TN. ‘Therefore the number of particles in one of the first excited states (note that the energy level caus is threefold degenerate) is given by nar = salen (35) = Soll we Mike (30) = ean eoT TNT (7) aw (38) Where we note that exp(2/.N) © 1 as Vis such a large number. At this stage it is a good idea to stand back and remind ourselves where we have got to in the argument - because it is certainly a little complicated. Firstly, we know that 1! is negative, and increases towards zero as the temperature drops. At some point we might expect there to he a good chance of getting some of the Bosons in the ground state, At this temperature a good approximation for’ = eq © —keT'/N, but if the number NV in the said ground state is large compared with 1, the next excited state has a population 1 rau epl(Ae)/kaT]—1 (39) where Ae —cin. Importantly, the equations for all of the other excited states will look similar. All this will happen when |j’| is small compared with Ac. We evidently need to find out at what sort of temperature this occurs. Well, if u’ does become small, the number of atoms in the excited states (i.e. not in the ground state) will be a New Lom EeT=1 where 2’ = ¢ — eq. We know the form for the density of states and thus: Notice that the V2" factor in the density of states ensures that the ground state is not counted in the integral as required The integral can be found in tables ' and thus we find (43) We know that as soon as this gets less than the orginal total number of atoms, NV, then (N /.r) of the atoms start to go into the ground state. ‘This is the phenomenon of Bose- Einstein condensation - the fact that at a certain transition temperature many, many of the particles start to occupy the lowest energy state, even though the upper energy states are still quite sparsely populated, and the temperature is relatively high. ‘The transition temperature can be found by considering the temperature at which Nex = equation (43) (44) For liquid helium, one mole occupies 27.6cm, and the transition temperature is about 3K, which is extremely large compared with the splitting between the ground and first excited states, Below the transition temperature the number of atoms in the ground state is ng = N ~ Nee = N(1~(T/Tas)”] (45) ‘The calculated condensation temperature of 3K for Helium is very close to the actual temperature of 2.7K at which a transition to a novel state of matter is observed. Below this temperature Helium-4 becomes a ‘superfluid’ with vanishingly small viscosity. However, despite this, many physicists are reluctant to refer to this as Bose-Einstein condensation, because in a fnid there are strong forces between the atoms, whereas the theory we have been considering above is for a set of non-interacting particles, 1.306V% (42) Thus, for many, many years, indeed since Einstein first predicted the phenomenon in 1925, physicists have tried to get a gas of non-interacting particles to undergo Bose-Einstein condensation. The goal was finally achieved just over 5 years ago, on June 5th 1995. A team of physicists, lead by Erie Cornell and Carl Wieman of JILA, Boulder Colorado, USA, finally achieved BEC in a gas of Rubidium atoms, as shown in Fig. 2. The density of the atoms was much less than that of a liquid, and thus (as you can see from equation (44)), the temperature required to achieve BEC was correspondingly lower. The atoms were cooled by sophisticated laser-cooling methods, and ‘evaporative’ cooling in magnetic traps, with a BEC temperature of 1.7 x 10-7K, It is not my intention to go into the details here: the interested reader is re- ferred to the web site at JILA: http:/ /jilawww.colorado.edu/bec/ for details of that amazing experiment. Since that time, many groups around the world have made a condensate. BEC has been achieved at Oxford (see http://www-matterwave.physics.ox.ac.uk/bec/bec.html) and the many atoms ina single quantum state have given rise toa whole host of interesting ‘macroscopic’ quantum mechanical experiments. Fig. 2. When a gas of Rubidium atoms is cooled below the transition temperature, they all attain the same quantum state - seen here in velocity space VIII. THE QUANTUM-CLASSICAL TRANSITION Before we close, let us take a look once more at equation (44), which tells us the transition temperature for BEC. Recall that in lecture 7 we considered the temperature and density at which the quantum to classical transition took place in a fermi-gas. Note that the transition temperature occurs at exactly the same place once more, that is to say when the de-Broglie wavelength of the particle is roughly equal to the inter-particle separation, Equation (44) corresponds to Ay © d, where d is the inter-particle separation. 1X. SUMMARY OF LECTURE 10 In a more sophisticated approach to statistical mechanics, we use the method of en- sembles. In particular, the grand canonical ensemble shows us that if both energy and particles can flow, the probability of finding a sub-system with energy B, and particle number N, is Px Aeso[ apres Nu) (46) If, in a collection of Bosons or Fermions, we call a particular quantum state the ‘sub- system’, and the rest of the states the ‘reservoir’, we can quickly derive both the Bose-Kinstein and Fermi-Dirac distributions. In a system of a fixed number of Bosons, below a certain temperature many, many of the particles condense into the same ground quantum state. ‘This is known as Bose-Einstein condensation. The critical temperature is given by Wop oN te - Ta8 = Srmks (aaa) @) Below this temperature the ground state atoms are all in the same quantum state, and a whole host of interesting physical effects can be investigated.

You might also like