This action might not be possible to undo. Are you sure you want to continue?

2.1 Introducing entropy

2.1.1 Boltzmann’s formula A very important thermodynamic concept is that of entropy S. Entropy is a function of state, like the internal energy. It measures the relative degree of order (as opposed to disorder) of the system when in this state. An understanding of the meaning of entropy thus requires some appreciation of the way systems can be described microscopically. This connection between thermodynamics and statistical mechanics is enshrined in the formula due to Boltzmann and Planck: S = k ln Ω where Ω is the number of microstates accessible to the system (the meaning of that phrase to be explained). We will first try to explain what a microstate is and how we count them. This leads to a statement of the second law of thermodynamics, which as we shall see has to do with maximising the entropy of an isolated system. As a thermodynamic function of state, entropy is easy to understand. Entropy changes of a system are intimately connected with heat flow into it. In an infinitesimal reversible process dQ = T dS; the heat flowing into the system is the product of the increment in entropy and the temperature. Thus while heat is not a function of state entropy is.

**2.2 The spin 1/2 paramagnet as a model system
**

Here we introduce the concepts of microstate, distribution, average distribution and the relationship to entropy using a model system. This treatment is intended to complement that of Guenault (chapter 1) which is very clear and which you must read. We choose as our main example a system that is very simple: one in which each particle has only two available quantum states. We use it to illustrate many of the principles of statistical mechanics. (Precisely the same arguments apply when considering the number of atoms in two halves of a box.) 2.2.1 Quantum states of a spin 1/2 paramagnet A spin S = 1 / 2 has two possible orientations. (These are the two possible values of the projection of the spin on the z axis: ms = ±½.) Associated with each spin is a magnetic moment which has the two possible values ±µ. An example of such a system is the nucleus of 3He which has S = ½ (this is due to an unpaired neutron; the other isotope of helium 4He is non-magnetic). Another example is the electron. In a magnetic field the two states are of different energy, because magnetic moments prefer to line up parallel (rather than antiparallel) with the field. Thus since E = −µ.B , the energy of the two states is E↑ = −µB, E↓ = µB. (Here the arrows refer to the direction of the nuclear magnetic moment: ↑ means that the moment is parallel to the field and ↓ means antiparallel. There is potential confusion because for 3He the moment points opposite to the spin!). PH261 – BPC/JS – 1997 Page 2.1

2.2.2 The notion of a microstate So much for a single particle. But we are interested in a system consisting of a large number of such particles, N. A microscopic description would necessitate specifying the state of each particle. In a localised assembly of such particles, each particle has only two possible quantum states ↑ or ↓. (By localised we mean the particles are fixed in position, like in a solid. We can label them if we like by giving a number to each position. In this way we can tell them apart; they are distinguishable). In a magnetic field the energy of each particle has only two possible values. (This is an example of a two level system). You can't get much simpler than that! Now we have set up the system, let's explain what we mean by a microstate and enumerate them. First consider the system in zero field . Then the states ↑ and ↓ have the same energy. We specify a microstate by giving the quantum state of each particle, whether it is ↑or ↓. For N

=

10 spins

↑↑↑↑↑↑↑↑↑↑ ↓↑↑↑↑↑↑↑↑↑ ↑↓↓↑↓↑↓↑↑↑

are possible microstates.

That’s it! 2.2.3 Counting the microstates What is the total number of such microstates (accessible to the system). This is called Ω. Well for each particle the spin can be ↑ or ↓ ( there is no restriction on this in zero field). Two possibilities for each particle gives 210 arrangements (we have merely given three examples) for N = 10. For N particles Ω = 2N. 2.2.4 Distribution of particles among states This list of possible microstates is far too detailed to handle. What's more we don’t need all this detail to calculate the properties of the system. For example the total magnetic moment of all the particles is M = (N ↑ − N ↓) µ; it just depends on the relative number of up and down moments and not on the detail of which are up or down. So we collect together all those microstates with the same number of up moments and down moments. Since the relative number of ups and downs is constrained by N ↑ + N ↓ = N (the moments must be either up or down), such a state can be characterised by one number: m = N ↑ − N ↓. This number m tells us the distribution of N particles among possible states: N ↑ = (N + m) / 2. N ↓ = (N − m) / 2, Now we ask the question: “how many microstates are there consistent with a given distribution (given value of m; given values of N ↑, N ↓). Call this t (m). (This is Guenault’s notation). Look at N = 10. For m = 10 (all spins up) and m = −10 (all spins down), that's easy! — t = 1. Now for m = 8 (one moment down) t (m = 8) = 10. There are ten ways of reversing one moment. PH261 – BPC/JS – 1997 Page 2.2

( 2 )! ( 2 )! If you plot the function t (m) it is peaked at m = 0.8 0. 2N This may be shown using Stirling’s approximation (Guenault.4 0.6 0. The sharpness of the distribution increases with the size of the system N. t (m) ≈ 2N exp − PH261 – BPC/JS – 1997 Page 2. t (m ) 252 210 210 120 120 45 1 −10 10 −8 −6 −4 −2 0 2 4 45 10 6 8 1 10 m When N = = N↑ − N↓ Plot of the function t (m) for 10 spins 100 the function is already much narrower: 1×10 0. In terms of m the function t (m) may be written N! t (m) = N − m N + m . (N ↑ = N ↓ = N / 2). Here we illustrate that for N = 10. This may be familiar to you as a binomial coefficient. Note that the RMS width of the function is N . Appendix 2). It is the number of ways of dividing N objects into one set of N ↑ identical objects and N ↓ different identical objects (red ball and blue balls or tossing an unbiassed coin and getting heads or tails).The general result is t (m) = N! / N ↑! N ↓!. since the standard deviation goes as N . then t (m) approaches the normal (Gaussian) function m2 .3 .2 29 −100 −50 0 50 100 Plot of the function t (m) for 100 spins When N is large.

The initial state of the system will therefore spontaneously evolve towards the more likely. For example one can consider the number of particles in two halves of a container. If we observe a system of spins as a function of time for most of the time we will find m to be at or near m = 0. The tendency of isolated systems to evolve in the direction of increasing disorder thus follows from (i) disordered distributions having a larger number of microstates (mathematical fact) (ii) all microstates are equally likely (physical hypothesis). Once there the fluctuations about m = 0 will be small.2. So the average value of m is ∑m m t (m) / Ω. so it is far the most probable state. The observed m will fluctuate about m = 0. the number of microstates with that value of m divided by the total number of microstates (Ω = ∑m t (m)).2. as we have seen. it is a highly ordered state.3 Entropy and the second law of thermodynamics 2. each in a definite microstate and one for every microstate. This average value is also the most probable value.4 .5 The average distribution and the most probable distribution The physical significance of this result derives from the fundamental assumption of statistical physics that each of these microstates is equally likely. (Note: According to the Gibbs method of ensembles we represent such a system by an ensemble of Ω systems. (It is possible to introduce a PH261 – BPC/JS – 1997 Page 2. in zero magnetic field. Hence we can work out the average distribution.e. ______________________________________________________________ End of lecture 4 2. such that they were all in an up state m = N .) It is worthwhile to note that the model treated here is applicable to a range of different phenomena. For large N the function t (m) is very strongly peaked at m = 0. and examine the density fluctuations. Each particle can be either side of the imaginary division so that the distribution of density in either half would follow the same distribution as derived above. The probability of a particular value of m is just t (m) / Ω i. The thermodynamic properties are obtained by an average over the ensemble. the (relative) extent of these fluctuations decreases with N. there are far more microstates with m = 0 than any other value. and the probability of the system returning fleetingly to the initial ordered state is infinitesimally small. more disordered m = 0 equilibrium state. = 0. We know the equilibrium state is random with equal numbers of up and down spins. This is not a state of internal thermodynamic equilibrium. In this example because the distribution function t (m) is symmetrical it is clear that mav the value of m for which t (m) is maximum is m = 0.3. States far away from m = 0 such as m = N (all spins up) are highly unlikely. Entropy is a function of state and is defined for a system in thermodynamic equilibrium. It follows that t (m) is the statistical weight of the distribution m (recall m determines N ↑ and N ↓). The equivalence of the ensemble average to the time average for the system is a subtle point and is the subject of the ergodic hypothesis. the probability of observing that state is 1 / Ω = 1 / 2N since there is only one such state out of a total 2N. It also has a large number of microstates associated with it. Also So on average for a system of spins in zero field m = 0: there are equal numbers of up and down spins. that is the relative probability of that distribution occurring. in this example this is just the average value of m. The thermodynamic quantity intimately linked to this disorder is the entropy.1 Order and entropy Suppose now that somehow or other we could set up the spins.

Here k is Boltzmann’s constant. The mixing of two different gases is another. E1. This law may be illustrated by asking what happens when a constraint is removed on an isolated composite system. them in the final equilibrium state the total number of accessible microstates Ω∗ is always greater. V2 . 2. Now suppose the two systems are brought into contact through a diathermal wall. V2.3. T ∂ E V . If the two systems are allowed to exchange energy.5 .2 The second law of thermodynamics The second law of thermodynamics is the law of increasing entropy. Is the number of microstates of the final equilibrium state be smaller. number of particles. N 2 all PH261 – BPC/JS – 1997 Page 2. N 2) microstates. the same or bigger ? We expect the system to evolve towards a more probable state. V2 . In the state of equilibrium the entropy attains its maximum value.generalised entropy for systems not in equilibrium but let's not complicate the issue). V2. N 1 E2. V1. So while the systems exchange energy (E1 and E2 can vary) we must keep E1 + E2 = E0 = const. total energy When separated and in equilibrium they will individually have Ω1 = Ω1 ( E1. The macroscopic desciption of this is that heat flows until the temperature of the two systems is the same . A nice simple example is given by Guenault on p12-13.3. so they can now exchange energy. For a composite system Ω = Ω1Ω2.3 Thermal interaction between systems and temperature Consider two isolated systems of given volume. N 1) and Ω2 = Ω2 (E2. so its total energy is constant. Thus for our example of spins N Ω = 2 so that S = Nk ln 2.N | 2. N 2 → E1. N 1 E2. This leads us. in the next section. And since V1. to a definition of statistical temperature as 1 ∂S = . V1 . The total number of microstates for the combined system is Ω = Ω1Ω2. During any real process the entropy of an isolated system always increases. N 2 ↑ fixed diathermal wall Thermal interaction The composite system is isolated. V1 . Hence the number of accessible microstates of the final state must be greater or the same and the entropy increases or stays the same. Entropy was defined by Boltzmann and Planck as S = k ln Ω where Ω is the total number of microstates accessible to a system. N 1.

∆S = T1 T2 + S2 as we have seen. We define statistical temperature as 1 ∂S = T ∂ E V . ( ) ( ) PH261 – BPC/JS – 1997 Page 2. they can be ignored (and kept constant in any differentiation).) Let us prove this. Writing Ω = Ω1 ( E) Ω2 ( E0 − E) we allow the systems to vary E so that Ω is a maximum: ∂Ω ∂E or = ∂ Ω1 Ω ∂E 2 = − Ω1 ∂ Ω2 ∂E = 0 1 ∂ Ω1 Ω1 ∂ E or 1 ∂ Ω2 Ω2 ∂ E ∂ ln Ω1 ∂E But from the definition of entropy. S characterised by = = ∂ ln Ω2 . We will discuss all the various formulations of this law a bit later on. we see this means that the equilibrium state is In other words. ∂E k ln Ω.N (recall that V and N are kept constant in the process) With this definition it turns out that statistical temperature is identical to absolute temperature. Our problem is this: after the two systems are brought together what will be the equilibrium state? We know that they will exchange energy. then it is clear that ∂ S / ∂ E (or ∂ ln Ω / ∂ E) must be related (somehow) to the temperature. so that: ∂ S1 ∂ S2 − ∆E1 ≥ 0 ∆S = ∂E ∂E or. using our definition of statistical temperature: 1 1 − ∆E1 ≥ 0. Since we have defined temperature to be that quantity which is the same for two systems in thermal equilibrium. and they will do this so as to maximise the total number of microstates for the composite system.6 . when the systems have reached equilibrium the quantity ∂ S / ∂ E of system 1 is equal to ∂ S / ∂ E of system 2. For a composite system Ω = Ω1Ω2 so that S = S1 According to the law of increasing entropy S ≥ 0. The systems will exchange energy so as to maximise Ω. ∂E ∂ S2 .remain fixed. ∂ S1 ∂E = | It follows from this definition and the law of increasing entropy that heat must flow from high temperature to low temperature (this is actually the Clausius statement of the second law of thermodynamics.

PH261 – BPC/JS – 1997 Page 2. with total energy E.this means that E1 increases if T2 > T1 E1 decreases if T2 < T1 so energy flows from systems at higher temperatures to systems at lower temperatures. All microstates have the same distribution of particles between the two energy levels N ↑ and N ↓. in particular we know nothing about the temperature. ≈ x ln x − x.4 More on the S=1/2 paramagnet 2. But there are still things we would like to know about this system. number N. Now for an isolated system. and volume V fixed. We can approach this from the entropy in our now-familiar way. As discussed already in a magnetic field the energies of the two possible spin states are We shall write these energies as ∓ε.7 .1 Energy distribution of the S=1/2 paramagnet Having motivated this definition of temperature we can now return to our simple model system and solve it to find N ↑ and N ↓ as a function of magnetic field B and temperature T. N ↑ ! N ↓! So we can calculate the entropy via S = k ln Ω: S = k {ln N ! − ln N ↑! − ln N ↓!} Now we use Stirling’s approximation (Guenault Appendix 2) which says ln x! this is important you should remember this. E↑ = − µB and E ↓ = µB. The total number of microstates Ω is given by Ω = N! . This follows since we must have both E N These two equations are solved to give N↓ N↑ = (N + = (N − = (N ↓ − = (N↓ + N ↑) ε N ↑) E / ε) / 2 E / ε) / 2 .4. it turns out that N ↑ and N ↓ are uniquely determined (since the energy depends on N ↑ − N ↓). Previously we had considered the behaviour in zero magnetic field only and we had not discussed temperature. Thus m = N ↑ − N ↓ is fixed and therefore it exhibits no fluctuations (as it did in zero field about the average value m = 0). ______________________________________________________________ End of lecture 5 2.

This is the distribution function.6 0. Into this we substitute for N ↑ and N ↓ since we need S in terms of E for differentiation to get temperature. T 2ε N↓ which can be inverted to give the ratio of the populations as N↑ = exp −2ε / kT . where we have used N ↓ + N ↑ = N. the temperature expression can be written: 1 k N↑ = ln .8 . N↓ Or. 1 1 1 1 (N + E / ε) ln (N + E / ε) − (N − E / ε) ln ( N − E / ε) S = k . T 2ε N + E/ε (You can check this differentiation with Maple or Mathematica if you wish!) S = { } { } { } Recalling the expressions for N ↑ and N ↓.8 N↑ / N 0. It tells us the fraction of particles in each of the two energy states as a function of temperature.V to obtain the temperature: 1 k N − E/ε = ln .2 N↓ / N kT / ε 0.4 0. On inspection you see that it can be written rather concisely as exp −ε / kT n ( ε) = N z PH261 – BPC/JS – 1997 Page 2.0 = exp +ε / kT exp +ε / kT + exp −ε / kT 0. N ln N − 2 2 2 2 Now we use the definition of statistical temperature: 1 / T = ∂ S / ∂ E|N.0 0 2 4 6 8 10 12 14 16 Temperature variation of up and sown populations This is our final result. since N ↓ + N ↑ = N we find N↑ exp −ε / kT = exp +ε / kT + exp −ε / kT N { } N↓ N 1.Hence k {N ln N − N ↑ ln N ↑ − N ↓ ln N ↓} .

6 0. The fluctuations in n (ε) about this average value are small for a sufficiently large number of particles. In the general case n (ε) is the average number of particles in the state of energy ε (In the present special case of only two levels n (ε1) and n (ε2) are uniquely determined).0 µB / kT magnetisation of paramagnet PH261 – BPC/JS – 1997 Page 2.2 Magnetisation of S = 1 / 2 magnet — Curie’s law We can now obtain an expression for the magnetisation of the spin 1/2 paramagnet in terms of temperature and applied magnetic field. M / Nµ 1. This distribution among energy levels is an example of the Boltzmann distribution. (The 1/ N factor) 2. This problem we treat in the next section.0 0.0 2. In the present example there are only two energy levels. The magnetisation (total magnetic moment) is given by: M = (N ↑ − N ↓) µ.4.where the quantity z z = exp −ε / kT + exp +ε / kT is called the (single particle) partition function. To remind you. where the sum is taken over all the possible energies of a single particle.0 1. It is the sum over the possible states of the factor exp −ε / kT . We have expressions for N ↑ and N ↓.0 µ tanh ε / kT kT saturation 0. giving the expression for M as exp −ε / kT − exp +ε / kT M = Nµ exp +ε / kT + exp −ε / kT = −N or. This distribution applies generally(to distinguishable particles) where there is a whole set of possible energies available and not merely two as we have considered thus far. since ε = − µB. the magnetisation in terms of the magnetic field is µB M = Nµ tanh ( ) . In general z = states ∑ exp −ε / kT .0 3.8 0.2 0.4 linear region Nµ2B M ∼ kT 0. an example of counting the microstates and evaluating the average distribution for a system of a few particles with a large number of available energies is given in Guenault chapter 1.9 .

kT C 2.3 Entropy of S = 1 / 2 magnet We can see immediately that for the spin system the entropy is a function of the magnetic field. linear behaviour may be found by expanding the tanh: M The magnetisation has the general form B . In particular at large B / T all the moments are parallel to the field. this can be written − N ↑ ln N ↑ − − N ↓ ln N ↓. But it is quite easy to determine the entropy at all B / T from S = k ln Ω.The general behaviour of magnetisation on magnetic field is nonlinear. The low field.4. As we have seen already Since N = N↓ + S / k = N ln N N ↑. M = = N µ2 B. This behaviour is referred to as Curie’s law and the constant C = N µ2 / k is called the Curie constant. T proportional to B and inversely proportional to T. Nk exp (2µB / kT ) ln [ exp (2µB / kT ) 1] + + 1 + S Nk ln 2 lower B higher B 0 0 T Entropy of a spin 1/2 paramagnet PH261 – BPC/JS – 1997 Page 2. S / k = −N ↑ ln N ↑ / N And substituting for N ↑ and N ↓ we then get Nk ln [ exp (−2µB / kT ) + 1] S = exp (−2µB / kT ) + 1 N ↓ ln N ↓ / N. At low fields the magnetisation starts linearly but at higher fields it saturates as all the moments become aligned. There is then only one possible microstate and the entropy is zero.10 .

But what about a non-isolated system? What can we say about the probabilities of microstates of such a system? Here the probability of a microstate will depend on its energy and on properties of the surroundings. In other words any distribution {ni} must satisfy the requirements PH261 – BPC/JS – 1997 Page 2. ______________________________________________________________ End of lecture 6 2.5. but the entire collection will be isolated from the outside world. This collection of systems is called an ensemble.1 Thermal interaction with the rest of the world − using a Gibbs ensemble For an isolated system all microststes are equally likely. In particular we know that because the whole ensemble is isolated then the number of elements and the total energy are fixed. Then both terms are equal and we find S = Nk ln 2. In this case S ≈ Nk (2µB / kT ) exp (−2µB / kT ). here all quantum states are equally probable. System in energy state of εj collection of identical systems a Gibbs ensemble of systems used to calculate {ni} There are many distributions {ni} which are possible for this ensemble. there will be ni in the microstate of energy εi . A real system will have its temperature determined by its environment. These will be allowed to exchange (heat) energy with each other. it will fluctuate (but the relative fluctuations are very small because of the 1 / N rule). We will take many copies of our system. All we really know about is isolated systems. Of the N individual elements of the ensemble. it is not isolated. the spins tend to line up. 2 At high temperatures kT ≫ 2µB. There are equal numbers of up and down spins: maximum disorder.1 At low temperatures kT ≪ 2µB. The point now is that we now know that all microstates of the composite ensemble are equally likely. Clearly S → 0 as T → 0.5 The Boltzmann distribution 2. That is. this is our Fundamental Postulate. so the probability of an element being in this microstate will be ni / N. its energy is not fixed. 3 The entropy is a function of B / T and increasing B at constant T reduces S. Then the first term is negligible and the 1s may be ignored. in agreement with our earlier argument. the system becomes more disordered.11 . So to examine the properties of a non-isolated system we shall adopt a trick.

∑ ni i = N E. In other words. Unfortunately the ni can not all be varied independently because all possible distributions {ni} must satisfy the two requirements ∑ ni i = N E. 2. So the most probable distribution is found by varying the various ni to give the maximum value for t. ∑ niεi i = By analogy with the case of the spin 1/2 paramagnet.5. Then the most likely distribution is that for which t ({ni}) is maximised. This gives us two extra degrees of freedom so that we can maximise this by varying all ni independently. 3. … . we will denote the number of microstates of the ensemble corresponding to a given distribution {ni} by t ({ni}). 2. the maximum in ln t is also PH261 – BPC/JS – 1997 Page 2. t ({ni}) = The maximum in ln t corresponds to the place where its differential is zero: ∂ ln t ∂ ln t ∂ ln t dn1 + dn2 + … + dn d ln t ({ni}) = ∂ n1 ∂ n2 ∂ ni i + … = 0 If the ni could be varied independently then we would have the following set of equations ∂ ln t ({ni}) ∂ ni = 0.12 . ∑ niεi i = there are two constraints on the distribution {ni}. i = 1. we can consider the differential of the expression ln t ({ni}) − α ∑ ni i − β ∑ niεi i where α and β are. These may be incorporated in the following way.2 Most likely distribution The value of t ({ni}) is given by N! ∏i ni ! since there are N elements all together and there are ni in the ith state. at this stage undetermined. It is actually more convenient (and mathematically equivalent) to maximise the logarithm of t. Since the constraints imply that d ∑ ni i = 0 and d ∑ niεi i = 0.

i so that upon evaluating the N partial derivatives we obtain ln ∑ ni i − ln nj − α − − βεj . The probability that a system will be in the state j is then found by dividing by N: pj = e−αe − βεj . then. It is called the partition function. (This is called the method of Lagrange’s undetermined multipliers. = 0 which has solution nj = Ne−αe βεj These are the values of nj which maximise t subject to N and E being fixed. i = 1.) The maximisation can now be performed by setting the N partial derivatives to zero ∂ ln t ({ni}) ∂ ni − α ∑ ni i − β ∑ niεi i = 0. and it is given the symbol Z because of its name in German: zustandsumme (sum over states). from this all thermodynamic properties can be found.3 What are α and β? For the present we shall sidestep the question of α by appealing to the normalisation of the probability distribution.specified by d ln t ({ni}) − α ∑ ni i − β ∑ niεi i = 0 Now we have recovered the two lost degrees of freedom so that the ni can be varied independently. 2. But then the multipliers α and β are no longer free variables and their values must be found from the constraints fixing N and E. The constant Z will turn out to be of central importance.5. PH261 – BPC/JS – 1997 Page 2. 3. This gives us the most probable distribution {ni}. Since we must have ∑ pj j = 1 it follows that we can express the probabilities as pj where the normalisation constant Z is given by Z = = e − βεj Z − ∑e j βεj . We use Stirling’s approximation for lnt.13 . given by ln t ({ni}) = (∑ n ) ln (∑ n ) i i i i − ∑ ni ln ni. So the remaining step. … . is to find the constants α and β. 2.

Feynman says “This fundamental law is the summit of statistical mechanics. as the principle is applied to various cases. However we know t ({ni}).14 . In other words.N Now the fundamental expression for entropy is S = k ln Ω. or the climb-up to where the fundamental law is derived and the concepts of thermal equilibrium and temperature clarified”. It is a key result.But what is this β? For the spin 1/2 paramagnet we had many similar expressions with 1 / kT there. This is a little difficult to obtain. And this t is a very sharply peaked function. From this we immediately identify = ∂S ∂ E |V N . T ∂ E V . kβ so that β as we asserted. We shall now show that this identification is correct. we take N! S = k ln ∏i ni ! where | ( ) nj Using Stirling’s approximation we have S = = Ne−αe − βεj . where Ω is the total number of microstates of the ensemble. ∝ e−εi/kT ______________________________________________________________ End of lecture 7 PH261 – BPC/JS – 1997 Page 2. where we have β here. We will approach this from our definition of temperature: 1 ∂S = . the number of microstates corresponding to a given distribution {ni}. k N ln N k {N ln N 1 T − ∑ ni ln ni i = + αN = + βE} . = 1 kT The probability of a (non-isolated) system being found in the ith microstate is then pi or e−εi/kT pi = Z This is known as the Boltzmann distribution or the Boltzmann factor. and the entire subject is either a slidedown from the summit. Therefore we make negligible error in approximating Ω by the value of t corresponding to the most probable distribution.

= T dS − pdV | | | | | It is instructive to examine this expression for the internal energy further.5. ln Z. on taking logarithms + βE} . E = kT 2 ∂T V Thus we see that once the partition function is evaluated by summing over the states.4 Link between the partition function and thermodynamic variables All the important thermodynamic variables for a system can be derived from Z and its derivatives. 2. We can then identify the various partial derivatives: ∂F ∂ ln Z = kT + k ln Z S = − ∂T V ∂T V ∂F ∂ ln Z p = − = kT . So for the entropy and internal energy of a single system S which. ∂V T ∂V T Since E = F + TS we can then express the internal energy as ∂ ln Z . upon rearrangement. We can see this from the expression for entropy. This will also confirm the identification of this function of state as the actual energy content of the system. This is seen from the differential of the free energy.2.15 . or the Helmholtz potential. We then have the memorable result F = −kT = k ln Z TS + E / T. α k {N ln N = − ln N + ln Z N ln Z + − N ln N + E / kT } Nk ln Z + E / T . We have And since e so that S = = − α = S = k {N ln N + αN N / Z we have.5 Finding Thermodynamic Variables A host of thermodynamic variables can be obtained from the partition function. can be written E Now the thermodynamic function F = E − TS is known as the Helmholtz free energy. Here both S and E refer to the ensemble of N elements. Since dE it follows that dF = −SdT − pdV . If pj is the probability of the system being in the eigenstate corresponding to energy εj then the mean energy of the system may be expressed as PH261 – BPC/JS – 1997 Page 2. all relevant thermodynamic variables can be obtained by differentiating Z.5. − = −k ln Z.

Evaluate the partition function for the system. the particles can be distinguished by their positions. In examimining the sum we note that εj e βεj − = − ∂ βεj e ∂β − so that the expression for E may be written. this is equivalent to our previous expression ∂ ln Z E = kT 2 . ∂β 1 / kT .6. and it is convenient here to work in terms of β rather than converting to 1 / kT .E = ∑ εjpj j = 1 εj e−βεj Z ∑ j where Z is the previously-defined partition function. perhaps a solid made up of argon atoms. so that 1 ∂Z E = − Z ∂β = − And since β ∂ ln Z. = | 2. The Helmholtz free energy then follows from this. E = − 1 ∂ −βε e j. ∂T V however the mathematical manipulations are often more convenient in terms of β. The point about localised particles is that even though they may be indistinguishable. PH261 – BPC/JS – 1997 Page 2. All thermodynamic variables follow from differentiation of the Helmholtz free energy. We summarise this procedure as follows: 1 2 3 4 Write down the possible energies for the system. However a considerable simplification follows if one is considering systems made up of noninteracting distinguishable or localised particles.1 Working in terms of the partition function of a single particle The partition function methodology described in the previous sections is general and very powerful. after interchanging the differentiation and the summation.6 Localised systems 2.6 Summary of methodology We have seen that the partition function provides a means of calculating all thermodynamic information from the microscopic description of a system. 2.16 . Z ∂β ∑ j But the sum here is just the partition function.5.

This allows an important simplification to the general methodology outlined above. The partition function is expressed as the hyperbolic cosine: = z = 2 cosh µB / kT . and if the partition function for a single such particle is z. then the partition function for the assembly is Z = zN. For localised systems we need only consider the energy levels of a single particle. The spin ½ has two quantum states. the free energy is an extensive quantity. if we have an assembly of N localised identical particles. This is z = −εi / kT ∑ e states i eµB/kT + e−µB/kT .2 Using the partition function I — the S = 1 / 2 paramagnet (again) Step 1) Write down the possible energies for the system The assembly of magnetic moments µ are placed in a magnetic field B. Step 2) Evaluate the partition function for the system We require to find the partition function for a single spin. which we label by ↑ and ↓. The expression for magnetic work is PH261 – BPC/JS – 1997 Page 2. ______________________________________________________________ End of lecture 8 2. one familiar and one new.Since the partition function for distinguishable systems is the product of the partition function for each system. The two energy levels are than = −NkT ε ↑ = − µB. We then evaluate the partition function z of a single particle and then use the relation F = −NkT ln z in order to find the thermodynamic properties of the system. It then follows that the Helmholtz free energy for the assembly is F = −kT ln Z ln z in other words the free energy is N times the free energy contribution of a single particle. ε ↓ = µB. as expected.6. we have magnetisation M and magnetic field B. This time we shall obtain the results in terms of hyperbolic functions rather than exponentials — for variety.17 . We will consider two examples of this. Step 3) The Helmholtz free energy then follows from this Here we use the relation F = −NkT = −NkT ln z ln {2 cosh µB / kT } . Here we don’t have pressure and volume as our work variables. Step 4) All thermodynamic variables follow from differentiation of the Helmholtz free energy Before proceeding with this step we must pause to consider the performance of magnetic work.

you should check that the exponential expressions there are equivalent to the hyperbolic expressions here. from the microcanonical approach. At low temperatures the magnetisation saturates as all the moments become aligned against the applied field. so comparing this with our familiar −pdV . the Helmholtz free energy E dF = −SdT We can then identify the various partial derivatives: ∂F = NkT S = − ∂T B ∂F M = − = NkT ∂B T Upon differentiation we then obtain N µB tanh µB / kT + S = − T − − TS has the differential MdB. M = Nµ tanh µB / kT .18 . Some of these expressions were derived before. ∂ B |T Nk ln {2 cosh µB / kT } . giving = −N µB tanh µB / kT . of more immediate importance.d ⁄ W = −MdB. The internal energy is best obtained now from E = F + TS ln z + = −NkT = −NkT TS + ln {2 cosh µB / kT } E T − N µB tanh µB / kT T + Nk ln {2 cosh µB / kT } . By differentiating the internal energy with respect to temperature we obtain the thermal capacity at constant field CB = Nk ( kT ) µB 2 sech 2 µB / kT . we note that the expression M PH261 – BPC/JS – 1997 = Nµ tanh µB / kT Page 2. = −MB We note that the internal energy can be written as E as expected. The internal energy differential of a magnetic system is then dE = T dS − MdB and. We plot the magnetisation as a function of inverse temperature. we see that when dealing with magnetic systems we must make the identification p ↔ M V ↔ B. where Curie’s law holds. Incidentally. ∂ T |B ∂ ln z . | | ∂ ln z + Nk ln z. Recall that at high temperatures we have a linear region.

Observe that they all go to zero as T → 0. This is known as a Schottky anomaly. internal energy and the thermal capacity as a function of temperature.83µB / k .0 2.0 0.6 0. kT just as it is in the high temperature region that the ideal gas equation of state p M = = NkT / V is valid.0 1.44Nk at a temperature of T ∼ 0.is the equation of state for the paramagnet. as expected. as may be seen by expanding the expression for CB: CB CB PH261 – BPC/JS – 1997 ∼ 4Nk ( 2Nk ( kT ) µB 2 e− kT 2 2µB T T → 0 ∼ kT ) µB → ∞. kT ) The thermal capacity is particularly important as it is readily measurable.0 3.4 linear region N µ2B M ∼ kT 0.19 . M / Nµ 1. But recall that we found linear behaviour in the high T / B region where N µ2B . It exhibits a maximum of CB ∼ 0. Spin systems are unusual in that the energy states of a spin are finite and therefore bounded from above. Ordinarily the thermal capacity of a substance increases with temperature. This is a nonlinear equation. At low temperatures the entropy goes to zero. Page 2. The system then has a maximum entropy.8 0.2 0.0 µB / kT magnetisation of spin 1/2 paramagnet against inverse temperature Next we plot the entropy. And at high temperatures the entropy goes to the classical two-state value of k ln 2 per particle S S ∼ 2Nk ( e kT ) − µB − kT 2µB T → 0 T → ∞. As the entropy increases towards this maximum value it becomes increasingly difficult to pump in more heat energy. saturating at a constant value at high temperatures. ∼ Nk ln 2 4Nk ( µB 2 At both high and low temperatures the thermal capacity goes to zero.0 saturation 0.

0 3.20 .0 2.S Nk ln 2 lower B kT / µB 0 0.0 4.0 1.0 2.8 kT / µB 0.0 5.0 3.00 0.0 -1.0 5.30 0.6 -0.0 -0.0 2.4 -0.20 0.0 5.0 entropy of spin ½ paramagnet E / NµB 0.0 1.0 3.0 4.10 0.0 kT / µB thermal capacity of spin ½ paramagnet ______________________________________________________________ End of lecture 9 PH261 – BPC/JS – 1997 Page 2.0 internal energy of spin ½ paramagnet CB / Nk 0.2 -0.40 0.0 1.0 4.

1 ħω To proceed. … … … 9 2 7 2 5 2 3 2 1 2 4 3 2 1 j = 0 ħω ħω ħω ħω ħω single particle energies: εj = (j 1 + 2) ħω j = 0. 3.2. Each atom sees a similar potential. We shall follow the procedures outlined above. let us define a characteristic temperature θ. And since each atom can vibrate in three independent directions. Step 1) Write down the possible energies for the system The energy states of the harmonic oscillator form an infinite ‘ladder’ of equally spaced levels. let us call this ω / 2π. PH261 – BPC/JS – 1997 Page 2.21 . so they all oscillate at the same frequency. 2. you should be familiar with this from your quantum mechanics course. He was concerned with nonmagnetic insulators. You should recall the result ∞ n=0 n ∑x = 1 + x + x2 + x3 + … = 1 1 − x . Einstein constructed a simple model of this which was partially successful in explaining the thermal capacity. In Einstein’s model each atom of the solid was regarded as a simple harmonic oscillator vibrating in the potential energy minimum produced by its neighbours. ħω / k. 1.3 Using the partition function II — the Einstein model of a solid One of the challenges faced by Einstein was the explanation of why the thermal capacity of solids tended to zero at low temperatures. related to the oscillator frequency by θ Then the partition function may be written as ∞ = z = j=0 ∑ exp − {− (j + 2 ) T } − 1 θ = e θ /2T ∞ j=0 ∑ (e θ /T j ) and we observe the sum here to be a (convergent) geometric progression. the solid of N atoms is modelled as a collection of 3N identical harmonic oscillators. and he had an inkling that the explanation was something to do with quantum mechanics. The thermal excitations in the solid are due to the vibrations of the atoms. This is ∞ z = j=0 ∞ ∑e − j/ kT ε = j=0 ∑ exp {− (j + 2 ) kT } . … Step 2) Evaluate the partition function for the system We require to find the partition function for a single harmonic oscillator.6.

. Step 4) All thermodynamic variables follow from differentiation of the Helmholtz free energy Here we have no explicit volume dependence. it is present even at T = 0.22 E = = − = −3N ∂ ln z ∂β − βħω 2 − ln {1 e−βħω} . The limiting low temperature behaviour is e−θ/T T → 0. S ∼ 3Nk ln ( ) S ∼ 3Nk θ The internal energy is found from E where we write ln z as ln z Then on differentiation we find e − 1 The first term represents the contribution to the internal energy from the zero point oscillations. If an external pressure were applied then the vibration frequency ω / 2π would be a function of the interparticle spacing or. the volume. 3N {2 ħω + βħω ħω }. PH261 – BPC/JS – 1997 Page 2. indicating that the equilibrium solid is at zero pressure. T At high temperatures the entropy tends towards a logarithmic increase T T → ∞.If you don’t remember this result then multiply the power series by 1 The harmonic oscillator partition function is then given by z = − x and check the sum. The thermodynamic variables we are interested in are then entropy. as one would expect. This would give a volume dependence to the free energy from which the pressure could be found. e−θ/2T . internal energy and thermal capacity. equivalently. However we shall ignore this. The entropy is S = − ∂F ∂ T |V = 3Nk ln ( eθ θ eθ / T /T − 1) + ( eθ / T − θ/T 1) . In the low temperature limit S goes to zero. 1 − e−θ/T Step 3) The Helmholtz free energy then follows from this Here we use the relation F = −3NkT = ln z + 3N kθ 2 3NkT ln {1 − e−θ/T } .

the characteristic temperature θ = ħω / k . The decrease in heat capacity below the classical temperature-independent value is seen to arise from the quantisation of the energy levels. 1 θ2 CV ∼ 3Nk − Nk ( ) + … T ≫ θ 4 T where the first term is the constant equipartition value. which was Einstein’s challenge. To the extent that the model has some validity. the frequency ω / 2π of the oscillations or. each substance should have its own value of θ. The explanation for this is that the atoms do not all oscillate independently.23 . At low temperatures we find CV ∼ 3Nk ( e θ /T ) T − θ 2 T ≪ θ. And the prediction is that CV is a universal function of T / θ . ∂ T |V ∂ 1 3Nk θ . 12 ( T ) The first term is the classical equipartition part. PH261 – BPC/JS – 1997 Page 2. as the CV figure shows. We see that this model does indeed predict the decrease of CV to zero as T → 0. which is independent of the vibration frequency. The coupling of their motion leads to normal mode waves propagating in the solid with a continuous range of frequencies. { θ /T ∂ T e − 1} 3Nk ( Upon differentiation this then gives T ) (eθ/T − 1)2 At high temperatures the variation of the thermal capacity is CV = θ 2 eθ / T . This was explained by the Debye model of solids. equivalently. This is an example of a law of corresponding states: When the temperatures are scaled appropriately all substances should exhibit the same behaviour.At high temperatures the variation of the internal energy is 1 θ 2 − … . we have CV = = ∂E . E ∼ 3NkT 1 { + } Turning now to the thermal capacity. the value for diamond is ∼1300K. The Einstein model introduces a single new parameter. However the observed low temperature behaviour of such thermal capacities is a simple T 3 variation rather than the more complicated variation predicted from this model.

0 1.0 0 entropy of Einstein solid E / Nkθ 6 4 3 NkT Ecl = 2 classical limit 2 E0 3 = 2 Nk θ zero point motion T/θ 0.0 1.0 1.0 0 internal energy of Einstein solid 3.6 S / Nk 5 4 3 2 1 T/θ 0.0 0.24 .0 CV / Nk experimental points for diamond fit to line for θ = 1300K 2.0 T/θ thermal capacity of Einstein solid PH261 – BPC/JS – 1997 Page 2.0 1.0 2.0 2.0 0.0 2.

2 2T d V We note that the equilibrium state of the solid when no pressure is applied. Page 2. ) 2T θ And from this the various properties follow: S E CV = 3Nk = 3 θ Nk θ coth ( ) 2 2T 3Nk ( 2T ) 2 sinh ( ) {( 2T ) coth ( 2T ) − ln 2T } θ θ θ 2T ) ______________________________________________________________ End of lecture 10 Equation of state for the Einstein solid We have no equation of state for this model so far. p = 3Nk α 2 V0 PH261 – BPC/JS – 1997 ( ) . you should check these. corresponds to the vanishing of dθ / dV . No p – V relation has been found. In fact θ will be a minimum at the equilibrium volume V0 (why?).Finally we give the calculated properties of the Einstein model in terms of hyperbolic functions. since there was no volume-dependence in the energy levels. For small changes of volume from this equilibrium we may then write = − ∂F ∂ V |T ( ) θ so that = θ0 + α( V − V0 V0 ) 2 dθ 2α (V − V0) . and thus in the partition function and everything which followed from that. z = 2 2T so that the Helmholtz free energy for the solid comprising 3N such oscillators is F = 3NkT ln 2 sinh ( . or equivalently the temperature θ may vary with volume. θ 2 cosech2 ( θ p = − 3Nk θ dθ coth . The partition function for simple harmonic oscillator can be written as 1 θ cosech ( ) . temperature-independent limit is (V0 − V ) . This deficiency can be rectified by recognising that the Einstein frequency.25 . = 2 dV V0 Then the equation of state for the solid is 3Nk α θ0 (V0 − V ) coth p = 2 V0 2T The high-temperature limit of this is 6Nk α (V0 − V ) T p = 2θ V0 and the low-temperature. Then the pressure may be found from = .

xxsere546

xxsere546

- Manual Salinas
- Esci341 Lesson11 Energy Minimum Principle
- Thermodynamical Quantities
- Thermo history
- Lecture 3 Clausius Inequality
- MEScchap1-1c
- 1102.2098
- Thermo 3
- Chap 6
- Entropy of Ice
- Thermal Physics Assignment
- Tipler More Chapter 8 1-Defining Temperature and Entropy
- Entropy and the Second Law
- Thermodynamic Relations
- genchem
- Entropy Definition
- 6
- Bkenstein-do We Understand Black Hole Entropy
- lec21
- ENTROPY
- Untitled
- Dr Hugo Touchette (NITheP Stellenbosch)
- A New Interpretation of Legendre’s Transformation And
- Canonical Distribution and Entropy
- Entropy
- Solutions1_2012
- Lecture 2
- Physical Review B Volume 30 Issue 5 1984 [Doi 10.1103%2Fphysrevb.30.2866] Meirovitch, Hagai -- Computer Simulation Study of Hysteresis and Free Energy in the Fcc Ising Antiferromagnet
- Arxiv Lebowitz Goldstein Boltzman 0304251
- Entropy of the God Love Heart
- Introduction to Statistical Mechanics

Are you sure?

This action might not be possible to undo. Are you sure you want to continue?

We've moved you to where you read on your other device.

Get the full title to continue

Get the full title to continue reading from where you left off, or restart the preview.

scribd