You are on page 1of 11



Die Energie der Welt ist konstant, Die Entropie der Welt strebt ein Maximum zu.1 Clausius Clausiuss couplet summarizes the vast range of experience that is canonized in thermodynamics, a subject that lies at the heart of chemistry, biochemistry, and indeed to all of nature. The power of thermodynamics lies in its ability to provide a quantitative measure of the energy changes that occur in physical, chemical, and biological processes. Chemical reactions of metabolism, the transport of material across cell membranes, the assembly of membranes, and the assembly of other types of macromolecular complexes all obey these laws. Nothing overrides the laws of thermodynamics: they are the decrees of fate. Since an energy change accompanies virtually every process, the importance of thermodynamics in cannot easily be exaggerated. Thermodynamics provides us with a way to determine how much energy is released or absorbed in a process. It also leads us from the concentrations of reactants and products in a chemical reaction or to the equilibrium constant for the reaction. And, from the equilibrium constant we can determine the magnitude of the force that drives a process to equilibrium. Thermodynamics also reveals how this force (the free energy) is partitioned between a potential energy change the enthalpy and a change in the molecular order the entropyof the system. In this discussion we shall consider the major thermodynamic laws and the equations that relate these laws to each other. Once we have the laws in hand, we shall apply them to various examples as a prelude to their more extensive application in subsequent chapters.

Thermodynamic Systems and Thermodynamic State Functions

The universe is bigger than our laboratories, and it is convenient to divide the world into thermodynamic systems and their surroundings. A thermodynamic system is that part of the universe-a test tube, tissue culture, or whatever-in which we are studying a process. Everything else constitutes its surroundings. If matter cannot cross the boundaries between the system and its surroundings, the system is closed. If, however, matter can be exchanged between the system and the surroundings, as in all living systems, the system is open. If neither matter nor energy can be exchanged with the system, it is isolated. Once we have defined the thermodynamic system, we must specify its state. The state of the system depends upon the thermodynamic variables, called state functions, such as temperature, pressure, and the number of moles of material in the system. We observe only changes in thermodynamic variables. When the state of the system changes, say by increasing the temperature, the change in all other state functions depends only upon the difference between the initial and final states and is independent of the pathway by which the process is carried out. The first law of thermodynamics states that the total energy of the universe is conserved in every process. We shall modestly limit our discussion of this grand statement to chemical reactions. In terms of our arbitrary division of the universe, we can say that the total energy of a system and its surroundings is a constant. Consider the reaction energy + aA + bB = cC + dD (1.1)

The energy absorbed in reaction 1.1 in the forward reaction is exactly equal to the energy released in the reverse reaction. Energy is conserved in the process. The total energy change in reaction 1.1 is

The energy of the universe is constant. The entropy of the universe increases to a maximum.

U = Q - W


where U is the change in the internal energy for the process, Q is the heat absorbed by the reaction, and W is the work done by the system. The type of work done depends on how the reaction occurs. It may involve moving a piston, or it may involve reproducing an organism. The quantity U is the same for the conversion of A and B to C and D whether the process requires many steps with many intermediates, or whether it occurs in a single step. In essence, the first law of thermodynamics tells us that different forms of energychemical, electrical, mechanical, and so forth-can be interconverted. Whenever energy is lost in one form, it reappears in exactly the same amount in another form.

Enthalpy Changes Under Standard Conditions

Many chemical processes, and almost all biochemical ones, occur at constant pressure. The work done by a system at constant pressure equals the product of the pressure and the change in volume that occurs during the process: W = P V (1.3)

Substituting the term P V in equation 1.3 in for W in equation 1.2 and rearranging we obtain U = Qp - PV (1.4)

Note that we have changed Q to Qp to denote that the change in question takes place at constant pressure. Thus, Qp is the heat released or absorbed in the process at constant pressure. Since U, P, and V are state functions, Qp is also a state function. The heat released or absorbed in the constant pressure process is called the enthalpy change for the reaction. Thus, equation 1.4 can be rewritten as H =U + PV (1.5)

where H is the symbol for the enthalpy change. Heat is measured in either calories or joules. One calorie (cal) or 4.184 joules (J) is the amount of heat required to raise the temperature of 1 gram of water from 14.5 to 15.5 C at a pressure of 1 atmosphere (atm). One kilojoule (kJ) is equal to 1000 J, and 1 kcal is equal to 4.184 kJ. The following conventions are used for enthalpy changes. H < 0 A reaction is exothermic when heat is evolved by the system and enters the surroundings. H > 0 A reaction is endothermic when heat is absorbed by the system. The specific heat of a substance is the amount of heat required to raise the temperature of 1 gram of the substance by 1C. The heat required to raise 1 mole of the substance by 1C is the molar heat capacity, equal to the specific heat times the molecular weight of the substance. The units of heat capacity are joules per degree (expressed in kelvins) per mole (J K-I mol-I). The heat capacity at constant pressure (Cp) is defined in terms of the enthalpy (equation 1.6). H = CpT (1.6)

The enthalpy change for a chemical reaction depends upon the number of moles undergo-

ing chemical change, and H is expressed in terms of kJ/mol To compare reactions under the same conditions the following conventions are used. 1. The standard state of any element or compound is the most stable form of the element or compound at 298 K and 1 atm pressure. 2. In solution the solute standard state is 1.0 molar (M). 3. The standard enthalpy of formation (AHof) of any element in its standard state is zero kJ/mol. The superscript 0 indicates that the reaction occurs under standard conditions. 4. The standard enthalpy of formation of 1.0 M solution of hydronium ions is zero kJ/mol. 5. The standard enthalpy of formation of a compound is the enthalpy change when 1 mole of the compound is formed in its standard state from its elements in their standard states. The magnitude of the standard enthalpy of formation says nothing about the path or reaction mechanism by which the compound is formed. It depends only on the difference in enthalpy between the final state (the compound) and the initial state (the elements). This path independence is true for the enthalpy change of every chemical reaction. Lets consider the following general reaction again. aA + bB -> cC + dD The standard enthalpy change for reaction 1.7 is given by
o H reaction = H o( products) H o(reac tan ts) f f



That is,
o o o o o H reaction = {cH fc [C ] + dH fD [D]} {aH fA [A] + bH fB [B]}


The bars over the symbols for H indicate that the standard heat of formation is for 1 mole of a given substance.

Heats of Combustion
The heat content, or in nutritional terms the caloric content, of metabolites is obtained by measuring the heat released when the compound is completely burned in oxygen. The heat of combustion (is the heat released for the complete oxidation of the metabolite to its oxidation products. The heats of combustion for carbohydrates, lipids, and proteins are 4.1, 9.3, and 4.1 kcal gm-I respectively. The heat of combustion of a substance can be determined from its standard enthalpy of formation and from the standard enthalpies of formation of the oxidation products, such as carbon dioxide and water. Let us consider the oxidation of glucose (reaction 1.10). Glucose has a rather complicated structure, but for the moment well just use its empirical formula.

C 6 H 12O6 + 6O2 6CO 2 (g) + 6H 2O


The standard enthalpy change for this reaction is identical to the heat of combustion and is calculated as shown in equations 1.11 and 1.12. Note that the physical state of each reactant is indicated. The heat of combustion depends upon the physical states of all reactants and products.
o H reaction(1.10) = H co = [6H o[CO 2 (g)] + 6H o[ H 2O(g)] ] H o[glucose(s)] f f f


When we look up the data for equation 1.11, we find that the heat of combustion is -610 Kcal/mol.

H co = 610(kcal /mol)

H co = [6(94.2)kcal /mol+ 6(57.8)kcal /mol] (303)kcal /mol


The Second Law of Thermodynamics

The second law of thermodynamics states that the entropy of the universe is increasing. The elusive concept of entropy is related to the order, or structure, of the system. In a physical, chemical, or biochemical change, if the final state is more ordered than the initial state, the entropy change is negative. On the other hand, if the final state is less ordered than the initial state, the change in state occurs with a positive entropy change. Although positive entropy changes are dictated by the second law of thermodynaics, processes with negative entropy changes are permitted since it is the entropy of the universe, including both the thermodynamic system and the surroundings, that must increase for any process (equation 1.14).

S process = Ssystem + Ssurroundings > 0


Therefore, Ssystem can be negative if Ssurroundings is positive and if | Ssurroundings | > | Ssystem |. In a system such as the flask in which a chemical reaction occurs order in the system may increase provided that disorder increases in the environment. For example, a living cell maintains its low entropy, highly structured state at the expense of increasing the entropy of the environment. Like enthalpy changes, the change in entropy for a change in state is independent of the path by which the process occurs and depends only on the initial and final states of the system. For the reaction aA + bB -> cC + dD The standard entropy change is given by equation 1.15.
o o o o o Sreaction = {cS fc [C ] + dS fD [D]} {aS fA [A] + bS fB [B]}


The bars over the symbols for the entropies indicate that the entropy is for 1 mole of a given substance.

Entropy, Probability, and Information

What never? No, never. What never? Well, hardly ever! W. S. Gilbert and A. S. Sullivan, H.M.S. Pinafore

There are several ways of formulating the entropy change for a process. One classical definition relates entropy changes to the heat absorbed in a reversible process and the temperature of the system (equation 1.16).

S =

qreversible T

(1.16) is the heat absorbed in a reversible process,

reversible and T is the Kelvin temperature. A second formulation, introduced by Boltzmann, defines

The term S is the entropy change, Q

the entropy in terms of the most probable state of the system. The most probable state is the most random, or least structured state. Entropies are additive (as are enthalpies) because the entropy change for a process depends only upon the initial and final states of the system and is independent of the process by which the change in state is brought about. Let us now examine the relationship between entropy and probability. Consider a deck of 52 playing cards. The probability of drawing a spade is t, and the probability of drawing an ace is n. The probability of drawing an ace of spades is (1/4 x 1/13). Probabilities are multiplicative. Additive entropies and multiplictative probabilities are related by a logarithmic function (equation 1.17). S = k ln W


where S is the entropy, k is the Boltzmann constant (13.8 x 10-24 J K-1, or the ideal gas constant per molecule), and W is the probability that an event will occur. Since entropy in this formulation is a purely statistical law, it can be applied only to large numbers of particles or events. James Clerk Maxwell (1831-1879), whose monumental achievement in physics was the unification of electricity and magnetism, invented, in a letter to Boltzmann, one of the most significant fantasies in the history of science. Suppose that we appoint a microscopic being, named Maxwells demon, to guard a gate between two flasks containing equal numbers of mol-ecules at the same temperature (Figure 1.1). By letting only fast molecules pass through the gate, the demon (who deserves his appellation) can cause one flask to heat up and the other one to cool down with no expenditure of energy. The entropy of the system would thus decrease (in a spontaneous change to a less probable state) in violation of the second law of thermody-namics.

Figure 1.1 Maxwells demon is able to separate fast and slow moving molecules, but not without expending energy in the form of information.

Over fifty years elapsed before the demon was deprived of his paradox-ical and magical powers. In 1929 L. Szilard pointed out that the demon must be endowed with memory

to separate hot and cold molecules, an idea that was altered slightly by L. Brouillon, who showed that Maxwells demon must possess information to carry out his duties. The demon has two choices: he must either open or close the gate as a molecule approaches. The decision to open or close the gate requires energy equal to -k ln 2. This fundamental quantity is called a bit (from binary digit). The loss of energy that accompanies making each decision exactly balances the gain to be had in separating hot and cold molecules. Maxwells demon was vanquished because of an explicit connection between information and entropy. Let us return to our deck of playing cards. When the cards are arranged by suit in ascending order from deuce through ace, the entropy of the system is defined as zero. The second law of thermody-namics says that shuffling the cards will abolish this order, and eventually produce a random distribution of cards corresponding to the state of max-imum entropy. Conversely, information is required to restore order to the chaos of the randomly shuffled deck. The information required to restore order is equal in magnitude and opposite in sign to the increase in entropy produced by shuffling the deck originally. Let us consider a biochemical reaction that is driven by an entropy change in which the statistical concept of entropy comes directly into play. The bacterium Pseudomonas putida produces an enzyme, alanine racemase, that interconverts L- and D-alanine. From either pure enantiomer the enzyme produces a that is, a mixture containing equal amounts of the D- and L-stereoisomers, as shown below.
CO 2 +

Alanine racemase

CO 2 C CH 3 NH 3 +

H 3N

C CH 3



The enthalpy change for racemization is zero and the equilibrium constant is 1.0, confirming the notion that enantiomers have the same thermodynamic stability. Since the enthalpy change for the reaction is zero, it must be driven by an entropy change. What is the origin of this entropy change? If the reaction begins with either pure isomer, converting the starting material into equal amounts of D- and L-alanine produces a more random state than the reactants. The probability that a Maxwell demon can pick out an L-alanine decreases from 1.0 to 0.5; the final state is therefore more disordered than the initial state consisting of pure enantiomer. The entropy change for the reaction is given by equation 1.18.

Sracemization = S products Sreac tan ts


We recall from equation 1.17 that the entropy of a state can be expressed as S = k lnW. For the products, a racemic mixture, W = 2 because there are two possible microscopic states: D-alanine and L-alanine. For the reactants W = 1 because there is only a single microscopic state consisting of a pure enantiomer. The Boltzmann constant can be converted to the gas constant R by multiplying it by Avogadros number, R = kb N. Therefore, the entropy change for racemization of 1 mole of D-alanine at room temperature (298 K) is given by equation 1.19. Sracemization = R ln 2 - R ln 1 = 5.76 J K-1 mol-1 (1.19)

Free Energy
All physical processes occur with an increase in entropy when changes in both the system and the surroundings are considered. When no further spon taneous change is possible, the total entropy has increased to a maximum, and the system is at equilibrium (Figure 1.2). The ability of the system to do work decreases as equilibrium is approached, and at equilibrium there is no free energy available to do work. When an organism is at equilibrium with the surroundings, it is dead. The Gibbs free energy (G) is a thermodynamic state function that defines the equilibrium condition at constant temperature and pressure (equation 1.20).

Composition of system. Figure 1.2 Relationship between entropy and composition of a system. When the entropy of the universe is maximum, no spontaneous change is possible, and the system is at equilibrium.

G = H - TS


The Gibbs free energy determines both the direction and the magnitude of spontaneous change in systems held at constant temperature and pressure. Because most chemical reactions are carried out under these conditions and since biological systems function under these conditions, the Gibbs free energy is of enormous importance. The change in free energy is the force that drives a process to equilibrium. By convention, if G is negative, the process is spontaneous in the direction written and is called exergonic (from Greek ergon, meaning work). When G is positive, the process is not spontaneous in the direction written and is said to be endergonic. If the free-energy change for a process is zero, the system is at equilibrium. These conventions are summarized below: Conventions of the algebraic sign of G and the direction of spontaneous change: G < 0 The change in state is exergonic and spontaneous in the direction written. G = 0 The reaction is at equilibrium, the system cannot undergo any spontaneous change, and there is no free energy available to do work. G > 0 The change in state is endergonic and is not spontaneous in the direction written. (The reverse reaction is spontaneous.) The sign of G is controlled by the balance between H and TS. For processes in which H is negative and TS is positive, the enthalpy and the entropy act in concert, and both terms favor the spontaneous change. We have now defined two conditions for equilibrium: (1) the entropy of the uni verse is a maximum at equilibrium and (2) the Gibbs

free energy of the system is a minimum at equilibrium. Since the Gibbs free energy is a property of the system, it provides us with a measurable criterion of equilibrium in which enthalpy and entropy changes are balanced. Since the enthalpy change is a consequence of the first law of thermodynamics, and the entropy change is described by the second law of thermodynamics, the Gibbs free energy is a tremendous unifying principle. The relationship between the spontaneity of a given change in state and the enthalpy and entropy changes for a given change in state are summarized below. Effect of changes in H and S on S for a reaction H S G + + The reaction is exothermic, but favored entropically. It may occur if the temperature is high enough + The reaction is endothermic and not favored entropically. It will not be spontaneous at any temperature. + The reaction is spontaneous at all temperatures. The reaction is exothermic, but not favored entropically. The process may be spontaneous at low enough temperatures for the TS term to outweigh the enthalpic contribution to the free energy change.

Standard Free Energy Energy Changes

The standard free energy of a compound is the free-energy change for formation of 1 mole in its standard state (298 K and 1.0 atm) from its elements in their standard states. The standard free energy of formation of any element in its standard state is zero; the standard state for a solute in solution is 1.0 molar; and the standard free energy of formation of a 1.0 M solution of hydronium ions is assigned a value of zero. The free-energy change for a given process is independent of the pathway by which the change in state is brought about. Consider the general reaction shown below.

aA + bB

cC + dD


The standard free energy change for reaction 1.21 is give by the equation 1.22.
o o o o o G reaction = {cG fc [C ] + dG fD [D]} {aG fA [A] + bG fB [B]}


The bar over the symbols for G indicates that these are the values for the standard free energy of formation of 1 mole of each substance.

Standard Free Energy Energy Changes and the Equilibrium Constant

We have now discussed three related ideas: free-energy changes, standard free-energy changes, and the equilibrium condition. What is the relation among them? The freeenergy content of a compound depends upon the number of moles present and is thus an extensive property of the system. The standard free energy of a compound is defined for 1 mole of the com-pound under specified conditions of constant temperature and pressure. It can be shown that the two are related by equation 1.23.
o G A = G A + 2.303RT log[A]


When the concentration of A is 1.0 M, GA simply equals GoA, but under all other con-

ditions the standard free energy and the free energy have different values. The second term on the right side of equation 1.24 is a correction factor that relates the actual free energy of the compound to the standard free energy. Let us consider our standard reaction reaction again.

aA + bB

cC + dD


The free energy change (note not the standard free energy change) is given by equation 1.25. Greaction = (cGC +dGD) - (aGA + bGB) (1.25)

Substituting a term of the form of equation 1.23 for each of the reacants and products into equation 1.25 gives the result shown in equation 1.26.
o o o o G reaction = (cG C + dGD aG A bGB ) + 2.303RT log

[C ]c [D]d [A]a [B]b


The term in parentheses in equation 1.26 is simply the standard free energy change for the reaction. Let us call the mantissa of the log term Q. Thus,


[C ]c [D]d [A]a [B]b


At equilibrium, Q is the equilibrium constant for the reaction. The free energy change for the reactionis zero, equation 1.27 reduces to equation 1.28.
o 0 = G reaction + 2.303RT logK equilbirum


This can easily be rearranged as equation 1.29.

o G reaction = 2.303RT logK equilbirum


Equation 1.29 is one of the most useful in all of thermodynamics. If we take the antilog of equation 1.29, we obtain equation 1.30

K eq = eG reaction / RT


Referring to equation 1.26, we see why Goreaction is related to the equilibrium constant rather than Greaction. The log term represents the free-energy change that occurs when the reactants and products are brought from standard state concentrations to equilibrium concentrations. The log term exactly balances the standard free-energy change required to make Greaction equal to zero. Thus, Goreaction is not the criterion of spontaneity for chemical reactions. In many reactions there are steps whose standard free-energy changes are positive, but which are nevertheless spontaneous. Referring again to equation 1.26, this means that the value of Q determines the spontaneity. If the value of Q is less than 1.0 the log term is negative; if it is sufficiently negative, the process is spontaneous under the prevailing conditions even though Goreaction is positive. The numerical relationship between equilibrium constants and standard free-energy changes is shown in Table 1.1.

Table 1.1 Relationship between Go and the equilibrium constant for the equilbrium between two species, A and B, at 298 K. %B K - Go (J/mol) 50 1.00 0 55 1.22 500 60 1.50 1000 65 1.86 1500 70 2.33 2100 75 3.00 2700 80 4.00 3400 85 5.67 4300 90 9.00 5450 95 19.0 7300 98 49.0 9650 99 99.0 1.1 x 104 99.9 999.0 1.7 x 104 99.99 9999.0 2.2 x 104