You are on page 1of 9

Gibbs Free Energy

Avalanches!
Gibbs Free Energy: Equilibrium

Unstable equilibrium:
small distortions
can perturb the system to
(local and global) equilibria
Stable equilibrium Stable equilibrium

“The equilibrium state is one in which no further macroscopic change takes place because
all forces acting on the system are balanced. ‘‘
Gibbs Free Energy: Equilibrium
The general definition of the Gibbs free energy (constant T and p) ! = # − %&
Consider a small change in G ∆! = ∆# − ∆%& − %∆&
If T is constant, the last term on the right-hand side vanishes, and we have the case for
isothermal processs
∆! = ∆# − %∆&
Notice that ∆G meausres the useful work. So the useful work is always less than ethalpy
change. The difference T∆S is the isotermally unavailable energy (wasted).
Let‘s now use ∆# = ∆( − )∆* − +∆, and ∆( = %∆& + )∆* from previous weeks to write

∆! = &∆% − +∆,
So, at constant p and T, the above expresson is ∆G = 0. That is when the (reversible) system
is in equilibrium, pressure and temperature do not change (example with ice-water), In other
words, the magnitude of ∆G measures the extent of displacement of the system from
equilibrium,and ∆G = 0 at equilibrium.

+10 +226
Gibbs Free Energy: Phase transiton
A phase is a system or part of a system that is homogeneous and has definite boundaries.
A phase need not be a chemically pure substance.
A phase transition is ordinarily caused by heat uptake or release, and when a phase
change does occur it is at a definite temperature and involves a definite amount of heat.
Transitions of this type are “all-or-none” transitions; the material is in one phase or in
another phase. The two phases can coexist at the transition temperature. Such
transitions are known as first-order phase transitions.

∆" = ∆$ − &∆'
discontinuity in
heat capacity, C

The opposite of 1st order phase transiton is the 2nd order phase transitions, for which
discontinuity disappear, and there is a smooth transition in phase changes
Polymers Magnetic systems
Statistical thermodynamics/physics/machanics
Classical thermodynamics provides a phenomenological description of nature. The
mathematical relationships of thermodynamics are precise, but they do not tell us the
molecular origin of the properties of matter. Now, we will discuss a molecular interpretation
of thermodynamic quantities.
Statistical mechanics links the behavior of individual particles or parts of
macromolecules to classical thermodynamic quantities like work, heat, and entropy.

Recall our example with water or car tires…

Molecular interpretation of the three


states of matter. In the solid phase
(A), the molecules are in a regular
array. Interactions between molecules
occur, but overall there is very little
translational motion. (B) depicts the
liquid state. Molecules are free to
translate in any direction. The volume,
however, is not much different from in
the solid state. In the gas phase (C),
the volume occupied by the molecules
is much larger than in the liquid phase
or the solid phase.
Statistical mechanics
How then does one go about providing a detailed description of molecular behavior? A
macroscopic system might have on the order of 1023 particles, and on a practical level the
complete description of each particle and each particle’s motion seems impossible.

Maxwell distribution of velocity of particles


named after James Maxwell and Ludwig Boltzmann

Relation to hot bodies at contact?


The systems are in thermal
equilibrium, when the distribution
of the kinetic energy of the
molecules of one system is
identical to that of the other
system
Statistical mechanics
Now lets discuss the spread of gas molecules in a room. This is an entropy driven process
since the gas particles naturally want to increase their entropy.

Expansion of a gas
throughout a room.

The gas pressure is high at the beginning. By contrast on the right side, 25 gas molecules
are spread throughout the entire 272 volume elements. There are many more ways of
placing 25 molecules in 272 elements than in 25 elements! As we shall see below,
combining the division of space into volume elements leads to the famous Boltzmann
equation of statistical mechanics.
Let’s choose as our system an ensemble of indistinguishable particles. The particles are
indistinguishable because there is no way of telling them apart.

The equilibrium distribution will be the most probable distribution. There will be
fluctuations of the system at equilibrium, but unless the system is particularly small, all
probable fluctuations will be essentially negligible in magnitude.
Statistical mechanics
Recall that there are many ways in which a given quantity of energy can be distributed even
though the total energy is constant (1st law of TD). Our system will choose the most
probable distribution of energy at equilibrium.
In general the number of ways, W, of arranging N identical particles in configuration
{n1, n2, . . . }, with n1 in one group, n2 in another, n3 in another, and so on, is
#!
!=
%& ! %' ! … %) !
Example: Suppose we have N= 20 with configuration {1, 0, 3, 5, 10, 1}; the objects are
gathered into six different piles. In this case W=20!/(1!·0!·3!·5!·10!·1!)1 ≈ 109, a large number.
The most probable distribution is the one having the largest number of arrangements of
particles, identifying the most probable distribution is the same as maximizing W under the
constraints the number of particles and the total energy is constant (maximum entropy!)

Guess what? The connection between S and W, known as the Boltzmann equation,

* = +, ln !

kB =1.38x10-23 J/K is the Boltzmann constant


Statistical mechanics
A C

Expansion of a gas
throughout a room.

!" = $% ln () !* = $% ln (+

() ~-" (+ ~-*

∆! = !% − !" = $% ln (+ − $% ln () = $% ln (+ / ()

∆! = $% ln 1+ /1) > 3

Entropy increases as the gas expands!!!!

You might also like