You are on page 1of 1

ENTROPY VARIATION IN MIXING

Since thermodynamic entropy can be related to statistical mechanics or to information theory, it is possible to calculate the entropy of mixing using these two approaches. Assume that the molecules of two different substances are approximately the same size, and regard space as subdivided into a square lattice whose cells are the size of the molecules. This is a crystal-like conceptual model to identify the molecular centers of mass. If the two phases are liquids, there is no spatial uncertainty in each one individually. Everywhere we look in component 1, there is a molecule present, and likewise for component 2. After the two different substances are intermingled, the liquid is still dense with molecules, but now there is uncertainty about what kind of molecule is in which location. Of course, any idea of identifying molecules in given locations is a thought experiment, not something one could do, but the calculation of the uncertainty is well-defined. We can use Boltzmann's equation for the entropy change as applied to the mixing process

where

is Boltzmanns constant. We then calculate the number of ways molecules of component 2 on a lattice, where

of arranging

molecules of

component 1 and

is the total number of molecules, and therefore the number of lattice sites. Calculating the number of permutations of for , objects, correcting for the fact that of them are identical to one another, and likewise

After applying Stirling's approximation for the factorial of a large integer m:

, the result is:

Since the Boltzmann constant generalized to a mixture of components,

and , with

is Avogadro's number, this expression may be

You might also like