Professional Documents
Culture Documents
Entropy is one of the important concepts that students need to understand clearly while studying Chemistry and
Physics. More significantly, entropy can be defined in several ways and thus can be applied in various stages or
instances, such as in a thermodynamic stage, cosmology, and even in economics. The concept of entropy basically
talks about the spontaneous changes that occur in everyday phenomena or the tendency of the universe towards
disorder.
Apart from being just a scientific concept, entropy is often described as a measurable physical property that is most
commonly associated with uncertainty. However, the term is also used in different fields ranging from classical
thermodynamics, statistical physics and even information theory. In addition to its applications in Physics and
Chemistry, it’s applied in biological systems and their relation to life, sociology, weather science and climate
change, as well as to quantify the transmission of information in telecommunication.
What Is Entropy?
Generally, entropy is defined as a measure of randomness or disorder of a system. This concept was introduced by a
German physicist named Rudolf Clausius in the year 1850.
Apart from the general definition, there are several definitions that one can find for this concept. The two definitions
of entropy that we will look at on this page are the thermodynamic definition and the statistical definition.
From a thermodynamics viewpoint of entropy, we do not consider the microscopic details of a system. Instead,
entropy is used to describe the behaviour of a system in terms of thermodynamic properties, such as temperature,
pressure, entropy, and heat capacity. This thermodynamic description took into consideration the state of
equilibrium of the systems.
Besides, the statistical definition, which was developed at a later stage, focused on the thermodynamic properties,
which were defined in terms of the statistics of the molecular motions of a system. Entropy is a measure of
molecular disorder.
If we talk about quantum statistical mechanics, Von Neumann extended the notion of entropy to
the quantum domain by means of the density matrix.
When discussing the information theory, it is a measure of the efficiency of a system in
transmitting a signal or the loss of information in a transmitted signal.
When it comes to dynamical systems, entropy defines the growing complexity of a dynamical
system. It also quantifies the average flow of information per unit of time.
Sociology states that entropy is the social decline or natural decay of structure (such as law,
organisation, and convention) in a social system.
In cosmology, entropy is described as a hypothetical tendency of the universe to attain a state of
maximum homogeneity. It states that the matter should be at a uniform temperature.
In any case, today, the term entropy is used in many other sciences very much distant from Physics or Mathematics,
and we must say that it no longer maintains its rigorous quantitative character.
Properties of Entropy
It is a thermodynamic function.
It is a state function. It depends on the state of the system and not the path that is followed.
It is represented by S, but in the standard state, it is represented by S°.
Its SI unit is J/Kmol.
Its CGS unit is cal/Kmol.
Entropy is an extensive property which means that it scales with the size or extent of a system.
Note: The greater disorder will be seen in an isolated system; hence, entropy also increases. When chemical
reactions take place, if reactants break into more products, entropy also gets increased. A system at higher
temperatures has greater randomness than a system at a lower temperature. From these examples, it is clear that
entropy increases with a decrease in regularity.
∆S = qrev,iso/T
If we add the same quantity of heat at a higher temperature and a lower temperature, randomness will be maximum
at a lower temperature. Hence, it suggests that temperature is inversely proportional to entropy.
Total entropy change is equal to the sum of the entropy change of the system and surroundings.
If the system loses an amount of heat q at a temperature T1, which is received by surroundings at a temperature T2.
∆Ssystem=-q/T1
∆Ssurrounding =q/T2
∆Stotal=-q/T1+q/T2
Points to Remember
∆S = qrev,iso/T
∆U=q+w
For the isothermal expansion of an ideal gas, ∆U = 0
Therefore,
∆S = nRln(V2/V1)
q=0
Therefore,
∆S = 0
Even though the reversible adiabatic expansion is isentropic, the irreversible adiabatic expansion is not isentropic.
Note:
1. Entropy increases when solid changes to a liquid and a liquid changes into gases.
2. Entropy also increases when the number of moles of gaseous products increases more than the reactants.
Some things are contrary to expectations about entropy.
A hard-boiled egg has greater entropy than an unboiled egg. It is due to the denaturation of the secondary
structure of the protein (albumin). Protein changes from the helical structure into a randomly coiled form.
If we stretch a rubber band, entropy gets decreased because macromolecules get uncoiled and arranged in a
more ordered manner. Therefore, randomness will decrease.
The limitation of this law is that many solids do not have zero entropy at absolute zero.
Entropy of Fusion
It is the increase in entropy when a solid melt into liquid. The entropy increases as the freedom of movement of
molecules increases with phase change.
The entropy of fusion is equal to the enthalpy of fusion divided by the melting point (fusion temperature)
∆fusS=∆fusH / Tf
A natural process such as a phase transition (for example, fusion) will occur when the associated change in the
Gibbs free energy is negative.
Exception
Helium-3 has a negative entropy of fusion at temperatures below 0.3 K. Helium-4 also has a very slightly negative
entropy of fusion below 0.8 K.
Entropy of Vaporisation
The entropy of vaporisation is a state when there is an increase in entropy as liquid changes into a vapour. This is
due to an increase in molecular movement which creates a randomness of motion.
The entropy of vaporisation is equal to the enthalpy of vaporisation divided by boiling point. It can be represented
as,
∆vapS=∆vapH / Tb
Spontaneity
● Exothermic reactions are spontaneous because ∆Ssurrounding is positive, which makes ∆Stotal positive.
● Endothermic reactions are spontaneous because ∆Ssystem is positive and ∆Ssurroundings is negative, but overall
∆Stotal is positive.
● Free energy change criteria for predicting spontaneity is better than entropy change criteria because the former
requires only free energy change of the system, whereas the latter needs entropy change of both system and
surroundings.
Negentropy
It is the reverse of entropy. It means things becoming more in order. By ‘order’, it means organisation, structure and
function. It is the opposite of randomness or chaos.
Solved Questions
1. The entropy of an isolated system can never ____.
Answer: b
Explanation: The entropy of an isolated system always increases and remains constant only when the process is
reversible.
2. According to the entropy principle, the entropy of an isolated system can never decrease and remains constant
only when the process is reversible.
a) true b) false
Answer: a
3. Entropy may decrease locally in some regions within the isolated system. How can this statement be justified?
a) This cannot be possible. b) This is possible because the entropy of an isolated system can decrease.
c) It must be compensated by a greater increase of entropy somewhere within the system. d) None of the above.
Answer: c
Explanation: The net effect of an irreversible process is an entropy increase in the whole system.
a) the energy of the world is constant b) the entropy of the world tends towards a maximum
Answer: c
5. The entropy of an isolated system always ____ and becomes a ____ at the state of equilibrium.
a) decreases, minimum b) increases, maximum c) increases, minimum d) decreases, maximum
Answer: b
Explanation: If the entropy of an isolated system varies with some parameter, then there is a certain value of that
parameter that maximises the entropy.
Entropy as Disorder
Even though there are many new and different interpretations of entropy, in general, it is defined as a measure of
disorder and chaos. Chaos, as such, is the state of a physical or dynamic system wherein elements of all types are
mixed evenly throughout the space, and so it becomes homogeneous.
More energy put into a system excites the molecules and the amount of random activity.
As the gas expands in a system, entropy increases.