You are on page 1of 30

Entropy

Made by Ravi paswan


What is entropy?

The word entropy is sometimes confused with energy.


Although they are related quantities, they are distinct.
Energy measures the capability of an object or system
to do work.
On the other hand, is a measure of the "disorder" of a
system. What "disorder refers to is really the number of
different microscopic states a system can be in, given that the
system has a particular fixed composition, volume, energy,
pressure, and temperature. By "microscopic states", we mean
the exact states of all the molecules making up the system.
Entropy = (Boltzmann's constant k) x logarithm of number of
possible states
= k log(N).
Entropy - thermodynamic property-- a
quantitative measure of disorder
Entropy traces out its origin –molecular
movement interpretation-Rudolf Clausias in
1850
The concept of entropy -thermodynamic
laws(i.e. the 2nd law of thermodynamics)
It can be visualised due to the process of
expansion, heating, mixing and reaction.
Entropy is associated with heat and
temperature.
Various types of disorder

Entropy-reflects the degree of disorderness.


Disorderness can be pointed out in three different
types. They are:
1.Positional disorder
whether the atoms are free to move or not
2.Vibrational disorder(thermal disorder)
whether the atoms vibrate about an
average position
3.Configurational disorder
this refers to the distribution of different
atoms or sites in lattice.
Definition and expression of
entropy
Entropy may be defined as the property of a system
which measure the degree of disorder or randomness
in the system
It is a Greek word which means
transformation
It is denoted by the symbol ‘S’
Clausius was convinced of the significance of the ratio
of heat delivered and the temperature at which it is
delivered,
S=Q/T
Entropy is the sum total of entropy due to
positional disorder, vibrational disorder and
configurational disorder. i.e randomness due to
change of state
S=sp+st+sc
When a system is undergoing change then the entropy
change is equal to the heat absorbed by the system
divided by the temperature at which change taken
place.
ΔS = S2 –S1
= ∫ dq / T
T ΔS = dq or TdS = dq
this is the II law expression.
Suppose the process is undergoing change at constant
temperature:
From I Law we know that
ΔE = q – w or dE = dq – dw or dE = dq – PdV
At constant temperature ΔE = 0, therefore
dq =PdV.
From II law we know that dq = TdS ,
Substituting this in the above we get,
Tds = Pdv
ΔS = PdV / T,
Suppose the process is undergoing change at constant pressure
condition then:
T ΔS = (q)p
but we know that (q)p = CpdT
T ΔS = Cp dT,
Or TdS = Cp dT
By integration,
1∫2dS = 1∫2 Cp dT /T
Entropy Change

The entropy change during a reversible process is defined as

For a reversible, adiabatic process


dS =0
S2 =S1
The reversible, adiabatic process is called an isentropic
process.
Entropy Change and
Isentropic Processes
The entropy-change and isentropic relations for a process can be
summarized as follows:
i. Pure substances:
Any process: Δs = s2 – s1 (kJ/kgK)
Isentropic process: s2 = s1
ii. Incompressible substances (liquids and solids):
Any process: s2 – s1 = cav T2/T1 (kJ/kg
Isentropic process: T2 = T1
iii. Ideal gases:
a) constant specific heats (approximate treatment):
for isentropic process

for all process


Isentropic Efficiency for Turbine
Isentropic Efficiency for
Compressor
Increase of Entropy Principle
where the equality holds for an internally reversible process and the
inequality for an irreversible process. We may conclude from these
equations that the entropy change of a closed system during an
irreversible process is greater than the integral of δQ/T evaluated for
that process. In the limiting case of a reversible process, these two
quantities become equal. We again emphasize that T in these relations is
the thermodynamic temperature at the boundary where the differential
heat δQ is transferred between the system and the surroundings.
Note that the entropy generation Sgen is always a positive quantity or
zero. Its value depends on the process, and thus it is not a property of
the system. Also, in the absence of any entropy transfer, the entropy
Equation 2 has far-reaching implications in thermodynamics. For an
isolated system (or simply an adiabatic closed system), the heat transfer
is zero, and Eq. 2 reduces to
ΔSisolated ≥ 0
This equation can be expressed as the entropy of an isolated system
during a process always increases or, in the limiting case of a reversible
process, remains constant. In other words, it never decreases. This is
known as the increase of entropy principle. Note that in the absence of
any heat transfer, entropy change is due to irreversibility's only, and their effect is always to
increase entropy.
Entropy is an extensive property, and thus the total entropy of a system is equal to the sum
of the entropies of the parts of the system. An isolated sys-tem may consist of any number of
subsystems . A system and its surroundings, for example, constitute an isolated system since
both can be enclosed by a sufficiently large arbitrary boundary across which there is no heat,
work, or mass transfer . Therefore, a system and its surroundings can be viewed as the two
subsystems of an isolated system, and the entropy change of this isolated system during a
process is the sum of the entropy changes of the system and its surroundings, which is equal
to the entropy generation since an isolated system involves no entropy transfer. That is, Sgen
= ΔStotal = ΔSsys + ΔSsurr ≥ 0 ------ Eq. 3
where the equality holds for reversible processes and the inequality for irreversible
ones. Note that Ssurr refers to the change in the entropy of the surroundings as a
result of the occurrence of the process under consideration.
Since no actual process is truly reversible, we can conclude that some entropy is
generated during a process, and therefore the entropy of the universe, which can be
considered to be an isolated system, is continuously increasing. The more
irreversible a process, the larger the entropy generated during that process. No
entropy is generated during reversible processes (Sgen _ 0).
Entropy increase of the universe is a major concern not only to
engineers but also to philosophers, theologians, economists, and
environmentalists since entropy is viewed as a measure of the disorder
(or “mixed-up-ness”) in the universe.
The increase of entropy principle does not imply that the entropy of a
sys-tem cannot decrease. The entropy change of a system can be
negative during a process (Fig. 3), but entropy generation cannot. The
increase of entropy principle can be summarized as follows:
Sgen > 0 Irreversible process
Sgen = 0 Reversible process
Sgen < 0 Impossible process
Entropy of ideal gas
The specific heats of ideal gases, with the exception of monatomic gases,
depend on temperature, and the integrals in Eqs. 3 and 4 cannot be
performed unless the dependence of cv and cp on temperature is known.
Even when the cv(T ) and cp(T ) functions are available, performing long
integrations every time entropy change is calculated is not practical. Then
two reasonable choices are left: either perform these integrations by simply
assuming constant specific heats or evaluate those integrals once and
tabulate the results. But here we are going to present variable specific heats
(Exact Analysis)
Clausius Theorem
Entropy is a thermodynamic property; it can be viewed as a
measure of disorder. i.e. More disorganized a system the
higher its entropy. Defined using Clausius inequality

where Q is the differential heat transfer & T is the absolute temperature at the
boundary where the heat transfer occurs
Clausius inequality is valid for all cycles,
reversible and irreversible.
Consider a reversible Carnot cycle:
Since entropy is a thermodynamic property, it has
fixed values at a fixed thermodynamic states.
Hence, the change, S, is determined by the initial
and final state. BUT..
Clausius inequality
Second Law & Entropy Balance
Increase of Entropy Principle is another way of stating the Second Law of Thermodynamics:
Second Law : Entropy can be created but NOT destroyed

(In contrast, the first law states: Energy is always conserved)

Note that this does not mean that the entropy of a system cannot be reduced, it can.
However, total entropy of a system + surroundings cannot be reduced

Entropy Balance is used to determine the Change in entropy of a system as follows:


Entropy change = Entropy Transfer + Entropy Generation where,

Entropy change = S = S2 - S1

Entropy Transfer = Transfer due to Heat (Q/T) + Entropy flow due to mass flow (misi – mese)

Entropy Generation = Sgen 0

For a Closed System: S2 - S1 = Qk /Tk + Sgen

In Rate Form: dS/dt = Q /T + S

You might also like