Professional Documents
Culture Documents
80
On: 01 Nov 2022
Access details: subscription number
Publisher: CRC Press
Informa Ltd Registered in England and Wales Registered Number: 1072954 Registered office: 5 Howick Place, London SW1P 1WG, UK
Energy Dissipation
Publication details
https://www.routledgehandbooks.com/doi/10.1201/b15380-5
Luca Palmeri, Alberto Barauesse, Sven Erik Jørgensen
Published online on: 28 Aug 2013
How to cite :- Luca Palmeri, Alberto Barauesse, Sven Erik Jørgensen. 28 Aug 2013, Energy
Dissipation from: Ecological Processes Handbook CRC Press
Accessed on: 01 Nov 2022
https://www.routledgehandbooks.com/doi/10.1201/b15380-5
This Document PDF may be used for research, teaching and private study purposes. Any substantial or systematic reproductions,
re-distribution, re-selling, loan or sub-licensing, systematic supply or distribution in any form to anyone is expressly forbidden.
The publisher does not give any warranty express or implied or make any representation that the contents will be complete or
accurate or up to date. The publisher shall not be liable for an loss, actions, claims, proceedings, demand or costs or damages
whatsoever or howsoever caused arising directly or indirectly in connection with or arising out of the use of this material.
3
Downloaded By: 10.3.98.80 At: 21:30 01 Nov 2022; For: 9781466558489, chapter3, 10.1201/b15380-5
Energy Dissipation
3.1 Introduction
Dissipation in physics is strictly related to the concept of dynamics. Indeed,
friction or turbulence causes a degradation of energy over time. This energy
is typically converted into heat, which raises the temperature of the system.
Such systems are called dissipative systems. Formally, dissipating forces are
those that cannot be described by a Hamiltonian. This includes all those
forces that result in the conversion of coherent or directed energy flow into
an undirected or more isotropic distribution of energy (e.g., the conversion
of light into heat).
As discussed in Chapters 1 and 2, this is exactly what happens in eco-
systems. However, at first glance, this seems to be in contradiction with the
observation that living objects embody order and information. In the 1940s
of the twentieth century, in his famous book What is Life? (Schrödinger, 1944),
Schrödinger threw new light on the relationship between order, disorder,
entropy, and living organisms. The instance “order from order” introduced
by Schrödinger was a startling premonition of what would be the results
of the research that led Watson and Crick in 1954 to the discovery of DNA.
The other fundamental idea expressed in Schrödinger’s book, the instance
“order from disorder” was a first attempt to apply the fundamental theorems
of thermodynamics to biology. Schrödinger recognized that living systems,
being embedded in a network of flows of matter and energy, are far from
equilibrium systems. He realized that the study of such systems from the
perspective of nonequilibrium thermodynamics would enable the reconcili-
ation of biological self-organization and thermodynamics.
Every biological process, event, or phenomenon is inherently irreversible,
leading to an increase of the entropy of that part of the universe in which
the event occurs. So, every living organism produces positive entropy, thus
coming dangerously close to the state of maximum entropy, the heat death.
It manages to stay away from this state only through continuous negative
entropy or Helmholtz free energy absorption from the surrounding environ-
ment in the form of matter and energy. This follows immediately from the
23
24 Ecological Processes Handbook
this case, the condition of stationarity implies that the flow of entropy is null:
dS dSin dSout
0= = +
dt dt dt
and, hence,
dSin dS
= − out
dt dt
Along with Schrödinger, we can say that “… essential for metabolism to
work is that the body is able to get rid of all the entropy that produces in its
lifetime.” Thus, the organization in living systems is maintained by absorb-
ing order from the surrounding environment and emitting disorder in the
form of thermal radiation in the infrared. In this context, Schrödinger, with-
out mentioning them, explicitly makes reference to the biological necessity of
dissipative processes. He states, “… the highest body temperature of warm-
blooded animals includes the advantage of making them able to get rid of its
entropy at a faster rate so that they can afford a more intense process of life.”
In general, the integrals in the last member depend on the trajectory fol-
lowed by the transformation between states a and b (i.e., diS and deS, and are
not state functions), except in the event that the transformations along which
we measure ΔS are reversible or quasistatic. In this case, we have
b b b
∂q
∆Sa ,b = ∫ dS = ∫ d e S = ∫T (3.2)
a a a
Note that the last integral can be written only when it makes sense: We
know that if the transformation is not reversible, intermediate states are
not equilibrium ones, and then, in general, the state variables are not well
defined. From point 5, it follows immediately that for irreversible transfor-
mations between a and b, it is always
b b
∂q ∂q
∫ T irrev. > ∫ T = ∆Sa ,b (3.3)
a a
* It is not denoted by dq to emphasize the fact that this is not an exact differential.
26 Ecological Processes Handbook
entropy can only increase, or, at most, in the case of reversible transforma-
Downloaded By: 10.3.98.80 At: 21:30 01 Nov 2022; For: 9781466558489, chapter3, 10.1201/b15380-5
From the conservation of the Hamiltonian flow, it follows that all the
trajectories of the system states in the phase space are constrained to lie
on constant energy surfaces. These are indeed states that are not dis-
tinguishable from the point of view of the energy or the macroscopic
point of view. A measurement of this indistinguishability, for a system
in equilibrium at temperature T, is provided by the area of the surface
of constant energy, or the number of possible microscopic states or com-
plexions. If we denote this number by Π, from the energy distribution
function, we get
(E − F )
−
∏= ∫e kB T
dE
E = E0
led to the demonstration of the renowned relation, valid for isolated sys-
Downloaded By: 10.3.98.80 At: 21:30 01 Nov 2022; For: 9781466558489, chapter3, 10.1201/b15380-5
S = kB log ∏
By using the energy distribution function f (E), the expression for the
1
kinetic energy, Ec = mv 2 , Maxwell–Boltzmann distribution is obtained.
2
This function describes the distribution of the particle velocity moduli
for n moles of gas at temperature T:
n
f ( v) = e−(1 kB T )( mv 2 )
2
( 2πmkB T )3 2
which is represented in Figure 3.1 for three different values of the tem-
perature. It is readily understood here how the existence of a more
probable velocity, corresponding to the maximum of the curve of
Maxwell–Boltzmann, is associated with the natural tendency of the
system to reach the most probable state. The more likely macro states are
those that are achievable with a greater number of microstates Π; that is,
they are more indistinguishable from a macroscopic point of view and,
therefore, more disordered.
An isolated system increases its entropy and more or less rapidly
reaches the inert state of maximum entropy, which is the equilibrium.
Thermodynamic systems tend to evolve toward states of greater disor-
der. The tendency to depart from an ordered state to reach a less ordered
one, to move from a less probable to a more probable one, to an entropy
increase are all different manifestations of the same natural law expressed
by the second principle. For an isolated system, the state of equilibrium is
the most likely and it is characterized by a maximum of the entropy.
Boltzmann’s order principle can be immediately extended to non-
isolated systems. In fact, if the system is closed, that is, can exchange
energy but not matter, and it is at a constant temperature T, its behavior
is described by Helmholtz free energy:
F = U − TS
f (υ) T
T < Tʹ < T ʺ
Tʹ
Tʺ
∣υ∣
FIGURE 3.1
Maxwell–Boltzmann distribution of particle velocity moduli for n moles of a gas at three
different temperatures.
28 Ecological Processes Handbook
* The concept of dissipation is then bound to the second principle in the impossibility to build
thermal machines with a 100% efficiency.
30 Ecological Processes Handbook
* For example, by applying a temperature gradient to a box containing a mixture of two gases,
we observe an increase in the concentration of one of the two gases in the vicinity of the
warmest wall and a decrease in the vicinity of the coldest one (Nicolis and Prigogine, 1977).
Energy Dissipation 31
In general, any extensive quantity can vary for two broad categories of
causation: variations due to internal processes and changes due to external
ones. For example, we have seen in Equation 3.1 that the entropy change can
be written as dS = diS + deS. Similarly, the k variables nj (number of moles
of the jth element) appearing in Equation 3.4 are extensive quantities whose
variations can be separated into two contributions:
dn j = d i n j + d e n j ; 1 ≤ j ≤ k
d e n j represents the change due to flows of matter through the surface of the
system, whereas d i n j represents the variation of matter due to internal pro-
cesses such as chemical reactions or phase changes.
As we did for dn j, the variation of heat dq can also be decomposed into two
contributions:
dq = d i q + d e q
32 Ecological Processes Handbook
k
dS 1 dq 1 dn j
= − ∑µj
dt T dt T j=1 dt
According to what has been stated in point 5 of Section 3.2 and under LTE
hypothesis, the total entropy production is given by the sum of the entropy pro-
duction of the single irreversible processes. Definition 3.5 can be generalized. In
fact, we observe that each of the terms appearing in Equation 3.5 is formally the
product of a state function and a flow. In a very general way, we can, therefore,
redefine the entropy production of a thermodynamic system with a form
diS
p= = ∑X k Jk (3.6)
dt k
J k = J k ( X 1 , X 2 ,...., X n ) (3.8)
The second law of thermodynamics says that forces and fluxes should have
equal signs. For example, if we think of a system consisting of two com-
partments that can only interact with each other by exchanging heat, we
can write
1 1 dq
X = − ; J = i 1
T1 T2 dt
Heat always flows from the body at a higher temperature to the coldest one.
As long as a difference or gradient in temperature exists, there will be a flux
J ≠ 0. Temperature difference is progressively reduced, and the system will
move to a state of equilibrium with X = 0 and J = 0. At equilibrium, entropy
production is null, and we conclude that the thermodynamic equilibrium is
a state of absolute minimum for p. The existence of a gradient is necessary
for the flow to persist. Is henceforth crucial the action on the system from
outside by imposing external constraints or forcings.
We may define a steady state as a condition in which some state variables
are time independent. An equilibrium state is trivially a steady state. The
existence of external constraints is a necessary condition for the existence
of nonequilibrium stationary states. In the condition of the nonequilibrium
steady state, the following equations hold:
diS d S dS
p= > 0, e + i = 0
dt dt dt
From these, it necessarily follows that
deS
< 0
dt
This condition may be interpreted by saying that for nonequilibrium sta-
tionary states, the entropy of matter/energy entering the system should be
minor of that released by the system to the outside. Along with Prigogine, we
can say that, from the thermodynamic point of view, open systems degrade
the material that comes in and it is this degradation that maintains the steady
state (Prigogine, 1967).
For a system that is subject to external constraints from Equation 3.7 and
the assumption of linearity of the relation expressed by Equation 3.8, it can
be shown that
dp
≤ 0 (3.10)
dt
34 Ecological Processes Handbook
It says that the entropy production of a system, under the linearity condition,
decreases over time. The linearity condition is demonstrated by Onsager
(1931). From Equation 3.7, for steady-state systems that are subject to external
constraints, as long as constraints persist, p > 0. This fact as well as Equation
3.10 leads to the conclusion that for constrained systems, the stationary states
are local minima of the entropy production p. In summary, we will have
dp dt < 0 far from the steady state, and dp dt = 0 at the steady state. The
following theorem can be proved: A thermodynamic system under linear
regime and subject to outside constraints tends to evolve toward states of
minimum entropy production. The previous statement is a first example of
evolutional criterion.
The entropy production p, in the regime of irreversible linear processes,
plays a similar role as thermodynamic potentials in the theory of equilib-
rium states. However, this decrease in entropy does not reflect the emer-
gence of macroscopic order, as is observed while continuously increasing
the constraints that drive the system away from equilibrium. The stability
of the not-too-far-from-equilibrium states implies that, in a system which
obeys linear laws, the spontaneous emergence of superior ordered structures
is impossible.
All the results presented so far are based on the assumption of validity of
the fundamental Gibbs Equation 3.4. This was first demonstrated for equi-
librium states. The physical interpretation of this assumption is that under
nonequilibrium conditions also, entropy depends on the same independent
variables as in equilibrium states. This can happen in far-from-equilibrium
situations as well. The assumption of linearity is rather more restrictive. Even
in the range of validity of Gibbs equation, the relationship between gener-
alized flows and forces may be nonlinear. Since the theorem of minimum
entropy production is valid only for the linear regime, it is not very inter-
esting for biological systems. In 1971, Prigogine and Glansdorff proposed
a method to extend the results of the thermodynamics of stationary states
to nonlinear situations (Glansdorff and Prigogine, 1971). Indeed, it can be
shown that there is always a part of the variation of the entropy production
in time that keeps a definite sign:
dX p dX p
≤ 0 or ≥ 0 (3.11)
dt dt
This part identifies a local extreme for that part of p that has been defined
as in Equation 3.6 by some external constraints.
An in-depth analysis of this theory would lead us very far. However, the
remarkable point is that in these developments, the theoretical basis for the
definition of evolutional criteria might be found.