You are on page 1of 16

Foundations of Science

https://doi.org/10.1007/s10699-019-09630-7

The Third Construct of the Universe: Information

C. Barreiro1 · Jose M. Barreiro2 · J. A. Lara3 · D. Lizcano3 · M. A. Martínez3   · J. Pazos3

© Springer Nature B.V. 2019

Abstract
Very few scientists today question the fact that information, together with matter and
energy, is one of the three constructs forming the ontology of the universe. However, there
is still a long way to go before in order to establish the interrelations between informa-
tion and energy and information and matter, as Einstein did between matter and energy.
In this paper, after introducing the energy, matter, information (IME) model, which covers
the three constructs and their relationships, we illustrate real examples—two qualitative
and two quantitative—of the interrelations between energy and information. This closes
the open question with respect to the interrelationship between energy and information.

Keywords  Energy · IME model · Information science · Information theory · Matter

1 Introduction

As the history of science has reiteratively shown, world perception is the product of the
human being’s historical experience. The concept of time did not develop until there was
relevant experience with machines for measuring time, clocks. Until then, time tended to
be viewed rather fuzzily and as something cyclical rather than linear. Not until Huygens
invented the pendulum clock, which was able to define time in terms of small, equal and
repetitive units, did time exhibit properties of homogeneity and continuity, whereby it is
perceived to pass minute by minute or second by second. Similarly, the concept of matter
did not start to be abstracted until there was relevant and significant experience on creat-
ing energy-generating devices. Until then, both concepts, matter and energy, were fuzzy.
An object could be hot or cold in the same manner that it could be hard or soft. Both were
specific properties of matter. Not until energy machines appeared did it become necessary
to clarify the concept and definition of energy.

Author C. Barreiro has passed away after finishing the article.

* M. A. Martínez
mariaaurora.martinez@udima.es
1
Universidad de Vigo, Pontevedra, Spain
2
Technical University of Madrid, Madrid, Spain
3
UDIMA-Madrid Open University, Madrid, Spain

13
Vol.:(0123456789)
C. Barreiro et al.

We are now at a similar historical crossroads. Until 1940s, there was almost no experi-
ence about information machines or artefacts. It was with the advent of electronic comput-
ers and their widespread and ubiquitous use when the need to precisely establish and define
the concept of information and create and develop a theoretical framework was first felt. As
Scarrott (1986) noted, there is a need for an information science that should research the
natural properties of information, such as function, structure, dynamic behaviour, features,
statistics, etc.
In response to a worldwide survey with a single question, What is the universe made
of?, most people would say, in different, albeit more or less equivalent ways, matter. A
comparatively small number would state, and this time more accurately, matter and energy.
And, finally, a tiny fraction, perhaps close to 1/10,000, would give the following, currently
more accurate response of energy, matter and information. These would be scientists and
engineers. The reason is that both groups are aware that factory robots, for instance, are
built of some material, matter, work using some sort of motor, energy, and will only do
useful things if they have plenty of instructions, information, that tell them what they
should do. Biologists, on the other hand, are well aware that a ribosome of a cell is built
from aminoacids, matter, and is fed with energy generated by the conversion of adenosine
triphosphate (ATP) into adenosine diphosphate (ADP), but would be unable to synthesize
proteins without the information supplied by deoxyribonucleic acid (DNA).
The result of all this is that, until not very long ago, everyone agreed that the physi-
cal world, the universe or the The Fabric of Reality (Deutsch 1997), referred to in these
three ways, was composed of matter and energy, which, according Einstein’s famous equa-
tion (Einstein 1905, 1907): E = mc2 , were interchangeable. Today a third element has to
be added to these two components: information. Information is neither matter nor energy,
but it is a real-world property that is just as essential as matter and energy. A century of
research has authoritatively shown that information plays a key role in physical systems
and processes.
In this paper we present a world model (IME) based on its three “constructs”: informa-
tion, mass and energy, based on papers by Stonier (1990) and Wang (2008) and our own
research. Besides, we present the equivalence between Information, Energy and Matter,
and this is important part of this research, two qualitative and two quantitative examples
of the relations between energy and information. To do this, in point 2, the IME model is
presented. In point 3, the equivalence between Information, Matter and Energy. Section 4
explain the qualitative and quantitative examples that prove the validity of the model.
Finally, at the last point, we explain the results and conclusions of this investigation. As
will be seen in Fig.  2, the interrelations between the three “constructs” of the Universe:
Energy, Matter and Information are determined by six functions f1 , f2 , f3 , f4 , f5 , f6 , so far
only two have been established quantitatively, those corresponding to Matter-Energy: f5 , f6 .
However, in this work the functions f1 and f2 , will be established qualitatively and quanti-
tatively, corresponding to the relationship between Information and Energy. Finally, at the
last point, we explain the results and conclusions of this investigation.

2 State of Art: The IME Model

Today the line of thought introduced by Princeton University’s John Archibald Wheeler
considers that the physical world is mainly composed of information; energy and mat-
ter would be accessory. Indeed, Wheeler, as consequence of the paradoxes of quantum

13
The Third Construct of the Universe: Information

physics, proposed an completely revolutionary paradigm shift. According to Wheeler, the


profoundest lesson of quantum mechanics is that physical phenomena are somehow defined
by the questions that they pose. In a sense, he claimed, it is a participatory universe. And,
he added, the basis of reality may not be the quantum, which, despite its elusiveness, is still
a physical phenomenon, but the bit, that is, the response to a yes-or-no question. Wheeler
refers to this idea as the it from bit (Wheeler 1990). Scientists have picked up this baton
and are trying to reformulate quantum physics in terms of information theory. They have
already found that Heisenberg’s uncertainty principle, wave-particle duality and non-
locality can be formulated more powerfully in this context. In the meantime, theorists from
other fields are reasoning out thought experiments designed to unveil the key to the enigma
once and for all.
There are two reasons for considering that information is more than something that is
manipulated within the human head. The first is the physical transmission and processing
of information: telephone, telegraph, radio waves, etc. The second is the indisputable dem-
onstration that DNA and RNA (ribonucleic acid) convey genetic information, that is, how a
physical substance, DNA, conveys information. And this has been going on for thousands
of millions of years, whereas the human brain has only been processing information for
five million years.
The question now is, Can information exist outside the human brain? Considering DNA,
the answer is obviously yes. However, this question refers more specifically to information
created or gathered by human beings. And, again, the answer is yes. Libraries, museums,
etc. store information outside the human brain. Human information, that is, information
produced by human beings can exist as forms of matter and energy whose physical reality
is independent of human beings (Dompere 2018).
But, like electricity, information is ethereal. This is why it is not reasonable to compare
information, like a thought written down on a piece of paper, with any material product
whatsoever. However, both electricity and information exist physically in reality (Khatam
and Shafiee 2014). Just as there exist different forms of energy—mechanical, chemical,
electrical, heat, light, sound, nuclear, etc.—, so do there exist different forms of informa-
tion. Human information is one form of information. Human information itself may be
stored and communicated in a wide variety of ways and represented in very different forms.
Gleick (2011) describes “the state of art” regarding the Classical Theory of information,
in a meticulous and almost complete way, because it ignores everything that is discussed in
this work. About this work it is possible to emphasize the following:

• During World War II, Turing, around 1946, traveled from Bletchley Park to the Bell
Laboratory, within the framework of the collaboration agreement in cryptography
between the UK and the USA, there he had the opportunity to engage in numer-
ous conversations with Shannon (Anguera et  al. 2019; Lara and Aljawarneh 2019)
and although they did not speak directly of their respective works in cryptography,
they did speak of a Turing idea of how to measure all that “material” that consti-
tuted the messages they processed. Turing was aware that there was something in
it that should be measured mathematically. It wasn’t about probability. What wor-
ried Turing rather was the information that “changed” the probability: A probability
factor, something like the weight of the evidence. And he invented a unit he called
“ban”. It seemed appropriate to use a logarithmic scale so that the banes were added
instead of multiplied. On a ten basis, a “ban” was the weight of the evidence needed
to make a fact ten times more likely. For finer measurements there were submulti-

13
C. Barreiro et al.

ples: the deciban, by analogy with the decibel, and the centiban (Good 1979). It is
noteworthy that by this time there was no formal information theory.
• Shannon (1948), wrote an article that was immediately classified. In it, Shannon,
almost in passing, used, for the first time, the expression “information theory”. For
him, information is uncertainty, it is entropy and can be measured, creating the unit
of measurement of information that John W. Tuckey later called “bit.”
• In Rosenblueth et al. (1943) published a work in which three fundamental concepts
appeared. One, “feedback” as an organizational and control principle. Two, “teleo-
logical computing” or computing with purpose, by suggesting different ways of con-
ferring objetives and purposes to machines. Three, the most relevant for this work,
information as pure form or abstraction, separated from the physical signal that car-
ries it.
• And in the 21st century, Stephen Lloyd wrote: “If there is more energy, faster the bits
moves. At last, earth, air, fire and water are made of energy, but the different forms they
adopt are determined by the information. To do something requires energy. To specify
what it is done requires information” (Lloyd 2006).

Note importantly that information is usually propagated by means of wave impulses:


light, sound, radio or electrons or even impulses affecting matter and its organization. The
fact that information can be divided into small and discrete packets is used in telecom-
munications for packet switching enabling several users to use the same facility simultane-
ously. Information has been naturally conceived as an independent entity since Nyquist
(1924, 1928), Hartley (1928), Shannon (1948), etc. Thus, for example, Bell (1968) stated
that Information… is a measurable quantity that is independent of the physical medium by
which it is conveyed. However, this does not mean that it has a physical reality—and Bell
compares information with a more abstract term, pattern—, but it does imply that it actu-
ally exists. In other words, information is dealt with as an abstract entity, but this idea is not
pursued to its logical conclusion: information exists.
Information does exist. It does not have to be perceived to exist. It does not have to
be understood to exist. It can be interpreted without intelligence. It does not have to have
meaning to exist. It exists. This is an essential premise for understanding the physical uni-
verse and developing a theory of information. And, without this theory, it is impossible, for
example, not only to convert software development into a genuine branch of engineering
with a scientific groundwork, but also to understand the behaviour of advanced and com-
plex information systems, be these biological, social or even quantum physics phenomena,
etc. Moreover, without information it will be hard to understand the universe. And it will
be hard to tackle the brain-mind problem with any chance of success.
Stonier (1990) qualitatively identified relationships between information, matter and
energy. Heat is pure energy interacting with matter. The application of heat does not in
itself provide any information input, quite the contrary. A big enough increase in heat
causes molecules or other particles to move more randomly. This leads a crystal to dis-
solve, ice to liquefy, liquid to evaporate. The system is less organized at each stage. The
application of heat to a system randomizes its components and causes disorder or increased
entropy in the universe. Reciprocally, withdrawing heat from a system, such as condensing
a gas or freezing a liquid, increases its organization. Such cooling processes are conse-
quently associated with an increase in information. Therefore, heat can be said to be the
result of energy interacting with matter, whereas structure is the result of information inter-
acting with matter. The application of energy expresses itself as heat which causes parti-
cles (molecules, photons, plasmons, etc.) to vibrate and to move at random. In contrast,

13
The Third Construct of the Universe: Information

the application of information causes particles to be bound into fixed patterns and ordered
motion. In that sense, heat may be considered as the antithesis of organization.
Heat is the antithesis of organization, and, by implication, energy is the antithesis of
information. This does not, however, preclude the possibility of energy and information
interacting to produce a mix which might be viewed as energized information or, alterna-
tively, as structured energy. Information and energy must not be viewed as the opposites
of a bipolar system, rather they must be considered, according to the IME model shown in
Fig. 1, as two vertices of a triangle, with matter as the third vertex. Figure 1 illustrates a
conceptual model that should define the limits of the physical world, according to the three
sides of a triangle, the edges, in terms of the following phenomena:

• An informationless combination of pure energy and matter should contain the plasma
of the fundamental particles.
• An energyless combination of matter and pure information should be exemplified by a
crystal at 0°K.
• A matterless combination of information and energy would consist of massless parti-
cles, for instance, photons travelling through matterless space.

On the other hand, the measures of space and time establish information about the
organization and distribution of matter and energy. The information that a physical system
contains is directly proportional to the space that it occupies, whereas time is inversely
proportional to information. The idea of a system containing less information as a result of
the time between two events increasing should not be confused with a system’s content of
structural information, which is persistent. A system that survives the ravages of time cer-
tainly contains more information than one that disintegrates.
Cognitive informatics considers information as abstract generic artefacts that can be
shaped, processed and stored by human brains. Cognitive informatics theories provide a
new perception of information and informatics (information science) issues, as follows:

• Information is the third essential construct for modelling the world.

Informaon

Hot maer
Maer Energy
Plasma of fundamental parcles

Fig. 1  Relations between matter, energy and information, according to Stonier

13
C. Barreiro et al.

• Any product and/or process of human mental activities generates information.


• Information, matter and energy can be transformed into each other.
• Software conforms to the laws of new informatics and cognitive informatics.

This poses several fundamental questions. For example, When will the information
storage capacity of memory devices stop increasing? What is the ultimate capacity of a
device weighing, for instance, less than a gram and having a size of one cubic centime-
tre, which is approximately the size of a chip? How much information is necessary to
describe the universe? When was the first information produced? Or, more interestingly
as far as we are concerned, what relations are there between information and energy?
According to Wang (2008), the information-matter-energy model (IME) states that
the natural world (NW), which forms the context of human intelligence and science and
technology, is a dual world. One aspect of it is the physical world (PW), and the other
is the abstract world (AW), where matter (M) and energy (E) are used to model the for-
mer, and information (I) the latter, that is:
NW =̂ PW ∥ AW = f (M, E) ∥ g(I) = h(I, M, E), (1)
where ∥ denotes a parallel relation, and f , g and h are functions that determine a certain
PW, AW, or NW, respectively, as illustrated in Fig. 2.
According to the IME model, information plays a vital role in connecting the physi-
cal world with the abstract world. Models of the natural world have been well studied
in physics and other natural sciences. However, the modelling of the abstract world is
still a fundamental issue yet to be explored in cognitive informatics, computing, soft-
ware science, cognitive science, brain sciences, and knowledge engineering. The rela-
tionships between I–M–E and their transformations especially are deemed as one of the
fundamental questions in science and engineering.
The natural world NW (I, M, E), and particularly part of the abstract world, AW(I),
is viewed differently by individuals because people have different perceptions and men-
tal contexts. This indicates that although the physical world PW (M, E) is the same to
everybody, the natural world NW (I, M, E) is unique to different individuals because it

Abstract World (AW)

M E

Physical World (PW)

Natural World (NW)

Fig. 2  The IME model of the world view

13
The Third Construct of the Universe: Information

contains the abstract world, which is subjective and depends on the information an indi-
vidual receives and perceives.
As the model illustrated in Fig.  2 shows, information models the abstract world and
its interactions with the physical world, making it, alongside matter and energy, the third
construct for modelling the natural world. According to the cognitive theory of informat-
ics, information is any property or attribute of entities in the natural world that can be
abstracted, represented digitally and processed mentally. Information science or informat-
ics studies the nature of information, information processing and, together with physics,
ways to transform information, matter and energy.
The transformability of I, M and E can be illustrated as in Fig. 2, where all the generic
functions f1 to f6 conform to the following equations:
(1)
?
I = f1 (E), E = f2 (I) = f1−1 (I); (2)

(2)
?
I = f3 (M), M = f4 (I) = f3−1 (I); (3)

(3)
E = f5 (M), M = f6 (E) = f5−1 (E); (4)
where the question mark over the equal sign denotes uncertainty about whether such a
reverse function exists.
To some extent, Wang views contemporary information theory as the science that sets
out to find possible solutions f1 , f2 , f3 and f4 . As shown in the next point, some of the above
question marks, that is, those referring to equation ( f1 and f2 ), have already been solved, as
demonstrated in Sect. 4.

3 Information–Energy–Matter Equivalence

Let us examine the interconvertibility of energy and information, using a well-known phys-
ical phenomenon: the collision of two billiard balls (Fig. 3). Suppose that two billiard balls,
a black and a white one, are rolling over the billiard table at the same velocity. The black
ball is moving in a north-easterly direction and the white one in the south-easterly direction
and they switch direction when they collide with each other. The question then is whether
they exchanged energy or information. They did not in fact exchange much energy as they
continued to travel at more or less the same velocity after the collision. However, their tra-
jectory did alter substantially, and this raises the following question, Is the conservation of
momentum a reflection of the fact that two bodies merely exchanged information?

Fig. 3  Information content of
bodies in motion

13
C. Barreiro et al.

Let us start by exploring the interconvertibility of energy and information. To do this,


let us take the equation that changed the shape of the world, formulated by Einstein (1905)
as a consequence of his special theory of relativity, as follows:

E = m ⋅ c2 ∕ ± 1 − v2 ∕c2 . (5)
If a particle is massless, that is, mo = 0 , then the above equation at the very least implies
that the particle is not moving at the speed of light, c, and its energy must be zero, that is,
if mo = 0,

0 ⋅ c2
E= √ = 0. (6)
± 1 − v2 ∕c2

However, if the particle is moving at the speed of light, that is, v = c , then v2 ∕c2 = 1 ,
and the above Eq. (6), is indeterminate, that is,

0 ⋅ c2
E= √ = 0∕0 = indeterminate. (7)
± 1−1

This means that, although E may have a value, this value cannot be determined by means
of Eq. (7). A similar argument applies for the relativistic moment, p, given by equation:

p = mo v∕ ± 1 − v2 ∕c2 . (8)
Now consider a massless particle moving at a velocity other than c . It should have nei-
ther energy nor momentum. However, such a particle could exist at least in theory. Like
a photon, it would not have rest mass; unlike a photon, however, it should not move at
velocity c and, therefore, should not have momentum. Even so, it could have velocity and,
consequently, it could represent a unit in motion composed of pure information. Such a
hypothetical particle would have certain properties.
The linear momentum of a photon, given by Eq. (8), can also be expressed as follows:
p = hv∕c. (9)
where h is the Planck constant and v is frequency. On the other hand, the relations between
frequency v and wavelength λ is known to be given by
v = c∕𝜆. (10)
And substituting the v from (10) in (9), we get the de Broglie wavelength (de Broglie
1925):
p = h∕𝜆 ⇒ 𝜆 = h∕p. (11)
Now, substituting (8) in (12) yields
� √ � � √ �
𝜆 = h∕ m0 v∕ ± 1 − v2 ∕c2 = h ± 1 − v2 ∕c2 ∕m0 v. (12)

We can argue similarly for Eqs.  (6), (7) and (8), that is, particles whose rest mass is
zero, provided that v = 0 , and the equation is indeterminate and λ can have a value. Other-
wise, 𝜆 is infinite, leading to the following two interesting postulates proposed by Stonier
(1990):

13
The Third Construct of the Universe: Information

1. An informon is a photon whose wavelength has been extended to the infinite.


2. A photon is an informon travelling at the speed of light.

Therefore, an informon can be considered as a photon that appears to have stopped


oscillating: at velocities other than c, its wavelength appears to be infinite and, conse-
quently, its frequency is zero. If an informon accelerates to the speed of light, it crosses
a threshold where it can be perceived as having energy. When this happens, energy E is a
function of its frequency v , that is,
E = hv. (13)
Reciprocally, at velocities other than c, the particle exhibits neither energy nor momen-
tum; however, it could retain at least two properties of information, namely, its velocity and
direction. In other words, at velocities other than the speed of light, a quantum of energy
becomes a quantum of information, that is, an informon.
This suggests the possibility of the universe being full of informons. However, current
techniques are not designed to determine the presence of such particles because they move
at velocities other than c and fail to interact with matter, that is, do not have momentum.
An informon possessing information that equates it to a green light should not be visible as
a green light until it reaches a velocity of c . The naked human eye does not perceive elec-
tromagnetic radiation belonging to a certain frequency and, therefore, is unable to detect
infrared light without special equipment; likewise neither does it perceive massless parti-
cles at velocities of less than c.

4 Examples of the Information–Energy Relations

Since Einstein wrote his famous matter-energy equivalence formula (5) in (1905), best
known like: E = mc2 , energy and matter are known to be interconvertible (Einstein 1905).
Indeed, atom bombs constituted powerful proof of the passage of matter to energy. The
reverse step, the conversion of energy to matter, has been observed experimentally in parti-
cle accelerators. Bombarding the nucleus of an atom with high-energy γ rays has produced
positron–electron particle pairs. For example, a high-energy photon can be converted into
an electron and a positron under certain circumstances.
Now, as we will see later, energy can also be converted into information and vice versa.
This occurs whenever the system exhibits an entropy reduction or an increase in potential
energy, such as in Bénard instability and something as commonplace as lifting up an object
off the floor. And, more importantly, the information can be converted into energy.

4.1 Qualitative Examples

4.1.1 Bénard Instability

Bénard instability is an amazing example of the above (Bénard 1900) which John Strutt
Lord Rayleigh theoretically studied (Rayleigh 1916). Bénard instability materializes under
certain circumstances when the lower surface of a horizontal liquid layer is heated. Under
these circumstances, a vertical temperature gradient is set up with a permanent flux of heat
from bottom to top. If the gradient is not very large, the heat is conveyed by conduction
alone. As gradually more heat is applied, however, a threshold value is reached as of which

13
C. Barreiro et al.

convection becomes important for transferring heat from bottom to top. Bénard instability
can involve a highly organized process of molecule movement. As Prigogine and Stengers
(1984) stated, Bénard stability is a spectacular phenomenon. The convection motion pro-
duced actually consists of the complex spatial organization of the system. Millions of mole-
cules move coherently, forming hexagonal convection cells of characteristic size. Instead of
the continued application of heat causing further disorganization, Bénard instability creates
structures that help to transfer heat through the layers of the liquid. Prigogine and Stengers
also make the following important observation: heat transfer was considered a source of
waste in classical thermodynamics, in the Bénard cell it becomes a source of order. This is
a clear case, then, of the application of energy leading to an increase in system organiza-
tion and, therefore, information. This particular form of organization is only conserved for
as long as there is a large enough heat flux through the horizontal liquid layer. Once the
energy is removed, the structure collapses, the information disappears.

4.1.2 Potential Energy

Every time we pick up an object off the floor and place it on a table, we expend energy on
altering the organization of the universe. Work has to be done to perform this operation,
but this also creates a thermodynamically less probable state; therefore, we have increased
the information content of the universe. What actually happens to the energy expended
when an object was lifted has always been a mystery (Esposito et al. 2018). To explain it,
physicists were obliged to invent an accounting device: potential energy. As the order of
potential is open to choice, potential energy has always been an anomaly. The object on the
table has no energy. It will not move unless a force is applied. But there is another simpler
explanation: the energy expended is converted into information. The loss of energy is equal
to the increase of information in the system. To highlight this point, work is a transitory
phenomenon, whereas the product of work implies a change in the status of system infor-
mation, a change that is constant until it is subject to another force or more work. Along
these lines, we can define potential energy as a term that describes a state in which the
expended energy results in an increase in the information content of the system.
These are two qualitative examples. Let us now look at two quantitative examples.

4.2 Quantitative Examples

In microscopic systems, thermodynamic quantities such as work, heat and internal energy
do not remain constant but fluctuate. In fact, random violations of the second law have
been observed; nonetheless, they are in thermal equilibrium ⟨ΔE − W⟩ ≤ 0 , where ΔE
is the free-energy difference between states, W is the work done on the system, and ⟨⋅⟩
is the ensemble average. However, feedback control enables us to selectively manipulate
only those fluctuations that violate the second law such as upward jumps using information
about the system. In fact, Szilard (1929) developed a model that converts one bit of infor-
mation about the system to kB TLn2 of free energy or work. In other words, the second law
of thermodynamics is generalized as ⟨ΔE − W⟩ ≤ kB TI .
Here I is the mutual information content obtained by the measurements taken. So far, the
idea of a simple thermal rectification by feedback control has found applications such as the
reduction of thermal noise and the rectification of an atomic current at low temperature. On
the other hand, the Szilard-type Maxwell’s demon enables us to evaluate both the input (used
information content), and the output (obtained energy) of the feedback control and relate

13
The Third Construct of the Universe: Information

them operationally. Therefore, it provides an ideal test ground of information–energy conver-


sion and plays the crucial role in the foundation of thermodynamics. However, its experimen-
tal realization has been elusive to date. In the following, we consider these two cases:

4.2.1 Thermal Noise

The maximum amount of information that can be transmitted over a communication chan-
nel is called channel capacity and is represented by C . Mathematically, it is given by
C= max Im (I, O) , where Im (I, O) is the mutual information, I represents the input
{p(Ai )}i=1,2,…,n
and O is the output, that is, Im (I, O) = H(I) − H(I∕O) = H(O) − H(O∕I) , which represents
the average quantity of information received about the input, once the channel output is
known. The value C is given in bits/input symbol.
Telecommunications engineers are well-acquainted with the impact that signal power
has on information quality, which, they add, is self-evident. However, the role of band
amplitude and width W  , they insist, is no less so. Now, the Hartley–Tuller–Shannon for-
mula (Rifá and Huguet 1991) states that, for Gaussian discrete time and continuous ampli-
tude channels, channel capacity is given by C = W log (1 + 𝛾)bits∕s.
Imagine thermal noise, that is, noise caused by temperature. In this case, as is well
known, the noise power is proportional 2
to the temperature and the bandwidth of the fre-
quencies of channel W  . Let 𝛾 = 𝜎𝛽 2 be the ratio of the average power constraint associated
with the channel to the noise-induced signal variance, that is, the signal/noise ratio. The
value 𝜎 2 for a thermal only noise source is given by 𝜎 2 = kTW  , where T is the absolute
temperature and k is a constant. On the other hand, 𝛽 2 = Mn is the energy dissipated per
issued symbol, where M is the average energy constraint for the channel input and
n = 2WT is ( the2 )length of the input sequences. Then we have that
C = W ⋅ log 1 + kTW 𝛽
bits∕s . And if W is assumed to be very large,
2 2
lim C = kT 𝛽 𝛽
log e = 1.44 kT  . Of course, the value of k depends on the system of measure-
W→∞ 2
ment used; it turns out that 1 bit∕s = 𝛽T ⋅ 1023 if T is expressed in kelvins and 𝛽 2 in joules.
This expresses a fundamental limitation, namely that we need T ⋅ 10−23 joules of energy
to obtain one bit of information at absolute temperature T  . We have, for example, that
1 bit = 3 ⋅ 10−21 joules at room temperature (20 degrees centigrade). Clearly, there is a pre-
cise and clearly defined relations between information and energy.

4.2.2 Maxwell’s Demon

In Szilard (1929) invented a research protocol in which a hypothetical intelligence, called


in honour of its creator, Maxwell’s demon (Maxwell 1871), pumps heat from a isother-
mal environment and transforms it into to work. After intense controversy lasting some
80 years, the conclusion was finally reached that the demon’s role did not contradict and
less so violate the omnipresent second law of thermodynamics of increasing entropy. But
that would mean, in principle, that information can be converted into free energy (Wein-
stein 2003). An experimental demonstration of this information-to-energy conversion,
however, had been elusive until Toyabe et al. (2010) demonstrated that a non-equilibrium
feedback manipulation of a Brownian particle based on information about its location
achieves a Szilard-type information-to-energy conversion. In other words, Toyabe et  al.
conducted an experiment in which they developed and demonstrated a new method to eval-
uate the information content and the thermodynamic quantities of feedback systems and

13
C. Barreiro et al.

demonstrated for the first time the Szilard-type information-to-energy conversion using a
colloidal particle on spiral-stair-like potential.
In the real world, such demons are at work in membrane-based biological systems. For
example, the green alga Valonia is composed of a hollow sphere containing liquid. This liq-
uid is known to contain potassium concentrations that are thousands of times greater than
the sea water in which it lives. Similarly, the human kidney purifies blood molecules by
excreting potentially harmful molecules, including excess water. A kidney requires energy
in order to do its job. As such, it is one of the many examples of biological machines that
convert energy to information, that is, biological demons do the work of the Maxwell’s
demon, selecting molecules and, of course, locally reducing entropy. But they do so only as
result of an energy input.
Consider now, as shown in Fig.  4a, a microscopic particle on a spiral-staircase-like
potential. The step height is proportional to thermal energy kB T  , where kB is the Boltzmann
constant and T is the temperature. The particle stochastically jumps between steps owing
to thermal fluctuations. Although the particle sometimes jumps up a step, the downward
jumps along the gradient are more frequent than the upward ones. Accordingly, the par-
ticle falls down the stairs on average unless it is pushed upwards externally, as shown in
Fig. 4a. Now consider the next feedback control. The position of the particle is measured
at regular intervals and, when an upward jump is observed, a wall is positioned behind the
particle to prevent downward jumps, as shown in Fig. 4b. By repeating this procedure, the
particle is expected to climb up the stairs. Note that the energy expended on positioning the
wall should be negligible; this means that the particle can obtain free energy without direct
energy injection. If this is the case, what makes the particle climb the stairs? This appar-
ent contradiction of the second law of thermodynamics, represented by Maxwell’s demon,
inspired many physicists to generalize the principles of thermodynamics. The particle is
now understood to be driven by the information gained by measuring particle position.
In sum, Toyabe et al. demonstrate that free energy is obtained by feedback control using
information about the system, that is, converting the information into free energy, through
the first implementation of a Szilard-type Maxwell’s demon. And given that resulting
free energy or work is offset by the cost of the energy used by the demon to manipulate
the information, the whole system, including both the particle and the demon, does not

Fig. 4  Example of spiralling
microscopic particle

13
The Third Construct of the Universe: Information

Fig. 5  Information–energy
conversion

violate the second law of thermodynamics. In practice, the system consists of macroscopic
devices, such computers in the system and the microscopic device that gains energy at the
expense of energy consumption by the macroscopic device, as shown in Fig.  5. In other
words, if information is used as the means to transfer energy, this information–energy con-
version can be used to transport energy to nanomachines, even if they cannot be handled
directly. In this schema, information–energy conversion is represented by a macroscopic
demon and a microscopic system, exemplified by the Szilard engine. The information-to-
energy conversion ratio of the Szilard engine is potentially 100%; 1 bit ( ln 2 nat in natural
logarithms) of information is converted to free energy or work amounting to kB ln2 in the
microscopic system at the expense of an energy consumption of kB T ln 2 in the microscopic
demon.

5 Results and Conclusions

In Sect. 2, among other issues, the question of interconversion between the three constructs
of the Universe was raised: Matter, Energy and Information. The existing ones between the
first two, were already established, were missing then, to establish those that could possibly
be between the first and the last “Matter-Information” and Energy-Information. The latest
are those that have been established in this work, leaving those of Matter-Information as an
open question. Indeed, clearly, the interrelations between information and energy are not
only the matter of Gedankenexperiments, like the Szilard-type Maxwell’s demon, but are
materially realizable, as shown in the examples illustrated in this paper.
This solves the problem stated by Wang concerning the interrelations between two of
the constructs, information and energy, of the IME model (Wang 2008). However, there are
another two important open questions. First, we know matter is composed of fermions and
energy of photons, but the component, informon (de la Peña et al. 2019), of information
has yet to be identified. Second, the interrelations between information and matter, which
would close the triangle formed by the elementary constructs of the ontology of the uni-
verse, have yet to be established.
We are working on this second open question stated by Wang and expect to report
the results in the near future The authors conjecture, in this regard, that since there is a
vicarious relationship; that is, through the Eq. (1) between Matter and Information, that the
direct interrelation between the two is feasible, even more, plausible, and, of course, desir-
able. Hence the dedication to that equation. All that we can say about the trickier question

13
C. Barreiro et al.

of identifying the informon at present is that we are devoting all our energy and powers to
advancing a solution.

Acknowledgements  The authors would like to thank R.E. for translating this manuscript.

Compliance with Ethical Standards 


Conflict of interest  On behalf of all authors, the corresponding author states that there is no conflict of inter-
est.

References
Anguera, A., Lara, J. A., Lizcano, D., Martínez, M. A., Pazos, J., & de la Peña, F. D. (2019). Turing: The
great unknown. Foundations of Science. https​://doi.org/10.1007/s1069​9-019-09596​-6.
Bell, D. A. (1968). Information theory and its engineering application. London: Sir Isaac Pittman & Sons.
Bénard, M. H. (1900). Les Tourbillons dans une nappe liquid. Reveu Generale Science Pure Applique, 11,
1261–1309.
de Broglie, L. (1925). Recherches sur la Théorie des quanta. Doctoral thesis. Faculty of Science, University
of Paris, November, 29. 1924. Also in, Anriales de Physique (Vol. 3, pp. 22–128).
de la Peña, F. D., Lara, J. A., Lizcano, D., Martinez, M. A., & Pazos, J. (2019). A new approach to comput-
ing using informons and holons: Towards a theory of computing science. Foundations of Science. https​
://doi.org/10.1007/s1069​9-019-09597​-5.
Deutsch, D. (1997). The fabric of reality. London: Penguin Books.
Dompere, K. K. (2018). The theory of info-statics: An epistemic unity in defining information from matter
and energy. In The theory of info-statics: Conceptual foundations of information and knowledge. Stud-
ies in systems, decision and control (Vol. 112). Cham: Springer. https​://doi.org/10.1007/978-3-319-
61639​-1.
Einstein, A. (1905). Ist die Trägheit eines Körpers von seinem Energieinhalt abhängig? Annalen der Physik,
18, 639–641.
Einstein, A. (1907). Über das Relativitätsprinzip und die aus demselben gezogenen Folgerungen. Jahrbuch
der Radioaktivität und Elektromik, 4, 411–462.
Esposito, C., Su, X., Aljawarneh, S. A., & Choi, C. (2018). Securing collaborative deep learning in indus-
trial applications within adversarial scenarios. IEEE Transactions on Industrial Informatics, 14(11),
4972–4981. https​://doi.org/10.1109/tii.2018.28536​76.
Gleick, J. (2011). The information: A history, a theory, a flood. New York City: Pantheon Books.
Good, I. J. (1979). Studies in the history of probability and statistics XXXVIII. A. M. Turing’s statistical
work in world war II. Biometrika, 66(2), 393–396.
Hartley, R. V. L. (1928). Transmission of information. Bell Systems Technical Journal, 7, 535.
Khatam, I., & Shafiee, A. (2014). Objective information in the empiricist view of von Weizsäcker. Founda-
tions of Science, 19, 241–255.
Lara, J. A., & Aljawarneh, S. (2019). Special issue on the foundations of software science and computation
structures. Foundations of Science,. https​://doi.org/10.1007/s1069​9-019-09587​-7.
Lloyd, S. (2006). Programming the universe. New York: Knopf.
Maxwell, J. C. (1871). Theory of heat. London: Longmans, Green and Co.
Nyquist, H. (1924). Certain factors affecting telegraph speed. Bell System Technical Journal, 3, 324–346.
Nyquist, H. (1928). Certain topics in telegraph transmission theory. Transaction of the AIEE, 47(Apr),
617–644.
Prigogine, I., & Stengers, I. (1984). Order out of chaos: Man’s new dialogue with nature. New York: Ban-
tam Books.
Rayleigh, L. (1916). On convection currents in horizontal layer of fluid when the higher temperature is on
the under side. Philosophical Magazine and Journal of Science Series 6, 32(192), 529–546.
Rifá, J., & Huguet, L. (1991). Comunicación digital. Teoría Matemática de la Información. Codificación
Algebraica. Criptología. Barcelona: Masson, S.A.
Rosenblueth, A., Wiener, N., & Bigelow, J. (1943). Behaviour, purpose and teleology. Phylosophy of Sci-
ence, 10, 18–24.
Scarrott, G. (1986). The need for a “science” of information. Journal Information Technology, 1(2), 33–38.

13
The Third Construct of the Universe: Information

Shannon, C. E. (1948). A mathematical theory of communication. The Bell System Technical Journal,
27, 379–423.
Stonier, T. (1990). Information and the internal structure of the universe. London: Springer.
Szilard, L. Z. (1929). Über die Entropieverminderung in einem Thermodynamischen System bei Eing-
riffen Intelligenter Wesen. Zeitschrift für Physik, 53, 840–856.
Toyabe, S., Sagawa, T., Ueda, M., Muneyuki, E., & Sano, M. (2010). Information heat engine: Convert-
ing information to energy by feedback control internet. Nature Physics, 6(12), 988–992.
Wang, Y. (2008). Software engineering foundations: A software science perspective. Boca Ratón: Taylor
& Francis Group, LLC.
Weinstein, S. (2003). Objectivity, information, and Maxwell’s Demon. Philosophy of Science, 70(5),
1245–1255.
Wheeler, J. A. (1990). Information, physics, quantum: The search for links. In Proceedings of the 3rd
international symposium on foundations of quantum mechanics in the light of technology. 1989 (pp.
354–368). Also in, W. H. Zure (Ed.), Complexity, entropy, and the physics of information. Addison-
Wesley. Reading, Mass (pp. 354–368).

Publisher’s Note  Springer Nature remains neutral with regard to jurisdictional claims in published maps and
institutional affiliations.

Cesareo Barreiro  is Professor and Research Scientist of the Nursing School at University of Vigo, Spain.
He is Clinical Psychologist and member of Organizational Structure of Integrated Management of Vigo
(EOXI). He holds a PhD in Computer Science from University of A Coruña, Spain. He is author of articles
and books published in the area of Cognitive Science and Clinical psychology. His research interests in
computer science include artificial intelligence (neuroscientific and computational methods) and Cognitive
Science (mind-brain problem).

Jose M. Barreiro  is Associate Professor and Research Scientist at Technical University of Madrid (UPM),
Spain. He is currently Member of Department of Artificial Intelligence and Member of the Group of
Research Biomedical Informatics (GIB). He holds a Ph.D. in Computer Science from Technical University
of Madrid, Spain. He is author of a significant number of articles published in international impact journals.
He has taken part in national and international research projects, and published some book chapters and
papers on several international conferences. His research interests in computer science include e-learning,
machine learning, artificial intelligence and biomedical informatics.

Juan A. Lara  is Associate Professor and Research Scientist at Madrid Open University, UDIMA, Spain. He
is currently Director of the Group of Research in Knowledge Management and Engineering. He is author of
more than five online education books. He holds a Ph.D. in Computer Science and two Post Graduate Mas-
ters in Information Technologies and Emerging Technologies to Develop Complex Software Systems from
Technical University of Madrid, Spain. He has published some book chapters and papers on several interna-
tional conferences, and taken part in national and international research projects. He is author of more than
25 papers published in international impact journals. His research interests in computer science include data
mining, knowledge discovery in databases, data fusion, artificial intelligence and e-learning.

David Lizcano  holds a Ph.D. in Computer Science from the UPM (2010), and a M.Sc. degree in Research
in Complex Software Development (2008) also from UPM. He held a research grant from the European
Social Fund under their Research Personnel Training program, the Extraordinary Graduation Prize for best
academic record UPM and the National Accenture Prize for the Best Final-Year Computing Project. He is
Vicerrector of R&D and Ph.D., Professor and Senior Researcher at the Madrid Open University (UDIMA).
He is currently involved in several national and European funded projects related to EUD, Web Engineer-
ing, Paradigms of Programming and HCI. He has published more than 35 papers in prestigious international
journals and attended more than 70 international conferences.

María Aurora Martínez  is an Associate Professor at the Madrid Open University (UDIMA), Spain. She has
a Ph.D. In Computer Science by the University of A Coruña. She has worked on several projects in several
organizations. She has published a few books, chapters and papers on several journals and international con-
ferences. Her research interests in Computer Science include artificial intelligence, knowledge management
and e-learning.

13
C. Barreiro et al.

Juan Pazos  is a member of the IEEE and the IEEE Computer Society. He received the first Spanish doctor-
ate in Computer Science from the Universidad Politécnica de Madrid, where he is currently Full Professor
at the Department of Artificial Intelligence. He set up the first Spanish Artificial Intelligence Laboratory and
was a visiting professor at Carnegie Mellon University and Sunderland University, among others. He has
been/is a member of the editorial board of the following journals: AI Magazine, Heuristics, Expert Systems
with Applications and Failure and Lessons Learned in Information Technology Management, among oth-
ers. He is author and co-author of 10 books on Computer Science and of over 100 publications. His current
research is on the construction of an Information Theory that integrates Computing Science, DNA and the
brain.

13

You might also like