You are on page 1of 9

Available online at www.sciencedirect.

com

ScienceDirect
Procedia Computer Science 28 (2014) 389 – 397

Conference on Systems Engineering Research (CSER 2014)

Eds.: Azad M. Madni, University of Southern California; Barry Boehm, University of Southern California;
Michael Sievers, Jet Propulsion Laboratory; Marilee Wheaton, The Aerospace Corporation
Redondo Beach, CA, March 21-22, 2014

Developing a Method to Evaluate Entropy in


Organizational Systems
Héctor A. Martínez-Berumena,b*, Gabriela C. López-Torresa, Laura Romo-Rojasa
a
Universidad Autónoma de Aguascalientes, Av. Universidad 940, Aguascalientes, 20131, México
b
C3 Innovación y Desarrollo, A.C., Almena 106, Aguascalientes, 20020, México

Abstract

Entropy is an abstract concept. It refers to a measure of disorder or uncertainty in a system. Since its origin from the study of
thermodynamics, various scientific disciplines have applied this property in their respective fields of knowledge, which
represents a major challenge, since entropy is difficult to interpret, understand or visualize, as it lacks of a direct interpretation or
physical measurement. This article discusses the origins of this concept, its implementation in different knowledge areas, and
describes the research currently conducted by the authors of this work to evaluate entropy as a measure of disorder in
organizational systems, which are conceived of as dissipative systems, i.e., systems that are able to remain in a state of
imbalance, challenging the inherent tendency towards an equilibrium state, which acts as an attractor of the system. While this
paper focuses mainly on the case of private enterprises, the concepts addressed in this work may be applied to any type of
organizational system.
© 2014 The Authors. Published by Elsevier B.V.
© 2014 The
Selection andAuthors. Published
peer-review underby Elsevier B.V.of the University of Southern California.
responsibility
Selection and peer-review under responsibility of the University of Southern California.
Keywords: Entropy; Organizational Systems; Dissipative Systems; Complex Systems.

* Corresponding author. Tel.: +52 1 449-494-6453; fax: +52 449-914-3289.


E-mail address: hamb@c3-id.com

1877-0509 © 2014 The Authors. Published by Elsevier B.V.


Selection and peer-review under responsibility of the University of Southern California.
doi:10.1016/j.procs.2014.03.048
390 Héctor A. Martínez-Berumen et al. / Procedia Computer Science 28 (2014) 389 – 397

1. Introduction

Since the foundations of systems thinking were established, the idea that different disciplines influence and
enrich each other has been pursued, "through the transfer of principles, concepts, methods, techniques and values"1
in the context of a single systems meta-theory. In this regard, Von Bertalanffy2, who created the "General Systems
Theory (GST)", proposed to: 1. Investigate the isomorphism of concepts, laws and models in various fields, and
propose transfers among them 2. Promote the development of appropriate theoretical models where missing 3.
Reduce duplication of theoretical effort, and 4. Promote the unity of science. Thus, according to his description,
General Systems Theory should become an integration mechanism between the natural and social sciences and a
basic tool for scientific training and preparation.
In this paper we propose to apply the concept of entropy, originated in the study of thermodynamic systems, as a
measure of disorder in organizational systems, which are conceived of as dissipative systems.
The use of the concept of entropy by other disciplines often gives rise to intense discussions, due to concerns
about the indiscriminate use of the term outside the context of physics. However, the investigation of the concept of
entropy in other disciplines responds to the fact that "there may be a general process of entropy and several specific
subtypes, which should be recognized and identified to distinguish the context to which it refers".3

1.1. Origin and meaning of the concept of entropy

The term entropy was coined in 1865 by Rudolf Clausius, who created this name from the Greek ἐντροπία, which
means evolution or transformation.4 Clausius developed the concept based on the formulation of the second law of
thermodynamics, which states that, without outside intervention, heat always flows from a warm body to a body
with lower temperature.5 This description coincides with the expression of the second law of thermodynamics
proposed by William Thomson (Lord Kelvin), whereby there cannot be a thermal machine which, through a cyclical
process, continuously and independently pumps energy from a heat reservoir, transforming it into work.6 Then,
while the first law of thermodynamics states that energy cannot be created nor destroyed but only transformed, the
second law of thermodynamics states the direction in which this transformation occurs. It is important to note that,
even when the second law of thermodynamics can be formulated in many different ways, they all lead to the
explanation of the concept of irreversibility of spontaneous processes, since it establishes that a closed system
always evolves towards its most probable macrostate, i.e.: one in which its entropy is maximized7,8, which means
that entropy acts as an attractor of the system.8 Clausius' contribution9 was to abstract the principle that states that
these processes are governed by the same law, which is expressed in a quantity that determines the direction of
events and that always changes in the same direction along the axis of time: entropy, usually depicted with the letter
S.

1.2. Statistical Mechanics description of the concept of Entropy and its application in Information Theory

Based on the above rationale, Ludwig Boltzmann proposed in 1877 a statistical interpretation of the entropy
concept, linking it to the total number of microstates characterized macroscopically by a certain level of energy,
volume and number of particles. Boltzmann based his proposal in the atomistic description of matter, stating that
entropy would be determined by the number of states available to the system in question, so that entropy would
equal the logarithm of the total number of arrangements (states) of a system.10
The above concepts have been applied in the field of information theory, which is a branch of probability and
statistics mathematical theory, and studies information and everything related to it: channels, data compression,
cryptography and related topics. The information theory was developed as a formal discipline from the work of
Shannon11, generating an accurate, objective and useful quantitative mathematical body, from a concept hitherto
subjective. It introduces the concept of information as a measurable quantity, through an expression isomorphic to
negative entropy (negentropy).12 Shannon11 concluded that the formula of information is exactly equal to the entropy
formula, only with the sign changed, where it follows that:
Information = -entropy or
Information = neguentropy
Héctor A. Martínez-Berumen et al. / Procedia Computer Science 28 (2014) 389 – 397 391

Also the pioneer in cybernetics, Norbert Wiener, determined the equality between information and the “level of
organization” (or negative entropy) of a cybernetic system.13 Similarly, O’Connor14 notes that when speaking of
system’s “organization”, it necessarily implies the existence of information, since this is a necessary condition for
the definition and characterization of a system. In that sense, it is emphasized that the entropy changes are
intrinsically related to changes in structure, i.e., changes in the organization of the system.
Gell-Mann15 notes that “entropy and information are closely related. In fact, entropy can be regarded as a
measure of ignorance. When we only know that a system is in a certain macrostate, the entropy of that macrostate
measures the degree of ignorance of the microstate in which it is located”. Also, Ben-Naim4 notes that information
can always be associated with entropy, and supports Gell-Mann’s idea that entropy is synonymous of “lost data”.
When studying the effect that information has in different systems, some research lines suggest distinguishing
between different types of information. For example, Ryan16 suggests three ways in which information may be
considered in different systems. Two of them refer to the system itself: the structural information, referring to the
macrostate of the system, and the link information, which relates to the microstates of the system. The third type: the
functional information. is related to what happens in the system’s environment when the former operates.
J. G. Miller17 points out that the more complex the system (understanding complexity as the possible number of
states that each element of the system may assume and the number of interactions among these elements), the more
energy is therefore invested by such systems to obtain, process, analyze store and/or communicate information.
While information in this context usually describes the functional properties of many types of mechanisms, the
amount is much less important than its “power”, i.e. its ability to execute cybernetic control over matter or energy. It
is also important to consider that information becomes more relevant in open teleological systems, which act to
achieve their purposes.18 For such a system, the acquisition and application of useful knowledge is closely related to
its self- determination and its future development, which is clearly the case of an organizational system.
Bailey3 states that thermodynamic entropy is defined only in terms of heat and temperature, and Boltzmann
entropy is defined in terms of the behavior of gas molecules, while entropy, according to information theory
proposed by Shannon, is a more open or generic measure, which means that it can be applied to any set of categories
for which information exists.

2. Organizational Systems

As mentioned before, an organization is conceived of as a system, described as a combination of assembled and


interconnected elements, forming an organized set, which have a defined goal, and are immersed in an environment
with which they interact. Since organizational design and management integrates various factors and perspectives,
both internal and external, including technical and technological, cultural, social, political and economic, it is
possible to apply a systems approach, and systems engineering concepts, to study this kind of systems.
There are various definitions of System. INCOSE19 defines a System as “a combination of interacting elements
organized to achieve one more stated purposes. An integrated set of elements, subsystems, or assemblies that
accomplish a defined objective. These elements include products (hardware, software, firmware), processes, people,
information, techniques, facilities, services, and other support elements”. Lara-Rosano20 proposes that a set of
elements constitute a system if they meet three conditions: 1.- The elements are related. 2.- The behavior of each
element affects the behavior of the whole. 3.- The way in which the behavior of each element affects the behavior of
the whole, depends on at least one of the other elements. These definitions may be applied to an organizational
context.
Systems may be classified as open or closed. Closed systems are those in which there is no interaction between
the system and its environment21 (the system is closed to certain types of transfers in or out of the system), while
open systems interact strongly with their environment, as it get supplies from it and performs its duties or send its
products to it. Open systems are extremely complex; this classification includes living beings, as well as economic,
technological, social or organizational systems. If we also consider that these systems have one or more purposes,
their complexity increases considerably. 21
It is also possible to identify hierarchies in the structure of systems, so that a macro-system is composed of a
number of lower-level systems, which in turn are composed of certain subsystems in an even lower hierarchy. In this
line of thought, the universe is the highest-ranking system, which is generally regarded as a closed system.22
392 Héctor A. Martínez-Berumen et al. / Procedia Computer Science 28 (2014) 389 – 397

2.1. Organizations as Complex Systems

A complex system is a system composed of hierarchically organized, interacting components whose


interrelations are non-linear and dynamic.23 These systems are extremely sensitive to initial conditions, so that very
small changes in their causes are capable of causing large differences in the effects, which means that “Complexity
can lead to unexpected and unpredictable behavior of systems”.19 Because of this, one of the objectives in the
analysis and development of systems is to minimize such undesirable consequences.
According to Gershernson24, a system is self-organizing if its elements interact in such a way that the system's
behavior is a product mainly of these interactions, and not from a single element or an external source. Also,
according to Gershernson, it is important to consider that achieving perfect control of a self-organizing open system
is an unattainable utopia.
Simon25 noted that the evolution of complex systems is faster if the system in question is composed of different
levels of hierarchically organized systems, instead of being composed solely of a set of interacting elements. That is,
hierarchies promote continuous evolution and adaptation of the system. Holland26 described adaptation as a
condition that gives rise to a kind of complexity that greatly hinder our attempts to solve the most important
problems facing our world today, where consistency and persistence of a system depend on a large number of
interactions, the aggregation of various elements, and adaptation or learning.
Simon25 is emphatic in stating that complex systems are everywhere: "in human administrative organizations,
corporate businesses, governments, universities, churches... all are excellent examples of complex systems". By
studying different types of complex systems, it was found that all of them, -whether physical, chemical, biological,
social or artificial systems-, share common properties.25
Several authors27,28,29 have pointed out that organizations can be modeled as systems of information processing
agents, considering an agent as an entity that acts on its environment.24 It is important to note that the ultimate goal
when designing and analyzing complex artificial systems, including organizational systems, is not to remove
complexity, but to manage it in order to avoid its negative impact on the systems performance.

3. Organizational Systems’ Entropy

According to Testa and Kier30, a system can be described considering three aspects:

x First, the system has a structure (also called form), which can be formally described.
x A system exhibits behavior patterns called functions, characterized by their properties,

These two aspects constitute the static description of a system, which do not consider the time dimension, thus,
leading to the third aspect that describes a complex system:

x The form and function of a system are not static but change over time, fluctuating within a range of probability.
This is described as a complex system fluctuation, a quality that was identified and described by Prigogine 31.

The fluctuation of the form and function generates different formal and functional states, which tend towards a
probability space, determined by the set of all possible states of the system. This space is considered an attractor of
the system.30 This is important in order to analyze the possible states of the system and identify the attractors related
to the systems performance.

3.1. Dissipative Systems and Entropy

It must be noted that the classical postulates of the second law of thermodynamics relate only to equilibrium
states. In this regard, Ilya Prigogine said: “150 years after its formulation, the second law of thermodynamics still
seems to be more of a program than a well-defined theory in the usual sense, since nothing precise (except the sign)
is said about entropy production. Even the range of validity of this inequality remains unspecified. This is a major
reason why applications were essentially limited to thermodynamic equilibrium processes ".31
Héctor A. Martínez-Berumen et al. / Procedia Computer Science 28 (2014) 389 – 397 393

Based on the classic principles of the second law of thermodynamics and its statistical description, and
considering the developments in information theory, Prigogine, along with others, developed the concept of
dissipative systems, based on the book by Erwin Schrödinger entitled “What is Life?” 32, where a theory is
presented, according to which the order of a macro-system would be determined by the disorder of its components,
calling this principle “order from disorder”. It also presents the “paradox of Schrödinger” which states that,
according to the laws of thermodynamics, the disorder of a closed system, as the universe, must constantly increase,
however, living beings seem to keep a state of increasing order, thus, Schrödinger concluded that life is an open
system. According to Prigogine 31, dissipative systems are able to remain in a state of imbalance, challenging the
inherent tendency towards a state of thermodynamic equilibrium, determined by entropy, which acts as an attractor
of the system. In thermodynamic terms, an open complex system, such as human organisms, is “a giant fluctuation
stabilized by energy exchanges” .31
Prigogine states that such systems are “self-organized” and can evolve to higher levels of complexity, which
occurs when an open system is brought to a condition beyond equilibrium, creating linear discontinuities or
instabilities, which tilt the system to a level of greater complexity and greater structural stability.18 This coherent
structure then becomes a source of order.31 It is interesting to note the relationship that Prigogine makes from
“order” as an effect of the evolution of dissipative structures. Heylighen8 points out that, since these phenomena are
dynamic, it is convenient to replace the word “structure”, which has a static connotation, with the term
“organization”, which refers to the dynamic characteristic of the system.
Therefore, entropy involves the natural tendency of a system towards an equilibrium state. One way of stating the
principle of the second law of thermodynamics, is that “all differentiated or heterogeneous tends to disappear when
the effectors that produced it cease, i.e. it falls in the homogeneous or undifferentiated” 20, in other words, the
system moves from a state of imbalance (order) to a equilibrium state (disorder), the latter being an attractor of the
system, which means that a steady state is more likely to occur than a state of imbalance. It is important to note that
the term “disorder” should not be used in a subjective sense, since a self-regulating complex system could develop a
structure that may seem “disordered” at first glance, but is really the way in which the system faces and adapts to the
demands imposed by its environment, which may not be apparent to the observer.
It has already been mentioned that certain self-regulating complex systems can avoid the effect of increasing
entropy. This is accomplished through the action of a number of effectors, which counteract the tendency applied by
the attractor (entropy). In other words: If a system is isolated or cut off from all supplies of matter, energy and
information, its entropy will tend to a maximum.3 In that sense, Ebeling and Volkenstein33 define self-organization
as “the spontaneous formation of order in open systems, which export entropy”. Thus: the sustainability of a system
would be determined by its ability to decrease entropy.
Heylighen8 states that, for a dissipative system to be stable, it requires internal control elements that maintain
internal heterogeneities, eliminating or counteracting fluctuations or disturbances that could destroy them.
Importantly, these actions are deliberate and require use of energy and resources to ensure that the structure of the
system evolves towards higher order levels. When such effectors are canceled, the system “succumbs” to the effect
of entropy and its structure begins to degrade, to homogenize with its environment. The system tends to its most
likely state, which is contrary to the order established by the structure, i.e. the disorder. In that sense, the entropy can
be used as an indicator of the sustainability of the system, understanding sustainability as the ability of an open
system to survive and thrive even under changing conditions.

3.2. Assessment of Organizational Systems Entropy

The assessment of organizational entropy is done considering the capacity of the organization to stay in a
differentiated state, enabling it to ensure its sustainability in the long term.
In this work, we define "success" as the survival and sustained growth of an organizational system over time. We
have considered the concepts of Viable System Model (VSM) proposed by Beer34,35,36, which describes an
organizational structure model of a viable or autonomous system, organized so that it can meet the demands of its
environment and survive in a changing context. We selected this approach, as we believe that organizational success
can be managed only through a holistic and multidisciplinary approach, and, as has been recently reported: “the
394 Héctor A. Martínez-Berumen et al. / Procedia Computer Science 28 (2014) 389 – 397

VSM is the best available guide for structuring and managing a business enterprise for success in turbulent times” 37,
since “the VSM supports autonomy and adaptability” .38
To this end, and also considering the concepts proposed by Nohria et al.39, we have identified the following
factors for the success and sustainability of an organizational system:

x Innovation. Defined as “the implementation of new or significantly improved products (goods or services) or
processes, new marketing methods, or new organizational methods in business practices, workplace organization
or external relations” 40.
x Talent. Capacity of the organization to attract, retain and develop staff with the skills and capabilities that enable
the organization to achieve superior performance.
x Culture. Ability to develop and maintain a set of group norms of behavior and the underlying shared values, with
the purpose to achieve higher organizational performance, in alignment with the organizational strategy.
x Leadership. Capacity of managers to create and foster the development of strong relationships between all levels
of the organization, and to foresee opportunities or threats and act accordingly, not only at the managerial level,
but throughout the organization.
x Structure. Degree to which the organizational arrangement or configuration reduces internal bureaucracy and
simplifies work, enabling collaboration and information sharing across the organization.
x Information and internal communication. Practices and infrastructure that enable the information and
communication functions and facilitate monitoring and control.
x Management and internal control. Structures and controls established with the aim of generating optimal
performance and organizational synergy, by implementing rules, resources, rights and responsibilities. A related
sub-function is to assess the information and interactions through audits.
x Competitive Monitoring. Monitoring of the environment, in order to determine how the organization must adapt
to remain viable in the long term.
x Execution. Functions that implement the key transformations, through which the organization creates value,
ensuring compliance and exceeding customer expectations, eliminating waste at all levels and continuously
improving productivity.
x Strategy. Definition and clear communication at all levels of the strategic direction, goals and objectives and the
actions needed to achieve them.
x Governance. Definition and implementation of policies that affect the entire organization, in order to balance the
present and future demands, internal and external, and to moderate the interactions between organizational
systems and to lead the organization as a whole.

Each of the factors described are assessed considering the following criteria:

x Maturity. Extent to which each of the factors is developed and applied systematically.
x Organizational integration. Degree at which each factor is applied and shared across the organization and
interacts with other organizational functions.
x Results orientation. Alignment of each factor to the overall performance of the organization, either by increasing
the value offered by the organization, or by improving its general performance or productivity.

Each factor is evaluated against these criteria, as shown in Table 1:

Table 1. Determinants of organizational sustainability.


Organizational
Determinants of sustainability Maturity Results orientation
integration
Innovation (0 – 10) (0 – 10) (0 – 10)
Talent (0 – 10) (0 – 10) (0 – 10)
Culture (0 – 10) (0 – 10) (0 – 10)
Leadership (0 – 10) (0 – 10) (0 – 10)
Héctor A. Martínez-Berumen et al. / Procedia Computer Science 28 (2014) 389 – 397 395

Structure (0 – 10) (0 – 10) (0 – 10)


Information and internal communication (0 – 10) (0 – 10) (0 – 10)
Management and internal control (0 – 10) (0 – 10) (0 – 10)
Competitive monitoring (0 – 10) (0 – 10) (0 – 10)
Execution (0 – 10) (0 – 10) (0 – 10)
Strategy (0 – 10) (0 – 10) (0 – 10)
Governance (0 – 10) (0 – 10) (0 – 10)

It has been mentioned that, according to the description of statistical mechanics, the entropy of a system is
determined by the number of "states" accessible to the system, and the probability of occurrence of each of those
states. In order to assess the organizational sustainability, these states are determined by the scenarios that may
occur as a result of the evaluation of the distinct functions and processes. In principle, the number of scenarios can
be infinite. However, Asbhy 41 points out that infinitesimal states involves a number of practical difficulties, so, he
suggests taking a finite number of states, considering that the difference between them is also finite.
We propose that the aforementioned states correspond to the probability that the organization presents a certain
level of performance, and that these probabilities are determined by the condition that the organization exhibits for
the factors listed in Table 1. Thus, the organizational systems entropy is an indicator of the risk faced by an
organization regarding its sustainability in the long term, since a higher entropy index implies less knowledge about
the status of the organizational system, which means a higher risk due to uncertainty regarding the variation of the
critical variables listed in Table 1.
The following method is proposed in order to assess the organizational systems entropy:

i. Determine (or identify ) the organizational system to evaluate.


ii. Specify the number of “states” (K) accessible to the organizational system.42
- The number of states corresponds to the scenarios (organizational sustainability levels), which in principle can
be infinite. It is recommendes however, to consider a finite number of scenarios, which describe the possible
risk levels with respect to the organizational sustainability in the long term. We propose to consider eleven
scenarios, from Scenario 0, which implies a high risk, to Scenario 10, characterized by a low risk.
iii. Next step is to obtain the probability distribution for each of the possible states (risk levels) mentioned in (ii),
considering the result of the evaluation of the factors listed in Table 1. This probability is obtained by calculating
the cumulative probability of occurrence for each of the scenarios (pi) through the normal probability cumulative
density function.
iv. Calculate the entropy (S) of the organizational system , according to the following formula proposed by Shannon
[11] :
ܰ

ܵ ൌ െ ෍ ‫݈݊ ݅݌‬ሺ‫ ݅݌‬ሻ (1)


݅ൌͳ

v. The obtained result is interpreted according to the scheme proposed in Table 2:

Table 2. Interpretation of the Entropy value (S) of the Organizational System [43].
Value of S Interpretation Recommendation
The organizational system has a high
ଷ level of disorder. The results of the Strengthen the lowest ranking
݈݊‫ < ܭ‬S ≤݈݊‫ܭ‬
ସ evaluation of organizational functions.
sustainability are highly variable.
396 Héctor A. Martínez-Berumen et al. / Procedia Computer Science 28 (2014) 389 – 397

The organizational system is ordered.


Check the trend of “organizational
ଵ ଷ The results of the evaluation of
݈݊‫ < ܭ‬S ≤ ݈݊‫ܭ‬ sustainability”. Determine whether any
ଶ ସ organizational sustainability are mainly
of the factors should be strengthened.
concentrated around a value.
The organizational system is highly
Check the data distribution. Determine
ଵ ordered. The results of the evaluation of
0 < S ≤ ݈݊‫ܭ‬ whether the data are distributed around
ଶ organizational sustainability are
a high or low value.
grouped around a common point.

In Table 2, K represents the number of accessible states and ln(K) represents the maximum entropy that the
system could have.41 As noted before, in this case K=11.
As mentioned before, if the organizational system has high entropy, it implies that all scenarios have the same
probability of occurrence, i.e. there is not enough information to determine the level of risk that the organizational
system assumes at any given time. Conversely, a low entropy level means that the probability distribution gives
more information about the state of the system. In this case, it means that a scene has a higher probability of
occurrence than the rest. Thus, the entropy of the organizational system can be employed to assess the risk faced by
an organization, regarding its sustainability in the long term.

4. Conclusions and Future Work

As we have mentioned in this paper, open systems, such as organizational systems, can consistently lower its
entropy. The concept of entropy is by itself an indicator of the state of the system, which is related to the state
differences (heterogeneity) in the system. Thus, as the amount of information is a measure of organization, entropy
is a measure of a systems disorganization. In this regard, entropy can be employed to assess the sustainability of
open systems, including organizational systems. If the level of disorder (which, as mentioned before, is determined
by the homogeneity of the system, and therefore by a low level of information about its state) is high, then the
system lacks sustainability. Conversely, if the entropy is low (determined by the heterogeneity of the system and
hence a high level of information about the state of the system) the system has greater sustainability. If entropy
increases, the future sustainability of the system is at risk.
A sustainable system must, by definition, ensure that its level of entropy does not increase to maximum levels,
since maximum entropy (characterized by absolute homogeneity) means the system death. Open systems can
develop complex dissipative structures, in order to achieve a state of dynamic equilibrium and adapt to the demands
imposed by its environment. Such structures “order” the system structure. When effectors that produce dynamic
stability of the system are canceled, the system “succumbs” to the effect of entropy and its structure begins to
degrade and homogenize with its environment. So, in order to be sustainable, the system must have subsystems that
ensure that the system has information to maintain its entropy at controlled levels. In the case of an organization, the
action of these sub-systems is reflected in the status of the determinants of organizational sustainability proposed in
this paper.
As mentioned before, this project is currently under development. This paper is primarily concerned with laying
the foundation for the described approach. The assessment methods presented in this paper will be fully developed,
in order to apply them in a case study and demonstrate how the measurement will be assessed, and what impact the
measured information could have on the operation of the organizational system. The effectiveness of the proposed
approach will be assessed as part of the ongoing project and will be reported in future works.

Acknowledgements

The authors would like to thank the support received from CONACYT and from the Autonomous University of
Aguascalientes (UAA) for the realization of this project.
Héctor A. Martínez-Berumen et al. / Procedia Computer Science 28 (2014) 389 – 397 397

References

1. Ruiz, Rosaura, Martínez, Rina y Valladares, Liliana. (2010) Innovación en la educación superior. FCE-UNAM. México. 212 p.
2. Von Bertalanffy, Ludwig, (1976) Teoría General de Sistemas; México, Fondo de Cultura Económica.
3. Bailey, Kenneth, 2001, Entropy Systems Theory, in Systems Science and Cybernetics, in Encyclopedia of Life Support Systems (EOLSS),
Developed under the Auspices of the UNESCO, Eolss Publishers, Oxford ,UK, [http://www.eolss.net]
4. Ben-Naim, Arieh., (2008). Entropy demystified. World Scientific, Singapur, 223p.
5. Clausius, R. (1850). "Über die bewegende Kraft der Wärme". Annalen der Physik und Chemie. 155:3 pp.368-397.
6. Thomson, W. (1851) "On the dynamical theory of heat; with numerical results deduced from Mr. Joule's equivalent of a thermal unit and M.
Regnault's observations on steam" Math. and Phys. Papers vol.1, pp 175–183.
7. Bazua Rueda, E. (1992). Bases termodinámicas para el uso eficiente de energía. Universidad Nacional Autónoma de México.
8. Heylighen F. (1990): Representation and Change. A Metarepresentational Framework for the Foundations of Physical and Cognitive Science.
9. Clausius, R., (1865). Ueber verschieden fuer die Anwendung bequeme Formen der Hauptgleichungen der mechanischen Waermetheorie.
Annalen der Physik und Chemie. 200:3. pp. 353-400.
10. Boltzmann, L., (1877). Bermerkungen über einige Probleme der mechanische Wärmetheorie, Wiener Berichte, 75: 62–100.
11. Shannon. (1948). A Mathematical Theory of Communication. The Bell System Technical Journal, Vol. 27, pp. 379–423, 623–656.
12. Johansen, Oscar, Introducción a la teoría general de sistemas, México, Ed. Limusa-Noriega, 2002, p. 30
13. Wiener, N., (1948). Cybernetics or Control and Communication in the Animal and the Machine. MIT Press, E.U.A.
14. O'Connor, M., (1991). Entropy structure and organisational change. Ecological Economics, 3, 95-122
15. Gell-Mann, M. (1995). El quark y el jaguar. Ed. Océano/Tusquets. España.
16. Ryan, J.P.F.J., (1972) Information, Entropy and Various Systems. J. theor. Biol. 36, 139-146.
17. Miller, J.G. (1978). The living Systems. Mc Graw-Hill, E.U.A. 1102 p.
18. Corning, P.A. (1995). Synergy and self organization in the evolution of complex systems. Systems Research 12(2):89-121.
19. INCOSE (International Council on Systems Engineering), 2011. Systems Engineering Handbook v. 3.2.2, October 2011
20. Lara Rosano, Felipe, (2002), Cibernética y sistemas cognitivos. In “Ingeniería de Sistemas”, Alfaomega-UNAM, México. pp. 44-70.
21. Moreno Bonett, Alberto, (2002), El enfoque sistémico y la ingeniería de sistemas. En: “Ingeniería de Sistemas: un enfoque interdisciplinario”,
Acosta Flores, Jesús (coordinador), Alfaomega-UNAM, México. pp. 1-28.
22. Hawking, Stephen, (2011). “El universo en una cáscara de nuez”. Ed. Drakontos. España. 246 p.
23. Lara Rosano, Felipe, (2009). Apuntes de clase de Dinámica de Sistemas Complejos. Universidad Nacional Autónoma de México.
24. Gershenson, Carlos, (2007) Design and Control of Self-organizing Systems, PhD Dissertation, Vrije Universiteit Brussel, Belgium, 2007.
25. Simon, H.A. The organization of complex systems. In: “Hierarchy Theory. The Challenge of Complex Systems”; Pattee, H.H., Ed.; Braziller:
New York, 1973; pp. 1-27.
26. Holland, John H, “El orden oculto”, Fondo de Cultura Económica, México, 2004.
27. Radner, R. (1993). The organization of decentralized information processing. Econometrica 61 (5) (September): 1109–1146.
28. Van Zandt, T. (1998). Organizations that process information with an endogenous number of agents. In Organizations with Incomplete
Information, M. Majumdar, (Ed.). Cambridge University Press, Cambridge, Chapter 7, pp. 239–305.
29. Decanio, S. J. and Watkins, W. E. (1998). Information processing and organizational structure. Journal of Economic Behavior and
Organization 36 (3): 275–294.
30. Testa B., Kier L.B. (2000). Emergence and Dissolvence in the Self-organisation of Complex Systems. Entropy. 2000; 2(1):1-25
31. Prigogine, I. (1978). Time, structure and fluctuations. Science, 201, pp. 777-84.
32. Schrödinger, Erwin. (1944). “What is Life?”. Ed. Macmillan.
33. Ebeling, Werner y Volkenstein, Michail. (1990). Entropy and the evolution of biological information. Physica A 163, 398-402.
34. Beer, S. (1972). Brain of the Firm; Allen Lane, The Penguin Press, London, Herder and Herder, USA.
35. Beer, S. (1979). The Heart of Enterprise; John Wiley, London and New York. (Discussion of VSM applied)
36. Beer, S. (1985). Diagnosing the System for Organizations; John Wiley, London and New York.
37. Christopher, W.F. (2011). A new management for enduring company success. Kybernetes, 40, 3/4, pp. 369-393.
38. Hoverstadt, P. (2010). The Viable System Model. In “Systems Approaches to Managing Change: A Practical Guide”; Springer, U.K.
39. Nohria, N., Joyce, W., and Roberson, B. (2003). What Really Works. Harvard Business Review. 81(7).
40. OECD - Eurostat, 2005. Oslo Manual. European Communities.
41. Ashby, W. R., (1956). An Introduction to Cybernetics, (Chapman & Hall, London).
42. García-Colín, Leopoldo. (1989) El concepto de entropía. Cuadernos del seminario de problemas científicos y filosóficos. UNAM. México.
43. Martínez-Berumen, Héctor A. (2012). La Planeación Estratégica-Tecnológica para la toma de decisiones en nuevas tecnologías en Centros
Publicos de Investigación y Desarrollo: Una propuesta metodológica con Enfoque de Sistemas. Unpublished Doctoral Thesis. Universidad
Nacional Autónoma de México (UNAM). México.

You might also like