You are on page 1of 11

Systems Practice, Vol. 7, No.

6, 1994

Research Note

Some Whys and Wherefores in the Failure of Social


Systems Design

Wojciech W. Gasparski 1
Mankind has eventually gone right after trying all possible ways of
going wrong; the wrong-doings being habitually checked by disaster
and pain, and death.
H. Spencer (1880)

1. I N T R O D U C T I O N
A couple of years ago, when I received an invitation to contribute to a conference
on "Strategies of Socio-Economical Changes in Poland," organized for the first
time since the previous political regime had been ousted, I thought it would be
useful to review some of the main causes of social systems failures. 2
The intention of the accompanying study (Gasparski, 1992a) was motivated
not only, and not predominantly, by the willingness to offer an explanation of
why the old system collapsed, but to forewarn planners and designers of a new
social system--or systems, rather, because of pluralism reintroduced in the coun
try--of the Scyllas and Charybdas they would face while drawing maps for
navigating the system (systems) toward the future.
The results achieved appeared to be of general rather than local importance,
probably because of the praxiological-systemic background of the study. This
encouraged me to enlarge the scope of the study 3 and present the outcome of
the rescarch to the broader audience of the "Addressing Global Issues" UKSS
International Conference. 4
This paper is composed of the following parts: (2) Traditional Thinking,

Learned Society of Praxiology, Nowy Swiat 72, 00-330, Warsaw, Poland.


2For the proceedingsof the conference, see Kubin, J., and Zekofiski, Z. (eds.), In Search for a
Strategy of Change, Polish Academyof Sciences, Warsaw, 1992.
3It was possiblethanks to the stay at the City University,London, UK, grantedby the Commission
of the EuropeanCommunityunderthe name -Praxiological-SystemicApproachto Practical Prob-
lem Solvingin Designand Management," No. ERB3510PL21223.
4Thanksto the kind invitationextendedby ProfessorFrankA. Stowellof the Universityof Paisley,
President of the UKSSand Chairmanof the Conference.

687

0894-9859/94/1200 0687/$07.00/0 9 1994 Plenum Publishing Corporation


688 Gasparski

(3) Assumptions and Presuppositions, (4) What Makes the Application of Sci-
entific Knowledge So Difficult?, (5) The Methodological Efficacy: A Metaphor,
and (6) Some of the Traps.

2. TRADITIONAL T H I N K I N G
"In contrast to most related endeavors," writes John Dupr6 (1993, p. 1)
in his recently published book The Disorder of Things, " I shall draw primarily
not on physics, but on biology. Biology is surely the science that addresses
much of what is of greatest concern to us biological beings, and if it cannot
serve as a paradigm for science, then science is a far less interesting undertaking
than is generally supposed."
Let us--the followers of the biologist Ludwig von Bertalanffy's idea of
general systems--use the above quotation as a motto to this paper on the fatal
consequences of traditional thinking dominating beyond the scope of its rele-
vance: so much that "the impact of systems thinking has seemed to be on the
wane," as was noticed, not without sorrow, by one of the most successful
systems scientists of this decade (Jackson, 1991, p. vii).
Why is it so? Why, despite the intensive propagation of multifaced "sys-
tems thinking, systems practice," is the real practice of social systems still
dominated by traditional thinking? Although social systems, also called systems
of human action, have been designed for a long time, technical/engineering
design is continuously used as a paragon of good design. The situation is almost
analogous to the situation in the philosophy of science, 5 which explores physics
pars pro toto as the best-grounded science.
"The success of this mode of thought and its paradigms," writes Harold
A. Linstone, "has led very naturally to increasing pressure to extend its use
beyond science and technology, i.e., [to] society and all its systems. The attitude
is typified by the planning, programming, and budgeting drive in the 1960s and
the popularity of econometric models. Why, then, is there deep trouble in relying
on it in ill-structured a r e a s . . . ? " (Linstone, 1981, p. 278). Here is the answer
offered by the same author given in a recently published article: "Science and
technology represent the most successful 'religion' of modem times. From Gal-
ileo to the Apollo lunar landing, from Darwin to recombinant DNA, its methods
have yielded dazzling triumphs and they also serve as paradigms for systems
analysis" (Linstone, 1989, p. 309).
The characteristic features of traditional, "technocratic" thinking are set
down after Linstone in Table I. The collection of characteristics relates to what
was named "the new utopia" by Robert Boguslaw (1965) some time ago. This

5We understand science here more broadly than limited to the so-called exact disciplines; our
understanding is as broad as German Wissenschaft or Polish nauka.
The Failure of Social Systems Design 689

Table I. The Features of Technocratic Thinking

Name Characteristics Consequences

All problems are It is assumed that any problem, Lack of knowledge that each new
solvable like those from school solution provided by technology
textbooks, can be solved creates, and even accelerates, the
emergence of new problems
Optimization Tendency to search for the best Ignoring that complex living systems
solution seek to diminish the cost of failure
and strive to increase their options;
i.e., degree of freedom to make a
choice
Reductionism Atomization of a system to such Ignoring that history is strewn with
a degree that one is able to events which had a powerful
use techniques he/she is impact but were calculated
familiar with (reduced) to have a very low
probability (because of lack of
relevant knowledge)
Data and models Reliance on averaged data, Modeling becomes an end rather than
schematic outlines, and their a means, so that dedicated
combinations modelers are ready to fall in love
with their models (the Pygmalion
complex)
Quantification The belief that everything Measure what can be easily
should be expressed in measured, disregard that which
numbers can't be measured or give it an
arbitrary quantitative value;
presume that what can't be
measured isn't very important or
really doesn't exist
Objectivity Assumption that scientists are Ignoring that an observer's faculty of
unbiased observers outside of describing enters his/her
the systems under descriptions
investigation
Depersonalization Tendency to avoid the Scientists, technocrats, bureaucrats,
individual etc., tend to deal not with a person
but with an aggregate, types,
averages, etc.
Time linearization The belief that time is moving Ignoring different time perceptions by
linearly and is independent of different individuals and in different
society and individuals cultures and circumstances

u t o p i a w a s a n d still i s - - u s i n g a c o n t e m p o r a r y c a t c h w o r d - - a k i n d o f a q u a s i
"virtual reality" set up to produce images of nonexistent social systems. But
adapting Plato's (The Republic) m e t a p h o r , "We n e e d to b e a w a r e t h a t , in o u r
modem electronic networked caves, the flickering shadows are not just projec-
690 Gasparski

tions from the outside w o d d they are also projections, fictions, from an inside
w o r l d " (Mallen, 1993, p. 273).
Linstone proposed the concept o f multiple perspectives (MP) as a remedy
to overcome domination o f the technical perspective, for M P also includes two
other perspectives: the personal perspectives o f the individuals involved and the
organizational perspective o f the social system in question. The approach using
three (types of) perspectives (in fact more than three, because of the many
individuals and more than one organization) enables us to take into account
individual and social values and to introduce design participation of the actors
and agents affected.
Another proposal, even earlier than Linstone's for coping with social sys-
tems design in a better way, was K i l m a n n ' s (1977) multivariate analysis, par-
ticipation, and structure (MAPS). Although this 12-step design technology was
welcomed by Chris Argyris (who wrote a Foreword to the Ralph H. Kilmann
book), he raised a question:
9 What are the limits of designing organizations based on peoples preferences? My
concern ranges from how realistic is this to the implicit assumption that individuals
may choose their preference over organizational health. Also, what if people change
their choices but these cannot be altered? Will that not bring us back to the original
problem of organizations that Kilmann is trying to overcome, namely their apparent
unchangeability and unilateral control?
Ralph Kilmann is aware of these questions and he continues the program of
research necessary to illuminate them. In the meantime we must not forget that people
are frequently making choices related to preferences. Since such choices are at best
quasi-legal, they are usually camouflaged. One consequence is that they become
undiscussable. Undiscussable choices are not consciously and planfully alterable. At
least with Kilmann's methodology, preferences may become discussable and given
a methodology that makes them manageable, they may be alterable. [Argyris; in
Kilmann (1977, p. xv)]

If it is really so, then Boguslaw's dubiousness would be reduced dramati-


cally but not removed. They had not been eliminated, even by DeGreene (1973),
the author o f a comprehensive book on sociotechnical systems.
Nothing is better to conclude this section than a (paraphrased) statement
by the author o f the Treatise of Basic Philosophy (Bunge, 1985, p. 290): The
hope o f scientific management is a combination o f technocracy with democracy;
sociotechnology without democracy can only help support tyranny; and democ-
racy without sociotechnology is blind, hence inefficient ergo fragile.

3. A S S U M P T I O N S A N D P R E S U P P O S I T I O N S

Solutions to problems related to large-scale s y s t e m s - - a n d social systems


are large-scale systems i n d e e d - - f o r which plans and strategies for change are
formulated, are burdened by underconceptualization (Warfield, 1990, p. 213).
The Failure of Social Systems Design 691

The solutions are based not only on consciously and openly accepted assump-
tions, i.e., suppositions, but also on presuppositions, i.e., hidden assumptions.
Presuppositions are imprinted in problem-solvers' unconsciousness; in their
intellectual creative repertoire they are possessed of as a result of informal and
formal education.
Presuppositions are potentially more dangerous than openly accepted
assumptions. This is because a designer does not give an account of them, either
to others or to himself. The more invasive the systems under design, the more
dangerous the presuppositions are. And systems such as education, military
systems, telecommunication, etc., i.e., systems imposed upon people, com-
munities, and societies, are really invasive systems. Most of social strategies
and plans are of such a nature.
The question of presuppositions is related not only to designers of invasive
systems, but also to other specialists engaged in their creation, development,
testing, accepting, implementing, and managing. All those specialists behave in
the ways determined by the presuppositions acquired through social and school
teaching/learning processes designed for them by their educators as well as a
result of continuous pressure of systems of punishments and rewards they have
experienced.
The importance of the subconscious for human behavior is well-known to
experts of all kind of advertising industry, election campaigns, etc. This is why
in some countries broadcasts, especially on TV, acting upon viewers' subcon-
sciousness are considered illegal.
It is the presuppositions' responsibility that correcting errors is so difficult.
It is, as nicely expressed by Chris Argyris (1982), because (1) people do not
know how to do it, (2) they even have no knowledge that they don't know how
to do it, and finally, (3) they do not know that there are programs encoded in
their minds making it impossible to learn how to do it.

4. W H A T MAKES T H E APPLICATION OF SCIENTIFIC


K N O W L E D G E SO DIFFICULT?
"The belief that better use should be made of scientific information in
public policy formation appears to be widely shared by policy makers, scientists,
and the public alike. The pervasiveness of this belief is apparent from such
diverse evidence as the establishment of the Office of Technology Assessment
by [the U.S.] Congress . . . . Results of even the most dedicated attempts to
improve such u s e . . , are widely regarded as having been disappointing . . . . "
(Hammond, 1983, p. 287). Let us ask, in connection with the above, where on
Earth is or should be an office to assess social systems plans and designs?
According to Hammond and his colleagues, obstacles to improving the use
of scientific information in any public policy process are of a fundamental char-
692 Gasparski

acter. The constraints are (a) situational (situational context of designing policy-
making processes), (b) cognitive (cognitive limitations of policy-makers), and
(c) scientific (the nature of scientific information).
The situational difficulties are as follows.
9 Ambiguity with regard to scientific information and the relevant social
values upon which the assessments and policy decisions are based makes
it difficult to analyze them explicitly.
9 Scientific information is probabilistic, while policy decisions require dis-
crete choices among mutually exclusive alternatives.
9 The political context of policy decisions provides disincentives to use
scientifically based assessments in a "disinterested" manner, without
regard to the decision makers' own careers.
The cognitive difficulties are as follows.
9 The use of scientific information is not analytical, which makes the
process of policy formation implicit and, then, difficult to criticize and
improve.
9 Methods of presenting scientific information are person-dependent, cre-
ating an attitude that "victory is more important than truth."
9 The volume of scientific information is much greater than the human
abilities to understand and use it effectively.
The scientific difficulties are as follows.
9 Experimentation is required to provide definitive answers to policy-rel-
evant questions, but the critical experiments cannot be conducted.
9 Since the experiments cannot be conducted, attempts must be made to
generalize the available scientific information obtained in different set-
tings, which is far from what would be perfectly reliable.
9 A consequence of the incompleteness of the process of producing sci-
entific information is that the information is uncertain.

5. THE M E T H O D O L O G I C A L EFFICACY: A M E T A P H O R
Let me recall, as a metaphor of methodological efficacy of solving practical
problems, the one I presented some time ago (Gasparski, 1986).
The basic structure of a process of solving practical problems, e.g., of
designing a plan or a strategy for social change, is shown in Fig. 1.
Suppose, for the sake of simplicity, that the process of finding a design
solution is composed of three procedures:
(1) inquiring, i.e., gathering knowledge about the existing practical situ-
ation (i.e., knowledge on facts and values comprising the situation);
The Failure of Social Systems Design 693

THEORY ~ :~ Concretization r DESIGN


r I
I
I

Enquiring Implementation
I
I

PRACTICAL SITUATION
SITUATION CHANGED
(FactsF'/ValucsV') (FactsFIValues V')

F i g . 1. The structure o f a practical problem-solving process.

(2) solving the problem on the basis of background (theoretical) knowl-


edge, and
(3) implementing, i.e., introducing the formulated design to change the
given situation.

All of the above procedures are performed in one or another way (usually many
specific techniques are used).
Let us assume that a special measure Era, called "methodological efficacy,"
is ascribed to each of the ways (methods) and the overall efficacy is equal to
the product of the methodological efficacies of the procedures:

E m = Eml • Era2 • Em3

In the most optimistic case, the methodological efficacy of all three pro-
cedures is very high, say 90%; therefore,

Em = 0.93 = 0.729

only. If the methodological efficacies of the procedures differ from each other
by only one-tenth, i.e., 0.9, 0.8, 0.7, respectively, then overall,

Em = 0.504

which is but a little over 50%; and if the methodological efficacy is equal to
0.5 for all the procedures, then overall,

Em = 0.125

only.
This metaphorical calculation could explain the causes of many unsuc-
cessful design undertakings, not just social.
694 Gasparski

6. SOME OF THE TRAPS

Life is brutal and full of traps, as an old proverb teaches us. The traps are
responsible for systems failures, so let us review the most frequent and important
ones.
What the authors (Bignell and Fortune, 1984) of a comprehensive book,
Understanding Systems Failures, say is true and important. It comes after exam-
ining a good collection of real systems (like the Titanic, the Three Mile Island
nuclear power station, the Humber Bridge project, etc.) and their failures. Their
message is that what is really needed to understand the failures is a praxiolog-
ical-systemic analysis of the failures in question. Actually they do not use the
term "praxiological" while calling for "double E " evaluation of systems,
which--the evaluation--is peculiar to praxiological analysis (see Gasparski,
1987). But, as I wrote in another place, "Praxiology is like the prose of Mon-
sieur Jourdain, the hero of Moliere's play: everybody uses it but only a few of
us know what we speak, write, or do" (Gasparski, 1992b, p. 3). And this is
just the first trap responsible for failures, the so-called practical failures, in social
systems. It is a lack of knowledge of the logic (praxiologic) and values (prax-
iology) behind practical activity.
The authors of the above book write, "Perhaps one of the most useful
ways of describing failure using systems terminology is as the production of
undesirable outputs of the system. Like other systems 'insights,' this description
has greater depth and significance than may at first be apparent. There are two
major ways to measure the performance of a system by using the system's
outputs. One way is to compare the inputs to the system with its outputs and
thus to measure its efficiency; the other is to compare the outputs with the
objectives of the system and thus to measure its effectiveness" (Bignell and
Fortune, 1984, p. 161). And this is just what "the science of efficient action,"
i.e., praxiology is about (Kotarbinski, 1965; Gasparski and Pszczolowski, 1983).
The quoted authors point out that looking at all of the outputs of the whole
system (desirable and undesirable), as well as for reasons for the absence of
some outputs that should be present, and vice versa, helps to avoid "falling into
the trap of concentrating solely upon those outputs that system was, on the face
of it, or in the eyes of its designers, intended to produce. The trap--the authors
continue--can be even more seductive when changes are taking place over a
considerable period of time" (Bignell and Fortune, 1984, p. 163).
Other traps, this time of a paired character, are generally known under the
names of error of omission and error of commission. The essence of the errors
consists in not taking into account what ought to be included and including what
should not have been taken into account at all. The pairs are close to what has
already been mentioned after Argyris, i.e., that choices related to preferences
are usually camouflaged. Let us add that, in many cases, they are camouflaged
The Failure of Social Systems Design 695

with the assistance o f misused figures (Huff, 1991), and not only on an individual
or small-scale system but also on the scale o f large and invasive systems.
" E c o n o m i c s has developed very sophisticated econometric methods for
handling aggregate data that, unfortunately, are usually extremely n o i s y , " says
Herbert A. Simon (1992, p. 41). "Econometrics on the whole has been a rather
inappropriate technology for economics. It has relied far too much on Newton-
ian-type models, searching for stable parameters that do not really exist . . . . "
added the late Kenneth E. Boulding (1992, p. 57). And finally, Donald
McCloskey concludes the story with full openness:
. . . Otto Eckstein, a superb economist with much common sense . . . built Data
Resources, Inc., into a company with revenues in 1984 of $84 million. But Data
Resources did not use its own predictions of prices and interest rates to speculate. It
sold them to others. [They] become rich by selling advice, in the form of models
and statistical equations and other charming talks, not by using it. . . . Forecasting
is very difficult, especially if it is about the future; an economist is an expert who
can tell you tomorrow why the things he predicted yesterday didn't happen today;
the best I can hope in a forecast is to be intelligently wrong or fortunately fight.

7. C O N C L U S I O N
The lesson for scientists in general, and for systems scientists in particular,
that follows from the above is that to overestimate the potential of your science
is a trap that may only affect the reputation of a scientist and/or of a science in
the best case. It may cause a disaster in the outside world at the worst.
Systems science, like any science, is done for better understanding and not
for immediate improvement o f a course o f action. Are economists richer than
other people? ( " I f y o u ' r e so smart why ain't you rich?" asks McCloskey)? Are
medical doctors healthier than the others? ( " I f y o u ' r e so smart why ain't you
better in ' X ' ? " one may paraphrase the American question). Sciences offer
intellectual tools that may be used or misused even by their originators. Then
whether a tool for better understanding sciences would really contribute to better
action depends not only on the tools, but also and predominantly, on the whole
system of action--the social s y s t e m - - i n question.
If the above message is painful to us systems scientists, it would be good
to keep in mind not only the motto by Herbert Spencer at the beginning o f this
paper, 6 but also what was said over a century ago by Charles Arthur Mercier
(1888, pp. 364-365), one o f the very first praxiologists in the United Kingdom:
The significance of pain is that it is the equivalent in consciousness of some action
which is tending to destroy the organism, and it is difficult to evade the impression
that its origin and persistence are due to its giving a readier and more effectual warning
to the whole organism of destructive agents affecting any part than any material

6Quoted after Wiltshire (1978, p. 199).


696 Gasparski

process could. Until it had attained intelligence enough to foresee the effects of acts,
no organism would make any effort to avoid disintegration, if disintegration were
unattended with pain.

REFERENCES
Argyris, C. (1982). Reasoning, Learning, and Action, San Francisco.
Bignell, V., and Fortune, J. (1984). Understanding Systems Failures, Manchester University Press,
Manchester.
Boguslaw, R. (1965). The New Utopians: A Study of System Design and Social Change, Prentice
Hall, Englewood Cliffs, NJ.
Boulding, K. E. (1992). Appropriate methodologies for the study of the economy. In Auspitz, J.
L , Gasparski, W. W., et al. (eds.), Praxiologies and the Philosophy of Economics, Trans-
action, New Brunswick, N J, pp. 43-59.
Bunge, M. (1985). Treatise on Basic Philosophy, Vol. 7 (Philosophy of Science and Technology,
Part II), D. Reidel, Dordrecht.
DeGreene, K. B. (1973). Sociotechnical Systems: Factors in Analysis, Design, and Management,
Prentice-Hall, Englewood Cliffs, NJ.
Duprr, J. (1993). The Disorder of Things: Metaphysical Foundations of the Disunity of Science,
Harvard University Press, Cambridge, MA.
Gasparski, W. (1986). What does it mean to be an expert? In Trappl, R. (ed.), Cybernetics and
Systems "88, Vienna, pp. 195-201.
Gasparski, W. (1987). Praxiology. In Systems & Control Encyclopedia, Pergamon Press, Oxford.
Gasparski, W. (1992a). Strategies and plans--The causes of failures. In Kubin, J., and Z. Zekofiski,
(eds.), In Search for a Strategy of Change, Polish Academy of Science, Warsaw (in Polish).
Gasparski, W. (1992b). The prose of action. In Auspitz, J. L., et al. (eds.), Praxiologies and the
Philosophy of Economics, Transaction, New Brunswick, NJ, pp. 3-8.
Gasparski, W., and Pszczolowski, T. (eds.) (1983). Praxiologica[ Studies: Polish Contributions to
the Science of Efficient Action, D. Reidel, Dordrecht.
Hammond et al. (1983). Fundamental obstacles to the use of scientific information in public policy
making. Technol. Forecast. Soc. Change 24, 287-297.
Huff, D. (1991). How to Lie with Statistics, Penguin Books, London.
Jackson, M. C. (1991). Systems Methodology for the Management Sciences, Plenum Press, New
York and London.
Kilmann, R. H. (1977). Social Systems Design: Normative Theory and the MAPS Design Technol-
ogy, North-Holland, New York, Oxford, and Amsterdam.
Kotarbinski, T. (1965). Praxiology: An Introduction to the Sciences of Efficient Action, Pergamon
Press, Oxford.
Linstone, H. A., et al. (1981). The multiple perspective concept. TechnoL Forecast. Soc. Change
209 275-325.
Linstone, H. A. (1989). Multiple perspectives: Concept, applications, and user guidelines. Syst.
Pract. 2, 307-331.
Mallen, G. (1993). Back to the cave--cultural perspectives on virtual reality. In Earnshaw, R., et
aL (eds.), Virtual Reality Systems, Academic Press, London.
McCloskey, D. N. (1992). If you're so smart . . . . In Auspitz, J. L., et al. (eds.), Praxiologies
and the Philosophy of Economics, Transaction, New Brunswick, NJ, pp. 93-111.
Mercier, Ch. A. (1888). The Nervous System and the Mind: A Treatise on the Dynamics of the
Human Organism, Macmillan, London.
The Failure of Social Systems Design 697

Simon, H. A. (1992). Methodological foundations of economics. In Auspitz, J. L., et al. (eds.),


Praxiologies and the Philosophy of Economics, Transaction, New Brunswick, NJ, pp. 25-42.
Spencer, H. (1880). The Study of Sociology, London.
Warfield, J. N. (1990). Presuppositions. In Trappl, R. (ed.), Proceedings of the lOth European
Meeting on Cybernetics & Systems Research, Vienna, pp. 213-219.
Wiltshire, D. (1978). The Social and Political Thought of Herbert Spencer, Oxford University
Press, Oxford.

You might also like