You are on page 1of 2

1.1.

The Evolution of Computers


A glance at the picture in Figure 1.1 of Charles Babbage's Analytical Difference Engine of 1834 reveals that computers have
come a long way since then. Their story has been one of ever-increasing processing power, complexity, and miniaturization,
but also of increasing usefulness, which has driven greater and greater levels of adoption and ubiquity.

Figure 1.1 A portion of Babbage's Analytical Difference Engine, as drawn in Harper's New Monthly
Magazine, Vol. 30, Issue 175, p. 34, 1864. The original engine documents, and a working
reconstruction, can be seen today at the Science Museum, London.

Despite looking nothing like a modern smartphone, quite a few of the techniques used in Babbage's machine, as well as the
early electrical computers of the 1940s, can still be found in today's computer systems. This stands testament to the amazing
foresight of those early pioneers, but also demonstrates that certain basic operations and structures are common to
computers of almost all types, over all ages. With the considerable benefit of hindsight, we have the opportunity to look back
through computing history and identify the discovery of those truths, or emergence of those techniques. We will also see
many short-lived evolutionary branches that seemed, at the time, to be promising paths to future progress, but that quickly
disappeared.

What seems likely then is that the computers of tomorrow will build on many techniques found in those of today. A snapshot
of current techniques (as any computing textbook has to be) needs to recognize this fact, rather than presenting the
technology as being set in stone.

© McGraw-Hill Education. All rights reserved. Any use is subject to the Terms of Use, Privacy Notice and copyright information.
This book will loosely follow the evolutionary trend. Early chapters will focus on computer fundamentals. Mastery of these will
allow a student to understand the basic workings of a computer on paper, however slow and inefficient this might be. These
early chapters will be followed by consideration of architectural speedups and advanced techniques in use today. These are
separated from the fundamentals because some of them may turn out to be the current "evolutionary blind alleys," but
nevertheless they are among the many techniques currently driving Moore's law[1] so quickly forward.

Every now and then something completely revolutionary happens in computer architecture—these break the evolutionary trend
and consign many past techniques that gave incremental performance increases to oblivion. Without a crystal ball, this book
will not attempt to identify future disruptive technology (although that will not prevent us from making an informed guess
about some advanced technologies; see Chapter 12), but we are free to examine the performance accelerators of past and
present computer systems. Above all, we can learn to understand and appreciate the driving factors and limitations within
which computer architects, computer system designers, software developers, and networking engineers have to work.

[1] Gordon Moore of Intel remarked, in 1965, that transistor density (and hence computer complexity) was increasing at an exponential
rate. He predicted that this rate of improvement would continue in the future, and the idea of this year-on-year improvement
subsequently became known as Moore's law.

© McGraw-Hill Education. All rights reserved. Any use is subject to the Terms of Use, Privacy Notice and copyright information.

You might also like