You are on page 1of 503

Computers History

Computers Developments and History

Contents
1

History of personal computers

1.1

Etymology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

1.2

Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

1.2.1

Mainframes, minicomputers, and microcomputers . . . . . . . . . . . . . . . . . . . . . .

1.2.2

Microprocessor and cost reduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

The beginnings of the personal computer industry . . . . . . . . . . . . . . . . . . . . . . . . . .

1.3.1

Simon

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

1.3.2

IBM 610 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

1.3.3

Olivetti Programma 101 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

1.3.4

MIR

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

1.3.5

Kenbak-1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

1.3.6

Datapoint 2200 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

1.3.7

Micral N . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

1.3.8

Xerox Alto and Star . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

1.3.9

IBM 5100 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

1.3.10 Altair 8800 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

1.3.11 Homebrew Computer Club . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

1.3.12 Other machines of the era . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

1977 and the emergence of the Trinity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

1.4.1

PET

1.4.2

Apple II

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

1.4.3

TRS-80 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Home computers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

1.5.1

Atari 400/800 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

1.5.2

Sinclair . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

1.5.3

TI-99 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

1.5.4

VIC-20 and Commodore 64 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

1.5.5

BBC Micro . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

1.5.6

Commodore/Atari price war and crash . . . . . . . . . . . . . . . . . . . . . . . . . . . .

1.5.7

Japanese computers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

The IBM PC . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

1.6.1

1.3

1.4

1.5

1.6

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

IBM PC clones . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
i

ii

CONTENTS
1.7

Apple Lisa and Macintosh . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

10

1.7.1

GUIs spread . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

11

1.8

PC clones dominate . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

11

1.9

1990s and 2000s . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

11

1.9.1

NeXT . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

11

1.9.2

CD-ROM . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

12

1.9.3

ThinkPad . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

12

1.9.4

Dell . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

12

1.9.5

Power Macintosh, PowerPC . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

12

1.9.6

Risc PC . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

12

1.9.7

BeBox . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

12

1.9.8

IBM clones, Apple back into protability . . . . . . . . . . . . . . . . . . . . . . . . . . .

12

1.9.9

Writable CDs, MP3, P2P le sharing . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

13

1.9.10 USB, DVD player . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

13

1.9.11 Hewlett-Packard . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

13

1.9.12 64 bits . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

13

1.9.13 Lenovo . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

13

1.9.14 Wi-Fi, LCD monitor, multi-core processor, ash memory . . . . . . . . . . . . . . . . . .

13

1.9.15 Local area networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

13

1.10 Market . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

14

1.11 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

14

1.12 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

14

1.13 Further reading . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

15

1.14 External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

16

History of computing

17

2.1

Concrete devices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

17

2.2

Numbers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

17

2.3

Early computation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

17

2.4

Navigation and astronomy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

19

2.5

Weather prediction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

19

2.6

Symbolic computations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

19

2.7

See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

19

2.8

References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

19

2.9

External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

19

2.9.1

20

British history links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

History of computing hardware

21

3.1

Early devices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

21

3.1.1

Ancient era . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

21

3.1.2

Medieval calculating tools . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

22

3.1.3

Mechanical calculators . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

22

CONTENTS
3.1.4

Punched card data processing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

23

3.1.5

Calculators . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

23

3.2

First general-purpose computing device . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

24

3.3

Analog computers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

25

3.4

Advent of the digital computer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

26

3.4.1

Electromechanical computers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

26

3.4.2

Digital computation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

27

3.4.3

Electronic data processing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

27

3.4.4

The electronic programmable computer . . . . . . . . . . . . . . . . . . . . . . . . . . .

28

The stored-program computer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

29

3.5.1

Theory . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

29

3.5.2

Manchester baby . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

30

3.5.3

Manchester Mark 1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

30

3.5.4

EDSAC . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

31

3.5.5

EDVAC . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

31

3.5.6

Commercial computers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

31

3.5.7

Microprogramming . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

32

3.5.8

Magnetic storage . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

32

3.6

Early computer characteristics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

32

3.7

Transistor computers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

32

3.7.1

Transistorized peripherals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

33

3.7.2

Supercomputers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

33

3.8

The integrated circuit . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

34

3.9

Post-1960 (integrated circuit based) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

34

3.10 Future . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

35

3.11 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

35

3.12 Notes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

36

3.13 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

39

3.14 Further reading . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

42

3.15 External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

42

Software

44

3.5

iii

4.1

History

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

44

4.2

Types of software . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

44

4.2.1

Purpose, or domain of use . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

44

4.2.2

Nature, or domain of execution

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

45

4.2.3

Programming tools . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

45

Software topics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

46

4.3.1

Architecture

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

46

4.3.2

Execution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

46

4.3.3

Quality and reliability

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

46

4.3.4

License . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

47

4.3

iv

CONTENTS
4.3.5

Patents . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

47

4.4

Design and implementation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

47

4.5

Industry and organizations

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

48

4.6

See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

48

4.7

References

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

48

4.8

External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

48

Computer science

49

5.1

History . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

49

5.1.1

History . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

50

Philosophy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

51

5.2.1

Name of the eld . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

51

Areas of computer science . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

52

5.3.1

Theoretical computer science . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

52

5.3.2

Applied computer science . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

53

5.4

The great insights of computer science . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

55

5.5

Academia . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

55

5.5.1

Conferences . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

55

5.5.2

Journals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

55

5.6

Education . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

55

5.7

See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

56

5.8

Notes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

56

5.9

References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

56

5.10 Further reading . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

58

5.11 External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

59

History of articial intelligence

60

6.1

Precursors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

60

6.1.1

AI in myth, ction and speculation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

60

6.1.2

Automatons

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

60

6.1.3

Formal reasoning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

61

6.1.4

Computer science . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

62

The birth of articial intelligence 19431956 . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

62

6.2.1

Cybernetics and early neural networks . . . . . . . . . . . . . . . . . . . . . . . . . . . .

62

6.2.2

Turings test . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

62

6.2.3

Game AI . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

62

6.2.4

Symbolic reasoning and the Logic Theorist . . . . . . . . . . . . . . . . . . . . . . . . . .

63

6.2.5

Dartmouth Conference 1956: the birth of AI . . . . . . . . . . . . . . . . . . . . . . . . .

63

The golden years 19561974 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

63

6.3.1

The work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

63

6.3.2

The optimism . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

64

6.3.3

The money . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

64

5.2
5.3

6.2

6.3

CONTENTS

6.4

The rst AI winter 19741980 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

64

6.4.1

The problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

64

6.4.2

The end of funding . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

65

6.4.3

Critiques from across campus . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

65

6.4.4

Perceptrons and the dark age of connectionism . . . . . . . . . . . . . . . . . . . . . . . .

66

6.4.5

The neats: logic, Prolog and expert systems . . . . . . . . . . . . . . . . . . . . . . . . . .

66

6.4.6

The scrues: frames and scripts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

66

Boom 19801987 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

67

6.5.1

The rise of expert systems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

67

6.5.2

The knowledge revolution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

67

6.5.3

The money returns: the fth generation project . . . . . . . . . . . . . . . . . . . . . . . .

67

6.5.4

The revival of connectionism . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

67

Bust: the second AI winter 19871993 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

68

6.6.1

AI winter . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

68

6.6.2

The importance of having a body: Nouvelle AI and embodied reason . . . . . . . . . . . .

68

AI 1993present . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

69

6.7.1

Milestones and Moores Law . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

69

6.7.2

Intelligent agents . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

69

6.7.3

Victory of the neats . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

69

6.7.4

AI behind the scenes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

69

6.7.5

Where is HAL 9000? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

70

6.7.6

2010s . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

70

6.8

See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

70

6.9

Notes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

70

6.10 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

74

History of computer science

77

7.1

Binary logic . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

77

7.2

Birth of computer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

77

7.3

Emergence of a discipline . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

78

7.3.1

Charles Babbage and Ada Lovelace . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

78

7.3.2

Alan Turing and the Turing Machine . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

78

7.3.3

Shannon and information theory . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

79

7.3.4

Wiener and cybernetics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

79

7.3.5

John von Neumann and the von Neumann architecture . . . . . . . . . . . . . . . . . . . .

79

7.4

See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

80

7.5

Notes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

80

7.6

Sources . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

81

7.7

Further reading . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

81

7.8

External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

81

6.5

6.6

6.7

History of operating systems

82

vi

CONTENTS
8.1

Background . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

82

8.2

Mainframes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

83

8.2.1

Systems on IBM hardware . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

83

8.2.2

Other mainframe operating systems

. . . . . . . . . . . . . . . . . . . . . . . . . . . . .

83

8.3

Minicomputers and the rise of Unix

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

84

8.4

Microcomputers: 8-bit home computers and game consoles . . . . . . . . . . . . . . . . . . . . .

85

8.4.1

Home computers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

85

8.4.2

Rise of OS in video games and consoles . . . . . . . . . . . . . . . . . . . . . . . . . . .

85

8.5

Personal computer era

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

85

8.6

Rise of virtualization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

86

8.7

See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

86

8.8

Notes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

86

8.9

References

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

86

8.10 Further reading . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

87

History of programming languages

88

9.1

Early history . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

88

9.2

First programming languages . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

88

9.3

Establishing fundamental paradigms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

89

9.4

1980s: consolidation, modules, performance . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

90

9.5

1990s: the Internet age . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

90

9.6

Current trends . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

91

9.7

Prominent people . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

91

9.8

See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

92

9.9

References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

92

9.10 Further reading . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

92

9.11 External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

92

10 History of software engineering

93

10.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

93

10.2 The Pioneering Era . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

93

10.3 1945 to 1965: The Origins . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

94

10.4 1965 to 1985: The Software Crisis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

94

10.5 1985 to 1989: No Silver Bullet . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

94

10.5.1 Software projects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

94

10.6 1990 to 1999: Prominence of the Internet

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

95

10.7 2000 to Present: Lightweight Methodologies . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

95

10.7.1 Current Trends in Software Engineering . . . . . . . . . . . . . . . . . . . . . . . . . . .

96

10.7.2 Software engineering today . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

96

10.8 Prominent Figures in the History of Software Engineering . . . . . . . . . . . . . . . . . . . . . .

96

10.9 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

96

10.10References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

96

CONTENTS

vii

10.11External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
11 History of the graphical user interface

97
98

11.1 Initial developments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

98

11.1.1 Augmentation of Human Intellect (NLS) . . . . . . . . . . . . . . . . . . . . . . . . . . .

99

11.1.2 Xerox PARC . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

99

11.2 Early developments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

99

11.2.1 Xerox Alto and Xerox Star . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

99

11.2.2 SGI 1000 series and MEX . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 100


11.2.3 Apple Lisa and Macintosh (and later, the Apple IIgs) . . . . . . . . . . . . . . . . . . . . 100
11.2.4 Graphical Environment Manager (GEM) . . . . . . . . . . . . . . . . . . . . . . . . . . . 101
11.2.5 DeskMate . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 101
11.2.6 MSX-View . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 101
11.2.7 Amiga Intuition and the Workbench . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102
11.2.8 Acorn BBC Master Compact . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102
11.2.9 Arthur / RISC OS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102
11.2.10 MS-DOS le managers and utility suites . . . . . . . . . . . . . . . . . . . . . . . . . . . 103
11.2.11 Applications under MS-DOS with proprietary GUIs . . . . . . . . . . . . . . . . . . . . . 104
11.2.12 Microsoft Windows (16-bit versions) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 104
11.2.13 GEOS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 105
11.2.14 The X Window System . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 105
11.2.15 NeWS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 105
11.3 The 1990s: Mainstream usage of the desktop . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 106
11.3.1 Windows 95 and a computer in every home . . . . . . . . . . . . . . . . . . . . . . . . 106
11.3.2 Mac OS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 107
11.3.3 GUIs built on the X Window System . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 107
11.3.4 Amiga . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 108
11.3.5 OS/2 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 108
11.3.6 NeXTSTEP . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 108
11.3.7 BeOS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 109
11.4 Current trends . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 109
11.4.1 Mobile devices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 109
11.4.2 3D user interface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 109
11.4.3 Virtual reality and presence

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 110

11.5 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 110


11.6 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 110
11.7 External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 112
12 History of the Internet

113

12.1 Precursors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 113


12.2 Three terminals and an ARPA . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 114
12.3 Packet switching . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 114

viii

CONTENTS
12.4 Networks that led to the Internet . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 115
12.4.1 ARPANET . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 115
12.4.2 NPL . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 115
12.4.3 Merit Network . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 115
12.4.4 CYCLADES . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 116
12.4.5 X.25 and public data networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 116
12.4.6 UUCP and Usenet . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 116
12.5 Merging the networks and creating the Internet (197390) . . . . . . . . . . . . . . . . . . . . . . 117
12.5.1 TCP/IP . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 117
12.5.2 From ARPANET to NSFNET . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 117
12.5.3 Transition towards the Internet . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 118
12.6 TCP/IP goes global (19892010) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 119
12.6.1 CERN, the European Internet, the link to the Pacic and beyond . . . . . . . . . . . . . . 119
12.6.2 Global digital divide . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 119
12.6.3 Opening the network to commerce . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 120
12.7 Networking in outer space . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 121
12.8 Internet governance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 121
12.8.1 NIC, InterNIC, IANA and ICANN . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 121
12.8.2 Internet Engineering Task Force . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 122
12.8.3 The Internet Society . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 123
12.8.4 Globalization and Internet governance in the 21st century . . . . . . . . . . . . . . . . . . 123
12.9 Net neutrality . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 123
12.10Use and culture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 124
12.10.1 Demographics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 124
12.10.2 Email and Usenet . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 124
12.10.3 From Gopher to the WWW . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 124
12.10.4 Search engines . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 125
12.10.5 File sharing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 126
12.10.6 Dot-com bubble . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 126
12.10.7 Mobile phones and the Internet . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 127
12.11Historiography . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 127
12.12See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 127
12.13Notes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 127
12.14References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 131
12.15External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 131

13 History of laptops

133

13.1 Osborne1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 133


13.2 Bondwell 2 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 133
13.3 Other CP/M laptops . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 133
13.4 Compaq Portable . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 133
13.5 Epson HX-20 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 133

CONTENTS

ix

13.6 GRiD Compass . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 134


13.7 Dulmont Magnum/Kookaburra . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 134
13.8 Ampere . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 134
13.9 Tandy Model 100 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 134
13.10Sharp and Gavilan . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 134
13.11Kyotronic 85 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 134
13.12Commodore SX-64 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 135
13.13Kaypro 2000 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 135
13.14IBM PC Convertible . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 135
13.15Toshiba T1100, T1000, and T1200 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 135
13.16US Air Force . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 135
13.17Hewlett-Packard Vectra Portable CS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 135
13.18Cambridge Z88 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 135
13.19Compaq SLT/286 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 136
13.20NEC UltraLite . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 136
13.21Apple . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 136
13.21.1 Macintosh Portable . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 136
13.21.2 Powerbook . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 136
13.22IBM RS/6000 N40 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 136
13.23Windows 95 operating system . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 136
13.24Intel Pentium processor . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 137
13.25Improved technology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 137
13.26Netbooks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 138
13.27Smartbooks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 138
13.28See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 138
13.29References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 138
13.30Further reading . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 139
14 History of the World Wide Web

140

14.1 Precursors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 140


14.2 19801991: Invention of the Web

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 140

14.3 19921995: Growth of the Web . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 141


14.3.1 Early browsers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 141
14.3.2 Web governance

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 142

14.4 19961998: Commercialization of the Web

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 142

14.5 19992001: Dot-com boom and bust . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 142


14.6 2002present: The Web becomes ubiquitous . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 143
14.6.1 Web 2.0

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 143

14.6.2 The semantic web . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 143


14.7 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 143
14.8 References

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 143

14.9 External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 144

CONTENTS

15 Timeline of computing hardware 2400 BC1949

145

15.1 Prehistory1640 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 145


15.2 1641-1850

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 145

15.3 18511930 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 145


15.4 19311940 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 145
15.5 19411949 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 145
15.6 Computing timeline . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 145
15.7 Notes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 145
15.8 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 146
15.9 External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 147
16 Timeline of computing 195079

148

16.1 1950s . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 148


16.2 1960s . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 148
16.3 1970s . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 148
16.4 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 148
16.5 References

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 148

16.6 External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 148


17 Timeline of computing 198089

149

17.1 1980 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 149


17.2 1981 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 149
17.3 1982 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 149
17.4 1983 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 149
17.5 1984 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 149
17.6 1985 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 149
17.7 1986 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 149
17.8 1987 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 149
17.9 1988 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 149
17.101989 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 149
17.11References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 149
17.12External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 149
18 Timeline of computing 199099

150

18.1 1990 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 150


18.2 1991 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 150
18.3 1992 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 150
18.4 1993 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 150
18.5 1994 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 150
18.6 1995 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 150
18.7 1996 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 150
18.8 1997 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 150

CONTENTS

xi

18.9 1998 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 150


18.101999 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 150
18.11References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 150
18.12External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 150
19 Timeline of computing 200009

151

19.1 2000 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 151


19.2 2001 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 151
19.3 2002 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 151
19.4 2003 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 151
19.5 2004 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 151
19.6 2005 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 151
19.7 2006 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 151
19.8 2007 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 151
19.9 2008 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 151
19.102009 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 151
19.11See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 151
19.12References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 151
19.13External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 151
20 Timeline of computing 201019

152

20.1 2010

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 152

20.2 2011

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 152

20.3 2012

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 152

20.4 2013

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 152

20.5 2014

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 152

20.6 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 152


21 Timeline of computing

153

21.1 Graphical timeline . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 153


21.2 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 153
21.3 Resources . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 153
21.4 External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 153
22 Microsoft

154

22.1 History . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 154


22.1.1 197283: Founding and company beginnings . . . . . . . . . . . . . . . . . . . . . . . . 154
22.1.2 198494: Windows and Oce . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 155
22.1.3 19952005: Internet and the 32-bit era . . . . . . . . . . . . . . . . . . . . . . . . . . . . 155
22.1.4 200610: Windows Vista, mobile, and Windows 7 . . . . . . . . . . . . . . . . . . . . . . 156
22.1.5 2011present: Rebranding, Windows 8, Surface and Nokia devices . . . . . . . . . . . . . 157
22.2 Businesses . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 158
22.2.1 Windows Division, Server and Tools, Online Services Division . . . . . . . . . . . . . . . 158

xii

CONTENTS
22.2.2 Business Division . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 158
22.2.3 Entertainment and Devices Division . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 159
22.3 Culture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 159
22.4 Criticism . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 159
22.5 Corporate aairs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 160
22.5.1 Financial . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 161
22.5.2 Environment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 161
22.5.3 Marketing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 161
22.5.4 Lay o . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 162
22.5.5 Cooperation with the United States Government . . . . . . . . . . . . . . . . . . . . . . . 162
22.5.6 Logo . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 162
22.6 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 163
22.7 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 163
22.8 External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 167

23 IBM

168

23.1 History . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 168


23.1.1 19301979 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 169
23.1.2 1980Present . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 169
23.2 Rank . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 170
23.3 Corporate aairs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 170
23.4 Facilities . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 171
23.5 Work environment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 171
23.6 Research and inventions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 172
23.7 Selected current projects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 173
23.8 Environmental record . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 174
23.9 Company logo and nickname . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 175
23.10See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 175
23.11References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 175
23.12Further reading . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 178
23.13External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 178
24 Apple Inc.

179

24.1 History . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 179


24.1.1 197680: Founding and incorporation . . . . . . . . . . . . . . . . . . . . . . . . . . . . 179
24.1.2 198189: Success with Macintosh . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 180
24.1.3 199099: Decline, restructuring, acquisitions . . . . . . . . . . . . . . . . . . . . . . . . . 181
24.1.4 200006: Return to protability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 182
24.1.5 200710: Success with mobile devices . . . . . . . . . . . . . . . . . . . . . . . . . . . . 183
24.1.6 201112: Steve Jobss death . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 184
24.1.7 2013present: Acquisitions and expansion . . . . . . . . . . . . . . . . . . . . . . . . . . 185
24.2 Products . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 186

CONTENTS

xiii

24.2.1 Mac . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 186


24.2.2 iPad . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 186
24.2.3 iPod . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 187
24.2.4 iPhone . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 187
24.2.5 Apple TV . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 188
24.2.6 Apple Watch . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 189
24.2.7 Software . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 189
24.3 Corporate identity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 189
24.3.1 Logo . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 189
24.3.2 Advertising . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 190
24.3.3 Brand loyalty . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 190
24.3.4 Home page . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 191
24.3.5 Headquarters . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 191
24.4 Corporate aairs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 191
24.4.1 Corporate culture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 191
24.4.2 Customer service . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 192
24.4.3 Manufacturing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 192
24.4.4 Finance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 194
24.4.5 Litigation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 194
24.4.6 Charitable causes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 195
24.5 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 195
24.6 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 195
24.7 Further reading . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 204
24.8 External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 204
25 Operating system

205

25.1 Types of operating systems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 205


25.1.1 Real-time . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 205
25.1.2 Multi-user . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 205
25.1.3 Multi-tasking vs. single-tasking . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 205
25.1.4 Distributed . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 205
25.1.5 Templated . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 206
25.1.6 Embedded . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 206
25.2 History . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 206
25.2.1 Mainframes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 206
25.2.2 Microcomputers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 207
25.3 Examples of operating systems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 208
25.3.1 Unix and Unix-like operating systems . . . . . . . . . . . . . . . . . . . . . . . . . . . . 208
25.3.2 Microsoft Windows . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 210
25.3.3 Other . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 211
25.4 Components . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 211
25.4.1 Kernel . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 211

xiv

CONTENTS
25.4.2 Networking . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 215
25.4.3 Security . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 216
25.4.4 User interface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 216
25.5 Real-time operating systems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 217
25.6 Operating system development as a hobby . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 218
25.7 Diversity of operating systems and portability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 218
25.8 Market share . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 218
25.9 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 218
25.10References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 218
25.11Further reading . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 219
25.12External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 219

26 Unix

220

26.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 220


26.2 History

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 221

26.3 Standards . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 221


26.4 Components
26.5 Impact

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 222

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 223

26.5.1 Free Unix and Unix-like operating systems

. . . . . . . . . . . . . . . . . . . . . . . . . 224

26.5.2 ARPANET . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 225


26.6 Branding

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 225

26.7 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 226


26.8 References

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 226

26.9 Further reading . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 227


26.10External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 227
27 Intel

228

27.1 Corporate history . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 228


27.1.1 Origins . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 228
27.1.2 Early history . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 229
27.1.3 Slowing demand and challenges to dominance . . . . . . . . . . . . . . . . . . . . . . . . 230
27.1.4 Regaining of momentum . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 230
27.1.5 Sale of XScale processor business . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 230
27.1.6 Acquisitions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 230
27.2 Acquisition table . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 231
27.2.1 Expansions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 231
27.2.2 Opening up the foundries . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 231
27.3 Product and market history . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 231
27.3.1 SRAMS and the microprocessor . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 231
27.3.2 From DRAM to microprocessors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 231
27.3.3 Intel, x86 processors, and the IBM PC . . . . . . . . . . . . . . . . . . . . . . . . . . . . 232
27.3.4 Solid-state drives (SSD) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 233

CONTENTS

xv

27.3.5 Supercomputers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 234


27.3.6 Competition, antitrust and espionage . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 234
27.3.7 Partnership with Apple . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 234
27.3.8 Core 2 Duo advertisement controversy . . . . . . . . . . . . . . . . . . . . . . . . . . . . 234
27.3.9 Classmate PC . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 235
27.3.10 Mobile processor . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 235
27.3.11 Server chips . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 235
27.3.12 22 nm processors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 235
27.3.13 Personal Oce Energy Monitor (POEM) . . . . . . . . . . . . . . . . . . . . . . . . . . . 235
27.3.14 IT Manager 3: Unseen Forces . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 235
27.3.15 Car Security System . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 235
27.3.16 High-Bandwidth Digital Content Protection . . . . . . . . . . . . . . . . . . . . . . . . . 235
27.3.17 Move from Wintel desktop to open mobile platforms . . . . . . . . . . . . . . . . . . . . . 235
27.3.18 Wearable fashion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 235
27.4 Corporate aairs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 236
27.4.1 Leadership and corporate structure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 236
27.4.2 Employment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 236
27.4.3 Economic Impacts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 237
27.4.4 Funding of a school . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 237
27.4.5 Ultrabook Fund . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 237
27.4.6 Finances . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 237
27.4.7 Advertising and brand management . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 238
27.4.8 Open source support . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 239
27.4.9 Corporate responsibility record . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 239
27.4.10 Religious controversy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 239
27.4.11 Age discrimination . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 239
27.5 Competition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 240
27.5.1 Lawsuits . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 240
27.5.2 Anti-competitive allegations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 240
27.6 Market share . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 241
27.7 Boycott, Divestment and Sanctions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 241
27.8 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 241
27.9 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 242
27.10External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 249
28 Microsoft Windows

250

28.1 Genealogy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 250


28.1.1 By marketing role . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 250
28.2 Version history . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 251
28.2.1 Early versions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 251
28.2.2 Windows 3.0 and 3.1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 251
28.2.3 Windows 9x . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 252

xvi

CONTENTS
28.2.4 Windows NT . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 252
28.2.5 Windows CE . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 254
28.2.6 Xbox OS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 255
28.3 Timeline of releases . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 255
28.4 Usage share . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 255
28.4.1 Usage share as a general platform . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 255
28.5 Security . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 256
28.5.1 File permissions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 256
28.5.2 Windows Defender . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 256
28.5.3 Third-party analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 257
28.6 Alternative implementations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 257
28.7 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 257
28.8 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 257
28.9 External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 259

29 Linux
29.1 History

260
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 260

29.1.1 Antecedents
29.1.2 Creation

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 260

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 261

29.1.3 Naming . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 261


29.1.4 Commercial and popular uptake . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 262
29.1.5 Current development . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 262
29.2 Design

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 262

29.2.1 User interface

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 263

29.2.2 Video input infrastructure


29.3 Development

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 264

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 264

29.3.1 Community . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 264


29.3.2 Programming on Linux . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 265
29.4 Uses . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 265
29.4.1 Desktop

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 266

29.4.2 Netbooks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 267


29.4.3 Servers, mainframes and supercomputers

. . . . . . . . . . . . . . . . . . . . . . . . . . 267

29.4.4 Smart devices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 268


29.4.5 Embedded devices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 268
29.4.6 Gaming . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 269
29.4.7 Specialized uses

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 269

29.5 Market share and uptake

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 270

29.6 Copyright, trademark, and naming . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 271


29.7 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 272
29.8 References

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 272

29.9 External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 276

CONTENTS

xvii

30 MS-DOS

277

30.1 History . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 277


30.2 Versions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 278
30.3 Competition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 279
30.4 Legal issues . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 281
30.5 Use of undocumented APIs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 281
30.6 End of MS-DOS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 282
30.7 Windows command-line interface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 282
30.8 Legacy compatibility . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 283
30.9 Related systems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 283
30.10See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 283
30.11References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 284
30.12External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 284
31 Google

286

31.1 History . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 286


31.1.1 Financing, 1998 and initial public oering, 2004 . . . . . . . . . . . . . . . . . . . . . . . 287
31.1.2 Growth . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 288
31.1.3 2013 onward . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 288
31.1.4 Acquisitions and partnerships . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 288
31.1.5 Google data centers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 290
31.2 Products and services . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 291
31.2.1 Advertising . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 291
31.2.2 Search engine . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 291
31.2.3 Productivity tools . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 292
31.2.4 Enterprise products . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 292
31.2.5 Other products . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 293
31.3 Corporate aairs and culture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 294
31.3.1 Employees . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 294
31.3.2 Googleplex . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 294
31.3.3 Doodles . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 296
31.3.4 Easter eggs and April Fools Day jokes . . . . . . . . . . . . . . . . . . . . . . . . . . . . 296
31.3.5 Philanthropy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 297
31.3.6 Tax avoidance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 297
31.3.7 Environment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 297
31.3.8 Lobbying . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 297
31.4 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 297
31.5 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 298
31.6 External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 306
32 IBM Personal Computer

307

32.1 History . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 307

xviii

CONTENTS
32.1.1 Rumors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 307
32.1.2 Too late? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 307
32.1.3 Predecessors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 308
32.1.4 Project Chess . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 308
32.1.5 Open standards . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 309
32.1.6 Debut . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 310
32.1.7 Reaction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 311
32.1.8 Success . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 311
32.1.9 Domination . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 312

32.2 IBM PC as standard . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 312


32.3 Third-party distribution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 312
32.4 Models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 312
32.5 PC . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 313
32.5.1 XT . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 313
32.5.2 XT/370 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 314
32.5.3 PCjr . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 314
32.5.4 Portable . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 314
32.5.5 AT . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 314
32.5.6 AT/370 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 315
32.5.7 Convertible . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 315
32.5.8 Next-generation IBM PS/2 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 315
32.6 Technology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 315
32.6.1 Electronics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 315
32.6.2 Peripheral integrated circuits . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 316
32.6.3 Joystick port . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 316
32.6.4 Keyboard . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 316
32.6.5 Character set . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 317
32.6.6 Storage media . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 317
32.6.7 BIOS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 319
32.6.8 Video output . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 319
32.6.9 Serial port addresses and interrupts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 320
32.6.10 Printer port . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 320
32.7 Reception . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 320
32.8 Longevity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 320
32.9 Collectability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 321
32.10See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 321
32.11Notes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 321
32.12References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 321
32.13Further reading . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 324
32.14External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 325
33 Digital Equipment Corporation

326

CONTENTS

xix

33.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 326


33.2 History . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 326
33.2.1 Origins . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 326
33.2.2 Digital modules . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 327
33.2.3 PDP-1 family . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 328
33.2.4 PDP-8 family . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 329
33.2.5 PDP-10 family . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 330
33.2.6 DECtape . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 330
33.2.7 PDP-11 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 330
33.2.8 VAX . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 331
33.2.9 Early microcomputers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 332
33.2.10 Networking and clusters . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 333
33.2.11 Diversication . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 333
33.2.12 Faltering in the market . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 333
33.2.13 32-bit MIPS and 64-bit Alpha systems . . . . . . . . . . . . . . . . . . . . . . . . . . . . 334
33.2.14 StrongARM . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 334
33.2.15 Designing solutions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 334
33.2.16 Final years . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 334
33.3 Research . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 336
33.4 Accomplishments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 337
33.5 User organizations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 338
33.6 Notes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 338
33.7 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 340
33.8 External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 340
34 Hewlett-Packard

341

34.1 History . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 341


34.1.1 Founding . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 341
34.1.2 Early years . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 342
34.1.3 1960s . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 342
34.1.4 1970s . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 342
34.1.5 1980s . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 343
34.1.6 1990s . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 343
34.1.7 2000s . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 344
34.1.8 2010s . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 345
34.2 Facilities . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 347
34.3 Products and organizational structure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 347
34.4 Culture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 349
34.5 Corporate social responsibility . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 349
34.6 Brand . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 350
34.7 HP DISCOVER customer event . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 351
34.8 Controversies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 351

xx

CONTENTS
34.8.1 Restatement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 351
34.8.2 Spying scandal . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 351
34.8.3 Hardware . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 351
34.8.4 Lawsuit against Oracle . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 351
34.8.5 Takeover of Autonomy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 352
34.8.6 Bribery . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 352
34.8.7 Divestment from HP regarding involvement in Israeli occupation and blockade of Palestinian
territories . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 353
34.9 Notable people . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 353
34.10See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 353
34.11References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 353
34.12External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 357

35 Mainframe computer

358

35.1 Description . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 358


35.2 Characteristics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 359
35.3 Market . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 360
35.4 History . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 360
35.5 Dierences from supercomputers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 361
35.6 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 361
35.7 Notes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 362
35.8 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 362
35.9 External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 362
36 Macintosh

363

36.1 History . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 364


36.1.1 Development and introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 364
36.1.2 Inital phase . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 364
36.1.3 Team members . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 364
36.1.4 Production . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 364
36.1.5 Debut . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 365
36.1.6 Desktop publishing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 366
36.1.7 Decline . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 368
36.1.8 Transition to PowerPC . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 369
36.1.9 Revival . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 369
36.1.10 Transition to Intel x86 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 370
36.2 Timeline of Macintosh models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 371
36.3 Product line . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 371
36.4 Hardware . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 371
36.5 Software . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 372
36.6 Advertising . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 372
36.7 Market share and user demographics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 373

CONTENTS

xxi

36.7.1 1980s and early 1990s . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 373


36.7.2 Late 1990s and early 2000s . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 373
36.7.3 Late 2000s . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 374
36.7.4 Post-PC era . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 375
36.8 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 375
36.9 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 375
36.10Further reading . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 380
36.11External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 381
37 IBM PC compatible

382

37.1 Origins . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 382


37.2 Compatibility issues . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 383
37.3 The decreasing inuence of IBM . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 384
37.4 Expandability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 385
37.5 IBM PC compatible becomes Wintel . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 385
37.6 Design limitations and more compatibility issues . . . . . . . . . . . . . . . . . . . . . . . . . . . 386
37.7 Challenges to Wintel domination . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 386
37.8 The IBM PC compatible today . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 387
37.9 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 387
37.10References

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 388

37.11External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 388


38 X86

389

38.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 389


38.2 Chronology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 390
38.3 History . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 390
38.3.1 Background . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 390
38.3.2 Other manufacturers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 391
38.3.3 Extensions of word size . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 391
38.4 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 391
38.4.1 Basic properties of the architecture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 392
38.4.2 Current implementations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 392
38.5 Segmentation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 393
38.6 Addressing modes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 393
38.7 x86 registers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 394
38.7.1 16-bit . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 394
38.7.2 32-bit . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 394
38.7.3 64-bit . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 395
38.7.4 128-bit . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 395
38.7.5 256-bit . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 395
38.7.6 512-bit . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 395
38.7.7 Miscellaneous/special purpose . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 395

xxii

CONTENTS
38.7.8 Purpose . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 395
38.7.9 Structure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 396

38.8 Operating modes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 396


38.8.1 Real mode . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 396
38.8.2 Protected mode . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 396
38.8.3 Long mode . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 397
38.9 Extensions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 397
38.9.1 Floating point unit . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 397
38.9.2 MMX . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 398
38.9.3 3DNow! . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 398
38.9.4 SSE . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 398
38.9.5 Physical Address Extension (PAE) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 399
38.9.6 x86-64 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 399
38.9.7 Virtualization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 400
38.10See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 400
38.11Notes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 400
38.12References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 401
38.13Further reading . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 402
38.14External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 402
39 Computer hardware

403

39.1 Von Neumann architecture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 403


39.2 Sales . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 403
39.3 Dierent systems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 404
39.3.1 Personal computer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 404
39.3.2 Mainframe computer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 405
39.3.3 Departmental computing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 406
39.3.4 Supercomputer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 406
39.4 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 406
39.5 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 406
39.6 External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 406
40 Personal computer
40.1 History

407

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 407

40.1.1 Market and sales . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 410


40.1.2 Average selling price . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 411
40.2 Terminology

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 411

40.3 Types . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 411


40.3.1 Stationary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 411
40.3.2 Portable . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 412
40.4 Hardware . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 415
40.4.1 Computer case . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 416

CONTENTS

xxiii

40.4.2 Power supply unit . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 416


40.4.3 Processor . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 416
40.4.4 Motherboard . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 417
40.4.5 Main memory

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 417

40.4.6 Hard disk . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 417


40.4.7 Visual display unit . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 418
40.4.8 Video card . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 418
40.4.9 Keyboard . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 418
40.4.10 Mouse

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 419

40.4.11 Other components . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 419


40.5 Software

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 420

40.5.1 Operating system . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 420


40.5.2 Applications

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 421

40.5.3 Gaming . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 422


40.6 Toxicity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 422
40.6.1 Electronic waste regulation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 422
40.7 See also . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 422
40.8 Notes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 422
40.9 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 423
40.10Further reading . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 424
40.11External links . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 425
40.12Text and image sources, contributors, and licenses . . . . . . . . . . . . . . . . . . . . . . . . . . 426
40.12.1 Text . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 426
40.12.2 Images . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 463
40.12.3 Content license . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 479

Chapter 1

History of personal computers


Main article: Personal computer
Main article: History of computing hardware (1960s
present)

ing sucient processing power and storage capabilities to


satisfy the needs of an individual user.[6] Two years later,
when what Byte was to call the 1977 Trinity of preassembled small computers hit the markets,[7] the Apple
II and the PET 2001 were advertised as personal computers,[8][9] while the TRS-80 was a described as a microcomputer used for household tasks including "personal
nancial management. By 1979 over half a million microcomputers were sold and the youth of the day had a
new concept of the personal computer.[10]

The history of the personal computer as mass-market


consumer electronic devices eectively began in 1977
with the introduction of microcomputers, although some
mainframe and maincomputers had been applied as
single-user systems much earlier. A personal computer
is one intended for interactive individual use, as opposed to a mainframe computer where the end users
requests are ltered through operating sta, or a time
sharing system in which one large processor is shared 1.2 Introduction
by many individuals. After the development of the
microprocessor, individual personal computers were low
enough in cost that they eventually became aordable 1.2.1 Mainframes, minicomputers, and
microcomputers
consumer goods. Early personal computers generally
called microcomputers were sold often in electronic kit
form and in limited numbers, and were of interest mostly Computer terminals were used for time sharing access
to hobbyists and technicians.
to central computers. Before the introduction of the
microprocessor in the early 1970s, computers were generally large, costly systems owned by large corporations,
universities, government agencies, and similar-sized in1.1 Etymology
stitutions. End users generally did not directly interact
with the machine, but instead would prepare tasks for the
An early use of the term personal computer appeared computer on o-line equipment, such as card punches. A
in a November 3, 1962, New York Times article report- number of assignments for the computer would be gathing John W. Mauchly's vision of future computing as de- ered up and processed in batch mode. After the job had
tailed at a recent meeting of the American Institute of completed, users could collect the results. In some cases
Industrial Engineers. Mauchly stated, There is no rea- it could take hours or days between submitting a job to
son to suppose the average boy or girl cannot be master the computing center and receiving the output.
of a personal computer".[1]
A more interactive form of computer use developed comSix years later a manufacturer took the risk of referring
to their product this way, when Hewlett-Packard advertised their Powerful Computing Genie as The New
Hewlett-Packard 9100A personal computer.[2] This advertisement was deemed too extreme for the target audience and replaced with a much drier ad for the HP 9100A
programmable calculator.[3][4]

mercially by the middle 1960s. In a time-sharing system, multiple computer terminals let many people share
the use of one mainframe computer processor. This was
common in business applications and in science and engineering.
A dierent model of computer use was foreshadowed
by the way in which early, pre-commercial, experimental computers were used, where one user had exclusive
use of a processor.[11] In places such as MIT, students
with access to some of the rst computers experimented
with applications that would today be typical of a personal computer; for example, computer aided drafting

Over the next seven years the phrase had gained enough
recognition that when Byte magazine published its rst
edition, it referred to its readers as "[in] the personal computing eld,[5] and Creative Computing dened the personal computer as a non-(time)shared system contain1

CHAPTER 1. HISTORY OF PERSONAL COMPUTERS

was foreshadowed by T-square, a program written in


1961, and an ancestor of todays computer games was
found in Spacewar! in 1962. Some of the rst computers
that might be called personal were early minicomputers
such as the LINC and PDP-8, and later on VAX and
larger minicomputers from Digital Equipment Corporation (DEC), Data General, Prime Computer, and others. By todays standards they were very large (about the
size of a refrigerator) and cost prohibitive (typically tens
of thousands of US dollars). However, they were much
smaller, less expensive, and generally simpler to operate than many of the mainframe computers of the time.
Therefore, they were accessible for individual laboratories and research projects. Minicomputers largely freed
these organizations from the batch processing and bureaucracy of a commercial or university computing center.
In addition, minicomputers were relatively interactive
and soon had their own operating systems. The minicomputer Xerox Alto (1973) was a landmark step in the development of personal computers, because of its graphical
user interface, bit-mapped high resolution screen, large
internal and external memory storage, mouse, and special software.[12]

After the 1972 introduction of the Intel 4004, microprocessor costs declined rapidly. In 1974 the American electronics magazine Radio-Electronics described the Mark-8
computer kit, based on the Intel 8008 processor. In January of the following year, Popular Electronics magazine
published an article describing a kit based on the Intel
8080, a somewhat more powerful and easier to use processor. The Altair 8800 sold remarkably well even though
initial memory size was limited to a few hundred bytes
and there was no software available. However, the Altair
kit was much less costly than an Intel development system
of the time and so was purchased by companies interested
in developing microprocessor control for their own products. Expansion memory boards and peripherals were
soon listed by the original manufacturer, and later by plug
compatible manufacturers. The very rst Microsoft product was a 4 kilobyte paper tape BASIC interpreter, which
allowed users to develop programs in a higher-level language. The alternative was to hand-assemble machine
code that could be directly loaded into the microcomputers memory using a front panel of toggle switches,
pushbuttons and LED displays. While the hardware front
panel emulated those used by early mainframe and minicomputers, after a very short time I/O through a terminal was the preferred human/machine interface, and front
panels became extinct.

As early as 1945, Vannevar Bush, in an essay called "As


We May Think", outlined a possible solution to the growing problem of information storage and retrieval. In what
was later to be called The Mother of All Demos, SRI researcher Douglas Engelbart in 1968 gave a preview of 1.3 The beginnings of the
what would become the staples of daily working life in the
computer industry
21st century e-mail, hypertext, word processing, video
conferencing, and the mouse. The demo was the culmination of research in Engelbarts Augmentation Research 1.3.1 Simon
Center laboratory, which concentrated on applying comMain article: Simon (computer)
puter technology to facilitate creative human thought.

1.2.2

Microprocessor and cost reduction

The minicomputer ancestors of the modern personal


computer used early integrated circuit (microchip) technology, which reduced size and cost, but they contained
no microprocessor. This meant that they were still large
and dicult to manufacture just like their mainframe
predecessors. After the computer-on-a-chip was commercialized, the cost to manufacture a computer system
dropped dramatically. The arithmetic, logic, and control
functions that previously occupied several costly circuit
boards were now available in one integrated circuit, making it possible to produce them in high volume. Concurrently, advances in the development of solid state memory
eliminated the bulky, costly, and power-hungry magnetic
core memory used in prior generations of computers.

personal

Simon [13] was a project developed by Edmund Berkeley and presented in a thirteen articles series issued in
Radio-Electronics magazine, from October 1950. Although there were far more advanced machines at the
time of its construction, the Simon represented the rst
experience of building an automatic simple digital computer, for educational purposes. In 1950, it was sold for
US$600.

1.3.2 IBM 610


Main article: IBM 610

The IBM 610 was designed between 1948 and 1957 by


John Lentz at the Watson Lab at Columbia University as
the Personal Automatic Computer (PAC) and announced
A few researchers at places such as SRI and Xerox PARC by IBM as the 610 Auto-Point in 1957. Although it was
were working on computers that a single person could use faulted for its speed, the IBM 610 handled oating-point
and that could be connected by fast, versatile networks: arithmetic naturally. With a price tag of $55,000, only
not home computers, but personal ones.
180 units were produced.[14]

1.3. THE BEGINNINGS OF THE PERSONAL COMPUTER INDUSTRY

1.3.3

Olivetti Programma 101

Main article: Programma 101

1.3.6 Datapoint 2200


Main article: Datapoint 2200
A programmable terminal called the Datapoint 2200

The Programma 101 was the rst commercially produced


desktop computer,[15][16] designed and produced by the
Italian company Olivetti and presented at the 1965 New
York Worlds Fair. Over 44,000 units were sold worldwide; in the US its cost at launch was $3,200. The Programma 101 had many of the features incorporated in
modern personal computers, such as memory, keyboard,
printing unit, magnetic card reader/recorder, control and
arithmetic unit[17] and is considered by many as the rst
commercially produced desktop computer, showing the
world that it was possible to create a desktop computer[18]
(HP later copied the Programma 101 architecture for its
HP9100 series).[19] [20]
1970: Datapoint 2200.

1.3.4

MIR

Main article: MIR (computer)


The Soviet MIR series of computers was developed from
1965 to 1969 in a group headed by Victor Glushkov.
It was designed as a relatively small-scale computer for
use in engineering and scientic applications and contained a hardware implementation of a high-level programming language. Another innovative feature for that
time was the user interface combining a keyboard with a
monitor and light pen for correcting texts and drawing on
screen.[21]

1.3.5

Kenbak-1

Main article: Kenbak-1


The Kenbak-1 is considered by the Computer History
Museum to be the worlds rst personal computer. It was
designed and invented by John Blankenbaker of Kenbak
Corporation in 1970, and was rst sold in early 1971. Unlike a modern personal computer, the Kenbak-1 was built
of small-scale integrated circuits, and did not use a microprocessor. The system rst sold for US$750. Only
around 40 machines were ever built and sold. In 1973,
production of the Kenbak-1 stopped as Kenbak Corporation folded.
With only 256 bytes of memory, an 8-bit word size, and
input and output restricted to lights and switches, the
Kenbak-1 was most useful for learning the principles of
programming but not capable of running application programs.

is the earliest known device that bears some signicant


resemblance to the modern personal computer, with a
screen, keyboard, and program storage.[22] It was made
by CTC (now known as Datapoint) in 1970 and was a
complete system in a small case bearing the approximate
footprint of an IBM Selectric typewriter. The systems
CPU was constructed from a variety of discrete components, although the company had commissioned Intel to
develop a single-chip processing unit; there was a falling
out between CTC and Intel, and the chip Intel had developed wasn't used. Intel soon released a modied version of that chip as the Intel 8008, the worlds rst 8bit microprocessor.[23] The needs and requirements of
the Datapoint 2200 therefore determined the nature of
the 8008, upon which all successive processors used in
IBM-compatible PCs were based. Additionally, the design of the Datapoint 2200s multi-chip CPU and the nal design of the Intel 8008 were so similar that the two
are largely software-compatible; therefore, the Datapoint
2200, from a practical perspective, can be regarded as
if it were indeed powered by an 8008, which makes it a
strong candidate for the title of rst microcomputer" as
well.

1.3.7 Micral N
Main article: Micral
The French company R2E was formed by two former
engineers of the Intertechnique company to sell their
Intel 8008-based microcomputer design. The system
was developed at the Institut National de la Recherche
Agronomique to automate hygrometric measurements.
The system ran at 500 kHz and included 16 kB of memory, and sold for 8500 Francs, about $1300US.

CHAPTER 1. HISTORY OF PERSONAL COMPUTERS

A bus, called Pluribus, was introduced that allowed connection of up to 14 boards. Boards for digital I/O, analog
I/O, memory, oppy disk were available from R2E. The
Micral operating system was initially called Sysmic, and
was later renamed Prologue.

While its use was limited to the engineers at Xerox


PARC, the Alto had features years ahead of its time.
Both the Xerox Alto and the Xerox Star would inspire
the Apple Lisa and the Apple Macintosh.

R2E was absorbed by Groupe Bull in 1978. Although


Groupe Bull continued the production of Micral computers, it was not interested in the personal computer market,
and Micral computers were mostly conned to highway
toll gates (where they remained in service until 1992) and
similar niche markets.

1.3.9 IBM 5100

1.3.8

Xerox Alto and Star

IBM 5100 was a desktop computer introduced in September 1975, six years before the IBM PC. It was the evolution of a prototype called the SCAMP (Special Computer
APL Machine Portable) that IBM demonstrated in 1973.
In January 1978 IBM announced the IBM 5110, its larger
cousin. The 5100 was withdrawn in March 1982.
When the PC was introduced in 1981, it was originally
designated as the IBM 5150, putting it in the 5100
series, though its architecture wasn't directly descended
from the IBM 5100.

1.3.10 Altair 8800


Main article: Altair 8800
Development of the single-chip microprocessor was the

1973: Xerox Alto

The Xerox Alto, developed at Xerox PARC in 1973, was


the rst computer to use a mouse, the desktop metaphor,
and a graphical user interface (GUI), concepts rst introduced by Douglas Engelbart while at International. It was
the rst example of what would today be recognized as a
complete personal computer.

>1975: Altair 8800

gateway to the popularization of cheap, easy to use, and


truly personal computers. It was only a matter of time before one such design was able to hit a sweet spot in terms
of pricing and performance, and that machine is generally considered to be the Altair 8800, from MITS, a small
In 1981, Xerox Corporation introduced the Xerox Star company that produced electronics kits for hobbyists.
workstation, ocially known as the 8010 Star Informa- The Altair was introduced in a Popular Electronics magtion System. Drawing upon its predecessor, the Xe- azine article in the January 1975 issue. In keeping with
rox Alto, it was the rst commercial system to incorpo- MITSs earlier projects, the Altair was sold in kit form,
rate various technologies that today have become com- although a relatively complex one consisting of four cirmonplace in personal computers, including a bit-mapped cuit boards and many parts. Priced at only $400, the Aldisplay, a windows-based graphical user interface, icons, tair tapped into pent-up demand and surprised its creators
folders, mouse, Ethernet networking, le servers, print when it generated thousands of orders in the rst month.
servers and e-mail. It also included a programming lan- Unable to keep up with demand, MITS sold the design
guage system called Smalltalk.
after about 10,000 kits had shipped.

1.4. 1977 AND THE EMERGENCE OF THE TRINITY

The introduction of the Altair spawned an entire industry


based on the basic layout and internal design. New companies like Cromemco started up to supply add-on kits,
while Microsoft was founded to supply a BASIC interpreter for the systems. Soon after a number of complete
clone designs, typied by the IMSAI 8080, appeared on
the market. This led to a wide variety of systems based
on the S-100 bus introduced with the Altair, machines
of generally improved performance, quality and ease-ofuse.

we would recognize today, the basic concept was already


rippling through other members of the group, and interested external companies.

1.3.12 Other machines of the era


Other 1977 machines that were important within the hobbyist community at the time included the Exidy Sorcerer, the NorthStar Horizon, the Cromemco Z-2, and
the Heathkit H8.

The Altair, and early clones, were relatively dicult to


use. The machines contained no operating system in
ROM, so starting it up required a machine language pro1.4 1977 and the emergence of the
gram to be entered by hand via front-panel switches, one
location at a time. The program was typically a small
Trinity
driver for an attached paper tape reader, which would
then be used to read in another real program. Later sys- See also: Microcomputer revolution
tems added bootstrapping code to improve this process,
and the machines became almost universally associated
with the CP/M operating system, loaded from oppy disk. By 1976 there were several rms racing to introduce
the rst truly successful commercial personal computers.
The Altair created a new industry of microcomputers and Three machines, the Apple II, PET 2001 and TRS-80
computer kits, with many others following, such as a wave were all released in 1977,[24] eventually selling millions
of small business computers in the late 1970s based on of machines. Byte magazine later referred to their launch
the Intel 8080, Zilog Z80 and Intel 8085 microproces- as the 1977 Trinity.
sor chips. Most ran the CP/M-80 operating system developed by Gary Kildall at Digital Research. CP/M-80
was the rst popular microcomputer operating system to 1.4.1 PET
be used by many dierent hardware vendors, and many
software packages were written for it, such as WordStar Main article: Commodore PET
and dBase II.
Chuck Peddle designed the Commodore PET (short for

1.3.11

Homebrew Computer Club

Although the Altair spawned an entire business, another


side eect it had was to demonstrate that the microprocessor had so reduced the cost and complexity of building
a microcomputer that anyone with an interest could build
their own. Many such hobbyists met and traded notes at
the meetings of the Homebrew Computer Club (HCC) in
Silicon Valley. Although the HCC was relatively shortlived, its inuence on the development of the modern PC
was enormous.
Members of the group complained that microcomputers
would never become commonplace if they still had to be
built up, from parts like the original Altair, or even in
terms of assembling the various add-ons that turned the
machine into a useful system. What they felt was needed
was an all-in-one system. Out of this desire came the
Sol-20 computer, which placed an entire S-100 system
QWERTY keyboard, CPU, display card, memory and
ports into an attractive single box. The systems were
packaged with a cassette tape interface for storage and a
12 monochrome monitor. Complete with a copy of BASIC, the system sold for US$2,100. About 10,000 Sol-20
systems were sold.

Oct. 1977: Commodore PET.

Personal Electronic Transactor) around his MOS 6502


processor. It was essentially a single-board computer with
a new display chip (the MOS 6545) driving a small builtin monochrome monitor with 4025 character graphics.
The processor card, keyboard, monitor and cassette drive
were all mounted in a single metal case. In 1982, Byte
referred to the PET design as the worlds rst personal
computer.[25]

The PET shipped in two models; the 2001-4 with 4 kB


of RAM, or the 2001-8 with 8 kB. The machine also included a built-in Datassette for data storage located on the
Although the Sol-20 was the rst all-in-one system that front of the case, which left little room for the keyboard.

CHAPTER 1. HISTORY OF PERSONAL COMPUTERS

The 2001 was announced in June 1977 and the rst 100
units were shipped in mid October 1977.[26] However
they remained back-ordered for months, and to ease deliveries they eventually canceled the 4 kB version early
the next year.

About 200 of the machines sold before the company announced the Apple II as a complete computer. It had color
graphics, a full QWERTY keyboard, and internal slots for
expansion, which were mounted in a high quality streamlined plastic case. The monitor and I/O devices were sold
Although the machine was fairly successful, there were separately. The original Apple II operating system was
frequent complaints about the tiny calculator-like key- only the built-in BASIC interpreter contained in ROM.
board, often referred to as a "Chiclet keyboard" due to Apple DOS was added to support the diskette drive; the
last version was Apple DOS 3.3.
the keys resemblance to the popular gum candy. This
was addressed in the upgraded dash N and dash B ver- Its higher price and lack of oating point BASIC, along
sions of the 2001, which put the cassette outside the case, with a lack of retail distribution sites, caused it to lag in
and included a much larger keyboard with a full stroke sales behind the other Trinity machines until 1979, when
non-click motion. Internally a newer and simpler moth- it surpassed the PET. It was again pushed into 4th place
erboard was used, along with an upgrade in memory to when Atari introduced its popular Atari 8-bit systems.[28]
8, 16, or 32 KB, known as the 2001-N-8, 2001-N-16 or Despite slow initial sales, the Apple IIs lifetime was about
2001-N-32, respectively.
eight years longer than other machines, and so accumuThe PET was the least successful of the 1977 Trinity ma- lated the highest total sales. By 1985 2.1 million had sold
chines, with under 1 million sales.[27]
and more than 4 million Apple IIs were shipped by the
end of its production in 1993.[27]

1.4.2

Apple II

1.4.3 TRS-80

Main article: Apple II


Steve Wozniak (known as Woz), a regular visitor Main article: TRS-80
Tandy Corporation (Radio Shack) introduced the TRS-

Nov. 1977: TRS-80 Model I


Apr. 1977: Apple II

to Homebrew Computer Club meetings, designed the


single-board Apple I computer and rst demonstrated it
there. With specications in hand and an order for 100
machines at US$500 each from the Byte Shop, Woz and
his friend Steve Jobs founded Apple Computer.

80, retroactively known as the Model I as improved models were introduced. The Model I combined the motherboard and keyboard into one unit with a separate monitor and power supply. Although the PET and the Apple
II oered certain features that were greatly advanced in
comparison, Tandys 3000+ Radio Shack storefronts ensured that it would have widespread distribution that nei-

1.5. HOME COMPUTERS


ther Apple nor Commodore could touch.
The Model I used a Zilog Z80 processor clocked at 1.77
MHz (the later models were shipped with a Z80A processor). The basic model originally shipped with 4 kB of
RAM, and later 16 kB, in the main computer. The expansion unit allowed for RAM expansion for a total of 48K.
Its other strong features were its full stroke QWERTY
keyboard, small size, well written Microsoft oatingpoint BASIC and inclusion of a monitor and tape deck for
approximately half the cost of the Apple II. Eventually,
5.25 inch oppy drives were made available by Tandy and
several third party manufacturers. The expansion unit allowed up to four oppy drives to be connected, provided a
slot for the RS-232 option and a parallel port for printers.
The Model I ran into some trouble meeting FCC regulations on radio interference due to its plastic case and
exterior cables. Apple had resolved this issue with an interior metallic foil but this patch wouldn't work on the
Model I.[29] Since the Model II and Model III were already in production Tandy decided to stop manufacturing
the Model I. Radio Shack had sold 1.5 million Model Is
by the cancellation in 1981.[27]

1.5 Home computers


See also: Home computer
Although the success of the Trinity machines was relatively limited in overall terms, as component prices continued to fall, many companies entered the computer
business. This led to an explosion of low-cost machines
known as home computers that sold millions of units
before the market imploded in a price war in the early
1980s.

1.5.1

Atari 400/800

Main article: Atari 8-bit family

7
1978, but production problems meant widespread sales
did not start until the next year.
At the time, the machines oered what was then much
higher performance than contemporary designs and a
number of graphics and sound features that no other microcomputer could match. They became very popular as
a result, quickly eclipsing the Trinity machines in sales.
In spite of a promising start with about 600,000 sold by
1981, the looming price war left Atari in a bad position. They were unable to compete eectively with Commodore, and only about 2 million machines were produced by the end of their production run.[27]

1.5.2 Sinclair
Sinclair Research Ltd is a British consumer electronics
company founded by Sir Clive Sinclair in Cambridge.
It was incorporated in 1973 as Ablesdeal Ltd. and renamed Westminster Mail Order Ltd and then Sinclair
Instrument Ltd. in 1975. The company remained dormant until 1976, when it was activated with the intension
of continuing Sinclairs commercial work from his earlier
company Sinclair Radionics; it adopted the name Sinclair
Research in 1981. In 1980, Clive Sinclair entered the
home computer market with the ZX80 at 99.95, at the
time the cheapest personal computer for sale in the UK.
In 1982 the ZX Spectrum was released, later becoming
Britains best selling computer, competing aggressively
against Commodore and Amstrad. At the height of its
success, and largely inspired by the Japanese Fifth Generation Computer programme, the company established
the MetaLab research centre at Milton Hall (near Cambridge), in order to pursue articial intelligence, waferscale integration, formal verication and other advanced
projects. The combination of the failures of the Sinclair
QL computer and the TV80 led to nancial diculties
in 1985, and a year later Sinclair sold the rights to their
computer products and brand name to Amstrad. Sinclair
Research Ltd exists today as a one man company, continuing to market Sir Clive Sinclairs newest inventions.
ZX80

Atari was a well-known brand in the late 1970s, both due


to their hit arcade games like Pong, as well as the hugely
successful Atari VCS game console. Realizing that the
VCS would have a limited lifetime in the market before
a technically advanced competitor came along, Atari decided they would be that competitor, and started work on
a new console design that was much more advanced.
While these designs were being developed, the Trinity machines hit the market with considerable fanfare.
Ataris management decided to change their work to a
home computer system instead. Their knowledge of the
home market through the VCS resulted in machines that
were almost indestructible and just as easy to use as a
games machine simply plug in a cartridge and go. The
new machines were rst introduced as the 400 and 800 in

Main article: Sinclair ZX80


The ZX80 home computer was launched in February
1980 at 79.95 in kit form and 99.95 ready-built. In
November of the same year Science of Cambridge was
renamed Sinclair Computers Ltd.
ZX81
Main article: Sinclair ZX81
The ZX81 (known as the TS 1000 in the United States)
was priced at 49.95 in kit form and 69.95 ready-built,
by mail order.

CHAPTER 1. HISTORY OF PERSONAL COMPUTERS


performance was untouchable, and TI had enormous cash
reserves and development capability.
When it was released in late 1979, TI took a somewhat slow approach to introducing it, initially focusing
on schools. Contrary to earlier predictions, the TI-99s
limitations meant it was not the giant-killer everyone expected, and a number of its design features were highly
controversial. A total of 2.8 million units were shipped
before the TI-99/4A was discontinued in March 1984.

1985 : Sinclair ZX Spectrum+ 128

1.5.4 VIC-20 and Commodore 64

ZX Spectrum
Main article: Sinclair Spectrum
The ZX Spectrum was launched on 23 April 1982, priced
at 125 for the 16 KB RAM version and 175 for the 48
KB version.
Sinclair QL
Main article: Sinclair QL

1982: Commodore 64.

Realizing that the PET could not easily compete with


color machines like the Apple II and Atari, Commodore
introduced the VIC-20 to address the home market. Limitations due to tiny 4 kB memory and its relatively limited
display in comparison to those machines was oset by a
low and ever falling price. Millions of VIC-20s were sold.

The Sinclair QL was announced in January 1984, priced


at 399. Marketed as a more sophisticated 32-bit microcomputer for professional users, it used a Motorola 68008
processor. Production was delayed by several months,
due to unnished development of hardware and software
The best-selling personal computer of all time was
at the time of the QLs launch.
released by Commodore International in 1982: the
Commodore 64 (C64) sold over 17 million units before
ZX Spectrum+
its end.[27] [30] The C64 name derived from its 64kb of
RAM and it also came with a side mount ROM cartridge
The ZX Spectrum+ was a repackaged ZX Spectrum 48K slot. It used the 6510 microprocessor CPU; MOS Techlaunched in October 1984.
nology, Inc. was then owned by Commodore.
ZX Spectrum 128

1.5.5 BBC Micro


The ZX Spectrum 128, with RAM expanded to 128 kB,
a sound chip and other enhancements, was launched in The BBC became interested in running a computer literSpain in September 1985 and the UK in January 1986, acy series, and sent out a tender for a standardized small
priced at 179.95.
computer to be used with the show. After examining several entrants, they selected what was then known as the
Acorn Proton and made a number of minor changes to
1.5.3 TI-99
produce the BBC Micro. The Micro was relatively expensive, which limited its commercial appeal, but with
Main article: TI-99
widespread marketing, BBC support and wide variety of
programs, the system eventually sold as many as 1.5 milTexas Instruments (TI), at the time the worlds largest lion units. Acorn was rescued from obscurity, and went
chip manufacturer, decided to enter the home computer on to develop the ARM processor (Acorn RISC Machine)
market with the Texas Instruments TI-99/4A. Announced to power follow-on designs. The ARM is widely used
long before its arrival, most industry observers expected to this day, powering a wide variety of products like the
the machine to wipe out all competition on paper its iPhone.

1.6. THE IBM PC

1.5.6

Commodore/Atari price war and


crash

TI had forced Commodore from the calculator market


by dropping the price of its own-brand calculators to
less than the cost of the chipsets it sold to third parties to make the same design. Commodores CEO, Jack
Tramiel, vowed that this would not happen again, and
purchased MOS Technology to ensure a supply of chips.
With his supply guaranteed, and good control over the
component pricing, Tramiel launched a war against TI
soon after the introduction of the Commodore 64.
Now vertically integated,[32] Commodore lowered the retail price of the 64 to $300 at the June 1983 Consumer
Electronics Show, and stores sold it for as little as $199.
At one point the company was selling as many computers as the rest of the industry combined.[33] TI responded
by lowering the list price of the 99/4A from $400 in the
fall of 1982 to $99 by June, causing a loss of hundreds
of millions of dollars; a Service Merchandise executive
stated I've been in retailing 30 years and I have never
seen any category of goods get on a self-destruct pattern
like this.[32] While Tramiels target was TI, everyone in
the home computer market was hurt by the process; many
companies went bankrupt or exited the business. In the
end even Commodores own nances were crippled by
the demands of nancing the massive building expansion
needed to deliver the machines, and Tramiel was forced
from the company.

1.5.7

Japanese computers

From the late 1970s to the early 1990s, Japan's personal computer market was largely dominated by domestic computer products. NEC's PC-88 and PC-98 was
the market leader, though with some competition from
the Sharp X1 and X68000, the FM-7 and FM Towns,
and the MSX and MSX2, the latter also gaining some
popularity in Europe. A key dierence between Western and Japanese systems at the time was the latters
higher display resolutions (640x400) in order to accommodate Japanese text. Japanese computers also employed Yamaha FM synthesis sound boards since the early
1980s, allowing the production of higher quality sound.
Japanese computers were widely used to produce video
games, though only a small portion of Japanese PC games
were released outside of the country.[34] The most successful Japanese personal computer was NECs PC-98,
which sold more than 18 million units by 1999.[35]

1.6 The IBM PC

1981: IBM 5150

the IBM PC, released in August, 1981. Like the Apple II


and S-100 systems, it was based on an open, card-based
architecture, which allowed third parties to develop for
it. It used the Intel 8088 CPU running at 4.77 MHz, containing 29,000 transistors. The rst model used an audio
cassette for external storage, though there was an expensive oppy disk option. The cassette option was never
popular and was removed in the PC XT of 1983.[36] The
XT added a 10MB hard drive in place of one of the two
oppy disks and increased the number of expansion slots
from 5 to 8. While the original PC design could accommodate only up to 64k on the main board, the architecture
was able to accommodate up to 640KB of RAM, with the
rest on cards. Later revisions of the design increased the
limit to 256K on the main board.
The IBM PC typically came with PC DOS, an operating system based upon Gary Kildall's CP/M-80 operating system. In 1980, IBM approached Digital Research,
Kildalls company, for a version of CP/M for its upcoming IBM PC. Kildalls wife and business partner, Dorothy
McEwen, met with the IBM representatives who were
unable to negotiate a standard non-disclosure agreement
with her. IBM turned to Bill Gates, who was already providing the ROM BASIC] interpreter for the PC. Gates
oered to provide 86-DOS, developed by Tim Paterson
of Seattle Computer Products. IBM rebranded it as PC
DOS, while Microsoft sold variations and upgrades as
MS-DOS.
The impact of the Apple II and the IBM PC was fully
demonstrated when Time named the home computer the
Machine of the Year, or Person of the Year for 1982
(January 3, 1983, The Computer Moves In). It was the
rst time in the history of the magazine that an inanimate
object was given this award.

1.6.1 IBM PC clones

The original PC design was followed up in 1983 by the


Main article: IBM PC
The IBM PC was the rst PC that justied widespread IBM XT, which was an incrementally improved design; it
use. IBM responded to the success of the Apple II with omitted support for the cassette, had more card slots, and

10

CHAPTER 1. HISTORY OF PERSONAL COMPUTERS

was available with a 10MB hard drive. Although mandatory at rst, the hard drive was later made an option and
a two oppy disk XT was sold. While the architectural
memory limit of 640K was the same, later versions were
more readily expandable.

In 1984, IBM introduced the IBM Personal Computer/AT (more often called the PC/AT or AT) built
around the Intel 80286 microprocessor. This chip was
much faster, and could address up to 16MB of RAM but
only in a mode that largely broke compatibility with the
Although the PC and XT included a version of the BA- earlier 8086 and 8088. In particular, the MS-DOS operSIC language in read-only memory, most were purchased ating system was not able to take advantage of this capawith disk drives and run with an operating system; three bility.
operating systems were initially announced with the PC.
One was CP/M-86 from Digital Research, the second
was PC DOS from IBM, and the third was the UCSD p- 1.7 Apple Lisa and Macintosh
System (from the University of California at San Diego).
PC DOS was the IBM branded version of an operating
system from Microsoft, previously best known for supplying BASIC language systems to computer hardware
companies. When sold by Microsoft, PC DOS was called
MS-DOS. The UCSD p-System OS was built around the
Pascal programming language and was not marketed to
the same niche as IBMs customers. Neither the p-System
nor CPM-86 was a commercial success.
Because MS-DOS was available as a separate product,
some companies attempted to make computers available which could run MS-DOS and programs. These
early machines, including the ACT Apricot, the DEC
rainbow 100, the Hewlett-Packard HP-150, the Seequa
Chameleon and many others were not especially successful, as they required a customized version of MSDOS, and could not run programs designed specically
for IBMs hardware. (See List of early non-IBM-PCcompatible PCs.) The rst truly IBM PC compatible
machines came from Compaq, although others soon followed.
Because the IBM PC was based on relatively standard integrated circuits, and the basic card-slot design was not
patented, the key portion of that hardware was actually
the BIOS software embedded in read-only memory. This
critical element got reverse engineered, and that opened
the oodgates to the market for IBM PC imitators, which
were dubbed PC clones. At the time that IBM had decided to enter the personal computer market in response
to Apples early success, IBM was the giant of the computer industry and was expected to crush Apples market
share. But because of these shortcuts that IBM took to enter the market quickly, they ended up releasing a product
that was easily copied by other manufacturers using o
the shelf, non-proprietary parts. So in the long run, IBMs
biggest role in the evolution of the personal computer was
to establish the de facto standard for hardware architecture amongst a wide range of manufacturers. IBMs pricing was undercut to the point where IBM was no longer
the signicant force in development, leaving only the PC
standard they had established. Emerging as the dominant
force from this battle amongst hardware manufacturers
who were vying for market share was the software company Microsoft that provided the operating system and
utilities to all PCs across the board, whether authentic
IBM machines or the PC clones.

1984: Apple Macintosh.

In 1983 Apple Computer introduced the rst massmarketed microcomputer with a graphical user interface,
the Lisa. The Lisa ran on a Motorola 68000 microprocessor and came equipped with 1 megabyte of RAM, a
12-inch (300 mm) black-and-white monitor, dual 5inch oppy disk drives and a 5 megabyte Prole hard
drive. The Lisas slow operating speed and high price
(US$10,000), however, led to its commercial failure.
Drawing upon its experience with the Lisa, Apple
launched the Macintosh in 1984, with an advertisement
during the Super Bowl. The Macintosh was the rst
successful mass-market mouse-driven computer with a
graphical user interface or 'WIMP' (Windows, Icons,
Menus, and Pointers). Based on the Motorola 68000 microprocessor, the Macintosh included many of the Lisas
features at a price of US$2,495. The Macintosh was
introduced with 128 kb of RAM and later that year a
512 kb RAM model became available. To reduce costs
compared the Lisa, the year-younger Macintosh had a
simplied motherboard design, no internal hard drive,
and a single 3.5 oppy drive. Applications that came

1.8. PC CLONES DOMINATE

11

with the Macintosh included MacPaint, a bit-mapped 1.8 PC clones dominate


graphics program, and MacWrite, which demonstrated
WYSIWYG word processing.
The transition from a PC-compatible market being driven
While not a success upon its release, the Macintosh was by IBM to one driven primarily by a broader market bea successful personal computer for years to come. This is gan to become clear in 1986 and 1987; in 1986, the
particularly due to the introduction of desktop publishing 32-bit Intel 80386 microprocessor was released, and the
in 1985 through Apples partnership with Adobe. This rst '386-based PC-compatible was the Compaq Deskpro
partnership introduced the LaserWriter printer and Aldus 386. IBMs response came nearly a year later with the iniPageMaker (now Adobe PageMaker) to users of the per- tial release of the IBM Personal System/2 series of comsonal computer. During Steve Jobs' hiatus from Apple, a puters, which had a closed architecture and were a signifnumber of dierent models of Macintosh, including the icant departure from the emerging standard PC. These
Macintosh Plus and Macintosh II, were released to a great models were largely unsuccessful, and the PC Clone style
degree of success. The entire Macintosh line of comput- machines outpaced sales of all other machines through
ers was IBMs major competition up until the early 1990s. the rest of this period.[38] Toward the end of the 1980s
PC XT clones began to take over the home computer
market segment from the specialty manufacturers such as
Commodore International and Atari that had previously
1.7.1 GUIs spread
dominated. These systems typically sold for just under
the magic $1000 price point (typically $999) and were
In the Commodore world, GEOS was available on the sold via mail order rather than a traditional dealer netCommodore 64 and Commodore 128. Later, a version work. This price was achieved by using the older 8/16 bit
was available for PCs running DOS. It could be used with technology, such as the 8088 CPU, instead of the 32-bits
a mouse or a joystick as a pointing device, and came with of the latest Intel CPUs. These CPUs were usually made
a suite of GUI applications. Commodores later prod- by a third party such as Cyrix or AMD. Dell started out as
uct line, the Amiga platform, ran a GUI operating sys- one of these manufacturers, under its original name PC
tem by default. The Amiga laid the blueprint for fu- Limited.
ture development of personal computers with its groundbreaking graphics and sound capabilities. Byte called it
the rst multimedia computer... so far ahead of its time
that almost nobody could fully articulate what it was all 1.9 1990s and 2000s
about.[37]

1.9.1 NeXT
In 1990, the NeXTstation workstation computer went
on sale, for interpersonal computing as Steve Jobs described it. The NeXTstation was meant to be a new computer for the 1990s, and was a cheaper version of the
previous NeXT Computer. Despite its pioneering use
of Object-oriented programming concepts, the NeXTstation was somewhat a commercial failure, and NeXT shut
down hardware operations in 1993.

1985: Atari ST.

In 1985, the Atari ST, also based on the Motorola 68000


microprocessor, was introduced with the rst color GUI
in the Atari TOS. It could be modied to emulate the
Macintosh using the third-party Spectre GCR device.
In 1987, Acorn launched the Archimedes range of highperformance home computers in Europe and Australasia.
Based on their own 32-bit ARM RISC processor, the systems were shipped with a GUI OS called Arthur. In 1989,
Arthur was superseded by a multi-tasking GUI-based op- The CD-ROM and CD-RW drives became standards for most
erating system called RISC OS. By default, the mice used personal computers.
on these computers had three buttons.

12

1.9.2

CHAPTER 1. HISTORY OF PERSONAL COMPUTERS

CD-ROM

In the early 1990s, the CD-ROM became an industry


standard, and by the mid-1990s one was built into almost
all desktop computers, and towards the end of the 1990s,
in laptops as well. Although introduced in 1982, the CD
ROM was mostly used for audio during the 1980s, and
then for computer data such as operating systems and applications into the 1990s. Another popular use of CD
ROMs in the 1990s was multimedia, as many desktop
computers started to come with built-in stereo speakers
capable of playing CD quality music and sounds with the
Sound Blaster sound card on PCs.

1.9.3

ThinkPad

1.9.6 Risc PC
Also in 1994, Acorn Computers launched its Risc PC series of high-end desktop computers. The Risc PC (codenamed Medusa) was Acorns next generation ARMbased RISC OS computer, which superseded the Acorn
Archimedes.

1.9.7 BeBox
In 1995, Be Inc. released the BeBox computer, which
used dual PowerPC 603 processors running at 66 MHz,
and later 133 MHz with the Be operating system. The
BeBox was largely a failure, with fewer than 2,000 units
produced between October 1995 and January 1997, when
production was ceased.

IBM introduced its successful ThinkPad range at


COMDEX 1992 using the series designators 300, 500
and 700 (allegedly analogous to the BMW car range and
used to indicate market), the 300 series being the budget, the 500 series midrange and the 700 series high
end. This designation continued until the late 1990s
when IBM introduced the T series as 600/700 series
replacements, and the 3, 5 and 7 series model designations were phased out for A (3&7) & X (5) series. The A
series was later partially replaced by the R series.

1.9.4

Dell

By the mid-1990s, Amiga, Commodore and Atari systems were no longer on the market, pushed out by strong
IBM PC clone competition and low prices. Other previous competition such as Sinclair and Amstrad were no
longer in the computer market. With less competition
The 1998 iMac, the iMac G3 brought Apple back into prot.
than ever before, Dell rose to high prots and success,
introducing low-cost systems targeted at consumers and
business markets using a direct-sales model. Dell surpassed Compaq as the worlds largest computer manu- 1.9.8 IBM clones, Apple back into proffacturer, and held that position until October 2006.
itability

1.9.5

Power Macintosh, PowerPC

In 1994, Apple introduced the Power Macintosh series


of high-end professional desktop computers for desktop
publishing and graphic designers. These new computers
made use of new Motorola PowerPC processors as part
of the AIM alliance, to replace the previous Motorola
68k architecture used for the Macintosh line. During the
1990s, the Macintosh remained with a low market share,
but as the primary choice for creative professionals, particularly those in the graphics and publishing industries.

Due to the sales growth of IBM clones in the '90s, they


became the industry standard for business and home use.
This growth was augmented by the introduction of Microsofts Windows 3.0 operating environment in 1990,
and followed by Windows 3.1 in 1992 and the Windows
95 operating system in 1995. The Macintosh was sent
into a period of decline by these developments coupled
with Apples own inability to come up with a successor
to the Macintosh operating system, and by 1996 Apple
was almost bankrupt. In December 1996 Apple bought
NeXT and in what has been described as a reverse
takeover, Steve Jobs returned to Apple in 1997. The
NeXT purchase and Jobs return brought Apple back to

1.9. 1990S AND 2000S


protability, rst with the release of Mac OS 8, a major new version of the operating system for Macintosh
computers, and then with the PowerMac G3 and iMac
computers for the professional and home markets. The
iMac was notable for its transparent bondi blue casing in
an ergonomic shape, as well as its discarding of legacy
devices such as a oppy drive and serial ports in favor of
Ethernet and USB connectivity. The iMac sold several
million units and a subsequent model using a dierent
form factor remains in production as of January 2012.
In 2001 Mac OS X, the long awaited next generation
Mac OS based on the NeXT technologies was nally introduced by Apple, cementing its comeback.

13

1.9.12 64 bits
In 2003, AMD shipped its 64-bit based microprocessor
line for desktop computers, Opteron and Athlon 64. Also
in 2003, IBM released the 64-bit based PowerPC 970 for
Apples high-end Power Mac G5 systems. Intel, in 2004,
reacted to AMDs success with 64-bit based processors,
releasing updated versions of their Xeon and Pentium 4
lines. 64-bit processors were rst common in high end
systems, servers and workstations, and then gradually replaced 32-bit processors in consumer desktop and laptop
systems since about 2005.

1.9.13 Lenovo
1.9.9

Writable CDs, MP3, P2P le sharing

The ROM in CD-ROM stands for Read Only Memory. In the late 1990s CD-R and later, rewritable CDRW drives were included instead of standard CD ROM
drives. This gave the personal computer user the capability to copy and burn standard Audio CDs which were
playable in any CD player. As computer hardware grew
more powerful and the MP3 format became pervasive,
ripping CDs into small, compressed les on a computers hard drive became popular. "Peer to peer" le
sharing networks such as Napster, Kazaa and Gnutella
arose to be used almost exclusively for sharing music les
and became a primary computer activity for many individuals.

1.9.10

USB, DVD player

Since the late 1990s, many more personal computers


started shipping that included USB (Universal Serial Bus)
ports for easy plug and play connectivity to devices such
as digital cameras, video cameras, personal digital assistants, printers, scanners, USB ash drives and other
peripheral devices. By the early 21st century, all shipping computers for the consumer market included at least
two USB ports. Also during the late 1990s DVD players
started appearing on high-end, usually more expensive,
desktop and laptop computers, and eventually on consumer computers into the rst decade of the 21st century.

In 2004, IBM announced the proposed sale of its PC business to Chinese computer maker Lenovo Group, which is
partially owned by the Chinese government, for US$650
million in cash and $600 million US in Lenovo stock. The
deal was approved by the Committee on Foreign Investment in the United States in March 2005, and completed
in May 2005. IBM will have a 19% stake in Lenovo,
which will move its headquarters to New York State and
appoint an IBM executive as its chief executive ocer.
The company will retain the right to use certain IBM
brand names for an initial period of ve years. As a result
of the purchase, Lenovo inherited a product line that featured the ThinkPad, a line of laptops that had been one
of IBMs most successful products.

1.9.14 Wi-Fi, LCD monitor, multi-core


processor, ash memory

In the early 21st century, Wi-Fi began to become increasingly popular as many consumers started installing
their own wireless home networks. Many of todays laptops and desktop computers are sold pre-installed with
wireless cards and antennas. Also in the early 21st century, LCD monitors became the most popular technology for computer monitors, with CRT production being slowed down. LCD monitors are typically sharper,
brighter, and more economical than CRT monitors. The
rst decade of the 21st century also saw the rise of multicore processors and ash memory. Once limited to highend industrial use due to expense, these technologies are
now mainstream and available to consumers. In 2008 the
MacBook Air and Asus Eee PC were released, laptops
1.9.11 Hewlett-Packard
that dispense with an optical drive and hard drive entirely
In 2002, Hewlett-Packard (HP) purchased Compaq. relying on ash memory for storage.
Compaq itself had bought Tandem Computers in 1997
(which had been started by ex-HP employees), and
Digital Equipment Corporation in 1998. Following this 1.9.15 Local area networks
strategy HP became a major player in desktops, laptops,
and servers for many dierent markets. The buyout made The invention in the late 1970s of local area networks
HP the worlds largest manufacturer of personal comput- (LANs), notably Ethernet, allowed PCs to communicate
with each other (peer-to-peer) and with shared printers.
ers, until Dell later surpassed HP.

14

CHAPTER 1. HISTORY OF PERSONAL COMPUTERS

As the microcomputer revolution continued, more robust


versions of the same technology were used to produce
microprocessor based servers that could also be linked
to the LAN. This was facilitated by the development of
server operating systems to run on the Intel architecture,
including several versions of both Unix and Microsoft
Windows.

SCELBI, another 1974 microcomputer


Simon (computer), a 1949 demonstration of computing principles

1.12 References
[1] Pocket Computer May Replace Shopping List. The New
York Times. November 3, 1962.

1.10 Market
In 2001, 125 million personal computers were shipped in
comparison to 48,000 in 1977. More than 500 million
PCs were in use in 2002 and one billion personal computers had been sold worldwide since mid-1970s till this
time. Of the latter gure, 75 percent were professional
or work related, while the rest sold for personal or home
use. About 81.5 percent of PCs shipped had been desktop
computers, 16.4 percent laptops and 2.1 percent servers.
United States had received 38.8 percent (394 million) of
the computers shipped, Europe 25 percent and 11.7 percent had gone to Asia-Pacic region, the fastest-growing
market as of 2002.[39] Almost half of all the households
in Western Europe had a personal computer and a computer could be found in 40 percent of homes in United
Kingdom, compared with only 13 percent in 1985.[40]
The third quarter of 2008 marked the rst time laptops
outsold desktop PCs in the United States.[41]

[2] 9100A desktop calculator, 1968 (PDF). HewlettPackard. Retrieved 2008-02-13.


[3] Hewlett-Packard (October 25, 1966). Restoring the Balance between Analysis and Computation (PDF). Science
Magazine 169 (3852): 409. Retrieved 2008-02-13.
[4] Shapiro, F.R.; Shapiro, F.R. (December 2000). Annals of the History of Computing. IEEE Annals of the
History of Computing (IEEE Journal) 22 (4): 7071.
doi:10.1109/MAHC.2000.887997.
[5] Helmers, Carl (October 1975). What is BYTE. BYTE.
pp. 4, col 3, para 2. Retrieved 2008-02-13.
[6] Horn, B.; Winston, P. (May 1975). Personal Computers. Datamation. p. 11. Retrieved 2008-02-13.
[7] Most Important Companies. Byte. September 1995.
Retrieved 2008-06-10.
[8] Birth of an Industry 197677. Apple Computer Inc.

advertisements. Kelley Advertising and Marketing. ReAs of June 2008, the number of personal computers
trieved 2008-06-14. Introducing Apple II. You've just run
worldwide in use hit one billion. Mature markets like
out of excuses for not owning a personal computer.
the United States, Western Europe and Japan accounted
for 58 percent of the worldwide installed PCs. About [9] Oldest Known Commodore PET Brochure. Retrieved
180 million PCs (16 percent of the existing installed
2008-06-14.
base) were expected to be replaced and 35 million to be
dumped into landll in 2008. The whole installed base [10] Reimer, Jeremy (December 14, 2005). Total share: 30
years of personal computer market share gures; The 8-bit
grew 12 percent annually.[42][43]
era (19801984)". Ars Technica. p. 4. Retrieved 200802-13.

1.11 See also


Timeline of electrical and electronic engineering
Computer museum and Personal Computer Museum
Expensive Desk Calculator
MIT Computer Science and Articial Intelligence
Laboratory
Educ-8 a
computer

1974

pre-microprocessor

micro-

Mark-8, a 1974 microprocessor-based microcomputer


Programma 101, a 1965 programmable calculator
with some attributes of a personal computer

[11] Athony Ralston and Edwin D. Reilly (ed), Encyclopedia


of Computer Science 3rd Edition, Van Nostrand Reinhold,
1993 ISBN 0-442-27679-6, article Digital Computers History
[12] Rheingold, H. (2000). Tools for thought: the history and
future of mind-expanding technology (New ed.). Cambridge, MA etc.: The MIT Press.
[13] What was the rst personal computer? at Blinkenlights
Archaeological Institute. Accessed: March 15, 2008.
[14] The IBM 610 Auto-Point Computer. Columbia University.
[15] "'Desk-top' computer is typewriter size. Business Week.
October 23, 1965.
[16] Desk-Top Size Computer Is Being Sold by Olivetti For
First Time in U.S.. Wall Street Journal. October 15,
1965.
[17]

1.13. FURTHER READING

[18] Documentary highlights how Programma 101 put people


rst. February 15, 2011.
[19] Olivetti Programma P101/P102. old-computers.com.
Retrieved 10:12 PM 11/8/2010. The P101, and particularly the magnetic card, was covered by a US patent
(3,495,222, Perotto et al.) and this gave to Olivetti over
$900.000 in royalties by HP alone, for the re-use of this
technology in the HP9100 series. Check date values in:
|accessdate= (help)
[20] Perotto, Pier Giorgio; et al. (Feb 10, 1970). 3,495,222
PROGRAM CONTROLLED ELECTRONIC COMPUTER (multiple). United States Patent Oce. Google
patents. Retrieved Nov 8, 2010.
[21] Pospelov, Dmitry. - [MIR series of computers. The rst personal computers]. Glushkov Foundation (in Russian). Institute of Applied Informatics. Retrieved 19 Nov 2012.

15

[34] Szczepaniak, John. Retro Japanese Computers: Gamings Final Frontier. Hardcore Gaming 101. Retrieved
2011-03-29. Reprinted from Retro Gamer (67), 2009
[35] Computing Japan. Computing Japan (LINC Japan). 5459: 18. 1999. Retrieved 6 February 2012. ...its venerable
PC 9800 series, which has sold more than 18 million units
over the years, and is the reason why NEC has been the
number one PC vendor in Japan for as long as anyone can
remember.
[36] The Old Computer Hut Intel family microcomputers (1)
[37]
[38] Reimer, Jeremy (December 14, 2005). Total share: 30
years of personal computer market share gures; The rise
of the PC (19871990)". Ars Technica. pp. 6;. Retrieved
2008-02-13.

[22] ;

[39] PCs: More than 1 billion served

[23] A History of Modern Computing, (MIT Press), pp. 220


21

[40] Computers reach one billion mark


[41] Notebook sales surpass PCs for rst time in US

[24] Alfred Dupont Chandler, Takashi Hikino, Andrew Von


Nordenycht, Inventing the Electronic Century: The Epic
Story of the Consumer Electronics and Computer Industries, Harvard University Press, 2009, page 134
[25] Lemmons, Phil (November 1982). Chuck Peddle: Chief
Designer of the Victor 9000 (PDF). Byte Magazine. Retrieved 2008-06-14.
[26] Whats New (February 1978). Commodore Ships First
PET Computers. BYTE (Byte Publications) 3 (2): 190.
Commodore press release. The PET computer made its
debut recently as the rst 100 units were shipped to waiting customers in mid October 1977.
[27] Reimer, Jeremy (December 2005). Personal Computer
Market Share: 19752004. Ars Technica. Retrieved
2008-02-13.
[28] Reimer, Jeremy (December 14, 2005). Total share: 30
years of personal computer market share gures; The new
era (2001 )". Ars Technica. p. 9. Retrieved 2008-02-13.
[29] Veit, Stan. TRS-80 the Trash-80"". pc-history.org. Retrieved 2010-0-0. Check date values in: |accessdate=
(help)
[30] Kahney, Leander (2003-09-09). Grandiose Price for a
Modest PC. Wired (Lycos). Retrieved 2006-10-25.
[31] Williams, Gregg; Welch, Mark; Avis, Paul (September
1985). A Microcomputing Timeline. BYTE. p. 198.
Retrieved 27 October 2013.
[32] Pollack, Andrew (1983-06-19). The Coming Crisis in
Home Computers. The New York Times. Retrieved 19
January 2015.
[33] Mitchell, Peter W. (1983-09-06). A summer-CES report. Boston Phoenix. p. 4. Retrieved 10 January 2015.

[42] Gartner Says More than 1 Billion PCs In Use Worldwide


and Headed to 2 Billion Units by 2014
[43] Computers in use pass 1 billion mark: Gartner

1.13 Further reading


Veit, Stan (1993). Stan Veits History of the Personal
Computer. WorldComm. p. 304. ISBN 978-156664-030-5.
Douglas K. Smith, Douglas K. Smith, Robert C.
Alexander (1999). Fumbling the Future: How Xerox
Invented, then Ignored, the First Personal Computer.
Authors Choice Press. p. 276. ISBN 978-1-58348266-7.
Freiberger, Paul; Swaine, Michael (2000). Fire in
the Valley: The Making of The Personal Computer.
McGraw-Hill Companies. p. 463. ISBN 978-0-07135892-7.
Allan, Roy A. (2001). A History of the Personal
Computer: The People and the Technology. Allan
Publishing. p. 528. ISBN 978-0-9689108-0-1.
Sherman, Josepha (2003). The History of the Personal Computer. Franklin Watts. p. 64. ISBN 9780-531-16213-2.
Laing, Gordon (2004). Digital Retro: The Evolution
and Design of the Personal Computer. Sybex. p.
192. ISBN 978-0-7821-4330-0.

16

1.14 External links


A history of the personal computer: the people and
the technology (PDF)
BlinkenLights Archaeological Insititute Personal
Computer Milestones
Personal Computer Museum A publicly viewable
museum in Brantford, Ontario, Canada
Old Computers Museum Displaying over 100 historic machines.
Chronology of Personal Computers a chronology
of computers from 1947 on
Total share: 30 years of personal computer market
share gures
Obsolete Technology Old Computers

CHAPTER 1. HISTORY OF PERSONAL COMPUTERS

Chapter 2

History of computing
The history of computing is longer than the history of
computing hardware and modern computing technology
and includes the history of methods intended for pen and
paper or for chalk and slate, with or without the aid of
tables. The timeline of computing presents a summary
list of major developments in computing by date.

2.1 Concrete devices


Computing is intimately tied to the representation of
numbers. But long before abstractions like the number
arose, there were mathematical concepts to serve the purposes of civilization. These concepts are implicit in concrete practices such as :
one-to-one correspondence, a rule to count how
many items, say on a tally stick, eventually abstracted into numbers;
comparison to a standard, a method for assuming
reproducibility in a measurement, for example, the
number of coins;

formally, and even proven. See, for example, Euclids algorithm for nding the greatest common divisor of two
numbers.
By the High Middle Ages, the positional Hindu-Arabic
numeral system had reached Europe, which allowed for
systematic computation of numbers. During this period,
the representation of a calculation on paper actually allowed calculation of mathematical expressions, and the
tabulation of mathematical functions such as the square
root and the common logarithm (for use in multiplication and division) and the trigonometric functions. By
the time of Isaac Newton's research, paper or vellum was
an important computing resource, and even in our present
time, researchers like Enrico Fermi would cover random
scraps of paper with calculation, to satisfy their curiosity about an equation.[2] Even into the period of programmable calculators, Richard Feynman would unhesitatingly compute any steps which overowed the memory
of the calculators, by hand, just to learn the answer.

2.3 Early computation

the 3-4-5 right triangle was a device for assuring a Main article: Timeline of computing hardware 2400
right angle, using ropes with 12 evenly spaced knots, BC1949
for example.
The earliest known tool for use in computation was the
abacus, and it was thought to have been invented in
Babylon circa 2400 BC. Its original style of usage was by
2.2 Numbers
lines drawn in sand with pebbles. Abaci, of a more modEventually, the concept of numbers became concrete and ern design, are still used as calculation tools today. This
familiar enough for counting to arise, at times with sing- was the rst known computer and most advanced system
song mnemonics to teach sequences to others. All known of calculation known to date - preceding Greek methods
languages have words for at least one and two (al- by 2,000 years.
though this is disputed: see Piraha language), and even In 1110 BC, the south-pointing chariot was invented in
some animals like the blackbird can distinguish a surpris- ancient China. It was the rst known geared mechanism
ing number of items.[1]
to use a dierential gear, which was later used in analog
The Chinese also invented a more sophisticomputers.
Advances in the numeral system and mathematical notacated
abacus
from around the 2nd century BC known as
tion eventually led to the discovery of mathematical opChinese
abacus).
the
erations such as addition, subtraction, multiplication, division, squaring, square root, and so forth. Eventually In the 5th century BC in ancient India, the grammarian
the operations were formalized, and concepts about the Pini formulated the grammar of Sanskrit in 3959 rules
operations became understood well enough to be stated known as the Ashtadhyayi which was highly systematized
17

18

CHAPTER 2. HISTORY OF COMPUTING

and technical. Panini used metarules, transformations as an internal scratch memory equivalent to RAM, muland recursions.[3]
tiple forms of output including a bell, a graph-plotter, and
In the 3rd century BC, Archimedes used the me- simple printer, and a programmable input-output hard
chanical principle of balance (see Archimedes memory of punch cards which it could modify as well
Palimpsest#Mathematical content) to calculate mathe- as read. The key advancement which Babbages devices
matical problems, such as the number of grains of sand possessed beyond those created before his was that each
in the universe (The sand reckoner), which also required component of the device was independent of the rest of
a recursive notation for numbers, the myriad myriad ... . the machine, much like the components of a modern electronic computer. This was a fundamental shift in thought;
The Antikythera mechanism is believed to be the earliest previous computational devices served only a single purknown mechanical analog computer.[4] It was designed pose, but had to be at best disassembled and recongured
to calculate astronomical positions. It was discovered in to solve a new problem. Babbages devices could be re1901 in the Antikythera wreck o the Greek island of programed to solve new problems by the entry of new
Antikythera, between Kythera and Crete, and has been data, and act upon previous calculations within the same
dated to circa 100 BC.
series of instructions. Ada Lovelace took this concept
Mechanical analog computer devices appeared again a one step further, by creating a program for the analytical
thousand years later in the medieval Islamic world and engine to calculate Bernoulli numbers, a complex calcuwere developed by Muslim astronomers, such as the me- lation requiring a recursive algorithm. This is considered
chanical geared astrolabe by Ab Rayhn al-Brn,[5] to be the rst example of a true computer program, a seand the torquetum by Jabir ibn Aah.[6] According to ries of instructions that act upon data not known in full
Simon Singh, Muslim mathematicians also made impor- until the program is run.
tant advances in cryptography, such as the development Several examples of analog computation survived into reof cryptanalysis and frequency analysis by Alkindus.[7][8] cent times. A planimeter is a device which does integrals,
Programmable machines were also invented by Muslim using distance as the analog quantity. Until the 1980s,
engineers, such as the automatic ute player by the HVAC systems used air both as the analog quantity and
Ban Ms brothers,[9] and Al-Jazari's humanoid robots the controlling element. Unlike modern digital computand castle clock, which is considered to be the rst ers, analog computers are not very exible, and need to
programmable analog computer.[10]
be recongured (i.e., reprogrammed) manually to switch
During the Middle Ages, several European philoso- them from working on one problem to another. Analog
phers made attempts to produce analog computer de- computers had an advantage over early digital computvices. Inuenced by the Arabs and Scholasticism, Ma- ers in that they could be used to solve complex problems
jorcan philosopher Ramon Llull (12321315) devoted a using behavioral analogues while the earliest attempts at
great part of his life to dening and designing several digital computers were quite limited.
logical machines that, by combining simple and undeniable philosophical truths, could produce all possible
knowledge. These machines were never actually built,
as they were more of a thought experiment to produce
new knowledge in systematic ways; although they could
make simple logical operations, they still needed a human being for the interpretation of results. Moreover,
they lacked a versatile architecture, each machine serving only very concrete purposes. In spite of this, Llulls
work had a strong inuence on Gottfried Leibniz (early
18th century), who developed his ideas further, and built
several calculating tools using them.
Indeed, when John Napier discovered logarithms for
computational purposes in the early 17th century, there
followed a period of considerable progress by inventors
and scientists in making calculating tools. The apex of
this early era of formal computing can be seen in the
dierence engine and its successor the analytical engine
(which was never completely constructed but was designed in detail), both by Charles Babbage. The analytical engine combined concepts from his work and that of
others to create a device that if constructed as designed
would have possessed many properties of a modern elec- A Smith Chart is a well-known nomogram.
tronic computer. These properties include such features
Since computers were rare in this era, the solu-

2.8. REFERENCES
tions were often hard-coded into paper forms such as
nomograms,[11] which could then produce analog solutions to these problems, such as the distribution of pressures and temperatures in a heating system.
None of the early computational devices were really computers in the modern sense, and it took considerable advancement in mathematics and theory before the rst
modern computers could be designed.
The Z3 computer from 1941, by German inventor
Konrad Zuse was the rst working programmable, fully
automatic computing machine.
The ENIAC (Electronic Numerical Integrator And Computer) was the rst electronic general-purpose computer,
announced to the public in 1946. It was Turing-complete,
digital, and capable of being reprogrammed to solve a full
range of computing problems.

2.4 Navigation and astronomy


Starting with known special cases, the calculation of logarithms and trigonometric functions can be performed
by looking up numbers in a mathematical table, and
interpolating between known cases. For small enough
dierences, this linear operation was accurate enough for
use in navigation and astronomy in the Age of Exploration. The uses of interpolation have thrived in the past
500 years: by the twentieth century Leslie Comrie and
W.J. Eckert systematized the use of interpolation in tables of numbers for punch card calculation.

2.5 Weather prediction

19
Computing timelines category
History of software
Index of history of computing articles
IT History Society
List of mathematicians
Timeline of quantum computing

2.8 References
[1] Konrad Lorenz, King Solomons Ring
[2] DIY: Enrico Fermis Back of the Envelope Calculations.
[3] Sinha, A. C. (1978). On the status of recursive rules
in transformational grammar. Lingua 44 (23): 169.
doi:10.1016/0024-3841(78)90076-1.
[4] The Antikythera Mechanism Research Project, The Antikythera Mechanism Research Project. Retrieved 200707-01
[5] Islam, Knowledge, and Science. University of Southern
California. Retrieved 2008-01-22.
[6] Lorch, R. P. (1976), The Astronomical Instruments of Jabir ibn Aah and the Torquetum,
Centaurus 20 (1): 1134, Bibcode:1976Cent...20...11L,
doi:10.1111/j.1600-0498.1976.tb00214.x
[7] Simon Singh, The Code Book, pp. 14-20
[8] Al-Kindi, Cryptgraphy, Codebreaking and Ciphers.
Retrieved 2007-01-12.
[9] Koetsier, Teun (2001), On the prehistory of programmable machines: musical automata, looms, calculators, Mechanism and Machine Theory (Elsevier) 36 (5):
589603, doi:10.1016/S0094-114X(01)00005-2..

The numerical solution of dierential equations, notably


the Navier-Stokes equations was an important stimulus
to computing, with Lewis Fry Richardson's numerical
approach to solving dierential equations. To this day, [10] Ancient Discoveries, Episode 11: Ancient Robots, History
Channel, retrieved 2008-09-06
some of the most powerful computer systems on Earth
are used for weather forecasts.
[11] Steinhaus, H. (1999). Mathematical Snapshots (3rd ed.).
New York: Dover. pp. 9295, p. 301.

2.6 Symbolic computations


By the late 1960s, computer systems could perform
symbolic algebraic manipulations well enough to pass
college-level calculus courses.

2.7 See also


Algorithm
Charles Babbage Institute - research center for history of computing at University of Minnesota

2.9 External links


The History of Computing by J.A.N. Lee
Things that Count: the rise and fall of calculators
The History of Computing Project
SIG on Computers, Information and Society of the
Society for the History of Technology
The Modern History of Computing
Cringelys Triumph of the Nerds

20
Top 25 Days in Computing History
A Chronology of Digital Computing Machines (to
1952) by Mark Brader
Bitsavers, an eort to capture, salvage, and archive
historical computer software and manuals from
minicomputers and mainframes of the 50s, 60s, 70s,
and 80s
Cyberhistory (2002) by Keith Falloon. UWA digital
thesis repository.
Arithmometre.org, The reference about Thomas de
Colmars arithmometers
Yahoo Computers and History
All-Magnetic Logic Computer. Timeline of Innovations. SRI International. Developed at SRI International in 1961
Famous Names in the History of Computing. Free
source for history of computing biographies.
Stephen Whites excellent computer history site (the
above article is a modied version of his work, used
with Permission)
Soviet Calculators Collection - a big collection of
Soviet calculators, computers, computer mices and
other devices
Logarithmic timeline of greatest breakthroughs
since start of computing era in 1623 by Jrgen
Schmidhuber, from The New AI: General & Sound
& Relevant for Physics, In B. Goertzel and C. Pennachin, eds.: Articial General Intelligence, p. 175198, 2006.
IEEE computer history timeline
Konrad Zuse, inventor of rst working programmable digital computer by Jrgen Schmidhuber
The Moore School Lectures and the British Lead
in Stored Program Computer Development (1946
1953), article from Virtual Travelog
Technology
and-Society/STS035Spring2004/CourseHome/index.htm
MIT
STS.035 History of Computing from MIT
OpenCourseWare for undergraduate level
Key Resources in the History of Computing
Italian computer database of brands
Computer History - a collection of articles by Bob
Bemer

CHAPTER 2. HISTORY OF COMPUTING


YouTube video comparing 1980s home computers
to 2010s technology
A visual timeline of the development of computers
since COLOSSUS' inception in 1943
History of Computing Visualisation

2.9.1 British history links


Resurrection Bulletin of the Computer Conservation
Society (UK) 19902006
The story of the Manchester Mark I, 50th Anniversary website at the University of Manchester
Richmond Arabian History of Computing Group
Linking the Gulf and Europe

Chapter 3

INFORMATION

INPUT

PROCESSOR

STORAGE

OUTPUT

History of computing hardware

INFORMATION

Computing hardware is a platform for information processing.

The history of computing hardware covers the developments from early simple devices to aid calculation to
modern day computers.
Before the 20th century, most calculations were done by
humans. Early mechanical tools to help humans with digital calculations were called calculating machines, by
proprietary names, or even as they are now, calculators.
The machine operator was called the computer.
The rst aids to computation were purely mechanical devices which required the operator to set up the initial values of an elementary arithmetic operation, then manipulate the device to obtain the result. Later, computers
represented numbers in a continuous form, for instance
distance along a scale, rotation of a shaft, or a voltage.
Numbers could also be represented in the form of digits, automatically manipulated by a mechanical mechanism. Although this approach generally required more
complex mechanisms, it greatly increased the precision
of results. The invention of transistor and then integrated circuits made a breakthrough in computers. As a
result digital computers largely replaced analog computers. The price of computers gradually became so low that The Ishango bone
rst the personal computers and later mobile computers
(smartphones and tablets) became ubiquitous.
ably livestock or grains, sealed in hollow unbaked clay
containers.[1][2] The use of counting rods is one example.

3.1 Early devices

The abacus was early used for arithmetic tasks. What we


now call the Roman abacus was used in Babylonia as early
as
2400 BC. Since then, many other forms of reckoning
3.1.1 Ancient era
boards or tables have been invented. In a medieval EuroDevices have been used to aid computation for thou- pean counting house, a checkered cloth would be placed
sands of years, mostly using one-to-one correspondence on a table, and markers moved around on it according to
with ngers. The earliest counting device was probably a certain rules, as an aid to calculating sums of money.
form of tally stick. Later record keeping aids through- Several analog computers were constructed in ancient
out the Fertile Crescent included calculi (clay spheres, and medieval times to perform astronomical calculacones, etc.) which represented counts of items, prob- tions. These include the Antikythera mechanism and the
21

22

CHAPTER 3. HISTORY OF COMPUTING HARDWARE

A slide rule

Suanpan (the number represented on this abacus is


6,302,715,408)

Since real numbers can be represented as distances or intervals on a line, the slide rule was invented in the 1620s,
shortly after Napiers work, to allow multiplication and
division operations to be carried out signicantly faster
than was previously possible.[6] Edmund Gunter built a
calculating device with a single logarithmic scale at the
University of Oxford. His device greatly simplied arithmetic calculations, including multiplication and division.
William Oughtred greatly improved this in 1630 with his
circular slide rule. He followed this up with the modern slide rule in 1632, essentially a combination of two
Gunter rules, held together with the hands. Slide rules
were used by generations of engineers and other mathematically involved professional workers, until the invention of the pocket calculator.[7]

astrolabe from ancient Greece (c. 150100 BC), which


are generally regarded as the earliest known mechanical
analog computers.[3] Hero of Alexandria (c. 1070 AD)
made many complex mechanical devices including automata and a programmable cart.[4] Other early versions
of mechanical devices used to perform one or another
type of calculations include the planisphere and other
mechanical computing devices invented by Abu Rayhan
al-Biruni (c. AD 1000); the equatorium and universal
latitude-independent astrolabe by Abu Ishaq Ibrahim alZarqali (c. AD 1015); the astronomical analog computers of other medieval Muslim astronomers and engineers; 3.1.3 Mechanical calculators
and the astronomical clock tower of Su Song (c. AD
Wilhelm Schickard, a German polymath, designed a cal1090) during the Song Dynasty.
culating machine in 1623 which combined a mechanised
form of Napiers rods with the worlds rst mechanical
adding machine built into the base. Because it made
3.1.2 Medieval calculating tools
use of a single-tooth gear there were circumstances in
which its carry mechanism would jam.[8] A re destroyed
at least one of the machines in 1624 and it is believed
Schickard was too disheartened to build another.

A set of John Napier's calculating tables from around 1680.

View through the back of Pascals calculator. Pascal invented


his machine in 1642.

Scottish mathematician and physicist John Napier discovered that the multiplication and division of numbers
could be performed by the addition and subtraction, respectively, of the logarithms of those numbers. While
producing the rst logarithmic tables, Napier needed to
perform many tedious multiplications. It was at this point
that he designed his 'Napiers bones', an abacus-like device that greatly simplied calculations that involved multiplication and division.[5]

In 1642, while still a teenager, Blaise Pascal started


some pioneering work on calculating machines and after three years of eort and 50 prototypes[9] he invented a
mechanical calculator.[10][11] He built twenty of these machines (called Pascals Calculator or Pascaline) in the following ten years.[12] Nine Pascalines have survived, most
of which are on display in European museums.[13] A continuing debate exists over whether Schickard or Pascal
should be regarded as the inventor of the mechanical

3.1. EARLY DEVICES

23

calculator and the range of issues to be considered is discussed elsewhere.[14]


Gottfried Wilhelm von Leibniz invented the Stepped
Reckoner and his famous stepped drum mechanism
around 1672. He attempted to create a machine that
could be used not only for addition and subtraction but
would utilise a moveable carriage to enable long multiplication and division. Leibniz once said It is unworthy
of excellent men to lose hours like slaves in the labour
of calculation which could safely be relegated to anyone
else if machines were used.[15] However, Leibniz did not
incorporate a fully successful carry mechanism. Leibniz
also described the binary numeral system,[16] a central ingredient of all modern computers. However, up to the
1940s, many subsequent designs (including Charles Babbage's machines of the 1822 and even ENIAC of 1945)
were based on the decimal system.[17]
Around 1820, Charles Xavier Thomas de Colmar created what would over the rest of the century become
the rst successful, mass-produced mechanical calculator, the Thomas Arithmometer. It could be used to add
and subtract, and with a moveable carriage the operator
IBM punched card Accounting Machines, pictured in 1936.
could also multiply, and divide by a process of long multi[18]
plication and long division. It utilised a stepped drum
similar in conception to that invented by Leibniz. Me- workers.[22] Punch cards became ubiquitous in industry
chanical calculators remained in use until the 1970s.
and government for accounting and administration.

3.1.4

Punched card data processing

In 1801, Joseph-Marie Jacquard developed a loom in


which the pattern being woven was controlled by punched
cards. The series of cards could be changed without
changing the mechanical design of the loom. This was a
landmark achievement in programmability. His machine
was an improvement over similar weaving looms. Punch
cards were preceded by punch bands, as in the machine
proposed by Basile Bouchon. These bands would inspire
information recording for automatic pianos and more recently numerical control machine tools.
In the late 1880s, the American Herman Hollerith invented data storage on punched cards that could then be
read by a machine.[19] To process these punched cards
he invented the tabulator, and the key punch machine.
His machines used mechanical relays (and solenoids) to
increment mechanical counters. Holleriths method was
used in the 1890 United States Census and the completed
results were "... nished months ahead of schedule and
far under budget.[20] Indeed, the census was processed
years faster than the prior census had been. Holleriths
company eventually became the core of IBM.

Leslie Comrie's articles on punched card methods and


W.J. Eckert's publication of Punched Card Methods in
Scientic Computation in 1940, described punch card
techniques suciently advanced to solve some dierential equations[23] or perform multiplication and division
using oating point representations, all on punched cards
and unit record machines. Such machines were used during World War II for cryptographic statistical processing,
as well as a vast number of administrative uses. The Astronomical Computing Bureau, Columbia University performed astronomical calculations representing the state
of the art in computing.[24][25]

3.1.5 Calculators
Main article: Calculator
By the 20th century, earlier mechanical calculators,
cash registers, accounting machines, and so on were redesigned to use electric motors, with gear position as the
representation for the state of a variable. The word computer was a job title assigned to people who used these
calculators to perform mathematical calculations. By the
1920s, British scientist Lewis Fry Richardson's interest
in weather prediction led him to propose human computers and numerical analysis to model the weather; to this
day, the most powerful computers on Earth are needed
to adequately model its weather using the NavierStokes
equations.[26]

By 1920, electro-mechanical tabulating machines could


add, subtract and print accumulated totals.[21] Machines
were programmed by inserting dozens of wire jumpers
into removable control panels. When the United States
instituted Social Security in 1935, IBM punched card Companies like Friden, Marchant Calculator and Monroe
systems were used to process records of 26 million made desktop mechanical calculators from the 1930s that

24

The Curta calculator could also do multiplication and division.

CHAPTER 3. HISTORY OF COMPUTING HARDWARE

A portion of Babbages Dierence engine.

could add, subtract, multiply and divide.[27] In 1948, the


Curta was introduced by Austrian inventor, Curt Herzstark. It was a small, hand-cranked mechanical calculator
and as such, a descendant of Gottfried Leibniz's Stepped
Reckoner and Thomas's Arithmometer.

chine via punched cards, a method being used at the time


to direct mechanical looms such as the Jacquard loom.
For output, the machine would have a printer, a curve
plotter and a bell. The machine would also be able to
The worlds rst all-electronic desktop calculator was the punch numbers onto cards to be read in later. It employed
British Bell Punch ANITA, released in 1961.[28][29] It ordinary base-10 xed-point arithmetic.
used vacuum tubes, cold-cathode tubes and Dekatrons in The Engine incorporated an arithmetic logic unit, control
its circuits, with 12 cold-cathode Nixie tubes for its dis- ow in the form of conditional branching and loops,
play. The ANITA sold well since it was the only elec- and integrated memory, making it the rst design for
tronic desktop calculator available, and was silent and a general-purpose computer that could be described in
quick. The tube technology was superseded in June 1963 modern terms as Turing-complete.[31][32]
by the U.S. manufactured Friden EC-130, which had an
all-transistor design, a stack of four 13-digit numbers dis- There was to be a store, or memory, capable of holdplayed on a 5-inch (13 cm) CRT, and introduced reverse ing 1,000 numbers of 40 decimal digits each (ca. 16.7
kB). An arithmetical unit, called the mill, would be
Polish notation (RPN).
able to perform all four arithmetic operations, plus comparisons and optionally square roots. Initially it was conceived as a dierence engine curved back upon itself, in
3.2 First general-purpose comput- a generally circular layout,[33] with the long store exiting o to one side. (Later drawings depict a regularized
ing device
grid layout.)[34] Like the central processing unit (CPU)
in a modern computer, the mill would rely upon its own
Main article: Analytical Engine
Charles Babbage, an English mechanical engineer and internal procedures, roughly equivalent to microcode in
polymath, originated the concept of a programmable modern CPUs, to be stored in the form of pegs inserted
computer. Considered the "father of the computer",[30] into rotating drums called barrels, to carry out some of
he conceptualized and invented the rst mechanical com- the more[35]complex instructions the users program might
puter in the early 19th century. After working on his rev- specify.
olutionary dierence engine, designed to aid in navigational calculations, in 1833 he realized that a much more
general design, an Analytical Engine, was possible. The
input of programs and data was to be provided to the ma-

The programming language to be employed by users was


akin to modern day assembly languages. Loops and conditional branching were possible, and so the language as
conceived would have been Turing-complete as later de-

3.3. ANALOG COMPUTERS

Reconstruction of Babbages Analytical Engine, the rst generalpurpose programmable computer.

25

Sir William Thomson's third tide-predicting machine design,


1879-81

ned by Alan Turing. Three dierent types of punch


cards were used: one for arithmetical operations, one for
numerical constants, and one for load and store operations, transferring numbers from the store to the arithmetical unit or back. There were three separate readers
for the three types of cards.

of physical phenomena such as electrical, mechanical, or


hydraulic quantities to model the problem being solved,
in contrast to digital computers that represented varying
quantities symbolically, as their numerical values change.
As an analog computer does not use discrete values, but
rather continuous values, processes cannot be reliably reThe machine was about a century ahead of its time. How- peated with exact equivalence, as they can with Turing
ever, the project was slowed by various problems includ- machines.[37]
ing disputes with the chief machinist building parts for it.
All the parts for his machine had to be made by hand The rst modern analog computer was a tide-predicting
- this was a major problem for a machine with thou- machine, invented by Sir William Thomson, later Lord
sands of parts. Eventually, the project was dissolved with Kelvin, in 1872. It used a system of pulleys and wires
the decision of the British Government to cease fund- to automatically calculate predicted tide levels for a set
ing. Babbages failure to complete the analytical engine period at a particular location and was of great utility to
can be chiey attributed to diculties not only of pol- navigation in shallow waters. His device was the founda[38]
itics and nancing, but also to his desire to develop an tion for further developments in analog computing.
increasingly sophisticated computer and to move ahead The dierential analyser, a mechanical analog computer
faster than anyone else could follow. Ada Lovelace, designed to solve dierential equations by integration usLord Byron's daughter, translated and added notes to ing wheel-and-disc mechanisms, was conceptualized in
the "Sketch of the Analytical Engine" by Federico Luigi, 1876 by James Thomson, the brother of the more famous
Conte Menabrea. This appears to be the rst published Lord Kelvin. He explored the possible construction of
description of programming.[36]
such calculators, but was stymied by the limited output
[39]
In a dierenFollowing Babbage, although unaware of his earlier work, torque of the ball-and-disk integrators.
tial
analyzer,
the
output
of
one
integrator
drove
the input
was Percy Ludgate, an accountant from Dublin, Ireland.
of
the
next
integrator,
or
a
graphing
output.
He independently designed a programmable mechanical
computer, which he described in a work that was pub- An important advance in analog computing was the delished in 1909.
velopment of the rst re-control systems for long range
ship gunlaying. When gunnery ranges increased dramatically in the late 19th century it was no longer a simple matter of calculating the proper aim point, given the
3.3 Analog computers
ight times of the shells. Various spotters on board the
ship would relay distance measures and observations to
Main article: Analog computer
a central plotting station. There the re direction teams
In the rst half of the 20th century, analog computers fed in the location, speed and direction of the ship and
were considered by many to be the future of computing. its target, as well as various adjustments for Coriolis efThese devices used the continuously changeable aspects fect, weather eects on the air, and other adjustments;

26

CHAPTER 3. HISTORY OF COMPUTING HARDWARE

3.4 Advent of the digital computer


The principle of the modern computer was rst described
by computer scientist Alan Turing, who set out the idea
in his seminal 1936 paper,[43] On Computable Numbers.
Turing reformulated Kurt Gdel's 1931 results on the
limits of proof and computation, replacing Gdels universal arithmetic-based formal language with the formal and simple hypothetical devices that became known
as Turing machines. He proved that some such machine would be capable of performing any conceivable
mathematical computation if it were representable as an
algorithm. He went on to prove that there was no solution to the Entscheidungsproblem by rst showing that
the halting problem for Turing machines is undecidable:
in general, it is not possible to decide algorithmically
whether a given Turing machine will ever halt.
A Mk. I Drift Sight. The lever just in front of the bomb aimers
ngertips sets the altitude, the wheels near his knuckles set the
wind and airspeed.

the computer would then output a ring solution, which


would be fed to the turrets for laying. In 1912, British engineer Arthur Pollen developed the rst electrically powered mechanical analogue computer (called at the time
the Argo Clock).[40] It was used by the Imperial Russian
Navy in World War I. The alternative Dreyer Table re
control system was tted to British capital ships by mid1916.
Mechanical devices were also used to aid the accuracy
of aerial bombing. Drift Sight was the rst such aid, developed by Harry Wimperis in 1916 for the Royal Naval
Air Service; it measured the wind speed from the air, and
used that measurement to calculate the winds eects on
the trajectory of the bombs. The system was later improved with the Course Setting Bomb Sight, and reached
a climax with World War II bomb sights, Mark XIV
bomb sight (RAF Bomber Command) and the Norden[41]
(United States Army Air Forces).
The art of mechanical analog computing reached its
zenith with the dierential analyzer,[42] built by H. L.
Hazen and Vannevar Bush at MIT starting in 1927, which
built on the mechanical integrators of James Thomson
and the torque ampliers invented by H. W. Nieman.
A dozen of these devices were built before their obsolescence became obvious; the most powerful was constructed at the University of Pennsylvania's Moore School
of Electrical Engineering, where the ENIAC was built.
By the 1950s the success of digital electronic computers
had spelled the end for most analog computing machines,
but hybrid analog computers, controlled by digital electronics, remained in substantial use into the 1950s and
1960s, and later in some specialized applications.

He also introduced the notion of a 'Universal Machine'


(now known as a Universal Turing machine), with the
idea that such a machine could perform the tasks of any
other machine, or in other words, it is provably capable of computing anything that is computable by executing a program stored on tape, allowing the machine to
be programmable. Von Neumann acknowledged that the
central concept of the modern computer was due to this
paper.[44] Turing machines are to this day a central object of study in theory of computation. Except for the
limitations imposed by their nite memory stores, modern computers are said to be Turing-complete, which is to
say, they have algorithm execution capability equivalent
to a universal Turing machine.

3.4.1 Electromechanical computers


The era of modern computing began with a urry of development before and during World War II. Most digital computers built in this period were electromechanical - electric switches drove mechanical relays to perform the calculation. These devices had a low operating
speed and were eventually superseded by much faster allelectric computers, originally using vacuum tubes.
The Z2 was one of the earliest examples of an electromechanical relay computer, and was created by German engineer Konrad Zuse in 1939. It was an improvement
on his earlier Z1; although it used the same mechanical
memory, it replaced the arithmetic and control logic with
electrical relay circuits.[45]
In the same year, the electro-mechanical bombes
were built by British cryptologists to help decipher
German Enigma-machine-encrypted secret messages
during World War II. The initial design of the bombe
was produced in 1939 at the UK Government Code and
Cypher School (GC&CS) at Bletchley Park by Alan Turing,[46] with an important renement devised in 1940 by
Gordon Welchman.[47] The engineering design and construction was the work of Harold Keen of the British Tab-

3.4. ADVENT OF THE DIGITAL COMPUTER

27

3.4.2 Digital computation


The mathematical basis of digital computing was established by the British mathematician George Boole,
in his work The Laws of Thought, published in 1854.
His Boolean algebra was further rened in the 1860s by
William Jevons and Charles Sanders Peirce, and was rst
presented systematically by Ernst Schrder and A. N.
Whitehead.[54]
In the 1930s and working independently, American
electronic engineer Claude Shannon and Soviet logician
Victor Shestakov both showed a one-to-one correspondence between the concepts of Boolean logic and certain
electrical circuits, now called logic gates, which are now
Replica of Zuse's Z3, the rst fully automatic, digital (electrome- ubiquitous in digital computers.[55] They showed[56] that
chanical) computer.
electronic relays and switches can realize the expressions
of Boolean algebra. This thesis essentially founded practical digital circuit design.

3.4.3 Electronic data processing


ulating Machine Company. It was a substantial development from a device that had been designed in 1938 by
Polish Cipher Bureau cryptologist Marian Rejewski, and
known as the "cryptologic bomb" (Polish: bomba kryptologiczna).
In 1941, Zuse followed his earlier machine up with
the Z3,[48] the worlds rst working electromechanical
programmable, fully automatic digital computer.[49] The
Z3 was built with 2000 relays, implementing a 22 bit word
length that operated at a clock frequency of about 510
Hz.[50] Program code and data were stored on punched
lm. It was quite similar to modern machines in some
respects, pioneering numerous advances such as oating
point numbers. Replacement of the hard-to-implement
decimal system (used in Charles Babbage's earlier design) by the simpler binary system meant that Zuses machines were easier to build and potentially more reliable,
given the technologies available at that time.[51] The Z3
was probably a complete Turing machine. In two 1936
patent applications, Zuse also anticipated that machine
instructions could be stored in the same storage used for
AtanasoBerry Computer replica at 1st oor of Durham Cendatathe key insight of what became known as the von
ter, Iowa State University.
Neumann architecture, rst implemented in the British
SSEM of 1948.[52]
Purely electronic circuit elements soon replaced their meZuse suered setbacks during World War II when some chanical and electromechanical equivalents, at the same
of his machines were destroyed in the course of Allied time that digital calculation replaced analog. Machines
bombing campaigns. Apparently his work remained such as the Z3, the AtanasoBerry Computer, the
largely unknown to engineers in the UK and US until Colossus computers, and the ENIAC were built by hand,
much later, although at least IBM was aware of it as it using circuits containing relays or valves (vacuum tubes),
nanced his post-war startup company in 1946 in return and often used punched cards or punched paper tape for
input and as the main (non-volatile) storage medium.
for an option on Zuses patents.
In 1944, the Harvard Mark I was constructed at IBMs
Endicott laboratories;[53] it was a similar general purpose
electro-mechanical computer to the Z3 and was not quite
Turing-complete.

The engineer Tommy Flowers joined the telecommunications branch of the General Post Oce in 1926. While
working at the research station in Dollis Hill in the 1930s,
he began to explore the possible use of electronics for

28

CHAPTER 3. HISTORY OF COMPUTING HARDWARE

Newman by Alan Turing[63] and spent eleven months


from early February 1943 designing and building the
rst Colossus.[64][65] After a functional test in December
1943, Colossus was shipped to Bletchley Park, where it
was delivered on 18 January 1944[66] and attacked its rst
[59]
In the US, John Vincent Atanaso and Cliord E. message on 5 February.
Berry of Iowa State University developed and tested the
AtanasoBerry Computer (ABC) in 1942,[57] the rst
electronic digital calculating device.[58] This design was
also all-electronic, and used about 300 vacuum tubes,
with capacitors xed in a mechanically rotating drum
for memory. However, its paper card writer/reader
was unreliable, and work on the machine was discontinued. The machines special-purpose nature and lack of
a changeable, stored program distinguish it from modern
computers.[59]
the telephone exchange. Experimental equipment that he
built in 1934 went into operation 5 years later, converting
a portion of the telephone exchange network into an electronic data processing system, using thousands of vacuum
tubes.[38]

3.4.4

The electronic programmable computer

Main articles: Colossus computer and ENIAC


During World War II, the British at Bletchley Park
Colossus rebuild seen from the rear.

Colossus was the rst electronic digital programmable computing


device, and was used to break German ciphers during World War
II.

(40 miles north of London) achieved a number of successes at breaking encrypted German military communications. The German encryption machine, Enigma,
was rst attacked with the help of the electro-mechanical
bombes.[60] They ruled out possible Enigma settings by
performing chains of logical deductions implemented
electrically. Most possibilities led to a contradiction, and
the few remaining could be tested by hand.
The Germans also developed a series of teleprinter encryption systems, quite dierent from Enigma. The
Lorenz SZ 40/42 machine was used for high-level Army
communications, termed Tunny by the British. The
rst intercepts of Lorenz messages began in 1941. As
part of an attack on Tunny, Max Newman and his colleagues helped specify the Colossus.[61]
Tommy Flowers, still a senior engineer at the Post
Oce Research Station[62] was recommended to Max

Colossus was the worlds rst electronic digital


programmable computer.[38] It used a large number
of valves (vacuum tubes). It had paper-tape input and
was capable of being congured to perform a variety
of boolean logical operations on its data, but it was not
Turing-complete. Nine Mk II Colossi were built (The
Mk I was converted to a Mk II making ten machines
in total). Colossus Mark I contained 1500 thermionic
valves (tubes), but Mark II with 2400 valves, was both 5
times faster and simpler to operate than Mark 1, greatly
speeding the decoding process. Mark 2 was designed
while Mark 1 was being constructed. Allen Coombs
took over leadership of the Colossus Mark 2 project
when Tommy Flowers moved on to other projects.[67]
Colossus was able to process 5,000 characters per second with the paper tape moving at 40 ft/s (12.2 m/s;
27.3 mph). Sometimes, two or more Colossus computers
tried dierent possibilities simultaneously in what now is
called parallel computing, speeding the decoding process
by perhaps as much as double the rate of comparison.
Colossus included the rst ever use of shift registers and
systolic arrays, enabling ve simultaneous tests, each involving up to 100 Boolean calculations, on each of the ve
channels on the punched tape (although in normal operation only one or two channels were examined in any run).
Initially Colossus was only used to determine the initial
wheel positions used for a particular message (termed
wheel setting). The Mark 2 included mechanisms intended to help determine pin patterns (wheel breaking).
Both models were programmable using switches and plug
panels in a way the Robinsons had not been.

3.5. THE STORED-PROGRAM COMPUTER

29
at the end of 1945. The machine was huge, weighing 30
tons, using 200 kilowatts of electric power and contained
over 18,000 vacuum tubes, 1,500 relays, and hundreds of
thousands of resistors, capacitors, and inductors.[70] One
of its major engineering feats was to minimize the eects
of tube burnout, which was a common problem in machine reliability at that time. The machine was in almost
constant use for the next ten years.

3.5 The stored-program computer


Further information: List of vacuum tube computers
ENIAC was the rst Turing-complete electronic device, and performed ballistics trajectory calculations for the United States
Army.[68]

Without the use of these machines, the Allies would


have been deprived of the very valuable intelligence
that was obtained from reading the vast quantity of
encrypted high-level telegraphic messages between the
German High Command (OKW) and their army commands throughout occupied Europe. Details of their existence, design, and use were kept secret well into the
1970s. Winston Churchill personally issued an order for
their destruction into pieces no larger than a mans hand,
to keep secret that the British were capable of cracking
Lorenz SZ cyphers (from German rotor stream cipher
machines) during the oncoming cold war. Two of the
machines were transferred to the newly formed GCHQ
and the others were destroyed. As a result the machines
were not included in many histories of computing.[69] A
reconstructed working copy of one of the Colossus machines is now on display at Bletchley Park.
The US-built ENIAC (Electronic Numerical Integrator
and Computer) was the rst electronic programmable
computer built in the US. Although the ENIAC was similar to the Colossus it was much faster and more exible. It
was unambiguously a Turing-complete device and could
compute any problem that would t into its memory. Like
the Colossus, a program on the ENIAC was dened by
the states of its patch cables and switches, a far cry from
the stored program electronic machines that came later.
Once a program was written, it had to be mechanically
set into the machine with manual resetting of plugs and
switches.
It combined the high speed of electronics with the ability to be programmed for many complex problems. It
could add or subtract 5000 times a second, a thousand
times faster than any other machine. It also had modules
to multiply, divide, and square root. High speed memory was limited to 20 words (about 80 bytes). Built under the direction of John Mauchly and J. Presper Eckert at the University of Pennsylvania, ENIACs development and construction lasted from 1943 to full operation

Early computing machines had xed programs. For example, a desk calculator is a xed program computer.
It can do basic mathematics, but it cannot be used as
a word processor or a gaming console. Changing the
program of a xed-program machine requires re-wiring,
re-structuring, or re-designing the machine. The earliest computers were not so much programmed as they
were designed. Reprogramming, when it was possible at all, was a laborious process, starting with owcharts
and paper notes, followed by detailed engineering designs, and then the often-arduous process of physically
re-wiring and re-building the machine.[71]
With the proposal of the stored-program computer this
changed. A stored-program computer includes by design
an instruction set and can store in memory a set of instructions (a program) that details the computation.

3.5.1 Theory

Memory

Control
Unit

Arithmetic
Logic
Unit
Accumulator

Input

Output

Design of the von Neumann architecture (1947)

The theoretical basis for the stored-program computer


had been laid by Alan Turing in his 1936 paper. In
1945 Turing joined the National Physical Laboratory and

30

CHAPTER 3. HISTORY OF COMPUTING HARDWARE

began work on developing an electronic stored-program the University of Manchester in 1946 and 1947, it was
digital computer. His 1945 report Proposed Electronic a cathode ray tube that used an eect called secondary
Calculator was the rst specication for such a device.
emission to temporarily store electronic binary data, and
Meanwhile, John von Neumann at the Moore School of was used successfully in several early computers.
Electrical Engineering, University of Pennsylvania, circulated his First Draft of a Report on the EDVAC in 1945.
Although substantially similar to Turings design and containing comparatively little engineering detail, the computer architecture it outlined became known as the "von
Neumann architecture". Turing presented a more detailed paper to the National Physical Laboratory (NPL)
Executive Committee in 1946, giving the rst reasonably
complete design of a stored-program computer, a device
he called the Automatic Computing Engine (ACE). However, the better-known EDVAC design of John von Neumann, who knew of Turings theoretical work, received
more publicity, despite its incomplete nature and questionable lack of attribution of the sources of some of the
ideas.[38]
Turing felt that speed and size of memory were crucial
and he proposed a high-speed memory of what would today be called 25 KB, accessed at a speed of 1 MHz. The
ACE implemented subroutine calls, whereas the EDVAC
did not, and the ACE also used Abbreviated Computer Instructions, an early form of programming language.

3.5.2

Manchester baby

Main article: Manchester Small-Scale Experimental Machine


The Manchester Small-Scale Experimental Machine,

Although the computer was considered small and primitive by the standards of its time, it was the rst working
machine to contain all of the elements essential to a modern electronic computer.[76] As soon as the SSEM had
demonstrated the feasibility of its design, a project was
initiated at the university to develop it into a more usable
computer, the Manchester Mark 1. The Mark 1 in turn
quickly became the prototype for the Ferranti Mark 1,
the worlds rst commercially available general-purpose
computer.[77]
The SSEM had a 32-bit word length and a memory of
32 words. As it was designed to be the simplest possible
stored-program computer, the only arithmetic operations
implemented in hardware were subtraction and negation;
other arithmetic operations were implemented in software. The rst of three programs written for the machine found the highest proper divisor of 218 (262,144),
a calculation that was known would take a long time to
runand so prove the computers reliabilityby testing
every integer from 218 - 1 downwards, as division was
implemented by repeated subtraction of the divisor. The
program consisted of 17 instructions and ran for 52 minutes before reaching the correct answer of 131,072, after
the SSEM had performed 3.5 million operations (for an
eective CPU speed of 1.1 kIPS).

3.5.3 Manchester Mark 1


The Experimental machine led on to the development of the Manchester Mark 1 at the University of
Manchester.[78] Work began in August 1948, and the rst
version was operational by April 1949; a program written to search for Mersenne primes ran error-free for nine
hours on the night of 16/17 June 1949. The machines
successful operation was widely reported in the British
press, which used the phrase electronic brain in describing it to their readers.

A section of the Manchester Small-Scale Experimental Machine,


the rst stored-program computer

nicknamed Baby, was the worlds rst stored-program


computer. It was built at the Victoria University of
Manchester by Frederic C. Williams, Tom Kilburn and
Geo Tootill, and ran its rst program on 21 June
1948.[72]
The machine was not intended to be a practical computer
but was instead designed as a testbed for the Williams
tube, the rst random-access digital storage device.[73] Invented by Freddie Williams and Tom Kilburn[74][75] at

The computer is especially historically signicant because of its pioneering inclusion of index registers, an
innovation which made it easier for a program to read sequentially through an array of words in memory. Thirtyfour patents resulted from the machines development,
and many of the ideas behind its design were incorporated
in subsequent commercial products such as the IBM 701
and 702 as well as the Ferranti Mark 1. The chief designers, Frederic C. Williams and Tom Kilburn, concluded
from their experiences with the Mark 1 that computers
would be used more in scientic roles than in pure mathematics. In 1951 they started development work on Meg,
the Mark 1s successor, which would include a oating
point unit.

3.5. THE STORED-PROGRAM COMPUTER

3.5.4

EDSAC

The other contender for being the rst recognizably


modern digital stored-program computer[79] was the
EDSAC,[80] designed and constructed by Maurice Wilkes
and his team at the University of Cambridge Mathematical Laboratory in England at the University of Cambridge
in 1949. The machine was inspired by John von Neumann's seminal First Draft of a Report on the EDVAC and
was one of the rst usefully operational electronic digital
stored-program computer.[81]

31
In October 1947, the directors of J. Lyons & Company, a
British catering company famous for its teashops but with
strong interests in new oce management techniques, decided to take an active role in promoting the commercial
development of computers. The LEO I computer became operational in April 1951[87] and ran the worlds
rst regular routine oce computer job. On 17 November 1951, the J. Lyons company began weekly operation
of a bakery valuations job on the LEO (Lyons Electronic
Oce). This was the rst business application to go live
on a stored program computer.[88]

EDSAC ran its rst programs on 6 May 1949, when it


calculated a table of squares[82] and a list of prime numbers.The EDSAC also served as the basis for the rst
commercially applied computer, the LEO I, used by food
manufacturing company J. Lyons & Co. Ltd. EDSAC
1 and was nally shut down on 11 July 1958, having
been superseded by EDSAC 2 which stayed in use until 1965.[83]

3.5.5

EDVAC

ENIAC inventors John Mauchly and J. Presper Eckert


proposed the EDVAC's construction in August 1944,
and design work for the EDVAC commenced at the
University of Pennsylvania's Moore School of Electrical Engineering, before the ENIAC was fully operational.
The design would implement a number of important architectural and logical improvements conceived during
the ENIACs construction and would incorporate a high
speed serial access memory.[84] However, Eckert and Front panel of the IBM 650.
Mauchly left the project and its construction oundered.
In June 1951, the UNIVAC I (Universal Automatic ComIt was nally delivered to the U.S. Army's Ballistics Reputer) was delivered to the U.S. Census Bureau. Remingsearch Laboratory at the Aberdeen Proving Ground in
ton Rand eventually sold 46 machines at more than US$1
August 1949, but due to a number of problems, the commillion each ($9.09 million as of 2015).[89] UNIVAC was
puter only began operation in 1951, and then only on a
the rst mass produced computer. It used 5,200 vaclimited basis.
uum tubes and consumed 125 kW of power. Its primary
storage was serial-access mercury delay lines capable of
storing 1,000 words of 11decimal digits plus sign (72-bit
3.5.6 Commercial computers
words).
The rst commercial computer was the Ferranti Mark
1, built by Ferranti and delivered to the University of
Manchester in February 1951. It was based on the
Manchester Mark 1. The main improvements over the
Manchester Mark 1 were in the size of the primary storage (using random access Williams tubes), secondary
storage (using a magnetic drum), a faster multiplier, and
additional instructions. The basic cycle time was 1.2 milliseconds, and a multiplication could be completed in
about 2.16 milliseconds. The multiplier used almost a
quarter of the machines 4,050 vacuum tubes (valves).[85]
A second machine was purchased by the University of
Toronto, before the design was revised into the Mark 1
Star. At least seven of these later machines were delivered between 1953 and 1957, one of them to Shell labs
in Amsterdam.[86]

IBM introduced a smaller, more aordable computer in


1954 that proved very popular.[90] The IBM 650 weighed
over 900 kg, the attached power supply weighed around
1350 kg and both were held in separate cabinets of
roughly 1.5 meters by 0.9 meters by 1.8 meters. It
cost US$500,000[91] ($4.39 million as of 2015) or could
be leased for US$3,500 a month ($30 thousand as of
2015).[89] Its drum memory was originally 2,000 tendigit words, later expanded to 4,000 words. Memory
limitations such as this were to dominate programming
for decades afterward. The program instructions were
fetched from the spinning drum as the code ran. Ecient
execution using drum memory was provided by a combination of hardware architecture: the instruction format
included the address of the next instruction; and software:
the Symbolic Optimal Assembly Program, SOAP,[92] as-

32

CHAPTER 3. HISTORY OF COMPUTING HARDWARE

signed instructions to the optimal addresses (to the extent


possible by static analysis of the source program). Thus
many instructions were, when needed, located in the next
row of the drum to be read and additional wait time for
drum rotation was not required.

computer. The IBM 704, introduced in 1954, used magnetic core memory, which became the standard for large
machines.

3.5.7

Microprogramming

IBM introduced the rst disk storage unit, the IBM 350
RAMAC (Random Access Method of Accounting and
Control) in 1956. Using fty 24-inch (610 mm) metal
disks, with 100 tracks per side, it was able to store 5
megabytes of data at a cost of US$10,000 per megabyte
($90 thousand as of 2015).[89][98]

In 1951, British scientist Maurice Wilkes developed the


concept of microprogramming from the realisation that
the Central Processing Unit of a computer could be controlled by a miniature, highly specialised computer pro- 3.6 Early computer characteristics
gram in high-speed ROM. Microprogramming allows the
base instruction set to be dened or extended by built-in
3.7 Transistor computers
programs (now called rmware or microcode).[93] This
concept greatly simplied CPU development. He rst described this at the University of Manchester Computer In- Main article: Transistor computer
augural Conference in 1951, then published in expanded Further information: List of transistorized computers
form in IEEE Spectrum in 1955.
It was widely used in the CPUs and oating-point units of The bipolar transistor was invented in 1947. From
mainframe and other computers; it was implemented for 1955 onwards transistors replaced vacuum tubes in com[99]
giving rise to the second generation
the rst time in EDSAC 2,[94] which also used multiple puter designs,
of
computers.
Initially
the only devices available were
identical bit slices to simplify design. Interchangeable,
germanium
point-contact
transistors.[100]
replaceable tube assemblies were used for each bit of the
processor.[95]

3.5.8

Magnetic storage

A bipolar junction transistor

Magnetic core memory. Each core is one bit.

By 1954, magnetic core memory was rapidly displacing most other forms of temporary storage, including the
Williams tube. It went on to dominate the eld through
the mid-1970s.[96]
A key feature of the American UNIVAC I system of
1951 was the implementation of a newly invented type
of metal magnetic tape, and a high-speed tape unit, for
non-volatile storage. Magnetic tape is still used in many
computers.[97] In 1952, IBM publicly announced the IBM
701 Electronic Data Processing Machine, the rst in its
successful 700/7000 series and its rst IBM mainframe

Compared to vacuum tubes, transistors have many advantages: they are smaller, and require less power than vacuum tubes, so give o less heat. Silicon junction transistors were much more reliable than vacuum tubes and had
longer, indenite, service life. Transistorized computers
could contain tens of thousands of binary logic circuits in
a relatively compact space. Transistors greatly reduced
computers size, initial cost, and operating cost. Typically, second-generation computers were composed of
large numbers of printed circuit boards such as the IBM
Standard Modular System[101] each carrying one to four
logic gates or ip-ops.
At the University of Manchester, a team under the leadership of Tom Kilburn designed and built a machine using the newly developed transistors instead of valves. Initially the only devices available were germanium point-

3.7. TRANSISTOR COMPUTERS


contact transistors, less reliable than the valves they replaced but which consumed far less power.[102] Their rst
transistorised computer and the rst in the world, was
operational by 1953,[103] and a second version was completed there in April 1955.[104] The 1955 version used
200 transistors, 1,300 solid-state diodes, and had a power
consumption of 150 watts. However, the machine did
make use of valves to generate its 125 kHz clock waveforms and in the circuitry to read and write on its magnetic
drum memory, so it was not the rst completely transistorized computer.
That distinction goes to the Harwell CADET of 1955,[105]
built by the electronics division of the Atomic Energy
Research Establishment at Harwell. The design featured
a 64-kilobyte magnetic drum memory store with multiple moving heads that had been designed at the National
Physical Laboratory, UK. By 1953 his team had transistor circuits operating to read and write on a smaller
magnetic drum from the Royal Radar Establishment.
The machine used a low clock speed of only 58 kHz
to avoid having to use any valves to generate the clock
waveforms.[106][107]

33
interchangeability guarantees a nearly unlimited quantity
of data close at hand. Magnetic tape provided archival
capability for this data, at a lower cost than disk.
Many second-generation CPUs delegated peripheral device communications to a secondary processor. For example, while the communication processor controlled
card reading and punching, the main CPU executed calculations and binary branch instructions. One databus
would bear data between the main CPU and core memory at the CPUs fetch-execute cycle rate, and other
databusses would typically serve the peripheral devices.
On the PDP-1, the core memorys cycle time was 5
microseconds; consequently most arithmetic instructions
took 10 microseconds (100,000 operations per second)
because most operations took at least two memory cycles;
one for the instruction, one for the operand data fetch.

During the second generation remote terminal units (often in the form of Teleprinters like a Friden Flexowriter) saw greatly increased use.[112] Telephone connections provided sucient speed for early remote terminals and allowed hundreds of kilometers separation between remote-terminals and the computing center. EvenCADET used 324 point-contact transistors provided by tually these stand-alone computer networks would be genthe UK company Standard Telephones and Cables; 76 eralized into an interconnected network of networksthe
junction transistors were used for the rst stage ampli- Internet.[113]
ers for data read from the drum, since point-contact
transistors were too noisy. From August 1956 CADET
was oering a regular computing service, during which it 3.7.2 Supercomputers
often executed continuous computing runs of 80 hours
or more.[108][109] Problems with the reliability of early
batches of point contact and alloyed junction transistors
meant that the machines mean time between failures was
about 90 minutes, but this improved once the more reliable bipolar junction transistors became available.[110]
The Transistor Computers design was adopted by the
local engineering rm of Metropolitan-Vickers in their
Metrovick 950, the rst commercial transistor computer
anywhere.[111] Six Metrovick 950s were built, the rst
completed in 1956. They were successfully deployed
within various departments of the company and were in
use for about ve years.[104]
A second generation computer, the IBM 1401, captured
about one third of the world market. IBM installed more The University of Manchester Atlas in January 1963
than ten thousand 1401s between 1960 and 1964.
The early 1960s saw the advent of supercomputing. The
Atlas Computer was a joint development between the
University of Manchester, Ferranti, and Plessey, and
3.7.1 Transistorized peripherals
was rst installed at Manchester University and oTransistorized electronics improved not only the CPU cially commissioned in 1962 as one of the worlds rst
(Central Processing Unit), but also the peripheral devices. supercomputers - considered to be the most powerful
The second generation disk data storage units were able computer in the world at that time.[114] It was said
to store tens of millions of letters and digits. Next to that whenever Atlas went oine half of the United
the xed disk storage units, connected to the CPU via Kingdoms computer capacity was lost.[115] It was a
high-speed data transmission, were removable disk data second-generation machine, using discrete germanium
storage units. A removable disk pack can be easily ex- transistors. Atlas also pioneered the Atlas Supervisor,
changed with another pack in a few seconds. Even if the considered by many to be the rst recognisable modern
removable disks capacity is smaller than xed disks, their operating system".[116]

34

CHAPTER 3. HISTORY OF COMPUTING HARDWARE

In the US, a series of computers at Control Data Corporation (CDC) were designed by Seymour Cray to
use innovative designs and parallelism to achieve superior computational peak performance.[117] The CDC
6600, released in 1964, is generally considered the rst
supercomputer.[118][119] The CDC 6600 outperformed its
predecessor, the IBM 7030 Stretch, by about a factor of
three. With performance of about 1 megaFLOPS,[120]
the CDC 6600 was the worlds fastest computer from
1964 to 1969, when it relinquished that status to its successor, the CDC 7600.

3.8 The integrated circuit


Intel 8742 eight-bit microcontroller IC

The next great advance in computing power came with


the advent of the integrated circuit. The idea of the integrated circuit was conceived by a radar scientist working
for the Royal Radar Establishment of the Ministry of Defence, Georey W.A. Dummer. Dummer presented the
rst public description of an integrated circuit at the Symposium on Progress in Quality Electronic Components in
Washington, D.C. on 7 May 1952:[121]
With the advent of the transistor and the work
on semi-conductors generally, it now seems
possible to envisage electronic equipment in a
solid block with no connecting wires.[122] The
block may consist of layers of insulating, conducting, rectifying and amplifying materials,
the electronic functions being connected directly by cutting out areas of the various layers.

The explosion in the use of computers began with thirdgeneration computers, making use of Jack St. Clair
Kilbys and Robert Noyces independent invention of the
integrated circuit (or microchip). This led to the invention of the microprocessor. While the subject of exactly which device was the rst microprocessor is contentious, partly due to lack of agreement on the exact definition of the term microprocessor, it is largely undisputed that the rst single-chip microprocessor was the Intel 4004,[128] designed and realized by Ted Ho, Federico
Faggin, and Stanley Mazor at Intel.[129]

While the earliest microprocessor ICs literally contained


only the processor, i.e. the central processing unit, of
a computer, their progressive development naturally led
to chips containing most or all of the internal electronic
parts of a computer. The integrated circuit in the image on the right, for example, an Intel 8742, is an 8-bit
The rst practical ICs were invented by Jack Kilby microcontroller that includes a CPU running at 12 MHz,
at Texas Instruments and Robert Noyce at Fairchild 128 bytes of RAM, 2048 bytes of EPROM, and I/O in
Semiconductor.[123] Kilby recorded his initial ideas con- the same chip.
cerning the integrated circuit in July 1958, successfully
During the 1960s there was considerable overlap between
demonstrating the rst working integrated example on 12
second and third generation technologies.[130] IBM imSeptember 1958.[124] In his patent application of 6 Februplemented its IBM Solid Logic Technology modules in
ary 1959, Kilby described his new device as a body of
hybrid circuits for the IBM System/360 in 1964. As late
semiconductor material ... wherein all the components of
as 1975, Sperry Univac continued the manufacture of
the electronic circuit are completely integrated.[125] The
second-generation machines such as the UNIVAC 494.
rst customer for the invention was the US Air Force.[126]
The Burroughs large systems such as the B5000 were
Noyce also came up with his own idea of an integrated stack machines, which allowed for simpler programming.
circuit half a year later than Kilby.[127] His chip solved These pushdown automatons were also implemented in
many practical problems that Kilbys had not. Pro- minicomputers and microprocessors later, which inuduced at Fairchild Semiconductor, it was made of silicon, enced programming language design. Minicomputers
whereas Kilbys chip was made of germanium.
served as low-cost computer centers for industry, business and universities.[131] It became possible to simulate analog circuits with the simulation program with inte3.9 Post-1960 (integrated circuit grated circuit emphasis, or SPICE (1971) on minicomputers, one of the programs for electronic design automation
based)
(EDA). The microprocessor led to the development of
the microcomputer, small, low-cost computers that could
Main articles: History of computing hardware (1960s be owned by individuals and small businesses. Micropresent) and History of general purpose CPUs
computers, the rst of which appeared in the 1970s, be-

3.10. FUTURE
came ubiquitous in the 1980s and beyond.
In April 1975 at the Hannover Fair, Olivetti presented
the P6060, the worlds rst personal computer with builtin oppy disk: a central processing unit on two cards,
code named PUCE1 and PUCE2, with TTL components.
It had one or two 8 oppy disk drives, a 32-character
plasma display, 80-column graphical thermal printer, 48
Kbytes of RAM, and BASIC language. It weighed 40 kg
(88 lb). It was in competition with a similar product by
IBM that had an external oppy disk drive.
MOS Technology KIM-1 and Altair 8800, were sold as
kits for do-it-yourselfers, as was the Apple I, soon afterward. The rst Apple computer with graphic and sound
capabilities came out well after the Commodore PET.
Computing has evolved with microcomputer architectures, with features added from their larger brethren, now
dominant in most market segments.
Systems as complicated as computers require very high
reliability. ENIAC remained on, in continuous operation from 1947 to 1955, for eight years before being shut down. Although a vacuum tube might fail, it
would be replaced without bringing down the system. By
the simple strategy of never shutting down ENIAC, the
failures were dramatically reduced. The vacuum-tube
SAGE air-defense computers became remarkably reliable installed in pairs, one o-line, tubes likely to fail
did so when the computer was intentionally run at reduced
power to nd them. Hot-pluggable hard disks, like the
hot-pluggable vacuum tubes of yesteryear, continue the
tradition of repair during continuous operation. Semiconductor memories routinely have no errors when they
operate, although operating systems like Unix have employed memory tests on start-up to detect failing hardware. Today, the requirement of reliable performance
is made even more stringent when server farms are the
delivery platform.[132] Google has managed this by using
fault-tolerant software to recover from hardware failures,
and is even working on the concept of replacing entire
server farms on-the-y, during a service event.[133][134]
In the 21st century, multi-core CPUs became commercially available.[135] Content-addressable memory
(CAM)[136] has become inexpensive enough to be used
in networking, although no computer system has yet implemented hardware CAMs for use in programming languages. Currently, CAMs (or associative arrays) in software are programming-language-specic. Semiconductor memory cell arrays are very regular structures, and
manufacturers prove their processes on them; this allows
price reductions on memory products. During the 1980s,
CMOS logic gates developed into devices that could be
made as fast as other circuit types; computer power consumption could therefore be decreased dramatically. Unlike the continuous current draw of a gate based on other
logic types, a CMOS gate only draws signicant current during the 'transition' between logic states, except
for leakage.

35
This has allowed computing to become a commodity
which is now ubiquitous, embedded in many forms, from
greeting cards and telephones to satellites. The thermal
design power which is dissipated during operation has become as essential as computing speed of operation. In
2006 servers consumed 1.5% of the total energy budget
of the U.S.[137] The energy consumption of computer data
centers was expected to double to 3% of world consumption by 2011. The SoC (system on a chip) has compressed
even more of the integrated circuitry into a single chip;
SoCs are enabling phones and PCs to converge into single
hand-held wireless mobile devices.[138] Computing hardware and its software have even become a metaphor for
the operation of the universe.[139]

3.10 Future
Although DNA-based computing and quantum computing are years or decades in the future, the infrastructure is being laid today, for example, with DNA origami
on photolithography[140] and with quantum antennae for
transferring information between ion traps.[141] By 2011,
researchers had entangled 14 qubits.[142] Fast digital circuits (including those based on Josephson junctions and
rapid single ux quantum technology) are becoming more
nearly realizable with the discovery of nanoscale superconductors.[143]
Fiber-optic and photonic devices, which already have
been used to transport data over long distances, are now
entering the data center, side by side with CPU and semiconductor memory components. This allows the separation of RAM from CPU by optical interconnects.[144]
IBM has created an integrated circuit with both electronic
and optical (this is called photonic) information processing in one chip. This is denoted CMOS-integrated
nanophotonics or (CINP).[145] One benet of optical interconnects is that motherboards which formerly required
a certain kind of system on a chip (SoC) can now move
formerly dedicated memory and network controllers o
the motherboards, spreading the controllers out onto the
rack. This allows standardization of backplane interconnects and motherboards for multiple types of SoCs,
which allows more timely upgrades of CPUs.[146]
An indication of the rapidity of development of this eld
can be inferred by the history of the seminal article.[147]
By the time that anyone had time to write anything
down, it was obsolete. After 1945, others read John
von Neumanns First Draft of a Report on the EDVAC,
and immediately started implementing their own systems.
To this day, the pace of development has continued,
worldwide.[148][149][150]

3.11 See also


Antikythera mechanism

36

CHAPTER 3. HISTORY OF COMPUTING HARDWARE

History of computing

[14] See in particular, http://things-that-count.net

Information Age

[15] As quoted in Smith 1929, pp. 180181

IT History Society

[16] Leibniz 1703

Timeline of computing

[17] Binary-coded decimal (BCD) is a numeric representation,


or character encoding, which is still widely used.
[18] Discovering the Arithmometer, Cornell University

3.12 Notes
[1] According to Schmandt-Besserat 1981, these clay containers contained tokens, the total of which were the count
of objects being transferred. The containers thus served
as something of a bill of lading or an accounts book. In
order to avoid breaking open the containers, rst, clay
impressions of the tokens were placed on the outside of
the containers, for the count; the shapes of the impressions were abstracted into stylized marks; nally, the abstract marks were systematically used as numerals; these
numerals were nally formalized as numbers. Eventually (Schmandt-Besserat estimates it took 4000 years)
the marks on the outside of the containers were all that
were needed to convey the count, and the clay containers
evolved into clay tablets with marks for the count.
[2] Robson, Eleanor (2008), Mathematics in Ancient Iraq,
ISBN 978-0-691-09182-2. p.5: calculi were in use in
Iraq for primitive accounting systems as early as 3200
3000 BCE, with commodity-specic counting representation systems. Balanced accounting was in use by 3000
2350 BCE, and a sexagesimal number system was in use
23502000 BCE.
[3] Lazos 1994
[4] Noel Sharkey (July 4, 2007), A programmable robot from
60 AD 2611, New Scientist
[5] A Spanish implementation of Napiers bones (1617), is
documented in Montaner & Simon 1887, pp. 1920.
[6] Kells, Kern & Bland 1943, p. 92
[7] Kells, Kern & Bland 1943, p. 82
[8] "...the single-tooth gear, like that used by Schickard,
would not do for a general carry mechanism. The singletooth gear works ne if the carry is only going to be propagated a few places but, if the carry has to be propagated
several places along the accumulator, the force needed to
operate the machine would be of such magnitude that it
would do damage to the delicate gear works. Williams
1997, p. 128
[9] (fr) La Machine darithmtique, Blaise Pascal, Wikisource

[19] Columbia University Computing History Herman


Hollerith. Columbia.edu. Retrieved 2010-01-30.
[20] U.S. Census Bureau: Tabulation and Processing
[21] http://www-03.ibm.com/ibm/history/history/year_1920.
html
[22] http://www-03.ibm.com/ibm/history/history/decade_
1930.html
[23] Eckert 1935
[24] Thomas J. Watson Astronomical Computing Bureau
[25] Eckert 1940, pp. 101=114. Chapter XII is The Computation of Planetary Perturbations.
[26] Hunt 1998, pp. xiiixxxvi
[27] Old Calculator Museum
[28] Simple and Silent, Oce Magazine, December 1961,
p1244
[29] "'Anita' der erste tragbare elektonische Rechenautomat
[trans: the rst portable electronic computer"], Buromaschinen Mechaniker, November 1961, p207
[30] Halacy, Daniel Stephen (1970). Charles Babbage, Father of the Computer. Crowell-Collier Press. ISBN 0-02741370-5.
[31] Babbage. Online stu. Science Museum. 2007-01-19.
Retrieved 2012-08-01.
[32] Lets build Babbages ultimate mechanical computer.
opinion. New Scientist. 23 December 2010. Retrieved
2012-08-01.
[33] http://cse.stanford.edu/classes/sophomore-college/
projects-98/babbage/ana-mech.htm
[34] The Babbage Pages:
Calculating Engines.
Projects.ex.ac.uk. 1997-01-08. Retrieved 2012-0801.
[35] Tim Robinson (2007-05-28). Dierence Engines.
Meccano.us. Retrieved 2012-08-01.
[36] Menabrea & Lovelace 1843

[10] Marguin 1994, p. 48

[37] Chua 1971, pp. 507519

[11] Maurice d'Ocagne (1893), p. 245 Copy of this book found


on the CNAM site

[38] The Modern History of Computing. Stanford Encyclopedia of Philosophy.

[12] Mourlevat 1988, p. 12

[39] Ray Girvan, The revealed grace of the mechanism:


computing after Babbage, Scientic Computing World,
May/June 2003

[13] All nine machines are described in Vidal & Vogt 2011.

3.12. NOTES

[40] Singer 1946


[41] Norden

37

[64] Bletchleys code-cracking Colossus, BBC News, 2


February 2010, retrieved 19 October 2012

[42] Coriolis 1836, pp. 59

[65] Fensom, Jim (8 November 2010), Harry Fensom obituary,


retrieved 17 October 2012

[43] Turing 1937, pp. 230265. Online versions: Proceedings


of the London Mathematical Society Another version online.

[66] The
Colossus
Rebuild
colossus-rebuild-story

http://www.tnmoc.org/

[44] von Neumann ... rmly emphasized to me, and to others I am sure, that the fundamental conception is owing to
Turinginsofar as not anticipated by Babbage, Lovelace
and others. Letter by Stanley Frankel to Brian Randell,
1972, quoted in Jack Copeland (2004) The Essential Turing, p22.

[67] Randell, Brian; Fensom, Harry; Milne, Frank A. (15


March 1995), Obituary: Allen Coombs, The Independent, retrieved 18 October 2012

[45] Zuse, Horst. Part 4: Konrad Zuses Z1 and Z3 Computers. The Life and Work of Konrad Zuse. EPE Online.
Archived from the original on 2008-06-01. Retrieved
2008-06-17.

[69] The existence of Colossus was not known to American


computer scientists, such as Gordon Bell and Allen Newell
(1971) in their Computing Structures, a standard reference
work in the 1970s

[46] Smith 2007, p. 60

[70] Generations of Computers

[47] Welchman 1984, p. 77

[71] Copeland 2006, p. 104

[48] Zuse

[72] Enticknap, Nicholas (Summer 1998), Computings


Golden Jubilee, Resurrection (The Computer Conservation Society) (20), ISSN 0958-7403, retrieved 19 April
2008

[49] A Computer Pioneer Rediscovered, 50 Years On. The


New York Times. April 20, 1994.
[50] Zuse, Konrad (1993). Der Computer. Mein Lebenswerk.
(in German) (3rd ed.). Berlin: Springer-Verlag. p. 55.
ISBN 978-3-540-56292-4.
[51] Crash! The Story of IT: Zuse at the Wayback Machine
(archived March 18, 2008)
[52] Electronic Digital Computers, Nature 162, 25
September 1948: 487, Bibcode:1948Natur.162..487W,
doi:10.1038/162487a0, retrieved 2009-04-10
[53] Da Cruz 2008
[54] J. Michael Dunn; Gary M. Hardegree (2001). Algebraic
methods in philosophical logic. Oxford University Press
US. p. 2. ISBN 978-0-19-853192-0.
[55] Shannon, Claude (1938). "A Symbolic Analysis of Relay
and Switching Circuits". Transactions of the American Institute of Electrical Engineers 57: 713723. doi:10.1109/taiee.1938.5057767.
[56] Shannon 1940
[57] January 15, 1941 notice in the Des Moines Register
[58] Arthur W. Burks. The First Electronic Computer.
[59] Copeland, Jack (2006), Colossus: The Secrets of Bletchley
Parks Codebreaking Computers, Oxford: Oxford University Press, pp. 101115, ISBN 0-19-284055-X
[60] Welchman 1984, pp. 138145, 295309
[61] Copeland 2006
[62] Randell 1980, p. 9
[63] Budiansky 2000, p. 314

[68] Brendan I. Loerner, The Worlds First Computer Has Finally Been Resurrected accessdate=2014-11-25

[73] Early computers at Manchester University, Resurrection (The Computer Conservation Society) 1 (4), Summer
1992, ISSN 0958-7403, retrieved 7 July 2010
[74] http://www.computer50.org/mark1/notes.html#
acousticdelay Why Williams-Kilburn Tube is a Better
Name for the Williams Tube
[75] Kilburn, Tom (1990), From Cathode Ray Tube to Ferranti Mark I, Resurrection (The Computer Conservation
Society) 1 (2), ISSN 0958-7403, retrieved 15 March 2012
[76] Early Electronic Computers (194651), University of
Manchester, retrieved 16 November 2008
[77] Napper, R. B. E., Introduction to the Mark 1, The University of Manchester, retrieved 4 November 2008
[78] Lavington 1998, p. 20
[79] Pioneering Edsac computer to be built at Bletchley Park
http://www.bbc.co.uk/news/technology-12181153
[80] Wilkes, W. V.; Renwick, W. (1950). The EDSAC (Electronic delay storage automatic calculator)". Math. Comp.
4: 6165. doi:10.1090/s0025-5718-1950-0037589-7.
[81] The Manchester Small-Scale Experimental Machine,
nicknamed Baby, predated EDSAC as a stored-program
computer, but was built as a test bed for the Williams
tube and not as a machine for practical use. http://www.
cl.cam.ac.uk/conference/EDSAC99/history.html. However, the Manchester Mark 1 of 1949 (not to be confused
with the 1948 SSEM prototype) was available for university research in April 1949 http://www.computer50.org/
mark1/MM1.html despite being still under development.

38

CHAPTER 3. HISTORY OF COMPUTING HARDWARE

[82] Pioneer computer to be rebuilt. Cam 62: 5. Lent 2011.


Check date values in: |date= (help) To be precise, EDSACs rst program printed a list of the squares of the
integers from 0 to 99 inclusive.

[98] IBM 1956


[99] Feynman, Leighton & Sands 1966, pp. 14-11 to 1412
[100] Lavington 1998, pp. 3435

[83] EDSAC 99: 15-16 April 1999, University of Cambridge


[101] IBM_SMS 1960
Computer Laboratory, 1999-05-06, pp. 68, 69, retrieved
2013-06-29
[102] Lavington (1998), pp. 3435
[84] Wilkes, M. V. (1956). Automatic Digital Computers. New [103] Lavington 1998, p. 37
York: John Wiley & Sons. pp. 305 pages. QA76.W5
[104] Lavington (1998), p. 37
1956.
[105] Cooke-Yarborough, E. H. (June 1998), Some early
transistor applications in the UK, Engineering and
[86] Computer Conservation Society, Our Computer Heritage
Science Education Journal (IEE) 7 (3): 100106,
Pilot Study: Deliveries of Ferranti Mark I and Mark I Star
doi:10.1049/esej:19980301, ISSN 0963-7346, retrieved
computers., retrieved 9 January 2010
7 June 2009 (subscription required)
[85] Lavington 1998, p. 25

[87] Lavington, Simon. A brief history of British computers: [106] Cooke-Yarborough, E.H. (1957). Introduction to Transisthe rst 25 years (19481973).. British Computer Socitor Circuits. Edinburgh: Oliver and Boyd. p. 139.
ety. Retrieved 10 January 2010.
[107] Cooke-Yarborough, E.H. (June 1998). Some early tran[88] Martin 2008, p. 24 notes that David Caminer (1915
sistor applications in the UK. Engineering and Sci2008) served as the rst corporate electronic systems anence Education Journal (London, UK: IEE) 7 (3): 100
alyst, for this rst business computer system, a Leo com106. doi:10.1049/esej:19980301. ISSN 0963-7346. Reputer, part of J. Lyons & Company. LEO would calcutrieved 2009-06-07.
late an employees pay, handle billing, and other oce
[108] Lavington, Simon (1980). Early British Computers.
automation tasks.
Manchester University Press. p. 139. ISBN 0-7190[89] Consumer Price Index (estimate) 18002014. Federal
0803-4.
Reserve Bank of Minneapolis. Retrieved February 27,
[109] Cooke-Yarborough, E. H. (1956). transistor digital com2014.
puter. Proceedings of the IEE (London, UK: IEE) 103B
(Supp 13): 36470. ISSN 0956-3776.
[90] For example, Kara Platonis article on Donald Knuth
stated that there was something special about the IBM
[110] Lavington 1998, pp. 3637
650", Stanford Magazine, May/June 2006

[91]

[92]
[93]
[94]

[95]
[96]
[97]

Dr. V. M. Wolontis (August 18, 1955) A [111] Metrovick.


Complete Floating-Decimal Interpretive System
[112] Allen Newell used remote terminals to communicate
for the I.B.M. 650 Magnetic Drum Calculator
cross-country with the RAND computers, as noted in
Case 20878 Bell Telephone Laboratories TechniSimon 1991
cal Memorandum MM-114-37, Reported in IBM
Technical Newsletter No. 11, March 1956, as ref- [113] Bob Taylor conceived of a generalized protocol to link toerenced in IEEE Annals of the History of Computing
gether multiple networks to be viewed as a single session
(Volume:8 , Issue: 1 ) January 1986
regardless of the specic network: Wait a minute. Why
not just have one terminal, and it connects to anything you
Dudley, Leonard (2008), Information Revolution in the
want it to be connected to? And, hence, the Arpanet was
History of the West, Edward Elgar Publishing, p. 266,
born.Mayo & Newcomb 2008
ISBN 978-1-84720-790-6
[114] Lavington 1998, p. 41
IBM (1957), SOAP II for the IBM 650 (PDF), C24-4000-0
[115] Lavington 1998, pp. 4445
Horowitz & Hill 1989, p. 743
[116] Lavington 1980, pp. 5052
Wilkes, M. V. (1992).
Edsac 2.
IEEE Annals of the History of Computing 14 (4): 4956. [117] Hardware software co-design of a multimedia SOC platdoi:10.1109/85.194055.
form by Sao-Jie Chen, Guang-Huei Lin, Pao-Ann Hsiung,
Yu-Hen Hu 2009 ISBN pages 70-72
The microcode was implemented as extracode on Atlas
accessdate=20100209
[118] John Impagliazzo, John A. N. Lee (2004). History of computing in education. p. 172. ISBN 1-4020-8135-9.
An Wang led October 1949, US patent 2708722, Pulse
transfer controlling devices, issued 1955-05-17
[119] Richard Sisson, Christian K. Zacher (2006). The American Midwest: an interpretive encyclopedia. p. 1489. ISBN
Magnetic tape will be the primary data storage mecha0-253-34886-2.
nism when CERN's Large Hadron Collider comes online
in 2008.
[120]

3.13. REFERENCES

39

[121] The Hapless Tale of Georey Dummer, (n.d.), [140] Ryan J. Kershner, Luisa D. Bozano, Christine M.
(HTML), Electronic Product News, accessed 8 July 2008.
Micheel, Albert M. Hung, Ann R. Fornof, Jennifer N.
Cha, Charles T. Rettner, Marco Bersani, Jane From[122] Lott, Sara. 1958 All semiconductor Solid Circuit is
mer, Paul W. K. Rothemund & Gregory M. Walldemonstrated. A Timeline of Semiconductors in Computra (16 August 2009) Placement and orientation of
ers. Computer History Museum. Retrieved 4 September
individual DNA shapes on lithographically patterned
2011.
surfaces Nature Nanotechnology publication information, supplementary information: DNA origami on pho[123] Kilby 2000
tolithography doi:10.1038/nnano.2009.220
[124] The Chip that Jack Built, (c. 2008), (HTML), Texas In[141] M. Harlander, R. Lechner, M. Brownnutt, R. Blatt,
struments, Retrieved 29 May 2008.
W. Hnsel.
Trapped-ion antennae for the transmission of quantum information.
Nature, 2011;
[125] Winston, Brian (1998). Media Technology and Society: A
doi:10.1038/nature09800
History : From the Telegraph to the Internet. Routledge.
p. 221. ISBN 978-0-415-14230-4.
[142] Thomas Monz, Philipp Schindler, Julio T. Barreiro, Michael Chwalla, Daniel Nigg, William A.
[126] Texas Instruments 1961 First IC-based computer.
Coish, Maximilian Harlander, Wolfgang Hnse,
Ti.com. Retrieved 2012-08-13.
Markus Hennrich, and Rainer Blatt, (31 March
[127] Robert Noyce's Unitary circuit, US patent 2981877,
2011) 14-Qubit Entanglement:
Creation and
Semiconductor device-and-lead structure, issued 1961Rev.
Lett.
106 13 http:
Coherence Phys.
04-25, assigned to Fairchild Semiconductor Corporation
//link.aps.org/doi/10.1103/PhysRevLett.106.130506
doi:10.1103/PhysRevLett.106.130506
[128] Intel_4004 1971
[129] The Intel 4004 (1971) die was 12 mm2 , composed of
2300 transistors; by comparison, the Pentium Pro was 306
mm2 , composed of 5.5 million transistors, according to
Patterson & Hennessy 1998, pp. 2739.

[143] Saw-Wai Hla et al., Nature Nanotechnology March 31,


2010 Worlds smallest superconductor discovered. Four
pairs of certain molecules have been shown to form
a nanoscale superconductor, at a dimension of 0.87
nanometers. Access date 2010-03-31

[130] In the defense eld, considerable work was done in


the computerized implementation of equations such as [144] Tom Simonite, Computing at the speed of light, Technology Review Wed., August 4, 2010 MIT
Kalman 1960, pp. 3545
[145] Sebastian Anthony (Dec 10,2012), IBM creates
rst commercially viable silicon nanophotonic chip,
accessdate=2012-12-10
[132] Since 2005, its [Googles] data centers have been composed of standard shipping containerseach with 1,160
servers and a power consumption that can reach 250 kilo- [146] Open Compute: Does the data center have an open future?
accessdate=2013-08-11
watts. Ben Jai of Google, as quoted in Shankland 2009
[131] Eckhouse & Morris 1979, pp. 12

[133] If you're running 10,000 machines, something is going [147] Burks, Goldstine & von Neumann 1947, pp. 1464
reprinted in Datamation, SeptemberOctober 1962. Note
to die every day. Je Dean of Google, as quoted in
that preliminary discussion/design was the term later called
Shankland 2008.
system analysis/design, and even later, called system architecture.
[134] However, when an entire server farm fails today, the recovery procedures are currently still manual procedures,
with the need for training the recovery team, even for [148] IEEE_Annals 1979 (IEEE Annals of the History of Computing online access)
the most advanced facilities. The initial failure was a
power failure; the recovery procedure cited an inconsis[149] DBLP summarizes the Annals of the History of Computing
tent backup site, and the inconsistent backup site was outyear by year, back to 1995, so far.
dated. Accessdate=2010-03-08
[135] Intel has unveiled a single-chip version of a 48-core
CPU for software and circuit research in cloud computing: accessdate=2009-12-02. Intel has loaded Linux
on each core; each core has an X86 architecture:
accessdate=2009-12-3
[136] Kohonen 1980, pp. 1368

[150] The fastest supercomputer of the top 500 was announced


to be Tianhe-2 at the 2014 Supercomputing Conference,
two years in a row, previously topping 2012s Titan, as of
Monday, June 23, 2014.

3.13 References

[137] 2007 Energystar report, p. 4 accessdate=2013-08-18


[138] Walt Mossberg (9 July 2014) How the PC is merging with
the smartphone accessdate=2014-07-09
[139] Smolin 2001, pp. 5357.Pages 220226 are annotated
references and guide for further reading.

Backus, John (August 1978), Can Programming be Liberated from the von Neumann
Style?", Communications of the ACM 21 (8): 613,
doi:10.1145/359576.359579, 1977 ACM Turing
Award Lecture.

40

CHAPTER 3. HISTORY OF COMPUTING HARDWARE

Bell, Gordon; Newell, Allen (1971), Computer


Structures: Readings and Examples, New York:
McGraw-Hill, ISBN 0-07-004357-4.

Davenport, Wilbur B., Jr; Root, William L. (1958),


An Introduction to the theory of Random Signals and
Noise, McGraw-Hill, pp. 112364, OCLC 573270.

Bergin, Thomas J. (ed.) (November 1314, 1996),


Fifty Years of Army Computing: from ENIAC to
MSRC, A record of a symposium and celebration,
Aberdeen Proving Ground.: Army Research Laboratory and the U.S.Army Ordnance Center and
School., retrieved 2008-05-17.

Eckert, Wallace (1935), The Computation


of Special Perturbations by the Punched
Card Method., Astronomical Journal 44
(1034):
177, Bibcode:1935AJ.....44..177E,
doi:10.1086/105298.

Bowden, B. V. (1970), The Language of


4353,
Computers, American Scientist 58:
Bibcode:1970AmSci..58...43B, retrieved 2008-0517.
Burks, Arthur W.; Goldstine, Herman; von Neumann, John (1947), Preliminary discussion of the
Logical Design of an Electronic Computing Instrument, Princeton, NJ: Institute for Advanced Study,
retrieved 2008-05-18.
Chua, Leon O (September 1971), Memristor
The Missing Circuit Element, IEEE Transactions on Circuit Theory, CT-18 (5): 507519,
doi:10.1109/TCT.1971.1083337.
Cleary, J. F. (1964), GE Transistor Manual (7th
ed.), General Electric, Semiconductor Products Department, Syracuse, NY, pp. 139204, OCLC
223686427.
Copeland, B. Jack (ed.) (2006), Colossus: The Secrets of Bletchley Parks Codebreaking Computers,
Oxford, England: Oxford University Press, ISBN 019-284055-X.
Coriolis, Gaspard-Gustave (1836), Note sur un
moyen de tracer des courbes donnes par des quations direntielles, Journal de Mathmatiques
Pures et Appliques, series I (in French) 1: 59, retrieved 2008-07-06.
Cortada, James W. (2009), Public Policies and
the Development of National Computer Industries in Britain, France, and the Soviet Union,
194080, Journal of Contemporary History 44
(3): 493512, doi:10.1177/0022009409104120,
JSTOR 40543045.
CSIRAC: Australias rst computer ( Scholar search ),
Commonwealth Scientic and Industrial Research
Organisation (CSIRAC), June 3, 2005, retrieved
2007-12-21.
Da Cruz, Frank (February 28, 2008), The IBM Automatic Sequence Controlled Calculator (ASCC)",
Columbia University Computing History: A Chronology of Computing at Columbia University (Columbia
University ACIS), retrieved 2008-05-17.

Eckert, Wallace (1940), XII: The Computation of


Planetary Perturbations"", Punched Card Methods
in Scientic Computation, Thomas J. Watson Astronomical Computing Bureau, Columbia University,
pp. 101114, OCLC 2275308.
Eckhouse, Richard H., Jr.; Morris, L. Robert
(1979), Minicomputer Systems: organization, programming, and applications (PDP-11), PrenticeHall, pp. 12, ISBN 0-13-583914-9.
Enticknap, Nicholas (Summer 1998), Computings
Golden Jubilee, Resurrection (The Computer Conservation Society) (20), ISSN 0958-7403, retrieved
2008-04-19.
Feynman, R. P.; Leighton, Robert; Sands, Matthew
(1965), Feynman Lectures on Physics: Mainly Mechanics, Radiation and Heat I, Reading, Mass:
Addison-Wesley, ISBN 0-201-02010-6, OCLC
531535.
Feynman, R. P.; Leighton, Robert; Sands, Matthew
(1966), Feynman Lectures on Physics: Quantum Mechanics III, Reading, Mass: Addison-Wesley, ASIN
B007BNG4E0.
Fisk, Dale (2005), Punch cards, Columbia University ACIS, retrieved 2008-05-19.
Flamm, Kenneth (1987), Targeting the Computer:
Government Support and International Competition,
Washington, DC: Brookings Institution Press, ISBN
978-0-815-72852-8.
Flamm, Kenneth (1988), Creating the Computer:
Government, Industry, and High Technology, Washington, DC: Brookings Institution Press, ISBN 9780-815-72850-4.
Hollerith, Herman (1890), In connection with the
electric tabulation system which has been adopted by
U.S. government for the work of the census bureau
(Ph.D. dissertation), Columbia University School of
Mines.
Horowitz, Paul; Hill, Wineld (1989), The Art of
Electronics (2nd ed.), Cambridge University Press,
ISBN 0-521-37095-7.
Hunt, J. c. r. (1998), Lewis Fry Richardson and
his contributions to Mathematics, Meteorology and

3.13. REFERENCES

41

Models of Conict, Ann. Rev. Fluid Mech. 30 (1):


XIIIXXXVI, Bibcode:1998AnRFM..30D..13H,
doi:10.1146/annurev.uid.30.1.0, retrieved 200806-15.
IBM_SMS (1960), IBM Standard Modular System
SMS Cards, IBM, retrieved 2008-03-06.
IBM (September 1956), IBM 350 disk storage unit,
IBM, retrieved 2008-07-01.
IEEE_Annals (Series dates from 1979), Annals of
the History of Computing, IEEE, retrieved 2008-0519 Check date values in: |date= (help).
Ifrah, Georges (2000), The Universal History of
Numbers: From prehistory to the invention of the
computer., John Wiley and Sons, p. 48, ISBN
0-471-39340-1. Translated from the French by
David Bellos, E.F. Harding, Sophie Wood and Ian
Monk. Ifrah supports his thesis by quoting idiomatic
phrases from languages across the entire world.
Intel_4004 (November 1971),
Microprocessorthe Intel 4004,
retrieved 2008-05-17.

Intels First
Intel Corp.,

Manchester (1999), Mark 1, Computer History


Museum, The University of Manchester, retrieved
2008-04-19
Marguin, Jean (1994), Histoire des instruments et
machines calculer, trois sicles de mcanique pensante 1642-1942 (in French), Hermann, ISBN 9782-7056-6166-3
Martin, Douglas (June 29, 2008), David Caminer,
92 Dies; A Pioneer in Computers, New York Times:
24
Mayo, Keenan; Newcomb, Peter (July 2008), How
the web was won: an oral history of the internet,
Vanity Fair (Conde Nast): 96117, retrieved 201205-05
Mead, Carver; Conway, Lynn (1980), Introduction
to VLSI Systems, Reading, Mass.: Addison-Wesley,
ISBN 0-201-04358-0.
Menabrea, Luigi Federico; Lovelace, Ada (1843),
Sketch of the Analytical Engine Invented by
Charles Babbage, Scientic Memoirs 3. With notes
upon the Memoir by the Translator.

Jones, Douglas W, Punched Cards: A brief illustrated technical history, The University of Iowa, retrieved 2008-05-15.

Menninger, Karl (1992), Number Words and Number Symbols: A Cultural History of Numbers, Dover
Publications. German to English translation, M.I.T.,
1969.

Kalman, R.E. (1960), A new approach to linear ltering and prediction problems, Journal of Basic
Engineering 82 (1): 3545, doi:10.1115/1.3662552,
retrieved 2008-05-03.

Montaner; Simon (1887), Diccionario Enciclopdico


Hispano-Americano (Hispano-American Encyclopedic Dictionary).

Kells; Kern; Bland (1943), The Log-Log Duplex


Decitrig Slide Rule No. 4081: A Manual, Keuel
& Esser, p. 92.
Kilby, Jack (2000), Nobel lecture, Stockholm: Nobel Foundation, retrieved 2008-05-15.
Kohonen, Teuvo (1980), Content-addressable memories, Springer-Verlag, p. 368, ISBN 0-387-098232.
Lavington, Simon (1998), A History of Manchester
Computers (2 ed.), Swindon: The British Computer
Society
Lazos, Christos (1994),
[The Antikythera Computer],
PUBLICATIONS GR.
Leibniz, Gottfried (1703),
l'Arithmtique Binaire.

Explication

de

Lubar, Steven (May 1991), Do not fold, spindle or


mutilate": A cultural history of the punched card,
archived from the original on 2006-10-25, retrieved
2006-10-31

Moye, William T. (January 1996), ENIAC: The


Army-Sponsored Revolution, retrieved 2008-05-17.
Norden, M9 Bombsight, National Museum of the
USAF, retrieved 2008-05-17.
Noyce, Robert US patent 2981877, Robert Noyce,
Semiconductor device-and-lead structure, issued
1961-04-25, assigned to Fairchild Semiconductor
Corporation.
Patterson, David; Hennessy, John (1998), Computer
Organization and Design, San Francisco: Morgan
Kaufmann, ISBN 1-55860-428-6.
Mourlevat, Guy (1988), Les machines arithmtiques
de Blaise Pascal (in French), Clermont-Ferrand: La
Franaise d'Edition et d'Imprimerie
Pellerin, David; Thibault, Scott (April 22, 2005),
Practical FPGA Programming in C, Prentice Hall
Modern Semiconductor Design Series Sub Series:
PH Signal Integrity Library, pp. 1464, ISBN 013-154318-0.
Phillips, A.W.H., The MONIAC, Reserve Bank Museum, retrieved 2006-05-17.

42

CHAPTER 3. HISTORY OF COMPUTING HARDWARE

Reynolds, David (2010), Science, technology, and


the Cold War, in Leer, Melvyn P.; Westad, Odd
Arne, The Cambridge History of the Cold War, Volume III: Endings, Cambridge: Cambridge University
Press, pp. 378399, ISBN 978-0-521-83721-7.

Entscheidungsproblem: A correction, Proceedings


of the London Mathematical Society, 2 (1937) 43 (6):
5446, doi:10.1112/plms/s2-43.6.544)Other online
versions: Proceedings of the London Mathematical
Society Another link online.

Rojas, Raul; Hashagen, Ulf (eds., 2000). The First


Computers: History and Architectures. Cambridge:
MIT Press. ISBN 0-262-68137-4.

Ulam, Stanislaw (1976), Adventures of a Mathematician, New York: Charles Scribners Sons, (autobiography).

Schmandt-Besserat, Denise (1981), Decipherment of the earliest tablets, Science 211


(4479): 283285, Bibcode:1981Sci...211..283S,
doi:10.1126/science.211.4479.283,
PMID
17748027.

Vidal, Nathalie; Vogt, Dominique (2011), Les Machines Arithmtiques de Blaise Pascal (in French),
Clermont-Ferrand: Musum Henri-Lecoq, ISBN
978-2-9528068-4-8

Shankland, Stephen (May 30, 2008), Google spotlights data center inner workings, Cnet, retrieved
2008-05-31.
Shankland, Stephen (April 1, 2009), Google uncloaks once-secret server, Cnet, retrieved 2009-0401.
Shannon, Claude E. (1940), A symbolic analysis of
relay and switching circuits, Massachusetts Institute
of Technology, Dept. of Electrical Engineering.
Simon, Herbert A. (1991), Models of My Life, Basic
Books, Sloan Foundation Series.
Singer (1946), Singer in World War II, 19391945
the M5 Director, Singer Manufacturing Co., retrieved 2008-05-17.
Smith, David Eugene (1929), A Source Book in
Mathematics, New York: McGraw-Hill, pp. 180
181.
Smolin, Lee (2001), Three roads to quantum gravity, Basic Books, pp. 5357, ISBN 0-465-07835-4.
Pages 220226 are annotated references and guide
for further reading.
Steinhaus, H. (1999), Mathematical Snapshots (3rd
ed.), New York: Dover, pp. 9295, p. 301.
Stern, Nancy (1981), From ENIAC to UNIVAC: An
Appraisal of the Eckert-Mauchly Computers, Digital
Press, ISBN 0-932376-14-2.
Stibitz, George US patent 2668661, George Stibitz,
Complex Computer, issued 1954-02-09, assigned
to American Telephone & Telegraph Company.
Taton, Ren (1969), Histoire du calcul. Que sais-je ?
n 198 (in French), Presses universitaires de France
Turing, A.M. (1936), On Computable Numbers,
with an Application to the Entscheidungsproblem,
Proceedings of the London Mathematical Society,
2 (1937) 42 (1): 23065, doi:10.1112/plms/s242.1.230 (and Turing, A.M. (1938), On Computable Numbers, with an Application to the

von Neumann, John (June 30, 1945), First Draft of


a Report on the EDVAC, Moore School of Electrical
Engineering: University of Pennsylvania.
Wang, An US patent 2708722, An Wang, Pulse
transfer controlling devices, issued 1955-05-17.
Welchman, Gordon (1984), The Hut Six Story:
Breaking the Enigma Codes, Harmondsworth, England: Penguin Books, pp. 138145, 295309.
Wilkes, Maurice (1986), The Genesis of Microprogramming, Ann. Hist. Comp. 8 (2): 115126.
Williams, Michael R. (1997), History of Computing
Technology, Los Alamitos, California: IEEE Computer Society, ISBN 0-8186-7739-2
Ziemer, Roger E.; Tranter, William H.; Fannin, D.
Ronald (1993), Signals and Systems: Continuous and
Discrete, Macmillan, p. 370, ISBN 0-02-431641-5.
Zuse, Z3 Computer (19381941), retrieved 200806-01.
Zuse, Konrad (2010) [1984], The Computer
My Life Translated by McKenna, Patricia and
Ross, J. Andrew from: Der Computer, mein
Lebenswerk (1984) (in English translated from German), Berlin/Heidelberg: Springer-Verlag, ISBN
978-3-642-08151-4

3.14 Further reading


Ceruzzi, Paul E., A History of Modern Computing,
MIT Press, 1998

3.15 External links


Obsolete Technology Old Computers
History of calculating technology
Historic Computers in Japan

3.15. EXTERNAL LINKS


The History of Japanese Mechanical Calculating
Machines
Computer History a collection of articles by Bob
Bemer
25 Microchips that shook the world a collection
of articles by the Institute of Electrical and Electronics Engineers

43

Chapter 4

Software
For other uses, see Software (disambiguation).

4.1 History

Computer software, or simply software is any set of


machine-readable instructions that directs a computer's
processor to perform specic operations. Computer software contrasts with computer hardware, which is the
physical component of computers. Computer hardware
and software require each other and neither can be realistically used without the other. Using a musical analogy,
hardware is like a musical instrument and software is like
the notes played on that instrument.

Main article: History of software


An outline (algorithm) for what would have been the rst
piece of software was written by Ada Lovelace in the 19th
century, for the planned analytical engine. However, neither the analytical engine nor any software for it were ever
created.

The rst theory about software - prior to the creation


of computers as we know them today - was proposed
Computer software includes computer programs, by Alan Turing in his 1935 essay Computable numbers
libraries and their associated documentation. The word with an application to the Entscheidungsproblem (decision
software is also sometimes used in a more narrow sense, problem).
meaning application software only. Software is stored
in computer memory and is intangible, i.e. it cannot be This eventually led to the creation of the twin academic
elds of computer science and software engineering,
touched.[1]
which both study software and its creation. Computer
At the lowest level, executable code consists of machine science is more theoretical (Turings essay is an examlanguage instructions specic to an individual processor ple of computer science), whereas software engineering
typically a central processing unit (CPU). A machine is focused on more practical concerns.
language consists of groups of binary values signifying
processor instructions that change the state of the com- However, prior to 1946, software as we now understand it
puter from its preceding state. For example, an instruc- - programs stored in the memory of stored-program digition may change the value stored in a particular storage tal computers - did not yet exist. The very rst electronic
location inside the computer an eect that is not di- computing devices were instead rewired in order to rerectly observable to the user. An instruction may also program them.
(indirectly) cause something to appear on a display of the
computer system a state change which should be visible to the user. The processor carries out the instructions 4.2 Types of software
in the order they are provided, unless it is instructed to
jump to a dierent instruction, or interrupted.
See also: List of software categories
Software written in a machine language is known as machine code. However, in practice, software is usually A diagram showing how the operating system softwritten in high-level programming languages that are eas- ware and application software are layered on a typical
ier and more ecient for humans to use (closer to natural desktop computer. The arrows indicate information ow.
language) than machine language.[2] High-level languages
are translated, using compilation or interpretation or a
combination of the two, into machine language. Software On virtually all computer platforms, software can be
may also be written in a low-level assembly language, es- grouped into a few broad categories.
sentially, a vaguely mnemonic representation of a machine language using a natural language alphabet. Assembly language is translated into machine code using an 4.2.1 Purpose, or domain of use
assembler.
Based on the goal, computer software can be divided into:
44

4.2. TYPES OF SOFTWARE


Application software, which uses the computer
system to perform special functions or provide
entertainment functions beyond the basic operation
of the computer itself. There are many dierent
types of application software, because the range of
tasks that can be performed with a modern computer
is so large - see list of software.
System software, which is designed to directly
operate the computer hardware, to provide basic
functionality needed by users and other software,
and to provide a platform for running application
software.[3] System software includes:
Operating systems, which are essential collections of software that manage resources and
provides common services for other software
that runs on top of them. Supervisory programs, boot loaders, shells and window systems are core parts of operating systems. In
practice, an operating system comes bundled
with additional software (including application software) so that a user can potentially do
some work with a computer that only has an
operating system.
Device drivers, which operate or control a particular type of device that is attached to a computer. Each device needs at least one corresponding device driver; because a computer
typically has at minimum at least one input device and at least one output device, a computer
typically needs more than one device driver.
Utilities, which are computer programs designed to assist users in maintenance and care
of their computers.
Malicious software or malware, which are computer programs developed to harm and disrupt computers. As such, malware is undesirable. Malware
is closely associated with computer-related crimes,
though some malicious programs may have been designed as practical jokes.

4.2.2

Nature, or domain of execution

45
without the need for a web browser plugin. Software written in other programming languages can
also be run within the web browser if the software is
either translated into JavaScript, or if a web browser
plugin that supports that language is installed; the
most common example of the latter is ActionScript
scripts, which are supported by the Adobe Flash plugin.
Server software, including:
Web applications, which usually run on the
web server and output dynamically generated
web pages to web browsers, using e.g. PHP,
Java or ASP.NET, or even JavaScript that runs
on the server. In modern times these commonly include some JavaScript to be run in the
web browser as well, in which case they typically run partly on the server, partly in the web
browser.
Plugins and extensions are software that extends or
modies the functionality of another piece of software, and require that software be used in order to
function;
Embedded software resides as rmware within
embedded systems, devices dedicated to a single
use or a few uses such as cars and televisions (although some embedded devices such as wireless
chipsets can themselves be part of an ordinary,
non-embedded computer system such as a PC or
smartphone).[4] In the embedded system context
there is sometimes no clear distinction between the
system software and the application software. However, some embedded systems run embedded operating systems, and these systems do retain the distinction between system software and application
software (although typically there will only be one,
xed, application which is always ran).
Microcode is a special, relatively obscure type of
embedded software which tells the processor itself how to execute machine code, so it is actually a lower level than machine code.[5] It is typically proprietary to the processor manufacturer, and
any necessary correctional microcode software updates are supplied by them to users (which is much
cheaper than shipping replacement processor hardware). Thus an ordinary programmer would not expect to ever have to deal with it.

Desktop applications such as web browsers and


Microsoft Oce, as well as smartphone and tablet
applications (called "apps"). (There is a push in
some parts of the software industry to merge desktop applications with mobile apps, to some extent.
Windows 8, and later Ubuntu Touch, tried to allow
the same style of application user interface to be 4.2.3 Programming tools
used on desktops and laptops, mobile devices, and
hybrid tablets.)
Main article: Programming tool

JavaScript scripts are pieces of software traditionally embedded in web pages that are run directly Programming tools are also software in the form of
inside the web browser when a web page is loaded programs or applications that software developers (also

46

CHAPTER 4. SOFTWARE

known as programmers, coders, hackers or software engineers) use to create, debug, maintain (i.e. improve or
x), or otherwise support software. Software is written
in one or more programming languages; there are many
programming languages in existence, and each has at least
one implementation, each of which consists of its own
set of programming tools. These tools may be relatively
self-contained programs such as compilers, debuggers,
interpreters, linkers, and text editors, that can be combined together to accomplish a task; or they may form
an integrated development environment (IDE), which
combines much or all of the functionality of such selfcontained tools. IDEs may do this by either invoking
the relevant individual tools or by re-implementing their
functionality in a new way. An IDE can make it easier to
do specic tasks, such as searching in les in a particular
project. Many programming language implementations
provide the option of using both individual tools or an
IDE.

4.3 Software topics


4.3.1

Architecture

See also: Software architecture


Users often see things dierently from programmers.
People who use modern general purpose computers (as
opposed to embedded systems, analog computers and
supercomputers) usually see three layers of software performing a variety of tasks: platform, application, and user
software.
Platform software: Platform includes the rmware,
device drivers, an operating system, and typically
a graphical user interface which, in total, allow a
user to interact with the computer and its peripherals
(associated equipment). Platform software often
comes bundled with the computer. On a PC one
will usually have the ability to change the platform
software.
Application software: Application software or Applications are what most people think of when they
think of software. Typical examples include oce
suites and video games. Application software is often purchased separately from computer hardware.
Sometimes applications are bundled with the computer, but that does not change the fact that they run
as independent applications. Applications are usually independent programs from the operating system, though they are often tailored for specic platforms. Most users think of compilers, databases,
and other system software as applications.

ware include spreadsheet templates and word processor templates. Even email lters are a kind of
user software. Users create this software themselves
and often overlook how important it is. Depending on how competently the user-written software
has been integrated into default application packages, many users may not be aware of the distinction
between the original packages, and what has been
added by co-workers.

4.3.2 Execution
Main article: Execution (computing)
Computer software has to be loaded into the computers
storage (such as the hard drive or memory). Once the
software has loaded, the computer is able to execute the
software. This involves passing instructions from the
application software, through the system software, to
the hardware which ultimately receives the instruction
as machine code. Each instruction causes the computer
to carry out an operation moving data, carrying out a
computation, or altering the control ow of instructions.
Data movement is typically from one place in memory
to another. Sometimes it involves moving data between
memory and registers which enable high-speed data access in the CPU. Moving data, especially large amounts
of it, can be costly. So, this is sometimes avoided by using
pointers to data instead. Computations include simple
operations such as incrementing the value of a variable
data element. More complex computations may involve
many operations and data elements together.

4.3.3 Quality and reliability


Main articles: Software quality, Software testing and
Software reliability
Software quality is very important, especially for
commercial and system software like Microsoft Oce,
Microsoft Windows and Linux. If software is faulty
(buggy), it can delete a persons work, crash the computer and do other unexpected things. Faults and errors
are called "bugs. Software is often also a victim to what
is known as software aging, the progressive performance
degradation resulting from a combination of unseen bugs.

Many bugs are discovered and eliminated (debugged)


through software testing. However, software testing
rarely if ever eliminates every bug; some programmers say that every program has at least one more bug
(Lubarskys Law).[6] In the waterfall method of software
development, separate testing teams are typically employed, but in newer approaches, collectively termed agile
User-written software: End-user development tai- software development, developers often do all their own
lors systems to meet users specic needs. User soft- testing, and demonstrate the software to users/clients reg-

4.4. DESIGN AND IMPLEMENTATION


ularly to obtain feedback. Software can be tested through
unit testing, regression testing and other methods, which
are done manually, or most commonly, automatically,
since the amount of code to be tested can be quite large.
For instance, NASA has extremely rigorous software testing procedures for many operating systems and communication functions. Many NASA-based operations interact and identify each other through command programs.
This enables many people who work at NASA to check
and evaluate functional systems overall. Programs containing command software enable hardware engineering
and system operations to function much easier together.

47
software patents are supposed to cover the middle area,
between requirements and concrete implementation. In
some countries, a requirement for the claimed invention
to have an eect on the physical world may also be part of
the requirements for a software patent to be held valid although since all useful software has eects on the physical world, this requirement may be open to debate.

Software patents are controversial in the software industry with many people holding dierent views about them.
One of the sources of controversy is that the aforementioned split between initial ideas and patent does not seem
to be honored in practice by patent lawyers - for example the patent for Aspect-Oriented Programming (AOP),
which purported to claim rights over any programming
4.3.4 License
tool implementing the idea of AOP, howsoever implemented. Another source of controversy is the eect on
Main article: Software license
innovation, with many distinguished experts and companies arguing that software is such a fast-moving eld that
The softwares license gives the user the right to use the software patents merely create vast additional litigation
software in the licensed environment, and in the case of costs and risks, and actually retard innovation. In the case
free software licenses, also grants other rights such as the of debates about software patents outside the US, the arright to make copies.
gument has been made that large American corporations
and patent lawyers are likely to be the primary beneciaProprietary software can be divided into two types:
ries of allowing or continue to allow software patents.
freeware, which includes the category of free
trial software or "freemium" software (in the
past, the term shareware was often used for free 4.4 Design and implementation
trial/freemium software). As the name suggests,
freeware can be used for free, although in the case of
free trials or freemium software, this is sometimes Main articles: Software development, Computer proonly true for a limited period of time or with limited gramming and Software engineering
functionality.
Design and implementation of software varies depend software available for a fee, often inaccurately ing on the complexity of the software. For instance, the
termed "commercial software", which can only be design and creation of Microsoft Word took much more
legally used on purchase of a license.
time than designing and developing Microsoft Notepad
because the latter has much more basic functionality.
Open source software, on the other hand, comes with a Software is usually designed and created (a.k.a.
free software license, granting the recipient the rights to coded/written/programmed) in integrated development
modify and redistribute the software.
environments (IDE) like Eclipse, IntelliJ and Microsoft

4.3.5

Patents

Main articles: Software patent and Software patent


debate
Software patents, like other types of patents, are theoretically supposed to give an inventor an exclusive, timelimited license for a detailed idea (e.g. an algorithm) on
how to implement a piece of software, or a component
of a piece of software. Ideas for useful things that software could do, and user requirements, are not supposed
to be patentable, and concrete implementations (i.e. the
actual software packages implementing the patent) are
not supposed to be patentable either - the latter are already covered by copyright, generally automatically. So

Visual Studio that can simplify the process and compile


the software (if applicable). As noted in a dierent
section, software is usually created on top of existing
software and the application programming interface
(API) that the underlying software provides like GTK+,
JavaBeans or Swing. Libraries (APIs) can be categorized
by their purpose. For instance, the Spring Framework
is used for implementing enterprise applications, the
Windows Forms library is used for designing graphical
user interface (GUI) applications like Microsoft Word,
and Windows Communication Foundation is used for
designing web services. When a program is designed, it
relies upon the API. For instance, if a user is designing a
Microsoft Windows desktop application, he or she might
use the .NET Windows Forms library to design the
desktop application and call its APIs like Form1.Close()
and Form1.Show()[7] to close or open the application,

48
and write the additional operations him/herself that it
needs to have. Without these APIs, the programmer
needs to write these APIs him/herself. Companies
like Oracle and Microsoft provide their own APIs so
that many applications are written using their software
libraries that usually have numerous APIs in them.
Data structures such as hash tables, arrays, and binary
trees, and algorithms such as quicksort, can be useful for
creating software.
Computer software has special economic characteristics
that make its design, creation, and distribution dierent
from most other economic goods.[8][9]
A person who creates software is called a programmer,
software engineer or software developer, terms that all
have a similar meaning. More informal terms for programmer also exist such as coder and "hacker" although use of the latter word may cause confusion, because it is more often used to mean someone who illegally
breaks into computer systems.

4.5 Industry and organizations


Main article: Software industry
A great variety of software companies and programmers in the world comprise a software industry. Software can be quite a protable industry: Bill Gates, the
founder of Microsoft was the richest person in the world
in 2009, largely due to his ownership of a signicant number of shares in Microsoft, the company responsible for
Microsoft Windows and Microsoft Oce software products.
Non-prot software organizations include the Free Software Foundation, GNU Project and Mozilla Foundation.
Software standard organizations like the W3C, IETF develop recommended software standards such as XML,
HTTP and HTML, so that software can interoperate
through these standards.
Other well-known large software companies include
Oracle, Novell, SAP, Symantec, Adobe Systems, and
Corel, while small companies often provide innovation.

4.6 See also


Software release life cycle
List of software

4.7 References
[1] "'Software' from Collins Concise English Dictionary.
Wordreference.com. Princeton, NJ: Princeton University.

CHAPTER 4. SOFTWARE

Retrieved 2007-08-19.
[2] Compiler construction.
[3] System Software. The University of Mississippi.
[4] Embedded SoftwareTechnologies and Trends. IEEE
Computer Society. Retrieved MayJune 2009.
[5] Microcode. Princeton University.
[6] scripting intelligence book examples.
[7] MSDN Library. Retrieved 2010-06-14.
[8] v. Engelhardt, Sebastian (2008). The Economic Properties of Software. Jena Economic Research Papers 2
(2008045.).
[9] Kaminsky, Dan (1999). Why Open Source Is The Optimum Economic Paradigm for Software.

4.8 External links


Software Wikia
Software in Open Directory Project
Software glitches are sometimes deadly

Chapter 5

Computer science
Computer science is the scientic and practical approach to computation and its applications. It is the
systematic study of the feasibility, structure, expression, and mechanization of the methodical procedures
(or algorithms) that underlie the acquisition, representation, processing, storage, communication of, and access to information, whether such information is encoded
as bits in a computer memory or transcribed in genes
and protein structures in a biological cell.[1] An alternate, more succinct denition of computer science is the
study of automating algorithmic processes that scale. A
computer scientist specializes in the theory of computation and the design of computational systems.[2]
Its subelds can be divided into a variety of theoretical and practical disciplines. Some elds, such as
computational complexity theory (which explores the
fundamental properties of computational and intractable
problems), are highly abstract, while elds such as
computer graphics emphasize real-world visual applications. Still other elds focus on the challenges in implementing computation. For example, programming
language theory considers various approaches to the description of computation, while the study of computer
programming itself investigates various aspects of the
use of programming language and complex systems.
Humancomputer interaction considers the challenges in
making computers and computations useful, usable, and
universally accessible to humans.

Charles Babbage is credited with inventing the rst mechanical


computer.

ther, algorithms for performing computations have existed since antiquity, even before sophisticated computing equipment were created. The ancient Sanskrit treatise
Shulba Sutras, or Rules of the Chord, is a book of algorithms written in 800 BCE for constructing geometric
objects like altars using a peg and chord, an early precurComputer science deals with the theoretical foundations sor of the modern eld of computational geometry.
of information and computation, together with practical
Blaise Pascal designed and constructed the rst working
techniques for the implementation and application of
mechanical calculator, Pascals calculator, in 1642.[3] In
these foundations
1673 Gottfried Leibniz demonstrated a digital mechanical calculator, called the 'Stepped Reckoner'.[4] He may
be considered the rst computer scientist and information theorist, for, among other reasons, documenting the
binary number system. In 1820, Thomas de Colmar
5.1 History
launched the mechanical calculator industry[5] when he
released his simplied arithmometer, which was the rst
Main article: History of computer science
calculating machine strong enough and reliable enough
The earliest foundations of what would become com- to be used daily in an oce environment. Charles Babputer science predate the invention of the modern digital bage started the design of the rst automatic mechanical
computer. Machines for calculating xed numerical tasks calculator, his dierence engine, in 1822, which evensuch as the abacus have existed since antiquity, aiding in tually gave him the idea of the rst programmable mecomputations such as multiplication and division. Fur- chanical calculator, his Analytical Engine.[6] He started
49

50

CHAPTER 5. COMPUTER SCIENCE


program, the Cambridge Diploma in Computer Science,
began at the University of Cambridge Computer Laboratory in 1953. The rst computer science degree program in the United States was formed at Purdue University in 1962.[15] Since practical computers became available, many applications of computing have become distinct areas of study in their own rights.
Although many initially believed it was impossible that
computers themselves could actually be a scientic eld
of study, in the late fties it gradually became accepted
among the greater academic population.[16] It is the now
well-known IBM brand that formed part of the computer
science revolution during this time. IBM (short for International Business Machines) released the IBM 704[17]
and later the IBM 709[18] computers, which were widely
used during the exploration period of such devices. Still,
working with the IBM [computer] was frustrating ... if
you had misplaced as much as one letter in one instruction, the program would crash, and you would have to
start the whole process over again.[16] During the late
1950s, the computer science discipline was very much in
its developmental stages, and such issues were commonplace.

Ada Lovelace is credited with writing the rst algorithm intended


for processing on a computer.

developing this machine in 1834 and in less than two


years he had sketched out many of the salient features
of the modern computer. A crucial step was the adoption of a punched card system derived from the Jacquard
loom[7] making it innitely programmable.[8] In 1843,
during the translation of a French article on the analytical engine, Ada Lovelace wrote, in one of the many
notes she included, an algorithm to compute the Bernoulli
numbers, which is considered to be the rst computer
program.[9] Around 1885, Herman Hollerith invented the
tabulator, which used punched cards to process statistical information; eventually his company became part of
IBM. In 1937, one hundred years after Babbages impossible dream, Howard Aiken convinced IBM, which
was making all kinds of punched card equipment and was
also in the calculator business[10] to develop his giant programmable calculator, the ASCC/Harvard Mark I, based
on Babbages analytical engine, which itself used cards
and a central computing unit. When the machine was nished, some hailed it as Babbages dream come true.[11]
During the 1940s, as new and more powerful computing
machines were developed, the term computer came
to refer to the machines rather than their human
predecessors.[12] As it became clear that computers could
be used for more than just mathematical calculations, the
eld of computer science broadened to study computation
in general. Computer science began to be established
as a distinct academic discipline in the 1950s and early
1960s.[13][14] The worlds rst computer science degree

Time has seen signicant improvements in the usability


and eectiveness of computing technology. Modern society has seen a signicant shift in the users of computer
technology, from usage only by experts and professionals,
to a near-ubiquitous user base. Initially, computers were
quite costly, and some degree of human aid was needed
for ecient use - in part from professional computer operators. As computer adoption became more widespread
and aordable, less human assistance was needed for
common usage.

5.1.1 History
Despite its short history as a formal academic discipline,
computer science has made a number of fundamental
contributions to science and society - in fact, along with
electronics, it is a founding science of the current epoch
of human history called the Information Age and a driver
of the Information Revolution, seen as the third major
leap in human technological progress after the Industrial
Revolution (1750-1850 CE) and the Agricultural Revolution (8000-5000 BCE).
These contributions include:
The start of the "digital revolution", which includes
the current Information Age and the Internet.[20]
A formal denition of computation and
computability, and proof that there are computationally unsolvable and intractable problems.[21]
The concept of a programming language, a tool for
the precise expression of methodological information at various levels of abstraction.[22]

5.2. PHILOSOPHY

51
tronic systems and circuits, as well as societies and
social situations (notably war games) along with
their habitats, among many others. Modern computers enable optimization of such designs as complete
aircraft. Notable in electrical and electronic circuit
design are SPICE, as well as software for physical
realization of new (or modied) designs. The latter includes essential design software for integrated
circuits.
Articial intelligence is becoming increasingly important as it gets more ecient and complex. There
are many applications of the AI, some of which can
be seen at home, such as robotic vacuum cleaners.
It is also present in video games and on the modern
battleeld in drones, anti-missile systems, and squad
support robots.

5.2 Philosophy
Main article: Philosophy of computer science

The German military used the Enigma machine (shown here)


during World War II for communication they thought to be secret. The large-scale decryption of Enigma trac at Bletchley
Park was an important factor that contributed to Allied victory
in WWII.[19]

In cryptography, breaking the Enigma code was an


important factor contributing to the Allied victory in
World War II.[19]

A number of computer scientists have argued for the distinction of three separate paradigms in computer science.
Peter Wegner argued that those paradigms are science,
technology, and mathematics.[25] Peter Denning's working group argued that they are theory, abstraction (modeling), and design.[26] Amnon H. Eden described them as
the rationalist paradigm (which treats computer science
as a branch of mathematics, which is prevalent in theoretical computer science, and mainly employs deductive
reasoning), the technocratic paradigm (which might
be found in engineering approaches, most prominently
in software engineering), and the scientic paradigm
(which approaches computer-related artifacts from the
empirical perspective of natural sciences, identiable in
some branches of articial intelligence).[27]

Scientic computing enabled practical evaluation of


processes and situations of great complexity, as well
as experimentation entirely by software. It also enabled advanced study of the mind, and mapping
of the human genome became possible with the
Human Genome Project.[20] Distributed computing projects such as Folding@home explore protein
5.2.1
folding.
Algorithmic trading has increased the eciency and
liquidity of nancial markets by using articial intelligence, machine learning, and other statistical
and numerical techniques on a large scale.[23] High
frequency algorithmic trading can also exacerbate
volatility.[24]
Computer graphics and computer-generated
imagery have become ubiquitous in modern
entertainment, particularly in television, cinema,
advertising, animation and video games. Even lms
that feature no explicit CGI are usually lmed
now on digital cameras, or edited or postprocessed
using a digital video editor.
Simulation of various processes, including computational uid dynamics, physical, electrical, and elec-

Name of the eld

The term computer science appears in a 1959 article


in Communications of the ACM,[28] in which Louis Fein
argues for the creation of a Graduate School in Computer Sciences analogous to the creation of Harvard Business School in 1921,[29] justifying the name by arguing
that, like management science, the subject is applied and
interdisciplinary in nature, while having the characteristics typical of an academic discipline.[30] His eorts,
and those of others such as numerical analyst George
Forsythe, were rewarded: universities went on to create such programs, starting with Purdue in 1962.[31] Despite its name, a signicant amount of computer science does not involve the study of computers themselves. Because of this, several alternative names have
been proposed.[32] Certain departments of major universities prefer the term computing science, to emphasize

52
precisely that dierence. Danish scientist Peter Naur suggested the term datalogy,[33] to reect the fact that the
scientic discipline revolves around data and data treatment, while not necessarily involving computers. The
rst scientic institution to use the term was the Department of Datalogy at the University of Copenhagen,
founded in 1969, with Peter Naur being the rst professor in datalogy. The term is used mainly in the Scandinavian countries. Also, in the early days of computing, a number of terms for the practitioners of the eld
of computing were suggested in the Communications of
the ACM turingineer, turologist, ow-charts-man, applied meta-mathematician, and applied epistemologist.[34]
Three months later in the same journal, comptologist was
suggested, followed next year by hypologist.[35] The term
computics has also been suggested.[36] In Europe, terms
derived from contracted translations of the expression
automatic information (e.g. informazione automatica
in Italian) or information and mathematics are often
used, e.g. informatique (French), Informatik (German),
informatica (Italy, The Netherlands), informtica (Spain,
Portugal), informatika (Slavic languages and Hungarian)
or pliroforiki (, which means informatics)
in Greek. Similar words have also been adopted in the
UK (as in the School of Informatics of the University of
Edinburgh).[37]

CHAPTER 5. COMPUTER SCIENCE


that the principal focus of computer science is studying
the properties of computation in general, while the principal focus of software engineering is the design of specic
computations to achieve practical goals, making the two
separate but complementary disciplines.[39]
The academic, political, and funding aspects of computer
science tend to depend on whether a department formed
with a mathematical emphasis or with an engineering emphasis. Computer science departments with a mathematics emphasis and with a numerical orientation consider
alignment with computational science. Both types of departments tend to make eorts to bridge the eld educationally if not across all research.

5.3 Areas of computer science


As a discipline, computer science spans a range of topics from theoretical studies of algorithms and the limits of
computation to the practical issues of implementing computing systems in hardware and software.[40][41] CSAB,
formerly called Computing Sciences Accreditation Board
which is made up of representatives of the Association for
Computing Machinery (ACM), and the IEEE Computer
Society (IEEE-CS)[42] identies four areas that it considers crucial to the discipline of computer science: theory of computation, algorithms and data structures, programming methodology and languages, and computer elements and architecture. In addition to these four areas, CSAB also identies elds such as software engineering, articial intelligence, computer networking and
telecommunications, database systems, parallel computation, distributed computation, computer-human interaction, computer graphics, operating systems, and numerical and symbolic computation as being important areas
of computer science.[40]

A folkloric quotation, often attributed tobut almost


certainly not rst formulated byEdsger Dijkstra, states
that computer science is no more about computers than
astronomy is about telescopes.[note 1] The design and deployment of computers and computer systems is generally considered the province of disciplines other than
computer science. For example, the study of computer
hardware is usually considered part of computer engineering, while the study of commercial computer systems and their deployment is often called information
technology or information systems. However, there has
been much cross-fertilization of ideas between the various computer-related disciplines. Computer science re5.3.1 Theoretical computer science
search also often intersects other disciplines, such as
philosophy, cognitive science, linguistics, mathematics,
Main article: Theoretical computer science
physics, biology, statistics, and logic.
Computer science is considered by some to have a much
closer relationship with mathematics than many scientic
disciplines, with some observers saying that computing is
a mathematical science.[13] Early computer science was
strongly inuenced by the work of mathematicians such
as Kurt Gdel and Alan Turing, and there continues to be
a useful interchange of ideas between the two elds in areas such as mathematical logic, category theory, domain
theory, and algebra.
The relationship between computer science and software
engineering is a contentious issue, which is further muddied by disputes over what the term software engineering means, and how computer science is dened.[38]
David Parnas, taking a cue from the relationship between
other engineering and science disciplines, has claimed

The broader eld of theoretical computer science encompasses both the classical theory of computation and a
wide range of other topics that focus on the more abstract,
logical, and mathematical aspects of computing.
Theory of computation
Main article: Theory of computation
According to Peter J. Denning, the fundamental question
underlying computer science is, What can be (eciently)
automated?" [13] The study of the theory of computation is
focused on answering fundamental questions about what
can be computed and what amount of resources are re-

5.3. AREAS OF COMPUTER SCIENCE


quired to perform those computations. In an eort to
answer the rst question, computability theory examines
which computational problems are solvable on various
theoretical models of computation. The second question
is addressed by computational complexity theory, which
studies the time and space costs associated with dierent approaches to solving a multitude of computational
problems.
The famous "P=NP?" problem, one of the Millennium
Prize Problems,[43] is an open problem in the theory of
computation.
Information and coding theory
Main articles: Information theory and Coding theory
Information theory is related to the quantication of information. This was developed by Claude E. Shannon
to nd fundamental limits on signal processing operations such as compressing data and on reliably storing
and communicating data.[44] Coding theory is the study
of the properties of codes (systems for converting information from one form to another) and their tness for a
specic application. Codes are used for data compression, cryptography, error detection and correction, and
more recently also for network coding. Codes are studied for the purpose of designing ecient and reliable data
transmission methods.

53
verication of software and hardware systems. The use
of formal methods for software and hardware design is
motivated by the expectation that, as in other engineering
disciplines, performing appropriate mathematical analysis can contribute to the reliability and robustness of a
design. They form an important theoretical underpinning
for software engineering, especially where safety or security is involved. Formal methods are a useful adjunct
to software testing since they help avoid errors and can
also give a framework for testing. For industrial use, tool
support is required. However, the high cost of using formal methods means that they are usually only used in the
development of high-integrity and life-critical systems,
where safety or security is of utmost importance. Formal methods are best described as the application of a
fairly broad variety of theoretical computer science fundamentals, in particular logic calculi, formal languages,
automata theory, and program semantics, but also type
systems and algebraic data types to problems in software
and hardware specication and verication.

5.3.2 Applied computer science


Applied computer science aims at identifying certain
computer science concepts that can be used directly in
solving real world problems.

Articial intelligence
Algorithms and data structures
Main article: Articial intelligence
Algorithms and data structures is the study of commonly
used computational methods and their computational ef- This branch of computer science aims to or is reciency.
quired to synthesise goal-orientated processes such as
problem-solving, decision-making, environmental adaptation, learning and communication found in humans
Programming language theory
and animals. From its origins in cybernetics and in
the Dartmouth Conference (1956), articial intelligence
Main article: Programming language theory
(AI) research has been necessarily cross-disciplinary,
drawing on areas of expertise such as applied matheProgramming language theory is a branch of computer matics, symbolic logic, semiotics, electrical engineering,
science that deals with the design, implementation, anal- philosophy of mind, neurophysiology, and social intelliysis, characterization, and classication of programming gence. AI is associated in the popular mind with robotic
languages and their individual features. It falls within development, but the main eld of practical application
the discipline of computer science, both depending on has been as an embedded component in areas of software
and aecting mathematics, software engineering and development, which require computational understandlinguistics. It is an active research area, with numerous ing and modeling such as nance and economics, data
dedicated academic journals.
mining and the physical sciences. The starting-point in
the late 1940s was Alan Turing's question Can computers think?", and the question remains eectively unanFormal methods
swered although the "Turing Test" is still used to assess
computer output on the scale of human intelligence. But
Main article: Formal methods
the automation of evaluative and predictive tasks has been
increasingly successful as a substitute for human monitorFormal methods are a particular kind of mathematically ing and intervention in domains of computer application
based technique for the specication, development and involving complex real-world data.

54

CHAPTER 5. COMPUTER SCIENCE

Computer architecture and engineering

eld of study concerned with constructing mathematical


models and quantitative analysis techniques and using
Main articles: Computer architecture and Computer computers to analyze and solve scientic problems. In
engineering
practical use, it is typically the application of computer
simulation and other forms of computation to problems
Computer architecture, or digital computer organiza- in various scientic disciplines.
tion, is the conceptual design and fundamental operational structure of a computer system. It focuses largely Computer networks
on the way by which the central processing unit performs internally and accesses addresses in memory.[45] Main article: Computer network
The eld often involves disciplines of computer engineering and electrical engineering, selecting and interconnecting hardware components to create computers that meet This branch of computer science aims to manage networks between computers worldwide.
functional, performance, and cost goals.
Computer Performance Analysis

Concurrent, parallel and distributed systems

Main article: Computer performance

Main articles: Concurrency (computer science) and


Distributed computing

Computer Performance Analysis is the study of work


owing through computers with the general goals of improving throughput, controlling response time, using resources eciently, eliminating bottlenecks, and predicting performance under anticipated peak loads.[46]

Concurrency is a property of systems in which several


computations are executing simultaneously, and potentially interacting with each other. A number of mathematical models have been developed for general concurrent computation including Petri nets, process calculi
and the Parallel Random Access Machine model. A disComputer graphics and visualization
tributed system extends the idea of concurrency onto multiple computers connected through a network. ComputMain article: Computer graphics (computer science)
ers within the same distributed system have their own private memory, and information is often exchanged among
Computer graphics is the study of digital visual contents, themselves to achieve a common goal.
and involves synthese and manipulations of image data.
The study is connected to many other elds in computer
Databases
science, including computer vision, image processing,
and computational geometry, and is heavily applied in the
Main articles: Database and Database management
elds of special eects and video games.
systems
Computer security and cryptography

A database is intended to organize, store, and retrieve


large amounts of data easily. Digital databases are manMain articles: Computer security and Cryptography
aged using database management systems to store, create,
maintain, and search data, through database models and
Computer security is a branch of computer technology, query languages.
whose objective includes protection of information from
unauthorized access, disruption, or modication while
Health informatics
maintaining the accessibility and usability of the system for its intended users. Cryptography is the practice
Main article: Health Informatics
and study of hiding (encryption) and therefore deciphering (decryption) information. Modern cryptography is
largely related to computer science, for many encryption Health Informatics in computer science deals with comand decryption algorithms are based on their computa- putational techniques for solving problems in health care.
tional complexity.
Information science
Computational science
Main article: Information science
Computational science (or scientic computing) is the

5.5. ACADEMIA
Software engineering
Main article: Software engineering
Software engineering is the study of designing, implementing, and modifying software in order to ensure it is of
high quality, aordable, maintainable, and fast to build.
It is a systematic approach to software design, involving
the application of engineering practices to software. Software engineering deals with the organizing and analyzing
of software it doesn't just deal with the creation or manufacture of new software, but its internal maintenance and
arrangement. Both computer applications software engineers and computer systems software engineers are projected to be among the fastest growing occupations from
2008 and 2018.
See also: computer programming

55
Bhm and Jacopini's insight: There are only 3 ways
of combining these actions (into more complex
ones) that are needed in order for a computer to do
anything
Only 3 rules are needed to combine any set of
basic instructions into more complex ones:
sequence:
rst do this; then do that
selection :
IF such-&-such is the case,
THEN do this
ELSE do that
repetition:
WHILE such & such is the case DO
this

5.4 The great insights of computer


science

Note that the 3 rules of Boehms and Jacopinis insight can


be further simplied with the use of goto (which means
its more elementary than structured programming.)

The philosopher of computing Bill Rapaport noted three See also: Elementary function arithmetic Friedmans
Great Insights of Computer Science[47]
grand conjecture
Leibniz's, Boole's, Alan Turing's, Shannon's, &
Morse's insight: There are only 2 objects that a
computer has to deal with in order to represent anything

5.5 Academia

All the information about any computable


problem can be represented using only 0 &
1 (or any other bistable pair that can ip-op
between two easily distinguishable states,such
as on"/"o, magnetized/de-magnetized,
high-voltage/low-voltage, etc.).

5.5.1 Conferences

Every algorithm can be expressed in a language


for a computer consisting of only 5 basic instructions:

5.5.2 Journals

Further information: List of computer science conferences

Conferences are strategic events of the Academic Research in computer science. During those conferences,
See also: digital physics
researchers from the public and private sectors present
their recent work and meet. Proceedings of these conferences are an important part of the computer science
Alan Turing's insight: There are only 5 actions that literature.
a computer has to perform in order to do anything

* move left one location


* move right one location
* read symbol at current location
* print 0 at current location
* print 1 at current location
See also: Turing machine

Further information: Category:Computer science journals

5.6 Education
Some universities teach computer science as a theoretical study of computation and algorithmic reasoning.
These programs often feature the theory of computation,

56
analysis of algorithms, formal methods, concurrency theory, databases, computer graphics, and systems analysis, among others. They typically also teach computer
programming, but treat it as a vessel for the support of
other elds of computer science rather than a central focus of high-level study. The ACM/IEEE-CS Joint Curriculum Task Force Computing Curriculum 2005 (and
2008 update)[48] gives a guideline for university curriculum.
Other colleges and universities, as well as secondary
schools and vocational programs that teach computer science, emphasize the practice of advanced programming
rather than the theory of algorithms and computation in
their computer science curricula. Such curricula tend to
focus on those skills that are important to workers entering the software industry. The process aspects of computer programming are often referred to as software engineering.
While computer science professions increasingly drive
the U.S. economy, computer science education is absent in most American K-12 curricula. A report entitled Running on Empty: The Failure to Teach K-12
Computer Science in the Digital Age was released in
October 2010 by Association for Computing Machinery (ACM) and Computer Science Teachers Association
(CSTA), and revealed that only 14 states have adopted
signicant education standards for high school computer
science. The report also found that only nine states count
high school computer science courses as a core academic
subject in their graduation requirements. In tandem with
Running on Empty, a new non-partisan advocacy coalition - Computing in the Core (CinC) - was founded to
inuence federal and state policy, such as the Computer
Science Education Act, which calls for grants to states to
develop plans for improving computer science education
and supporting computer science teachers.
Within the United States a gender gap in computer science education has been observed as well. Research conducted by the WGBH Educational Foundation and the
Association for Computing Machinery (ACM) revealed
that more than twice as many high school boys considered computer science to be a very good or good
college major than high school girls.[49] In addition, the
high school Advanced Placement (AP) exam for computer science has displayed a disparity in gender. Compared to other AP subjects it has the lowest number of
female participants, with a composition of about 15 percent women.[50] This gender gap in computer science is
further witnessed at the college level, where 31 percent of
undergraduate computer science degrees are earned by
women and only 8 percent of computer science faculty
consists of women.[51] According to an article published
by the Epistemic Games Group in August 2012, the number of women graduates in the computer science eld has
declined to 13 percent.[52]
A 2014 Mother Jones article, We Can Code It, advo-

CHAPTER 5. COMPUTER SCIENCE


cates for adding computer literacy and coding to the K-12
curriculum in the United States, and notes that computer
science is not incorporated into the requirements for the
Common Core State Standards Initiative.[53]

5.7 See also


Main article: Outline of computer science

Academic genealogy of computer scientists


Informatics (academic eld)
List of academic computer science departments
List of computer science conferences
List of computer scientists
List of publications in computer science
List of pioneers in computer science
Technology transfer in computer science
List of software engineering topics
List of unsolved problems in computer science
Women in computing
Computer science Wikipedia book

5.8 Notes
[1] See the entry "Computer science" on Wikiquote for the
history of this quotation.

5.9 References
[1] What is Computer Science?". Boston University Department of Computer Science. Spring 2003. Retrieved December 12, 2014.
[2] WordNet Search - 3.1. Wordnetweb.princeton.edu. Retrieved 2012-05-14.
[3] Blaise Pascal. School of Mathematics and Statistics
University of St Andrews, Scotland.
[4] A Brief History of Computing.
[5] In 1851
[6] Science Museum - Introduction to Babbage. Archived
from the original on 2006-09-08. Retrieved 2006-09-24.
[7] Anthony Hyman (1982). Charles Babbage, pioneer of the
computer.

5.9. REFERENCES

57

[8] The introduction of punched cards into the new engine


was important not only as a more convenient form of control than the drums, or because programs could now be of
unlimited extent, and could be stored and repeated without the danger of introducing errors in setting the machine
by hand; it was important also because it served to crystallize Babbages feeling that he had invented something
really new, something much more than a sophisticated calculating machine. Bruce Collier, 1970
[9] A Selection and Adaptation From Adas Notes found in
Ada, The Enchantress of Numbers, by Betty Alexandra
Toole Ed.D. Strawberry Press, Mill Valley, CA. Retrieved 2006-05-04.
[10] In this sense Aiken needed IBM, whose technology included the use of punched cards, the accumulation of numerical data, and the transfer of numerical data from one
register to another, Bernard Cohen, p.44 (2000)

[23] Black box traders are on the march. The Telegraph. August 26, 2006.
[24] The Impact of High Frequency Trading on an Electronic
Market. Papers.ssrn.com. doi:10.2139/ssrn.1686004.
Retrieved 2012-05-14.
[25] Wegner, P. (October 1315, 1976). Proceedings of the
2nd international Conference on Software Engineering.
San Francisco, California, United States: IEEE Computer
Society Press, Los Alamitos, CA.
[26] Denning, P. J.; Comer, D. E.; Gries, D.; Mulder, M.
C.; Tucker, A.; Turner, A. J.; Young, P. R. (Jan 1989).
Computing as a discipline. Communications of the ACM
32: 923. doi:10.1145/63238.63239.
[27] Eden, A. H. (2007). Three Paradigms of Computer
Science. Minds and Machines 17 (2): 135167.
doi:10.1007/s11023-007-9060-8.

[11] Brian Randell, p. 187, 1975


[12] The Association for Computing Machinery (ACM) was
founded in 1947.
[13] Denning, P.J. (2000).
Computer Science: The
Discipline (PDF). Encyclopedia of Computer Science.
Archived from the original on 2006-05-25.
[14] Some EDSAC statistics.
2011-11-19.

Cl.cam.ac.uk.

Retrieved

[15] Computer science pioneer Samuel D. Conte dies at 85.


Purdue Computer Science. July 1, 2002. Retrieved December 12, 2014.
[16] Levy, Steven (1984). Hackers: Heroes of the Computer
Revolution. Doubleday. ISBN 0-385-19195-2.
[17] IBM 704 Electronic Data Processing System - CHM
Revolution. Computerhistory.org. Retrieved 2013-0707.

[28] Louis Fine (1959).


The Role of the University in Computers, Data Processing, and Related
Fields. Communications of the ACM 2 (9): 714.
doi:10.1145/368424.368427.
[29] Stanford University Oral History. Stanford University.
Retrieved May 30, 2013.
[30] id., p. 11
[31] Donald Knuth (1972). George Forsythe and the Development of Computer Science. Comms. ACM.
[32] Matti Tedre (2006). The Development of Computer Science: A Sociocultural Perspective. p. 260. Retrieved
December 12, 2014.
[33] Peter Naur (1966).
The science of datalogy.
Communications of the ACM 9 (7): 485.
doi:10.1145/365719.366510.
[34] Communications of the ACM 1 (4): 6.

[18] IBM 709: a powerful new data processing system.


Computer History Museum. Retrieved December 12,
2014.

[35] Communications of the ACM 2(1):p.4


[36] IEEE Computer 28(12):p.136

[19] David Kahn, The Codebreakers, 1967, ISBN 0-68483130-9.


[20] http://www.cis.cornell.edu/Dean/Presentations/Slides/
bgu.pdf[]
[21] Constable, R. L. (March 2000). Computer Science:
Achievements and Challenges circa 2000 (PDF).
[22] Abelson, H.; G.J. Sussman with J. Sussman (1996). Structure and Interpretation of Computer Programs (2nd ed.).
MIT Press. ISBN 0-262-01153-0. The computer revolution is a revolution in the way we think and in the way
we express what we think. The essence of this change
is the emergence of what might best be called procedural
epistemology the study of the structure of knowledge
from an imperative point of view, as opposed to the more
declarative point of view taken by classical mathematical
subjects.

[37] P. Mounier-Kuhn, L'Informatique en France, de la seconde guerre mondiale au Plan Calcul. L'mergence d'une
science, Paris, PUPS, 2010, ch. 3 & 4.
[38] Tedre, M. (2011). Computing as a Science: A Survey
of Competing Viewpoints. Minds and Machines 21 (3):
361387. doi:10.1007/s11023-011-9240-4.
[39] Parnas, D. L. (1998). Annals of Software Engineering 6:
1937. doi:10.1023/A:1018949113292., p. 19: Rather
than treat software engineering as a subeld of computer
science, I treat it as an element of the set, Civil Engineering, Mechanical Engineering, Chemical Engineering,
Electrical Engineering, [...]"
[40] Computing Sciences Accreditation Board (May 28,
1997). Computer Science as a Profession. Archived
from the original on 2008-06-17. Retrieved 2010-05-23.

58

CHAPTER 5. COMPUTER SCIENCE

[41] Committee on the Fundamentals of Computer Science:


Challenges and Opportunities, National Research Council (2004). Computer Science: Reections on the Field, Reections from the Field. National Academies Press. ISBN
978-0-309-09301-9.
[42] CSAB Leading Computer Education. CSAB. 2011-0803. Retrieved 2011-11-19.
[43] Clay Mathematics Institute P=NP
[44] P. Collins, Graham (October 14, 2002). Claude E. Shannon: Founder of Information Theory. Scientic American. Retrieved December 12, 2014.
[45] A. Thisted, Ronald. Computer Architecture. The University of Chicago. Retrieved April 7, 1997.
[46] Wescott, Bob (2013). The Every Computer Performance
Book, Chapter 3: Useful laws. CreateSpace. ISBN
1482657759.
[47] What Is Computation?". bualo.edu.
[48] ACM Curricula Recommendations. Retrieved 201211-18.
[49] New Image for Computing Report on Market Research.
WGBH Educational Foundation and the Association for
Computing Machinery (ACM). April 2009. Retrieved
December 12, 2014.
[50] Gilbert, Alorie. Newsmaker: Computer sciences gender
gap. CNET News.
[51] Dovzan, Nicole. Examining the Gender Gap in Technology. University of Michigan.
[52] Encouraging the next generation of women in computing. Microsoft Research Connections Team. Retrieved
September 3, 2013.
[53] Raja, Tasneem (August 2014). Is Coding the New Literacy?". Mother Jones. Retrieved 2014-06-21.

van Leeuwen, Jan (1994). Handbook of Theoretical


Computer Science. The MIT Press. ISBN 0-26272020-5.
"[...] this set is the most unique and possibly
the most useful to the [theoretical computer
science] community, in support both of teaching and research [...]. The books can be used
by anyone wanting simply to gain an understanding of one of these areas, or by someone
desiring to be in research in a topic, or by instructors wishing to nd timely information on
a subject they are teaching outside their major areas of expertise. (Rocky Ross, SIGACT
News)
Ralston,
Anthony;
Reilly,
Edwin D.;
Hemmendinger, David (2000).
Encyclopedia
of Computer Science (4th ed.). Groves Dictionaries. ISBN 1-56159-248-X.
Since 1976, this has been the denitive reference work on computer, computing, and computer science. [...] Alphabetically arranged
and classied into broad subject areas, the entries cover hardware, computer systems, information and data, software, the mathematics of
computing, theory of computation, methodologies, applications, and computing milieu.
The editors have done a commendable job of
blending historical perspective and practical
reference information. The encyclopedia remains essential for most public and academic
library reference collections. (Joe Accardin,
Northeastern Illinois Univ., Chicago)
Edwin D. Reilly (2003). Milestones in Computer Science and Information Technology. Greenwood Publishing Group. ISBN 978-1-57356-521-9.

Computer Software Engineer. U.S. Bureau of Labor


Statistics. U.S. Bureau of Labor Statistics, n.d. Web.
Selected papers
February 5, 2013.

5.10 Further reading


Overview
Tucker, Allen B. (2004). Computer Science Handbook (2nd ed.). Chapman and Hall/CRC. ISBN 158488-360-X.
Within more than 70 chapters, every one
new or signicantly revised, one can nd any
kind of information and references about computer science one can imagine. [...] all
in all, there is absolute nothing about Computer Science that can not be found in the 2.5
kilogram-encyclopaedia with its 110 survey
articles [...]. (Christoph Meinel, Zentralblatt
MATH)

Knuth, Donald E. (1996). Selected Papers on Computer Science. CSLI Publications, Cambridge University Press.
Collier, Bruce. The little engine that could've: The
calculating machines of Charles Babbage. Garland
Publishing Inc. ISBN 0-8240-0043-9.
Cohen, Bernard (2000). Howard Aiken, Portrait of
a computer pioneer. The MIT press. ISBN 978-02625317-9-5.
Randell, Brian (1973). The origins of Digital computers, Selected Papers. Springer-Verlag. ISBN 3540-06169-X.
Covering a period from 1966 to 1993, its interest lies not only in the content of each of
these papers still timely today but also in

5.11. EXTERNAL LINKS


their being put together so that ideas expressed
at dierent times complement each other
nicely. (N. Bernard, Zentralblatt MATH)

59
The Collection of Computer Science Bibliographies
(article)
Professional organizations

Articles
Peter J. Denning. Is computer science science?,
Communications of the ACM, April 2005.
Peter J. Denning, Great principles in computing curricula, Technical Symposium on Computer Science
Education, 2004.

Association for Computing Machinery


IEEE Computer Society
Informatics Europe
AAAI

AAAS Computer Science


Research evaluation for computer science, Informatics Europe report. Shorter journal version: Bertrand Meyer, Christine Choppy, Jan van Misc
Leeuwen and Jorgen Staunstrup, Research evalua Computer Science - Stack Exchange a community
tion for computer science, in Communications of the
run question and answer site for computer science
ACM, vol. 52, no. 4, pp. 3134, April 2009.
Curriculum and classication

What is computer science


Is computer science science?

Association for Computing Machinery. 1998 ACM


Computing Classication System. 1998.
Joint Task Force of Association for Computing Machinery (ACM), Association for Information Systems (AIS) and IEEE Computer Society (IEEECS). Computing Curricula 2005: The Overview Report. September 30, 2005.
Norman Gibbs, Allen Tucker. A model curriculum
for a liberal arts degree in computer science. Communications of the ACM, Volume 29 Issue 3, March
1986.

5.11 External links


Computer science at DMOZ
Scholarly Societies in Computer Science
Best Papers Awards in Computer Science since
1996
Photographs of computer scientists by Bertrand
Meyer
EECS.berkeley.edu
Bibliography and academic search engines
CiteSeerx (article): search engine, digital library and
repository for scientic and academic papers with a
focus on computer and information science.
DBLP Computer Science Bibliography (article):
computer science bibliography website hosted at
Universitt Trier, in Germany.

Chapter 6

History of articial intelligence


See also: Timeline of articial intelligence

distance ahead, admitted Alan Turing, in a famous 1950


paper that catalyzed the modern search for machines that
think. But, he added, we can see much that must be
The history of articial intelligence (AI) began in
[3]
antiquity, with myths, stories and rumors of articial be- done.
ings endowed with intelligence or consciousness by master craftsmen; as Pamela McCorduck writes, AI began
with an ancient wish to forge the gods.[1]
6.1 Precursors
The seeds of modern AI were planted by classical
philosophers who attempted to describe the process of
human thinking as the mechanical manipulation of symbols. This work culminated in the invention of the
programmable digital computer in the 1940s, a machine
based on the abstract essence of mathematical reasoning.
This device and the ideas behind it inspired a handful of
scientists to begin seriously discussing the possibility of
building an electronic brain.
The eld of AI research was founded at a conference
on the campus of Dartmouth College in the summer of
1956. Those who attended would become the leaders of
AI research for decades. Many of them predicted that
a machine as intelligent as a human being would exist
in no more than a generation and they were given millions of dollars to make this vision come true. Eventually it became obvious that they had grossly underestimated the diculty of the project. In 1973, in response
to the criticism of James Lighthill and ongoing pressure
from congress, the U.S. and British Governments stopped
funding undirected research into articial intelligence.
Seven years later, a visionary initiative by the Japanese
Government inspired governments and industry to provide AI with billions of dollars, but by the late 80s the investors became disillusioned and withdrew funding again.
This cycle of boom and bust, of "AI winters" and summers, continues to haunt the eld. Undaunted, there are
those who make extraordinary predictions even now.[2]
Progress in AI has continued, despite the rise and fall of
its reputation in the eyes of government bureaucrats and
venture capitalists. Problems that had begun to seem impossible in 1970 have been solved and the solutions are
now used in successful commercial products. However,
no machine has been built with a human level of intelligence, contrary to the optimistic predictions of the rst
generation of AI researchers. We can only see a short

McCorduck (2004) writes "articial intelligence in one


form or another is an idea that has pervaded Western intellectual history, a dream in urgent need of being realized, expressed in humanitys myths, legends, stories,
speculation and clockwork automatons.[4]

6.1.1 AI in myth, ction and speculation


Main article: Articial intelligence in ction
Mechanical men and articial beings appear in Greek
myths, such as the golden robots of Hephaestus and
Pygmalions Galatea.[5] In the Middle Ages, there were
rumors of secret mystical or alchemical means of placing mind into matter, such as Jbir ibn Hayyn's
Takwin, Paracelsus' homunculus and Rabbi Judah Loew's
Golem.[6] By the 19th century, ideas about articial men
and thinking machines were developed in ction, as in
Mary Shelley's Frankenstein or Karel apek's R.U.R.
(Rossums Universal Robots),[7] and speculation, such as
Samuel Butler's "Darwin among the Machines.[8] AI has
continued to be an important element of science ction
into the present.

6.1.2 Automatons
Main article: Automaton
Realistic humanoid automatons were built by craftsman
from every civilization, including Yan Shi,[9] Hero of
Alexandria,[10] Al-Jazari[11] and Wolfgang von Kempelen.[12] The oldest known automatons were the sacred
statues of ancient Egypt and Greece. The faithful believed that craftsman had imbued these gures with very
real minds, capable of wisdom and emotionHermes

60

6.1. PRECURSORS

61

Al-Jazari's programmable automata (1206 CE)

Trismegistus wrote that by discovering the true nature


of the gods, man has been able to reproduce it.[13][14]

6.1.3

Formal reasoning

Articial intelligence is based on the assumption that the


process of human thought can be mechanized. The study
of mechanicalor formalreasoning has a long history. Chinese, Indian and Greek philosophers all developed structured methods of formal deduction in the rst
millennium BCE. Their ideas were developed over the
centuries by philosophers such as Aristotle (who gave a
formal analysis of the syllogism), Euclid (whose Elements
was a model of formal reasoning), al-Khwrizm (who
developed algebra and gave his name to "algorithm")
and European scholastic philosophers such as William of
Ockham and Duns Scotus.[15]
Majorcan philosopher Ramon Llull (12321315) developed several logical machines devoted to the production
of knowledge by logical means;[16] Llull described his
machines as mechanical entities that could combine basic
and undeniable truths by simple logical operations, produced by the machine by mechanical meanings, in such
ways as to produce all the possible knowledge.[17] Llulls
work had a great inuence on Gottfried Leibniz, who redeveloped his ideas.[18]

Gottfried Leibniz, who speculated that human reason could be


reduced to mechanical calculation

such works as Boole's The Laws of Thought and Frege's


Begrisschrift. Building on Frege's system, Russell and
Whitehead presented a formal treatment of the foundations of mathematics in their masterpiece, the Principia
Mathematica in 1913. Inspired by Russell's success,
David Hilbert challenged mathematicians of the 1920s
and 30s to answer this fundamental question: can all of
mathematical reasoning be formalized?"[15] His question
was answered by Gdel's incompleteness proof, Turing's
machine and Church's Lambda calculus.[15][22] Their answer was surprising in two ways.

In the 17th century, Leibniz, Thomas Hobbes and Ren


Descartes explored the possibility that all rational thought
could be made as systematic as algebra or geometry.[19]
Hobbes famously wrote in Leviathan: reason is nothing but reckoning.[20] Leibniz envisioned a universal language of reasoning (his characteristica universalis) which
would reduce argumentation to calculation, so that there
would be no more need of disputation between two
philosophers than between two accountants. For it would
suce to take their pencils in hand, down to their slates,
and to say each other (with a friend as witness, if they
liked): Let us calculate.[21] These philosophers had begun to articulate the physical symbol system hypothesis
The ENIAC, at the Moore School of Electrical Engineering. This
that would become the guiding faith of AI research.
photo has been articially darkened, obscuring details such as
[23]
In the 20th century, the study of mathematical logic pro- the women who were present and the IBM equipment in use.
vided the essential breakthrough that made articial intelligence seem plausible. The foundations had been set by First, they proved that there were, in fact, limits to what

62

CHAPTER 6. HISTORY OF ARTIFICIAL INTELLIGENCE

mathematical logic could accomplish. But second (and


more important for AI) their work suggested that, within
these limits, any form of mathematical reasoning could
be mechanized. The Church-Turing thesis implied that a
mechanical device, shuing symbols as simple as 0 and
1, could imitate any conceivable process of mathematical
deduction. The key insight was the Turing machinea
simple theoretical construct that captured the essence of
abstract symbol manipulation. This invention would inspire a handful of scientists to begin discussing the possibility of thinking machines.[15][24]

6.1.4

Computer science

Main articles: history of computer hardware


and history of computer science

Calculating machines were built in antiquity and improved throughout history by many mathematicians, including (once again) philosopher Gottfried Leibniz. In
the early 19th century, Charles Babbage designed a programmable computer (the Analytical Engine), although
it was never built. Ada Lovelace speculated that the machine might compose elaborate and scientic pieces of
music of any degree of complexity or extent.[25] (She is
often credited as the rst programmer because of a set of
notes she wrote that completely detail a method for calculating Bernoulli numbers with the Engine.)
The rst modern computers were the massive code breaking machines of the Second World War (such as Z3,
ENIAC and Colossus).[26] The latter two of these machines were based on the theoretical foundation laid by
Alan Turing and developed by John von Neumann.[27]

intelligence research was founded as an academic discipline in 1956.

6.2.1 Cybernetics and early neural networks


The earliest research into thinking machines was inspired
by a conuence of ideas that became prevalent in the late
30s, 40s and early 50s. Recent research in neurology had
shown that the brain was an electrical network of neurons
that red in all-or-nothing pulses. Norbert Wiener's
cybernetics described control and stability in electrical networks. Claude Shannon's information theory described digital signals (i.e., all-or-nothing signals). Alan
Turing's theory of computation showed that any form of
computation could be described digitally. The close relationship between these ideas suggested that it might be
possible to construct an electronic brain.[29]
Examples of work in this vein includes robots such as
W. Grey Walter's turtles and the Johns Hopkins Beast.
These machines did not use computers, digital electronics or symbolic reasoning; they were controlled entirely
by analog circuitry.[30]
Walter Pitts and Warren McCulloch analyzed networks
of idealized articial neurons and showed how they might
perform simple logical functions. They were the rst
to describe what later researchers would call a neural
network.[31] One of the students inspired by Pitts and
McCulloch was a young Marvin Minsky, then a 24-year
old graduate student. In 1951 (with Dean Edmonds) he
built the rst neural net machine, the SNARC.[32] Minsky
was to become one of the most important leaders and innovators in AI for the next 50 years.

6.2.2 Turings test

6.2 The birth of articial intelligence 19431956

The IBM 702: a computer used by the rst generation of AI researchers.

A note on the sections in this article.[28]

In 1950 Alan Turing published a landmark paper in which


he speculated about the possibility of creating machines
that think.[33] He noted that thinking is dicult to dene and devised his famous Turing Test. If a machine
could carry on a conversation (over a teleprinter) that
was indistinguishable from a conversation with a human
being, then it was reasonable to say that the machine
was thinking. This simplied version of the problem
allowed Turing to argue convincingly that a thinking
machine was at least plausible and the paper answered
all the most common objections to the proposition.[34]
The Turing Test was the rst serious proposal in the
philosophy of articial intelligence.

6.2.3 Game AI

In the 1940s and 50s, a handful of scientists from a variety of elds (mathematics, psychology, engineering, eco- In 1951, using the Ferranti Mark 1 machine of the
nomics and political science) began to discuss the possi- University of Manchester, Christopher Strachey wrote
bility of creating an articial brain. The eld of articial a checkers program and Dietrich Prinz wrote one for

6.3. THE GOLDEN YEARS 19561974

63

chess.[35] Arthur Samuel's checkers program, developed


in the middle 50s and early 60s, eventually achieved sufcient skill to challenge a respectable amateur.[36] Game
AI would continue to be used as a measure of progress in
AI throughout its history.

6.2.4

gebra word problems, proving theorems in geometry and


learning to speak English. Few at the time would have
believed that such intelligent behavior by machines was
possible at all.[47] Researchers expressed an intense optimism in private and in print, predicting that a fully intelligent machine would be built in less than 20 years.[48]
Government agencies like ARPA poured money into the
Symbolic reasoning and the Logic new eld.[49]

Theorist
When access to digital computers became possible in the
middle fties, a few scientists instinctively recognized
that a machine that could manipulate numbers could also
manipulate symbols and that the manipulation of symbols
could well be the essence of human thought. This was a
new approach to creating thinking machines.[37]
In 1955, Allen Newell and (future Nobel Laureate)
Herbert A. Simon created the "Logic Theorist" (with help
from J. C. Shaw). The program would eventually prove
38 of the rst 52 theorems in Russell and Whiteheads
Principia Mathematica, and nd new and more elegant
proofs for some.[38] Simon said that they had solved the
venerable mind/body problem, explaining how a system
composed of matter can have the properties of mind.[39]
(This was an early statement of the philosophical position
John Searle would later call "Strong AI": that machines
can contain minds just as human bodies do.)[40]

6.3.1 The work


There were many successful programs and new directions
in the late 50s and 1960s. Among the most inuential
were these:

Reasoning as search
Many early AI programs used the same basic algorithm.
To achieve some goal (like winning a game or proving
a theorem), they proceeded step by step towards it (by
making a move or a deduction) as if searching through
a maze, backtracking whenever they reached a dead end.
This paradigm was called "reasoning as search".[50]

The principal diculty was that, for many problems, the


number of possible paths through the maze was simply
astronomical (a situation known as a "combinatorial ex6.2.5 Dartmouth Conference 1956: the plosion"). Researchers would reduce the search space by
birth of AI
using heuristics or rules of thumb that would eliminate
those paths that were unlikely to lead to a solution.[51]
[41]
The Dartmouth Conference of 1956
was organized
Newell and Simon tried to capture a general version of
by Marvin Minsky, John McCarthy and two senior scithis algorithm in a program called the "General Probentists: Claude Shannon and Nathan Rochester of IBM.
lem Solver".[52] Other searching programs were able
The proposal for the conference included this assertion:
to accomplish impressive tasks like solving problems
every aspect of learning or any other feature of intelliin geometry and algebra, such as Herbert Gelernter's
gence can be so precisely described that a machine can be
Geometry Theorem Prover (1958) and SAINT, written
made to simulate it.[42] The participants included Ray
by Minskys student James Slagle (1961).[53] Other proSolomono, Oliver Selfridge, Trenchard More, Arthur
grams searched through goals and subgoals to plan acSamuel, Allen Newell and Herbert A. Simon, all of whom
tions, like the STRIPS system developed at Stanford to
would create important programs during the rst decades
control the behavior of their robot Shakey.[54]
of AI research.[43] At the conference Newell and Simon
debuted the "Logic Theorist" and McCarthy persuaded
the attendees to accept Articial Intelligence as the
name of the eld.[44] The 1956 Dartmouth conference
was the moment that AI gained its name, its mission, its
rst success and its major players, and is widely considered the birth of AI.[45]

6.3 The golden years 19561974


The years after the Dartmouth conference were an era of
discovery, of sprinting across new ground. The programs
that were developed during this time were, to most peo- An example of a semantic network
ple, simply astonishing":[46] computers were solving al-

64
Natural language
An important goal of AI research is to allow computers to
communicate in natural languages like English. An early
success was Daniel Bobrow's program STUDENT, which
could solve high school algebra word problems.[55]

CHAPTER 6. HISTORY OF ARTIFICIAL INTELLIGENCE


1967, Marvin Minsky: Within a generation ... the
problem of creating 'articial intelligence' will substantially be solved.[63]
1970, Marvin Minsky (in Life Magazine): In from
three to eight years we will have a machine with the
general intelligence of an average human being.[64]

A semantic net represents concepts (e.g. house,"door)


as nodes and relations among concepts (e.g. has-a) as
links between the nodes. The rst AI program to use
a semantic net was written by Ross Quillian[56] and the 6.3.3 The money
most successful (and controversial) version was Roger
In June 1963, MIT received a $2.2 million grant from
Schank's Conceptual dependency theory.[57]
the newly created Advanced Research Projects Agency
Joseph Weizenbaum's ELIZA could carry out conversa- (later known as DARPA). The money was used to fund
tions that were so realistic that users occasionally were project MAC which subsumed the AI Group founded
fooled into thinking they were communicating with a hu- by Minsky and McCarthy ve years earlier. ARPA conman being and not a program. But in fact, ELIZA had tinued to provide three million dollars a year until the
no idea what she was talking about. She simply gave 70s.[65] ARPA made similar grants to Newell and Simons
a canned response or repeated back what was said to program at CMU and to the Stanford AI Project (founded
her, rephrasing her response with a few grammar rules. by John McCarthy in 1963).[66] Another important AI
ELIZA was the rst chatterbot.[58]
laboratory was established at Edinburgh University by
Donald Michie in 1965.[67] These four institutions would
continue to be the main centers of AI research (and fundMicro-worlds
ing) in academia for many years.[68]
In the late 60s, Marvin Minsky and Seymour Papert of
the MIT AI Laboratory proposed that AI research should
focus on articially simple situations known as microworlds. They pointed out that in successful sciences
like physics, basic principles were often best understood
using simplied models like frictionless planes or perfectly rigid bodies. Much of the research focused on a
"blocks world, which consists of colored blocks of various shapes and sizes arrayed on a at surface.[59]

The money was proered with few strings attached: J.


C. R. Licklider, then the director of ARPA, believed that
his organization should fund people, not projects!" and
allowed researchers to pursue whatever directions might
interest them.[69] This created a freewheeling atmosphere
at MIT that gave birth to the hacker culture,[70] but this
hands o approach would not last.

This paradigm led to innovative work in machine vision


by Gerald Sussman (who led the team), Adolfo Guzman, David Waltz (who invented "constraint propagation"), and especially Patrick Winston. At the same time,
Minsky and Papert built a robot arm that could stack
blocks, bringing the blocks world to life. The crowning achievement of the micro-world program was Terry
Winograd's SHRDLU. It could communicate in ordinary
English sentences, plan operations and execute them.[60]

6.4 The rst AI winter 19741980

In the 70s, AI was subject to critiques and nancial setbacks. AI researchers had failed to appreciate the difculty of the problems they faced. Their tremendous
optimism had raised expectations impossibly high, and
when the promised results failed to materialize, funding for AI disappeared.[71] At the same time, the eld
of connectionism (or neural nets) was shut down almost
completely for 10 years by Marvin Minsky's devastating
criticism of perceptrons.[72] Despite the diculties with
6.3.2 The optimism
public perception of AI in the late 70s, new ideas were
The rst generation of AI researchers made these predic- explored in logic programming, commonsense reasoning
and many other areas.[73]
tions about their work:
1958, H. A. Simon and Allen Newell: within ten
years a digital computer will be the worlds chess 6.4.1 The problems
champion and within ten years a digital computer
will discover and prove an important new mathemat- In the early seventies, the capabilities of AI programs
were limited. Even the most impressive could only hanical theorem.[61]
dle trivial versions of the problems they were supposed
1965, H. A. Simon: machines will be capable, to solve; all the programs were, in some sense, toys.[74]
within twenty years, of doing any work a man can AI researchers had begun to run into several fundamental
limits that could not be overcome in the 1970s. Although
do.[62]

6.4. THE FIRST AI WINTER 19741980


some of these limits would be conquered in later decades,
others still stymie the eld to this day.[75]
Limited computer power: There was not enough
memory or processing speed to accomplish anything
truly useful. For example, Ross Quillian's successful work on natural language was demonstrated with
a vocabulary of only twenty words, because that
was all that would t in memory.[76] Hans Moravec
argued in 1976 that computers were still millions
of times too weak to exhibit intelligence. He suggested an analogy: articial intelligence requires
computer power in the same way that aircraft require horsepower. Below a certain threshold, its impossible, but, as power increases, eventually it could
become easy.[77] With regard to computer vision,
Moravec estimated that simply matching the edge
and motion detection capabilities of human retina in
real time would require a general-purpose computer
capable of 109 operations/second (1000 MIPS).[78]
As of 2011, practical computer vision applications
require 10,000 to 1,000,000 MIPS. By comparison,
the fastest supercomputer in 1976, Cray-1 (retailing at $5 million to $8 million), was only capable
of around 80 to 130 MIPS, and a typical desktop
computer at the time achieved less than 1 MIPS.
Intractability and the combinatorial explosion.
In 1972 Richard Karp (building on Stephen Cook's
1971 theorem) showed there are many problems that
can probably only be solved in exponential time (in
the size of the inputs). Finding optimal solutions
to these problems requires unimaginable amounts of
computer time except when the problems are trivial.
This almost certainly meant that many of the toy
solutions used by AI would probably never scale up
into useful systems.[79]

65
The frame and qualication problems. AI researchers (like John McCarthy) who used logic discovered that they could not represent ordinary deductions that involved planning or default reasoning without making changes to the structure of
logic itself. They developed new logics (like nonmonotonic logics and modal logics) to try to solve
the problems.[82]

6.4.2 The end of funding


See also: AI Winter
The agencies which funded AI research (such as the
British government, DARPA and NRC) became frustrated with the lack of progress and eventually cut o
almost all funding for undirected research into AI. The
pattern began as early as 1966 when the ALPAC report appeared criticizing machine translation eorts. After spending 20 million dollars, the NRC ended all
support.[83] In 1973, the Lighthill report on the state of
AI research in England criticized the utter failure of AI
to achieve its grandiose objectives and led to the dismantling of AI research in that country.[84] (The report
specically mentioned the combinatorial explosion problem as a reason for AIs failings.)[85] DARPA was deeply
disappointed with researchers working on the Speech Understanding Research program at CMU and canceled an
annual grant of three million dollars.[86] By 1974, funding
for AI projects was hard to nd.
Hans Moravec blamed the crisis on the unrealistic predictions of his colleagues. Many researchers were caught up
in a web of increasing exaggeration.[87] However, there
was another issue: since the passage of the Manseld
Amendment in 1969, DARPA had been under increasing
pressure to fund mission-oriented direct research, rather
than basic undirected research. Funding for the creative,
freewheeling exploration that had gone on in the 60s
would not come from DARPA. Instead, the money was
directed at specic projects with clear objectives, such as
autonomous tanks and battle management systems.[88]

Commonsense knowledge and reasoning. Many


important articial intelligence applications like
vision or natural language require simply enormous
amounts of information about the world: the program needs to have some idea of what it might be
looking at or what it is talking about. This requires
that the program know most of the same things
6.4.3 Critiques from across campus
about the world that a child does. Researchers soon
discovered that this was a truly vast amount of information. No one in 1970 could build a database so See also: Philosophy of articial intelligence
large and no one knew how a program might learn
so much information.[80]
Several philosophers had strong objections to the claims
being made by AI researchers. One of the earliest was
Moravecs paradox: Proving theorems and solving John Lucas, who argued that Gdels incompleteness
geometry problems is comparatively easy for com- theorem showed that a formal system (such as a computers, but a supposedly simple task like recognizing puter program) could never see the truth of certain statea face or crossing a room without bumping into any- ments, while a human being could.[89] Hubert Dreyfus
thing is extremely dicult. This helps explain why ridiculed the broken promises of the 60s and critiqued
research into vision and robotics had made so little the assumptions of AI, arguing that human reasoning
progress by the middle 1970s.[81]
actually involved very little symbol processing and a

66

CHAPTER 6. HISTORY OF ARTIFICIAL INTELLIGENCE

great deal of embodied, instinctive, unconscious "know


how".[90][91] John Searle's Chinese Room argument, presented in 1980, attempted to show that a program could
not be said to understand the symbols that it uses (a
quality called "intentionality"). If the symbols have no
meaning for the machine, Searle argued, then the machine can not be described as thinking.[92]
These critiques were not taken seriously by AI researchers, often because they seemed so far o the point.
Problems like intractability and commonsense knowledge seemed much more immediate and serious. It was
unclear what dierence "know how" or "intentionality"
made to an actual computer program. Minsky said of
Dreyfus and Searle they misunderstand, and should be
ignored.[93] Dreyfus, who taught at MIT, was given a
cold shoulder: he later said that AI researchers dared
not be seen having lunch with me.[94] Joseph Weizenbaum, the author of ELIZA, felt his colleagues treatment
of Dreyfus was unprofessional and childish. Although he
was an outspoken critic of Dreyfus positions, he deliberately made it plain that theirs was not the way to treat a
human being.[95]
Weizenbaum began to have serious ethical doubts about
AI when Kenneth Colby wrote DOCTOR, a chatterbot
therapist. Weizenbaum was disturbed that Colby saw his
mindless program as a serious therapeutic tool. A feud
began, and the situation was not helped when Colby did
not credit Weizenbaum for his contribution to the program. In 1976, Weizenbaum published Computer Power
and Human Reason which argued that the misuse of articial intelligence has the potential to devalue human
life.[96]

6.4.4

6.4.5 The neats: logic, Prolog and expert


systems
Logic was introduced into AI research as early as 1958,
by John McCarthy in his Advice Taker proposal.[97] In
1963, J. Alan Robinson had discovered a simple method
to implement deduction on computers, the resolution
and unication algorithm. However, straightforward implementations, like those attempted by McCarthy and
his students in the late 60s, were especially intractable:
the programs required astronomical numbers of steps to
prove simple theorems.[98] A more fruitful approach to
logic was developed in the 70s by Robert Kowalski at
the University of Edinburgh, and soon this led to the
collaboration with French researchers Alain Colmerauer
and Philippe Roussel who created the successful logic
programming language Prolog.[99] Prolog uses a subset
of logic (Horn clauses, closely related to rules and
"production rules") that permit tractable computation.
Rules would continue to be inuential, providing a foundation for Edward Feigenbaum's expert systems and the
continuing work by Allen Newell and Herbert A. Simon
that would lead to Soar and their unied theories of cognition.[100]
Critics of the logical approach noted, as Dreyfus had,
that human beings rarely used logic when they solved
problems. Experiments by psychologists like Peter Wason, Eleanor Rosch, Amos Tversky, Daniel Kahneman
and others provided proof.[101] McCarthy responded that
what people do is irrelevant. He argued that what is really needed are machines that can solve problemsnot
machines that think as people do.[102]

6.4.6 The scrues: frames and scripts


Perceptrons and the dark age of conAmong the critics of McCarthys approach were his
nectionism

A perceptron was a form of neural network introduced


in 1958 by Frank Rosenblatt, who had been a schoolmate of Marvin Minsky at the Bronx High School of Science. Like most AI researchers, he was optimistic about
their power, predicting that perceptron may eventually
be able to learn, make decisions, and translate languages.
An active research program into the paradigm was carried out throughout the 60s but came to a sudden halt
with the publication of Minsky and Paperts 1969 book
Perceptrons. It suggested that there were severe limitations to what perceptrons could do and that Frank Rosenblatt's predictions had been grossly exaggerated. The effect of the book was devastating: virtually no research at
all was done in connectionism for 10 years. Eventually, a
new generation of researchers would revive the eld and
thereafter it would become a vital and useful part of articial intelligence. Rosenblatt would not live to see this,
as he died in a boating accident shortly after the book was
published.[72]

colleagues across the country at MIT. Marvin Minsky,


Seymour Papert and Roger Schank were trying to solve
problems like story understanding and object recognition that required a machine to think like a person. In
order to use ordinary concepts like chair or restaurant
they had to make all the same illogical assumptions that
people normally made. Unfortunately, imprecise concepts like these are hard to represent in logic. Gerald
Sussman observed that using precise language to describe essentially imprecise concepts doesn't make them
any more precise.[103] Schank described their antilogic approaches as "scruy", as opposed to the "neat"
paradigms used by McCarthy, Kowalski, Feigenbaum,
Newell and Simon.[104]

In 1975, in a seminal paper, Minsky noted that many of


his fellow scruy researchers were using the same kind
of tool: a framework that captures all our common sense
assumptions about something. For example, if we use the
concept of a bird, there is a constellation of facts that immediately come to mind: we might assume that it ies,

6.5. BOOM 19801987


eats worms and so on. We know these facts are not always true and that deductions using these facts will not
be logical, but these structured sets of assumptions are
part of the context of everything we say and think. He
called these structures "frames". Schank used a version
of frames he called "scripts" to successfully answer questions about short stories in English.[105] Many years later
object-oriented programming would adopt the essential
idea of "inheritance" from AI research on frames.

6.5 Boom 19801987


In the 1980s a form of AI program called "expert systems" was adopted by corporations around the world
and knowledge became the focus of mainstream AI research. In those same years, the Japanese government
aggressively funded AI with its fth generation computer
project. Another encouraging event in the early 1980s
was the revival of connectionism in the work of John
Hopeld and David Rumelhart. Once again, AI had
achieved success.

6.5.1

The rise of expert systems

An expert system is a program that answers questions or


solves problems about a specic domain of knowledge,
using logical rules that are derived from the knowledge
of experts. The earliest examples were developed by
Edward Feigenbaum and his students. Dendral, begun
in 1965, identied compounds from spectrometer readings. MYCIN, developed in 1972, diagnosed infectious
blood diseases. They demonstrated the feasibility of the
approach.[106]
Expert systems restricted themselves to a small domain
of specic knowledge (thus avoiding the commonsense
knowledge problem) and their simple design made it relatively easy for programs to be built and then modied
once they were in place. All in all, the programs proved to
be useful: something that AI had not been able to achieve
up to this point.[107]
In 1980, an expert system called XCON was completed
at CMU for the Digital Equipment Corporation. It was
an enormous success: it was saving the company 40 million dollars annually by 1986.[108] Corporations around
the world began to develop and deploy expert systems and
by 1985 they were spending over a billion dollars on AI,
most of it to in-house AI departments. An industry grew
up to support them, including hardware companies like
Symbolics and Lisp Machines and software companies
such as IntelliCorp and Aion.[109]

67

6.5.2 The knowledge revolution


The power of expert systems came from the expert
knowledge they contained. They were part of a new
direction in AI research that had been gaining ground
throughout the 70s. AI researchers were beginning to
suspectreluctantly, for it violated the scientic canon
of parsimonythat intelligence might very well be based
on the ability to use large amounts of diverse knowledge in dierent ways,[110] writes Pamela McCorduck.
"[T]he great lesson from the 1970s was that intelligent
behavior depended very much on dealing with knowledge, sometimes quite detailed knowledge, of a domain
where a given task lay.[111] Knowledge based systems
and knowledge engineering became a major focus of AI
research in the 1980s.[112]
The 1980s also saw the birth of Cyc, the rst attempt
to attack the commonsense knowledge problem directly,
by creating a massive database that would contain all the
mundane facts that the average person knows. Douglas
Lenat, who started and led the project, argued that there
is no shortcut the only way for machines to know the
meaning of human concepts is to teach them, one concept
at a time, by hand. The project was not expected to be
completed for many decades.[113]

6.5.3 The money returns: the fth generation project


In 1981, the Japanese Ministry of International Trade and
Industry set aside $850 million for the Fifth generation
computer project. Their objectives were to write programs and build machines that could carry on conversations, translate languages, interpret pictures, and reason
like human beings.[114] Much to the chagrin of scrues,
they chose Prolog as the primary computer language for
the project.[115]
Other countries responded with new programs of their
own. The UK began the 350 million Alvey project.
A consortium of American companies formed the
Microelectronics and Computer Technology Corporation
(or MCC) to fund large scale projects in AI and information technology.[116][117] DARPA responded as well,
founding the Strategic Computing Initiative and tripling
its investment in AI between 1984 and 1988.[118]

6.5.4 The revival of connectionism


In 1982, physicist John Hopeld was able to prove that
a form of neural network (now called a "Hopeld net")
could learn and process information in a completely
new way. Around the same time, David Rumelhart
popularized a new method for training neural networks
called "backpropagation" (discovered years earlier by
Paul Werbos). These two discoveries revived the eld of

68

CHAPTER 6. HISTORY OF ARTIFICIAL INTELLIGENCE


been steadily gaining speed and power and in 1987 they
became more powerful than the more expensive Lisp machines made by Symbolics and others. There was no
longer a good reason to buy them. An entire industry
worth half a billion dollars was demolished overnight.[122]
Eventually the earliest successful expert systems, such as
XCON, proved too expensive to maintain. They were difcult to update, they could not learn, they were "brittle"
(i.e., they could make grotesque mistakes when given unusual inputs), and they fell prey to problems (such as the
qualication problem) that had been identied years earlier. Expert systems proved useful, but only in a few special contexts.[123]
In the late 80s, the Strategic Computing Initiative cut
funding to AI deeply and brutally. New leadership at
DARPA had decided that AI was not the next wave and
directed funds towards projects that seemed more likely
to produce immediate results.[124]

By 1991, the impressive list of goals penned in 1981 for


Japans Fifth Generation Project had not been met. Indeed, some of them, like carry on a casual conversation
connectionism which had been largely abandoned since had not been met by 2010.[125] As with other AI projects,
[117][119]
1970.
expectations had run much higher than what was actually
The new eld was unied and inspired by the appearance possible.[125]
of Parallel Distributed Processing in 1986a two volume
collection of papers edited by Rumelhart and psychologist James McClelland. Neural networks would become 6.6.2 The importance of having a body:
commercially successful in the 1990s, when they began
Nouvelle AI and embodied reason
to be used as the engines driving programs like optical
character recognition and speech recognition.[117][120]
Main articles: Nouvelle AI, behavior-based AI, situated
and embodied cognitive science
A Hopeld net with four nodes.

6.6 Bust: the second AI winter


19871993

In the late 80s, several researchers advocated a completely new approach to articial intelligence, based on
robotics.[126] They believed that, to show real intelligence,
a machine needs to have a body it needs to perceive,
move, survive and deal with the world. They argued that
these sensorimotor skills are essential to higher level skills
like commonsense reasoning and that abstract reasoning
was actually the least interesting or important human skill
(see Moravecs paradox). They advocated building intelligence from the bottom up.[127]

The business communitys fascination with AI rose and


fell in the 80s in the classic pattern of an economic bubble. The collapse was in the perception of AI by government agencies and investors the eld continued to make
advances despite the criticism. Rodney Brooks and Hans
Moravec, researchers from the related eld of robotics,
argued for an entirely new approach to articial intelliThe approach revived ideas from cybernetics and control
gence.
theory that had been unpopular since the sixties. Another precursor was David Marr, who had come to MIT
in the late 70s from a successful background in theoret6.6.1 AI winter
ical neuroscience to lead the group studying vision. He
The term "AI winter" was coined by researchers who had rejected all symbolic approaches (both McCarthys logic
survived the funding cuts of 1974 when they became con- and Minsky's frames), arguing that AI needed to undercerned that enthusiasm for expert systems had spiraled stand the physical machinery of vision from the bottom
out of control and that disappointment would certainly up before any symbolic processing took place. (Marrs
follow.[121] Their fears were well founded: in the late 80s work would be cut short by leukemia in 1980.)[128]
and early 90s, AI suered a series of nancial setbacks. In a 1990 paper Elephants Don't Play Chess, robotics reThe rst indication of a change in weather was the sud- searcher Rodney Brooks took direct aim at the physical
den collapse of the market for specialized AI hardware symbol system hypothesis, arguing that symbols are not
in 1987. Desktop computers from Apple and IBM had always necessary since the world is its own best model.

6.7. AI 1993PRESENT
It is always exactly up to date. It always has every detail
there is to be known. The trick is to sense it appropriately
and often enough.[129] In the 80s and 90s, many cognitive
scientists also rejected the symbol processing model of
the mind and argued that the body was essential for reasoning, a theory called the embodied mind thesis.[130]

69
into the study of AI.[138] When the economists denition of a rational agent was married to computer science's
denition of an object or module, the intelligent agent
paradigm was complete.

An intelligent agent is a system that perceives its environment and takes actions which maximize its chances of
success. By this denition, simple programs that solve
specic problems are intelligent agents, as are human
6.7 AI 1993present
beings and organizations of human beings, such as rms.
The intelligent agent paradigm denes AI research as the
The eld of AI, now more than a half a century old, - study of intelligent agents. This is a generalization of
nally achieved some of its oldest goals. It began to be used some earlier denitions of AI: it goes beyond studying
successfully throughout the technology industry, although human intelligence; it studies all kinds of intelligence.[139]
somewhat behind the scenes. Some of the success was The paradigm gave researchers license to study isolated
due to increasing computer power and some was achieved problems and nd solutions that were both veriable and
by focusing on specic isolated problems and pursuing useful. It provided a common language to describe probthem with the highest standards of scientic accountabil- lems and share their solutions with each other, and with
ity. Still, the reputation of AI, in the business world at other elds that also used concepts of abstract agents, like
least, was less than pristine. Inside the eld there was lit- economics and control theory. It was hoped that a comtle agreement on the reasons for AIs failure to fulll the plete agent architecture (like Newells SOAR) would one
dream of human level intelligence that had captured the day allow researchers to build more versatile and intelliimagination of the world in the 1960s. Together, all these gent systems out of interacting intelligent agents.[138][140]
factors helped to fragment AI into competing subelds focused on particular problems or approaches, sometimes
even under new names that disguised the tarnished pedi- 6.7.3 Victory of the neats
gree of articial intelligence.[131] AI was both more
cautious and more successful than it had ever been.
AI researchers began to develop and use sophisticated
mathematical tools more than they ever had in the
past.[141] There was a widespread realization that many of
6.7.1 Milestones and Moores Law
the problems that AI needed to solve were already being
On 11 May 1997, Deep Blue became the rst com- worked on by researchers in elds like mathematics, ecoputer chess-playing system to beat a reigning world chess nomics or operations research. The shared mathematical
champion, Garry Kasparov.[132] In February 2011, in a language allowed both a higher level of collaboration with
Jeopardy! quiz show exhibition match, IBM's question more established and successful elds and the achieveanswering system, Watson, defeated the two greatest ment of results which were measurable and provable;
Jeopardy! champions, Brad Rutter and Ken Jennings, by AI had become a more rigorous scientic discipline.
Russell & Norvig (2003) describe this as nothing less than
a signicant margin.[133]
a revolution and the victory of the neats".[142][143]
These successes were not due to some revolutionary new
[144]
brought
paradigm, but mostly on the tedious application of engi- Judea Pearl's highly inuential 1988 book
probability
and
decision
theory
into
AI.
Among
the
neering skill and on the tremendous power of computers
many
new
tools
in
use
were
Bayesian
networks,
hidden
[134]
today.
In fact, Deep Blues computer was 10 million
times faster than the Ferranti Mark 1 that Christopher Markov models, information theory, stochastic modelStrachey taught to play chess in 1951.[135] This dramatic ing and classical optimization. Precise mathematical deincrease is measured by Moores law, which predicts that scriptions were also developed for "computational intellilike neural networks and evolutionary
the speed and memory capacity of computers doubles ev- gence" paradigms
[142]
algorithms.
ery two years. The fundamental problem of raw computer power was slowly being overcome.

6.7.4 AI behind the scenes


6.7.2

Intelligent agents

A new paradigm called "intelligent agents" became


widely accepted during the 90s.[136] Although earlier researchers had proposed modular divide and conquer
approaches to AI,[137] the intelligent agent did not reach
its modern form until Judea Pearl, Allen Newell and others brought concepts from decision theory and economics

Algorithms originally developed by AI researchers began


to appear as parts of larger systems. AI had solved a lot
of very dicult problems[145] and their solutions proved
to be useful throughout the technology industry,[146]
such as data mining, industrial robotics, logistics,[147]
speech recognition,[148] banking software,[149] medical
diagnosis[149] and Google's search engine.[150]

70

CHAPTER 6. HISTORY OF ARTIFICIAL INTELLIGENCE

The eld of AI receives little or no credit for these suc- some tools by watching YouTube videos.[164]
cesses. Many of AIs greatest innovations have been reduced to the status of just another item in the tool chest
of computer science.[151] Nick Bostrom explains A lot 6.8 See also
of cutting edge AI has ltered into general applications,
often without being called AI because once something
Outline of articial intelligence
becomes useful enough and common enough its not labeled AI anymore.[152]
Progress in articial intelligence
Many researchers in AI in 1990s deliberately called their
work by other names, such as informatics, knowledgebased systems, cognitive systems or computational intelligence. In part, this may be because they considered their
eld to be fundamentally dierent from AI, but also the
new names help to procure funding. In the commercial
world at least, the failed promises of the AI Winter continue to haunt AI research, as the New York Times reported in 2005: Computer scientists and software engineers avoided the term articial intelligence for fear of
being viewed as wild-eyed dreamers.[153][154][155]

Timeline of articial intelligence


History of natural language processing

6.9 Notes
[1] McCorduck 2004.
[2] For example Kurzweil (2005) argues that machines with
human level intelligence will exist by 2029.
[3] Turing 1950, p. 460

6.7.5

Where is HAL 9000?

[4] McCorduck 2004, pp. 535.


[5] McCorduck 2004, p. 5; Russell & Norvig 2003, p. 939

In 1968, Arthur C. Clarke and Stanley Kubrick had imagined that by the year 2001, a machine would exist with an
intelligence that matched or exceeded the capability of
human beings. The character they created, HAL 9000,
was based on a belief shared by many leading AI researchers that such a machine would exist by the year
2001.[156]
Marvin Minsky asks So the question is why didn't we
get HAL in 2001?"[157] Minsky believes that the answer
is that the central problems, like commonsense reasoning, were being neglected, while most researchers pursued things like commercial applications of neural nets
or genetic algorithms. John McCarthy, on the other
hand, still blames the qualication problem.[158] For Ray
Kurzweil, the issue is computer power and, using Moores
Law, he predicts that machines with human-level intelligence will appear by 2029.[159] Je Hawkins argues
that neural net research ignores the essential properties
of the human cortex, preferring simple models that have
been successful at solving simple problems.[160] There are
many other explanations and for each there is a corresponding research program underway.

6.7.6

2010s

[6] McCorduck 2004, pp. 1516; Buchanan 2005, p. 50


(Judah Loew's Golem); McCorduck 2004, pp. 1314
(Paracelsus); O'Connor 1994 (Gebers Takwin)
[7] McCorduck 2004, pp. 1725.
[8] Butler 1863.
[9] Needham 1986, p. 53
[10] McCorduck 2004, p. 6
[11] Nick 2005.
[12] McCorduck 2004, p. 17 and see also Levitt 2000
[13] Quoted in McCorduck 2004, p. 8. Crevier 1993, p. 1 and
McCorduck 2004, pp. 69 discusses sacred statues.
[14] Other important automatons were built by Haroun alRashid (McCorduck 2004, p. 10), Jacques de Vaucanson (McCorduck 2004, p. 16) and Leonardo Torres y
Quevedo (McCorduck 2004, pp. 5962)
[15] Berlinski 2000
[16] Cfr. Carreras Artau, Toms y Joaqun. Historia de la
losofa espaola. Filosofa cristiana de los siglos XIII al
XV. Madrid, 1939, Volume I
[17] Bonner, Anthonny, The Art and Logic of Ramn Llull: A

In February 2011, in a Jeopardy! quiz show exhibiUsers Guide, Brill, 2007.


tion match, IBM's question answering system, Watson,
defeated the two greatest Jeopardy champions, Brad [18] Anthony Bonner (ed.), Doctor Illuminatus. A Ramon
Llull Reader (Princeton University 1985). Vid. Llulls
Rutter and Ken Jennings, by a signicant margin.[161]
Inuence: The History of Lullism at 5771
The Kinect, which provides a 3D bodymotion interface for the Xbox 360 and the Xbox One, uses algo- [19] 17th century mechanism and AI:
rithms that emerged from lengthy AI research[162] as do
McCorduck 2004, pp. 3746
intelligent personal assistants in smartphones.[163] In 2015
researchers constructed robot, which learned how to use
Russell & Norvig 2003, p. 6

6.9. NOTES
Haugeland 1986, chpt. 2
Buchanan 2005, p. 53
[20] Hobbes and AI:
McCorduck 2004, p. 42
Hobbes 1651, chapter 5
[21] Leibniz and AI:
McCorduck 2004, p. 41
Russell & Norvig 2003, p. 6
Berlinski 2000, p. 12
Buchanan 2005, p. 53
[22] The Lambda calculus was especially important to AI,
since it was an inspiration for Lisp (the most important
programming language used in AI). (Crevier 1993, pp.
190 196,61)
[23] The original photo can be seen in the article: Rose, Allen
(April 1946). Lightning Strikes Mathematics. Popular
Science: 8386. Retrieved 15 April 2012.
[24] The Turing machine: McCorduck 2004, pp. 6364,
Crevier 1993, pp. 2224, Russell & Norvig 2003, p. 8
and see Turing 1936
[25] Menabrea 1843
[26] McCorduck 2004, pp. 6162, 6466, Russell & Norvig
2003, pp. 1415

71

[36] Schaeer, Jonathan. One Jump Ahead:: Challenging Human Supremacy in Checkers, 1997,2009, Springer, ISBN
978-0-387-76575-4. Chapter 6.
[37] McCorduck 2004, pp. 137170, Crevier, pp. 4447
[38] McCorduck 2004, pp. 123125, Crevier 1993, pp. 44
46 and Russell & Norvig 2003, p. 17
[39] Quoted in Crevier 1993, p. 46 and Russell & Norvig
2003, p. 17
[40] Russell & Norvig 2003, p. 947,952
[41] McCorduck 2004, pp. 111136, Crevier 1993, pp. 49
51 and Russell & Norvig 2003, p. 17
[42] See McCarthy et al. 1955. Also see Crevier 1993, p. 48
where Crevier states "[the proposal] later became known
as the 'physical symbol systems hypothesis". The physical
symbol system hypothesis was articulated and named by
Newell and Simon in their paper on GPS. (Newell & Simon 1963) It includes a more specic denition of a machine as an agent that manipulates symbols. See the
philosophy of articial intelligence.
[43] McCorduck (2004, pp. 129130) discusses how the Dartmouth conference alumni dominated the rst two decades
of AI research, calling them the invisible college.
[44] I won't swear and I hadn't seen it before, McCarthy told
Pamela McCorduck in 1979. (McCorduck 2004, p. 114)
However, McCarthy also stated unequivocally I came up
with the term in a CNET interview. (Skillings 2006)

[27] Von Neumann: McCorduck (2004, pp. 7680)

[45] Crevier (1993, pp. 49) writes the conference is generally


recognized as the ocial birthdate of the new science.

[28] The starting and ending dates of the sections in this article are adopted from Crevier 1993 and Russell & Norvig
2003, p. 1627. Themes, trends and projects are treated
in the period that the most important work was done.

[46] Russell and Norvig write it was astonishing whenever a


computer did anything remotely clever. Russell & Norvig
2003, p. 18

[29] McCorduck 2004, pp. 5157, 80107, Crevier 1993, pp.


2732, Russell & Norvig 2003, pp. 15, 940, Moravec
1988, p. 3, Cordeschi & 2002 Chap. 5.
[30] McCorduck 2004, p. 98, Crevier 1993, pp. 2728,
Russell & Norvig 2003, pp. 15, 940, Moravec 1988, p.
3, Cordeschi & 2002 Chap. 5.
[31] McCorduck 2004, pp. 5157, 8894, Crevier 1993, p.
30, Russell & Norvig 2003, p. 1516, Cordeschi & 2002
Chap. 5 and see also Pitts & McCullough 1943
[32] McCorduck 2004, p. 102, Crevier 1993, pp. 3435 and
Russell & Norvig 2003, p. 17
[33] McCorduck 2004, pp. 7072, Crevier 1993, p. 2225,
Russell & Norvig 2003, pp. 23 and 948, Haugeland
1985, pp. 69, Cordeschi 2002, pp. 170176. See also
Turing 1950
[34] Norvig & Russell (2003, p. 948) claim that Turing answered all the major objections to AI that have been offered in the years since the paper appeared.
[35] See A Brief History of Computing at AlanTuring.net.

[47] Crevier 1993, pp. 52107, Moravec 1988, p. 9 and


Russell & Norvig 2003, p. 1821
[48] McCorduck 2004, p. 218, Crevier 1993, pp. 108109
and Russell & Norvig 2003, p. 21
[49] Crevier 1993, pp. 52107, Moravec 1988, p. 9
[50] Means-ends analysis, reasoning as search: McCorduck
2004, pp. 247248. Russell & Norvig 2003, pp. 5961
[51] Heuristic: McCorduck 2004, p. 246, Russell & Norvig
2003, pp. 2122
[52] GPS: McCorduck 2004, pp. 245250, Crevier 1993, p.
GPS?, Russell & Norvig 2003, p. GPS?
[53] Crevier 1993, pp. 5158,6566 and Russell & Norvig
2003, pp. 1819
[54] McCorduck 2004, pp. 268271, Crevier 1993, pp. 95
96, Moravec 1988, pp. 1415
[55] McCorduck 2004, p. 286, Crevier 1993, pp. 7679,
Russell & Norvig 2003, p. 19
[56] Crevier 1993, pp. 7983

72

[57] Crevier 1993, pp. 164172


[58] McCorduck 2004, pp. 291296, Crevier 1993, pp. 134
139
[59] McCorduck 2004, pp. 299305, Crevier 1993, pp. 83
102, Russell & Norvig 2003, p. 19 and Copeland 2000
[60] McCorduck 2004, pp. 300305, Crevier 1993, pp. 84
102, Russell & Norvig 2003, p. 19
[61] Simon & Newell 1958, p. 78 quoted in Crevier 1993, p.
108. See also Russell & Norvig 2003, p. 21
[62] Simon 1965, p. 96 quoted in Crevier 1993, p. 109
[63] Minsky 1967, p. 2 quoted in Crevier 1993, p. 109
[64] Minsky strongly believes he was misquoted.
See
McCorduck 2004, pp. 272274, Crevier 1993, p. 96 and
Darrach 1970.
[65] Crevier 1993, pp. 6465
[66] Crevier 1993, p. 94
[67] Howe 1994
[68] McCorduck 2004, p. 131, Crevier 1993, p. 51. McCorduck also notes that funding was mostly under the direction of alumni of the Dartmouth conference of 1956.
[69] Crevier 1993, p. 65
[70] Crevier 1993, pp. 6871 and Turkle 1984
[71] Crevier 1993, pp. 100144 and Russell & Norvig 2003,
pp. 2122
[72] McCorduck 2004, pp. 104107, Crevier 1993, pp. 102
105, Russell & Norvig 2003, p. 22
[73] Crevier 1993, pp. 163196
[74] Crevier 1993, p. 146
[75] Russell & Norvig 2003, pp. 2021
[76] Crevier 1993, pp. 146148, see also Buchanan 2005, p.
56: Early programs were necessarily limited in scope by
the size and speed of memory

CHAPTER 6. HISTORY OF ARTIFICIAL INTELLIGENCE

[83] McCorduck 2004, pp. 280281, Crevier 1993, p. 110,


Russell & Norvig 2003, p. 21 and NRC 1999 under Success in Speech Recognition.
[84] Crevier 1993, p. 117, Russell & Norvig 2003, p. 22,
Howe 1994 and see also Lighthill 1973.
[85] Russell & Norvig 2003, p. 22, Lighthill 1973, John McCarthy wrote in response that the combinatorial explosion problem has been recognized in AI from the beginning in Review of Lighthill report
[86] Crevier 1993, pp. 115116 (on whom this account is
based). Other views include McCorduck 2004, pp. 306
313 and NRC 1999 under Success in Speech Recognition.
[87] Crevier 1993, p. 115. Moravec explains, Their initial
promises to DARPA had been much too optimistic. Of
course, what they delivered stopped considerably short of
that. But they felt they couldn't in their next proposal
promise less than in the rst one, so they promised more.
[88] NRC 1999 under Shift to Applied Research Increases
Investment. While the autonomous tank was a failure,
the battle management system (called "DART") proved to
be enormously successful, saving billions in the rst Gulf
War, repaying the investment and justifying the DARPA's
pragmatic policy, at least as far as DARPA was concerned.
[89] Lucas and Penrose' critique of AI: Crevier 1993, p. 22,
Russell & Norvig 2003, pp. 949950, Hofstadter 1980,
pp. 471477 and see Lucas 1961
[90] Know-how is Dreyfus term. (Dreyfus makes a distinction between knowing how and knowing that, a modern version of Heidegger's distinction of ready-to-hand
and present-at-hand.) (Dreyfus & Dreyfus 1986)
[91] Dreyfus critique of articial intelligence: McCorduck
2004, pp. 211239, Crevier 1993, pp. 120132, Russell
& Norvig 2003, pp. 950952 and see Dreyfus 1965,
Dreyfus 1972, Dreyfus & Dreyfus 1986
[92] Searles critique of AI: McCorduck 2004, pp. 443445,
Crevier 1993, pp. 269271, Russell & Norvig 2004, pp.
958960 and see Searle 1980
[93] Quoted in Crevier 1993, p. 143

[77] Moravec 1976. McCarthy has always disagreed with


Moravec, back to their early days together at SAIL. He
states I would say that 50 years ago, the machine capability was much too small, but by 30 years ago, machine
capability wasn't the real problem. in a CNET interview.
(Skillings 2006)

[94] Quoted in Crevier 1993, p. 122

[78] Hans Moravec, ROBOT: Mere Machine to Transcendent


Mind

[96] Weizenbaums critique of AI: McCorduck 2004, pp. 356


373, Crevier 1993, pp. 132144, Russell & Norvig 2003,
p. 961 and see Weizenbaum 1976

[79] Russell & Norvig 2003, pp. 9,2122 and Lighthill 1973
[80] McCorduck 2004, pp. 300 & 421; Crevier 1993, pp.
113114; Moravec 1988, p. 13; Lenat & Guha 1989, (Introduction); Russell & Norvig 2003, p. 21
[81] McCorduck 2004, p. 456, Moravec 1988, pp. 1516
[82] McCarthy & Hayes 1969, Crevier 1993, pp. 117119

[95] I became the only member of the AI community to be


seen eating lunch with Dreyfus. And I deliberately made
it plain that theirs was not the way to treat a human being.
Joseph Weizenbaum, quoted in Crevier 1993, p. 123.

[97] McCorduck 2004, p. 51, Russell & Norvig 2003, pp. 19,
23
[98] McCorduck 2004, p. 51, Crevier 1993, pp. 190192
[99] Crevier 1993, pp. 193196
[100] Crevier 1993, pp. 145149,25863

6.9. NOTES

73

[101] Wason (1966) showed that people do poorly on com- [121] Crevier 1993, pp. 203. AI winter was rst used as the
pletely abstract problems, but if the problem is restated
title of a seminar on the subject for the Association for
to allowed the use of intuitive social intelligence, perforthe Advancement of Articial Intelligence.
mance dramatically improves. (See Wason selection task)
Tversky, Slovic & Kahnemann (1982) have shown that [122] McCorduck 2004, p. 435, Crevier 1993, pp. 209210
people are terrible at elementary problems that involve uncertain reasoning. (See list of cognitive biases for several [123] McCorduck 2004, p. 435 (who cites institutional reasons
for their ultimate failure), Crevier 1993, pp. 204208
examples). Eleanor Rosch's work is described in Lako
(who cites the diculty of truth maintenance, i.e., learn1987
ing and updating), Lenat & Guha 1989, Introduction (who
[102] An early example of McCathys position was in the jouremphasizes the brittleness and the inability to handle exnal Science where he said This is AI, so we don't care
cessive qualication.)
if its psychologically real (Kolata 1982), and he recently
reiterated his position at the AI@50 conference where he [124] McCorduck 2004, pp. 430431
said Articial intelligence is not, by denition, simulation of human intelligence (Maker 2006).
[125] McCorduck 2004, p. 441, Crevier 1993, p. 212. McCorduck writes Two and a half decades later, we can see
[103] Crevier 1993, pp. 175
that the Japanese didn't quite meet all of those ambitious
goals.
[104] Neat vs. scruy: McCorduck 2004, pp. 421424 (who
picks up the state of the debate in 1984). Crevier 1993,
pp. 168 (who documents Schanks original use of the [126] McCorduck 2004, pp. 454462
term). Another aspect of the conict was called the pro[127] Moravec (1988, p. 20) writes: I am condent that this
cedural/declarative distinction but did not prove to be inbottom-up route to articial intelligence will one date
uential in later AI research.
meet the traditional top-down route more than half way,
ready to provide the real world competence and the com[105] McCorduck 2004, pp. 305306, Crevier 1993, pp. 170
monsense knowledge that has been so frustratingly elusive
173, 246 and Russell & Norvig 2003, p. 24. Minskys
in reasoning programs. Fully intelligent machines will reframe paper: Minsky 1974.
sult when the metaphorical golden spike is driven uniting
[106] McCorduck 2004, pp. 327335 (Dendral), Crevier 1993,
the two eorts.
pp. 148159, Russell & Norvig 2003, pp. 2223
[128] Crevier 1993, pp. 183190.
[107] Crevier 1993, pp. 158159 and Russell & Norvig 2003,
p. 2324
[129] Brooks 1990, p. 3
[108] Crevier 1993, p. 198

[130] See, for example, Lako & Turner 1999

[109] McCorduck 2004, pp. 434435, Crevier 1993, pp. 161


[131] McCorduck (2004, p. 424) discusses the fragmentation
162,197203 and Russell & Norvig 2003, p. 24
and the abandonment of AIs original goals.
[110] McCorduck 2004, p. 299
[132] McCorduck 2004, pp. 480483
[111] McCorduck 2004, pp. 421
[133] Marko, John (16 February 2011). On Jeopardy!' Wat[112] Knowledge revolution: McCorduck 2004, pp. 266276,
son Win Is All but Trivial. The New York Times.
298300, 314, 421, Russell Norvig, pp. 2223
[113] Cyc: McCorduck 2004, p. 489, Crevier 1993, pp. 239
243, Russell & Norvig 2003, p. 363365 and Lenat &
Guha 1989

[134] Kurzweil 2005, p. 274 writes that the improvement in


computer chess, according to common wisdom, is governed only by the brute force expansion of computer hardware.

[114] McCorduck 2004, pp. 436441, Crevier 1993, pp. 211,


Russell & Norvig 2003, p. 24 and see also Feigenbaum & [135] Cycle time of Ferranti Mark 1 was 1.2 milliseconds,
which is arguably equivalent to about 833 ops. Deep
McCorduck 1983
Blue ran at 11.38 gigaops (and this does not even take
into account Deep Blues special-purpose hardware for
[115] Crevier 1993, pp. 195
chess). Very approximately, these dier by a factor of
[116] Crevier 1993, pp. 240.
10^7.
[117] Russell & Norvig 2003, p. 25
[118] McCorduck 2004, pp. 426432, NRC 1999 under Shift
to Applied Research Increases Investment
[119] Crevier 1993, pp. 214215.
[120] Crevier 1993, pp. 215216.

[136] McCorduck 2004, pp. 471478, Russell & Norvig 2003,


p. 55, where they write: The whole-agent view is
now widely accepted in the eld. The intelligent agent
paradigm is discussed in major AI textbooks, such as:
Russell & Norvig 2003, pp. 3258, 968972, Poole,
Mackworth & Goebel 1998, pp. 721, Luger & Stubbleeld 2004, pp. 235240

74

CHAPTER 6. HISTORY OF ARTIFICIAL INTELLIGENCE

[137] Carl Hewitt's Actor model anticipated the modern definition of intelligent agents. (Hewitt, Bishop & Steiger
1973) Both John Doyle (Doyle 1983) and Marvin Minsky's popular classic The Society of Mind (Minsky 1986)
used the word agent. Other modular proposals included Rodney Brooks subsumption architecture, objectoriented programming and others.
[138] Russell & Norvig 2003, pp. 27, 55
[139] This is how the most widely accepted textbooks of the 21st
century dene articial intelligence. See Russell & Norvig
2003, p. 32 and Poole, Mackworth & Goebel 1998, p. 1
[140] McCorduck 2004, p. 478
[141] McCorduck 2004, pp. 486487, Russell & Norvig 2003,
pp. 2526
[142] Russell & Norvig 2003, p. 2526
[143] McCorduck (2004, p. 487): As I write, AI enjoys a Neat
hegemony.
[144] Pearl 1988
[145] See Computer science (in Applications of articial intelligence)
[146] NRC 1999 under Articial Intelligence in the 90s, and
Kurzweil 2005, p. 264
[147] Russell & Norvig 2003, p. 28
[148] For the new state of the art in AI based speech recognition,
see The Economist (2007)
[149] AI-inspired systems were already integral to many everyday technologies such as internet search engines, bank
software for processing transactions and in medical diagnosis. Nick Bostrom, quoted in CNN 2006
[150] Olsen (2004),Olsen (2006)
[151] McCorduck 2004, p. 423, Kurzweil 2005, p.
Hofstadter 1979, p. 601

265,

[152] CNN 2006


[153] Marko 2005
[154] The Economist 2007
[155] Tascarella 2006
[156] Crevier 1993, pp. 108109
[157] He goes on to say: The answer is, I believe we could have
... I once went to an international conference on neural
net[s]. There were 40 thousand registrants ... but ... if
you had an international conference, for example, on using
multiple representations for common sense reasoning, I've
only been able to nd 6 or 7 people in the whole world.
Minsky 2001
[158] Maker 2006
[159] Kurzweil 2005
[160] Hawkins & Blakeslee 2004

[161] Marko 2011.


[162] Kinects AI breakthrough explained
[163] http://readwrite.com/2013/01/15/
virtual-personal-assistants-the-future-of-your-smartphone-infographic
[164] http://www.kurzweilai.net/
robot-learns-to-use-tools-by-watching-youtube-videos

6.10 References
Berlinski, David (2000), The Advent of the Algorithm, Harcourt Books, ISBN 0-15-601391-6,
OCLC 46890682.
Buchanan, Bruce G. (Winter 2005), A (Very) Brief
History of Articial Intelligence, AI Magazine: 53
60, retrieved 2007-08-30.
Brooks, Rodney (1990), Elephants Don't Play
Chess, Robotics and Autonomous Systems 6:
315, doi:10.1016/S0921-8890(05)80025-9, retrieved 2007-08-30.
Butler, Samuel (13 June 1863), Darwin Among the
Machines, the Press, Christchurch, New Zealand,
retrieved 10 October 2008.
CNN (26 July 2006), AI set to exceed human brain
power, CNN.com, retrieved 16 October 2007.
Copeland, Jack (2000), Micro-World AI, retrieved 8
October 2008.
Cordeschi, Roberto (2002), The Discovery of the
Articial, Dordrecht: Kluwer..
Crevier, Daniel (1993), AI: The Tumultuous Search
for Articial Intelligence, New York, NY: BasicBooks, ISBN 0-465-02997-3
Darrach, Brad (20 November 1970), Meet Shakey,
the First Electronic Person, Life Magazine: 5868.
Doyle, J. (1983), What is rational psychology? Toward a modern mental philosophy, AI Magazine 4
(3): 5053.
Dreyfus, Hubert (1965), Alchemy and AI, RAND
Corporation Memo.
Dreyfus, Hubert (1972), What Computers Can't
Do, New York: MIT Press, ISBN 0-06-090613-8,
OCLC 5056816.
The Economist (7 June 2007), Are You Talking to
Me?", The Economist, retrieved 16 October 2008.
Feigenbaum, Edward A.; McCorduck, Pamela
(1983), The Fifth Generation: Articial Intelligence and Japans Computer Challenge to the World,
Michael Joseph, ISBN 0-7181-2401-4.

6.10. REFERENCES
Hawkins, Je; Blakeslee, Sandra (2004), On Intelligence, New York, NY: Owl Books, ISBN 0-80507853-3, OCLC 61273290.
Hebb, D.O. (1949), The Organization of Behavior,
New York: Wiley, ISBN 0-8058-4300-0, OCLC
48871099.
Hewitt, Carl; Bishop, Peter; Steiger, Richard
(1973), A Universal Modular Actor Formalism for
Articial Intelligence (PDF), IJCAI
Hobbes, Thomas (1651), Leviathan.
Hofstadter, Douglas (1999 (1979)), Gdel, Escher,
Bach: an Eternal Golden Braid, Basic Books, ISBN
0-465-02656-7, OCLC 225590743 Check date values in: |date= (help).
Howe, J. (November 1994), Articial Intelligence
at Edinburgh University: a Perspective, retrieved 30
August 2007.
Kolata, G. (1982), How can computers
get common sense?", Science 217 (4566):
12371238,
Bibcode:1982Sci...217.1237K,
doi:10.1126/science.217.4566.1237,
PMID
17837639.
Kurzweil, Ray (2005), The Singularity is Near,
Viking Press, ISBN 0-14-303788-9, OCLC
71826177.
Lako, George (1987), Women, Fire, and Dangerous Things: What Categories Reveal About the Mind,
University of Chicago Press., ISBN 0-226-46804-6.
Lenat, Douglas; Guha, R. V. (1989), Building Large
Knowledge-Based Systems, Addison-Wesley, ISBN
0-201-51752-3, OCLC 19981533.
Levitt, Gerald M. (2000), The Turk, Chess Automaton, Jeerson, N.C.: McFarland, ISBN 0-78640778-6.
Lighthill, Professor Sir James (1973), "Articial
Intelligence: A General Survey", Articial Intelligence: a paper symposium, Science Research Council
Lucas, John (1961), Minds, Machines and
Gdel, Philosophy 36 (XXXVI): 112127,
doi:10.1017/S0031819100057983, retrieved 15
October 2008
Maker, Meg Houston (2006), AI@50: AI Past,
Present, Future, Dartmouth College, retrieved 16
October 2008
Marko, John (14 October 2005), Behind Articial Intelligence, a Squadron of Bright Real People,
The New York Times, retrieved 16 October 2008

75
McCarthy, John; Minsky, Marvin; Rochester,
Nathan; Shannon, Claude (August 35, 1955), A Proposal for the Dartmouth Summer Research Project
on Articial Intelligence, retrieved 16 October 2008
Check date values in: |date= (help)
McCarthy, John; Hayes, P. J. (1969), Some philosophical problems from the standpoint of articial
intelligence, in Meltzer, B. J.; Mitchie, Donald,
Machine Intelligence 4, Edinburgh University Press,
pp. 463502, retrieved 16 October 2008
McCorduck, Pamela (2004), Machines Who Think
(2nd ed.), Natick, MA: A. K. Peters, Ltd., ISBN 156881-205-1, OCLC 52197627.
McCullough, W. S.; Pitts, W. (1943), A logical
calculus of the ideas immanent in nervous activity,
Bulletin of Mathematical Biophysics 5 (4): 115127,
doi:10.1007/BF02478259
Menabrea, Luigi Federico; Lovelace, Ada (1843),
Sketch of the Analytical Engine Invented by
Charles Babbage, Scientic Memoirs 3, retrieved
2008-08-29 With notes upon the Memoir by the
Translator
Minsky, Marvin (1967), Computation: Finite and
Innite Machines, Englewood Clis, N.J.: PrenticeHall
Minsky, Marvin; Papert, Seymour (1969), Perceptrons: An Introduction to Computational Geometry, The MIT Press, ISBN 0-262-63111-3, OCLC
16924756
Minsky, Marvin (1974), A Framework for Representing Knowledge, retrieved 16 October 2008
Minsky, Marvin (1986), The Society of Mind, Simon and Schuster, ISBN 0-671-65713-5, OCLC
223353010
Minsky, Marvin (2001), Its 2001. Where Is HAL?,
Dr. Dobbs Technetcast, retrieved 8 August 2009
Moravec, Hans (1976), The Role of Raw Power in
Intelligence, retrieved 16 October 2008
Moravec, Hans (1988), Mind Children, Harvard
University Press, ISBN 0-674-57618-7, OCLC
245755104
NRC (1999), Developments in Articial Intelligence, Funding a Revolution: Government Support
for Computing Research, National Academy Press,
ISBN 0-309-06278-0, OCLC 246584055
Newell, Allen; Simon, H. A. (1963), GPS: A Program that Simulates Human Thought, in Feigenbaum, E.A.; Feldman, J., Computers and Thought,
New York: McGraw-Hill, ISBN 0-262-56092-5,
OCLC 246968117

76

CHAPTER 6. HISTORY OF ARTIFICIAL INTELLIGENCE

Nick, Martin (2005), Al Jazari: The Ingenious 13th


Century Muslin Mechanic, Al Shindagah, retrieved
16 October 2008.

Society, 2 (42): 230265, doi:10.1112/plms/s242.1.230, retrieved 8 October 2008 Check date values in: |date= (help).

O'Connor, Kathleen Malone (1994), The alchemical


creation of life (takwin) and other concepts of Genesis in medieval Islam, University of Pennsylvania,
retrieved 2007-01-10

Turing, Alan (October 1950), Computing Machinery and Intelligence, Mind LIX (236): 433460,
doi:10.1093/mind/LIX.236.433, ISSN 0026-4423,
retrieved 2008-08-18.

Olsen, Stefanie (10 May 2004), Newsmaker:


Googles man behind the curtain, CNET, retrieved
17 October 2008.

Weizenbaum, Joseph (1976), Computer Power and


Human Reason, W.H. Freeman & Company, ISBN
0-14-022535-8, OCLC 10952283.

Olsen, Stefanie (18 August 2006), Spying an intelligent search engine, CNET, retrieved 17 October .
2008.
Pearl, J. (1988), Probabilistic Reasoning in Intelligent
Systems: Networks of Plausible Inference, San Mateo, California: Morgan Kaufmann, ISBN 1-55860479-0, OCLC 249625842.
Russell, Stuart J.; Norvig, Peter (2003), Articial
Intelligence: A Modern Approach (2nd ed.), Upper
Saddle River, New Jersey: Prentice Hall, ISBN 013-790395-2.
Poole, David; Mackworth, Alan; Goebel, Randy
(1998), Computational Intelligence: A Logical Approach, Oxford University Press., ISBN 0-19510270-3.
Samuel, Arthur L. (July 1959), Some studies in
machine learning using the game of checkers, IBM
Journal of Research and Development 3 (3): 210
219, doi:10.1147/rd.33.0210, retrieved 2007-0820.
Searle, John (1980), Minds, Brains and Programs,
Behavioral and Brain Sciences 3 (3): 417457,
doi:10.1017/S0140525X00005756, retrieved May
13, 2009.
Simon, H. A.; Newell, Allen (1958), Heuristic Problem Solving: The Next Advance in Operations Research, Operations Research 6: 1,
doi:10.1287/opre.6.1.1.
Simon, H. A. (1965), The Shape of Automation for
Men and Management, New York: Harper & Row.
Skillings, Jonathan (2006), Newsmaker: Getting machines to think like us, CNET, retrieved 8 October
2008.
Tascarella, Patty (11 Autgust 2006), Robotics rms
nd fundraising struggle, with venture capital shy,
Pittsburgh Business Times, retrieved 8 October 2008
Check date values in: |date= (help).
Turing, Alan (1936-37), On Computable Numbers, with an Application to the Entscheidungsproblem, Proceedings of the London Mathematical

Chapter 7

History of computer science


The history of computer science began long before
the modern discipline of computer science that emerged
in the 20th century, and hinted at in the centuries
prior. The progression, from mechanical inventions
and mathematical theories towards the modern computer
concepts and machines, formed a major academic eld
and the basis of a massive worldwide industry.[1]

a memory capacity of less than 1 kilobyte of memory and


a clock speed of less than 10 Hertz .

The earliest known tool for use in computation was the


abacus, developed in period 27002300 BCE in Sumer .
The Sumerians abacus consisted of a table of successive
columns which delimited the successive orders of magnitude of their sexagesimal number system.[2] Its original
style of usage was by lines drawn in sand with pebbles .
Abaci of a more modern design are still used as calculation tools today.[3]

7.1 Binary logic

Considerable advancement in mathematics and electronics theory was required before the rst modern computers
could be designed .

In 1703, Gottfried Wilhelm Leibnitz developed logic in


a formal, mathematical sense with his writings on the binary numeral system. In his system, the ones and zeros
also represent true and false values or on and o states.
But it took more than a century before George Boole published his Boolean algebra in 1854 with a complete sysThe Antikythera mechanism is believed to be the earliest tem that allowed computational processes to be matheknown mechanical analog computer.[4] It was designed matically modeled .
to calculate astronomical positions. It was discovered in
By this time, the rst mechanical devices driven by a bi1901 in the Antikythera wreck o the Greek island of
nary pattern had been invented. The industrial revolution
Antikythera, between Kythera and Crete, and has been
had driven forward the mechanization of many tasks, and
dated to c. 100 BCE. Technological artifacts of similar
this included weaving. Punched cards controlled Joseph
complexity did not reappear until the 14th century, when
Marie Jacquard's loom in 1801, where a hole punched
mechanical astronomical clocks appeared in Europe.[5]
in the card indicated a binary one and an unpunched spot
When John Napier discovered logarithms for computa- indicated a binary zero. Jacquards loom was far from betional purposes in the early 17th century, there followed ing a computer, but it did illustrate that machines could
a period of considerable progress by inventors and sci- be driven by binary systems .
entists in making calculating tools. In 1623 Wilhelm
Schickard designed a calculating machine, but abandoned
the project, when the prototype he had started building 7.2 Birth of computer
was destroyed by a re in 1624 . Around 1640, Blaise
Pascal, a leading French mathematician, constructed a
mechanical adding device based on a design described Before the 1920s, computers (sometimes computors) were
by Greek mathematician Hero of Alexandria.[6] Then in human clerks that performed computations. They were
1672 Gottfried Wilhelm Leibnitz invented the Stepped usually under the lead of a physicist. Many thousands
of computers were employed in commerce, government,
Reckoner which he completed in 1694.[7]
and research establishments. Most of these computers
In 1837 Charles Babbage rst described his Analytical were women, and they were known to have a degree in
Engine which is accepted as the rst design for a modern calculus. Some performed astronomical calculations for
computer. The analytical engine had expandable mem- calendars, others ballistic tables for the military.
ory, an arithmetic unit, and logic processing capabilities able to interpret a programming language with loops After the 1920s, the expression computing machine reand conditional branching. Although never built, the de- ferred to any machine that performed the work of a husign has been studied extensively and is understood to be man computer, especially those in accordance with efTuring equivalent. The analytical engine would have had fective methods of the Church-Turing thesis. The thesis
states that a mathematical method is eective if it could
77

78

CHAPTER 7. HISTORY OF COMPUTER SCIENCE

be set out as a list of instructions able to be followed by a bage as an assistant while Babbage was working on his
human clerk with paper and pencil, for as long as neces- Analytical Engine, the rst mechanical computer. Dursary, and without ingenuity or insight .
ing her work with Babbage, Ada Lovelace became the deMachines that computed with continuous values became signer of the rst computer algorithm, which had the abilknown as the analog kind. They used machinery that rep- ity to compute Bernoulli numbers. Moreover, Lovelaces
resented continuous numeric quantities, like the angle of work with Babbage resulted in her prediction of future
computers to not only perform mathematical calculaa shaft rotation or dierence in electrical potential .
tions, but also manipulate symbols, mathematical or not.
Digital machinery, in contrast to analog, were able to ren- While she was never able to see the results of her work, as
der a state of a numeric value and store each individual the Analytical Engine was not created in her lifetime,
digit. Digital machinery used dierence engines or relays her eorts in later years, beginning in the 1940s, did not
before the invention of faster memory devices .
go unnoticed.[9]
The phrase computing machine gradually gave away, after the late 1940s, to just computer as the onset of electronic digital machinery became common. These com- 7.3.2 Alan Turing and the Turing Machine
puters were able to perform the calculations that were
The mathematical foundations of modern computer
performed by the previous human clerks .
science began to be laid by Kurt Gdel with his
Since the values stored by digital machines were not incompleteness theorem (1931). In this theorem, he
bound to physical properties like analog devices, a log- showed that there were limits to what could be proved
ical computer, based on digital equipment, was able to and disproved within a formal system. This led to work
do anything that could be described purely mechanical. by Gdel and others to dene and describe these formal
The theoretical Turing Machine, created by Alan Turing, systems, including concepts such as mu-recursive funcis a hypothetical device theorized in order to study the tions and lambda-denable functions .
properties of such hardware .
1936 was a key year for computer science. Alan Turing
See also: Philosophy of physics, Philosophy of biology, and Alonzo Church independently, and also together, inPhilosophy of mathematics, Philosophy of language and troduced the formalization of an algorithm, with limits on
Philosophy of mind
what can be computed, and a purely mechanical model
for computing .
These topics are covered by what is now called the
ChurchTuring thesis, a hypothesis about the nature of
7.3 Emergence of a discipline
mechanical calculation devices, such as electronic computers. The thesis claims that any calculation that is pos7.3.1 Charles Babbage and Ada Lovelace sible can be performed by an algorithm running on a computer, provided that sucient time and storage space are
Charles Babbage is often regarded as one of the rst pi- available .
oneers of computing. Beginning in the 1810s, Babbage In 1937, Alan Turing introduced his idea of what are
had a vision of mechanically computing numbers and ta- now referred to as Turing Machines, and, anticipating
bles. Putting this into reality, Babbage designed a cal- the modern stored program computer, described what beculator to compute numbers up to 8 decimal points long. came known as the Universal Turing machine. These
Continuing with the success of this idea, Babbage worked Turing machines were designed to formally determine,
to develop a machine that could compute numbers with mathematically, what can be computed, taking into acup to 20 decimal places. By the 1830s, Babbage had de- count limitations on computing ability. If a Turing mavised a plan to develop a machine that could use punched chine can complete the task, it is considered Turing
cards to perform arithmetical operations. The machine computable.[10]
would store numbers in memory units, and there would Turing machines are not physical objects, but mathematbe a form of sequential control. This means that one op- ical ones. They show if and how any given algorithm can
eration would be carried out before another in such a way be computed. Turing machines are state machines, where
that the machine would produce an answer and not fail. a state represents a position in a graph. State machines
This machine was to be known as the Analytical En- use various states, or graph positions, to determine the
gine, which was the rst true representation of what is outcome of the algorithm. To accomplish this, a theothe modern computer.[8]
retical one-dimensional tape is said to be divided into an
Ada Lovelace (Augusta Ada Byron) is credited as the
pioneer of computer programming and is regarded as
a mathematical genius, a result of the mathematically
heavy tutoring regimen her mother assigned to her as a
young girl. Lovelace began working with Charles Bab-

innite number of cells. Each cell contains a binary digit,


1 or 0. As the read/write head of the machine scans in
the subsequent value in the cell, it uses this value to determine what state to transition to next. To accomplish
this, the machine follows an input of rules, usually in the

7.3. EMERGENCE OF A DISCIPLINE


form of tables, that contain logic similar to: if the machine is in state A and a 0 is read in, the machine is going
to go to the next state, say, state B. The rules that the machines must follow are considered the program. These
Turing machines helped dene the logic behind modern
computer science. Memory in modern computers is represented by the innite tape, and the bus of the machine
is represented by the read/write head.[10]
Turing focused heavily on designing a machine that could
determine what can be computed. Turing concluded that
as long as a Turing machine exists that could compute
a precise approximation of the number, that value was
computable. This does include constants such as pi. Furthermore, functions can be computable when determining TRUE or FALSE for any given parameters. One
example of this would be a function IsEven. If this
function were passed a number, the computation would
produce TRUE if the number were even and FALSE if
the number were odd. Using these specications, Turing machines can determine if a function is computable
and terminate if said function is computable. Furthermore, Turing machines can interpret logic operators, such
as AND, OR, XOR, NOT, and IF-THEN-ELSE[10] to
determine if a function is computable.

79

7.3.3 Shannon and information theory


Up to and during the 1930s, electrical engineers were
able to build electronic circuits to solve mathematical and
logic problems, but most did so in an ad hoc manner,
lacking any theoretical rigor. This changed with Claude
Elwood Shannon's publication of his 1937 masters thesis, A Symbolic Analysis of Relay and Switching Circuits. While taking an undergraduate philosophy class,
Shannon had been exposed to Booles work, and recognized that it could be used to arrange electromechanical
relays (then used in telephone routing switches) to solve
logic problems. This concept, of utilizing the properties
of electrical switches to do logic, is the basic concept that
underlies all electronic digital computers, and his thesis
became the foundation of practical digital circuit design
when it became widely known among the electrical engineering community during and after World War II .
Shannon went on to found the eld of information theory with his 1948 paper titled A Mathematical Theory of
Communication, which applied probability theory to the
problem of how to best encode the information a sender
wants to transmit. This work is one of the theoretical
foundations for many areas of study, including data compression and cryptography .

Turing is so important to computer science that his name


is also featured on the Turing Award and the Turing test.
He contributed greatly to British code-breaking successes 7.3.4 Wiener and cybernetics
in the Second World War, and continued to design computers and software through the 1940s until his untimely From experiments with anti-aircraft systems that interdeath in 1954 .
preted radar images to detect enemy planes, Norbert
At a symposium on large-scale digital machinery in Cam- Wiener coined the term cybernetics from the Greek word
bridge, Turing said, We are trying to build a machine to for steersman. He published Cybernetics in 1948,
do all kinds of dierent things simply by programming which inuenced articial intelligence. Wiener also compared computation, computing machinery, memory derather than by the addition of extra apparatus .
vices, and other cognitive similarities with his analysis of
In 1941, Konrad Zuse developed the worlds rst funcbrain waves .
tional program-controlled computer, the Z3; in 1998,
it was shown to be Turing-complete in principle.[11][12] The rst actual computer bug was a moth. It was stuck
Zuse was also noted for the S2 computing machine, in between the relays on the Harvard Mark II. While the
considered the rst process-controlled computer. He invention of the term 'bug' is often but erroneously atfounded one of the earliest computer businesses in 1941, tributed to Grace Hopper, a future rear admiral in the
producing the Z4, which became the worlds rst com- U.S. Navy, who supposedly logged the bug on Septemmercial computer. In 1946, he designed the rst high- ber 9, 1945, most other accounts conict at least with
level programming language, Plankalkl.[13] In 1969, these details. According to these accounts, the actual date
Zuse suggested the concept of a computation-based uni- was September 9, 1947 when operators led this 'inciverse in his book Rechnender Raum (Calculating Space) dent' along with the insect and the notation First actual case of bug being found (see software bug for de.
tails) .
In 1948, the rst practical computer that could run stored
programs, based on the Turing machine model, had been
built - the Manchester Baby .
7.3.5 John von Neumann and the von NeuIn 1950, Britains National Physical Laboratory commann architecture
pleted Pilot ACE, a small scale programmable computer,
based on Turings philosophy .[14]
In 1946, a model for computer architecture was introduced and became known as Von Neumann architecture.
Since 1950, the von Neumann model provided uniformity
in subsequent computer designs. The von Neumann architecture was considered innovative as it introduced an

80
idea of allowing machine instructions and data to share
memory space. The von Neumann model is composed
of three major parts, the arithmetic logic unit (ALU), the
memory, and the instruction processing unit (IPU). In von
Neumann machine design, the IPU passes addresses to
memory, and memory, in turn, is routed either back to
the IPU if an instruction is being fetched or to the ALU
if data is being fetched.[15]
Von Neumanns machine design uses a RISC (Reduced
instruction set computing) architecture, which means the
instruction set uses a total of 21 instructions to perform
all tasks. (This is in contrast to CISC, complex instruction set computing, instruction sets which have more instructions from which to choose.) With von Neumann
architecture, main memory along with the accumulator
(the register that holds the result of logical operations)[16]
are the two memories that are addressed. Operations can
be carried out as simple arithmetic (these are performed
by the ALU and include addition, subtraction, multiplication and division), conditional branches (these are more
commonly seen now as if statements or while loops. The
branches serve as go to statements), and logical moves
between the dierent components of the machine, i.e., a
move from the accumulator to memory or vice versa. Von
Neumann architecture accepts fractions and instructions
as data types. Finally, as the von Neumann architecture is
a simple one, its register management is also simple. The
architecture uses a set of seven registers to manipulate
and interpret fetched data and instructions. These registers include the IR (instruction register), IBR (instruction buer register), MQ (multiplier quotient register), MAR (memory address register), and MDR
(memory data register).[15] The architecture also uses a
program counter (PC) to keep track of where in the
program the machine is.[15]

7.4 See also


Computer Museum
History of computing
History of computing hardware
History of software

CHAPTER 7. HISTORY OF COMPUTER SCIENCE

7.5 Notes
[1] History of Computer Science
[2] Ifrah 2001:11
[3] Bellos, Alex (2012-10-25). Abacus adds up to number
joy in Japan. The Guardian (London). Retrieved 201306-25.
[4] The Antikythera Mechanism Research Project, The Antikythera Mechanism Research Project. Retrieved 200707-01
[5] In search of lost time, Jo Marchant, Nature 444,
#7119 (November 30, 2006), pp.
534538,
doi:10.1038/444534a PMID 17136067.
[6] History of Computing Science: The First Mechanical Calculator
[7] Kidwell, Peggy Aldritch; Williams, Michael R. (1992).
The Calculating Machines: Their history and development.
USA: Massachusetts Institute of Technology and Tomash
Publishers., p.38-42, translated and edited from Martin,
Ernst (1925). Die Rechenmaschinen und ihre Entwicklungsgeschichte. Germany: Pappenheim.
[8] Charles Babbage. Encyclopedia Britannica Online Academic Edition. Encyclopedia Britannica In. Retrieved
2013-02-20.
[9] Isaacson, Betsy (2012-12-10).
Ada Lovelace,
Worlds First Computer Programmer,
Celebrated With Google Doodle.
The Hungton
Post
(http://www.huffingtonpost.com/2012/12/10/
google-doodle-ada-lovelace_n_2270668.html).
Retrieved 2013-02-20.
[10] Barker-Plummer, David. [<http://plato.stanford.edu/
archives/win2012/entries/turing-machine/>.
Turing
Machines"]. The Stanford Encyclopedia of Philosophy.
Retrieved 2013-02-20.
[11] Rojas, R. (1998). How to make Zuses Z3 a universal
computer. IEEE Annals of the History of Computing 20
(3): 5154. doi:10.1109/85.707574.
[12] Rojas, Ral. How to Make Zuses Z3 a Universal Computer.

List of computer term etymologies, the origins of


computer science words

[13] Talk given by Horst Zuse to the Computer Conservation


Society at the Science Museum (London) on 18 November
2010

List of prominent pioneers in computer science

[14] BBC News - How Alan Turings Pilot ACE changed


computing. BBC News. May 15, 2010.

Timeline of algorithms
History of personal computers

[15] Cragon, Harvey G. (2000). Computer Architecture and


Implementation. Cambridge: Cambridge University
Press. pp. 113. ISBN 0521651689.

Bugbook Historical Computer Museum

[16] Accumlator Def. 3. Oxford Dictionaries.

7.8. EXTERNAL LINKS

7.6 Sources
Ifrah, Georges (2001), The Universal History of
Computing: From the Abacus to the Quantum Computer, New York: John Wiley & Sons, ISBN 0-47139671-0

7.7 Further reading


Alan Turing
A Very Brief History of Computer Science
Computer History Museum
Computers: From the Past to the Present
The First Computer Bug at the Online Library of
the Naval Historical Center, retrieved February 28,
2006
Bitsavers, an eort to capture, salvage, and archive
historical computer software and manuals from
minicomputers and mainframes of the 1950s,
1960s, 1970s, and 1980s
Matti Tedre (2006). The Development of Computer
Science: A Sociocultural Perspective. Doctoral thesis
for University of Joensuu.

7.8 External links


Oral history interview with Albert H. Bowker at
Charles Babbage Institute, University of Minnesota.
Bowker discusses his role in the formation of the
Stanford University computer science department,
and his vision, as early as 1956, of computer science
as an academic discipline.
Oral history interview with Joseph F. Traub at
Charles Babbage Institute, University of Minnesota.
Traub discusses why computer science has developed as a discipline at institutions including Stanford, Berkeley, University of Pennsylvania, MIT,
and Carnegie-Mellon.
Oral history interview with Gene H. Golub at
Charles Babbage Institute, University of Minnesota.
Golub discusses his career in computer science at
Stanford University.
Oral history interview with John Herriot at Charles
Babbage Institute, University of Minnesota. Herriot
describes the early years of computing at Stanford
University, including formation of the computer science department, centering on the role of George
Forsythe.

81
Oral history interview with William F. Miller at
Charles Babbage Institute, University of Minnesota.
Miller contrasts the emergence of computer science
at Stanford with developments at Harvard and the
University of Pennsylvania.
Oral history interview with Alexandra Forsythe
at Charles Babbage Institute, University of Minnesota. Forsythe discusses the career of her husband, George Forsythe, who established Stanford
Universitys program in computer science.
Oral history interview with Allen Newell at Charles
Babbage Institute, University of Minnesota. Newell
discusses his entry into computer science, funding
for computer science departments and research, the
development of the Computer Science Department
at Carnegie Mellon University, including the work
of Alan J. Perlis and Raj Reddy, and the growth of
the computer science and articial intelligence research communities. Compares computer science
programs at Stanford, MIT, and Carnegie Mellon.
Oral history interview with Louis Fein at Charles
Babbage Institute, University of Minnesota. Fein
discusses establishing computer science as an academic discipline at Stanford Research Institute
(SRI) as well as contacts with the University of
CaliforniaBerkeley, the University of North Carolina, Purdue, International Federation for Information Processing and other institutions.
Oral history interview with W. Richards Adrion at
Charles Babbage Institute, University of Minnesota.
Adrion gives a brief history of theoretical computer
science in the United States and NSFs role in funding that area during the 1970s and 1980s.
Oral history interview with Bernard A. Galler at
Charles Babbage Institute, University of Minnesota.
Galler describes the development of computer science at the University of Michigan from the 1950s
through the 1980s and discusses his own work in
computer science.
Michael S. Mahoney Papers at Charles Babbage Institute, University of MinnesotaMahoney was the
preeminent historian of computer science as a distinct academic discipline. Papers contain 38 boxes
of books, serials, notes, and manuscripts related to
the history of computing, mathematics, and related
elds.
The Modern History of Computing entry by B. Jack
Copeland in the Stanford Encyclopedia of Philosophy

Chapter 8

History of operating systems


Computer operating systems (OSes) provide a set of
functions needed and used by most application programs
on a computer, and the linkages needed to control and
synchronize computer hardware. On the rst computers,
with no operating system, every program needed the full
hardware specication to run correctly and perform standard tasks, and its own drivers for peripheral devices like
printers and punched paper card readers. The growing
complexity of hardware and application programs eventually made operating systems a necessity for everyday
use.

8.1 Background

Accounting for and paying for machine usage moved on


from checking the wall clock to automatic logging by
the computer. Run queues evolved from a literal queue
of people at the door, to a heap of media on a jobswaiting table, or batches of punch-cards stacked one on
top of the other in the reader, until the machine itself was
able to select and sequence which magnetic tape drives
processed which tapes. Where program developers had
originally had access to run their own jobs on the machine, they were supplanted by dedicated machine operators who looked after the well-being and maintenance of
the machine and were less and less concerned with implementing tasks manually. When commercially available
computer centers were faced with the implications of data
lost through tampering or operational errors, equipment
vendors were put under pressure to enhance the runtime
libraries to prevent misuse of system resources. Automated monitoring was needed not just for CPU usage
but for counting pages printed, cards punched, cards read,
disk storage used and for signaling when operator intervention was required by jobs such as changing magnetic
tapes and paper forms. Security features were added to
operating systems to record audit trails of which programs
were accessing which les and to prevent access to a production payroll le by an engineering program, for example.

The earliest computers were mainframes that lacked any


form of operating system. Each user had sole use of the
machine for a scheduled period of time and would arrive at the computer with program and data, often on
punched paper cards and magnetic or paper tape. The
program would be loaded into the machine, and the machine would be set to work until the program completed
or crashed. Programs could generally be debugged via a
control panel using toggle switches and panel lights. It is
said that Alan Turing was a master of this on the early
Manchester Mark 1 machine, and he was already deriv- All these features were building up towards the repering the primitive conception of an operating system from toire of a fully capable operating system. Eventually the
runtime libraries became an amalgamated program that
the principles of the Universal Turing machine.
was started before the rst customer job and could read
Symbolic languages, assemblers, and compilers were dein the customer job, control its execution, record its usveloped for programmers to translate symbolic programage, reassign hardware resources after the job ended, and
code into machine code that previously would have been
immediately go on to process the next job. These reshand-encoded. Later machines came with libraries of
ident background programs, capable of managing mulsupport code on punched cards or magnetic tape, which
tistep processes, were often called monitors or monitorwould be linked to the users program to assist in operprograms before the term OS established itself.
ations such as input and output. This was the genesis of
the modern-day operating system. However, machines An underlying program oering basic hardwaresoftware-scheduling and resourcestill ran a single job at a time. At Cambridge Univer- management,
sity in England the job queue was at one time a washing monitoring may seem a remote ancestor to the
line from which tapes were hung with dierent colored user-oriented OSes of the personal computing era.
But there has been a shift in the meaning of OS. Just
clothes-pegs to indicate job-priority.
as early automobiles lacked speedometers, radios, and
As machines became more powerful, the time to run proair-conditioners which later became standard, more
grams diminished and the time to hand o the equipand more optional software features became standard
ment to the next user became very large by comparison.
82

8.2. MAINFRAMES
features in every OS package, although some applications
such as data base management systems and spreadsheets
remain optional and separately priced. This has led to
the perception of an OS as a complete user-system with
an integrated graphical user interface, utilities, some
applications such as text editors and le managers, and
conguration tools.
The true descendant of the early operating systems is
what is now called the "kernel". In technical and development circles the old restricted sense of an OS persists because of the continued active development of embedded
operating systems for all kinds of devices with a dataprocessing component, from hand-held gadgets up to industrial robots and real-time control-systems, which do
not run user applications at the front-end. An embedded
OS in a device today is not so far removed as one might
think from its ancestor of the 1950s to 1990
The broader categories of systems and application software are discussed in the computer software article.

8.2 Mainframes

83
range and delays with software development, a whole
family of operating systems was introduced instead of a
single OS/360.[4][5]
IBM wound up releasing a series of stop-gaps followed by
two longer-lived operating systems:
OS/360 for mid-range and large systems. This was
available in three system generation options:
PCP for early users and for those without the
resources for multiprogramming.
MFT for mid-range systems, replaced by
MFT-II in OS/360 Release 15/16. This had
one successor, OS/VS1, which was discontinued in the 1980s.
MVT for large systems. This was similar in
most ways to PCP and MFT (most programs
could be ported among the three without being re-compiled), but has more sophisticated
memory management and a time-sharing facility, TSO. MVT had several successors including the current z/OS.

DOS/360 for small System/360 models had several


The rst operating system used for real work was GMsuccessors including the current z/VSE. It was sigNAA I/O, produced in 1956 by General Motors' Renicantly dierent from OS/360.
search division[1] for its IBM 704.[2] Most other early operating systems for IBM mainframes were also produced
IBM maintained full compatibility with the past, so that
by customers.[3]
programs developed in the sixties can still run under
Early operating systems were very diverse, with each ven- z/VSE (if developed for DOS/360) or z/OS (if developed
dor or customer producing one or more operating systems for MFT or MVT) with no change.
specic to their particular mainframe computer. Every
operating system, even from the same vendor, could have IBM also developed, but never ocially released,
radically dierent models of commands, operating pro- TSS/360, a time-sharing system for the System/360
cedures, and such facilities as debugging aids. Typically, Model 67.
each time the manufacturer brought out a new machine, Several operating systems for the IBM S/360 and S/370
there would be a new operating system, and most appli- architectures were developed by third parties, including
cations would have to be manually adjusted, recompiled, the Michigan Terminal System (MTS) and MUSIC/SP.
and retested.

8.2.1

Systems on IBM hardware

8.2.2 Other mainframe operating systems

Control Data Corporation developed the SCOPE operatMain article: History of IBM mainframe operating ing systems[NB 1] in the 1960s, for batch processing and
systems
later developed the MACE operating system for time
sharing, which was the basis for the later Kronos. In coopThe state of aairs continued until the 1960s when IBM, eration with the University of Minnesota, the Kronos and
already a leading hardware vendor, stopped work on ex- later the NOS operating systems were developed during
isting systems and put all their eort into developing the the 1970s, which supported simultaneous batch and timeSystem/360 series of machines, all of which used the sharing use. Like many commercial timesharing systems,
same instruction and input/output architecture. IBM in- its interface was an extension of the DTSS time sharing
tended to develop a single operating system for the new system, one of the pioneering eorts in timesharing and
hardware, the OS/360. The problems encountered in the programming languages.
development of the OS/360 are legendary, and are described by Fred Brooks in The Mythical Man-Montha
book that has become a classic of software engineering.
Because of performance dierences across the hardware

In the late 1970s, Control Data and the University of Illinois developed the PLATO system, which used plasma
panel displays and long-distance time sharing networks.
PLATO was remarkably innovative for its time; the

84

CHAPTER 8. HISTORY OF OPERATING SYSTEMS

shared memory model of PLATOs TUTOR program- several operating systems for the Sigma series of computming language allowed applications such as real-time chat ers, such as the Basic Control Monitor (BCM), Batch Proand multi-user graphical games.
cessing Monitor (BPM), and Basic Time-Sharing MoniFor the UNIVAC 1107, UNIVAC, the rst commercial tor (BTM). Later, BPM and BTM were succeeded by the
computer manufacturer, produced the EXEC I operating Universal Time-Sharing System (UTS); it was designed
system, and Computer Sciences Corporation developed to provide multi-programming services for online (interthe EXEC II operating system and delivered it to UNI- active) user programs in addition to batch-mode producVAC. EXEC II was ported to the UNIVAC 1108. Later, tion jobs, It was succeeded by the CP-V operating system,
which combined UTS with the heavily batch-oriented
UNIVAC developed the EXEC 8 operating system for
the 1108; it was the basis for operating systems for later Xerox Operating System (XOS).
members of the family. Like all early mainframe systems, EXEC I and EXEC II were a batch-oriented system
that managed magnetic drums, disks, card readers and 8.3 Minicomputers and the rise of
line printers; EXEC 8 supported both batch processing
Unix
and on-line transaction processing. In the 1970s, UNIVAC produced the Real-Time Basic (RTB) system to
support large-scale time sharing, also patterned after the Digital Equipment Corporation created several operating
systems for its 16-bit PDP-11 machines, including the
Dartmouth BASIC system.
simple RT-11 system, the time-sharing RSTS operating
Burroughs Corporation introduced the B5000 in 1961
systems, and the RSX-11 family of real-time operating
with the MCP (Master Control Program) operating syssystems, as well as the VMS system for the 32-bit VAX
tem. The B5000 was a stack machine designed to exmachines.
clusively support high-level languages, with no software,
not even at the lowest level of the operating system, be- Several competitors of Digital Equipment Corporation
ing written directly in machine language or assembly lan- such as Data General, Hewlett-Packard, and Computer
guage; the MCP was the rst OS to be written entirely Automation created their own operating systems. One
in a high-level language - ESPOL, a dialect of ALGOL such, MAX III, was developed for Modular Computer
60 - although ESPOL had specialized statements for each Systems Modcomp II and Modcomp III computers. It
syllable[6] in the B5000 instruction set. MCP also intro- was characterised by its target market being the indusduced many other ground-breaking innovations, such as trial control market. The Fortran libraries included one
being one of[NB 2] the rst commercial implementations that enabled access to measurement and control devices.
of virtual memory. The rewrite of MCP for the B6500 The Unix operating system was developed at AT&T Bell
is still in use today in the Unisys ClearPath/MCP line of Laboratories in the late 1960s, originally for the PDP-7,
computers.
and later for the PDP-11. Because it was essentially free
GE introduced the GE-600 series with the General Electric Comprehensive Operating Supervisor (GECOS) operating system in 1962. After Honeywell acquired GEs
computer business, it was renamed to General Comprehensive Operating System (GCOS). Honeywell expanded
the use of the GCOS name to cover all its operating systems in the 1970s, though many of its computers had
nothing in common with the earlier GE 600 series and
their operating systems were not derived from the original GECOS.
Project MAC at MIT, working with GE and Bell Labs,
developed Multics, which introduced the concept of
ringed security privilege levels.

in early editions, easily obtainable, and easily modied,


it achieved wide acceptance. It also became a requirement within the Bell systems operating companies. Since
it was written in the C language, when that language was
ported to a new machine architecture, Unix was also able
to be ported. This portability permitted it to become the
choice for a second generation of minicomputers and the
rst generation of workstations. By widespread use it exemplied the idea of an operating system that was conceptually the same across various hardware platforms. It
still was owned by AT&T Corporation and that limited
its use to groups or corporations who could aord to license it. It became one of the roots of the free software
and open source movements. In 1991 Linus Torvalds
began development of Linux, an open source kernel; a
kernel was the only piece missing in the GNU operating
system(developed by Richard Stallman). The combination of the Linux kernel and userland code from GNU
and elsewhere produced the GNU/Linux operating system (see GNU/Linux naming controversy).

Digital Equipment Corporation developed TOPS-10 for


its PDP-10 line of 36-bit computers in 1967. Before the
widespread use of Unix, TOPS-10 was a particularly popular system in universities, and in the early ARPANET
community. Bolt, Beranek, and Newman developed
TENEX for a modied PDP-10 that supported demand
paging; this was another popular system in the research
Another system which evolved in this time frame was the
and ARPANET communities, and was later developed by
Pick operating system. The Pick system was developed
DEC into TOPS-20.
and sold by Microdata Corporation who created the preScientic Data Systems/Xerox Data Systems developed cursors of the system. The system is an example of a

8.5. PERSONAL COMPUTER ERA

85

system which started as a database application support dard formats like ASCII text or CSV, or through specialprogram and graduated to system work.
ized le conversion programs.

8.4 Microcomputers: 8-bit home 8.4.2 Rise of OS in video games and consoles
computers and game consoles
Since virtually all video game consoles and arcade cabinets designed and built after 1980 were true digital machines (unlike the analog Pong clones and derivatives),
some of them carried a minimal form of BIOS or built-in
game, such as the ColecoVision, the Sega Master System
and the SNK Neo Geo.

Beginning in the mid-1970s, a new class of small computers came onto the marketplace. Featuring 8-bit processors, typically the MOS Technology 6502, Intel 8080
or the Zilog Z-80, along with rudimentary input and output interfaces and as much RAM as practical, these systems started out as kit-based hobbyist computers but soon Modern-day game consoles and videogames, starting
evolved into an essential business tool.
with the PC-Engine, all have a minimal BIOS that also
provides some interactive utilities such as memory card
management, audio or video CD playback, copy protec8.4.1 Home computers
tion and sometimes carry libraries for developers to use
etc. Few of these cases, however, would qualify as a true
While many 8-bit home computers of the 1980s, such operating system.
as the Commodore 64, Apple II series, the Atari 8-bit,
the Amstrad CPC, ZX Spectrum series and others could The most notable exceptions are probably the Dreamcast
load a third-party disk-loading operating system, such as game console which includes a minimal BIOS, like the
CP/M or GEOS, they were generally used without one. PlayStation, but can load the Windows CE operating sysTheir built-in operating systems were designed in an era tem from the game disk allowing easily porting of games
when oppy disk drives were very expensive and not ex- from the PC world, and the Xbox game console, which is
pected to be used by most users, so the standard storage little more than a disguised Intel-based PC running a sedevice on most was a tape drive using standard compact cret, modied version of Microsoft Windows in the backcassettes. Most, if not all, of these computers shipped ground. Furthermore, there are Linux versions that will
with a built-in BASIC interpreter on ROM, which also run on a Dreamcast and later game consoles as well.
served as a crude command line interface, allowing the Long before that, Sony had released a kind of
user to load a separate disk operating system to perform development kit called the Net Yaroze for its rst
le management commands and load and save to disk. PlayStation platform, which provided a series of proThe most popular home computer, the Commodore 64, gramming and developing tools to be used with a norwas a notable exception, as its DOS was on ROM in the mal PC and a specially modied Black PlayStation that
disk drive hardware, and the drive was addressed identi- could be interfaced with a PC and download programs
cally to printers, modems, and other external devices.
from it. These operations require in general a functional
More elaborate operating systems were not needed in part OS on both platforms involved.
because most such machines were used for entertainment In general, it can be said that videogame consoles and arand education, and seldom used for more serious business cade coin operated machines used at most a built-in BIOS
or science purposes.
during the 1970s, 1980s and most of the 1990s, while
Another reason is that the hardware they evolved on from the PlayStation era and beyond they started getting
initially shipped with minimal amounts of computer more and more sophisticated, to the point of requiring a
memory4-8 kilobytes was standard on early home generic or custom-built OS for aiding in development and
computersas well as 8-bit processors without special- expandability.
ized support circuitry like a MMU or even a dedicated
real-time clock. On this hardware, a complex operating systems overhead supporting multiple tasks and users 8.5 Personal computer era
would likely compromise the performance of the machine
without really being needed.
The development of microprocessors made inexpensive
Video games and even the available spreadsheet, database computing available for the small business and hobbyist,
and word processors for home computers were mostly which in turn led to the widespread use of interchangeable
self-contained programs that took over the machine com- hardware components using a common interconnection
pletely. Although integrated software existed for these (such as the S-100, SS-50, Apple II, ISA, and PCI buses),
computers, they usually lacked features compared to their and an increasing need for standard operating systems
standalone equivalents, largely due to memory limita- to control them. The most important of the early OSes
tions. Data exchange was mostly performed though stan- on these machines was Digital Research's CP/M80 for

86

CHAPTER 8. HISTORY OF OPERATING SYSTEMS

the 8080 / 8085 / Z-80 CPUs. It was based on several


systems, as exemplied by Hyper-V in Windows
Digital Equipment Corporation operating systems, mostly
Server 2008 or HP Integrity Virtual Machines in
for the PDP-11 architecture. Microsofts rst operating
HP-UX.
system, MDOS/MIDAS, was designed along many of the
In some systems, such as POWER5 and POWER6PDP-11 features, but for microprocessor based systems.
based servers from IBM, the hypervisor is no longer
MS-DOS, or PC DOS when supplied by IBM, was based
optional.[9]
originally on CP/M-80. Each of these machines had a
small boot program in ROM which loaded the OS itself
Applications have been re-designed to run directly
from disk. The BIOS on the IBM-PC class machines was
on a virtual machine monitor.[10]
an extension of this idea and has accreted more features
and functions in the 20 years since the rst IBM-PC was In many ways, virtual machine software today plays the
role formerly held by the operating system, including
introduced in 1981.
managing the hardware resources (processor, memory,
The decreasing cost of display equipment and procesI/O devices), applying scheduling policies, or allowing
sors made it practical to provide graphical user interfaces
system administrators to manage systems.
for many operating systems, such as the generic X Window System that is provided with many Unix systems, or
other graphical systems such as Microsoft Windows, the
RadioShack Color Computers OS-9 Level II/MultiVue, 8.7 See also
Commodore's AmigaOS, Atari TOS, Apple's Mac OS,
Charles Babbage Institute
or even IBM's OS/2. The original GUI was developed
on the Xerox Alto computer system at Xerox Palo Alto
IT History Society
Research Center in the early '70s and commercialized by
List of operating systems
many vendors throughout the 1980s and 1990s.
Since the late 1990s there have been three operating systems in widespread use on personal computers: Microsoft
Windows, Apple Inc.'s OS X, and the open source Linux.
Since 2005 and Apples transition to Intel processors,
all have been developed mainly on the x86 platform, although OS X retained PowerPC support until 2009 and
Linux remains ported to a multitude of architectures including ones such as 68k, PA-RISC, and DEC Alpha,
which have been long superseded and out of production,
and SPARC and MIPS, which are used in servers or embedded systems but no longer used in desktop computers.
Other operating systems such as AmigaOS and OS/2 remain in use, if at all, mainly by retrocomputing enthusiasts or for specialized embedded applications.

Timeline of operating systems


History of computer icons

8.8 Notes
[1] CDC used the SCOPE name for disparate operating systems on the upper 3000 series, the lower 3000 series, the
6000 series and the 7600
[2] The B5000 was contemporaneous with the Ferranti Atlas

8.9 References
[1] See Rand Corporation publication by Robert Patrick

8.6 Rise of virtualization


Operating systems originally ran directly on the hardware
itself and provided services to applications. With CP-67
on the IBM System/360 Model 67 and Virtual Machine
Facility/370 (VM/370) on System/370, IBM introduced
the notion of a virtual machine, where the operating system itself runs under the control of a hypervisor, instead
of being in direct control of the hardware. VMware popularized this technology on personal computers. Over
time, the line between virtual machines, monitors, and
operating systems was blurred:

[2] Timeline of Computer History: 1956: Software.


Computer History Museum. Retrieved 2008-05-25.
[3] A Brief History of Linux
[4] Johnston (April 1, 2005). VSE: A Look at the Past
40 Years. z/Journal (Thomas Communications, Inc.)
(April/May 2005).
[5] Chuck Boyer, The 360 Revolution
[6] A syllable in the B5000 could contain a 10-bit literal, an
operand call, a descriptor call or a 10-bit opcode.
[7] VMware API. VMware. Retrieved 26 November 2008.
[8] VMware le system. Retrieved 26 November 2008.

Hypervisors grew more complex, gaining their


own application programming interface,[7] memory
management or le system.[8]
Virtualization becomes a key feature of operating

[9] PowerVM Virtualization on IBM System p: Introduction


and Conguration. Retrieved 26 November 2008.
[10] JRockits Liquid VM could be the rst real Java OS.
Retrieved 26 November 2008.

8.10. FURTHER READING

8.10 Further reading


Neal Stephenson (1999). In the Beginning... Was
the Command Line. Harper Perennial. ISBN 0-38081593-1.

87

Chapter 9

History of programming languages


This article discusses the major developments in the his- TINUE. The use of a magnetic drum for memory meant
tory of programming languages. For a detailed time- that computer programs also had to be interleaved with
line of events, see: Timeline of programming languages. the rotations of the drum. Thus the programs were more
hardware-dependent.

9.1 Early history


During a nine-month period in 1840-1843, Ada Lovelace
translated the memoir of Italian mathematician Luigi
Menabrea about Charles Babbage's newest proposed machine, the Analytical Engine. With the article she appended a set of notes which specied in complete detail a method for calculating Bernoulli numbers with the
Analytical Engine, recognized by some historians as the
worlds rst computer program.[1]
Herman Hollerith realized that he could encode information on punch cards when he observed that train
conductors encode the appearance of the ticket holders
on the train tickets using the position of punched holes
on the tickets. Hollerith then encoded the 1890 census
data on punch cards.

To some people, what was the rst modern programming language depends on how much power and humanreadability is required before the status of programming
language is granted. Jacquard looms and Charles Babbages Dierence Engine both had simple, extremely limited languages for describing the actions that these machines should perform. One can even regard the punch
holes on a player piano scroll as a limited domain-specic
language, albeit not designed for human consumption.

9.2 First programming languages

In the 1940s, the rst recognizably modern electrically


powered computers were created. The limited speed
and memory capacity forced programmers to write hand
tuned assembly language programs. It was eventually realized that programming in assembly language required a
The rst computer codes were specialized for their appli- great deal of intellectual eort and was error-prone.
cations. In the rst decades of the 20th century, numerical calculations were based on decimal numbers. Even- The rst programming languages designed to communicate instructions to a computer were written in the 1950s.
tually it was realized that logic could be represented with
numbers, not only with words. For example, Alonzo An early high-level programming language to be designed
for a computer was Plankalkl, developed for the German
Church was able to express the lambda calculus in a formulaic way. The Turing machine was an abstraction Z3 by Konrad Zuse between 1943 and 1945.[2]However,
it was not implemented until 1998 and 2000.
of the operation of a tape-marking machine, for example, in use at the telephone companies. Turing machines John Mauchly's Short Code, proposed in 1949, was one
set the basis for storage of programs as data in the von of the rst high-level languages ever developed for an
Neumann architecture of computers by representing a electronic computer.[3] Unlike machine code, Short Code
machine through a nite number. However, unlike the statements represented mathematical expressions in unlambda calculus, Turings code does not serve well as a derstandable form. However, the program had to be
basis for higher-level languagesits principal use is in translated into machine code every time it ran, making
rigorous analyses of algorithmic complexity.
the process much slower than running the equivalent maLike many rsts in history, the rst modern program- chine code.
ming language is hard to identify. From the start, the
restrictions of the hardware dened the language. Punch
cards allowed 80 columns, but some of the columns had
to be used for a sorting number on each card. FORTRAN included some keywords which were the same as
English words, such as IF, GOTO (go to) and CON-

At the University of Manchester, Alick Glennie developed Autocode in the early 1950s. A programming language, it used a compiler to automatically convert the language into machine code. The rst code and compiler
was developed in 1952 for the Mark 1 computer at the
University of Manchester and is considered to be the rst

88

9.3. ESTABLISHING FUNDAMENTAL PARADIGMS

89
nested block structure: code sequences and associated declarations could be grouped into blocks
without having to be turned into separate, explicitly
named procedures;
lexical scoping: a block could have its own private variables, procedures and functions, invisible to
code outside that block, that is, information hiding.
Another innovation, related to this, was in how the language was described:

The Manchester Mark 1 ran programs written in Autocode from


1952.

compiled high-level programming language.[4][5]


The second autocode was developed for the Mark 1 by
R. A. Brooker in 1954 and was called the Mark 1 Autocode. Brooker also developed an autocode for the
Ferranti Mercury in the 1950s in conjunction with the
University of Manchester. The version for the EDSAC 2
was devised by D. F. Hartley of University of Cambridge
Mathematical Laboratory in 1961. Known as EDSAC
2 Autocode, it was a straight development from Mercury
Autocode adapted for local circumstances, and was noted
for its object code optimisation and source-language diagnostics which were advanced for the time. A contemporary but separate thread of development, Atlas Autocode was developed for the University of Manchester
Atlas 1 machine.
Another early programming language was devised by
Grace Hopper in the US, called FLOW-MATIC. It was
developed for the UNIVAC I at Remington Rand during
the period from 1955 until 1959. Hopper found that business data processing customers were uncomfortable with
mathematical notation, and in early 1955, she and her
team wrote a specication for an English programming
language and implemented a prototype.[6] The FLOWMATIC compiler became publicly available in early 1958
and was substantially complete in 1959.[7] Flow-Matic
was a major inuence in the design of COBOL, since
only it and its direct descendent AIMACO were in actual
use at the time.[8] The language Fortran was developed at
IBM in the mid 1950s, and became the rst widely used
high-level general purpose programming language.
Other languages still in use today, include LISP (1958),
invented by John McCarthy and COBOL (1959), created by the Short Range Committee. Another milestone
in the late 1950s was the publication, by a committee
of American and European computer scientists, of a
new language for algorithms"; the ALGOL 60 Report (the
"ALGOrithmic Language). This report consolidated
many ideas circulating at the time and featured three key
language innovations:

a mathematically exact notation, Backus-Naur Form


(BNF), was used to describe the languages syntax.
Nearly all subsequent programming languages have
used a variant of BNF to describe the context-free
portion of their syntax.
Algol 60 was particularly inuential in the design of later
languages, some of which soon became more popular.
The Burroughs large systems were designed to be programmed in an extended subset of Algol.
Algols key ideas were continued, producing ALGOL 68:
syntax and semantics became even more orthogonal,
with anonymous routines, a recursive typing system
with higher-order functions, etc.;
not only the context-free part, but the full language syntax and semantics were dened formally,
in terms of Van Wijngaarden grammar, a formalism
designed specically for this purpose.
Algol 68s many little-used language features (for example, concurrent and parallel blocks) and its complex system of syntactic shortcuts and automatic type coercions
made it unpopular with implementers and gained it a reputation of being dicult. Niklaus Wirth actually walked
out of the design committee to create the simpler Pascal
language.
Some important languages that were developed in this period include:

9.3 Establishing
paradigms

fundamental

The period from the late 1960s to the late 1970s brought
a major owering of programming languages. Most of
the major language paradigms now in use were invented
in this period:
Simula, invented in the late 1960s by Nygaard and
Dahl as a superset of Algol 60, was the rst language
designed to support object-oriented programming.

90

CHAPTER 9. HISTORY OF PROGRAMMING LANGUAGES

C, an early systems programming language, was de- One important new trend in language design was an inveloped by Dennis Ritchie and Ken Thompson at creased focus on programming for large-scale systems
Bell Labs between 1969 and 1973.
through the use of modules, or large-scale organizational
units of code. Modula, Ada, and ML all developed no Smalltalk (mid-1970s) provided a complete table module systems in the 1980s. Module systems
ground-up design of an object-oriented language.
were often wedded to generic programming constructs--generics being, in essence, parametrized modules (see
Prolog, designed in 1972 by Colmerauer, Roussel, also polymorphism in object-oriented programming).
and Kowalski, was the rst logic programming lanAlthough major new paradigms for imperative programguage.
ming languages did not appear, many researchers ex ML built a polymorphic type system (invented by panded on the ideas of prior languages and adapted them
Robin Milner in 1973) on top of Lisp , pioneering to new contexts. For example, the languages of the Argus
statically typed functional programming languages. and Emerald systems adapted object-oriented programming to distributed systems.
Each of these languages spawned an entire family of de- The 1980s also brought advances in programming lanscendants, and most modern languages count at least one guage implementation. The RISC movement in computer
architecture postulated that hardware should be designed
of them in their ancestry.
for compilers rather than for human assembly programThe 1960s and 1970s also saw considerable debate over mers. Aided by processor speed improvements that enthe merits of "structured programming", which essen- abled increasingly aggressive compilation techniques, the
tially meant programming without the use of Goto. This RISC movement sparked greater interest in compilation
debate was closely related to language design: some lan- technology for high-level languages.
guages did not include GOTO, which forced structured
programming on the programmer. Although the debate Language technology continued along these lines well
raged hotly at the time, nearly all programmers now agree into the 1990s.
that, even in languages that provide GOTO, it is bad Some important languages that were developed in this peprogramming style to use it except in rare circumstances. riod include:
As a result, later generations of language designers have
found the structured programming debate tedious and
even bewildering.

9.5 1990s: the Internet age

To provide even faster compile times, some languages


were structured for "one-pass compilers" which expect
subordinate routines to be dened rst, as with Pascal, The rapid growth of the Internet in the mid-1990s was the
where the main routine, or driver function, is the nal next major historic event in programming languages. By
opening up a radically new platform for computer syssection of the program listing.
tems, the Internet created an opportunity for new lanSome important languages that were developed in this pe- guages to be adopted. In particular, the JavaScript proriod include:
gramming language rose to popularity because of its early
integration with the Netscape Navigator web browser.
Various other scripting languages achieved widespread
9.4 1980s: consolidation, modules, use in developing customized application for web servers
such as PHP. The 1990s saw no fundamental novelty
performance
in imperative languages, but much recombination and
maturation of old ideas. This era began the spread of
The 1980s were years of relative consolidation in functional languages. A big driving philosophy was proimperative languages.
Rather than inventing new grammer productivity. Many rapid application develparadigms, all of these movements elaborated upon the opment (RAD) languages emerged, which usually came
ideas invented in the previous decade. C++ combined with an IDE, garbage collection, and were descendants of
object-oriented and systems programming. The United older languages. All such languages were object-oriented.
States government standardized Ada, a systems program- These included Object Pascal, Visual Basic, and Java.
ming language intended for use by defense contractors. Java in particular received much attention. More radIn Japan and elsewhere, vast sums were spent investi- ical and innovative than the RAD languages were the
gating so-called fth-generation programming languages new scripting languages. These did not directly descend
that incorporated logic programming constructs. The from other languages and featured new syntaxes and more
functional languages community moved to standardize liberal incorporation of features. Many consider these
ML and Lisp. Research in Miranda, a functional lan- scripting languages to be more productive than even the
guage with lazy evaluation, began to take hold in this RAD languages, but often because of choices that make
decade.
small programs simpler but large programs more dicult

9.7. PROMINENT PEOPLE

91

to write and maintain. Nevertheless, scripting languages 9.7 Prominent people


came to be the most prominent ones used in connection
with the Web.
Some key people who helped develop programming lanSome important languages that were developed in this pe- guages (in alpha order):
riod include:
Joe Armstrong, creator of Erlang.

9.6 Current trends

John Backus, inventor of Fortran.


Alan Cooper, developer of Visual Basic.

Programming language evolution continues, in both industry and research. Some of the current trends include:

Edsger W. Dijkstra, developed the framework for


structured programming.

Increasing support for functional programming in


mainstream languages used commercially, including
pure functional programming for making code easier to reason about and easier to parallelise (at both
micro- and macro- levels)

Jean-Yves Girard, co-inventor of the polymorphic


lambda calculus (System F).

Constructs to support concurrent and distributed


programming.
Mechanisms for adding security and reliability verication to the language: extended static checking,
dependent typing, information ow control, static
thread safety.

James Gosling, developer of Oak, the precursor of


Java.
Anders Hejlsberg, developer of Turbo Pascal,
Delphi and C#.
Rich Hickey, creator of Clojure.
Grace Hopper, developer of Flow-Matic, inuencing COBOL.
Jean Ichbiah, chief designer of Ada, Ada 83

Alternative mechanisms for modularity: mixins,


delegates, aspects.

Kenneth E. Iverson, developer of APL, and codeveloper of J along with Roger Hui.

Component-oriented software development.

Alan Kay, pioneering work on object-oriented programming, and originator of Smalltalk.

Metaprogramming, reection or access to the


abstract syntax tree
Increased emphasis on distribution and mobility.
Integration with databases, including XML and
relational databases.
Support for Unicode so that source code (program
text) is not restricted to those characters contained in
the ASCII character set; allowing, for example, use
of non-Latin-based scripts or extended punctuation.
XML for graphical interface (XUL, XAML).
Open source as a developmental philosophy for languages, including the GNU compiler collection and
recent languages such as Python, Ruby, and Squeak.
AOP or Aspect Oriented Programming allowing developers to code by places in code extended behaviors.
Massively parallel languages for coding 2000 processor GPU graphics processing units and supercomputer arrays including OpenCL
Some important languages developed during this period
include:

Brian Kernighan, co-author of the rst book on


the C programming language with Dennis Ritchie,
coauthor of the AWK and AMPL programming languages.
Yukihiro Matsumoto, creator of Ruby.
John McCarthy, inventor of LISP.
Bertrand Meyer, inventor of Eiel.
Robin Milner, inventor of ML, and sharing credit
for HindleyMilner polymorphic type inference.
John von Neumann, originator of the operating system concept.
Martin Odersky, creator of Scala, and previously a
contributor to the design of Java.
John C. Reynolds, co-inventor of the polymorphic
lambda calculus (System F).
Dennis Ritchie, inventor of C. Unix Operating System, Plan 9 Operating System.
Nathaniel Rochester, inventor of rst assembler
(IBM 701).
Guido van Rossum, creator of Python.

92

CHAPTER 9. HISTORY OF PROGRAMMING LANGUAGES

Bjarne Stroustrup, developer of C++.


Ken Thompson, inventor of B, Go Programming
Language, Inferno Programming Language, and
Unix Operating System co-author.
Larry Wall, creator of the Perl programming language (see Perl and Perl 6).
Niklaus Wirth, inventor of Pascal, Modula and
Oberon.
Stephen Wolfram, creator of Mathematica.

9.8 See also


9.9 References
[1] J. Fuegi and J. Francis (OctoberDecember 2003),
Lovelace & Babbage and the creation of the 1843
'notes", Annals of the History of Computing 25 (4): 16,
19, 25, doi:10.1109/MAHC.2003.1253887
[2] Rojas, Ral, et al. (2000). Plankalkl: The First HighLevel Programming Language and its Implementation.
Institut fr Informatik, Freie Universitt Berlin, Technical
Report B-3/2000. (full text)
[3] Sebesta, W.S Concepts of Programming languages.
2006;M6 14:18 pp.44. ISBN 0-321-33025-0
[4] Knuth, Donald E.; Pardo, Luis Trabb. Early development
of programming languages. Encyclopedia of Computer
Science and Technology (Marcel Dekker) 7: 419493.
[5] Peter J. Bentley (2012). Digitized: The Science of Computers and how it Shapes Our World. Oxford University
Press. p. 87.
[6] Hopper (1978) p. 16.
[7] Sammet (1969) p. 316
[8] Sammet (1978) p. 204.

9.10 Further reading


Rosen, Saul, (editor), Programming Systems and
Languages, McGraw-Hill, 1967
Sammet, Jean E., Programming Languages: History
and Fundamentals, Prentice-Hall, 1969
Sammet, Jean E. (July 1972).
Programming Languages: History and Future. Communications of the ACM 15 (7): 601610.
doi:10.1145/361454.361485.
Richard L. Wexelblat (ed.): History of Programming
Languages, Academic Press 1981.
Thomas J. Bergin and Richard G. Gibson (eds.):
History of Programming Languages, Addison Wesley, 1996.

9.11 External links


History and evolution of programming languages.
Graph of programming language history

Chapter 10

History of software engineering


From its beginnings in the 1940s, writing software has
evolved into a profession concerned with how best to
maximize the quality of software and of how to create
it. Quality can refer to how maintainable software is,
to its stability, speed, usability, testability, readability,
size, cost, security, and number of aws or bugs, as
well as to less measurable qualities like elegance, conciseness, and customer satisfaction, among many other
attributes. How best to create high quality software is
a separate and controversial problem covering software
design principles, so-called best practices for writing
code, as well as broader management issues such as optimal team size, process, how best to deliver software on
time and as quickly as possible, work-place culture, hiring practices, and so forth. All this falls under the broad
rubric of software engineering.

ware engineering and are hailed for their potential to


improve software and sharply criticized for their potential to constrict programmers.
Cost of hardware: The relative cost of software
versus hardware has changed substantially over the
last 50 years. When mainframes were expensive
and required large support stas, the few organizations buying them also had the resources to
fund large, expensive custom software engineering
projects. Computers are now much more numerous
and much more powerful, which has several eects
on software. The larger market can support large
projects to create commercial o the shelf software,
as done by companies such as Microsoft. The cheap
machines allow each programmer to have a terminal
capable of fairly rapid compilation. The programs in
question can use techniques such as garbage collection, which make them easier and faster for the programmer to write. On the other hand, many fewer
organizations are interested in employing programmers for large custom software projects, instead using commercial o the shelf software as much as
possible.

10.1 Overview
There are a number of areas where the evolution of software engineering is notable:
Emergence as a profession: By the early 1980s,[1]
software engineering had already emerged as a bona
de profession, to stand beside computer science
and traditional engineering. See also software engineering professionalism.
Role of women: In the 1940s, 1950s, and 1960s,
men often lled the more prestigious and better paying hardware engineering roles, but often delegated
the writing of software to women. Grace Hopper,
Jamie Fenton and many other unsung women lled
many computer programming jobs during the rst
several decades of software engineering. Today,
fewer women work in software engineering than in
other professions, a situation whose cause is not
clearly identied. It is often attributed to sexual
discrimination, cyberculture or bias in education.
Many academic and professional organizations consider this situation unbalanced and are trying hard to
solve it.

10.2 The Pioneering Era


The most important development was that new computers were coming out almost every year or two, rendering
existing ones obsolete. Software people had to rewrite
all their programs to run on these new machines. Programmers did not have computers on their desks and had
to go to the machine room. Jobs were run by signing
up for machine time or by operational sta. Jobs were
run by putting punched cards for input into the machines
card reader and waiting for results to come back on the
printer.

The eld was so new that the idea of management by


schedule was non-existent. Making predictions of a
projects completion date was almost impossible. Computer hardware was application-specic. Scientic and
business tasks needed dierent machines. Due to the
Processes: Processes have become a big part of soft- need to frequently translate old software to meet the needs
93

94

CHAPTER 10. HISTORY OF SOFTWARE ENGINEERING

of new machines, high-order languages like FORTRAN,


COBOL, and ALGOL were developed. Hardware vendors gave away systems software for free as hardware
could not be sold without software. A few companies sold
the service of building custom software but no software
companies were selling packaged software.

Life and Death: Software defects can kill. Some


embedded systems used in radiotherapy machines
failed so catastrophically that they administered
lethal doses of radiation to patients. The most famous of these failures is the Therac-25 incident.

The notion of reuse ourished. As software was free, user


organizations commonly gave it away. Groups like IBMs
scientic user group SHARE oered catalogs of reusable
components. Academia did not yet teach the principles of
computer science. Modular programming and data abstraction were already being used in programming.

Peter G. Neumann has kept a contemporary list of software problems and disasters.[5] The software crisis has
been fading from view, because it is psychologically extremely dicult to remain in crisis mode for a protracted
period (more than 20 years). Nevertheless, software especially real-time embedded software - remains risky
and is pervasive, and it is crucial not to give in to complacency. Over the last 1015 years Michael A. Jackson
has written extensively about the nature of software engineering, has identied the main source of its diculties
as lack of specialization, and has suggested that his problem frames provide the basis for a normal practice of
software engineering, a prerequisite if software engineering is to become an engineering science. {Michael Jackson, Engineering and Software Engineering in S Nanz
ed, The Future of Software Engineering, Springer Verlag
2010; Michael Jackson, Problem Frames: Analyzing and
Structuring Software Development Problems; AddisonWesley, 2001}.

10.3 1945 to 1965: The Origins


The term software engineering, coined by Margaret
Hamilton,[2] rst appeared in the late 1950s and early
1960s. Programmers have always known about civil,
electrical, and computer engineering and debated what
engineering might mean for software.
The NATO Science Committee sponsored two
conferences[3] on software engineering in 1968
(Garmisch, Germany see conference report) and
1969, which gave the eld its initial boost. Many
believe these conferences marked the ocial start of the
profession of software engineering.

10.4 1965 to 1985: The Software


Crisis
Software engineering was spurred by the so-called
software crisis of the 1960s, 1970s, and 1980s, which
identied many of the problems of software development.
Many software projects ran over budget and schedule.
Some projects caused property damage. A few projects
caused loss of life.[4] The software crisis was originally
dened in terms of productivity, but evolved to emphasize quality. Some used the term software crisis to refer
to their inability to hire enough qualied programmers.
Cost and Budget Overruns: The OS/360 operating system was a classic example. This decadelong project from the 1960s eventually produced
one of the most complex software systems at the
time. OS/360 was one of the rst large (1000 programmers) software projects. Fred Brooks claims
in The Mythical Man Month that he made a multimillion dollar mistake of not developing a coherent
architecture before starting development.
Property Damage: Software defects can cause property damage. Poor software security allows hackers
to steal identities, costing time, money, and reputations.

10.5 1985 to 1989: No Silver Bullet


For decades, solving the software crisis was paramount to
researchers and companies producing software tools. The
cost of owning and maintaining software in the 1980s was
twice as expensive as developing the software. During the 1990s, the cost of ownership and maintenance
increased by 30% over the 1980s. In 1995, statistics
showed that half of surveyed development projects were
operational, but were not considered successful. The
average software project overshoots its schedule by half.
Three-quarters of all large software products delivered
to the customer are failures that are either not used at all,
or do not meet the customers requirements.

10.5.1 Software projects


Seemingly, every new technology and practice from the
1970s to the 1990s was trumpeted as a silver bullet to
solve the software crisis. Tools, discipline, formal methods, process, and professionalism were touted as silver
bullets:
Tools: Especially emphasized were tools: structured
object-oriented programming,
programming,
CASE tools such as ICLs CADES CASE system,
Ada, documentation, and standards were touted as
silver bullets.

10.6. 1990 TO 1999: PROMINENCE OF THE INTERNET

95

Discipline: Some pundits argued that the software


crisis was due to the lack of discipline of programmers.

10.6 1990 to 1999: Prominence of


the Internet

Formal methods: Some believed that if formal engineering methodologies would be applied to software
development, then production of software would become as predictable an industry as other branches of
engineering. They advocated proving all programs
correct.

The rise of the Internet led to very rapid growth in the


demand for international information display/e-mail systems on the World Wide Web. Programmers were required to handle illustrations, maps, photographs, and
other images, plus simple animation, at a rate never before seen, with few well-known methods to optimize image display/storage (such as the use of thumbnail images).

The growth of browser usage, running on the HTML lan Process: Many advocated the use of dened pro- guage, changed the way in which information-display and
cesses and methodologies like the Capability Matu- retrieval was organized. The widespread network connecrity Model.
tions led to the growth and prevention of international
computer viruses on MS Windows computers, and the
vast proliferation of spam e-mail became a major design
Professionalism: This led to work on a code of issue in e-mail systems, ooding communication channels
ethics, licenses, and professionalism.
and requiring semi-automated pre-screening. Keywordsearch systems evolved into web-based search engines,
and many software systems had to be re-designed, for
In 1986, Fred Brooks published his No Silver Bullet ar- international searching, depending on search engine opticle, arguing that no individual technology or practice timization (SEO) techniques. Human natural-language
would ever make a 10-fold improvement in productivity translation systems were needed to attempt to translate
within 10 years.
the information ow in multiple foreign languages, with
Debate about silver bullets raged over the following many software systems being designed for multi-language
decade. Advocates for Ada, components, and processes usage, based on design concepts from human translacontinued arguing for years that their favorite technology tors. Typical computer-user bases went from hundreds,
would be a silver bullet. Skeptics disagreed. Eventually, or thousands of users, to, often, many-millions of interalmost everyone accepted that no silver bullet would ever national users.
be found. Yet, claims about silver bullets pop up now and
again, even today.
Some interpret no silver bullet to mean that software engineering failed. However, with further reading, Brooks
goes on to say, We will surely make substantial progress
over the next 40 years; an order of magnitude over 40
years is hardly magical ....
The search for a single key to success never worked. All
known technologies and practices have only made incremental improvements to productivity and quality. Yet,
there are no silver bullets for any other profession, either.
Others interpret no silver bullet as proof that software engineering has nally matured and recognized that projects
succeed due to hard work.
However, it could also be said that there are, in fact,
a range of silver bullets today, including lightweight
methodologies (see "Project management"), spreadsheet
calculators, customized browsers, in-site search engines,
database report generators, integrated design-test codingeditors with memory/dierences/undo, and specialty
shops that generate niche software, such as information
websites, at a fraction of the cost of totally customized
website development. Nevertheless, the eld of software
engineering appears too complex and diverse for a single
silver bullet to improve most issues, and each issue accounts for only a small portion of all software problems.

10.7 2000 to Present: Lightweight


Methodologies
With the expanding demand for software in many smaller
organizations, the need for inexpensive software solutions led to the growth of simpler, faster methodologies that developed running software, from requirements
to deployment, quicker & easier. The use of rapidprototyping evolved to entire lightweight methodologies,
such as Extreme Programming (XP), which attempted
to simplify many areas of software engineering, including requirements gathering and reliability testing for the
growing, vast number of small software systems. Very
large software systems still used heavily-documented
methodologies, with many volumes in the documentation
set; however, smaller systems had a simpler, faster alternative approach to managing the development and maintenance of software calculations and algorithms, information storage/retrieval and display.

96

10.7.1

CHAPTER 10. HISTORY OF SOFTWARE ENGINEERING

Current Trends in Software Engi- 10.7.2 Software engineering today


neering

The profession is trying to dene its boundary and conSoftware engineering is a young discipline, and is still de- tent. The Software Engineering Body of Knowledge
veloping. The directions in which software engineering is SWEBOK has been tabled as an ISO standard during
2006 (ISO/IEC TR 19759).
developing include:
Aspects aspects help software engineers deal with
quality attributes by providing tools to add or remove boilerplate code from many areas in the source
code. Aspects describe how all objects or functions should behave in particular circumstances. For
example, aspects can add debugging, logging, or
locking control into all objects of particular types.
Researchers are currently working to understand
how to use aspects to design general-purpose code.
Related concepts include generative programming
and templates.
Agile agile software development guides software development projects that evolve rapidly with changing expectations and competitive markets. Proponents of this method believe that heavy, documentdriven processes (like TickIT, CMM and ISO 9000)
are fading in importance. Some people believe that
companies and agencies export many of the jobs that
can be guided by heavy-weight processes. Related
concepts include extreme programming, scrum, and
lean software development.
Experimental experimental software engineering is a
branch of software engineering interested in devising experiments on software, in collecting data from
the experiments, and in devising laws and theories
from this data. Proponents of this method advocate that the nature of software is such that we can
advance the knowledge on software through experiments only.
Model-driven model driven design develops textual and
graphical models as primary design artifacts. Development tools are available that use model transformation and code generation to generate wellorganized code fragments that serve as a basis for
producing complete applications.
Software product lines software product lines is a systematic way to produce families of software systems,
instead of creating a succession of completely individual products. This method emphasizes extensive,
systematic, formal code reuse, to try to industrialize
the software development process.
The Future of Software Engineering [6] conference
(FOSE), held at ICSE 2000, documented the state of the
art of SE in 2000 and listed many problems to be solved
over the next decade. The FOSE tracks at the ICSE 2000
[7]
and the ICSE 2007[8] conferences also help identify
the state of the art in software engineering.

In 2006, Money Magazine and Salary.com rated software


engineering as the best job in America in terms of growth,
pay, stress levels, exibility in hours and working environment, creativity, and how easy it is to enter and advance
in the eld.[9]

10.8 Prominent Figures in the History of Software Engineering


Charles Bachman (born 1924) is particularly known
for his work in the area of databases.
Laszlo Belady (born 1928) the editor-in-chief of the
IEEE Transactions on Software Engineering in the
1980s
Fred Brooks (born 1931) best known for managing
the development of OS/360.
Peter Chen, known for the development of entityrelationship modeling.
Edsger Dijkstra (19302002) developed the framework for proper programming.
David Parnas (born 1941) developed the concept of
information hiding in modular programming.
Michael A. Jackson (born 1936) software engineering methodologist responsible for JSP method of
program design; JSD method of system development (with John Cameron); and Problem Frames
method for analysing and structuring software development problems.

10.9 See also


History of software
History of computer science
History of programming languages

10.10 References
[1] Software engineering ... has recently emerged as a discipline in its own right. Sommerville, Ian (1985) [1982].
Software Engineering. Addison-Wesley. ISBN 0-20114229-5.

10.11. EXTERNAL LINKS

[2] Rayl, A.J.S. (October 16, 2008). NASA Engineers


and Scientists-Transforming Dreams Into Reality. http:
//www.nasa.gov/index.html''. NASA. Retrieved December
27, 2014.
[3] The NATO Software Engineering Conferences
[4] Therac-25
[5] Computer Risks
[6] Future of Software Engineering
[7] ICSE 2000
[8] ICSE 2007
[9] Kalwarski, Tara; Daphne Mosher, Janet Paskin and
Donna Rosato (2006). Best Jobs in America. MONEY
Magazine. CNN. Retrieved 2006-04-20. , MONEY
Magazine and Salary.com researched hundreds of jobs,
considering their growth, pay, stress-levels and other factors. These careers ranked highest. 1. Software Engineer...

10.11 External links


Oral history interview with Bruce H. Barnes,
Charles Babbage Institute, University of Minnesota.
Barnes describes the National Science Foundation
(NSF) and its support of research in theoretical
computer science, computer architecture, numerical
methods, and software engineering, and the development of networking.
Oral history interview with Laszlo A. Belady,
Charles Babbage Institute, University of Minnesota.
Brian Randell: The NATO Software Engineering
Conferences (The site includes the original two
NATO reportsfrom 1968 and 1969as well as
photographs of the participants and some of the sessions at Garmisch)

97

Chapter 11

History of the graphical user interface

Ivan Sutherland demonstrating Sketchpad (UVC via IA: video


and thumbnails).

The history of the graphical user interface, understood as


The rst prototype of a computer mouse, as designed by Bill Enthe use of graphic icons and a pointing device to control
glish from Engelbarts sketches[1]
a computer, covers a ve-decade span of incremental renements, built on some constant core principles. Several
vendors have created their own windowing systems based
on independent code, but with basic elements in common
that dene the WIMP window, icon, menu and pointing
device paradigm.
There have been important technological achievements,
and enhancements to the general interaction in small steps
over previous systems. There have been a few signicant
breakthroughs in terms of use, but the same organizational metaphors and interaction idioms are still in use.
Although many GUI operating systems are controlled by
using a mouse, the keyboard can also be used with keyboard shortcuts or arrow keys. The interface developments described, below, have been summarized and omit
many details in the interest of brevity. The inuence of
game computers and joystick operation has been omitted. Videoconferencing on NLS (1968)

11.1 Initial developments


Early dynamic information devices such as radar displays, where input devices were used for direct control
of computer-created data, set the basis for later improvements of graphical interfaces.[2] Some early cathode-raytube (CRT) screens used a lightpen, rather than a mouse,

as the pointing device.


The concept of a multi-panel windowing system was introduced by the rst real-time graphic display systems
for computers: the SAGE Project and Ivan Sutherland's
Sketchpad.

98

11.2. EARLY DEVELOPMENTS

11.1.1

99

Augmentation of Human Intellect menus (including the rst xed drop-down menu) to
support commands such as opening les, deleting les,
(NLS)

In the 1960s, Doug Engelbart's Augmentation of Human Intellect project at the Augmentation Research Center at SRI International in Menlo Park, California developed the oN-Line System (NLS). This computer incorporated a mouse-driven cursor and multiple windows used
to work on hypertext. Engelbart had been inspired, in
part, by the memex desk-based information machine suggested by Vannevar Bush in 1945.

moving les, etc. In 1974, work began at PARC on


Gypsy, the rst bitmap What-You-See-Is-What-You-Get
(WYSIWYG) cut & paste editor. In 1975, Xerox engineers demonstrated a Graphical User Interface including
icons and the rst use of pop-up menus.[3]

In 1981 Xerox introduced a pioneering product, Star,


a workstation incorporating many of PARCs innovations. Although not commercially successful, Star greatly
inuenced future developments, for example at Apple,
[4]
Much of the early research was based on how young chil- Microsoft and Sun Microsystems.
dren learn. So, the design was based on the childlike
primitives of hand-eye coordination, rather than use of
command languages, user-dened macro procedures, or
11.2 Early developments
automated transformations of data as later used by adult
professionals.

11.2.1 Xerox Alto and Xerox Star


11.1.2

Xerox PARC

Xerox Star workstation introduced the rst commercial GUI operating system

Engelbarts work directly led to the advances at Xerox


PARC. Several people went from SRI to Xerox PARC
in the early 1970s. In 1973, Xerox PARC developed
the Alto personal computer. It had a bitmapped screen,
and was the rst computer to demonstrate the desktop
metaphor and graphical user interface (GUI). It was not
a commercial product, but several thousand units were
built and were heavily used at PARC, as well as other XEROX oces, and at several universities for many years.
The Alto greatly inuenced the design of personal computers during the late 1970s and early 1980s, notably the
Three Rivers PERQ, the Apple Lisa and Macintosh, and
the rst Sun workstations.

The Xerox Alto had an early graphical user interface.

The Xerox Alto (and later Xerox Star) was an early personal computer developed at Xerox PARC in 1973. It
was the rst computer to use the desktop metaphor and
mouse-driven graphical user interface (GUI).

It was not a commercial product, but several thousand


units were built and were heavily used at PARC, other
Xerox facilities, at least one government facility and at
several universities for many years. The Alto greatly inThe GUI was rst developed at Xerox PARC by Alan uenced the design of some personal computers in the
Kay, Larry Tesler, Dan Ingalls, David Smith and a num- following decades, notably the Apple Macintosh and the
ber of other researchers. It used windows, icons, and rst Sun workstations.

100

11.2.2

CHAPTER 11. HISTORY OF THE GRAPHICAL USER INTERFACE

SGI 1000 series and MEX

Founded 1982, SGI introduced the IRIS 1000 Series[5]


in 1983.[6] The rst graphical terminals (IRIS 1000)
shipped in late 1983, and the corresponding workstation model (IRIS 1400) was released in mid-1984. The
machines used an early version of the MEX windowing system on top of the GL2 Release 1 operating
environment.[7] Examples of the MEX user interface
can be seen in a 1988 article in the journal Computer
Graphics,[8] while earlier screenshots can not be found.
The rst commercial GUI-based systems, these did not
nd widespread use as to their (discounted) academic list
price of $22,500 and $35,700 for the IRIS 1000 and IRIS
1400, respectively.[6] However, these systems were commercially successful enough to start SGIs business as one
of the main graphical workstation vendors. In later revisions of graphical workstations, SGI switched to the
X window system, which had been developed starting
at MIT since 1984 and which became the standard for
UNIX workstations.

11.2.3

The Apple GS/OS desktop (1986).

place around the screen as desired; and the user could


delete les and folders by dragging them to a trash-can
icon on the screen. The Macintosh, in contrast to the Lisa,
used a program-centric rather than document-centric design. Apple revisited the document-centric design, in a
limited manner, much later with OpenDoc.

Apple Lisa and Macintosh (and


There is still some controversy over the amount of inulater, the Apple IIgs)

ence that Xeroxs PARC work, as opposed to previous


academic research, had on the GUIs of the Apple Lisa
Main article: Mac OS history
Beginning in 1979, started by Steve Jobs and led by Jef and Macintosh, but it is clear that the inuence was extensive, because rst versions of Lisa GUIs even lacked
icons. These prototype GUIs are at least mouse-driven,
but completely ignored the WIMP ( window, icon,
menu, pointing device) concept. Screenshots of rst
GUIs of Apple Lisa prototypes show the early designs.
Note also that Apple engineers visited the PARC facilities (Apple secured the rights for the visit by compensating Xerox with a pre-IPO purchase of Apple stock) and a
number of PARC employees subsequently moved to Apple to work on the Lisa and Macintosh GUI. However,
the Apple work extended PARCs considerably, adding
manipulatable icons, and drag&drop manipulation of objects in the le system (see Macintosh Finder) for example. A list of the improvements made by Apple, beyond
Macintosh Desktop (1984).
the PARC interface, can be read at Folklore.org.[9] Jef
Raskin warns that many of the reported facts in the hisRaskin, the Apple Lisa and Macintosh teams at Apple tory of the PARC and Macintosh development are inacComputer (which included former members of the Xerox curate, distorted or even fabricated, due to the lack of
PARC group) continued to develop such ideas. The Lisa, usage by historians of direct primary sources.[10]
released in 1983, featured a high-resolution stationarybased (document-centric) graphical interface atop an ad- In 1984, Apple released a television commercial which
during the telecast of
vanced hard disk based OS that featured such things as introduced the Apple Macintosh
[11]
Super
Bowl
XVIII
by
CBS,
with
allusions to George
multitasking and oce programs that users could paste
noted
novel,
Nineteen
Eighty-Four.
The comOrwell's
content, such as graphs, from other programs into. The
mercial
was
aimed
at
making
people
think
about
comcomparatively simplied Macintosh, released in 1984
puters,
identifying
the
user-friendly
interface
as
a
perand designed to be lower in cost, was the rst commersonal
computer
which
departed
from
previous
businesscially successful product to use a multi-panel window in[12]
becoming a signature representerface. A desktop metaphor was used, in which les oriented systems, and [13]
tation
of
Apple
products.
looked like pieces of paper. File directories looked like
le folders. There were a set of desk accessories like a In 1986 the Apple IIgs was launched, a very advanced
calculator, notepad, and alarm clock that the user could model of the successful Apple II series, based on 16-bit

11.2. EARLY DEVELOPMENTS

101

technology (in fact, virtually two machines into one). It


came with a new operating system, the Apple GS/OS,
which features a Finder-like GUI, very similar to that
of the Macintosh series, able to deal with the advanced
graphic abilities of its Video Graphics Chip (VGC).

11.2.4

Graphical Environment Manager


(GEM)

Main article: Graphical Environment Manager


Digital Research (DRI) created the Graphical Environment Manager (GEM) as an add-on program for personal computers. GEM was developed to work with existing CP/M and MS-DOS operating systems on business
computers such as IBM-compatibles. It was developed
from DRI software, known as GSX, designed by a former PARC employee. The similarity to the Macintosh
desktop led to a copyright lawsuit from Apple Computer,
and a settlement which involved some changes to GEM.
This was to be the rst of a series of 'look and feel' lawsuits related to GUI design in the 1980s.

DeskMate 3.02 running in VGA mode

a disk operating system such as TRS-DOS or MS-DOS.


The application was popular at the time and included a
number of programs like Draw, Text and Calendar, as
well as attracting outside investment such as Lotus 1-2-3
for DeskMate.

11.2.6 MSX-View

GEM on the Atari ST (1985)

GEM received widespread use in the consumer market


from 1985, when it was made the default user interface
built into the Atari TOS operating system of the Atari
ST line of personal computers. It was also bundled by
other computer manufacturers and distributors, such as
Amstrad. Later, it was distributed with the best-sold Digital Research version of DOS for IBM PC compatibles,
the DR-DOS 6.0. The GEM desktop faded from the market with the withdrawal of the Atari ST line in 1992 and MSX-View running VShell
with the popularity of the Microsoft Windows 3.0 in the
PC front around the same period of time.
Main article: MSX-View

11.2.5

DeskMate

MSX-View was developed for MSX computers by ASCII


Corporation and HAL Laboratory. MSX-View conMain article: DeskMate
tains software such as Page Edit, Page View, Page Link,
VShell, VTed, VPaint and VDraw. An external version
Tandys DeskMate appeared in the early 1980s on its of the built-in MSX View of the Panasonic FS-A1GT was
TRS-80 machines and was ported to its Tandy 1000 range released as an add-on for the Panasonic FS-A1ST on disk
in 1984. Like most PC GUIs of the time, it depended on instead of 512kB ROM DISK.

102

CHAPTER 11. HISTORY OF THE GRAPHICAL USER INTERFACE


Like most GUIs of the day, Amigas Intuition followed
Xeroxs, and sometimes Apples, lead. But a CLI was
included which dramatically extended the functionality of
the platform. However, the CLI/Shell of Amiga is not
just a simple text-based interface like in MS-DOS, but
another graphic process driven by Intuition, and with the
same gadgets included in Amigas graphics.library. The
CLI/Shell interface integrates itself with the Workbench,
sharing privileges with the GUI.
The Amiga Workbench evolved over the 1990s, even after Commodores 1994 bankruptcy.

Amiga Workbench (1985)

11.2.8 Acorn BBC Master Compact


11.2.7

Amiga Intuition and the Workbench

The Amiga computer was launched by Commodore in


1985 with a GUI called Workbench. Workbench was
based on an internal engine developed mostly by RJ Mical, called Intuition, which drove all the input events.
The rst versions used a blue/orange/white/black default
palette, which was selected for high contrast on televisions and composite monitors. Workbench presented directories as drawers to t in with the "workbench" theme.
Intuition was the widget and graphics library that made
the GUI work. It was driven by user events through the
mouse, keyboard, and other input devices.
Due to a mistake made by the Commodore sales department, the rst oppies of AmigaOS (released with the
Amiga1000) named the whole OS Workbench. Since
then, users and CBM itself referred to Workbench as
the nickname for the whole AmigaOS (including Amiga
DOS, Extras, etc.). This common consent ended with
release of version 2.0 of AmigaOS, which re-introduced
proper names to the installation oppies of AmigaDOS,
Workbench, Extras, etc.
Starting with Workbench 1.0, AmigaOS treated the
Workbench as a backdrop, borderless window sitting atop
a blank screen. With the introduction of AmigaOS 2.0,
however, the user was free to select whether the main
Workbench window appeared as a normally layered window, complete with a border and scrollbars, through a
menu item.

The Master Compact GUI

Main article: BBC Master


Acorns 8-bit BBC Master Compact shipped with Acorns
rst public GUI interface in 1986.[14] Little commercial
software, beyond that included on the Welcome disk, was
ever made available for the system, despite the claim by
Acorn at the time that the major software houses have
worked with Acorn to make over 100 titles available on
compilation discs at launch.[15] The most avid supporter
of the Master Compact appeared to be Superior Software, who produced and specically labelled their games
as 'Master Compact' compatible.

Amiga users were able to boot their computer into a


command line interface (also known as the CLI or Amiga
Shell). This was a keyboard-based environment without 11.2.9 Arthur / RISC OS
the Workbench GUI. Later they could invoke it with the
CLI/SHELL command LoadWB which loaded Work- Main article: RISC OS
bench GUI.
One major dierence between other OSs of the time (and
for some time after) was the Amigas fully multi-tasking
operating system, a powerful built-in animation system
using a hardware blitter and copper and 4 channels of 26
kHz 8-bit sampled sound. This made the Amiga the rst
multi-media computer years before other OSs.

RISC OS /rskos/[16] is a series of graphical user


interface-based computer operating systems (OSes) designed for ARM architecture systems. It takes its name
from the RISC (Reduced Instruction Set Computing) architecture supported. The OS was originally developed
by Acorn Computers for use with their 1987 range of

11.2. EARLY DEVELOPMENTS

Arthur Desktop

103

A typical RISC OS 3.7 session

Archimedes personal computers using the Acorn RISC The GUI is centred around the concept of les. The
Machine processors. It comprises a command-line inter- Filer displays the contents of a disc. Applications are run
face and desktop environment with a windowing system. from the Filer view and les can be dragged to the Filer
Originally branded as the Arthur 1.20 the subsequent view from applications to perform saves. Application diArthur 2 release was shipped under the name RISC OS rectories are used to store applications. The OS dierentiates them from normal directories through the use
2.
of a pling (exclamation mark, also called shriek) prex.
From 1988 to 1998, the OS was bundled with nearly ev- Double-clicking on such a directory launches the applicaery ARM-based Acorn computer model, including the tion rather than opening the directory. The applications
Archimedes range, RiscPC, NewsPad and A7000. A executable les and resources are contained within the diversion of the OS (called NCOS) was used in Oracle's rectory, but normally they remain hidden from the user.
Network Computer and compatible systems. After the Because applications are self-contained, this allows dragbreakup of Acorn in 1998, development of the OS was and-drop installation and removal.
forked and separately continued by several companies, including RISCOS Ltd, Pace Micro Technology and Castle The RISC OS Style Guide encourages a consistent look
Technology. Since 1998 it has been bundled with a and feel across applications. This was introduced in RISC
number of ARM-based desktop computers such as the OS 3 and species application appearance and behaviour.
Iyonix[17] and A9home. As of 2012, the OS remains Acorns own main bundled applications were not updated
forked and is independently developed by RISCOS Ltd to comply with the guide until RISCOS Ltd's Select release in 2001.[26]
and the RISC OS Open community.
Most recent stable versions run on the ARMv3/ARMv4
RiscPC[18] (or under emulation via VirtualAcorn
or RPCEmu), the ARMv5 Iyonix,[19] Raspberry
Pi[20][21][22] and ARMv7 Cortex-A8 processors[23][24]
(such as that used in the BeagleBoard and Touch Book).
In 2011, a port for the Cortex-A9 PandaBoard was
announced[25]
Desktop
The WIMP interface incorporates three mouse buttons (named Select, Menu and Adjust), context-sensitive
menus, window order control (i.e. send to back) and dynamic window focus (a window can have input focus at
any position on the stack). The Icon bar (Dock) holds
icons which represent mounted disc drives, RAM discs,
running applications, system utilities and docked: Files,
Directories or inactive Applications. These icons have
context-sensitive menus and support drag-and-drop behaviour. They represent the running application as a
whole, irrespective of whether it has open windows.

Font manager
The outline fonts manager provides spatial anti-aliasing
of fonts, the OS being the rst operating system to include such a feature,[27][28][29][30] having included it since
before January 1989.[31] Since 1994, in RISC OS 3.5, it
has been possible to use an outline anti-aliased font in the
WindowManager for UI elements, rather than the bitmap
system font from previous versions.[32]

11.2.10 MS-DOS le managers and utility


suites
Because most of the very early IBM PC and compatibles lacked any common true graphical capability (they
used the 80-column basic text mode compatible with the
original MDA display adapter), a series of le managers
arose, including Microsoft's DOS Shell, which features
typical GUI elements as menus, push buttons, lists with

104

CHAPTER 11. HISTORY OF THE GRAPHICAL USER INTERFACE

DeluxePaint II for MS-DOS (1989)


Norton Utilities 6.01 (1991). Note the graphical widgets and the
arrow pointer in text mode.

The original Adobe Acrobat Reader executable le for


MS-DOS was able to run on both the standard Windows
3.x GUI and the standard DOS command prompt. When
scrollbars and mouse pointer. The name text-based user it was launched from the command prompt, on a machine
interface was later invented to name this kind of inter- with a VGA graphics card, it provided its own GUI.
face. Many MS-DOS text mode applications, like the
default text editor for MS-DOS 5.0 (and related tools,
11.2.12 Microsoft Windows (16-bit verlike QBasic), also used the same philosophy. The IBM
sions)
DOS Shell included with IBM DOS 5.0 (circa 1992) supported both text display modes and actual graphics display modes, making it both a TUI and a GUI, depending
on the chosen mode.
Advanced le managers for MS-DOS were able to redene character shapes with EGA and better display
adapters, giving some basic low resolution icons and
graphical interface elements, including an arrow (instead
of a coloured cell block) for the mouse pointer. When the
display adapter lacks the ability to change the characters
shapes, they default to the CP437 character set found in
the adapters ROM. Some popular utility suites for MSDOS, as Norton Utilities (pictured) and PC Tools used Windows 1.01 (1985)
these techniques as well.
DESQview was a text mode multitasking program introduced in July 1985. Running on top of MS-DOS, it allowed users to run multiple DOS programs concurrently
in windows. It was the rst program to bring multitasking and windowing capabilities to a DOS environment in
which existing DOS programs could be used. DESQview
was not a true GUI but oered certain components of
one, such as resizable, overlapping windows and mouse
pointing.

11.2.11

See also: History of Microsoft Windows

Windows 1.0, a GUI for the MS-DOS operating system


was released in 1985.[33] The markets response was less
than stellar.[34] Windows 2.0 followed, but it wasn't until the 1990 launch of Windows 3.0, based on Common
User Access that its popularity truly exploded. The GUI
has seen minor redesigns since, mainly the networking
enabled Windows 3.11 and its Win32s 32-bit patch. The
16-bit line of MS Windows were discontinued with the
introduction of Windows 95 and Windows NT 32-bit
Applications under MS-DOS with based architecture in the 1990s. See the next section.

proprietary GUIs
Before the MS-Windows age, and with the lack of a true
common GUI under MS-DOS, most graphical applications which worked with EGA, VGA and better graphic
cards had proprietary built-in GUIs. One of the best
known such graphical applications was Deluxe Paint, a
popular painting software with a typical WIMP interface.

The main window of a given application can occupy


the full screen in maximized status. The users must
then to switch between maximized applications using the
Alt+Tab keyboard shortcut; no alternative with the mouse
except for de-maximize. When none of the running application windows are maximized, switching can be done
by clicking on a partially visible window, as is the common way in other GUIs.

11.2. EARLY DEVELOPMENTS

105

11.2.14 The X Window System


Main article: X Window System
The standard windowing system in the Unix world is

Windows 3.11 (1993)

In 1988, Apple sued Microsoft for copyright infringement of the LISA and Apple Macintosh GUI. The court
case lasted 4 years before almost all of Apples claims
were denied on a contractual technicality. Subsequent
appeals by Apple were also denied. Microsoft and Ap- A Unix based X Window System desktop (circa 1990).
ple apparently entered a nal, private settlement of the
matter in 1997.
the X Window System (commonly X11 or X), rst released in the mid-1980s. The W Window System (1983)
was the precursor to X; X was developed at MIT as
11.2.13 GEOS
Project Athena. Its original purpose was to allow users
of the newly emerging graphic terminals to access remote graphics workstations without regard to the workstations operating system or the hardware. Due largely to
the availability of the source code used to write X, it has
become the standard layer for management of graphical
and input/output devices and for the building of both local and remote graphical interfaces on virtually all Unix,
Linux and other Unix-like operating systems, with the notable exceptions of Mac OS X and Android.

GEOS for the Commodore 64 (1986).

Main article: GEOS (8-bit operating system)


GEOS was launched in 1986. Originally written for the 8bit home computer Commodore 64 and shortly after, the
Apple II series. The name was later used by the company
as PC/Geos for IBM PC systems, then Geoworks Ensemble. It came with several application programs like a calendar and word processor, and a cut-down version served
as the basis for America Online's DOS client. Compared
to the competing Windows 3.0 GUI it could run reasonably well on simpler hardware, but its developer had a
restrictive policy towards third-party developers that prevented it from becoming a serious competitor. And it
was targeted at 8-bit machines and the 16-bit computer
age was dawning.

X allows a graphical terminal user to make use of remote


resources on the network as if they were all located locally
to the user by running a single module of software called
the X server. The software running on the remote machine is called the client application. Xs network transparency protocols allow the display and input portions of
any application to be separated from the remainder of the
application and 'served up' to any of a large number of remote users. X is available today as free software.

11.2.15 NeWS
Main article: NeWS
The PostScript-based NeWS (Network extensible Window System) was developed by Sun Microsystems in the
mid-1980s. For several years SunOS included a window system combining NeWS and the X Window System. Although NeWS was considered technically elegant
by some commentators, Sun eventually dropped the product. Unlike X, NeWS was always proprietary software.

106

CHAPTER 11. HISTORY OF THE GRAPHICAL USER INTERFACE

Windows 95 desktop (1995).


HyperTIES authoring tool under NeWS window system.

11.3 The 1990s: Mainstream usage


of the desktop
The widespread adoption of the PC platform at homes
and small business popularized computers among people
with no formal training. This created a fast-growing market, opening an opportunity for commercial exploitation
and of easy-to-use interfaces and making economically
viable the incremental renement of the existing GUIs
for home systems.
Also, the spreading of Highcolor and True Color capabilities of display adapters providing thousands and millions
of colors, along with faster CPUs and accelerated graphic
cards, cheaper RAM, storage devices up to an order
of magnitude larger (from megabytes to gigabytes) and
larger bandwidth for telecom networking at lower cost
helped to create an environment in which the common
user was able to run complicated GUIs which began to
favor aesthetics.

linear address memory space. Windows 95 was touted


as a 32-bit based operating system but it was actually
based on a hybrid kernel (VWIN32.VXD) with the 16-bit
user interface (USER.EXE) and graphic device interface
(GDI.EXE) of Windows for Workgroups (3.11), which
had 16-bit kernel components with a 32-bit subsystem
(USER32.DLL and GDI32.DLL) that allowed it to run
native 16-bit applications as well as 32-bit applications.
In the marketplace, Windows 95 was an unqualied success, promoting a general upgrade to 32-bit technology,
and within a year or two of its release had become the
most successful operating system ever produced.
Accompanied by an extensive marketing campaign,[35]
Windows 95 was a major success in the marketplace at
launch and shortly became the most popular desktop operating system.

Windows 95 saw the beginning of the browser wars, when


the World Wide Web began receiving a great deal of
attention in the popular culture and mass media. Microsoft at rst did not see potential in the Web, and Windows 95 was shipped with Microsofts own online service
called The Microsoft Network, which was dial-up only
11.3.1 Windows 95 and a computer in ev- and was used primarily for its own content, not internet
access. As versions of Netscape Navigator and Internet
ery home
Explorer were released at a rapid pace over the following
few years, Microsoft used its desktop dominance to push
Main article: Windows 95. See also: Windows
its browser and shape the ecology of the web mainly as a
NT.
monoculture.
After Windows 3.11, Microsoft began to develop a new
consumer-oriented version of the operating system. Windows 95 was intended to integrate Microsofts formerly
separate MS-DOS and Windows products and included
an enhanced version of DOS, often referred to as MSDOS 7.0. It also featured a signicant redesign of the
GUI, dubbed Cairo. While Cairo never really materialized, parts of Cairo found their way into subsequent
versions of the operating system starting with Windows
95. Both Win95 and WinNT could run 32-bit applications, and could exploit the abilities of the Intel 80386
CPU, as the preemptive multitasking and up to 4GiB of

Windows 95 evolved through the years into Windows


98 and Windows ME. Windows ME was the last in the
line of the Windows 3.x-based operating systems from
Microsoft. Windows underwent a parallel 32-bit evolutionary path, where Windows NT 3.1 was released in
1993. Windows NT (for New Technology)[36] was a native 32-bit operating system with a new driver model, was
unicode-based, and provided for true separation between
applications. Windows NT also supported 16-bit applications in an NTVDM, but it did not support VXD based
drivers. Windows 95 was supposed to be released before
1993 as the predecessor to Windows NT. The idea was

11.3. THE 1990S: MAINSTREAM USAGE OF THE DESKTOP


to promote the development of 32-bit applications with
backward compatibility - leading the way for more successful NT release. After multiple delays, Windows 95
was released without unicode and used the VXD driver
model. Windows NT 3.1 evolved to Windows NT 3.5,
3.51 and then 4.0 when it nally shared a similar interface
with its Windows 9x desktop counterpart and included a
START button. The evolution continued with Windows
2000, Windows XP, Windows Vista, then Windows 7.
Windows XP and higher were also made available in 64bit modes. Windows server products branched o with
the introduction of Windows Server 2003 (available in
32-bit and 64-bit IA64 or x64), then Windows Server
2008 and then Windows Server 2008 R2. Windows 2000
and XP shared the same basic GUI although XP introduced Visual Styles. With Windows 98, the Active Desktop theme was introduced, allowing an HTML approach
for the desktop, but this feature was coldly received by
customers, who frequently disabled it. At the end, Windows Vista denitively discontinued it, but put a new
SideBar on the desktop.

107

With Mac OS X v10.4, new features were added, including Dashboard (a virtual alternate desktop for mini
specic-purpose applications) and a search tool called
Spotlight, which provides users with an option for searching through les instead of browsing through folders.

11.3.3 GUIs built on the X Window System

KDE Plasma 4.4 desktop (2010)

11.3.2

Mac OS

A GNOME 2.28 desktop (2010)


Screenshot of System 7.5.3

The Macintoshs GUI has been revised multiple times


since 1984, with major updates including System 7 and
Mac OS 8. It underwent its largest revision to date with
the introduction of the "Aqua" interface in 2001s Mac
OS X. It was a new operating system built primarily on
technology from NeXTStep with UI elements of the original Mac OS grafted on. Mac OS X uses a technology
known as Quartz (graphics layer), for graphics rendering and drawing on-screen. Some interface features of
Mac OS X are inherited from NeXTStep (such as the
Dock, the automatic wait cursor, or double-buered windows giving a solid appearance and icker-free window
redraws), while others are inherited from the old Mac
OS operating system (the single system-wide menu-bar).
Mac OS X v10.3 introduced features to improve usability including Expos, which is designed to make nding
open windows easier.

In the early days of X Window development, Sun Microsystems and AT&T attempted to push for a GUI standard called OPEN LOOK in competition with Motif.
OPEN LOOK was a well-designed standard developed
from scratch in conjunction with Xerox, while Motif was
a collective eort that fell into place, with a look and
feel patterned after Windows 3.11. Many who worked
on OPEN LOOK at the time appreciated its design coherence. Motif prevailed in the UNIX GUI battles and
became the basis for the Common Desktop Environment
(CDE). CDE was based on Visual User Environment
(VUE), a proprietary desktop from Hewlett-Packard that
in turn was based on the Motif look and feel.
In the late 1990s, there was signicant growth in the Unix
world, especially among the free software community.
New graphical desktop movements grew up around Linux
and similar operating systems, based on the X Window
System. A new emphasis on providing an integrated and

108

CHAPTER 11. HISTORY OF THE GRAPHICAL USER INTERFACE

uniform interface to the user brought about new desktop


environments, such as KDE Plasma Desktop, GNOME
and XFCE which have supplanted CDE in popularity on
both Unix and Unix-like operating systems. The XFCE,
KDE and GNOME look and feel each tend to undergo
more rapid change and less codication than the earlier
OPEN LOOK and Motif environments.

color icons, increased use of wallpapers for screens and


windows, alpha channel, transparencies and shadows as
any modern GUI provides.

11.3.4

Use of object oriented graphic engines dramatically


changes the look and feel of a GUI to match actual
styleguides.

Amiga

Modern derivatives of Workbench are Ambient for


MorphOS, Scalos, Workbench for AmigaOS 4 and
Wanderer for AROS. There is a brief article on Ambient and descriptions of MUI icons, menus and gadgets at
aps.fr and images of Zune stay at main AROS site.

11.3.5 OS/2

Amiga Workbench 2.0 (1990)

OS/2 Workplace Shell

Amiga Workbench 4.1 (2009)

Later releases added improvements over the original


Workbench, like support for high-color Workbench
screens, context menus, and embossed 2D icons with
pseudo-3D aspect. Some Amiga users preferred alternative interfaces to standard Workbench, such as Directory
Opus Magellan.
The use of improved, third-party GUI engines became
common amongst users who preferred more attractive
interfaces such as Magic User Interface (MUI), and
ReAction. These object-oriented graphic engines driven
by user interface classes and methods were then standardized into the Amiga environment and changed Amiga
Workbench to a complete and modern guided interface,
with new standard gadgets, animated buttons, true 24-bit-

Originally collaboratively developed by Microsoft and


IBM to replace DOS, OS/2 version 1.0 (released in 1987)
had no GUI at all. Version 1.1 (released 1988) included
Presentation Manager (PM), an implementation of IBM
Common User Access, which looked a lot like the later
Windows 3.1 UI. After the split with Microsoft, IBM developed the Workplace Shell (WPS) for version 2.0 (released in 1992), a quite radical, object-oriented approach
to GUIs. Microsoft later imitated much of this look in
Windows 95.

11.3.6 NeXTSTEP
The NeXTSTEP user interface was used in the NeXT line
of computers. NeXTSTEPs rst major version was released in 1989. It used Display PostScript for its graphical underpinning. The NeXTSTEP interfaces most signicant feature was the Dock, carried with some modication into Mac OS X, and had other minor interface
details that some found made it easier and more intuitive
to use than previous GUIs. NeXTSTEPs GUI was the
rst to feature opaque dragging of windows in its user interface, on a comparatively weak machine by todays standards, ideally aided by high performance graphics hardware.

11.4. CURRENT TRENDS

109
Other portable devices such as MP3 players and cell
phones have been a burgeoning area of deployment for
GUIs in recent years. Since the mid-2000s, a vast majority of portable devices have advanced to having highscreen resolutions and sizes. (The iPhone 5's 1,136 640
pixel display is an example). Because of this, these devices have their own famed user interfaces and operating
systems that have large homebrew communities dedicated to creating their own visual elements, such as icons,
menus, wallpapers, and more. Post-WIMP interfaces are
often used in these mobile devices, where the traditional
pointing devices required by the desktop metaphor are
not practical.

NeXTStep 3.x running NetHack, help and more apps.

11.3.7

BeOS

As high-powered graphics hardware draws considerable


power and generates signicant heat, many of the 3D effects developed between 2000 and 2010 are not practical
on this class of device. This has led to the development of
simpler interfaces making a design feature of two dimensionality such as exhibited by Metro and the 2012 Gmail
redesign.

11.4.2 3D user interface


Main article: 3D interaction
In the 1st decade of the 21st century, the rapid devel-

BeOS Desktop

BeOS was developed on custom AT&T Hobbit-based


computers before switching to PowerPC hardware by a
team led by former Apple executive Jean-Louis Gasse
as an alternative to Mac OS. BeOS was later ported to Intel hardware. It used an object-oriented kernel written by
Be, and did not use the X Window System, but a dierent
GUI written from scratch. Much eort was spent by the
developers to make it an ecient platform for multimedia applications. Be Inc. was acquired by PalmSource,
Inc. (Palm Inc. at the time) in 2001. The BeOS GUI still
lives in Haiku, an open source software reimplementation
of the BeOS.

11.4 Current trends


11.4.1

Mobile devices

In 2007 with the iPhone[37] and later in 2010 with the


introduction of the iPad,[38] Apple popularized the postWIMP style of interaction for multi-touch screens, with
those devices considered to be milestones in the development of mobile devices.[39][40]

Compiz running on Fedora Core 6 with AIGLX.

opment of GPUs led to a trend for the inclusion of 3D


eects in window management. It is based in experimental research in User Interface Design trying to expand the
expressive power of the existing toolkits in order to enhance the physical cues that allow for direct manipulation.
New eects common to several projects are scale resizing and zooming, several windows transformations and
animations (wobbly windows, smooth minimization to
system tray...), composition of images (used for window
drop shadows and transparency) and enhancing the global
organization of open windows (zooming to virtual desktops, desktop cube, Expos, etc.) The proof-of-concept
BumpTop desktop combines a physical representation of
documents with tools for document classication possible
only in the simulated environment, like instant reordering

110

CHAPTER 11. HISTORY OF THE GRAPHICAL USER INTERFACE

and automated grouping of related documents.


These eects are popularized thanks to the widespread
use of 3D video cards (mainly due to gaming) which allow for complex visual processing with low CPU use, using the 3D acceleration in most modern graphics cards to
render the application clients in a 3D scene. The application window is drawn o-screen in a pixel buer, and the
graphics card renders it into the 3D scene.

11.5 See also


Windowing system
Bill Atkinson
The Blit (graphics terminal by Rob Pike, 1982)
Direct manipulation interface

This can have the advantage of moving some of the window rendering to the GPU on the graphics card and thus
reducing the load on the main CPU, but the facilities that
allow this must be available on the graphics card to be
able to take advantage of this.

Doug Engelbart's On-Line System

Examples of 3D user-interface software include XGL and


Compiz from Novell, and AIGLX bundled with Red Hat
Fedora. Quartz Extreme for Mac OS X and Windows 7
and Vista's Aero interface use 3D rendering for shading
and transparency eects as well as Expose and Windows
Flip and Flip 3D, respectively. Windows Vista uses
Direct3D to accomplish this, whereas the other interfaces
use OpenGL.

History of computing hardware

At the IEEE 7th Symposium on 3D User Interfaces,


Ph.D. student Mengu Sukan, M.S. student Semih Energin
and Prof. Steve Feiner won best poster for research and
development of augmented reality, titled Manipulating
Virtual Objects in Hand-Held Augmented Reality using
Stored Snapshots. The poster presents a set of interaction techniques that allow a user to rst take snapshots of
a scene using a tablet computer, and then jump back and
forth between the snapshots, to revisit them virtually for
interaction. By storing for each snapshot a still image of
the scene, along with the camera position and orientation
determined by computer vision software, this approach
allows the overlaid 3D graphics to be dynamic and interactive. This makes it possible for the user to move and
rotate virtual 3D objects from the vantage points of different locations, without the overhead of physically traveling between those locations. 3DUI attendees tried a
real-time demo in which they laid out virtual furniture.
They could rapidly transition between the live view and
the viewpoints of multiple snapshots, as they moved and
rotated items of virtual furniture, iteratively designing a
desired layout.[41]

XGL

Graphical user interface


Text-based user interface

History of computer icons


Ivan Sutherland's Sketchpad
Jef Raskin
Oce of the future

AIGLX
DirectFB
Mezzo
tiling window manager
Macro command language
Texting
Skeuomorph
Apple v. Microsoft
IBM Common User Access

11.6 References
[1] The computer mouse turns 40. Retrieved June 12,
2012.
[2] Clive Akass. The men who really invented the GUI.
[3] History of PARC

11.4.3

Virtual reality and presence

See also: Head-up display


Virtual reality devices such as the Oculus Rift and Sonys
Project Morpheus[42] aim to provide users with presence,
a perception of full immersion into a virtual environment.

[4] Mike Tuck. The Real History of the GUI.


[5] Sgi Iris Faq. Futuretech.blinkenlights.nl. Retrieved
2014-03-07.
[6] Hardware : Systems : IRIS 1000. sgistu.net. Retrieved 2014-03-07.
[7] History of IRIX. Ryan.tliquest.net. Retrieved 2014-0307.

11.6. REFERENCES

111

[8] ConMan: a visual programming language for interactive graphics. Computer Graphics (Dx.doi.org) 22 (4):
103111. 1988. doi:10.1145/378456.378494. Retrieved
2014-03-07.

[24] Cortex-A8 port status. RISC OS Open. Retrieved January 31, 2011. [The port includes] a modied version of
the RISC OS kernel containing support for (all) CortexA8 CPU cores.

[9] On Xerox, Apple and Progress (1996), Folklore.org.

[25] Lee, Jerey (August 2, 2011). Have I Got Old News


For You. The Icon Bar. Retrieved September 28, 2011.
[...] Willi Theiss has recently announced that hes been
working on a port of RISC OS to the PandaBoard [...]

[10] Jef Raskin. Holes in Histories.


[11] Friedman, Ted (October 1997). Apples 1984: The Introduction of the Macintosh in the Cultural History of Personal Computers. Archived from the original on October
5, 1999.
[12] Friedman, Ted (2005). Chapter 5: 1984. Electric
Dreams: Computers in American Culture. New York University Press. ISBN 0-8147-2740-9.
[13] Grote, Patrick (October 29, 2006). Review of Pirates of
Silicon Valley Movie. DotJournal.com. Archived from
the original on November 7, 2006. Retrieved January 24,
2014.
[14] Chriss Acorns: Master Compact
[15] [Acorn User October 1986 - News - Page 9]
[16] About us: RISC OS Open Limited FAQ. RISC OS
Open. Retrieved June 13, 2011.
[17] Acorn announces distribution deal with Castle Technology for RISC based products. Press release (Acorn Computers Ltd). October 12, 1998. Archived from the original on May 6, 1999. Retrieved January 6, 2011. (October
12th 1998), Cambridge, UK-Acorn announced today that
it has completed negotiations with Castle Technology for
them to distribute Acorn products. |rst1= missing |last1=
in Authors list (help)
[18] Risc os 6 general faq. RISCOS Ltd. Retrieved January
31, 2011. [RISC OS 6 is] suitable for Risc PC, A7000
and Virtual Acorn products.
[19] RISC OS 5 features. Iyonix Ltd. Retrieved January
31, 2011. All IYONIX pcs ship with RISC OS 5 in ash
ROM.
[20] Lee, Jerey. Newsround. The Icon Bar. Retrieved October 17, 2011.
[21] Holwerda, Thom (October 31, 2011). Raspberry Pi To
Embrace RISC OS. OSNews. Retrieved November 1,
2011.
[22] Dewhurst, Christopher (December 2011). The London
show 2011. Archive (magazine) 23 (3). p. 3.
[23] Farrell, Nick (April 27, 2009). Snaps leak of RISC OS5
on Beagleboard. The Inquirer. Retrieved June 28, 2011.
A snap of an RISC OS 5, running on a Beagleboard device
powered by a 600MHz ARM Cortex-A8 processor with
a built-in graphics chip, has tipped up on the world wide
wibble. The port developed by Jerey Lee is a breakthrough for the shared-source project because it has ported
the OS without an army of engineers.

[26] Mellor, Phil (March 23, 2007). An arbitrary number


of possibly inuential RISC OS things. The Icon Bar.
Retrieved September 27, 2011. Admittedly it wasn't until RISC OS Select was released, almost 10 years later,
that the standard Acorn applications (Draw, Edit, and
Paint) implemented the style guides clipboard recommendations, but most products followed it with care.
[27] Round, Mark (February 26, 2004). Emulating RISC OS
under Windows. OSnews. OSNews. Retrieved May 12,
2011. Many of the UI concepts that we take for granted
were rst pioneered in RISC OS, for instance: scalable
anti-aliased fonts and an operating system extendable by
'modules, while most of the PC world was still on Windows 3.0.
[28] Ghiraddje (December 22, 2009). The RISC OS GUI.
Telcontar.net. Retrieved May 12, 2011. Only with Mac
OS X did any mainstream graphical interface provide the
smoothly rendered, fractionally spaced type that Acorn
accomplished in 1992 or earlier.
[29] Reimer, Jeremy (May 2005). A History of the GUI.
ArsTechnica. Retrieved May 25, 2011. [...] in 1987, the
UK-based company Acorn Computers introduced their
[...] GUI, called Arthur, also was the rst to feature
anti-aliased display of on-screen fonts, even in 16-color
mode!
[30] Holwerda, Thom (June 23, 2005). Screen Fonts: Shape
Accuracy or On-Screen Readability?". OSNews. Retrieved June 13, 2011. [...] it was RISC OS that had the
rst system-wide, intricate [...] font rendering in operating systems.
[31] Pountain, Dick (December 1988). Screentest: Archie
RISC OS. Personal Computer World. p. 154. Retrieved
January 14, 2011. [ArcDraw] can also add text in multiple
sizes and fonts to a drawing (including anti-aliased fonts)
[32] Acorn Computers Support Group Application Notice 253
- New features of RISC OS version 3.5
[33] how-windows-came-to-be-windows-1.
romania.com. Retrieved October 3, 2011.

sbp-

[34] history-computer.com. http://history-computer.com.


Retrieved October 3, 2011.
[35] Washington Post (August 24, 1995). Windows With
Windows 95s Debut, Microsoft Scales Heights of Hype.
Washington Post. Retrieved 11/8/2013. Check date values in: |accessdate= (help)
[36] Gates, Bill (1998-06-05). Q&A: Protecting children
from information on the Internet. Archived from the
original on 2001-05-26. Retrieved 2005-06-26.

112

CHAPTER 11. HISTORY OF THE GRAPHICAL USER INTERFACE

[37] Mather, John. iMania, Ryerson Review of Journalism,


(February 19, 2007) Retrieved February 19, 2007
[38] the iPad could nally spark demand for the hitherto unsuccessful tablet PC --Eaton, Nick The iPad/tablet PC
market dened?, Seattle Post-Intelligencer, 2010
[39] Bright, Peter Ballmer (and Microsoft) still doesn't get the
iPad, Ars Technica, 2010
[40] The iPads victory in dening the tablet: What it means.
Infoworld.
[41] Dedual, Nicolas (March 8, 2012). Sukan, Feiner, and
Energin receive Best Poster Award at IEEE 3DUI 2012
(announcement). Columbia University. Retrieved April
3, 2013.
[42]

11.7 External links


Raj Lal User Interface evolution in last 50 years,
Digital Design and Innovation Summit, San Francisco, Sept 20, 2013
Jeremy Reimer. A History of the GUI Ars Technica. May 5, 2005.
User Interface Timeline George Mason University
Nathan Lineback. The Graphical User Interface
Gallery. Nathans Toasty Technology Page.
Oral history interview with Marvin L. Minsky,
Charles Babbage Institute, University of Minnesota.
Minsky describes articial intelligence (AI) research at the Massachusetts Institute of Technology
(MIT), including research in the areas of graphics,
word processing, and time-sharing.
Oral history interview with Ivan Sutherland, Charles
Babbage Institute, University of Minnesota. Sutherland describes his tenure as head of ARPAs Information Processing Techniques Oce (IPTO) from
1963 to 1965, including new projects in graphics and
networking.
Oral history interview with Charles A. Csuri,
Charles Babbage Institute, University of Minnesota.
Csuri recounts his art education and explains his
transition to computer graphics in the mid-1960s,
after receiving a National Science Foundation grant
for research in graphics.
GUIdebook: Graphical User Interface gallery
VisiOn history The rst GUI for the PC
mprove: Historical Overview of Graphical User Interfaces
Anecdotes about the development of the Macintosh
Hardware & GUI

Chapter 12

History of the Internet


The history of the Internet begins with the development
of electronic computers in the 1950s. Initial concepts
of packet networking originated in several computer science laboratories in the United States, Great Britain, and
France. The US Department of Defense awarded contracts as early as the 1960s for packet network systems,
including the development of the ARPANET (which
would become the rst network to use the Internet Protocol.) The rst message was sent over the ARPANET from
computer science Professor Leonard Kleinrocks laboratory at University of California, Los Angeles (UCLA) to
the second network node at Stanford Research Institute
(SRI).

community continues to develop and use advanced networks such as NSFs very high speed Backbone Network
Service (vBNS), Internet2, and National LambdaRail.
Increasing amounts of data are transmitted at higher and
higher speeds over ber optic networks operating at 1Gbit/s, 10-Gbit/s, or more. The Internets takeover of
the global communication landscape was almost instant
in historical terms: it only communicated 1% of the information owing through two-way telecommunications
networks in the year 1993, already 51% by 2000, and
more than 97% of the telecommunicated information by
2007.[1] Today the Internet continues to grow, driven by
ever greater amounts of online information, commerce,
Packet switching networks such as ARPANET, Mark I at entertainment, and social networking.
NPL in the UK, CYCLADES, Merit Network, Tymnet,
and Telenet, were developed in the late 1960s and early
1970s using a variety of communications protocols. The 12.1 Precursors
ARPANET in particular led to the development of protocols for internetworking, in which multiple separate netSee also: Victorian Internet
works could be joined into a network of networks.
Access to the ARPANET was expanded in 1981 when
the National Science Foundation (NSF) funded the
Computer Science Network (CSNET). In 1982, the
Internet protocol suite (TCP/IP) was introduced as the
standard networking protocol on the ARPANET. In
the early 1980s the NSF funded the establishment for
national supercomputing centers at several universities, and provided interconnectivity in 1986 with the
NSFNET project, which also created network access to
the supercomputer sites in the United States from research and education organizations. Commercial Internet
service providers (ISPs) began to emerge in the late
1980s. The ARPANET was decommissioned in 1990.
Private connections to the Internet by commercial entities became widespread quickly, and the NSFNET was
decommissioned in 1995, removing the last restrictions
on the use of the Internet to carry commercial trac.

The telegraph system is the rst fully digital communication system. Thus the Internet has precursors, such as the
telegraph system, that date back to the 19th century, more
than a century before the digital Internet became widely
used in the second half of the 1990s. The concept of data
communication transmitting data between two dierent places, connected via some kind of electromagnetic
medium, such as radio or an electrical wire predates the
introduction of the rst computers. Such communication
systems were typically limited to point to point communication between two end devices. Telegraph systems and
telex machines can be considered early precursors of this
kind of communication.

Fundamental theoretical work in data transmission and


information theory was developed by Claude Shannon,
Harry Nyquist, and Ralph Hartley, during the early 20th
Since the mid-1990s, the Internet has had a revolution- century.
ary impact on culture and commerce, including the rise Early computers used the technology available at the time
of near-instant communication by electronic mail, instant to allow communication between the central processing
messaging, voice over Internet Protocol (VoIP) telephone unit and remote terminals. As the technology evolved,
calls, two-way interactive video calls, and the World Wide new systems were devised to allow communication over
Web with its discussion forums, blogs, social networking, longer distances (for terminals) or with higher speed (for
and online shopping sites. The research and education interconnection of local devices) that were necessary for
113

114

CHAPTER 12. HISTORY OF THE INTERNET

the mainframe computer model. Using these technologies made it possible to exchange data (such as les) between remote computers. However, the point to point
communication model was limited, as it did not allow
for direct communication between any two arbitrary systems; a physical link was necessary. The technology was
also deemed as inherently unsafe for strategic and military use, because there were no alternative paths for the
communication in case of an enemy attack.

Although he left the IPTO in 1964, ve years before the


ARPANET went live, it was his vision of universal networking that provided the impetus that led his successors
such as Lawrence Roberts and Robert Taylor to further
the ARPANET development. Licklider later returned to
lead the IPTO in 1973 for two years.[6]

12.2 Three terminals and an ARPA

Main article: Packet switching


At the tip of the problem lay the issue of connect-

12.3 Packet switching

Main articles: RAND Corporation and ARPANET


A pioneer in the call for a global network, J. C. R.
Licklider, proposed in his January 1960 paper, "ManComputer Symbiosis": A network of such [computers],
connected to one another by wide-band communication
lines [which provided] the functions of present-day libraries together with anticipated advances in information
storage and retrieval and [other] symbiotic functions.[2]
In August 1962, Licklider and Welden Clark published
the paper On-Line Man-Computer Communication,[3]
which was one of the rst descriptions of a networked
future.
In October 1962, Licklider was hired by Jack Ruina as
director of the newly established Information Processing
Techniques Oce (IPTO) within DARPA, with a mandate to interconnect the United States Department of Defense's main computers at Cheyenne Mountain, the Pentagon, and SAC HQ. There he formed an informal group
within DARPA to further computer research. He began by writing memos describing a distributed network
to the IPTO sta, whom he called Members and Afliates of the Intergalactic Computer Network.[4] As
part of the information processing oces role, three network terminals had been installed: one for System Development Corporation in Santa Monica, one for Project
Genie at the University of California, Berkeley and one
for the Compatible Time-Sharing System project at the
Massachusetts Institute of Technology (MIT). Lickliders
identied need for inter-networking would be made obvious by the apparent waste of resources this caused.
For each of these three terminals, I had
three dierent sets of user commands. So if I
was talking online with someone at S.D.C. and
I wanted to talk to someone I knew at Berkeley
or M.I.T. about this, I had to get up from the
S.D.C. terminal, go over and log into the other
terminal and get in touch with them....
I said, oh man, its obvious what to do: If you
have these three terminals, there ought to be
one terminal that goes anywhere you want to
go where you have interactive computing. That
idea is the ARPAnet.[5]

Len Kleinrock and the rst Interface Message Processor.[7]

ing separate physical networks to form one logical network. In the 1960s, Paul Baran of the RAND Corporation produced a study of survivable networks for the
U.S. military in the event of nuclear war.[8] Information transmitted across Barans network would be divided
into what he called message-blocks. Independently,
Donald Davies (National Physical Laboratory, UK), proposed and developed a similar network based on what he
called packet-switching, the term that would ultimately be
adopted. Leonard Kleinrock (MIT) developed a mathematical theory behind this technology. Packet-switching
provides better bandwidth utilization and response times
than the traditional circuit-switching technology used for
telephony, particularly on resource-limited interconnection links.[9]
Packet switching is a rapid store and forward networking design that divides messages up into arbitrary packets, with routing decisions made per-packet. Early networks used message switched systems that required rigid

12.4. NETWORKS THAT LED TO THE INTERNET


routing structures prone to single point of failure. This
led Tommy Krash and Paul Barans U.S. military-funded
research to focus on using message-blocks to include network redundancy.[10]

12.4 Networks that led to the Internet


12.4.1

ARPANET

115
the number of hosts had grown to 213, with a new host
being added approximately every twenty days.[13][14]
ARPANET development was centered around the
Request for Comments (RFC) process, still used today
for proposing and distributing Internet Protocols and Systems. RFC 1, entitled Host Software, was written by
Steve Crocker from the University of California, Los Angeles, and published on April 7, 1969. These early years
were documented in the 1972 lm Computer Networks:
The Heralds of Resource Sharing.
ARPANET became the technical core of what would
become the Internet, and a primary tool in developing the technologies used. The early ARPANET used
the Network Control Program (NCP, sometimes Network Control Protocol) rather than TCP/IP. On January
1, 1983, known as ag day, NCP on the ARPANET
was replaced by the more exible and powerful family
of TCP/IP protocols, marking the start of the modern
Internet.[15]

Promoted to the head of the information processing


oce at Defense Advanced Research Projects Agency
(DARPA), Robert Taylor intended to realize Lickliders
ideas of an interconnected networking system. Bringing
in Larry Roberts from MIT, he initiated a project to build
such a network. The rst ARPANET link was established between the University of California, Los Angeles (UCLA) and the Stanford Research Institute at 22:30
International collaborations on ARPANET were sparse.
hours on October 29, 1969.[11]
For various political reasons, European developers were
concerned with developing the X.25 networks. NoWe set up a telephone connection between
table exceptions were the Norwegian Seismic Array
us and the guys at SRI ..., Kleinrock ... said in
(NORSAR) in 1972, followed in 1973 by Sweden with
an interview: We typed the L and we asked on
satellite links to the Tanum Earth Station and Peter
the phone,
Kirstein's research group in the UK, initially at the InstiDo you see the L?"
tute of Computer Science, London University and later
at University College London.[16]
Yes, we see the L, came the response.
We typed the O, and we asked, Do
12.4.2 NPL
you see the O.
Yes, we see the O.
In 1965, Donald Davies of the National Physical LaboraThen we typed the G, and the systory (United Kingdom) proposed a national data network
tem crashed ...
based on packet-switching. The proposal was not taken
Yet a revolution had begun ....[12]

up nationally, but by 1970 he had designed and built the


Mark I packet-switched network to meet the needs of the
multidisciplinary laboratory and prove the technology under operational conditions.[17] By 1976 12 computers and
75 terminal devices were attached and more were added
until the network was replaced in 1986.

12.4.3 Merit Network


The Merit Network[18] was formed in 1966 as the Michigan Educational Research Information Triad to explore
computer networking between three of Michigans public universities as a means to help the states educational
and economic development.[19] With initial support from
the State of Michigan and the National Science Founda35 Years of the Internet, 1969-2004. Stamp of Azerbaijan, 2004. tion (NSF), the packet-switched network was rst demonstrated in December 1971 when an interactive host to host
By December 5, 1969, a 4-node network was connected connection was made between the IBM mainframe comby adding the University of Utah and the University of puter systems at the University of Michigan in Ann ArCalifornia, Santa Barbara. Building on ideas developed bor and Wayne State University in Detroit.[20] In Octoin ALOHAnet, the ARPANET grew rapidly. By 1981, ber 1972 connections to the CDC mainframe at Michigan

116

CHAPTER 12. HISTORY OF THE INTERNET

State University in East Lansing completed the triad.


Over the next several years in addition to host to host interactive connections the network was enhanced to support terminal to host connections, host to host batch connections (remote job submission, remote printing, batch
le transfer), interactive le transfer, gateways to the
Tymnet and Telenet public data networks, X.25 host attachments, gateways to X.25 data networks, Ethernet attached hosts, and eventually TCP/IP and additional public
universities in Michigan join the network.[20][21] All of
this set the stage for Merits role in the NSFNET project
starting in the mid-1980s.

12.4.4

CYCLADES

The CYCLADES packet switching network was a French


research network designed and directed by Louis Pouzin.
First demonstrated in 1973, it was developed to explore alternatives to the initial ARPANET design and
to support network research generally. It was the rst
network to make the hosts responsible for the reliable
delivery of data, rather than the network itself, using
unreliable datagrams and associated end-to-end protocol
mechanisms.[22][23]

12.4.5

Kong, and Australia by 1981. By the 1990s it provided a


worldwide networking infrastructure.[25]
Unlike ARPANET, X.25 was commonly available for
business use. Telenet oered its Telemail electronic mail
service, which was also targeted to enterprise use rather
than the general email system of the ARPANET.
The rst public dial-in networks used asynchronous
TTY terminal protocols to reach a concentrator operated in the public network. Some networks, such as
CompuServe, used X.25 to multiplex the terminal sessions into their packet-switched backbones, while others,
such as Tymnet, used proprietary protocols. In 1979,
CompuServe became the rst service to oer electronic
mail capabilities and technical support to personal computer users. The company broke new ground again in
1980 as the rst to oer real-time chat with its CB Simulator. Other major dial-in networks were America Online (AOL) and Prodigy that also provided communications, content, and entertainment features. Many bulletin
board system (BBS) networks also provided on-line access, such as FidoNet which was popular amongst hobbyist computer users, many of them hackers and amateur
radio operators.

X.25 and public data networks

Main articles: X.25, Bulletin board system and FidoNet


Based on ARPAs research, packet switching network 12.4.6

UUCP and Usenet

Main articles: UUCP and Usenet

1974 ABC interview with Arthur C. Clarke, in which he describes


a future of ubiquitous networked personal computers.

standards were developed by the International Telecommunication Union (ITU) in the form of X.25 and related
standards. While using packet switching, X.25 is built on
the concept of virtual circuits emulating traditional telephone connections. In 1974, X.25 formed the basis for
the SERCnet network between British academic and research sites, which later became JANET. The initial ITU
Standard on X.25 was approved in March 1976.[24]
The British Post Oce, Western Union International
and Tymnet collaborated to create the rst international
packet switched network, referred to as the International
Packet Switched Service (IPSS), in 1978. This network
grew from Europe and the US to cover Canada, Hong

In 1979, two students at Duke University, Tom Truscott


and Jim Ellis, originated the idea of using Bourne shell
scripts to transfer news and messages on a serial line
UUCP connection with nearby University of North Carolina at Chapel Hill. Following public release of the
software, the mesh of UUCP hosts forwarding on the
Usenet news rapidly expanded. UUCPnet, as it would
later be named, also created gateways and links between
FidoNet and dial-up BBS hosts. UUCP networks spread
quickly due to the lower costs involved, ability to use existing leased lines, X.25 links or even ARPANET connections, and the lack of strict use policies (commercial organizations who might provide bug xes) compared
to later networks like CSNET and Bitnet. All connects
were local. By 1981 the number of UUCP hosts had
grown to 550, nearly doubling to 940 in 1984. Sublink
Network, operating since 1987 and ocially founded in
Italy in 1989, based its interconnectivity upon UUCP to
redistribute mail and news groups messages throughout
its Italian nodes (about 100 at the time) owned both by
private individuals and small companies. Sublink Network represented possibly one of the rst examples of
the internet technology becoming progress through popular diusion.[26]

12.5. MERGING THE NETWORKS AND CREATING THE INTERNET (197390)

117

12.5 Merging the networks and


creating the Internet (1973
90)
12.5.1

TCP/IP

A Stanford Research Institute's Packet Radio Van, site of the rst


three-way internetworked transmission.

gether, no matter what their characteristics were, thereby


solving Kahns initial problem. DARPA agreed to fund
development of prototype software, and after several
years of work, the rst demonstration of a gateway between the Packet Radio network in the SF Bay area
and the ARPANET was conducted by the Stanford Research Institute. On November 22, 1977 a three network
demonstration was conducted including the ARPANET,
the SRIs Packet Radio Van on the Packet Radio Network
and the Atlantic Packet Satellite network.[28][29]

Map of the TCP/IP test network in February 1982

Main article: Internet Protocol Suite


With so many dierent network methods, something was
needed to unify them. Robert E. Kahn of DARPA and
ARPANET recruited Vinton Cerf of Stanford University to work with him on the problem. By 1973, they
had worked out a fundamental reformulation, where the
dierences between network protocols were hidden by
using a common internetwork protocol, and instead of
the network being responsible for reliability, as in the
ARPANET, the hosts became responsible. Cerf credits
Hubert Zimmermann, Gerard LeLann and Louis Pouzin
(designer of the CYCLADES network) with important
work on this design.[27]

Stemming from the rst specications of TCP in 1974,


TCP/IP emerged in mid-late 1978 in nearly nal form.
By 1981, the associated standards were published as
RFCs 791, 792 and 793 and adopted for use. DARPA
sponsored or encouraged the development of TCP/IP
implementations for many operating systems and then
scheduled a migration of all hosts on all of its packet networks to TCP/IP. On January 1, 1983, known as ag day,
TCP/IP protocols became the only approved protocol on
the ARPANET, replacing the earlier NCP protocol.[30]

12.5.2 From ARPANET to NSFNET


Main articles: ARPANET and NSFNET
After the ARPANET had been up and running for several years, ARPA looked for another agency to hand o
the network to; ARPAs primary mission was funding cutting edge research and development, not running a communications utility. Eventually, in July 1975, the network had been turned over to the Defense Communications Agency, also part of the Department of Defense.
In 1983, the U.S. military portion of the ARPANET was
broken o as a separate network, the MILNET. MILNET
subsequently became the unclassied but military-only
NIPRNET, in parallel with the SECRET-level SIPRNET
and JWICS for TOP SECRET and above. NIPRNET
does have controlled security gateways to the public Internet.

The specication of the resulting protocol, RFC 675


Specication of Internet Transmission Control Program,
by Vinton Cerf, Yogen Dalal and Carl Sunshine, Network
Working Group, December 1974, contains the rst attested use of the term internet, as a shorthand for internetworking; later RFCs repeat this use, so the word started
out as an adjective rather than the noun it is today.
The networks based on the ARPANET were government
With the role of the network reduced to the bare mini- funded and therefore restricted to noncommercial uses
mum, it became possible to join almost any networks to- such as research; unrelated commercial use was strictly

118

CHAPTER 12. HISTORY OF THE INTERNET


cations infrastructure to the NASA scientic community
for the advancement of earth, space and life sciences. As
a high-speed, multiprotocol, international network, NSI
provided connectivity to over 20,000 scientists across all
seven continents.
In 1981 NSF supported the development of the Computer
Science Network (CSNET). CSNET connected with
ARPANET using TCP/IP, and ran TCP/IP over X.25,
but it also supported departments without sophisticated
network connections, using automated dial-up mail exchange.

BBN Technologies TCP/IP internet map early 1986

forbidden. This initially restricted connections to military


sites and universities. During the 1980s, the connections
expanded to more educational institutions, and even to a
growing number of companies such as Digital Equipment
Corporation and Hewlett-Packard, which were participating in research projects or providing services to those
who were.

Its experience with CSNET lead NSF to use TCP/IP


when it created NSFNET, a 56 kbit/s backbone established in 1986, to supported the NSF sponsored
supercomputing centers. The NSFNET Project also provided support for the creation of regional research and education networks in the United States and for the connection of university and college campus networks to the regional networks.[31] The use of NSFNET and the regional
networks was not limited to supercomputer users and the
56 kbit/s network quickly became overloaded. NSFNET
was upgraded to 1.5 Mbit/s in 1988 under a cooperative
agreement with the Merit Network in partnership with
IBM, MCI, and the State of Michigan. The existence of
NSFNET and the creation of Federal Internet Exchanges
(FIXes) allowed the ARPANET to be decommissioned
in 1990. NSFNET was expanded and upgraded to 45
Mbit/s in 1991, and was decommissioned in 1995 when
it was replaced by backbones operated by several commercial Internet Service Providers.

Several other branches of the U.S. government, the


National Aeronautics and Space Administration (NASA),
the National Science Foundation (NSF), and the
Department of Energy (DOE) became heavily involved
in Internet research and started development of a successor to ARPANET. In the mid-1980s, all three of these
branches developed the rst Wide Area Networks based
on TCP/IP. NASA developed the NASA Science Network, NSF developed CSNET and DOE evolved the 12.5.3 Transition towards the Internet
Energy Sciences Network or ESNet.
The term internet was adopted in the rst RFC published on the TCP protocol (RFC 675:[32] Internet Transmission Control Program, December 1974) as an abbreviation of the term internetworking and the two terms
were used interchangeably. In general, an internet was
any network using TCP/IP. It was around the time when
ARPANET was interlinked with NSFNET in the late
1980s, that the term was used as the name of the network,
Internet, being the large and global TCP/IP network.[33]

As interest in networking grew and new applications


for it were developed, the Internets technologies spread
throughout the rest of the world. The network-agnostic
approach in TCP/IP meant that it was easy to use any exT3 NSFNET Backbone, c. 1992
isting network infrastructure, such as the IPSS X.25 network,
to carry Internet trac. In 1984, University ColNASA developed the TCP/IP based NASA Science Netlege
London
replaced its transatlantic satellite links with
work (NSN) in the mid-1980s, connecting space scienTCP/IP
over
IPSS.[34]
tists to data and information stored anywhere in the world.
In 1989, the DECnet-based Space Physics Analysis Net- Many sites unable to link directly to the Internet created
work (SPAN) and the TCP/IP-based NASA Science Net- simple gateways for the transfer of electronic mail, the
work (NSN) were brought together at NASA Ames Re- most important application of the time. Sites with only
search Center creating the rst multiprotocol wide area intermittent connections used UUCP or FidoNet and renetwork called the NASA Science Internet, or NSI. NSI lied on the gateways between these networks and the Inwas established to provide a totally integrated communi- ternet. Some gateway services went beyond simple mail

12.6. TCP/IP GOES GLOBAL (19892010)

119

peering, such as allowing access to File Transfer Protocol Vice-Chancellors Committee and provided a dedicated
(FTP) sites via UUCP or mail.[35]
IP based network for Australia.
Finally, routing technologies were developed for the Internet to remove the remaining centralized routing aspects. The Exterior Gateway Protocol (EGP) was replaced by a new protocol, the Border Gateway Protocol
(BGP). This provided a meshed topology for the Internet
and reduced the centric architecture which ARPANET
had emphasized. In 1994, Classless Inter-Domain Routing (CIDR) was introduced to support better conservation
of address space which allowed use of route aggregation
to decrease the size of routing tables.[36]

The Internet began to penetrate Asia in the late 1980s.


Japan, which had built the UUCP-based network JUNET
in 1984, connected to NSFNET in 1989. It hosted
the annual meeting of the Internet Society, INET'92, in
Kobe. Singapore developed TECHNET in 1990, and
Thailand gained a global Internet connection between
Chulalongkorn University and UUNET in 1992.[38]

12.6.2 Global digital divide

12.6 TCP/IP goes global (1989


2010)
100%

12.6.1

CERN, the European Internet, the


link to the Pacic and beyond

80%
60%
40%
20%
0%

Between 1984 and 1988 CERN began installation and operation of TCP/IP to interconnect its major internal computer systems, workstations, PCs and an accelerator conInternet users in 2012 as a percentage of a countrys
trol system. CERN continued to operate a limited selfpopulation
developed system (CERNET) internally and several incompatible (typically proprietary) network protocols ex- Source: International Telecommunications Union.[39]
ternally. There was considerable resistance in Europe towards more widespread use of TCP/IP, and the CERN Main articles: Global digital divide and Digital divide
TCP/IP intranets remained isolated from the Internet un- While developed countries with technological infrastructil 1989.
No data

In 1988, Daniel Karrenberg, from Centrum Wiskunde


& Informatica (CWI) in Amsterdam, visited Ben Segal,
CERN's TCP/IP Coordinator, looking for advice about
the transition of the European side of the UUCP Usenet
network (much of which ran over X.25 links) over to
TCP/IP. In 1987, Ben Segal had met with Len Bosack
from the then still small company Cisco about purchasing some TCP/IP routers for CERN, and was able to
give Karrenberg advice and forward him on to Cisco
for the appropriate hardware. This expanded the EuroFixed broadband Internet subscriptions in 2012
pean portion of the Internet across the existing UUCP
as a percentage of a countrys population
networks, and in 1989 CERN opened its rst external
[37]
TCP/IP connections. This coincided with the creation
Source: International Telecommunications Union.[40]
of Rseaux IP Europens (RIPE), initially a group of IP
network administrators who met regularly to carry out cotures were joining the Internet, developing countries beordination work together. Later, in 1992, RIPE was forgan to experience a digital divide separating them from
mally registered as a cooperative in Amsterdam.
the Internet. On an essentially continental basis, they
At the same time as the rise of internetworking in Europe, are building organizations for Internet resource adminad hoc networking to ARPA and in-between Australian istration and sharing operational experience, as more and
universities formed, based on various technologies such more transmission facilities go into place.
as X.25 and UUCPNet. These were limited in their connection to the global networks, due to the cost of making
individual international UUCP dial-up or X.25 connec- Africa
tions. In 1989, Australian universities joined the push towards using IP protocols to unify their networking infras- At the beginning of the 1990s, African countries relied
tructures. AARNet was formed in 1989 by the Australian upon X.25 IPSS and 2400 baud modem UUCP links
100%
20%
10%
4%
1%
0%

No data

120

CHAPTER 12. HISTORY OF THE INTERNET

125%
60%
40%
20%
4%
0%
No data

Mobile broadband Internet subscriptions in 2012


as a percentage of a countrys population

In 1991, the Peoples Republic of China saw its rst


TCP/IP college network, Tsinghua University's TUNET.
The PRC went on to make its rst global Internet connection in 1994, between the Beijing Electro-Spectrometer
Collaboration and Stanford University's Linear Accelerator Center. However, China went on to implement its own
digital divide by implementing a country-wide content lter.[45]
Latin America

As with the other regions, the Latin American and


Caribbean Internet Addresses Registry (LACNIC) manages the IP address space and other resources for its area.
LACNIC, headquartered in Uruguay, operates DNS root,
for international and internetwork computer communicareverse DNS, and other key services.
tions.
Source: International Telecommunications Union.[41]

In August 1995, InfoMail Uganda, Ltd., a privately held


rm in Kampala now known as InfoCom, and NSN Network Services of Avon, Colorado, sold in 1997 and now
known as Clear Channel Satellite, established Africas
rst native TCP/IP high-speed satellite Internet services.
The data connection was originally carried by a CBand RSCC Russian satellite which connected InfoMails
Kampala oces directly to NSNs MAE-West point of
presence using a private network from NSNs leased
ground station in New Jersey. InfoComs rst satellite
connection was just 64 kbit/s, serving a Sun host computer and twelve US Robotics dial-up modems.

12.6.3 Opening the network to commerce


The interest in commercial use of the Internet became a
hotly debated topic. Although commercial use was forbidden, the exact denition of commercial use could be
unclear and subjective. UUCPNet and the X.25 IPSS
had no such restrictions, which would eventually see
the ocial barring of UUCPNet use of ARPANET and
NSFNET connections. Some UUCP links still remained
connecting to these networks however, as administrators
cast a blind eye to their operation.

In 1996, a USAID funded project, the Leland Initiative,


started work on developing full Internet connectivity for
the continent. Guinea, Mozambique, Madagascar and
Rwanda gained satellite earth stations in 1997, followed
by Cte d'Ivoire and Benin in 1998.
Africa is building an Internet infrastructure. AfriNIC,
headquartered in Mauritius, manages IP address allocation for the continent. As do the other Internet regions,
there is an operational forum, the Internet Community of
Operational Networking Specialists.[42]
There are many programs to provide high-performance
transmission plant, and the western and southern coasts
have undersea optical cable. High-speed cables join
North Africa and the Horn of Africa to intercontinental
cable systems. Undersea cable development is slower for
East Africa; the original joint eort between New Partnership for Africas Development (NEPAD) and the East
Africa Submarine System (Eassy) has broken o and may
become two eorts.[43]
Asia and Oceania
The Asia Pacic Network Information Centre (APNIC),
headquartered in Australia, manages IP address allocation for the continent. APNIC sponsors an operational
forum, the Asia-Pacic Regional Internet Conference on
Operational Technologies (APRICOT).[44]

Number of Internet hosts worldwide: 19812012


Source: Internet Systems Consortium.[46]

During the late 1980s, the rst Internet service provider


(ISP) companies were formed. Companies like PSINet,
UUNET, Netcom, and Portal Software were formed to
provide service to the regional research networks and provide alternate network access, UUCP-based email and
Usenet News to the public. The rst commercial dialup
ISP in the United States was The World, which opened in
1989.[47]
In 1992, the U.S. Congress passed the Scientic and
Advanced-Technology Act, 42 U.S.C. 1862(g), which
allowed NSF to support access by the research and education communities to computer networks which were
not used exclusively for research and education purposes,

12.8. INTERNET GOVERNANCE

121

thus permitting NSFNET to interconnect with commercial networks.[48][49] This caused controversy within the
research and education community, who were concerned
commercial use of the network might lead to an Internet that was less responsive to their needs, and within the
community of commercial network providers, who felt
that government subsidies were giving an unfair advantage to some organizations.[50]

can temporarily lose contact because they move behind


the Moon or planets, or because space weather disrupts
the connection. Under such conditions, DTN retransmits
data packages instead of dropping them, as the standard
TCP/IP internet protocol does. NASA conducted the
rst eld test of what it calls the deep space internet
in November 2008.[56] Testing of DTN-based communications between the International Space Station and Earth
(now termed Disruption-Tolerant Networking) has been
By 1990, ARPANET had been overtaken and replaced
2009, and is scheduled to continue
by newer networking technologies and the project came ongoing since March
until March 2014.[57]
to a close. New network service providers including
PSINet, Alternet, CERFNet, ANS CO+RE, and many This network technology is supposed to ultimately enable
others were oering network access to commercial cus- missions that involve multiple spacecraft where reliable
tomers. NSFNET was no longer the de facto backbone inter-vessel communication might take precedence over
and exchange point for Internet. The Commercial In- vessel-to-earth downlinks. According to a February 2011
ternet eXchange (CIX), Metropolitan Area Exchanges statement by Googles Vint Cerf, the so-called Bundle
(MAEs), and later Network Access Points (NAPs) were protocols have been uploaded to NASAs EPOXI misbecoming the primary interconnections between many sion spacecraft (which is in orbit around the Sun) and
networks. The nal restrictions on carrying commercial communication with Earth has been tested at a distance
trac ended on April 30, 1995 when the National Sci- of approximately 80 light seconds.[58]
ence Foundation ended its sponsorship of the NSFNET
Backbone Service and the service ended.[51][52] NSF provided initial support for the NAPs and interim support to 12.8 Internet governance
help the regional research and education networks transition to commercial ISPs. NSF also sponsored the very
Main article: Internet governance
high speed Backbone Network Service (vBNS) which
continued to provide support for the supercomputing centers and research and education in the United States.[53] As a globally distributed network of voluntarily interconnected autonomous networks, the Internet operates without a central governing body. It has no centralized governance for either technology or policies, and each con12.7 Networking in outer space
stituent network chooses what technologies and protocols
it will deploy from the voluntary technical standards that
are developed by the Internet Engineering Task Force
Main article: Interplanetary Internet
(IETF).[59] However, throughout its entire history, the
Internet system has had an Internet Assigned Numbers
The rst live Internet link into low earth orbit was estab- Authority (IANA) for the allocation and assignment of
lished on January 22, 2010 when astronaut T. J. Creamer various technical identiers needed for the operation of
posted the rst unassisted update to his Twitter account the Internet.[60] The Internet Corporation for Assigned
from the International Space Station, marking the exten- Names and Numbers (ICANN) provides oversight and
sion of the Internet into space.[54] (Astronauts at the ISS coordination for two principal name spaces in the Interhad used email and Twitter before, but these messages net, the Internet Protocol address space and the Domain
had been relayed to the ground through a NASA data link Name System.
before being posted by a human proxy.) This personal
Web access, which NASA calls the Crew Support LAN,
uses the space stations high-speed Ku band microwave 12.8.1 NIC, InterNIC, IANA and ICANN
link. To surf the Web, astronauts can use a station laptop
computer to control a desktop computer on Earth, and Main articles: InterNIC, Internet Assigned Numbers
they can talk to their families and friends on Earth using Authority and ICANN
Voice over IP equipment.[55]
Communication with spacecraft beyond earth orbit has
traditionally been over point-to-point links through the
Deep Space Network. Each such data link must be manually scheduled and congured. In the late 1990s NASA
and Google began working on a new network protocol,
Delay-tolerant networking (DTN) which automates this
process, allows networking of spaceborne transmission
nodes, and takes the fact into account that spacecraft

The IANA function was originally performed by USC Information Sciences Institute, and it delegated portions of
this responsibility with respect to numeric network and
autonomous system identiers to the Network Information Center (NIC) at Stanford Research Institute (SRI International) in Menlo Park, California. In addition to his
role as the RFC Editor, Jon Postel worked as the manager
of IANA until his death in 1998.

122
As the early ARPANET grew, hosts were referred to by
names, and a HOSTS.TXT le would be distributed from
SRI International to each host on the network. As the network grew, this became cumbersome. A technical solution came in the form of the Domain Name System, created by Paul Mockapetris. The Defense Data Network
Network Information Center (DDN-NIC) at SRI handled all registration services, including the top-level domains (TLDs) of .mil, .gov, .edu, .org, .net, .com and .us,
root nameserver administration and Internet number assignments under a United States Department of Defense
contract.[60] In 1991, the Defense Information Systems
Agency (DISA) awarded the administration and maintenance of DDN-NIC (managed by SRI up until this point)
to Government Systems, Inc., who subcontracted it to the
small private-sector Network Solutions, Inc.[61][62]
The increasing cultural diversity of the Internet also
posed administrative challenges for centralized management of the IP addresses. In October 1992, the Internet
Engineering Task Force (IETF) published RFC 1366,[63]
which described the growth of the Internet and its increasing globalization and set out the basis for an evolution of the IP registry process, based on a regionally distributed registry model. This document stressed the need
for a single Internet number registry to exist in each geographical region of the world (which would be of continental dimensions). Registries would be unbiased and
widely recognized by network providers and subscribers
within their region. The RIPE Network Coordination
Centre (RIPE NCC) was established as the rst RIR in
May 1992. The second RIR, the Asia Pacic Network
Information Centre (APNIC), was established in Tokyo
in 1993, as a pilot project of the Asia Pacic Networking
Group.[64]

CHAPTER 12. HISTORY OF THE INTERNET


December 1997, as an independent, not-for-prot corporation by direction of the National Science Foundation
and became the third Regional Internet Registry.[66]
In 1998, both the IANA and remaining DNS-related InterNIC functions were reorganized under the control of
ICANN, a California non-prot corporation contracted
by the United States Department of Commerce to manage a number of Internet-related tasks. As these tasks
involved technical coordination for two principal Internet name spaces (DNS names and IP addresses) created by the IETF, ICANN also signed a memorandum
of understanding with the IAB to dene the technical
work to be carried out by the Internet Assigned Numbers
Authority.[67] The management of Internet address space
remained with the regional Internet registries, which collectively were dened as a supporting organization within
the ICANN structure.[68] ICANN provides central coordination for the DNS system, including policy coordination for the split registry / registrar system, with competition among registry service providers to serve each toplevel-domain and multiple competing registrars oering
DNS services to end-users.

12.8.2 Internet Engineering Task Force


The Internet Engineering Task Force (IETF) is the largest
and most visible of several loosely related ad-hoc groups
that provide technical direction for the Internet, including the Internet Architecture Board (IAB), the Internet
Engineering Steering Group (IESG), and the Internet Research Task Force (IRTF).
The IETF is a loosely self-organized group of international volunteers who contribute to the engineering and
evolution of Internet technologies. It is the principal
body engaged in the development of new Internet standard specications. Much of the IETFs work is done in
Working Groups. It does not run the Internet, despite
what some people might mistakenly say. The IETF does
make voluntary standards that are often adopted by Internet users, but it does not control, or even patrol, the
Internet.[69][70]

Since at this point in history most of the growth on the Internet was coming from non-military sources, it was decided that the Department of Defense would no longer
fund registration services outside of the .mil TLD. In
1993 the U.S. National Science Foundation, after a competitive bidding process in 1992, created the InterNIC to
manage the allocations of addresses and management of
the address databases, and awarded the contract to three
organizations. Registration Services would be provided
by Network Solutions; Directory and Database Services The IETF started in January 1986 as a quarterly meetwould be provided by AT&T; and Information Services ing of U.S. government funded researchers. Nongovernment representatives were invited starting with the
would be provided by General Atomics.[65]
fourth IETF meeting in October 1986. The concept of
Over time, after consultation with the IANA, the IETF, Working Groups was introduced at the fth IETF meetRIPE NCC, APNIC, and the Federal Networking Coun- ing in February 1987. The seventh IETF meeting in July
cil (FNC), the decision was made to separate the man- 1987 was the rst meeting with more than 100 attendees.
agement of domain names from the management of IP In 1992, the Internet Society, a professional membership
numbers.[64] Following the examples of RIPE NCC and society, was formed and IETF began to operate under
APNIC, it was recommended that management of IP ad- it as an independent international standards body. The
dress space then administered by the InterNIC should be rst IETF meeting outside of the United States was held
under the control of those that use it, specically the in Amsterdam, The Netherlands, in July 1993. Today
ISPs, end-user organizations, corporate entities, univer- the IETF meets three times a year and attendnce is ofsities, and individuals. As a result, the American Reg- ten about 1,300 people, but has been as high as 2,000
istry for Internet Numbers (ARIN) was established as in upon occasion. Typically one in three IETF meetings are

12.9. NET NEUTRALITY

123

held in Europe or Asia. The number of non-US atten- for which it is the organizational home: the Internet Endees is roughly 50%, even at meetings held in the United gineering Task Force (IETF), the Internet Architecture
States.[69]
Board (IAB), the Internet Engineering Steering Group
The IETF is unusual in that it exists as a collection of hap- (IESG), and the Internet Research Task Force (IRTF).
penings, but is not a corporation and has no board of di- ISOC also promotes understanding and appreciation of
processes and
rectors, no members, and no dues. The closest thing there the Internet model of open, transparent
[77]
consensus-based
decision
making.
is to being an IETF member is being on the IETF or a
Working Group mailing list. IETF volunteers come from
all over the world and from many dierent parts of the
Internet community. The IETF works closely with and
under the supervision of the Internet Engineering Steering Group (IESG)[71] and the Internet Architecture Board
(IAB).[72] The Internet Research Task Force (IRTF) and
the Internet Research Steering Group (IRSG), peer activities to the IETF and IESG under the general supervision
of the IAB, focus on longer term research issues.[69][73]
Request for Comments
Request for Comments (RFCs) are the main documentation for the work of the IAB, IESG, IETF, and IRTF.
RFC 1, Host Software, was written by Steve Crocker at
UCLA in April 1969, well before the IETF was created.
Originally they were technical memos documenting aspects of ARPANET development and were edited by Jon
Postel, the rst RFC Editor.[69][74]
RFCs cover a wide range of information from proposed
standards, draft standards, full standards, best practices,
experimental protocols, history, and other informational
topics.[75] RFCs can be written by individuals or informal groups of individuals, but many are the product of
a more formal Working Group. Drafts are submitted to
the IESG either by individuals or by the Working Group
Chair. An RFC Editor, appointed by the IAB, separate
from IANA, and working in conjunction with the IESG,
receives drafts from the IESG and edits, formats, and
publishes them. Once an RFC is published, it is never revised. If the standard it describes changes or its information becomes obsolete, the revised standard or updated
information will be re-published as a new RFC that obsoletes the original.[69][74]

12.8.3

The Internet Society

The Internet Society (ISOC) is an international, nonprot


organization founded during 1992 to assure the open development, evolution and use of the Internet for the benet of all people throughout the world. With oces
near Washington, DC, USA, and in Geneva, Switzerland,
ISOC has a membership base comprising more than 80
organizational and more than 50,000 individual members. Members also form chapters based on either common geographical location or special interests. There are
currently more than 90 chapters around the world.[76]

12.8.4 Globalization and Internet governance in the 21st century


Since the 1990s, the Internets governance and organization has been of global importance to governments,
commerce, civil society, and individuals. The organizations which held control of certain technical aspects of
the Internet were the successors of the old ARPANET
oversight and the current decision-makers in the dayto-day technical aspects of the network. While recognized as the administrators of certain aspects of the Internet, their roles and their decision making authority are
limited and subject to increasing international scrutiny
and increasing objections. These objections have led to
the ICANN removing themselves from relationships with
rst the University of Southern California in 2000,[78] and
nally in September 2009, gaining autonomy from the US
government by the ending of its longstanding agreements,
although some contractual obligations with the U.S. Department of Commerce continued.[79][80][81]
The IETF, with nancial and organizational support from
the Internet Society, continues to serve as the Internets
ad-hoc standards body and issues Request for Comments.
In November 2005, the World Summit on the Information Society, held in Tunis, called for an Internet Governance Forum (IGF) to be convened by United Nations Secretary General. The IGF opened an ongoing,
non-binding conversation among stakeholders representing governments, the private sector, civil society, and the
technical and academic communities about the future of
Internet governance. The rst IGF meeting was held in
October/November 2006 with follow up meetings annually thereafter.[82] Since WSIS, the term Internet governance has been broadened beyond narrow technical concerns to include a wider range of Internet-related policy
issues.[83][84]

12.9 Net neutrality


Main article: Net neutrality

On April 23, 2014, the Federal Communications Commission (FCC) was reported to be considering a new
rule that would permit Internet service providers to ofISOC provides nancial and organizational support to fer content providers a faster track to send content, thus
and promotes the work of the standards settings bodies reversing their earlier net neutrality position.[85][86][87]

124

CHAPTER 12. HISTORY OF THE INTERNET

A possible solution to net neutrality concerns may


be municipal broadband, according to Professor Susan Crawford, a legal and technology expert at Harvard
Law School.[88] On May 15, 2014, the FCC decided to
consider two options regarding Internet services: rst,
permit fast and slow broadband lanes, thereby compromising net neutrality; and second, reclassify broadband as a telecommunication service, thereby preserving
net neutrality.[89][90] On November 10, 2014, President
Obama recommended the FCC reclassify broadband Internet service as a telecommunications service in order to
preserve net neutrality.[91][92][93] On January 16, 2015,
Republicans presented legislation, in the form of a U.
S. Congress H. R. discussion draft bill, that makes concessions to net neutrality but prohibits the FCC from
accomplishing the goal or enacting any further regulation aecting Internet service providers (ISPs).[94][95]
On January 31, 2015, AP News reported the FCC will
present the notion of applying (with some caveats)
Title II (common carrier) of the Communications Act of
1934 to the internet in a vote expected on February 26,
2015.[96][97][98][99][100] Adoption of this notion would reclassify internet service from one of information to one of
telecommunications[101] and, according to Tom Wheeler,
chairman of the FCC, ensure net neutrality.[102][103]

12.10 Use and culture


Main article: Sociology of the Internet

12.10.1

Demographics

80
77*

70

73*

71
67
63

60

59

61

54

50

51
46
42

40
36

39*

38
33

30
24

20

0
1996

11
2
0

1998

23

21
18

17

10

7
2

2000

36*

30

31

8
3

11
4

2002

12
6

14
7

2004

16

15

26
18

28*

24

31*

21

12
9

2006

2008

2010

2014

2012

* Estimate

Internet users per 100 inhabitants


Source: International Telecommunications Union.[104][105]

Email has often been called the killer application of the


Internet. It actually predates the Internet and was a crucial tool in creating it. Email started in 1965 as a way
for multiple users of a time-sharing mainframe computer
to communicate. Although the history is undocumented,
among the rst systems to have such a facility were the
System Development Corporation (SDC) Q32 and the
Compatible Time-Sharing System (CTSS) at MIT.[106]
The ARPANET computer network made a large contribution to the evolution of electronic mail. An experimental inter-system transferred mail on the ARPANET
shortly after its creation.[107] In 1971 Ray Tomlinson created what was to become the standard Internet electronic
mail addressing format, using the @ sign to separate mailbox names from host names.[108]
A number of protocols were developed to deliver messages among groups of time-sharing computers over alternative transmission systems, such as UUCP and IBM's
VNET email system. Email could be passed this way
between a number of networks, including ARPANET,
BITNET and NSFNET, as well as to hosts connected directly to other sites via UUCP. See the history of SMTP
protocol.
In addition, UUCP allowed the publication of text les
that could be read by many others. The News software
developed by Steve Daniel and Tom Truscott in 1979
was used to distribute news and bulletin board-like messages. This quickly grew into discussion groups, known
as newsgroups, on a wide range of topics. On ARPANET
and NSFNET similar discussion groups would form via
mailing lists, discussing both technical issues and more
culturally focused topics (such as science ction, discussed on the sovers mailing list).
During the early years of the Internet, email and similar mechanisms were also fundamental to allow people
to access resources that were not available due to the absence of online connectivity. UUCP was often used to
distribute les using the 'alt.binary' groups. Also, FTP
e-mail gateways allowed people that lived outside the US
and Europe to download les using ftp commands written
inside email messages. The le was encoded, broken in
pieces and sent by email; the receiver had to reassemble
and decode it later, and it was the only way for people living overseas to download items such as the earlier Linux
versions using the slow dial-up connections available at
the time. After the popularization of the Web and the
HTTP protocol such tools were slowly abandoned.

See also: Global Internet usage

12.10.3 From Gopher to the WWW


12.10.2

Email and Usenet

Main articles: History of the World Wide Web and


World Wide Web

Main articles: e-mail, Simple Mail Transfer Protocol


and Usenet
As the Internet grew through the 1980s and early 1990s,
many people realized the increasing need to be able to

12.10. USE AND CULTURE


nd and organize les and information. Projects such
as Archie, Gopher, WAIS, and the FTP Archive list attempted to create ways to organize distributed data. In
the early 1990s, Gopher, invented by Mark P. McCahill oered a viable alternative to the World Wide Web.
However, in 1993 the World Wide Web saw many advances to indexing and ease of access through search engines, which often neglected Gopher and Gopherspace.
As popularity increased through ease of use, investment incentives also grew until in the middle of 1994
the WWWs popularity gained the upper hand. Then it
became clear that Gopher and the other projects were
doomed fall short.[109]

125
High Performance Computing and Communication Act of
1991 also known as the Gore Bill.[116] Mosaics graphical
interface soon became more popular than Gopher, which
at the time was primarily text-based, and the WWW became the preferred interface for accessing the Internet.
(Gores reference to his role in creating the Internet,
however, was ridiculed in his presidential election campaign. See the full article Al Gore and information technology).
Mosaic was eventually superseded in 1994 by Andreessens Netscape Navigator, which replaced Mosaic as
the worlds most popular browser. While it held this title for some time, eventually competition from Internet
Explorer and a variety of other browsers almost completely displaced it. Another important event held on January 11, 1994, was The Superhighway Summit at UCLA's
Royce Hall. This was the rst public conference bringing together all of the major industry, government and
academic leaders in the eld [and] also began the national dialogue about the Information Superhighway and
its implications.[117]

One of the most promising user interface paradigms during this period was hypertext. The technology had been
inspired by Vannevar Bush's "Memex"[110] and developed
through Ted Nelson's research on Project Xanadu and
Douglas Engelbart's research on NLS.[111] Many small
self-contained hypertext systems had been created before, such as Apple Computers HyperCard (1987). Gopher became the rst commonly used hypertext interface
to the Internet. While Gopher menu items were examples 24 Hours in Cyberspace, the largest one-day online
of hypertext, they were not commonly perceived in that event (February 8, 1996) up to that date, took place
way.
on the then-active website, cyber24.com.[118][119] It was
headed by photographer Rick Smolan.[120] A photographic exhibition was unveiled at the Smithsonian Institution's National Museum of American History on January 23, 1997, featuring 70 photos from the project.[121]

12.10.4 Search engines


Main article: Search engine (computing)
Even before the World Wide Web, there were search en-

This NeXT Computer was used by Sir Tim Berners-Lee at CERN


and became the worlds rst Web server.

In 1989, while working at CERN, Tim Berners-Lee invented a network-based implementation of the hypertext
concept. By releasing his invention to public use, he
ensured the technology would become widespread.[112]
For his work in developing the World Wide Web,
Berners-Lee received the Millennium technology prize in
2004.[113] One early popular web browser, modeled after
Search engines can be considered to be the last layer of technolHyperCard, was ViolaWWW.
A turning point for the World Wide Web began with
the introduction[114] of the Mosaic web browser[115] in
1993, a graphical browser developed by a team at the
National Center for Supercomputing Applications at the
University of Illinois at Urbana-Champaign (NCSAUIUC), led by Marc Andreessen. Funding for Mosaic
came from the High-Performance Computing and Communications Initiative, a funding program initiated by the

ogy that turned the Internet into the extremely useful tool that it is
today.

gines that attempted to organize the Internet. The rst


of these was the Archie search engine from McGill University in 1990, followed in 1991 by WAIS and Gopher.
All three of those systems predated the invention of the
World Wide Web but all continued to index the Web and
the rest of the Internet for several years after the Web

126

CHAPTER 12. HISTORY OF THE INTERNET

appeared. There are still Gopher servers as of 2006, al- Wide Area Information Server (WAIS) in 1991, Gopher
though there are a great many more web servers.
in 1991, Archie in 1991, Veronica in 1992, Jughead in
As the Web grew, search engines and Web directories 1993, Internet Relay Chat (IRC) in 1988, and eventually
were created to track pages on the Web and allow peo- the World Wide Web (WWW) in 1991 with Web direcple to nd things. The rst full-text Web search engine tories and Web search engines.
was WebCrawler in 1994. Before WebCrawler, only Web
page titles were searched. Another early search engine,
Lycos, was created in 1993 as a university project, and
was the rst to achieve commercial success. During the
late 1990s, both Web directories and Web search engines
were popularYahoo! (founded 1994) and Altavista
(founded 1995) were the respective industry leaders. By
August 2001, the directory model had begun to give way
to search engines, tracking the rise of Google (founded
1998), which had developed new approaches to relevancy
ranking. Directory features, while still commonly available, became after-thoughts to search engines.
Database size, which had been a signicant marketing
feature through the early 2000s, was similarly displaced
by emphasis on relevancy ranking, the methods by which
search engines attempt to sort the best results rst. Relevancy ranking rst became a major issue circa 1996,
when it became apparent that it was impractical to review full lists of results. Consequently, algorithms for
relevancy ranking have continuously improved. Googles
PageRank method for ordering the results has received
the most press, but all major search engines continually
rene their ranking methodologies with a view toward improving the ordering of results. As of 2006, search engine
rankings are more important than ever, so much so that
an industry has developed ("search engine optimizers",
or SEO) to help web-developers improve their search
ranking, and an entire body of case law has developed
around matters that aect search engine rankings, such as
use of trademarks in metatags. The sale of search rankings by some search engines has also created controversy
among librarians and consumer advocates.[122]
On June 3, 2009, Microsoft launched its new search
engine, Bing.[123] The following month Microsoft and
Yahoo! announced a deal in which Bing would power
Yahoo! Search.[124]

12.10.5

File sharing

Main articles: File sharing, Peer-to-peer le sharing and


Timeline of le sharing

In 1999, Napster became the rst peer-to-peer le sharing system.[126] Napster used a central server for indexing and peer discovery, but the storage and transfer of les was decentralized. A variety of peer-to-peer
le sharing programs and services with dierent levels
of decentralization and anonymity followed, including:
Gnutella, eDonkey2000, and Freenet in 2000, FastTrack,
Kazaa, Limewire, and BitTorrent in 2001, and Poisoned
in 2003.[127]
All of these tools are general purpose and can be used
to share a wide variety of content, but sharing of music les, software, and later movies and videos are major uses.[128] And while some of this sharing is legal,
large portions are not. Lawsuits and other legal actions
caused Napster in 2001, eDonkey2000 in 2005, Kazza
in 2006, and Limewire in 2010 to shutdown or refocus
their eorts.[129][130] The Pirate Bay, founded in Sweden
in 2003, continues despite a trial and appeal in 2009 and
2010 that resulted in jail terms and large nes for several
of its founders.[131] File sharing remains contentious and
controversial with charges of theft of intellectual property on the one hand and charges of censorship on the
other.[132][133]

12.10.6 Dot-com bubble


Main article: Dot-com bubble
Suddenly the low price of reaching millions worldwide,
and the possibility of selling to or hearing from those
people at the same moment when they were reached,
promised to overturn established business dogma in advertising, mail-order sales, customer relationship management, and many more areas. The web was a new killer
appit could bring together unrelated buyers and sellers
in seamless and low-cost ways. Entrepreneurs around the
world developed new business models, and ran to their
nearest venture capitalist. While some of the new entrepreneurs had experience in business and economics,
the majority were simply people with ideas, and did not
manage the capital inux prudently. Additionally, many
dot-com business plans were predicated on the assumption that by using the Internet, they would bypass the distribution channels of existing businesses and therefore not
have to compete with them; when the established businesses with strong existing brands developed their own Internet presence, these hopes were shattered, and the newcomers were left attempting to break into markets dominated by larger, more established businesses. Many did
not have the ability to do so.

Resource or le sharing has been an important activity


on computer networks from well before the Internet was
established and was supported in a variety of ways including bulletin board systems (1978), Usenet (1980),
Kermit (1981), and many others. The File Transfer Protocol (FTP) for use on the Internet was standardized in
1985 and is still in use today.[125] A variety of tools
were developed to aid the use of FTP by helping users
discover les they might want to transfer, including the The dot-com bubble burst in March 2000, with the tech-

12.12. SEE ALSO


nology heavy NASDAQ Composite index peaking at
5,048.62 on March 10[134] (5,132.52 intraday), more than
double its value just a year before. By 2001, the bubbles
deation was running full speed. A majority of the dotcoms had ceased trading, after having burnt through their
venture capital and IPO capital, often without ever making a prot. But despite this, the Internet continues to
grow, driven by commerce, ever greater amounts of online information and knowledge and social networking.

12.10.7

Mobile phones and the Internet

See also: Mobile Web


The rst mobile phone with Internet connectivity was
the Nokia 9000 Communicator, launched in Finland in
1996. The viability of Internet services access on mobile phones was limited until prices came down from that
model and network providers started to develop systems
and services conveniently accessible on phones. NTT DoCoMo in Japan launched the rst mobile Internet service, i-mode, in 1999 and this is considered the birth
of the mobile phone Internet services. In 2001, the mobile phone email system by Research in Motion for their
BlackBerry product was launched in America. To make
ecient use of the small screen and tiny keypad and onehanded operation typical of mobile phones, a specic
document and networking model was created for mobile
devices, the Wireless Application Protocol (WAP). Most
mobile device Internet services operate using WAP. The
growth of mobile phone services was initially a primarily
Asian phenomenon with Japan, South Korea and Taiwan
all soon nding the majority of their Internet users accessing resources by phone rather than by PC. Developing countries followed, with India, South Africa, Kenya,
Philippines, and Pakistan all reporting that the majority
of their domestic users accessed the Internet from a mobile phone rather than a PC. The European and North
American use of the Internet was inuenced by a large
installed base of personal computers, and the growth of
mobile phone Internet access was more gradual, but had
reached national penetration levels of 2030% in most
Western countries.[135] The cross-over occurred in 2008,
when more Internet access devices were mobile phones
than personal computers. In many parts of the developing world, the ratio is as much as 10 mobile phone users
to one PC user.[136]

12.11 Historiography
Some concerns have been raised over the historiography
of the Internets development. The process of digitization
represents a twofold challenge both for historiography in
general and, in particular, for historical communication
research.[137] Specically that it is hard to nd documen-

127
tation of much of the Internets development, for several
reasons, including a lack of centralized documentation
for much of the early developments that led to the Internet.
The Arpanet period is somewhat well
documented because the corporation in charge
BBN left a physical record. Moving into
the NSFNET era, it became an extraordinarily
decentralized process. The record exists in
peoples basements, in closets. [...] So much
of what happened was done verbally and on
the basis of individual trust.
Doug Gale (2007)[138]

12.12 See also


Index of Internet-related articles
Outline of the Internet
History of hypertext
History of the Internet in Sweden
History of the web browser
Net neutrality
On the Internet, nobody knows you're a dog

12.13 Notes
[1] The Worlds Technological Capacity to Store, Communicate, and Compute Information, Martin Hilbert and
Priscila Lpez (2011), Science (journal), 332(6025), 60
65; free access to the article through here: martinhilbert.
net/WorldInfoCapacity.html
[2] J. C. R. Licklider (March 1960). Man-Computer Symbiosis. IRE Transactions on Human Factors in Electronics. HFE-1: 411.
[3] J. C. R. Licklider and Welden Clark (August 1962). OnLine Man-Computer Communication. AIEE-IRE '62
(Spring): 113128.
[4] Licklider, J. C. R. (23 April 1963). Topics for Discussion
at the Forthcoming Meeting, Memorandum For: Members and Aliates of the Intergalactic Computer Network. Washington, D.C.: Advanced Research Projects
Agency. Retrieved 2013-01-26.
[5] Robert Taylor in an interview with John Marko (December 20, 1999). An Internet Pioneer Ponders the Next
Revolution. The New York Times. Retrieved November
25, 2005.
[6] J.C.R. Licklider and the Universal Network. The Internet. 2000.

128

[7] Leonard Kleinrock (2005). The history of the Internet.


Retrieved May 28, 2009.
[8] Baran, Paul (May 27, 1960). Reliable Digital Communications Using Unreliable Network Repeater Nodes
(PDF). The RAND Corporation. p. 1. Retrieved July 25,
2012.
[9] Rutheld, Scott (September 1995).
The Internets History and Development From Wartime Tool
to the Fish-Cam. Crossroads 2 (1). pp. 24.
doi:10.1145/332198.332202. Archived from the original
on October 18, 2007. Retrieved July 25, 2012.
[10] About Rand. Paul Baran and the Origins of the Internet.
Retrieved July 25, 2012.
[11] Strickland, Jonathan (n.d.). How ARPANET Works.
[12] Gromov, Gregory (1995). Roads and Crossroads of Internet History.
[13] Hafner, Katie (1998). Where Wizards Stay Up Late: The
Origins Of The Internet. Simon & Schuster. ISBN 0-68483267-4.
[14] Ronda Hauben (2001). From the ARPANET to the Internet. Retrieved May 28, 2009.
[15] Postel, J. (November 1981). The General Plan.
NCP/TCP transition plan. IETF. p. 2. RFC 801. https:
//tools.ietf.org/html/rfc801#page-2. Retrieved February
1, 2011.
[16] NORSAR and the Internet. NORSAR. Retrieved June
5, 2009.
[17] Ward, Mark (October 29, 2009). Celebrating 40 years
of the net. BBC News.
[18] The Merit Network, Inc. is an independent non-prot
501(c)(3) corporation governed by Michigans public universities. Merit receives administrative services under an
agreement with the University of Michigan.
[19] A Chronicle of Merits Early History, John Mulcahy, 1989,
Merit Network, Ann Arbor, Michigan
[20] Merit Network Timeline: 19701979, Merit Network,
Ann Arbor, Michigan
[21] Merit Network Timeline: 19801989, Merit Network,
Ann Arbor, Michigan
[22] A Technical History of CYCLADES. Technical Histories of the Internet & other Network Protocols. Computer
Science Department, University of Texas Austin.
[23] The Cyclades Experience: Results and Impacts, Zimmermann, H., Proc. IFIP'77 Congress, Toronto, August
1977, pp. 465469

CHAPTER 12. HISTORY OF THE INTERNET

[26] UUCP Internals Frequently Asked Questions


[27] Barry M. Leiner, Vinton G. Cerf, David D. Clark, Robert
E. Kahn, Leonard Kleinrock, Daniel C. Lynch, Jon Postel,
Larry G. Roberts, Stephen Wol (2003). A Brief History
of Internet. Retrieved May 28, 2009.
[28] Computer History Museum and Web History Center
Celebrate 30th Anniversary of Internet Milestone. Retrieved November 22, 2007.
[29] Ogg, Erica (2007-11-08). "'Internet van' helped drive
evolution of the Web. CNET. Retrieved 2011-11-12.
[30] Jon Postel, NCP/TCP Transition Plan, RFC 801
[31] David Roessner, Barry Bozeman, Irwin Feller, Christopher Hill, Nils Newman (1997). The Role of NSFs
Support of Engineering in Enabling Technological Innovation. Retrieved May 28, 2009.
[32] RFC 675 Specication of internet transmission control
program. Tools.ietf.org. Retrieved May 28, 2009.
[33] Tanenbaum, Andrew S. (1996). Computer Networks.
Prentice Hall. ISBN 0-13-394248-1.
[34] Hauben, Ronda (2004). The Internet: On its International Origins and Collaborative Vision. Amateur Computerist 12 (2). Retrieved May 29, 2009.
[35] Internet Access Provider Lists.
2012.

Retrieved May 10,

[36] RFC 1871 CIDR and Classful Routing. Tools.ietf.org.


Retrieved May 28, 2009.
[37] Ben Segal (1995). A Short History of Internet Protocols
at CERN.
[38] Internet History in Asia.
16th APAN Meetings/Advanced Network Conference in Busan. Retrieved
December 25, 2005.
[39] Percentage of Individuals using the Internet 2000-2012,
International Telecommunications Union (Geneva), June
2013, retrieved 22 June 2013
[40] Fixed (wired)-broadband subscriptions per 100 inhabitants 2012, Dynamic Report, ITU ITC EYE,
International Telecommunication Union. Retrieved on 29
June 2013.
[41] Active mobile-broadband subscriptions per 100 inhabitants 2012, Dynamic Report, ITU ITC EYE,
International Telecommunication Union. Retrieved on 29
June 2013.
[42] ICONS webpage. Icons.afrinic.net. Retrieved May 28,
2009.
[43] Nepad, Eassy partnership ends in divorce,(South African)
Financial Times FMTech, 2007

[24] tsbedh. History of X.25, CCITT Plenary Assemblies and


Book Colors. Itu.int. Retrieved June 5, 2009.

[44] APRICOT webpage. Apricot.net. May 4, 2009. Retrieved May 28, 2009.

[25] Events in British Telecomms History. Events in British


TelecommsHistory. Archived from the original on April 5,
2003. Retrieved November 25, 2005.

[45] A brief history of the Internet in China. China celebrates


10 years of being connected to the Internet. Retrieved December 25, 2005.

12.13. NOTES

[46] Internet host count history. Internet Systems Consortium. Retrieved May 16, 2012.

129

[62] Thomas v. NSI, Civ. No. 97-2412 (TFH), Sec. I.A.


(DCDC April 6, 1998)". Lw.bna.com. Retrieved May
28, 2009.

[47] The World internet provider. Retrieved May 28, 2009.


[48] OGC-00-33R Department of Commerce: Relationship with
the Internet Corporation for Assigned Names and Numbers.
Government Accountability Oce. July 7, 2000. p. 6.
[49] Even after the appropriations act was amended in 1992 to
give NSF more exibility with regard to commercial trafc, NSF never felt that it could entirely do away with its
Acceptable Use Policy and its restrictions on commercial
trac, see the response to Recommendation 5 in NSFs
response to the Inspector Generals review (a April 19,
1993 memo from Frederick Bernthal, Acting Director, to
Linda Sundro, Inspector General, that is included at the
end of Review of NSFNET, Oce of the Inspector General, National Science Foundation, March 23, 1993)
[50] Management of NSFNET, a transcript of the March
12, 1992 hearing before the Subcommittee on Science
of the Committee on Science, Space, and Technology,
U.S. House of Representatives, One Hundred Second
Congress, Second Session, Hon. Rick Boucher, subcommittee chairman, presiding
[51] Retiring the NSFNET Backbone Service: Chronicling
the End of an Era, Susan R. Harris, Ph.D., and Elise
Gerich, ConneXions, Vol. 10, No. 4, April 1996
[52] A Brief History of the Internet.
[53] NSF Solicitation 93-52 Network Access Point Manager, Routing Arbiter, Regional Network Providers, and
Very High Speed Backbone Network Services Provider
for NSFNET and the NREN(SM) Program, May 6, 1993
[54] Twitter post. 2010-01-22. Archived from the original
on 2013-03-10. Retrieved 2013-03-10.
[55] NASA Extends the World Wide Web Out Into Space.
NASA media advisory M10-012, January 22, 2010.
Archived

[63] RFC 1366. Guidelines for Management of IP Address


Space. Retrieved April 10, 2012.
[64] Development of the Regional Internet Registry System.
Cisco. Retrieved April 10, 2012.
[65] NIS Manager Award Announced. NSF Network information services awards. Retrieved December 25, 2005.
[66] Internet Moves Toward Privatization. http://www.nsf.
gov''. 24 June 1997.
[67] RFC 2860. Memorandum of Understanding Concerning the Technical Work of the Internet Assigned Numbers
Authority. Retrieved December 26, 2005.
[68] ICANN Bylaws. Retrieved April 10, 2012.
[69] The Tao of IETF: A Novices Guide to the Internet Engineering Task Force, FYI 17 and RFC 4677, P. Homan
and S. Harris, Internet Society, September 2006
[70] A Mission Statement for the IETF, H. Alvestrand, Internet Society, BCP 95 and RFC 3935, October 2004
[71] An IESG charter, H. Alvestrand, RFC 3710, Internet
Society, February 2004
[72] Charter of the Internet Architecture Board (IAB)", B.
Carpenter, BCP 39 and RFC 2850, Internet Society, May
2000
[73] IAB Thoughts on the Role of the Internet Research Task
Force (IRTF)", S. Floyd, V. Paxson, A. Falk (eds), RFC
4440, Internet Society, March 2006
[74] The RFC Series and RFC Editor, L. Daigle, RFC 4844,
Internet Society, July 2007
[75] Not All RFCs are Standards, C. Huitema, J. Postel, S.
Crocker, RFC 1796, Internet Society, April 1995

[56] NASA Successfully Tests First Deep Space Internet.


NASA media release 08-298, November 18, 2008
Archived

[76] Internet Society (ISOC) Introduction to ISOC

[57] Disruption Tolerant Networking for Space Operations


(DTN). July 31, 2012

[78] USC/ICANN Transition Agreement

[58] Cerf: 2011 will be proving point for 'InterPlanetary Internet'". Network World interview with Vint Cerf. February 18, 2011. Archived from the original on December 9,
2012.
[59] Internet Architecture. IAB Architectural Principles of
the Internet. Retrieved April 10, 2012.
[60] DDN NIC. IAB Recommended Policy on Distributing
Internet Identier Assignment. Retrieved December 26,
2005.
[61] GSI-Network Solutions. TRANSITION OF NIC SERVICES. Retrieved December 26, 2005.

[77] Internet Society (ISOC) ISOCs Standards Activities

[79] ICANN cuts cord to US government, gets broader oversight: ICANN, which oversees the Internets domain name
system, is a private nonprot that reports to the US Department of Commerce. Under a new agreement, that relationship will change, and ICANNs accountability goes
global Nate Anderson, September 30, 2009
[80] Rhoads, Christopher (October 2, 2009). U.S. Eases Grip
Over Web Body: Move Addresses Criticisms as Internet
Usage Becomes More Global. The Wall Street Journal.
[81] Rabkin, Jeremy; Eisenach, Jerey (October 2, 2009).
The U.S. Abandons the Internet: Multilateral governance
of the domain name system risks censorship and repression. The Wall Street Journal.

130

CHAPTER 12. HISTORY OF THE INTERNET

[82] Mueller, Milton L. (2010). Networks and States: The


Global Politics of Internet Governance. MIT Press. p. 67.
ISBN 978-0-262-01459-5.

[99] Fung, Brian (January 2, 2015). Get ready: The FCC says
it will vote on net neutrality in February. Washington
Post. Retrieved January 2, 2015.

[83] Mueller, Milton L. (2010). Networks and States: The [100] Sta (January 2, 2015). FCC to vote next month on net
neutrality rules. AP News. Retrieved January 2, 2015.
Global Politics of Internet Governance. MIT Press. pp.
7980. ISBN 978-0-262-01459-5.
[101] Lohr, Steve (February 4, 2015). F.C.C. Plans Strong
Hand to Regulate the Internet. New York Times. Re[84] DeNardis, Laura, The Emerging Field of Internet Govertrieved February 5, 2015.
nance (September 17, 2010). Yale Information Society
Project Working Paper Series.
[102] Wheeler, Tom (February 4, 2015). FCC Chairman Tom
Wheeler: This Is How We Will Ensure Net Neutrality.
[85] Wyatt, Edward (April 23, 2014). F.C.C., in Net NeuWired (magazine). Retrieved February 5, 2015.
trality Turnaround, Plans to Allow Fast Lane. New York
Times. Retrieved 2014-04-23.
[86] Sta (April 24, 2014). Creating a Two-Speed Internet.
New York Times. Retrieved 2014-04-25.
[87] Carr, David (May 11, 2014). Warnings Along F.C.C.s
Fast Lane. New York Times. Retrieved 2014-05-11.

[103] The Editorial Board (February 6, 2015). Courage and


Good Sense at the F.C.C. - Net Neutralitys Wise New
Rules. New York Times. Retrieved February 6, 2015.
[104] Internet users per 100 inhabitants 2001-2011, International Telecommunications Union, Geneva, accessed 4
April 2012

[88] Crawford, Susan (April 28, 2014). The Wire Next


[105] Internet users per 100 inhabitants 2006-2013, InterTime. New York Times. Retrieved 2014-04-28.
national Telecommunications Union, Geneva, accessed 3
June 2013
[89] Sta (May 15, 2014). Searching for Fairness on the Internet. New York Times. Retrieved 2014-05-15.

[106] The Risks Digest. Great moments in e-mail history. Retrieved April 27, 2006.

[90] Wyatt, Edward (May 15, 2014). F.C.C. Backs Opening


Net Rules for Debate. New York Times. Retrieved 2014- [107] The History of Electronic Mail. The History of Elec05-15.
tronic Mail. Retrieved December 23, 2005.

[91] Wyatt, Edward (November 10, 2014). Obama Asks [108] The First Network Email. The First Network Email. ReF.C.C. to Adopt Tough Net Neutrality Rules. New York
trieved December 23, 2005.
Times. Retrieved November 15, 2014.
[109] http://ils.unc.edu/callee/gopherpaper.htm
[92] NYT Editorial Board (November 14, 2014). Why the
F.C.C. Should Heed President Obama on Internet Regu- [110] Bush, Vannevar (1945). As We May Think. Retrieved
May 28, 2009.
lation. New York Times. Retrieved November 15, 2014.
[93] Sepulveda, Ambassador Daniel A. (January 21, 2015). [111] Douglas Engelbart (1962). Augmenting Human Intellect: A Conceptual Framework.
The World Is Watching Our Net Neutrality Debate, So
Lets Get It Right. Wired (website). Retrieved January
[112] The Early World Wide Web at SLAC. The Early World
20, 2015.
Wide Web at SLAC: Documentation of the Early Web at
SLAC. Retrieved November 25, 2005.
[94] Weisman, Jonathan (January 19, 2015). Shifting Politics of Net Neutrality Debate Ahead of F.C.C.Vote. New
[113] Millennium Technology Prize 2004 awarded to invenYork Times. Retrieved January 20, 2015.
tor of World Wide Web. Millennium Technology Prize.
Archived from the original on August 30, 2007. Retrieved
[95] Sta (January 16, 2015). H. R. _ 114th Congress, 1st
May 25, 2008.
Session [Discussion Draft] - To amend the Communications Act of 1934 to ensure Internet openness... (PDF).
[114] Mosaic Web Browser History NCSA, Marc AnU. S. Congress. Retrieved January 20, 2015.
dreessen, Eric Bina. Livinginternet.com. Retrieved May
28, 2009.
[96] Lohr, Steve (February 2, 2015). In Net Neutrality Push,
F.C.C. Is Expected to Propose Regulating Internet Ser- [115] NCSA Mosaic September 10, 1993 Demo. Totic.org.
vice as a Utility. New York Times. Retrieved February 2,
Retrieved May 28, 2009.
2015.
[116] Vice President Al Gores ENIAC Anniversary Speech.
[97] Lohr, Steve (February 2, 2015). F.C.C. Chief Wants to
Cs.washington.edu. February 14, 1996. Retrieved May
Override State Laws Curbing Community Net Services.
28, 2009.
New York Times. Retrieved February 2, 2015.
[117] UCLA Center for Communication Policy. Digitalcen[98] Flaherty, Anne (January 31, 2015). Just whose Internet
ter.org. Retrieved May 28, 2009.
is it? New federal rules may answer that. AP News. Retrieved January 31, 2015.
[118] Mirror of Ocial site map

12.14. REFERENCES

131

[119] Mirror of Ocial Site

[137] Christoph Classen, Susanne Kinnebrock & Maria Lblich


(Eds.): Towards Web History: Sources, Methods, and
[120] 24 Hours in Cyberspace (and more)". Baychi.org. ReChallenges in the Digital Age. In Historical Social Retrieved May 28, 2009.
search 37 (4): 97188. 2012.
[121] The human face of cyberspace, painted in random im- [138] Barras, Colin (August 23, 2007). An Internet Pioneer
ages. Archive.southcoasttoday.com. Retrieved May 28,
Ponders the Next Revolution. Illuminating the nets Dark
2009.
Ages. Retrieved February 26, 2008.
[122] Stross, Randall (22 September 2009). Planet Google:
One Companys Audacious Plan to Organize Everything We
Know. Simon and Schuster. ISBN 978-1-4165-4696-2.
Retrieved 9 December 2012.
[123] Microsofts New Search at Bing.com Helps People Make
Better Decisions: Decision Engine goes beyond search
to help customers deal with information overload (Press
Release)". Microsoft News Center. May 28, 2009. Retrieved May 29, 2009.
[124] Microsoft and Yahoo seal web deal, BBC Mobile News,
July 29, 2009.
[125] RFC 765: File Transfer Protocol (FTP), J. Postel and J.
Reynolds, ISI, October 1985
[126] Kenneth P. Birman (2005-03-25). Reliable Distributed
Systems: Technologies, Web Services, and Applications. Springer-Verlag New York Incorporated. ISBN
9780387215099. Retrieved 2012-01-20.

12.14 References
Abbate, Janet. Inventing the Internet, Cambridge:
MIT Press, 1999.
Bemer, Bob, A History of Source Concepts for the
Internet/Web
Campbell-Kelly, Martin;
Aspray, William.
Computer: A History of the Information Machine.
New York: BasicBooks, 1996.
Clark, D. (1988). The Design Philosophy of
the DARPA Internet Protocols.
SIGCOMM
'88 Symposium proceedings on Communications
architectures and protocols (ACM): 106114.
doi:10.1145/52324.52336.
ISBN 0897912799.
Retrieved 2011-10-16.

[127] Menta, Richard (July 20, 2001). Napster Clones Crush


Napster. Take 6 out of the Top 10 Downloads on CNet.
MP3 Newswire.

Graham, Ian S. The HTML Sourcebook: The Complete Guide to HTML. New York: John Wiley and
Sons, 1995.

[128] Movie File-Sharing Booming: Study, Solutions Research


Group, Toronto, 24 January 2006

Krol, Ed. Hitchhikers Guide to the Internet, 1987.

[129] Menta, Richard (December 9, 1999). RIAA Sues Music


Startup Napster for $20 Billion. MP3 Newswire.

Krol, Ed. Whole Internet Users Guide and Catalog.


O'Reilly & Associates, 1992.

[130] EFF: What Peer-to-Peer Developers Need to Know


about Copyright Law. W2.e.org. Retrieved 2012-0120.
[131] Kobie, Nicole (November 26, 2010). Pirate Bay trio lose
appeal against jail sentences. pcpro.co.uk (PCPRO). Retrieved November 26, 2010.
[132] Poll: Young Say File Sharing OK, Bootie CosgroveMather, CBS News, 11 February 2009
[133] Green, Stuart P. (29 March 2012). OP-ED CONTRIBUTOR; When Stealing Isn't Stealing. The New York
Times. p. 27.
[134] Nasdaq peak of 5,048.62
[135] Susmita Dasgupta; Somik V. Lall; David Wheeler (2001).
Policy Reform, Economic Growth, and the Digital Divide:
An Econometric Analysis. World Bank Publications. pp.
13. GGKEY:YLS5GEUEBAR. Retrieved 11 February
2013.
[136] Hillebrand, Friedhelm (2002). Hillebrand, Friedhelm, ed.
GSM and UMTS, The Creation of Global Mobile Communications. John Wiley & Sons. ISBN 0-470-84322-5.

Scientic American Special Issue on Communications, Computers, and Networks, September 1991.

12.15 External links


Thomas Greene, Larry James Landweber, George
Strawn (2003). A Brief History of NSF and the
Internet. National Science Foundation. Retrieved
May 28, 2009.
Robert H Zakon. Hobbes Internet Timeline
v10.1. Retrieved July 23, 2010.
Principal Figures in the Development of the Internet and the World Wide Web. University of North
Carolina. Retrieved July 3, 2006.
Internet History Timeline. Computer History
Museum. Retrieved November 25, 2005.
Marcus Kazmierczak (September 24, 1997).
Internet History. Archived from the original on
October 31, 2005. Retrieved November 25, 2005.

132
Harri K. Salminen. History of the Internet.
Heureka Science Center, Finland. Retrieved June
11, 2008.
Histories of the Internet. Internet Society. Retrieved December 1, 2007.
Living Internet. Retrieved January 1, 2009. Internet History with input from many of the people
who helped invent the Internet
Voice of America: Overhearing the Internet ,
Robert Wright, The New Republic, September 13,
1993
How the Internet Came to Be, by Vinton Cerf,
1993
Cybertelecom :: Internet History, focusing on the
governmental, legal, and policy history of the Internet
History of the Internet, an animated documentary from 2009 explaining the inventions from timesharing to lesharing, from Arpanet to Internet
The Roads and Crossroads of Internet History, by
Gregory R. Gromov
The History of the Internet According to Itself: A
Synthesis of Online Internet Histories Available at the
Turn of the Century, Steven E. Opfer, 1999
Fool Us Once Shame on YouFool Us Twice
Shame on Us: What We Can Learn from the Privatizations of the Internet Backbone Network and the
Domain Name System, Jay P. Kesan and Rajiv C.
Shah, Washington University Law Review, Volume
79, Issue 1 (2001)
How It All Started (slides), Tim Berners-Lee,
W3C, December 2004
A Little History of the World Wide Web: from
1945 to 1995, Dan Connolly, W3C, 2000
The World Wide Web: Past, Present and Future,
Tim Berners-Lee, August 1996
The History of the Internet 1969 - 2012. AVG
Technologies. Retrieved January 4, 2013.
The History of the Internet in a Nutshell.
Cameron Chapman, Six Revisions. November 15,
2009. Retrieved October 29, 2013.
The History of the Internet. YouTube. Melih Bilgil. January 4, 2009. Retrieved October 29, 2013.
(video)

CHAPTER 12. HISTORY OF THE INTERNET

Chapter 13

History of laptops
Before laptop/notebook computers were technically feasible, similar ideas had been proposed, most notably Alan
Kay's Dynabook concept, developed at Xerox PARC in
the early 1970s. What was probably the rst portable
computer was the Xerox NoteTaker, again developed at
Xerox PARC, in 1976. However, only 10 prototypes
were built.

13.1 Osborne1

13.2 Bondwell 2
Although it wasn't released until 1985, well after the decline of CP/M as a major operating system, the Bondwell
2 is one of only a handful of CP/M laptops. It used a Z80 CPU running at 4 MHz, had 64 K RAM and, unusual
for a CP/M machine, a 3.5 oppy disk drive built in. It
had an 8025 character-based LCD mounted on a hinge
similar to modern laptops, one of the rst computers to
use this form factor.

13.3 Other CP/M laptops


The other CP/M laptops were the Epson PX-4 (or HX40) and PX-8 (Geneva), the NEC PC-8401A, and the
NEC PC-8500. These four units, however, utilized modied CP/M systems in ROM, and did not come standard
with any oppy or hard disks.

13.4 Compaq Portable


A more enduring success was the Compaq Portable, the
rst product from Compaq, introduced in 1983, by which
time the IBM Personal Computer had become the standard platform. Although scarcely more portable than
the Osborne machines, and also requiring AC power to
run, it ran MS-DOS and was the rst true legal IBM
clone (IBMs own later Portable Computer, which arrived
An opened Osborne 1 computer, ready for use. The keyboard
in
1984, was notably less IBM PC-compatible than the
sits on the inside of the lid.
Compaq.) The third model of this development, Compaq Portable II, featured high resolution graphics on its
The rst mass-produced microprocessor-based portable tube display. It was the rst portable computer ready to be
computer was the Osborne 1 in 1981, which used the used on the shop oor, and for CAD and diagram display.
CP/M operating system. Although it was large and heavy It established Compaq as a major brand on the market.
compared to todays laptops, with a tiny 5 CRT monitor,
it had a near-revolutionary impact on business, as professionals were able to take their computer and data with
them for the rst time. This and other luggables were 13.5 Epson HX-20
inspired by what was probably the rst portable computer,
the Xerox NoteTaker. The Osborne was about the size of Another signicant machine announced in 1981, ala portable sewing machine, and more importantly, could though rst sold widely in 1983, was the Epson HX-20.
not be carried on commercial aircraft.
A simple handheld computer, it featured a full-transit 68133

134

CHAPTER 13. HISTORY OF LAPTOPS

key keyboard, rechargeable nickel-cadmium batteries, a 13.9 Tandy Model 100


small (12032-pixel) dot-matrix LCD display with 4 lines
of text, 20 characters per line text mode, a 24 column dot The TRS-80 Model 100 was an early portable computer
matrix printer, a Microsoft BASIC interpreter, and 16 introduced in 1983. It was one of the rst notebook-style
KB of RAM (expandable to 32 KB).
computers, featuring a keyboard and LCD display, battery powered, in a package roughly the size and shape of
notepad or large book.

13.6 GRiD Compass

It was made by Kyocera, and originally sold in Japan as


the Kyotronic 85. Although a slow seller for Kyocera, the
rights to the machine were purchased by Tandy Corporation, and the computer was sold through Radio Shack
stores in the United States and Canada as well as aliated dealers in other countries, becoming one of the companys most popular models, with over 6,000,000 units
sold worldwide. The Olivetti M-10 and the NEC PC8201 and PC-8300 were also built on the same Kyocera
platform.

However, arguably the rst true laptop was the GRiD


Compass 1101, designed by Bill Moggridge in 19791980, and released in 1982. Enclosed in a magnesium
case, it introduced the now familiar clamshell design, in
which the at display folded shut against the keyboard.
The computer could be run from batteries, and was
equipped with a 320200-pixel electroluminescent display and 384 kilobyte bubble memory. It was not IBMcompatible, and its high price (US$8,00010,000) limited it to specialized applications. However, it was used
heavily by the U.S. military, and by NASA on the Space
Shuttle during the 1980s. The GRiDs manufacturer sub- 13.10 Sharp and Gavilan
sequently earned signicant returns on its patent rights
as its innovations became commonplace. GRiD Systems Two other noteworthy early laptops were the Sharp PCCorp. was later bought by the Tandy (now RadioShack) 5000 and the Gavilan SC, announced in 1983 but rst
sold in 1984. The Gavilan was notably the rst computer
Corporation.
to be marketed as a laptop. It was also equipped with a
pioneering touchpad-like pointing device, installed on a
above the keyboard. Like the GRiD Compass, the
13.7 Dulmont
Mag- panel
Gavilan and the Sharp were housed in clamshell cases,
num/Kookaburra
but they were partly IBM-compatible, although primarily running their own system software. Both had LCD
Another contender for the rst true laptop was the displays, and could connect to optional external printers.
Dulmont Magnum, designed Barry Wilkinson and Terry The Dulmont Magnum, launched internationally in 1984,
Crews Engineering Manager at Dulmison in 1982 and was an Australian portable similar in layout to the Gavi[2]
released in Australia in 1983 It included an 8x80 dis- lan, which used the Intel 80186 processor.
play in a lid that closed against the keyboard. It was
based on the MS-DOS operating system and applications
stored in ROM (A:) and also supported removable mod- 13.11 Kyotronic 85
ules in expansion slots (B: and C:) that could be custom
programmed EPROM or standard word processing and
spreadsheet applications. However, the Magnum had no The year 1983 also saw the launch of what was probably
nonvolatile memory, but could suspend and retain mem- the biggest-selling early laptop, the Kyocera Kyotronic
ory in RAM, including a RAM Disk (D:). A separate 85. Owing much to the design of the previous Epson
expansion box provided dual 5.25 oppy or 10MB hard HX-20, and although at rst a slow seller in Japan, it
disk storage. Dulmont was eventually taken over by Time was quickly licensed by Tandy Corporation, Olivetti, and
Oce Computers, who marketed the Magnum interna- NEC, who recognised its potential and marketed it retionally in 16 and 25 line LCD versions, and also intro- spectively as the TRS-80 Model 100[3]line (or Tandy 100),
duced the brandname Kookaburra to emphasize its Aus- Olivetti M-10, and NEC PC-8201. The machines ran
on standard AA batteries. The Tandys built-in programs,
tralian origins.
including a BASIC interpreter, a text editor, and a terminal program, were supplied by Microsoft, and was written in part by Bill Gates himself. The computer was not
13.8 Ampere
a clamshell, but provided a tiltable 8 line 40-character
LCD screen above a full-travel keyboard. With its inThe Ampere,[1] a sleek clamshell design by Ryu Oosake, ternal modem, it was a highly portable communications
also made in 1983. It oered a MC68008 microproces- terminal. Due to its portability, good battery life (and
sor dedicated to running an APL interpreter residing in ease of replacement), reliability (it had no moving parts),
ROM.
and low price (as little as US$300), the model was highly

13.16. US AIR FORCE


regarded, becoming a favorite among journalists. It
weighed less than 2 kg with dimensions of 3021.54.5
centimeters (1281 in). Initial specications included 8 kilobytes of RAM (expandable to 24 KB) and a
3 MHz processor. The machine was in fact about the size
of a paper notebook, but the term had yet to come into
use and it was generally described as a portable computer.

13.12 Commodore SX-64


The Commodore SX-64, also known as the Executive 64, or VIP-64 in Europe, was a portable, briefcase or suitcase-size luggable version of the popular
Commodore 64 home computer, and was the rst fullcolor portable computer.[4]
The SX-64 featured a built-in ve-inch composite monitor and a built-in 1541 oppy drive. It weighed 20
pounds. The machine was carried by its sturdy handle,
which doubled as an adjustable stand. It was announced
in January 1983 and released a year later, at $995 USD.[5]

13.13 Kaypro 2000


Possibly the rst commercial IBM-compatible laptop was
the Kaypro 2000, introduced in 1985. With its brushed
aluminum clamshell case, it was remarkably similar in design to modern laptops. It featured a 25 line by 80 character LCD display, a detachable keyboard, and a pop-up
90 mm (3.5 inch) oppy drive.

135
be run from lead-acid batteries. They also introduced the
now-standard "resume" feature to DOS-based machines:
the computer could be paused between sessions without
having to be restarted each time.

13.16 US Air Force


The rst laptops successful on a large scale came in large
part due to a Request For Proposal (RFP) by the U.S.
Air Force in 1987. This contract would eventually lead
to the purchase of over 200,000 laptops. Competition to
supply this contract was ercely contested and the major
PC companies of the time; IBM Corporation, Toshiba,
Compaq, NEC, and Zenith Data Systems (ZDS), rushed
to develop laptops in an attempt to win this deal. ZDS,
which had earlier won a landmark deal with the IRS for
its Z-171, was awarded this contract for its SupersPort series. The SupersPort series was originally launched with
an Intel 8086 processor, dual oppy disk drives, a backlit,
blue and white STN LCD screen, and a NiCd battery
pack. Later models featured an Intel 80286 processor
and a 20 MB hard disk drive. On the strength of this
deal, ZDS became the worlds largest laptop supplier in
1987 and 1988. ZDS partnered with Tottori Sanyo in
the design and manufacturing of these laptops. This relationship is notable because it was the rst deal between a
major brand and an Asian original equipment manufacturer.

13.17 Hewlett-Packard
Portable CS

Vectra

13.14 IBM PC Convertible

In 1987, HP released a portable version of their Vectra


CS computer.[7] It had the classic laptop conguration
Also among the rst commercial IBM-compatible laptops (keyboard and monitor closes up clam-shell style in orwas the IBM PC Convertible, introduced in 1986. It had der to carry), however, it was very heavy and fairly large.
a CGA-compatible LCD display and 2 oppy drives. It It had a full-size keyboard (with separate numeric keyweighed 13 lbs.
pad) and a large amber LCD screen. While it was oered
with dual 3.5-inch oppy disk drives, the most common
conguration was a 20 MB hard drive and a single oppy
13.15 Toshiba T1100, T1000, and drive. It was one of the rst machines with a 1.44 MB
density 3.5-inch disk drive.

T1200

Toshiba launched the Toshiba T1100 in 1985, and has


subsequently described it as "the worlds rst mass-market
laptop computer".[6] It did not have a hard drive, and ran
entirely from oppy disks. The CPU was a 4.77 MHz
Intel 80C88, a variation of the popular Intel 8088, and the
display was a monochrome, text-only 640x200 LCD. It
was followed in 1987 by the T1000 and T1200. Although
limited oppy-based DOS machines, with the operating
system stored in ROM, the Toshiba models were small
and light enough to be carried in a backpack, and could

13.18 Cambridge Z88


Another notable computer was the Cambridge Z88, designed by Clive Sinclair, introduced in 1988. About the
size of an A4 sheet of paper as well, it ran on standard
batteries, and contained basic spreadsheet, word processing, and communications programs. It anticipated the
future miniaturization of the portable computer, and as
a ROM-based machine with a small display, can like

136

CHAPTER 13. HISTORY OF LAPTOPS

the TRS-80 Model 100 also be seen as a forerunner of praised for its clear active matrix display and long battery
the personal digital assistant.
life, but was a poor seller due to its bulk. In the absence of
a true Apple laptop, several compatible machines such as
the Outbound Laptop were available for Mac users; however, for copyright reasons, the user had to supply a set of
13.19 Compaq SLT/286
Mac ROMs, which usually meant having to buy a new or
used Macintosh as well.
By the end of the 1980s, laptop computers were becoming popular among business people. The COMPAQ SLT/286 debuted in October 1988, being the rst
battery-powered laptop to support an internal hard disk
drive and a VGA compatible LCD screen. It weighed 14 13.21.2 Powerbook
lbs.[8]
The Apple PowerBook series, introduced in October
1991, pioneered changes that are now de facto standards
13.20 NEC UltraLite
on laptops, such as room for a palm rest, and the inclusion
of a pointing device (a trackball). The following year,
The NEC UltraLite, released in mid-1989, was perhaps IBM released its ThinkPad 700C, featuring a similar dethe rst notebook computer, weighing just 2 kg (4.4 lbs). sign (though with a distinctive red TrackPoint pointing
In lieu of a oppy or hard drive, it contained a 2 megabyte device).
RAM drive, but this reduced its utility as well as its Later PowerBooks featured optional color displays
size. Although portable computers with clamshell LCD (PowerBook 165c, 1993), and rst true touchpad
screens already existed before it, the Ultralite was the rst (PowerBook 500 series, 1994), rst 16-bit stereo audio,
computer in a notebook form-factor. It was signicantly and rst built-in Ethernet network adapter (PowerBook
smaller than all portable computers that came before it. 500, 1994).
People can actually carry-it like a notebook and fold its
clamshell LCD like a book cover over the rest of its body.

13.21 Apple

13.22 IBM RS/6000 N40

13.21.1

In 1994, IBM released the RS/6000 N40 laptop based


on a PowerPC microprocessor running the AIX operating system, a variant of UNIX. It was manufactured by
Tadpole Technology (now Tadpole Computer), who also
manufactured laptops based on SPARC and Alpha microprocessors, the SPARCbook and ALPHAbook lines,
respectively.

Macintosh Portable

13.23 Windows 95 operating system

The Macintosh Portable, Apples rst attempt at a batterypowered computer

The rst Apple Computer machine designed to be used


on the go was the 1989 Macintosh Portable (although
an LCD screen had been an option for the transportable
Apple IIc in 1984). Unlike the Compaq LTE laptop released earlier in the year the Macintosh Portable was actually a luggable not a laptop, but the Mac Portable was

The summer of 1995 was a signicant turning point in


the history of notebook computing. In August of that
year Microsoft introduced Windows 95. It was the rst
time that Microsoft had implemented the advanced power
management specication with control in the operating
system. Prior to this point each brand used custom BIOS,
drivers and in some cases, ASICs, to optimize the battery
life of its machines. This move by Microsoft was controversial in the eyes of notebook designers because it greatly
reduced their ability to innovate; however, it did serve its
role in simplifying and stabilizing certain aspects of notebook design.

13.25. IMPROVED TECHNOLOGY

13.24 Intel Pentium processor


Windows 95 also ushered in the importance of the CDROM drive in mobile computing, and initiated the shift
to the Intel Pentium processor as the base platform for
notebooks. The Gateway Solo was the rst notebook introduced with a Pentium processor and a CD-ROM. Also
featuring a removable hard disk drive and oppy drive,
the Solo was the rst three-spindle (optical, oppy, and
hard disk drive) notebook computer, and was extremely
successful within the consumer segment of the market. In
roughly the same time period the Dell Latitude, Toshiba
Satellite, and IBM ThinkPad were reaching great success
with Pentium-based two-spindle (hard disk and oppy
disk drive) systems directed toward the corporate market.

13.25 Improved technology

A 1997 Micron laptop

Early laptop displays were so primitive that PC Magazine in 1986 published an article discussing them with
the headline Is It On Yet?". It said of the accompanying montage of nine portable computers, Pictured at the
right are two screens and seven elongated smudges. The
article stated that LCD screens still look to many observers like Etch-a-Sketch toys, or gray chalk on a dirty
blackboard, and predicted that until displays improved,
laptops will continue to be a niche rather than a mainstream direction.[9] As technology improved during the
1990s, the usefulness and popularity of laptops increased.
Correspondingly prices went down. Several developments specic to laptops were quickly implemented, improving usability and performance. Among them were:
Improved battery technology. The heavy lead-acid
batteries were replaced with lighter and more ecient technologies, rst nickel cadmium or NiCd,
then nickel metal hydride (NiMH) and then lithium
ion battery and lithium polymer.

137
Power-saving processors. While laptops in 1991
were limited to the 80286 processor because of the
energy demands of the more powerful 80386, the
introduction of the Intel 386SL processor, designed
for the specic power needs of laptops, marked the
point at which laptop needs were included in CPU
design. The 386SL integrated a 386SX core with a
memory controller and this was paired with an I/O
chip to create the SL chipset. It was more integrated than any previous solution although its cost
was higher. It was heavily adopted by the major
notebook brands of the time. Intel followed this with
the 486SL chipset which used the same architecture.
However, Intel had to abandon this design approach
as it introduced its Pentium series. Early versions of
the mobile Pentium required TAB mounting (also
used in LCD manufacturing) and this initially limited the number of companies capable of supplying
notebooks. However, Intel did eventually migrate
to more standard chip packaging. One limitation of
notebooks has always been the diculty in upgrading the processor which is a common attribute of
desktops. Intel did try to solve this problem with
the introduction of the MMC for mobile computing. The MMC was a standard module upon which
the CPU and external cache memory could sit. It
gave the notebook buyer the potential to upgrade his
CPU at a later date, eased the manufacturing process
somewhat, and was also used in some cases to skirt
U.S. import duties as the CPU could be added to the
chassis after it arrived in the U.S. Intel stuck with
MMC for a few generations but ultimately could not
maintain the appropriate speed and data integrity to
the memory subsystem through the MMC connector. A more specialized power saving CPU variant for laptops is the PowerPC 603 family.[10] Derived from IBMs 601 series for laptops (while the
604 branch was for desktops), it found itself used on
many low end Apple desktops before it was wildly
used in laptops, starting with PowerBook models
5300, 2400, 500 upgrades. Ironically, what started
out as a laptop processor was eventually used across
all platforms in its follow up PPC 750.
Improved liquid crystal displays, in particular activematrix TFT (Thin-Film Transistor) LCD technology. Early laptop screens were black and white,
blue and white, or grayscale, STN (Super Twist Nematic) passive-matrix LCDs prone to heavy shadows, ghosting and blurry movement (some portable
computer screens were sharper monochrome plasma
displays, but these drew too much current to be powered by batteries). Color STN screens were used
for some time although their viewing quality was
poor. By about 1991, two new color LCD technologies hit the mainstream market in a big way;
Dual STN and TFT. The Dual STN screens solved
many of the viewing problems of STN at a very af-

138

CHAPTER 13. HISTORY OF LAPTOPS


fordable price and the TFT screens oered excellent
an integrated video camera for video commuviewing quality although initially at a steep price.
nication
DSTN continued to oer a signicant cost advan a ngerprint sensor for implementing a restrictage over TFT until the mid-90s before the cost
tion of access to a sensitive data or the comdelta dropped to the point that DSTN was no longer
puter itself.
used in notebooks. Improvements in production
technology meant displays became larger, sharper,
had higher native resolutions, faster response time
and could display color with great accuracy, making 13.26 Netbooks
them an acceptable substitute for a traditional CRT
monitor.
Main article: Netbook
In June 2007 Asus announced the Eee PC 701 to be released in October, a small lightweight x86 Celeron-M
ULV 353 powered laptop with 4 GB SDHC disk and a 7
inch screen.[11] Despite previous attempts to launch small
lightweight computers such as ultra-portable PC, the Eee
was the rst success story largely due to its low cost, small
size, low weight and versatility. The term 'Netbook' was
later dubbed by Intel. Asus then extended the Eee line
with models with features such as a 9 inch screen and
other brands including Acer, MSI and Dell followed suit
with similar devices, often built on the edgling lowpower Intel Atom processor architecture.

13.27 Smartbooks
OLPC XO-1 laptop in Ebook-Mode.

Main article: Smartbook

In 2009, Qualcomm introduced a new term smartbook,


Improved storage technology. Early laptops and which stands for a hybrid device between smartphone and
[12]
portables had only oppy disk drives. As thin, high- laptop.
capacity hard disk drives with higher reliability and
shock resistance and lower power consumption became available, users could store their work on lap- 13.28 See also
top computers and take it with them. The 3.5
HDD was created initially as a response to the needs
Timeline of portable computers
of notebook designers that needed smaller, lower
power consumption products. With continuing pressure to shrink the notebook size even further, the
2.5 HDD was introduced. One Laptop Per Child 13.29 References
(OLPC) and other new laptops use Flash RAM (non
volatile, non mechanical memory device) instead of [1] Bob Armstrong, http://cosy.com/language/cosyhard/
cosyhard.htm
the mechanical hard disk.
Improved connectivity. Internal modems and standard serial, parallel, and PS/2 ports on IBM PCcompatible laptops made it easier to work away from
home; the addition of network adapters and, from
1997, USB, as well as, from 1999, Wi-Fi, made
laptops as easy to use with peripherals as a desktop computer. Many newer laptops are also available with built-in 3G Broadband wireless modems.
Other peripherals may include:

[2] OLD-COMPUTERS.COM : The Museum


[3] See TRS-80 Model 100 / 102 at old-computers.com
[4] Commodore SX-64 Portable
[5] Commodore SX-64 portable computer
[6] Toshiba-Asia.com, May 2005
[7] . HP Computer Museum http://www.hpmuseum.net/
display_item.php?hw=219. Retrieved 7 April 2014.
Missing or empty |title= (help)

13.30. FURTHER READING

[8] Lewis, Peter H. (October 23, 1988). THE EXECUTIVE


COMPUTER; Compaq Finally Makes a Laptop. The
New York Times.
[9] Somerson, Paul (July 1986). Is It On Yet?". PC. pp.
122123. Retrieved 9 January 2015.
[10] http://www-03.ibm.com/systems/p/hardware/
whitepapers/power/ppc_603.html
[11] http://www.engadget.com/2007/06/05/
asus-new-eee-pc-701-joins-the-laptop-lite-fray-with-a-bang/
[12] Hachman, Mark (June 1, 2009). Qualcomm Shows O
Snapdragon Smartbooks. PC Magazine.

13.30 Further reading


Wilson, James E. (2006). Vintage Laptop Computers: First Decade: 1980-89. Outskirts Press. p. 132.
ISBN 978-1-59800-489-2.

139

Chapter 14

History of the World Wide Web


The World Wide Web ("WWW" or simply the "Web")
is a global information medium which users can read and
write via computers connected to the Internet. The term
is often mistakenly used as a synonym for the Internet
itself, but the Web is a service that operates over the Internet, just as e-mail also does. The history of the Internet dates back signicantly further than that of the World
Wide Web.

14.1 Precursors
The hypertext portion of the Web in particular has an
intricate intellectual history; notable inuences and preThe NeXTcube used by Tim Berners-Lee at CERN became the
cursors include Vannevar Bush's Memex,[3] IBMs Genrst Web server.
[4]
eralized Markup Language, and Ted Nelson's Project
Xanadu.[3]
Paul Otlet's Mundaneum project has also been named as tocols were installed on some key non-Unix machines at
an early 20th century precursor of the Web.[5]
the institution, turning it into the largest Internet site in
The concept of a global information system connecting Europe within a few years. As a result, CERNs infras[6]
homes is pregured in "A Logic Named Joe", a 1946 tructure was ready for Berners-Lee to create the Web.
short story by Murray Leinster, in which computer terminals, called logics, are present in every home. Although
the computer system in the story is centralized, the story
anticipates a ubiquitous information environment similar
to the Web.

14.2 19801991: Invention of the


Web

Berners-Lee wrote a proposal in March 1989 for a large


hypertext database with typed links.[7] Although the proposal attracted little interest, Berners-Lee was encouraged by his boss, Mike Sendall, to begin implementing
his system on a newly acquired NeXT workstation.[8] He
considered several names, including Information Mesh,[7]
The Information Mine or Mine of Information, but settled
on World Wide Web.[9]
Berners-Lee found an enthusiastic collaborator in Robert
Cailliau, who rewrote the proposal (published on November 12, 1990) and sought resources within CERN.
Berners-Lee and Cailliau pitched their ideas to the European Conference on Hypertext Technology in September
1990, but found no vendors who could appreciate their
vision of marrying hypertext with the Internet.[10]

In 1980, Tim Berners-Lee, an independent contractor at the European Organization for Nuclear Research
(CERN), Switzerland, built ENQUIRE, as a personal
database of people and software models, but also as a way
to play with hypertext; each new page of information in
By Christmas 1990, Berners-Lee had built all the tools
ENQUIRE had to be linked to an existing page.[3]
necessary for a working Web: the HyperText TransIn 1984 Berners-Lee returned to CERN, and considered fer Protocol (HTTP) 0.9,[11] the HyperText Markup
its problems of information management: physicists from Language (HTML), the rst Web browser (named
around the world needed to share data, yet they lacked WorldWideWeb, which was also a Web editor), the rst
common machines and any shared presentation software. HTTP server software (later known as CERN httpd), the
Shortly after Berners-Lees return to CERN, TCP/IP pro- rst web server (http://info.cern.ch), and the rst Web
140

14.3. 19921995: GROWTH OF THE WEB

141
An early CERN-related contribution to the Web was
the parody band Les Horribles Cernettes, whose promotional image is believed to be among the Webs rst ve
pictures.[21]

14.3 19921995:
Web

Growth of the

In keeping with its birth at CERN, early adopters of the


World Wide Web were primarily university-based scientic departments or physics laboratories such as Fermilab
and SLAC. By January 1993 there were fty Web servers
Robert Cailliau, Jean-Franois Abramatic and Tim Berners-Lee across the world; by October 1993 there were over ve
at the 10th anniversary of the WWW Consortium.
hundred.[14]
pages that described the project itself. The browser could
access Usenet newsgroups and FTP les as well. However, it could run only on the NeXT; Nicola Pellow therefore created a simple text browser that could run on almost any computer called the Line Mode Browser.[12]
To encourage use within CERN, Bernd Pollermann put
the CERN telephone directory on the web previously
users had to log onto the mainframe in order to look up
phone numbers.[12]
While inventing the Web, Berners-Lee spent most of his
working hours in Building 31 at CERN (461357N
60242E / 46.2325N 6.0450E), but also at his two
homes, one in France, one in Switzerland.[13] In January 1991 the rst Web servers outside CERN itself were
switched on.[14]

Early websites intermingled links for both the HTTP web


protocol and the then-popular Gopher protocol, which
provided access to content through hypertext menus presented as a le system rather than through HTML les.
Early Web users would navigate either by bookmarking popular directory pages, such as Berners-Lees rst
site at http://info.cern.ch/, or by consulting updated lists
such as the NCSA Whats New page. Some sites were
also indexed by WAIS, enabling users to submit full-text
searches similar to the capability later provided by search
engines.
By the end of 1994, the total number of websites was still
minute compared to present gures, but quite a number
of notable websites were already active, many of which
are the precursors or inspiring examples of todays most
popular services.

The rst web page may be lost, but Paul Jones of UNCChapel Hill in North Carolina revealed in May 2013 that 14.3.1
he has a copy of a page sent to him in 1991 by BernersLee which is the oldest known web page. Jones stored the
plain-text page, with hyperlinks, on a oppy disk and on
his NeXT computer.[15] CERN put the oldest known web
page back online in 2014, complete with hyperlinks that
helped users get started and helped them navigate what
was then a very small web.[16][17]

Early browsers

On August 6, 1991,[18] Berners-Lee posted a short summary of the World Wide Web project on the alt.hypertext
newsgroup, inviting collaborators.[19] This date also
marked the debut of the Web as a publicly available service on the Internet, although new users could only access
it after August 23.
Paul Kunz from the Stanford Linear Accelerator Center
visited CERN in September 1991, and was captivated by
the Web. He brought the NeXT software back to SLAC,
where librarian Louise Addis adapted it for the VM/CMS
operating system on the IBM mainframe as a way to display SLACs catalog of online documents;[12] this was the
rst web server outside of Europe and the rst in North
America.[20] The www-talk mailing list was started in the
same month.[14]

The advent of the Mosaic browser in 1993 was a turning point


in utility of the World Wide Web.

Initially, a web browser was available only for the


NeXT operating system. This shortcoming was discussed in January 1992,[14] and alleviated in April 1992
by the release of Erwise, an application developed at
the Helsinki University of Technology, and in May by

142

CHAPTER 14. HISTORY OF THE WORLD WIDE WEB

ViolaWWW, created by Pei-Yuan Wei, which included


advanced features such as embedded graphics, scripting,
and animation.[12] ViolaWWW was originally an application for HyperCard. Both programs ran on the X Window
System for Unix.[12]

Advanced Research Projects Agency (DARPA) and the


European Commission. It comprised various companies
that were willing to create standards and recommendations to improve the quality of the Web. Berners-Lee
made the Web available freely, with no patent and no royStudents at the University of Kansas adapted an exist- alties due. The W3C decided that its standards must be
ing text-only hypertext browser, Lynx, to access the web. based on royalty-free technology, so they can be easily
Lynx was available on Unix and DOS, and some web de- adopted by anyone.
signers, unimpressed with glossy graphical websites, held
that a website not accessible through Lynx wasnt worth
14.4 19961998: Commercializavisiting.
The rst Microsoft Windows browser was Cello, writtion of the Web
ten by Thomas R. Bruce for the Legal Information Institute at Cornell Law School to provide legal information, Main article: Web marketing
since access to Windows was more widespread amongst
lawyers than access to Unix. Cello was released in June
By 1996 it became obvious to most publicly traded com1993.[12]
panies that a public Web presence was no longer opThe Web was rst popularized by Mosaic,[22] a graph- tional. Though at rst people saw mainly the possibiliical browser launched in 1993 by Marc Andreessen's ties of free publishing and instant worldwide information,
team at the National Center for Supercomputing Appli- increasing familiarity with two-way communication over
cations (NCSA) at the University of Illinois at Urbana- the Web led to the possibility of direct Web-based comChampaign (UIUC).[23] The origins of Mosaic date to merce (e-commerce) and instantaneous group communi1992. In November 1992, the NCSA at the Univer- cations worldwide. More dotcoms, displaying products
sity of Illinois (UIUC) established a website. In Decem- on hypertext webpages, were added into the Web.
ber 1992, Andreessen and Eric Bina, students attending
UIUC and working at the NCSA, began work on Mosaic with funding from the High-Performance Computing and Communications Initiative, a US-federal research 14.5 19992001: Dot-com boom
and development program.[24] Andreessen and Bina reand bust
leased a Unix version of the browser in February 1993;
Mac and Windows versions followed in August 1993.[14]
Low interest rates in 199899 facilitated an increase in
The browser gained popularity due to its strong support of
start-up companies. Although a number of these new enintegrated multimedia, and the authors rapid response to
trepreneurs had realistic plans and administrative ability,
user bug reports and recommendations for new features.
most of them lacked these characteristics but were able
After graduation from UIUC, Andreessen and James H. to sell their ideas to investors because of the novelty of
Clark, former CEO of Silicon Graphics, met and formed the dot-com concept.
Mosaic Communications Corporation to develop the MoHistorically, the dot-com boom can be seen as similar to
saic browser commercially. The company changed its
a number of other technology-inspired booms of the past
name to Netscape in April 1994, and the browser was
including railroads in the 1840s, automobiles in the early
developed further as Netscape Navigator.
20th century, radio in the 1920s, television in the 1940s,
transistor electronics in the 1950s, computer time-sharing
in the 1960s, and home computers and biotechnology in
14.3.2 Web governance
the early 1980s.
In 2001 the bubble burst, and many dot-com startups
went out of business after burning through their venture
capital and failing to become protable. Many others,
however, did survive and thrive in the early 21st century.
Many companies which began as online retailers blossomed and became highly protable. More conventional
retailers found online merchandising to be a protable
additional source of revenue. While some online entertainment and news outlets failed when their seed capital
ran out, others persisted and eventually became economIn September 1994, Berners-Lee founded the World ically self-sucient. Traditional media outlets (newspaWide Web Consortium (W3C) at the Massachusetts In- per publishers, broadcasters and cablecasters in particstitute of Technology with support from the Defense ular) also found the Web to be a useful and protable
In May 1994, the rst International WWW Conference, organized by Robert Cailliau,[25][10] was held at
CERN;[26] the conference has been held every year since.
In April 1993, CERN had agreed that anyone could use
the Web protocol and code royalty-free; this was in part
a reaction to the perturbation caused by the University of
Minnesota's announcement that it would begin charging
license fees for its implementation of the Gopher protocol.

14.7. SEE ALSO


additional channel for content distribution, and an additional means to generate advertising revenue. The sites
that survived and eventually prospered after the bubble
burst had two things in common; a sound business plan,
and a niche in the marketplace that was, if not unique,
particularly well-dened and well-served.

143
typical Web 2.0 feel. They have articles with embedded video, user-submitted comments below the article,
and RSS boxes to the side, listing some of the latest articles from other sites.

Continued extension of the Web has focused on connecting devices to the Internet, coined Intelligent Device
Management. As Internet connectivity becomes ubiquitous, manufacturers have started to leverage the ex14.6 2002present: The Web be- panded computing power of their devices to enhance their
usability and capability. Through Internet connectivity,
comes ubiquitous
manufacturers are now able to interact with the devices
they have sold and shipped to their customers, and cusIn the aftermath of the dot-com bubble, telecommuni- tomers are able to interact with the manufacturer (and
cations companies had a great deal of overcapacity as other providers) to access new content.
many Internet business clients went bust. That, plus on[27]
going investment in local cell infrastructure kept connec- Web 2.0 has found a place in the English lexicon.
tivity charges low, and helping to make high-speed Internet connectivity more aordable. During this time, a
handful of companies found success developing business 14.6.2 The semantic web
models that helped make the World Wide Web a more
[28]
compelling experience. These include airline booking Popularized by Berners-Lees book Weaving the Web
sites, Google's search engine and its protable approach and a Scientic American article by Berners-Lee, James
[29]
to keyword-based advertising, as well as eBay's auction Hendler, and Ora Lassila, the term Semantic Web describes an evolution of the existing Web in which the netsite and Amazon.com's online department store.
work of hyperlinked human-readable web pages is exThis new era also begot social networking websites, such tended by machine-readable metadata about documents
as MySpace and Facebook, which gained acceptance and how they are related to each other, enabling autorapidly and became a central part of youth culture.
mated agents to access the Web more intelligently and
perform tasks on behalf of users. This has yet to happen.
In 2006, Berners-Lee and colleagues stated that the idea
14.6.1 Web 2.0
remains largely unrealized.[30]
Beginning in 2002, new ideas for sharing and exchanging content ad hoc, such as Weblogs and RSS, rapidly
gained acceptance on the Web. This new model for information exchange, primarily featuring user-generated and
user-edited websites, was dubbed Web 2.0. The Web 2.0
boom saw many new service-oriented startups catering to
a newly democratized Web.
As the Web became easier to query, it attained a greater
ease of use overall and gained a sense of organization
which ushered in a period of rapid popularization. New
sites such as Wikipedia and its sister projects are based on
the concept of user edited content. In 2005, three former
PayPal employees created a video viewing website called
YouTube, which became popular quickly and introduced
a new concept of user-submitted content in major events,
as in the CNN-YouTube Presidential Debates.
The popularity of YouTube, Facebook, etc., combined
with the increasing availability and aordability of highspeed connections has made video content far more common on all kinds of websites. Many video-content hosting
and creation sites provide an easy means for their videos
to be embedded on third party websites without payment
or permission.
This combination of more user-created or edited content,
and easy means of sharing content, such as via RSS widgets and video embedding, has led to many sites with a

14.7 See also


Hypermedia
Linked Data
Computer Lib / Dream Machines
History of hypertext
History of the web browser
History of web syndication technology

14.8 References
[1] Quittner, Joshua (March 29, 1999). Network designer.
Time.
[2] Tim Berners-Lee. Frequently asked questions. World
Wide Web Consortium. Retrieved 22 July 2010.
[3] Berners-Lee, Tim. Frequently asked questions - Start of
the web: Inuences. World Wide Web Consortium. Retrieved 22 July 2010.

144

[4] Berners-Lee, Tim. Frequently asked questions - Why the


//, #, etc?". World Wide Web Consortium. Retrieved 22
July 2010.
[5] Wright, Alex (2014-07-10). Cataloging the World: Paul
Otlet and the Birth of the Information Age. Oxford ; New
York: OUP USA. pp. 815. ISBN 9780199931415.
[6] Segal, Ben (1995). A Short History of Internet Protocols
at CERN. W3C.org.
[7] Berners-Lee, Tim (March 1989). Information Management: A Proposal. World Wide Web Consortium. Retrieved 24 August 2010.
[8] Gromov, Gregory (2011). The Next Crossroad of Web
History. Net Valley.
[9] Berners-Lee, Tim (2000-11-07). Weaving the Web:
The Original Design and Ultimate Destiny of the World
Wide Web. San Francisco: Harper. p. 23. ISBN
9780062515872.
[10] Tim Berners-Lee. Frequently asked questions - Robert
Cailliaus role. World Wide Web Consortium. Retrieved
22 July 2010.
[11] Berners-Lee, Tim. The Original HTTP as dened in
1991. W3C.org.
[12] Berners-Lee, Tim (ca 1993/1994). A Brief History of
the Web. World Wide Web Consortium. Retrieved 17
August 2010. Check date values in: |date= (help)
[13] Galbraith, David (July 8, 2010). Tim Berners-Lee: Conrming the exact location of the invention of the web.
DavidGalbraith.org.
[14] Raggett, Dave; Jenny Lam; Ian Alexander (1996-04).
HTML 3: Electronic Publishing on the World Wide Web.
Harlow, England ; Reading, Mass: Addison-Wesley. p.
21. ISBN 9780201876932. Check date values in: |date=
(help)
[15] Murawski, John (24 May 2013). Hunt for worlds oldest
WWW page leads to UNC Chapel Hill. News & Observer.

CHAPTER 14. HISTORY OF THE WORLD WIDE WEB

[22] Stewart, William. Mosaic: The First Global Web


Browser. The Living Internet.
[23] NCSA Mosaic September 10, 1993 Demo
[24] Gore, Al (February 14, 1996). The Technology Challenge: How Can America Spark Private Innovation?".
[25] Robert Cailliau (November 2, 1995). A Short History of
the Web: Text of a speech delivered at the launching of
the European branch of the W3 Consortium. Net Valley.
Retrieved 21 July 2010.
[26] IW3C2 - Past and Future Conferences. International World Wide Web Conferences Steering Committee. 2010-05-02. Retrieved 16 May 2010.
[27] "'Millionth English Word' declared. BBC News. June 19,
2009.
[28] Berners-Lee, Tim (2000-11-07). Weaving the Web: The
Original Design and Ultimate Destiny of the World Wide
Web. San Francisco: Harper. pp. 177198. ISBN
9780062515872.
[29] Berners-Lee, Tim; James Hendler; Ora Lassila (May 1,
2001). The Semantic Web. Scientic American. Retrieved March 13, 2008.
[30] Shadbolt, Nigel; Wendy Hall; Tim Berners-Lee (2006).
The Semantic Web Revisited. IEEE Intelligent Systems.
Retrieved April 13, 2007.

14.9 External links


First World Web site
Bemer, Bob, A History of Source Concepts for the
Internet/Web
The World Wide Web History Project
Important Events in the History of the World Wide
Web

[16] Shubber, Khadim (April 13, 2013). First ever web page
put back online by CERN. Wired.

Principal Figures in the Development of the Internet and the World Wide Web. University of North
Carolina. Retrieved July 3, 2006.

[17] Brodkin, John (April 30, 2013). First website ever goes
back online on the open Webs 20th birthday. Ars Technica.

How It All Started (slides), Tim Berners-Lee,


W3C, December 2004

[18] Ward, Mark (3 August 2006). How the web went world
wide. BBC News. Retrieved 24 January 2011.

A Little History of the World Wide Web: from


1945 to 1995, Dan Connolly, W3C, 2000

[19] Berners-Lee, Tim. Qualiers on Hypertext links....


alt.hypertext. Retrieved 11 July 2012.

The World Wide Web: Past, Present and Future,


Tim Berners-Lee, August 1996

[20] Berners-Lee, Tim (2000-11-07). Weaving the Web:


The Original Design and Ultimate Destiny of the World
Wide Web. San Francisco: Harper. p. 46. ISBN
9780062515872.
[21] Heather McCabe (1999-02-09). Grrl Geeks Rock Out.
Wired magazine.

Internet History, Computer History Museum


25 Years of the Internet

Chapter 15

Timeline of computing hardware 2400


BC1949
This article presents a detailed timeline of events in the
history of computing hardware: from prehistory until 1949. For narratives explaining the overall developments, see the navigation box History of computing.

[2] Rudman, Peter Strom (2007). How Mathematics Happened: The First 50,000 Years. Prometheus Books. p.
64. ISBN 978-1-59102-477-4.

15.1 Prehistory1640

[4] Morse code. ActewAGL.

[3] The History of the Binomial Coecients in India,


California State University, East Bay.

[5] Simon Singh. The Code Book. p. 14-20

15.2 1641-1850

[6] Fowler, Charles B. (October 1967). The Museum of Music: A History of Mechanical Instruments. Music Educators Journal (Music Educators Journal, Vol. 54, No. 2) 54
(2): 4549. doi:10.2307/3391092. JSTOR 3391092.

15.3 18511930

[7] Koetsier, Teun (2001). On the prehistory of programmable machines: musical automata, looms, calculators. Mechanism and Machine Theory (Elsevier) 36 (5):
589603. doi:10.1016/S0094-114X(01)00005-2.

15.4 19311940
15.5 19411949

[8] Islam, Knowledge, and Science. University of Southern


California. Retrieved 2008-01-22.

15.6 Computing timeline

[9] Hill, Donald (1985).


Al-Brn's mechanical
calendar.
Annals of Science 42 (2): 139163.
doi:10.1080/00033798500200141. ISSN 0003-3790.

Timeline of computing

[10] Tuncer Oren (2001). Advances in Computer and Information Sciences: From Abacus to Holonic Agents. Turk
J Elec Engin 9 (1): 6370 [64].

19501979
19801989

[11] Hassan, Ahmad Y.. Transfer of Islamic Technology to


the West, Part II: Transmission Of Islamic Engineering.
Retrieved 2008-01-22.

19901999
2000-2009
2010-present

[12] Lorch, R. P. (1976). The Astronomical Instruments


of Jabir ibn Aah and the Torquetum. Centaurus
20 (1):
1134.
Bibcode:1976Cent...20...11L.
doi:10.1111/j.1600-0498.1976.tb00214.x.

15.7 Notes
[1] Ralf Vogelsang et al.
(2010) New excavations
of Middle Stone Age deposits at Apollo 11
Rockshelter,
Namibia:
stratigraphy,
archaeology, chronology and past environments.
Journal
of African Archaeology 8(2):
185218 "https:
//www.academia.edu/4106767/New_Excavations_
at_Apollo_11_Namibia_Ralf_Vogelsang_et_al._".

[13] A 13th Century Programmable Robot, University of


Sheeld
[14] Ancient Discoveries, Episode 11: Ancient Robots.
History Channel. Retrieved 2008-09-06.
[15] Howard R. Turner (1997), Science in Medieval Islam: An
Illustrated Introduction, p. 184, University of Texas Press,
ISBN 0-292-78149-0

145

146

CHAPTER 15. TIMELINE OF COMPUTING HARDWARE 2400 BC1949

[16] Donald Routledge Hill, Mechanical Engineering in the


Medieval Near East, Scientic American, May 1991, pp.
64-9 (cf. Donald Routledge Hill, Mechanical Engineering)
[17] Bedini, Silvio A.; Maddison, Francis R. (1966). Mechanical Universe: The Astrarium of Giovanni de'
Dondi. Transactions of the American Philosophical Society 56 (5): 169.
[18] Astrolabe gearing. Museum of the History of Science,
Oxford. 2005. Retrieved 2008-01-22.
[19] History of the Astrolabe. Museum of the History of
Science, Oxford.
[20] Kennedy, E. S. (November 1947).
Al-Ksh's
Plate of Conjunctions"".
Isis 38 (1/2): 5659.
doi:10.1086/348036. ISSN 0021-1753. JSTOR 225450.
[21] Kennedy, Edward S. (1950). A Fifteenth-Century Planetary Computer: al-Kashis Tabaq al-Manateq I. Motion
of the Sun and Moon in Longitude. Isis 41 (2): 180183.
doi:10.1086/349146.
[22] Kennedy, Edward S. (1952). A Fifteenth-Century Planetary Computer: al-Kashis Tabaq al-Maneteq II: Longitudes, Distances, and Equations of the Planets. Isis 43
(1): 4250. doi:10.1086/349363.
[23] Kennedy, Edward S. (1951). An Islamic Computer for
Planetary Latitudes. Journal of the American Oriental
Society (Journal of the American Oriental Society, Vol.
71, No. 1) 71 (1): 1321. doi:10.2307/595221. JSTOR
595221.
[24] http://dotpoint.com/xnumber/pic_leonardo_calc.htm
[25] Jean Marguin, p.47 (1994)
[26] Jean Marguin, p.48 (1994)
[27] Ren Taton, p. 81 (1969)
[28] Jean Marguin, p. 48 (1994) Citing Ren Taton (1963)
[29] Jean Marguin, p.46 (1994)
[30] Jean Marguin, p.64-65 (1994)

[37] Felt, Dorr E. (1916). Mechanical arithmetic, or The history of the counting machine. Chicago: Washington Institute. p. 4.
[38] Columbia University Computing History - Herman Hollerith
[39] U.S. Census Bureau: Tabulation and Processing
[40] Hollerith Integrating Tabulator
[41] G.C. Chase: History of Mechanical Computing Machinery, Vol. 2, Number 3, July 1980, page 221, IEEE Annals
of the History of Computing
[42] Thomas A. Russo: Antique Oce Machines: 600 Years of
Calculating Devices, 2001, p.114, Schier Publishing Ltd,
ISBN 0-7643-1346-0
[43] http://www.columbia.edu/cu/computinghistory/
tabulator.html
[44] The IBM 601 Multiplying Punch
[45] Interconnected Punched Card Equipment
[46] Turing, A.M. (1936). On Computable Numbers, with an
Application to the Entscheidungsproblem. Proceedings
of the London Mathematical Society. 2 (1937) 42: 230
65. doi:10.1112/plms/s2-42.1.230. (and Turing, A.M.
(1938). On Computable Numbers, with an Application
to the Entscheidungsproblem: A correction. Proceedings
of the London Mathematical Society. 2 (1937) 43: 5446.
doi:10.1112/plms/s2-43.6.544.)
[47] Rojas, R. (1998). How to make Zuses Z3 a universal
computer. IEEE Annals of the History of Computing 20
(3): 5154. doi:10.1109/85.707574.
[48] Rojas, Ral. How to Make Zuses Z3 a Universal Computer.
[49] Hill, Peter C. J. (2005-09-16). RALPH BENJAMIN:
An Interview Conducted by Peter C. J. Hill (Interview).
Interview #465. IEEE History Center, The Institute
of Electrical and Electronics Engineers, Inc. Retrieved
2013-07-18.
[50] Copping, Jasper (2013-07-11). Briton: 'I invented the
computer mouse 20 years before the Americans". The
Telegraph. Retrieved 2013-07-18.

[31] David Smith, p.173-181 (1929)


[32] Copy of Polenis machine (it) Museo Nazionale Della
Scienza E Della Tecnologia Leonardo Da Vinci. Retrieved 2010-10-04
[33] (fr) Rapport du jury central sur les produits de l'agriculture
et de l'industrie exposs en 1849, Tome II, page 542 - 548,
Imprimerie Nationale, 1850 Gallica
[34] (fr) Le calcul simpli Maurice d'Ocagne, page 269, Bibliothque numrique du CNAM
[35] James Essinger, Jacquards Web, page 77 & 102-106, Oxford University Press, 2004
[36] the rst clone maker was made by Burkhardt from Germany in 1878

15.8 References
Marguin, Jean (1994). Histoire des instruments et
machines calculer, trois sicles de mcanique pensante 1642-1942 (in French). Hermann. ISBN 9782-7056-6166-3.
Ginsburg, Jekuthiel (2003). Scripta Mathematica
(Septembre 1932-Juin 1933). Kessinger Publishing,
LLC. ISBN 978-0-7661-3835-3.
Gladstone-Millar, Lynne (2003). John Napier: Logarithm John. National Museums Of Scotland. ISBN
978-1-901663-70-9.

15.9. EXTERNAL LINKS


Taton, Ren (1969). Histoire du calcul. Que sais-je
? n 198. Presses universitaires de France.
Swedin, Eric G.; Ferro, David L. (2005). Computers: The Life Story of a Technology. Greenwood.
ISBN 978-0-313-33149-7.
Taton, Ren (1963). Le calcul mcanique (in
French). Paris: Presses universitaires de France.
Smith, David Eugene (1929). A Source Book in
Mathematics. New York and London: McGraw-Hill
Book Company, Inc.

15.9 External links


A Brief History of Computing, by Stephen White.
An excellent computer history site; the present article is a modied version of his timeline, used with
permission.
The Evolution of the Modern Computer (1934 to
1950): An Open Source Graphical History, article
from Virtual Travelog
Timeline: exponential speedup since rst automatic
calculator in 1623 by Jrgen Schmidhuber, from
The New AI: General & Sound & Relevant for
Physics, In B. Goertzel and C. Pennachin, eds.: Articial General Intelligence, p. 175-198, 2006.
Computing History Timeline, a photographic gallery
on computing history

147

Chapter 16

Timeline of computing 195079


This article presents a detailed timeline of events in the
history of computing from 1950 to 1979. For narratives explaining the overall developments, see the History
of computing.

16.1 1950s
16.2 1960s
16.3 1970s
16.4 See also
Information revolution

16.5 References
[1] Stefan Betschon: Der Zauber des Anfangs - Schweizer
Computerpioniere. In: Ingenieure bauen die Schweiz.
Franz Betschon et al. (editors), pp. 376-399, Verlag Neue
Zuercher Zeitung, Zurich 2013, ISBN 978-3-03823-7914
[2] Auf den Spuren der deutschen Computermaus [In the
footsteps of the German computer mouse] (in German).
Heise Verlag. 2009-04-28. Retrieved 2013-01-07.
[3] Telefunkens 'Rollkugel'". oldmouse.com.
[4] SIG-100 video terminal and mouse.
[5] see 6502 microprocessor history
[6] Steven Weyhrich (28 December 2001). Apple II History
Chapter 5, The Disk II. Retrieved 27 November 2008.

16.6 External links


A Brief History of Computing, by Stephen White.
An excellent computer history site; the present article is a modied version of his timeline, used with
permission.

148

Chapter 17

Timeline of computing 198089


This article presents a detailed timeline of events in the
history of computing from 1980 to 1989. For narratives explaining the overall developments, see the History
of computing.

17.12 External links

17.1 1980
17.2 1981
17.3 1982
17.4 1983
17.5 1984
17.6 1985
17.7 1986
17.8 1987
17.9 1988
17.10 1989
17.11 References
[1] The Quintessential Computer? Epsons QX-10 hits the
high-end market. by Jim Hansen. Microcomputing
magazine 1983 April
[2] http://commons.wikimedia.org/wiki/File:Comdex_
1986.png

149

A Brief History of Computing, by Stephen White.


An excellent computer history site; the present article is a modied version of his timeline, used with
permission.

Chapter 18

Timeline of computing 199099


This article presents a detailed timeline of events in the
history of computing from 1990 to 1999. For narratives explaining the overall developments, see the History
of computing.

[4] http://www.thefreelibrary.com/Audio+Highway+
Announces+the+Listen+UP+Player+--+A+New+
Device+that...-a018696161

18.12 External links


18.1 1990

A Brief History of Computing, by Stephen White.


An excellent computer history site; the present article is a modied version of his timeline, used with
permission.

18.2 1991
18.3 1992
18.4 1993
18.5 1994
18.6 1995
18.7 1996
18.8 1997
18.9 1998
18.10 1999
18.11 References
[1] p. 54, Intel Turns 35: Now What?", David L. Margulius,
InfoWorld, July 21, 2003, ISSN 0199-6649.
[2] p. 21, "Architecture of the Pentium microprocessor", D.
Alpert and D. Avnon, IEEE Micro, 13, #3 (June 1993), pp.
1121, doi:10.1109/40.216745.
[3] p. 90, Inside Intel, Business Week, #3268, June 1, 1992.

150

Chapter 19

Timeline of computing 200009


This article presents a detailed timeline of events in the
history of computing from 2000 to 2009. For narratives explaining the overall developments, see the History
of computing.

[3] http://www.technewsworld.com/story/51228.html
[4] http://www.news.com/Chip+breaks+speed+record+in+
deep+freeze/2100-1006_3-6085568.html
[5] http://www.news.com/2100-1006_3-6119618.html
[6] http://laptoping.com/asus-eee-701-pc.html

19.1 2000

[7] http://android-developers.blogspot.com/2008/09/
announcing-android-10-sdk-release-1.html

19.2 2001
19.3 2002

19.13 External links


A Brief History of Computing, by Stephen White.
An excellent computer history site; the present article is a modied version of his timeline, used with
permission.

19.4 2003
19.5 2004
19.6 2005
19.7 2006
19.8 2007
19.9 2008
19.10 2009
19.11 See also
Informational revolution

19.12 References
[1] http://www.nvidia.com/object/IO_12687.html
[2] http://www.apple.com/pr/library/2005/jun/06intel.html

151

Chapter 20

Timeline of computing 201019


This article presents a detailed timeline of events in the
history of computing from 2010 to 2019. For narratives explaining the overall developments, see the History
of computing.

20.1 2010

[9] Shimpi, Anand Lal (2012-09-11). Intels Next Unit of


Computing: 4"x4, Core i3, Systems Targeted at $399.
AnandTech. Retrieved 26 January 2014.
[10] Parrish, Kevin (2012-10-04). TDK Finally Crams 2TB
on One 3.5-inch HDD Platter. Toms Hardware. Retrieved 26 January 2014.
[11] Windows 8s delivery date: October 26. ZDNet. 18 July
2012. Retrieved 4 October 2014.

20.2 2011

[12] Nintendo Wii U release date is November 18th in US


starting at $299.99, November 30th in Europe. Polygon.
Retrieved 4 October 2014.

20.3 2012

[13] PlayStation 4 Release Date Conrmed for November


15th in North America, November 29th in Europe.
Archive.is. Retrieved 4 October 2014.

20.4 2013

[14] Xbox One to Launch on November 22, 2013 in 13 Markets - Xbox Lives Major Nelson. Xbox Lives Major Nelson. Retrieved 4 October 2014.

20.5 2014

[15] PlayStation 4 Release Date Conrmed for November


15th in North America, November 29th in Europe.
Archive.is. Retrieved 4 October 2014.

20.6 References
[1] Ocial: iPad Launching Here April 3, Pre-Orders
March 12. Gizmodo. Retrieved 4 October 2014.
[2] iPhone 4 Release Date: New iPhone Release Set For
Summer 2010. The Hungton Post. 7 June 2010. Retrieved 4 October 2014.
[3] Apple - Press Info - Apple Presents iPhone 4. Apple.com. Retrieved 4 October 2014.
[4] Shimpi, Anand Lal (2011-05-04). Intel Announces rst
22nm 3D Tri-Gate Transistors, Shipping in 2H 2011.
AnandTech. Retrieved 23 January 2014.
[5] Ocial Google Blog: A new kind of computer: Chromebook. Ocial Google Blog. Retrieved 4 October 2014.
[6] Shimpi, Anand Lal (2011-09-07). Seagate Ships Worlds
First 4TB External HDD. AnandTech. Retrieved 26 January 2014.

[16] Seagates 8TB drive is biggest ever, stores more than 300
Blu-ray discs. TechRadar. 26 August 2014. Retrieved 4
October 2014.
[17] Seagate ships rst 8TB hard drive. Techreport.com.
Retrieved 4 October 2014.
[18] Seagate Ships Worlds First 8TB Hard Drives. Seagate.com. Retrieved 4 October 2014.
[19] Chromium Blog: 64 bits of awesome: 64-bit Windows
Support, now in Stable!". Chromium Blog. Retrieved 4
October 2014.
[20] IntelPR. Intel Unleashes its First 8-Core Desktop Processor. Intel Newsroom. Retrieved 4 October 2014.
[21] Happy Haswell-E And X99 Chipset Day, Internet! How
About A System Giveaway?". Toms Hardware. Retrieved 4 October 2014.

[7] BBC News - The Raspberry Pi computer goes on general


sale. BBC News. Retrieved 4 October 2014.
[8] Raspberry Pi $35 miniature computer now on sale, $25
model going into production 'immediately'". The Verge.
29 February 2012. Retrieved 4 October 2014.

152

Chapter 21

Timeline of computing
Timeline of computing presents events in the history of
computing organized by year and grouped into six topic
areas: predictions and concepts, rst use and inventions,
hardware systems and processors, operating systems, programming languages, and new application areas. More
detailed timelines are listed toward the end of the article.

21.1 Graphical timeline


21.2 See also
History of compiler construction
History of computing hardware up to third generation (1960s)
History of computing hardware (1960spresent)
third generation and later
History of the graphical user interface
History of the Internet

21.3 Resources
Stephen White, A Brief History of Computing, an excellent computer history site; the above is a modied
version of his timeline, used with permission.
The Computer History in time and space, Graphing
Project, an attempt to build a graphical image of
computer history, in particular operating systems.
The Computer Revolution/Timeline at Wikibooks

21.4 External links


Visual History of Computing
Computing History Timeline

153

Chapter 22

Microsoft
Microsoft Corporation (/makrsft/ or /-sft/[4] ) is 22.1 History
an American multinational corporation headquartered in
Redmond, Washington, that develops, manufactures, li- Main articles: History of Microsoft and History of
censes, supports and sells computer software, consumer Microsoft Windows
electronics and personal computers and services. Its best
known software products are the Microsoft Windows line
of operating systems, Microsoft Oce oce suite, and
Internet Explorer web browser. Its agship hardware 22.1.1 197283: Founding and company
products are the Xbox game consoles and the Microsoft
beginnings
Surface tablet lineup. It is the worlds largest software
maker measured by revenues.[5] It is also one of the
worlds most valuable companies.[6]
Microsoft was founded by Bill Gates and Paul Allen on
April 4, 1975, to develop and sell BASIC interpreters
for Altair 8800. It rose to dominate the personal computer operating system market with MS-DOS in the mid1980s, followed by Microsoft Windows. The companys
1986 initial public oering, and subsequent rise in its
share price, created three billionaires and an estimated
12,000 millionaires from Microsoft employees. Since the
1990s, it has increasingly diversied from the operating
system market and has made a number of corporate acquisitions. In May 2011, Microsoft acquired Skype Technologies for $8.5 billion in its largest acquisition to date.[7]
As of 2013, Microsoft is market dominant in both the
IBM PC-compatible operating system and oce software
suite markets (the latter with Microsoft Oce). The
company also produces a wide range of other software
for desktops and servers, and is active in areas including
Internet search (with Bing), the video game industry (with
the Xbox, Xbox 360 and Xbox One consoles), the digital services market (through MSN), and mobile phones
(via the Windows Phone OS). In June 2012, Microsoft
entered the personal computer production market for the
rst time, with the launch of the Microsoft Surface, a line
of tablet computers.
With the acquisition of Nokias devices and services division to form Microsoft Mobile Oy, the company reentered the smartphone hardware market, after its previous attempt, Microsoft Kin, which resulted from their
acquisition of Danger Inc.[8]

Paul Allen (l.) and Bill Gates (r.) on October 19, 1981, in a sea
of PCs after signing a pivotal contract. IBM called Microsoft in
July 1980 inquiring about programming languages for its upcoming PC line;[9]:228 after failed negotiations with another company,
IBM gave Microsoft a contract to develop the OS for the new line
of PCs.[10]

Paul Allen and Bill Gates, childhood friends with a passion in computer programming, were seeking to make a
successful business utilizing their shared skills. In 1972
they founded their rst company named Traf-O-Data,
which oered a rudimentary computer that tracked and
analyzed automobile trac data. Allen went on to pursue
a degree in computer science at the University of Washington, later dropping out of school to work at Honeywell. Gates began studies at Harvard.[11] The January
1975 issue of Popular Electronics featured Micro Instrumentation and Telemetry Systems's (MITS) Altair 8800
microcomputer. Allen suggested that they could program a BASIC interpreter for the device; after a call from
Gates claiming to have a working interpreter, MITS requested a demonstration. Since they didn't actually have

154

22.1. HISTORY
one, Allen worked on a simulator for the Altair while
Gates developed the interpreter. Although they developed the interpreter on a simulator and not the actual device, the interpreter worked awlessly when they demonstrated the interpreter to MITS in Albuquerque, New
Mexico in March 1975; MITS agreed to distribute it,
marketing it as Altair BASIC.[9]:108, 112114 They ocially
established Microsoft on April 4, 1975, with Gates as
the CEO.[12] Allen came up with the original name of
Micro-Soft, the combination of the words microprocessor and software, as recounted in a 1995 Fortune magazine article.[13][14] In August 1977 the company formed
an agreement with ASCII Magazine in Japan, resulting in
its rst international oce, "ASCII Microsoft".[15] The
company moved to a new home in Bellevue, Washington
in January 1979.[12]
Microsoft entered the OS business in 1980 with its own
version of Unix, called Xenix.[16] However, it was MSDOS that solidied the companys dominance. After negotiations with Digital Research failed, IBM awarded a
contract to Microsoft in November 1980 to provide a
version of the CP/M OS, which was set to be used in
the upcoming IBM Personal Computer (IBM PC).[17] For
this deal, Microsoft purchased a CP/M clone called 86DOS from Seattle Computer Products, branding it as MSDOS, which IBM rebranded to PC DOS. Following the
release of the IBM PC in August 1981, Microsoft retained ownership of MS-DOS. Since IBM copyrighted
the IBM PC BIOS, other companies had to reverse engineer it in order for non-IBM hardware to run as IBM PC
compatibles, but no such restriction applied to the operating systems. Due to various factors, such as MS-DOSs
available software selection, Microsoft eventually became the leading PC operating systems vendor.[10][18]:210
The company expanded into new markets with the release
of the Microsoft Mouse in 1983, as well as a publishing division named Microsoft Press.[9]:232 Paul Allen resigned
from Microsoft in February after developing Hodgkins
disease.[9]:231

22.1.2

155
using ideas from OS/2; it shipped on July 21, 1993, with a
new modular kernel and the Win32 application programming interface (API), making porting from 16-bit (MSDOS-based) Windows easier. Once Microsoft informed
IBM of NT, the OS/2 partnership deteriorated.[22]
In 1990, Microsoft introduced its oce suite, Microsoft
Oce. The software bundled separate oce productivity
applications, such as Microsoft Word and Microsoft Excel.[9]:301 On May 22 Microsoft launched Windows 3.0
with a streamlined user interface graphics and improved
protected mode capability for the Intel 386 processor.[23]
Both Oce and Windows became dominant in their respective areas.[24][25] Novell, a Word competitor from
19841986, led a lawsuit years later claiming that Microsoft left part of its APIs undocumented in order to gain
a competitive advantage.[26]
On July 27, 1994, the U.S. Department of Justice, Antitrust Division led a Competitive Impact Statement that
said, in part: Beginning in 1988, and continuing until
July 15, 1994, Microsoft induced many OEMs to execute anti-competitive per processor licenses. Under a
per processor license, an OEM pays Microsoft a royalty
for each computer it sells containing a particular microprocessor, whether the OEM sells the computer with a
Microsoft operating system or a non-Microsoft operating system. In eect, the royalty payment to Microsoft
when no Microsoft product is being used acts as a penalty,
or tax, on the OEMs use of a competing PC operating
system. Since 1988, Microsofts use of per processor licenses has increased.[27]

22.1.3 19952005: Internet and the 32-bit


era

198494: Windows and Oce

While jointly developing a new OS with IBM in


1984, OS/2, Microsoft released Microsoft Windows,
a graphical extension for MS-DOS, on November 20,
1985.[9]:242243, 246 Microsoft moved its headquarters to
Redmond on February 26, 1986, and on March 13 the
company went public;[19] the ensuing rise in the stock
would make an estimated four billionaires and 12,000
millionaires from Microsoft employees.[20] Due to the
partnership with IBM, in 1990 the Federal Trade Commission set its eye on Microsoft for possible collusion; it
marked the beginning of over a decade of legal clashes
with the U.S. Government.[21] Microsoft released its
version of OS/2 to original equipment manufacturers
(OEMs) on April 2, 1987;[9]:243244 meanwhile, the company was at work on a 32-bit OS, Microsoft Windows NT,

Bill Gates giving his deposition in 1998 for the United States
v. Microsoft trial. Once the U.S. Department of Justice 1993
took over from the Federal Trade Commission, a protracted legal wrangling between Microsoft and the department ensued, resulting in various settlements and possible blocked mergers. Microsoft would point to companies such as AOL-Time Warner in
its defense.[21]

Following Bill Gatess internal Internet Tidal Wave

156
memo on May 26, 1995, Microsoft began to redene
its oerings and expand its product line into computer
networking and the World Wide Web.[28] The company
released Windows 95 on August 24, 1995, featuring preemptive multitasking, a completely new user interface
with a novel start button, and 32-bit compatibility; similar to NT, it provided the Win32 API.[29][30]:20 Windows
95 came bundled with the online service MSN (which
was originally planned to be a competitor to the Internet), and for OEMs Internet Explorer, a web browser.
Internet Explorer was not bundled with the retail Windows 95 boxes because the boxes were printed before the
team nished the web browser, and instead was included
in the Windows 95 Plus! pack.[31] Branching out into new
markets in 1996, Microsoft and NBC Universal created a
new 24/7 cable news station, MSNBC.[32] Microsoft created Windows CE 1.0, a new OS designed for devices
with low memory and other constraints, such as personal
digital assistants.[33] In October 1997, the Justice Department led a motion in the Federal District Court, stating
that Microsoft violated an agreement signed in 1994 and
asked the court to stop the bundling of Internet Explorer
with Windows.[9]:323324
Bill Gates handed over the CEO position on January
13, 2000, to Steve Ballmer, an old college friend of
Gates and employee of the company since 1980, creating a new position for himself as Chief Software Architect.[9]:111, 228[12] Various companies including Microsoft
formed the Trusted Computing Platform Alliance in October 1999 to, among other things, increase security and
protect intellectual property through identifying changes
in hardware and software. Critics decry the alliance as a
way to enforce indiscriminate restrictions over how consumers use software, and over how computers behave, a
form of digital rights management; for example the scenario where a computer is not only secured for its owner,
but also secured against its owner as well.[34][35] On April
3, 2000, a judgment was handed down in the case of
United States v. Microsoft,[36] calling the company an
abusive monopoly";[37] it settled with the U.S. Department of Justice in 2004.[19] On October 25, 2001, Microsoft released Windows XP, unifying the mainstream
and NT lines under the NT codebase.[38] The company released the Xbox later that year, entering the game console
market dominated by Sony and Nintendo.[39] In March
2004 the European Union brought antitrust legal action
against the company, citing it abused its dominance with
the Windows OS, resulting in a judgment of 497 million
($613 million) and to produce new versions of Windows
XP without Windows Media Player, Windows XP Home
Edition N and Windows XP Professional N.[40][41]

CHAPTER 22. MICROSOFT

CEO Steve Ballmer at the MIX event in 2008. In an interview


about his management style in 2005, he mentioned that his rst
priority was to get the people he delegates to in order. Ballmer
also emphasized the need to continue pursuing new technologies
even if initial attempts fail, citing the original attempts with Windows as an example.[42]

designed user interface dubbed Aero.[43][44] Microsoft


Oce 2007, released at the same time, featured a
"Ribbon" user interface which was a signicant departure from its predecessors. Relatively strong sales of both
titles helped to produce a record prot in 2007.[45] The
European Union imposed another ne of 899 million
($1.4 billion) for Microsofts lack of compliance with the
March 2004 judgment on February 27, 2008, saying that
the company charged rivals unreasonable prices for key
information about its workgroup and backoce servers.
Microsoft stated that it was in compliance and that these
nes are about the past issues that have been resolved.[46]
2007 also saw the creation of a multi-core unit at Microsoft, as they followed in the steps of server companies
such as Sun and IBM.[47]
Bill Gates retired from his role as Chief Software Architect on June 27, 2008, while retaining other positions related to the company in addition to being an advisor for the company on key projects.[48] Azure Services Platform, the companys entry into the cloud computing market for Windows, launched on October 27,
2008.[49] On February 12, 2009, Microsoft announced its
intent to open a chain of Microsoft-branded retail stores,
and on October 22, 2009, the rst retail Microsoft Store
opened in Scottsdale, Arizona; the same day the rst store
opened, Windows 7 was ocially released to the public. Windows 7s focus was on rening Vista with ease of
use features and performance enhancements, rather than
a large reworking of Windows.[50][51][52]

As the smartphone industry boomed beginning in 2007,


Microsoft struggled to keep up with its rivals Apple and
Google in providing a modern smartphone operating sys22.1.4 200610: Windows Vista, mobile, tem. As a result, in 2010, Microsoft revamped their
and Windows 7
aging agship mobile operating system, Windows Mobile, replacing it with the new Windows Phone OS; along
Released in January 2007, the next version of Windows, with a new strategy in the smartphone industry that has
Windows Vista, focused on features, security, and a re- Microsoft working more closely with smartphone man-

22.1. HISTORY
ufacturers, such as Nokia, and to provide a consistent
user experience across all smartphones using Microsofts
Windows Phone OS. It used a new user interface design
language, codenamed Metro, which prominently used
simple shapes, typography and iconography, and the concept of minimalism.

157
previewed Windows 8, an operating system designed to
power both personal computers and tablet computers, in
Taipei in June 2011.[56] A developer preview was released on September 13, and was replaced by a consumer
preview on February 29, 2012.[57] On May 31, 2012, the
preview version was released.

Microsoft is a founding member of the Open Networking Foundation started on March 23, 2011. Other founding companies include Google, HP Networking, Yahoo,
Verizon, Deutsche Telekom and 17 other companies. The
nonprot organization is focused on providing support for
a new cloud computing initiative called Software-Dened
Networking.[53] The initiative is meant to speed innovation through simple software changes in telecommunications networks, wireless networks, data centers and other
networking areas.[54]

22.1.5

On June 18, 2012, Microsoft unveiled the Surface,


the rst computer in the companys history to have its
hardware made by Microsoft.[58][59] On June 25, Microsoft paid US$1.2 billion to buy the social network
Yammer.[60] On July 31, 2012, Microsoft launched the
Outlook.com webmail service to compete with Gmail.[61]
On September 4, 2012, Microsoft released Windows
Server 2012.[62] On October 1, Microsoft announced its
intention to launch a news operation, part of a new-look
MSN, at the time of the Windows 8 launch that was
later in the month.[63] On October 26, 2012, Microsoft
launched Windows 8 and the Microsoft Surface.[59][64]
2011present: Rebranding, Win- Three days later, Windows Phone 8 was launched.[65]
dows 8, Surface and Nokia devices To cope with the potential for an increase in demand
for products and services, Microsoft opened a number of holiday stores across the U.S. to complement
the increasing number of bricks-and-mortar Microsoft
Stores that opened in 2012.[66]

General design principle behind the Start screen in Windows 8.1,


Windows Phone and Xbox One

On March 29, 2013, Microsoft launched a Patent


Tracker.[67] The Kinect, the motion sensing input devices
by Microsoft, which was rst introduced in November
2010 was upgraded for the 2013 release of the eighthgeneration Xbox One. Its capabilities were revealed in
May 2013. The new Kinect uses an ultra-wide 1080p
camera, it can function in the dark due to an infrared sensor, it employs higher-end processing power and new software, it can distinguish between ne movements (such
as a thumb movements), and the device can determine a
users heart rate by looking at his/her face.[68] Microsoft
led a patent application in 2011 that suggests that the
corporation may use the Kinect camera system to monitor the behavior of television viewers as part of a plan
to make the viewing experience more active. On July
19, 2013, Microsoft stocks suered its biggest one-day
percentage sell-o since the year 2000 after its fourthquarter report raised concerns among the investors on the
poor showings of both Windows 8 and the Surface tablet;
with more than 11 percentage points declining Microsoft
suered a loss of more than US$32 billion.[69] For the
2010 scal year, Microsoft had ve product divisions:
Windows Division, Server and Tools, Online Services Division, Microsoft Business Division, and Entertainment
and Devices Division.

<gallery
widths=200
mode="traditional
style=text-align:center;
margin:auto;">
File:Xbox
Microsoft Surface tablet
One Console Set.jpg|Xbox One console File:
Following the release of Windows Phone, Microsoft Xbox-360-Kinect-Standalone.png|Xbox 360 Kinect
underwent a gradual rebranding of its product range sensor </gallery>
throughout 2011 and 2012the corporations logos, On September 3, 2013, Microsoft agreed to buy Nokia's
products, services, and websites adopted the principles mobile unit for $7 billion.[70] Also in 2013, Amy Hood
and concepts of the Metro design language.[55] Microsoft

158

CHAPTER 22. MICROSOFT

22.2 Businesses
22.2.1 Windows Division, Server and
Tools, Online Services Division

John W. Thompson has been appointed the chairman of Microsoft, taking over from Bill Gates.

became the CFO of Microsoft.[71]


The Alliance for Aordable Internet (A4AI) was
launched in October 2013 and Microsoft is part of the
coalition of public and private organizations that also includes Facebook, Intel and Google. Led by Tim BernersLee, the A4AI seeks to make Internet access more aordable so that access is broadened in the developing world,
where only 31% of people are online. Google will help to
decrease internet access prices so that they fall below the
UN Broadband Commissions worldwide target of 5% of
monthly income.[72]

The companys Client division produces the agship Windows OS line such as Windows 8; it also produces the
Windows Live family of products and services. Server
and Tools produces the server versions of Windows, such
as Windows Server 2008 R2 as well as a set of development tools called Microsoft Visual Studio, Microsoft
Silverlight, a web application framework, and System
Center Conguration Manager, a collection of tools providing remote-control abilities, patch management, software distribution and a hardware/software inventory.
Other server products include: Microsoft SQL Server,
a relational database management system, Microsoft Exchange Server, for certain business-oriented e-mail and
scheduling features, Small Business Server, for messaging and other small business-oriented features; and
Microsoft BizTalk Server, for business process management.
Microsoft provides IT consulting (Microsoft Consulting Services) and produces a set of certication programs handled by the Server and Tools division designed to recognize individuals who have a minimal set
of prociencies in a specic role; this includes developers (Microsoft Certied Solution Developer), system/network analysts (Microsoft Certied Systems Engineer), trainers ("Microsoft Certied Trainers") and administrators ("Microsoft Certied Systems Administrator" and Microsoft Certied Database Administrator).
Microsoft Press, which publishes books, is also managed
by the division. The Online Services Business division
handles the online service MSN and the search engine
Bing. As of December 2009, the company also possesses
an 18% ownership of the cable news channel MSNBC
without any editorial control; however, the division develops the channels website, msnbc.com, in a joint venture
with the channels co-owner, NBC Universal.[77]

In line with the maturing PC business, in July 2013 Microsoft announced that it would reorganize the business
into 4 new business divisions by function: Operating System, Apps, Cloud and Devices. All previous divisions
will be diluted into new divisions without any workforce
cut.[73]
22.2.2 Business Division
On February 4, 2014, Steve Ballmer stepped down as
CEO of Microsoft and was succeeded by Satya Nadella, The Microsoft Business Division produces Microsoft Ofwho previously led Microsofts Cloud and Enterprise ce including Microsoft Oce 2010, the companys
division.[74] On the same day, John W. Thompson took line of oce software. The software product includes
on the role of chairman, with Bill Gates stepping down Word (a word processor), Access (a relational database
from the position to become more active within the com- program), Excel (a spreadsheet program), Outlook
(Groupware, frequently used with Exchange Server),
pany as Technology Advisor.[75]
PowerPoint (presentation software), Publisher (desktop
On April 25, 2014, Microsoft acquired Nokia Devices publishing software) and Sharepoint. A number of other
and Services and formed a new subsidiary, Microsoft Mo- products were added later with the release of Oce
bile Oy.
2003 including Visio, Project, MapPoint, InfoPath and
On September 15, 2014, Microsoft acquired the video OneNote. The division also develops enterprise regame development company Mojang for $2.5 billion, best source planning (ERP) software for companies under the
known for its wildly popular agship game Minecraft.[76] Microsoft Dynamics brand. These include: Microsoft

22.3. CULTURE

159

22.3 Culture
Technical reference for developers and articles for various Microsoft magazines such as Microsoft Systems Journal (MSJ) are available through the Microsoft Developer
Network (MSDN). MSDN also oers subscriptions for
companies and individuals, and the more expensive subscriptions usually oer access to pre-release beta versions of Microsoft software.[80][81] In April 2004 Microsoft launched a community site for developers and
users, titled Channel 9, that provides a wiki and an
Internet forum.[82] Another community site that provides
daily videocasts and other services, On10.net, launched
on March 3, 2006.[83] Free technical support is tradiThe Commons, located on the campus of the companys head- tionally provided through online Usenet newsgroups, and
quarters in Redmond
CompuServe in the past, monitored by Microsoft employees; there can be several newsgroups for a single
product. Helpful people can be elected by peers or Microsoft employees for Microsoft Most Valuable ProfesDynamics AX, Microsoft Dynamics NAV, Microsoft Dy- sional (MVP) status, which entitles them to a sort of spenamics GP, and Microsoft Dynamics SL. They are tar- cial social status and possibilities for awards and other
geted at varying company types and countries, and lim- benets.[84]
ited to organizations with under 7,500 employees.[78]
Also included under the Dynamics brand is the customer Noted for its internal lexicon, the expression eating our
relationship management software Microsoft Dynamics own dog food is used to describe the policy of using prerelease and beta versions of products inside Microsoft in
CRM, part of the Azure Services Platform.
an eort to test them in real-world situations.[85] This is
usually shortened to just dog food and is used as noun,
verb, and adjective. Another bit of jargon, FYIFV or
FYIV (Fuck You, I'm [Fully] Vested), is used by an
employee to indicate they are nancially independent and
[86]
22.2.3 Entertainment and Devices Divi- can avoid work anytime they wish. The company is
also known for its hiring process, mimicked in other orsion
ganizations and dubbed the "Microsoft interview", which
is notorious for o-the-wall questions such as Why is a
manhole cover round?".[87]
See also: Microsoft Mobile Oy
Microsoft is an outspoken opponent of the cap on H1B
visas, which allow companies in the U.S. to employ certain foreign workers. Bill Gates claims the cap on H1B
visas makes it dicult to hire employees for the company, stating I'd certainly get rid of the H1B cap in
2005.[88] Critics of H1B visas argue that relaxing the limits would result in increased unemployment for U.S. citizens due to H1B workers working for lower salaries.[89]
The Human Rights Campaign Corporate Equality Index,
a report of how progressive the organization deems company policies towards LGBT (lesbian, gay, bisexual and
transsexual) employees, rated Microsoft as 87% from
2002 to 2004 and as 100% from 2005 to 2010 after they
allowed gender expression.[90]

The Entertainment and Devices Division produces the


Windows CE OS for embedded systems and Windows
Phone for smartphones.[79] Microsoft initially entered the
mobile market through Windows CE for handheld devices, eventually developing into the Windows Mobile OS
and now, Windows Phone. Windows CE is designed for
devices where the OS may not directly be visible to the
end user, in particular, appliances and cars. The division also produces computer games, via its in-house game
publisher Microsoft Studios, that run on Windows PCs
and other systems including titles such as Age of Empires, Halo and the Microsoft Flight Simulator series, and
houses the Macintosh Business Unit which produces Mac
OS software including Microsoft Oce 2011 for Mac.
Microsofts Entertainment and Devices Division designs,
markets, and manufactures consumer electronics including the Xbox 360 game console, the handheld Zune media 22.4 Criticism
player, and the television-based Internet appliance MSN
TV. Microsoft also markets personal computer hardware Main article: Criticism of Microsoft
including mice, keyboards, and various game controllers Criticism of Microsoft has followed the companys exissuch as joysticks and gamepads.
tence because of various aspects of its products and busi-

160

CHAPTER 22. MICROSOFT


Helmut Panke and John W. Stanton.[98] Board members
are elected every year at the annual shareholders meeting
using a majority vote system. There are ve committees
within the board which oversee more specic matters.
These committees include the Audit Committee, which
handles accounting issues with the company including
auditing and reporting; the Compensation Committee,
which approves compensation for the CEO and other employees of the company; the Finance Committee, which
handles nancial matters such as proposing mergers and
acquisitions; the Governance and Nominating Committee, which handles various corporate matters including
nomination of the board; and the Antitrust Compliance
Committee, which attempts to prevent company practices
from violating antitrust laws.[99]

BadVista and Defective by Design groups protest against


Windows Vista

ness practices. Ease of use, stability, and security of the


companys software are common targets for critics. More
recently, Trojan horses and other exploits have plagued
numerous users due to faults in the security of Microsoft
Windows and other programs. Microsoft is also accused
of locking vendors into their products, and not following
and complying with existing standards in its software.[91]
Total cost of ownership comparisons of Linux as well as
OS X to Windows are a continuous point of debate.
The company has been in numerous lawsuits by several
governments and other companies for unlawful monopolistic practices. In 2004, the European Union found Microsoft guilty in a highly publicized anti-trust case. Additionally, Microsofts EULA for some of its programs is
often criticized as being too restrictive as well as being
against open source software.
Microsoft has been criticized (along with Yahoo, AOL,
Google and others) for its involvement in censorship in
the Peoples Republic of China.[92] Microsoft has also
come under criticism for outsourcing jobs to China and
India.[93][94][95] There were reports of poor working conditions at a factory in southern China that makes some of
Microsofts products.[96]

22.5 Corporate aairs


The company is run by a board of directors made up
of mostly company outsiders, as is customary for publicly traded companies. Members of the board of directors as of September 2014 are: John W. Thompson, Dina Dublon, Bill Gates, Maria Klawe, David Marquardt, Mason Mort,[97] Satya Nadella, Charles Noski,

Five year history graph of NASDAQ: MSFT stock on July 17,


2013[100]

When Microsoft went public and launched its initial public oering (IPO) in 1986, the opening stock price was
$21; after the trading day, the price closed at $27.75.
As of July 2010, with the companys nine stock splits,
any IPO shares would be multiplied by 288; if one was
to buy the IPO today given the splits and other factors,
it would cost about 9 cents.[9]:235236[101][102] The stock
price peaked in 1999 at around $119 ($60.928 adjusting
for splits).[103] The company began to oer a dividend
on January 16, 2003, starting at eight cents per share for
the scal year followed by a dividend of sixteen cents per
share the subsequent year, switching from yearly to quarterly dividends in 2005 with eight cents a share per quarter
and a special one-time payout of three dollars per share
for the second quarter of the scal year.[103][104] Though
the company had subsequent increases in dividend payouts, the price of Microsofts stock remained steady for
years.[104][105]
One of Microsofts business tactics, described by an executive as "embrace, extend and extinguish, initially embraces a competing standard or product, then extends it
to produce their own version which is then incompatible
with the standard, which in time extinguishes competition
that does not or cannot use Microsofts new version.[106]
Various companies and governments sue Microsoft over
this set of tactics, resulting in billions of dollars in rulings
against the company.[107][36][41] Microsoft claims that the
original strategy is not anti-competitive, but rather an exercise of its discretion to implement features it believes

22.5. CORPORATE AFFAIRS


customers want.[108]

22.5.1

Financial

Standard and Poors and Moodys have both given a AAA


rating to Microsoft, whose assets were valued at $41 billion as compared to only $8.5 billion in unsecured debt.
Consequently, in February 2011 Microsoft released a
corporate bond amounting to $2.25 billion with relatively
low borrowing rates compared to government bonds.[109]

161
on toxic chemicals, recycling and climate change.[119] Microsofts timeline for phasing out BFRs and phthalates in
all products is 2012 but its commitment to phasing out
PVC is not clear. As yet (January 2011) it has no products that are completely free from PVC and BFRs.[120]
Microsofts main U.S. campus received a silver certication from the Leadership in Energy and Environmental
Design (LEED) program in 2008, and it installed over
2,000 solar panels on top of its buildings in its Silicon
Valley campus, generating approximately 15 percent of
the total energy needed by the facilities in April 2005.[121]

For the rst time in 20 years Apple Inc. surpassed Microsoft in Q1 2011 quarterly prots and revenues due
to a slowdown in PC sales and continuing huge losses in
Microsofts Online Services Division (which contains its
search engine Bing). Microsoft prots were $5.2 billion,
while Apple Inc. prots were $6 billion, on revenues of
$14.5 billion and $24.7 billion respectively.[110]

Microsoft makes use of alternative forms of transit. It


created one of the worlds largest private bus systems, the
Connector, to transport people from outside the company; for on-campus transportation, the Shuttle Connect uses a large eet of hybrid cars to save fuel. The
company also subsidises regional public transport as an
incentive.[121][122] In February 2010 however, Microsoft
Microsofts Online Services Division has been continu- took a stance against adding additional public transport
ously loss-making since 2006 and in Q1 2011 it lost $726 and high-occupancy vehicle (HOV) lanes to a bridge conmillion. This follows a loss of $2.5 billion for the year necting Redmond to Seattle; the company did not want to
delay the construction any further.[123]
2010.[111]
On July 20, 2012, Microsoft posted its rst quarterly Microsoft was ranked number 1 in the list of the Worlds
loss ever, despite earning record revenues for the quar- Best Multinational Workplaces by the Great Place to
ter and scal year, with a net loss of $492 million Work Institute in 2011.[124]
due to a writedown related to the advertising company
aQuantive, which had been acquired for $6.2 billion back
in 2007.[112]

22.5.3 Marketing

As of January 2014, Microsofts market capitalization


stood at $314B,[113] making it the 8th largest company
In 2004, Microsoft commissioned research rms to do inin the world by market capitalization.[114]
dependent studies comparing the total cost of ownership
On November 14, 2014, Microsoft overtook Exxon Mo- (TCO) of Windows Server 2003 to Linux; the rms conbil to become the 2nd most valuable company by mar- cluded that companies found Windows easier to adminisket capitalization, behind only Apple Inc. Its total mar- trate than Linux, thus those using Windows would adminket value was over $410B - with the stock price hitting istrate faster resulting in lower costs for their company
$50.04 a share, the highest since early 2000.[115]
(i.e. lower TCO).[125] This spurred a wave of related studies; a study by the Yankee Group concluded that upgrading from one version of Windows Server to another costs
22.5.2 Environment
a fraction of the switching costs from Windows Server to
Linux, although companies surveyed noted the increased
In 2011, Greenpeace released a report rating the top ten security and reliability of Linux servers and concern
big brands in cloud computing on their sources of elec- about being locked into using Microsoft products.[126]
tricity for their data centers. At the time, data centers con- Another study, released by the Open Source Developsumed up to 2% of all global electricity and this amount ment Labs, claimed that the Microsoft studies were simwas projected to increase. Phil Radford of Greenpeace ply outdated and one-sided and their survey concluded
said we are concerned that this new explosion in electric- that the TCO of Linux was lower due to Linux adminity use could lock us into old, polluting energy sources istrators managing more servers on average and other
instead of the clean energy available today,[116] and reasons.[127]
called on Amazon, Microsoft and other leaders of the As part of the Get the Facts campaign, Microsoft
information-technology industry must embrace clean en- highlighted the .NET trading platform that it had deergy to power their cloud-based data centers.[117] In veloped in partnership with Accenture for the London
2013, Microsoft agreed to buy power generated by a Stock Exchange, claiming that it provided "ve nines"
Texas wind project to power one of its data centers.[118] reliability. After suering extended downtime and
Microsoft is ranked on the 17th place in Greenpeace's unreliability[128][129] the LSE announced in 2009 that it
Guide to Greener Electronics (16th Edition) that ranks was planning to drop its Microsoft solution and switch to
18 electronics manufacturers according to their policies a Linux based one in 2010.[130][131]

162

CHAPTER 22. MICROSOFT

In 2012, Microsoft hired a political pollster named Mark


Penn, whom the New York Times called famous for bulldozing his political opponents [132] as Executive VicePresident, Advertising and Strategy. Penn created a series
of negative ads targeting one of Microsofts chief competitors, Google. The ads, called "Scroogled", attempt to
make the case that Google is screwing consumers with
search results rigged to favor Googles paid advertisers,
that Gmail violates the privacy of its users to place ad
results related to the content of their emails and shopping results which favor Google products. Tech publications like Tech Crunch have been highly critical of the ad
campaign,[133] while Google employees have embraced
it.[134]

22.5.4

Lay o

quests about specic accounts or identiers.


If the government has a broader voluntary
national security program to gather customer
data, we don't participate in it.[141]
During the rst six months in 2013, Microsoft had received requests that aected between 15,000 and 15,999
accounts.[142] In December 2013, the company made
statement to further emphasis the fact that they take their
customers privacy and data protection very seriously,
even saying that government snooping potentially now
constitutes an advanced persistent threat, alongside sophisticated malware and cyber attacks.[143] The statement also marked the beginning of three-part program to
enhance Microsofts encryption and transparency eorts.
In July 1, 2014, as part of this program they opened the
rst (of many) Microsoft Transparency Center, that provides participating governments with the ability to review source code for our key products, assure themselves
of their software integrity, and conrm there are no back
doors.[144]

In July 2014, Microsoft announced plans to lay o 18,000


employees. Microsoft employed 127,104 people as of
June 5, 2014, making this about a 14 percent reduction
of its workforce as the biggest Microsoft lay o ever. It
will include 12,500 professional and factory personnel. Microsoft has also argued that the United States Congress
Previously, Microsoft has laid o 5,800 jobs in 2009 in should enact strong privacy regulations to protect conline with US nancial crisis.[135][136]
sumer data.[145]

In September 2014, Microsoft laid o 2,100 people, including 747 people in the Seattle-Redmond area, where
the company is headquartered. The rings came as a sec- 22.5.6 Logo
ond wave of the layos that were previously announced.
This brings the total number to over 15,000 out of the Microsoft adopted the so-called "Pac-Man Logo, designed by Scott Baker, in 1987. Baker stated The new
18,000 expected cuts.[137]
logo, in Helvetica italic typeface, has a slash between the
In October 2014, Microsoft revealed that it was almost o and s to emphasize the soft part of the name and condone with the elimination of 18,000 employees which is vey motion and speed.[146] Dave Norris ran an internal
its largest ever layo sweep.
joke campaign to save the old logo, which was green, in
all uppercase, and featured a fanciful letter O, nicknamed
the blibbet, but it was discarded.[147] Microsofts logo with
22.5.5 Cooperation with the United States
the "Your potential. Our passion." tagline below the main
Government
corporate name, is based on a slogan Microsoft used in
2008. In 2002, the company started using the logo in the
Microsoft provides information about reported bugs in United States and eventually started a TV campaign with
their software to intelligence agencies of the United States the slogan, changed from the previous tagline of "Where
government, prior to the public release of the x. A Mi- do you want to go today?".[148][149][150] During the pricrosoft spokesperson has stated that the corporation runs vate MGX (Microsoft Global Exchange) conference in
several programs that facilitate the sharing of such infor- 2010, Microsoft unveiled the companys next tagline, Be
mation with the U.S. government.[138]
Whats Next..[151]
Following media reports about PRISM, NSAs massive On August 23, 2012, Microsoft unveiled a new corpoelectronic surveillance program, in May 2013, several rate logo at the opening of its 23rd Microsoft store in
technology companies were identied as participants, in- Boston indicating the companys shift of focus from the
cluding Microsoft.[139] According to leaks of said pro- classic style to the tile-centric modern interface which it
gram, Microsoft joined the PRISM program in 2007.[140] uses/will use on the Windows Phone platform, Xbox 360,
However, in June 2013, an ocial statement from Mi- Windows 8 and the upcoming Oce Suites.[152] The new
crosoft atly denied their participation in the program:
logo also includes four squares with the colors of the thenWe provide customer data only when we
receive a legally binding order or subpoena to
do so, and never on a voluntary basis. In addition we only ever comply with orders for re-

current Windows logo which have been used to represent


Microsofts four major products: Windows (blue), Oce
(red), Xbox (green), and Bing (yellow).[153] However this
logo is not completely newit was featured in Windows
95 commercials from the mid-1990s.[154][155]

22.7. REFERENCES
19851987
19872006
20062011
20112012
2012present

163

[12] Bill Gates: A Timeline. BBC News (BBC). July 15,


2006. Retrieved July 17, 2010.
[13] Schlender, Brent (2 October 1995). BILL GATES &
PAUL ALLEN TALK CHECK OUT THE ULTIMATE
BUDDY ACT IN BUSINESS HISTORY. Fortune Magazine.
[14] Allen, Paul (2011). Paul Allen: Idea Man. Penguin
Group. p. 91. ISBN 0141969385.

1987 Microsoft "Pac-Man" logo, designed by


Scott Baker and used from 1987 to 2012 with the
19942002 slogan "Where do you want to go today?".[148][149]

[15] Staples, Betsy (August 1984). Kay Nishi bridges the cultural gap. Creative Computing 10 (8): 192. Retrieved
July 15, 2010.

20062011 Microsoft logo as of 20062011, with


the slogan Your potential. Our passion. [149]

[16] Dyar, Dafydd Neal (November 4, 2002). Under The


Hood: Part 8. Computer Source. Archived from the original on September 11, 2006. Retrieved July 14, 2010.

20112012 Logo by Microsoft with the slogan Be


whats next. [151]

[17] Engines that move markets. Books.google.co.uk. 2002.


ISBN 9780471205951. Retrieved May 29, 2011.

2012present Introduced on August 23, 2012,


to symbolize the world of digital motion and Microsofts diverse portfolio of products.[156]

[18] Blaxill, Mark; Eckardt, Ralph (March 5, 2009). The Invisible Edge: Taking Your Strategy to the Next Level Using Intellectual Property. Portfolio Hardcover. ISBN 159184-237-9. Retrieved July 7, 2010.

22.6 See also

[19] Microsoft Chronology. CBS News (CBS Interactive).


Retrieved August 5, 2010.

Bill Gates

[20] Bick, Julie (May 29, 2005). The Microsoft Millionaires


Come of Age. The New York Times. Retrieved July 3,
2006.

22.7 References
[1] Earnings Release FY14 Q4. Microsoft. July 22, 2014.
Retrieved September 15, 2014.
[2] Revenue and Headcount. Microsoft. Retrieved 201409-15.
[3] Microsoft Corporation Annual Reports. Microsoft Corporation. July 28, 2014. Retrieved August 23, 2014.

[21] U.S. v. Microsoft: Timeline. Wired. November 4,


2002. Retrieved July 17, 2010.
[22] Thurrott, Paul (January 24, 2003). Windows Server
2003: The Road To Gold. winsupersite.com. Penton
Media. Archived from the original on June 4, 2010. Retrieved July 15, 2010.
[23] Athow, Desire (May 22, 2010). Microsoft Windows 3.0
Is 20 Years Old Today!!!". ITProPortal. Retrieved April
4, 2012.

[4] See lot-cloth split


[5] Global Software Top 100 Edition 2011. Softwaretop100.Org. 23 August 2011.

[24] Miller, Michael (August 1, 1998). Windows 98 Put to


the Test (OS Market Share 19932001)". PC Magazine.
Retrieved July 3, 2010.

[6] Market Cap Rankings. Ycharts. Zacks Investment Research. April 8, 2012. Retrieved April 9, 2012.

[25] McCracken, Harry (September 13, 2000). A Peek at Ofce Upgrade. PCWorld. Retrieved July 4, 2006.

[7] Microsoft buys Skype for $8.5 billion. The Search Ofce Space Blog. May 10, 2011. Retrieved April 4, 2011.
[8] Notify The Next Of Kin. InformationWeek. 30 June
2010.
[9] Allan, Roy A. (2001). A History of the Personal Computer. Allan Publishing. ISBN 0-9689108-0-7. Retrieved
July 17, 2010.
[10] Microsoft to Microsoft disk operating system (MSDOS)". Smart Computing (Sandhills Publishing Company) 6 (3). March 2002. Retrieved August 18, 2008.
[11] Microsoft Company History.

[26] Waner, Jim (November 12, 2004). Novell Files WordPerfect Suit Against Microsoft. internetnews.com. Retrieved July 15, 2010.
[27] Competitive Impact Statement : U.S. v. Microsoft Corporation. Justice.gov. Retrieved May 11, 2011.
[28] Borland, John (April 15, 2003). Victor: Software empire
pays high price. CNET (CBS Interactive). Retrieved July
16, 2010.
[29] Cope, Jim (March 1996). New And Improved. Smart
Computing (Sandhills Publishing Company) 4 (3). Retrieved July 16, 2010.

164

CHAPTER 22. MICROSOFT

[30] Pietrek, Matt (March 1996). Windows 95 Programming


Secrets (PDF). IDG. ISBN 1-56884-318-6. Retrieved
July 17, 2010.

[46] AFP:EU hits Microsoft with record 899 million euro


antitrust ne. Google News (Google). Agence FrancePresse. February 27, 2008. Retrieved June 1, 2008.

[31] Thurrott, Paul (May 31, 2005). MSN: The Inside Story.
winsupersite.com (Penton Media). Retrieved July 17,
2010.

[47] Microsoft, Multi-core and the Data Center.

[32] Marketplace: News Archives. Marketplace. American


Public Media. July 15, 1996. Archived from the original
on August 23, 2004.
[33] Tilly, Chris. The History of Microsoft Windows CE.
HPC:Factor. Retrieved August 18, 2008.
[34] Marko, John (June 20, 2002). Fears of Misuse of Encryption System Are Voiced. The New York Times. Retrieved July 7, 2010.
[35] Stajano, Frank (2003). Security for whom? The shifting
security assumptions of pervasive computing. Software
SecurityTheories and Systems. Lecture notes in computer science (Springer-Verlag Berlin Heidelberg) 2609:
1627. doi:10.1007/3-540-36532-X_2. ISBN 978-3540-00708-1. Retrieved July 6, 2010.
[36] United States v. Microsoft. U.S. Department of Justice.
Retrieved August 5, 2005.
[37] Jackson, Thomas Peneld (November 5, 1999). U.S. vs.
Microsoft ndings of fact. U.S. Department of Justice.
Retrieved August 18, 2008.
[38] Thurrott, Paul (October 26, 2001). WinInfo Short Takes:
Windows XP Launch Special Edition. Windows IT Pro
(Penton Media). Retrieved July 16, 2010.
[39] NPD REPORTS ANNUAL 2001 U.S. INTERACTIVE
ENTERTAINMENT SALES SHATTER INDUSTRY
RECORD (Press release). Port Washington, New York:
NPD Group. February 7, 2002. Archived from the original on August 14, 2004. Retrieved January 28, 2015.
[40] Microsoft hit by record EU ne. CNN. March 25, 2004.
Archived from the original on April 13, 2006. Retrieved
August 14, 2010.
[41] Commission Decision of 24.03.2004 relating to a proceeding under Article 82 of the EC Treaty (Case
COMP/C-3/37.792 Microsoft)" (PDF). Commission of
the European Communities. April 21, 2004. Retrieved
August 5, 2005.
[42] Wee, Gerald (November 10, 2005). Steve Ballmer on
management style. ITWorld (IDG). CIO Asia. Retrieved
January 29, 2011.
[43] Vamosi, Robert (January 23, 2007). Windows Vista Ultimate review. CNET. CBS Interactive. Retrieved April
4, 2012.
[44] Ricadela, Aaron (February 14, 2006). Gates Says Security Is Job One For Vista. InformationWeek. UBM
TechWeb. Retrieved April 4, 2012.
[45] Vista gives Microsoft view of record prot. Edinburgh
Evening News. Johnston Press. April 27, 2007. Retrieved
February 1, 2009.

[48] Conte, Natali Del (June 15, 2006). Bill Gates Announces
Resignation. PC Magazine (Zi Davis). Retrieved July
17, 2010.
[49] Fried, Ina (October 27, 2008). Microsoft launches Windows Azure. CNET. CBS Interactive. Retrieved July 6,
2010.
[50] Fried, Ina (February 12, 2009). Microsoft follows Apple into the retail business. CNET. CBS Interactive. Retrieved July 17, 2010.
[51] Gaynor, Tim (October 22, 2009). Long lines as Microsoft opens retail store. Reuters (Thomson Reuters).
Retrieved July 3, 2010.
[52] Mintz, Jessica (October 22, 2009). Windows 7 operating
system makes its debut. NBCNews.com (NBCUniversal).
Associated Press. Retrieved April 4, 2012.
[53] Erickson, David (March 21, 2011). Open Networking
Foundation News Release. Openow.org. Retrieved May
29, 2011.
[54] ""Google and other titans form Open Networking Foundation. Noyes, March 23, 2011. Computerworld. IDG.
March 23, 2011. Retrieved May 29, 2011.
[55] Windows Phone 7 Series UI Design & Interaction
Guide. March 18, 2010. Retrieved 2010-10-09.
[56] Microsoft releases nal test version of Windows 8.
Business Line (Kasturi & Sons). June 1, 2012. Retrieved
August 4, 2012.
[57] Roso, Matt (January 5, 2011). OK, So Windows 8 Is
Coming To ARM Tablets...Someday (MSFT)". San Francisco Chronicle. Retrieved January 5, 2011.
[58] Sullivan, Mark. Microsoft Announces New 'Surface'
Tablet PC. PCWorld. Retrieved June 19, 2012.
[59] Eichenwald, Kurt, Microsofts Lost Decade: How Microsoft Lost Its Mojo, Vanity Fair, August 2012
[60] Acohido, Byron (June 25, 2012). Microsoft buys Internet startup Yammer for $1.2 billion. USA Today
(Gannett Company). Retrieved 25 June 2012.
[61] Thurrott, Paul (31 July 2012). Outlook.com Mail: Microsoft Reimagines Webmail. Supersite for Windows.
Penton Media. Retrieved 1 August 2012.
[62] Microsoft Corp. (8 August 2012). Windows Server 2012
Save the Date Announcement.
[63] Rigby, Bill (October 1, 2012). Microsoft launching news
operation, new MSN. Reuters. Retrieved October 1,
2012.
[64] Windows 8s delivery date: October 26. ZDNet. July
18, 2012. Retrieved September 17, 2012.

22.7. REFERENCES

[65] Mary Jo Foley: Windows Phone 8 launch date revealed.


LiveSide.net. 2012-08-30. Retrieved 2012-11-27.
[66] Microsoft prepping for complete brand and product line
relaunch, New York store coming the 26th. wpcentral.com. Retrieved November 3, 2012.
[67] Microsoft launches 'Patent Tracker' to help you search
its library of intellectual property. The Next Web. March
28, 2013. Retrieved March 29, 2013.
[68] David Pierce (21 May 2013). The all-seeing Kinect:
tracking my face, arms, body, and heart on the Xbox
One. The Verge. Vox Media, Inc. Retrieved 28 May
2013.
[69] Funky Friday: More than $32 billion in Microsoft
stock value wiped out | Microsoft CNET News.
News.cnet.com. Retrieved 2013-07-21.
[70] Microsoft buying Nokias phone business in a $7.2 billion
bid for its mobile future.
[71] Microsoft names insider Amy Hood as CFO.
Reuters.com. Retrieved 2014-04-18.
[72] Samuel Gibbs (7 October 2013). Sir Tim Berners-Lee
and Google lead coalition for cheaper internet. The
Guardian. Retrieved 8 October 2013.
[73] Microsofts sweeping reorganization shifts focus to services, devices. July 11, 2013.
[74] Microsoft CEO Steve Ballmer to retire within 12
months.
[75]
[76] Hutchinson, Lee. Its ocial: Microsoft acquires Mojang
and Minecraft for $2.5 billion. Ars Technica. Retrieved
19 September 2014.
[77] Carter, Bill (December 24, 2005). Microsoft Quits
MSNBC TV, but Web Partnership Remains. The New
York Times. Retrieved July 6, 2010.
[78] Four Products Advance on Dynamics ERP Roadmap.
Directions on Microsoft. April 27, 2009. Retrieved July 3,
2010.

165

[84] Bray, Hiawatha (June 13, 2005). Somehow, Usenet lumbers on. The Boston Globe. Retrieved July 3, 2006.
* Microsoft MVP Frequently Asked Questions. Microsoft. Retrieved July 1, 2006.
[85] CNET News.com Sta (July 21, 2003). Microsoft tests
its own dog food. ZDNet (CNET Networks, Inc.).
Archived from the original on January 8, 2007. Retrieved
October 9, 2005.
[86] Heileman, John (November 2000). The Truth, The
Whole Truth, and Nothing But The Truth. Wired. Retrieved September 30, 2007.
[87] Poundstone, William (May 21, 2003). Square Manhole
Covers and Crazy Questions. G4TV.com. Retrieved July
1, 2006.
[88] Mark, Roy (April 27, 2005). Gates Rakes Congress on
H1B Visa Cap. internetnews.com. Retrieved August 18,
2008.
[89] Bill Gates Targets Visa Rules for Tech Workers. NPR.
March 12, 2008. Retrieved July 6, 2010.
[90] Corporate Equality Index Archive. Human Rights
Campaign Foundation. Retrieved July 17, 2010.
[91] Writing history with Microsofts Oce lock-in.
[92] Corporate Complicity in Chinese Internet Censorship.
Retrieved 2006-11-23.
[93] "Whos buying Microsofts outsourcing excuses?".
InfoWorld. April 22, 2010.
[94] "Microsoft plans to outsource more, says ex-worker". The
Seattle Times. September 3, 2005.
[95] "High-end tech jobs outsourced by Microsoft". Taipei
Times. June 17, 2004.
[96] "Microsoft Investigates 'Mass Suicide Threat'". Sky News.
January 11, 2012
[97] Microsoft appoints activist investor Mason Mort to
board. Reuters. March 11, 2014.
[98] Microsoft Board of Directors. PressPass (Press release).
Microsoft. Retrieved September 6, 2014.
[99] Microsoft Corporation Corporate Governance Guidelines. Microsoft. July 1, 2009. Retrieved July 18, 2010.

[79] Cha, Bonnie (September 1, 2010). Microsoft releases


Windows Phone 7 to manufacturers. CNET. CBS Inter- [100] Five year history graph of (NASDAQ:MSFT) stock.
active. Retrieved September 7, 2010.
ZenoBank. AlphaTrade. September 29, 2009. Retrieved
September 29, 2009.
[80] MSDN Subscription FAQ. Microsoft. Retrieved July
3, 2006.
[101] Monkman, Carol Smith (March 14, 1986). Microsoft
stock is red hot on rst trading day. Seattle Post[81] Microsoft Systems Journal Homepage. Microsoft.
Intelligencer (Hearst Seattle Media, LLC). p. B9. ReApril 15, 2004. Retrieved August 18, 2008.
trieved July 18, 2010.
[82] Hobson, Neville (April 11, 2005). Microsofts Channel [102] MSFT stock performance and split info. Morningstar,
9 And Cultural Rules. WebProNews (iEntry Inc). ReInc. Retrieved July 17, 2010.
trieved July 3, 2006.
[103] Microsoft stock price spreadsheet from Microsoft in[83] On10.net homepage. Microsoft. Retrieved May 4,
vestor relations (xls). Microsoft. Retrieved August 18,
2006.
2008.

166

CHAPTER 22. MICROSOFT

[104] Dividend Frequently Asked Questions. Microsoft. Re- [120] Ranking tables October 2010 Greenpeace Internatrieved August 18, 2008.
tional. Greenpeace International. Retrieved January 24,
2011.
[105] Yahoo MSFT stock chart. Yahoo Finance. Retrieved
December 13, 2008.
[121] Mills, Elinor (June 6, 2008). Microsoft vs. Google:
* MSN Money MSFT chart with dividend and split info.
Whos greener?". CNET (CBS Interactive). Retrieved
MSN Money. Microsoft. Retrieved December 13, 2008.
July 3, 2010.
* Fried, Ina; Ard, Scott (June 15, 2006). Gates stepping
down from full-time Microsoft role, page 2. ZDNet. Re- [122] Fostering Alternative Ways to Commute at Microsoft.
Microsoft. Archived from the original on May 1, 2008.
trieved August 18, 2008.
[106] Rodger, Will (November 8, 1998). Intel exec: MS [123] Seattle hires consultant to look at 520 bridge plan. King5
wanted to 'extend, embrace and extinguish' competition.
Television News. February 23, 2010. Retrieved July 3,
ZDNet. Retrieved August 18, 2008.
2010.
[107] Microsoft Corp. Licenses Burst.com Patents & Settles
Suit (Press release). Burst.com Inc. March 11, 2005.
Retrieved August 18, 2008.
* Orlowski, Andrew (March 5, 2004). Eolas web patent
nullied. The Register. Situation Publishing Ltd. Retrieved May 18, 2006.
* Dennis, Tony (December 24, 2002). Sendo & Microsoft it all ends in tears. TheInquirer.net. Archived
from the original on May 29, 2008. Retrieved May 18,
2006.
* Nystedt, Dan (December 7, 2005). Update: Microsoft
ned $32M by South Korea. IDG News Service. Retrieved August 18, 2008.

[124] Tu, Janet I. (October 28, 2011). Microsoft Pri0 | Microsoft named best multinational workplace. Seattle
Times Newspaper. Retrieved November 3, 2011.
[125] Bishop, Todd (January 27, 2004). Studies on Linux help
their patron: Microsoft. Seattle Post-Intelligencer (Hearst
Seattle Media, LLC). Retrieved July 16, 2010.
[126] Foley, Mary Jo (March 24, 2004). Yankee Independently
Pits Windows TCO vs. Linux TCO. eWeek. Retrieved
July 14, 2010.

[127] Jaques, Robert (February 13, 2006). Linux fans hit back
at Microsoft TCO claims. vnunet.com. Retrieved August
[108] U.S. v. Microsoft: We're Defending Our Right to Inno18, 2008.
vate. The Wall Street Journal. May 20, 1998. Archived
from the original on November 17, 2007. Retrieved [128] Rowena Mason (September 10, 2008). Seven-hour LSE
blackout caused by double glitch. London: The TeleMarch 31, 2006.
graph.
[109] Microsoft sells $2.25 billion of debt at low rates.
Reuters. February 4, 2011.
[129] London Stock Exchange trading hit by technical glitch.
BBC News. November 26, 2009.
[110] Charles Arthur (April 28, 2011). Microsoft falls behind
Apple for rst time in 20 years | Technology. London: [130] David M. Williams (October 8, 2009). London Stock
The Guardian. Retrieved May 11, 2011.
Exchange gets the facts and dumps Windows for Linux.
ITWire.
[111] MG Siegler Apr 29, 2011 (April 29, 2011). When Will
Microsofts Internet Bloodbath End?". Techcrunch.com.
[131] London Stock Exchange Rejects .NET For Open
Retrieved May 11, 2011.
Source. Slashdot. October 6, 2009.
[112] White, Martha. Microsoft reports rst quarterly loss
[132] Wingeld, Nick (December 14, 2012). Microsoft Battles
ever. Retrieved 20 July 2012.
Google by Hiring Political Brawler Mark Penn. The New
York Times.
[113] Microsoft Overview. Marketwatch. Retrieved 2 February 2014.
[133] Scroogled: Why So Negative, Microsoft?". TechCrunch.
[114] Global Top 100 Companies. PWC. Retrieved 2 Febru2013-02-10. Retrieved 2014-04-18.
ary 2014.
[134] Kashmir Hill (November 21, 2013). Googlers Love Mi[115] Microsoft Surpasses Exxon as 2nd Most Valuable Co..
crosofts 'Scroogled' Gear. Mug and Shirts Sell Out..
AssociatedPress. Retrieved 14 November 2014.
Forbes.
[116] Dirty Data Report Card. Greenpeace. Retrieved Au[135] Microsoft to cut up to 18,000 jobs over next year. July
gust 22, 2013.
17, 2014.
[117] Amazon, Microsoft: Lets keep 'the cloud' clean, Phil
[136] Microsoft Layos Greater Than Expected: Up to 18,000
Radford
Jobs Being Cut. Gamespot. Retrieved 10 August 2014.
[118] Microsoft looks to boost eco credentials with wind[137] By Alex Wilhelm, TechCrunch. Lays O 2,100 More
powered data centre, Suzanne Goldenberg
Employees. September 18, 2014. September 18, 2014.
[119] Guide to Greener Electronics Greenpeace International
(16th Edition)". Greenpeace International. Retrieved [138] U.S. Agencies Said to Swap Data With Thousands of
April 3, 2012.
Firms. Bloomberg.

22.8. EXTERNAL LINKS

[139] Ryan W. Neal (July 11, 2013). Snowden Reveals


Microsoft PRISM Cooperation: Helped NSA Decrypt
Emails, Chats, Skype Conversations. International Business Times.
[140] Greenwald, Glenn; MacAskill, Ewen (June 7, 2013).
NSA Prism program taps in to user data of Apple,
Google and others. The Guardian. Guardian News and
Media Limited. Retrieved April 26, 2014.
[141] Johnson, Kevin; Martin, Scott; O'Donnell, Jayne; Winter,
Michael (June 15, 2013). Reports: NSA Siphons Data
from 9 Major Net Firms. USA Today. Retrieved June 6,
2013.
[142] Microsoft, Facebook, Google and Yahoo release US
surveillance requests. The Guardian. February 3, 2014.
[143] Smith, Brad (December 4, 2013). Protecting customer
data from government snooping. The Ocial Microsoft
Blog. Retrieved 1 January 2015.
[144] Thomlinson, Matt (July 1, 2014). Advancing our encryption and transparency eorts. Microsoft on the Issues.
Retrieved 1 January 2015.
[145] Heiner, David. Request for Comment: Big Data and
Consumer Privacy in the Internet Economy. National
Telecommunications and Information Administration. Microsoft. Retrieved 12 August 2014.
[146] Computer Reseller News Magazine. March 1987.
[147] Osterman, Larry (July 14, 2005). Remember the blibbet. Larry Ostermans WebLog. Microsoft. Retrieved
August 18, 2008.
[148] The Rise and Rise of the Redmond Empire. Wired. December 1998. Retrieved August 18, 2008.
[149] Schmelzer, Randi (January 9, 2006). McCann Thinks
Local for Global Microsoft. Adweek. Retrieved August
18, 2008.
[150] Reimer, Jeremy (January 23, 2006). Microsoft set to
launch new marketing campaign. Ars Technica (Cond
Nast Digital). Retrieved August 18, 2008.
[151] Topolsky, Joshua (July 22, 2010). New Microsoft brand
logos, company tagline revealed at MGX event? (update:
no new logos, tagline is a go)". Engadget. AOL. Retrieved
August 2, 2012.
[152] Meisner, Jerey (August 23, 2012). Microsoft Unveils
a New Look. The Ocial Microsoft Blog. Retrieved
August 23, 2012.
[153] Eric, Steven H. (August 23, 2012). NEW MICROSOFT
LOGO REVEALED. Flapship.com. Retrieved August
23, 2012.
[154] Microsofts new logo has ties to the past.
[155] Microsofts logo is not new, its from 1995.
[156] Microsoft Unveils a New Look. Microsoft. August
2012. Retrieved August 23, 2012.

167

22.8 External links


Ocial website
Ocial blog

Business data for Microsoft Corporation:


Hoovers
Reuters
SEC lings

Microsoft companies grouped at OpenCorporates


Coordinates: 473823N 122742W / 47.63972N
122.12833W

Chapter 23

IBM
This article is about the technology company sometimes
referred to as Big Blue. For other uses of these terms,
see IBM (disambiguation) and Big Blue (disambiguation).

Fortran programming language, SABRE airline reservation system, DRAM, copper wiring in semiconductors,
the silicon-on-insulator (SOI) semiconductor manufacturing process, and Watson articial intelligence.

IBM has constantly evolved since its inception, acquiring properties such as Kenexa (2012) and SPSS (2009)
and organizations such as PwC's consulting business
(2002), spinning o companies like printer manufacturer Lexmark (1991), and selling o product lines like
its personal computer and server businesses to Lenovo
(2005, 2014). In 2014 IBM announced that it would ofoad IBM Micro Electronics semiconductor manufacThe company was founded in 1911 as the Computing- turing to Global Foundries. This transition is in progress
Tabulating-Recording Company (CTR) through a merger as of early 2015.
of the Tabulating Machine Company, the International
Time Recording Company, and the Computing Scale
Company.[4][5] CTR was changed to International Busi- 23.1 History
ness Machines in 1924, using a name which had originated with CTRs Canadian subsidiary. The acronym
IBM followed. Securities analysts nicknamed the com- Main article: History of IBM
pany Big Blue for its size and common use of the color
in products, packaging, and logo.[6]
In the 1880s, three technologies emerged that would form
In 2012, Fortune ranked IBM the No. 2 largest U.S. rm the core of what would become International Business
E. Pitrat patented the computin terms of number of employees (435,000 worldwide),[7] Machines (IBM). Julius
[16]
Alexander
Dey invented the dial
[8] ing scale in 1885;
the No. 4 largest in terms of market capitalization,
[17]
recorder
(1888);
and
Herman
Hollerith patented the
[9]
the No. 9 most protable, and the No. 19 largest
[18]
Electric
Tabulating
Machine
and
Willard Bundy in[10]
rm in terms of revenue.
Globally, the company was
vented
a
time
clock
to
record
a
workers
arrival and deranked the No. 31 largest in terms of revenue by Forbes
[19]
parture
time
on
a
paper
tape
in
1889.
[11][12]
for 2011.
Other rankings for 2011/2012 include
The International Business Machines Corporation
(IBM) is an American multinational technology and
consulting corporation, with headquarters in Armonk,
New York, United States. IBM manufactures and
markets computer hardware and software, and oers
infrastructure, hosting and consulting services in areas
ranging from mainframe computers to nanotechnology.[3]

No. 1 company for leaders (Fortune), No. 1 green company in the U.S. (Newsweek), No. 2 best global brand
(Interbrand), No. 2 most respected company (Barrons),
No. 5 most admired company (Fortune), and No. 18
most innovative company (Fast Company).[13]
IBM has 12 research laboratories worldwide, bundled
into IBM Research. As of 2013 the company held the
record for most patents generated by a business for 22
consecutive years.[14] Its employees have garnered ve
Nobel Prizes, six Turing Awards, ten National Medals
of Technology, and ve National Medals of Science.[15]
Notable company inventions include the automated teller
machine (ATM), the oppy disk, the hard disk drive,
the magnetic stripe card, the relational database, the
Universal Product Code (UPC), the nancial swap, the

On June 16, 1911, these technologies and their respective companies were merged by Charles Ranlett Flint
to form the Computing-Tabulating-Recording Company
(C-T-R).[20] The New York City-based company had
1,300 employees and oces and plants in Endicott and
Binghamton, New York; Dayton, Ohio; Detroit, Michigan; Washington, D.C.; and Toronto, Ontario. It manufactured and sold machinery ranging from commercial
scales and industrial time recorders to meat and cheese
slicers, along with tabulators and punched cards.
Flint recruited Thomas J. Watson, Sr., formerly of the
National Cash Register Company, to help lead the company in 1914.[20] Watson implemented generous sales
incentives, a focus on customer service, an insistence
on well-groomed, dark-suited salesmen and an evangel-

168

23.1. HISTORY
ical fervor for instilling company pride and loyalty in every worker.[21] His favorite slogan, THINK, became
a mantra for C-T-Rs employees, and within 11 months
of joining C-T-R, Watson became its president.[21] The
company focused on providing large-scale, custom-built
tabulating solutions for businesses, leaving the market for
small oce products to others. During Watsons rst four
years, revenues more than doubled to $9 million and the
companys operations expanded to Europe, South America, Asia, and Australia.[21] On February 14, 1924, C-T-R
was renamed the International Business Machines Corporation (IBM),[13] citing the need to align its name with the
growth and extension of [its] activities.[22]

23.1.1

19301979

169
In 1961, Thomas J. Watson, Jr., was elected chairman
of the board and Albert L. Williams became company
president. The same year IBM developed the SABRE
(Semi-Automatic Business-Related Environment) reservation system for American Airlines and introduced the
highly successful Selectric typewriter.
In 1963, IBM employees and computers helped NASA
track the orbital ight of the Mercury astronauts. A year
later it moved its corporate headquarters from New York
City to Armonk, New York. The latter half of the 1960s
saw IBM continue its support of space exploration, participating in the 1965 Gemini ights, 1966 Saturn ights,
and 1969 lunar mission.
On April 7, 1964 IBM announced the rst computer system family, the revolutionary IBM System/360. Sold
between 1964 and 1978, it spanned the complete range
of commercial and scientic applications from large to
small, allowing companies for the rst time to upgrade to
models with greater computing capability without having
to rewrite their application.
In 1974, IBM engineer George J. Laurer developed the
Universal Product Code.[27] On October 11, 1973, IBM
introduced the IBM 3666, a laser-scanning point-of-sale
barcode reader which would become the backbone of retail checkouts. On June 26, 1974, at Marshs supermarket in Troy, Ohio, a pack of Wrigleys Juicy Fruit chewing
gum was the rst-ever product scanned. It is now on display at the Smithsonian Institutions National Museum of
American History in Washington, D.C.

NACA researchers using an IBM type 704 electronic data processing machine in 1957

In the late 1970s, IBM underwent a wave of internal convulsions between a management faction wanting to concentrate on its bread-and-butter mainframe business and
one desiring to expand into the emerging personal computer industry.

In 1937, IBMs tabulating equipment enabled organizations to process unprecedented amounts of data, its clients
including the U.S. Government, during its rst eort to
23.1.2
maintain the employment records for 26 million people
pursuant to the Social Security Act,[23] and the Third Reich,[24] largely through the German subsidiary Dehomag.
During the Second World War the company produced
small arms for the American war eort (M1 Carbine, and
Browning Automatic Rie). IBM provided translation
services for the Nuremberg Trials. In 1947, IBM opened
its rst oce in Bahrain,[25] as well as an oce in Saudi
Arabia to service the needs of the Arabian-American Oil
Company that would grow to become Saudi Business Machines (SBM).[26]
In 1952, Thomas Watson, Sr., stepped down after almost
40 years at the company helm; his son, Thomas Watson, Jr., was named president. In 1956, the company
demonstrated the rst practical example of articial intelligence when Arthur L. Samuel of IBMs Poughkeepsie, New York, laboratory programmed an IBM 704 not
merely to play checkers but learn from its own experience. In 1957, the FORTRAN (FORmula TRANslation) scientic programming language was developed.

1980Present

IBMs Blue Gene supercomputers were awarded the National


Medal of Technology and Innovation by U.S. President Barack
Obama on September 18, 2009.

Financial swaps were rst introduced to the public in


1981 when IBM and the World Bank entered into a swap

170

CHAPTER 23. IBM

agreement.[28] The IBM PC, originally designated IBM


5150, was introduced in 1981, and it soon became the industry standard. In 1991, IBM sold printer manufacturer
Lexmark. In 1993, IBM posted a US$8 billion loss, at the
time was the biggest in American corporate history.[29]

with Apple Inc. in mobile enterprise.[37][38]

In 2002, IBM acquired PwC consulting. In 2003, it initiated a project to redene company values. Using its Jam
technology, it hosted a three-day Internet-based online
discussion of key business issues with 50,000 employeess. Results were data mined by sophisticated text analysis software (eClassier) for common themes. Three
emerged, expressed as: Dedication to every clients success, Innovation that mattersfor our company and for
the world, and Trust and personal responsibility in all
relationships.[30] Another three-day Jam was conducted
in 2004, with 52,000 employees discussing ways to implement company values in practice.[31]

In September 2014, it was announced that IBM would


sell its x86 server division to Lenovo for a fee of $2.1
billion.[40] That same year, Reuters referred to IBM as
largely a computer services supplier.[41]

On August 11, 2014, IBM announced it has acquired


the business operations of Lighthouse Security Group,
LLC, a premier cloud security services provider. Financial terms were not disclosed.[39]

In November 2014, IBM and Twitter announced a global


landmark partnership they claim will change how institutions and businesses understand their customers, markets and trends. With Twitters data on people and
IBMs cloud-based analytics and customer engagement
platforms they plan to help enterprises make better, more
informed decisions. The partnership will give enterprises
and institutions a way to make sense of Twitters mountain of data using IBMs Watson supercomputer.[42]

23.2 Rank
In 2012, Fortune ranked IBM the No. 2 largest U.S.
rm in terms of number of employees,[7] the No. 4
largest in terms of market capitalization,[8] the No. 9
most protable,[9] and the No. 19 largest rm in terms
of revenue.[10] Globally, the company was ranked the No.
31 largest rm in terms of revenue by Forbes for 2011.[11]
Other rankings for 2011/2012 include the following:[13]
IBM showing their various innovations at CeBIT 2010 in
Hanover, Germany

In 2005, the company sold its personal computer business to Lenovo, and in the same year, agreed to acquire
Micromuse.[32] A year later, IBM launched Secure Blue,
a low-cost hardware design for data encryption that can
be built into a microprocessor.[33] In 2009, it acquired
software company SPSS Inc. Later in 2009, IBMs Blue
Gene supercomputing program was awarded the National
Medal of Technology and Innovation by U.S. President
Barack Obama. In 2011, IBM gained worldwide attention for its articial intelligence program Watson, which
was exhibited on Jeopardy! where it won against game
show champions Ken Jennings and Brad Rutter. As of
2012, IBM had been the top annual recipient of U.S.
patents for 20 consecutive years.[34]

No. 1 company for leaders (Fortune)


No. 1 green company in the U.S. (Newsweek)[43]
No. 2 best global brand (Interbrand)
No. 2 most respected company (Barrons)[44]
No. 5 most admired company (Fortune)
No. 18 most innovative company (Fast Company)
For 2012, IBMs brand was valued by Interbrand at $75.5
billion.[45]

For 2012, Vault ranked IBM Global Technology Services


No. 1 in tech consulting for cyber security, operations
and implementation, and public sector; and No. 2 in
IBMs closing value of $214 billion on September 29, outsourcing.[46]
2011 surpassed Microsoft's $213.2 billion valuation. It
was the rst time since 1996 that IBMs closing price exceeded its software rivals. On August 16, 2012, IBM 23.3 Corporate aairs
announced it entered an agreement to buy Texas Memory Systems.[35] Later that month, IBM announced it has IBM is headquartered in Armonk, New York.[47] The
agreed to buy Kenexa.
283,000-square-foot (26,300 m2 ) glass and stone buildIn June 2013, IBM acquired SoftLayer Technologies, a ing sits on a 25-acre (10 ha) parcel amid a 432 acre forweb hosting service, in a deal of around $2 billion;[36] mer apple orchard the company purchased in the midand in July 2014, the company announced a partnership 1950s.[48]

23.5. WORK ENVIRONMENT


The companys 14 member Board of Directors is responsible for overall corporate management. As of Cathie
Black's resignation in November 2010 its membership
(by aliation and year of joining) included: Alain J.
P. Belda '08 (Alcoa), William R. Brody '07 (Salk Institute / Johns Hopkins University), Kenneth Chenault
'98 (American Express), Michael L. Eskew '05 (UPS),
Shirley Ann Jackson '05 (Rensselaer Polytechnic Institute), Andrew N. Liveris '10 (Dow Chemical), W. James
McNerney, Jr. '09 (Boeing), James W. Owens '06
(Caterpillar), Samuel J. Palmisano '00 (IBM), Joan Spero
'04 (Doris Duke Charitable Foundation), Sidney Taurel
'01 (Eli Lilly), and Lorenzo Zambrano '03 (Cemex).[49]
On January 21, 2014 IBM announced that company executives would forgo bonuses for scal year 2013. The
move came as the rm reported a 5% drop in sales and
1% decline in net prot over 2012. It also committed
to a $1.2bn plus expansion of its data center and cloudstorage business, including the development of 15 new
data centers.[50] After ten successive quarters of at or
sliding sales under Chief Executive Virginia Rometty
IBM is being forced to look at new approaches. Said
Rometty, Weve got to reinvent ourselves like weve done
in prior generations.[51]

23.4 Facilities
The company has twelve research labs worldwide, bundled under IBM Research and headquartered at the
Thomas J. Watson Research Center in New York. Others include the Almaden lab in California, Austin lab in
Texas, Australia lab in Melbourne, Brazil lab in So Paulo
and Rio de Janeiro, China lab in Beijing and Shanghai,
Ireland lab in Dublin, Haifa lab, in Israel, India lab in
Delhi and Bangalore, Tokyo lab, Zurich lab and Africa
lab in Nairobi.
Other major campus installations include towers in
Montreal, Paris, and Atlanta; software labs in RaleighDurham, Rome, Cracow and Toronto; Johannesburg,
Seattle; and facilities in Hakozaki and Yamato. The
company also operates the IBM Scientic Center,
Hursley House, the Canada Head Oce Building, IBM
Rochester, and the Somers Oce Complex. The companys contributions to architecture and design, which
include works by Eero Saarinen, Ludwig Mies van der
Rohe, and I.M. Pei, have been recognized. Van der
Rohes 330 North Wabash building in Chicago, the original center of the companys research division post-World
War II, was recognized with the 1990 Honor Award from
the National Building Museum.[52]

171
IBM Rochester (Minnesota), nicknamed the Big
Blue Zoo
IBM Avenida de Amrica Building in Madrid, Spain
Thomas J. Watson Research Center in Yorktown
Heights, New York, designed by Eero Saarinen
Somers (New York) Oce Complex, designed by
I.M. Pei
IBM Japan Makuhari Technical Center, designed by
Yoshio Taniguchi
IBM Haifa Research Lab, Israel

23.5 Work environment


IBMs employee management practices can be traced
back to its roots. In 1914, CEO Thomas J. Watson boosted company spirit by creating employee sports
teams, hosting family outings, and furnishing a company
band. In 1924 the Quarter Century Club, which recognizes employees with 25 years of service, was organized
and the rst issue of Business Machines, IBMs internal
publication, was published. In 1925, the rst meeting of
the Hundred Percent Club, composed of IBM salesmen
who meet their quotas, convened in Atlantic City, New
Jersey.
IBM was among the rst corporations to provide group
life insurance (1934), survivor benets (1935) and paid
vacations (1937). In 1932 IBM created an Education Department to oversee training for employees, which oversaw the completion of the IBM Schoolhouse at Endicott
in 1933. In 1935, the employee magazine Think was
created. Also that year, IBM held its rst training class
for female systems service professionals. In 1942, IBM
launched a program to train and employ disabled people in Topeka, Kansas. The next year classes begin in
New York City, and soon the company was asked to join
the Presidents Committee for Employment of the Handicapped. In 1946, the company hired its rst black salesman, 18 years before the Civil Rights Act of 1964. In
1947, IBM announced a Total and Permanent Disability
Income Plan for employees. A vested rights pension was
added to the IBM retirement plan.

In 1952, Thomas J. Watson, Jr., published the companys


rst written equal opportunity policy letter, one year before the U.S. Supreme Court decision in Brown vs. Board
of Education and 11 years before the Civil Rights Act of
1964. In 1961, IBMs nondiscrimination policy was expanded to include sex, national origin, and age. The following year, IBM hosted its rst Invention Award Dinner
honoring 34 outstanding IBM inventors; and in 1963, the
IBM Building in West Boca Raton, Florida The company named the rst eight IBM Fellows in a new FelBoca Corporate Center and Campus was originally lowship Program that recognizes senior IBM scientists,
one of IBMs research labs where the PC was cre- engineers and other professionals for outstanding techniated.
cal achievements.

172

CHAPTER 23. IBM


A dark (or gray) suit, white shirt, and a sincere tie[62]
was the public uniform for IBM employees for most of
the 20th century. During IBMs management transformation in the 1990s, CEO Louis V. Gerstner, Jr. relaxed
these codes, normalizing the dress and behavior of IBM
employees to resemble their counterparts in other large
technology companies. Since then IBMs dress code is
business casual although employees often wear business
suits during client meetings.[63]
On June 16, 2011, as part of its centenary celebrations[64]
the company announced IBM100, a year-long grants
program to fund employee participation in volunteer
projects.

23.6 Research and inventions


An IBM delivery tricycle in Johannesburg, South Africa in 1965

On September 21, 1953, Thomas Watson, Jr., the companys president at the time, sent out a controversial letter to all IBM employees stating that IBM needed to
hire the best people, regardless of their race, ethnic origin, or gender. He also publicized the policy so that in
his negotiations to build new manufacturing plants with
the governors of two states in the U.S. South, he could
be clear that IBM would not build "separate-but-equal"
workplaces.[53] In 1984, IBM added sexual orientation to
its nondiscrimination policy. The company stated that
this would give IBM a competitive advantage because
IBM would then be able to hire talented people its com- An anechoic chamber inside IBMs Yamato research facility
petitors would turn down.[54]
IBM was the only technology company ranked in Working
Mother magazines Top 10 for 2004, and one of two technology companies in 2005.[55][56] On October 10, 2005,
IBM became the rst major company in the world to
commit formally to not use genetic information in employment decisions. The announcement was made shortly
after IBM began working with the National Geographic
Society on its Genographic Project.
IBM provides same-sex partners of its employees
with health benets and provides an anti-discrimination
clause. The Human Rights Campaign has consistently
rated IBM 100% on its index of gay-friendliness since
2003 (in 2002, the year it began compiling its report
on major companies, IBM scored 86%).[57] In 2007 and
again in 2010, IBM UK was ranked rst in Stonewalls
annual Workplace Equality Index for UK employers.[58]
The company has traditionally resisted labor union
organizing,[59] although unions represent some IBM
workers outside the United States.[60] In 2009, the Unite
union stated that several hundred employees joined following the announcement in the UK of pension cuts
that left many employees facing a shortfall in projected
pensions.[61]

In 1945, The Watson Scientic Computing Laboratory


was founded at Columbia University in New York, New
York. The renovated fraternity house on Manhattans
West Side was used as IBMs rst laboratory devoted to
pure science. It was the forerunner of IBM Research, the
largest industrial research organization in the world, with
twelve labs on six continents.[65]
In 1966, IBM researcher Robert H. Dennard invented
Dynamic Random Access Memory (DRAM) cells, onetransistor memory cells that store each single bit of information as an electrical charge in an electronic circuit.
The technology permits major increases in memory density and is widely adopted throughout the industry where
it remains in widespread use today.
IBM has been a leading proponent of the Open Source
Initiative, and began supporting Linux in 1998.[66] The
company invests billions of dollars in services and software based on Linux through the IBM Linux Technology Center, which includes over 300 Linux kernel developers.[67] IBM has also released code under
dierent open source licenses, such as the platformindependent software framework Eclipse (worth approximately US$40 million at the time of the donation),[68]

23.7. SELECTED CURRENT PROJECTS


the three-sentence International Components for Unicode
(ICU) license, and the Java-based relational database
management system (RDBMS) Apache Derby. IBMs
open source involvement has not been trouble-free, however (see SCO v. IBM).
In 2013, Booz and Company placed IBM sixteenth
among the 20 most innovative companies in the world.
The company spends 6% of its revenue ($6.3 billion) in
research and development.[69]
Famous inventions by IBM include the following:
Automated teller machine (ATM)
Floppy disk
Hard disk drive
Electronic keypunch
Magnetic stripe card
Virtual machine
Scanning tunneling microscope
Reduced instruction set computing
Relational database
Universal Product Code (UPC)
Financial swap
SABRE airline reservation system

173
IBM History Flow Visualization Application A
tool for visualizing dynamic, evolving documents
and the interactions of multiple collaborating authors.
IBM Linux on POWER Performance Simulator A
tool that provides users of Linux on Power a set of
performance models for IBMs POWER processors.
Database File Archive And Restoration Management An application for archiving and restoring
hard disk drive les using le references stored in a
database.
Policy Management for Autonomic Computing A
policy-based autonomic management infrastructure
that simplies the automation of IT and business
processes.
FairUCE A spam lter that veries sender identity
instead of ltering content.
Unstructured Information Management Architecture (UIMA) SDK A Java SDK that supports the
implementation, composition, and deployment of
applications working with unstructured data.
Accessibility Browser A web-browser specically
designed to assist people with visual impairments, to
be released as open source software. Also known as
the A-Browser, the technology will aim to eliminate the need for a mouse, relying instead completely on voice-controls, buttons and predened
shortcut keys.

Dynamic Random Access Memory (DRAM)


Watson articial intelligence

23.7 Selected current projects


DeveloperWorks is a website run by IBM for software
developers and IT professionals. It contains how-to articles and tutorials, as well as software downloads and code
samples, discussion forums, podcasts, blogs, wikis, and
other resources for developers and technical professionals. Subjects range from open, industry-standard technologies like Java, Linux, SOA and web services, web
development, Ajax, PHP, and XML to IBMs products
(WebSphere, Rational, Lotus, Tivoli and Information
Management). In 2007, developerWorks was inducted
into the Jolt Hall of Fame.[70]

Watson, an IBM articial intelligence computer, is capable of


learning as it operates.

Virtually all console gaming systems of the previous generation used microprocessors developed by IBM. The
Xbox 360 contains a PowerPC tri-core processor, which
alphaWorks is IBMs source for emerging software tech- was designed and produced by IBM in less than 24
nologies. These technologies include:
months.[71] Sonys PlayStation 3 features the Cell BE
microprocessor designed jointly by IBM, Toshiba, and
Flexible Internet Evaluation Report Architecture Sony. IBM also provided the microprocessor that serves
A highly exible architecture for the design, display, as the heart of Nintendo's new Wii U system, which debuted in 2012.[72] The new Power Architecture-based
and reporting of Internet surveys.

174
microprocessor includes IBMs latest technology in an
energy-saving silicon package.[73] Nintendo's seventhgeneration console, Wii, features an IBM chip codenamed Broadway. The older Nintendo GameCube utilizes the Gekko processor, also designed by IBM.
In May 2002, IBM and Buttery.net, Inc. announced the
Buttery Grid, a commercial grid for the online video
gaming market.[74] In March 2006, IBM announced separate agreements with Hoplon Infotainment, Online Game
Services Incorporated (OGSI), and RenderRocket to provide on-demand content management and blade server
computing resources.[75]
IBM announced it will launch its new software, called
Open Client Oering which is to run on Linux,
Microsoft Windows and Apples Mac OS X. The company states that its new product allows businesses to offer employees a choice of using the same software on
Windows and its alternatives. This means that Open
Client Oering is to cut costs of managing whether to
use Linux or Apple relative to Windows. There will be
no necessity for companies to pay Microsoft for its licenses for operating systems since the operating systems
will no longer rely on software which is Windows-based.
One alternative to Microsofts oce document formats
is the Open Document Format software, whose development IBM supports. It is going to be used for several tasks
like: word processing, presentations, along with collaboration with Lotus Notes, instant messaging and blog tools
as well as an Internet Explorer competitor the Mozilla
Firefox web browser. IBM plans to install Open Client on
5% of its desktop PCs. The Linux oering has been made
available as the IBM Client for Smart Work product on
the Ubuntu and Red Hat Enterprise Linux platforms.[76]

CHAPTER 23. IBM


ware that can be built into microprocessors. A year
later, IBM unveiled Project Big Green, a re-direction
of $1 billion per year across its businesses to increase
energy eciency. On November 2008, IBMs CEO,
Sam Palmisano, during a speech at the Council on Foreign Relations, outlined a new agenda for building a
Smarter Planet.[79] On March 1, 2011, IBM announced
the Smarter Computing framework to support Smarter
Planet.[80] On Aug 18, 2011, as part of its eort in cognitive computing, IBM has produced chips that imitate
neurons and synapses. These microprocessors do not use
von Neumann architecture, and they consume less memory and power.[81]
IBM also holds the SmartCamp program globally. The
program searches for fresh start-up companies that IBM
can partner with to solve world problems. IBM holds 17
SmartCamp events around the world.[82] Since July 2011,
IBM has partnered with Pennies, the electronic charity
box, and produced a software solution for IBM retail customers that provides an easy way to donate money when
paying in-store by credit or debit card. Customers donate
just a few pence (1p-99p) a time and every donation goes
to UK charities.
In January 2014, IBM announced plans to invest more
than $1.2bn (735m) into its data centers and cloud storage business. It plans to build 15 new centers around the
world, bringing the total number up to 40 during 2014.[83]

In July 2014, the company revealed it was investing $3


billion over the following ve years to create computer
functionality to resemble how the human brain thinks. A
spokesman said that basic computer architecture had not
altered since the 1940s. IBM says its goal is to design
a neural chip that mimics the human brain, with 10 bilThe UC2 (Unied Communications and Collaboration) lion neurons and 100 trillion synapses, but that uses just
Client Platform is an IBM and Cisco Systems joint 1 kilowatt of power.[84]
project based on Eclipse and OSGi. It will oer the numerous Eclipse application developers a unied platform
for an easier work environment. The software based on
23.8 Environmental record
UC2 platform will provide major enterprises with easyto-use communication solutions, such as the Lotus based
Sametime. In the future the Sametime users will benet IBM was recognized as one of the Top 20 Best Workfrom such additional functions as click-to-call and voice places for Commuters by the United States Environmental Protection Agency (EPA) in 2005. The award was to
mailing.[77]
recognize Fortune 500 companies which provided emRedbooks are publicly available online books about best
ployees with excellent commuter benets to help reduce
practices with IBM products. They describe the prodtrac and air pollution.[85]
ucts features, eld experience and dos and don'ts, while
leaving aside marketing buzz. Available formats are Red- The birthplace of IBM, Endicott, suered pollution for
decades, however. IBM used liquid cleaning agents
books, Redpapers and Redpieces.
in circuit board assembly operation for more than two
Extreme Blue is a company initiative that uses expedecades, and six spills and leaks were recorded, includrienced IBM engineers, talented interns, and business
ing one leak in 1979 of 4,100 gallons from an undermanagers to develop high-value technology. The project
ground tank. These left behind volatile organic comis designed to analyze emerging business needs and the
pounds in the towns soil and aquifer. Traces of volatile
technologies that can solve them. These projects mostly
organic compounds have been identied in Endicotts
involve rapid-prototyping of high-prole software and
drinking water, but the levels are within regulatory limits.
hardware projects.[78]
Also, from 1980, IBM has pumped out 78,000 gallons
In 2006, IBM launched Secure Blue, encryption hard- of chemicals, including trichloroethane, freon, benzene

23.10. SEE ALSO


and perchloroethene to the air and allegedly caused several cancer cases among the townspeople. IBM Endicott has been identied by the Department of Environmental Conservation as the major source of pollution,
though traces of contaminants from a local dry cleaner
and other polluters were also found. Remediation and
testing are ongoing,[86] however according to city ocials, tests show that the water is safe to drink.[87]

175
240 dots per inch. In 1990 company scientists used a
scanning tunneling microscope to arrange 35 individual
xenon atoms to spell out the company acronym. It was
the rst structure assembled one atom at a time.[92]

Tokyo Ohka Kogyo Co., Ltd. (TOK) and IBM are collaborating to establish new, low-cost methods for bringing
the next generation of solar energy products, called CIGS
(Copper-Indium-Gallium-Selenide) solar cell modules,
to market. Use of thin lm technology, such as CIGS, has
great promise in reducing the overall cost of solar cells
and further enabling their widespread adoption.[88][89]
IBM is exploring four main areas of photovoltaic research: using current technologies to develop cheaper and
more ecient silicon solar cells, developing new solutionprocessed thin lm photovoltaic devices, concentrator
photovoltaics, and future generation photovoltaic architectures based upon nanostructures such as semiconductor quantum dots and nanowires.[90]

23.9 Company logo and nickname

IBM spelled out using 35 xenon atoms

Big Blue is a nickname for IBM derived in the 1960s


from the companys blue logo and color scheme, originally adopted in 1947. True Blue referred to a loyal
IBM customer, and business writers later picked up the
term.[93][94] IBM once had a de facto dress code that
saw many IBM employees wear white shirts with blue
suits.[93][95]

23.10 See also


Institute of Electrical and Electronics Engineers
List of computer system manufacturers
Top 100 US Federal Contractors
List of semiconductor fabrication plants

23.11 References
[1] IBM Corporation Financials Statements. United States
Securities and Exchange Commission.
[2] 2013 IBM Annual Report (PDF). IBM.com.
The company used the globe logo until 1947, when it began
using an acronym-based logo.

[3] Nanotechnology & Nanoscience.


[4] IBM Archives: Frequently Asked Questions (PDF).

IBMs current 8-bar logo was designed in 1972 by


graphic designer Paul Rand.[91] It was a general replacement for a 13-bar logo that rst appeared in public on
the 1966 release of the TSS/360. Logos designed in
the 1970s tended to reect the inability of period photocopiers to render large areas well, hence discrete horizontal bars.

[5] Madrigal, Alexis (16 June 2011). IBMs First 100


Years. The Atlantic. Retrieved 26 June 2013.

Early dot matrix printers also had diculty rendering either large solids or narrow bars in resolutions as low as

[8] Fortune 500: IBM employees. Fortune. 2012. Retrieved 7 May 2012.

[6] Simmons, William W. (1988). Inside IBM: The Watson


Years, A Personal Memoir. Dorrance & Co. p. 137.
[7] Fortune 500: IBM employees. Fortune. 2012. Retrieved 7 May 2012.

176

[9] Fortune 20 most protable companies: IBM. Fortune.


2012. Retrieved 7 May 2012.
[10] Fortune 500: IBM. Fortune. 2012. Retrieved 7 May
2012.
[11] The Worlds Biggest Public Companies. Forbes. Retrieved June 7, 2011.
[12] IBM. Forbes. Retrieved June 7, 2011.
[13] IBM rankings. Ranking the Brands. Retrieved 17 December 2010.
[14] IBM Tops Patent List for 22nd Year as It Looks for
Growth. Bloomberg. 2015-01-12.
[15] Awards & Achievements. IBM. Retrieved 2012-05-23.
[16] Aswad, Ed; Meredith, Suzanne (2005). Images of America: IBM in Endicott. Arcadia Publishing. ISBN 0-73853700-4.
[17] Dey dial recorder, early 20th century. UK Science Museum. Retrieved 30 December 2010.
[18] Hollerith 1890 Census Tabulator. Columbia University.
Retrieved 30 December 2010.
[19] Employee Punch Clocks. Florida Time Clock. Retrieved 30 December 2010.
[20] Lee, Kenneth (1998). Trouncing the Dow: A value-based
method for making huge prots. McGraw-Hill. p. 123.
ISBN 0-07-136834-5. Retrieved 1 January 2011.
[21] Mathews, Ryan; Watts Wacker (2008). Whats your
story?: Storytelling to move markets, audiences, people,
and brands. Pearson Education. p. 138. ISBN 0-13227742-5. Retrieved 1 January 2011.

CHAPTER 23. IBM

[32] IBM to Acquire Micromuse Inc.. IBM.


[33] IBM Extends Enhanced Data Security to Consumer
Electronics Products. April 10, 2006.
[34] IBM Breaks U.S. Patent Record, Scientic Computing
(Advantage Business Media), January 12, 2012, scienticcomputing.com, retrieved January 15, 2012
[35] IBM Plans to Acquire Texas Memory Systems. R & D
Magazine. August 19, 2012. Retrieved August 27, 2012.
[36] Jennifer Saba (5 June 2013). IBM to buy website hosting
service SoftLayer. Reuters.
[37] Apple + IBM. IBM. Retrieved 18 July 2014.
[38] Etherington, Darrell (15 July 2014). Apple Teams Up
With IBM For Huge, Expansive Enterprise Push. Tech
Crunch. Retrieved 18 July 2014.
[39] IBM Acquires Cloud Security Services Provider Lighthouse Security Group. insurancenewsnet. 12 August
2014. Retrieved 11 August 2014.
[40] Lenovo says $2.1 billion IBM x86 server deal to close
on Wednesday (Press release). Reuters. 29 September
2014.
[41] UPDATE 1-Lufthansa close to deal with IBM for IT infrastructure unit. Reuters. October 22, 2014.
[42] Landmark IBM Twitter partnership to help businesses
make decisions. Market Business News. November 2,
2014.
[43] IBM #1 in Green Rankingss for 2012.
beast.com.

thedaily-

[22] 1920s. IBM. Retrieved 30 December 2010.

[44] Santoli, Michael (23 June 2012). The Worlds Most Respected Companies. Barrons. Retrieved 23 June 2012.

[23] DeWitt, Larry (April 2000). Early Automation Challenges for SSA. Retrieved March 2011.

[45] Best Global Brands Ranking for 2012. Interbrand. Retrieved 6 June 2013.

[24] IBM Statement on Nazi-era Book and Lawsuit. IBM


Press room. February 14, 2001.
[25] IBM Middle East - Bahrain. Ibm.com. Retrieved 201306-14.
[26] Corporate Timeline. SBM. Retrieved 17 March 2014.
[27] The history of the UPC bar code and how the bar code
symbol and system became a world standard.. Cummingsdesign. Retrieved 17 May 2011.
[28] Ross; Westereld; Jordan (2010). Fundamentals of Corporate Finance (9th, alternate ed.). McGraw Hill. p. 746.
[29] Lefever, Guy; Pesanello, Michele; Fraser, Heather; Taurman, Lee (2011). Life science: Fade or ourish ?"
(PDF). p. 2: IBM Institute for Business Value. Retrieved
6 July 2013.
[30] Speeches. IBM. 2004-04-27.
[31] Leading Change When Business Is Good: The HBR
Interview--Samuel J. Palmisano. Harvard Business Review (Harvard University Press). December 2004.

[46] Tech Consulting Firm Rankings 2012: Best Firms in


Each Practice Area. Vault. Retrieved 29 December
2011.
[47] Contact Us. IBM. Retrieved October 20, 2009.
[48] http://partners.nytimes.com/library/cyber/week/
091797ibm.html IBMs New Headquarters Reects
A Change in Corporate Style
[49] Board of Directors.
2010.

IBM. Retrieved 17 December

[50] IBM top executives to forgo bonuses as prots fall. BBC


News. January 21, 2014.
[51] http://www.dividendstocksresearch.com/
how-to-find-a-good-dividend-stock-in-uncertain-times
[52] Benjamin Forgey (1990-03-24). In the IBM Honoring
the Corporations Buildings. Washington Post.
[53] IBMs EO Policy letter is IBMs foundation for diversity.
IBM.

23.11. REFERENCES

[54] IBM Valuing Diversity: Heritage - 1980s. IBM.


[55] 100 best companies for working mothers 2004. Working Mother Media, Inc. Archived from the original on
2004-10-17.
[56] 100 best companies 2005. Working Mother Media, Inc.
Retrieved 2006-06-26.
[57] International Business Machines Corp. (IBM) prole.
HRC Corporate Equality Index Score.
[58] IBM Valuing Diversity - Awards and Recognition. IBM.
Retrieved 2009-05-27.
[59] Logan, John (December 2006). The Union Avoidance
Industry in the United States (PDF). British Journal of
Industrial Relations: 651675.
[60] IBM Global Unions Links. EndicottAlliance.org.
[61] IBM workers up in arms at pension cuts. v3.co.uk.
[62] Smith, Paul Russell (1999). Strategic Marketing Communications: New Ways to Build and Integrate Communications. Kogan Page. p. 24. ISBN 0-7494-2918-6.
[63] IBM Attire. IBM Archives. IBM Corp. Retrieved 31
May 2012.
[64] IBM celebrates 100th anniversary. London: Telegraph.
16 June 2011.

177

[76] IBM Client for Smart Work. 01.ibm.com. Retrieved


2010-05-23.
[77] IBM and Cisco Unveil Platform for Developing Unied
Communications and Collaboration Solutions. Orlando,
Florida: IBM. 2007-03-07.
[78] Extreme Blue web page. 01.ibm.com. 2007-09-07. Retrieved 2010-05-23.
[79] Building a smarter planet. Asmarterplanet.com. Retrieved 2010-05-23.
[80] Launch of IBM Smarter Computing.
March 2011.

Retrieved 1

[81] lholm, Mads (August 18, 2011). Major breakthrough


in cognitive computing. Semi Accurate. Retrieved August 24, 2011.
[82] Barak, Sylvie (February 3, 2012). IBM SmartCamp startups attempt to solve world problems. EE Times. Retrieved February 6, 2012.
[83] IBM commits .2bn to cloud data centre expansion. BBC
News. 17 January 2014.
[84] New research initiative sees IBM commit $3 bn. San
Francisco News.Net. Retrieved 10 July 2014.
[85] Environmental Protection. IBM. 3 May 2008.

[65] http://www.research.ibm.com/labs/
[66] IBM launches biggest Linux lineup ever. IBM. 199903-02. Archived from the original on 1999-11-10.
[67] Farrah Hamid (2006-05-24). IBM invests in Brazil
Linux Tech Center. LWN.net.
[68] Interview: The Eclipse code donation. IBM. 2001-1101.
[69] Le top 20 des entreprises les plus innovantes du monde.
Challenges. 22 October 2013.
[70] developerWorks blogs : Michael O'Connell : dW wins
Jolt Hall of Fame award; Booch, Ambler, dW authors also
honored. IBM. 2007-03-27. Retrieved 2007-04-23.
[71] IBM delivers Power-based chip for Microsoft Xbox 360
worldwide launch. IBM. 2005-10-25.
[72] Sta Writer, mybroadband (Jun 8, 2011). IBM microprocessors drive the new Nintendo WiiU console. mybroadband.co.za. Retrieved June 17, 2011.
[73] Leung, Isaac; Electronics News (June 8, 2011). IBMS
45NM SOI MICROPROCESSORS AT CORE OF NINTENDO WII U. electronicsnews.com.au. Retrieved
June 17, 2011.
[74] Buttery and IBM introduce rst video game industry
computing grid. IBM. 2002-05-09.
[75] IBM joins forces with game companies around the world
to accelerate innovation. IBM. 2006-03-21.

[86] Village of Endicott Environmental Investigations. Retrieved 28 January 2015.


[87] Chittum, Samme (15 March 2004). In an I.B.M. Village,
Pollution Fears Taint Relations With Neighbors. New
York Times Online. Retrieved 1 May 2008.
[88] IBM and Tokyo Ohka Kogyo Turn Up Watts on Solar
Energy Production (PDF). tok.co.jp.
[89] Energy, the environment and IBM.. IBM. 2008-04-01.
Retrieved 2009-05-27.
[90] IBM Press room - 2008-05-15 IBM Research Unveils
Breakthrough In Solar Farm Technology - United States.
IBM. 2008-05-15. Retrieved 2009-05-27.
[91] IBM Archives. IBM.
[92] IBM Archives: IBM atoms. IBM.
[93] edited by Evan Selinger. (2006). Postphenomenology: A
Critical Companion to Ihde. State University of New York
Press. p. 228. ISBN 0-7914-6787-2.
[94] Conway Lloyd Morgan and Chris Foges. (2004). Logos,
Letterheads & Business Cards: Design for Prot. Rotovision. p. 15. ISBN 2-88046-750-0.
[95] E. Garrison Walters. (2001). The Essential Guide to Computing: The Story of Information Technology. Publisher:
Prentice Hall PTR. p. 55. ISBN 0-13-019469-7.

178

CHAPTER 23. IBM

23.12 Further reading


For additional books about IBM:
biographies, memoirs, technology, and more, see History of
IBM#Further reading.
John Harwood (2011). The Interface: IBM and the
Transformation of Corporate Design, 1945-1976.
ISBN 978-0-8166-7039-0.
Edwin Black (2008). IBM and the Holocaust: The
Strategic Alliance Between Nazi Germany and Americas Most Powerful Corporation. ISBN 0-91415310-2.
Ulrich Steinhilper (2006). Don't Talk Do It! From
Flying To Word Processing. ISBN 1-872836-75-5.
Samme Chittum (2004-03-15). In an I.B.M. Village, Pollution Fears Taint Relations With Neighbors. New York Times.
Louis V. Gerstner, Jr. (2002). Who Says Elephants
can't Dance?. HarperCollins. ISBN 0-00-7154488.
Doug Garr (1999). IBM Redux: Lou Gerstner & The
Business Turnaround of the Decade. Harper Business.
Robert Slater (1999). Saving Big Blue: IBMs Lou
Gerstner. McGraw Hill.
Emerson W. Pugh (1996). Building IBM: Shaping
an Industry. MIT Press.
Robert Heller (1994).
Brown.

The Fate of IBM. Little

Paul Carroll (1993). Big Blues: The Unmaking of


IBM. Crown Publishers.
Roy A Bauer et al. (1992). The Silverlake Project:
Transformation at IBM (AS/400). Oxford University
Press.
Thomas Watson, Jr. (1990). Father, Son & Co: My
Life at IBM and Beyond. ISBN 0-553-29023-1.
David Mercer (1988). The Global IBM: Leadership
in Multinational Management. Dodd, Mead. p. 374.
David Mercer (1987). IBM: How the Worlds Most
Successful Corporation is Managed. Kogan Page.
Richard Thomas DeLamarter (1986). Big Blue:
IBMs Use and Abuse of Power. ISBN 0-396-085156.
Buck Rodgers (1986). The IBM Way. Harper &
Row.

Robert Sobel (1986). IBM vs. Japan: The Struggle


for the Future. ISBN 0-8128-3071-7.
Robert Sobel (1981). IBM: Colossus in Transition.
ISBN 0-8129-1000-1.
Robert Sobel (1981). Thomas Watson, Sr.: IBM and
the Computer Revolution (biography of Thomas J.
Watson). ISBN 1-893122-82-4.
William Rodgers (1969). THINK: A Biography of
the Watsons and IBM. ISBN 0-8128-1226-3.

23.13 External links


Ocial website
IBM Systems Magazine

Business data for IBM Corp.:


Hoovers
Reuters
SEC lings

IBM companies grouped at OpenCorporates

Chapter 24

Apple Inc.
This article is about the technology company. For other materials.
companies named Apple, see Apple (disambiguation).
Apple Inc. quarterly results surpassed Wall Street expecNot to be confused with Apple Corps.
tations with record sales of big-screen iPhones in the holiday shopping season and a 70 percent rise in China sales,
Coordinates: 371955N 1220152W / 37.33182N obtaining the largest prot in corporate history to date.
122.03118W
The company sold 74.5 million iPhones in its scal rst
Apple Inc. is an American multinational corporation quarter ended December 27, while many analysts had exheadquartered in Cupertino, California, that designs, de- pected fewer than 70 million. Revenue rose to $74.6 bilvelops, and sells consumer electronics, computer soft- lion from $57.6 billion a year earlier. Prot of $18 bilware, online services, and personal computers. Its best- lion was the biggest ever reported by a public company,
known hardware products are the Mac line of computers, worldwide, according to S&P analyst Howard Silverblatt.
buy IBM
the iPod media player, the iPhone smartphone, and the Apples cash pile is now $178 billion, enough to
[8]
or
the
equivalent
to
$556
for
every
American.
iPad tablet computer. Its online services include iCloud,
iTunes Store, and App Store. Apples consumer software
includes the OS X and iOS operating systems, the iTunes
media browser, the Safari web browser, and the iLife and
iWork creativity and productivity suites.

24.1 History

Apple was founded by Steve Jobs, Steve Wozniak, and Main article: History of Apple Inc.
Ronald Wayne on April 1, 1976, to develop and sell
personal computers. It was incorporated as Apple Computer, Inc. on January 3, 1977, and was renamed as Apple Inc. on January 9, 2007, to reect its shifted focus 24.1.1 197680: Founding and incorporation
towards consumer electronics.
Apple is the worlds second-largest information technology company by revenue after Samsung Electronics,
and the worlds third-largest mobile phone maker. On
November 25, 2014, in addition to being the largest publicly traded corporation in the world by market capitalization, Apple became the rst U.S. company to be valued at
over $700B.[4] As of 2014, Apple employs 72,800 permanent full-time employees, maintains 437 retail stores
in fteen countries,[5] and operates the online Apple Store
and iTunes Store, the latter of which is the worlds largest
music retailer.
Apples worldwide annual revenue in 2014 totaled
US$182 billion (FY end October 2014[6] ). Apple enjoys
a high level of brand loyalty and, according to the 2014
edition of the Interbrand Best Global Brands report, is the
worlds most valuable brand with a valuation of $118.9
billion.[7] By the end of 2014, the corporation continued to manage signicant criticism regarding the labor
practices of its contractors, as well as for its environmental and business practices, including the origins of source

The Apple I, Apples rst product, was sold as an assembled circuit board and lacked basic features such as a keyboard, monitor, and case. The owner of this unit added a keyboard and a
wooden case.

Apple was established on April 1, 1976, by Steve

179

180

CHAPTER 24. APPLE INC.

Jobs, Steve Wozniak and Ronald Wayne[9][10] to sell


the Apple I personal computer kit. The Apple I kits
were computers single handedly designed and hand-built
by Wozniak[11][12] and rst shown to the public at the
Homebrew Computer Club.[13] The Apple I was sold as a
motherboard (with CPU, RAM, and basic textual-video
chips), which is less than what is now considered a complete personal computer.[14] The Apple I went on sale in
July 1976 and was market-priced at $666.66 ($2,763 in
2015 dollars, adjusted for ination).[15][16][17][18][19][20]
Apple was incorporated January 3, 1977,[21] without
Wayne, who sold his share of the company back to
Jobs and Wozniak for $800.[10] Multimillionaire Mike
Markkula provided essential business expertise and funding of $250,000 during the incorporation of Apple.[22][23] Apples 1984 television ad, set in a dystopian future modeled
after the George Orwell novel Nineteen Eighty-Four, set the tone
During the rst ve years of operations, revenues doubled
for the introduction of the Macintosh.
every four months, an average growth rate of 700%.
The Apple II, also invented by Wozniak, was introduced
on April 16, 1977, at the rst West Coast Computer
Faire. It diered from its major rivals, the TRS-80 and
Commodore PET, because of its character cell-based
color graphics and open architecture. While early Apple II models used ordinary cassette tapes as storage devices, they were superseded by the introduction of a 5
1/4 inch oppy disk drive and interface called the Disk
II.[24] The Apple II was chosen to be the desktop platform
for the rst "killer app" of the business world: VisiCalc,
a spreadsheet program.[25] VisiCalc created a business
market for the Apple II and gave home users an additional reason to buy an Apple II: compatibility with the
oce.[25] Before VisiCalc, Apple had been a distant third
place competitor to Commodore and Tandy.[26][27]

project, the Macintosh. A race broke out between the


Lisa team and the Macintosh team over which product
would ship rst. Lisa won the race in 1983 and became
the rst personal computer sold to the public with a GUI,
but was a commercial failure due to its high price tag and
limited software titles.[33]

By the end of the 1970s, Apple had a sta of computer


designers and a production line. The company introduced
the Apple III in May 1980 in an attempt to compete with
IBM and Microsoft in the business and corporate computing market.[28] Jobs and several Apple employees, including Jef Raskin, visited Xerox PARC in December 1979
to see the Xerox Alto. Xerox granted Apple engineers
three days of access to the PARC facilities in return for
the option to buy 100,000 shares (800,000 split-adjusted
shares) of Apple at the pre-IPO price of $10 a share.[29]
Jobs was immediately convinced that all future computers
would use a graphical user interface (GUI), and development of a GUI began for the Apple Lisa.[30]
On December 12, 1980, Apple went public at $22 per The rst Macintosh, released in 1984
share,[31] generating more capital than any IPO since Ford
Motor Company in 1956 and instantly creating more milIn 1984, Apple launched the Macintosh. It was the rst
lionaires (about 300) than any company in history.[32]
personal computer to be sold without a programming language at all.[34] Its debut was announced by the now famous $1.5 million television commercial 1984. It was
24.1.2 198189: Success with Macintosh
directed by Ridley Scott and rst aired during the third
quarter of Super Bowl XVIII on January 22, 1984.[35]
See also: Timeline of Macintosh models
event for
Apple began working on the Apple Lisa in 1978. In The commercial[36]is now hailed as a watershed
[37][38]
and
a
masterpiece.
Apples
success
1982, Jobs was pushed from the Lisa team due to inghting. Jobs took over Jef Raskins low-cost-computer The Macintosh initially sold well, but follow-up sales

24.1. HISTORY
were not strong[39] due to its high price and limited range
of software titles. The machines fortunes changed with
the introduction of the LaserWriter, the rst PostScript
laser printer to be sold at a reasonable price, and
PageMaker, an early desktop publishing package. It has
been suggested that the combination of these three products was responsible for the creation of the desktop publishing market.[40] The Macintosh was particularly powerful in the desktop publishing market due to its advanced
graphics capabilities, which had necessarily been built in
to create the intuitive Macintosh GUI.
In 1985 a power struggle developed between Jobs and
CEO John Sculley, who had been hired two years
earlier.[41] The Apple board of directors instructed Sculley to contain Jobs and limit his ability to launch expensive forays into untested products. Rather than submit to
Sculleys direction, Jobs attempted to oust him from his
leadership role at Apple. Sculley found out that Jobs had
been attempting to organize a coup and called a board
meeting at which Apples board of directors sided with
Sculley and removed Jobs from his managerial duties.[39]
Jobs resigned from Apple and founded NeXT Inc. the
same year.[42]

181

24.1.3 199099: Decline, restructuring,


acquisitions
See also: Timeline of Apple II family
Apple believed the Apple II series was too expensive to produce and took away sales from the low-end
Macintosh.[43] In 1990, Apple released the Macintosh
LC, which featured a single expansion slot for the Apple
IIe Card to help migrate Apple II users to the Macintosh platform.[43] Apple stopped selling the Apple IIe in
1993. Following the success of the Macintosh LC, Apple introduced the Centris line, a low-end Quadra, and
the ill-fated Performa line. Consumers ended up confused because they did not understand the dierence between models.[44] Apple experimented with a number
of other unsuccessful consumer targeted products during the 90s, including digital cameras, portable CD audio players, speakers, video consoles, and TV appliances.
Enormous resources were also invested in the problemplagued Newton division based on John Sculleys unrealistic market forecasts. Ultimately, none of these products
helped and Apples market share and stock prices continued to slide.
Microsoft continued to gain market share with Windows
by focusing on delivering software to cheap commodity
personal computers while Apple was delivering a richly
engineered but expensive experience.[45] Apple relied on
high prot margins and never developed a clear response.
Instead, they sued Microsoft for using a graphical user interface similar to the Apple Lisa in Apple Computer, Inc.
v. Microsoft Corporation.[46] The lawsuit dragged on for
years before it was nally dismissed. At the same time, a
series of major product ops and missed deadlines sullied
Apples reputation and Sculley was replaced as CEO by
Michael Spindler.[47]

The Macintosh Portable was Apples rst portable Macintosh


computer, released in 1989.

The Macintosh Portable was introduced in 1989 and was


designed to be just as powerful as a desktop Macintosh,
but weighed 7.5 kilograms (17 lb) with a 12-hour battery
life. After the Macintosh Portable, Apple introduced the
PowerBook in 1991. The same year, Apple introduced
System 7, a major upgrade to the operating system which
added color to the interface and introduced new networking capabilities. It remained the architectural basis for
Mac OS until 2001. The success of the PowerBook and
other products brought increasing revenue.[41] For some
time, Apple was doing incredibly well, introducing fresh
new products and generating increasing prots in the process. The magazine MacAddict named the period between 1989 and 1991 as the rst golden age of the Macintosh.

The Newton was Apples rst foray into the PDA markets, as well
as one of the rst in the industry. Despite being a nancial op
at the time of its release, it helped pave the way for the Palm Pilot
and Apples own iPhone and iPad in the future.

182
By the early 1990s, Apple was developing alternative
platforms to the Macintosh, such as the A/UX. Apple had
also begun to experiment with providing a Mac-only online portal which they called eWorld, which was developed in collaboration with America Online and designed
as a Mac-friendly alternative to other online services such
as CompuServe. The Macintosh platform itself was becoming outdated because it was not built for multitasking and because several important software routines were
programmed directly into the hardware. In addition, Apple was facing competition from OS/2 and UNIX vendors
such as Sun Microsystems. The Macintosh would need to
be replaced by a new platform or reworked to run on more
powerful hardware.[48]
In 1994, Apple allied with IBM and Motorola in the
AIM alliance with the goal of creating a new computing platform (the PowerPC Reference Platform), which
would use IBM and Motorola hardware coupled with Apple software. The AIM alliance hoped that PRePs performance and Apples software would leave the PC far behind and thus counter Microsoft. The same year, Apple
introduced the Power Macintosh, the rst of many Apple
computers to use Motorolas PowerPC processor.[49]

CHAPTER 24. APPLE INC.


sion to solely focus upon web development software. The
product, still unnished at the time of the sale, was renamed "Final Cut Pro" when it was launched on the retail market in April 1999.[59][60] The development of Key
Grip also led to Apples release of the consumer videoediting product iMovie in October 1999.[61] Next, Apple successfully acquired the German company Astarte,
which had developed DVD authoring technology, as well
as Astartes corresponding products and engineering team
in April 2000. Astartes digital tool DVDirector was
subsequently transformed into the professional-oriented
DVD Studio Pro software product. Apple then employed
the same technology to create iDVD for the consumer
market.[61] In 2002, Apple purchased Nothing Real for
their advanced digital compositing application Shake,[62]
as well as Emagic for the music productivity application
Logic. The purchase of Emagic made Apple the rst
computer manufacturer to own a music software company. The acquisition was followed by the development
of Apples consumer-level GarageBand application.[63]
The release of iPhoto in the same year completed the
iLife suite.[64]

In 1996, Michael Spindler was replaced by Gil Amelio as 24.1.4 200006: Return to protability
CEO. Gil Amelio made many changes at Apple, including extensive layos.[50] After numerous failed attempts Main article: Apples transition to Intel processors
to improve Mac OS, rst with the Taligent project and Mac OS X, based on NeXTs OPENSTEP and BSD
later with Copland and Gershwin, Amelio chose to purchase NeXT and its NeXTSTEP operating system and
bring Steve Jobs back to Apple as an advisor.[51] On July
9, 1997, Amelio was ousted by the board of directors
after overseeing a three-year record-low stock price and
crippling nancial losses. Jobs acted as the interim CEO
and began restructuring the companys product line; it
was during this period that Jobs identied Jonathan Ive's
design talent, and the pair worked collaboratively to rebuild Apples status.[52]
At the 1997 Macworld Expo, Jobs announced that Apple
would join Microsoft to release new versions of Microsoft
Oce for the Macintosh, and that Microsoft had made a
$150 million investment in non-voting Apple stock.[53]
On November 10, 1997, Apple introduced the Apple
Online Store, which was tied to a new build-to-order
manufacturing strategy.[54][55] On August 15, 1998, Apple introduced a new all-in-one computer reminiscent of
the Macintosh 128K: the iMac. The iMac design team
was led by Ive, who would later design the iPod and the
iPhone.[56][57] The iMac featured modern technology and
a unique design, and sold almost 800,000 units in its rst
ve months.[58]

Apple retail stores allow potential customers to use oor models


without making a purchase.
(Apple Store, North Michigan Avenue, Chicago, Illinois in 2005)

Unix, was released on March 24, 2001 after several years


of development. Aimed at consumers and professionals
alike, Mac OS X aimed to combine the stability, reliability and security of Unix with the ease of use aorded by
an overhauled user interface. To aid users in migrating
from Mac OS 9, the new operating system allowed the
use of OS 9 applications within Mac OS X as the Classic
environment. This meant that users were able to continue
running their old applications.[65]

During this period, Apple completed numerous acquisitions to create a portfolio of digital production software
for both professionals and consumers. In 1998, Apple
purchased Macromedia's Key Grip software project, signaling an expansion into the digital video editing market. The sale was an outcome of Macromedias deci- On May 19, 2001, Apple opened the rst ocial Apple
Retail Stores in Virginia and California.[66] On October

24.1. HISTORY
23 of the same year, Apple debuted the iPod portable
digital audio player. The product, which was rst sold
on November 10, 2001, was phenomenally successful
with over 100 million units sold within six years.[67][68] In
2003, Apples iTunes Store was introduced. The service
oered online music downloads for $0.99 a song and integration with the iPod. The iTunes store quickly became
the market leader in online music services, with over 5
billion downloads by June 19, 2008.[69]

The MacBook Pro, Apples rst laptop with an Intel microprocessor, announced in January 2006.

At the Worldwide Developers Conference keynote address on June 6, 2005, Jobs announced that Apple would
begin producing Intel-based Mac computers in 2006.[70]
On January 10, 2006, the new MacBook Pro and iMac
became the rst Apple computers to use Intels Core Duo
CPU. By August 7, 2006, Apple made the transition to
Intel chips for the entire Mac product lineover one
year sooner than announced.[70] The Power Mac, iBook
and PowerBook brands were retired during the transition; the Mac Pro, MacBook, and MacBook Pro became
their respective successors.[71][72] On April 29, 2009, The
Wall Street Journal reported that Apple was building its
own team of engineers to design microchips.[73] Apple
also introduced Boot Camp in 2006 to help users install
Windows XP or Windows Vista on their Intel Macs alongside Mac OS X.[74]

183
titanium-made PowerBook and was followed by the
iBook's white polycarbonate structure and the at-panel
iMac.[77][78]

24.1.5 200710: Success with mobile devices


During his keynote speech at the Macworld Expo on
January 9, 2007, Jobs announced that Apple Computer,
Inc. would from that point on be known as Apple Inc., because the company had shifted its emphasis from computers to mobile electronic devices. This
event also saw the announcement of the iPhone and
the Apple TV.[79][80][81][82] The following day, Apple
shares hit $97.80, an all-time high at that point. In
May, Apples share price passed the $100 mark.[83] Apple
would achieve widespread success with its iPhone, iPod
Touch and iPad products, which introduced innovations
in mobile phones, portable music players and personal
computers respectively.[84] Furthermore, by early 2007,
800,000 Final Cut Pro users were registered.[85]
In an article posted on Apples website on February 6,
2007, Jobs wrote that Apple would be willing to sell music on the iTunes Store without digital rights management
(DRM), thereby allowing tracks to be played on thirdparty players, if record labels would agree to drop the
technology.[86] On April 2, 2007, Apple and EMI jointly
announced the removal of DRM technology from EMIs
catalog in the iTunes Store, eective in May 2007.[87]
Other record labels eventually followed suit and Apple
published a press release in January 2009 to announce
the corresponding changes to the iTunes Store.[88]
In July 2008, Apple launched the App Store to sell thirdparty applications for the iPhone and iPod Touch.[89]
Within a month, the store sold 60 million applications and
registered an average daily revenue of $1 million, with
Jobs speculating in August 2008 that the App Store could
become a billion-dollar business for Apple.[90] By October 2008, Apple was the third-largest mobile handset supplier in the world due to the popularity of the iPhone.[91]
On December 16, 2008, Apple announced that 2009
would be the last year the corporation would attend the
Macworld Expo, after more than 20 years of attendance, and that senior vice president of Worldwide Product Marketing Philip Schiller would deliver the 2009
keynote address in lieu of the expected Jobs. The ocial press release explained that Apple was scaling back
on trade shows in general, including Macworld Tokyo
and the Apple Expo in Paris, France, primarily because
the enormous successes of the Apple Retail Stores and
website had rendered trade shows a minor promotional
channel.[92][93]

Apples success during this period was evident in its stock


price. Between early 2003 and 2006, the price of Apples stock increased more than tenfold, from around $6
per share (split-adjusted) to over $80. In January 2006,
Apples market cap surpassed that of Dell.[75] Nine years
prior, Dells CEO Michael Dell said that if he ran Apple he would shut it down and give the money back to
the shareholders.[76] Although Apples market share in
computers had grown, it remained far behind competitors using Microsoft Windows, accounting for about 8%
of desktops and laptops in the US.
On January 14, 2009, an internal memo from Jobs anSince 2001, Apples design team has progressively aban- nounced that he would be taking a six-month medical
doned the use of translucent colored plastics rst used leave of absence from Apple until the end of June 2009
in the iMac G3. This design change began with the and would spend the time focusing on his health. In the

184
email, Jobs stated that the curiosity over my personal
health continues to be a distraction not only for me and
my family, but everyone else at Apple as well, and explained that the break would allow the company to focus
on delivering extraordinary products.[94] Despite Jobss
absence, Apple recorded its best non-holiday quarter (Q1
FY 2009) during the recession with a revenue of $8.16
billion and a prot of $1.21 billion.[95][96]
After years of speculation and multiple rumored leaks,
Apple announced a large screen, tablet-like media device
known as the iPad on January 27, 2010. The iPad ran
the same touch based operating system that the iPhone
used, and many iPhone apps were compatible with the
iPad. This gave the iPad a large app catalog on launch
despite very little development time before the release.
Later that year on April 3, 2010, the iPad was launched
in the US. It sold more than 300,000 units on its rst day,
and 500,000 by the end of the rst week.[97] In May of
the same year, Apples market cap exceeded that of competitor Microsoft for the rst time since 1989.[98]

CHAPTER 24. APPLE INC.


establishment and dominance of Silicon Valley.[105]
On January 17, 2011, Jobs announced in an internal Apple memo that he would take another medical leave of absence, for an indenite period, to allow him to focus on
his health. Chief operating ocer Tim Cook assumed
Jobss day-to-day operations at Apple, although Jobs
would still remain involved in major strategic decisions
for the company.[106] Apple became the most valuable
consumer-facing brand in the world.[107] In June 2011,
Jobs surprisingly took the stage and unveiled iCloud, an
online storage and syncing service for music, photos, les
and software which replaced MobileMe, Apples previous attempt at content syncing.[108]

This would be the last product launch Jobs would attend before his death. It has been argued that Apple has
achieved such eciency in its supply chain that the company operates as a monopsony (one buyer, many sellers)
and can dictate terms to its suppliers.[109][110][111] In July
2011, due to the American debt-ceiling crisis, Apples nancial reserves were briey larger than those of the U.S.
Apple also released the iPhone 4, which introduced video Government.[112]
calling, multitasking, and a new uninsulated stainless steel
On August 24, 2011, Jobs resigned his position as CEO
design that acted as the phones antenna. Later that year of Apple.[113] He was replaced by Tim Cook and Jobs beApple again refreshed its iPod line of MP3 players by incame Apples chairman. Prior to this, Apple did not have
troducing a multi-touch iPod Nano, an iPod Touch with a chairman and instead had two co-lead directors, Andrea
FaceTime, and an iPod Shue that brought back the butJung and Arthur D. Levinson, who continued with those
tons of earlier generations.[99][100][101] Additionally, on titles until Levinson became Chairman of the Board in
October 20, Apple updated their MacBook Air laptop,
November.[114] On October 5, 2011, Apple announced
iLife suite of applications, and unveiled Mac OS X Lion, that Jobs had died, marking the end of an era for Apthe last version with the name Mac OS X.[102]
ple Inc.[115][116] The rst major announcement by Apple
In October 2010, Apple shares hit an all-time high, following Jobs passing occurred on January 19, 2012,
eclipsing $300.[103]
when Apples Phil Schiller introduced iBooks Textbooks
for iOS and iBook Author for Mac OS X in New York
City.[117] Jobs had stated in his biography that he wanted
24.1.6 201112: Steve Jobss death
to reinvent the textbook industry and education.

Apple store in Yonkers, New York

On January 6, 2011, the company opened their Mac App


Store, a digital software distribution platform similar to
the existing iOS App Store.[104] Alongside peer entities
such as Atari and Cisco Systems, Apple was featured in
the documentary Something Ventured which premiered in
2011 and explored the three-decade era that led to the

From 2011-2012, Apple released the iPhone 4S and


iPhone 5, which featured improved cameras, an
"intelligent software assistant" named Siri, and cloudsourced data with iCloud;[118][119][120] the third and
fourth generation iPads, which featured Retina displays;[121][122][123] and the iPad Mini, which featured
a 7.9-inch screen in contrast to the iPads 9.7-inch
screen.[124] These launches were successful, with the
iPhone 5 (released September 21, 2012) becoming Apples biggest iPhone launch with over 2 million preorders[125] and sales of 3 million iPads in three days
following the launch of the iPad Mini and fourth generation iPad (released November 3, 2012).[126] Apple
also released a third-generation 13-inch MacBook Pro
with a Retina display and new iMac and Mac Mini
computers.[123][124][127]
On October 29, 2011, Apple purchased C3 Technologies, a mapping company, for $240 million, becoming
the third mapping company Apple has purchased.[128]
On January 10, 2012, Apple paid $500 million to acquire Anobit, an Israeli hardware company that devel-

24.1. HISTORY

185

oped and supplied a proprietary memory signal processing technology that improved the performance of the
ash-memory used in iPhones and iPads.[129][130] On July
24, 2012, during a conference call with investors, Tim
Cook said that he loved India, but that Apple was going to expect larger opportunities outside of India. Cook
cited the reason as the 30% sourcing requirement from
India.[131][132][133][134]

of Yves Saint Laurent as a vice president reporting directly to Tim Cook.[145] A mid-October 2013 announcement revealed that Burberry executive Angela Ahrendts
will commence as a senior vice president at Apple in
mid-2014. Ahrendts oversaw Burberrys digital strategy
for almost eight years and, during her tenure, sales increased to about US$3.2 billion and shares gained more
than threefold.[146]

On August 20, 2012, Apples rising stock rose the companys value to a world-record $624 billion. This beat
the non-ination-adjusted record for market capitalization set by Microsoft in 1999.[135] On August 24, 2012,
a US jury ruled that Samsung should pay Apple $1.05
billion (665m) in damages in an intellectual property
lawsuit.[136] Samsung appealed the damages award, which
the Court reduced by $450 million.[137] The Court further granted Samsungs request for a new trial.[137] On
November 10, 2012, Apple conrmed a global settlement that would dismiss all lawsuits between Apple and
HTC up to that date, in favor of a ten-year license agreement for current and future patents between the two
companies.[138] It is predicted that Apple will make $280
million a year from this deal with HTC.[139]

At the Worldwide Developers Conference on June 10,


2013, Apple announced the seventh iOS operating system alongside OS X Mavericks, the tenth version of Mac
OS X, and a new Internet radio service called iTunes Radio.[147][148][149] iTunes Radio, iOS 7 and OS X Mavericks were released fall 2013.[147][148][150] On December
6, 2013, Apple Inc. launched iBeacon across its 254 U.S.
retail stores. Using Bluetooth wireless technology, iBeacon senses the users exact location within the Apple store
and sends the user messages about products, events and
other information, tailored to the users location.[151]

24.1.7

2013present: Acquisitions and expansion

See also: List of mergers and acquisitions by Apple


A previously condential email written by Jobs a year before his death, was presented During the proceedings of
the Apple Inc. v. Samsung Electronics Co., Ltd. lawsuits and became publicly available in early April 2014.
With a subject line that reads Top 100 A, the email
was sent only to the companys 100 most senior employees and outlines Jobss vision of Apple Inc.'s future under 10 subheadings. Notably, Jobs declares a Holy War
with Google for 2011 and schedules a new campus for
2015.[140]
In March 2013, Apple led a patent for an augmented
reality (AR) system that can identify objects in a live
video stream and present information corresponding to
these objects through a computer-generated information
layer overlaid on top of the real-world image.[141] Later
in 2013, Apple acquired Embark Inc., a small Silicon
Valley-based mapping company that builds free transit
apps to help smartphone users navigate public transportation in U.S. cities,[142] and PrimeSense, an Israeli 3D
sensing company based in Tel Aviv.[143] In December
2013, Apple Inc. purchased social analytics rm Topsy.
Topsy is one of a small number of rms with real-time access to the messages that appear on Twitter and can do
real-time analysis of the trends and discussions happening on Twitter.[144] The company also made several high
prole hiring decisions in 2013. On July 2, 2013, Apple recruited Paul Deneve, Belgian President and CEO

Alongside Google vice-president Vint Cerf and AT&T


CEO Randall Stephenson, Cook attended a closed-door
summit held by President Obama on August 8, 2013,
in regard to government surveillance and the Internet in
the wake of the Edward Snowden NSA incident.[152][153]
On February 4, 2014, Cook met with Abdullah Gl, the
President of Turkey, in Ankara to discuss the companys
involvement in the Fatih project.[154] Cook also conrmed
that Turkey's rst Apple Retail Store would be opened in
Istanbul in April 2014.[155]
An anonymous Apple employee revealed to the
Bloomberg media publication that the opening of a
Tokyo, Japan store is planned for 2014. The construction of the store will be completed in February 2014, but
as of August 29, 2013 Apples Tokyo-based spokesman
has not made any comments to the media. A Japanese
analyst has stated, For Apple, the Japanese market is appealing in terms of quantity and price. There is room to
expand tablet sales and a possibility the Japanese market
expands if Apples mobile carrier partners increase.[156]
On October 1, 2013, Apple India executives unveiled a
plan to expand further into the Indian market, following
Cooks acknowledgment of the country in July 2013
when sales results showed that iPhone sales in India grew
400% during the second quarter of 2013.[157]
Apple Inc. reported that the company sold 51 million
iPhones in the Q1 of 2014 (an all-time quarterly record),
compared to 47.8 million in the year-ago quarter. Apple also sold 26 million iPads during the quarter, also an
all-time quarterly record, compared to 22.9 million in the
year-ago quarter. The Company sold 4.8 million Macs,
compared to 4.1 million in the year-ago quarter.[158] On
May 28, 2014, Apple conrmed its intent to acquire Dr.
Dre and Jimmy Iovine's audio company Beats Electronicsproducer of the Beats by Dr. Dre line of headphones and speaker products, and operator of the music
streaming service Beats Musicfor $3 billion, and to sell

186

CHAPTER 24. APPLE INC.

their products through Apples retail outlets and resellers.


Iovine felt that Beats had always belonged with Apple,
as the company modeled itself after Apples unmatched
ability to marry culture and technology.[159][160][161] In
August 2014 an Apple representative conrmed to the
media that Anand Lal Shimpi, editor and publisher of the
AnandTech website, had been recruited by Apple without
elaborating on Lal Shimpis role.[162]

MacBook Pro: Professional notebook, introduced in


2006.

Apple announced the Apple Watch on September 9,


2014.[163] It features a digital crown that enables ecient scroll, zoom and navigation functionality in a very
small form factor. The device is a communication portal to a nearby iPhone for messaging, telephone calls, and
engaging Siri, Apples personal assistant. The watch incorporates a Retina display for ultra-high clarity, force
touch technology to sense the dierence between a tap
and a press, medical sensors to monitor the health of the
wearer, a Taptic Engine to discreetly get the wearers attention, and supports Apple Pay. The product will arrive
in the spring of 2015 in three models - standard, sport,
and an elegant 18-karat gold special edition.

Mac Pro: Workstation desktop computer, introduced in 2006.

Mac Mini: Consumer sub-desktop computer and


server, introduced in 2005.
iMac: Consumer all-in one desktop computer, introduced in 1998.

Apple sells a variety of computer accessories for Macs,


including Thunderbolt Display, Magic Mouse, Magic
Trackpad, Wireless Keyboard, Battery Charger, the AirPort wireless networking products, and Time Capsule.

24.2.2 iPad
Main article: iPad

On January 27, 2010, Apple introduced their muchanticipated media tablet, the iPad, which runs a modi24.2 Products
ed version of iOS. It oers multi-touch interaction with
multimedia formats including newspapers, ebooks, phoSee also: Timeline of Apple products and List of tos, videos, music, word processing documents, video
products discontinued by Apple Inc.
games, and most existing iPhone apps.[164] It also includes
a mobile version of Safari for web browsing, as well
as access to the App Store, iTunes Library, iBookstore,
Contacts, and Notes. Content is downloadable via Wi24.2.1 Mac
Fi and optional 3G service or synced through the users
computer.[165] AT&T was initially the sole U.S. provider
of 3G wireless access for the iPad.[166]
On March 2, 2011, Apple introduced the iPad 2, which
had a faster processor and a camera on the front and back.
It also added support for optional 3G service provided
by Verizon in addition to AT&T.[167] The availability of
the iPad 2 was initially limited as a result of a devastating earthquake and tsunami in Japan in March 2011.[168]
The third-generation iPad was released on March 7, 2012
and marketed as "the new iPad". It added LTE service
from AT&T or Verizon, an upgraded A5X processor, and
Retina display. The dimensions and form factor remained
relatively unchanged, with the new iPad being a fraction
thicker and heavier than the previous version and featuring minor positioning changes.[169]
MacBook Air

On October 23, 2012, Apples fourth-generation iPad


came out, marketed as the "iPad with Retina display".
Main article: Macintosh
It added the upgraded A6X processor and replaced the
See also: Timeline of Macintosh models, List of Macin- traditional 30-pin dock connector with the all-digital
tosh models grouped by CPU type and List of Macintosh Lightning connector.[170] The iPad Mini was also intromodels by case type
duced. It featured a reduced 7.9-inch display and much
of the same internal specications as the iPad 2.[171] On
October 22, 2013, Apple introduced the iPad Air and the
MacBook Air: Consumer ultra-thin, ultra-portable iPad mini with Retina Display, both featuring a new 64
notebook, introduced in 2008.
bit Apple-A7 processor.[172] The iPad Air 2 was unveiled

24.2. PRODUCTS
on October 16, 2014. It added better graphics and central processing and a camera burst mode as well as minor updates. The iPad Mini 3 was unveiled at the same
time.[172]

187
megapixel iSight). The latter camera supports HD
video recording at 1080p.[176]

Since its launch, iPad users have downloaded three billion 24.2.4
apps. The total number of App Store downloads is over
25 billion.[173][174]

24.2.3

iPhone

iPod

Main article: iPod


On October 23, 2001, Apple introduced the iPod digital

iPod line as of 2014. From left to right: iPod Shue, iPod Nano,
iPod Touch.

music player. Several updated models have since been introduced, and the iPod brand is now the market leader in
portable music players by a signicant margin. More than
350 million units have shipped as of September 2012.[175]
Apple has partnered with Nike to oer the Nike+iPod
Sports Kit, enabling runners to synchronize and monitor
their runs with iTunes and the Nike+ website.
Apple currently sells three variants of the iPod:
iPod Shue: Ultra-portable digital audio player,
currently available in a 2 GB model, introduced in
2005.
iPod Nano: Portable media player, currently available in a 16 GB model, introduced in 2005. Earlier
models featured the traditional iPod click wheel, but
the current generation features a multi-touch interface and includes an FM radio and a pedometer.
iPod Touch: Portable media player that runs iOS
and was released on September 12, 2012 and is
currently available in 16, 32 and 64 GB models.
The current generation features the Apple A5 pro- The rst-generation iPhone, 3G, 4, 5, 5C and 5S to scale.
cessor, a Retina display, Siri and dual cameras
on the front (1.2 megapixel sensor) and back (5 Main article: iPhone

188

At the Macworld Conference & Expo in January 2007,


Steve Jobs introduced the long-anticipated[177] iPhone,
a convergence of an Internet-enabled smartphone and
iPod.[178] The rst-generation iPhone was released on
June 29, 2007 for $499 (4 GB) and $599 (8 GB) with an
AT&T contract.[179] On February 5, 2008, it was updated
to have 16 GB of memory, in addition to the 8 GB and 4
GB models.[180] It combined a 2.5G quad band GSM and
EDGE cellular phone with features found in handheld devices, running scaled-down versions of Apples Mac OS
X (dubbed iPhone OS, later renamed iOS), with various
Mac OS X applications such as Safari and Mail. It also
includes web-based and Dashboard apps such as Google
Maps and Weather. The iPhone features a 3.5-inch (89
mm) touchscreen display, Bluetooth, and Wi-Fi (both b
and g).[178]
A second version, the iPhone 3G, was released on July 11,
2008 with a reduced price of $199 for the 8 GB version
and $299 for the 16 GB version.[181] This version added
support for 3G networking and assisted-GPS navigation.
The at silver back and large antenna square of the original model were eliminated in favor of a glossy, curved
black or white back. Software capabilities were improved
with the release of the App Store, which provided iPhonecompatible applications to download. On April 24, 2009,
the App Store [182] surpassed one billion downloads.[183]
On June 8, 2009, Apple announced the iPhone 3GS. It
provided an incremental update to the device, including
faster internal components, support for faster 3G speeds,
video recording capability, and voice control.

CHAPTER 24. APPLE INC.


launch.[192] Upon the launch of the iPhone 5S and iPhone
5C, Apple set a new record for rst-weekend smartphone
sales by selling over nine million devices in the rst three
days of its launch.[193] The release of the iPhone 5S and
5C was the rst time that Apple simultaneously launched
two models.[194]
A patent led in July 2013 revealed the development of
a new iPhone battery system that uses location data in
combination with data on the users habits to moderate
the handsets power settings accordingly. Apple is working towards a power management system that will provide
features such as the ability of the iPhone to estimate the
length of time a user will be away from a power source to
modify energy usage and a detection function that adjusts
the charging rate to best suit the type of power source that
is being used.[195]
In a March 2014 interview, Apple designer Jonathan Ive
used the iPhone as an example of Apples ethos of creating high-quality, life-changing products. He explained
that the phones are comparatively expensive due to the
intensive eort that is used to make them:
We dont take so long and make the way
we make for scal reasons ... Quite the reverse. The body is made from a single piece
of machined aluminium ... The whole thing
is polished rst to a mirror nish and then
is very nely textured, except for the Apple
logo. The chamfers [smoothed-o edges] are
cut with diamond-tipped cutters. The cutters
dont usually last very long, so we had to gure
out a way of mass-manufacturing long-lasting
ones. The camera cover is sapphire crystal.
Look at the details around the sim-card slot.
Its extraordinary![52]

At the Worldwide Developers Conference (WWDC) on


June 7, 2010, Apple announced the redesigned iPhone
4.[184] It featured a 960x640 display, the Apple A4 processor, a gyroscope for enhanced gaming, a 5MP camera
with LED ash, front-facing VGA camera and FaceTime
video calling. Shortly after its release, reception issues
were discovered by consumers, due to the stainless steel 24.2.5 Apple TV
band around the edge of the device, which also serves as
the phones cellular signal and Wi-Fi antenna. The issue Main article: Apple TV
was corrected by a Bumper Case distributed by Apple At the 2007 Macworld conference, Jobs demonstrated
for free to all owners for a few months. In June 2011, Apple overtook Nokia to become the worlds biggest smartphone maker by volume.[185] On October 4, 2011, Apple
unveiled the iPhone 4S, which was rst released on October 14, 2011.[186] It features the Apple A5 processor and
Siri voice assistant technology, the latter of which Apple
had acquired in 2010.[187] It also features an updated 8MP
camera with new optics. Apple sold 4 million iPhone 4S
phones in the rst three days of availability.[188]
On September 12, 2012, Apple introduced the iPhone
5.[189] It added a 4-inch display, 4G LTE connectivity,
and the upgraded Apple A6 chip, among several other
improvements.[190] Two million iPhones were sold in the
rst twenty-four hours of pre-ordering[191] and over ve
million handsets were sold in the rst three days of its

The current generation Apple TV.

24.3. CORPORATE IDENTITY


the Apple TV (previously known as the iTV),[196] a settop video device intended to bridge the sale of content
from iTunes with high-denition televisions. The device
links up to a users TV and syncs, either via Wi-Fi or a
wired network, with one computers iTunes library and
streams content from an additional four. The Apple TV
originally incorporated a 40 GB hard drive for storage,
included outputs for HDMI and component video, and
played video at a maximum resolution of 720p.[197] On
May 31, 2007, a 160 GB drive was released alongside the
existing 40 GB model.[198] A software update released on
January 15, 2008 allowed media to be purchased directly
from the Apple TV.[199]
In September 2009, Apple discontinued the original 40
GB Apple TV and now continues to produce and sell the
160 GB Apple TV. On September 1, 2010, Apple released a completely redesigned Apple TV. The new device is 1/4 the size, runs quieter, and replaces the need
for a hard drive with media streaming from any iTunes
library on the network along with 8 GB of ash memory to cache media downloaded. Like the iPad and the
iPhone, Apple TV runs on an A4 processor. The memory included in the device is half of that in the iPhone 4
at 256 MB; the same as the iPad, iPhone 3GS, third and
fourth-generation iPod Touch.[200]

189
the software Apple develops is bundled with its computers. An example of this is the consumer-oriented iLife software package that bundles iMovie, iPhoto and
GarageBand. For presentation, page layout and word
processing, iWork is available, which includes Keynote,
Pages, and Numbers. iTunes, QuickTime media player,
and Software Update are available as free downloads for
both OS X and Windows.
Apple also oers a range of professional software titles.
Their range of server software includes the operating system OS X Server; Apple Remote Desktop, a remote systems management application; and Xsan, a Storage Area
Network le system. For the professional creative market, there is Aperture for professional RAW-format photo
processing; Final Cut Pro, a video production suite; Logic
Pro, a comprehensive music toolkit; and Motion, an advanced eects composition program.

Apple also oers online services with iCloud, which provides cloud storage and syncing for a wide range of data,
including email, contacts, calendars, photos and documents. It also oers iOS device backup, and is able to
integrate directly with third-party apps for even greater
functionality. iCloud is the fourth generation of online services provided by Apple, and was preceded by
MobileMe, .Mac and iTools, all which met varying deIt has HDMI out as the only video out source. Features grees of success.
include access to the iTunes Store to rent movies and
TV shows (purchasing has been discontinued), streaming
from internet video sources, including YouTube and Netix, and media streaming from an iTunes library. Apple
also reduced the price of the device to $99. A third gen- 24.3 Corporate identity
eration of the device was introduced at an Apple event on
March 7, 2012, with new features such as higher resolu24.3.1 Logo
tion (1080p) and a new user interface.

24.2.6

Apple Watch

Main article: Apple Watch

See also: Typography of Apple Inc.


Apple logo redirects here. For the programming
language, see Apple Logo.

The Apple Watch smartwatch was launched by Cook


on September 9, 2014, and is scheduled to be released
in early 2015. The wearable device consists of tnesstracking capabilities that are similar to Fitbit, and must
be used in combination with an iPhone to work (only the
iPhone 5, or later models, are compatible with the Apple
Watch).[201][202][203]
First Apple logo (April 1, 1976, Prototype)

24.2.7

Software

See also: List of Macintosh software


Apple develops its own operating system to run on Macs,
OS X, the latest version being OS X Yosemite (version
10.10). Apple also independently develops computer First ocial Apple logo used from May 17, 1976 to
software titles for its OS X operating system. Much of August 26, 1999.

190

CHAPTER 24. APPLE INC.


iThink, therefore iMac was used in 1998 to promote
the iMac,[216] and Say hello to iPhone has been used
in iPhone advertisements.[217] Hello was also used to
introduce the original Macintosh, Newton, iMac (hello
(again)"), and iPod.[218]

Current Apple logo since August 27, 1999.[204]


According to Steve Jobs, the companys name was inspired by his visit to an apple farm while on a fruitarian
diet. Jobs thought the name Apple was fun, spirited
and not intimidating.[205]

From the introduction of the Macintosh in 1984 with the


1984 Super Bowl commercial to the more modern 'Get a
Mac' adverts, Apple has been recognized in for its eorts
towards eective advertising and marketing for its products. However, claims made by later campaigns were criticized, particularly the 2005 Power Mac ads.[219][220][221]
Apples product commercials gained a lot of attention as a
result of their eye-popping graphics and catchy tunes.[222]
Musicians who beneted from an improved prole as a
result of their songs being included on Apple commercials include Canadian singer Feist with the song "1234"
and Yael Nam with the song "New Soul".[222]

Apples rst logo, designed by Ron Wayne, depicts Sir


Isaac Newton sitting under an apple tree. It was almost
immediately replaced by Rob Jano's rainbow Apple,
the now-familiar rainbow-colored silhouette of an apple
with a bite taken out of it. Jano presented Jobs with
several dierent monochromatic themes for the bitten 24.3.3 Brand loyalty
logo, and Jobs immediately took a liking to it. However,
Jobs insisted that the logo be colorized to humanize the See also: Criticism of Apple Inc. Comparison with a
company.[206][207] The logo was designed with a bite so cult/religion
that it would not be confused with a cherry.[208] The col- The scenes I witnessed at the opening of the new Apored stripes were conceived to make the logo more accessible, and to represent the fact the Apple II could generate graphics in color.[208] This logo is often erroneously
referred to as a tribute to Alan Turing, with the bite mark
a reference to his method of suicide.[209][210] Both Jano
and Apple deny any homage to Turing in the design of
the logo.[208][211]
On August 27, 1999[204] (the year following the introduction of the iMac G3), Apple ocially dropped the
rainbow scheme and began to use monochromatic logos
nearly identical in shape to the previous rainbow incarnation. An Aqua-themed version of the monochrome logo
was used from 1999 to 2003, and a glass-themed version
was used from 2007 to 2013.
Steve Jobs and Steve Wozniak were Beatles fans,[212][213]
but Apple Inc. had name and logo trademark issues with
Apple Corps Ltd., a multimedia company started by the
Beatles in 1967. This resulted in a series of lawsuits and
tension between the two companies. These issues ended
with settling of their most recent lawsuit in 2007.

Apple acionados wait in line around the Apple Store on Fifth


Avenue in New York City in anticipation of a new product.

ple store in Londons Covent Garden were more like an


evangelical prayer meeting than a chance to buy a phone
or a laptop.
Alex Riley, writing for the BBC[223]

24.3.2

Advertising

Main articles: Apple Inc. advertising and List of Apple


Inc. slogans
Apples rst slogan, "Byte into an Apple, was coined
in the late 1970s.[214] From 1997 to 2002, the slogan
"Think Dierent" was used in advertising campaigns,
and is still closely associated with Apple.[215] Apple also
has slogans for specic product lines for example,

Apples high level of brand loyalty is considered unusual


for any product. Apple evangelists were actively engaged by the company at one time, but this was after the
phenomenon had already been rmly established. Apple evangelist Guy Kawasaki has called the brand fanaticism something that was stumbled upon,[224] while Ive
explained in 2014 that People have an incredibly personal relationship with Apples products.[52] Apple Store
openings can draw crowds of thousands, with some waiting in line as much as a day before the opening or ying

24.4. CORPORATE AFFAIRS


in from other countries for the event.[225] The opening
of the New York City Fifth Avenue Cube store had a
line half a mile long; a few Mac fans used the setting to
propose marriage.[226] The line for the Ginza opening in
Tokyo was estimated to include thousands of people and
exceeded eight city blocks.[227]
Fortune magazine named Apple the most admired company in the United States in 2008, and in the world from
2008 to 2012.[228][229][230][231][232] On September 30,
2013, Apple surpassed Coca-Cola to become the worlds
most valuable brand in the Omnicom Group's Best
Global Brands report.[233] Boston Consulting Group has
ranked Apple as the worlds most innovative brand every
year since 2005.[234]

191
Apple Inc.'s world corporate headquarters are located
in the middle of Silicon Valley, at 16 Innite Loop,
Cupertino, California. This Apple campus has six buildings that total 850,000 square feet (79,000 m2 ) and was
built in 1993 by Sobrato Development Cos.[245]
In 2006, Apple announced its intention to build a second campus in Cupertino about 1 mile (1.6 km) east of
the current campus and next to Interstate 280.[246] The
new campus building will be designed by Norman Foster.[247] The Cupertino City Council approved the proposed spaceship design campus on October 15, 2013,
after a 2011 presentation by Jobs detailing the architectural design of the new building and its environs. The
new campus is planned to house up to 13,000 employees
in one central, four-storied, circular building surrounded
by extensive landscape. It will feature a caf with room
for 3,000 sitting people and parking underground as well
as in a parking structure. The 2.8 million square foot facility will also include Jobss original designs for a tness
center and a corporate auditorium.[248]

John Sculley told The Guardian newspaper in 1997:


People talk about technology, but Apple was a marketing company. It was the marketing company of the
decade.[235] Research in 2002 by NetRatings indicate
that the average Apple consumer was usually more afuent and better educated than other PC company consumers. The research indicated that this correlation could Apples headquarters for Europe, the Middle East and
stem from the fact that on average Apple Inc. products Africa (EMEA) are located in Cork in the south of
were more expensive than other PC products.[236][237]
Ireland.[123][249][250][251][252][253][254] The facility, which
opened in 1980, was Apples rst location outside of the
In response to a query about the devotion of loyal Apple
United States.[255] Apple Sales International, which deals
consumers, Jonathon Ive responded:
with all of Apples international sales outside of the USA,
is located at Apples campus in Cork[256] along with ApWhat people are responding to is much
ple Distribution International, which similarly deals with
bigger than the object. They are respondApples international distribution network.[257] On April
ing to something rarea group of people who
20, 2012, Apple added 500 new jobs at its European
do more than simply make something work,
headquarters, increasing the total workforce from around
they make the very best products they possibly
2,800 to 3,300 employees.[248][249][258] The company will
can. Its a demonstration against thoughtlessbuild a new oce block on its Hollyhill Campus to acness and carelessness.[52]
commodate the additional sta.[259]

24.3.4

Home page

24.4 Corporate aairs

The Apple website home page has been used to commemorate, or pay tribute to, milestones and events outside of See also: List of mergers and acquisitions by Apple,
Apples product oerings:
Braeburn Capital and FileMaker Inc.
2014: Robin Williams[238]
2013: Nelson Mandela[239]
2011: Steve Jobs

24.4.1 Corporate culture

[240]

2010: Jerome B. York (board member)[241]


2005: Rosa Parks[242]
2003: Gregory Hines[243]
2001: George Harrison[244]

24.3.5

Headquarters

Main article: Apple Campus

Apple was one of several highly successful companies


founded in the 1970s that bucked the traditional notions
of corporate culture. Jobs often walked around the oce
barefoot even after Apple became a Fortune 500 company. By the time of the 1984 television commercial,
Apples informal culture had become a key trait that differentiated it from its competitors.[260] According to a
2011 report in Fortune, this has resulted in a corporate
culture more akin to a startup rather than a multinational
corporation.[261]
As the company has grown and been led by a series of
dierently opinionated chief executives, it has arguably

192
lost some of its original character. Nonetheless, it has
maintained a reputation for fostering individuality and
excellence that reliably attracts talented workers, particularly after Jobs returned to the company. Numerous
Apple employees have stated that projects without Jobss
involvement often take longer than projects with it.[262]
To recognize the best of its employees, Apple created
the Apple Fellows program which awards individuals who
make extraordinary technical or leadership contributions
to personal computing while at the company. The Apple
Fellowship has so far been awarded to individuals including Bill Atkinson,[263] Steve Capps,[264] Rod Holt,[263]
Alan Kay,[265][266] Guy Kawasaki,[265][267] Al Alcorn,[268]
Don Norman,[265] Rich Page,[263] and Steve Wozniak.[263]

CHAPTER 24. APPLE INC.


and testing period. The advisors earn between US$9 and
$12 per hour and receive intensive management to ensure
a high quality of customer support.[276]

24.4.3 Manufacturing

The companys manufacturing, procurement and logistics enable it to execute massive product launches without
having to maintain large, prot-sapping inventories. In
2011, Apples prot margins were 40 percent, compared
with between 10 and 20 percent for most other hardware
companies. Cooks catchphrase to describe his focus on
the companys operational arm is: Nobody wants to buy
[111][277]
At Apple, employees are specialists who are not exposed sour milk.
to functions outside their area of expertise. Jobs saw this During the Macs early history Apple generally refused to
as a means of having best-in-class employees in every adopt prevailing industry standards for hardware, instead
role. For instance, Ron JohnsonSenior Vice President creating their own.[278] This trend was largely reversed in
of Retail Operations until November 1, 2011was re- the late 1990s, beginning with Apples adoption of the
sponsible for site selection, in-store service, and store lay- PCI bus in the 7500/8500/9500 Power Macs. Apple has
out, yet had no control of the inventory in his stores (this since adopted USB, AGP, HyperTransport, Wi-Fi, and
was done by Cook, who had a background in supply-chain other industry standards in its computers. FireWire is an
management).[269][269] Apple is also known for strictly Apple-originated standard that was widely adopted across
enforcing accountability. Each project has a directly the industry after it was standardized as IEEE 1394.[279]
responsible individual, or DRI in Apple jargon.[261]
As an example, when iOS senior vice president Scott
Forstall refused to sign Apples ocial apology for nu- Labor practices
merous errors in the redesigned Maps app, he was forced
to resign.[270] Unlike other major U.S. companies Apple Further information: Criticism of Apple Inc. Labor
provides a relatively simple compensation policy for exec- practices
utives that does not include perks enjoyed by other CEOs
like country club fees or private use of company aircraft. The company advertised its products as being made in
The company typically grants stock options to executives America until the late 1990s; however, as a result of
every other year.[271]
outsourcing initiatives in the 2000s, almost all of its manufacturing is now handled abroad. According to a report by the New York Times, Apple insiders believe the
24.4.2 Customer service
vast scale of overseas factories as well as the exibility,
In 1999 Apple retained Eight Inc. as a strategic retail diligence and industrial skills of foreign workers have
so outpaced their American counterparts that Made in
design partner and began creating the Apple retail stores.
is no longer a viable option for most Apple
Tim Kobe of Eight Inc. prepared an Apple Retail white the U.S.A.
products.[280]
paper for Jobs, outlining the ability of separate Apple retail stores to directly drive the Apple brand experience In 2006, the Mail on Sunday reported on the working conKobe used their recently completed work with The North ditions of the Chinese factories where contract manufacFace and Nike as a basis for the white paper. The rst turers Foxconn and Inventec produced the iPod.[281] The
two Apple Stores opened on May 19, 2001 in Tysons article stated that one complex of factories that assemCorner, Virginia, and Glendale, California. More than bled the iPod and other items had over 200,000 work7,700 people visited Apples rst two stores in the open- ers living and working within it. Employees regularly
ing weekend, spending a total of US$599,000.[272] As worked more than 60 hours per week and made around
of June 2014, Apple maintains 425 retail stores in four- $100 per month. A little over half of the workers earnteen countries.[273][274] In addition to Apple products, the ings was required to pay for rent and food from the
stores sell third-party products like software titles, digital company.[282][283][284]
cameras, camcorders and handheld organizers.[275]
Apple immediately launched an investigation after the
A media article published in July 2013 provided details
about Apples At-Home Apple Advisors customer support program that serves as the corporations call center.
The advisors are employed within the U.S. and work remotely after undergoing a four-week training program

2006 media report, and worked with their manufacturers to ensure acceptable working conditions.[285] In 2007,
Apple started yearly audits of all its suppliers regarding
workers rights, slowly raising standards and pruning suppliers that did not comply. Yearly progress reports have

24.4. CORPORATE AFFAIRS

193

been published since 2008.[286] In 2011, Apple admit- are aware of no other company doing as much as Apple
ted that its suppliers child labor practices in China had to ensure fair and safe working conditions.[293]
worsened.[287]
In December 2014, the Institute for Global Labour and
The Foxconn suicides occurred between January and Human Rights published a report which documented inNovember 2010, when 18[288] Foxconn (Chinese:
) humane conditions for the 15,000 workers at a Zhen Ding
employees attempted suicide, resulting in 14 deaths Technology factory in Shenzhen, China, which serves as
the company was the worlds largest contract electron- a major supplier of circuit boards for Apples iPhone and
ics manufacturer, for clients including Apple, at the iPad. According to the report, workers are pressured
time.[288][289][290] The suicides drew media attention, and into 65 hour work weeks which leaves them so ex