You are on page 1of 60

COMPUTER

ARCHITECTURE
CHAPTER 1
INTRODUCTION

1
Course aims

Understand the internal organization of a computer, the principles of its


operation and the main problems to be solved.

Why?

▪ Required for a good understanding of other courses


▪ of Operating Systems to Algorithmics

▪ Study of general problems and solutions in IT


▪ from modularity to performance issues

2
Some courses and related issues

▪ Operating systems
▪ How does an operating system manage to run several programs "all at once"?

▪ Concurrent programming
▪ My two threads each increment the grandTotal variable 1000 times, but I get
1,534 in the end. Why?

▪ Parallel programming
▪ I've gone from a single to a quadruple processor machine, and my prime number
calculation doesn't go any faster!?

3
Some courses and related issues

▪ Network programming
▪ I copy my measurement table from the Jupiter machine to the Venus machine
and the values are no longer the same!?

▪ Compilation
▪ Which machine instruction sequence to generate for copying a character string?

▪ Algorithmic
▪ Why the quick sort is generally better than the heap sort when they have the
same average algorithmic complexity O(n log2 n)?

4
Some general problems and solutions
▪ Reduce access latency:
▪ to memory, hard disk, server, etc.
▪ solution: use a cache, i.e. a local memory with quick access

RAM slow Cache quick


CPU
Memory Memory

▪ Increase the processing speed :


▪ instructions, program calculations, etc.
▪ solution: parallelize, or pipeliner the treatments

▪ Etc.

5
References

▪ Books:
▪ Structured Computer Organization. Andrew S. Tanenbaum
▪ Computer Organization and Design. John Hennessy, David Patterson

▪ Manufacturer's documentation:
▪ ® 64 and IA-32 Intel processor architectures
▪ Motorola 68000 Developer's Manual

6
Evaluation methods

2 or 3 MCQs performed at the end of the


course sessions (correction: automatic)
Continuous monitoring :
40% MCQs average mark + 60% TP mark
Synthesis TP (to be delivered at the end
of the session)
(correction: TD/TP tutors)

Final DE
(correction : TD/TP tutors))
CHAPTER PLAN

3
Von
Neumann's
machine

8
CHAPTER PLAN

3
Von
Neumann's
machine

9
Before our era
▪ - 3000: period of the Chinese emperor Fou-Hi whose magic symbol, the
trigram octagon contains the first 8 numbers represented in binary form by
lines interrupted or not: 000 001 010 011 etc ...

▪ -2000: appearance in the Middle East of the first calculation "tool": the
abacus

▪ -1000: invention of the Chinese abacus

▪ -300: the Greek philosopher Aristotle defines in his work what logic is (or
Organon)

1
First mechanical calculators
▪ 1623: Wilhelm Schickard invents a calculating clock
▪ 1632: The Englishman Oughtred invents the slide rule
▪ 1642: In order to help his father, a tax collector in Rouen, Pascal
develops the Pascaline

▪ 1679: Gottfried Wilhelm von Leibniz discovers and develops


binary arithmetic
▪ 1694 Leibniz invents a calculating machine derived from the
Pascaline but capable of processing multiplications and
divisions
▪ 1820 Charles-Xavier Thomas de Colmar invents the
arithmometer based on the Leibniz machine.

11
The big names
▪ 1854: George Boole publishes a book in which he demonstrates that
any logical process can be decomposed into a sequence of logical
operations applied to two states

▪ 1904: Invention of the first vacuum tube, the diode by John Ambrose
Fleming
▪ 1937: Alan M. Turing publishes a paper on calculable numbers and
invents the Turing Machine (followed by the Turing Test in 1950)
▪ 1938: Thesis by Claude E. Shannon who first draws a parallel
between electrical circuits and Boolean algebra. He defines the binary
digit: bit (BInary digiT)

12
The first computers
▪ 1941: Creation of the ABC binary calculator by John Atanasoff
and Clifford Berry - first calculator to use Boolean algebra

▪ 1941: Konrad Zuse develops the Z3, the first computer with
stored program (first real computer)

▪ 1945: John Von Neumann describes EDVAC (Electronic


Discrete Variable Automatic Computer) ⇒ Von Neumann
architecture

▪ 1946: Creation of ENIAC (Electronic Numerical Integrator and


Computer) by P. Eckert and J. Mauchly

13
The first programming languages

▪ 1950: invention of the assembler by Maurice V. Wilkes of Cambridge University.


Previously, programming was done directly in binary.
▪ 1955: IBM launches the IBM 704 developed by Gene Amdahl, the machine on which
the FORTRAN language will be developed
▪ 1957: Creation of the first universal programming language, FORTRAN (FORmula
TRANslator) by John Backus of IBM
▪ 1964: Thomas Kurtz and John Kemeny create the BASIC (Beginner's All-purpose
Symbolic Instruction Code) language at Dartmouth College for their students
▪ 1968: Creation of the PASCAL language by Niklaus Wirth

14
Other important dates and people
▪ 1964: Creation of the ASCII code (American Standard Code for
Information Interchange), standardized in 1966 by ISO

▪ 1965: Gordon Moore writes the first "Moore's Law" saying that the
complexity of integrated circuits will double every year

▪ 1969 : Ken Thompson and Dennis Ritchie develop UNIX on Dec PDP 7
▪ ...

More information on http://histoire.info.online.fr

15
CHAPTER PLAN

3
Von
Neumann's
machine

16
Types of computers

▪ Visible computers:
▪ Personal computers and workstations (laptops and desktops),
▪ Server computers,
▪ Process control computers,
▪ Supercomputers.

▪ Invisible computers:
▪ Vehicles (cars, trains, boats, planes, rockets, ...),
▪ Telecommunications, By far the most
numerous!
▪ Household appliances,
▪ Watchmaking,
▪…

17
The machine as perceived by the user

Interaction with the machine through peripherals:


- keyboard,
- mouse,
- screen,
- printer,
- DVD player,
- etc....

18
Services rendered by a computer

Data exchange

Data Data
storage Control transfer

Data storage Data processing

Data processing

19
Hardware decomposition of a computer
▪ A computer consists of several parts:
▪ mouse,
▪ screen,
▪ keyboard,
▪ CPU,
▪ floppy drive ...

▪ Inside the central unit:


▪ a motherboard,
▪ a video card,
▪ hard drive disks ...

▪ On the motherboard:
▪ a microprocessor,
▪ memory (ROM, RAM) ...

▪ In the microprocessor: the computer chip.

20
Architecture and organisation

▪ The entire Intel x86 family shares


the same basic architecture.
▪ The entire IBM System/370
family shares the same basic
architecture.
▪ But the organization is different
from one version to another.

21
Five generations of computers

Generation Period Technology Velocity (number of


operations/s)
1 1946 - 1957 Vacuum tube 40 000
2 1958 - 1964 Transistor 200 000
3 1965 - 1971 SSI/MSI 1 000 000
4 1972 - 1977 LSI 10 000 000
5 Post 1978 VLSI 100 000 000

22
Architecture and organisation

▪ The entire Intel x86 family shares


the same basic architecture.
▪ The entire IBM System/370
family shares the same basic
architecture.
▪ But the organization is different
from one version to another.

23
1945-1958 Period

▪ Specialized computers, unique copies.


▪ Large and unreliable machines.
▪ Tube technology, relays, resistors.
▪ 104 logic elements - programming by perforated cards.

▪ Representative: ENIAC (1946) for the feasibility study of the H-bomb.

24
1958-1964 Period

▪ General use, reliable machines.


▪ Transistor technology.
▪ 105 logical elements.
▪ First advanced programming languages (COBOL, FORTRAN, LISP).

▪ Representative: DEC PDP-1 mini computer (1961).

25
1965-1971 Period

▪ Integrated circuit technology (S/MSI Small/medium scale integration).


▪ 106 logical elements.
▪ The advent of the complex operating system.
▪ UNIX, Pascal, Basic,...
▪ CISC (Complex Instruction Set Computer).

▪ Moore's Law: the number of transistors that can be integrated on a


single chip doubles each year.

26
Moore's Law

27
Period 1972 to present day

▪ LSI technology (large SI).


▪ 107 logical elements.
▪ The advent of machine networks.
▪ Distributed processing.

▪ 1971: INTEL's first 4004 microprocessor, all the components of the CPU are
combined on a single chip.
▪ 1981: IBM computer capable of executing 240,000 sums / second.
▪ 1984: first PC running under a different OS from Windows (Apple Macintosh).
▪ 1996: Intel processor capable of executing 400 million sums / second.

28
Functional decomposition of a computer

Memory
Central Process
Interconnections Unit

ALU CU
Input/Output Interconnections

Registers

29
CHAPTER PLAN

3
Von
Neumann's
machine

30
Von Neumann's machine

Universal machine model with :


▪ a memory: contains instructions and data,
▪ an arithmetic and logical unit (ALU): performs the calculations,
▪ an input/output unit: information exchange (I/O) with devices,
▪ a control unit: control (UC).

31
Schematic operation

▪ The Control Unit:


1. extracts an instruction from the memory,
2. analyzes the instruction,
3. searches the memory for the relevant data,
4. triggers the appropriate operation on the ALU or I/O,
5. stores the result in memory.

32
Von Neumann's machine

33
Overview

▪ Basic features: ▪ Functional units:


▪ signal and chronogram, ▪ memory,
▪ clock, ▪ ALU,
▪ registers, ▪ E/S,
▪ bus. ▪ UC.

34
Signal and chronogram

▪ A signal is a discrete quantity belonging to [0,1].


▪ A chronogram is the graphical representation of a signal evolving over
time.

Status

Time

35
Clock

▪ Regular sequence signal of 0 and 1.


▪ A sequence is called a cycle.
▪ Example: a clock frequency of 500MHz
gives cycles of 2 nanoseconds.
▪ Synchronizes all devices.

36
Registers

Memory cells

▪ Fast memory elements internal to the RAM Memory

CPU. Registers

▪ A signal commands storage:


▪ level loading,
▪ front loading.

37
Bus

▪ Set of electrical wires on which the signals pass through.


▪ Connects the units to each other.
▪ Bus width:
▪ number of wires constituting the path,
▪ number of signals that can be sent at the same time.

38
Memory

Address Value

▪ Vector in which each component is


accessible through an address.
▪ Permitted transactions:
▪ Read,
▪ Write.

▪ Word = unit of information accessible in a


single reading operation.

3 448 765 900


126 (and dust)

39
How it works

1. The CPU writes the address of a cell in an address register (AR).


2. The CU is requesting an operation.
3. Exchanges are made via a word register (WR).

Read Write
AR ← address WR ← value
WR ← memory[AR] AR ← address
memory[AR]← WR

40
ALU

▪ 3-parameter function:
▪ 1 operation
▪ 2 arguments
▪ 1 result

▪ Requires a or some storage registers:


▪ inputs, outputs,
▪ intermediate results.

41
Inputs / Outputs unit

▪ Serves as an interface with peripherals.


▪ The associated operations depend on the device.
▪ Operation similar to memory:
▪ a register storing the device address (Device Selection Register),
▪ a Data Exchange Register.

42
Control unit

Its operation is that of the Von Neumann Data bus


model associated with the following

Addresses bus
INSTRUCTION ORDINAL
registers : REGISTER COUNTER

▪ Ordinal counter (PC): memory address of the FUNCTION ADDRESS


instruction to be executed, REGISTER
DECODER

▪ Instruction register (IR): instruction divided into Clock

different parts.

43
The whole machine

44
CHAPTER PLAN

3
Von
Neumann's
machine

45
Binary information

▪ Elementary information is encoded in binary.

5V  1 Vrai  1
0V  0 Faux  0
 1  0  1  0
Logical
Gate open or Electrical Voltage Magnetization
proposal
closed

▪ This is purely conventional coding (it can be reversed!).

46
Why binary?

▪ Because it's easier to do!


▪ Because it's enough!
▪ Electronic machines:
▪ Coding of electrical voltages:
▪ 5V = 1
▪ 0V = 0

▪ Note: there have been ternary and quaternary machines.

47
Size measurement units

▪ The information is encoded in binary:


▪ A bit can take 2 values: 0 or 1
▪ One kilobit (noted 1 Kb) is equal to 210 bits.
▪ Exactly how many bits is worth 1 Kb?

▪ One megabit (noted 1 Mb) is worth 210 kilobits.


▪ Exactly how many bits is worth 1 Mb?

▪ One gigabit (noted 1 Gb) is worth 210 megabits.


▪ Exactly how many bits is worth 1 Gb ?

48
Size measurement units

▪ In most computers, each character is encoded on 8 bits.


▪ How many different characters can be encoded with 8 bits?

▪ One byte = 8 bits.

▪ Caution :
▪ 1 KB = 1 kilobyte ≠ 1 Kb = 1 kilobit
▪ 1 MB = 1 megabyte ≠ 1 Mb = 1 megabit
▪ 1 GB = 1 gigabyte ≠ 1 Gb = 1 gigabit

49
CHAPTER PLAN

3
Von
Neumann's
machine

50
Instruction set

▪ All instructions that can be executed on a machine.


▪ Different instruction formats depending on the number of parts
reserved for operands (or addresses):

| operation | operand | (1 address format)


| operation | operand 1 | operand 2 | (2 addresses format)
...

51
Example of instructions in "1 address" format

readPeriph PeriphName
writePeriph PeriphName
loadAcc adMemory
memoAcc adMemory
add adMemory
substract adMemory
multiply adMemory
divide adMemory
testZero adMemory
stop

52
Another example
H0 : AR ← PC

H1 : IR ← MR, PC ← PC +1
4 phase clock (1 per CPU cycle): H2 : SPR ← IRp

H3 : ACC ← ER
▪ keyboard acquisition readPeriph keyboard
H0 : AR ← PC
▪ addition of the read value add 163
H1 : IR ← MR, PC ← PC +1
with a data in memory
H2 : AR ← IRp
▪ display of the result on the screen writePeriph screen
H3 : ACC ← ACC + MR

H0 : AR ← PC

H1 : IR ← MR, PC ← PC +1

H2 : SPR ← IRp

H3 : ER ← ACC

53
Loading

▪ Loader = program that reads a Loading into memory, addresses and pointers

program and writes it to


memory.
▪ The loader is charged by a
hardware device called a micro
loader. Hard disk RAM

54
Physically …

55
56
A perpetual race for performance

▪ The evolution of computers is


linked to:
▪ the evolution of needs (i.e.,
applications),
▪ the evolution of the performance of
each component.

▪ The design principle of a machine


is linked to performance.

57
The solutions

▪ Increase the number of bits handled


simultaneously.
▪ Change the memory structure.
▪ Reduce the frequency of memory accesses.
▪ Increase bandwidth.

58
Performance measures

▪ Performance depends on the architecture:

Measure Relevance
Clock frequency Objective for the same architecture
Number of instructions/s (MIPS, MFLOPS) Objective for the same set of instructions
Program execution time (benchmarks) The most objective

59
60

You might also like