You are on page 1of 34

PARALLELISM AND

CONCURRENCY

INTRODUCTION
Introduction to the introduction:
the beginnings...
I/O OVERLAPPING

If an I/O operation may take long to be


completed, do not idle-wait, instead use CPU for
something else. Resume previous computation
once I/O has completed
Introduction to the introduction:
the beginnings...
I/O OVERLAPPING

DMA support is required to have


this kind of i/o overlapping.
Introduction to the introduction:
the beginnings...
A long time ago in a galaxy far, far away...

I/O OVERLAPPING

Yes, it all started with I/O overlapping. What’s i/o overlapping? To cut a
long story short: in programs where i/o operations are required,
i/o devices can interrupt the processor to notify the completion
of an operation. The processor may then branch to a specific
subroutine perform some tasks and later on branch back to the
point where the interruption happened. Also, when a program
required an i/o operation that could take long to be completed,
instead of making the processor idle-wait until the end of the
operation, control could be transferred to another program,
minimizing wasting cpu-time.
Introduction to the introduction:
the beginnings...
DMA...
Introduction to the introduction:
the beginnings...
... And then came
TIME-SHARING BASED MULTI-USER SYSTEMS
(TIME-SLICING)
Introduction to the introduction:
the beginnings...
... And eventually came parallelism
USE SEVERAL PROCESSORS AND DO MORE THAN
ONE THING IN PARALLEL!

VERY ROUGH DEFINITION OF


PARALLELISM: DO MORE THAN ONE
THING AT THE SAME TIME... Whatever
that may mean...
Biggest issue  SYNCHRONIZATION
NOW, LET’S GO FOR THE
TERMINOLGY AND SOME
IMPORTANT DEFINITIONS
“Ordinary” programs are sequential
 An “ordinary” program consists of
 Data declarations
 Assignments and control flow statements (if, while, switch, … and procedure
and method invocations)
 When compiled all this is translated into elementary expressions
(machine instructions) that
 Compute expressions (arithmetic...)
 move data (load and store from/into memory and registers)
 change control flow (jump, ...)
 Machine instructions are executed SEQUENTIALLY: one after the
other.
 If the same program is executed twice with the same data, its
instructions will be executed in precisely the same order.
Sequential programs are deterministic
 Machine instructions are executed SEQUENTIALLY: one after the other.
 If the same program is executed twice with the same data, its instructions will be
executed in precisely the same order.

 “Ordinary” programs are sequential in


nature!
and sequential translates into DETERMINISTIC!!!

Deterministic?
Execute the same “ordinary” program twice with the same
data and you’ll get exactly the same results.
CONCURRENT program
 A PROCESS is a sequential program in execution
 Totally or partially loaded into memory
 Has (may have) resources allocated (files,...)

 A CONCURRENT PROGRAM is a set of


sequential programs that can be executed in
PARALLEL ( SIMULTANEOUSLY)

 The execution of a concurrent program results in multiple


processes running “simultaneously”.
PARALLEL? CONCURRENT?
 Traditionally, the term PARALLEL refers to systems in which the execution of several
programs (their resulting processes) OVERLAP in time by running them on
SEPARATE PROCESSORS

 The term CONCURRENT refers to POTENTIAL PARALLELISM: execution may, but


need not overlap.

 CONCURRENT EXECUTION does not require multiple processors.


INTERLEAVING the instructions from multiple processes on a single processor can
be used to simulate parallelism, giving the illusion of parallel execution.

 PARALLELISM = ONE PROCESS  ONE PROCESSOR


 CONCURRENCY = ONE PROCESSOR  SEVERAL
INTERLEAVED PROCESSES
PARALLELISM: one process / one processor

Process 1 Process 2 ... Process N


CONCURRENCY: one processor / several processes

Process 1 Process 2 ... Process N


CONCURRENCY is an abstraction
 PARALLELISM = ONE PROCESS  ONE PROCESSOR
 CONCURRENCY = ONE PROCESSOR  SEVERAL INTERLEAVED PROCESSES

 CONCURRENY is a (very useful) ABSTRACTION of


PARALLELISM:
 One can pretend (imagine) that all the processes of a concurrent
program are executed in parallel (in separate processors)
 Conversely, even if each process is executed in a separate
processor, one can pretend that they are interleaved on a single
processor

 Thus, concurrency and parallelism are equivalent


from a theoretical point of view
concurrency and parallelism
 ...

 Thus, concurrency and parallelism are equivalent


from a theoretical point of view. REGARDING
CORRECTENESS this means that:

 If a program is correct under concurrent execution then it


is also correct under parallel execution
 If a program is correct under parallel execution then it is
also correct under concurrent execution
INTERLEAVING vs. OVERLAPPING
 CONCURRENT EXECUTION does not require multiple processors.
INTERLEAVING the instructions from multiple processes on a single
processor can be used to simulate parallelism, giving the illusion of
parallel execution.

 In parallel execution processes OVERLAP: their


instructions overlap over time
 In concurrent execution processes INTERLEAVE: their
instructions interleave over time
TIME

1 2 3 4 5 6 7 8 9
1 2 3 4 5 6 7 8 9

a b c d e f g
a b c d e f g

Overlapping Overlapping

1 2 a b 3 c d e 4 5 6 7 f 8 g 9

Interleaving

1 a 2 b 3 c 4 d e 5 6 f 7 8 9 g

Interleaving

i Atomic instruction from process P

i Atomic instruction from process Q


ATOMIC STATEMENTS. COMPUTATIONS
AND TRACES
DEFINITION:
A concurrent program consists of a finite set of sequential
programs that are written using a finite set of atomic
statements (actions, instructions).
The execution of a concurrent program proceeds by
executing a sequence of atomic statements obtained by
arbitrarily interleaving the atomic statements from the
processes.
A computation is an execution sequence that can occur as
a result of the interleaving. Computations are also
called scenarios and are represented by traces.
Example (traces)
ATOMIC STATEMENTS?
...
A concurrent program consists of a finite set of sequential programs
that are written using a finite set of atomic statements (actions,
instructions).

 An atomic statement (action, instruction) is an statement


that can not be further subdivided into simpler sub
statements. When the execution of an atomic
statement starts it always proceeds, non-stop, until it
reaches its end. Atomic statements are said to be
non-interruptible.
GRANULARITY
 An atomic statement (action, instruction) is an statement that can not be further subdivided
into simpler sub statements. When the execution of an atomic statement starts it always
proceeds, non-stop, until it reaches its end. Atomic statements are said to be non-
interruptible.
QUESTION
Which statements are atomic?
Is a simple Java bloc like
while(a<=10) {
a=a+1; b=b*b;} atomic?
Is a simple Java instruction like a = a + 1 atomic?
Is a simple machine level instruction like ADD EAX,1
atomic?
GRANULARITY
QUESTION
Which statements are atomic?
Is a simple Java bloc like
while(a<=10) {
a=a+1; b=b*b;} atomic?
Is a simple Java instruction like a = a + 1 atomic?
Is a simple machine level instruction like ADD EAX,1 atomic?

GRANULARITY refers to the “size” –complexity- of the


atomic statements. Correctness may depend on
granularity!!! (more about this in the future...)
Granularity and the concurrent update
problem
Let’s assume that
n  n +1
Is an atomic statement.

Only two traces are


possible:

p1  q1
And
q1  p1
Granularity and the concurrent update
problem
Now Let’s assume that n  n +1
cannot be performed atomically (for
instance because it requires the use
of an intermediate temporal
variable...)
6 different traces (13
possible states) and
2 possible outcomes
6 different traces (13
possible states) and
2 possible outcomes
6 different traces (13
possible states) and
2 possible outcomes
6 different traces (13
possible states) and
2 possible outcomes
6 different traces (13
possible states) and
2 possible outcomes
6 different traces (13
possible states) and
2 possible outcomes
6 different traces (13
possible states) and
2 possible outcomes
YES, CONCURRENCY IS DIFFICULT!!!
The concurrent update problem

You might also like