PARALLEL PROGRAMMING
OverviewParallel Programming
» How does one program « parallelcomputing
system?
> Teaditionally, programs are defined serially,
involves separating Independent
sections of code mo 105
How to describe parallelism’
» Data-level (SIMD)
» Lightweight - programmer/compiler handle thi
OS support needed
> EXAMPLE=forAll()
» Thread/Task-level (MIMD)
> Faitly fightweigh! -litfle OS support
» EXAMPLE= thread _create()
» Process-level (MiMD}
» Heayyweight-a lat of OS support
> EXAMPLE = fork()
noSerial Programs
» Program is decomposedinto a series of tasks
» Tasks can be fine-grainedor coarse-grained.
> Tasks are made Up of instructions.
> Tasks must be executed sequentially
» Total execution time = ElExecution time(Task)}
» What if tasks are independent?
> w
don't weexecute them in parallel®
Parallel Programs
can be reduced if
tasks run in parallel.
» Problem:
» User is responsible for
defining tasks.
» Dividing « program Into
tasks
» What each task must do:
» How each task...
» Communicates,
a=
» Synchrenizes.Parallel Programming Models
> Serlalprograms can be hard fo design and
debug.
> Perallel programs are even herder
> — Models are needed so programmerscan create
and Understand parallel programs,
& Amodelisneeded that allows:
ai Asingle application fo be delined
®) Application fo toke advantage of porallelcomputing
rBErces
Programmer fo reason about how the parallel progam
Wilexecule, communicate, and syncteonice,
3 ApBiCation to be portable fo different architectures enc!
plaiigins.
Parallel Programming
Paradigms
» What is a "Programming Paradigm"?
» AKA Programmina Model.
ines the abstractions tha} a proaromr
ning o soutien fo a problsn,
> Parallel pregramming implies that there are
concurrent Operations
pical concumency abstractions
canuse when
» So what are
& Tosks:Shared-Memory Model
> Global address space for all tasks
» Avariable: x
shared by multiple tasks.
onis needed in order to keep-data
Task shouldin' treed 2CunHl Task Ah
> SSSHIOUS HPS SURGE ROH 82 PS RER:
von is done witl
Message-Passing Model
» Tasks have theirown address space.
» Communication must be done through the
passing of messages.
» Copies data from one task to another.
» Synchronization is handled autematically for the
programmer
» Example-Task A gives Task B some data
» Task B listens for 2 message fram Task A.
& Task B then operates on the dataonce It receives the
message fom Task A.
fer receiving the message Task & and Task A
pendent copies ef the data.Comparing the Models
» Shared-Memory (Global adaress space},
& nfertask communication is IMPLICIM
» Every task communicates with shared dato.
> User is responsible for comectly using syrichronization
operations
> Message-Passing (Independent address soaces}
» ntertask communication is EXPLICT
» Mossages require that daiais copied,
13s slow —> Crverhe:
respeneibie tor synchron
BRITS Sad the teases
fon operations. just
Shared-Memory Example
Communicating through shared date
+ Protectionof criticalregions
+ [pjererence can eccur protection is done incorecti
Bletasi'aeaodng af Mel Soime dat: *
> TaskaA
mutex lockimutext)
» Do Task 4's Job -Medify data protected by mutex
» Mutex_unlock{mutex!)
> TaskB
m Mutex lockimutext)
» Do Task B's Job- Modity data
> Mutex_unlocktmutex!)
protected by mutextShared-Memory Diagram
Message-Passing Example
+ Communication through messages,
+ Interference cannot occur b/e each task has
its own copy of the data
» Task A
» Receive _message(TaskB, datainn
» Do Task A's Job - dataOutput = ffdatainput}
Send_message(TaskB, datoOutar
> Task B
» Recelve_message(TaskA, datainput)
» Do TaskB's Job - dataQuipul = f(dalainpul]
& Send_message(TaskA. dataQutput)Message-Passing Diagram
(tobe! message network
"No dain sored here.
“Task ean commanioste,
Message Network Be :
ot err
(tetereonscct "Communication REQUIRES.
Z @ hssage 10 Be sent and
fecsived (08 eal)
Tasks wi indapendont
‘adcrese-spaces
All of tasks data
Te.stored cay
Comparing the Models (Again)
» Shared-Memory
» Shared-memory doesn't require copying.
rey headancd
intention fer @ single memory.Comparing the Models (Again,
+ Message-Passing
& Passing of data is explicit.
ka tna te develo,
» Message-passing requires copying of data.
& I+) Each tosk “owns” its own copy of thedot
ces foirly wel
>) Message. pasing may be foo "heavyweight" fersome
Which Model Is Better?
» Neithermodelhas a significant advantage
overthe other.
» However, implementations can be better than
ene another,
> Implementations of each of the modelscan
use underlying hardware of a different model
memory interlace ena machine with distibuted
ig Interface on a
ved-memory Medel
that uses aUsing a Programming Model
> Most Implementations of programming models are
In the form of libraries
> Why? Cis popular, but has no suppor.
» Application Programmer Interfaces (APIs)
» The intertace tothe functionality af the foeary.
» En
[oes pokey while hole
19 mechonisms abstract
» Allows applications to be portable,
Apcralia programming Riecry sro
» The arc!
lecture Used for the design,
development cond implementation of these
statistical algorithms Is one of the most challenging
for @ parallel programmer. Net only is the process of
creating a parallel program for use on a cluster a
tasking exercise. but the battle to overcome the
latency problems of the architecture makes
program design a challenge.1, hitp:/fen wikipedia.org /wiki/Parallel_programming_model
2. nttp:/Awww.mpletorum.orgfdocs/.
3. hitp://www.co-oray.org/
4. nttp:/fupc.tbl.gow
5. http://www. emsl prlgov/does/global/
6. httpi//x10-tang.orgs
» 7.hitp://chapel.cray.com/
yvrvyyy