You are on page 1of 14

Parallel Computing

Understanding Parallelism
• Definition of Parallelism in Computing
• Rationale for Adopting Parallelism
• Key Advantages of Parallel Computing
Types of Parallelism
• Understanding Data Parallelism
• Understanding Task Parallelism
• Comparative Analysis of Data and Task Parallelism
Data Parallelism
• Basic Principles Behind Data Parallelism
• Partitioning
• Synchronization
• Typical Applications Using Data Parallelism
• Array and Vector Operations
• Image Processing
• Machine Learning
• Strengths and Limitations of Data Parallelism
• Scalability – Linear speed-up
• Simplicity
• Overhead
Task Parallelism
• Basic Principles Behind Task Parallelism
• Independence of tasks
• Coordination and communication
• Dynamic task scheduling
• Typical Applications Using Task Parallelism
• Simulations
• Web servers
• Graphic Rendering
• Strengths and Limitations of Task Parallelism
Task Parallelism
• Strengths and Limitations of Task Parallelism
• Maximized CPU utilization
• Better load balancing
• Overhead
• Complexity
Amdahl's Law
• Mathematical Formulation of Amdahl's Law
• It quantifies the maximum speedup achievable by parallelizing a
program.

• Implications of Amdahl's Law on Parallel Computing


• Real-world Examples and Limitations
Amdahl's Law
• Implications of Amdahl's Law on Parallel Computing
• Limit to speedup
• Diminishing returns
• Focus on bottlenecks
Amdahl’s Law
• Scenario 1: Database Query Optimization

• Background: A software company has a database system


that processes large datasets. Investigations reveal that
60% of the query time is spent sorting data, a task they
believe can be significantly parallelized using better
algorithms or more processors. The company is
considering an upgrade that promises to make the sorting
4 times faster.
Amdahl’s Law
• Scenario 2: Image Processing in a Graphics Application

• Background: A graphics software company has an


application where 90% of the processing time is spent
applying filters to images. With the emergence of GPUs
and parallel processing techniques, they believe they can
speed up this filter application process by 20 times with
certain hardware upgrades.
Gustafson's
Law
• Mathematical
Formulation of
Gustafson's Law
• Implications and
Advantages of
Gustafson's Law
• Scalability Optimism
• Reframing
Parallelism
• Real-world Examples
and Applications
Gustafson's Law
• Scenario 1: Large-Scale Image Processing for Satellite Imagery
• Background: A space agency collects vast amounts of satellite images
daily. Initially, their computational system could process 1000 images
a day using a single processor. They believe that 90% of the
processing task can be parallelized, and they're contemplating using
10 processors to handle the increasing volume of data.
Gustafson's Law
• Scenario: Small Business Inventory Management System
• Background: A local bookstore uses an inventory management
system to track stock, handle sales, and generate reports. They use a
single processor server for their operations, and due to the increasing
volume of books and sales, they're considering an upgrade to
enhance performance. Currently, their system updates the inventory
and produces reports within 30 minutes. The IT team assessed that
only 40% of the task can be parallelized due to the sequential nature
of many operations like transaction handling and report generation.
• Proposed Upgrade: Introducing a server with 4 processors to speed
up the process.
Conclusion & Takeaways
• Summary of Key Points Discussed
• Practical Implications of Parallel Computing
• Questions & Answers

You might also like