You are on page 1of 5

1

Computer science

Student's name

Institutional affiliation

Professor's name

Course

Date
2

An Analytical Study of Amdahl's and Gustafson's Law

This article, "An Analytical Study of Amdahl's and Gustafson's Law," aimed to outline

parallel computing and highlight the importance of Amdahl's and Gustafson's Law in it.

Comparison and contrast of the two laws in parallel computing using several examples are made.

The paper's conclusion offers suggestions for further work that may be done to enhance the

performance parameter.

Multiple processes are carried out simultaneously in a computer model known as parallel

computing. The main objectives of parallel computing are to increase calculation speed, reduce

costs, and get beyond the limitations of serial computing. Although the two laws are distinct,

they both serve to make processors' jobs easier. The ratio between the time required by a single

processor system and that required by a parallel processing system is one performance metric

used in parallel computing to determine how much a sequential program can be parallelized

(Murray, 2022).

In general, parallel processing refers to the division of a job between at least two

microprocessors. A computer scientist uses specialized software created for the job to break

down a complicated problem into component elements. Then, they designate a specific processor

for each component portion. To complete the entire computing task, each processor completes its

part. The program reassembles the data to solve the difficult initial challenge. It's a high-tech

way of explaining that splitting the workload simplifies things. The load might be distributed

among many processors housed in the same computer, or it could be distributed among other

computers connected through a network (HowStuffWorks et al., 2022).


3

Amdahl's Law states that in a program with parallel processing, relatively few

instructions that must be completed in sequence will limit program speedup, so adding more

processors may not help the program run faster. This is an argument against parallel processing

for specific applications and, more broadly, against exaggerated claims for parallel computing

(Scott, 2022). Gustafson's Law in computer architecture, which uses a hypothetical work run on

a single-core machine as the baseline, explains the potential speedup in the execution time of a

job that benefits from parallel computing. The theoretical "slowdown" of a parallelized process

when carried out on a serial system is what it is, in other words (Stoker, 2022).

Only in situations when the issue size is fixed does Amdahl's law apply. In reality, as

computer power increases, larger issues (larger datasets) tend to be tackled, which causes the

time spent on the parallelizable portion to increase much more quickly than the fundamentally

serial task. Gustafson's law provides a less pessimistic and more realistic evaluation of the

parallel performance in this situation. (Murray, 2022).

Gustafson's formulation specifies a new serial percentage of the overall processing time

with P processors. Therefore, it is reliant on P. This P-dependent serial percentage is more

accessible to accomplish by computer trials than Amdahl's formulation. It is theoretically

challenging to utilize Gustafson's formulation to directly quantify P's impact on speedup, given it

contains a P-dependent variable (Murray, 2022).

In conclusion, Gustafson's Law will be more applicable if we parallelize an algorithm that

may increase the amount of computing required to fit the amount of parallelization available.

Amdahl's Law is more suitable if the quantity of computing that has to be done is fixed and

cannot be altered by parallelization. Amdahl's Law in parallel computing is only helpful when

there are few processors, or the task is perfectly parallel. Depending on the size of the change,
4

Gustafson's Law and Amdahl's Law can be utilized as upper and lower constraints on the

expected speed up.


5

References

HowStuffWorks, Tech, Computer, Hardware, & CPU. (2022). How Parallel Processing Works.

Retrieved 26 August 2022, from https://computer.howstuffworks.com/parallel-

processing.htm.

Stoker, B. (2022). Gustafson's Law. DB pedia. Retrieved 26 August 2022, from

https://dbpedia.org/page/Gustafson%27s_law.

Scott, M. (2022). What is Amdahl's Law? - Definition from WhatIs.com. WhatIs.com. Retrieved

26 August 2022, from https://www.techtarget.com/whatis/definition/Amdahls-

law#:text=In%20computer%20programmingAmdahllaw,maketheprogramfaster.

Murray, B. (2022). What Is Parallel Processing? Definition, Types, and Examples | Spiceworks.

Spiceworks. Retrieved 26 August 2022, from

https://www.spiceworks.com/tech/iot/articles/what-is-parallel-processing/.

You might also like