You are on page 1of 2

++********

Algorithm Design develop mathematical process to solve problems.


Algorithm Analysis predict the performance of an algorithm.
Big O Notation obtains function to measure algorithm time
complexity based on the input size.
It is hard to compare algorithms by how fast they execute. To make it
easier to compare algorithms, theories were develop to analyze
algorithms independently, regardless of computers and specific input.
So this approach showed that the input size effected the algorithm
execution time. So as the input size increase the execution time
increased. This is their growth rate. How much more does the
execution time increase when the input size increase? So when
searching the for key value in an array, the algorithm time depends
on the size of the array. The larger the array the longer it takes to
execute the search. So algorithms grow at a linear rate. The growth
rate is a magnitude of n. n represents the size of the array.
Magnitude means multiples or division of n. For example n/2 or 2n. So
n/2 can mean half the array size and 2n can mean double the area
size.
So Big O notation represents the magnitude. So Big O magnitude.
The notation for the complexity of the linear search algorithm is
O(n) which means order of n. O(n) represents a linear growth rate.
Algorithm execution time can still depending on the input. An input
with a short execution time is a best-case input. An input with a
long execution time is call worst-case input. Worst-case analysis is
very useful because nothing would cause a slower execution time than
the worst case. Average case analysis finds the average time is
takes to execution with a range of different inputs. Average analysis
is ideal but hard to perform due to the fact that it is hard to find
the relative probabilities and distributions of various inputs.
Worst-case analysis is easier to perform, so this analysis is used.
Algorithm analysis focus on growth rates. So the growth rate for n/2
or 100n both equal n. So O(n) = O(n/2) = O(100n). Algorithm analysis
is for large input size. When the execution time is not related to
the input size, the execution time is called constant time. Meaning
that when looking for a value at a given index or key, that amount of
time it takes is the constant time, Its the same for every value been
search with a known key. The time doesn't grow if the input size
increases.
Time complexity is using the Big O notation to measure the execution
times.
Multiplicative constants have no effect on the Big O notation.

Non-dominating terms like the -1 in n-1 is ignored because when n


starts to grow that -1 becomes more of a small factor in the
expression. The n is the head honcho.

You might also like