Short Notes: Analysis of Algorithms | Short Notes for Computer Science Engineering - Computer Science Engineering (CSE) PDF Download

Download, print and study this document offline
Please wait while the PDF view is loading
 Page 1


ANALYSIS OF ALGORITHMS
As mentioned before, different algorithms can be devised to solve a single problem. The 
most efficient algorithm is always desired. This efficiency can be in terms of space required or 
time required to solve the problem.
Analysis of algorithm is a technique to compare the relative efficiency of different 
algorithms. Since speed of an algorithm can be different on different computers (depending on 
computer speed), time required to solve the problem cannot be expressed without ambiguity in 
terms of number of seconds required to solve a problem. Instead, it is measured by number of 
calculations required to solve the problem.
In real life situations, the actual number of operations is not used. Rather, the time 
required is expressed as a mathematical function of the input size. Two algorithms are compared 
on the basis of rate of growth of that function. Higher rate of growth means the algorithm will 
take more time as the input size increases.
Representation of analysis
The mathematical function is generally represented using big-O notation. Big-O notation 
gives an asymptotic upper bound of the function.
Let the function be a polynomial function 
F (n)=anA k+bnA (k-1)+.. ,+z 
where n is the variable, a ,b ,.,z are co-efficient.
The growth of the function is high. It is higher than any polynomial equation of power less than k. 
But it is lower than any polynomial of power greater than k.
So
F(n) < nA(k+1) and so on.
Moreover, we can always find a constant a such that 
F(n) <= a nA k
So, nAk, nA(k+1) and so on represent an upper bound to the function. It is in symbolic natation 
represented as
F(n)= O (nAk)
F(n)= O (nA(k+1)) and so on
Definition of Big-O
For two functions F(n) and G(n), if there exists a positive constant c, such that 
0 <= F(n) <= c G(n), for some n>=n0 
Then F(n)=O(G(n))
Other representations
> Big-Q :
o This gives an asymptotic lower bound of a function.
Page 2


ANALYSIS OF ALGORITHMS
As mentioned before, different algorithms can be devised to solve a single problem. The 
most efficient algorithm is always desired. This efficiency can be in terms of space required or 
time required to solve the problem.
Analysis of algorithm is a technique to compare the relative efficiency of different 
algorithms. Since speed of an algorithm can be different on different computers (depending on 
computer speed), time required to solve the problem cannot be expressed without ambiguity in 
terms of number of seconds required to solve a problem. Instead, it is measured by number of 
calculations required to solve the problem.
In real life situations, the actual number of operations is not used. Rather, the time 
required is expressed as a mathematical function of the input size. Two algorithms are compared 
on the basis of rate of growth of that function. Higher rate of growth means the algorithm will 
take more time as the input size increases.
Representation of analysis
The mathematical function is generally represented using big-O notation. Big-O notation 
gives an asymptotic upper bound of the function.
Let the function be a polynomial function 
F (n)=anA k+bnA (k-1)+.. ,+z 
where n is the variable, a ,b ,.,z are co-efficient.
The growth of the function is high. It is higher than any polynomial equation of power less than k. 
But it is lower than any polynomial of power greater than k.
So
F(n) < nA(k+1) and so on.
Moreover, we can always find a constant a such that 
F(n) <= a nA k
So, nAk, nA(k+1) and so on represent an upper bound to the function. It is in symbolic natation 
represented as
F(n)= O (nAk)
F(n)= O (nA(k+1)) and so on
Definition of Big-O
For two functions F(n) and G(n), if there exists a positive constant c, such that 
0 <= F(n) <= c G(n), for some n>=n0 
Then F(n)=O(G(n))
Other representations
> Big-Q :
o This gives an asymptotic lower bound of a function.
o Informally, it is the minimum time an algorithm will take to solve a problem. 
o For two functions F(n) and G(n), if there exists a positive constant c, such that 
0 <= c G(n) <= F(n), for some n>=n0
Then F(n)= Q (G(n))
> Big-0:
o For two functions F(n) and G(n), if there exists two positive constant cl and c2, 
such that
0 <= cl G(n) <= F(n) <=c2 G(n) , for some n>=n0
Then F(n)= 0 (G(n))
Comparison of speed of two algorithms (an example)
Let there be two computers, A and B. Let A can execute 10A10 instructions per second 
while B can execute 10A 7 instructions per second. Let there be two algorithms to solve a given 
problem, one of time complexity 2nA2 and another 50nlogn. Let the first algorithm be run on 
faster computer A and the other one in computer B.
When input size n=10A7,
Time required by computer A = 2 x (10A7)A2 / 10A10
= 20000 seconds (about 5.5 hours)
Time required by computer B = 50 x (10A7) x log(10A7) / 10A10
= 1163 seconds (less than 20 minutes)
Even though first computer was about 1000 times faster than second computer, it took a lot more 
time to execute an algorithm of higher complexity.
Cases considered in analysis
> Best case :
o This is the analysis when the input is in favour of output.
o For example, in sorting problem, the analysis done when the list is already sorted 
is called best case analysis. 
o This is not of much interest.
> Worst case :
o Worst case analysis is the longest running time of an algorithm for a particular size 
of input.
o For sorting problem, the worst case is when the number are already sorted but in 
reverse order.
o This gives an upper bound of an algorithm.
> Average case :
o It is the general case (not best not worst). 
o But in practice, it is as bad as the worst case. 
o It occurs fairly often (as often as worst case).
o In sorting example, sorting any random list falls under this category.
Page 3


ANALYSIS OF ALGORITHMS
As mentioned before, different algorithms can be devised to solve a single problem. The 
most efficient algorithm is always desired. This efficiency can be in terms of space required or 
time required to solve the problem.
Analysis of algorithm is a technique to compare the relative efficiency of different 
algorithms. Since speed of an algorithm can be different on different computers (depending on 
computer speed), time required to solve the problem cannot be expressed without ambiguity in 
terms of number of seconds required to solve a problem. Instead, it is measured by number of 
calculations required to solve the problem.
In real life situations, the actual number of operations is not used. Rather, the time 
required is expressed as a mathematical function of the input size. Two algorithms are compared 
on the basis of rate of growth of that function. Higher rate of growth means the algorithm will 
take more time as the input size increases.
Representation of analysis
The mathematical function is generally represented using big-O notation. Big-O notation 
gives an asymptotic upper bound of the function.
Let the function be a polynomial function 
F (n)=anA k+bnA (k-1)+.. ,+z 
where n is the variable, a ,b ,.,z are co-efficient.
The growth of the function is high. It is higher than any polynomial equation of power less than k. 
But it is lower than any polynomial of power greater than k.
So
F(n) < nA(k+1) and so on.
Moreover, we can always find a constant a such that 
F(n) <= a nA k
So, nAk, nA(k+1) and so on represent an upper bound to the function. It is in symbolic natation 
represented as
F(n)= O (nAk)
F(n)= O (nA(k+1)) and so on
Definition of Big-O
For two functions F(n) and G(n), if there exists a positive constant c, such that 
0 <= F(n) <= c G(n), for some n>=n0 
Then F(n)=O(G(n))
Other representations
> Big-Q :
o This gives an asymptotic lower bound of a function.
o Informally, it is the minimum time an algorithm will take to solve a problem. 
o For two functions F(n) and G(n), if there exists a positive constant c, such that 
0 <= c G(n) <= F(n), for some n>=n0
Then F(n)= Q (G(n))
> Big-0:
o For two functions F(n) and G(n), if there exists two positive constant cl and c2, 
such that
0 <= cl G(n) <= F(n) <=c2 G(n) , for some n>=n0
Then F(n)= 0 (G(n))
Comparison of speed of two algorithms (an example)
Let there be two computers, A and B. Let A can execute 10A10 instructions per second 
while B can execute 10A 7 instructions per second. Let there be two algorithms to solve a given 
problem, one of time complexity 2nA2 and another 50nlogn. Let the first algorithm be run on 
faster computer A and the other one in computer B.
When input size n=10A7,
Time required by computer A = 2 x (10A7)A2 / 10A10
= 20000 seconds (about 5.5 hours)
Time required by computer B = 50 x (10A7) x log(10A7) / 10A10
= 1163 seconds (less than 20 minutes)
Even though first computer was about 1000 times faster than second computer, it took a lot more 
time to execute an algorithm of higher complexity.
Cases considered in analysis
> Best case :
o This is the analysis when the input is in favour of output.
o For example, in sorting problem, the analysis done when the list is already sorted 
is called best case analysis. 
o This is not of much interest.
> Worst case :
o Worst case analysis is the longest running time of an algorithm for a particular size 
of input.
o For sorting problem, the worst case is when the number are already sorted but in 
reverse order.
o This gives an upper bound of an algorithm.
> Average case :
o It is the general case (not best not worst). 
o But in practice, it is as bad as the worst case. 
o It occurs fairly often (as often as worst case).
o In sorting example, sorting any random list falls under this category.
An example of finding time complexity
Consider the following code snippet for sorting an array. 
for(i=0;i<n;i++)
for(j=i+1;j<n;j++)
if(arr[i]>arr[j])
swap(arr[i],arr[j]);
Here the first element is compared with remaining n-1 elements, second element with 
remaining n-2 elements and so on.
So total number of comparisons is
(n-1) + (n-2) + ... + 1 = n(n-1)/2
This is a polynomial equation of power 2. So the algorithm is said to be O(nA2).
Read More
90 docs

Top Courses for Computer Science Engineering (CSE)

Explore Courses for Computer Science Engineering (CSE) exam

Top Courses for Computer Science Engineering (CSE)

Signup for Free!
Signup to see your scores go up within 7 days! Learn & Practice with 1000+ FREE Notes, Videos & Tests.
10M+ students study on EduRev
Related Searches

Previous Year Questions with Solutions

,

mock tests for examination

,

Short Notes: Analysis of Algorithms | Short Notes for Computer Science Engineering - Computer Science Engineering (CSE)

,

Sample Paper

,

practice quizzes

,

study material

,

Short Notes: Analysis of Algorithms | Short Notes for Computer Science Engineering - Computer Science Engineering (CSE)

,

Exam

,

shortcuts and tricks

,

Extra Questions

,

video lectures

,

past year papers

,

MCQs

,

Free

,

Viva Questions

,

Semester Notes

,

Short Notes: Analysis of Algorithms | Short Notes for Computer Science Engineering - Computer Science Engineering (CSE)

,

Summary

,

Important questions

,

pdf

,

ppt

,

Objective type Questions

;