Short Notes: Space and Time Complexity | Short Notes for Computer Science Engineering - Computer Science Engineering (CSE) PDF Download

Download, print and study this document offline
Please wait while the PDF view is loading
 Page 1


Space and Time Complexity  
I. Asymptotic Notations:
The notations we use to describe the asymptotic (approximate) running time of an 
algorithm are defined in terms of functions whose domains are the set of natural 
numbers N = {0,1, 2 ...}. The asymptotic notations consist of the following useful 
notations.
Big Oh (0):
• If we write f(n) = 0(g(n)), then there exists a function f(n) such that v n s n0 , 
f(n) s eg (n) with any constant c and a positive integer n0.
OR
• f(n) = 0(g(n)) means we can say g(n) is an asymptotic upper bound for f(n).
Page 2


Space and Time Complexity  
I. Asymptotic Notations:
The notations we use to describe the asymptotic (approximate) running time of an 
algorithm are defined in terms of functions whose domains are the set of natural 
numbers N = {0,1, 2 ...}. The asymptotic notations consist of the following useful 
notations.
Big Oh (0):
• If we write f(n) = 0(g(n)), then there exists a function f(n) such that v n s n0 , 
f(n) s eg (n) with any constant c and a positive integer n0.
OR
• f(n) = 0(g(n)) means we can say g(n) is an asymptotic upper bound for f(n).
• Example-1:
Let f(n) = n2 + n + 5. Then f(n) is 0(n2 ), and f(n) is 0(n3 ), but f(n) is not O(n).
• Example-2:
Let f(n) = 3n . Then f(n) is 0(4n ) but not f(n) is not 0(2n )
Note: 0(1) refers to constant time. 0(n) indicates linear time; 0(nk ) (k fixed) refers 
to polynomial time; 0(log n) is called logarithmic time; 0(2n ) refers to exponential 
time, etc.
0(1) < 0(log n) < 0(n) < 0(n2) < 0(2n )
Omega (Q):
• If we write f(n) = Q(g(n)), then there exists a function f(n) such that v n s 
n0 i f(n) s cg(n) with any constant c and a positive integer n0. Or
• f(n) = 0(g(n)) means we can say Function g(n) is an asymptotic lower bound 
for f(n).
• Example-1: Let f (n) = 2n2 + 4n + 10. Then f (n) is Ci(n2 )
Theta (0):
• If we write f(n) = 0(g(n)), then there exists a function f(n) such that v n s 
n0 , Cig(n) s f(n) s C 2g(n) with a positive integer n0 , any positive constants 
c-| and c2. Or
• f(n) = 0(g(n)) means we can say Function g(n) is an asymptotically tight 
bound for f(n).
• Example-1:
Let f (n) = 2n2 + 4n + 10. Then f (n) is 9(n2 )
Small Oh (o):
• If we write f(n) = o(g(n), then there exists a function such that f(n) < c 
g(n) with any positive constant c and a positive integer n0. Or
• f(n) = o(g(n) means we can say Function g(n) is an asymptotically tight upper 
bound of f(n).
• Example: n1 9 9 = o(n2 )
Small Omega (w):
Page 3


Space and Time Complexity  
I. Asymptotic Notations:
The notations we use to describe the asymptotic (approximate) running time of an 
algorithm are defined in terms of functions whose domains are the set of natural 
numbers N = {0,1, 2 ...}. The asymptotic notations consist of the following useful 
notations.
Big Oh (0):
• If we write f(n) = 0(g(n)), then there exists a function f(n) such that v n s n0 , 
f(n) s eg (n) with any constant c and a positive integer n0.
OR
• f(n) = 0(g(n)) means we can say g(n) is an asymptotic upper bound for f(n).
• Example-1:
Let f(n) = n2 + n + 5. Then f(n) is 0(n2 ), and f(n) is 0(n3 ), but f(n) is not O(n).
• Example-2:
Let f(n) = 3n . Then f(n) is 0(4n ) but not f(n) is not 0(2n )
Note: 0(1) refers to constant time. 0(n) indicates linear time; 0(nk ) (k fixed) refers 
to polynomial time; 0(log n) is called logarithmic time; 0(2n ) refers to exponential 
time, etc.
0(1) < 0(log n) < 0(n) < 0(n2) < 0(2n )
Omega (Q):
• If we write f(n) = Q(g(n)), then there exists a function f(n) such that v n s 
n0 i f(n) s cg(n) with any constant c and a positive integer n0. Or
• f(n) = 0(g(n)) means we can say Function g(n) is an asymptotic lower bound 
for f(n).
• Example-1: Let f (n) = 2n2 + 4n + 10. Then f (n) is Ci(n2 )
Theta (0):
• If we write f(n) = 0(g(n)), then there exists a function f(n) such that v n s 
n0 , Cig(n) s f(n) s C 2g(n) with a positive integer n0 , any positive constants 
c-| and c2. Or
• f(n) = 0(g(n)) means we can say Function g(n) is an asymptotically tight 
bound for f(n).
• Example-1:
Let f (n) = 2n2 + 4n + 10. Then f (n) is 9(n2 )
Small Oh (o):
• If we write f(n) = o(g(n), then there exists a function such that f(n) < c 
g(n) with any positive constant c and a positive integer n0. Or
• f(n) = o(g(n) means we can say Function g(n) is an asymptotically tight upper 
bound of f(n).
• Example: n1 9 9 = o(n2 )
Small Omega (w):
• If we write f(n) = u)(g(n)), then these exists a function such that f(n) > 
cg(n) with any positive constant c and a positive integer n0. Or
• f(n) = u)(g(n)) means we can say g(n) is asymptotically tight lower bound of 
f(n).
• Example: n200001 = u)(n2 ) and n2 ± u)(n2 )
II. The relationship between asymptotic notations :
f(n) £ 0{g(n)) if and only if 9(n) £ Q(f(n))
f(n) £ o(g(n)) if and only if g(n) £ u ; ( / ( n ) )
f{n) £ e(g(n)) if and only if f(n) £ 0(g(n)) and f(n) £ n(g(n))
f(n) £ o(g(n)) implies / ( n ) £ 0(g(n))
f(n) £ u(g(n)) implies f ( n) £ Q(g(n))
f (n) £ 0(g(n)) implies f(n) £ u;{g(n))
f(n) € Q{g{n)) implies f ( n) $ o(g{n))
f(n) ~ g(n) implies f(n) £ S(g(n))
f(n) ~ g(n) is equivalent to / ( n ) £ g(n) + o(g(n))
III. Properties of Asymptotic notations
1 .Reflexive Property:
f(fl) £ 0 ( p ( n ) ) is equivalent to g(n) £ 0 (f(n)) 
f(n) ~ g{n) is equivalent to g(n) ^ f(n) 
f(n) £ o(g(n)) implies g (n ) £ o(f(n)) 
f(n) £ Lj(g(n)) implies g(n) £ w ( / ( n ) )
2. Symmetric Property:
f(n) £ <d(g(n)) is equivalent to g(n) £ 0 ( / ( n ) ) 
f(n) ~ g{n) is equivalent to g(n) ^ f[n) 
f(n) £ o(g(n)) implies g(n) £ o(f(n)) 
f(n) £ u(g(n)) implies g(n) £ w ( / ( n ) )
3.Transitive Property
Page 4


Space and Time Complexity  
I. Asymptotic Notations:
The notations we use to describe the asymptotic (approximate) running time of an 
algorithm are defined in terms of functions whose domains are the set of natural 
numbers N = {0,1, 2 ...}. The asymptotic notations consist of the following useful 
notations.
Big Oh (0):
• If we write f(n) = 0(g(n)), then there exists a function f(n) such that v n s n0 , 
f(n) s eg (n) with any constant c and a positive integer n0.
OR
• f(n) = 0(g(n)) means we can say g(n) is an asymptotic upper bound for f(n).
• Example-1:
Let f(n) = n2 + n + 5. Then f(n) is 0(n2 ), and f(n) is 0(n3 ), but f(n) is not O(n).
• Example-2:
Let f(n) = 3n . Then f(n) is 0(4n ) but not f(n) is not 0(2n )
Note: 0(1) refers to constant time. 0(n) indicates linear time; 0(nk ) (k fixed) refers 
to polynomial time; 0(log n) is called logarithmic time; 0(2n ) refers to exponential 
time, etc.
0(1) < 0(log n) < 0(n) < 0(n2) < 0(2n )
Omega (Q):
• If we write f(n) = Q(g(n)), then there exists a function f(n) such that v n s 
n0 i f(n) s cg(n) with any constant c and a positive integer n0. Or
• f(n) = 0(g(n)) means we can say Function g(n) is an asymptotic lower bound 
for f(n).
• Example-1: Let f (n) = 2n2 + 4n + 10. Then f (n) is Ci(n2 )
Theta (0):
• If we write f(n) = 0(g(n)), then there exists a function f(n) such that v n s 
n0 , Cig(n) s f(n) s C 2g(n) with a positive integer n0 , any positive constants 
c-| and c2. Or
• f(n) = 0(g(n)) means we can say Function g(n) is an asymptotically tight 
bound for f(n).
• Example-1:
Let f (n) = 2n2 + 4n + 10. Then f (n) is 9(n2 )
Small Oh (o):
• If we write f(n) = o(g(n), then there exists a function such that f(n) < c 
g(n) with any positive constant c and a positive integer n0. Or
• f(n) = o(g(n) means we can say Function g(n) is an asymptotically tight upper 
bound of f(n).
• Example: n1 9 9 = o(n2 )
Small Omega (w):
• If we write f(n) = u)(g(n)), then these exists a function such that f(n) > 
cg(n) with any positive constant c and a positive integer n0. Or
• f(n) = u)(g(n)) means we can say g(n) is asymptotically tight lower bound of 
f(n).
• Example: n200001 = u)(n2 ) and n2 ± u)(n2 )
II. The relationship between asymptotic notations :
f(n) £ 0{g(n)) if and only if 9(n) £ Q(f(n))
f(n) £ o(g(n)) if and only if g(n) £ u ; ( / ( n ) )
f{n) £ e(g(n)) if and only if f(n) £ 0(g(n)) and f(n) £ n(g(n))
f(n) £ o(g(n)) implies / ( n ) £ 0(g(n))
f(n) £ u(g(n)) implies f ( n) £ Q(g(n))
f (n) £ 0(g(n)) implies f(n) £ u;{g(n))
f(n) € Q{g{n)) implies f ( n) $ o(g{n))
f(n) ~ g(n) implies f(n) £ S(g(n))
f(n) ~ g(n) is equivalent to / ( n ) £ g(n) + o(g(n))
III. Properties of Asymptotic notations
1 .Reflexive Property:
f(fl) £ 0 ( p ( n ) ) is equivalent to g(n) £ 0 (f(n)) 
f(n) ~ g{n) is equivalent to g(n) ^ f(n) 
f(n) £ o(g(n)) implies g (n ) £ o(f(n)) 
f(n) £ Lj(g(n)) implies g(n) £ w ( / ( n ) )
2. Symmetric Property:
f(n) £ <d(g(n)) is equivalent to g(n) £ 0 ( / ( n ) ) 
f(n) ~ g{n) is equivalent to g(n) ^ f[n) 
f(n) £ o(g(n)) implies g(n) £ o(f(n)) 
f(n) £ u(g(n)) implies g(n) £ w ( / ( n ) )
3.Transitive Property
if f[n ) G 0(<r/(r?)) and g(n) G 0 ( / t ( n ) ) then / ( n ) G 0 (/? (rc )) 
If / ( n ) € tt(g (n )) and g(n) G 0 ( / i( n ) ) then / ( n ) € 17( / i ( r?)) 
l f / ( n ) € 0 ( p ( n ) ) and p (n ) G S (h(n)), then /(;? ) e 0 (/ j(t ?)) 
If f ( n ) € o(fll(« ))a n d <?(rc) £ o (/?(/?.)) then / ( n ) 6 o(h(n)) 
if f(n ) G a ;(p (n )) and g(n) G u (h (n )) then f( n ) G w ( /i( n ) ) 
lf / ( w ) ~ g(n) and g[ri) ~ h(n) then /( n .) ~ /j( n )
4.Mixed asymptotic transitive Properties:
« f ( n )
e 0(<j(n
»
and
9(n)
G < 0 (h (n ))
. then
f(n ) G 0 (h (n ))
' f / ( n )
G 0(<j(n
»
and
9(n)
€ IW n ) )
. then
f(n ) G V (h (n ))
" / ( " )
G e (g (n
»
and
9(n)
G i
then
/ ( » )
G o(h(n))
f / ( n )
g Q (g{n
))
and
9(n) G < A h {n ))
, then
m
G u (h {n ))
if f(n ) € 0 ( g ( n
0)
and
g(n)
G o(h(n)),
then
f(n )
G o(h(n))
if f( n ) g U {g(n))
and
g(n) G i A h {n ))
, then
/ ( » )
G u {h (n ))
IV. Analysis of Algorithms
The complexity of an algorithm is a function describing the efficiency of the 
algorithm in terms of the amount of data the algorithm must process. Usually there 
are natural units for the domain and range of this function.
• Algorithm can be classified by the amount of time they need to complete 
compared to their input size.
• The analysis of an algorithm focuses on the complexity of algorithm which 
depends on time or space.
There are two main complexity measures of the efficiency of an algorithm:
1. Time Complexity:
The time complexity is a function that gives the amount of time required by an 
algorithm to run to completion.
• Worst case time complexity: It is the function defined by the maximum 
amount of time needed by an algorithm for an input of size n.
• Average case time complexity: The average-case running time of an algorithm 
is an estimate of the running time for an "average" input. Computation of 
average-case running time entails knowing all possible input sequences, the 
probability distribution of occurrence of these sequences, and the running 
times for the individual sequences.
• Amortized Running Time: It is the time required to perform a sequence of 
(related) operations is averaged over all the operations performed. Amortized 
analysis guarantees the average performance of each operation in the worst
case.
Page 5


Space and Time Complexity  
I. Asymptotic Notations:
The notations we use to describe the asymptotic (approximate) running time of an 
algorithm are defined in terms of functions whose domains are the set of natural 
numbers N = {0,1, 2 ...}. The asymptotic notations consist of the following useful 
notations.
Big Oh (0):
• If we write f(n) = 0(g(n)), then there exists a function f(n) such that v n s n0 , 
f(n) s eg (n) with any constant c and a positive integer n0.
OR
• f(n) = 0(g(n)) means we can say g(n) is an asymptotic upper bound for f(n).
• Example-1:
Let f(n) = n2 + n + 5. Then f(n) is 0(n2 ), and f(n) is 0(n3 ), but f(n) is not O(n).
• Example-2:
Let f(n) = 3n . Then f(n) is 0(4n ) but not f(n) is not 0(2n )
Note: 0(1) refers to constant time. 0(n) indicates linear time; 0(nk ) (k fixed) refers 
to polynomial time; 0(log n) is called logarithmic time; 0(2n ) refers to exponential 
time, etc.
0(1) < 0(log n) < 0(n) < 0(n2) < 0(2n )
Omega (Q):
• If we write f(n) = Q(g(n)), then there exists a function f(n) such that v n s 
n0 i f(n) s cg(n) with any constant c and a positive integer n0. Or
• f(n) = 0(g(n)) means we can say Function g(n) is an asymptotic lower bound 
for f(n).
• Example-1: Let f (n) = 2n2 + 4n + 10. Then f (n) is Ci(n2 )
Theta (0):
• If we write f(n) = 0(g(n)), then there exists a function f(n) such that v n s 
n0 , Cig(n) s f(n) s C 2g(n) with a positive integer n0 , any positive constants 
c-| and c2. Or
• f(n) = 0(g(n)) means we can say Function g(n) is an asymptotically tight 
bound for f(n).
• Example-1:
Let f (n) = 2n2 + 4n + 10. Then f (n) is 9(n2 )
Small Oh (o):
• If we write f(n) = o(g(n), then there exists a function such that f(n) < c 
g(n) with any positive constant c and a positive integer n0. Or
• f(n) = o(g(n) means we can say Function g(n) is an asymptotically tight upper 
bound of f(n).
• Example: n1 9 9 = o(n2 )
Small Omega (w):
• If we write f(n) = u)(g(n)), then these exists a function such that f(n) > 
cg(n) with any positive constant c and a positive integer n0. Or
• f(n) = u)(g(n)) means we can say g(n) is asymptotically tight lower bound of 
f(n).
• Example: n200001 = u)(n2 ) and n2 ± u)(n2 )
II. The relationship between asymptotic notations :
f(n) £ 0{g(n)) if and only if 9(n) £ Q(f(n))
f(n) £ o(g(n)) if and only if g(n) £ u ; ( / ( n ) )
f{n) £ e(g(n)) if and only if f(n) £ 0(g(n)) and f(n) £ n(g(n))
f(n) £ o(g(n)) implies / ( n ) £ 0(g(n))
f(n) £ u(g(n)) implies f ( n) £ Q(g(n))
f (n) £ 0(g(n)) implies f(n) £ u;{g(n))
f(n) € Q{g{n)) implies f ( n) $ o(g{n))
f(n) ~ g(n) implies f(n) £ S(g(n))
f(n) ~ g(n) is equivalent to / ( n ) £ g(n) + o(g(n))
III. Properties of Asymptotic notations
1 .Reflexive Property:
f(fl) £ 0 ( p ( n ) ) is equivalent to g(n) £ 0 (f(n)) 
f(n) ~ g{n) is equivalent to g(n) ^ f(n) 
f(n) £ o(g(n)) implies g (n ) £ o(f(n)) 
f(n) £ Lj(g(n)) implies g(n) £ w ( / ( n ) )
2. Symmetric Property:
f(n) £ <d(g(n)) is equivalent to g(n) £ 0 ( / ( n ) ) 
f(n) ~ g{n) is equivalent to g(n) ^ f[n) 
f(n) £ o(g(n)) implies g(n) £ o(f(n)) 
f(n) £ u(g(n)) implies g(n) £ w ( / ( n ) )
3.Transitive Property
if f[n ) G 0(<r/(r?)) and g(n) G 0 ( / t ( n ) ) then / ( n ) G 0 (/? (rc )) 
If / ( n ) € tt(g (n )) and g(n) G 0 ( / i( n ) ) then / ( n ) € 17( / i ( r?)) 
l f / ( n ) € 0 ( p ( n ) ) and p (n ) G S (h(n)), then /(;? ) e 0 (/ j(t ?)) 
If f ( n ) € o(fll(« ))a n d <?(rc) £ o (/?(/?.)) then / ( n ) 6 o(h(n)) 
if f(n ) G a ;(p (n )) and g(n) G u (h (n )) then f( n ) G w ( /i( n ) ) 
lf / ( w ) ~ g(n) and g[ri) ~ h(n) then /( n .) ~ /j( n )
4.Mixed asymptotic transitive Properties:
« f ( n )
e 0(<j(n
»
and
9(n)
G < 0 (h (n ))
. then
f(n ) G 0 (h (n ))
' f / ( n )
G 0(<j(n
»
and
9(n)
€ IW n ) )
. then
f(n ) G V (h (n ))
" / ( " )
G e (g (n
»
and
9(n)
G i
then
/ ( » )
G o(h(n))
f / ( n )
g Q (g{n
))
and
9(n) G < A h {n ))
, then
m
G u (h {n ))
if f(n ) € 0 ( g ( n
0)
and
g(n)
G o(h(n)),
then
f(n )
G o(h(n))
if f( n ) g U {g(n))
and
g(n) G i A h {n ))
, then
/ ( » )
G u {h (n ))
IV. Analysis of Algorithms
The complexity of an algorithm is a function describing the efficiency of the 
algorithm in terms of the amount of data the algorithm must process. Usually there 
are natural units for the domain and range of this function.
• Algorithm can be classified by the amount of time they need to complete 
compared to their input size.
• The analysis of an algorithm focuses on the complexity of algorithm which 
depends on time or space.
There are two main complexity measures of the efficiency of an algorithm:
1. Time Complexity:
The time complexity is a function that gives the amount of time required by an 
algorithm to run to completion.
• Worst case time complexity: It is the function defined by the maximum 
amount of time needed by an algorithm for an input of size n.
• Average case time complexity: The average-case running time of an algorithm 
is an estimate of the running time for an "average" input. Computation of 
average-case running time entails knowing all possible input sequences, the 
probability distribution of occurrence of these sequences, and the running 
times for the individual sequences.
• Amortized Running Time: It is the time required to perform a sequence of 
(related) operations is averaged over all the operations performed. Amortized 
analysis guarantees the average performance of each operation in the worst
case.
• Best case time complexity: It is the minimum amount of time that an 
algorithm requires for an input of size n.
2. Space Complexity:
The space complexity is a function that gives the amount of space required by an 
algorithm to run to completion.
V. What is Recurrence Relations
A recurrence is a function defined in terms of One or more base cases and Itself 
with smaller arguments.
Example:
T(n)=
1
T(n— 1)+1
if n = 1 , 
if n > 1
Above recurrence relation can be computed asymptotically that is T(n) = 0(n2 ).
In algorithm analysis, we usually express both the recurrence and its solution using 
asymptotic notation.
Methods to Solve the Recurrence Relation
There are three methods to solve the recurrence relation given as: Master method , 
Substitution Method and Recursive Tree method.
1. Master Method:
The master method gives us a quick way to find solutions to recurrence relations of 
the form T(n) = aT (n/b) + f(n). Where, a and b are constants, a £ 1 and b > 1)
• T(n) = aT(n/b) + f(n) where f(n) £©(nd ), d > 0
o Case-1: If a < bd , T(n) eQ(nd )
° Case-2: If a = bd , T(n) e Q(nd log n)
° Case-3: If a > bd , T(n) e©(nlogba )
• Examples:
° T(n) = 4T(n/2) + n = > T(n) e©(nz ) 
o T(n) = 4T(n/2) + n2 = > T(n) e©(n2log n)
° T(n) = 4T(n/2) + n3 = > T(n) £©(n3 )
2. Iterative Substitution Method:
Recurrence equation is substituted itself to find the final generalized form of the 
recurrence equation.
T(N) = 2T(N/2)+bN
Here T(N/2) is substituted with 2T((N/2)/2)+b(N/2)
T(N) = 4T(N/4)+2bN (Similarly substitute for T(N/4)
T(N) = 8T(N/8)+3bN 
After (i-1) substititions,
T(N) = 2'T(N/2')+ibN
When i=log(N), N/2'=1 and we have the base case.
T(N) = 2l09(N )T(N/2'°9(N ))+ibN
Read More
90 docs

Top Courses for Computer Science Engineering (CSE)

90 docs
Download as PDF
Explore Courses for Computer Science Engineering (CSE) exam

Top Courses for Computer Science Engineering (CSE)

Signup for Free!
Signup to see your scores go up within 7 days! Learn & Practice with 1000+ FREE Notes, Videos & Tests.
10M+ students study on EduRev
Related Searches

MCQs

,

practice quizzes

,

pdf

,

Previous Year Questions with Solutions

,

Exam

,

shortcuts and tricks

,

Extra Questions

,

Semester Notes

,

Important questions

,

video lectures

,

Sample Paper

,

Viva Questions

,

ppt

,

study material

,

past year papers

,

Free

,

Short Notes: Space and Time Complexity | Short Notes for Computer Science Engineering - Computer Science Engineering (CSE)

,

Short Notes: Space and Time Complexity | Short Notes for Computer Science Engineering - Computer Science Engineering (CSE)

,

Summary

,

Objective type Questions

,

mock tests for examination

,

Short Notes: Space and Time Complexity | Short Notes for Computer Science Engineering - Computer Science Engineering (CSE)

;