DSA Module1
DSA Module1
• Multiple algorithms are available for solving the same problem (for
example, a sorting problem has many algorithms, like insertion
sort, selection sort, quick sort and many more).
Time Space
Omega(Ω) Lower Bound (Best-case) The minimum time/space complexity Best-case scenario
Theta(θ) Tight Bound (Exact bound) The exact time/space complexity Average behavior (tight bound)
Why analyzing the time complexity is so important?
Asymptotic Notations
Best case:Defines the input for which the algorithm takes the
least time. Input is the one for which the algorithm runs the
fastest.
best
case
We want to search
for num = 4
4 5 1 7 2 9 3 8
best
case
We want to search
for num = 4
4 5 1 7 2 9 3 8
worst case
We want to search
for num = 6
4 5 1 7 2 9 3 8
worst case
We want to search
for num = 6
4 5 1 7 2 9 3 8
• The best case time or space
complexity is often represented in
terms of the BigOmega (Ω) notation
Best case analysis of Algorithm • Searching an element in the array:
• Provides the lower bound on the algorithm's
running time for any given input. int linear_search(int arr, int l, int target) {
• Any program will need at least (greater than or
equal to) that time to run.
int i;
for (i = 0; i < l; i++) {
the best case time complexity is Ω(𝑁), then we can
Example: Suppose we have an algorithm that has if (arr[i] == target) {
return arr[i]
Ω(𝑁)time to run, it can never take sometime less
say that the program will take a minimum of
than that. }
}
return -1
}
The number you are searching for is
present at the very beginning index of
the array. In that case, your algorithm
will take Ω(1) or constant time to find
the number in the best case.
• The worst-case time or space complexity is
often represented in terms of the Big
Oh (O) notation.
Worst case analysis
• Provides the upper bound on the running
time of the algorithm for any given input. • Example: Searching an element at the end of
• It states that any program will need a maximum the array.
of time (less than or equal to) to run.
· 2cbn/2c log
bn/2c+ n Induction
case is true
· cn log (n/2) + n
= cn log n – cn + n ·
cn log n
54
Substitution Method
(if we know the answer)
Q. How did we know the value of c and
n0 ?
A. If induction works, the induction
case must be correct 🡺 c ¸ 1
Then, we find that by setting c = 2,
our guess is correct as soon as n0 =
2
Alternatively, we can also use c
= 1.3 Then, we just need a larger
55
n =4
Substitution Method
(New Challenge 1)
How to solve this?
T(n) = 2T( n )+
log n ?
56
Substitution Method
(New Challenge 1)
Set m = log n , we get
T(2m) = 2T(2m/2) +
m
58
Recursion Tree Method
( Nothing Special… Very Useful ! )
Expanding the terms, we get:
T(n) = n2 + 2T(n/2)
= n2 + 2n2/4 + 4T(n/4)
= n + 2n2/4 + 4n2/16 +
8T(n/8)
=…
Σ
= k=0 to log n– 1 (1/2)k n2 + 2log n
T(1)
59
2 2 2
Recursion Tree Method
( Recursion Tree View )
60
Further expressing gives us:
This term is
from T(n/2)
61
Recursion Tree Method
( New Challenge )
How to solve
this? with T(1) =
T(n) = T(n/3) + T(2n/3) + 1
What will be then,
recursion tree
view?
62
The corresponding recursion tree view is:
63
Master Method
( Save our effort )
64
Master Theorem
Let T(n) = aT(n/b) +
with a f(n)
≥ 1 and b > 1 are constants.
65
Theorem: (Case 2: Moderate f(n) )
If f(n) = Θ(nlogb a),
the T(n) = Θ(nlogb a log
n n)
Theorem: (Case 3: Very large f(n) )
If (i) f(n) = Ω(nlogb a + ε) for some constant
ε>0
and (ii) a f(n/b) ≤ c f(n) for some constant c
< 1,
the T(n) = all sufficiently large n
n Θ(f(n))
66
Master Theorem (Exercises)
1. Solve T(n) = 9T(n/3) + n