0% found this document useful (0 votes)
17 views

AlgorithmLec 2asymptotic Notations

This document discusses asymptotic notations and how they are used to analyze algorithms. It defines O, Ω, and Θ notation and provides examples of using these notations to determine the asymptotic runtime of algorithms. Examples include analyzing basic algorithms and mathematical functions to determine upper and lower bounds.

Uploaded by

malkmoh781.mm
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
17 views

AlgorithmLec 2asymptotic Notations

This document discusses asymptotic notations and how they are used to analyze algorithms. It defines O, Ω, and Θ notation and provides examples of using these notations to determine the asymptotic runtime of algorithms. Examples include analyzing basic algorithms and mathematical functions to determine upper and lower bounds.

Uploaded by

malkmoh781.mm
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 36

Algorithm analysis & design

Asymptotic Notations
Presented By:

DR. Marwa Mamdouh Emam

LEC 2 FCI, Minia University


Agenda
2

 Asymptotic Notations

 Examples
Running time vs. Asymptotic performance
3

 Exact running time can be computed for small problem


size (small n)
 Asymptotic performance can be found for large n where
multiplication constants and lower-order terms of the exact
running time are dominated.

 Asymptotic performance of algorithm


How the running time of an algorithm increases with the size of
the input in the limit, as the size of the input increases without
bound.
Growth of Functions and Asymptotic Notation
4

 When we study algorithms, we are interested in characterizing


them according to their efficiency.

 We are usually interesting in the order of growth of the running


time of an algorithm, not in the exact running time. This is also
referred to as the asymptotic running time.

 We need to develop a way to talk about rate of growth of


functions so that we can compare algorithms.

 Asymptotic notation gives us a method for classifying functions


according to their rate of growth.
Classifying Functions by Their Asymptotic Growth
5

 Asymptotic growth: The rate of growth of a function

 Given a particular differentiable function f(n), all


other differentiable functions fall into three classes:
 growing with the same rate
 growing faster
 growing slower
Bounds (asymptotic notations)
6


  
Bounds describe the limiting behavior of algorithm complexity at large
(n). They are:

 Upper Bound (Big O complexity) (WorestCase)

 Lower Bound (Big  complexity) (Best Case)


 Exact (Big  complexity)(Average Case)
O-NOTATION: DEFINITION
7

 Big-O:- defines an upper bound of an algorithm. It bounds a function


only from above.
 For function F(n), we define O (g(n)), big-Oh of n, as:

f(N) is O(g(N)) If
 positive constants c and n0, such that
f(n) ≤ c × g(n)  n ≥ n0

 For example
 T(n) = O(n100)
 T(n) will never grow asymptotically faster than n100
O-NOTATION: DEFINITION
8

Trend of running time


O-NOTATION : Example
9

Prove that
 5n2 is O(n3)
Proof:
 According to the definition of O(), we should find a constant c s.t.
𝒇 𝒏 ≤ 𝒄 × 𝒈 𝒏 ∀ 𝒏 ≥ 𝒏𝟎

5 × 𝑛2 ≤ 𝑐 × 𝑛3 ∀ 𝑛 > 𝑛0 , divide by n3
5
≤𝑐 ∀ 𝑛 ≥ 𝑛0
𝑛
 Substitute n=5 for example , then any c≥1 will satisfy the
inequality ∀ n≥ 5
it's true
Ω-NOTATION: DEFINITION
10

 Ω notation provides an asymptotic lower bound of an algorithm.


 For function F(n), we define Ω (g(n)), big-Omega of n, as:

f(n) is Ω(g(n)) If
 positive constants c and n0, such that
c × g(n) ≤ f(n)  n ≥ n0

 For example:
 T(n) = Ω(n3)
 T(n) will never grow asymptotically slower than n3
Ω-NOTATION
ASYMPTOTIC LOWER BOUND
11
Ω-NOTATION : Example
12

Prove that
 5n2 is Ω(n)
Proof:
 According to the definition of Ω (), we should find a constant
c s.t.
𝒄 × 𝒈 𝒏 ≤ 𝒇 𝒏 ∀ 𝒏 ≥ 𝒏𝟎

𝑐 × 𝑛 ≤ 5 × 𝑛2 ∀ 𝑛 > 𝑛0 , divide by n
𝑐 ≤ 5 × 𝑛 ∀ 𝑛 ≥ 𝑛0
 Substitute n=1 for example , then any c ≤ 5 will satisfy
the inequality ∀ n ≥ 1
it's true
Θ-NOTATION: DEFINITION
13

 Θ notation provides an asymptotic tight bound of an algorithm.


 For function F(n), we define Θ (g(n)), big-Theta of n, as:

f(n) is Θ (g(n)) If
 positive constants c1,c2 ,and n0, such that
c1×g(n) ≤ f(n) ≤ c2×g(n)
 n ≥ n0
 Θ(): Exact order (most difficult to compute in some algorithms)
 For example:-
 T(n) = Θ(n3)
 T(n) grows asymptotically as fast as n3.
Θ-NOTATION: DEFINITION
14

1
Θ -NOTATION : Example
15

Prove that
 5n2 is Θ(n2))
Proof:
 According to the definition of Θ(), we should find 2 constants c1 & c2 s.t.

c1 × 𝒈 𝒏 ≤ 𝒇 𝒏 ≤ c2 × 𝒈 𝒏 ∀ 𝒏 ≥ 𝒏𝟎

c1 × n2 ≤ 5 × 𝑛2 ≤ c2 × n2 ∀ 𝑛 ≥ 𝑛0 , divide by 𝑛2
c1 ≤ 5 ≤ c2 ∀ 𝑛 ≥ 𝑛0
 So, there's exists 𝑐1 = 4 and 𝑐2 = 6 that satisfy the inequality for all n ≥ 1
(n0 = 1)
it's true
Relations Between Bounds (asymptotic notations)

  
16

c1×g(n) ≤ f(n) ≤ c2×g(n) f(n) ≥ c × g(n)


f(n) ≤ c × g(n)
for all n ≥ n0 for all n ≥ n0
for all n ≥ n0
Relations Between Bounds (asymptotic notations)
17

Theorem :
For any two functions g(n) and f(n),
f(n) = (g(n)) iff
f(n) = O(g(n)) and f(n) = (g(n)).

I.e. f(N) is Θ(g(N)) iff it's O(g(N)) & Ω(g(N)) (by


their definitions)
Relations Between Bounds : Example
18

Prove that
 5n2 is Θ(n2)  O(N2) & Ω(N2)
Proof:
 According to the definition of Θ(), we should find 2 constants c1 & c2 s.t.

c1 × 𝒈 𝒏 ≤ 𝒇 𝒏 ≤ c2 × 𝒈 𝒏 ∀ 𝒏 > 𝒏𝟎

c1 × n2 ≤ 5 × 𝑛2 ≤ c2 × n2 ∀ 𝑛 > 𝑛0 , divide by 𝑛2
c1 ≤ 5 ≤ c2 ∀ 𝑛 > 𝑛0
 So, there's exists 𝑐1 = 4 and 𝑐2 = 6 that satisfy the inequality for all n ≥ 1 (n0 = 1)
 But the left side of the inequality is the prove of the definition of Ω(N2)
and right side is the prove of O(N2)

it's true
Examples
19

 Θ( 𝑛 3 ):
𝑛3
5𝑛3 + 4n
105𝑛3 + 4n2 + 6n

 Θ(𝑛 ):
2
𝑛2
5𝑛2 + 4n + 6
𝑛2 + 5
Examples
20

 True or false?
• N2 = O(N2) true
• 2N = O(N2) true
• N = O(N2) true

• N2 = O(N) false
• 2N = O(N) true
• N = O(N) true
Examples
21

• N2 = Θ (N2 ) true
• 2N = Θ (N2 ) false
• N = Θ (N2 ) false

• N2 = Θ (N) false
• 2N = Θ (N) true
• N = Θ (N) true

• 4+3n = O(n) true


• n+2logn = O(logn) false
• logn+2 = O(1) False
What is the difference between worst case
and Big-O?
22

 Worst, best and average cases describe distinct runtime


functions (Different types of analysis) : one for the sequence of
highest runtime of any given n, one for that of lowest, and so
on..

 asymptotic notations are just representation methods of any


time complexity of an algorithm.
What is the difference between worst case
and Big-O?
23

 Worst case:- happen

 Upper Bound:- may not happen


Example1:
24

 Compute complexity for this code:-


For k=1 to N
Statements That take exectly k statements
End for
 Solution:-
 By applying rules of order: Complexity Time = O(N2)
 By exact iteration:
𝑁(𝑁+1)
 Complexity Time =1+2+⋯ ⋯+N = 𝑁
𝑘=1 𝑘 =
2
= Θ (N2 )
Example2:
25

 Indicate, for each pair of expressions (A, B) in the table below,


whether A is O, Ω, or Θ of B. Log are base 2. Your answer should
be in the form of the table with “yes” or “no” written in each box:

A B O Ω Θ
2N+1 2N
N (Log(N))2 N Log(N)
(N-1)! N!
Example2:
26

 Indicate, for each pair of expressions (A, B) in the table below,


whether A is O, Ω, or Θ of B. Log are base 2. Your answer should
be in the form of the table with “yes” or “no” written in each box:

A B O Ω Θ
2N+1 2N yes yes yes

N (Log(N))2 N Log(N) no yes no

(N-1)! N! yes no no
Example3:
27

 Let f(n) and g(n) be asymptotically positive functions. Prove


or disprove each of the following:
 f(n) = O(g(n)) implies g(n) = O(f(n)).
 h(n) + k(n) = Θ(min(h(n), k(n))).
Example3:
28

 Let f(n) and g(n) be asymptotically positive functions. Prove


or disprove each of the following:
 f(n) = O(g(n)) implies g(n) = O(f(n)). #Disproved
 h(n) + k(n) = Θ(min(h(n), k(n))). #Disproved
Example 4:
29

 Write the big-Θ expression (tight bound) to describe the


number of operations required for the following algorithms:
1)
z=0; Solution:
for (w = 0 ; w < K ; w++)
By applying rules of order:
for (x = 0 ; x < N ; x++)
z=z+x Complexity Time = Θ (K ×
End for
for (y = 0 ; y < M ; y++) (N + M)) = Θ(K × N) =
z=z+y Θ(N2) if K is O(N)
End for
End for Note: Θ(N+M)=Θ (N) say
N max
Example 4:
30

 Write the big-Θ expression (tight bound) to describe the


number of operations required for the following algorithms:
2)
Solution:-
F1 (N)
For i = 1 to N Do  By applying rules of order:
j=N
Complexity Time = O(N2)
While I > j Do
j=j–1  By exact iteration:
End while
Complexity Time
End for
𝑁−1
EndF1 =0+1+2+⋯ ⋯+N-1 = 𝑖=0 𝑖

(𝑁−1)𝑁
= = Θ (N2 )
2
Example 4:
31

 Write the big-Θ expression (tight bound) to describe the


number of operations required for the following algorithms:
3)
Solution:
F2 (N)
I=1
 By summing the # iterations in the inner
While I < N Do loop: 1 + 2 + 4 + 8 + 16 + … + 2L ,where L is
For J = 1 to I
K=1 # iterations
End for
I=2×I  Termination:
End while when 2L = N  log 2L = log N
EndF2
 L = log2 (N)
𝒍 𝒊
 By applying rules of order:  1 + 2 + 4 + 8 + 16 + … + 2L = 𝟎𝟐 =
𝒍𝒐𝒈 𝑵 𝒊
Complexity Time = O (NlogN) 𝟎 𝟐= (𝟐𝒍𝒐𝒈 𝑵+𝟏 −1)=(𝟐𝒍𝒐𝒈 𝑵 𝟐 − 𝟏)
=(𝑵𝒍𝒐𝒈 𝟐 𝟐 − 𝟏) = N 2-1= Θ (N)
Example 4:
32

 Write the big-Θ expression (tight bound) to describe the number


of operations required for the following algorithms:
4)
I=N
while (I > 1)
I = I - 10;
End while

 Solution:
 Iterator: N, N – 10, N – 2×10, N – 3×10, …, N – L×10, where L is # iterations
𝑁−1
 Termination: when N – L×10 = 1  L = 10
Complexity Time = Θ(N)
Example 4:
33

 Write the big-Θ expression (tight bound) to describe the


number of operations required for the following algorithms:
5)
For i = 1 to N Do
A[i] = i
B[i] = 1
Endfor
InsertionSort(A, N)

Solution: Θ(N+N)=Θ(N) since the array is already


sorted (insertion sort best case)
Eight growth functions
34

 Eight functions O(n) that occur frequently in the analysis of


algorithms (in order of increasing rate of growth relative to n):
 Constant ≈ 1
 Logarithmic ≈ log n
 Linear ≈ n
 Log Linear ≈ n log n
 Quadratic ≈ n2
 Cubic ≈ n3
 Exponential ≈ 2n
 Factorial≈ n!
Growth rates compared
35

n=1 n=2 n=4 n=8 n=16 n=32


1 1 1 1 1 1 1
logn 0 1 2 3 4 5
n 1 2 4 8 16 32
nlogn 0 2 8 24 64 160
n2 1 4 16 64 256 1024
n3 1 8 64 512 4096 32768
2n 2 4 16 256 65536 4294967296
n! 1 2 24 40320 20.9T Don’t ask!

You might also like