0% found this document useful (0 votes)
41 views

Basics of Algorithm Analysis: What Is The Running Time of An Algorithm

The document discusses algorithm analysis and asymptotic notation. It covers: 1) Measuring running time as a function of input size n and ignoring lower order terms. 2) Classifying algorithms into categories like n, n^2, n log n based on their asymptotic growth rates. 3) Defining asymptotic notation like Big-O, Big-Omega, and Big-Theta to formally describe upper and lower bounds of algorithms.

Uploaded by

Mark Glen Guides
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
41 views

Basics of Algorithm Analysis: What Is The Running Time of An Algorithm

The document discusses algorithm analysis and asymptotic notation. It covers: 1) Measuring running time as a function of input size n and ignoring lower order terms. 2) Classifying algorithms into categories like n, n^2, n log n based on their asymptotic growth rates. 3) Defining asymptotic notation like Big-O, Big-Omega, and Big-Theta to formally describe upper and lower bounds of algorithms.

Uploaded by

Mark Glen Guides
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

Basics of Algorithm Analysis

• We measure running time as a function of n, the size of the input (in


bytes assuming a reasonable encoding).
• We work in the RAM model of computation. All “reasonable” oper-
ations take “1” unit of time. (e.g. +, *, -, /, array access, pointer
following, writing a value, one byte of I/O...)

What is the running time of an algorithm


• Best case (seldom used)
• Average case (used if we understand the average)
• Worst case (used most often)

We measure as a function of n, and ignore low order terms.


• 5n3 + n − 6 becomes n3
• 8n log n − 60n becomes n log n
• 2n + 3n4 becomes 2n
Asymptotic notation

big-O
O(g(n)) = {f (n) : there exist positive constants c and n0 such that
0 ≤ f (n) ≤ cg(n) for all n ≥ n0} .
Alternatively, we say

f (n) = O(g(n)) if there exist positive constants c and n0 such that


0 ≤ f (n) ≤ cg(n) for all n ≥ n0}
Informally, f (n) = O(g(n)) means that f (n) is asymptotically less than or
equal to g(n).

big-Ω

Ω(g(n)) = {f (n) : there exist positive constants c and n0 such that


0 ≤ cg(n) ≤ f (n) for all n ≥ n0} .
Alternatively, we say

f (n) = Ω(g(n)) if there exist positive constants c and n0 such that


0 ≤ cg(n) ≤ f (n) for all n ≥ n0} .
Informally, f (n) = Ω(g(n) means that f (n) is asymptotically greater than
or equal to g(n).
big-Θ

f (n) = Θ(g(n)) if and only if f (n) = O(g(n)) and f (n) = Ω(g(n)).

Informally, f (n) = Θ(g(n) means that f (n) is asymptotically equal to g(n).

INFORMAL summary
• f (n) = O(g(n)) roughly means f (n) ≤ g(n)
• f (n) = Ω(g(n)) roughly means f (n) ≥ g(n)
• f (n) = Θ(g(n)) roughly means f (n) = g(n)
• f (n) = o(g(n)) roughly means f (n) < g(n)
• f (n) = w(g(n)) roughly means f (n) > g(n)
We use these to classify algorithms into classes, e.g. n, n2, n log n, 2n.

See chart for justification


3 useful formulas

Arithmetic series
n n(n + 1)
i=
X

i=1 2

Geometric series
∞ 1
ai = for 0 < a < 1
X

i=0 1−a

Harmonic series
n 1
= ln n + O(1) = Θ(ln n)
X

i=1 i
Algorithmic Correctness

• Very important, but we won’t typically prove correctness from first


principles.
• We will use loop invariants
• We will use other problem specific methods
Master Theorem

Master Theorem for Recurrences Let a ≥ 1 and b > 1 be constants, let f (n)
be a function, and let T (n) be defined on the nonnegative integers by the
recurrence
T (n) = aT (n/b) + f (n) ,
where we interpret n/b to mean either bn/bc or dn/be. Then T (n) can be
bounded asymptotically as follows.
1. If f (n) = O(nlogb a−) for some constant > 0, then T (n) = Θ(nlogb a).
2. If f (n) = Θ(nlogb a), then T (n) = Θ(nlogb a lg n).
3. If f (n) = Ω(nlogb a+) for some constant > 0, and if af (n/b) ≤ cf (n) for
some constant c < 1 and all sufficiently large n, then T (n) = Θ(f (n)).

You might also like