Chapter 1 Part 2UU
Chapter 1 Part 2UU
INTRODUCTION
Data Structure
and Algorithm
DATA STRUCTURE AND ALGORITHM 1
PSEUDOCODE
Pretty close to English but precise enough for a computing agent to carry out.
Algorithm arrayMax(A, n)
Input array A of n integers
Output maximum element of A
currentMax A[0]
for i 1 to n 1 do
if A[i] currentMax then
currentMax A[i]
return currentMax
PSEUDOCODE DETAILS
e ?
oo s
o ch
t
o ne
ich
Wh
Problem
Algorithm 5
Algorithm 4
Algorithm 7
Algorithm 6
Time complexity: the amount of time taken by an algorithm to run as a function of the length of the
input.
Space complexity: the amount of space or memory taken by an algorithm to run as a function of the
Time/Space TRADE OFF :- we may have to sacrifice one at the cost of the other
Efficiency of an algorithm can be analyzed at two different stages, before implementation and after
implementation.
Efficiency of an algorithm is measured by assuming that all other factors, for example, processor speed, are
In this analysis, actual statistics like running time and space required, are collected.
DATA STRUCTURE AND ALGORITHM 8
SPACE COMPLEXITY
2. We may be interested to know in advance that whether sufficient memory is available to run the
program.
4. Can be used to estimate the size of the largest problem that a program can solve
DATA STRUCTURE AND ALGORITHM 9
TIME COMPLEXITY
The time complexity of an algorithm or a program is the amount of time it needs to run to
completion.
the implementation of the algorithm, programming language, optimizing the capabilities of the compiler used,
To measure the time complexity accurately, we have to count all sorts of operations performed in an
algorithm.
If we know the time for each one of the primitive operations performed in a given computer, we can
Worst-Case Analysis –The maximum amount of time that an algorithm require to solve a
problem of size n. This gives an upper bound for the time complexity of an algorithm.
represents a guarantee for performance on any possible input.
Best-Case Analysis –The minimum amount of time that an algorithm require to solve a
Average-Case Analysis –The average amount of time that an algorithm require to solve a
5 20 2 23 8 10 1
Suppose you are given an array A and an integer x and you have to find if x exists in array
A.
for i : 1 to length of A
if A[i] is equal
to x
return
DATA STRUCTURE AND ALGORITHM 12
TRUE
return False
HOW TO ESTIMATE RUN TIME COMPLEXITY OF AN
ALGORITHM ?
When we analyze algorithms, we should employ mathematical techniques (theoretical model) that
Before we can analyze an algorithm, we must have a model of the implementation technology that
will be used, including a model for the resources of that technology and their costs.
We shall assume a generic one processor, random-access machine (RAM) model of computation
as our implementation technology and understand that our algorithms will be implemented as
computer programs.
DATA STRUCTURE AND ALGORITHM 13
RAM MODEL
Under the RAM model, we measure the run time of an algorithm by counting up the number of
Each memory access takes exactly one time step, and we have as much memory as we need.
Identifiable in pseudocode
Examples:
Evaluating an expression, Assigning a value to a variable, Indexing into an array, Calling a method,
Running time of a selection statement (if, switch) is the time for the condition evaluation +
the maximum of the running times for the individual clauses in the selection.
The running time of a for loop is at most the running time of the statements inside the for
loop (including tests) times the number of iterations.
Always assume that the loop executes the maximum number of iterations possible
Running time of a function call is 1 for setup + the time for any parameter calculations +
the time required for the execution of the function body.
{ -------------------------------------------------
Using asymptotic analysis, we can very well conclude the best case, average case, and
Asymptotic analysis is input bound i.e., if there's no input to the algorithm, it is concluded
Other than the "input" all other factors are considered constant.
In asymptotic notation, we use only the most significant terms to represent the time complexity of an
algorithm.
Asymptotic Notation identifies the behavior of an algorithm as the input size changes.
Following are the commonly used asymptotic notations to calculate the running time complexity of an
algorithm.
Ο Notation (Big O) – Upper bound
These are some basic function growth classifications used in various notations.
Logarithmic Function - log n
Linear Function - an + b
Quadratic Function - an2 + bn + c
Polynomial Function - an^z + . . . + an^2 + a*n^1 + a*n^0, where z is some constant
Exponential Function - a^n, where a is some constant
The list starts at the slowest growing function (logarithmic, fastest execution time) and
goes on to the fastest growing (exponential, slowest execution time).
It provides us with an asymptotic upper bound for the growth rate of runtime of an
algorithm.
Say f(n) is your algorithm runtime, and g(n) is an arbitrary time complexity you are trying
to relate to your algorithm. f(n) is O(g(n)), if for some real constants c (c > 0) and n 0, f(n)
To show that f(n) is O(g(n)) we must show that constants c and no such that
It provides us with an asymptotic lower bound for the growth rate of runtime of an
algorithm.
f(n) is Ω(g(n)), if for some real constants c (c > 0) and n0 (n0 > 0), f(n) is >= c g(n) for
f(n) is Θ(g(n)), if for some real constants c1, c2 and n0 (c1 > 0, c2 > 0, n0 > 0), c1 g(n) is
< f(n) is < c2 g(n) for every input size n (n > n0).