Algorithm - Asymptotic Notattions - ADT PPT Part1
Algorithm - Asymptotic Notattions - ADT PPT Part1
1
Table of Contents
• Algorithm
– Efficiency of an Algorithm
– Time and Space Complexity
• Asymptotic notations
– Big Oh
– Big Theta
– Big Omega
• Time-Space trade-off
• Abstract Data Types (ADT)
2
Algorithm Definition
• An algorithm is a step by step procedure to
solve a problem.
• In normal language, the algorithm is defined
as a sequence of statements which are used to
perform a task.
3
Algorithm Definition…
• In computer science, an algorithm can be
defined as follows...
4
Specifications of Algorithms
• Input - Every algorithm must take zero or more number
of input values from external.
• Output - Every algorithm must produce an output as a
result.
• Definiteness - Every statement / instruction in an
algorithm must be clear and unambiguous (only one
interpretation).
• Finiteness - For all different cases, the algorithm must
produce a result within a finite number of steps.
• Effectiveness - Every instruction must be basic enough to
be carried out and it also must be feasible.
5
Example
• Let us consider the following problem for finding the
largest value in a given list of values.
7
Code using C Programming
• int findMax(L)
{
int max = 0,i;
for(i=0; i < listSize; i++)
{
if(L[i] > max)
max = L[i];
}
return max;
}
8
Efficiency of an Algorithm
• A measure of the average execution time necessary for
an algorithm to complete work on a set of data.
9
Time and Space Complexity
• Time complexity is a function describing the amount
of time an algorithm takes in terms of the amount of
input to the algorithm.
10
Time and Space Complexity...
• Space complexity is a function describing the
amount of memory (space) an algorithm takes in
terms of the amount of input to the algorithm.
• We often speak of "extra" memory needed, not
counting the memory needed to store the input
itself.
• Space complexity is sometimes ignored because the
space used is minimal and/or obvious, but
sometimes it becomes as important an issue as time.
11
Asymptotic Notations
• Whenever we want to perform an analysis of an algorithm,
we need to calculate the complexity of that algorithm.
• But when we calculate the complexity of an algorithm it does
not provide the exact amount of resource required.
• So instead of taking the exact amount of resource, we
represent that complexity in a general form (Notation) which
produces the basic nature of that algorithm.
13
Big - Oh Notation (O)
• Big - Oh notation is used to define the upper bound of an
algorithm in terms of Time Complexity.
• That means Big - Oh notation always indicates the maximum
time required by an algorithm for all input values.
• That means Big - Oh notation describes the worst case of an
algorithm time complexity.
• Big - Oh Notation can be defined as follows:
“Consider function f(n) as time complexity of an
algorithm and g(n) is the most significant term. If f(n)
<= C g(n) for all n >= n0, C > 0 and n0 >= 1. Then we
can represent f(n) as O(g(n)).”
14
Big - Oh Notation (O)...
• Consider the following graph drawn for the values of f(n) and
C g(n) for input (n) value on X-Axis and time required is on Y-
Axis.
• In above graph after a particular input value n0, always C g(n)
is greater than f(n) which indicates the algorithm's upper
bound.
15
Example
• Consider the following f(n) and g(n).
f(n) = 3n + 2
g(n) = n
• If we want to represent f(n) as O(g(n)) then it must satisfy f(n) <=
C g(n) for all values of C > 0 and n0>= 1
⇒ f(n) <= C g(n)
⇒3n + 2 <= C n
• Above condition is always TRUE for all values of C = 4 and n >= 2.
• By using Big - Oh notation, we can represent the time complexity
as follows...
3n + 2 = O(n)
16
Big - Omega Notation (Ω)
• Big - Omega notation is used to define the lower bound of an
algorithm in terms of Time Complexity.
• That means Big-Omega notation always indicates the
minimum time required by an algorithm for all input values.
• That means Big-Omega notation describes the best case of an
algorithm time complexity.
• Big - Omega Notation can be defined as follows:
18
Example
• Consider the following f(n) and g(n)...
f(n) = 3n + 2
g(n) = n
• If we want to represent f(n) as Ω(g(n)) then it must satisfy f(n) >=
C g(n) for all values of C > 0 and n0>= 1
⇒ f(n) >= C g(n)
⇒3n + 2 >= C n
• Above condition is always TRUE for all values of C = 1 and n >= 1.
• By using Big-Omega notation we can represent the time
complexity as follows...
3n + 2 = Ω(n)
19
Big - Theta Notation (Θ)
• Big - Theta notation is used to define the average bound of an
algorithm in terms of Time Complexity.
• That means Big - Theta notation always indicates the average
time required by an algorithm for all input values.
• That means Big - Theta notation describes the average case of
an algorithm time complexity.
• Big - Theta Notation can be defined as follows:
“Consider function f(n) as time complexity of an
algorithm and g(n) is the most significant term. If
C1 g(n) <= f(n) <= C2 g(n) for all n >= n0, C1 > 0, C2 >
0 and n0 >= 1. Then we can represent f(n) as
Θ(g(n)). f(n) = Θ(g(n))”
20
Big - Theta Notation (Θ)...
• Consider the following graph drawn for the values of f(n) and
C g(n) for input (n) value on X-Axis and time required is on Y-
Axis.
• In above graph after a particular input value n0, always C1 g(n)
is less than f(n) and C2 g(n) is greater than f(n) which indicates
the algorithm's average bound.
21
Example
• Consider the following f(n) and g(n)...
f(n) = 3n + 2
g(n) = n
• If we want to represent f(n) as Θ(g(n)) then it must
satisfy C1 g(n) <= f(n) <= C2 g(n) for all values of C1 > 0, C2 >
0 and n0>= 1
⇒ C1 g(n) <= f(n) <= C2 g(n)
⇒C1 n <= 3n + 2 <= C2 n
• Above condition is always TRUE for all values of C1 = 1, C2 =
4 and n >= 2.
• By using Big - Theta notation we can represent the time
complexity as follows...
3n + 2 = Θ(n)
22
Intuition about the notations
Notation Intuition
O (Big-Oh) f(n) g(n)
(Big-Omega) f(n) g(n)
(Theta) f(n) = g(n)
23
Common Asymptotic Notations
Constant O(1)
Logarithmic Ο(log n)
Linear Ο(n)
n log n Ο(n log n)
Quadratic Ο(n2)
Cubic Ο(n3)
Polynomial nΟ(1)
Exponential 2Ο(n)
24
Time-Space Trade-off
• The best algorithm to solve a given problem is one that
requires less space in memory and takes less time to complete
its execution.
• But in practice, it is not always possible to achieve both these
objectives.
• As we know there may be more than one approach to solve a
particular problem.
• One approach may take more space but takes less time to
complete its execution while the other approach may take
less space but takes more time to complete its execution.
• We may have to sacrifice one at the cost of the other.
25
Time-Space Trade-off...
• If space is our constraint, then we have to choose a program
that requires less space at the cost of more execution time.
• On the other hand, if time is our constraint then we have to
choose a program that takes less time to complete its
execution at the cost of more space.
• That is what we can say that there exists a time-space trade-
off among algorithms.
26
Abstract Data Types (ADT)
• Abstract data type (ADT) is a mathematical model with a
collection of operations defined on that model.
• The abstract data type encapsulates a data type in the sense
that the definition of the type and all operations can be
localized and are not visible to the user of the ADT.
• To the user the declaration of the ADT and its operations are
important.
27
Abstract Data Types (ADT)...
• Implementation of abstract data type is the translation into
statements of programming language which chooses the data
structure to represent the abstract data type.
• For example class, structures, union, enumerated data types
are abstract data types.
• ADT (Abstract Data Types) are the data types which are made
up of or composed of primitive data types as these ADT can
be application or implementation specific, they could be
created on a need basis.
28
Abstract Data Types (ADT)...
• Some Examples:
• stack: operations are "push an item onto the stack", "pop an
item from the stack", "ask if the stack is empty"; an
implementation may be as an array, linked list, etc.
• queue: operations are "add to the end of the queue", "delete
from the beginning of the queue", "ask if the queue is
empty"; an implementation may be as an array or linked list
or heap.
• search structure: operations are "insert an item", "ask if an
item is in the structure", and "delete an item"; an
implementation may be as an array, linked list, tree, a hash
table.
29
Thank You!
30