0% found this document useful (0 votes)
29 views

Algo Chapter 2

Uploaded by

mesfinleul873
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
29 views

Algo Chapter 2

Uploaded by

mesfinleul873
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 57

Chapter 2: Algorithm /Complexity

analysis concept
Asymptotic complexity
Big-O, Ω, Θ notations
Common complexity classes
Best, average and worst case complexity
REVIEW
An algorithm is well-defined computational procedure that takes
some values as input and produces some values as output.
Like a cooking formula, an algorithm provides a step-by-step method
for solving a computational problem.
They are mathematical entities, which can be thought of as running on
some sort of idealized computer with an infinite random access
memory and an unlimited word size.
Algorithm design is all about the mathematical theory behind the
design of good programs
Step of steeps
Problem-needs many algorithm
Algorithm takes list amount of spaces is best algorithm
I/p---------Algorithm------------o/P
Analysis of Algorithms
Algorithm analysis refers to the process of determining how
much computing time and storage that algorithms will
require.
In other words, it’s a process of predicting the resource
requirement of algorithms in a given environment.
After designing an algorithm, it has to be checked and its
correctness needs to be predicted; this is done by analyzing
the algorithm.
writing a syntax-error-free program is not enough. We need
to know whether the algorithm is correct or not, i.e. whether
it can give correct answers for the inputs.
If the program is run on a large data set, the running time
becomes an issue.
Cont..
We want to know how the algorithm performs when the
input size is large.
The program may be computationally inefficient, or may
need lots of memory.
We analyze the resources that the algorithm requires:
memory, and computation time.
We analysis algorithms, rather than problems.
A problem can be solved with several algorithms, some are
more efficient than others.
Analysis of Algorithm refers to calculating or guessing
resources need full for the algorithm.
Resources means computer memory, processing time etc.
In all these factors, time is most important because the
program developed should be fast enough.
• The choice of a particular algorithm depends on
following performance analysis and measurements.
Space complexity
Time complexity
Note: Complexity how easy it is to understand given
two algorithms, one complicated and one clear, tend to
prefer the clear one or you can say Many ways to solve a
problem.
but which way (i.e. algorithm) is better We check by
Measuring Efficiency(performance analysis)
Space complexity
Analysis of space complexity of determining of the amount
of memory size it needs to run to completion of given
algorithm.
Some of the reasons for studying space complexity are:
There may be several possible solutions with different
space requirements.
We may be interested to know in advance that whether
sufficient memory is available to run the program.
Can be used to estimate the size of the largest problem that
a program can solve.
Memory Usage while Execution
While executing, algorithm uses memory space for three reasons:
Instruction Space
It's the amount of memory used to save the compiled version of
instructions.
Environmental Stack
Sometimes an algorithm(function) may be called inside another
algorithm(function). In such a situation, the current variables are
pushed onto the system stack, where they wait for further execution
and then the call to the inside algorithm(function) is made.
• For example, If a function A() calls function B() inside it, then all the
variables of the function A() will get stored on the system stack
temporarily, while the functionB() is called and executed inside the
funciton A().
Cont..
• Data Space
Amount of space used by the variables and constant.
Calculating spaces complexity
For calculating the space complexity, we need to know the value of
memory used by different type of datatype variables, which generally
varies for different operating systems, but the method for calculating
the space complexity remains the same.
Data Type Size
bool, char, unsigned char, signed char 1 byte
short int, unsigned short int 2 bytes
float, int, long int 4 bytes
double, long double, long long 8 bytes
Now let's learn how to compute space complexity by taking a few
examples
{
int z = a + b + c;
return(z);
}In the above expression, variables a, b, c and z are all
integer types, hence they will take up 4 bytes each, so total
memory requirement will be (4(4) + 4) = 20 bytes, this
additional 4 bytes is for return value. And because this
space requirement is fixed for the above example, hence it
is called Constant Space Complexity
To know the size of datatype you can type in C++
compiler
Cout<<“the size of integer\t”<<sizeof(int)<<endl;
Cout<<“the size of character\t”<<sizeof(char)<<endl;
Cont..
Let's have another example, this time a bit complex one,// n
is the length of array a[]
int sum(int a[], int n)
{
int x = 0; // 4 bytes for x
for(int i = 0; i < n; i++) // 4 bytes for i
{
x = x + a[i]; //4n
}
return(x);// 4bytes
}
Cont..
In the above code, 4*n bytes of space is required for the
array a[] elements.
Hence the total memory requirement will be (4n + 12),
which is increasing linearly with the increase in the input
value n, hence it is called as Linear Space Complexity.
Similarly, we can have quadratic and other complex space
complexity as well, as the complexity of an algorithm
increases.
But we should always focus on writing algorithm code in
such a way that we keep the space complexity minimum.
Cont..
• Algorithm swap(a , b)
{ space complexity
Temp=a; variable a=1
a=b; variable b=1
b=temp variable temp=1
} ----+----3
Spaces complexity s(n)=3
but we don’t know its datatype it is simply represent by
word so
S(n)=3word
Time complexity
The amount of time that an algorithm needs to run for
completion
There are several factors affecting the running time
 Type of computer
 Quality of compiler
 Algorithm
 input to the algorithm
The content of the input affects the running time typically,
the input size (number of items in the input) is the main
consideration
E.g. sorting problem ⇒ the number of items to be sorted.
Running time of an Algorithm
• Running time of an algorithm depends upon the input size
and nature of input

• The running time of an algorithm typically grows with the


input size
• Even on inputs of the same size, running time can be very
different
• Example: algorithm that finds the first prime number in
an array by scanning it left to right
• Idea: analyze running time in the best case, worst case,
average case
Example of algorithm

• Swapping of algorithm ==Exchanging of the 2 variable


based on temporary variable
• Algorithm swap(a , b)
{ Time complexity
Temp=a; 1 ass op unit
a=b; 1 >>>
b=temp 1>>>>
} --------+-------- 3
All are constant so time complexity T(n) =3
Complexity analysis
Is the systematic study of the cost of a computation,
measured either in time units or in operations performed,
or in the amount of storage space required.
Complexity analysis involves two distinct phases:
Algorithm analysis: analysis of the algorithm or data
structure to produce a function T(n) measuring the
complexity
order of magnitude (asymptotic) analysis: analysis of
the function T(n) to determine the general complexity
category to which it belongs.
Algorithm analysis requires a set of rules to determine
how operations are to be counted.
Algorithm Analysis Rules
1. Execution of one of the following operations takes time 1:
Assignment Operation
Single Input/Output Operation
Single Boolean Operations
Single Arithmetic Operations
Function Return/calling = sum{a,b};/calling 1unit
Index A[i]=5 is unit 2 indexing a=1 assignment=1
 Assignment Operation
e.g. i=0;
 Single Input/Output Operation
e.g. cin>>a; cout<<“hello”;
 Single Boolean Operations e.g. i>=10;
 Single Arithmetic Operations e.g. a+b;
 Function Return e.g. return sum;
int main()
{ int sum=0,a,b;
cout<< “Enter an integer the two number”;
cin>>a;
Cin>>b;
Sum=a+b;
Cout<<“sum”<<sum;
return 0;
} Time Units to Compute
1 for the assignment statement: int sum=0
2 for the output statement.
2 for the input statement.
2 for assignment and addition operation
1 Return statement T (n)= 1+2+2+2+1+ =8
-------------------------------------------------------------------
algorithm analysis=T(n)=8 means algorithm is said to run
Constant time if it require the same amount of time
nevertheless/regardless of the input
asymptotic analysis=O(1)
2. Running time of a selection statement (if, switch) is the
time for the condition evaluation + the maximum of the
running times for the individual clauses in the selection.
int x;

if(condition)
{
Statement 1
}
else
{
int main()
{int a,b,c;
cout<<"eneter two number\n"; 1
cin>>a; 1
cin>>b; 1
if(a>b) 1
{
c=a+b; 1
cout<<"sum of "<<c; 1 statement 2
}
else {
cout<<"b"<<endl; 1
}
system("pause");} T(n)=1+1+1+1+MAX(2,1)=6
algorithm analysis=T(n)=6
asymptotic analysis=O(1)
3. Loop statements:
The running time for the nested loop is N( (outer) * (inner loop
+number of testing + iterations) or time for checking (number of
iteration + 1) + time for update (number of iteration)
for(int i=1 ; i<=n; i++) n
{
for( int j=1; j<=n; j++)n(n+1+n)
{
K++ n2
}
}
T(n)=1+(n+1)+n+n(1+(n+1)+n+n)
T(n)=2+n+n+n+n2+n+n2+n2=3n2+4n+2
Time complexity of nested loop is considered as O(n2) or If
loop variables incremented or decrement by some
constant amount.
Cont..
Time complexity of these single loop is
int k=0,n; considered as O(n) or big O If loop
cout<<“Enter an integer”; variables incremented or decrement
cin>>n n =3 by some constant amount. how
for(int i=0;i<=n, i++) i=1+1+1+1+--- =n
k++; by how many k

T(n)= 3+1+n+1+n+n=3n+5 therefor k=n


O(n)
algorithm analysis=T(n)= 3n+5
asymptotic analysis= O(n)
eg2
int count()
{ int k=0;  1 for the assignment statement:
cout<< “Enter an integer”; 1 for the output statement.
cin>>n; 1 for the input statement.
for (i=0;i<=n;i++)
k=k+1;
return 0;} Time Units to Compute
In the for. loop:
1 assignment, n+1 tests, and n increments.
n loops of 2 units for an assignment, and an addition.
1 for the return statement.---T (n)= 1+1+1+(1+n+1+n)+2n+1 =
4n+6 algorithm analysis=T(n)= 4n+6 worst case
asymptotic analysis= O(n)
Incremented by
/ or *
For (int i =1; i<=n ; i= i*2) i=1*2*2*2*2*…..=N
{
Statement ; logn means 2k =NK=logn2
Proved is O(logn2)
}
Time complexity of nested loop is
How initially i=1,
considered as O(Logn) If loop variables
1*2= 2
divided/multiple by some constant
2*2=22
amount.
22 *2= 23 Statement =logn
:
:
2k Assume the loop is terminated at i>=n
i=2k,,,,2k >=n
2k =n it comes onces true using mathes
induction principle
K=logn2
Substitut n=8 and n=10 example

• For(int i=1;i<=n;i=i*2) n=8 n=10


• {statement; } i=1 i=1
i=2 i=2
i=4 i=4
i=8 no i=8 yes
3 times check i=16 no 4
Times check therefor values of statement determined by n values
logn
Cont..
For(int i=n;i>=1;i=i/2) firstly i=n
{ second n/2
Statement; n/22
} n/23
n/24
:

n/2k
assume similarly the loop is terminate the equation when i<1
therefore n/2k<1
n/2k =1
n=2k--> k=logn2O(logn)
Determine running time of program
int total(int n)
{
int sum=0;
for (int i=1;i<=n;i++)
sum=sum+1;
return sum;
}
Time Units to Compute
-------------------------------------------------
Cont…
• 1 for the assignment statement:
int sum=0
In the for loop:
1 assignment, n+1 tests, and n increments.
n loops of 2 units for an assignment, and an addition.
1 for the return statement.
• ----------------------------------------------------
T (n)= 1+ (1+n+1+n)+2n+1 = 4n+4
algorithm analysis 4n+4
• asymptotic analysis: O(n)
Cont…
• int sum (int n)
{
int partial_sum = 0;
for (int i = 1; i <= n; i++)
partial_sum = partial_sum +(i * i * i);
return partial_sum;
}
• Time Units to Compute
-------------------------------------------------
Cont…

1 for the assignment.


1 assignment, n+1 tests, and n increments.
n loops of 4 units for an assignment, an addition, and
two multiplications.
1 for the return statement.
-------------------------------------------------
T (n)= 1+(1+n+1+n)+4n+1 = 6n+4
Algorithm analysis= 6n+4
An algorithm is said to run in linear time if its time
execution is directly proportional to input size
(asymptotic) analysis: = O(n)
Eg
calculate the maximum computational time of following Nested
Loop Times
i=1; 1
sum = 0; 1
while (i <= n) { n+1
j=1; n
while (j <= n) { n*(n+1)
sum = sum + 1; 2* n*n
j = j + 1; 2*n*n
}
i = i +1; 2*n
}
T(n)=1+1+n+1+n+n*(n+1)+2*n*n+2*n*n+2*n=5n2+5n+3
(asymptotic) analysis:---------O(n2)
EXERCISE
1. int main()
{int a,b,c,sum=0;
cout<<"enter nuber 3 number"<<endl;
cin>>a>>b>>c;
sum=a+b+c;
cout<<"sum=="<<sum<<endl;
A. find space complexity? Ans. S(n)=4*4=16
B. Find time complexity? Ans. T(n)=1+1+3+3+1=9
C. Compare with question number 2?
2. int a,b,c;
cout<<"enter nuber 3 number"<<endl;
cin>>a>>b>>c; Ans.s(n)=4*3=12
return<<(a+b+c); Ans. t(n)=1+3+1=5
system("pause");
ASYMPTOTIC NOTATIONS
A way to describe behavior of functions/algorithm in the
limit
Abstracts away low-order terms and constant factors
How we indicate running times of algorithms
Describe the running time of an algorithm as n grows to
large.
3 asymptotic notations
Big O notation: asymptotic “less than”: f(n)“≤” g(n)
Omega  notation: asymptotic “greater than”:f(n) “≥”
g(n)
Theta  notation: asymptotic “equality”: f(n) “=” g(n)
Big O-notation
• Simplified analysis of algorithms efficiency
• Big-Oh notation is a way of comparing algorithms and is
used for computing the complexity of algorithms; i.e., the
amount of time that it takes for computer program to run
.(it covers upper bound worst case )
• Complexity interims of input size N
• Basic computer steps
• It’s only concerned with what happens for very a large
value of n.
• Therefore only the largest term in the expression
(function) is needed.
Cont..
• For example, if the number of operations in an algorithm is
n2 – n, n is insignificant compared to n2 for large values of
n. Hence the n term is ignored. Of course, for small values
of n, it may be important. However, Big-Oh is mainly
concerned with large values of n.
Cont..
Formal definition The function f(n)=O(g(n) if and only if
+Ve constant C and n0 read as fx is big O g(x)
Such that f(n)<=c.g(n) for all n>n0( it can be read as n> n
zero or n note)
Eg f(n)=2n+3
To prove 2n+3 f(n)<=c.g(n)2n+3<=c*n  the given
function have only 2 terms then find term that makes greater
from given function
Take constant or c which grater from given sum c=5,6,10…
when if n=1 n=2 depends on n
F(n)=O(n)
F(n) also=O(n2)
F(n) also=O(2n)

But f(n) is not O(logn) because it makes g(n)false for any


Cont..

5<=10 ,7<=20…..based on the input of n


2n+3<=10n but it should be for all n>=1,c=10
Therefore f(n)= O(n)
Or to find constant c add n to f(n) because the greater
exponent of f(n) is 1 so to balance in the g(n) add it in
constant number for g(n)
2n+3<=2n+3n
2n+3<=5n for all n>=1 the function is true
eg
 f(n)=n2+2n+1 is O(gn),f(n)<=c*g(n)
n2+2n+1<=c*g(n)
n2+2n+1<=n2+2n2+n2
n2+2n+1<=4n2 C=4 toget the range of n0 take arbitrary point n>=1
{c=4,n0/k>=1} so f(n)=O(X2)
Let f(n) = 7n + 8 and g(n) = n. Is f(n) ∈ O(g(n))?
The second opportunity to determine such like function may take the
constant number for c but the issue is that determining the no that
makes true equation
For 7n + 8 ∈ O(n), we have to find c and n0 such that 7n + 8 ≤ c · n,
∀n ≥ n0. By inspection, it’s clear that c must be larger than 7. Let c =
8.
Cont..
Now we need a suitable n0. In this case, f(8) = 8 · g(8).
Because the definition of O() requires that f(n) ≤ c · g(n),
we can select n0 = 8, or any integer above 8 – they will all
work.
We have identified values for the constants c and n0 such
that 7n + 8 is ≤ c · n for every n ≥ n0, so we can say that 7n
+ 8 is O(n).
(But how do we know that this will work for every n above
7?
We can prove by induction that 7n+8 ≤ 8n, for all n ≥ 8. Be
sure that you can write such proofs if asked!)
Cont.
• f(n)=10n+5 and g(n)=n. Show that f(n) is O(g(n)).
To show that f(n) is O(g(n)) we must show that constants c
and k such that
f(n) <=c.g(n) for all n>=n0 or some constant k
Or 10n+5<=c.n for all n>=n0
Try c=15. Then we need to show that 10n+5<=15n
Solving for n we get: 5<5n or 1<=n.
So f(n) =10n+5 <=15.g(n) for all n>=1.
(c=15,k/n0=1).
Cont..
Exampl: f(n) =7n-2 is O(n)
need c > 0 and n0  1 such that 7n-2  c•n for
n  n0
this is true for c = 7 and n0 = 1
Theorem 1(Big-Oh Rules:): k is O(1) means Ignore
constant.eg 5n is O(N)
Theorem 2:-certain term dominate others in some
growth of N
O(1)<O(Logn)<O(N)<O(nlogn)<O(n^)<O(2^N)…
i.E Ignore lower order term
Cont..
Theorem 3: A polynomial is O(the term containing
the highest power of n
Polynomial’s growth rate is determined by the
leading term(Pick the highest order) may be
ignored(Ignore the coefficient.(Drop constant factors)
like=== k*f(n) is O(f(n))
If f(n) is a polynomial of degree d, then f(n) is
O(nd)
E.g. f(n) =7n4+3n2+5n+1000 is O(n4)
Example:
1. T(n)=3n + 5  O(n)
2. T(n)=3n2+4n+2  O(n2)
If f(n) a polynomial of degree d, then f(n) is O( nd), i.e
Big O in constant time
Constant factors
X=5+(15*20);
It is independent of N
It is O(1)
What will be sequence of constant function
X=5+(15*20);
Y=6-2;
Print y+X;
Total time =O(1)+o(1)+O(1)=3*O(1) but drop constant
=O(1)
Linear time
We have discussed before indirectly but for summarize
Eg
For X in the range(0,n)// N
Print X;//O(1) O(1)*N=O(N)
eg2
Y=3+(6*9) //O(1)
For X in the range(0,n) //O(N)
Print X; //O(1)
Total time=O(N)+O(1)=O(N)
Quadratic time //O(N^2) the nested loop
dominates the function
For x in range(0,n)
For y in range(0,n)
Print (X+y);//O(1)
O(N^2)
• O(1)

• O(N)

• O(N^2)
Big-Omega Notation(best case)
Just as O-notation provides an asymptotic upper bound on
a function,  notation provides an asymptotic lower bound.
Formal Definition: A function f(n) is ( g (n)) if there exist
constants c and k ∊ ℛ+ such that
f(n) >=c. g(n) for all n>=k.
f(n)= ( g (n)) means that f(n) is greater than or equal to
some constant multiple of g(n) for all values of n greater
than or equal to some k.
Example: If f(n) =n2, then f(n)= ( n)
In simple terms, f(n)= ( g (n)) means that the growth rate
of f(n) is greater that or equal to g(n).
Cont..
F(n)=2n+3 show that f(n) is ( g (n))
Therefore function to be 2n+3>=c.g(n)
2n+3>=1.n for all n>=1
| | |
F(n)= C g(n)
Therefore f(n)=( g (n)) true
And also 2n+3=1*logn these is also true f(n)=  logn
Theta Notation(average case)
A function f (n) belongs to the set of  (g(n)) if there exist
positive constants c1 and c2 such that it can be sandwiched
between c1.g(n) and c2.g(n), for sufficiently large values of
n.
Formal Definition: A function f (n) is  (g(n)) if it is both
O( g(n) ) and  ( g(n) ). In other words, there exist
constants c1, c2, and k >0 such that c1.g (n)<=f(n)<=c2.
g(n) for all n >= k
If f(n)=  (g(n)), then g(n) is an asymptotically tight bound
for f(n).
In simple terms, f(n)=  (g(n)) means that f(n) and g(n)
have the same rate of growth.
Show f(n)=2n+3 point to????
eg

F(n)=2n+3 show that f(n) is  (g(n))


C1.(gn)<=2n+3<=c2.g(n)
| | |
( g (n)) f(n) Og(n)
1.n <=2n+3 <=5n
( g (n) and Og(n) point is average case or theta notation
But from it to left side is lower bound means O mega and
right side upperbound or big Oh
CATEGORIES OF ALGORITHM
ANALYSIS/types of measurement

• In order to determine the running of an algorithm it is


possible to define three function.
Tworst(n): worst case /big O
T best(n): best case/Omega
Tavg(n): average case /Theta Notation
• Algorithms may be examined under different situations to
correctly determine their efficiency for accurate comparison.
Best Case Analysis
• Takes the smallest possible set of inputs.
• The minimum amount of time that an algorithm require to
solve a problem of size n.
• We must know the case that causes minimum number of
operations to be executed.
• In the linear search problem, the best case occurs when x is
present at the first location.
• The number of operations in the best case is constant (not
dependent on n).
• So time complexity in the best case would be 1
Cont..

• Cause fewest number of executions.


• It computes the lower bound of T(n).(big o mega)
Examples: For sorting algorithm
data are arranged in the required order.
For searching algorithm
 required item is found at the first position.
Find at 1st place one comparison
Worst Case Analysis:
• The maximum amount of time that an algorithm require to solve
a problem of size n.
• The worst case of the algorithm is the function defined by the
maximum number of steps taken on any instance of size n.
• This gives an upper bound for the time complexity of an
algorithm.
• Normally, we try to find worst-case behavior of an algorithm.
• It computes the upper bound of T(n)
• Examples: For sorting algorithms
 data are arranged in the opposite order.
For searching algorithms
 required item is found at the end of list or
the item is missing.
Cont..
Example problem or input
Search(array, X) case 1:x is first element(best case)
no. of check is =1
case 2: x is not present (worst case)
Number of check is= N
So we will concern on the worst case of algorithm
because to reduces worst case time.
because it depends on the input size(n).
Average Case Analysis: Tavg(n):
The average amount of time that an algorithm require to
solve a problem of size n.
Sometimes, it is difficult to find the average-case behavior
of an algorithm.
We have to look at all possible data organizations of a given
size n, and their distribution probabilities of these
organizations
Takes an average set of inputs.
It causes average number of executions
• we take all possible inputs and calculate computing time
for all of the inputs. Sum all the calculated values and
divide the sum by total number of inputs.
Cont.…
• Showing graphical
Assignment 5% copy will lead to zero mark
Question 1. Read more and realize about the asymptotic analysis of
its rule , definition ,theorem in linear ,polynomial function with
example in detail.
A. Omega Notation ()
B. Theta Notation ()
Question 2. read and write some clarification of below term
A. Skip lists
B. Self- organizing lists
C. Sparse tables

You might also like