Algo Chapter 2
Algo Chapter 2
analysis concept
Asymptotic complexity
Big-O, Ω, Θ notations
Common complexity classes
Best, average and worst case complexity
REVIEW
An algorithm is well-defined computational procedure that takes
some values as input and produces some values as output.
Like a cooking formula, an algorithm provides a step-by-step method
for solving a computational problem.
They are mathematical entities, which can be thought of as running on
some sort of idealized computer with an infinite random access
memory and an unlimited word size.
Algorithm design is all about the mathematical theory behind the
design of good programs
Step of steeps
Problem-needs many algorithm
Algorithm takes list amount of spaces is best algorithm
I/p---------Algorithm------------o/P
Analysis of Algorithms
Algorithm analysis refers to the process of determining how
much computing time and storage that algorithms will
require.
In other words, it’s a process of predicting the resource
requirement of algorithms in a given environment.
After designing an algorithm, it has to be checked and its
correctness needs to be predicted; this is done by analyzing
the algorithm.
writing a syntax-error-free program is not enough. We need
to know whether the algorithm is correct or not, i.e. whether
it can give correct answers for the inputs.
If the program is run on a large data set, the running time
becomes an issue.
Cont..
We want to know how the algorithm performs when the
input size is large.
The program may be computationally inefficient, or may
need lots of memory.
We analyze the resources that the algorithm requires:
memory, and computation time.
We analysis algorithms, rather than problems.
A problem can be solved with several algorithms, some are
more efficient than others.
Analysis of Algorithm refers to calculating or guessing
resources need full for the algorithm.
Resources means computer memory, processing time etc.
In all these factors, time is most important because the
program developed should be fast enough.
• The choice of a particular algorithm depends on
following performance analysis and measurements.
Space complexity
Time complexity
Note: Complexity how easy it is to understand given
two algorithms, one complicated and one clear, tend to
prefer the clear one or you can say Many ways to solve a
problem.
but which way (i.e. algorithm) is better We check by
Measuring Efficiency(performance analysis)
Space complexity
Analysis of space complexity of determining of the amount
of memory size it needs to run to completion of given
algorithm.
Some of the reasons for studying space complexity are:
There may be several possible solutions with different
space requirements.
We may be interested to know in advance that whether
sufficient memory is available to run the program.
Can be used to estimate the size of the largest problem that
a program can solve.
Memory Usage while Execution
While executing, algorithm uses memory space for three reasons:
Instruction Space
It's the amount of memory used to save the compiled version of
instructions.
Environmental Stack
Sometimes an algorithm(function) may be called inside another
algorithm(function). In such a situation, the current variables are
pushed onto the system stack, where they wait for further execution
and then the call to the inside algorithm(function) is made.
• For example, If a function A() calls function B() inside it, then all the
variables of the function A() will get stored on the system stack
temporarily, while the functionB() is called and executed inside the
funciton A().
Cont..
• Data Space
Amount of space used by the variables and constant.
Calculating spaces complexity
For calculating the space complexity, we need to know the value of
memory used by different type of datatype variables, which generally
varies for different operating systems, but the method for calculating
the space complexity remains the same.
Data Type Size
bool, char, unsigned char, signed char 1 byte
short int, unsigned short int 2 bytes
float, int, long int 4 bytes
double, long double, long long 8 bytes
Now let's learn how to compute space complexity by taking a few
examples
{
int z = a + b + c;
return(z);
}In the above expression, variables a, b, c and z are all
integer types, hence they will take up 4 bytes each, so total
memory requirement will be (4(4) + 4) = 20 bytes, this
additional 4 bytes is for return value. And because this
space requirement is fixed for the above example, hence it
is called Constant Space Complexity
To know the size of datatype you can type in C++
compiler
Cout<<“the size of integer\t”<<sizeof(int)<<endl;
Cout<<“the size of character\t”<<sizeof(char)<<endl;
Cont..
Let's have another example, this time a bit complex one,// n
is the length of array a[]
int sum(int a[], int n)
{
int x = 0; // 4 bytes for x
for(int i = 0; i < n; i++) // 4 bytes for i
{
x = x + a[i]; //4n
}
return(x);// 4bytes
}
Cont..
In the above code, 4*n bytes of space is required for the
array a[] elements.
Hence the total memory requirement will be (4n + 12),
which is increasing linearly with the increase in the input
value n, hence it is called as Linear Space Complexity.
Similarly, we can have quadratic and other complex space
complexity as well, as the complexity of an algorithm
increases.
But we should always focus on writing algorithm code in
such a way that we keep the space complexity minimum.
Cont..
• Algorithm swap(a , b)
{ space complexity
Temp=a; variable a=1
a=b; variable b=1
b=temp variable temp=1
} ----+----3
Spaces complexity s(n)=3
but we don’t know its datatype it is simply represent by
word so
S(n)=3word
Time complexity
The amount of time that an algorithm needs to run for
completion
There are several factors affecting the running time
Type of computer
Quality of compiler
Algorithm
input to the algorithm
The content of the input affects the running time typically,
the input size (number of items in the input) is the main
consideration
E.g. sorting problem ⇒ the number of items to be sorted.
Running time of an Algorithm
• Running time of an algorithm depends upon the input size
and nature of input
if(condition)
{
Statement 1
}
else
{
int main()
{int a,b,c;
cout<<"eneter two number\n"; 1
cin>>a; 1
cin>>b; 1
if(a>b) 1
{
c=a+b; 1
cout<<"sum of "<<c; 1 statement 2
}
else {
cout<<"b"<<endl; 1
}
system("pause");} T(n)=1+1+1+1+MAX(2,1)=6
algorithm analysis=T(n)=6
asymptotic analysis=O(1)
3. Loop statements:
The running time for the nested loop is N( (outer) * (inner loop
+number of testing + iterations) or time for checking (number of
iteration + 1) + time for update (number of iteration)
for(int i=1 ; i<=n; i++) n
{
for( int j=1; j<=n; j++)n(n+1+n)
{
K++ n2
}
}
T(n)=1+(n+1)+n+n(1+(n+1)+n+n)
T(n)=2+n+n+n+n2+n+n2+n2=3n2+4n+2
Time complexity of nested loop is considered as O(n2) or If
loop variables incremented or decrement by some
constant amount.
Cont..
Time complexity of these single loop is
int k=0,n; considered as O(n) or big O If loop
cout<<“Enter an integer”; variables incremented or decrement
cin>>n n =3 by some constant amount. how
for(int i=0;i<=n, i++) i=1+1+1+1+--- =n
k++; by how many k
n/2k
assume similarly the loop is terminate the equation when i<1
therefore n/2k<1
n/2k =1
n=2k--> k=logn2O(logn)
Determine running time of program
int total(int n)
{
int sum=0;
for (int i=1;i<=n;i++)
sum=sum+1;
return sum;
}
Time Units to Compute
-------------------------------------------------
Cont…
• 1 for the assignment statement:
int sum=0
In the for loop:
1 assignment, n+1 tests, and n increments.
n loops of 2 units for an assignment, and an addition.
1 for the return statement.
• ----------------------------------------------------
T (n)= 1+ (1+n+1+n)+2n+1 = 4n+4
algorithm analysis 4n+4
• asymptotic analysis: O(n)
Cont…
• int sum (int n)
{
int partial_sum = 0;
for (int i = 1; i <= n; i++)
partial_sum = partial_sum +(i * i * i);
return partial_sum;
}
• Time Units to Compute
-------------------------------------------------
Cont…
• O(N)
• O(N^2)
Big-Omega Notation(best case)
Just as O-notation provides an asymptotic upper bound on
a function, notation provides an asymptotic lower bound.
Formal Definition: A function f(n) is ( g (n)) if there exist
constants c and k ∊ ℛ+ such that
f(n) >=c. g(n) for all n>=k.
f(n)= ( g (n)) means that f(n) is greater than or equal to
some constant multiple of g(n) for all values of n greater
than or equal to some k.
Example: If f(n) =n2, then f(n)= ( n)
In simple terms, f(n)= ( g (n)) means that the growth rate
of f(n) is greater that or equal to g(n).
Cont..
F(n)=2n+3 show that f(n) is ( g (n))
Therefore function to be 2n+3>=c.g(n)
2n+3>=1.n for all n>=1
| | |
F(n)= C g(n)
Therefore f(n)=( g (n)) true
And also 2n+3=1*logn these is also true f(n)= logn
Theta Notation(average case)
A function f (n) belongs to the set of (g(n)) if there exist
positive constants c1 and c2 such that it can be sandwiched
between c1.g(n) and c2.g(n), for sufficiently large values of
n.
Formal Definition: A function f (n) is (g(n)) if it is both
O( g(n) ) and ( g(n) ). In other words, there exist
constants c1, c2, and k >0 such that c1.g (n)<=f(n)<=c2.
g(n) for all n >= k
If f(n)= (g(n)), then g(n) is an asymptotically tight bound
for f(n).
In simple terms, f(n)= (g(n)) means that f(n) and g(n)
have the same rate of growth.
Show f(n)=2n+3 point to????
eg