Session 0 - Algorithms Efficiency Analysis and Order (3)
Session 0 - Algorithms Efficiency Analysis and Order (3)
Algorithms - Session 0
Matthew Huang
Jan 7, 2025
Self introduction
Midterm (30%)
This will be either in-class exam
Project (20%)
This will be individual/group project(s)
Final (30%)
• Algorithms
• The Importance of Developing Efficient Algorithms
• Analysis of Algorithms
• Order
• Outline of This Book
Problem, Solution, Algorithm
• A problem is a question to which we seek an answer.
• Sort a list S of n numbers in nondecreasing order.
• Answer: the numbers in sorted sequence.
• e.g. S = [10; 7; 11; 5; 13; 8] and n = 6 [5, 7, 8, 10, 11, 13].
• Determine whether the number x is in the list S of n numbers.
• Answer: yes if x is in S and no if it is not.
• e.g. S = [10; 7; 11; 5; 13; 8]; n = 6; and x = 5 "yes, x is in S."
• f1 = 1
• f3 = f2 + f1 = 1 + 1 = 2
• f4 = f3 + f2 = 2 + 1 = 3
• f5 = f4 + f3 = 3 + 2 = 5; etc.
• the number of terms in the tree more than doubles every time n
increases by 2. e.g., there are 9 terms in the tree when n = 4 and
25 terms when n = 6. If we call T(n) the number of terms in the
recursion tree for n and the number of terms more than doubled
every time n increased by 2, we would have the following for n
even:
Because T(0) = 1, this would mean T(n) > 2n/2. We use induction to show
that this is true for n >= 2 even if n is not even.
Theorem 1.1
An iterative approach
Analysis of Algorithms
• When evaluating an algorithm's time efficiency, we don't focus on the specific
number of CPU cycles or instructions executed, as these vary based on the
computer, programming language, and programmer. Instead, we use a measure that
is independent of these variables.
• Generally, the running time of an algorithm increases with the input size and is
roughly proportional to the number of times a basic operation (like a comparison) is
performed. Therefore, we analyze an algorithm's efficiency by counting the number
of basic operations it performs relative to the input size.
• After figuring out the input size, we identify an instruction (or group of instructions)
that represents the bulk of the algorithm's work. This is called the basic operation.
• For example, in the problem “Determine whether the number x is in the list S of
n numbers”, comparing a value x with an item in S is a suitable basic operation.
By counting how many times this comparison happens for different input sizes,
we can understand how efficient the algorithms are compared to each other.
• In general, a time complexity analysis of an algorithm is the determination of
how many times the basic operation is done for each value of the input size.
Complexity Analysis
• Every-Case Time Complexity: T(n) is defined as the number of times
the algorithm does the basic operation for an instance of size n. T(n) is
called the every-case time complexity of the algorithm, and the
determination of T(n) is called an every-case time complexity analysis.
Complexity Analysis - Cont.
In many scenarios, the basic operation is not done the same number of times for all
instances of size n and this algorithm does not have an every-case time complexity.
However, this does not mean that we cannot analyze such algorithms, because there are
other analysis techniques that can be tried.
Consider the maximum number of times the basic operation is done. For a given
algorithm, W(n) is defined as the maximum number of times the algorithm will ever do
its basic operation for an input size of n. So W(n) is called the worst-case time
complexity of the algorithm, and the determination of W(n) is called a worst-case time
complexity analysis.
Worst-Case Time Complexity