EEB 435 Python LECTURE 3
EEB 435 Python LECTURE 3
Python Basics
By
DR. Ebenezer Esenogho
• Algorithm Analysis
• Complexity analysis,
• Big-O notation,
• Amortised Cost
• Algorithm Analysis
• What is an Algorithm?
Simple Examples
• Example 1: Making a Cup of Tea
• Imagine you're teaching an alien who has never seen a kitchen
before. Here's an algorithm for making a cup of tea:
• Input: Tea bag, water, cup, kettle, and optionally, sugar and milk.
• Steps:
• Steps:
– Fill the kettle with water.
– Turn on the kettle.
– Wait until the kettle boils.
– Place the tea bag in the cup.
– Pour the boiling water into the cup.
– Wait 3-5 minutes.
– Remove the tea bag.
– (Optional) Add sugar and/or milk to taste.
• Output: A cup of tea.
• Example 2: Finding the Maximum Number in a List
– Input: A list of numbers, e.g., [3, 1, 4, 1, 5, 9, 2, 6, 5].
– Steps:
– Assume the first number is the largest.
– For each number in the list, compare it to the current largest.
– If the current number is larger, it becomes the new largest.
– Continue until the end of the list.
– Output: The largest number (in this case, 9).
Pseudo code
Function Find Maximum(List)
Set Max to List[0]
For each Number in List
If Number > Max
Set Max to Number
Return Max
End Function
• 3. Real-World Algorithmic Examples
Binary Search: Given a sorted list of numbers and a target number,
binary search is an algorithm that finds if the target exists in the list.
It does this by repeatedly dividing the list in half until it either finds
the target or the range has no numbers left. Its efficiency is O(logn)
Best case: Define the input for which algorithm takes less time or minimum time. In
the best case calculate the lower bound of an algorithm. Example: In the linear search
when search data is present at the first location of large data then the best case occurs.
In the linear search problem, the best case occurs when x is present at the first location.
The number of operations in the best case is constant (not dependent on n). So time
complexity in the best case would be Ω(1)
Average case: In the average case take all random inputs and calculate the
computation time for all inputs.
And then we divide it by the total number of inputs.
Example 3: For insertion sort, the worst case occurs when the array is reverse
sorted and the best case occurs when the array is sorted in the same order as
output.
Complexity analysis
Complexity analysis provides a high-level understanding of the efficiency
of an algorithm.
There are two primary types of complexity:
,
Time Complexity:
Represents the amount of time an algorithm takes to run as a function of the length of the input.
•Time complexity is a measure of the amount of time an algorithm takes in terms of the amount of input to
the algorithm. The common notations used to describe time complexity are:
•O(1) – Constant Time:
–The running time of the algorithm is constant regardless of the input size.
–Example: Accessing an array element by its index.
,
O(n log n) – Linear Logarithmic Time:
–as input doubles, the number of operations performing in the algorithm only
increases by 1
–Common in advanced sorting algorithms. Example: Merge sort, quick sort (on
average).
,
,
,
For the first example, we only iterate through the list once,
making it linear time. For the second, we sort the list first (which
can take O(n log n) time with good algorithms) and then simply
select the last element. The space complexity is O(1) for the first
because we're using a constant amount of additional space (just
for the max_val variable), while for the second, if the sorting
isn't done in-place, it might require additional O(n) space.
,
Space Complexity
Space complexity is a measure of the amount of memory an algorithm uses in
terms of the amount of input to the algorithm. Here are common space
complexities:
,
Linked-list vs. Arrays
,
Design Techniques
Several foundational techniques are employed for crafting algorithms:
Divide and Conquer: Break the problem into smaller sub-problems, solve each sub-problem, and combine the results. Examples
include Merge Sort and Quick Sort.
Dynamic Programming: Break down the problem into smaller sub problems, solve each sub-problem, and store the results of each of
these sub problems to avoid duplicate work. Example: Fibonacci sequence.
Greedy Algorithms: Greedy algorithms aim to make the optimal choice at that given moment. Each step it chooses the optimal choice,
without knowing the future. . Example: Coin change problem for some specific denominations.
Backtracking: Solve the problem incrementally, backtrack upon hitting an obstacle until a solution is found. Example: The N-Queens
problem.
Brute Force: Try every possible solution until a satisfactory solution is found. It's often the most time-consuming. Example: Finding
the shortest path in a small graph by examining all possible paths.
,
Big-O notation,
Big O notation is one of the most fundamental tools for
computer scientists to analyse the cost of an algorithm.
,
Now, imagine you have a task, like searching for a friend in a line of people:
• If you have to look at every single person in the line to find your friend, and the
time it takes grows with the number of people, we might say this takes O(n)
time, where n is the number of people.
• If you knew your friend was in the middle of the line and went straight to them
every time, no matter how long the line is, we would say it takes O(1) time. This
is constant time because it doesn't change based on the size of the line.
,
,
Why Is Big-O Important?
,
Amortised Cost
What is Amortized Cost?
Amortized cost analysis is a method used to find the average time required per
operation, in a sequence of operations, over the worst-case scenario. Instead of
considering each operation's worst-case individually, we look at the overall
performance over the entire sequence.
Simple Analogy:
Imagine you have a piggy bank. Most days, you put in just one coin, which takes a
second. But once a year, you count and wrap all the coins, which might take hours.
Even though this one event is time-consuming, on average (or "amortized" over the
year), the time you spend on the piggy bank is relatively low.
Most days, you put in just one coin, which takes a second."
Meaning: Every day, you have a small, consistent task. Adding one coin to the
piggy bank is quick and simple, analogous to a basic operation in an algorithm or
data structure that typically runs fast.
"But once a year, you count and wrap all the coins, which might take hours."
Meaning: Even though there's a long task once a year, if you spread out that "cost"
over the whole year, the average time you spend per day is low. Similarly, even if
an algorithm occasionally takes a long time for a particular operation, its "average"
or "amortized" cost might still be low if those expensive operations are rare.
,
Python Example: Dynamic Array
One of the classic examples of amortized analysis is the dynamic array
in Python, commonly known as the list.
When you append an element to a Python list, most of the time it's a fast
O(1) operation. However, occasionally, the underlying array runs out of
space, and Python needs to create a larger array, copy all the elements
from the old array to the new one, and then finally append the new
element. This resizing operation is costly compared to a regular append.
def append_to_list(n):
"""
Append n items to a list and record the time taken for each append.
"""
# Create an empty list
my_list = []
for _ in range(n):
start_time = time.time()
end_time = time.time()
times.append((end_time - start_time) * 1000) # Time in milliseconds
return times
# Let's append 10,000 items to the list and see the times.
times_taken = append_to_list(10000)
When we run the code, you will observe that the vast majority of
append operations are extremely quick, but a few will stand out as
significantly longer. Those longer times are the moments when
Python is resizing the list behind the scenes.
However, when averaged out across all operations, the time for each
append remains consistently low, demonstrating the concept of an
amortized O(1) time complexity for the append operation on a Python
list.
Kea leboga