0% found this document useful (0 votes)
4 views5 pages

Algorithms and Number Theory

The document covers algorithms and number theory, focusing on complexity measures such as time and space complexity, and their asymptotic notations (Big O, Omega, and Theta). It also discusses modular arithmetic, the greatest common divisor (GCD) using the Euclidean algorithm, and combinatorial analysis including permutations and combinations. Additionally, it explains divide and conquer algorithms like Quick Sort and Binary Search, as well as the Bubble Sort algorithm with its time complexities.

Uploaded by

shrabonyghosh44
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
4 views5 pages

Algorithms and Number Theory

The document covers algorithms and number theory, focusing on complexity measures such as time and space complexity, and their asymptotic notations (Big O, Omega, and Theta). It also discusses modular arithmetic, the greatest common divisor (GCD) using the Euclidean algorithm, and combinatorial analysis including permutations and combinations. Additionally, it explains divide and conquer algorithms like Quick Sort and Binary Search, as well as the Bubble Sort algorithm with its time complexities.

Uploaded by

shrabonyghosh44
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

Algorithms and Number Theory

Complexity
Complexity measures the efficiency of an algorithm, mainly in terms of time (how fast it runs)
and space (how much memory it uses). It helps us predict an algorithm's performance as the
input size grows.

●​ Time Complexity: Describes the amount of time an algorithm takes based on the size of
its input.​

●​ Space Complexity: Describes the amount of memory an algorithm uses relative to the
input size.​

We generally focus on the worst-case complexity unless otherwise mentioned.

Asymptotic Notation
To express complexity, we use asymptotic notations, which describe the behavior of
algorithms for large input sizes:

●​ Big O Notation (O): Upper bound; describes the worst-case scenario.​

○​ Example: O(n^2) means the time grows at most proportional to the square of
the input size.​

●​ Omega Notation (Ω): Lower bound; describes the best-case scenario.​

○​ Example: Ω(n) means the time takes at least linear time.​

●​ Theta Notation (Θ): Tight bound; describes both upper and lower bounds.​

○​ Example: Θ(n log n) means the time grows exactly proportional to n log n.​
Common Complexities
Complexity Name Example

O(1) Constant time Accessing an array


element

O(log n) Logarithmic time Binary search

O(n) Linear time Simple for loop

O(n log n) Linearithmic time Merge sort

O(n²) Quadratic time Bubble sort

3. Divisions and Applications

a. Modular Arithmetic

●​ Definition:​
An arithmetic system for integers where numbers "wrap around" after a certain value, n.​

●​ Notation:​
a ≡ b (mod n) means (a-b) is divisible by n.​

●​ Properties:​

○​ (a + b) mod n = (a mod n + b mod n) mod n​

○​ (a × b) mod n = (a mod n × b mod n) mod n​

b. Greatest Common Divisor (GCD)

●​ Definition:​
The largest number that divides two integers without a remainder.​
Euclidean Algorithm:​

GCD(a, b) = GCD(b, a mod b)

Example:​

GCD(48, 18)
→ GCD(18, 48 mod 18) = GCD(18, 12)
→ GCD(12, 6)
→ GCD(6, 0)
→ GCD = 6

●​

Combinatorial Analysis
1. Permutation and Combination

a. Basic Counting Principles

●​ Addition Rule:​
If event A can happen in m ways and B in n ways, and they can't co-occur:​
Total = m + n​

●​ Multiplication Rule:​
If event A can happen in m ways and event B in n ways:​
Total = m × n​

b. Permutations:

●​ Order matters.​

○​ Without repetition: P(n, r) = n! / (n-r)!


○​ With repetition: n^r​

c. Combinations:
●​ Order doesn't matter.​
C(n, r) = n! / [r!(n - r)!]​

●​

2. Divide and Conquer Algorithms

a. Quick Sort

●​ Strategy: Pick a pivot, partition the array around the pivot, and sort the subarrays
recursively.​

●​ Time Complexity:​

○​ Best & Average: O(n log n)


○​ Worst: O(n^2) (when pivot is worst)

b. Binary Search

●​ Strategy:​
Repeatedly divide the sorted array in half to find the target.​

●​ Time Complexity:​
O(log n)​

●​ Precondition:​
The array must be sorted.

Bubble Sort Algorithm


Bubble Sort is a simple comparison-based sorting algorithm.​
It works by repeatedly swapping adjacent elements if they are in the wrong order.

●​ After each full pass through the array, the largest unsorted element "bubbles" to its
correct position at the end.​

Pseudocode
BubbleSort(arr):
n = length of arr
for i from 0 to n-1:
for j from 0 to n-i-2:
if arr[j] > arr[j+1]:
swap arr[j] with arr[j+1]

Example

Given array:​
arr = [5, 2, 9, 1, 5, 6]

Pass 1:​
[2, 5, 1, 5, 6, 9]

Pass 2:​
[2, 1, 5, 5, 6, 9]

Pass 3:​
[1, 2, 5, 5, 6, 9]

Pass 4 and 5:​


No swaps – array is sorted.

Final sorted array:​


[1, 2, 5, 5, 6, 9]

Time Complexity
Case Time

Best Case O(n) (when already sorted; optimized version)

Average O(n²)
Case

Worst Case O(n²)

You might also like