0% found this document useful (0 votes)
210 views2 pages

CS2040 Note

Quicksort has average time complexity of O(n log n) but worst case of O(n^2). It partitions elements around a pivot element and recursively sorts subarrays. AVL trees are height-balanced binary search trees that ensure the height difference between left and right subtrees is at most 1 through rotations. Union-find algorithms use disjoint sets to maintain connections between elements efficiently through operations like find and union.

Uploaded by

ChoonKit -
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
210 views2 pages

CS2040 Note

Quicksort has average time complexity of O(n log n) but worst case of O(n^2). It partitions elements around a pivot element and recursively sorts subarrays. AVL trees are height-balanced binary search trees that ensure the height difference between left and right subtrees is at most 1 through rotations. Union-find algorithms use disjoint sets to maintain connections between elements efficiently through operations like find and union.

Uploaded by

ChoonKit -
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 2

insertion : n, n2 Stable: y selection n2, n2 Stable: n

bubble: n, n2 Stable: y merge: nlogn nlogn Stable: y Algorithm: HashTable (assume simple uniform hashing)
quick sort: nlogn, n2 Stable: n if tree is right heavy: -n items, m buckets => Load = n/m = avg. items
QuickSort: if tree’s right subtree is left heavy: Right-Left Rotation -Expected search time = hash f + array access + n/m
There is a pivot, all smaller elements placed to its left, all bigger elements placed else: Left Rotation -Insert n elemt in m buckets where n = m, O(logn), Theta(logn/loglogn)
to its right. Bigger elements on the left (starting from start) are swapped with else if tree is left heavy: - Open addressing: On collision, probe a sequence of buckets till one is empty
smaller elements on the right (starting from end) if tree’s left subtree is right heavy: Left-Right Rotation Linear probing: ℎ(𝑘,𝑖)=𝑓(𝑘)+𝑖 mod 𝑚
- Use 3-way partitioning for duplicates. else: Right Rotation Double hashing: ℎ(𝑘,𝑖)=𝑓(𝑘)+𝑖⋅𝑔(𝑘) mod 𝑚
Time Complexity: In general, time taken can be written as following: Quadratic probing, fixed size m, load a: Terminates if m is prime, a < 0.5
T(n) = T(k) + T(n – k – 1) + Θ(n) Order Statistics Tree: Load a = n/m, assume a < 1, expected cost of operation <= 1/(1-a).
Worst Case: Always picks greatest or smallest element as pivot. O(n^2) - AVL tree where each node also stores count of nodes in subtree. *performance degrades badly as a -> 1*
Best Case: Always pick the median element as pivot. / Split the array Select(k) Pseudo Code: Amortized : Operation has amortized cost T(n) if for every integer k, cost of k
(1/10): (9/10)/ Pick a random pivot: O(nlogn) Rank = m _left.weight + 1; operations is ≤ k T(n).
Space: O(logn) if (k == rank) then return v;
Stable: No as in-place partitioning is not stable. Stable (only if partitioning is
else if (k < rank) then return m_left.select(k); Graphs
stable) with extra memory.
else if (k > rank) then return m_right.select(k – rank); Clique: Complete Graph
In-Place Sorting: Yes, extra space for storing recursive function calls not input.
*Let p be probability of event x obtaining a good partition, Diameter: Length of longest shortest path that exists in a graph (can be used to
number of random partitions to get a good partition is E(x) = 1/p* Rank(k) Pseudo Code: calculate number of moves to solve a puzzle (d nxnxn rubik cube = 0(n^2/logn)
Rank = node.left.weight + 1; Connected components: Unweighted graphs, as long as two nodes connected
QuickSelect: while (node != null) do: by edge
Run partition on random pivot and recurse on the side your target is on. if node is left child then do nothing No of connected components assuming no cycles, given V and E:
Time: 𝑂(𝑛) (expected), Space: 𝑂(1) else if node is right child then all permutations of edges will give same number.
When to use? Rank += node.parent.left.weight + 1; Strongly Connected Components:
- Finding kth smallest/largest element in an array e.g. finding median Node = node.parent; Adjacency List: Nodes (Array)+ Edges(Linked list)
- Finding all k smallest/largest elements return rank; Adjacency Matrix (A) : nxn array: edges = pair of nodes (length of n path = A^n)
NOTE: If k is a constant just use linear searching. Interval Tree:
- AVL tree where each node stores an interval (start, end) ordered by start. Each Union Find
AVL Trees: height-balanced tree (|v.left.height – v.right.height| <= 1) node also stores the maximum end in the subtree. Algo, Find, Union:
- A height-balanced tree with height h has at least n > 2^(h/2) nodes. - Search for an interval containing a point: 𝑂(log𝑛) quick-find O(1) O(n)
Tree Rotations: - List all intervals that overlap with a point: 𝑂(𝑘log𝑛 ), k = number of overlapping quick-union O(n) O(n)
(Left Rotation & Right Rotation) intervals weighted-union O(log n) O(log n)
Interval-search(x) Pseudo Code: path compression O(log n) O(log n)
c = root; weighted-union with path-compression α(m, n) α(m, n) (a <= 5)
while (c != null and x is not in c.interval) do (amortised a(m, n), worst case O(logn))
if (c.left == null) then c = c.right; Quick Union: (Use Array int[] parent, 2 objects are connected if same parent)
else if (x > c.left.max) then c = c.right;
else c = c.left; find(int p, int q) union(int p, int q)
(Left-Right Rotation)
return c.interval; while (parent[p] != p) p = parent[p]; while (parent[p] != p) p = parent[p];
while (parent[q] != q) q = parent[q]; while (parent[q] != q) q = parent[q];
Orthogonal Range Tree :
return (p == q); parent[p] = q;
- AVL tree where all points are stored in the leaves, each internal nodes are
Quick Find: (Use Array int[]componentId/ Hash table+Open addressing)
copies that store the max of any leaf in left subtree.
- For > 1 dimensions, at each node, nested tree is stored for next dimension find(int p, int q) union(int p, int q)
return(componentId[p] updateComponent = componentId[q]
with all nodes in subtree rooted at it.
(Right-Left Rotation) == componentId[q]); for (int i=0; i<componentId.length; i++)
Range query: 𝑂(𝑘+log^𝑑(𝑛)) time, k = no. of points in range, d = dimensions
if (componentId[i] == updateComponent)
Build tree: 𝑂(𝑛 log^(𝑑−1) (𝑛) ) time, Space: 𝑂(𝑛 log^(𝑑−1) (n)) componentId[i] = componentId[p];

KD-Tree:
- Each level divides the points in the plane in half.
- Point query or Range count: 𝑂(√𝑛) for 2 dimensions, 𝑂(𝑛^(1−1/𝑘) ) for k
dimensions
- Range query: 𝑂(√𝑛+𝑘) for 2D, 𝑂(𝑛^(1−1/𝑘)+𝑘) for kD
When to use?
- Range queries in multiple dimensions
- When space matters! (Note this has better space usage than range trees)
Weighted Union:
union(int p, int q)
while (parent[p] !=p) p = parent[p];
while (parent[q] !=q) q = parent[q];
if (size[p] > size[q] {
parent[q] = p; // Link q to p
size[p] = size[p] + size[q];
} else {
parent[p] = q; // Link p to q
size[q] = size[p] + size[q];
}
Path Compression
findRoot(int p) {
root = p;
while (parent[root] != root) root = parent[root];
while (parent[p] != p) {
temp = parent[p];
parent[p] = root;
p = temp;
}
return root;
}
findRoot(int p) {
root = p;
while (parent[root] != root) {
parent[root] = parent[parent[root]];
root = parent[root];
}
return root;
}

CREATE BINARY HEAP IS O(n)

You might also like