0% found this document useful (0 votes)
15 views18 pages

Sos Midterm Report

This midterm report provides an overview of data structures and algorithms, emphasizing their significance in computer science. It covers various data structures such as arrays, linked lists, stacks, queues, trees, and graphs, along with essential algorithms for sorting, searching, and traversal. The report also discusses algorithm design techniques, time and space complexity, and their practical applications in solving computational problems.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
15 views18 pages

Sos Midterm Report

This midterm report provides an overview of data structures and algorithms, emphasizing their significance in computer science. It covers various data structures such as arrays, linked lists, stacks, queues, trees, and graphs, along with essential algorithms for sorting, searching, and traversal. The report also discusses algorithm design techniques, time and space complexity, and their practical applications in solving computational problems.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 18

Data Structures and Algorithms Midterm Report

P.Vamsodhar
Abstract

This report delves into the fundamental concepts of data structures and algorithms, cru-
cial components in the field of computer science. The primary objective is to provide a
comprehensive overview of various data structures, including arrays, linked lists, stacks,
queues, trees, and graphs, along with their respective use-cases and performance impli-
cations. Furthermore, the report explores essential algorithms such as sorting, searching,
and traversal techniques, highlighting their efficiency and practical applications. Ulti-
mately, this report serves as a resource for understanding how to effectively implement
and leverage data structures and algorithms to solve complex computational problems.
Contents

Contents 1

1 An Intro to Data Structures and Algorithms 3


1.1 Importance of Data Structures and Algorithms . . . . . . . . . . . . . . . 3
1.2 Common Data Structures . . . . . . . . . . . . . . . . . . . . . . . . . . 3
1.3 Algorithms Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
1.3.1 Time Complexity . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
1.3.2 Space Complexity . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
1.3.3 Types of Space Complexity . . . . . . . . . . . . . . . . . . . . . 4
1.4 Algorithm Design Techniques . . . . . . . . . . . . . . . . . . . . . . . . 5
1.4.1 Brute Force . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
1.4.2 Divide and Conquer . . . . . . . . . . . . . . . . . . . . . . . . . 5
1.4.3 Dynamic Programming . . . . . . . . . . . . . . . . . . . . . . . . 5
1.4.4 Greedy Algorithms . . . . . . . . . . . . . . . . . . . . . . . . . . 5
1.5 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6

2 Arrays and Linked Lists 7


2.1 Introduction to Arrays and Linked Lists . . . . . . . . . . . . . . . . . . 7
2.1.1 Arrays . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
2.1.2 Linked Lists . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
2.2 Operations on Arrays and Linked Lists . . . . . . . . . . . . . . . . . . . 8
2.2.1 Array Operations . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
2.2.2 Linked List Operations . . . . . . . . . . . . . . . . . . . . . . . . 8
2.3 Implementation and Complexity Analysis . . . . . . . . . . . . . . . . . . 8
2.3.1 Array Implementation . . . . . . . . . . . . . . . . . . . . . . . . 8
2.3.2 Linked List Implementation . . . . . . . . . . . . . . . . . . . . . 9
2.3.3 Complexity Analysis . . . . . . . . . . . . . . . . . . . . . . . . . 9
2.4 Applications of Arrays and Linked Lists . . . . . . . . . . . . . . . . . . 9
2.4.1 Applications of Arrays . . . . . . . . . . . . . . . . . . . . . . . . 9
2.4.2 Applications of Linked Lists . . . . . . . . . . . . . . . . . . . . . 9
2.5 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9

3 Trees and Graphs 10


3.1 Introduction to Trees and Graphs .......................................................................... 10
3.1.1 Trees.............................................................................................................. 10
3.1.2 Graphs .......................................................................................................... 11
3.2 Types of Trees and Graphs ..................................................................................... 11
3.2.1 Types of Trees.............................................................................................. 11

1
CONTENTS 2

3.2.2 Types of Graphs . . . . . . . . . . . . . . . . . . . . . . . . . . . 11


3.3 Operations on Trees and Graphs . . . . . . . . . . . . . . . . . . . . . . . 11
3.3.1 Tree Operations . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
3.3.2 Graph Operations . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
3.4 Implementation and Complexity Analysis . . . . . . . . . . . . . . . . . . 12
3.4.1 Tree Implementation . . . . . . . . . . . . . . . . . . . . . . . . . 12
3.4.2 Graph Implementation . . . . . . . . . . . . . . . . . . . . . . . . 12
3.4.3 Complexity Analysis . . . . . . . . . . . . . . . . . . . . . . . . . 12
3.5 Applications of Trees and Graphs . . . . . . . . . . . . . . . . . . . . . . 12
3.5.1 Applications of Trees . . . . . . . . . . . . . . . . . . . . . . . . . 12
3.5.2 Applications of Graphs . . . . . . . . . . . . . . . . . . . . . . . . 13
3.6 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13

4 Weekly Progress Report 14


4.1 Week 1: Basic Data Structures . . . . . . . . . . . . . . . . . . . . . . . 14
4.2 Week 2: Algorithm Analysis and Algorithm Design Techniques . . . . . . 14
4.3 Week 3: Sorting Algorithms and Searching Algorithms . . . . . . . . . . 15
4.4 Week 4: Advanced Topics . . . . . . . . . . . . . . . . . . . . . . . . . . 15

References 16
1 An Intro to Data Structures and
Algorithms

Data Structures and Algorithms (DSA) form the backbone of computer science and soft-
ware engineering. They are essential tools that every programmer needs to master to
develop efficient, scalable, and maintainable software. This introduction provides an
overview of the fundamental concepts of DSA, explaining their importance and applica-
tions in solving complex computational problems.

1.1 Importance of Data Structures and Algorithms


Data structures are specialized formats for organizing, processing, retrieving, and storing
data. They are designed to arrange data to suit specific purposes so that it can be
accessed and worked with in appropriate ways. Algorithms are step-by-step procedures
or formulas for solving problems. When combined, data structures and algorithms enable
the creation of efficient and optimized software.

1.2 Common Data Structures


Arrays: Collections of elements identified by index or key. They provide quick access to
elements but have a fixed size.
Linked Lists: Collections of elements, called nodes, where each node points to the
next node. They allow dynamic memory allocation and efficient insertions and deletions.
Stacks: Collections of elements that follow the Last-In-First-Out (LIFO) principle.
Used for tasks like reversing strings and parsing expressions.
Queues: Collections of elements that follow the First-In-First-Out (FIFO) principle.
Commonly used in scheduling and buffering tasks.
Trees: Hierarchical structures with a root element and child elements. Binary trees,
AVL trees, and binary search trees are examples, used in databases and file systems.
Graphs: Collections of nodes connected by edges. Graphs are used to represent
networks, such as social networks and transportation systems.

1.3 Algorithms Analysis


1.3.1 Time Complexity
Time complexity is a computational concept that describes the amount of time it takes
to run an algorithm as a function of the length of the input. It provides a way to discuss

3
CHAPTER 1. AN INTRO TO DATA STRUCTURES AND ALGORITHMS 4

and compare the efficiency of different algorithms.

Asymptotic Notation
Time complexity is often expressed using Big O notation, which describes the upper
bound of the running time. This notation helps in comparing the worst-case scenarios of
algorithms as the input size grows.

• O(1): Constant time – The algorithm takes the same amount of time regardless of
the input size.

• O(log n): Logarithmic time – The time complexity grows logarithmically with the
input size.

• O(n): Linear time – The time complexity grows linearly with the input size.

• O(n log n): Linearithmic time – The time complexity grows in proportion to n
and log n.

• O(n2): Quadratic time – The time complexity grows quadratically with the input
size.

• O(2n): Exponential time – The time complexity grows exponentially with the input
size.

Analyzing Time Complexity


To determine the time complexity of an algorithm, we analyze the loops, recursive calls,
and nested operations to count the number of basic operations performed relative to the
input size.

1.3.2 Space Complexity


Space complexity refers to the amount of memory an algorithm uses in relation to the
input size. This includes the memory needed for the input, auxiliary variables, and any
data structures used in the algorithm.

Components of Space Complexity


Space complexity is usually broken down into two parts:
Fixed Part: Memory required by constants, simple variables, and fixed-size variables.
Variable Part: Memory required by dynamic variables, including those used in
recursive calls or dynamically allocated structures like arrays and linked lists.

1.3.3 Types of Space Complexity


O(1): Constant space – The algorithm uses a fixed amount of memory regardless of the
input size.
O(n): Linear space – The memory usage grows linearly with the input size.
O(n2): Quadratic space – The memory usage grows quadratically with the input size.
CHAPTER 1. AN INTRO TO DATA STRUCTURES AND ALGORITHMS 5

1.4 Algorithm Design Techniques


Algorithm Design Techniques are systematic approaches to problem-solving that help in
developing efficient algorithms. These techniques provide structured methods to tackle
various types of computational problems, optimizing both time and space complexities.
Here are some key algorithm design techniques commonly used in computer science:

1.4.1 Brute Force


Brute Force is a straightforward approach where the algorithm systematically enumerates
all possible solutions and checks each one until a satisfactory solution is found.
Characteristics:
Exhaustive Search: It considers all possible solutions, regardless of efficiency.
Simple Implementation: Often easier to implement but can be computationally
expensive.
Applicability: Suitable for smaller problem instances or when more efficient algo-
rithms are not feasible.

1.4.2 Divide and Conquer


Divide and Conquer is a recursive algorithmic technique where a problem is divided
into smaller sub-problems of the same type, solved independently, and then combined to
obtain the solution of the original problem.
Characteristics:
Recursive Structure: Problems are divided into smaller, more manageable parts.
Efficiency: Can significantly reduce the time complexity compared to brute force.
Examples: Binary search, merge sort, and quicksort are classic examples that employ
divide and conquer.

1.4.3 Dynamic Programming


Dynamic Programming (DP) is a technique used to solve problems by breaking them
down into overlapping sub-problems and storing the results of sub-problems to avoid
redundant computations.
Characteristics:
Memoization or Tabulation: Techniques to store and reuse solutions to sub-
problems.
Optimal Substructure: Solutions to larger problems can be constructed from op-
timal solutions to smaller sub-problems.
Examples: Algorithms like Fibonacci sequence computation, shortest path problems
in graphs (e.g., Floyd-Warshall algorithm), and the knapsack problem.

1.4.4 Greedy Algorithms


Greedy Algorithms make decisions locally at each step with the hope of finding a global
optimum solution. It chooses the best option available at the moment, without consid-
ering future consequences.
Characteristics:
CHAPTER 1. AN INTRO TO DATA STRUCTURES AND ALGORITHMS 6

Local Optimization: Makes the best choice at each step with the hope of finding
the overall optimal solution.
No Backtracking: Decisions are irrevocable and do not reconsider previous choices.
Examples: Algorithms for minimum spanning trees (e.g., Prim’s and Kruskal’s al-
gorithms), Dijkstra’s shortest path algorithm, and Huffman coding.

1.5 Conclusion
This chapter has provided an introductory overview of data structures and algorithms,
emphasizing their significance in computer science. Understanding these foundational
concepts is crucial for developing efficient and scalable software solutions. The sub-
sequent chapters will delve deeper into specific data structures, algorithms, and their
implementations, offering practical insights and examples.
2 Arrays and Linked Lists

Contents
1.1 Importance of Data Structures and Algorithms . . . . . . . 3
1.2 Common Data Structures . . . . . . . . . . . . . . . . . . . . 3
1.3 Algorithms Analysis . . . . . . . . . . . . . . . . . . . . . . . . 3
1.3.1 Time Complexity ...................................................................................... 3
1.3.2 Space Complexity ....................................................................................... 4
1.3.3 Types of Space Complexity....................................................................... 4
1.4 Algorithm Design Techniques . . . . . . . . . . . . . . . . . . 5
1.4.1 Brute Force ................................................................................................ 5
1.4.2 Divide and Conquer .................................................................................. 5
1.4.3 Dynamic Programming.............................................................................. 5
1.4.4 Greedy Algorithms ..................................................................................... 5
1.5 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6

2.1 Introduction to Arrays and Linked Lists


This chapter explores two fundamental data structures: arrays and linked lists. Both
structures are used to store collections of data, but they differ significantly in their im-
plementations and characteristics.

2.1.1 Arrays
An array is a collection of elements stored at contiguous memory locations. It allows
for efficient random access to elements using their indices. Arrays have a fixed size,
determined at the time of declaration, and can store elements of the same data type.
Characteristics:
Random Access: Elements can be accessed directly using their indices, making
array operations efficient.
Fixed Size: Arrays have a predetermined size that cannot be changed during run-
time.
Memory Efficiency: Arrays allocate contiguous memory blocks, which can be ad-
vantageous for caching and memory locality.

7
CHAPTER 2. ARRAYS AND LINKED LISTS 8

2.1.2 Linked Lists


A linked list is a linear collection of elements, called nodes, where each node contains
data and a reference (link) to the next node in the sequence. Unlike arrays, linked lists
do not require contiguous memory allocation and can dynamically grow and shrink.
Characteristics:
Dynamic Size: Linked lists can grow or shrink dynamically by allocating memory
as needed.
Pointer-based Navigation: Each node contains a reference (pointer) to the next
node, enabling traversal and manipulation.
Insertions and Deletions: Linked lists allow for efficient insertions and deletions
of elements, as they do not require shifting elements.

2.2 Operations on Arrays and Linked Lists


2.2.1 Array Operations
Arrays support several fundamental operations:
Access: Accessing elements by index (e.g., accessing the i-th element).
Insertion: Inserting elements at a specific index (e.g., inserting a new element at the
beginning or end of the array).
Deletion: Deleting elements from a specific index (e.g., removing an element from
the middle of the array).
Traversal: Iterating through all elements of the array to perform operations (e.g.,
searching or sorting).

2.2.2 Linked List Operations


Linked lists support basic operations:
Traversal: Iterating through the linked list to access or modify elements.
Insertion: Inserting a new node at the beginning, end, or middle of the linked list.
Deletion: Removing a node from the linked list, based on the node’s value or position.
Search: Searching for a specific element within the linked list.

2.3 Implementation and Complexity Analysis


2.3.1 Array Implementation
Arrays can be implemented using static or dynamic memory allocation, depending on
the programming language and requirements.
Static Arrays: Declared with a fixed size that cannot be changed during runtime.
Memory is allocated at compile time.
Dynamic Arrays: Also known as resizable arrays, they allow for dynamic memory
allocation during runtime. Examples include ArrayList in Java and vector in C++.
CHAPTER 2. ARRAYS AND LINKED LISTS 9

2.3.2 Linked List Implementation


Linked lists are implemented using nodes, where each node contains data and a pointer/reference
to the next node in the sequence.
Singly Linked Lists: Each node points to the next node in the sequence.
Doubly Linked Lists: Each node points to both the next and previous nodes in the
sequence, enabling bidirectional traversal.

2.3.3 Complexity Analysis


The time complexity of operations on arrays and linked lists varies based on the specific
operation and data structure:
Array Operations:
Access: O(1) – Directly accessing elements by index.
Insertion/Deletion (at the end): O(1) for dynamic arrays; O(n) for static arrays,
as it may require resizing and copying elements.
Insertion/Deletion (at the beginning): O(n) – Shifting elements to make space
or close gaps.
Linked List Operations:
Access: O(n) – Sequential traversal required to access elements.
Insertion/Deletion (at the beginning): O(1) – Updating pointers.
Insertion/Deletion (at the end): O(n) – Traversing the list to reach the end.

2.4 Applications of Arrays and Linked Lists


2.4.1 Applications of Arrays
Arrays are used in various applications, including:
Database Management: Storing records and fields in tables.
Matrices: Representing and manipulating matrices in linear algebra.
Dynamic Programming: Caching solutions to sub-problems.

2.4.2 Applications of Linked Lists


Linked lists find applications in:
Dynamic Memory Allocation: Allocating memory for data structures like stacks
and queues.
Implementation of Graphs: Representing graphs using adjacency lists.
Undo Functionality: Implementing undo functionality in text editors.

2.5 Conclusion
This chapter has explored the fundamental concepts, operations, implementation details,
and complexity analysis of arrays and linked lists. Understanding these data structures
is essential for developing efficient and scalable software solutions. The next chapter will
delve deeper into more advanced data structures and their applications.
3 Trees and Graphs

Contents
2.1 Introduction to Arrays and Linked Lists . . . . . . . . . . . 7
2.1.1 Arrays ......................................................................................................... 7
2.1.2 Linked Lists ............................................................................................... 8
2.2 Operations on Arrays and Linked Lists . . . . . . . . . . . . 8
2.2.1 Array Operations ....................................................................................... 8
2.2.2 Linked List Operations............................................................................ 8
2.3 Implementation and Complexity Analysis . . . . . . . . . . . 8
2.3.1 Array Implementation ............................................................................... 8
2.3.2 Linked List Implementation................................................................... 9
2.3.3 Complexity Analysis ................................................................................. 9
2.4 Applications of Arrays and Linked Lists . . . . . . . . . . . . 9
2.4.1 Applications of Arrays .............................................................................. 9
2.4.2 Applications of Linked Lists .................................................................... 9
2.5 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9

3.1 Introduction to Trees and Graphs


This chapter explores two hierarchical data structures: trees and graphs. Both structures
are used to represent relationships and connections between data elements.

3.1.1 Trees
A tree is a hierarchical data structure consisting of nodes connected by edges. It has a
root node at the top and each node has zero or more child nodes. Trees are widely used in
computer science for representing hierarchical data, such as file systems, organizational
charts, and HTML/XML documents.
Characteristics:
Root Node: The topmost node in the tree structure.
Parent and Child Nodes: Nodes connected by directed edges.
Leaf Nodes: Nodes with no children.
Height of a Tree: The number of edges on the longest path from the root to a leaf.

10
CHAPTER 3. TREES AND GRAPHS 11

3.1.2 Graphs
A graph is a non-linear data structure consisting of nodes (vertices) and edges connecting
them. Unlike trees, graphs can have cycles and multiple edges between nodes. Graphs are
used to represent relationships between objects, such as social networks, transportation
networks, and dependency graphs.
Characteristics:
Vertices and Edges: Nodes (vertices) connected by edges.
Directed vs. Undirected Graphs: Edges may have a direction (directed graph)
or no direction (undirected graph).
Weighted vs. Unweighted Graphs: Edges may have weights (weighted graph) or
equal weights (unweighted graph).

3.2 Types of Trees and Graphs


3.2.1 Types of Trees
Various types of trees include:
Binary Tree: A tree in which each node has at most two children.
Binary Search Tree (BST): A binary tree in which for each node, the left subtree
contains only nodes with values less than the node’s value, and the right subtree contains
only nodes with values greater than the node’s value.
AVL Tree: A self-balancing binary search tree where the heights of the two child
subtrees of any node differ by at most one.

3.2.2 Types of Graphs


Various types of graphs include:
Undirected Graph: A graph in which edges have no direction.
Directed Graph (Digraph): A graph in which edges have a direction from one
vertex to another.
Weighted Graph: A graph in which edges have weights or costs associated with
them.

3.3 Operations on Trees and Graphs


3.3.1 Tree Operations
Common operations on trees include:
Traversal: Visiting all nodes in a specific order (e.g., inorder, preorder, postorder).
Insertion: Adding a new node to the tree.
Deletion: Removing a node from the tree.

3.3.2 Graph Operations


Common operations on graphs include:
Traversal: Visiting all vertices and edges in a specific order (e.g., depth-first search,
breadth-first search).
CHAPTER 3. TREES AND GRAPHS 12

Insertion: Adding a new vertex or edge to the graph.


Deletion: Removing a vertex or edge from the graph.

3.4 Implementation and Complexity Analysis


3.4.1 Tree Implementation
Trees can be implemented using various approaches:
Binary Tree Implementation: Using nodes with left and right child pointers.
Binary Search Tree Implementation: Ensuring the left subtree is less than the
node, and the right subtree is greater than the node.
AVL Tree Implementation: Maintaining balance factors to ensure the tree remains
balanced.

3.4.2 Graph Implementation


Graphs can be implemented using adjacency matrices or adjacency lists:
Adjacency Matrix: A 2D array where the presence/absence of an edge between
vertices is represented.
Adjacency List: A collection of linked lists or arrays that lists all the adjacent
vertices for each vertex.

3.4.3 Complexity Analysis


The time complexity of operations on trees and graphs varies based on the specific oper-
ation and data structure:
Tree Operations:
Traversal: O(n) – Visiting all nodes in the tree.
Insertion/Deletion: O(log n) for balanced trees like AVL trees; O(n) for unbalanced
trees.
Graph Operations:
Traversal: O(V + E) – Visiting all vertices (V) and edges (E) in the graph.
Insertion/Deletion: O(1) for adjacency lists; O(V 2) for adjacency matrices.

3.5 Applications of Trees and Graphs


3.5.1 Applications of Trees
Trees find applications in:
File Systems: Organizing files and directories in a hierarchical structure.
XML/HTML Parsing: Representing and navigating through structured docu-
ments.
Decision Trees: Making decisions based on hierarchical rules.
CHAPTER 3. TREES AND GRAPHS 13

3.5.2 Applications of Graphs


Graphs find applications in:
Social Networks: Representing relationships between individuals.
Transportation Networks: Modeling routes between locations.
Network Flows: Optimizing flows through networks (e.g., water flow in pipes, data
flow in communication networks).

3.6 Conclusion
This chapter has explored the hierarchical data structures: trees and graphs, their op-
erations, implementation details, complexity analysis, and applications. Understanding
these structures is crucial for solving complex problems and developing efficient algo-
rithms. The next chapter will explore advanced algorithms and their applications.
4 Weekly Progress Report

4.1 Week 1: Basic Data Structures


During the first week, we covered the fundamental concepts of data structures. The topics
included:

• Arrays

• Linked Lists

• Stacks

• Queues

• Trees

• Graphs

Students were introduced to the basics of each data structure, including their operations,
use cases, and performance characteristics.

4.2 Week 2: Algorithm Analysis and Algorithm De-


sign Techniques
The second week focused on understanding how to analyze and design algorithms. The
key points covered were:

• Big O Notation

• Time Complexity

• Space Complexity

• Recurrence Relations

• Divide and Conquer

• Dynamic Programming

• Greedy Algorithms

Students learned how to evaluate the efficiency of algorithms and were introduced to
various design techniques used to develop efficient solutions.

14
CHAPTER 4. WEEKLY PROGRESS REPORT 15

4.3 Week 3: Sorting Algorithms and Searching Algo-


rithms
In the third week, we explored various sorting and searching algorithms. The topics
included:

• Bubble Sort

• Insertion Sort

• Selection Sort

• Merge Sort

• Quick Sort

• Binary Search

• Linear Search

Students practiced implementing these algorithms and analyzed their performance in


different scenarios.

4.4 Week 4: Advanced Topics


The final week was dedicated to advanced topics in data structures and algorithms. The
areas covered were:

• Hash Tables

• Heaps

• Graph Algorithms (BFS, DFS, Dijkstra’s Algorithm)

• Balanced Trees (AVL Trees, Red-Black Trees)

• Amortized Analysis

• NP-Completeness and Computational Intractability

Students gained insight into more complex data structures and algorithms, preparing
them for solving sophisticated computational problems.
References

[1] Unknown. Binary Search. Available online: https://cp-algorithms.com/num_


methods/binary_search.html (Accessed: 2024-06-21).

[2] Antti Laaksonen. Competitive Programmer’s Handbook. 2018. Available on-


line: https://drive.google.com/file/d/1T9ztUCHQBDVyQaXLFmcCU3U44tWbwM8X/
view (Accessed: 2024-06-21).

[3] Steven Halim and Felix Halim. Guide to Competitive Program-


ming. 2013. Available online: https://drive.google.com/file/d/
1Cb9g-RyKsJ7O9eYj2FPeHWLJh7EG3N2_/view (Accessed: 2024-06-21).

[4] Thomas H. Cormen, Charles E. Leiserson, Ronald L. Rivest, and Clif-


ford Stein. Introduction to Algorithms. 3rd edition. MIT Press, 2009. ISBN:
978-0262533058. Available online: https://mitpress.mit.edu/9780262533058/
introduction-to-algorithms/

[5] Aditya Bhargava. Grokking Algorithms. Manning Publications, 2016.


ISBN: 978-1617292231. Available online: https://www.manning.com/books/
grokking-algorithms-second-edition

16

You might also like