Created by:- Talib sir
Question Description Answer Choice 1
What is the time complexity of binary search in a sorted array? O(n)
Which data structure is most commonly used to implement a hArray
In hash tables, what is the term for multiple keys hashing to t Clustering
Which collision avoidance strategy involves probing sequentiallySeparate chaining
In quadratic probing, how is the next index to be probed calcul Adding a constant value
Which strategy involves maintaining a linked list of entries at e Linear probing
What is the time complexity of inserting an element into a binarO(log n)
What is the worst-case time complexity of deleting a node in a O(log n)
Which traversal method visits the left subtree, then the root, t Preorder
What is the time complexity of traversing a binary search tree (O(log n)
In a divide and conquer strategy, the problem is: Divided into multiple subprobl
Which of the following is an example of a divide and conquer a QuickSort
What is the time complexity of Merge Sort? O(n^2)
Which algorithm follows the greedy approach? Dijkstra's Algorithm
In greedy algorithms, the decision-making process is based on: Optimizing future decisions
Which data structure is used to implement priority queues in a Hash table
What is the advantage of linear probing in hash tables? Uses less memory
Which of the following is a collision avoidance strategy used in Binary Search
What is the average case time complexity for searching in a binO(log n)
In hashing, what is the primary purpose of a hash function? To resolve collisions
Which of the following is not a traversal technique for binary s Inorder
What is the time complexity of inserting an element into a bala O(n log n)
Which hash function method involves computing the remainderDivisionw Method
What is the time complexity of deleting an element from a hashO(1)
Which algorithm is commonly used for searching in a sorted ar Linear Search
What is the load factor in a hash table? Ratio of keys to table size
What happens when the load factor in a hash table exceeds a cTable is resized
What is the primary disadvantage of linear probing in hash tabl Increases memory usage
Which of the following operations is faster in a hash table com Insertion
Which divide and conquer algorithm is used to solve the closes Binary Search
In greedy algorithms, the "greedy choice property" means: Making the optimal decision glo
What is the time complexity for inserting an element into a hasO(1)
In a binary search tree (BST), what happens if the inserted elemThe tree becomes balanced
Which of the following hashing methods can lead to fewer colliOpen Addressing
What is the main drawback of quadratic probing in hash tables Increased memory usage
In binary search trees, what traversal method is commonly usedPreorder
Which collision resolution technique involves placing all element Linear Probing
Which algorithm is NOT an example of divide and conquer? QuickSort
In dynamic programming, what is "memoization"? Caching intermediate results
Which of the following algorithms is an example of greedy algo Huffman Coding
What is the main principle behind greedy algorithms? Choose the global best choice
What is the best-case time complexity of binary search? O(log n)
What is the worst-case time complexity of linear probing in hasO(1)
In separate chaining, what happens when the load factor exceeThe hash table becomes ineffici
Which algorithm is used to find the shortest path in an unweig Dijkstra's Algorithm
In a balanced binary search tree (BST), what is the height of th O(log n)
What is the time complexity of QuickSort in the best case? O(n log n)
Which of the following is a characteristic of divide and conquer Bottom-up approach
What is the average time complexity for searching in a hash tabO(1)
What type of traversal in a binary tree visits nodes level by levePreorder
In hashing, what is a "load factor"? The ratio of keys to the table siz
What is the main advantage of quadratic probing over linear prEliminates clustering
What is the time complexity for searching in a perfectly balanc O(n)
In divide and conquer algorithms, how are subproblems combi By recursive calls
What is the time complexity for inserting an element into a bin O(log n)
Which collision resolution technique uses a second hash functioLinear Probing
What is the main disadvantage of separate chaining in hash tabIncreased memory usage
What is the purpose of a hash function in a hash table? To resolve collisions
In a binary search tree, how is the left subtree of a node define Contains nodes with larger valu
What is the advantage of separate chaining in hash tables? Reduces memory usage
What happens in hash tables when two keys hash to the same The second key is ignored
In divide and conquer algorithms, what is done after dividing t The subproblems are merged
What is the time complexity of binary search on a sorted array oO(n)
In a hash table, what is the process of resolving collisions using Using linked lists
What is a collision in a hash table? Two keys generate same index
Which of the following algorithms follows the divide and conquMerge Sort
Which of the following is an example of a greedy algorithm? Dijkstra's Algorithm
What is the worst-case time complexity of a linear search algor O(n log n)
What happens when we keep inserting into a hash table using lin Rehashing occurs
Which of the following conditions must hold true for binary se The array must be sorted
Which strategy resolves collisions by linking the collided eleme Separate Chaining
What is the time complexity of Merge Sort? O(n)
What is the probing interval in quadratic probing? Constant interval
When deleting a node with two children in a binary search tree,Root node
Which of the following does NOT follow the divide and conque Quick Sort
Which greedy algorithm is used to find the minimum spanning tFloyd-Warshall Algorithm
In a binary search tree, how are new nodes inserted? Always at the root
What is the time complexity of inserting an element in a hash t O(1)
Which of the following is a method for resolving collisions in haLinear Probing
What is the best case time complexity for searching in a binary O(log n)
In separate chaining, what happens if many elements hash to t The table becomes full
Which of the following is NOT a greedy algorithm? Kruskal’s Algorithm
Which technique involves solving the problem by recursively brDynamic Programming
In quadratic probing, how is the next index to be probed calcul Linearly
What is the time complexity of deleting a node from a binary seO(1)
What is the worst-case time complexity of binary search? O(n)
Which of the following is a characteristic of divide and conquer Overlapping subproblems
Which collision resolution method does NOT store all elements Linear Probing
What traversal method visits the root node first in a binary treeInorder
Which of the following algorithms is used to find the minimum Dijkstra’s Algorithm
What is the time complexity of merge sort in the worst case? O(log n)
In linear probing, how is a collision resolved? Using the next empty slot
What is the advantage of greedy algorithms? Always finds the best solution
What is the best time complexity for inserting an element into aO(log n)
Which of the following algorithms follows the divide and conquQuick Sort
What is the purpose of the load factor in a hash table? Measure the number of collisio
Which algorithm is used to find the shortest path in a weighte Floyd-Warshall Algorithm
What is the time complexity of searching in a perfectly balance O(n)
Answer Choice 2 Answer Choice 3 Answer Choice 4
O(log n) O(n log n) O(n^2)
Linked list Binary search tree Stack
Collision Overflow Hashing conflict
Linear probing Quadratic probing Double hashing
Incrementing by a fixed amUsing the square of the index Using a prime number
Separate chaining Quadratic probing Rehashing
O(n) O(n log n) O(1)
O(n) O(n log n) O(1)
Postorder Inorder Level-order
O(n) O(n log n) O(1)
Solved recursively Both A and B Solved using iteration
Insertion Sort Selection Sort Linear Search
O(n log n) O(n) O(log n)
Merge Sort QuickSort Binary Search
Making the locally optimal Minimizing the number of steps Maximizing efficiency
Stack Heap Queue
Reduces clustering Simpler to implement Ensures no collisions
Quadratic Probing BFS DFS
O(n log n) O(n^2) O(1)
To compute the load factorTo map keys to an index To determine table size
Preorder Postorder Reverse order
O(log n) O(n^2) O(n)
Multiplication Method Mid-square Method Folding Method
O(log n) O(n) O(n log n)
Binary Search QuickSort Merge Sort
Number of collisions Number of empty slots Ratio of collisions to keys
Elements are rehashed Collisions increase All of the above
Leads to clustering Difficult to implement Slower insertion time
Deletion Searching All of the above
QuickSort Merge Sort Divide Sort
Making the optimal decisionMaking the fastest decision None of the above
O(n) O(log n) O(n^2)
The tree becomes skewed The tree has no change The tree becomes a heap
Separate Chaining Linear Probing Quadratic Probing
Higher time complexity Secondary clustering Difficult to implement
Postorder Inorder Level-order
Separate Chaining Double Hashing Quadratic Probing
Merge Sort Binary Search Dijkstra’s Algorithm
Solving subproblems optimaDividing a problem into subproblems Iterating through the solution space
Merge Sort Dynamic Programming Bubble Sort
Choose the fastest option Choose the locally optimal choice Choose the simplest option
O(n log n) O(1) O(n)
O(n) O(log n) O(n^2)
No changes occur Clustering occurs Rehashing is needed
Bellman-Ford Algorithm Breadth-First Search (BFS) Floyd-Warshall Algorithm
O(n) O(n^2) O(1)
O(n^2) O(n) O(log n)
Top-down approach Subproblems overlap No recursive calls
O(n) O(log n) O(n log n)
Postorder Inorder Level-order
The number of collisions The ratio of empty slots to filled slots The length of the hash table
Requires less memory Avoids secondary clustering Faster insertion time
O(log n) O(n log n) O(1)
Through dynamic programmThrough a merging process Iteratively
O(n) O(1) O(n log n)
Quadratic Probing Double Hashing Separate Chaining
Clustering Slower search time Higher collision rate
To compute the load factorTo map keys to an index To determine the table size
Contains nodes with smallerContains the root of the tree Contains all the leaf nodes
Handles collisions effectivelFaster insertion time Easier to implement
The second key is stored in A collision occurs The first key is deleted
The subproblems are solve The problem is recursively called The optimal solution is found
O(log n) O(n log n) O(1)
Incrementing indices Rehashing Doubling the table size
Key not found Value not hashed properly Table is full
Bubble Sort Insertion Sort Selection Sort
Merge Sort Bubble Sort Binary Search
O(1) O(n) O(n^2)
Insertion fails Key values are overwritten Linear probing continues
The array can be unsorted The array must contain distinct elements The array must have even length
Linear Probing Quadratic Probing Rehashing
O(n log n) O(log n) O(n^2)
Increases quadratically Increases linearly Decreases quadratically
Random node Inorder predecessor Inorder successor
Merge Sort Heap Sort Binary Search
Prim's Algorithm Dijkstra's Algorithm Bellman-Ford Algorithm
Based on value comparisonRandomly Using hashing
O(n) O(log n) O(n^2)
Quadratic Probing Separate Chaining All of the above
O(n) O(n^2) O(1)
A linked list is created The hash function is recomputed The index is skipped
Prim’s Algorithm Dijkstra’s Algorithm Merge Sort
Divide and Conquer Greedy Algorithms Brute Force
Using a second hash functi Quadratically Randomly
O(log n) O(n) O(n log n)
O(log n) O(1) O(n^2)
No recursive calls Bottom-up approach Dividing the problem into smaller parts
Separate Chaining Quadratic Probing Double Hashing
Preorder Postorder Level-order
Bellman-Ford Algorithm Prim’s Algorithm Floyd-Warshall Algorithm
O(n log n) O(n) O(n^2)
Storing all keys in a linked liRehashing Skipping to the next probe
Fast and easy to implementRequires less memory Handles all types of problems
O(n) O(1) O(n log n)
Prim's Algorithm Dijkstra's Algorithm Kruskal's Algorithm
To determine the next probMeasure the fullness of the table Resize the hash table
Prim's Algorithm Bellman-Ford Algorithm Dijkstra's Algorithm
O(log n) O(1) O(n^2)
Correct Answer Choice
O(log n)
Array
Collision
Linear probing
Using the square of the index
Separate chaining
O(n)
O(n)
Inorder
O(n)
Both A and B
QuickSort
O(n log n)
Dijkstra's Algorithm
Making the locally optimal choice
Heap
Simpler to implement
Quadratic Probing
O(log n)
To map keys to an index
Reverse order
O(log n)
Division Method
O(n)
Binary Search
Ratio of keys to table size
All of the above
Leads to clustering
All of the above
Divide Sort
Making the optimal decision locally
O(1)
The tree becomes skewed
Separate Chaining
Secondary clustering
Inorder
Separate Chaining
Dijkstra’s Algorithm
Caching intermediate results
Huffman Coding
Choose the locally optimal choice
O(1)
O(n)
The hash table becomes inefficient
Breadth-First Search (BFS)
O(log n)
O(n log n)
Top-down approach
O(1)
Level-order
The ratio of keys to the table size
Avoids secondary clustering
O(log n)
Through a merging process
O(log n)
Double Hashing
Increased memory usage
To map keys to an index
Contains nodes with smaller values
Handles collisions effectively
A collision occurs
The subproblems are solved independently
O(log n)
Incrementing indices
Two keys generate same index
Merge Sort
Dijkstra's Algorithm
O(n)
Insertion fails
The array must be sorted
Separate Chaining
O(n log n)
Increases quadratically
Inorder successor
Heap Sort
Prim's Algorithm
Based on value comparison
O(1)
All of the above
O(log n)
A linked list is created
Merge Sort
Divide and Conquer
Quadratically
O(log n)
O(log n)
Dividing the problem into smaller parts
Separate Chaining
Preorder
Prim’s Algorithm
O(n log n)
Using the next empty slot
Fast and easy to implement
O(log n)
Quick Sort
Measure the fullness of the table
Dijkstra's Algorithm
O(log n)
Title Question Description Answer Choice 1
What is the space complexity of an
Adjacency List adjacency list for a graph with V O(V^2)
vertices and E edges?
What is the space complexity of an
Adjacency List adjacency list in a graph with V O(V^2)
vertices and E edges?
What is the time complexity to find
Adjacency List (Time
all neighbors of a node in an O(1)
Complexity)
adjacency list?
What is the space complexity of an
Adjacency Matrix adjacency matrix in a graph with 'n' O(n)
vertices?
What is the space complexity of an
Adjacency Matrix adjacency matrix for a graph with n O(n)
vertices?
Adjacency Matrix For an undirected graph, how is the
Symmetrical
(Undirected Graph) adjacency matrix represented?
How is the adjacency matrix
Adjacency Matrix represented for an undirected Symmetrical
(Undirected Graph) graph?
What happens when you relax an
Bellman-Ford (Edge
edge in the Bellman-Ford The edge is removed
Relaxation Process)
algorithm?
How many times are the edges
Bellman-Ford (Edge
relaxed in Bellman-Ford algorithm V
Relaxation)
in a graph with V vertices?
Bellman-Ford (Edge In a graph with 'V' vertices, how
many times are the edges relaxed V
Relaxation)
in the Bellman-Ford algorithm?
What is the key advantage of
Bellman-Ford (Negative
Bellman-Ford over Dijkstra’s Simpler implementation
Weights)
algorithm?
Bellman-Ford (Negative What is the main advantage of the
Bellman-Ford algorithm over Simpler implementation
Weights)
Dijkstra’s algorithm?
What is the time complexity of the
Bellman-Ford Algorithm Bellman-Ford algorithm for a graph O(V+E)
with 'V' vertices and 'E' edges?
What is the time complexity of
Bellman-Ford Algorithm Bellman-Ford algorithm in a graph O(V+E)
with V vertices and E edges?
In the Bellman-Ford algorithm, how
Bellman-Ford Edge
many times are the edges relaxed V
Relaxation
for a graph with V vertices?
What is the time complexity of the
Bellman-Ford Time
Bellman-Ford algorithm for a graph O(VE)
Complexity
with V vertices and E edges?
What is the main advantage of
Bellman-Ford vs Dijkstra Bellman-Ford over Dijkstra’s Handles negative weights
algorithm?
Which of the following is/are
property/properties of a dynamic
programming problem?
Dynamic Programming Optimal substructure
The Fibonacci sequence is a classic
Dynamic Programming
example of a problem that can be Greedy algorithms
(Fibonacci)
solved using:
Dynamic Programming The 0/1 Knapsack problem is best
Greedy Algorithms
(Knapsack Problem) solved using which technique?
Dynamic Programming Which technique is most suitable
for solving the 0/1 Knapsack Greedy Algorithms
(Knapsack Problem)
problem?
What is the key property that
Dynamic Programming
allows dynamic programming Optimal Substructure
(Optimal Substructure)
solutions to work efficiently?
Dynamic Programming
What key feature makes dynamic Optimal Substructure
(Optimal Substructure)
programming algorithms perform
efficiently?
If an optimal solution can be
created for a problem by
Dynamic Programming
constructing optimal solutions for Overlapping subproblems
(Subproblem Overlap)
its subproblems, the problem
possesses ____________ property.
Which of the following problems is
Dynamic Programming typically solved using dynamic
Longest common subsequence
(Subproblem Overlap) programming due to subproblem
overlap?
What is the core feature of
Dynamic Programming
problems solved by dynamic Greedy property
Core Feature
programming?
Dynamic Programming Which of the following problems is
typically solved using dynamic Knapsack
Example - Knapsack
programming?
Dynamic Programming The Fibonacci sequence is an
Dynamic programming
Fibonacci example of a problem solved using:
What core feature do problems
Dynamic Programming Problems with overlapping
suitable for dynamic programming
Introduction subproblems
share?
What is the key characteristic of
Dynamic Programming Problems with overlapping
problems solved by dynamic
Introduction subproblems
programming?
Dynamic Programming In dynamic programming, what
Storing intermediate results
Memoization does memoization refer to?
Floyd-Warshall (All-Pairs Floyd-Warshall algorithm is
Single source shortest path
Shortest Path) typically used for:
Floyd-Warshall (All-Pairs
Shortest Path) What is the primary application of Single source shortest path
the Floyd-Warshall algorithm?
How does the Floyd-Warshall
Floyd-Warshall
algorithm detect negative weight By detecting zero-weight edges
(Negative Weight Cycles)
cycles?
What is the time complexity of
Floyd-Warshall (Time
Floyd-Warshall Algorithm for a O(VE)
Complexity)
graph with V vertices?
Floyd-Warshall (Time What is the time complexity of the
Floyd-Warshall algorithm for a O(VE)
Complexity)
graph with 'V' vertices?
What is the main advantage of
Floyd-Warshall
Floyd-Warshall algorithm over Works for all pair shortest paths
Advantage
Dijkstra's algorithm?
What is the main advantage of the
Floyd-Warshall
Floyd-Warshall algorithm over Faster execution
Algorithm
Dijkstra’s algorithm?
What is the primary advantage of
Floyd-Warshall
Floyd-Warshall Algorithm over Faster execution
Algorithm
Dijkstra’s Algorithm?
Floyd-Warshall and How does Floyd-Warshall detect
Detects zero-weight edges
Negative Cycles negative weight cycles?
What is the time complexity of
Floyd-Warshall
Floyd-Warshall algorithm for a O(VE)
Complexity
graph with V vertices?
What is the space complexity of an
Graph Representation -
adjacency list for a graph with V O(V + E)
Adjacency List
vertices and E edges?
Graph Representation - What is the space complexity of an
adjacency matrix for a graph with n O(n)
Adjacency Matrix
vertices?
Graph Representation - Which graph representation is
Adjacency matrix
Edge List more suitable for sparse graphs?
Which of the following is NOT a
Graphs Introduction Edge List
valid representation of graphs?
Which of the following is not a valid
Graphs Introduction Edge List
way to represent a graph?
What is the purpose of the prefix
KMP Algorithm table (pi-table) in the KMP To store the matches found
algorithm?
KMP Algorithm To store the matches found
What role does the prefix table (pi-
table) play in the KMP algorithm?
What is the primary advantage of
KMP Algorithm (Pattern
KMP over brute force pattern Faster matching
Search)
matching?
KMP Algorithm (Time What is the time complexity of
O(n^2)
Complexity) KMP string matching algorithm?
KMP Algorithm (Time What is the time complexity of the O(n^2)
Complexity) KMP string matching algorithm?
What is the key advantage of the
KMP Algorithm
KMP algorithm over brute force Reduces redundant comparisons
Advantage
pattern matching?
In the KMP algorithm, what is the
KMP Algorithm Prefix
purpose of the prefix table (pi- Track mismatches
Table
table)?
What data structure is essential for
Kruskal’s Algorithm implementing Kruskal’s algorithm Binary Heap
efficiently?
Which data structure is crucial for
Kruskal’s Algorithm efficiently implementing Kruskal’s Binary Heap
algorithm?
Dynamic Programming What is the core idea behind
Divide and Conquer
Definition dynamic programming?
What is the purpose of
Memorization in DP memorization in dynamic To store solutions to subproblems
programming?
How does tabulation differ from
Tabulation in DP Tabulation uses recursion
memorization?
Adjacency List How is an adjacency list
A matrix of size VxV
Representation represented in a graph?
Edge List vs. Adjacency What is the difference between an Edge list uses less space than
Matrix edge list and an adjacency matrix? adjacency matrix
Which data structure is commonly
BFS Traversal Stack
used for BFS traversal?
Which data structure is commonly
DFS Traversal Stack
used for DFS traversal?
Why is Dijkstra’s algorithm efficient
Shortest Path Algorithm
for shortest paths in weighted It only works for undirected graphs
Efficiency
graphs?
Bellman-Ford Negative How does Bellman-Ford handle
It uses backtracking
Weights negative weights?
Floyd-Warshall What does Floyd-Warshall
Single source shortest path
Algorithm algorithm compute?
Which algorithm is best suited to
Binary Maze Shortest
find the shortest path in a binary BFS
Path
maze?
What is the key concept of Prim’s
Prim’s Algorithm Start from an arbitrary vertex
algorithm for finding MST?
Kruskal’s Algorithm What is the role of a disjoint set in
Sort edges by weight
Efficiency Kruskal’s algorithm?
What is the time complexity of the
Bellman-Ford Time
Bellman-Ford algorithm for V O(V^2)
Complexity
vertices and E edges?
What is the time complexity of
Dijkstra with Binary
Dijkstra’s algorithm using a binary O(V + E)
Heap
heap and adjacency list?
What is the space complexity of the
Floyd-Warshall Space
Floyd-Warshall algorithm for a O(V)
Complexity
graph with V vertices?
Why is BFS preferred over DFS for
BFS vs DFS in Graph
shortest paths in an unweighted BFS explores deeper nodes first
Traversal
graph?
How does Bellman-Ford algorithm By checking negative weights in the
Detecting Negative Cycle
detect a negative weight cycle? graph
Which approach to solving the
Tabulation Example Recursion
Fibonacci problem uses tabulation?
What is the time complexity of
Prim’s Algorithm Time
Prim’s algorithm using an O(V^2)
Complexity
adjacency matrix for V vertices?
Kruskal’s Algorithm What is the key preprocessing step
Relaxing edges
Sorting in Kruskal’s algorithm?
What is the key advantage of
Recursive vs Iterative DP iterative DP (tabulation) over Uses recursion
recursive DP (memorization)?
Which algorithm is preferred for
Shortest Path in
shortest path in weighted graphs BFS
Weighted Graph
with non-negative weights?
What is the condition to move to
Binary Maze Distance
the next cell in a binary maze The cell is visited
Check
shortest path problem?
Floyd-Warshall What is the primary application of
Finding MST
Application Floyd-Warshall algorithm?
What is the key difference between
Greedy algorithms always
Greedy Algorithms vs DP Greedy Algorithms and Dynamic
guarantee optimal solutions
Programming?
Minimum Spanning Tree What is the minimum spanning
A tree with minimum depth
Definition tree of a graph?
Which of the following is NOT a
Applications of BFS Shortest path in unweighted graph
valid application of BFS?
When is an adjacency matrix more
Adjacency Matrix vs List In sparse graphs
efficient than an adjacency list?
Which graph traversal algorithm is
Topological Sorting BFS
used for topological sorting?
What is the space complexity of an
Graph Representation
adjacency matrix for a graph with V O(V)
Efficiency
vertices?
What happens when a node is
Graph Traversal in DFS It is marked as visited
visited in DFS traversal?
Which data structure is used to
Use of Disjoint Sets detect cycles in Kruskal’s Queue
algorithm?
What assumption is made in
Dijkstra’s Algorithm
Dijkstra’s algorithm for edge Negative weights are allowed
Assumption
weights?
Edge Relaxation in What does edge relaxation mean in
Adjusting edge weights
Bellman-Ford Bellman-Ford algorithm?
Floyd-Warshall and How does Floyd-Warshall detect By checking the main diagonal of
Negative Cycles negative weight cycles? the matrix
What is the purpose of a queue in
BFS Queue Usage To explore deeper levels
BFS?
Which of the following cannot be
Binary Maze Path Finding shortest path
solved using BFS in a binary maze?
What is the primary difference
Prim’s Algorithm
between Prim’s and Kruskal’s Prim’s uses adjacency list
Efficiency
algorithms?
What is the time complexity of
Kruskal’s Algorithm Time
Kruskal’s algorithm using a union- O(E log V)
Complexity
find data structure?
What is the time complexity of
DP Fibonacci Tabulation O(n)
solving Fibonacci using tabulation?
What does the priority queue store
Dijkstra’s Priority Queue Only the source vertex
in Dijkstra’s algorithm?
Why is Bellman-Ford preferred
Bellman-Ford Use Case It is faster
over Dijkstra in certain cases?
How is an adjacency list stored in
Adjacency List Storage As a matrix
memory?
Which of the following real-world
BFS Applications Social network connectivity
problems can BFS solve?
Which two properties are essential
Dynamic Programming
for applying dynamic Greedy and Divide and Conquer
Conditions
programming?
What is the worst-case time
Graph Traversal
complexity of BFS on a graph with O(V)
Efficiency
V vertices and E edges?
How is an edge list stored in
Edge List Representation As a list of (u, v) pairs
memory?
DP Problem Solving What is the correct order to solve a
Bottom-up approach
Strategy DP problem using tabulation?
MST Application in What is a real-world application of
Routing network connections
Networks Minimum Spanning Trees?
What is the characteristic of BFS
BFS Layer-wise Traversal Explores nodes layer by layer
traversal?
Answer Choice 2 Answer Choice 3 Answer Choice 4 Correct Answer Choice
O(V+E) O(E log V) O(V+E log V) O(V+E)
O(V+E) O(E log V) O(V+E log V) O(V+E)
O(V) O(E) O(degree of the node) O(degree of the node)
O(n^2) O(n log n) O(2^n) O(n^2)
O(n^2) O(n log n) O(2^n) O(n^2)
Asymmetrical Diagonal Random Symmetrical
Asymmetrical Diagonal Random Symmetrical
The distance to the The edge weight is The distance to the
The vertex is marked as visited
vertex is updated modified vertex is updated
V-1 V+1 E V-1
V-1 V+1 E V-1
Requires fewer Works with negative
Works with negative weights Faster for dense graphs
iterations weights
Requires fewer Works with negative
Works with negative weights Faster for dense graphs
iterations weights
O(VE) O(E log V) O(V^2) O(VE)
O(VE) O(E log V) O(V^2) O(VE)
V-1 V+1 E V-1
O(V^2) O(E log V) O(V + E) O(VE)
Handles negative
Faster Simpler Uses less memory
weights
Both optimal Both optimal
substructure and substructure and
Overlapping subproblems Greedy approach
overlapping overlapping
subproblems subproblems
Dynamic programming Backtracking Branch and bound Dynamic programming
Divide and Conquer Dynamic Programming Backtracking Dynamic Programming
Divide and Conquer Dynamic Programming Backtracking Dynamic Programming
No overlapping
Greedy Choice Property Independent decisions Optimal Substructure
subproblems
No overlapping
Greedy Choice Property Independent decisions Optimal Substructure
subproblems
Optimal substructure Memoization Greedy Optimal substructure
Longest common
Binary search Quick Sort Depth-first search
subsequence
Subproblem overlap Divide and conquer Random selection Subproblem overlap
Sorting Binary Search Maximum Flow Knapsack
Greedy algorithms Backtracking Divide and Conquer Dynamic programming
Problems with
Problems with simple Problems with greedy
Problems with no dependencies overlapping
recursion choices
subproblems
Problems with
Problems with simple Problems with greedy
Problems with no dependencies overlapping
recursion choices
subproblems
Caching subproblem Caching subproblem
Iterative calculation Recursive division
solutions solutions
All-pairs shortest path Minimum spanning tree Longest path in DAG All-pairs shortest path
All-pairs shortest path Minimum spanning tree Longest path in DAG All-pairs shortest path
By looking at the By looking at the
By checking edge
By comparing distances diagonal of the distance diagonal of the
weights
matrix distance matrix
O(V^2) O(V^3) O(E log V) O(V^3)
O(V^2) O(V^3) O(E log V) O(V^3)
Handles negative Works for all pair
Faster in all cases Simpler to implement
weights shortest paths
Works on undirected Works for negative
Works for negative weight cycles Simpler implementation
graphs weight cycles
Works on undirected Works for negative
Works for negative weight cycles Simpler implementation
graphs weight cycles
Looks for negative Checks adjacency Compares diagonal
Compares diagonal values
distances matrix values
O(V^2) O(V^3) O(E log V) O(V^3)
O(V^2) O(V log E) O(VE) O(V + E)
O(n^2) O(n log n) O(n^3) O(n^2)
Edge list Adjacency list Distance matrix Edge list
Matrix Tree Matrix Tree
Adjacency Matrix Adjacency List
Representation Representation
Matrix Tree Matrix Tree
Adjacency Matrix Adjacency List
Representation Representation
To count the number of To reduce redundant
To reduce redundant comparisons To avoid backtracking
occurrences comparisons
To count the number of To reduce redundant
To reduce redundant comparisons To avoid backtracking
occurrences comparisons
Works with negative Simpler Avoids redundant
Avoids redundant comparisons
weights implementation comparisons
O(n log n) O(n + m) O(nm) O(n + m)
O(n log n) O(n + m) O(nm) O(n + m)
Works with negative Reduces redundant
Faster on average Uses less space
weights comparisons
Store pattern Store longest prefix- Store longest prefix-
Record shifts in pattern
frequencies suffix match suffix match
Union-Find Stack Queue Union-Find
Union-Find Stack Queue Union-Find
Optimal Substructure Optimal Substructure
Memoization and
Recursive Backtracking and Overlapping and Overlapping
Tabulation
Subproblems Subproblems
To avoid recalculating solved To reduce space
Both 1 and 2 Both 1 and 2
subproblems complexity
Tabulation does not Tabulation solves
Tabulation solves subproblems Tabulation uses
require problem subproblems
iteratively additional memory
overlap iteratively
A dictionary of lists A list of edges A boolean array A dictionary of lists
Edge list uses less
Adjacency matrix is more efficient Edge list supports
Both 1 and 2 space than adjacency
in searches weighted graphs
matrix
Queue Priority Queue Set Queue
Queue Priority Queue Hash Map Stack
It can handle negative It relaxes edges at each
It uses a priority queue It uses a priority queue
weights iteration
It iteratively relaxes edges V-1 It calculates all-pairs It discards negative It iteratively relaxes
times shortest paths cycles edges V-1 times
Minimum spanning tree All pairs shortest paths Topological order All pairs shortest paths
DFS Dijkstra Bellman-Ford BFS
Start from an arbitrary
Select minimum weight edge Relax all edges at once Traverse all vertices
vertex
Track connected
Track connected components Find minimum edge Avoid negative cycles
components
O(V * E) O(V + E) O(V^3) O(V * E)
O((V + E) * log V) O(V^2) O(V^3) O((V + E) * log V)
O(V^2) O(V + E) O(V^3) O(V^2)
BFS ensures minimum BFS ensures minimum
BFS uses a priority queue BFS avoids cycles
depth exploration depth exploration
By checking for relaxed By calculating the By checking for relaxed
By comparing weights after V
edges in the V-th minimum spanning edges in the V-th
iterations
iteration tree iteration
Storing solutions in a table Storing solutions in a
Backtracking Greedy approach
iteratively table iteratively
O(V + E) O((V + E) * log V) O(V^3) O(V^2)
Calculating shortest Sorting edges by
Sorting edges by weight Traversing vertices
path weight
Reduces time
Uses additional memory Avoids stack overflow Avoids stack overflow
complexity
Bellman-Ford Dijkstra Floyd-Warshall Dijkstra
The cell is visited and The cell is unvisited
The cell is unvisited and open The cell is blocked
closed and open
Determining graph Finding all-pairs
Finding all-pairs shortest paths Detecting negative cycles
connectivity shortest paths
Greedy doesn’t consider
DP considers all possibilities Both 2 and 3 Both 2 and 3
future consequences
A tree with maximum A spanning tree with A tree with minimum
A tree with minimum weight
edges negative cycles weight
Finding connected
Checking bipartiteness Topological sorting Topological sorting
components
In dense graphs In unweighted graphs In directed graphs In dense graphs
DFS Dijkstra Prim DFS
O(V^2) O(E) O(V + E) O(V^2)
All its neighbors are
It is pushed into the queue It is revisited in cycles explored It is marked as visited
simultaneously
Disjoint Sets Priority Queue Stack Disjoint Sets
All weights must be Edge weights must All weights must be
All weights must be non-negative
integers form a cycle-free graph non-negative
Updating distances for adjacent Updating distances for
Finding cycles in a graph Adding new edges
nodes adjacent nodes
By examining shortest By running DFS after all By checking the main
By checking for unvisited vertices
paths iterations diagonal of the matrix
To maintain the order of node To optimize space To maintain the order
To detect cycles
traversal usage of node traversal
Detecting unreachable Solving weighted path Solving weighted path
Counting number of paths
areas problems problems
Kruskal’s grows a
Kruskal’s uses adjacency matrix Prim’s grows a tree Prim’s grows a tree
forest
O(V log V) O(E log E) O(V^2) O(E log V)
O(log n) O(2^n) O(1) O(n)
The current shortest
The current shortest distances All edges All vertices
distances
It works with negative It does not use edge It works with negative
It works with graphs having cycles
weights relaxation weights
As an array of linked
As a hash map As an array of linked lists As a binary tree
lists
Network flow problems Detecting deadlocks All of the above All of the above
Overlapping
Overlapping Subproblems and Optimal Substructure Divide and Conquer
Subproblems and
Optimal Substructure and Greedy and Memoization
Optimal Substructure
O(V + E) O(E^2) O(V^2) O(V + E)
As a matrix of size VxV As a binary tree As a dictionary of lists As a list of (u, v) pairs
Top-down approach Randomized approach Brute force Bottom-up approach
Routing network
Detecting deadlocks Solving shortest paths Pathfinding in mazes
connections
Explores nodes layer
Explores deeper nodes first Avoids cycles Uses a stack
by layer