You are on page 1of 3

UNIT -2

Greedy Method: General Method

- Greedy Method is a general algorithmic paradigm that makes a series of choices at each
step to find an optimal solution.
- At each step, it makes the locally optimal choice, hoping that this will lead to a globally
optimal solution.
- Greedy algorithms are often easy to implement and computationally efficient.

Knapsack Problem:

- Knapsack Problem involves choosing items with specific weights and values to maximize
the total value within a weight limit.
- Two common variants:
1. 0/1 Knapsack: Items cannot be divided.
2. Fractional Knapsack: Items can be divided.
- Greedy Strategy for Fractional Knapsack: Sort items by value-to-weight ratio and choose
items in descending order of the ratio until the knapsack is full.
- Time Complexity: O(n log n) for sorting in the fractional knapsack problem.

Huffman Codes:

- Huffman Codes are used for data compression, where frequently used characters are
represented with shorter codes.
- The algorithm creates a binary tree with characters as leaves and encodes characters
based on their position in the tree.
- Greedy Strategy: Repeatedly merge the two lowest frequency characters into a new
internal node until there's only one node left (the root).
- Time Complexity: O(n log n) for constructing the Huffman tree.

Job Sequencing with Deadlines:

- Job Sequencing with Deadlines is a scheduling problem where jobs have associated
deadlines and profits.
- The goal is to maximize total profit by scheduling jobs within their deadlines.
- Greedy Strategy: Sort jobs by profit in non-increasing order and schedule the jobs in the
earliest available slots.
- Time Complexity: O(n log n) for sorting jobs by profit.

Minimum Spanning Trees:

- Minimum Spanning Trees (MST) are used in network design problems to find the minimum
total edge weight to connect all nodes.
- Prim's and Kruskal's algorithms are common for finding MSTs.
- Greedy Strategy (Kruskal's): Sort edges by weight, select the lowest-weight edge, and add
it to the MST if it doesn't create a cycle.
- Time Complexity: O(E log E) for sorting edges using Kruskal's algorithm.
Single-Source Shortest Paths:

- Single-Source Shortest Paths problems involve finding the shortest path from one source
node to all other nodes in a weighted graph.
- Dijkstra's and Bellman-Ford algorithms are common for solving such problems.
- Greedy Strategy (Dijkstra's): Maintain a set of vertices with the shortest known distance
and iteratively select the vertex with the smallest distance and update its neighbors.
- Time Complexity: O(V^2) for Dijkstra's with an adjacency matrix, but O(E + V log V) using a
min-priority queue.
- Bellman-Ford has a time complexity of O(V*E).

Analysis of These Problems:

1. Knapsack Problem:
- Time Complexity: O(n log n) for fractional knapsack (sorting).
- Space Complexity: O(1) for the fractional knapsack.

2. Huffman Codes:
- Time Complexity: O(n log n) for constructing the Huffman tree.
- Space Complexity: O(n) for the Huffman tree data structure.

3. Job Sequencing with Deadlines:


- Time Complexity: O(n log n) for sorting jobs by profit.
- Space Complexity: O(n) for storing the scheduled jobs.

4. Minimum Spanning Trees:


- Kruskal's algorithm has a time complexity of O(E log E) for sorting edges.
- Prim's algorithm has a time complexity of O(V^2) or O(E + V log V) using a priority queue.
- Space Complexity: O(V) for Prim's and O(E) for Kruskal's.

5. Single-Source Shortest Paths:


- Dijkstra's algorithm time complexity depends on the implementation (using a min-priority
queue or not).
- Bellman-Ford has a time complexity of O(V*E).
- Space Complexity: O(V) for Dijkstra's with a min-priority queue, O(E) for Bellman-Ford.

Backtracking: General Method

- Backtracking is a recursive algorithmic paradigm used to solve problems where you need
to make a sequence of decisions to find a solution.
- It systematically explores the solution space by making choices at each step, and if a
choice leads to an infeasible solution, the algorithm backtracks to explore other options.
- Backtracking is often used in combination with pruning techniques to reduce the search
space and improve efficiency.
8 Queen's Problem:

- In the 8 Queen's problem, the goal is to place eight queens on an 8x8 chessboard in a way
that no two queens threaten each other (i.e., no two queens share the same row, column, or
diagonal).
- Backtracking Strategy:
1. Start from the first column and place a queen in the first row.
2. Move to the next column and place a queen in a row where it does not threaten the
queens already placed.
3. Continue this process until either all eight queens are placed or a conflict is detected.
4. If a conflict is detected, backtrack to the previous column and explore other positions.
- The algorithm explores all possible combinations until a valid solution is found or all
possibilities are exhausted.
- Time Complexity: O(n^n) where n is the size of the board, and space complexity is O(n).

Graph Coloring:

- Graph coloring involves assigning colors to the vertices of a graph in such a way that no
adjacent vertices have the same color.
- The minimum number of colors required to color a graph is called the chromatic number.
- Backtracking Strategy:
1. Start with an empty coloring of the graph.
2. Pick an uncolored vertex and assign it the smallest available color that doesn't conflict
with adjacent vertices.
3. Repeat this process until all vertices are colored or a conflict is detected.
4. If a conflict is detected, backtrack to the previous vertex and explore other colors.
- The algorithm explores all possible colorings until a valid solution is found or all possibilities
are exhausted.
- Time Complexity: Exponential, O(n^n), where n is the number of vertices, and space
complexity is O(n) for storing color assignments.

Hamiltonian Cycles:

- A Hamiltonian cycle is a cycle that visits every vertex exactly once in a graph.
- A Hamiltonian path visits all vertices, but it does not necessarily return to the starting
vertex.
- Backtracking Strategy:
1. Start with an empty path or cycle.
2. Try to extend the path/cycle by adding an unvisited vertex.
3. Continue this process until either all vertices are visited and a valid Hamiltonian
path/cycle is found or a dead end is reached.
4. If a dead end is reached, backtrack to the previous vertex and explore other unvisited
vertices.
- The algorithm explores all possible permutations of vertices until a valid Hamiltonian path
or cycle is found or all possibilities are exhausted.
- Time Complexity: Exponential, O(n!), where n is the number of vertices, and space
complexity is O(n) for storing path or cycle configuration.

You might also like