You are on page 1of 27

Local Search Algorithms

Limitations of Search Tree


• Search tree explores the state space
systematically
• Keeps more than one path in memory
• Not space-efficient
Local Search
• Moves from current state to next state based
on objective function
• Objective function defines a landscape in the
state space
• Each state gives a value for objective function
• Aim is to reach the global minimum or
maximum
Landscape
Hill Climbing Search
(Steepest ascent/Greedy local search)
8 queens problem – Complete state formulation
State: 1 queen per column
Successor states: Move single queen to another
square in the same column
????
Hill Climbing Search
(Steepest ascent/Greedy local search)
8 queens problem – Complete state formulation
State: 1 queen per column
Successor states: Move single queen to another
square in the same column
8*7 = 56
Heuristic cost function h: No. of pairs of
queens attacking each other
Global minimum h = 0
8-queens state with h=17

h value of a successor state

Possible next move


with minimum h
Problems with hill climbing

Getting stuck at

1. Local maxima
2. Ridges - Sequence of local maxima
3. Plateau – Flat area
Variants
Stochastic hill climbing
- Randomly choose an uphill move
Eg. First-choice hill climbing
Variants
Stochastic hill climbing
- Randomly choose an uphill move
Eg. First-choice hill climbing

Random-restart hill climbing


- Randomly generate initial state until a goal is
reached
Simulated Annealing
Hill Climbing
• Never makes downhill
• Problem is incomplete
Simulated Annealing
Hill Climbing
• Never makes downhill
• Problem is incomplete
Random Walk
• Take a random successor
• Complete but inefficient
Simulated Annealing
Hill Climbing
• Never makes downhill
• Problem is incomplete
Random Walk
• Take a random successor
• Complete but inefficient
Simulated Annealing in Metallurgy
• Harden metal/glass by heating to high
temperature and gradually cooling it
Simulated Annealing - Algorithm
1. Pick a random move
Simulated Annealing - Algorithm
1. Pick a random move
2. Compute the cost difference
Simulated Annealing - Algorithm
1. Pick a random move
2. Compute the cost difference
3. Accept if it is an improvement
Simulated Annealing - Algorithm
1. Pick a random move
2. Compute the cost difference
3. Accept if it is an improvement
4. Otherwise accept with a probability
– Bad moves are initially accepted with high
probability
– Bad moves are largely not accepted later
Simulated Annealing
Main Idea

Escape local maxima by allowing some BAD


moves but gradually decrease their size and
frequency
Local Beam Search

1. Initially select k randomly generated states


Local Beam Search

1. Initially select k randomly generated states


2. Successors of these states are generated
Local Beam Search

1. Initially select k randomly generated states


2. Successors of these states are generated
3. Successor generation goes on in parallel
threads
Local Beam Search

1. Initially select k randomly generated states


2. Successors of these states are generated
3. Successor generation goes on in parallel
threads
4. Algorithm halts if any one is goal
Local Beam Search

Comparison with random restart


Local beam search passes useful information
among the parallel search threads
Genetic Algorithms
• Successor state generated from 2 parent states

Initial Population:
k randomly generated states

Fitness function:
Number of non-attacking pairs of queens
Genetic Algorithms

Genetic Algorithm

Position of
queen
Genetic algorithms

You might also like