Faculty of Information Technology
Local Search
(ARTIFICIAL INTELLIGENCE)
Instructor: Assoc. Prof. PhD. Hoàng Văn Dũng
Email: dunghv@hcmute.edu.vn
(slides adapted and revised from Dan Klein, Pieter Abbeel, Anca Dragan, et al)
Local Search
ISLab- Intelligent Systems Laboratory
Local Search
Example: 4-queens problem
• h = number of pairs of queens that are attacking each other, either directly or indirectly
ISLab- Intelligent Systems Laboratory
Local Search
• Tree search keeps unexplored alternatives on the fringe (ensures completeness)
• Local search: improve a single option until you can’t make it better (no fringe!)
• New successor function: local changes
• Generally much faster and more memory efficient (but incomplete and suboptimal)
ISLab- Intelligent Systems Laboratory
Hill Climbing
• Simple, general idea:
– Start wherever
– Repeat: move to the best neighboring state
– If no neighbors better than current, quit
• What’s bad about this approach?
• What’s good about it?
ISLab- Intelligent Systems Laboratory
Hill-climbing search
• "Like climbing Everest in thick fog with amnesia"
ISLab- Intelligent Systems Laboratory
Hill Climbing Search
Hill Climbing Quiz
Starting from X, where do you end up ?
Starting from Y, where do you end up ?
Starting from Z, where do you end up ?
ISLab- Intelligent Systems Laboratory
Hill Climbing
• Example: 8puzzle
ISLab- Intelligent Systems Laboratory
Hill-climbing search
Example: 8-queens problem
• h = number of pairs of queens that are attacking each other, either directly or indirectly
• h = 17 for the above state
ISLab- Intelligent Systems Laboratory
Hill-climbing search
Example: 8-queens problem
• A local minimum with h = 1
ISLab- Intelligent Systems Laboratory
Hill Climbing Diagram
• Problem: depending on initial state, can get stuck in local maxima
•
ISLab- Intelligent Systems Laboratory
Beam search
• Keep track of k states rather than just one
• Start with k randomly generated states
• At each iteration, all the successors of all k states are generated
• If any one is a goal state, stop; else select the k best successors from the
complete list and repeat.
ISLab- Intelligent Systems Laboratory
Simulated Annealing
• Idea: Escape local maxima by allowing downhill moves
– But make them rarer as time goes on
18
ISLab- Intelligent Systems Laboratory
Simulated Annealing
• Theoretical guarantee:
– Stationary distribution:
– If T decreased slowly enough,
will converge to optimal state!
• Is this an interesting guarantee?
• Sounds like magic, but reality is reality:
– The more downhill steps you need to escape a local optimum, the
less likely you are to ever make them all in a row
– People think hard about ridge operators which let you jump around
the space in better ways
ISLab- Intelligent Systems Laboratory
Genetic algorithms
• A successor state is generated by combining two parent states
• Start with k randomly generated states (population)
• A state is represented as a string over a finite alphabet (often a string of 0s and 1s)
• Evaluation function (fitness function). Higher values for better states.
• Produce the next generation of states by selection, crossover, and mutation
ISLab- Intelligent Systems Laboratory
Genetic Algorithms
• Genetic algorithms use a natural selection
metaphor • Fitness function: number of non-attacking pairs of
– Keep best N hypotheses at each step (selection) queens (min = 0, max = 8 × 7/2 = 28)
based on a fitness function • 24/(24+23+20+11) = 31%
– Also have pairwise crossover operators, with • 23/(24+23+20+11) = 29% etc
optional mutation to give variety
• Possibly the most misunderstood, misapplied
(and even maligned) technique around
ISLab- Intelligent Systems Laboratory
Genetic Algorithms
3 Operators:
➢ Reproduction
➢ Crossover
➢ Mutation
ISLab- Intelligent Systems Laboratory
Genetic Algorithms
Example: N-Queens
• Why does crossover make sense here?
• When wouldn’t it make sense?
• What would mutation be?
• What would a good fitness function be?
ISLab- Intelligent Systems Laboratory
Genetic Algorithms
ISLab- Intelligent Systems Laboratory
Thanks for your attention!
Q&A
Local Search in Continuous Spaces
• Put 3 airports in Romania to
(x1,y1)
minimize the sum of squared
distance of each city to its
nearest airport
• Variables: x1,y1,x2,y2,x3,y3 (x2,y2)
• Ci = set of cities nearest to i (x3,y3)
• Cost f(x1,y1,x2,y2,x3,y3) =
30
ISLab- Intelligent Systems Laboratory
Local Search in Continuous Spaces
• Cost f(x1,y1,x2,y2,x3,y3) =
(x1,y1)
• Method 1: discretize, compute (x2,y2)
empirical gradient
f(x1+dx,y1,x2,y2,x3,y3) etc. (x3,y3)
• Method 2: stochastic descent:
generate small random vector dx and
accept if f(x+dx) < f(x)
31
ISLab- Intelligent Systems Laboratory
Local Search in Continuous Spaces
• Cost f(x1,y1,x2,y2,x3,y3) =
(x1,y1)
• Method 3: take small step along (x2,y2)
gradient vector
(x3,y3)
32
ISLab- Intelligent Systems Laboratory
ISLab- Intelligent Systems Laboratory