An Interactive Guide to Local Search Algorithms

Exploring Smart Ways to Solve Complex Problems in Artificial Intelligence


Created by Sajjad Ahmed Niloy

Hill-Climbing Search

How it Works

  1. Start Anywhere: Begin with a random solution.
  2. Look Around: Examine all neighboring solutions.
  3. Take the Best Step: Move to the best neighbor.
  4. Repeat: Continue until no neighbor is better.

Analogy: The Lost Hiker

A hiker in thick fog always walks in the direction that leads up. They stop when all directions go downhill.

Pros

  • Simple & easy to implement.
  • Memory efficient.

Cons

  • Gets stuck on local maxima.
  • Can't navigate plateaus.

Interactive Simulation

A climber (blue dot) tries to find the highest peak. The true highest peak is marked with a green flag. Watch how it often gets stuck on a smaller peak (local maximum), marked by a red flag. Click "Reset" to try again from a new random spot.

Python Code Example

# Objective function (a simple parabola) def objective_function(x): return -(x - 2)**2 + 10 # Peak is at x=2 # Simple Hill-Climbing implementation current_solution = 5.0 step_size = 0.1 for _ in range(100): current_value = objective_function(current_solution) # Check neighbors if objective_function(current_solution + step_size) > current_value: current_solution += step_size elif objective_function(current_solution - step_size) > current_value: current_solution -= step_size else: break # Found a peak print(f"Found peak near x = {current_solution:.2f}")

Simulated Annealing

How it Works

  1. Start Hot: Begin with a random solution and high "temperature."
  2. Random Move: Move to a random neighbor.
  3. Acceptance Probability: If better, accept. If worse, accept with a probability that decreases as the temperature cools.
  4. Cool Down: Gradually decrease the temperature.

Analogy: The Bouncing Ball

A bouncing ball on a bumpy surface. With high energy (hot), it can jump out of small valleys. As it cools, it loses energy and settles in the deepest valley.

Pros

  • Good at escaping local optima.
  • Likely to find a global optimum.

Cons

  • Can be slower to converge.
  • Requires careful tuning.

Interactive Simulation

This "bouncing ball" (orange dot) is trying to find the lowest valley (global minimum, green flag). When "hot" (early in the simulation), it can jump out of shallow valleys (local minima). As it "cools," it settles into the best solution it has found. Click "Reset" to start over.

Python Code Example

import random import math # A more complex function to minimize def objective_function(x): return (x**2 - 4*x + 5) * math.sin(3*x) current_solution = random.uniform(-5, 5) temp = 100.0 cooling_rate = 0.99 for i in range(1000): current_energy = objective_function(current_solution) neighbor = current_solution + random.uniform(-1, 1) neighbor_energy = objective_function(neighbor) # Decide to move to neighbor if neighbor_energy < current_energy or \ random.random() < math.exp((current_energy - neighbor_energy) / temp): current_solution = neighbor temp *= cooling_rate # Cool down print(f"Found minimum near x = {current_solution:.2f}")

Genetic Algorithms

How it Works

  1. Initial Population: Start with many random solutions.
  2. Fitness Evaluation: Score how good each solution is.
  3. Selection: The best solutions are chosen to be "parents".
  4. Crossover & Mutation: Parents create "offspring" by combining and randomly changing genes.
  5. New Generation: The new population replaces the old, and the cycle repeats.

Analogy: Breeding Racehorses

Start with many horses. Let the fastest ones breed. Their offspring are likely to be fast. Over many generations, you evolve an even faster horse.

Pros

  • Excellent for global optimization.
  • Handles very complex problems.

Cons

  • Computationally expensive.
  • Many parameters to configure.

Interactive Simulation

A population of random phrases evolves to match the target phrase. Watch how the "Best Phrase" gets closer each generation through selection, crossover, and mutation. Click "Reset" to start over.

Python Code Example

# --- Conceptual Example --- # Goal: Evolve a string to match a target import random TARGET = "HELLO" POP_SIZE = 100 VALID_GENES = " ABCDEFGHIJKLMNOPQRSTUVWXYZ" def random_char(): return random.choice(VALID_GENES) def fitness(individual): # Higher score is better score = sum(1 for i, c in enumerate(individual) if c == TARGET[i]) return score # 1. Initialization population = ["".join(random_char() for _ in range(len(TARGET))) for _ in range(POP_SIZE)] for generation in range(500): # 2. Selection parents = sorted(population, key=fitness, reverse=True)[:10] # 3. Crossover & 4. Mutation offspring = [] for _ in range(POP_SIZE): p1, p2 = random.sample(parents, 2) child = "".join(p1[i] if random.random() < 0.5 else p2[i] for i in range(len(TARGET))) if random.random() < 0.2: # Mutate child_list = list(child) child_list[random.randint(0, len(TARGET)-1)] = random_char() child = "".join(child_list) offspring.append(child) population = offspring if TARGET in population: break print(f"Best solution found: {max(population, key=fitness)}")