Simulated Annealing Algorithm

by Sajjad Ahmed Niloy

Simulated Annealing

How it Works

  1. Start Hot: Begin with a random solution and a high "temperature."
  2. Random Move: Consider a random neighboring solution.
  3. Acceptance Probability: If the new solution is better, always move to it. If it's worse, move to it based on a probability that depends on the high temperature.
  4. Cool Down: Gradually decrease the temperature, making it less likely to accept worse solutions over time.
  5. Repeat: Continue until the system is "cool" and the solution has stabilized.

Analogy: Annealing Metal

Heating metal allows its atoms to move freely out of stressed positions (local optima). As it cools slowly, the atoms settle into a strong, stable structure (the global optimum).

Pros

  • Good at escaping local optima.
  • Likely to find a global optimum.

Cons

  • Can be slower to converge.
  • Requires careful tuning of parameters.

Simulation

A particle (orange dot) explores a landscape to find the lowest point (global minimum, green marker). Watch as it uses high "temperature" to "jump" out of shallow valleys (local minima). As it cools, it settles into the best solution. Click "Run Simulation" to start over.

Deconstructing the Simulation:

1. The Landscape & Particle: The landscape represents all possible solutions, where height is the "cost" (we want to find the lowest point). The orange particle is the current solution being tested.

2. Energy (Cost): The "energy" of the particle is its Y-position on the graph. A lower energy (a deeper valley) is a better solution.

3. Temperature: This is a crucial parameter that starts high and slowly decreases. When the temperature is high, the particle has more "energy" and is more likely to accept a "bad" move (jumping uphill to a worse solution) to escape a local minimum.

4. Acceptance Probability: The decision to move to a worse solution is calculated with the formula `P = exp(ΔV / T)`. If a random number is less than P, the bad move is accepted. You can see that a high temperature (T) makes this probability higher, and as T approaches zero, the probability of accepting a bad move also approaches zero.

This process allows the algorithm to explore widely at the start and then focus on the best area as it "cools down," preventing it from getting stuck too early.

Python Code Example

# --- Conceptual Example --- # Goal: Find the minimum of a complex function import random import math # A function with multiple local minima def objective_function(x): return (x**2) * math.sin(5 * math.pi * x)**6.0 current_solution = random.uniform(0, 1) temp = 1.0 cooling_rate = 0.99 min_temp = 0.00001 while temp > min_temp: current_energy = objective_function(current_solution) # Get a random neighbor neighbor = current_solution + random.uniform(-0.1, 0.1) neighbor = max(0.0, min(1.0, neighbor)) # Clamp to [0,1] neighbor_energy = objective_function(neighbor) # Decide to move to neighbor delta_energy = neighbor_energy - current_energy if delta_energy < 0 or \ random.random() < math.exp(-delta_energy / temp): current_solution = neighbor temp *= cooling_rate # Cool down print(f"Found minimum near x = {current_solution:.4f}")