Master Simulated Annealing, Genetic Algorithm, and PSO with Python
Explore three powerful optimization techniques—Simulated Annealing, Genetic Algorithm, and Particle Swarm Optimization—through clear explanations, step-by-step procedures, and complete Python implementations that demonstrate how to find global minima of complex functions in practice and compare their performance.
Simulated Annealing Algorithm
Simulated Annealing (SA) is a global optimization method that probabilistically accepts worse solutions to escape local minima and search for the global optimum.
Steps
The SA procedure consists of:
Initialize parameters: choose an initial temperature, set cooling rate, and define termination criteria.
Generate an initial solution randomly.
Select a new candidate solution by perturbing the current one.
Compute acceptance probability using the energy difference ΔE and current temperature T.
Update the solution according to the acceptance probability.
Cool down the temperature according to the cooling rate and check termination.
Repeat steps 3–6 until the stopping condition is met.
During each cooling cycle the probability of accepting worse solutions decreases, allowing convergence toward the global optimum. Performance depends heavily on the choice of initial temperature, cooling rate, and termination condition.
Python Implementation
Below is a Python example that finds the global minimum of a one‑dimensional function.
Define the objective function:
<code>import math
import random
def func(x):
return x ** 2 + math.sin(5 * x)
</code>Plot the function (optional):
<code>import matplotlib.pyplot as plt
import numpy as np
xarray = np.linspace(-5, 5, 10000)
plt.plot(xarray, [func(x) for x in xarray])
</code>Simulated annealing implementation:
<code>def simulated_annealing(func, x0, T0, r, iter_max, tol):
'''
func: objective function
x0: initial solution
T0: initial temperature
r: cooling rate
iter_max: maximum iterations
tol: temperature lower bound
'''
x_best = x0
f_best = func(x0)
T = T0
iter = 0
while T > tol and iter < iter_max:
x_new = x_best + random.uniform(-1, 1) * T
f_new = func(x_new)
delta_f = f_new - f_best
if delta_f < 0 or random.uniform(0, 1) < math.exp(-delta_f / T):
x_best, f_best = x_new, f_new
T *= r
iter += 1
return x_best, f_best
</code>Run the algorithm:
<code>x0 = 2
T0 = 100
r = 0.95
iter_max = 10000
tol = 0.0001
x_best, f_best = simulated_annealing(func, x0, T0, r, iter_max, tol)
print("x_best = {:.4f}, f_best = {:.4f}".format(x_best, f_best))
</code>Result:
x_best = -0.2906, f_best = -0.9086The outcome is reasonably close to the true global minimum.
Genetic Algorithm
Genetic Algorithm (GA) mimics biological evolution to iteratively improve solutions through selection, crossover, and mutation. It offers strong global search capability and robustness but may converge slowly or become trapped in local optima.
Steps
Initialize a population of random solutions.
Evaluate fitness of each individual.
Select parents based on fitness.
Apply crossover and mutation to generate offspring.
Replace the old population with the new one and repeat until a stopping condition is satisfied.
Example: maximize the integer value of a function on [0,15] using a 4‑bit binary encoding.
Python Implementation
<code>import math
def func(x):
return x**2 + math.sin(5*x)
def fitness(x):
return 30 - (x**2 + math.sin(5*x))
import random
POPULATION_SIZE = 50
GENE_LENGTH = 16
def generate_population(population_size, gene_length):
population = []
for i in range(population_size):
individual = [random.randint(0, 1) for _ in range(gene_length)]
population.append(individual)
return population
def crossover(parent1, parent2):
crossover_point = random.randint(0, GENE_LENGTH - 1)
child1 = parent1[:crossover_point] + parent2[crossover_point:]
child2 = parent2[:crossover_point] + parent1[crossover_point:]
return child1, child2
def mutation(individual, mutation_probability):
for i in range(GENE_LENGTH):
if random.random() < mutation_probability:
individual[i] = 1 - individual[i]
return individual
def select_parents(population):
total_fitness = sum([fitness(decode(ind)) for ind in population])
parent1 = None
parent2 = None
while parent1 == parent2:
parent1 = select_individual(population, total_fitness)
parent2 = select_individual(population, total_fitness)
return parent1, parent2
def select_individual(population, total_fitness):
r = random.uniform(0, total_fitness)
fitness_sum = 0
for individual in population:
fitness_sum += fitness(decode(individual))
if fitness_sum > r:
return individual
return population[-1]
def decode(individual):
x = sum([gene * 2**i for i, gene in enumerate(individual)])
return -5 + 10 * x / (2**GENE_LENGTH - 1)
GENERATIONS = 100
CROSSOVER_PROBABILITY = 0.8
MUTATION_PROBABILITY = 0.05
def genetic_algorithm():
population = generate_population(POPULATION_SIZE, GENE_LENGTH)
for i in range(GENERATIONS):
new_population = []
for j in range(int(POPULATION_SIZE/2)):
parent1, parent2 = select_parents(population)
if random.random() < CROSSOVER_PROBABILITY:
child1, child2 = crossover(parent1, parent2)
else:
child1, child2 = parent1, parent2
child1 = mutation(child1, MUTATION_PROBABILITY)
child2 = mutation(child2, MUTATION_PROBABILITY)
new_population.append(child1)
new_population.append(child2)
population = new_population
best_individual = max(population, key=lambda ind: fitness(decode(ind)))
best_fitness = fitness(decode(best_individual))
best_x = decode(best_individual)
best_func = func(best_x)
return best_x, best_fitness, best_func
best_x, best_fitness, best_func = genetic_algorithm()
print("x =", best_x)
print("Maximum fitness =", best_fitness)
print("Function value =", best_func)
</code>The fitness function is defined inversely because the algorithm maximizes fitness while we seek a minimum of the original function.
Particle Swarm Optimization (PSO)
Particle Swarm Optimization is an evolutionary computation technique inspired by the collective behavior of bird flocks. Particles explore the search space, adjusting their velocities and positions based on personal and global best experiences to converge toward optimal solutions.
Steps
Initialize random positions and velocities for all particles.
Evaluate each particle's fitness.
Update each particle's personal best if the current fitness is better.
Update the global best based on all personal bests.
Update velocities and positions using the PSO update formulas.
Repeat until a stopping condition (e.g., sufficient fitness or maximum iterations) is reached.
Return the global best position.
Python Implementation
<code>import numpy as np
def evaluate_fitness(x):
return x ** 2 + np.sin(5 * x)
class PSO:
def __init__(self, n_particles, n_iterations, w, c1, c2, bounds):
self.n_particles = n_particles
self.n_iterations = n_iterations
self.w = w
self.c1 = c1
self.c2 = c2
self.bounds = bounds
self.particles_x = np.random.uniform(bounds[0], bounds[1], size=(n_particles,))
self.particles_v = np.zeros_like(self.particles_x)
self.particles_fitness = evaluate_fitness(self.particles_x)
self.particles_best_x = self.particles_x.copy()
self.particles_best_fitness = self.particles_fitness.copy()
self.global_best_x = self.particles_x[self.particles_fitness.argmin()]
def update_particle_velocity(self):
r1 = np.random.uniform(size=self.n_particles)
r2 = np.random.uniform(size=self.n_particles)
self.particles_v = (self.w * self.particles_v +
self.c1 * r1 * (self.particles_best_x - self.particles_x) +
self.c2 * r2 * (self.global_best_x - self.particles_x))
self.particles_v = np.clip(self.particles_v, -1, 1)
def update_particle_position(self):
self.particles_x = self.particles_x + self.particles_v
self.particles_x = np.clip(self.particles_x, self.bounds[0], self.bounds[1])
self.particles_fitness = evaluate_fitness(self.particles_x)
better_mask = self.particles_fitness < self.particles_best_fitness
self.particles_best_x[better_mask] = self.particles_x[better_mask]
self.particles_best_fitness[better_mask] = self.particles_fitness[better_mask]
best_particle = self.particles_fitness.argmin()
if self.particles_fitness[best_particle] < evaluate_fitness(self.global_best_x):
self.global_best_x = self.particles_x[best_particle]
def run(self):
for i in range(self.n_iterations):
self.update_particle_velocity()
self.update_particle_position()
return self.global_best_x
pso = PSO(n_particles=20, n_iterations=50, w=0.7, c1=1.4, c2=1.4, bounds=(-5, 5))
global_best_x = pso.run()
</code>The optimal solution found is approximately -0.2908.
These three optimization algorithms provide versatile tools for tackling complex, non‑convex problems.
Model Perspective
Insights, knowledge, and enjoyment from a mathematical modeling researcher and educator. Hosted by Haihua Wang, a modeling instructor and author of "Clever Use of Chat for Mathematical Modeling", "Modeling: The Mathematics of Thinking", "Mathematical Modeling Practice: A Hands‑On Guide to Competitions", and co‑author of "Mathematical Modeling: Teaching Design and Cases".
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.