Understanding the Butterfly Optimization Algorithm: Theory and Python Code
This article explains the Butterfly Optimization Algorithm (BOA), covering its biological inspiration, core concepts such as scent-based global and local search, key parameters, iteration process, and provides a complete Python implementation with example usage and visualisation of convergence.
Basic Principles of the Butterfly Optimization Algorithm
The Butterfly Optimization Algorithm (BOA) is a novel intelligent optimization method proposed in 2019 that mimics butterflies' foraging behavior to solve optimization problems, offering fast convergence and strong search capability.
In BOA each butterfly represents a search agent in the solution space; it emits a scent whose intensity is related to its fitness. When a butterfly senses a stronger scent from another, it moves toward that butterfly (global search). If no stronger scent is detected, it performs random exploration around its current position (local search).
Butterfly Scent
Each butterfly releases a scent whose intensity decays with distance. The scent value is calculated using three important parameters: sensory modality, stimulus factor, and power exponent.
Sensory modality (c) denotes the way scent is perceived, analogous to the butterfly's olfactory sense; it is a constant set during initialization.
Stimulus factor (I) is derived from the current fitness value; butterflies with higher fitness emit stronger scents, attracting others during global search.
Power exponent (a) is a constant that determines the response type: linear, compressed, or expanded, affecting how scent intensity changes with distance.
The scent formula combines these parameters, and their ranges influence the algorithm’s performance; extreme values can lead to premature convergence or excessive exploration.
Butterfly Movement and Iterations
The algorithm assumes that all butterflies emit scent, each either performs random local search or moves toward the strongest scent, and the stimulus factor depends solely on the objective function.
BOA operates in three phases: initialization, iteration, and termination. During initialization the objective function, search space, and constants (c, a, switching probability p) are defined, and a population of butterflies is randomly distributed.
In each iteration, butterflies update their positions, recompute fitness and scent, and execute either global search (moving toward the best‑scented butterfly) or local search (random movement within a neighborhood). A switching probability p determines which search mode is used.
The algorithm also includes boundary checking to keep positions within the defined limits.
Algorithm steps:
Set basic parameters.
Randomly initialize butterfly positions.
Compute stimulus factor using the fitness function.
Calculate scent values and record the best global butterfly.
Generate a random number and compare with the switching probability; if the random number is less than p, perform global search, otherwise perform local search.
Code
<code>import numpy as np
import random
import numpy as np
from matplotlib import pyplot as plt
import copy
def initialization(pop,ub,lb,dim):
'''种群初始化函数'''
X = np.zeros([pop,dim]) #声明空间
for i in range(pop):
for j in range(dim):
X[i,j]=(ub[j]-lb[j])*np.random.random()+lb[j] #生成[lb,ub]之间的随机数
return X
def BorderCheck(X,ub,lb,pop,dim):
'''边界检查函数'''
for i in range(pop):
for j in range(dim):
if X[i,j]>ub[j]:
X[i,j] = ub[j]
elif X[i,j]<lb[j]:
X[i,j] = lb[j]
return X
def CaculateFitness(X,fun):
'''计算种群的所有个体的适应度值'''
pop = X.shape[0]
fitness = np.zeros([pop, 1])
for i in range(pop):
fitness[i] = fun(X[i, :])
return fitness
def SortFitness(Fit):
'''适应度值排序'''
fitness = np.sort(Fit, axis=0)
index = np.argsort(Fit, axis=0)
return fitness,index
def SortPosition(X,index):
'''根据适应度值对位置进行排序'''
Xnew = np.zeros(X.shape)
for i in range(X.shape[0]):
Xnew[i,:] = X[index[i],:]
return Xnew
def BOA(pop, dim, lb, ub, MaxIter, fun):
'''蝴蝶优化算法'''
p=0.8 #切换概率
power_exponent=0.1 #功率指数a
sensory_modality=0.1 #感知形态c
X = initialization(pop,ub,lb,dim) # 初始化种群
fitness = CaculateFitness(X, fun) # 计算适应度值
indexBest = np.argmin(fitness) #寻找最优适应度位置
GbestScore = fitness[indexBest] #记录最优适应度值
GbestPositon = np.zeros([1,dim])
GbestPositon[0,:] = X[indexBest, :]
X_new = copy.copy(X)
Curve = np.zeros([MaxIter, 1])
for t in range(MaxIter):
print("第"+str(t)+"次迭代")
for i in range(pop):
FP = sensory_modality*(fitness[i]**power_exponent) #刺激强度I的计算
if random.random()<p: #全局搜索
dis = random.random()*random.random()*GbestPositon - X[i,:]
Temp = np.matrix(dis*FP)
X_new[i,:] = X[i,:] + Temp[0,:]
else: #局部搜索
Temp = range(pop)
JK = random.sample(Temp,pop) #随机选择个体
dis=random.random()*random.random()*X[JK[0],:]-X[JK[1],:]
Temp = np.matrix(dis*FP)
X_new[i,:] = X[i,:] + Temp[0,:]
for j in range(dim):
if X_new[i,j] > ub[j]:
X_new[i, j] = ub[j]
if X_new[i,j] < lb[j]:
X_new[i, j] = lb[j]
#如果更优才更新
if(fun(X_new[i,:])<fitness[i]):
X[i,:] = copy.copy(X_new[i,:])
fitness[i] = copy.copy(fun(X_new[i,:]))
X = BorderCheck(X, ub, lb, pop, dim) # 边界检测
fitness = CaculateFitness(X, fun) # 计算适应度值
indexBest = np.argmin(fitness)
if fitness[indexBest] <= GbestScore: # 更新全局最优
GbestScore = copy.copy(fitness[indexBest])
GbestPositon[0,:] = copy.copy(X[indexBest, :])
Curve[t] = GbestScore
return GbestScore, GbestPositon, Curve
'''适应度函数'''
def fun(X):
O=X[0]**2 + X[1]**2
return O
# 设置参数
pop = 50 #种群数量
MaxIter = 100 #最大迭代次数
dim = 2 #维度
lb = -10*np.ones(dim) #下边界
ub = 10*np.ones(dim) #上边界
fobj = fun
GbestScore,GbestPositon,Curve = BOA(pop,dim,lb,ub,MaxIter,fobj)
print('最优适应度值:',GbestScore)
print('最优解[x1,x2]:',GbestPositon)
# 绘制适应度曲线
plt.figure(1)
plt.plot(Curve,'r-',linewidth=2)
plt.xlabel('Iteration',fontsize='medium')
plt.ylabel('Fitness',fontsize='medium')
plt.grid()
plt.title('BOA',fontsize='large')
plt.show()
</code>Reference: Fan Xu, "Python Intelligent Optimization Algorithms: From Principles to Code Implementation and Applications".
Model Perspective
Insights, knowledge, and enjoyment from a mathematical modeling researcher and educator. Hosted by Haihua Wang, a modeling instructor and author of "Clever Use of Chat for Mathematical Modeling", "Modeling: The Mathematics of Thinking", "Mathematical Modeling Practice: A Hands‑On Guide to Competitions", and co‑author of "Mathematical Modeling: Teaching Design and Cases".
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.