Artificial Intelligence 10 min read

Unlocking the Seagull Optimization Algorithm: Principles, Migration & Attack Mechanics

This article introduces the Seagull Optimization Algorithm (SOA), explains how it mimics seagull migration and attack behaviors to solve optimization problems, details its migration and prey‑attack phases, outlines the step‑by‑step workflow, and provides a full Python implementation with code examples.

Model Perspective
Model Perspective
Model Perspective
Unlocking the Seagull Optimization Algorithm: Principles, Migration & Attack Mechanics

Fundamental Principles of the Seagull Optimization Algorithm

The Seagull Optimization Algorithm (SOA), proposed by Gaurav Dhiman et al. in 2019, is a meta‑heuristic optimization method that simulates the migration and attack behaviors of seagulls to explore the search space and find optimal solutions.

Seagull Migration

During the migration phase, the algorithm moves each virtual seagull toward the best known position while ensuring that no two seagulls occupy the same location, thereby avoiding collisions. New positions are calculated using a set of control variables that gradually decrease over iterations, balancing global and local search.

Seagull Attack on Prey

In the attack phase, seagulls adjust their attack angle and speed, performing a spiral motion in the air. The spiral trajectory is modeled mathematically using a radius, random angular values, and constants that shape the spiral, allowing the algorithm to intensify the search around promising regions.

Algorithm Workflow

The SOA workflow consists of the following steps:

Initialize algorithm parameters and population positions.

Compute fitness values and retain the global best position.

Perform the migration phase.

Execute the attack phase.

Check stopping criteria; if not met, repeat from fitness evaluation.

Two illustrative diagrams show the seagull’s movement trajectory and the overall algorithm flowchart.

Python Code Implementation

<code>import numpy as np
import copy
import matplotlib.pyplot as plt

def initialization(pop, ub, lb, dim):
    '''Population initialization function'''
    X = np.zeros([pop, dim])
    for i in range(pop):
        for j in range(dim):
            X[i, j] = (ub[j] - lb[j]) * np.random.random() + lb[j]
    return X

def BorderCheck(X, ub, lb, pop, dim):
    '''Boundary check function'''
    for i in range(pop):
        for j in range(dim):
            if X[i, j] > ub[j]:
                X[i, j] = ub[j]
            elif X[i, j] < lb[j]:
                X[i, j] = lb[j]
    return X

def CaculateFitness(X, fun):
    '''Calculate fitness values for all individuals'''
    pop = X.shape[0]
    fitness = np.zeros([pop, 1])
    for i in range(pop):
        fitness[i] = fun(X[i, :])
    return fitness

def SortFitness(Fit):
    '''Sort fitness values'''
    fitness = np.sort(Fit, axis=0)
    index = np.argsort(Fit, axis=0)
    return fitness, index

def SortPosition(X, index):
    '''Sort positions based on fitness'''
    Xnew = np.zeros(X.shape)
    for i in range(X.shape[0]):
        Xnew[i, :] = X[index[i], :]
    return Xnew

def SOA(pop, dim, lb, ub, MaxIter, fun):
    '''Seagull Optimization Algorithm'''
    fc = 2
    X = initialization(pop, ub, lb, dim)
    fitness = CaculateFitness(X, fun)
    fitness, sortIndex = SortFitness(fitness)
    X = SortPosition(X, sortIndex)
    GbestScore = copy.copy(fitness[0])
    GbestPositon = np.zeros([1, dim])
    GbestPositon[0, :] = copy.copy(X[0, :])
    Curve = np.zeros([MaxIter, 1])
    MS = np.zeros([pop, dim])
    CS = np.zeros([pop, dim])
    DS = np.zeros([pop, dim])
    X_new = copy.copy(X)
    for i in range(MaxIter):
        Pbest = X[0, :]
        for j in range(pop):
            A = fc - (i * (fc / MaxIter))
            CS[j, :] = X[j, :] * A
            rd = np.random.random()
            B = 2 * (A ** 2) * rd
            MS[j, :] = B * (Pbest - X[j, :])
            DS[j, :] = np.abs(CS[j, :] + MS[j, :])
            u = 1
            v = 1
            theta = np.random.random()
            r = u * np.exp(theta * v)
            x = r * np.cos(theta * 2 * np.pi)
            y = r * np.sin(theta * 2 * np.pi)
            z = r * theta
            X_new[j, :] = x * y * z * DS[j, :] + Pbest
        X = BorderCheck(X_new, ub, lb, pop, dim)
        fitness = CaculateFitness(X, fun)
        fitness, sortIndex = SortFitness(fitness)
        X = SortPosition(X, sortIndex)
        if fitness[0] <= GbestScore:
            GbestScore = copy.copy(fitness[0])
            GbestPositon[0, :] = copy.copy(X[0, :])
        Curve[i] = GbestScore
    return GbestScore, GbestPositon, Curve

def fun(X):
    O = X[0]**2 + X[1]**2
    return O
# Parameters
pop = 50
MaxIter = 100
dim = 2
lb = -10 * np.ones(dim)
ub = 10 * np.ones(dim)
# Run SOA
GbestScore, GbestPositon, Curve = SOA(pop, dim, lb, ub, MaxIter, fun)
print('Best fitness:', GbestScore)
print('Best solution [x1, x2]:', GbestPositon)
# Plot convergence curve
plt.figure(1)
plt.plot(Curve, 'r-', linewidth=2)
plt.xlabel('Iteration')
plt.ylabel('Fitness')
plt.grid()
plt.title('SOA')
plt.show()
</code>
Optimizationartificial intelligencePythonswarm intelligencemetaheuristicSeagull Optimization Algorithm
Model Perspective
Written by

Model Perspective

Insights, knowledge, and enjoyment from a mathematical modeling researcher and educator. Hosted by Haihua Wang, a modeling instructor and author of "Clever Use of Chat for Mathematical Modeling", "Modeling: The Mathematics of Thinking", "Mathematical Modeling Practice: A Hands‑On Guide to Competitions", and co‑author of "Mathematical Modeling: Teaching Design and Cases".

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.