Mastering SciPy Optimize: From Root Finding to Global Optimization
This guide introduces SciPy's optimize module, covering scalar and multivariate minimization, global optimization algorithms, root finding, linear programming, and assignment problems, complete with clear Python code examples and explanations of each method's usage and output.
Optimization and Root Finding (scipy.optimize)
Common functionalities include:
Minimization (or maximization) of objective functions, with support for constrained and unconstrained problems and both local and global algorithms.
Linear programming algorithms.
Constrained and nonlinear least‑squares.
Root finding.
Curve fitting.
Scalar Function Optimization
<code>def f(x):
return (x - 2) * x * (x + 2)**2
from scipy.optimize import minimize_scalar
res = minimize_scalar(f)
print(res.x)
res = minimize_scalar(f, bounds=(-3, -1), method='bounded')
print(res.x)
</code>Local Multivariate Optimization
<code>from scipy.optimize import minimize
def f2(x):
return (x[0] - 2) * x[1] * (x[2] + 2)**2
x0 = [1.3, 0.7, 0.8]
res = minimize(f2, x0, method='Nelder-Mead', tol=1e-6)
print(res.x)
</code>Result: array([-1.90682839e+53, 5.63973935e+52, 3.40181690e+52])
Constrained Optimization Example
<code>fun = lambda x: (x[0] - 1)**2 + (x[1] - 2.5)**2
cons = ({'type': 'ineq', 'fun': lambda x: x[0] - 2*x[1] + 2},
{'type': 'ineq', 'fun': lambda x: -x[0] - 2*x[1] + 6},
{'type': 'ineq', 'fun': lambda x: -x[0] + 2*x[1] + 2})
bnds = ((0, None), (0, None))
res = minimize(fun, (2, 0), method='SLSQP', bounds=bnds, constraints=cons)
print(res)
</code>Optimization terminated successfully with solution x = [1.4, 1.7] .
Global Optimization Methods
basinhopping – basin‑hopping algorithm.
brute – exhaustive search within given bounds.
differential_evolution – differential evolution for global minima.
shgo – simplicial homology global optimization.
dual_annealing – dual annealing algorithm.
Dual Annealing Example
<code>from scipy.optimize import dual_annealing
import numpy as np
func = lambda x: np.sum(x*x - 10*np.cos(2*np.pi*x)) + 10*np.size(x)
lw = [-5.12] * 10
up = [5.12] * 10
ret = dual_annealing(func, bounds=list(zip(lw, up)), seed=1234)
print(ret.x)
</code>Result: an array of values close to zero.
Differential Evolution Example (Ackley Function)
<code>from scipy.optimize import differential_evolution
import numpy as np
def ackley(x):
arg1 = -0.2 * np.sqrt(0.5 * (x[0]**2 + x[1]**2))
arg2 = 0.5 * (np.cos(2.*np.pi*x[0]) + np.cos(2.*np.pi*x[1]))
return -20.*np.exp(arg1) - np.exp(arg2) + 20. + np.e
bounds = [(-5, 5), (-5, 5)]
result = differential_evolution(ackley, bounds)
print(result.x, result.fun)
</code>Result: (array([0., 0.]), 4.44e-16) .
Root Finding
<code>from scipy import optimize
import numpy as np
def fun(x):
return [x[0] + 0.5*(x[0]-x[1])**3 - 1.0,
0.5*(x[1]-x[0])**3 + x[1]]
def jac(x):
return np.array([[1 + 1.5*(x[0]-x[1])**2, -1.5*(x[0]-x[1])**2],
[-1.5*(x[1]-x[0])**2, 1 + 1.5*(x[1]-x[0])**2]])
sol = optimize.root(fun, [0, 0], jac=jac, method='hybr')
print(sol.x)
</code>Linear Programming
<code>c = [-1, 4]
A = [[-3, 1], [1, 2]]
b = [6, 4]
x0_bounds = (None, None)
x1_bounds = (-3, None)
from scipy.optimize import linprog
res = linprog(c, A_ub=A, b_ub=b, bounds=[x0_bounds, x1_bounds])
print(res)
</code>Solution: x = [9.99999989, -2.99999999] with optimal objective -22.0 .
Assignment Problem (Hungarian Algorithm)
<code>from scipy.optimize import linear_sum_assignment
import numpy as np
goodAt = np.array([[7,3,7,4,5,5],[7,3,7,4,5,5],[4,9,2,6,8,3],[4,9,2,6,8,3],[8,3,5,7,6,4],[8,3,5,7,6,4]])
weakAt = 10 - goodAt
row_ind, col_ind = linear_sum_assignment(weakAt)
print(row_ind)
print(col_ind)
</code>The printed indices give the optimal assignment of rows to columns that minimizes total cost.
Model Perspective
Insights, knowledge, and enjoyment from a mathematical modeling researcher and educator. Hosted by Haihua Wang, a modeling instructor and author of "Clever Use of Chat for Mathematical Modeling", "Modeling: The Mathematics of Thinking", "Mathematical Modeling Practice: A Hands‑On Guide to Competitions", and co‑author of "Mathematical Modeling: Teaching Design and Cases".
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.