-
Notifications
You must be signed in to change notification settings - Fork 12
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Some local minima are missing even if setting a very large n
.
#28
Comments
At a first glance it does appear that shgo is detecting the sub-domains of all (~121?) local minima, but that the local minimisation ends in the wrong point. For example we can see this when we raise the import numpy as np
import matplotlib.pyplot as plt
from scipy.optimize import shgo
def rastrigin(X):
return 20. + sum([(x**2 - 10. * np.cos(2. * np.pi * x)) for x in X])
func = rastrigin
xmin, xmax = -5.12, 5.12
x = np.linspace(xmin, xmax, 100)
y = np.linspace(xmin, xmax, 100)
xx, yy = np.meshgrid(x, y)
zz = func((xx, yy))
bounds = [(xmin, xmax)] * 2
# Change the tolerance in the local minimization routine:
minimizer_kwargs = {'options': {'ftol': 1e5}}
ret = shgo(func, bounds, n=4000, # (roughly needed for finding all minima)
iters=1, sampling_method='sobol', minimizer_kwargs=minimizer_kwargs)
print('nfev: {:5d}'.format(ret.nfev))
print('nxl: {:5d}'.format(len(ret.xl))) # 121
plt.figure()
plt.pcolormesh(x, y, zz)
for x1, y1 in ret.xl:
plt.plot(x1, y1, 'ro')
plt.show() and we can see solutions near all the local minima in the output plot: A quick solution would be use a more powerful local minimisation routine, for example using the trust-region algorithm, add the following to the # trust-krylov method with Jacobian and Hessian matrices computed with sympy:
def jac(x):
x = np.array(x)
return np.array([2 * x[0] + 20.0 * np.pi * np.sin(2.0 * np.pi * x[0]),
2 * x[1] + 20.0 * np.pi * np.sin(2.0 * np.pi * x[1])])
def hess(x):
x = np.array(x)
H = np.array([[40.0*np.pi**2*np.cos(2.0*np.pi*x[0]) + 2, 0],
[0, 40.0*np.pi**2*np.cos(2.0*np.pi*x[1]) + 2]])
return H
minimizer_kwargs = {'method': 'trust-krylov',
'jac': jac,
'hess': hess} After running the code we find the exact local minima: However, this is likely a defect in the current shgo implementation because the local minimisations should be constrained in sub-domains near the point (so that even if the local minimisation was not successful, it should not "escape" outside this sub-domain to another minima). My gut feeling is that the local minimisation routine is "escaping" the sub-domain through the global bounds/edges of the domain somehow (since the defective points are all near the boundary), then returning to an infeasible step near another local minima and finally terminating in the infeasible point. I think that it would be more correct for shgo to use the feasible starting point if the local routine returns an infeasible point, and then prompt a warning to the user. I am not intimately familiar with the implementation details of SLSQP, currently shgo passes global bound arguments to SLSQP, but it I will also investigate passing bounds as linear constraints instead. (I apologise for the late reply, annoyingly I have not been getting any of my feed notifications from Github lately) |
Some local minima are missing even if setting a very large
n
. Is this the method's limit or the improper settings of parameters? Is it possible to find all local minima with some scale through this method?e.g. the rastrigin function
The text was updated successfully, but these errors were encountered: