本文介绍了Scipy.optimize:如何限制参数值的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试使用 scipy.optimize 函数来找到具有多个参数的复杂函数的全局最小值.scipy.optimize.minimize 似乎做得最好,即Nelder-Mead"方法.然而,它倾向于进入参数域之外的区域(将负值分配给只能为正的参数),因此在这种情况下会返回错误.有没有办法在 scipy.optimize.minimize 函数本身内限制参数的边界?或者可能在其他 scipy.optimize 函数中?

I'm trying to use scipy.optimize functions to find a global minimum of a complicated function with several arguments. scipy.optimize.minimize seems to do the job best of all, namely, the 'Nelder-Mead' method. However, it tends to go to the areas out of arguments' domain (to assign negative values to arguments that can only be positive) and thus returns an error in such cases. Is there a way to restrict the arguments' bounds within the scipy.optimize.minimize function itself? Or maybe within other scipy.optimize functions?

我找到了以下建议:

当参数超出允许范围时,返回一个非常大的数字(远离要拟合的数据).这将(希望如此)惩罚这种参数选择,以至于 curve_fit 将选择其他一些可接受的参数集作为最佳.

在之前的答案中给出,但该过程需要就我而言,需要大量计算时间.

given in this previous answer, but the procedure will take a lot of computational time in my case.

推荐答案

Nelder-Mead 求解器不支持约束优化,但还有其他几个支持.

The Nelder-Mead solver doesn't support constrained optimization, but there are several others that do.

TNC 和 L-BFGS-B 都仅支持绑定约束(例如 x[0] >= 0),这对您的情况应该没问题.COBYLA 和 SLSQP 更加灵活,支持边界、等式和基于不等式的约束的任意组合.

TNC and L-BFGS-B both support only bound constraints (e.g. x[0] >= 0), which should be fine for your case. COBYLA and SLSQP are more flexible, supporting any combination of bounds, equality and inequality-based constraints.

您可以通过查看独立函数的文档来找到有关求解器的更多详细信息,例如scipy.optimize.fmin_slsqp 用于 method='SLSQP'.

You can find more detailed info about the solvers by looking at the docs for the standalone functions, e.g. scipy.optimize.fmin_slsqp for method='SLSQP'.

您可以在此处查看我之前的回答 使用 SLSQP 的约束优化示例.

You can see my previous answer here for an example of constrained optimization using SLSQP.

这篇关于Scipy.optimize:如何限制参数值的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

07-11 17:17