本文介绍了scipy.optimize.minimize跟踪目标函数的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在使用scipy.optimize.minimize,并且正在使用这样的函数优化3个参数

I'm working with scipy.optimize.minimize, and I'm optimizing 3 parameters with a function like this

def foo(A, x, y, z):
    test = my_function(A[0], A[1], A[2], x, y, z)
    return test

在这个答案中,我发现了一些见识:如何显示scipy.optimize函数的进度?所以我想出了这个功能:

In this answer I found some insight:How to display progress of scipy.optimize function?So I came up with this function:

def callbackF(Xi, x, y, z)
    global Nfeval
    print '{0:4d}   {1: 3.6f}   {2: 3.6f}   {3: 3.6f}   {4: 3.6f}'.format(Nfeval, Xi[0], Xi[1], Xi[2], foo(Xi, x, y, z))
    Nfeval += 1

所以我的代码看起来像这样

So my code will look like this

Optimal = minimize(fun=foo, x0=[fi, alfa, Ks], args=(x, y, z),
                   method='BFGS', callback=callbackF, tol=1e-2)

但是我得到这个错误:

TypeError: callbackF() takes exactly 4 arguments (1 given)

我了解该错误,但是应该如何避免呢?

I understand the error, but how should I avoid it?

推荐答案

如果您可以自己检测函数本身,则可以随时对其进行DIY.唯一棘手的问题是迭代计数.为此,您既可以使用全局函数,也可以使用(更好的是IMO)将计数器附加到函数本身:

You can always DIY it if you can instrument the function itself. The only tricky bit is the iteration count. For it you can either use a global, or (IMO, better) attach the counter to the function itself:

>>> import numpy as np
>>> from scipy.optimize import minimize
>>>
>>> def f(x):
...     res = np.sum(x**2)
...     f.count += 1
...     print('x = ', x, ' res = ', res, '  j = ', f.count)
...     return res
...
>>> f.count = 0
>>> minimize(f, x0=5)
x =  [ 5.]  res =  25.0   j =  1
x =  [ 5.00000001]  res =  25.000000149   j =  2
x =  [ 5.]  res =  25.0   j =  3
x =  [-5.]  res =  25.0   j =  4
x =  [-5.]  res =  25.0   j =  5
x =  [-4.99999999]  res =  24.999999851   j =  6
x =  [ 0.0005]  res =  2.5e-07   j =  7
x =  [ 0.0005]  res =  2.5e-07   j =  8
x =  [ 0.00050001]  res =  2.50014901383e-07   j =  9
x =  [ -7.45132485e-09]  res =  5.55222420558e-17   j =  10
x =  [ -7.45132485e-09]  res =  5.55222420558e-17   j =  11
x =  [  7.44983634e-09]  res =  5.55000615146e-17   j =  12
      fun: 5.552224205575604e-17
 hess_inv: array([[ 0.5]])
      jac: array([ -1.48851092e-12])
  message: 'Optimization terminated successfully.'
     nfev: 12
      nit: 2
     njev: 4
   status: 0
  success: True
        x: array([ -7.45132485e-09])

这篇关于scipy.optimize.minimize跟踪目标函数的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

07-11 16:45