本文介绍了如何在Keras model.compile中使用scipy.optimize.minimize?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有一个自定义损失f = x + y,其中我有一些约束使得优化f时,x应该在(0.10,0.2)的范围内,而y在(0.6,0.1)的范围内,y是实际标签和预测标签之间的均方差,x是不同类型的作业.该模型未基于x进行训练;但是,需要进行优化以在预测范围内获得不同类型的工作.

I have a custom loss f = x + y where I have constraints such that while optimising f, x should be within a range of (0.10, 0.2) and y within the range of (0.6, 0.1), y is the mean square difference between the actual and predicted labels and x is different types of jobs. The model is not trained based on x; however, it's required to be optimized on getting different types of jobs within the prediction.

我遇到了 Scipy.optimize:如何限制参数值关于如何将scipy.optimize与函数的参数范围一起使用.但是,我的主要问题是我有一个自定义损失函数total_loss(y_pred,y_true),它与Keras一起使用"SGD"作为优化器作为损失函数.现在,为了合并参数的绑定范围,我想在Keras中使用scipy.optimize.minimise.关于如何在Keras中的model.compile上使用scipy.optimize的任何指导?

I came across Scipy.optimize: how to restrict argument values on how scipy.optimize can be used with bounds of the parameter of a function. However, my main problem is that I have a custom loss function total_loss(y_pred, y_true) and it works with Keras as a loss function using "SGD" as optimizer. Now, for incorporating the bound range of the parameter, I would like to use scipy.optimize.minimise with Keras. Any direction on how to use scipy.optimize on model.compile in Keras?

推荐答案

您可以使用自定义训练循环.训练循环可以考虑您的keras损失函数来收集梯度,然后您可以选择使用给定约束来优化权重.

You can use a custom training loop.The training loop can collect the gradients considering your keras loss function and you can then choose to optimize the weights using a given constraint.

我已经实现了相似;该代码调用SciPy最小化以优化模型变量.

I've implemented something similar; this code calls SciPy minimize to optimize a model's variables.

这篇关于如何在Keras model.compile中使用scipy.optimize.minimize?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

07-11 17:18