本文介绍了Tensorflow 中的可微操作列表的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

是否有可微分(即会自动微分)的 Tensorflow ops 主列表?

Is there a master list of Tensorflow ops that are differentiable (i.e., will auto-differentiate)?

另外两种表达方式:

  • 未设置 ops.NoGradient 的操作列表.
  • 不会触发LookupError的操作列表.
  • List of ops that do not have ops.NoGradient set.
  • List of ops that will not trigger LookupError.

例如,我假设所有控制流操作都是不可微的(例如,tf.where).除了通过 tf.gradients 手动运行它们以查看它们是否抛出 LookupError 之外,我如何找到它.

For example, I'd assume that all the Control Flow ops are not differentiable (e.g., tf.where). How would I find this other than by manually running them all through tf.gradients to see if they throw the LookupError.

常识"不是有效答案.

谢谢.

tf.where 是可微的,所以我的直觉是错误的.也许这里的正确问题是 Tensorflow 中的哪些操作不可可微.

tf.where is differentiable so my intuitions are wrong. Perhaps the correct question here is which ops in Tensorflow are not differentiable.

谢谢.

推荐答案

不,没有列表(您可以第一个创建它).另外据我所知,每个函数的文档也没有告诉它(tf.size 是不可微的,但没有说明.

No, there is no list (you can be the first one to create it). Also as far as I am aware, documentation of each function also does not tell it (tf.size is non-differentiable but does not tell about it).

除了您建议的方式外,您还可以从源代码中提取此数据.例如,所有已实现梯度的操作,都有 @ops.RegisterGradient 在方法声明的前面.对于没有梯度的操作,您将拥有 ops.NotDifferentiable(

Apart from the way you suggested, you can also extract this data from the source code. For example all the ops that have gradient implemented, have @ops.RegisterGradient in front of the method declaration. For ops which do not have gradient you will have ops.NotDifferentiable(

不相关,但可能有帮助.

这篇关于Tensorflow 中的可微操作列表的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

10-22 12:25