本文介绍了图像平均减法与BatchNormalization - Caffe的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我对Caffe中的图像预处理有疑问。
当我在我的caffemodel中使用BatchNormalization Layer时,我是否仍然需要在训练阶段开始之前的所有训练中的预处理步骤图像均值减法?或者这是在BatchNormalization Layer中完成的?

i have a question regarding Image preprocessing in Caffe.When i use the BatchNormalization Layer in my caffemodel, do i still need the preprocessing step "image mean subtraction" on all my trainings Images before Training Phase starts? Or is this done in the BatchNormalization Layer ?

非常感谢=)

推荐答案

图像平均减法与BatchNormalization有所不同,用于不同的目的。

Image mean subtraction does something different than BatchNormalization and is used for a different purpose.

BatchNormalization规范化批次而不是每个单独的图像,更多地用于保持数据的良好分布并对抗高激活并因此过度拟合。之后并非每个图像都具有0均值,但批次的组合具有0均值。如果batchsize为1,它将是相同的。

BatchNormalization normalizes a batch and not every single image and is more used to keep the data distributed well and to combat high activations and therefore overfitting. Afterwards not every image has the 0 mean, but the combination of the batch has 0 mean. It would only be the same if the batchsize is 1.

图像平均减法主要用于对抗输入空间中的照明变化。 http://ufldl.stanford.edu/wiki/index.php/Data_Preprocessing

Image mean subtraction is mostly used to combat illumination changes in the input space. http://ufldl.stanford.edu/wiki/index.php/Data_Preprocessing

根据您的具体示例,您可以通过在输入后应用批量标准化而不是使用平均减法来获得良好的结果,但您需要对此进行测试。

Depending on your specific example you may get good results by applying batch normalization after the input instead of using per mean subtraction, but you will need to test this.

这篇关于图像平均减法与BatchNormalization - Caffe的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

10-13 19:32