本文介绍了与BatchNormalization层关联的参数数量如何为2048?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有以下代码.

x = keras.layers.Input(batch_shape = (None, 4096))
hidden = keras.layers.Dense(512, activation = 'relu')(x)
hidden = keras.layers.BatchNormalization()(hidden)
hidden = keras.layers.Dropout(0.5)(hidden)
predictions = keras.layers.Dense(80, activation = 'sigmoid')(hidden)
mlp_model = keras.models.Model(input = [x], output = [predictions])
mlp_model.summary()

这是模型摘要:

____________________________________________________________________________________________________
Layer (type)                     Output Shape          Param #     Connected to                     
====================================================================================================
input_3 (InputLayer)             (None, 4096)          0                                            
____________________________________________________________________________________________________
dense_1 (Dense)                  (None, 512)           2097664     input_3[0][0]                    
____________________________________________________________________________________________________
batchnormalization_1 (BatchNorma (None, 512)           2048        dense_1[0][0]                    
____________________________________________________________________________________________________
dropout_1 (Dropout)              (None, 512)           0           batchnormalization_1[0][0]       
____________________________________________________________________________________________________
dense_2 (Dense)                  (None, 80)            41040       dropout_1[0][0]                  
====================================================================================================
Total params: 2,140,752
Trainable params: 2,139,728
Non-trainable params: 1,024
____________________________________________________________________________________________________

BatchNormalization(BN)层的输入大小为512.根据 Keras文档 ,则BN层的输出形状与输入相同,即512.

The size of the input for the BatchNormalization (BN) layer is 512. According to Keras documentation, shape of the output for BN layer is same as input which is 512.

那么与BN层关联的参数个数是2048个?

Then how the number of parameters associated with BN layer is 2048?

推荐答案

Keras中的批次规范化实现了本文.

The batch normalization in Keras implements this paper.

您可以在此处阅读,为了使训练中的批量标准化工作正常进行,他们需要跟踪每个标准化维度的分布.为此,由于默认情况下您位于mode=0中,因此它们在上一层为每个要素计算4个参数.这些参数可确保您正确传播和反向传播信息.

As you can read there, in order to make the batch normalization work during training, they need to keep track of the distributions of each normalized dimensions. To do so, since you are in mode=0by default, they compute 4 parameters per feature on the previous layer. Those parameters are making sure that you properly propagate and backpropagate the information.

所以4*512 = 2048,这应该可以回答您的问题.

So 4*512 = 2048, this should answer your question.

这篇关于与BatchNormalization层关联的参数数量如何为2048?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

10-12 15:40