本文介绍了Deeplearning4j:Iterations,Epochs和ScoreIterationListener的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

大家下午好。

我对Deepleaning4j库很新,还有一些我还不清楚的东西。
时代的概念并不新鲜,因此很明显它代表了训练集的一个完整周期。
我的第一个疑问与迭代的概念有关。什么是训练集的迭代?它是否与小批量训练实例或其他内容的分析相对应?

I'm quite new to the Deepleaning4j library and there are a couple of stuff that are still unclear to me.The concept of "epoch" is not new, thus, it is clear that it represents a full-cycle on the training set.My first doubt is related to the concept of "iteration". What is an iteration over the training set? Does it correspond to the analysis of a mini-batch number of training instances or to something else?

在我的代码中,我设置了.iterations(1);但是,当我运行我的代码时,我看到了很多:

In my code, I set ".iterations(1)"; however, when I run my code I see a lot of:

... ScoreIterationListener - 迭代时得分XX是yy.yyyyyy

... ScoreIterationListener - Score at iteration XX is yy.yyyyyy"

所以,如果我设置.iterations(1),为什么我继续看到XX的值大于1?
是否存在迭代的概念之间存在一些差异作为网络配置参数和迭代对ScoreIterationListener类意味着什么?

So, if I set ".iterations(1)", why do I continue to see values of XX greater than 1?Are there, maybe, some differences between the idea of "iteration" as a network configuration parameter and what "iteration" means for the ScoreIterationListener class?

感谢大家对任何有用信息的回答或链接。

Thanks everybody for any answer or link to useful information.

最好,
Mauro。

Best,Mauro.

推荐答案

有一些很好的见解,特别是关于时期和迭代之间的区别。

The DeepLearning4J documentation has some good insight, especially with respect to the difference between an epoch and an iteration.

根据DL4J的文档:

迭代只是神经的一次更新净模型的参数。
不要与纪元混淆,后者是完整的数据集传递。
许多迭代可以在纪元结束之前发生。如果为每次遍历整个数据集更新一次参数,则Epoch和iteration只是同义词;如果你使用迷你批次更新,它们意味着不同的东西。假设您的数据有2个小批量:A和B. .numIterations(3)执行类似 AAABBB 的培训,而3个时期看起来像 ABABAB

"An iteration is simply one update of the neural net model’s parameters.Not to be confused with an epoch which is one complete pass through the dataset.Many iterations can occur before an epoch is over. Epoch and iteration are only synonymous if you update your parameters once for each pass through the whole dataset; if you update using mini-batches, they mean different things. Say your data has 2 minibatches: A and B. .numIterations(3) performs training like AAABBB, while 3 epochs looks like ABABAB."

关于您的问题以及此摘录所引用的,如果您设置 .iterations(1) )且只有一个批次,迭代将与1个纪元同义,或者一个遍历整个数据集。但是,如果使用迷你批次更新,则纪元和迭代会略有不同 - - 迭代会导致AAABBB,而不是一个epoch,它会产生ABABAB(由上面的例子引用)。

With respect to your question and as referenced by this excerpt, if you set .iterations(1) and had only one batch, the iteration would be synonymous with 1 epoch, or one pass through the complete dataset. However, if you update using mini-batches, an epoch and an iteration would differ slightly -- an iteration would result in AAABBB, rather than an epoch, which would produce ABABAB (referenced by the example above).

希望这个答案和文档链接回答你的问题!

Hopefully this answer and the documentation linked answers your question!

PS我为迟到的回复道歉;我最近偶然发现了这个问题!

P.S. I apologize for the late reply; I stumbled on this question very recently!

这篇关于Deeplearning4j:Iterations,Epochs和ScoreIterationListener的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

11-02 20:01