本文介绍了keras中的train_on_batch()有什么用?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

train_on_batch()fit()有何不同?在什么情况下应该使用train_on_batch()?

How train_on_batch() is different from fit()? What are the cases when we should use train_on_batch()?

推荐答案

对于此问题,这是主要作者的简单答案:

train_on_batch允许您根据所提供的样品集合明确地更新重量,而无需考虑任何固定的批次大小.您可以在需要的情况下使用它:训练明确的样本集合.您可以使用这种方法来维护传统训练集的多个批次上的迭代,但是允许fitfit_generator为您迭代批次可能更简单.

train_on_batch allows you to expressly update weights based on a collection of samples you provide, without regard to any fixed batch size. You would use this in cases when that is what you want: to train on an explicit collection of samples. You could use that approach to maintain your own iteration over multiple batches of a traditional training set but allowing fit or fit_generator to iterate batches for you is likely simpler.

使用train_on_batch可能很好的一种情况是在一批新的样本上更新预先训练的模型.假设您已经训练和部署了一个模型,并且稍后某个时候您收到了一组以前从未使用过的新训练样本.您可以使用train_on_batch仅在这些样本上直接更新现有模型.其他方法也可以做到这一点,但是在这种情况下使用train_on_batch是相当明确的.

One case when it might be nice to use train_on_batch is for updating a pre-trained model on a single new batch of samples. Suppose you've already trained and deployed a model, and sometime later you've received a new set of training samples previously never used. You could use train_on_batch to directly update the existing model only on those samples. Other methods can do this too, but it is rather explicit to use train_on_batch for this case.

除了像这样的特殊情况(在某些教学原因上,您需要在不同的培训批次之间维护自己的光标,或者在特殊批次上进行某种类型的半在线培训更新)之外,最好还是始终使用fit(用于存储在内存中的数据)或fit_generator(用于将成批的数据流作为生成器).

Apart from special cases like this (either where you have some pedagogical reason to maintain your own cursor across different training batches, or else for some type of semi-online training update on a special batch), it is probably better to just always use fit (for data that fits in memory) or fit_generator (for streaming batches of data as a generator).

这篇关于keras中的train_on_batch()有什么用?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

10-12 02:47