本文介绍了使用Tensorflow进行多任务深度学习的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

有人尝试过使用TensorFlow进行多任务深度学习吗?即,共享底层而不共享顶层。一个简单说明的例子会很有帮助。

解决方案

有一个类似的问题



此图是根据以下



假设我们正在训练一个分类器,该分类器可预测图像中的数字,每个图像最多5个数字。在这里,我们定义了6个输出层: digit1 digit2 digit3 digit4 digit5 长度。如果有这样的数字, digit 层应输出0〜9,或者将 X 替换为实际数字),如果位置没有数字。对于 length 也是一样,如果图像包含0〜5位数字,它应该输出0〜5,如果它包含 X ,它应该输出0〜5。包含超过5个数字。



现在要对其进行训练,我们只需将每个softmax函数的所有交叉熵损失加起来即可:

 #定义损失和优化器
lossLength = tf.log(tf.clip_by_value(tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(length_logits,true_length)),1e -37,1e + 37))
lossDigit1 = tf.log(tf.clip_by_value(tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(digit1_logits,true_digit1)),1e-37,1e + 37))
b lossDigit2 = tf.log(tf.clip_by_value(tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(digit2_logits,true_digit2)),1e-37,1e + 37))
lossDigit3 = tf.log(tf.clip_by_value(tf。 tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(digit3_logits,true_digit3)),1e-37,1e + 37))
lossDigit4 = tf.log(tf.clip_by_value(tf.reduce_mean(tf.nn.softmax_cross_its_(4) ,true_digit4)),1e-3 7,1e + 37))
lossDigit5 = tf.log(tf.clip_by_value(tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(digit5_logits,true_digit5)),1e-37,1e + 37))

成本= tf.add(
tf.add(
tf.add(
tf.add(
tf.add(cL,lossDigit1),
lossDigit2),
lossDigit3),
lossDigit4),
lossDigit5)

Optimizer = tf.train.AdamOptimizer(learning_rate = learning_rate).minimize(cost)


Have someone tried doing multitask deep learning with TensorFlow? That is, sharing the bottom layers while not sharing the top layers. An example with simple illustration would help a lot.

解决方案

There is an similar question here, the answer used keras.

It's similar when just using tensorflow. The idea is this: we can define multiple outputs of a network, and thus multiple loss functions (objectives). We then tell optimizer to minimize a combined loss function, usually using a linear combination.

A concept diagram

This diagram is drawn according to this paper.

Let's say we are training a classifier that predict the digit in the image, with maximum 5 digits per image. Here we defined 6 output layer: digit1, digit2, digit3, digit4, digit5, length. The digit layer should output 0~9 if there is such a digit, or X(substitute it with an real number in practice) if there isn't any digit in its position. Same thing for length, it should output 0~5 if the image contains 0~5 digit, or X if it contains more than 5 digits.

Now to train it, we just add up all the cross entropy loss of each softmax function:

# Define loss and optimizer
lossLength = tf.log(tf.clip_by_value(tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(length_logits, true_length)), 1e-37, 1e+37))
lossDigit1 = tf.log(tf.clip_by_value(tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(digit1_logits, true_digit1)), 1e-37, 1e+37))
lossDigit2 = tf.log(tf.clip_by_value(tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(digit2_logits, true_digit2)), 1e-37, 1e+37))
lossDigit3 = tf.log(tf.clip_by_value(tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(digit3_logits, true_digit3)), 1e-37, 1e+37))
lossDigit4 = tf.log(tf.clip_by_value(tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(digit4_logits, true_digit4)), 1e-37, 1e+37))
lossDigit5 = tf.log(tf.clip_by_value(tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(digit5_logits, true_digit5)), 1e-37, 1e+37))

cost = tf.add(
        tf.add(
        tf.add(
        tf.add(
        tf.add(cL,lossDigit1),
        lossDigit2),
        lossDigit3),
        lossDigit4),
        lossDigit5)

optimizer = tf.train.AdamOptimizer(learning_rate=learning_rate).minimize(cost)

这篇关于使用Tensorflow进行多任务深度学习的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

06-30 02:31