本文介绍了如何在 Keras 中实现这种深度学习模型?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

检查 replit 中的源代码.

我有 3 个班级(A、B 和 C).

I have 3 classes (A, B, and C).

我有 6 个功能:

train_x = [[ 6.442  6.338  7.027  8.789 10.009 12.566]
           [ 6.338  7.027  5.338 10.009  8.122 11.217]
           [ 7.027  5.338  5.335  8.122  5.537  6.408]
           [ 5.338  5.335  5.659  5.537  5.241  7.043]]

这些特征表示由 3 个类(例如 AABBC 等)组成的 5 个字符的字符串模式.

These features represent a 5-character string pattern comprising of 3-classes(e.g. AABBC, etc.).

让,一个 5 字符的字符串模式被单热编码如下:

Let, a 5-character string pattern is one-hot encoded as follows:

train_z = [[0. 0. 1. 0. 0. 1. 0. 0. 1. 0. 0. 1. 1. 0. 0.]
           [0. 0. 1. 0. 0. 1. 0. 0. 1. 1. 0. 0. 1. 0. 0.]
           [0. 0. 1. 0. 0. 1. 1. 0. 0. 1. 0. 0. 1. 0. 0.]
           [0. 0. 1. 1. 0. 0. 1. 0. 0. 1. 0. 0. 0. 0. 1.]]

我认为,这是一个多任务学习问题.

I think, this is a Multi-task learning problem.

因此,我编写了以下源代码:

So, I wrote the following source code:

    # there would be 6 inputs for 6 features
inputs_tensor = keras.Input(shape=(FEATURES_COUNT,))

# there would be 2 hidden layers
hidden_layer_1 = keras.layers.Dense(LAYER_1_NEURON_COUNT, activation="relu")
hidden_layer_2 = keras.layers.Dense(LAYER_2_NEURON_COUNT, activation="relu")

# there would be 15 outputs for 15-bits
# each o/p layer will have 1 neurons for binary data
output_layer_1 = keras.layers.Dense(1, activation='sigmoid')  # 1 neuraons for 1 output
output_layer_2 = keras.layers.Dense(1, activation='sigmoid')  # -do-
output_layer_3 = keras.layers.Dense(1, activation='sigmoid')  # -do-
output_layer_4 = keras.layers.Dense(1, activation='sigmoid')  # -do-
output_layer_5 = keras.layers.Dense(1, activation='sigmoid')  # -do-
output_layer_6 = keras.layers.Dense(1, activation='sigmoid')  # -do-
output_layer_7 = keras.layers.Dense(1, activation='sigmoid')  # 1 neuraons for 1 output
output_layer_8 = keras.layers.Dense(1, activation='sigmoid')  # -do-
output_layer_9 = keras.layers.Dense(1, activation='sigmoid')  # -do-
output_layer_10 = keras.layers.Dense(1, activation='sigmoid')  # -do-
output_layer_11 = keras.layers.Dense(1, activation='sigmoid')  # -do-
output_layer_12 = keras.layers.Dense(1, activation='sigmoid')  # -do-
output_layer_13 = keras.layers.Dense(1, activation='sigmoid')  # -do-
output_layer_14 = keras.layers.Dense(1, activation='sigmoid')  # -do-
output_layer_15 = keras.layers.Dense(1, activation='sigmoid')  # -do-

# assembling the layers.
x = hidden_layer_1(inputs_tensor)
x = hidden_layer_2(x)
# configuring the output
output1 = output_layer_1(x)
output2 = output_layer_2(x)
output3 = output_layer_3(x)
output4 = output_layer_4(x)
output5 = output_layer_5(x)
output6 = output_layer_6(x)
output7 = output_layer_7(x)
output8 = output_layer_8(x)
output9 = output_layer_9(x)
output10 = output_layer_10(x)
output11 = output_layer_11(x)
output12 = output_layer_12(x)
output13 = output_layer_13(x)
output14 = output_layer_14(x)
output15 = output_layer_15(x)

model = keras.Model(inputs=[inputs_tensor],
                    outputs=[output1, output2, output3, output4, output5,
                             output6, output7, output8, output9, output10,
                             output11, output12, output13, output14, output15],
                    name="functional_model")

model.summary()
print("Inputs count : ", model.inputs)
print("Outputs count : ", len(model.outputs))

opt_function = keras.optimizers.SGD(lr=0.01, decay=1e-1, momentum=0.9, nesterov=True)
#
model.compile(loss='binary_crossentropy',
               optimizer=opt_function,
              metrics=['accuracy'])
#

print(train_x,"\n",train_z)

model.fit(
    train_x, train_z,
    epochs=EPOCHS,
    batch_size=BATCH_SIZE
)

产生错误:

Traceback (most recent call last):
  File "C:/Users/pc/source/repos/OneHotEncodingLayer__test/ny_nn___k_15_outputs.py", line 117, in <module>
    model.fit(
  File "C:\ProgramData\Miniconda3\envs\by_nn\lib\site-packages\tensorflow\python\keras\engine\training.py", line 108, in _method_wrapper
    return method(self, *args, **kwargs)
  File "C:\ProgramData\Miniconda3\envs\by_nn\lib\site-packages\tensorflow\python\keras\engine\training.py", line 1098, in fit
    tmp_logs = train_function(iterator)
  File "C:\ProgramData\Miniconda3\envs\by_nn\lib\site-packages\tensorflow\python\eager\def_function.py", line 780, in __call__
    result = self._call(*args, **kwds)
  File "C:\ProgramData\Miniconda3\envs\by_nn\lib\site-packages\tensorflow\python\eager\def_function.py", line 823, in _call
    self._initialize(args, kwds, add_initializers_to=initializers)
  File "C:\ProgramData\Miniconda3\envs\by_nn\lib\site-packages\tensorflow\python\eager\def_function.py", line 696, in _initialize
    self._stateful_fn._get_concrete_function_internal_garbage_collected(  # pylint: disable=protected-access
  File "C:\ProgramData\Miniconda3\envs\by_nn\lib\site-packages\tensorflow\python\eager\function.py", line 2855, in _get_concrete_function_internal_garbage_collected
    graph_function, _, _ = self._maybe_define_function(args, kwargs)
  File "C:\ProgramData\Miniconda3\envs\by_nn\lib\site-packages\tensorflow\python\eager\function.py", line 3213, in _maybe_define_function
    graph_function = self._create_graph_function(args, kwargs)
  File "C:\ProgramData\Miniconda3\envs\by_nn\lib\site-packages\tensorflow\python\eager\function.py", line 3065, in _create_graph_function
    func_graph_module.func_graph_from_py_func(
  File "C:\ProgramData\Miniconda3\envs\by_nn\lib\site-packages\tensorflow\python\framework\func_graph.py", line 986, in func_graph_from_py_func
    func_outputs = python_func(*func_args, **func_kwargs)
  File "C:\ProgramData\Miniconda3\envs\by_nn\lib\site-packages\tensorflow\python\eager\def_function.py", line 600, in wrapped_fn
    return weak_wrapped_fn().__wrapped__(*args, **kwds)
  File "C:\ProgramData\Miniconda3\envs\by_nn\lib\site-packages\tensorflow\python\framework\func_graph.py", line 973, in wrapper
    raise e.ag_error_metadata.to_exception(e)
ValueError: in user code:

    C:\ProgramData\Miniconda3\envs\by_nn\lib\site-packages\tensorflow\python\keras\engine\training.py:806 train_function  *
        return step_function(self, iterator)
    C:\ProgramData\Miniconda3\envs\by_nn\lib\site-packages\tensorflow\python\keras\engine\training.py:796 step_function  **
        outputs = model.distribute_strategy.run(run_step, args=(data,))
    C:\ProgramData\Miniconda3\envs\by_nn\lib\site-packages\tensorflow\python\distribute\distribute_lib.py:1211 run
        return self._extended.call_for_each_replica(fn, args=args, kwargs=kwargs)
    C:\ProgramData\Miniconda3\envs\by_nn\lib\site-packages\tensorflow\python\distribute\distribute_lib.py:2585 call_for_each_replica
        return self._call_for_each_replica(fn, args, kwargs)
    C:\ProgramData\Miniconda3\envs\by_nn\lib\site-packages\tensorflow\python\distribute\distribute_lib.py:2945 _call_for_each_replica
        return fn(*args, **kwargs)
    C:\ProgramData\Miniconda3\envs\by_nn\lib\site-packages\tensorflow\python\keras\engine\training.py:789 run_step  **
        outputs = model.train_step(data)
    C:\ProgramData\Miniconda3\envs\by_nn\lib\site-packages\tensorflow\python\keras\engine\training.py:748 train_step
        loss = self.compiled_loss(
    C:\ProgramData\Miniconda3\envs\by_nn\lib\site-packages\tensorflow\python\keras\engine\compile_utils.py:204 __call__
        loss_value = loss_obj(y_t, y_p, sample_weight=sw)
    C:\ProgramData\Miniconda3\envs\by_nn\lib\site-packages\tensorflow\python\keras\losses.py:149 __call__
        losses = ag_call(y_true, y_pred)
    C:\ProgramData\Miniconda3\envs\by_nn\lib\site-packages\tensorflow\python\keras\losses.py:253 call  **
        return ag_fn(y_true, y_pred, **self._fn_kwargs)
    C:\ProgramData\Miniconda3\envs\by_nn\lib\site-packages\tensorflow\python\util\dispatch.py:201 wrapper
        return target(*args, **kwargs)
    C:\ProgramData\Miniconda3\envs\by_nn\lib\site-packages\tensorflow\python\keras\losses.py:1605 binary_crossentropy
        K.binary_crossentropy(y_true, y_pred, from_logits=from_logits), axis=-1)
    C:\ProgramData\Miniconda3\envs\by_nn\lib\site-packages\tensorflow\python\util\dispatch.py:201 wrapper
        return target(*args, **kwargs)
    C:\ProgramData\Miniconda3\envs\by_nn\lib\site-packages\tensorflow\python\keras\backend.py:4823 binary_crossentropy
        return nn.sigmoid_cross_entropy_with_logits(labels=target, logits=output)
    C:\ProgramData\Miniconda3\envs\by_nn\lib\site-packages\tensorflow\python\util\dispatch.py:201 wrapper
        return target(*args, **kwargs)
    C:\ProgramData\Miniconda3\envs\by_nn\lib\site-packages\tensorflow\python\ops\nn_impl.py:173 sigmoid_cross_entropy_with_logits
        raise ValueError("logits and labels must have the same shape (%s vs %s)" %

    ValueError: logits and labels must have the same shape ((1, 1) vs (1, 15))


Process finished with exit code 1
  1. 我的实现是否正确?如果不是,我该如何纠正实施?
  2. 我该如何解决错误?

推荐答案

出现错误是因为模型期望一个具有 15 个单位的输出层

The error appears because the model expect a single output layer with 15 units

output_layer = keras.layers.Dense(15, activation='sigmoid')

无论如何,这不是实现模型的正确方法,因为您当前的模型忽略了所有连续的 3 个输出值组都相互关联的事实,因为标签是互斥的.例如,如果模型预测输出 [1, 1, 1, ..., 1, 1, 1] 由 15 个值 1 组成,则无法确定是否为每个字符分配 A、B 或 C 类.

Anyhow, this is not the correct way to implement the model, because your current model disregards the fact that all consecutive groups of 3 output values are linked, because the labels are mutually exclusive.For example if the model predicts an output [1, 1, 1, ..., 1, 1, 1] composed of 15 values 1, it is impossible to decide whether to assign class A, B or C to every character.

您的输出应该是 5 层,并使用分类交叉熵损失训练 softmax 激活(每个输出一个).标签应该适应这种形式的输出,将标签 train_z 分成 5 个 (N, 3) 形状的矩阵列表.

Your outputs should be 5 layers with softmax activation trained using the categorical cross-entropy losses (one for every output). The labels should be adapted to this form of output splitting the labels train_z into a list of 5 (N, 3)-shaped matrices.

下面的代码应该比您当前的模型更有效.

The code below should work better than your current model.

inputs_tensor = keras.Input(shape=(FEATURES_COUNT,))

# there would be 2 hidden layers
hidden_layer_1 = keras.layers.Dense(LAYER_1_NEURON_COUNT, activation="relu")
hidden_layer_2 = keras.layers.Dense(LAYER_2_NEURON_COUNT, activation="relu")

output_layer_char1 = keras.layers.Dense(3, activation='softmax')
output_layer_char2 = keras.layers.Dense(3, activation='softmax')
output_layer_char3 = keras.layers.Dense(3, activation='softmax')
output_layer_char4 = keras.layers.Dense(3, activation='softmax')
output_layer_char5 = keras.layers.Dense(3, activation='softmax')

x = hidden_layer_1(inputs_tensor)
x = hidden_layer_2(x)
# configuring the output
output1 = output_layer_char1(x)
output2 = output_layer_char2(x)
output3 = output_layer_char3(x)
output4 = output_layer_char4(x)
output5 = output_layer_char5(x)

model = keras.Model(inputs=[inputs_tensor],
                    outputs=[output1, output2, output3, output4, output5],
                    name="functional_model")

opt_function = keras.optimizers.SGD(learning_rate=0.01, decay=1e-1, momentum=0.9, nesterov=True)

model.compile(loss=['categorical_crossentropy']*5,
              optimizer=opt_function,
              metrics=[['accuracy']]*5)
model.fit(
    train_x, tf.split(train_z, 5, axis=1),
    epochs=EPOCHS,
    batch_size=BATCH_SIZE
)

这篇关于如何在 Keras 中实现这种深度学习模型?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

08-28 22:05