我正在尝试使用以下命令行量化Tensorflow SavedModel:

tflite_convert \
  --output_file=/tmp/foo.tflite \
  --saved_model_dir=/tmp/saved_model

但我得到了以下错误:
ValueError: No 'serving_default' in the SavedModel's SignatureDefs. Possible values are 'my model name'

我已经检查过了,导出模型时定义了签名映射。
命令:
saved_model_cli show --dir /tmp/mobilenet/1 --tag_set serve

回报
The given SavedModel MetaGraphDef contains SignatureDefs with the following keys:
SignatureDef key: 'name_of_my_model'

以及:
The given SavedModel SignatureDef contains the following input(s):
  inputs['is_training'] tensor_info:
      dtype: DT_BOOL
      shape: ()
      name: is_training:0
  inputs['question1_embedding'] tensor_info:
      dtype: DT_FLOAT
      shape: (-1, 35, 300)
      name: question1_embedding:0
  inputs['question2_embedding'] tensor_info:
      dtype: DT_FLOAT
      shape: (-1, 35, 300)
      name: question2_embedding:0
The given SavedModel SignatureDef contains the following output(s):
  outputs['prediction'] tensor_info:
      dtype: DT_FLOAT
      shape: (-1, 1)
      name: prediction:0
Method name is: tensorflow/serving/predict

最佳答案

转换时,您应该能够使用saved_model_signature_key指定签名名称

tflite_convert \
         --output_file=/tmp/foo.tflite \
         --saved_model_dir=/tmp/saved_model \
         --saved_model_signature_key='my model name'

10-08 02:44