How to Do a Simple CLI Query for a Saved Estimator Model?
The ServingInputReceiver
you're creating for the model export is is telling the saved model to expect serialized tf.Example
protos instead of the raw strings you wish to classify.
From the Save and Restore documentation:
A typical pattern is that inference requests arrive in the form of serialized tf.Examples, so the serving_input_receiver_fn() creates a single string placeholder to receive them. The serving_input_receiver_fn() is then also responsible for parsing the tf.Examples by adding a tf.parse_example op to the graph.
....
The tf.estimator.export.build_parsing_serving_input_receiver_fn utility function provides that input receiver for the common case.
So your exported model contains a tf.parse_example
op that expects to receive serialized tf.Example
protos satisfying the feature specification you passed to build_parsing_serving_input_receiver_fn
, i.e. in your case it expects serialized examples that have the sentence
feature. To predict with the model, you have to provide those serialized protos.
Fortunately, Tensorflow makes it fairly easy to construct these. Here's one possible function to return an expression mapping the examples
input key to a batch of strings, which you can then pass to the CLI:
import tensorflow as tf
def serialize_example_string(strings):
serialized_examples = []
for s in strings:
try:
value = [bytes(s, "utf-8")]
except TypeError: # python 2
value = [bytes(s)]
example = tf.train.Example(
features=tf.train.Features(
feature={
"sentence": tf.train.Feature(bytes_list=tf.train.BytesList(value=value))
}
)
)
serialized_examples.append(example.SerializeToString())
return "examples=" + repr(serialized_examples).replace("'", "\"")
So using some strings pulled from your examples:
strings = ["klassifiziere mich bitte",
"Das Paket „S Line Competition“ umfasst unter anderem optische Details, eine neue Farbe (Turboblau), 19-Zöller und LED-Lampen.",
"(pro Stimme geht 1 Euro Spende von Pfuscher ans Forum) ah du sack, also so gehts ja net :D:D:D"]
print (serialize_example_string(strings))
the CLI command would be:
saved_model_cli run --dir /path/to/model --tag_set serve --signature_def predict --input_exprs='examples=[b"\n*\n(\n\x08sentence\x12\x1c\n\x1a\n\x18klassifiziere mich bitte", b"\n\x98\x01\n\x95\x01\n\x08sentence\x12\x88\x01\n\x85\x01\n\x82\x01Das Paket \xe2\x80\x9eS Line Competition\xe2\x80\x9c umfasst unter anderem optische Details, eine neue Farbe (Turboblau), 19-Z\xc3\xb6ller und LED-Lampen.", b"\np\nn\n\x08sentence\x12b\n`\n^(pro Stimme geht 1 Euro Spende von Pfuscher ans Forum) ah du sack, also so gehts ja net :D:D:D"]'
which should give you the desired results:
Result for output key class_ids:
[[0]
[1]
[0]]
Result for output key classes:
[[b'0']
[b'1']
[b'0']]
Result for output key logistic:
[[0.05852016]
[0.88453305]
[0.04373989]]
Result for output key logits:
[[-2.7780817]
[ 2.0360758]
[-3.0847695]]
Result for output key probabilities:
[[0.94147986 0.05852016]
[0.11546692 0.88453305]
[0.9562601 0.04373989]]
Alternatively, saved_model_cli provides another option --input_examples
, instead of --input_exprs
, so that you can pass the tf.Examples data directly in the cmd line, without the manual serialization.
For example:
--input_examples 'examples=[{"sentence":["this is a sentence"]}]'
See https://www.tensorflow.org/guide/saved_model#--input_examples for details.