- Notifications
You must be signed in to change notification settings - Fork 3.7k
*bug* "Feature: targets is required but could not be found" on exported Text2ClassProblem #868
Description
Description
I created a Text2ClassProblem including 7 labels. When I use the exported model in tensorflow_model_server and send a request using t2t-query-server, the application crashes.
I used:
- model: transformer
- hparams_set: transformer_base_single_gpu
- vocab: 32k subwords
Environment information
OS: Ubuntu 16.04
$ pip freeze | grep tensor
tensor2tensor==1.6.3
tensorboard==1.7.0
tensorflow-gpu==1.7.0
$ python -V
Python 2.7.12
Steps to reproduce:
Create a Text2ClassProblem and assign label ids 0-6.
t2t-datagen
t2t-trainer
t2t-exporter
tensorflow_model_server
t2t-query-server
(send a request, then it should crash)
Error logs:
Traceback (most recent call last):
File "/home//dev/git/prinvision/nets/test_doc_to_language.py", line 22, in
outputs = service.process(inputs)
File "/home//dev/git/prinvision/nets/public_service_api_v3.py", line 50, in process
outputs = serving_utils.predict([input_string], self.problem, self.request_fn)
File "/home//.local/lib/python2.7/site-packages/tensor2tensor/serving/serving_utils.py", line 118, in predict
predictions = request_fn(examples)
File "/home//.local/lib/python2.7/site-packages/tensor2tensor/serving/serving_utils.py", line 75, in _make_grpc_request
response = stub.Predict(request, timeout_secs)
File "/usr/local/lib/python2.7/dist-packages/grpc/beta/_client_adaptations.py", line 309, in call
self._request_serializer, self._response_deserializer)
File "/usr/local/lib/python2.7/dist-packages/grpc/beta/_client_adaptations.py", line 195, in _blocking_unary_unary
raise _abortion_error(rpc_error_call)
grpc.framework.interfaces.face.face.AbortionError: AbortionError(code=StatusCode.INVALID_ARGUMENT, details="Feature: targets (data type: int64) is required but could not be found.
[[Node: ParseSingleExample/ParseSingleExample = ParseSingleExample[Tdense=[DT_INT64, DT_INT64], dense_keys=["batch_prediction_key", "targets"], dense_shapes=[[1], [1]], num_sparse=1, sparse_keys=["inputs"], sparse_types=[DT_INT64]](arg0, ParseSingleExample/Reshape, ParseSingleExample/Const)]]
[[Node: DatasetToSingleElement = DatasetToSingleElementoutput_shapes=[[?,1], [?,?,1,1], [?,1,1,1]], output_types=[DT_INT32, DT_INT32, DT_INT32], _device="/job:localhost/replica:0/task:0/device:CPU:0"]]")