- Notifications
You must be signed in to change notification settings - Fork 25.6k
Closed
Labels
:mlMachine learningMachine learningbug"" muted="" aria-describedby="MDU6TGFiZWwyMzE3Mw==-tooltip :R5b96b:">>bugTeam:MLMeta label for the ML teamMeta label for the ML team
Description
Elasticsearch Version
8.9.0
Installed Plugins
No response
Java Version
bundled
OS Version
any
Problem Description
If the context text supplied to the Question Answering task is longer than the model's max_sequence_length and truncate is set to none then inference on the model fails with the message:
status_exception: question answering result has invalid dimension, expected 2 found [6]
Steps to Reproduce
Use long input text to the question answering task
Work Aroundf
The fix is to truncate the context which unfortunately may remove the part of the document containing the answer. For reference the call is documented here
POST _ml/trained_models/distilbert-base-cased-distilled-squad/deployment/_infer { "docs":[{ "text_field": "question answering context"}], "inference_config": { "question_answering": { "question": "The question to ask", "tokenization": { "bert": { "truncate": "second", "span": -1 } } } } } Metadata
Metadata
Assignees
Labels
:mlMachine learningMachine learningbug"" muted="" aria-describedby="MDU6TGFiZWwyMzE3Mw==-tooltip :Ra5pmb:">>bugTeam:MLMeta label for the ML teamMeta label for the ML team