Delete an inference endpoint Generally available

DELETE /_inference/{inference_id}

Path parameters

  • inference_id string Required

    The inference identifier.

Query parameters

  • dry_run boolean

    When true, the endpoint is not deleted and a list of ingest processors which reference this endpoint is returned.

  • force boolean

    When true, the inference endpoint is forcefully deleted even if it is still being used by ingest processors or semantic text fields.

Responses

  • 200 application/json
    Hide response attributes Show response attributes object

    Acknowledged response. For dry_run, contains the list of pipelines which reference the inference endpoint

    • acknowledged boolean Required

      For a successful response, this value is always true. On failure, an exception is returned instead.

    • pipelines array[string] Required
DELETE /_inference/{inference_id}
DELETE /_inference/sparse_embedding/my-elser-model 
resp = client.inference.delete( task_type="sparse_embedding", inference_id="my-elser-model", )
const response = await client.inference.delete({ task_type: "sparse_embedding", inference_id: "my-elser-model", });
response = client.inference.delete( task_type: "sparse_embedding", inference_id: "my-elser-model" )
$resp = $client->inference()->delete([ "task_type" => "sparse_embedding", "inference_id" => "my-elser-model", ]);
curl -X DELETE -H "Authorization: ApiKey $ELASTIC_API_KEY" "$ELASTICSEARCH_URL/_inference/sparse_embedding/my-elser-model"