Create a Mistral inference endpoint Generally available; Added in 8.15.0
Path parameters
-
The type of the inference task that the model will perform.
Values are
text_embedding
,completion
, orchat_completion
. -
The unique identifier of the inference endpoint.
PUT /_inference/{task_type}/{mistral_inference_id}
Console
PUT _inference/text_embedding/mistral-embeddings-test { "service": "mistral", "service_settings": { "api_key": "Mistral-API-Key", "model": "mistral-embed" } }
resp = client.inference.put( task_type="text_embedding", inference_id="mistral-embeddings-test", inference_config={ "service": "mistral", "service_settings": { "api_key": "Mistral-API-Key", "model": "mistral-embed" } }, )
const response = await client.inference.put({ task_type: "text_embedding", inference_id: "mistral-embeddings-test", inference_config: { service: "mistral", service_settings: { api_key: "Mistral-API-Key", model: "mistral-embed", }, }, });
response = client.inference.put( task_type: "text_embedding", inference_id: "mistral-embeddings-test", body: { "service": "mistral", "service_settings": { "api_key": "Mistral-API-Key", "model": "mistral-embed" } } )
$resp = $client->inference()->put([ "task_type" => "text_embedding", "inference_id" => "mistral-embeddings-test", "body" => [ "service" => "mistral", "service_settings" => [ "api_key" => "Mistral-API-Key", "model" => "mistral-embed", ], ], ]);
curl -X PUT -H "Authorization: ApiKey $ELASTIC_API_KEY" -H "Content-Type: application/json" -d '{"service":"mistral","service_settings":{"api_key":"Mistral-API-Key","model":"mistral-embed"}}' "$ELASTICSEARCH_URL/_inference/text_embedding/mistral-embeddings-test"
Request example
Run `PUT _inference/text_embedding/mistral-embeddings-test` to create a Mistral inference endpoint that performs a text embedding task.
{ "service": "mistral", "service_settings": { "api_key": "Mistral-API-Key", "model": "mistral-embed" } }