Skip to content

Commit adf43c7

Browse files
committed
Docs(OCI Cohere): fix property prefix to spring.ai.oci.genai.cohere.chat (Fixes #4436)
Signed-off-by: Albert Attard <albertattard@gmail.com>
1 parent 5be4330 commit adf43c7

File tree

1 file changed

+15
-15
lines changed
  • spring-ai-docs/src/main/antora/modules/ROOT/pages/api/chat/oci-genai

1 file changed

+15
-15
lines changed

spring-ai-docs/src/main/antora/modules/ROOT/pages/api/chat/oci-genai/cohere-chat.adoc

Lines changed: 15 additions & 15 deletions
Original file line numberDiff line numberDiff line change
@@ -78,34 +78,34 @@ To disable, spring.ai.model.chat=none (or any value which doesn't match oci-gena
7878
This change is done to allow configuration of multiple models.
7979
====
8080

81-
The prefix `spring.ai.oci.genai.chat.cohere` is the property prefix that configures the `ChatModel` implementation for OCI GenAI Cohere Chat.
81+
The prefix `spring.ai.oci.genai.cohere.chat` is the property prefix that configures the `ChatModel` implementation for OCI GenAI Cohere Chat.
8282

8383
[cols="3,5,1", stripes=even]
8484
|====
8585
| Property | Description | Default
8686

8787
| spring.ai.model.chat | Enable OCI GenAI Cohere chat model. | oci-genai
88-
| spring.ai.oci.genai.chat.cohere.enabled (no longer valid) | Enable OCI GenAI Cohere chat model. | true
89-
| spring.ai.oci.genai.chat.cohere.options.model | Model OCID or endpoint | -
90-
| spring.ai.oci.genai.chat.cohere.options.compartment | Model compartment OCID. | -
91-
| spring.ai.oci.genai.chat.cohere.options.servingMode | The model serving mode to be used. May be `on-demand`, or `dedicated`. | on-demand
92-
| spring.ai.oci.genai.chat.cohere.options.preambleOverride | Override the chat model's prompt preamble | -
93-
| spring.ai.oci.genai.chat.cohere.options.temperature | Inference temperature | -
94-
| spring.ai.oci.genai.chat.cohere.options.topP | Top P parameter | -
95-
| spring.ai.oci.genai.chat.cohere.options.topK | Top K parameter | -
96-
| spring.ai.oci.genai.chat.cohere.options.frequencyPenalty | Higher values will reduce repeated tokens and outputs will be more random. | -
97-
| spring.ai.oci.genai.chat.cohere.options.presencePenalty | Higher values encourage generating outputs with tokens that haven't been used. | -
98-
| spring.ai.oci.genai.chat.cohere.options.stop | List of textual sequences that will end completions generation. | -
99-
| spring.ai.oci.genai.chat.cohere.options.documents | List of documents used in chat context. | -
88+
| spring.ai.oci.genai.cohere.chat.enabled (no longer valid) | Enable OCI GenAI Cohere chat model. | true
89+
| spring.ai.oci.genai.cohere.chat.options.model | Model OCID or endpoint | -
90+
| spring.ai.oci.genai.cohere.chat.options.compartment | Model compartment OCID. | -
91+
| spring.ai.oci.genai.cohere.chat.options.servingMode | The model serving mode to be used. May be `on-demand`, or `dedicated`. | on-demand
92+
| spring.ai.oci.genai.cohere.chat.options.preambleOverride | Override the chat model's prompt preamble | -
93+
| spring.ai.oci.genai.cohere.chat.options.temperature | Inference temperature | -
94+
| spring.ai.oci.genai.cohere.chat.options.topP | Top P parameter | -
95+
| spring.ai.oci.genai.cohere.chat.options.topK | Top K parameter | -
96+
| spring.ai.oci.genai.cohere.chat.options.frequencyPenalty | Higher values will reduce repeated tokens and outputs will be more random. | -
97+
| spring.ai.oci.genai.cohere.chat.options.presencePenalty | Higher values encourage generating outputs with tokens that haven't been used. | -
98+
| spring.ai.oci.genai.cohere.chat.options.stop | List of textual sequences that will end completions generation. | -
99+
| spring.ai.oci.genai.cohere.chat.options.documents | List of documents used in chat context. | -
100100
|====
101101

102-
TIP: All properties prefixed with `spring.ai.oci.genai.chat.cohere.options` can be overridden at runtime by adding a request specific <<chat-options>> to the `Prompt` call.
102+
TIP: All properties prefixed with `spring.ai.oci.genai.cohere.chat.options` can be overridden at runtime by adding a request specific <<chat-options>> to the `Prompt` call.
103103

104104
== Runtime Options [[chat-options]]
105105

106106
The link:https://github.com/spring-projects/spring-ai/blob/main/models/spring-ai-oci-genai/src/main/java/org/springframework/ai/oci/cohere/OCICohereChatOptions.java[OCICohereChatOptions.java] provides model configurations, such as the model to use, the temperature, the frequency penalty, etc.
107107

108-
On start-up, the default options can be configured with the `OCICohereChatModel(api, options)` constructor or the `spring.ai.oci.genai.chat.cohere.options.*` properties.
108+
On start-up, the default options can be configured with the `OCICohereChatModel(api, options)` constructor or the `spring.ai.oci.genai.cohere.chat.options.*` properties.
109109

110110
At run-time you can override the default options by adding new, request specific, options to the `Prompt` call.
111111
For example to override the default model and temperature for a specific request:

0 commit comments

Comments
 (0)