- Notifications
You must be signed in to change notification settings - Fork 6
chat api
Nagendra Dhanakeerthi edited this page Oct 30, 2024 · 1 revision
The Chat API provides methods to interact with OpenAI's chat completion endpoints.
client = ChatGPT::Client.new response = client.chat([ { role: "user", content: "Hello!" } ])
Messages should be formatted as an array of hashes with role
and content
:
messages = [ { role: "system", content: "You are a helpful assistant." }, { role: "user", content: "What is Ruby?" }, { role: "assistant", content: "Ruby is a programming language..." }, { role: "user", content: "Tell me more!" } ]
-
system
: Set behavior and context -
user
: User messages -
assistant
: AI responses -
function
: Function call responses (future implementation)
client.chat(messages, { model: 'gpt-3.5-turbo', # Optional, defaults to configured value temperature: 0.7, # Optional (0.0 to 1.0) max_tokens: 100, # Optional top_p: 1.0, # Optional n: 1 # Optional, number of responses })
{ "choices" => [{ "message" => { "role" => "assistant", "content" => "Response content here" }, "finish_reason" => "stop", "index" => 0 }], "usage" => { "prompt_tokens" => 10, "completion_tokens" => 20, "total_tokens" => 30 } }
begin response = client.chat(messages) rescue ChatGPT::AuthenticationError => e # Handle authentication errors rescue ChatGPT::RateLimitError => e # Handle rate limit errors rescue ChatGPT::APIError => e # Handle other API errors end
- Message Context
# Keep conversation context messages = [] messages << { role: "user", content: "Hello!" } response = client.chat(messages) messages << response["choices"][0]["message"]
- System Messages
messages = [ { role: "system", content: "You are a helpful assistant. Be concise." } ]
- Error Recovery
def chat_with_retry(messages, max_attempts = 3) attempts = 0 begin attempts += 1 client.chat(messages) rescue ChatGPT::RateLimitError => e retry if attempts < max_attempts raise end end
response = client.chat([ { role: "user", content: "Hello!" } ]) puts response["choices"][0]["message"]["content"]
messages = [ { role: "system", content: "You are a helpful assistant." } ] # First interaction messages << { role: "user", content: "What is Ruby?" } response = client.chat(messages) messages << response["choices"][0]["message"] # Follow-up messages << { role: "user", content: "Tell me more!" } response = client.chat(messages)
response = client.chat( messages, { temperature: 0.7, max_tokens: 150 } )