One beautiful Ruby API for GPT, Claude, Gemini, and more. Easily build chatbots, AI agents, RAG applications, and content generators.

Get started GitHub


Gem Version Ruby Style Guide Gem Downloads codecov

crmne%2Fruby_llm | Trendshift

Battle tested at Claude Code for your documents

Using RubyLLM in production? Share your story! Takes 5 minutes.


Why RubyLLM?

Every AI provider ships their own bloated client. Different APIs. Different response formats. Different conventions. It’s exhausting.

RubyLLM gives you one beautiful API for all of them. Same interface whether you’re using GPT, Claude, or your local Ollama. Just three dependencies: Faraday, Zeitwerk, and Marcel. That’s it.

Show me the code

# Just ask questions chat = RubyLLM.chat chat.ask "What's the best way to learn Ruby?" 
# Analyze any file type chat.ask "What's in this image?", with: "ruby_conf.jpg" chat.ask "What's happening in this video?", with: "video.mp4" chat.ask "Describe this meeting", with: "meeting.wav" chat.ask "Summarize this document", with: "contract.pdf" chat.ask "Explain this code", with: "app.rb" 
# Multiple files at once chat.ask "Analyze these files", with: ["diagram.png", "report.pdf", "notes.txt"] 
# Stream responses chat.ask "Tell me a story about Ruby" do |chunk| print chunk.content end 
# Generate images RubyLLM.paint "a sunset over mountains in watercolor style" 
# Create embeddings RubyLLM.embed "Ruby is elegant and expressive" 
# Moderate content for safety RubyLLM.moderate("Check if this text is safe").flagged? # => false 
# Let AI use your code class Weather < RubyLLM::Tool description "Get current weather" param :latitude param :longitude def execute(latitude:, longitude:) url = "https://api.open-meteo.com/v1/forecast?latitude=#{latitude}&longitude=#{longitude}&current=temperature_2m,wind_speed_10m" JSON.parse(Faraday.get(url).body) end end chat.with_tool(Weather).ask "What's the weather in Berlin?" 
# Get structured output class ProductSchema < RubyLLM::Schema string :name number :price array :features do string end end response = chat.with_schema(ProductSchema).ask "Analyze this product", with: "product.txt" 

Features

  • Chat: Conversational AI with RubyLLM.chat
  • Vision: Analyze images and screenshots
  • Audio: Transcribe and understand speech
  • Documents: Extract from PDFs, CSVs, JSON, any file type
  • Image generation: Create images with RubyLLM.paint
  • Embeddings: Vector search with RubyLLM.embed
  • Moderation: Content safety with RubyLLM.moderate
  • Tools: Let AI call your Ruby methods
  • Structured output: JSON schemas that just work
  • Streaming: Real-time responses with blocks
  • Rails: ActiveRecord integration with acts_as_chat
  • Async: Fiber-based concurrency
  • Model registry: 500+ models with capability detection and pricing
  • Providers: OpenAI, Anthropic, Gemini, VertexAI, Bedrock, DeepSeek, Mistral, Ollama, OpenRouter, Perplexity, GPUStack, and any OpenAI-compatible API

Installation

Add to your Gemfile:

gem 'ruby_llm' 

Then bundle install.

Configure your API keys:

# config/initializers/ruby_llm.rb RubyLLM.configure do |config| config.openai_api_key = ENV['OPENAI_API_KEY'] end 

Rails

# Install database models rails generate ruby_llm:install # Add chat UI (optional) rails generate ruby_llm:chat_ui 
class Chat < ApplicationRecord acts_as_chat end chat = Chat.create! model: "claude-sonnet-4" chat.ask "What's in this file?", with: "report.pdf" 

Visit http://localhost:3000/chats for a ready-to-use chat interface!


Brought to you by Carmine Paolino, maker of — Claude Code for your documents