Skip to content

AI agent built with LangChain that supports multiple model providers with a unified interface for tool calling

License

Notifications You must be signed in to change notification settings

kostola/multimodel-agent

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Multi-Model AI Agent

A Python AI agent built with LangChain that supports multiple model providers including Anthropic Claude, OpenAI GPT, and Ollama models. The agent provides a unified interface for tool calling across different model providers.

Features

  • Multi-provider support: Anthropic, OpenAI compatible APIs, and Ollama
  • Unified tool interface: Same tool definitions work across all model providers
  • Predefined presets: Quick setup with common model configurations
  • Flexible configuration: Custom model parameters and endpoints
  • Interactive CLI: Real-time conversation with tool calling capabilities

Installation

  1. Install dependencies:
pip install -r requirements.txt
  1. Set up API keys (as needed):
export ANTHROPIC_API_KEY="your-anthropic-key" export OPENAI_API_KEY="your-openai-key"

Usage

Using Predefined Presets

The easiest way to get started is with predefined model presets:

# Anthropic Claude models python main.py --preset claude-3.5-sonnet python main.py --preset claude-3-haiku # OpenAI models python main.py --preset gpt-4-turbo python main.py --preset gpt-3.5-turbo # Ollama models (requires Ollama running locally) python main.py --preset llama3.1-8b python main.py --preset llama3.1-70b python main.py --preset mistral-7b python main.py --preset codellama-7b

Custom Configuration

For more control, use custom provider configurations:

# Anthropic with custom parameters python main.py --provider anthropic --model claude-3-5-sonnet-20241022 --temperature 0.7 # OpenAI with custom API key python main.py --provider openai --model gpt-4-turbo-preview --api-key your-key # Custom OpenAI-compatible endpoint python main.py --provider openai --model custom-model --base-url https://api.custom.com/v1 # Ollama with custom endpoint python main.py --provider ollama --model llama3.1:8b --base-url http://localhost:11434 # With additional parameters python main.py --provider anthropic --model claude-3-5-sonnet-20241022 --temperature 0.7 --max-tokens 2048 # Disable SSL verification (useful for self-signed certificates) python main.py --preset llama3.1-8b --no-verify-ssl python main.py --provider ollama --model llama3.1:8b --base-url https://my-ollama:11434 --no-verify-ssl

CLI Parameters

Required Parameters

  • --preset: Use a predefined model preset (mutually exclusive with --provider)
  • --provider: Model provider (anthropic, openai, ollama) - requires --model

Optional Parameters

  • --model: Model name (required when using --provider)
  • --api-key: API key for the model provider
  • --base-url: Base URL for the API (useful for custom OpenAI endpoints or Ollama)
  • --temperature: Temperature for text generation (default: 0.0)
  • --max-tokens: Maximum tokens in response (default: 1024)
  • --no-verify-ssl: Disable SSL certificate verification (useful for self-signed certificates)

Environment Variables

  • ANTHROPIC_API_KEY: Your Anthropic API key
  • OPENAI_API_KEY: Your OpenAI API key
  • OPENAI_BASE_URL: Custom OpenAI-compatible endpoint (optional)
  • OLLAMA_BASE_URL: Ollama server URL (default: http://localhost:11434)

Model Provider Setup

Anthropic Claude

  1. Get an API key from Anthropic Console
  2. Set the environment variable: export ANTHROPIC_API_KEY="your-key"
  3. Available models: claude-3-5-sonnet-20241022, claude-3-haiku-20240307, etc.

OpenAI

  1. Get an API key from OpenAI Platform
  2. Set the environment variable: export OPENAI_API_KEY="your-key"
  3. Available models: gpt-4-turbo-preview, gpt-3.5-turbo, etc.

Ollama

  1. Install Ollama from ollama.com
  2. Start the Ollama server: ollama serve
  3. Pull a model: ollama pull llama3.1:8b
  4. Available models: Any model supported by your Ollama installation

Available Tools

The agent comes with these built-in tools:

  • read_file: Read the contents of a file
  • list_files: List files and directories in a path

Adding Custom Tools

To add custom tools, implement the ToolDefinition interface:

from agent import ToolDefinition class MyCustomTool(ToolDefinition): def name(self) -> str: return "my_tool" def description(self) -> str: return "Description of what my tool does" def input_schema(self) -> dict: return { "type": "object", "properties": { "param": {"type": "string", "description": "Parameter description"} }, "required": ["param"] } def execute(self, **kwargs) -> str: # Tool implementation return "Tool result"

Then add it to your agent:

from agent import MultiModelAgent, ModelFactory, ModelPresets, ModelProvider from tools import ReadFileTool, ListFilesTool # Create model model = ModelFactory.create_model(ModelPresets.CLAUDE_3_5_SONNET) # Add tools including your custom tool tools = [ReadFileTool(), ListFilesTool(), MyCustomTool()] # Create agent agent = MultiModelAgent(model, ModelProvider.ANTHROPIC, get_user_message, tools)

Examples

Simple file operations:

You: List the files in the current directory Agent: [uses list_files tool] You: Read the contents of main.py Agent: [uses read_file tool] 

Code analysis:

You: Analyze the structure of this Python project Agent: [uses list_files to explore, then read_file to examine key files] 

Architecture

  • ModelFactory: Creates model instances for different providers
  • MultiModelAgent: Main agent class handling conversation and tool execution
  • ToolDefinition: Interface for implementing custom tools
  • Provider-specific handling: Different tool calling formats for each provider

Contributing

  1. Fork the repository
  2. Create a feature branch
  3. Add your changes with tests
  4. Submit a pull request

License

MIT License - see LICENSE file for details.

About

AI agent built with LangChain that supports multiple model providers with a unified interface for tool calling

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages