A simple, easy-to-use api server that sits in front of your local ollama instance to add additional security when making requests to ollama.

- β Features
- ποΈ Example Flow
- π Quick Start
- βοΈ Configuration
- π API Endpoints
- π Authentication
- π API Key Authentication
- π CORS Support
- π Connection Pooling
- π Streaming Support
- π³ Easy Docker Setup
- π All Ollama API Endpoints Supported
- Example of connecting a local ollama instance to an open-webui docker container on the same docker network
flowchart TD user([External User]) --> webui[Open WebUI] webui -->|Request with API Key| api[Ollama API Server] api --> auth{API Key Valid?} auth -->|No| reject[Reject Connection] auth -->|Yes| ollama[Ollama LLM Service] ollama -->|Response| api api -->|Response| webui webui -->|Response| user subgraph "Docker: ollama-network" webui api auth ollama end classDef green fill:#d1e7dd,stroke:#0f5132,stroke-width:1px,color:#0f5132; classDef blue fill:#cfe2ff,stroke:#084298,stroke-width:1px,color:#084298; classDef red fill:#f8d7da,stroke:#842029,stroke-width:1px,color:#842029; classDef yellow fill:#fff3cd,stroke:#664d03,stroke-width:1px,color:#664d03; classDef gray fill:#f8f9fa,stroke:#343a40,stroke-width:1px,color:#343a40; class user gray class webui blue class api blue class auth yellow class ollama green class reject red
The official Docker image is available on Docker Hub and GitHub Container Registry:
# Docker Hub docker pull gitmotion/ollama-api-server:latest # GitHub Container Registry docker pull ghcr.io/gitmotion/ollama-api-server:latest
docker run -d \ --name ollama-api-server \ --restart unless-stopped \ -p 7777:7777 \ -e PORT=7777 \ -e OLLAMA_BASE_URL=http://internal-ip-where-ollama-installed:11434 \ -e CORS_ORIGIN=* \ -e API_KEYS=default-key-1,default-key-2 \ gitmotion/ollama-api-server:latest
The server uses the following docker-compose.yml
configuration:
services: api: image: gitmotion/ollama-api-server:latest container_name: ollama-api-server restart: unless-stopped ports: - "${PORT_EXTERNAL:-7777}:7777" environment: - PORT=7777 - OLLAMA_BASE_URL=http://internal-ip-where-ollama-installed:11434 # must serve your ollama server with 0.0.0.0 - CORS_ORIGIN=* - API_KEYS=${API_KEYS:-default-key-1,default-key-2}
services: ollama-api-server: image: gitmotion/ollama-api-server:latest container_name: ollama-api-server restart: unless-stopped ports: - "${PORT_EXTERNAL:-7777}:7777" environment: - PORT=7777 - OLLAMA_BASE_URL=http://internal-ip-where-ollama-installed:11434 # must serve your ollama server with 0.0.0.0 - CORS_ORIGIN=* - API_KEYS=${API_KEYS:-secure-api-key-1,secure-api-key-2} # UPDATE THESE KEYS - comma separated networks: - ollama-network open-webui: image: openwebui/open-webui:latest container_name: open-webui restart: unless-stopped depends_on: - ollama-api-server ports: - "3000:3000" environment: - OLLAMA_BASE_URL=http://ollama-api-server:7777 # Configure the api key via UI - WEBUI_SECRET_KEY=${WEBUI_SECRET_KEY} volumes: - ./open-webui-data:/app/backend/data networks: - ollama-network - your-external-reverse-proxy networks: ollama-network: driver: bridge your-external-reverse-proxy: external: true
This configuration:
- Uses the official Docker image
- Maps the container's port 7777 to your host's port (configurable via PORT_EXTERNAL)
- Sets up the required environment variables
- Provides default API keys if none are specified
You can customize the configuration by:
- Changing the external port (PORT_EXTERNAL in .env)
- Setting your API keys (API_KEYS in .env)
- Modifying the Ollama base URL if needed
- Adjusting CORS settings for your environment
-
Clone the repository:
git clone https://github.com/gitmotion/ollama-api-server.git cd ollama-api-server
-
Install dependencies:
npm install
-
Create and configure .env file:
cp .env.example .env # Edit .env with your settings
-
Build and start the server:
npm run build npm start
The server can be configured using environment variables:
Variable | Description | Default |
---|---|---|
PORT | Server port | 7777 |
OLLAMA_BASE_URL | URL of your Ollama instance | http://localhost:11434 |
CORS_ORIGIN | CORS origin setting | * |
API_KEYS | Comma-separated list of valid API keys | default-key-1,default-key-2 |
All Ollama API endpoints are supported with authentication:
POST /api/chat
- Chat completionPOST /api/generate
- Text generationPOST /api/embeddings
- Generate embeddingsGET /api/tags
- List available modelsPOST /api/show
- Show model detailsPOST /api/pull
- Pull a modelDELETE /api/delete
- Delete a modelPOST /api/copy
- Copy a modelGET /api/version
- Get Ollama versionGET /health
- Health check endpoint
Include your API key in requests using one of these methods:
-
X-API-Key header:
curl -H "x-api-key: your-api-key" http://localhost:7777/api/tags
-
Authorization header:
curl -H "Authorization: Bearer your-api-key" http://localhost:7777/api/tags
-
Request body:
curl -X POST -H "Content-Type: application/json" \ -d '{"apiKey": "your-api-key"}' \ http://localhost:7777/api/tags
Made with β€οΈ by gitmotion