TensorZero is an open-source stack for industrial-grade LLM applications. It unifies an LLM gateway, observability, optimization, evaluation, and experimentation.
- Updated
Dec 23, 2025 - Rust
TensorZero is an open-source stack for industrial-grade LLM applications. It unifies an LLM gateway, observability, optimization, evaluation, and experimentation.
⚙️🦀 Build modular and scalable LLM Applications in Rust
Ship agents faster. Arch is delivery infrastructure for agentic apps — a models-native proxy server & dataplane that offloads the plumbing work, so you stay focused on product logic.
Bionic is an on-premise replacement for ChatGPT, offering the advantages of Generative AI while maintaining strict data confidentiality
AICI: Prompts as (Wasm) Programs
Scalable, fast, and disk-friendly vector search in Postgres, the successor of pgvecto.rs.
Open-source LLM load balancer and serving platform for self-hosting LLMs at scale 🏓🦙
Govern & Secure your AI
High-scale LLM gateway, written in Rust. OpenTelemetry-based observability included
Simple, Composable, High-Performance, Safe and Web3 Friendly AI Agents and LazAI Gateway for Everyone
🧬 The adaptive model routing system for exploration and exploitation.
🧭🧭 An intelligent load balancer with smart scheduling that unifies diverse LLMs.
Robot VLM and VLA (Vision-Language-Action) inference API helping you manage multimodal prompts, RAG, and location metadata
Burgonet Gateway is an enterprise LLM gateway that provides secure access and compliance controls for AI systems
A Lazy, high throughput and blazing fast structured text generation backend.
[WIP] Sorai is a lightweight, high-performance, and open-source LLM proxy gateway.
Add a description, image, and links to the llmops topic page so that developers can more easily learn about it.
To associate your repository with the llmops topic, visit your repo's landing page and select "manage topics."