🪢 Open source LLM engineering platform: LLM Observability, metrics, evals, prompt management, playground, datasets. Integrates with OpenTelemetry, Langchain, OpenAI SDK, LiteLLM, and more. 🍊YC W23
- Updated
Dec 12, 2025 - TypeScript
🪢 Open source LLM engineering platform: LLM Observability, metrics, evals, prompt management, playground, datasets. Integrates with OpenTelemetry, Langchain, OpenAI SDK, LiteLLM, and more. 🍊YC W23
Next-generation AI Agent Optimization Platform: Cozeloop addresses challenges in AI agent development by providing full-lifecycle management capabilities from development, debugging, and evaluation to monitoring.
🧊 Open source LLM observability platform. One line of code to monitor, evaluate, and experiment. YC W23 🍓
The open-source LLMOps platform: prompt playground, prompt management, LLM evaluation, and LLM observability all in one place.
🕹️ Open-source, developer-first LLMOps platform designed to streamline prompt design, version management, instant delivery, collaboration, troubleshooting, observability and more.
A playground of highly experimental prompts, Jinja2 templates & scripts for machine intelligence models from OpenAI, Anthropic, DeepSeek, Meta, Mistral, Google, xAI & others. Alex Bilzerian (2022-2025).
Markdown for the AI era
Prompt & Conversation Management Middleware for Conversational AI APIs such as OpenAI ChatGPT, Facebook Hugging Face, Anthropic Claude, Google Gemini, Ollama and Jlama. Lean, restful, scalable, and cloud-native. Developed in Java, powered by Quarkus, provided with Docker, and orchestrated with Kubernetes or Openshift.
Open-source versioning, tracing, and annotation tooling.
An Automated AI-Powered Prompt Optimization Framework
Model Context Protocol (MCP) Server for Langfuse Prompt Management. This server allows you to access and manage your Langfuse prompts through the Model Context Protocol.
🪢 Langfuse documentation -- Langfuse is the open source LLM Engineering Platform. Observability, evals, prompt management, playground and metrics to debug and improve LLM apps
Modular, open source LLMOps stack that separates concerns: LiteLLM unifies LLM APIs, manages routing and cost controls, and ensures high-availability, while Langfuse focuses on detailed observability, prompt versioning, and performance evaluations.
MCP prompt template server: hot-reload, thinking frameworks, quality gates
Open Source LLM proxy that transparently captures and logs all interactions with LLM API
提示词分支、版本式管理工具 | Prompt branching and version management tool
These guides are designed to help teams and individuals leverage AI tools like GitHub Copilot, OpenAI, and Claude to build software projects efficiently and effectively
PromptRose 🌹 is your AI prompt companion, blooming at your fingertips.
Managed Prompt Engineering
An easy-to-use structured prompt builder for LLMs in TypeScript.
Add a description, image, and links to the prompt-management topic page so that developers can more easily learn about it.
To associate your repository with the prompt-management topic, visit your repo's landing page and select "manage topics."