Next-generation AI Agent Optimization Platform: Cozeloop addresses challenges in AI agent development by providing full-lifecycle management capabilities from development, debugging, and evaluation to monitoring.
- Updated
Oct 17, 2025 - Go
Next-generation AI Agent Optimization Platform: Cozeloop addresses challenges in AI agent development by providing full-lifecycle management capabilities from development, debugging, and evaluation to monitoring.
🏕️ Reproducible development environment
Declarative Intent Driven Platform Orchestrator for Internal Developer Platform (IDP).
Self-hosted MCP Gateway and Registry for AI agents
🕵️♂️ Library designed for developers eager to explore the potential of Large Language Models (LLMs) and other generative AI through a clean, effective, and Go-idiomatic approach.
Autoscale LLM (vLLM, SGLang, LMDeploy) inferences on Kubernetes (and others)
Finetune LLMs on K8s by using Runbooks
A diverse, simple, and secure all-in-one LLMOps platform
🧯 Kubernetes coverage for fault awareness and recovery, works for any LLMOps, MLOps, AI workloads.
Sister project to OpenLLMetry, but in Go. Open-source observability for your LLM application, based on OpenTelemetry
The reliability layer between your code and LLM providers.
专注在智能运维、自动化运维、Zabbix、Prometheus、Grafana、Nagios、ELK Stack(Elasticsearch、Logstash、Kibana)、Graylog、Ansible、SaltStack、Puppet、Chef、Terraform、Docker、Kubernetes、OpenShift、Jenkins、MySQL、PostgreSQL、MariaDB、Redis、MongoDB、InfluxDB、Ceph、MinIO,RabbitMQ、Kafka、NATS、Apache Pulsar、Nginx、Apache HTTP Server、HAProxy、Traefik、Caddy、OpenStack、OpenLDAP、FreeRDP等多个领域。
Aegis is an LLM-powered AI cluster autonomous operations system, focused on intelligent capabilities such as Fault Diagnosis, Self-healing, Root Cause Analysis, Cluster Inspection, and Alert Optimization.
XScopeHub — Observability Suite : Bridges exporters, OpenTelemetry, and OpenObserve with ETL pipelines for metrics, logs, and traces.
Add a description, image, and links to the llmops topic page so that developers can more easily learn about it.
To associate your repository with the llmops topic, visit your repo's landing page and select "manage topics."