Unified Go interface for Language Model (LLM) providers. Simplifies LLM integration with flexible prompt management and common task functions.
- Updated
Jul 8, 2025 - Go
Unified Go interface for Language Model (LLM) providers. Simplifies LLM integration with flexible prompt management and common task functions.
First AI Journey for DevOps - with comprehensive learning paths, practical tips, and enterprise guidelines
Find buried AI prompts in any codebase. Instantly⚡
MCP OAuth Proxy incl. dynamic client registration (DCR), MCP prompt analytics and MCP firewall to build enterprise grade MCP servers.
Count Tokens of Code (forked from gocloc)
Create the prompts you need to write your Novel using AI
Easiest way to write, build, run & share your OpenAI prompts. Do prompt engineer with confidence.
MuseWeb is a Prompt Driven AI WebServer
Terminal-based context management for AI driven development
Your naming pal, written in Go 🐶
Type-safe AI agents for Go. Suricata combines LLM intelligence with Go’s strong typing, declarative YAML specs, and code generation to build safe, maintainable, and production-ready AI agents.
LLM Prompt Injection Detection API Service PoC.
The LLM guardian kernel
Guardrails for LLMs: detect and block hallucinated tool calls to improve safety and reliability.
Converts your codebase into LLM prompt
A modern prompt engineering tool that applies real engineering practices to transform messy, unstructured thoughts into clean, effective prompts for AI models. Built with Go and React, it runs as a single binary with zero dependencies.
Generate structured context from your codebase for LLMs
Taco makes it easy to get your source code ready for LLMs. With just one command, it collects all your text files and merges them into a single file, so you can quickly create a complete prompt with all your source code.
Add a description, image, and links to the prompt-engineering topic page so that developers can more easily learn about it.
To associate your repository with the prompt-engineering topic, visit your repo's landing page and select "manage topics."