Have you ever wished your AI coding assistant actually read your code instead of pretending it did?
That was exactly my frustration.
I often found myself repeating the same things to Claude:
“Here’s that chunk of code I wrote before — please reuse it,”
“This is how I used that library in another project,”
“Follow this coding style, not the default one.”
Sound familiar? 😅
That’s why I hacked together something like a local version of Context7 — but one that actually digs into your source code. I call it Snippets, and I’d love for you to try it out and give me feedback.
🤔 Motivation
Context7 is a neat tool, but I hit some walls with it:
- It mainly focuses on docs (
.md
files), while I often need the actual code. - Yes, it supports private repos, but only as a team feature, and your data still gets stored on external servers.
- I wanted something local-first: no uploads, no external storage, just me and my machine.
- Since Snippets relies on Claude Code itself to extract snippets, I suspect it’s both cheaper and more accurate for real-world usage.
The vision is simple:
👉 Give your AI a searchable memory of code snippets — your repos, your patterns, your style.
🔍 Context7 vs Snippets (at a glance)
Feature | Context7 | Snippets (this project) |
---|---|---|
Data focus | Reads docs (.md files) | Reads actual source code |
Private repo support | ✅ Yes, but looks like it may soon become a paid feature | ✅ Yes, works locally out-of-the-box |
Data storage | Stored on external servers | Stays entirely on your machine |
Deployment | Cloud-based | Local Docker setup |
Opensource? | Closed Source | Yes, it is! |
💡 What Snippets Can Do
Here are a few practical ways I’m already using Snippets:
AI memory for your projects
When Claude struggles with a task, feed it snippets from your past repos to give it context.Dealing with bad documentation
Context7 can read .md docs, but when docs are missing or messy, Snippets lets you just add an example repo and search real code instead.Style transfer
Index your repos and tell Claude: “follow this coding style.”Reusable snippet library
Collect helpers, utilities, and boilerplate patterns into a semantic, searchable database.Expanding AI’s reach
Feed in Rust code, private company libraries, or niche frameworks — things your AI doesn’t natively know well — and make it smarter.
⚡ Quick Start
Getting Snippets running locally is straightforward.
1. Clone & Setup
git clone https://github.com/cheolwanpark/snippets cd snippets
2. Configure Environment
cp docker/.env.example docker/.env
Add your Claude OAuth Token and Gemini API key in docker/.env
.
Claude OAuth Token: You can get one from claude setup-token
command!
Gemini API key: It is for Embedding. You can get one from Google AI Studio!
3. Launch with Docker
cd docker docker-compose up -d
This spins up:
- Frontend → http://localhost:3000
- API & MCP Server → http://localhost:8000 / http://localhost:8080/mcp
- Qdrant & Redis → 6333 and 6379
4. Add Your First Repo
- Open
http://localhost:3000
- Paste a GitHub repo URL
- Hit Embed
- Once processing completes, you can search snippets in plain English!
🔗 Claude MCP Integration
Hook it up to Claude Code via MCP:
claude mcp add --transport http snippets http://localhost:8080/mcp
Now you can literally ask Claude things like:
search error handling patterns in Python. use snippets. search JWT authentication middleware. use snippets. search async database queries in Rust. use snippets.
Instead of vague memory, Claude pulls real snippets directly from your repos.
🧭 Why Bother?
At the end of the day, here’s what I want:
- Not just
.md
docs, but real code. - Not just cloud tools, but local-first and private by design.
- Not just “AI magic,” but a searchable library of my own snippets.
Snippets is still early, but it’s a step toward “AI that actually understands your context.”
🙌 Feedback Wanted
I’d love your thoughts:
- Does this help in your workflow?
- What features would you like to see added?
- Any setup/usage pain points?
Let’s make our coding assistants a little smarter — together.
Top comments (0)