InfluxDB 3 OSS is now GA. Transform, enrich, and act on time series data directly in the database. Automate critical tasks and eliminate the need to move data externally. Download now. Learn more →
Code2prompt Alternatives
Similar projects and alternatives to code2prompt
-
txtai
💡 All-in-one AI framework for semantic search, LLM orchestration and language model workflows
-
InfluxDB
InfluxDB – Built for High-Performance Time Series Workloads. InfluxDB 3 OSS is now GA. Transform, enrich, and act on time series data directly in the database. Automate critical tasks and eliminate the need to move data externally. Download now.
-
-
-
-
repomix
📦 Repomix is a powerful tool that packs your entire repository into a single, AI-friendly file. Perfect for when you need to feed your codebase to Large Language Models (LLMs) or other AI tools like Claude, ChatGPT, DeepSeek, Perplexity, Gemini, Gemma, Llama, Grok, and more.
-
-
gitingest
Replace 'hub' with 'ingest' in any GitHub URL to get a prompt-friendly extract of a codebase
-
Stream
Stream - Scalable APIs for Chat, Feeds, Moderation, & Video. Stream helps developers build engaging apps that scale to millions with performant and flexible Chat, Feeds, Moderation, and Video APIs and SDKs powered by a global edge network and enterprise-grade infrastructure.
-
-
-
onefilellm
Discontinued Specify a github or local repo, github pull request, arXiv or Sci-Hub paper, Youtube transcript or documentation URL on the web and scrape into a text file and clipboard for easier LLM ingestion [Moved to: https://github.com/jimmc414/onefilellm]
-
Awesome-Prompt-Engineering
This repository contains a hand-curated resources for Prompt Engineering with a focus on Generative Pre-trained Transformer (GPT), ChatGPT, PaLM etc
-
-
-
-
ingest
Parse files (e.g. code repos) and websites to clipboard or a file for ingestions by AI / LLMs
-
-
your-source-to-prompt.html
Quickly and securely turn your code projects into LLM prompts, all locally on your own machine!
-
-
GPT-4-Prompt-Library
Discontinued Advanced Code and Text Manipulation Prompts for Various LLMs. Suitable for GPT-4, Claude, Llama2, Falcon, Bard, and other high-performing open-source LLMs. [Moved to: https://github.com/abilzerian/LLM-Prompt-Library]
-
repopack
Discontinued 📦 Repomix (formerly Repopack) is a powerful tool that packs your entire repository into a single, AI-friendly file. Perfect for when you need to feed your codebase to Large Language Models (LLMs) or other AI tools like Claude, ChatGPT, and Gemini. [Moved to: https://github.com/yamadashy/repomix]
-
SaaSHub
SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives
code2prompt discussion
code2prompt reviews and mentions
- Revolutionizing LLM Interactions: Code2Prompt – Your Code's New AI Assistant
View the Project on GitHub
- Gemini 2.5 Pro vs. Claude 3.7 Sonnet: Coding Comparison
- All Data and AI Weekly #177 - 17-Feb-2025
🚀 https://github.com/mufeedvh/code2prompt
- Show HN: Transform Your Codebase into a Single Markdown Doc for Feeding into AI
From a rough glance it looks pretty similar to another tool that I've been using https://github.com/mufeedvh/code2prompt
- Exploring AI for Refactoring at Scale
Prompt engineering is another beast. You can check some open source. But I would suggest using your 🧠to think and explain what you want as a result and what specific context you know about the project. AI can do a lot, but it can not read your mind
- Mufeedvh/code2prompt:A CLI tool to convert your codebase in a single LLM prompt
- Ask HN: How do you load your code base as context window in ChatGPT?
I've read O1 has 200k tokens limit as per context window [1], my code base is about 177k, I could generate a prompt with code2prompt [2] tool and load mi code case as a prompt, which is not the case:
- it doesn't allow me to attach text files when o1 is selected.
- Pasting the whole prompt is the text area freezes the browser for a while but when si done, I can't submitted as the send button is disabled.
- When creating a project I can attach files but then I can't select o1 model.
I'm on the fence to buy pro subscription, if only I could use o1 with my code base as context window.
My guess is that when they say 200k tokens they mean trough the api ? But then I'd incur in extra costs which seems odd since I'm already paying the subscription, and it's not an automation use case.
I would appreciate if anyone can share their experience working with large context window for specific use case of a code base.
Thanks!
- [1] https://platform.openai.com/docs/models#o1
- [2] https://github.com/mufeedvh/code2prompt
- Code2prompt
- Source to Prompt- Turn your code into an LLM prompt, but with more features
- Show HN: Dump entire Git repos into a single file for LLM prompts
--> There's no option to specify files to include, must work backwards with ignore option
- [code2prompt](https://github.com/mufeedvh/code2prompt)
- A note from our sponsor - InfluxDB www.influxdata.com | 23 Dec 2025
Stats
mufeedvh/code2prompt is an open source project licensed under MIT License which is an OSI approved license.
The primary programming language of code2prompt is Rust.