local_fire_departmentHoneystax
search⌘K
loginLog Inperson_addSign Up
layers
HONEYSTAX TERMINAL v1.0
HomeNewsSavedSubmit
Back to the live board
C

code2prompt

Model

A CLI tool to convert your codebase into a single LLM prompt with source tree, prompt templating, and token counting.

Copy the install, test the workflow, then decide if it earns a permanent slot.

7,236
Why nowMoving now

Fresh repo activity plus visible builder pull. This is the kind of tool people test before it turns obvious.

DecisionHigh-conviction move

Copy the install, test the workflow, then decide if it earns a permanent slot.

Trial costMedium lift

Not hard to test, not trivial to unwind. Worth trying if it closes a sharp gap.

Risk35/100

GitHub health 42/100. no security policy. 21 open issues make this testable, but not something to trust blind.

What You Are Adopting

AI Agent

Universal

Model

Multiple

Build Time

Hours

Move Fast

open_in_new

No direct local install flow.

Open the project page, steal the pattern, and decide fast if it deserves a deeper test.

About

A CLI tool to convert your codebase into a single LLM prompt with source tree, prompt templating, and token counting.. An open-source model for the AI coding ecosystem.

README

Code2prompt

Convert your codebase into a single LLM prompt.

Website • Documentation • Discord

License Crates.io PyPI CI Discord Docs.rs Crates.io Downloads GitHub Stars


code2prompt demo

Flow Diagram

Code2Prompt is a powerful context engineering tool designed to ingest codebases and format them for Large Language Models. Whether you are manually copying context for ChatGPT, building AI agents via Python, or running a MCP server, Code2Prompt streamlines the context preparation process.

⚡ Quick Install

Cargo

cargo install code2prompt 

To enable optional Wayland support (e.g., for clipboard integration on Wayland-based systems), use the wayland feature flag:

cargo install --features wayland code2prompt

Homebrew

brew install code2prompt

SDK with pip 🐍

pip install code2prompt-rs

🚀 Quick Start

Once installed, generating a prompt from your codebase is as simple as pointing the tool to your directory.

Basic Usage: Generate a prompt from the current directory and copy it to the clipboard.

code2prompt .

Save to file:

code2prompt path/to/project --output prompt.txt

🌐 Ecosystem

Code2Prompt is more than just a CLI tool. It is a complete ecosystem for codebase context.

🧱 Core Library
Rust Core Badge
💻 CLI Tool
CLI Badge
🐍 Python SDK
Python SDK Badge
🤖 MCP Server MCP Server Badge
The internal, high-speed library responsible for secure file traversal, respecting .gitignore rules, and structuring Git metadata. Designed for humans, featuring both a minimal CLI and an interactive TUI. Generate formatted prompts, track token usage, and outputs the result to your clipboard or stdout. Provides fast Python bindings to the Rust Core. Ideal for AI Agents, automation scripts, or deep integration into RAG pipelines. Available on PyPI. Run Code2Prompt as a local service, enabling agentic applications to read your local codebase efficiently without bloating your context window.

📚 Documentation

Check our online documentation for detailed instructions

✨ Features

Code2Prompt transforms your entire codebase into a well-structured prompt for large language models. Key features include:

  • Terminal User Interface (TUI): Interactive terminal interface for configuring and generating prompts
  • Smart Filtering: Include/exclude files using glob patterns and respect .gitignore rules
  • Flexible Templating: Customize prompts with Handlebars templates for different use cases
  • Automatic Code Processing: Convert codebases of any size into readable, formatted prompts
  • Token Tracking: Track token usage to stay within LLM context limits
  • Smart File Reading: Simplify reading various file formats for LLMs (CSV, Notebooks, JSONL, etc.)
  • Git Integration: Include diffs, logs, and branch comparisons in your prompts
  • Blazing Fast: Built in Rust for high performance and low resource usage

Stop manually copying files and formatting code for LLMs. Code2Prompt handles the tedious work so you can focus on getting insights and solutions from AI models.

Alternative Installation

Refer to the documentation for detailed installation instructions.

Binary releases

Download the latest binary for your OS from Releases.

Source build

Requires:

  • Git, Rust and Cargo.
git clone https://github.com/mufeedvh/code2prompt.git
cd code2prompt/
cargo install --path crates/code2prompt

⭐ Star Gazing

Star History Chart

📜 License

Licensed under the MIT License, see LICENSE for more information.

Liked the project?

If you liked the project and found it useful, please give it a ⭐ !

👥 Contribution

Ways to contribute:

  • Suggest a feature
  • Report a bug
  • Fix something and open a pull request
  • Help me document the code
  • Spread the word

Tech Stack

RustPythonGoLLM
Open Live ProjectAudit Repo

Reviews0

Log in to write a review.

ActiveLast commit 15d ago
bug_report21open issues
Submitted March 9, 2024

auto_awesomeYour strongest next moves after code2prompt