A command-line tool that uses AI to streamline your git workflow - from generating commit messages to explaining complex changes, all without requiring an API key.
- Smart Commit Messages: Generate conventional commit messages for your staged changes
- Git History Insights: Understand what changed in any commit, branch, or your current work
- Interactive Search: Find and explore commits using fuzzy search
- Change Analysis: Ask questions about specific changes and their impact
- Zero Config: Works instantly without an API key, using Phind by default
- Flexible: Works with any git workflow and supports multiple AI providers
- Rich Output: Markdown support for readable explanations and diffs (requires: mdcat)
Before you begin, ensure you have:
gitinstalled on your system- fzf (optional) - Required for
lumen listcommand - mdcat (optional) - Required for pretty output formatting
brew install jnsahaj/lumen/lumenImportant
cargo is a package manager for rust, and is installed automatically when you install rust. See installation guide
cargo install lumenCreate meaningful commit messages for your staged changes:
# Basic usage - generates a commit message based on staged changes lumen draft # Output: "feat(button.tsx): Update button color to blue" # Add context for more meaningful messages lumen draft --context "match brand guidelines" # Output: "feat(button.tsx): Update button color to align with brand identity guidelines"Ask Lumen to generate Git commands based on a natural language query:
lumen operate "squash the last 3 commits into 1 with the message 'squashed commit'" # Output: git reset --soft HEAD~3 && git commit -m "squashed commit" [y/N]Understand what changed and why:
# Explain current changes in your working directory lumen explain --diff # All changes lumen explain --diff --staged # Only staged changes # Explain specific commits lumen explain HEAD # Latest commit lumen explain abc123f # Specific commit lumen explain HEAD~3..HEAD # Last 3 commits lumen explain main..feature/A # Branch comparison lumen explain main...feature/A # Branch comparison (merge base) # Ask specific questions about changes lumen explain --diff --query "What's the performance impact of these changes?" lumen explain HEAD --query "What are the potential side effects?"# Launch interactive fuzzy finder to search through commits (requires: fzf) lumen list# Copy commit message to clipboard lumen draft | pbcopy # macOS lumen draft | xclip -selection c # Linux # View the commit message and copy it lumen draft | tee >(pbcopy) # Open in your favorite editor lumen draft | code - # Directly commit using the generated message lumen draft | git commit -F - If you are using lazygit, you can add this to the user config
customCommands: - key: '<c-l>' context: 'files' command: 'lumen draft | tee >(pbcopy)' loadingText: 'Generating message...' showOutput: true - key: '<c-k>' context: 'files' command: 'lumen draft -c {{.Form.Context | quote}} | tee >(pbcopy)' loadingText: 'Generating message...' showOutput: true prompts: - type: 'input' title: 'Context' key: 'Context'Configure your preferred AI provider:
# Using CLI arguments lumen -p openai -k "your-api-key" -m "gpt-4o" draft # Using environment variables export LUMEN_AI_PROVIDER="openai" export LUMEN_API_KEY="your-api-key" export LUMEN_AI_MODEL="gpt-4o"| Provider | API Key Required | Models |
|---|---|---|
Phind phind (Default) | No | Phind-70B |
Groq groq | Yes (free) | llama2-70b-4096, mixtral-8x7b-32768 (default: mixtral-8x7b-32768) |
OpenAI openai | Yes | gpt-4o, gpt-4o-mini, gpt-4, gpt-3.5-turbo (default: gpt-4o-mini) |
Claude claude | Yes | see list (default: claude-3-5-sonnet-20241022) |
Ollama ollama | No (local) | see list (required) |
OpenRouter openrouter | Yes | see list (default: anthropic/claude-3.5-sonnet) |
DeepSeek deepseek | Yes | deepseek-chat, deepseek-reasoner (default: deepseek-reasoner) |
Lumen supports configuration through a JSON file. You can place the configuration file in one of the following locations:
- Project Root: Create a lumen.config.json file in your project's root directory.
- Custom Path: Specify a custom path using the --config CLI option.
- Global Configuration (Optional): Place a lumen.config.json file in your system's default configuration directory:
- Linux/macOS:
~/.config/lumen/lumen.config.json - Windows:
%USERPROFILE%\.config\lumen\lumen.config.json
- Linux/macOS:
Lumen will load configurations in the following order of priority:
- CLI arguments (highest priority)
- Configuration file specified by --config
- Project root lumen.config.json
- Global configuration file (lowest priority)
{ "provider": "ollama", "model": "qwen2.5-coder:7b", "ollama_api_base_url": "http://localhost:11434", "draft": { "commit_types": { "docs": "Documentation only changes", "style": "Changes that do not affect the meaning of the code", "refactor": "A code change that neither fixes a bug nor adds a feature", "perf": "A code change that improves performance", "test": "Adding missing tests or correcting existing tests", "build": "Changes that affect the build system or external dependencies", "ci": "Changes to our CI configuration files and scripts", "chore": "Other changes that don't modify src or test files", "revert": "Reverts a previous commit", "feat": "A new feature", "fix": "A bug fix" } } }Options are applied in the following order (highest to lowest priority):
- CLI Flags
- Configuration File
- Environment Variables
- Default options
Example: Using different providers for different projects:
# Set global defaults in .zshrc/.bashrc export LUMEN_AI_PROVIDER="openai" export LUMEN_AI_MODEL="gpt-4o" export LUMEN_API_KEY="sk-xxxxxxxxxxxxxxxxxxxxxxxx" # Override per project using config file { "provider": "ollama", "model": "llama3.2" } # Or override using CLI flags lumen -p "ollama" -m "llama3.2" draft
