lumen 2.4.0

lumen is a command-line tool that uses AI to generate commit messages, summarise git diffs or past commits, and more.
lumen-2.4.0 is not a library.

Crates.io Total Downloads GitHub Releases GitHub License Crates.io Size

A command-line tool that uses AI to streamline your git workflow - generate commit messages, view diff and explain changes. CleanShot 2025-12-30 at 02 47 04

Table of Contents

Features 🔅

  • Beautiful & Ergonomic Diff Viewer: Review your code with minimal effort
  • Smart Commit Messages: Generate conventional commit messages for your staged changes
  • Git History Insights: Understand what changed in any commit, branch, or your current work
  • Interactive Search: Find and explore commits using fuzzy search
  • Change Analysis: Ask questions about specific changes and their impact
  • Multiple AI Providers: Supports OpenAI, Claude, Groq, Ollama, and more
  • Flexible: Works with any git workflow and supports multiple AI providers
  • Rich Output: Markdown support for readable explanations and diffs (requires: mdcat)

Getting Started 🔅

Prerequisites

Before you begin, ensure you have:

  1. git installed on your system
  2. fzf (optional) - Required for lumen list command
  3. mdcat (optional) - Required for pretty output formatting

Installation

Using Homebrew (MacOS and Linux)

brew install jnsahaj/lumen/lumen

Using Cargo

[!IMPORTANT] cargo is a package manager for rust, and is installed automatically when you install rust. See installation guide

cargo install lumen

Configuration (for AI features)

If you want to use AI-powered features (explain, draft, list, operate), run the interactive setup:

lumen configure

This will guide you through selecting an AI provider and entering your API key. The configuration is saved to ~/.config/lumen/lumen.config.json.

[!NOTE] The diff command works without any configuration - it's a standalone visual diff viewer.

Usage 🔅

Generate Commit Messages

Create meaningful commit messages for your staged changes:

# Basic usage - generates a commit message based on staged changes
lumen draft
# Output: "feat(button.tsx): Update button color to blue"

# Add context for more meaningful messages
lumen draft --context "match brand guidelines"
# Output: "feat(button.tsx): Update button color to align with brand identity guidelines"

Generate Git Commands

Ask Lumen to generate Git commands based on a natural language query:

lumen operate "squash the last 3 commits into 1 with the message 'squashed commit'"
# Output: git reset --soft HEAD~3 && git commit -m "squashed commit" [y/N]

The command will display an explanation of what the generated command does, show any warnings for potentially dangerous operations, and prompt for confirmation before execution.

Visual Diff Viewer

Launch an interactive side-by-side diff viewer in your terminal:

# View uncommitted changes
lumen diff

# View changes for a specific commit
lumen diff HEAD~1

# View changes between branches
lumen diff main..feature/A

# Filter to specific files
lumen diff --file src/main.rs --file src/lib.rs

# Watch mode - auto-refresh on file changes
lumen diff --watch

Keybindings in the diff viewer:

  • j/k or arrow keys: Navigate
  • {/}: Jump between hunks
  • tab: Toggle sidebar
  • space: Mark file as viewed
  • e: Open file in editor
  • ?: Show all keybindings

Explain Changes

Understand what changed and why:

# Explain current changes in your working directory
lumen explain                         # All changes
lumen explain --staged                # Only staged changes

# Explain specific commits
lumen explain HEAD                    # Latest commit
lumen explain abc123f                 # Specific commit
lumen explain HEAD~3..HEAD            # Last 3 commits
lumen explain main..feature/A         # Branch comparison
lumen explain main...feature/A        # Branch comparison (merge base)

# Ask specific questions about changes
lumen explain --query "What's the performance impact of these changes?"
lumen explain HEAD --query "What are the potential side effects?"

Interactive Mode

# Launch interactive fuzzy finder to search through commits (requires: fzf)
lumen list

Tips & Tricks

# Copy commit message to clipboard
lumen draft | pbcopy                  # macOS
lumen draft | xclip -selection c      # Linux

# View the commit message and copy it
lumen draft | tee >(pbcopy)

# Open in your favorite editor
lumen draft | code -      

# Directly commit using the generated message
lumen draft | git commit -F -           

If you are using lazygit, you can add this to the user config

customCommands:
  - key: '<c-l>'
    context: 'files'
    command: 'lumen draft | tee >(pbcopy)'
    loadingText: 'Generating message...'
    showOutput: true
  - key: '<c-k>'
    context: 'files'
    command: 'lumen draft -c {{.Form.Context | quote}} | tee >(pbcopy)'
    loadingText: 'Generating message...'
    showOutput: true
    prompts:
          - type: 'input'
            title: 'Context'
            key: 'Context'

AI Providers 🔅

Configure your preferred AI provider:

# Using CLI arguments
lumen -p openai -k "your-api-key" -m "gpt-5-mini" draft

# Using environment variables
export LUMEN_AI_PROVIDER="openai"
export LUMEN_API_KEY="your-api-key"
export LUMEN_AI_MODEL="gpt-5-mini"

Supported Providers

Provider API Key Required Models
OpenAI openai (Default) Yes gpt-5.2, gpt-5, gpt-5-mini, gpt-5-nano, gpt-4.1, gpt-4.1-mini, o4-mini (default: gpt-5-mini)
Claude claude Yes claude-sonnet-4-5-20250930, claude-opus-4-5-20251115, claude-haiku-4-5-20251015 (default: claude-sonnet-4-5-20250930)
Gemini gemini Yes (free tier) gemini-3-pro, gemini-3-flash, gemini-2.5-pro, gemini-2.5-flash, gemini-2.5-flash-lite (default: gemini-3-flash)
Groq groq Yes (free) llama-3.3-70b-versatile, llama-3.1-8b-instant, meta-llama/llama-4-maverick-17b-128e-instruct, openai/gpt-oss-120b (default: llama-3.3-70b-versatile)
DeepSeek deepseek Yes deepseek-chat (V3.2), deepseek-reasoner (default: deepseek-chat)
xAI xai Yes grok-4, grok-4-mini, grok-4-mini-fast (default: grok-4-mini-fast)
Ollama ollama No (local) see list (default: llama3.2)
OpenRouter openrouter Yes see list (default: anthropic/claude-sonnet-4.5)
Vercel AI Gateway vercel Yes see list (default: anthropic/claude-sonnet-4.5)

Advanced Configuration 🔅

Configuration File

Lumen supports configuration through a JSON file. You can place the configuration file in one of the following locations:

  1. Project Root: Create a lumen.config.json file in your project's root directory.
  2. Custom Path: Specify a custom path using the --config CLI option.
  3. Global Configuration (Optional): Place a lumen.config.json file in your system's default configuration directory:
    • Linux/macOS: ~/.config/lumen/lumen.config.json
    • Windows: %USERPROFILE%\.config\lumen\lumen.config.json

Lumen will load configurations in the following order of priority:

  1. CLI arguments (highest priority)
  2. Configuration file specified by --config
  3. Project root lumen.config.json
  4. Global configuration file (lowest priority)
{
  "provider": "openai",
  "model": "gpt-5-mini",
  "api_key": "sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx",
  "draft": {
    "commit_types": {
      "docs": "Documentation only changes",
      "style": "Changes that do not affect the meaning of the code",
      "refactor": "A code change that neither fixes a bug nor adds a feature",
      "perf": "A code change that improves performance",
      "test": "Adding missing tests or correcting existing tests",
      "build": "Changes that affect the build system or external dependencies",
      "ci": "Changes to our CI configuration files and scripts",
      "chore": "Other changes that don't modify src or test files",
      "revert": "Reverts a previous commit",
      "feat": "A new feature",
      "fix": "A bug fix"
    }
  }
}

Configuration Precedence

Options are applied in the following order (highest to lowest priority):

  1. CLI Flags
  2. Configuration File
  3. Environment Variables
  4. Default options

Example: Using different providers for different projects:

# Set global defaults in .zshrc/.bashrc
export LUMEN_AI_PROVIDER="openai"
export LUMEN_AI_MODEL="gpt-5-mini"
export LUMEN_API_KEY="sk-xxxxxxxxxxxxxxxxxxxxxxxx"

# Override per project using config file
{
  "provider": "ollama",
  "model": "llama3.2"
}

# Or override using CLI flags
lumen -p "ollama" -m "llama3.2" draft