aicommit 0.1.12

A CLI tool that generates concise and descriptive git commit messages using LLMs
aicommit-0.1.12 is not a library.

aicommit

logo

Crates.io Documentation License: MIT

๐Ÿ“š Website & Documentation

A CLI tool that generates concise and descriptive git commit messages using LLMs (Large Language Models).

Features

Implemented Features

  • โœ… Uses LLMs to generate meaningful commit messages from your changes
  • โœ… Supports multiple LLM providers (OpenRouter, Ollama)
  • โœ… Custom api keys for services through open router api (for google aistudio and etc) - go to https://openrouter.ai/settings/integrations and paste key from any of them: AI21, Amazon BedRock, Anthropic, AnyScale, Avian.io, Cloudflare, Cohere, DeepInfra, DeepSeek, Fireworks, Google AI Studio, Google Vertex, Hyperbolic, Infermatic, Inflection, Lambda, Lepton, Mancer, Mistral, NovitaAI, OpenAI, Perplexity, Recursal, SambaNova, SF Compute, Together, xAI
  • โœ… Fast and efficient - works directly from your terminal
  • โœ… Easy configuration and customization
  • โœ… Transparent token usage and cost tracking
  • โœ… Version management with automatic incrementation
  • โœ… Version synchronization with Cargo.toml
  • โœ… Provider management (add, list, set active)
  • โœ… Interactive configuration setup
  • โœ… Configuration file editing

Planned Features

  • ๐Ÿšง Tests for each feature to prevent breaking changes
  • ๐Ÿšง Split commits by file (aicommit --by-file)
  • ๐Ÿšง Split commits by feature (aicommit --by-feature)
  • ๐Ÿšง Basic .gitignore file checks and management
  • ๐Ÿšง Watch mode (aicommit --watch 1m)
  • ๐Ÿšง Watch with edit delay (aicommit --watch 1m --wait-for-edit 30s)
  • ๐Ÿšง Version management for multiple languages (package.json, requirements.txt, etc.)
  • ๐Ÿšง Interactive commit message generation (aicommit --generate)
  • ๐Ÿšง Auto push functionality (aicommit --push)
  • ๐Ÿšง Branch safety checks for push operations
  • ๐Ÿšง Auto pull functionality
  • ๐Ÿšง Support for conventional commits format
  • ๐Ÿšง Project icon
  • ๐Ÿšง OpenRouter project listing

Legend:

  • โœ… Implemented
  • ๐Ÿšง Planned
  • ๐Ÿงช Has tests

Installation

Install via cargo:

cargo install aicommit

Or build from source:

git clone https://github.com/suenot/aicommit
cd aicommit
cargo install --path .

Quick Start

  1. Add a provider:
aicommit --add
  1. Make some changes to your code

  2. Create a commit:

aicommit

Provider Management

List all configured providers:

aicommit --list

Set active provider:

aicommit --set <provider-id>

Version Management

Automatically increment version in a file before commit:

aicommit --version-file "./version" --version-iterate

Synchronize version with Cargo.toml:

aicommit --version-file "./version" --version-cargo

Both operations can be combined:

aicommit --version-file "./version" --version-cargo --version-iterate

Configuration

The configuration file is stored at ~/.aicommit.json. You can edit it directly with:

aicommit --config

Provider Configuration

Each provider can be configured with the following settings:

  • max_tokens: Maximum number of tokens in the response (default: 50)
  • temperature: Controls randomness in the response (0.0-1.0, default: 0.3)

For OpenRouter, token costs are automatically fetched from their API. For Ollama, you can specify your own costs if you want to track usage.

Supported LLM Providers

OpenRouter

{
  "providers": [{
    "id": "550e8400-e29b-41d4-a716-446655440000",
    "provider": "openrouter",
    "api_key": "sk-or-v1-...",
    "model": "mistralai/mistral-tiny",
    "max_tokens": 50,
    "temperature": 0.3,
    "input_cost_per_1k_tokens": 0.25,
    "output_cost_per_1k_tokens": 0.25
  }],
  "active_provider": "550e8400-e29b-41d4-a716-446655440000"
}

Ollama

{
  "providers": [{
    "id": "67e55044-10b1-426f-9247-bb680e5fe0c8",
    "provider": "ollama",
    "url": "http://localhost:11434",
    "model": "llama2",
    "max_tokens": 50,
    "temperature": 0.3,
    "input_cost_per_1k_tokens": 0.0,
    "output_cost_per_1k_tokens": 0.0
  }],
  "active_provider": "67e55044-10b1-426f-9247-bb680e5fe0c8"
}

Recommended Providers through OpenRouter

  • ๐ŸŒŸ Google AI Studio - 1000000 tokens for free
    • "google/gemini-2.0-flash-exp:free"
  • ๐ŸŒŸ DeepSeek
    • "deepseek/deepseek-chat"

Usage Information

When generating a commit message, the tool will display:

  • Number of tokens used (input and output)
  • Total API cost (calculated separately for input and output tokens)

Example output:

Generated commit message: Add support for multiple LLM providers
Tokens: 8โ†‘ 32โ†“
API Cost: $0.0100

You can have multiple providers configured and switch between them by changing the active_provider field to match the desired provider's id.

License

This project is licensed under the MIT License - see the LICENSE file for details.