Git Intelligence Message (GIM) 🚀
An advanced Git commit message generation utility designed to automatically craft high-quality commit messages with precision and sophistication.
Features
- 🤖 AI-powered commit message generation
- ⚡ Lightning fast Rust implementation
- 🔧 Easy configuration for various AI providers
- 🌍 Multi-language support
- 🔄 Automatic git staging (optional)
- ✏️ Amend previous commits
Installation
Using Homebrew (macOS/Linux)
or
Using Cargo
Build from source
Command Line Interface
Basic Usage
# Generate commit message automatically
# Specify commit title
# Stage unstaged changes automatically
# Amend the most recent commit
Recommended Usage
# Basic usage - generate commit message for staged changes
# Auto-stage changes and generate commit message
# Amend the most recent commit
Command Options
-t, --title <STRING>: Specify the commit message title-a, --auto-add: Automatically stage all modifications-p, --update: Amend the most recent commit
Prompt Management
View and edit the AI prompt templates used for generating commit messages:
# View current prompt templates
# Open the prompt files in default file manager for editing
# Edit a specific prompt file with default editor
# Edit a specific prompt file with custom editor
Prompt types:
d,diff,diff_prompt: Diff analysis prompt templates,subject,subject_prompt: Commit subject generation prompt template
AI Configuration
Configuration
Utilise the gim ai command to configure AI-related parameters:
# Configure AI model
# Set API key
# Define API endpoint
# Set output language
Important: The
--urlparameter only supports OpenAI-compatible API endpoints ,such as OpenAI official or third-party services compatible with OpenAI protocol.
AI Configuration Options
-m, --model <STRING>: Specify the AI model to be utilised-k, --apikey <STRING>: Configure the API key for AI service-u, --url <STRING>: Set the API endpoint for AI service-l, --language <STRING>: Define the language for generated commit messages-v, --verbose: Show verbose output including AI chat content
Built-in Model Support
The following model prefixes are supported with their respective default endpoints:
| Model Prefix | Service Provider | Default Endpoint |
|---|---|---|
gpt-* |
OpenAI | https://api.openai.com/v1/chat/completions |
moonshot-* |
Moonshot AI | https://api.moonshot.cn/v1/chat/completions |
qwen-* |
Alibaba Qwen | https://dashscope.aliyuncs.com/compatible-mode/v1/chat/completions |
gemini-* |
Google Gemini | https://generativelanguage.googleapis.com/v1beta/openai/ |
doubao-* |
ByteDance Doubao | https://ark.cn-beijing.volces.com/api/v3/chat/completions |
glm-* |
THUDM GLM | https://open.bigmodel.cn/api/paas/v4/chat/completions |
deepseek-* |
DeepSeek | https://api.deepseek.com/chat/completions |
qianfan-* |
Baidu Qianfan | https://qianfan.baidubce.com/v2/chat/completions |
You can use any model name starting with these prefixes, and the corresponding endpoint will be used automatically (so you don't need to set --url).
Update
Check for updates and install the latest version:
Force update even if you're on the latest version:
The application will automatically check for updates when you run it.