git-semantic-1.0.4 is not a library.
Visit the last successful build:
git-semantic-1.3.2
git-semantic
Search your git history using natural language - find commits by what they mean, not just what they say.
)
)
Stop scrolling through hundreds of commits with git log --grep. Just describe what you're looking for in plain English.
Why?
Traditional git search is keyword-based. You need to guess the exact words the author used:
git-semantic understands meaning. Search for "race condition" and find commits about "concurrent access" or "synchronization bugs" - even if those exact words aren't in the message.
Features
- 🔍 Natural language search - "fix memory leak" finds more than just those exact words
- 🚀 Fast - Results in < 100ms
- 🔒 Private - Everything runs locally with ONNX, no API keys or cloud services
- 📦 Zero config - Works out of the box
- 🎯 Smart filtering - By author, date, file, and more
Installation
Using Cargo (Recommended)
Alternatively, you can also install from the latest release compatible with your OS on the releases page.
Quick Start
# 1. One-time setup (downloads AI model, ~130MB)
# 2. Index your repository
# 3. Search!
Usage
Basic Search
Filters
# By author
# By date
# By file
# Limit results
Index Management
# Update index with new commits
# Show index statistics
# Quick index (messages only, faster)
# Full index (messages + diffs, more context)
How It Works
- Downloads BGE-small-en-v1.5 - A compact AI model (130MB) for semantic embeddings
- Indexes your repo - Converts each commit into a 384-dimensional vector
- Stores locally - Binary index saved in
.git/semantic-index(ignored by git) - Searches by meaning - Your query becomes a vector, finds similar commit vectors using cosine similarity
- ONNX Runtime - Fast local inference, no cloud services needed
Stored locations:
- Model:
~/Library/Application Support/com.git-semantic.git-semantic/models/(macOS) - Index:
.git/semantic-index(per repository)
Technical Details
- Model: BGE-small-en-v1.5 (BAAI)
- Runtime: ONNX Runtime for fast local inference
- Storage: Bincode serialization (~0.04MB per 7 commits)
- Search: Cosine similarity with L2 normalization
- Inference: < 100ms per query
Real Example
)
)
)
);
);
)
;
;
Contributing
Contributions welcome! Please use Conventional Commits format:
Requirements
- Git repository (obviously!)
- ~130MB disk space for the AI model
- Rust 1.70+ (if building from source)
License
MIT
Built with: Rust 🦀 and ❤️