Please check the build logs for more information.
See Builds for ideas on how to fix a failed build, or Metadata for how to configure docs.rs builds.
If you believe this is docs.rs' fault, open an issue.
Shelf: AI-based command-line tools for developers
Shelf is a command-line tool for managing what we called dotfiles in the system, generating git commit
messages, and reviewing code using AI. It provides a simple interface to track files across your system and
integrates with multiple AI providers to automatically generate meaningful commit messages through git hooks
and perform comprehensive code reviews. With support for local and cloud-based AI models, Shelf makes configuration
files management, git commits, and code reviews effortless.
Features
- Track dotfiles from anywhere in your file system recursively
- List all tracked dotfiles
- Remove dotfiles recursively from database
- AI-powered git commit message generation with multiple providers:
- Groq
- Google Gemini
- Anthropic Claude
- OpenAI
- Xai grok
- Ollama (local)
- Git hooks integration for automatic commit message generation
Installation
To install Shelf, you need to have Rust and Cargo installed on your system. If you don't have them, you can install them from rustup.rs.
Once you have Rust and Cargo installed, you can build and install Shelf using the following command:
cargo install --path .
Usage
Shelf provides commands for both dotfile management and git integration:
Dotfile Management
# Add a new dotfile to track
# List all tracked dotfiles
# Remove a dotfile from tracking
# Interactive selection of dotfiles to track
# Show help
Each command can be run with -h or --help for more information.
Git AI Integration
The ai subcommand provides AI-powered features:
# Generate commit message for staged changes
# Install git hook for automatic message generation
# Remove git hook
# Configure AI provider
# Use specific provider for one commit
# List current configuration
The AI-powered features support diffrent AI providers:
- Groq (default): GroqCloud-based models
- Google Gemini: Cloud-based using Gemini models
- OpenAI: Cloud-based using GPT models
- Anthropic Claude: Cloud-based using Claude models
- XAI Grok: Cloud-based using Grok models
- Ollama: Local, privacy-friendly AI using models like Qwen
The git hook integrates seamlessly with your normal git workflow:
# Hook will automatically generate message if none provided
# Your message takes precedence
# AI helps with amending
Code Review with AI
Shelf can assist in code review by analyzing pull requests and providing AI-powered feedback:
# Review the current staged branch's changes
# Review with specific provider
The AI review provides:
- Code quality analysis
- Potential bug detection
- Style guide compliance checks
- Security vulnerability scanning
- Performance improvement suggestions
- Best practice recommendations
Migration from v0.8.7 to newer versions
If you're upgrading from a v0.8.7 version of Shelf, here are the key changes and migration steps:
Migration Steps
- Convert your existing config:
# Migration hints
# Apply changes
Prompts
Prompt templates for commit messages and code reviews are stored in the user's configuration directory. You can customize these templates to tailor the AI's output to your specific needs.
Shell Completion
Shelf supports generating shell completion scripts for various shells. You can generate these
scripts using the completion subcommand:
# Generate completion script for Bash
# Generate completion script for Zsh
# Generate completion script for Fish
To use the completion scripts:
-
For Bash, add the following line to your
~/.bashrc: -
For Zsh, place the
_shelffile in~/.zfunc, then addsource ~/.zfunc/_shelfin~/.zshrc. -
For Fish, place the
shelf.fishfile in~/.config/fish/completions.
After setting up the completion script, restart your shell or source the respective configuration file to enable completions for the shelf command.
Configuration
AI settings are stored in ~/.config/shelf/ai.json (or $XDG_CONFIG_HOME/shelf/ai.json if set). You can configure:
provider: AI provider to use (openai,anthropic,gemini,groq,xaiandollama)model: Ollama model to use (default:qwen2.5-coder)openai_api_key: OpenAI API key for GPT modelsollama_host: Ollama server URL (default:http://localhost:11434)
Example configuration:
Development
To build the project locally:
cargo build
To run tests:
cargo test
To run the project directly without installing:
cargo run --bin shelf -- [SUBCOMMAND]
Replace [SUBCOMMAND] with the command you want to run, such as dotfile or ai.
Contributing
Contributions are welcome! Please feel free tor submit a Pull Request.
License
This project is licensed under the MIT License - see the LICENSE file for details.