ask
AI-powered CLI assistant — modern Unix meets AI
ask is a fast, ergonomic command-line tool that bridges traditional Unix workflows with AI capabilities. Get instant command suggestions, query system information naturally, and tap into AI for complex questions — all from your terminal.
Features
- Multi-provider support — Works with Anthropic, OpenAI, Gemini, Ollama, Perplexity, Groq, Mistral, Cohere, Together, or custom endpoints
- Command suggestions — Get the right command without leaving the terminal
- Natural language system queries — Ask "what is using port 8080" instead of remembering
lsofflags - Streaming AI responses — Real-time output, no waiting for complete responses
- Zero friction — No quotes needed for queries, works with pipes, respects
NO_COLOR - Fast — Native Rust binary, ~3MB, starts instantly
- Secure — Security warnings for untrusted providers, no accidental data leaks
Installation
From crates.io (recommended)
Pre-built binaries
Download from GitHub Releases.
From source
Quick Start
# Set your API key (Anthropic is the default provider)
# Or save it to config (secure input, not saved in shell history)
# Start asking
Using Other Providers
# Use OpenAI
# Use local Ollama (no API key needed)
# Use Groq for fast inference
Usage
Command Suggestions
Ask how to do something and get the command:
System Information
Query system resources directly:
Natural Language System Queries
Ask about your system in plain English:
)
# Shows top memory consumers
# Shows top CPU consumers
Command Explanation
Get help for any command:
AI Queries
For anything else, ask Claude:
Responses stream in real-time.
Interactive Prompts
For use in scripts:
if ; then
fi
Returns exit code 0 for yes, 1 for no.
Input Methods
|
Configuration
Config file location: ~/.config/ask/config.json
Supported Providers
| Provider | Default Model | API Key Env Var |
|---|---|---|
anthropic (default) |
claude-sonnet-4-20250514 | ANTHROPIC_API_KEY |
openai |
gpt-4o | OPENAI_API_KEY |
gemini |
gemini-1.5-flash | GEMINI_API_KEY |
ollama |
llama3.2 | (none required) |
perplexity |
llama-3.1-sonar-small-128k-online | PERPLEXITY_API_KEY |
groq |
llama-3.3-70b-versatile | GROQ_API_KEY |
mistral |
mistral-small-latest | MISTRAL_API_KEY |
cohere |
command-r-plus | COHERE_API_KEY |
together |
meta-llama/Llama-3.3-70B-Instruct-Turbo | TOGETHER_API_KEY |
Environment Variables
| Variable | Description |
|---|---|
ASK_PROVIDER |
Override provider (e.g., openai, ollama) |
ASK_API_URL |
Override API endpoint URL |
ASK_MODEL |
Override model |
ASK_API_KEY |
Fallback API key for any provider |
ANTHROPIC_API_KEY |
Anthropic API key |
OPENAI_API_KEY |
OpenAI API key |
GEMINI_API_KEY |
Google Gemini API key |
ASK_NO_COLOR |
Disable colored output |
NO_COLOR |
Disable colored output (standard) |
Custom Providers
You can use custom API endpoints, but ask will show a security warning:
This prevents accidental data leaks to untrusted endpoints.
Why ask?
| Instead of... | Use... |
|---|---|
| Googling "how to tar a directory" | ask how do I compress a folder |
man lsof, scrolling for flags |
ask what is using port 8080 |
| Opening ChatGPT in browser | ask "explain kubernetes pods" |
df -h (if you remember it) |
ask system disk |
ask keeps you in the terminal and in flow.
Requirements
- macOS or Linux
- Rust 1.70+ (for building from source)
- API key from a supported provider, or local Ollama (only needed for AI queries — system info, command suggestions, and explanations work without any API key)
Contributing
Contributions are welcome! Please see CONTRIBUTING.md for guidelines.
This project follows the Contributor Covenant Code of Conduct.
License
Built with Rust. Inspired by the wave of modern Unix tools like ripgrep, fd, bat, and eza.