tl – streaming, cached translation CLI
tl is a small CLI that streams translations through any OpenAI-compatible endpoint (local or remote). Configure multiple providers with their own endpoints, API keys, and models, then switch between them as needed.
Install
Using cargo (recommended)
Using installer scripts
macOS/Linux:
|
Windows (PowerShell):
irm https://github.com/d2verb/tl/releases/latest/download/tl-cli-installer.ps1 | iex
From source
Getting Started
1. Add a provider
Follow the prompts to configure your first provider. Example for local Ollama:
Provider name: ollama
Endpoint URL: http://localhost:11434
API key method: None (no auth required)
Models: gemma3:12b, llama3.2
2. Set defaults
Select your default provider, model, and target language.
3. Translate
|
You should see the translation stream in real-time.
Usage
|
Translations are cached (keyed on input, language, model, endpoint, and prompt) so rerunning the same source is fast and cheap.
Managing Providers
Translation Styles
Styles control the tone and manner of translations. Four preset styles are available:
| Style | Description |
|---|---|
casual |
Casual, conversational tone |
formal |
Formal, business-appropriate |
literal |
Literal, close to source |
natural |
Natural, idiomatic expressions |
Use styles with the --style option:
Chat Mode
For interactive translation sessions:
Type text and press Enter to translate. Available commands:
| Command | Description |
|---|---|
/help |
Show available commands |
/config |
Show current configuration |
/set style <name> |
Set translation style (or clear with /set style) |
/set to <lang> |
Change target language |
/set model <name> |
Change model |
/quit |
Exit chat mode |
Configuration Reference
Settings are stored in ~/.config/tl/config.toml:
[]
= "ollama"
= "gemma3:12b"
= "ja"
= "casual" # optional default style
[]
= "http://localhost:11434"
= ["gemma3:12b", "llama3.2"]
[]
= "https://openrouter.ai/api"
= "OPENROUTER_API_KEY"
= ["anthropic/claude-3.5-sonnet", "openai/gpt-4o"]
[]
= "Middle-aged man texting style"
= "Translate with excessive emoji, overly familiar tone, and random punctuation."
[]
= "Polite Japanese honorifics"
= "Translate using polite Japanese with appropriate keigo (honorific language)."
Provider options
endpoint(required) – OpenAI-compatible API endpointapi_key_env(optional) – environment variable name for API keyapi_key(optional) – API key in config (not recommended)models(optional) – available models for this provider
Custom style options
description(required) – short description shown intl styleslistprompt(required) – instruction appended to the system prompt for the LLM
CLI options always override config file values.
Troubleshooting
- Run
tl languagesto see supported ISO 639-1 language codes. - Pressing
Ctrl+Cwhile streaming aborts without polluting the cache. - Use
--no-cacheto force a fresh API request. - API key issues? Ensure the environment variable specified in
api_key_envis set.