pub async fn pick_default_client() -> Result<Box<dyn LlmClient>, LlmError>Expand description
Select an LLM client at runtime, reading the environment:
- Probe Ollama at
OLLAMA_BASE_URL(defaulthttp://localhost:11434) viaGET /api/version. If reachable in 500ms, use it. - Otherwise, if
ANTHROPIC_API_KEYis set, useAnthropicHaikuClient. - Otherwise, return
LlmError::NoLlmAvailable.