Skip to main content

Module llm

Module llm 

Source

Structs§

ChatMessage
Chat message for /api/chat (multi-turn with tool support).
ChatToolResponse
Result of chat_with_tools call.
LlmCallStats
Stats captured from an LLM call for pipeline reports.
LlmClient
Unified LLM client — routes to Claude API, Grok API, or Ollama based on model name. Supports both blocking and streaming generation.
OllamaTool
Ollama tool definition (JSON Schema).
OllamaToolCall
Tool call returned by the model.
OllamaToolCallFunction
OllamaToolFunction

Enums§

StreamEvent
Events emitted during streaming generation.

Functions§

extract_code
Extract clean code from an LLM response that may contain markdown fences.
ollama_url
Get the Ollama base URL from OLLAMA_HOST env var, or default to localhost. Supports: “host:port”, “http://host:port”, or just “host”.