Skip to main content

Module llm

Module llm 

Source
Expand description

LLM HTTP client for chat completions with tool calling.

Supports two API formats:

  • OpenAI-compatible: /chat/completions (GPT-4, DeepSeek, Qwen, etc.)
  • Claude Native: /v1/messages (Anthropic Claude)

Auto-detects which API to use based on model name or API base URL.

Ported from Python AgenticLoop._call_openai / _call_claude.

Structs§

ChatCompletionResponse
Choice
ChoiceMessage
LlmClient
LLM client supporting both OpenAI and Claude API formats.
Usage

Functions§

detect_tool_format
Detect API format from model name or API base.
is_context_overflow_error
Check if an error is a context overflow (token limit exceeded). Ported from Python _is_context_overflow_error.
truncate_tool_messages
Truncate all tool result messages in place to reduce context size. Ported from Python _truncate_tool_messages_in_place.