Expand description
LLM HTTP client for chat completions with tool calling.
Supports two API formats:
- OpenAI-compatible:
/chat/completions(GPT-4, DeepSeek, Qwen, etc.) - Claude Native:
/v1/messages(Anthropic Claude)
Auto-detects which API to use based on model name or API base URL.
Ported from Python AgenticLoop._call_openai / _call_claude.
Structs§
- Chat
Completion Response - Choice
- Choice
Message - LlmClient
- LLM client supporting both OpenAI and Claude API formats.
- Usage
Functions§
- detect_
tool_ format - Detect API format from model name or API base.
- is_
context_ overflow_ error - Check if an error is a context overflow (token limit exceeded).
Ported from Python
_is_context_overflow_error. - truncate_
tool_ messages - Truncate all tool result messages in place to reduce context size.
Ported from Python
_truncate_tool_messages_in_place.