Expand description
§sgr-agent — LLM client + agent framework
Pure Rust. No dlopen, no external binaries. Works on iOS, Android, WASM — anywhere reqwest+rustls compiles.
§LLM Client (default)
- Structured output — response conforms to JSON Schema (SGR envelope)
- Function calling — tools as typed structs, model picks & fills params
- Flexible parser — extract JSON from markdown, broken JSON, streaming chunks
- Backends: Gemini (Google AI + Vertex AI), OpenAI (+ OpenRouter, Ollama)
§Agent Framework (feature = "agent")
- Tool trait — define tools with typed args + async execute
- ToolRegistry — ordered collection, case-insensitive lookup, fuzzy resolve
- Agent trait — decides what tools to call given conversation history
- 3 agent variants: SgrAgent (structured output), ToolCallingAgent (native FC), FlexibleAgent (text parse)
- Agent loop — decide → execute → feed back, with 3-tier loop detection
- Progressive discovery — filter tools by relevance (TF-IDF scoring)
Re-exports§
pub use oxide_client::OxideClient;pub use oxide_chat_client::OxideChatClient;pub use llm::Llm;pub use coerce::coerce_value;pub use flexible_parser::parse_flexible;pub use flexible_parser::parse_flexible_coerced;pub use schema::json_schema_for;pub use schema::response_schema_for;pub use tool::ToolDef;pub use tool::tool;pub use types::*;
Modules§
- agent
- Agent trait — decides what to do next given conversation history and tools.
- agent_
loop - Generic agent loop — drives agent + tools until completion or limit.
- agent_
tool - Tool trait — the core abstraction for agent tools.
- agents
- Agent variants — different strategies for LLM ↔ tool interaction.
- baml_
parser - Lightweight BAML parser — extracts classes, enums, functions from
.bamlfiles. - benchmark
- Benchmark suite: 5 fixed tasks to measure agent quality.
- client
- LlmClient trait — abstract LLM backend for agent use.
- codegen
- Code generator: BAML AST → Rust source code with schemars + serde derives.
- coerce
- Fuzzy type coercion for LLM outputs.
- compaction
- LLM-based context compaction — summarize old messages to stay within limits.
- context
- Agent execution context — state and domain-specific data.
- discovery
- Progressive discovery — filter tools by relevance to current query.
- evolution
- Self-evolution: agent evaluates its own runs and proposes improvements.
- factory
- AgentFactory — create agents from configuration.
- flexible_
parser - Flexible JSON parser — extracts structured data from messy LLM output.
- gemini
- Gemini API client — structured output + function calling.
- llm
- Llm — provider-agnostic LLM client.
- model_
info - Model metadata — context window sizes and compaction budgets.
- openai
- OpenAI-compatible API client — works with OpenAI, OpenRouter, Ollama.
- openapi
- OpenAPI → Agent Tool: convert any API spec into a searchable, callable tool.
- oxide_
chat_ client - OxideChatClient — LlmClient via Chat Completions API (not Responses).
- oxide_
client - OxideClient — LlmClient adapter for
openai-oxidecrate. - prompt_
loader - Load, cache, and merge prompt files from disk.
- registry
- Tool registry — ordered collection of tools with lookup and fuzzy resolve.
- retry
- RetryClient — wraps LlmClient with exponential backoff for transient errors.
- router
- Model router — routes requests to smart or fast model based on complexity.
- schema
- JSON Schema generation from Rust types via
schemars. - schema_
simplifier - SchemaSimplifier — converts JSON Schema to human-readable text.
- streaming
- Streaming abstraction — channel-based streaming for agent output.
- swarm
- Multi-agent swarm — spawn, manage, and coordinate sub-agents.
- swarm_
tools - Swarm tools — tools for the parent agent to manage sub-agents.
- tool
- Tool definitions — typed Rust structs → function declarations for LLM APIs.
- types
- union_
schema - Dynamic flat action schema builder — generates a single object JSON Schema from tool definitions at runtime. Used by SgrAgent for structured output.