Skip to main content

Crate astrid_llm

Crate astrid_llm 

Source
Expand description

Astrid LLM - LLM provider abstraction with streaming support.

This crate provides:

  • LLM provider trait for abstraction
  • Claude (Anthropic) implementation
  • OpenAI-compatible implementation (LM Studio, OpenAI, vLLM, etc.)
  • Streaming response support
  • Tool use support

§Example with Claude

use astrid_llm::{ClaudeProvider, LlmProvider, Message, ProviderConfig};

// Create provider
let config = ProviderConfig::new("your-api-key", "claude-sonnet-4-20250514");
let provider = ClaudeProvider::new(config);

// Simple completion
let response = provider.complete_simple("What is 2+2?").await?;
println!("Response: {}", response);

§Example with LM Studio

use astrid_llm::{OpenAiCompatProvider, LlmProvider, Message};

// Connect to LM Studio running locally
let provider = OpenAiCompatProvider::lm_studio();

// Or with a specific model
let provider = OpenAiCompatProvider::lm_studio_with_model("llama-3.1-8b");

let response = provider.complete_simple("Hello!").await?;
println!("Response: {}", response);

§Streaming

use astrid_llm::{ClaudeProvider, LlmProvider, Message, ProviderConfig, StreamEvent};
use futures::StreamExt;

let provider = ClaudeProvider::new(ProviderConfig::new("api-key", "claude-sonnet-4-20250514"));
let messages = vec![Message::user("Tell me a story")];

let mut stream = provider.stream(&messages, &[], "").await?;

while let Some(event) = stream.next().await {
    match event? {
        StreamEvent::TextDelta(text) => print!("{}", text),
        StreamEvent::Done => println!("\n[Done]"),
        _ => {}
    }
}

Modules§

prelude
Prelude module - commonly used types for convenient import.

Structs§

ClaudeProvider
Claude LLM provider.
LlmResponse
LLM response (non-streaming).
LlmToolDefinition
Tool definition for the LLM.
Message
A message in the conversation.
OpenAiCompatProvider
OpenAI-compatible LLM provider.
ProviderConfig
Configuration for LLM providers.
ToolCall
A tool call from the assistant.
ToolCallResult
Result of a tool call.
Usage
Token usage information.
ZaiProvider
Z.AI (Zhipu AI) LLM provider.

Enums§

ContentPart
A part of multi-part content.
LlmError
Errors that can occur with LLM operations.
MessageContent
Message content.
MessageRole
Message role.
StopReason
Reason the model stopped generating.
StreamEvent
Streaming event from the LLM.

Traits§

LlmProvider
LLM provider trait.

Type Aliases§

LlmResult
Result type for LLM operations.
StreamBox
Type alias for boxed streams.