pub struct Client { /* private fields */ }Expand description
Unified LLM client that wraps llm-connector for all providers
Implementations§
Source§impl Client
impl Client
Sourcepub async fn chat_stream_with_format(
&self,
model: &str,
messages: Vec<Message>,
format: StreamFormat,
) -> Result<UnboundedReceiverStream<String>>
pub async fn chat_stream_with_format( &self, model: &str, messages: Vec<Message>, format: StreamFormat, ) -> Result<UnboundedReceiverStream<String>>
Send a streaming chat request with specified format (Ollama-style response)
This method returns streaming responses in Ollama API format, which is used by Ollama-compatible clients like Zed.dev.
Sourcepub async fn chat_stream_openai(
&self,
model: &str,
messages: Vec<Message>,
tools: Option<Vec<Tool>>,
format: StreamFormat,
) -> Result<UnboundedReceiverStream<String>>
pub async fn chat_stream_openai( &self, model: &str, messages: Vec<Message>, tools: Option<Vec<Tool>>, format: StreamFormat, ) -> Result<UnboundedReceiverStream<String>>
Send a streaming chat request for OpenAI API (OpenAI-style response)
This method returns streaming responses in OpenAI API format, which is used by OpenAI-compatible clients like Codex CLI.
Key feature: Automatically corrects finish_reason from “stop” to “tool_calls” when tool_calls are detected in the stream.
Auto Trait Implementations§
impl Freeze for Client
impl !RefUnwindSafe for Client
impl Send for Client
impl Sync for Client
impl Unpin for Client
impl !UnwindSafe for Client
Blanket Implementations§
Source§impl<T> BorrowMut<T> for Twhere
T: ?Sized,
impl<T> BorrowMut<T> for Twhere
T: ?Sized,
Source§fn borrow_mut(&mut self) -> &mut T
fn borrow_mut(&mut self) -> &mut T
Mutably borrows from an owned value. Read more