pub struct OxideClient { /* private fields */ }Expand description
LlmClient backed by openai-oxide (Responses API).
With oxide-ws feature: call connect_ws() to upgrade to WebSocket mode.
All subsequent calls go over persistent wss:// connection (-20-25% latency).
Implementations§
Source§impl OxideClient
impl OxideClient
Sourcepub fn from_config(config: &LlmConfig) -> Result<Self, SgrError>
pub fn from_config(config: &LlmConfig) -> Result<Self, SgrError>
Create from LlmConfig.
Sourcepub fn set_response_id(&self, id: Option<&str>)
pub fn set_response_id(&self, id: Option<&str>)
Set response_id externally (for stateful session coordination with coach).
Sourcepub fn response_id(&self) -> Option<String>
pub fn response_id(&self) -> Option<String>
Get current response_id.
Sourcepub async fn tools_call_stateful(
&self,
messages: &[Message],
tools: &[ToolDef],
previous_response_id: Option<&str>,
) -> Result<(Vec<ToolCall>, Option<String>), SgrError>
pub async fn tools_call_stateful( &self, messages: &[Message], tools: &[ToolDef], previous_response_id: Option<&str>, ) -> Result<(Vec<ToolCall>, Option<String>), SgrError>
Function calling with explicit previous_response_id. Returns tool calls + new response_id for chaining.
Always sets store(true) so responses can be referenced by subsequent calls.
When previous_response_id is provided, only delta messages need to be sent
(server has full history from previous stored response).
Tool messages (role=Tool with tool_call_id) are converted to Responses API
function_call_output items — required for chaining with previous_response_id.