Open Responses
A Rust client library for the Open Responses API specification.
Open Responses is an open-source specification for building multi-provider, interoperable LLM interfaces based on the OpenAI Responses API. This library provides a unified experience for calling language models, streaming results, and composing agentic workflows across different providers.
Key Features
- Schema Compliance: Strictly follows the latest OpenAI Responses API spec.
- Auto URL Normalization: Just provide the base domain; we'll handle the
/v1/responsespath for you. - MCP Tool Support: Compatible with Model Context Protocol (MCP) tools (e.g., in LM Studio).
- Stateful & Stateless: Support for both standard chat and stateful follow-ups using
previous_response_id. - Rich Streaming: Comprehensive SSE event handling for real-time applications.
Installation
Add this to your Cargo.toml:
[]
= "0.2.0"
= { = "1", = ["full"] }
How to Configure Your API
The library provides a flexible ClientBuilder to suit your development or production environment.
1. Direct Input (Simple & Quick)
Best for local testing or when using local LLM servers like LM Studio. You can pass string literals directly.
use Client;
let client = builder
.base_url // No need to add /v1/responses
.build;
2. Environment Variables (Recommended for Production)
Best for keeping secrets and configurations out of your source code.
use Client;
use env;
let api_key = var.expect;
let api_url = var.unwrap_or_else;
let client = builder
.base_url
.build;
Quick Start
Basic Chat
use ;
async
Streaming Responses
use ;
use StreamExt;
async
Stateful Follow-up
Continue a conversation by referencing a previous response ID (if supported by your provider).
let request = CreateResponseBody ;
Examples
Check the examples/ directory for ready-to-run code:
direct_input.rs: Simplified connection to local servers.env_config.rs: Using environment variables.stateful_follow_up.rs: Chaining conversations.streaming.rs: SSE streaming implementation.function_calling.rs: Using tools and functions.
Run any example:
License
This project is licensed under the Apache 2.0 License - see the LICENSE file for details.