OpenAI Interface
A low-level Rust interface for interacting with OpenAI's API. Both streaming and non-streaming APIs are supported.
Currently, only chat completion is supported. FIM completion, image generation, and other features are still in development.
Repository:
You are welcome to contribute to this project through any of the links above.
Features
- Chat Completions: Full support for OpenAI's chat completion API, including both streaming and non-streaming responses
- Streaming and Non-streaming: Support for both streaming and non-streaming responses
- Strong Typing: Complete type definitions for all API requests and responses, utilizing Rust's powerful type system
- Error Handling: Comprehensive error handling with detailed error types defined in the [
errors
] module - Async/Await: Built with async/await support
- Musl Support: Designed to work with musl libc out-of-the-box
- Multiple Provider Support: Works with OpenAI, DeepSeek, Qwen, and other compatible APIs
Installation
[!WARNING] Versions prior to 0.3.0 have serious issues with SSE streaming responses processing. Instead of a single chunk, multiple chunks may be returned in each iteration of
chat::request::ChatCompletion::get_streaming_response_string
.
Add this to your Cargo.toml
:
[]
= "0.4.0-alpha.2"
Usage
Chat Completion
This crate provides methods for both streaming and non-streaming chat completions. The following examples demonstrate how to use these features.
Non-streaming Chat Completion
use LazyLock;
use ;
use ChatCompletion;
use FromStr;
// You need to provide your own DeepSeek API key at /keys/deepseek_domestic_key
const DEEPSEEK_API_KEY: =
new;
const DEEPSEEK_CHAT_URL: &'static str = "https://api.deepseek.com/chat/completions";
const DEEPSEEK_MODEL: &'static str = "deepseek-chat";
async
Streaming Chat Completion
This example demonstrates how to handle streaming responses from the API.
use ;
use ;
use StreamExt;
use FromStr;
use LazyLock;
// You need to provide your own DeepSeek API key at /keys/deepseek_domestic_key
const DEEPSEEK_API_KEY: =
new;
const DEEPSEEK_CHAT_URL: &'static str = "https://api.deepseek.com/chat/completions";
const DEEPSEEK_MODEL: &'static str = "deepseek-chat";
async
Custom Request Parameters
You can customize request parameters as needed. If you require provider-specific
fields, you can add them to extra_body
or extra_body_map
.
Modules
- [
chat
]: Contains all chat completion related structs, enums, and methods. - [
completion
]: Contains all completion related structs, enums, and methods. Note that this API is getting deprecated in favour ofchat
and is only available for out-dated LLM modules. - [
rest
]: Providing all REST related traits and methods. - [
errors
]: Defines error types used throughout the crate.
Error Handling
All errors are converted into either crate::error::OapiError
or
crate::error::ResponseError
.
Musl Build
This crate is designed to work with musl libc, making it suitable for lightweight deployments in containerized environments. Longer compile times may be required as OpenSSL needs to be built from source.
To build for musl:
Supported Providers
This crate aims to support standard OpenAI-compatible API endpoints. Unfortunately, OpenAI aggressively restricts the access from the People's Republic of China. As a result, the implementation has been tested primarily with DeepSeek and Qwen. Please open an issue if you find any mistakes or inaccuracies in the implementation.
Contributing
Contributions are welcome! Please feel free to submit pull requests or open issues for bugs and feature requests.
License
This project is licensed under the AGPL-3.0 License - see the LICENSE file for details.