# AISDK
[](https://github.com/lazy-hq/ai-sdk-rs/actions/workflows/ci.yml)
[](https://opensource.org/licenses/MIT)
[](https://github.com/lazy-hq/ai-sdk-rs/issues)
[](https://github.com/lazy-hq/ai-sdk-rs/pulls)
An open-source Rust library for building AI-powered applications, inspired by the Vercel AI SDK. It provides a type-safe interface for interacting with Large Language Models (LLMs).
> **⚠️ Early Stage Warning**: This project is in very early development and not ready for production use. APIs may change significantly, and features are limited. Use at your own risk.
## Key Features
- **OpenAI Provider Support**: Initial support for OpenAI models with text generation and streaming.
- **Type-Safe API**: Built with Rust's type system for reliability.
- **Asynchronous**: Uses Tokio for async operations.
- **Prompt Templating**: Filesystem-based prompts using Tera templates (coming soon).
## Installation
Add `aisdk` to your `Cargo.toml`:
```toml
[dependencies]
aisdk = "0.1.0"
```
Enable the OpenAI feature:
```toml
aisdk = { version = "0.1.0", features = ["openai"] }
```
## Usage
### Basic Text Generation
```rust
use ai_sdk_rs::{
core::{GenerateTextCallOptions, generate_text},
providers::openai::{OpenAI, OpenAIProviderSettings},
};
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
let settings = OpenAIProviderSettings::builder()
.api_key("your-api-key".to_string())
.model_name("gpt-4o".to_string())
.build()?;
let openai = OpenAI::new(settings);
let options = GenerateTextCallOptions::builder()
.prompt("Say hello.")
.build()?;
let result = generate_text(openai, options).await?;
println!("{}", result.text);
Ok(())
}
```
### Streaming Text Generation
```rust
use ai_sdk_rs::{
core::{GenerateTextCallOptions, generate_stream},
providers::openai::{OpenAI, OpenAIProviderSettings},
};
use futures::StreamExt;
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
let settings = OpenAIProviderSettings::builder()
.api_key("your-api-key".to_string())
.model_name("gpt-4o".to_string())
.build()?;
let openai = OpenAI::new(settings);
let options = GenerateTextCallOptions::builder()
.prompt("Count from 1 to 10.")
.build()?;
let mut stream = generate_stream(openai, options).await?;
while let Some(chunk) = stream.stream.next().await {
print!("{}", chunk.text);
}
Ok(())
}
```
## Technologies Used
- **Rust**: Core language.
- **Tokio**: Async runtime.
- **Tera**: Template engine for prompts.
- **async-openai**: OpenAI API client.
## Contributing
We welcome contributions! Please see [CONTRIBUTING.md](./CONTRIBUTING.md) for guidelines.
## License
Licensed under the MIT License. See [LICENSE](./LICENSE) for details.