Expand description
Runtime library, providing the core functionality for the ai_transform macro, including
the OpenAI client, error types, and the main transformation logic.
§Usage
Typically used indirectly through the ai_transform macro, but can
also be used directly:
use ai_transform_runtime::{transform, error::TransformError};
use serde::{Deserialize, Serialize};
#[derive(Serialize, Deserialize, Default)]
struct User { name: String, age: u32 }
#[derive(Serialize, Deserialize, Default, Debug)]
struct Profile { full_name: String, years_old: u32, is_adult: bool }
#[tokio::main]
async fn main() -> Result<(), TransformError> {
let user = User { name: "Alice".to_string(), age: 28 };
let profile: Profile = transform(user).await?;
println!("{:?}", profile);
Ok(())
}§How It Works
- Serialization: Converts the source value to JSON
- Schema Generation: Creates example JSON for both source and target types
- AI Request: Sends a transformation prompt to
OpenAI’s API - Response Processing: Extracts and cleans the JSON response
- Deserialization: Converts the result back to the target type
§Configuration
Environment variables:
OPENAI_API_KEY: YourOpenAIAPI key (required)OPENAI_MODEL: Model to use (default:"gpt-4o")OPENAI_BASE_URL: API base URL (default:"https://api.openai.com/v1")
§Considerations
- Each call makes a network request to
OpenAI’s API - Response time depends on data complexity and
OpenAI’s response time - API usage incurs costs based on
OpenAI’s pricing
§Requirements
Both source and target types must implement:
serde::Serialize+serde::Deserialize+Default
Modules§
Functions§
- transform
- Transforms data from one type to another using AI-powered transformation.