Inference Gateway Rust SDK
An SDK written in Rust for the Inference Gateway.
Installation
Run cargo add inference-gateway-sdk.
Usage
Creating a Client
use inference_gateway_rust_sdk::{InferenceGatewayClient, Provider};
use std::error::Error;
fn main() -> Result<(), Box<dyn Error>> {
let client = InferenceGatewayClient::new("http://localhost:8080");
let models = client.list_models()?;
for provider_models in models {
println!("Provider: {}", provider_models.provider);
for model in provider_models.models {
println!(" Model: {}", model.id);
}
}
let response = client.generate_content(
Provider::Ollama,
"llama2",
"Tell me a joke"
)?;
println!("Response: {}", response.response);
Ok(())
}
Listing Models
To list available models, use the list_models method:
let models = client.list_models()?;
for provider_models in models {
println!("Provider: {}", provider_models.provider);
for model in provider_models.models {
println!(" Model: {}", model.id);
}
}
Generating Content
To generate content using a model, use the generate_content method:
let response = client.generate_content(
Provider::Ollama,
"llama2",
"Tell me a joke"
)?;
println!("Provider: {}", response.provider);
println!("Response: {}", response.response);
Health Check
To check if the Inference Gateway is running, use the health_check method:
let is_healthy = client.health_check()?;
println!("API is healthy: {}", is_healthy);
License
This SDK is distributed under the MIT License, see LICENSE for more information.