Mistral AI Rust Client

Rust client for the Mistral AI API.
Supported APIs
Installation
You can install the library in your project using:
cargo add mistralai-client
Mistral API Key
You can get your Mistral API Key there: https://docs.mistral.ai/#api-access.
As an environment variable
Just set the MISTRAL_API_KEY environment variable.
As a client argument
use mistralai_client::v1::client::Client;
fn main() {
let api_key = "your_api_key";
let client = Client::new(Some(api_key), None, None, None);
}
Usage
Chat without streaming
use mistralai_client::v1::{
chat_completion::{ChatCompletionMessage, ChatCompletionMessageRole, ChatCompletionRequestOptions},
client::Client,
constants::OPEN_MISTRAL_7B,
};
fn main() {
let client = Client::new(None, None, None, None);
let model = OPEN_MISTRAL_7B.to_string();
let messages = vec![ChatCompletionMessage {
role: ChatCompletionMessageRole::user,
content: "Just guess the next word: \"Eiffel ...\"?".to_string(),
}];
let options = ChatCompletionRequestOptions {
temperature: Some(0.0),
random_seed: Some(42),
..Default::default()
};
let result = client.chat(model, messages, Some(options)).unwrap();
println!("Assistant: {}", result.choices[0].message.content);
}
Chat with streaming
In progress.
Embeddings
In progress.
List models
use mistralai_client::v1::client::Client;
fn main() {
let client = Client::new(None, None, None, None);
let result = client.list_models(model, messages, Some(options)).unwrap();
println!("First Model ID: {:?}", result.data[0].id);
}