Sermo
A Rust client library for interacting with various Large Language Model (LLM) provider APIs.
Features
- Supports multiple LLM providers (Ollama, OpenAI, Anthropic, Google, X.ai, Mistral, Deepseek, Groq, TogetherAI, and more)
- Simple API for sending chat messages and receiving responses
- Configurable model settings (temperature, max tokens)
- Flexible JSON extraction from responses
Installation
Add this to your Cargo.toml:
[]
= "0.1.0"
{LlmProfile, LlmProvider};
-> Result<(), std::io::Error> {
= LlmProfile {
LlmProvider::ollama,
String::new(),
"llama2".to_string(),
Some(0.7),
Some(100),
"http://localhost:11434/api/chat".to_string(),
};
= profile.send_single("Hello! Tell me something about Rust.")?;
"Response: {}", response);
))
}
via other provider)
Please open an issue or submit a pull request on GitHub.
mailto:matt@cicero.sh)