A simple and lightweight Ollama API wrapper for Rust.
Interact with your local Ollama models effortlessly without dealing with raw HTTP requests.
## Features
- Easy to use API
- Async support
- Minimal dependencies
## Quick Start
```rust
use ollarust::{
Ollama,
structs::GeneratePayload
};
fn main() {
let ollama = Ollama::new(None).unwrap();
println!("{}", ollama.version);
let payload = GeneratePayload {
model: "llama3.2".to_string(),
prompt: "Hello?".to_string(),
stream: false
};
println!("{:?}", ollama.generate(payload).await);
}
```
## Requirements
- [Ollama](https://ollama.ai) installed and running locally
## License
MIT