ollama-oxide
Features
- Low-level primitives for direct Ollama API interaction
- High-level conveniences (optional) for common use cases
- Async/await support with Tokio runtime
- Type-safe API bindings generated from OpenAPI specs
- Comprehensive error handling
- HTTP/2 support via reqwest
- Feature flags for modular dependencies
Architecture
Single-crate design with modular structure and feature flags:
ollama-oxide/
└── src/
├── lib.rs # Main library entry point
├── inference/ # Inference types: chat, generate, embed (default)
├── http/ # HTTP client layer (default)
├── tools/ # Ergonomic function calling (optional)
├── model/ # Model management (optional)
└── conveniences/ # High-level APIs (optional)
Feature Flags
The library uses feature flags to let you include only what you need:
| Feature | Dependencies | Purpose |
|---|---|---|
default |
http, inference |
Standard usage - HTTP client + all inference types |
inference |
- | Standalone inference types (chat, generate, embed) |
http |
- | HTTP client implementation (async/sync) |
tools |
schemars, futures |
Ergonomic function calling with auto-generated JSON schemas |
model |
http, inference |
Model management API (list, show, copy, create, delete) |
conveniences |
http, inference |
High-level ergonomic APIs |
Installation
Add this to your Cargo.toml:
# Default features (inference + http)
[]
= "0.1.0"
# With function calling support
[]
= { = "0.1.0", = ["tools"] }
# With model management
[]
= { = "0.1.0", = ["model"] }
# Full featured
[]
= { = "0.1.0", = ["tools", "model"] }
# Inference types only (no HTTP client)
[]
= { = "0.1.0", = false, = ["inference"] }
Quick Start
Requirements
- Rust 1.75+ (edition 2024)
- Ollama running locally or accessible via network
Development
Building
Running Tests
Running Examples
API Documentation
The library follows Ollama's OpenAPI specifications (see spec/primitives/).
12 Total Endpoints:
- 5 Simple endpoints (version, tags, ps, copy, delete)
- 2 Medium complexity (show, embed)
- 5 Complex with streaming (generate, chat, create, pull, push)
See spec/api-analysis.md for detailed endpoint documentation.
Contributing
Contributions are welcome! Please read CONTRIBUTING.md for guidelines.
License
This project is licensed under the MIT License - see the LICENSE file for details.
Acknowledgments
Based on Ollama's official libraries and API specifications.