mini_ollama_client 0.1.0

Simple ollama client with minimal dependency in rust
Documentation
  • Coverage
  • 66.67%
    2 out of 3 items documented1 out of 1 items with examples
  • Size
  • Source code size: 6.98 kB This is the summed size of all the files inside the crates.io package for this release.
  • Documentation size: 1.11 MB This is the summed size of all files generated by rustdoc for all configured targets
  • Ø build duration
  • this release: 12s Average build duration of successful builds.
  • all releases: 12s Average build duration of successful builds in releases after 2024-10-23.
  • Links
  • Repository
  • crates.io
  • Dependencies
  • Versions
  • Owners
  • roquess

mini_ollama_client

Simple Ollama client with minimal dependency in rust

Ollama Client Library

This is a simple Rust library to interact with the Ollama server. It provides a minimal set of dependencies and functionality to send requests to the server and receive responses.

License

This project is licensed under the MIT License. See the LICENSE file for details.

Features

  • Minimal dependencies
  • Simple API to send requests to the Ollama server
  • Default model "phi3" if none is specified

Usage

Add the following to your Cargo.toml:

[dependencies]

mini_ollama_client = "0.1.0"

Example

Here is an example of how to use the ollama_client library:

use mini_ollama_client::send_request;
use std::error::Error;

fn main() -> Result<(), Box<dyn Error>> {
    println!("Starting the Ollama client...");

    let server = "localhost:11434";
    let prompt = "Hello ollama.";
    let model = None; // You can specify a model here or leave None to use "phi3" by default

    match send_request(server, prompt, model) {
        Ok(response) => println!("Response: {}", response),
        Err(e) => eprintln!("Error: {}", e),
    }

    Ok(())
}

Contributing

Contributions are welcome! Please open an issue or submit a pull request.