mini_ollama_client 0.1.0

Simple ollama client with minimal dependency in rust
Documentation
# mini_ollama_client

Simple Ollama client with minimal dependency in rust

 # Ollama Client Library

This is a simple Rust library to interact with the Ollama server. It provides a minimal set of dependencies and functionality to send requests to the server and receive responses.

## License


This project is licensed under the MIT License. See the [LICENSE](LICENSE) file for details.

## Features


- Minimal dependencies
- Simple API to send requests to the Ollama server
- Default model "phi3" if none is specified

## Usage


Add the following to your `Cargo.toml`:

```toml
[dependencies]
mini_ollama_client = "0.1.0"
```

## Example


Here is an example of how to use the ollama_client library:

```rust
use mini_ollama_client::send_request;
use std::error::Error;

fn main() -> Result<(), Box<dyn Error>> {
    println!("Starting the Ollama client...");

    let server = "localhost:11434";
    let prompt = "Hello ollama.";
    let model = None; // You can specify a model here or leave None to use "phi3" by default

    match send_request(server, prompt, model) {
        Ok(response) => println!("Response: {}", response),
        Err(e) => eprintln!("Error: {}", e),
    }

    Ok(())
}
```

## Contributing


Contributions are welcome! Please open an issue or submit a pull request.