Ollama-rs
A simple and easy to use library for interacting with the Ollama API.
It was made following the Ollama API documentation.
Installation
Add ollama-rs to your Cargo.toml
[]
= "0.2.0"
Initialize Ollama
// By default it will connect to localhost:11434
let ollama = default;
// For custom values:
let ollama = new;
Usage
Feel free to check the Chatbot example that shows how to use the library to create a simple chatbot in less than 50 lines of code. You can also check some other examples.
These examples use poor error handling for simplicity, but you should handle errors properly in your code.
Completion generation
let model = "llama2:latest".to_string;
let prompt = "Why is the sky blue?".to_string;
let res = ollama.generate.await;
if let Ok = res
OUTPUTS: The sky appears blue because of a phenomenon called Rayleigh scattering...
Completion generation (streaming)
Requires the stream feature.
let model = "llama2:latest".to_string;
let prompt = "Why is the sky blue?".to_string;
let mut stream = ollama.generate_stream.await.unwrap;
let mut stdout = stdout;
while let Some = stream.next.await
Same output as above but streamed.
Completion generation (passing options to the model)
let model = "llama2:latest".to_string;
let prompt = "Why is the sky blue?".to_string;
let options = default
.temperature
.repeat_penalty
.top_k
.top_p;
let res = ollama.generate.await;
if let Ok = res
OUTPUTS: 1. Sun emits white sunlight: The sun consists primarily ...
List local models
let res = ollama.list_local_models.await.unwrap;
Returns a vector of Model structs.
Show model information
let res = ollama.show_model_info.await.unwrap;
Returns a ModelInfo struct.
Create a model
let res = ollama.create_model.await.unwrap;
Returns a CreateModelStatus struct representing the final status of the model creation.
Create a model (streaming)
Requires the stream feature.
let mut res = ollama.create_model_stream.await.unwrap;
while let Some = res.next.await
Returns a CreateModelStatusStream that will stream every status update of the model creation.
Copy a model
let _ = ollama.copy_model.await.unwrap;
Delete a model
let _ = ollama.delete_model.await.unwrap;
Generate embeddings
let prompt = "Why is the sky blue?".to_string;
let res = ollama.generate_embeddings.await.unwrap;
Returns a GenerateEmbeddingsResponse struct containing the embeddings (a vector of floats).
Make a function call
let tools = vec!;
let parser = new;
let message = user;
let res = ollama.send_function_call.await.unwrap;
Uses the given tools (such as searching the web) to find an answer, returns a ChatMessageResponse with the answer to the question.