Expand description
§cargo-ai Library
This library provides API clients for interacting with AI services. It includes modules for communicating with both the Ollama and OpenAI APIs.
§Usage
The functions provided by this library are asynchronous and should be used within an async context.
They return a Result<String, Error>, where String is the successful API response.
use cargo_ai::{ollama_send_request, openai_send_request};
// For the Ollama API:
// Provide the model name, prompt, and a timeout (in seconds).
let ollama_response = ollama_send_request("url", "model_name", "Your prompt here", 60).await;
// For the OpenAI API:
// Provide the model name, prompt, timeout (in seconds), and your API token.
let openai_response = openai_send_request("url", "model_name", "Your prompt here", 60, "your_token_here").await;§Modules
ollama_api_client: Functions for interacting with the Ollama API.openai_api_client: Functions for interacting with the OpenAI API.
Structs§
Constants§
- DEFAULT_
TEMPERATURE - Default temperature used for model requests when not specified. Kept low (0.0) for consistent, deterministic outputs in schema-bound agents.
Functions§
- ollama_
send_ request - Re-exports the
send_requestfunction from theollama_api_clientmodule. This function sends a request to the Ollama API and returns the response. - openai_
send_ request - Re-exports the
send_requestfunction from theopenai_api_clientmodule. This function sends a request to the OpenAI API and returns the response.