pub struct Ollama { /* private fields */ }
Implementations§
Source§impl Ollama
impl Ollama
Sourcepub async fn send_chat_messages_stream(
&self,
request: ChatMessageRequest,
) -> Result<ChatMessageResponseStream>
Available on crate feature stream
only.
pub async fn send_chat_messages_stream( &self, request: ChatMessageRequest, ) -> Result<ChatMessageResponseStream>
stream
only.Chat message generation with streaming.
Returns a stream of ChatMessageResponse
objects
Sourcepub async fn send_chat_messages(
&self,
request: ChatMessageRequest,
) -> Result<ChatMessageResponse>
pub async fn send_chat_messages( &self, request: ChatMessageRequest, ) -> Result<ChatMessageResponse>
Chat message generation.
Returns a ChatMessageResponse
object
Source§impl Ollama
impl Ollama
pub async fn send_chat_messages_with_history_stream<C: ChatHistory + Send + 'static>( &self, history: Arc<Mutex<C>>, request: ChatMessageRequest, ) -> Result<ChatMessageResponseStream>
stream
only.Sourcepub async fn send_chat_messages_with_history<C: ChatHistory>(
&mut self,
history: &mut C,
request: ChatMessageRequest,
) -> Result<ChatMessageResponse>
pub async fn send_chat_messages_with_history<C: ChatHistory>( &mut self, history: &mut C, request: ChatMessageRequest, ) -> Result<ChatMessageResponse>
Chat message generation
Returns a ChatMessageResponse
object
Source§impl Ollama
impl Ollama
Sourcepub async fn generate_stream(
&self,
request: GenerationRequest<'_>,
) -> Result<GenerationResponseStream>
Available on crate feature stream
only.
pub async fn generate_stream( &self, request: GenerationRequest<'_>, ) -> Result<GenerationResponseStream>
stream
only.Completion generation with streaming.
Returns a stream of GenerationResponse
objects
Sourcepub async fn generate(
&self,
request: GenerationRequest<'_>,
) -> Result<GenerationResponse>
pub async fn generate( &self, request: GenerationRequest<'_>, ) -> Result<GenerationResponse>
Completion generation with a single response.
Returns a single GenerationResponse
object
Source§impl Ollama
impl Ollama
Sourcepub async fn generate_embeddings(
&self,
request: GenerateEmbeddingsRequest,
) -> Result<GenerateEmbeddingsResponse>
pub async fn generate_embeddings( &self, request: GenerateEmbeddingsRequest, ) -> Result<GenerateEmbeddingsResponse>
Generate embeddings from a model
model_name
- Name of model to generate embeddings fromprompt
- Prompt to generate embeddings for
Source§impl Ollama
impl Ollama
Sourcepub fn new_with_request_headers(
host: impl IntoUrl,
port: u16,
headers: HeaderMap,
) -> Self
Available on crate feature headers
only.
pub fn new_with_request_headers( host: impl IntoUrl, port: u16, headers: HeaderMap, ) -> Self
headers
only.Creates a new Ollama
instance with the specified host, port, and request headers.
§Arguments
host
- The host of the Ollama service.port
- The port of the Ollama service.headers
- The request headers to be used.
§Returns
A new Ollama
instance with the specified request headers.
§Panics
Panics if the host is not a valid URL or if the URL cannot have a port.
Sourcepub fn set_headers(&mut self, headers: Option<HeaderMap>)
Available on crate feature headers
only.
pub fn set_headers(&mut self, headers: Option<HeaderMap>)
headers
only.Sets the request headers for the Ollama
instance.
§Arguments
headers
- An optionalHeaderMap
containing the request headers.
If None
is provided, the headers will be reset to an empty HeaderMap
.
Source§impl Ollama
impl Ollama
Sourcepub async fn create_model_stream(
&self,
request: CreateModelRequest,
) -> Result<CreateModelStatusStream>
Available on crate feature stream
only.
pub async fn create_model_stream( &self, request: CreateModelRequest, ) -> Result<CreateModelStatusStream>
stream
only.Create a model with streaming, meaning that each new status will be streamed.
Sourcepub async fn create_model(
&self,
request: CreateModelRequest,
) -> Result<CreateModelStatus>
pub async fn create_model( &self, request: CreateModelRequest, ) -> Result<CreateModelStatus>
Create a model with a single response, only the final status will be returned.
Source§impl Ollama
impl Ollama
Sourcepub async fn delete_model(&self, model_name: String) -> Result<()>
pub async fn delete_model(&self, model_name: String) -> Result<()>
Delete a model and its data.
Source§impl Ollama
impl Ollama
pub async fn list_local_models(&self) -> Result<Vec<LocalModel>>
Source§impl Ollama
impl Ollama
Sourcepub async fn pull_model_stream(
&self,
model_name: String,
allow_insecure: bool,
) -> Result<PullModelStatusStream>
Available on crate feature stream
only.
pub async fn pull_model_stream( &self, model_name: String, allow_insecure: bool, ) -> Result<PullModelStatusStream>
stream
only.Pull a model with streaming, meaning that each new status will be streamed.
model_name
- The name of the model to pull.allow_insecure
- Allow insecure connections to the library. Only use this if you are pulling from your own library during development.
Sourcepub async fn pull_model(
&self,
model_name: String,
allow_insecure: bool,
) -> Result<PullModelStatus>
pub async fn pull_model( &self, model_name: String, allow_insecure: bool, ) -> Result<PullModelStatus>
Pull a model with a single response, only the final status will be returned.
model_name
- The name of the model to pull.allow_insecure
- Allow insecure connections to the library. Only use this if you are pulling from your own library during development.
Source§impl Ollama
impl Ollama
Sourcepub async fn push_model_stream(
&self,
model_name: String,
allow_insecure: bool,
) -> Result<PushModelStatusStream>
Available on crate feature stream
only.
pub async fn push_model_stream( &self, model_name: String, allow_insecure: bool, ) -> Result<PushModelStatusStream>
stream
only.Upload a model to a model library. Requires registering for ollama.ai and adding a public key first. Push a model with streaming, meaning that each new status will be streamed.
model_name
- The name of the model to push in the form of<namespace>/<model>:<tag>
.allow_insecure
- Allow insecure connections to the library. Only use this if you are pushing to your library during development.
Sourcepub async fn push_model(
&self,
model_name: String,
allow_insecure: bool,
) -> Result<PushModelStatus>
pub async fn push_model( &self, model_name: String, allow_insecure: bool, ) -> Result<PushModelStatus>
Upload a model to a model library. Requires registering for ollama.ai and adding a public key first. Push a model with a single response, only the final status will be returned.
model_name
- The name of the model to push in the form of<namespace>/<model>:<tag>
.allow_insecure
- Allow insecure connections to the library. Only use this if you are pushing to your library during development.
Source§impl Ollama
impl Ollama
Sourcepub async fn show_model_info(&self, model_name: String) -> Result<ModelInfo>
pub async fn show_model_info(&self, model_name: String) -> Result<ModelInfo>
Show details about a model including modelfile, template, parameters, license, and system prompt.
Source§impl Ollama
The main struct representing an Ollama client.
impl Ollama
The main struct representing an Ollama client.
This struct is used to interact with the Ollama service.
§Fields
url
- The base URL of the Ollama service.reqwest_client
- The HTTP client used for requests.request_headers
- Optional headers for requests (enabled with theheaders
feature).
Sourcepub fn new_with_client(
host: impl IntoUrl,
port: u16,
reqwest_client: Client,
) -> Self
pub fn new_with_client( host: impl IntoUrl, port: u16, reqwest_client: Client, ) -> Self
Creates a new Ollama
instance with the specified host, port, and reqwest
client.
§Arguments
host
- The host of the Ollama service.port
- The port of the Ollama service.reqwest_client
- Thereqwest
client instance.
§Returns
A new Ollama
instance with the specified reqwest
client.
§Panics
Panics if the host is not a valid URL or if the URL cannot have a port.
Sourcepub fn try_new(url: impl IntoUrl) -> Result<Self, ParseError>
pub fn try_new(url: impl IntoUrl) -> Result<Self, ParseError>
Sourcepub fn url_str(&self) -> &str
pub fn url_str(&self) -> &str
Returns the URL of the Ollama service as a &str
.
Syntax in pseudo-BNF:
url = scheme ":" [ hierarchical | non-hierarchical ] [ "?" query ]? [ "#" fragment ]?
non-hierarchical = non-hierarchical-path
non-hierarchical-path = /* Does not start with "/" */
hierarchical = authority? hierarchical-path
authority = "//" userinfo? host [ ":" port ]?
userinfo = username [ ":" password ]? "@"
hierarchical-path = [ "/" path-segment ]+
Trait Implementations§
Auto Trait Implementations§
impl Freeze for Ollama
impl !RefUnwindSafe for Ollama
impl Send for Ollama
impl Sync for Ollama
impl Unpin for Ollama
impl !UnwindSafe for Ollama
Blanket Implementations§
Source§impl<T> BorrowMut<T> for Twhere
T: ?Sized,
impl<T> BorrowMut<T> for Twhere
T: ?Sized,
Source§fn borrow_mut(&mut self) -> &mut T
fn borrow_mut(&mut self) -> &mut T
Source§impl<T> CloneToUninit for Twhere
T: Clone,
impl<T> CloneToUninit for Twhere
T: Clone,
Source§impl<T> Instrument for T
impl<T> Instrument for T
Source§fn instrument(self, span: Span) -> Instrumented<Self>
fn instrument(self, span: Span) -> Instrumented<Self>
Source§fn in_current_span(self) -> Instrumented<Self>
fn in_current_span(self) -> Instrumented<Self>
Source§impl<T> IntoEither for T
impl<T> IntoEither for T
Source§fn into_either(self, into_left: bool) -> Either<Self, Self>
fn into_either(self, into_left: bool) -> Either<Self, Self>
self
into a Left
variant of Either<Self, Self>
if into_left
is true
.
Converts self
into a Right
variant of Either<Self, Self>
otherwise. Read moreSource§fn into_either_with<F>(self, into_left: F) -> Either<Self, Self>
fn into_either_with<F>(self, into_left: F) -> Either<Self, Self>
self
into a Left
variant of Either<Self, Self>
if into_left(&self)
returns true
.
Converts self
into a Right
variant of Either<Self, Self>
otherwise. Read more