pub struct Gemini { /* private fields */ }Expand description
The main client for interacting with the Gemini API.
Use Gemini::new or Gemini::new_with_timeout to create an instance.
You can configure various aspects of the request like model, system instructions,
generation config, safety settings, and tools using the provided builder-like methods.
Implementations§
Source§impl Gemini
impl Gemini
Sourcepub fn new(
api_key: impl Into<String>,
model: impl Into<String>,
sys_prompt: Option<SystemInstruction>,
) -> Self
pub fn new( api_key: impl Into<String>, model: impl Into<String>, sys_prompt: Option<SystemInstruction>, ) -> Self
Creates a new Gemini client.
§Arguments
api_key- Your Gemini API key. Get one from Google AI studio.model- The model variation to use (e.g., “gemini-1.5-flash”). See model variations.sys_prompt- Optional system instructions. See system instructions.
Sourcepub fn new_with_timeout(
api_key: impl Into<String>,
model: impl Into<String>,
sys_prompt: Option<SystemInstruction>,
api_timeout: Duration,
) -> Self
pub fn new_with_timeout( api_key: impl Into<String>, model: impl Into<String>, sys_prompt: Option<SystemInstruction>, api_timeout: Duration, ) -> Self
Creates a new Gemini client with a custom API timeout.
§Arguments
api_key- Your Gemini API key.model- The model variation to use.sys_prompt- Optional system instructions.api_timeout- Custom duration for request timeouts.
Sourcepub fn set_generation_config(&mut self) -> &mut Value
pub fn set_generation_config(&mut self) -> &mut Value
Returns a mutable reference to the generation configuration. If not already set, initializes it to an empty object.
See Gemini docs for schema details.
pub fn set_tool_config(self, config: ToolConfig) -> Self
pub fn set_thinking_config(self, config: ThinkingConfig) -> Self
pub fn set_model(self, model: impl Into<String>) -> Self
pub fn set_sys_prompt(self, sys_prompt: Option<SystemInstruction>) -> Self
pub fn set_safety_settings(self, settings: Option<Vec<SafetySetting>>) -> Self
pub fn set_api_key(self, api_key: impl Into<String>) -> Self
Sourcepub fn set_json_mode(self, schema: Value) -> Self
pub fn set_json_mode(self, schema: Value) -> Self
Sets the response format to JSON mode with a specific schema.
To use a Rust struct as a schema, decorate it with #[gemini_schema] and pass
StructName::gemini_schema().
§Arguments
schema- The JSON schema for the response. See Gemini Schema docs.
pub fn unset_json_mode(self) -> Self
Sourcepub fn set_tools(self, tools: Vec<Tool>) -> Self
pub fn set_tools(self, tools: Vec<Tool>) -> Self
Sets the tools (functions) available to the model.
Sourcepub fn unset_tools(self) -> Self
pub fn unset_tools(self) -> Self
Removes all tools.
Sourcepub async fn ask(
&self,
session: &mut Session,
) -> Result<GeminiResponse, GeminiResponseError>
pub async fn ask( &self, session: &mut Session, ) -> Result<GeminiResponse, GeminiResponseError>
Sends a prompt to the model and waits for the full response.
Updates the session history with the model’s reply.
§Errors
Returns GeminiResponseError::NothingToRespond if the last message in history is from the model.
Sourcepub async fn ask_as_stream_with_extractor<F, StreamType>(
&self,
session: Session,
data_extractor: F,
) -> Result<ResponseStream<F, StreamType>, (Session, GeminiResponseError)>
pub async fn ask_as_stream_with_extractor<F, StreamType>( &self, session: Session, data_extractor: F, ) -> Result<ResponseStream<F, StreamType>, (Session, GeminiResponseError)>
§Warning
You must read the response stream to get reply stored context in session.
data_extractor is used to extract data that you get as a stream of futures.
§Example
use futures::StreamExt
let mut response_stream = gemini.ask_as_stream_with_extractor(session,
|session, _gemini_response| session.get_last_message_text("").unwrap())
.await.unwrap(); // Use _gemini_response.get_text("") to just get the text received in every chunk
while let Some(response) = response_stream.next().await {
if let Ok(response) = response {
println!("{}", response);
}
}Sourcepub async fn ask_as_stream(
&self,
session: Session,
) -> Result<GeminiResponseStream, (Session, GeminiResponseError)>
pub async fn ask_as_stream( &self, session: Session, ) -> Result<GeminiResponseStream, (Session, GeminiResponseError)>
Sends a prompt to the model and returns a stream of responses.
§Warning
You must exhaust the response stream to ensure the session history is correctly updated.
§Example
use futures::StreamExt;
let mut response_stream = gemini.ask_as_stream(session).await.unwrap();
while let Some(response) = response_stream.next().await {
if let Ok(response) = response {
println!("{}", response.get_text(""));
}
}