Struct openai_safe::OpenAI
source · pub struct OpenAI { /* private fields */ }Expand description
Chat models take a list of messages as input and return a model-generated message as output. Although the chat format is designed to make multi-turn conversations easy, it’s just as useful for single-turn tasks without any conversation.
Implementations§
source§impl OpenAI
impl OpenAI
sourcepub fn new(
open_ai_key: &str,
model: OpenAIModels,
max_tokens: Option<usize>,
temperature: Option<u32>
) -> Self
pub fn new( open_ai_key: &str, model: OpenAIModels, max_tokens: Option<usize>, temperature: Option<u32> ) -> Self
Examples found in repository?
examples/use_openai.rs (line 21)
16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34
async fn main() {
env_logger::init();
let api_key: String = std::env::var("OPENAI_API_KEY").expect("OPENAI_API_KEY not set");
let model = OpenAIModels::Gpt3_5Turbo; // Choose the model
let open_ai = OpenAI::new(&api_key, model, None, None);
// Example context and instructions
let instructions =
"Translate the following English text to all the languages in the response type";
match open_ai
.get_answer::<TranslationResponse>(instructions)
.await
{
Ok(response) => println!("Response: {:?}", response),
Err(e) => eprintln!("Error: {:?}", e),
}
}pub fn debug(self) -> Self
pub fn function_calling(self, function_call: bool) -> Self
pub fn set_context<T: Serialize>( self, input_name: &str, input_data: &T ) -> Result<Self>
pub fn check_prompt_tokens<T: JsonSchema + DeserializeOwned>( &self, instructions: &str ) -> Result<usize>
sourcepub async fn get_answer<T: JsonSchema + DeserializeOwned>(
self,
instructions: &str
) -> Result<T>
pub async fn get_answer<T: JsonSchema + DeserializeOwned>( self, instructions: &str ) -> Result<T>
Examples found in repository?
examples/use_openai.rs (line 28)
16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34
async fn main() {
env_logger::init();
let api_key: String = std::env::var("OPENAI_API_KEY").expect("OPENAI_API_KEY not set");
let model = OpenAIModels::Gpt3_5Turbo; // Choose the model
let open_ai = OpenAI::new(&api_key, model, None, None);
// Example context and instructions
let instructions =
"Translate the following English text to all the languages in the response type";
match open_ai
.get_answer::<TranslationResponse>(instructions)
.await
{
Ok(response) => println!("Response: {:?}", response),
Err(e) => eprintln!("Error: {:?}", e),
}
}Auto Trait Implementations§
impl RefUnwindSafe for OpenAI
impl Send for OpenAI
impl Sync for OpenAI
impl Unpin for OpenAI
impl UnwindSafe for OpenAI
Blanket Implementations§
source§impl<T> BorrowMut<T> for Twhere
T: ?Sized,
impl<T> BorrowMut<T> for Twhere T: ?Sized,
source§fn borrow_mut(&mut self) -> &mut T
fn borrow_mut(&mut self) -> &mut T
Mutably borrows from an owned value. Read more