Enum chat_gpt_lib_rs::models::Model
source · pub enum Model {
Gpt3_5Turbo,
Gpt_4,
Gpt_4_32k,
Gpt_4Turbo,
Gpt_4o,
Gpt_4Turbo_Vision,
}
Expand description
Model
enum represents the available OpenAI models.
This enum provides an easy way to specify the model to be used in the API calls. Currently supported models are:
- Gpt3_5Turbo
- Gpt4
Variants§
Implementations§
Trait Implementations§
source§impl<'de> Deserialize<'de> for Model
impl<'de> Deserialize<'de> for Model
source§fn deserialize<__D>(__deserializer: __D) -> Result<Self, __D::Error>where
__D: Deserializer<'de>,
fn deserialize<__D>(__deserializer: __D) -> Result<Self, __D::Error>where
__D: Deserializer<'de>,
Deserialize this value from the given Serde deserializer. Read more
source§impl FromStr for Model
impl FromStr for Model
Implement FromStr
to enable parsing the enum from a string representation.
source§impl PartialEq for Model
impl PartialEq for Model
impl Copy for Model
impl Eq for Model
impl StructuralPartialEq for Model
Auto Trait Implementations§
impl Freeze for Model
impl RefUnwindSafe for Model
impl Send for Model
impl Sync for Model
impl Unpin for Model
impl UnwindSafe for Model
Blanket Implementations§
source§impl<T> BorrowMut<T> for Twhere
T: ?Sized,
impl<T> BorrowMut<T> for Twhere
T: ?Sized,
source§fn borrow_mut(&mut self) -> &mut T
fn borrow_mut(&mut self) -> &mut T
Mutably borrows from an owned value. Read more