pub struct MistralLlmProvider { /* private fields */ }Expand description
Vertex AI LLM provider for Mistral models (mistral-small-2503, mistral-medium-3, codestral-2).
Mistral uses the OpenAI-compatible rawPredict endpoint and is only available in
us-central1 and europe-west4 (not global). Models must be enabled from the
Model Garden console before use.
Model name detection: model names containing “mistral” or “codestral” route here.
Implementations§
Source§impl MistralLlmProvider
impl MistralLlmProvider
pub fn new(config: VertexAiConfig) -> Self
Trait Implementations§
Auto Trait Implementations§
impl !Freeze for MistralLlmProvider
impl !RefUnwindSafe for MistralLlmProvider
impl Send for MistralLlmProvider
impl Sync for MistralLlmProvider
impl Unpin for MistralLlmProvider
impl UnsafeUnpin for MistralLlmProvider
impl !UnwindSafe for MistralLlmProvider
Blanket Implementations§
Source§impl<T> BorrowMut<T> for Twhere
T: ?Sized,
impl<T> BorrowMut<T> for Twhere
T: ?Sized,
Source§fn borrow_mut(&mut self) -> &mut T
fn borrow_mut(&mut self) -> &mut T
Mutably borrows from an owned value. Read more