Struct rust_bert::pipelines::conversation::ConversationModel
source · pub struct ConversationModel { /* private fields */ }
Expand description
Conversation model
Processes a ConversationManager and generate system responses for active conversations.
Implementations§
source§impl ConversationModel
impl ConversationModel
sourcepub fn new(
conversation_config: ConversationConfig
) -> Result<ConversationModel, RustBertError>
pub fn new( conversation_config: ConversationConfig ) -> Result<ConversationModel, RustBertError>
Build a new ConversationModel
Arguments
conversation_config
-ConversationConfig
object containing the resource references (model, vocabulary, configuration), conversation options and device placement (CPU/GPU)
Example
use rust_bert::pipelines::conversation::ConversationModel;
let conversation_model = ConversationModel::new(Default::default())?;
sourcepub fn new_with_tokenizer(
conversation_config: ConversationConfig,
tokenizer: TokenizerOption
) -> Result<ConversationModel, RustBertError>
pub fn new_with_tokenizer( conversation_config: ConversationConfig, tokenizer: TokenizerOption ) -> Result<ConversationModel, RustBertError>
Build a new ConversationModel
with a provided tokenizer.
Arguments
conversation_config
-ConversationConfig
object containing the resource references (model, vocabulary, configuration), conversation options and device placement (CPU/GPU)tokenizer
-TokenizerOption
tokenizer to use for conversation
Example
use rust_bert::pipelines::common::{ModelType, TokenizerOption};
use rust_bert::pipelines::conversation::ConversationModel;
let tokenizer = TokenizerOption::from_file(
ModelType::GPT2,
"path/to/vocab.json",
Some("path/to/merges.txt"),
false,
None,
None,
)?;
let conversation_model = ConversationModel::new_with_tokenizer(Default::default(), tokenizer)?;
sourcepub fn generate_responses<'a>(
&self,
conversation_manager: &'a mut ConversationManager
) -> Result<HashMap<&'a Uuid, &'a str>, RustBertError>
pub fn generate_responses<'a>( &self, conversation_manager: &'a mut ConversationManager ) -> Result<HashMap<&'a Uuid, &'a str>, RustBertError>
Perform a multi-turn conversation based on user input
Arguments
conversation_manager
-&mut ConversationManager
Conversation manager keeping track of active conversations
Returns
HashMap<&Uuid, &str>
Responses from the model for each active conversation, referenced by Uuid
Example
use rust_bert::pipelines::conversation::{ConversationManager, ConversationModel};
use rust_bert::pipelines::generation_utils::LanguageGenerator;
let model = ConversationModel::new(Default::default())?;
let mut conversation_manager = ConversationManager::new();
conversation_manager.create("Hello, how are you?");
let output = model.generate_responses(&mut conversation_manager);
sourcepub fn encode_prompts(&self, texts: &[&str]) -> Vec<Vec<i64>>
pub fn encode_prompts(&self, texts: &[&str]) -> Vec<Vec<i64>>
Encodes prompts into Vectors of indices to be processed by the model. This method may be used to initialize the history of a conversation with a prior state.
Example:
use rust_bert::pipelines::conversation::{ConversationManager, ConversationModel};
use rust_bert::pipelines::generation_utils::LanguageGenerator;
let model = ConversationModel::new(Default::default())?;
let history = [
"Going to the movies tonight - any suggestions?",
"The Big Lebowski",
"Is it an action movie?",
];
let encoded_history = model.encode_prompts(&history);
Auto Trait Implementations§
impl !RefUnwindSafe for ConversationModel
impl Send for ConversationModel
impl !Sync for ConversationModel
impl Unpin for ConversationModel
impl !UnwindSafe for ConversationModel
Blanket Implementations§
source§impl<T> BorrowMut<T> for Twhere
T: ?Sized,
impl<T> BorrowMut<T> for Twhere
T: ?Sized,
source§fn borrow_mut(&mut self) -> &mut T
fn borrow_mut(&mut self) -> &mut T
Mutably borrows from an owned value. Read more