Struct rust_bert::pipelines::conversation::ConversationModel
source · pub struct ConversationModel { /* private fields */ }Expand description
Conversation model
Processes a ConversationManager and generate system responses for active conversations.
Implementations§
source§impl ConversationModel
impl ConversationModel
sourcepub fn new(
conversation_config: ConversationConfig
) -> Result<ConversationModel, RustBertError>
pub fn new(
conversation_config: ConversationConfig
) -> Result<ConversationModel, RustBertError>
Build a new ConversationModel
Arguments
conversation_config-ConversationConfigobject containing the resource references (model, vocabulary, configuration), conversation options and device placement (CPU/GPU)
Example
use rust_bert::pipelines::conversation::ConversationModel;
let conversation_model = ConversationModel::new(Default::default())?;sourcepub fn generate_responses<'a>(
&self,
conversation_manager: &'a mut ConversationManager
) -> HashMap<&'a Uuid, &'a str>
pub fn generate_responses<'a>(
&self,
conversation_manager: &'a mut ConversationManager
) -> HashMap<&'a Uuid, &'a str>
Perform a multi-turn conversation based on user input
Arguments
conversation_manager-&mut ConversationManagerConversation manager keeping track of active conversations
Returns
HashMap<&Uuid, &str>Responses from the model for each active conversation, referenced by Uuid
Example
use rust_bert::pipelines::conversation::{ConversationManager, ConversationModel};
use rust_bert::pipelines::generation_utils::LanguageGenerator;
let model = ConversationModel::new(Default::default())?;
let mut conversation_manager = ConversationManager::new();
conversation_manager.create("Hello, how are you?");
let output = model.generate_responses(&mut conversation_manager);sourcepub fn encode_prompts(&self, texts: &[&str]) -> Vec<Vec<i64>> ⓘ
pub fn encode_prompts(&self, texts: &[&str]) -> Vec<Vec<i64>> ⓘ
Encodes prompts into Vectors of indices to be processed by the model. This method may be used to initialize the history of a conversation with a prior state.
Example:
use rust_bert::pipelines::conversation::{ConversationManager, ConversationModel};
use rust_bert::pipelines::generation_utils::LanguageGenerator;
let model = ConversationModel::new(Default::default())?;
let history = [
"Going to the movies tonight - any suggestions?",
"The Big Lebowski",
"Is it an action movie?",
];
let encoded_history = model.encode_prompts(&history);