#[non_exhaustive]pub struct StreamingAnalyzeContentResponse {
pub recognition_result: Option<StreamingRecognitionResult>,
pub reply_text: String,
pub reply_audio: Option<OutputAudio>,
pub automated_agent_reply: Option<AutomatedAgentReply>,
pub message: Option<Message>,
pub human_agent_suggestion_results: Vec<SuggestionResult>,
pub end_user_suggestion_results: Vec<SuggestionResult>,
pub dtmf_parameters: Option<DtmfParameters>,
pub debugging_info: Option<CloudConversationDebuggingInfo>,
pub speech_model: String,
/* private fields */
}participants only.Expand description
The top-level message returned from the StreamingAnalyzeContent method.
Multiple response messages can be returned in order:
-
If the input was set to streaming audio, the first one or more messages contain
recognition_result. Eachrecognition_resultrepresents a more complete transcript of what the user said. The lastrecognition_resulthasis_finalset totrue. -
In virtual agent stage: if
enable_partial_automated_agent_replyis true, the following N (currently 1 <= N <= 4) messages containautomated_agent_replyand optionallyreply_audioreturned by the virtual agent. The first (N-1)automated_agent_replys will haveautomated_agent_reply_typeset toPARTIAL. The lastautomated_agent_replyhasautomated_agent_reply_typeset toFINAL. Ifenable_partial_automated_agent_replyis not enabled, response stream only contains the final reply.In human assist stage: the following N (N >= 1) messages contain
human_agent_suggestion_results,end_user_suggestion_resultsormessage.
Fields (Non-exhaustive)§
This struct is marked as non-exhaustive
Struct { .. } syntax; cannot be matched against without a wildcard ..; and struct update syntax will not work.recognition_result: Option<StreamingRecognitionResult>The result of speech recognition.
reply_text: StringThe output text content. This field is set if an automated agent responded with a text for the user.
reply_audio: Option<OutputAudio>The audio data bytes encoded as specified in the request. This field is set if:
- The
reply_audio_configfield is specified in the request. - The automated agent, which this output comes from, responded with audio.
In such case, the
reply_audio.configfield contains settings used to synthesize the speech.
In some scenarios, multiple output audio fields may be present in the response structure. In these cases, only the top-most-level audio output has content.
automated_agent_reply: Option<AutomatedAgentReply>Note that in [AutomatedAgentReply.DetectIntentResponse][], [Sessions.DetectIntentResponse.output_audio][] and [Sessions.DetectIntentResponse.output_audio_config][] are always empty, use reply_audio instead.
message: Option<Message>Message analyzed by CCAI.
human_agent_suggestion_results: Vec<SuggestionResult>The suggestions for most recent human agent. The order is the same as HumanAgentAssistantConfig.SuggestionConfig.feature_configs of HumanAgentAssistantConfig.human_agent_suggestion_config.
end_user_suggestion_results: Vec<SuggestionResult>The suggestions for end user. The order is the same as HumanAgentAssistantConfig.SuggestionConfig.feature_configs of HumanAgentAssistantConfig.end_user_suggestion_config.
dtmf_parameters: Option<DtmfParameters>Indicates the parameters of DTMF.
debugging_info: Option<CloudConversationDebuggingInfo>Debugging info that would get populated when
StreamingAnalyzeContentRequest.enable_debugging_info is set to true.
speech_model: StringThe name of the actual Cloud speech model used for speech recognition.
Implementations§
Source§impl StreamingAnalyzeContentResponse
impl StreamingAnalyzeContentResponse
pub fn new() -> Self
Sourcepub fn set_recognition_result<T>(self, v: T) -> Selfwhere
T: Into<StreamingRecognitionResult>,
pub fn set_recognition_result<T>(self, v: T) -> Selfwhere
T: Into<StreamingRecognitionResult>,
Sets the value of recognition_result.
§Example
use google_cloud_dialogflow_v2::model::StreamingRecognitionResult;
let x = StreamingAnalyzeContentResponse::new().set_recognition_result(StreamingRecognitionResult::default()/* use setters */);Sourcepub fn set_or_clear_recognition_result<T>(self, v: Option<T>) -> Selfwhere
T: Into<StreamingRecognitionResult>,
pub fn set_or_clear_recognition_result<T>(self, v: Option<T>) -> Selfwhere
T: Into<StreamingRecognitionResult>,
Sets or clears the value of recognition_result.
§Example
use google_cloud_dialogflow_v2::model::StreamingRecognitionResult;
let x = StreamingAnalyzeContentResponse::new().set_or_clear_recognition_result(Some(StreamingRecognitionResult::default()/* use setters */));
let x = StreamingAnalyzeContentResponse::new().set_or_clear_recognition_result(None::<StreamingRecognitionResult>);Sourcepub fn set_reply_text<T: Into<String>>(self, v: T) -> Self
pub fn set_reply_text<T: Into<String>>(self, v: T) -> Self
Sets the value of reply_text.
§Example
let x = StreamingAnalyzeContentResponse::new().set_reply_text("example");Sourcepub fn set_reply_audio<T>(self, v: T) -> Selfwhere
T: Into<OutputAudio>,
pub fn set_reply_audio<T>(self, v: T) -> Selfwhere
T: Into<OutputAudio>,
Sets the value of reply_audio.
§Example
use google_cloud_dialogflow_v2::model::OutputAudio;
let x = StreamingAnalyzeContentResponse::new().set_reply_audio(OutputAudio::default()/* use setters */);Sourcepub fn set_or_clear_reply_audio<T>(self, v: Option<T>) -> Selfwhere
T: Into<OutputAudio>,
pub fn set_or_clear_reply_audio<T>(self, v: Option<T>) -> Selfwhere
T: Into<OutputAudio>,
Sets or clears the value of reply_audio.
§Example
use google_cloud_dialogflow_v2::model::OutputAudio;
let x = StreamingAnalyzeContentResponse::new().set_or_clear_reply_audio(Some(OutputAudio::default()/* use setters */));
let x = StreamingAnalyzeContentResponse::new().set_or_clear_reply_audio(None::<OutputAudio>);Sourcepub fn set_automated_agent_reply<T>(self, v: T) -> Selfwhere
T: Into<AutomatedAgentReply>,
pub fn set_automated_agent_reply<T>(self, v: T) -> Selfwhere
T: Into<AutomatedAgentReply>,
Sets the value of automated_agent_reply.
§Example
use google_cloud_dialogflow_v2::model::AutomatedAgentReply;
let x = StreamingAnalyzeContentResponse::new().set_automated_agent_reply(AutomatedAgentReply::default()/* use setters */);Sourcepub fn set_or_clear_automated_agent_reply<T>(self, v: Option<T>) -> Selfwhere
T: Into<AutomatedAgentReply>,
pub fn set_or_clear_automated_agent_reply<T>(self, v: Option<T>) -> Selfwhere
T: Into<AutomatedAgentReply>,
Sets or clears the value of automated_agent_reply.
§Example
use google_cloud_dialogflow_v2::model::AutomatedAgentReply;
let x = StreamingAnalyzeContentResponse::new().set_or_clear_automated_agent_reply(Some(AutomatedAgentReply::default()/* use setters */));
let x = StreamingAnalyzeContentResponse::new().set_or_clear_automated_agent_reply(None::<AutomatedAgentReply>);Sourcepub fn set_message<T>(self, v: T) -> Self
pub fn set_message<T>(self, v: T) -> Self
Sourcepub fn set_or_clear_message<T>(self, v: Option<T>) -> Self
pub fn set_or_clear_message<T>(self, v: Option<T>) -> Self
Sourcepub fn set_human_agent_suggestion_results<T, V>(self, v: T) -> Self
pub fn set_human_agent_suggestion_results<T, V>(self, v: T) -> Self
Sets the value of human_agent_suggestion_results.
§Example
use google_cloud_dialogflow_v2::model::SuggestionResult;
let x = StreamingAnalyzeContentResponse::new()
.set_human_agent_suggestion_results([
SuggestionResult::default()/* use setters */,
SuggestionResult::default()/* use (different) setters */,
]);Sourcepub fn set_end_user_suggestion_results<T, V>(self, v: T) -> Self
pub fn set_end_user_suggestion_results<T, V>(self, v: T) -> Self
Sets the value of end_user_suggestion_results.
§Example
use google_cloud_dialogflow_v2::model::SuggestionResult;
let x = StreamingAnalyzeContentResponse::new()
.set_end_user_suggestion_results([
SuggestionResult::default()/* use setters */,
SuggestionResult::default()/* use (different) setters */,
]);Sourcepub fn set_dtmf_parameters<T>(self, v: T) -> Selfwhere
T: Into<DtmfParameters>,
pub fn set_dtmf_parameters<T>(self, v: T) -> Selfwhere
T: Into<DtmfParameters>,
Sets the value of dtmf_parameters.
§Example
use google_cloud_dialogflow_v2::model::DtmfParameters;
let x = StreamingAnalyzeContentResponse::new().set_dtmf_parameters(DtmfParameters::default()/* use setters */);Sourcepub fn set_or_clear_dtmf_parameters<T>(self, v: Option<T>) -> Selfwhere
T: Into<DtmfParameters>,
pub fn set_or_clear_dtmf_parameters<T>(self, v: Option<T>) -> Selfwhere
T: Into<DtmfParameters>,
Sets or clears the value of dtmf_parameters.
§Example
use google_cloud_dialogflow_v2::model::DtmfParameters;
let x = StreamingAnalyzeContentResponse::new().set_or_clear_dtmf_parameters(Some(DtmfParameters::default()/* use setters */));
let x = StreamingAnalyzeContentResponse::new().set_or_clear_dtmf_parameters(None::<DtmfParameters>);Sourcepub fn set_debugging_info<T>(self, v: T) -> Selfwhere
T: Into<CloudConversationDebuggingInfo>,
pub fn set_debugging_info<T>(self, v: T) -> Selfwhere
T: Into<CloudConversationDebuggingInfo>,
Sets the value of debugging_info.
§Example
use google_cloud_dialogflow_v2::model::CloudConversationDebuggingInfo;
let x = StreamingAnalyzeContentResponse::new().set_debugging_info(CloudConversationDebuggingInfo::default()/* use setters */);Sourcepub fn set_or_clear_debugging_info<T>(self, v: Option<T>) -> Selfwhere
T: Into<CloudConversationDebuggingInfo>,
pub fn set_or_clear_debugging_info<T>(self, v: Option<T>) -> Selfwhere
T: Into<CloudConversationDebuggingInfo>,
Sets or clears the value of debugging_info.
§Example
use google_cloud_dialogflow_v2::model::CloudConversationDebuggingInfo;
let x = StreamingAnalyzeContentResponse::new().set_or_clear_debugging_info(Some(CloudConversationDebuggingInfo::default()/* use setters */));
let x = StreamingAnalyzeContentResponse::new().set_or_clear_debugging_info(None::<CloudConversationDebuggingInfo>);Sourcepub fn set_speech_model<T: Into<String>>(self, v: T) -> Self
pub fn set_speech_model<T: Into<String>>(self, v: T) -> Self
Sets the value of speech_model.
§Example
let x = StreamingAnalyzeContentResponse::new().set_speech_model("example");Trait Implementations§
Source§impl Clone for StreamingAnalyzeContentResponse
impl Clone for StreamingAnalyzeContentResponse
Source§fn clone(&self) -> StreamingAnalyzeContentResponse
fn clone(&self) -> StreamingAnalyzeContentResponse
1.0.0 · Source§fn clone_from(&mut self, source: &Self)
fn clone_from(&mut self, source: &Self)
source. Read moreSource§impl Default for StreamingAnalyzeContentResponse
impl Default for StreamingAnalyzeContentResponse
Source§fn default() -> StreamingAnalyzeContentResponse
fn default() -> StreamingAnalyzeContentResponse
Source§impl PartialEq for StreamingAnalyzeContentResponse
impl PartialEq for StreamingAnalyzeContentResponse
Source§fn eq(&self, other: &StreamingAnalyzeContentResponse) -> bool
fn eq(&self, other: &StreamingAnalyzeContentResponse) -> bool
self and other values to be equal, and is used by ==.