pub struct StartLabelDetectionFluentBuilder { /* private fields */ }
Expand description
Fluent builder constructing a request to StartLabelDetection
.
Starts asynchronous detection of labels in a stored video.
Amazon Rekognition Video can detect labels in a video. Labels are instances of real-world entities. This includes objects like flower, tree, and table; events like wedding, graduation, and birthday party; concepts like landscape, evening, and nature; and activities like a person getting out of a car or a person skiing.
The video must be stored in an Amazon S3 bucket. Use Video
to specify the bucket name and the filename of the video. StartLabelDetection
returns a job identifier (JobId
) which you use to get the results of the operation. When label detection is finished, Amazon Rekognition Video publishes a completion status to the Amazon Simple Notification Service topic that you specify in NotificationChannel
.
To get the results of the label detection operation, first check that the status value published to the Amazon SNS topic is SUCCEEDED
. If so, call GetLabelDetection
and pass the job identifier (JobId
) from the initial call to StartLabelDetection
.
Optional Parameters
StartLabelDetection
has the GENERAL_LABELS
Feature applied by default. This feature allows you to provide filtering criteria to the Settings
parameter. You can filter with sets of individual labels or with label categories. You can specify inclusive filters, exclusive filters, or a combination of inclusive and exclusive filters. For more information on filtering, see Detecting labels in a video.
You can specify MinConfidence
to control the confidence threshold for the labels returned. The default is 50.
Implementations§
Source§impl StartLabelDetectionFluentBuilder
impl StartLabelDetectionFluentBuilder
Sourcepub fn as_input(&self) -> &StartLabelDetectionInputBuilder
pub fn as_input(&self) -> &StartLabelDetectionInputBuilder
Access the StartLabelDetection as a reference.
Sourcepub async fn send(
self,
) -> Result<StartLabelDetectionOutput, SdkError<StartLabelDetectionError, HttpResponse>>
pub async fn send( self, ) -> Result<StartLabelDetectionOutput, SdkError<StartLabelDetectionError, HttpResponse>>
Sends the request and returns the response.
If an error occurs, an SdkError
will be returned with additional details that
can be matched against.
By default, any retryable failures will be retried twice. Retry behavior is configurable with the RetryConfig, which can be set when configuring the client.
Sourcepub fn customize(
self,
) -> CustomizableOperation<StartLabelDetectionOutput, StartLabelDetectionError, Self>
pub fn customize( self, ) -> CustomizableOperation<StartLabelDetectionOutput, StartLabelDetectionError, Self>
Consumes this builder, creating a customizable operation that can be modified before being sent.
Sourcepub fn video(self, input: Video) -> Self
pub fn video(self, input: Video) -> Self
The video in which you want to detect labels. The video must be stored in an Amazon S3 bucket.
Sourcepub fn set_video(self, input: Option<Video>) -> Self
pub fn set_video(self, input: Option<Video>) -> Self
The video in which you want to detect labels. The video must be stored in an Amazon S3 bucket.
Sourcepub fn get_video(&self) -> &Option<Video>
pub fn get_video(&self) -> &Option<Video>
The video in which you want to detect labels. The video must be stored in an Amazon S3 bucket.
Sourcepub fn client_request_token(self, input: impl Into<String>) -> Self
pub fn client_request_token(self, input: impl Into<String>) -> Self
Idempotent token used to identify the start request. If you use the same token with multiple StartLabelDetection
requests, the same JobId
is returned. Use ClientRequestToken
to prevent the same job from being accidently started more than once.
Sourcepub fn set_client_request_token(self, input: Option<String>) -> Self
pub fn set_client_request_token(self, input: Option<String>) -> Self
Idempotent token used to identify the start request. If you use the same token with multiple StartLabelDetection
requests, the same JobId
is returned. Use ClientRequestToken
to prevent the same job from being accidently started more than once.
Sourcepub fn get_client_request_token(&self) -> &Option<String>
pub fn get_client_request_token(&self) -> &Option<String>
Idempotent token used to identify the start request. If you use the same token with multiple StartLabelDetection
requests, the same JobId
is returned. Use ClientRequestToken
to prevent the same job from being accidently started more than once.
Sourcepub fn min_confidence(self, input: f32) -> Self
pub fn min_confidence(self, input: f32) -> Self
Specifies the minimum confidence that Amazon Rekognition Video must have in order to return a detected label. Confidence represents how certain Amazon Rekognition is that a label is correctly identified.0 is the lowest confidence. 100 is the highest confidence. Amazon Rekognition Video doesn't return any labels with a confidence level lower than this specified value.
If you don't specify MinConfidence
, the operation returns labels and bounding boxes (if detected) with confidence values greater than or equal to 50 percent.
Sourcepub fn set_min_confidence(self, input: Option<f32>) -> Self
pub fn set_min_confidence(self, input: Option<f32>) -> Self
Specifies the minimum confidence that Amazon Rekognition Video must have in order to return a detected label. Confidence represents how certain Amazon Rekognition is that a label is correctly identified.0 is the lowest confidence. 100 is the highest confidence. Amazon Rekognition Video doesn't return any labels with a confidence level lower than this specified value.
If you don't specify MinConfidence
, the operation returns labels and bounding boxes (if detected) with confidence values greater than or equal to 50 percent.
Sourcepub fn get_min_confidence(&self) -> &Option<f32>
pub fn get_min_confidence(&self) -> &Option<f32>
Specifies the minimum confidence that Amazon Rekognition Video must have in order to return a detected label. Confidence represents how certain Amazon Rekognition is that a label is correctly identified.0 is the lowest confidence. 100 is the highest confidence. Amazon Rekognition Video doesn't return any labels with a confidence level lower than this specified value.
If you don't specify MinConfidence
, the operation returns labels and bounding boxes (if detected) with confidence values greater than or equal to 50 percent.
Sourcepub fn notification_channel(self, input: NotificationChannel) -> Self
pub fn notification_channel(self, input: NotificationChannel) -> Self
The Amazon SNS topic ARN you want Amazon Rekognition Video to publish the completion status of the label detection operation to. The Amazon SNS topic must have a topic name that begins with AmazonRekognition if you are using the AmazonRekognitionServiceRole permissions policy.
Sourcepub fn set_notification_channel(
self,
input: Option<NotificationChannel>,
) -> Self
pub fn set_notification_channel( self, input: Option<NotificationChannel>, ) -> Self
The Amazon SNS topic ARN you want Amazon Rekognition Video to publish the completion status of the label detection operation to. The Amazon SNS topic must have a topic name that begins with AmazonRekognition if you are using the AmazonRekognitionServiceRole permissions policy.
Sourcepub fn get_notification_channel(&self) -> &Option<NotificationChannel>
pub fn get_notification_channel(&self) -> &Option<NotificationChannel>
The Amazon SNS topic ARN you want Amazon Rekognition Video to publish the completion status of the label detection operation to. The Amazon SNS topic must have a topic name that begins with AmazonRekognition if you are using the AmazonRekognitionServiceRole permissions policy.
Sourcepub fn job_tag(self, input: impl Into<String>) -> Self
pub fn job_tag(self, input: impl Into<String>) -> Self
An identifier you specify that's returned in the completion notification that's published to your Amazon Simple Notification Service topic. For example, you can use JobTag
to group related jobs and identify them in the completion notification.
Sourcepub fn set_job_tag(self, input: Option<String>) -> Self
pub fn set_job_tag(self, input: Option<String>) -> Self
An identifier you specify that's returned in the completion notification that's published to your Amazon Simple Notification Service topic. For example, you can use JobTag
to group related jobs and identify them in the completion notification.
Sourcepub fn get_job_tag(&self) -> &Option<String>
pub fn get_job_tag(&self) -> &Option<String>
An identifier you specify that's returned in the completion notification that's published to your Amazon Simple Notification Service topic. For example, you can use JobTag
to group related jobs and identify them in the completion notification.
Sourcepub fn features(self, input: LabelDetectionFeatureName) -> Self
pub fn features(self, input: LabelDetectionFeatureName) -> Self
Appends an item to Features
.
To override the contents of this collection use set_features
.
The features to return after video analysis. You can specify that GENERAL_LABELS are returned.
Sourcepub fn set_features(self, input: Option<Vec<LabelDetectionFeatureName>>) -> Self
pub fn set_features(self, input: Option<Vec<LabelDetectionFeatureName>>) -> Self
The features to return after video analysis. You can specify that GENERAL_LABELS are returned.
Sourcepub fn get_features(&self) -> &Option<Vec<LabelDetectionFeatureName>>
pub fn get_features(&self) -> &Option<Vec<LabelDetectionFeatureName>>
The features to return after video analysis. You can specify that GENERAL_LABELS are returned.
Sourcepub fn settings(self, input: LabelDetectionSettings) -> Self
pub fn settings(self, input: LabelDetectionSettings) -> Self
The settings for a StartLabelDetection request.Contains the specified parameters for the label detection request of an asynchronous label analysis operation. Settings can include filters for GENERAL_LABELS.
Sourcepub fn set_settings(self, input: Option<LabelDetectionSettings>) -> Self
pub fn set_settings(self, input: Option<LabelDetectionSettings>) -> Self
The settings for a StartLabelDetection request.Contains the specified parameters for the label detection request of an asynchronous label analysis operation. Settings can include filters for GENERAL_LABELS.
Sourcepub fn get_settings(&self) -> &Option<LabelDetectionSettings>
pub fn get_settings(&self) -> &Option<LabelDetectionSettings>
The settings for a StartLabelDetection request.Contains the specified parameters for the label detection request of an asynchronous label analysis operation. Settings can include filters for GENERAL_LABELS.
Trait Implementations§
Source§impl Clone for StartLabelDetectionFluentBuilder
impl Clone for StartLabelDetectionFluentBuilder
Source§fn clone(&self) -> StartLabelDetectionFluentBuilder
fn clone(&self) -> StartLabelDetectionFluentBuilder
1.0.0 · Source§fn clone_from(&mut self, source: &Self)
fn clone_from(&mut self, source: &Self)
source
. Read moreAuto Trait Implementations§
impl Freeze for StartLabelDetectionFluentBuilder
impl !RefUnwindSafe for StartLabelDetectionFluentBuilder
impl Send for StartLabelDetectionFluentBuilder
impl Sync for StartLabelDetectionFluentBuilder
impl Unpin for StartLabelDetectionFluentBuilder
impl !UnwindSafe for StartLabelDetectionFluentBuilder
Blanket Implementations§
Source§impl<T> BorrowMut<T> for Twhere
T: ?Sized,
impl<T> BorrowMut<T> for Twhere
T: ?Sized,
Source§fn borrow_mut(&mut self) -> &mut T
fn borrow_mut(&mut self) -> &mut T
Source§impl<T> CloneToUninit for Twhere
T: Clone,
impl<T> CloneToUninit for Twhere
T: Clone,
Source§impl<T> Instrument for T
impl<T> Instrument for T
Source§fn instrument(self, span: Span) -> Instrumented<Self>
fn instrument(self, span: Span) -> Instrumented<Self>
Source§fn in_current_span(self) -> Instrumented<Self>
fn in_current_span(self) -> Instrumented<Self>
Source§impl<T> IntoEither for T
impl<T> IntoEither for T
Source§fn into_either(self, into_left: bool) -> Either<Self, Self>
fn into_either(self, into_left: bool) -> Either<Self, Self>
self
into a Left
variant of Either<Self, Self>
if into_left
is true
.
Converts self
into a Right
variant of Either<Self, Self>
otherwise. Read moreSource§fn into_either_with<F>(self, into_left: F) -> Either<Self, Self>
fn into_either_with<F>(self, into_left: F) -> Either<Self, Self>
self
into a Left
variant of Either<Self, Self>
if into_left(&self)
returns true
.
Converts self
into a Right
variant of Either<Self, Self>
otherwise. Read moreSource§impl<T> Paint for Twhere
T: ?Sized,
impl<T> Paint for Twhere
T: ?Sized,
Source§fn fg(&self, value: Color) -> Painted<&T>
fn fg(&self, value: Color) -> Painted<&T>
Returns a styled value derived from self
with the foreground set to
value
.
This method should be used rarely. Instead, prefer to use color-specific
builder methods like red()
and
green()
, which have the same functionality but are
pithier.
§Example
Set foreground color to white using fg()
:
use yansi::{Paint, Color};
painted.fg(Color::White);
Set foreground color to white using white()
.
use yansi::Paint;
painted.white();
Source§fn bright_black(&self) -> Painted<&T>
fn bright_black(&self) -> Painted<&T>
Source§fn bright_red(&self) -> Painted<&T>
fn bright_red(&self) -> Painted<&T>
Source§fn bright_green(&self) -> Painted<&T>
fn bright_green(&self) -> Painted<&T>
Source§fn bright_yellow(&self) -> Painted<&T>
fn bright_yellow(&self) -> Painted<&T>
Source§fn bright_blue(&self) -> Painted<&T>
fn bright_blue(&self) -> Painted<&T>
Source§fn bright_magenta(&self) -> Painted<&T>
fn bright_magenta(&self) -> Painted<&T>
Source§fn bright_cyan(&self) -> Painted<&T>
fn bright_cyan(&self) -> Painted<&T>
Source§fn bright_white(&self) -> Painted<&T>
fn bright_white(&self) -> Painted<&T>
Source§fn bg(&self, value: Color) -> Painted<&T>
fn bg(&self, value: Color) -> Painted<&T>
Returns a styled value derived from self
with the background set to
value
.
This method should be used rarely. Instead, prefer to use color-specific
builder methods like on_red()
and
on_green()
, which have the same functionality but
are pithier.
§Example
Set background color to red using fg()
:
use yansi::{Paint, Color};
painted.bg(Color::Red);
Set background color to red using on_red()
.
use yansi::Paint;
painted.on_red();
Source§fn on_primary(&self) -> Painted<&T>
fn on_primary(&self) -> Painted<&T>
Source§fn on_magenta(&self) -> Painted<&T>
fn on_magenta(&self) -> Painted<&T>
Source§fn on_bright_black(&self) -> Painted<&T>
fn on_bright_black(&self) -> Painted<&T>
Source§fn on_bright_red(&self) -> Painted<&T>
fn on_bright_red(&self) -> Painted<&T>
Source§fn on_bright_green(&self) -> Painted<&T>
fn on_bright_green(&self) -> Painted<&T>
Source§fn on_bright_yellow(&self) -> Painted<&T>
fn on_bright_yellow(&self) -> Painted<&T>
Source§fn on_bright_blue(&self) -> Painted<&T>
fn on_bright_blue(&self) -> Painted<&T>
Source§fn on_bright_magenta(&self) -> Painted<&T>
fn on_bright_magenta(&self) -> Painted<&T>
Source§fn on_bright_cyan(&self) -> Painted<&T>
fn on_bright_cyan(&self) -> Painted<&T>
Source§fn on_bright_white(&self) -> Painted<&T>
fn on_bright_white(&self) -> Painted<&T>
Source§fn attr(&self, value: Attribute) -> Painted<&T>
fn attr(&self, value: Attribute) -> Painted<&T>
Enables the styling Attribute
value
.
This method should be used rarely. Instead, prefer to use
attribute-specific builder methods like bold()
and
underline()
, which have the same functionality
but are pithier.
§Example
Make text bold using attr()
:
use yansi::{Paint, Attribute};
painted.attr(Attribute::Bold);
Make text bold using using bold()
.
use yansi::Paint;
painted.bold();
Source§fn rapid_blink(&self) -> Painted<&T>
fn rapid_blink(&self) -> Painted<&T>
Source§fn quirk(&self, value: Quirk) -> Painted<&T>
fn quirk(&self, value: Quirk) -> Painted<&T>
Enables the yansi
Quirk
value
.
This method should be used rarely. Instead, prefer to use quirk-specific
builder methods like mask()
and
wrap()
, which have the same functionality but are
pithier.
§Example
Enable wrapping using .quirk()
:
use yansi::{Paint, Quirk};
painted.quirk(Quirk::Wrap);
Enable wrapping using wrap()
.
use yansi::Paint;
painted.wrap();
Source§fn clear(&self) -> Painted<&T>
👎Deprecated since 1.0.1: renamed to resetting()
due to conflicts with Vec::clear()
.
The clear()
method will be removed in a future release.
fn clear(&self) -> Painted<&T>
resetting()
due to conflicts with Vec::clear()
.
The clear()
method will be removed in a future release.Source§fn whenever(&self, value: Condition) -> Painted<&T>
fn whenever(&self, value: Condition) -> Painted<&T>
Conditionally enable styling based on whether the Condition
value
applies. Replaces any previous condition.
See the crate level docs for more details.
§Example
Enable styling painted
only when both stdout
and stderr
are TTYs:
use yansi::{Paint, Condition};
painted.red().on_yellow().whenever(Condition::STDOUTERR_ARE_TTY);