#[non_exhaustive]pub struct IndexFacesInputBuilder { /* private fields */ }Expand description
A builder for IndexFacesInput.
Implementations§
Source§impl IndexFacesInputBuilder
impl IndexFacesInputBuilder
Sourcepub fn collection_id(self, input: impl Into<String>) -> Self
pub fn collection_id(self, input: impl Into<String>) -> Self
The ID of an existing collection to which you want to add the faces that are detected in the input images.
This field is required.Sourcepub fn set_collection_id(self, input: Option<String>) -> Self
pub fn set_collection_id(self, input: Option<String>) -> Self
The ID of an existing collection to which you want to add the faces that are detected in the input images.
Sourcepub fn get_collection_id(&self) -> &Option<String>
pub fn get_collection_id(&self) -> &Option<String>
The ID of an existing collection to which you want to add the faces that are detected in the input images.
Sourcepub fn image(self, input: Image) -> Self
pub fn image(self, input: Image) -> Self
The input image as base64-encoded bytes or an S3 object. If you use the AWS CLI to call Amazon Rekognition operations, passing base64-encoded image bytes isn't supported.
If you are using an AWS SDK to call Amazon Rekognition, you might not need to base64-encode image bytes passed using the Bytes field. For more information, see Images in the Amazon Rekognition developer guide.
Sourcepub fn set_image(self, input: Option<Image>) -> Self
pub fn set_image(self, input: Option<Image>) -> Self
The input image as base64-encoded bytes or an S3 object. If you use the AWS CLI to call Amazon Rekognition operations, passing base64-encoded image bytes isn't supported.
If you are using an AWS SDK to call Amazon Rekognition, you might not need to base64-encode image bytes passed using the Bytes field. For more information, see Images in the Amazon Rekognition developer guide.
Sourcepub fn get_image(&self) -> &Option<Image>
pub fn get_image(&self) -> &Option<Image>
The input image as base64-encoded bytes or an S3 object. If you use the AWS CLI to call Amazon Rekognition operations, passing base64-encoded image bytes isn't supported.
If you are using an AWS SDK to call Amazon Rekognition, you might not need to base64-encode image bytes passed using the Bytes field. For more information, see Images in the Amazon Rekognition developer guide.
Sourcepub fn external_image_id(self, input: impl Into<String>) -> Self
pub fn external_image_id(self, input: impl Into<String>) -> Self
The ID you want to assign to all the faces detected in the image.
Sourcepub fn set_external_image_id(self, input: Option<String>) -> Self
pub fn set_external_image_id(self, input: Option<String>) -> Self
The ID you want to assign to all the faces detected in the image.
Sourcepub fn get_external_image_id(&self) -> &Option<String>
pub fn get_external_image_id(&self) -> &Option<String>
The ID you want to assign to all the faces detected in the image.
Sourcepub fn detection_attributes(self, input: Attribute) -> Self
pub fn detection_attributes(self, input: Attribute) -> Self
Appends an item to detection_attributes.
To override the contents of this collection use set_detection_attributes.
An array of facial attributes you want to be returned. A DEFAULT subset of facial attributes - BoundingBox, Confidence, Pose, Quality, and Landmarks - will always be returned. You can request for specific facial attributes (in addition to the default list) - by using \["DEFAULT", "FACE_OCCLUDED"\] or just \["FACE_OCCLUDED"\]. You can request for all facial attributes by using \["ALL"\]. Requesting more attributes may increase response time.
If you provide both, \["ALL", "DEFAULT"\], the service uses a logical AND operator to determine which attributes to return (in this case, all attributes).
Sourcepub fn set_detection_attributes(self, input: Option<Vec<Attribute>>) -> Self
pub fn set_detection_attributes(self, input: Option<Vec<Attribute>>) -> Self
An array of facial attributes you want to be returned. A DEFAULT subset of facial attributes - BoundingBox, Confidence, Pose, Quality, and Landmarks - will always be returned. You can request for specific facial attributes (in addition to the default list) - by using \["DEFAULT", "FACE_OCCLUDED"\] or just \["FACE_OCCLUDED"\]. You can request for all facial attributes by using \["ALL"\]. Requesting more attributes may increase response time.
If you provide both, \["ALL", "DEFAULT"\], the service uses a logical AND operator to determine which attributes to return (in this case, all attributes).
Sourcepub fn get_detection_attributes(&self) -> &Option<Vec<Attribute>>
pub fn get_detection_attributes(&self) -> &Option<Vec<Attribute>>
An array of facial attributes you want to be returned. A DEFAULT subset of facial attributes - BoundingBox, Confidence, Pose, Quality, and Landmarks - will always be returned. You can request for specific facial attributes (in addition to the default list) - by using \["DEFAULT", "FACE_OCCLUDED"\] or just \["FACE_OCCLUDED"\]. You can request for all facial attributes by using \["ALL"\]. Requesting more attributes may increase response time.
If you provide both, \["ALL", "DEFAULT"\], the service uses a logical AND operator to determine which attributes to return (in this case, all attributes).
Sourcepub fn max_faces(self, input: i32) -> Self
pub fn max_faces(self, input: i32) -> Self
The maximum number of faces to index. The value of MaxFaces must be greater than or equal to 1. IndexFaces returns no more than 100 detected faces in an image, even if you specify a larger value for MaxFaces.
If IndexFaces detects more faces than the value of MaxFaces, the faces with the lowest quality are filtered out first. If there are still more faces than the value of MaxFaces, the faces with the smallest bounding boxes are filtered out (up to the number that's needed to satisfy the value of MaxFaces). Information about the unindexed faces is available in the UnindexedFaces array.
The faces that are returned by IndexFaces are sorted by the largest face bounding box size to the smallest size, in descending order.
MaxFaces can be used with a collection associated with any version of the face model.
Sourcepub fn set_max_faces(self, input: Option<i32>) -> Self
pub fn set_max_faces(self, input: Option<i32>) -> Self
The maximum number of faces to index. The value of MaxFaces must be greater than or equal to 1. IndexFaces returns no more than 100 detected faces in an image, even if you specify a larger value for MaxFaces.
If IndexFaces detects more faces than the value of MaxFaces, the faces with the lowest quality are filtered out first. If there are still more faces than the value of MaxFaces, the faces with the smallest bounding boxes are filtered out (up to the number that's needed to satisfy the value of MaxFaces). Information about the unindexed faces is available in the UnindexedFaces array.
The faces that are returned by IndexFaces are sorted by the largest face bounding box size to the smallest size, in descending order.
MaxFaces can be used with a collection associated with any version of the face model.
Sourcepub fn get_max_faces(&self) -> &Option<i32>
pub fn get_max_faces(&self) -> &Option<i32>
The maximum number of faces to index. The value of MaxFaces must be greater than or equal to 1. IndexFaces returns no more than 100 detected faces in an image, even if you specify a larger value for MaxFaces.
If IndexFaces detects more faces than the value of MaxFaces, the faces with the lowest quality are filtered out first. If there are still more faces than the value of MaxFaces, the faces with the smallest bounding boxes are filtered out (up to the number that's needed to satisfy the value of MaxFaces). Information about the unindexed faces is available in the UnindexedFaces array.
The faces that are returned by IndexFaces are sorted by the largest face bounding box size to the smallest size, in descending order.
MaxFaces can be used with a collection associated with any version of the face model.
Sourcepub fn quality_filter(self, input: QualityFilter) -> Self
pub fn quality_filter(self, input: QualityFilter) -> Self
A filter that specifies a quality bar for how much filtering is done to identify faces. Filtered faces aren't indexed. If you specify AUTO, Amazon Rekognition chooses the quality bar. If you specify LOW, MEDIUM, or HIGH, filtering removes all faces that don’t meet the chosen quality bar. The default value is AUTO. The quality bar is based on a variety of common use cases. Low-quality detections can occur for a number of reasons. Some examples are an object that's misidentified as a face, a face that's too blurry, or a face with a pose that's too extreme to use. If you specify NONE, no filtering is performed.
To use quality filtering, the collection you are using must be associated with version 3 of the face model or higher.
Sourcepub fn set_quality_filter(self, input: Option<QualityFilter>) -> Self
pub fn set_quality_filter(self, input: Option<QualityFilter>) -> Self
A filter that specifies a quality bar for how much filtering is done to identify faces. Filtered faces aren't indexed. If you specify AUTO, Amazon Rekognition chooses the quality bar. If you specify LOW, MEDIUM, or HIGH, filtering removes all faces that don’t meet the chosen quality bar. The default value is AUTO. The quality bar is based on a variety of common use cases. Low-quality detections can occur for a number of reasons. Some examples are an object that's misidentified as a face, a face that's too blurry, or a face with a pose that's too extreme to use. If you specify NONE, no filtering is performed.
To use quality filtering, the collection you are using must be associated with version 3 of the face model or higher.
Sourcepub fn get_quality_filter(&self) -> &Option<QualityFilter>
pub fn get_quality_filter(&self) -> &Option<QualityFilter>
A filter that specifies a quality bar for how much filtering is done to identify faces. Filtered faces aren't indexed. If you specify AUTO, Amazon Rekognition chooses the quality bar. If you specify LOW, MEDIUM, or HIGH, filtering removes all faces that don’t meet the chosen quality bar. The default value is AUTO. The quality bar is based on a variety of common use cases. Low-quality detections can occur for a number of reasons. Some examples are an object that's misidentified as a face, a face that's too blurry, or a face with a pose that's too extreme to use. If you specify NONE, no filtering is performed.
To use quality filtering, the collection you are using must be associated with version 3 of the face model or higher.
Sourcepub fn build(self) -> Result<IndexFacesInput, BuildError>
pub fn build(self) -> Result<IndexFacesInput, BuildError>
Consumes the builder and constructs a IndexFacesInput.
Source§impl IndexFacesInputBuilder
impl IndexFacesInputBuilder
Sourcepub async fn send_with(
self,
client: &Client,
) -> Result<IndexFacesOutput, SdkError<IndexFacesError, HttpResponse>>
pub async fn send_with( self, client: &Client, ) -> Result<IndexFacesOutput, SdkError<IndexFacesError, HttpResponse>>
Sends a request with this input using the given client.
Trait Implementations§
Source§impl Clone for IndexFacesInputBuilder
impl Clone for IndexFacesInputBuilder
Source§fn clone(&self) -> IndexFacesInputBuilder
fn clone(&self) -> IndexFacesInputBuilder
1.0.0 · Source§fn clone_from(&mut self, source: &Self)
fn clone_from(&mut self, source: &Self)
source. Read moreSource§impl Debug for IndexFacesInputBuilder
impl Debug for IndexFacesInputBuilder
Source§impl Default for IndexFacesInputBuilder
impl Default for IndexFacesInputBuilder
Source§fn default() -> IndexFacesInputBuilder
fn default() -> IndexFacesInputBuilder
Source§impl PartialEq for IndexFacesInputBuilder
impl PartialEq for IndexFacesInputBuilder
impl StructuralPartialEq for IndexFacesInputBuilder
Auto Trait Implementations§
impl Freeze for IndexFacesInputBuilder
impl RefUnwindSafe for IndexFacesInputBuilder
impl Send for IndexFacesInputBuilder
impl Sync for IndexFacesInputBuilder
impl Unpin for IndexFacesInputBuilder
impl UnwindSafe for IndexFacesInputBuilder
Blanket Implementations§
Source§impl<T> BorrowMut<T> for Twhere
T: ?Sized,
impl<T> BorrowMut<T> for Twhere
T: ?Sized,
Source§fn borrow_mut(&mut self) -> &mut T
fn borrow_mut(&mut self) -> &mut T
Source§impl<T> CloneToUninit for Twhere
T: Clone,
impl<T> CloneToUninit for Twhere
T: Clone,
Source§impl<T> Instrument for T
impl<T> Instrument for T
Source§fn instrument(self, span: Span) -> Instrumented<Self>
fn instrument(self, span: Span) -> Instrumented<Self>
Source§fn in_current_span(self) -> Instrumented<Self>
fn in_current_span(self) -> Instrumented<Self>
Source§impl<T> IntoEither for T
impl<T> IntoEither for T
Source§fn into_either(self, into_left: bool) -> Either<Self, Self>
fn into_either(self, into_left: bool) -> Either<Self, Self>
self into a Left variant of Either<Self, Self>
if into_left is true.
Converts self into a Right variant of Either<Self, Self>
otherwise. Read moreSource§fn into_either_with<F>(self, into_left: F) -> Either<Self, Self>
fn into_either_with<F>(self, into_left: F) -> Either<Self, Self>
self into a Left variant of Either<Self, Self>
if into_left(&self) returns true.
Converts self into a Right variant of Either<Self, Self>
otherwise. Read moreSource§impl<T> Paint for Twhere
T: ?Sized,
impl<T> Paint for Twhere
T: ?Sized,
Source§fn fg(&self, value: Color) -> Painted<&T>
fn fg(&self, value: Color) -> Painted<&T>
Returns a styled value derived from self with the foreground set to
value.
This method should be used rarely. Instead, prefer to use color-specific
builder methods like red() and
green(), which have the same functionality but are
pithier.
§Example
Set foreground color to white using fg():
use yansi::{Paint, Color};
painted.fg(Color::White);Set foreground color to white using white().
use yansi::Paint;
painted.white();Source§fn bright_black(&self) -> Painted<&T>
fn bright_black(&self) -> Painted<&T>
Source§fn bright_red(&self) -> Painted<&T>
fn bright_red(&self) -> Painted<&T>
Source§fn bright_green(&self) -> Painted<&T>
fn bright_green(&self) -> Painted<&T>
Source§fn bright_yellow(&self) -> Painted<&T>
fn bright_yellow(&self) -> Painted<&T>
Source§fn bright_blue(&self) -> Painted<&T>
fn bright_blue(&self) -> Painted<&T>
Source§fn bright_magenta(&self) -> Painted<&T>
fn bright_magenta(&self) -> Painted<&T>
Source§fn bright_cyan(&self) -> Painted<&T>
fn bright_cyan(&self) -> Painted<&T>
Source§fn bright_white(&self) -> Painted<&T>
fn bright_white(&self) -> Painted<&T>
Source§fn bg(&self, value: Color) -> Painted<&T>
fn bg(&self, value: Color) -> Painted<&T>
Returns a styled value derived from self with the background set to
value.
This method should be used rarely. Instead, prefer to use color-specific
builder methods like on_red() and
on_green(), which have the same functionality but
are pithier.
§Example
Set background color to red using fg():
use yansi::{Paint, Color};
painted.bg(Color::Red);Set background color to red using on_red().
use yansi::Paint;
painted.on_red();Source§fn on_primary(&self) -> Painted<&T>
fn on_primary(&self) -> Painted<&T>
Source§fn on_magenta(&self) -> Painted<&T>
fn on_magenta(&self) -> Painted<&T>
Source§fn on_bright_black(&self) -> Painted<&T>
fn on_bright_black(&self) -> Painted<&T>
Source§fn on_bright_red(&self) -> Painted<&T>
fn on_bright_red(&self) -> Painted<&T>
Source§fn on_bright_green(&self) -> Painted<&T>
fn on_bright_green(&self) -> Painted<&T>
Source§fn on_bright_yellow(&self) -> Painted<&T>
fn on_bright_yellow(&self) -> Painted<&T>
Source§fn on_bright_blue(&self) -> Painted<&T>
fn on_bright_blue(&self) -> Painted<&T>
Source§fn on_bright_magenta(&self) -> Painted<&T>
fn on_bright_magenta(&self) -> Painted<&T>
Source§fn on_bright_cyan(&self) -> Painted<&T>
fn on_bright_cyan(&self) -> Painted<&T>
Source§fn on_bright_white(&self) -> Painted<&T>
fn on_bright_white(&self) -> Painted<&T>
Source§fn attr(&self, value: Attribute) -> Painted<&T>
fn attr(&self, value: Attribute) -> Painted<&T>
Enables the styling Attribute value.
This method should be used rarely. Instead, prefer to use
attribute-specific builder methods like bold() and
underline(), which have the same functionality
but are pithier.
§Example
Make text bold using attr():
use yansi::{Paint, Attribute};
painted.attr(Attribute::Bold);Make text bold using using bold().
use yansi::Paint;
painted.bold();Source§fn rapid_blink(&self) -> Painted<&T>
fn rapid_blink(&self) -> Painted<&T>
Source§fn quirk(&self, value: Quirk) -> Painted<&T>
fn quirk(&self, value: Quirk) -> Painted<&T>
Enables the yansi Quirk value.
This method should be used rarely. Instead, prefer to use quirk-specific
builder methods like mask() and
wrap(), which have the same functionality but are
pithier.
§Example
Enable wrapping using .quirk():
use yansi::{Paint, Quirk};
painted.quirk(Quirk::Wrap);Enable wrapping using wrap().
use yansi::Paint;
painted.wrap();Source§fn clear(&self) -> Painted<&T>
👎Deprecated since 1.0.1: renamed to resetting() due to conflicts with Vec::clear().
The clear() method will be removed in a future release.
fn clear(&self) -> Painted<&T>
resetting() due to conflicts with Vec::clear().
The clear() method will be removed in a future release.Source§fn whenever(&self, value: Condition) -> Painted<&T>
fn whenever(&self, value: Condition) -> Painted<&T>
Conditionally enable styling based on whether the Condition value
applies. Replaces any previous condition.
See the crate level docs for more details.
§Example
Enable styling painted only when both stdout and stderr are TTYs:
use yansi::{Paint, Condition};
painted.red().on_yellow().whenever(Condition::STDOUTERR_ARE_TTY);