pub struct Client { /* private fields */ }
Expand description
Client for AWS IoT Analytics
Client for invoking operations on AWS IoT Analytics. Each operation on AWS IoT Analytics is a method on this
this struct. .send()
MUST be invoked on the generated operations to dispatch the request to the service.
§Constructing a Client
A Config
is required to construct a client. For most use cases, the aws-config
crate should be used to automatically resolve this config using
aws_config::load_from_env()
, since this will resolve an SdkConfig
which can be shared
across multiple different AWS SDK clients. This config resolution process can be customized
by calling aws_config::from_env()
instead, which returns a ConfigLoader
that uses
the builder pattern to customize the default config.
In the simplest case, creating a client looks as follows:
let config = aws_config::load_from_env().await;
let client = aws_sdk_iotanalytics::Client::new(&config);
Occasionally, SDKs may have additional service-specific values that can be set on the Config
that
is absent from SdkConfig
, or slightly different settings for a specific client may be desired.
The Builder
struct implements From<&SdkConfig>
, so setting these specific settings can be
done as follows:
let sdk_config = ::aws_config::load_from_env().await;
let config = aws_sdk_iotanalytics::config::Builder::from(&sdk_config)
.some_service_specific_setting("value")
.build();
See the aws-config
docs and Config
for more information on customizing configuration.
Note: Client construction is expensive due to connection thread pool initialization, and should be done once at application start-up.
§Using the Client
A client has a function for every operation that can be performed by the service.
For example, the BatchPutMessage
operation has
a Client::batch_put_message
, function which returns a builder for that operation.
The fluent builder ultimately has a send()
function that returns an async future that
returns a result, as illustrated below:
let result = client.batch_put_message()
.channel_name("example")
.send()
.await;
The underlying HTTP requests that get made by this can be modified with the customize_operation
function on the fluent builder. See the customize
module for more
information.
Implementations§
Source§impl Client
impl Client
Sourcepub fn batch_put_message(&self) -> BatchPutMessageFluentBuilder
pub fn batch_put_message(&self) -> BatchPutMessageFluentBuilder
Constructs a fluent builder for the BatchPutMessage
operation.
- The fluent builder is configurable:
channel_name(impl Into<String>)
/set_channel_name(Option<String>)
:
required: trueThe name of the channel where the messages are sent.
messages(Message)
/set_messages(Option<Vec::<Message>>)
:
required: trueThe list of messages to be sent. Each message has the format: { “messageId”: “string”, “payload”: “string”}.
The field names of message payloads (data) that you send to IoT Analytics:
-
Must contain only alphanumeric characters and undescores (). No other special characters are allowed.
-
Must begin with an alphabetic character or single underscore ().
-
Cannot contain hyphens (-).
-
In regular expression terms: “^[A-Za-z_]([A-Za-z0-9]|[A-Za-z0-9][A-Za-z0-9_])$”.
-
Cannot be more than 255 characters.
-
Are case insensitive. (Fields named foo and FOO in the same payload are considered duplicates.)
For example, {“temp_01”: 29} or {“_temp_01”: 29} are valid, but {“temp-01”: 29}, {“01_temp”: 29} or {“__temp_01”: 29} are invalid in message payloads.
-
- On success, responds with
BatchPutMessageOutput
with field(s):batch_put_message_error_entries(Option<Vec::<BatchPutMessageErrorEntry>>)
:A list of any errors encountered when sending the messages to the channel.
- On failure, responds with
SdkError<BatchPutMessageError>
Source§impl Client
impl Client
Sourcepub fn cancel_pipeline_reprocessing(
&self,
) -> CancelPipelineReprocessingFluentBuilder
pub fn cancel_pipeline_reprocessing( &self, ) -> CancelPipelineReprocessingFluentBuilder
Constructs a fluent builder for the CancelPipelineReprocessing
operation.
- The fluent builder is configurable:
pipeline_name(impl Into<String>)
/set_pipeline_name(Option<String>)
:
required: trueThe name of pipeline for which data reprocessing is canceled.
reprocessing_id(impl Into<String>)
/set_reprocessing_id(Option<String>)
:
required: trueThe ID of the reprocessing task (returned by
StartPipelineReprocessing
).
- On success, responds with
CancelPipelineReprocessingOutput
- On failure, responds with
SdkError<CancelPipelineReprocessingError>
Source§impl Client
impl Client
Sourcepub fn create_channel(&self) -> CreateChannelFluentBuilder
pub fn create_channel(&self) -> CreateChannelFluentBuilder
Constructs a fluent builder for the CreateChannel
operation.
- The fluent builder is configurable:
channel_name(impl Into<String>)
/set_channel_name(Option<String>)
:
required: trueThe name of the channel.
channel_storage(ChannelStorage)
/set_channel_storage(Option<ChannelStorage>)
:
required: falseWhere channel data is stored. You can choose one of
serviceManagedS3
orcustomerManagedS3
storage. If not specified, the default isserviceManagedS3
. You can’t change this storage option after the channel is created.retention_period(RetentionPeriod)
/set_retention_period(Option<RetentionPeriod>)
:
required: falseHow long, in days, message data is kept for the channel. When
customerManagedS3
storage is selected, this parameter is ignored.tags(Tag)
/set_tags(Option<Vec::<Tag>>)
:
required: falseMetadata which can be used to manage the channel.
- On success, responds with
CreateChannelOutput
with field(s):channel_name(Option<String>)
:The name of the channel.
channel_arn(Option<String>)
:The ARN of the channel.
retention_period(Option<RetentionPeriod>)
:How long, in days, message data is kept for the channel.
- On failure, responds with
SdkError<CreateChannelError>
Source§impl Client
impl Client
Sourcepub fn create_dataset(&self) -> CreateDatasetFluentBuilder
pub fn create_dataset(&self) -> CreateDatasetFluentBuilder
Constructs a fluent builder for the CreateDataset
operation.
- The fluent builder is configurable:
dataset_name(impl Into<String>)
/set_dataset_name(Option<String>)
:
required: trueThe name of the dataset.
actions(DatasetAction)
/set_actions(Option<Vec::<DatasetAction>>)
:
required: trueA list of actions that create the dataset contents.
triggers(DatasetTrigger)
/set_triggers(Option<Vec::<DatasetTrigger>>)
:
required: falseA list of triggers. A trigger causes dataset contents to be populated at a specified time interval or when another dataset’s contents are created. The list of triggers can be empty or contain up to five
DataSetTrigger
objects.content_delivery_rules(DatasetContentDeliveryRule)
/set_content_delivery_rules(Option<Vec::<DatasetContentDeliveryRule>>)
:
required: falseWhen dataset contents are created, they are delivered to destinations specified here.
retention_period(RetentionPeriod)
/set_retention_period(Option<RetentionPeriod>)
:
required: falseOptional. How long, in days, versions of dataset contents are kept for the dataset. If not specified or set to
null
, versions of dataset contents are retained for at most 90 days. The number of versions of dataset contents retained is determined by theversioningConfiguration
parameter. For more information, see Keeping Multiple Versions of IoT Analytics datasets in the IoT Analytics User Guide.versioning_configuration(VersioningConfiguration)
/set_versioning_configuration(Option<VersioningConfiguration>)
:
required: falseOptional. How many versions of dataset contents are kept. If not specified or set to null, only the latest version plus the latest succeeded version (if they are different) are kept for the time period specified by the
retentionPeriod
parameter. For more information, see Keeping Multiple Versions of IoT Analytics datasets in the IoT Analytics User Guide.tags(Tag)
/set_tags(Option<Vec::<Tag>>)
:
required: falseMetadata which can be used to manage the dataset.
late_data_rules(LateDataRule)
/set_late_data_rules(Option<Vec::<LateDataRule>>)
:
required: falseA list of data rules that send notifications to CloudWatch, when data arrives late. To specify
lateDataRules
, the dataset must use a DeltaTimer filter.
- On success, responds with
CreateDatasetOutput
with field(s):dataset_name(Option<String>)
:The name of the dataset.
dataset_arn(Option<String>)
:The ARN of the dataset.
retention_period(Option<RetentionPeriod>)
:How long, in days, dataset contents are kept for the dataset.
- On failure, responds with
SdkError<CreateDatasetError>
Source§impl Client
impl Client
Sourcepub fn create_dataset_content(&self) -> CreateDatasetContentFluentBuilder
pub fn create_dataset_content(&self) -> CreateDatasetContentFluentBuilder
Constructs a fluent builder for the CreateDatasetContent
operation.
- The fluent builder is configurable:
dataset_name(impl Into<String>)
/set_dataset_name(Option<String>)
:
required: trueThe name of the dataset.
version_id(impl Into<String>)
/set_version_id(Option<String>)
:
required: falseThe version ID of the dataset content. To specify
versionId
for a dataset content, the dataset must use a DeltaTimer filter.
- On success, responds with
CreateDatasetContentOutput
with field(s):version_id(Option<String>)
:The version ID of the dataset contents that are being created.
- On failure, responds with
SdkError<CreateDatasetContentError>
Source§impl Client
impl Client
Sourcepub fn create_datastore(&self) -> CreateDatastoreFluentBuilder
pub fn create_datastore(&self) -> CreateDatastoreFluentBuilder
Constructs a fluent builder for the CreateDatastore
operation.
- The fluent builder is configurable:
datastore_name(impl Into<String>)
/set_datastore_name(Option<String>)
:
required: trueThe name of the data store.
datastore_storage(DatastoreStorage)
/set_datastore_storage(Option<DatastoreStorage>)
:
required: falseWhere data in a data store is stored.. You can choose
serviceManagedS3
storage,customerManagedS3
storage, oriotSiteWiseMultiLayerStorage
storage. The default isserviceManagedS3
. You can’t change the choice of Amazon S3 storage after your data store is created.retention_period(RetentionPeriod)
/set_retention_period(Option<RetentionPeriod>)
:
required: falseHow long, in days, message data is kept for the data store. When
customerManagedS3
storage is selected, this parameter is ignored.tags(Tag)
/set_tags(Option<Vec::<Tag>>)
:
required: falseMetadata which can be used to manage the data store.
file_format_configuration(FileFormatConfiguration)
/set_file_format_configuration(Option<FileFormatConfiguration>)
:
required: falseContains the configuration information of file formats. IoT Analytics data stores support JSON and Parquet.
The default file format is JSON. You can specify only one format.
You can’t change the file format after you create the data store.
datastore_partitions(DatastorePartitions)
/set_datastore_partitions(Option<DatastorePartitions>)
:
required: falseContains information about the partition dimensions in a data store.
- On success, responds with
CreateDatastoreOutput
with field(s):datastore_name(Option<String>)
:The name of the data store.
datastore_arn(Option<String>)
:The ARN of the data store.
retention_period(Option<RetentionPeriod>)
:How long, in days, message data is kept for the data store.
- On failure, responds with
SdkError<CreateDatastoreError>
Source§impl Client
impl Client
Sourcepub fn create_pipeline(&self) -> CreatePipelineFluentBuilder
pub fn create_pipeline(&self) -> CreatePipelineFluentBuilder
Constructs a fluent builder for the CreatePipeline
operation.
- The fluent builder is configurable:
pipeline_name(impl Into<String>)
/set_pipeline_name(Option<String>)
:
required: trueThe name of the pipeline.
pipeline_activities(PipelineActivity)
/set_pipeline_activities(Option<Vec::<PipelineActivity>>)
:
required: trueA list of
PipelineActivity
objects. Activities perform transformations on your messages, such as removing, renaming or adding message attributes; filtering messages based on attribute values; invoking your Lambda unctions on messages for advanced processing; or performing mathematical transformations to normalize device data.The list can be 2-25
PipelineActivity
objects and must contain both achannel
and adatastore
activity. Each entry in the list must contain only one activity. For example:pipelineActivities = [ { “channel”: { … } }, { “lambda”: { … } }, … ]
tags(Tag)
/set_tags(Option<Vec::<Tag>>)
:
required: falseMetadata which can be used to manage the pipeline.
- On success, responds with
CreatePipelineOutput
with field(s):pipeline_name(Option<String>)
:The name of the pipeline.
pipeline_arn(Option<String>)
:The ARN of the pipeline.
- On failure, responds with
SdkError<CreatePipelineError>
Source§impl Client
impl Client
Sourcepub fn delete_channel(&self) -> DeleteChannelFluentBuilder
pub fn delete_channel(&self) -> DeleteChannelFluentBuilder
Constructs a fluent builder for the DeleteChannel
operation.
- The fluent builder is configurable:
channel_name(impl Into<String>)
/set_channel_name(Option<String>)
:
required: trueThe name of the channel to delete.
- On success, responds with
DeleteChannelOutput
- On failure, responds with
SdkError<DeleteChannelError>
Source§impl Client
impl Client
Sourcepub fn delete_dataset(&self) -> DeleteDatasetFluentBuilder
pub fn delete_dataset(&self) -> DeleteDatasetFluentBuilder
Constructs a fluent builder for the DeleteDataset
operation.
- The fluent builder is configurable:
dataset_name(impl Into<String>)
/set_dataset_name(Option<String>)
:
required: trueThe name of the dataset to delete.
- On success, responds with
DeleteDatasetOutput
- On failure, responds with
SdkError<DeleteDatasetError>
Source§impl Client
impl Client
Sourcepub fn delete_dataset_content(&self) -> DeleteDatasetContentFluentBuilder
pub fn delete_dataset_content(&self) -> DeleteDatasetContentFluentBuilder
Constructs a fluent builder for the DeleteDatasetContent
operation.
- The fluent builder is configurable:
dataset_name(impl Into<String>)
/set_dataset_name(Option<String>)
:
required: trueThe name of the dataset whose content is deleted.
version_id(impl Into<String>)
/set_version_id(Option<String>)
:
required: falseThe version of the dataset whose content is deleted. You can also use the strings “$LATEST” or “$LATEST_SUCCEEDED” to delete the latest or latest successfully completed data set. If not specified, “$LATEST_SUCCEEDED” is the default.
- On success, responds with
DeleteDatasetContentOutput
- On failure, responds with
SdkError<DeleteDatasetContentError>
Source§impl Client
impl Client
Sourcepub fn delete_datastore(&self) -> DeleteDatastoreFluentBuilder
pub fn delete_datastore(&self) -> DeleteDatastoreFluentBuilder
Constructs a fluent builder for the DeleteDatastore
operation.
- The fluent builder is configurable:
datastore_name(impl Into<String>)
/set_datastore_name(Option<String>)
:
required: trueThe name of the data store to delete.
- On success, responds with
DeleteDatastoreOutput
- On failure, responds with
SdkError<DeleteDatastoreError>
Source§impl Client
impl Client
Sourcepub fn delete_pipeline(&self) -> DeletePipelineFluentBuilder
pub fn delete_pipeline(&self) -> DeletePipelineFluentBuilder
Constructs a fluent builder for the DeletePipeline
operation.
- The fluent builder is configurable:
pipeline_name(impl Into<String>)
/set_pipeline_name(Option<String>)
:
required: trueThe name of the pipeline to delete.
- On success, responds with
DeletePipelineOutput
- On failure, responds with
SdkError<DeletePipelineError>
Source§impl Client
impl Client
Sourcepub fn describe_channel(&self) -> DescribeChannelFluentBuilder
pub fn describe_channel(&self) -> DescribeChannelFluentBuilder
Constructs a fluent builder for the DescribeChannel
operation.
- The fluent builder is configurable:
channel_name(impl Into<String>)
/set_channel_name(Option<String>)
:
required: trueThe name of the channel whose information is retrieved.
include_statistics(bool)
/set_include_statistics(Option<bool>)
:
required: falseIf true, additional statistical information about the channel is included in the response. This feature can’t be used with a channel whose S3 storage is customer-managed.
- On success, responds with
DescribeChannelOutput
with field(s):channel(Option<Channel>)
:An object that contains information about the channel.
statistics(Option<ChannelStatistics>)
:Statistics about the channel. Included if the
includeStatistics
parameter is set totrue
in the request.
- On failure, responds with
SdkError<DescribeChannelError>
Source§impl Client
impl Client
Sourcepub fn describe_dataset(&self) -> DescribeDatasetFluentBuilder
pub fn describe_dataset(&self) -> DescribeDatasetFluentBuilder
Constructs a fluent builder for the DescribeDataset
operation.
- The fluent builder is configurable:
dataset_name(impl Into<String>)
/set_dataset_name(Option<String>)
:
required: trueThe name of the dataset whose information is retrieved.
- On success, responds with
DescribeDatasetOutput
with field(s):dataset(Option<Dataset>)
:An object that contains information about the dataset.
- On failure, responds with
SdkError<DescribeDatasetError>
Source§impl Client
impl Client
Sourcepub fn describe_datastore(&self) -> DescribeDatastoreFluentBuilder
pub fn describe_datastore(&self) -> DescribeDatastoreFluentBuilder
Constructs a fluent builder for the DescribeDatastore
operation.
- The fluent builder is configurable:
datastore_name(impl Into<String>)
/set_datastore_name(Option<String>)
:
required: trueThe name of the data store
include_statistics(bool)
/set_include_statistics(Option<bool>)
:
required: falseIf true, additional statistical information about the data store is included in the response. This feature can’t be used with a data store whose S3 storage is customer-managed.
- On success, responds with
DescribeDatastoreOutput
with field(s):datastore(Option<Datastore>)
:Information about the data store.
statistics(Option<DatastoreStatistics>)
:Additional statistical information about the data store. Included if the
includeStatistics
parameter is set totrue
in the request.
- On failure, responds with
SdkError<DescribeDatastoreError>
Source§impl Client
impl Client
Sourcepub fn describe_logging_options(&self) -> DescribeLoggingOptionsFluentBuilder
pub fn describe_logging_options(&self) -> DescribeLoggingOptionsFluentBuilder
Constructs a fluent builder for the DescribeLoggingOptions
operation.
- The fluent builder takes no input, just
send
it. - On success, responds with
DescribeLoggingOptionsOutput
with field(s):logging_options(Option<LoggingOptions>)
:The current settings of the IoT Analytics logging options.
- On failure, responds with
SdkError<DescribeLoggingOptionsError>
Source§impl Client
impl Client
Sourcepub fn describe_pipeline(&self) -> DescribePipelineFluentBuilder
pub fn describe_pipeline(&self) -> DescribePipelineFluentBuilder
Constructs a fluent builder for the DescribePipeline
operation.
- The fluent builder is configurable:
pipeline_name(impl Into<String>)
/set_pipeline_name(Option<String>)
:
required: trueThe name of the pipeline whose information is retrieved.
- On success, responds with
DescribePipelineOutput
with field(s):pipeline(Option<Pipeline>)
:A
Pipeline
object that contains information about the pipeline.
- On failure, responds with
SdkError<DescribePipelineError>
Source§impl Client
impl Client
Sourcepub fn get_dataset_content(&self) -> GetDatasetContentFluentBuilder
pub fn get_dataset_content(&self) -> GetDatasetContentFluentBuilder
Constructs a fluent builder for the GetDatasetContent
operation.
- The fluent builder is configurable:
dataset_name(impl Into<String>)
/set_dataset_name(Option<String>)
:
required: trueThe name of the dataset whose contents are retrieved.
version_id(impl Into<String>)
/set_version_id(Option<String>)
:
required: falseThe version of the dataset whose contents are retrieved. You can also use the strings “$LATEST” or “$LATEST_SUCCEEDED” to retrieve the contents of the latest or latest successfully completed dataset. If not specified, “$LATEST_SUCCEEDED” is the default.
- On success, responds with
GetDatasetContentOutput
with field(s):entries(Option<Vec::<DatasetEntry>>)
:A list of
DatasetEntry
objects.timestamp(Option<DateTime>)
:The time when the request was made.
status(Option<DatasetContentStatus>)
:The status of the dataset content.
- On failure, responds with
SdkError<GetDatasetContentError>
Source§impl Client
impl Client
Sourcepub fn list_channels(&self) -> ListChannelsFluentBuilder
pub fn list_channels(&self) -> ListChannelsFluentBuilder
Constructs a fluent builder for the ListChannels
operation.
This operation supports pagination; See into_paginator()
.
- The fluent builder is configurable:
next_token(impl Into<String>)
/set_next_token(Option<String>)
:
required: falseThe token for the next set of results.
max_results(i32)
/set_max_results(Option<i32>)
:
required: falseThe maximum number of results to return in this request.
The default value is 100.
- On success, responds with
ListChannelsOutput
with field(s):channel_summaries(Option<Vec::<ChannelSummary>>)
:A list of
ChannelSummary
objects.next_token(Option<String>)
:The token to retrieve the next set of results, or
null
if there are no more results.
- On failure, responds with
SdkError<ListChannelsError>
Source§impl Client
impl Client
Sourcepub fn list_dataset_contents(&self) -> ListDatasetContentsFluentBuilder
pub fn list_dataset_contents(&self) -> ListDatasetContentsFluentBuilder
Constructs a fluent builder for the ListDatasetContents
operation.
This operation supports pagination; See into_paginator()
.
- The fluent builder is configurable:
dataset_name(impl Into<String>)
/set_dataset_name(Option<String>)
:
required: trueThe name of the dataset whose contents information you want to list.
next_token(impl Into<String>)
/set_next_token(Option<String>)
:
required: falseThe token for the next set of results.
max_results(i32)
/set_max_results(Option<i32>)
:
required: falseThe maximum number of results to return in this request.
scheduled_on_or_after(DateTime)
/set_scheduled_on_or_after(Option<DateTime>)
:
required: falseA filter to limit results to those dataset contents whose creation is scheduled on or after the given time. See the field
triggers.schedule
in theCreateDataset
request. (timestamp)scheduled_before(DateTime)
/set_scheduled_before(Option<DateTime>)
:
required: falseA filter to limit results to those dataset contents whose creation is scheduled before the given time. See the field
triggers.schedule
in theCreateDataset
request. (timestamp)
- On success, responds with
ListDatasetContentsOutput
with field(s):dataset_content_summaries(Option<Vec::<DatasetContentSummary>>)
:Summary information about dataset contents that have been created.
next_token(Option<String>)
:The token to retrieve the next set of results, or
null
if there are no more results.
- On failure, responds with
SdkError<ListDatasetContentsError>
Source§impl Client
impl Client
Sourcepub fn list_datasets(&self) -> ListDatasetsFluentBuilder
pub fn list_datasets(&self) -> ListDatasetsFluentBuilder
Constructs a fluent builder for the ListDatasets
operation.
This operation supports pagination; See into_paginator()
.
- The fluent builder is configurable:
next_token(impl Into<String>)
/set_next_token(Option<String>)
:
required: falseThe token for the next set of results.
max_results(i32)
/set_max_results(Option<i32>)
:
required: falseThe maximum number of results to return in this request.
The default value is 100.
- On success, responds with
ListDatasetsOutput
with field(s):dataset_summaries(Option<Vec::<DatasetSummary>>)
:A list of
DatasetSummary
objects.next_token(Option<String>)
:The token to retrieve the next set of results, or
null
if there are no more results.
- On failure, responds with
SdkError<ListDatasetsError>
Source§impl Client
impl Client
Sourcepub fn list_datastores(&self) -> ListDatastoresFluentBuilder
pub fn list_datastores(&self) -> ListDatastoresFluentBuilder
Constructs a fluent builder for the ListDatastores
operation.
This operation supports pagination; See into_paginator()
.
- The fluent builder is configurable:
next_token(impl Into<String>)
/set_next_token(Option<String>)
:
required: falseThe token for the next set of results.
max_results(i32)
/set_max_results(Option<i32>)
:
required: falseThe maximum number of results to return in this request.
The default value is 100.
- On success, responds with
ListDatastoresOutput
with field(s):datastore_summaries(Option<Vec::<DatastoreSummary>>)
:A list of
DatastoreSummary
objects.next_token(Option<String>)
:The token to retrieve the next set of results, or
null
if there are no more results.
- On failure, responds with
SdkError<ListDatastoresError>
Source§impl Client
impl Client
Sourcepub fn list_pipelines(&self) -> ListPipelinesFluentBuilder
pub fn list_pipelines(&self) -> ListPipelinesFluentBuilder
Constructs a fluent builder for the ListPipelines
operation.
This operation supports pagination; See into_paginator()
.
- The fluent builder is configurable:
next_token(impl Into<String>)
/set_next_token(Option<String>)
:
required: falseThe token for the next set of results.
max_results(i32)
/set_max_results(Option<i32>)
:
required: falseThe maximum number of results to return in this request.
The default value is 100.
- On success, responds with
ListPipelinesOutput
with field(s):pipeline_summaries(Option<Vec::<PipelineSummary>>)
:A list of
PipelineSummary
objects.next_token(Option<String>)
:The token to retrieve the next set of results, or
null
if there are no more results.
- On failure, responds with
SdkError<ListPipelinesError>
Source§impl Client
impl Client
Constructs a fluent builder for the ListTagsForResource
operation.
- The fluent builder is configurable:
resource_arn(impl Into<String>)
/set_resource_arn(Option<String>)
:
required: trueThe ARN of the resource whose tags you want to list.
- On success, responds with
ListTagsForResourceOutput
with field(s):tags(Option<Vec::<Tag>>)
:The tags (metadata) that you have assigned to the resource.
- On failure, responds with
SdkError<ListTagsForResourceError>
Source§impl Client
impl Client
Sourcepub fn put_logging_options(&self) -> PutLoggingOptionsFluentBuilder
pub fn put_logging_options(&self) -> PutLoggingOptionsFluentBuilder
Constructs a fluent builder for the PutLoggingOptions
operation.
- The fluent builder is configurable:
logging_options(LoggingOptions)
/set_logging_options(Option<LoggingOptions>)
:
required: trueThe new values of the IoT Analytics logging options.
- On success, responds with
PutLoggingOptionsOutput
- On failure, responds with
SdkError<PutLoggingOptionsError>
Source§impl Client
impl Client
Sourcepub fn run_pipeline_activity(&self) -> RunPipelineActivityFluentBuilder
pub fn run_pipeline_activity(&self) -> RunPipelineActivityFluentBuilder
Constructs a fluent builder for the RunPipelineActivity
operation.
- The fluent builder is configurable:
pipeline_activity(PipelineActivity)
/set_pipeline_activity(Option<PipelineActivity>)
:
required: trueThe pipeline activity that is run. This must not be a channel activity or a data store activity because these activities are used in a pipeline only to load the original message and to store the (possibly) transformed message. If a Lambda activity is specified, only short-running Lambda functions (those with a timeout of less than 30 seconds or less) can be used.
payloads(Blob)
/set_payloads(Option<Vec::<Blob>>)
:
required: trueThe sample message payloads on which the pipeline activity is run.
- On success, responds with
RunPipelineActivityOutput
with field(s):payloads(Option<Vec::<Blob>>)
:The enriched or transformed sample message payloads as base64-encoded strings. (The results of running the pipeline activity on each input sample message payload, encoded in base64.)
log_result(Option<String>)
:In case the pipeline activity fails, the log message that is generated.
- On failure, responds with
SdkError<RunPipelineActivityError>
Source§impl Client
impl Client
Sourcepub fn sample_channel_data(&self) -> SampleChannelDataFluentBuilder
pub fn sample_channel_data(&self) -> SampleChannelDataFluentBuilder
Constructs a fluent builder for the SampleChannelData
operation.
- The fluent builder is configurable:
channel_name(impl Into<String>)
/set_channel_name(Option<String>)
:
required: trueThe name of the channel whose message samples are retrieved.
max_messages(i32)
/set_max_messages(Option<i32>)
:
required: falseThe number of sample messages to be retrieved. The limit is 10. The default is also 10.
start_time(DateTime)
/set_start_time(Option<DateTime>)
:
required: falseThe start of the time window from which sample messages are retrieved.
end_time(DateTime)
/set_end_time(Option<DateTime>)
:
required: falseThe end of the time window from which sample messages are retrieved.
- On success, responds with
SampleChannelDataOutput
with field(s):payloads(Option<Vec::<Blob>>)
:The list of message samples. Each sample message is returned as a base64-encoded string.
- On failure, responds with
SdkError<SampleChannelDataError>
Source§impl Client
impl Client
Sourcepub fn start_pipeline_reprocessing(
&self,
) -> StartPipelineReprocessingFluentBuilder
pub fn start_pipeline_reprocessing( &self, ) -> StartPipelineReprocessingFluentBuilder
Constructs a fluent builder for the StartPipelineReprocessing
operation.
- The fluent builder is configurable:
pipeline_name(impl Into<String>)
/set_pipeline_name(Option<String>)
:
required: trueThe name of the pipeline on which to start reprocessing.
start_time(DateTime)
/set_start_time(Option<DateTime>)
:
required: falseThe start time (inclusive) of raw message data that is reprocessed.
If you specify a value for the
startTime
parameter, you must not use thechannelMessages
object.end_time(DateTime)
/set_end_time(Option<DateTime>)
:
required: falseThe end time (exclusive) of raw message data that is reprocessed.
If you specify a value for the
endTime
parameter, you must not use thechannelMessages
object.channel_messages(ChannelMessages)
/set_channel_messages(Option<ChannelMessages>)
:
required: falseSpecifies one or more sets of channel messages that you want to reprocess.
If you use the
channelMessages
object, you must not specify a value forstartTime
andendTime
.
- On success, responds with
StartPipelineReprocessingOutput
with field(s):reprocessing_id(Option<String>)
:The ID of the pipeline reprocessing activity that was started.
- On failure, responds with
SdkError<StartPipelineReprocessingError>
Source§impl Client
impl Client
Sourcepub fn tag_resource(&self) -> TagResourceFluentBuilder
pub fn tag_resource(&self) -> TagResourceFluentBuilder
Constructs a fluent builder for the TagResource
operation.
- The fluent builder is configurable:
resource_arn(impl Into<String>)
/set_resource_arn(Option<String>)
:
required: trueThe ARN of the resource whose tags you want to modify.
tags(Tag)
/set_tags(Option<Vec::<Tag>>)
:
required: trueThe new or modified tags for the resource.
- On success, responds with
TagResourceOutput
- On failure, responds with
SdkError<TagResourceError>
Source§impl Client
impl Client
Sourcepub fn untag_resource(&self) -> UntagResourceFluentBuilder
pub fn untag_resource(&self) -> UntagResourceFluentBuilder
Constructs a fluent builder for the UntagResource
operation.
- The fluent builder is configurable:
resource_arn(impl Into<String>)
/set_resource_arn(Option<String>)
:
required: trueThe ARN of the resource whose tags you want to remove.
tag_keys(impl Into<String>)
/set_tag_keys(Option<Vec::<String>>)
:
required: trueThe keys of those tags which you want to remove.
- On success, responds with
UntagResourceOutput
- On failure, responds with
SdkError<UntagResourceError>
Source§impl Client
impl Client
Sourcepub fn update_channel(&self) -> UpdateChannelFluentBuilder
pub fn update_channel(&self) -> UpdateChannelFluentBuilder
Constructs a fluent builder for the UpdateChannel
operation.
- The fluent builder is configurable:
channel_name(impl Into<String>)
/set_channel_name(Option<String>)
:
required: trueThe name of the channel to be updated.
channel_storage(ChannelStorage)
/set_channel_storage(Option<ChannelStorage>)
:
required: falseWhere channel data is stored. You can choose one of
serviceManagedS3
orcustomerManagedS3
storage. If not specified, the default isserviceManagedS3
. You can’t change this storage option after the channel is created.retention_period(RetentionPeriod)
/set_retention_period(Option<RetentionPeriod>)
:
required: falseHow long, in days, message data is kept for the channel. The retention period can’t be updated if the channel’s Amazon S3 storage is customer-managed.
- On success, responds with
UpdateChannelOutput
- On failure, responds with
SdkError<UpdateChannelError>
Source§impl Client
impl Client
Sourcepub fn update_dataset(&self) -> UpdateDatasetFluentBuilder
pub fn update_dataset(&self) -> UpdateDatasetFluentBuilder
Constructs a fluent builder for the UpdateDataset
operation.
- The fluent builder is configurable:
dataset_name(impl Into<String>)
/set_dataset_name(Option<String>)
:
required: trueThe name of the dataset to update.
actions(DatasetAction)
/set_actions(Option<Vec::<DatasetAction>>)
:
required: trueA list of
DatasetAction
objects.triggers(DatasetTrigger)
/set_triggers(Option<Vec::<DatasetTrigger>>)
:
required: falseA list of
DatasetTrigger
objects. The list can be empty or can contain up to fiveDatasetTrigger
objects.content_delivery_rules(DatasetContentDeliveryRule)
/set_content_delivery_rules(Option<Vec::<DatasetContentDeliveryRule>>)
:
required: falseWhen dataset contents are created, they are delivered to destinations specified here.
retention_period(RetentionPeriod)
/set_retention_period(Option<RetentionPeriod>)
:
required: falseHow long, in days, dataset contents are kept for the dataset.
versioning_configuration(VersioningConfiguration)
/set_versioning_configuration(Option<VersioningConfiguration>)
:
required: falseOptional. How many versions of dataset contents are kept. If not specified or set to null, only the latest version plus the latest succeeded version (if they are different) are kept for the time period specified by the
retentionPeriod
parameter. For more information, see Keeping Multiple Versions of IoT Analytics datasets in the IoT Analytics User Guide.late_data_rules(LateDataRule)
/set_late_data_rules(Option<Vec::<LateDataRule>>)
:
required: falseA list of data rules that send notifications to CloudWatch, when data arrives late. To specify
lateDataRules
, the dataset must use a DeltaTimer filter.
- On success, responds with
UpdateDatasetOutput
- On failure, responds with
SdkError<UpdateDatasetError>
Source§impl Client
impl Client
Sourcepub fn update_datastore(&self) -> UpdateDatastoreFluentBuilder
pub fn update_datastore(&self) -> UpdateDatastoreFluentBuilder
Constructs a fluent builder for the UpdateDatastore
operation.
- The fluent builder is configurable:
datastore_name(impl Into<String>)
/set_datastore_name(Option<String>)
:
required: trueThe name of the data store to be updated.
retention_period(RetentionPeriod)
/set_retention_period(Option<RetentionPeriod>)
:
required: falseHow long, in days, message data is kept for the data store. The retention period can’t be updated if the data store’s Amazon S3 storage is customer-managed.
datastore_storage(DatastoreStorage)
/set_datastore_storage(Option<DatastoreStorage>)
:
required: falseWhere data in a data store is stored.. You can choose
serviceManagedS3
storage,customerManagedS3
storage, oriotSiteWiseMultiLayerStorage
storage. The default isserviceManagedS3
. You can’t change the choice of Amazon S3 storage after your data store is created.file_format_configuration(FileFormatConfiguration)
/set_file_format_configuration(Option<FileFormatConfiguration>)
:
required: falseContains the configuration information of file formats. IoT Analytics data stores support JSON and Parquet.
The default file format is JSON. You can specify only one format.
You can’t change the file format after you create the data store.
- On success, responds with
UpdateDatastoreOutput
- On failure, responds with
SdkError<UpdateDatastoreError>
Source§impl Client
impl Client
Sourcepub fn update_pipeline(&self) -> UpdatePipelineFluentBuilder
pub fn update_pipeline(&self) -> UpdatePipelineFluentBuilder
Constructs a fluent builder for the UpdatePipeline
operation.
- The fluent builder is configurable:
pipeline_name(impl Into<String>)
/set_pipeline_name(Option<String>)
:
required: trueThe name of the pipeline to update.
pipeline_activities(PipelineActivity)
/set_pipeline_activities(Option<Vec::<PipelineActivity>>)
:
required: trueA list of
PipelineActivity
objects. Activities perform transformations on your messages, such as removing, renaming or adding message attributes; filtering messages based on attribute values; invoking your Lambda functions on messages for advanced processing; or performing mathematical transformations to normalize device data.The list can be 2-25
PipelineActivity
objects and must contain both achannel
and adatastore
activity. Each entry in the list must contain only one activity. For example:pipelineActivities = [ { “channel”: { … } }, { “lambda”: { … } }, … ]
- On success, responds with
UpdatePipelineOutput
- On failure, responds with
SdkError<UpdatePipelineError>
Source§impl Client
impl Client
Sourcepub fn from_conf(conf: Config) -> Self
pub fn from_conf(conf: Config) -> Self
Creates a new client from the service Config
.
§Panics
This method will panic in the following cases:
- Retries or timeouts are enabled without a
sleep_impl
configured. - Identity caching is enabled without a
sleep_impl
andtime_source
configured. - No
behavior_version
is provided.
The panic message for each of these will have instructions on how to resolve them.
Source§impl Client
impl Client
Sourcepub fn new(sdk_config: &SdkConfig) -> Self
pub fn new(sdk_config: &SdkConfig) -> Self
Creates a new client from an SDK Config.
§Panics
- This method will panic if the
sdk_config
is missing an async sleep implementation. If you experience this panic, set thesleep_impl
on the Config passed into this function to fix it. - This method will panic if the
sdk_config
is missing an HTTP connector. If you experience this panic, set thehttp_connector
on the Config passed into this function to fix it. - This method will panic if no
BehaviorVersion
is provided. If you experience this panic, setbehavior_version
on the Config or enable thebehavior-version-latest
Cargo feature.
Trait Implementations§
Auto Trait Implementations§
impl Freeze for Client
impl !RefUnwindSafe for Client
impl Send for Client
impl Sync for Client
impl Unpin for Client
impl !UnwindSafe for Client
Blanket Implementations§
Source§impl<T> BorrowMut<T> for Twhere
T: ?Sized,
impl<T> BorrowMut<T> for Twhere
T: ?Sized,
Source§fn borrow_mut(&mut self) -> &mut T
fn borrow_mut(&mut self) -> &mut T
Source§impl<T> CloneToUninit for Twhere
T: Clone,
impl<T> CloneToUninit for Twhere
T: Clone,
Source§impl<T> Instrument for T
impl<T> Instrument for T
Source§fn instrument(self, span: Span) -> Instrumented<Self>
fn instrument(self, span: Span) -> Instrumented<Self>
Source§fn in_current_span(self) -> Instrumented<Self>
fn in_current_span(self) -> Instrumented<Self>
Source§impl<T> IntoEither for T
impl<T> IntoEither for T
Source§fn into_either(self, into_left: bool) -> Either<Self, Self>
fn into_either(self, into_left: bool) -> Either<Self, Self>
self
into a Left
variant of Either<Self, Self>
if into_left
is true
.
Converts self
into a Right
variant of Either<Self, Self>
otherwise. Read moreSource§fn into_either_with<F>(self, into_left: F) -> Either<Self, Self>
fn into_either_with<F>(self, into_left: F) -> Either<Self, Self>
self
into a Left
variant of Either<Self, Self>
if into_left(&self)
returns true
.
Converts self
into a Right
variant of Either<Self, Self>
otherwise. Read moreSource§impl<T> Paint for Twhere
T: ?Sized,
impl<T> Paint for Twhere
T: ?Sized,
Source§fn fg(&self, value: Color) -> Painted<&T>
fn fg(&self, value: Color) -> Painted<&T>
Returns a styled value derived from self
with the foreground set to
value
.
This method should be used rarely. Instead, prefer to use color-specific
builder methods like red()
and
green()
, which have the same functionality but are
pithier.
§Example
Set foreground color to white using fg()
:
use yansi::{Paint, Color};
painted.fg(Color::White);
Set foreground color to white using white()
.
use yansi::Paint;
painted.white();
Source§fn bright_black(&self) -> Painted<&T>
fn bright_black(&self) -> Painted<&T>
Source§fn bright_red(&self) -> Painted<&T>
fn bright_red(&self) -> Painted<&T>
Source§fn bright_green(&self) -> Painted<&T>
fn bright_green(&self) -> Painted<&T>
Source§fn bright_yellow(&self) -> Painted<&T>
fn bright_yellow(&self) -> Painted<&T>
Source§fn bright_blue(&self) -> Painted<&T>
fn bright_blue(&self) -> Painted<&T>
Source§fn bright_magenta(&self) -> Painted<&T>
fn bright_magenta(&self) -> Painted<&T>
Source§fn bright_cyan(&self) -> Painted<&T>
fn bright_cyan(&self) -> Painted<&T>
Source§fn bright_white(&self) -> Painted<&T>
fn bright_white(&self) -> Painted<&T>
Source§fn bg(&self, value: Color) -> Painted<&T>
fn bg(&self, value: Color) -> Painted<&T>
Returns a styled value derived from self
with the background set to
value
.
This method should be used rarely. Instead, prefer to use color-specific
builder methods like on_red()
and
on_green()
, which have the same functionality but
are pithier.
§Example
Set background color to red using fg()
:
use yansi::{Paint, Color};
painted.bg(Color::Red);
Set background color to red using on_red()
.
use yansi::Paint;
painted.on_red();
Source§fn on_primary(&self) -> Painted<&T>
fn on_primary(&self) -> Painted<&T>
Source§fn on_magenta(&self) -> Painted<&T>
fn on_magenta(&self) -> Painted<&T>
Source§fn on_bright_black(&self) -> Painted<&T>
fn on_bright_black(&self) -> Painted<&T>
Source§fn on_bright_red(&self) -> Painted<&T>
fn on_bright_red(&self) -> Painted<&T>
Source§fn on_bright_green(&self) -> Painted<&T>
fn on_bright_green(&self) -> Painted<&T>
Source§fn on_bright_yellow(&self) -> Painted<&T>
fn on_bright_yellow(&self) -> Painted<&T>
Source§fn on_bright_blue(&self) -> Painted<&T>
fn on_bright_blue(&self) -> Painted<&T>
Source§fn on_bright_magenta(&self) -> Painted<&T>
fn on_bright_magenta(&self) -> Painted<&T>
Source§fn on_bright_cyan(&self) -> Painted<&T>
fn on_bright_cyan(&self) -> Painted<&T>
Source§fn on_bright_white(&self) -> Painted<&T>
fn on_bright_white(&self) -> Painted<&T>
Source§fn attr(&self, value: Attribute) -> Painted<&T>
fn attr(&self, value: Attribute) -> Painted<&T>
Enables the styling Attribute
value
.
This method should be used rarely. Instead, prefer to use
attribute-specific builder methods like bold()
and
underline()
, which have the same functionality
but are pithier.
§Example
Make text bold using attr()
:
use yansi::{Paint, Attribute};
painted.attr(Attribute::Bold);
Make text bold using using bold()
.
use yansi::Paint;
painted.bold();
Source§fn rapid_blink(&self) -> Painted<&T>
fn rapid_blink(&self) -> Painted<&T>
Source§fn quirk(&self, value: Quirk) -> Painted<&T>
fn quirk(&self, value: Quirk) -> Painted<&T>
Enables the yansi
Quirk
value
.
This method should be used rarely. Instead, prefer to use quirk-specific
builder methods like mask()
and
wrap()
, which have the same functionality but are
pithier.
§Example
Enable wrapping using .quirk()
:
use yansi::{Paint, Quirk};
painted.quirk(Quirk::Wrap);
Enable wrapping using wrap()
.
use yansi::Paint;
painted.wrap();
Source§fn clear(&self) -> Painted<&T>
👎Deprecated since 1.0.1: renamed to resetting()
due to conflicts with Vec::clear()
.
The clear()
method will be removed in a future release.
fn clear(&self) -> Painted<&T>
resetting()
due to conflicts with Vec::clear()
.
The clear()
method will be removed in a future release.Source§fn whenever(&self, value: Condition) -> Painted<&T>
fn whenever(&self, value: Condition) -> Painted<&T>
Conditionally enable styling based on whether the Condition
value
applies. Replaces any previous condition.
See the crate level docs for more details.
§Example
Enable styling painted
only when both stdout
and stderr
are TTYs:
use yansi::{Paint, Condition};
painted.red().on_yellow().whenever(Condition::STDOUTERR_ARE_TTY);