Struct aws_sdk_iotanalytics::Client
source · pub struct Client { /* private fields */ }Expand description
Client for AWS IoT Analytics
Client for invoking operations on AWS IoT Analytics. Each operation on AWS IoT Analytics is a method on this
this struct. .send() MUST be invoked on the generated operations to dispatch the request to the service.
Examples
Constructing a client and invoking an operation
// create a shared configuration. This can be used & shared between multiple service clients.
let shared_config = aws_config::load_from_env().await;
let client = aws_sdk_iotanalytics::Client::new(&shared_config);
// invoke an operation
/* let rsp = client
.<operation_name>().
.<param>("some value")
.send().await; */Constructing a client with custom configuration
use aws_config::retry::RetryConfig;
let shared_config = aws_config::load_from_env().await;
let config = aws_sdk_iotanalytics::config::Builder::from(&shared_config)
.retry_config(RetryConfig::disabled())
.build();
let client = aws_sdk_iotanalytics::Client::from_conf(config);Implementations§
source§impl Client
impl Client
sourcepub fn with_config(
client: Client<DynConnector, DynMiddleware<DynConnector>>,
conf: Config
) -> Self
pub fn with_config(
client: Client<DynConnector, DynMiddleware<DynConnector>>,
conf: Config
) -> Self
Creates a client with the given service configuration.
source§impl Client
impl Client
sourcepub fn batch_put_message(&self) -> BatchPutMessage
pub fn batch_put_message(&self) -> BatchPutMessage
Constructs a fluent builder for the BatchPutMessage operation.
- The fluent builder is configurable:
channel_name(impl Into<String>)/set_channel_name(Option<String>):The name of the channel where the messages are sent.
messages(Vec<Message>)/set_messages(Option<Vec<Message>>):The list of messages to be sent. Each message has the format: { “messageId”: “string”, “payload”: “string”}.
The field names of message payloads (data) that you send to IoT Analytics:
-
Must contain only alphanumeric characters and undescores (). No other special characters are allowed.
-
Must begin with an alphabetic character or single underscore ().
-
Cannot contain hyphens (-).
-
In regular expression terms: “^A-Za-z_$”.
-
Cannot be more than 255 characters.
-
Are case insensitive. (Fields named foo and FOO in the same payload are considered duplicates.)
For example, {“temp_01”: 29} or {“_temp_01”: 29} are valid, but {“temp-01”: 29}, {“01_temp”: 29} or {“__temp_01”: 29} are invalid in message payloads.
-
- On success, responds with
BatchPutMessageOutputwith field(s):batch_put_message_error_entries(Option<Vec<BatchPutMessageErrorEntry>>):A list of any errors encountered when sending the messages to the channel.
- On failure, responds with
SdkError<BatchPutMessageError>
sourcepub fn cancel_pipeline_reprocessing(&self) -> CancelPipelineReprocessing
pub fn cancel_pipeline_reprocessing(&self) -> CancelPipelineReprocessing
Constructs a fluent builder for the CancelPipelineReprocessing operation.
- The fluent builder is configurable:
pipeline_name(impl Into<String>)/set_pipeline_name(Option<String>):The name of pipeline for which data reprocessing is canceled.
reprocessing_id(impl Into<String>)/set_reprocessing_id(Option<String>):The ID of the reprocessing task (returned by
StartPipelineReprocessing).
- On success, responds with
CancelPipelineReprocessingOutput - On failure, responds with
SdkError<CancelPipelineReprocessingError>
sourcepub fn create_channel(&self) -> CreateChannel
pub fn create_channel(&self) -> CreateChannel
Constructs a fluent builder for the CreateChannel operation.
- The fluent builder is configurable:
channel_name(impl Into<String>)/set_channel_name(Option<String>):The name of the channel.
channel_storage(ChannelStorage)/set_channel_storage(Option<ChannelStorage>):Where channel data is stored. You can choose one of
serviceManagedS3orcustomerManagedS3storage. If not specified, the default isserviceManagedS3. You can’t change this storage option after the channel is created.retention_period(RetentionPeriod)/set_retention_period(Option<RetentionPeriod>):How long, in days, message data is kept for the channel. When
customerManagedS3storage is selected, this parameter is ignored.tags(Vec<Tag>)/set_tags(Option<Vec<Tag>>):Metadata which can be used to manage the channel.
- On success, responds with
CreateChannelOutputwith field(s):channel_name(Option<String>):The name of the channel.
channel_arn(Option<String>):The ARN of the channel.
retention_period(Option<RetentionPeriod>):How long, in days, message data is kept for the channel.
- On failure, responds with
SdkError<CreateChannelError>
sourcepub fn create_dataset(&self) -> CreateDataset
pub fn create_dataset(&self) -> CreateDataset
Constructs a fluent builder for the CreateDataset operation.
- The fluent builder is configurable:
dataset_name(impl Into<String>)/set_dataset_name(Option<String>):The name of the dataset.
actions(Vec<DatasetAction>)/set_actions(Option<Vec<DatasetAction>>):A list of actions that create the dataset contents.
triggers(Vec<DatasetTrigger>)/set_triggers(Option<Vec<DatasetTrigger>>):A list of triggers. A trigger causes dataset contents to be populated at a specified time interval or when another dataset’s contents are created. The list of triggers can be empty or contain up to five
DataSetTriggerobjects.content_delivery_rules(Vec<DatasetContentDeliveryRule>)/set_content_delivery_rules(Option<Vec<DatasetContentDeliveryRule>>):When dataset contents are created, they are delivered to destinations specified here.
retention_period(RetentionPeriod)/set_retention_period(Option<RetentionPeriod>):Optional. How long, in days, versions of dataset contents are kept for the dataset. If not specified or set to
null, versions of dataset contents are retained for at most 90 days. The number of versions of dataset contents retained is determined by theversioningConfigurationparameter. For more information, see Keeping Multiple Versions of IoT Analytics datasets in the IoT Analytics User Guide.versioning_configuration(VersioningConfiguration)/set_versioning_configuration(Option<VersioningConfiguration>):Optional. How many versions of dataset contents are kept. If not specified or set to null, only the latest version plus the latest succeeded version (if they are different) are kept for the time period specified by the
retentionPeriodparameter. For more information, see Keeping Multiple Versions of IoT Analytics datasets in the IoT Analytics User Guide.tags(Vec<Tag>)/set_tags(Option<Vec<Tag>>):Metadata which can be used to manage the dataset.
late_data_rules(Vec<LateDataRule>)/set_late_data_rules(Option<Vec<LateDataRule>>):A list of data rules that send notifications to CloudWatch, when data arrives late. To specify
lateDataRules, the dataset must use a DeltaTimer filter.
- On success, responds with
CreateDatasetOutputwith field(s):dataset_name(Option<String>):The name of the dataset.
dataset_arn(Option<String>):The ARN of the dataset.
retention_period(Option<RetentionPeriod>):How long, in days, dataset contents are kept for the dataset.
- On failure, responds with
SdkError<CreateDatasetError>
sourcepub fn create_dataset_content(&self) -> CreateDatasetContent
pub fn create_dataset_content(&self) -> CreateDatasetContent
Constructs a fluent builder for the CreateDatasetContent operation.
- The fluent builder is configurable:
dataset_name(impl Into<String>)/set_dataset_name(Option<String>):The name of the dataset.
version_id(impl Into<String>)/set_version_id(Option<String>):The version ID of the dataset content. To specify
versionIdfor a dataset content, the dataset must use a DeltaTimer filter.
- On success, responds with
CreateDatasetContentOutputwith field(s):version_id(Option<String>):The version ID of the dataset contents that are being created.
- On failure, responds with
SdkError<CreateDatasetContentError>
sourcepub fn create_datastore(&self) -> CreateDatastore
pub fn create_datastore(&self) -> CreateDatastore
Constructs a fluent builder for the CreateDatastore operation.
- The fluent builder is configurable:
datastore_name(impl Into<String>)/set_datastore_name(Option<String>):The name of the data store.
datastore_storage(DatastoreStorage)/set_datastore_storage(Option<DatastoreStorage>):Where data in a data store is stored.. You can choose
serviceManagedS3storage,customerManagedS3storage, oriotSiteWiseMultiLayerStoragestorage. The default isserviceManagedS3. You can’t change the choice of Amazon S3 storage after your data store is created.retention_period(RetentionPeriod)/set_retention_period(Option<RetentionPeriod>):How long, in days, message data is kept for the data store. When
customerManagedS3storage is selected, this parameter is ignored.tags(Vec<Tag>)/set_tags(Option<Vec<Tag>>):Metadata which can be used to manage the data store.
file_format_configuration(FileFormatConfiguration)/set_file_format_configuration(Option<FileFormatConfiguration>):Contains the configuration information of file formats. IoT Analytics data stores support JSON and Parquet.
The default file format is JSON. You can specify only one format.
You can’t change the file format after you create the data store.
datastore_partitions(DatastorePartitions)/set_datastore_partitions(Option<DatastorePartitions>):Contains information about the partition dimensions in a data store.
- On success, responds with
CreateDatastoreOutputwith field(s):datastore_name(Option<String>):The name of the data store.
datastore_arn(Option<String>):The ARN of the data store.
retention_period(Option<RetentionPeriod>):How long, in days, message data is kept for the data store.
- On failure, responds with
SdkError<CreateDatastoreError>
sourcepub fn create_pipeline(&self) -> CreatePipeline
pub fn create_pipeline(&self) -> CreatePipeline
Constructs a fluent builder for the CreatePipeline operation.
- The fluent builder is configurable:
pipeline_name(impl Into<String>)/set_pipeline_name(Option<String>):The name of the pipeline.
pipeline_activities(Vec<PipelineActivity>)/set_pipeline_activities(Option<Vec<PipelineActivity>>):A list of
PipelineActivityobjects. Activities perform transformations on your messages, such as removing, renaming or adding message attributes; filtering messages based on attribute values; invoking your Lambda unctions on messages for advanced processing; or performing mathematical transformations to normalize device data.The list can be 2-25
PipelineActivityobjects and must contain both achanneland adatastoreactivity. Each entry in the list must contain only one activity. For example:pipelineActivities = [ { “channel”: { … } }, { “lambda”: { … } }, … ]tags(Vec<Tag>)/set_tags(Option<Vec<Tag>>):Metadata which can be used to manage the pipeline.
- On success, responds with
CreatePipelineOutputwith field(s):pipeline_name(Option<String>):The name of the pipeline.
pipeline_arn(Option<String>):The ARN of the pipeline.
- On failure, responds with
SdkError<CreatePipelineError>
sourcepub fn delete_channel(&self) -> DeleteChannel
pub fn delete_channel(&self) -> DeleteChannel
Constructs a fluent builder for the DeleteChannel operation.
- The fluent builder is configurable:
channel_name(impl Into<String>)/set_channel_name(Option<String>):The name of the channel to delete.
- On success, responds with
DeleteChannelOutput - On failure, responds with
SdkError<DeleteChannelError>
sourcepub fn delete_dataset(&self) -> DeleteDataset
pub fn delete_dataset(&self) -> DeleteDataset
Constructs a fluent builder for the DeleteDataset operation.
- The fluent builder is configurable:
dataset_name(impl Into<String>)/set_dataset_name(Option<String>):The name of the dataset to delete.
- On success, responds with
DeleteDatasetOutput - On failure, responds with
SdkError<DeleteDatasetError>
sourcepub fn delete_dataset_content(&self) -> DeleteDatasetContent
pub fn delete_dataset_content(&self) -> DeleteDatasetContent
Constructs a fluent builder for the DeleteDatasetContent operation.
- The fluent builder is configurable:
dataset_name(impl Into<String>)/set_dataset_name(Option<String>):The name of the dataset whose content is deleted.
version_id(impl Into<String>)/set_version_id(Option<String>):The version of the dataset whose content is deleted. You can also use the strings “$LATEST” or “$LATEST_SUCCEEDED” to delete the latest or latest successfully completed data set. If not specified, “$LATEST_SUCCEEDED” is the default.
- On success, responds with
DeleteDatasetContentOutput - On failure, responds with
SdkError<DeleteDatasetContentError>
sourcepub fn delete_datastore(&self) -> DeleteDatastore
pub fn delete_datastore(&self) -> DeleteDatastore
Constructs a fluent builder for the DeleteDatastore operation.
- The fluent builder is configurable:
datastore_name(impl Into<String>)/set_datastore_name(Option<String>):The name of the data store to delete.
- On success, responds with
DeleteDatastoreOutput - On failure, responds with
SdkError<DeleteDatastoreError>
sourcepub fn delete_pipeline(&self) -> DeletePipeline
pub fn delete_pipeline(&self) -> DeletePipeline
Constructs a fluent builder for the DeletePipeline operation.
- The fluent builder is configurable:
pipeline_name(impl Into<String>)/set_pipeline_name(Option<String>):The name of the pipeline to delete.
- On success, responds with
DeletePipelineOutput - On failure, responds with
SdkError<DeletePipelineError>
sourcepub fn describe_channel(&self) -> DescribeChannel
pub fn describe_channel(&self) -> DescribeChannel
Constructs a fluent builder for the DescribeChannel operation.
- The fluent builder is configurable:
channel_name(impl Into<String>)/set_channel_name(Option<String>):The name of the channel whose information is retrieved.
include_statistics(bool)/set_include_statistics(bool):If true, additional statistical information about the channel is included in the response. This feature can’t be used with a channel whose S3 storage is customer-managed.
- On success, responds with
DescribeChannelOutputwith field(s):channel(Option<Channel>):An object that contains information about the channel.
statistics(Option<ChannelStatistics>):Statistics about the channel. Included if the
includeStatisticsparameter is set totruein the request.
- On failure, responds with
SdkError<DescribeChannelError>
sourcepub fn describe_dataset(&self) -> DescribeDataset
pub fn describe_dataset(&self) -> DescribeDataset
Constructs a fluent builder for the DescribeDataset operation.
- The fluent builder is configurable:
dataset_name(impl Into<String>)/set_dataset_name(Option<String>):The name of the dataset whose information is retrieved.
- On success, responds with
DescribeDatasetOutputwith field(s):dataset(Option<Dataset>):An object that contains information about the dataset.
- On failure, responds with
SdkError<DescribeDatasetError>
sourcepub fn describe_datastore(&self) -> DescribeDatastore
pub fn describe_datastore(&self) -> DescribeDatastore
Constructs a fluent builder for the DescribeDatastore operation.
- The fluent builder is configurable:
datastore_name(impl Into<String>)/set_datastore_name(Option<String>):The name of the data store
include_statistics(bool)/set_include_statistics(bool):If true, additional statistical information about the data store is included in the response. This feature can’t be used with a data store whose S3 storage is customer-managed.
- On success, responds with
DescribeDatastoreOutputwith field(s):datastore(Option<Datastore>):Information about the data store.
statistics(Option<DatastoreStatistics>):Additional statistical information about the data store. Included if the
includeStatisticsparameter is set totruein the request.
- On failure, responds with
SdkError<DescribeDatastoreError>
sourcepub fn describe_logging_options(&self) -> DescribeLoggingOptions
pub fn describe_logging_options(&self) -> DescribeLoggingOptions
Constructs a fluent builder for the DescribeLoggingOptions operation.
- The fluent builder takes no input, just
sendit. - On success, responds with
DescribeLoggingOptionsOutputwith field(s):logging_options(Option<LoggingOptions>):The current settings of the IoT Analytics logging options.
- On failure, responds with
SdkError<DescribeLoggingOptionsError>
sourcepub fn describe_pipeline(&self) -> DescribePipeline
pub fn describe_pipeline(&self) -> DescribePipeline
Constructs a fluent builder for the DescribePipeline operation.
- The fluent builder is configurable:
pipeline_name(impl Into<String>)/set_pipeline_name(Option<String>):The name of the pipeline whose information is retrieved.
- On success, responds with
DescribePipelineOutputwith field(s):pipeline(Option<Pipeline>):A
Pipelineobject that contains information about the pipeline.
- On failure, responds with
SdkError<DescribePipelineError>
sourcepub fn get_dataset_content(&self) -> GetDatasetContent
pub fn get_dataset_content(&self) -> GetDatasetContent
Constructs a fluent builder for the GetDatasetContent operation.
- The fluent builder is configurable:
dataset_name(impl Into<String>)/set_dataset_name(Option<String>):The name of the dataset whose contents are retrieved.
version_id(impl Into<String>)/set_version_id(Option<String>):The version of the dataset whose contents are retrieved. You can also use the strings “$LATEST” or “$LATEST_SUCCEEDED” to retrieve the contents of the latest or latest successfully completed dataset. If not specified, “$LATEST_SUCCEEDED” is the default.
- On success, responds with
GetDatasetContentOutputwith field(s):entries(Option<Vec<DatasetEntry>>):A list of
DatasetEntryobjects.timestamp(Option<DateTime>):The time when the request was made.
status(Option<DatasetContentStatus>):The status of the dataset content.
- On failure, responds with
SdkError<GetDatasetContentError>
sourcepub fn list_channels(&self) -> ListChannels
pub fn list_channels(&self) -> ListChannels
Constructs a fluent builder for the ListChannels operation.
This operation supports pagination; See into_paginator().
- The fluent builder is configurable:
next_token(impl Into<String>)/set_next_token(Option<String>):The token for the next set of results.
max_results(i32)/set_max_results(Option<i32>):The maximum number of results to return in this request.
The default value is 100.
- On success, responds with
ListChannelsOutputwith field(s):channel_summaries(Option<Vec<ChannelSummary>>):A list of
ChannelSummaryobjects.next_token(Option<String>):The token to retrieve the next set of results, or
nullif there are no more results.
- On failure, responds with
SdkError<ListChannelsError>
sourcepub fn list_dataset_contents(&self) -> ListDatasetContents
pub fn list_dataset_contents(&self) -> ListDatasetContents
Constructs a fluent builder for the ListDatasetContents operation.
This operation supports pagination; See into_paginator().
- The fluent builder is configurable:
dataset_name(impl Into<String>)/set_dataset_name(Option<String>):The name of the dataset whose contents information you want to list.
next_token(impl Into<String>)/set_next_token(Option<String>):The token for the next set of results.
max_results(i32)/set_max_results(Option<i32>):The maximum number of results to return in this request.
scheduled_on_or_after(DateTime)/set_scheduled_on_or_after(Option<DateTime>):A filter to limit results to those dataset contents whose creation is scheduled on or after the given time. See the field
triggers.schedulein theCreateDatasetrequest. (timestamp)scheduled_before(DateTime)/set_scheduled_before(Option<DateTime>):A filter to limit results to those dataset contents whose creation is scheduled before the given time. See the field
triggers.schedulein theCreateDatasetrequest. (timestamp)
- On success, responds with
ListDatasetContentsOutputwith field(s):dataset_content_summaries(Option<Vec<DatasetContentSummary>>):Summary information about dataset contents that have been created.
next_token(Option<String>):The token to retrieve the next set of results, or
nullif there are no more results.
- On failure, responds with
SdkError<ListDatasetContentsError>
sourcepub fn list_datasets(&self) -> ListDatasets
pub fn list_datasets(&self) -> ListDatasets
Constructs a fluent builder for the ListDatasets operation.
This operation supports pagination; See into_paginator().
- The fluent builder is configurable:
next_token(impl Into<String>)/set_next_token(Option<String>):The token for the next set of results.
max_results(i32)/set_max_results(Option<i32>):The maximum number of results to return in this request.
The default value is 100.
- On success, responds with
ListDatasetsOutputwith field(s):dataset_summaries(Option<Vec<DatasetSummary>>):A list of
DatasetSummaryobjects.next_token(Option<String>):The token to retrieve the next set of results, or
nullif there are no more results.
- On failure, responds with
SdkError<ListDatasetsError>
sourcepub fn list_datastores(&self) -> ListDatastores
pub fn list_datastores(&self) -> ListDatastores
Constructs a fluent builder for the ListDatastores operation.
This operation supports pagination; See into_paginator().
- The fluent builder is configurable:
next_token(impl Into<String>)/set_next_token(Option<String>):The token for the next set of results.
max_results(i32)/set_max_results(Option<i32>):The maximum number of results to return in this request.
The default value is 100.
- On success, responds with
ListDatastoresOutputwith field(s):datastore_summaries(Option<Vec<DatastoreSummary>>):A list of
DatastoreSummaryobjects.next_token(Option<String>):The token to retrieve the next set of results, or
nullif there are no more results.
- On failure, responds with
SdkError<ListDatastoresError>
sourcepub fn list_pipelines(&self) -> ListPipelines
pub fn list_pipelines(&self) -> ListPipelines
Constructs a fluent builder for the ListPipelines operation.
This operation supports pagination; See into_paginator().
- The fluent builder is configurable:
next_token(impl Into<String>)/set_next_token(Option<String>):The token for the next set of results.
max_results(i32)/set_max_results(Option<i32>):The maximum number of results to return in this request.
The default value is 100.
- On success, responds with
ListPipelinesOutputwith field(s):pipeline_summaries(Option<Vec<PipelineSummary>>):A list of
PipelineSummaryobjects.next_token(Option<String>):The token to retrieve the next set of results, or
nullif there are no more results.
- On failure, responds with
SdkError<ListPipelinesError>
Constructs a fluent builder for the ListTagsForResource operation.
- The fluent builder is configurable:
resource_arn(impl Into<String>)/set_resource_arn(Option<String>):The ARN of the resource whose tags you want to list.
- On success, responds with
ListTagsForResourceOutputwith field(s):tags(Option<Vec<Tag>>):The tags (metadata) that you have assigned to the resource.
- On failure, responds with
SdkError<ListTagsForResourceError>
sourcepub fn put_logging_options(&self) -> PutLoggingOptions
pub fn put_logging_options(&self) -> PutLoggingOptions
Constructs a fluent builder for the PutLoggingOptions operation.
- The fluent builder is configurable:
logging_options(LoggingOptions)/set_logging_options(Option<LoggingOptions>):The new values of the IoT Analytics logging options.
- On success, responds with
PutLoggingOptionsOutput - On failure, responds with
SdkError<PutLoggingOptionsError>
sourcepub fn run_pipeline_activity(&self) -> RunPipelineActivity
pub fn run_pipeline_activity(&self) -> RunPipelineActivity
Constructs a fluent builder for the RunPipelineActivity operation.
- The fluent builder is configurable:
pipeline_activity(PipelineActivity)/set_pipeline_activity(Option<PipelineActivity>):The pipeline activity that is run. This must not be a channel activity or a data store activity because these activities are used in a pipeline only to load the original message and to store the (possibly) transformed message. If a Lambda activity is specified, only short-running Lambda functions (those with a timeout of less than 30 seconds or less) can be used.
payloads(Vec<Blob>)/set_payloads(Option<Vec<Blob>>):The sample message payloads on which the pipeline activity is run.
- On success, responds with
RunPipelineActivityOutputwith field(s):payloads(Option<Vec<Blob>>):The enriched or transformed sample message payloads as base64-encoded strings. (The results of running the pipeline activity on each input sample message payload, encoded in base64.)
log_result(Option<String>):In case the pipeline activity fails, the log message that is generated.
- On failure, responds with
SdkError<RunPipelineActivityError>
sourcepub fn sample_channel_data(&self) -> SampleChannelData
pub fn sample_channel_data(&self) -> SampleChannelData
Constructs a fluent builder for the SampleChannelData operation.
- The fluent builder is configurable:
channel_name(impl Into<String>)/set_channel_name(Option<String>):The name of the channel whose message samples are retrieved.
max_messages(i32)/set_max_messages(Option<i32>):The number of sample messages to be retrieved. The limit is 10. The default is also 10.
start_time(DateTime)/set_start_time(Option<DateTime>):The start of the time window from which sample messages are retrieved.
end_time(DateTime)/set_end_time(Option<DateTime>):The end of the time window from which sample messages are retrieved.
- On success, responds with
SampleChannelDataOutputwith field(s):payloads(Option<Vec<Blob>>):The list of message samples. Each sample message is returned as a base64-encoded string.
- On failure, responds with
SdkError<SampleChannelDataError>
sourcepub fn start_pipeline_reprocessing(&self) -> StartPipelineReprocessing
pub fn start_pipeline_reprocessing(&self) -> StartPipelineReprocessing
Constructs a fluent builder for the StartPipelineReprocessing operation.
- The fluent builder is configurable:
pipeline_name(impl Into<String>)/set_pipeline_name(Option<String>):The name of the pipeline on which to start reprocessing.
start_time(DateTime)/set_start_time(Option<DateTime>):The start time (inclusive) of raw message data that is reprocessed.
If you specify a value for the
startTimeparameter, you must not use thechannelMessagesobject.end_time(DateTime)/set_end_time(Option<DateTime>):The end time (exclusive) of raw message data that is reprocessed.
If you specify a value for the
endTimeparameter, you must not use thechannelMessagesobject.channel_messages(ChannelMessages)/set_channel_messages(Option<ChannelMessages>):Specifies one or more sets of channel messages that you want to reprocess.
If you use the
channelMessagesobject, you must not specify a value forstartTimeandendTime.
- On success, responds with
StartPipelineReprocessingOutputwith field(s):reprocessing_id(Option<String>):The ID of the pipeline reprocessing activity that was started.
- On failure, responds with
SdkError<StartPipelineReprocessingError>
sourcepub fn tag_resource(&self) -> TagResource
pub fn tag_resource(&self) -> TagResource
Constructs a fluent builder for the TagResource operation.
- The fluent builder is configurable:
resource_arn(impl Into<String>)/set_resource_arn(Option<String>):The ARN of the resource whose tags you want to modify.
tags(Vec<Tag>)/set_tags(Option<Vec<Tag>>):The new or modified tags for the resource.
- On success, responds with
TagResourceOutput - On failure, responds with
SdkError<TagResourceError>
sourcepub fn untag_resource(&self) -> UntagResource
pub fn untag_resource(&self) -> UntagResource
Constructs a fluent builder for the UntagResource operation.
- The fluent builder is configurable:
resource_arn(impl Into<String>)/set_resource_arn(Option<String>):The ARN of the resource whose tags you want to remove.
tag_keys(Vec<String>)/set_tag_keys(Option<Vec<String>>):The keys of those tags which you want to remove.
- On success, responds with
UntagResourceOutput - On failure, responds with
SdkError<UntagResourceError>
sourcepub fn update_channel(&self) -> UpdateChannel
pub fn update_channel(&self) -> UpdateChannel
Constructs a fluent builder for the UpdateChannel operation.
- The fluent builder is configurable:
channel_name(impl Into<String>)/set_channel_name(Option<String>):The name of the channel to be updated.
channel_storage(ChannelStorage)/set_channel_storage(Option<ChannelStorage>):Where channel data is stored. You can choose one of
serviceManagedS3orcustomerManagedS3storage. If not specified, the default isserviceManagedS3. You can’t change this storage option after the channel is created.retention_period(RetentionPeriod)/set_retention_period(Option<RetentionPeriod>):How long, in days, message data is kept for the channel. The retention period can’t be updated if the channel’s Amazon S3 storage is customer-managed.
- On success, responds with
UpdateChannelOutput - On failure, responds with
SdkError<UpdateChannelError>
sourcepub fn update_dataset(&self) -> UpdateDataset
pub fn update_dataset(&self) -> UpdateDataset
Constructs a fluent builder for the UpdateDataset operation.
- The fluent builder is configurable:
dataset_name(impl Into<String>)/set_dataset_name(Option<String>):The name of the dataset to update.
actions(Vec<DatasetAction>)/set_actions(Option<Vec<DatasetAction>>):A list of
DatasetActionobjects.triggers(Vec<DatasetTrigger>)/set_triggers(Option<Vec<DatasetTrigger>>):A list of
DatasetTriggerobjects. The list can be empty or can contain up to fiveDatasetTriggerobjects.content_delivery_rules(Vec<DatasetContentDeliveryRule>)/set_content_delivery_rules(Option<Vec<DatasetContentDeliveryRule>>):When dataset contents are created, they are delivered to destinations specified here.
retention_period(RetentionPeriod)/set_retention_period(Option<RetentionPeriod>):How long, in days, dataset contents are kept for the dataset.
versioning_configuration(VersioningConfiguration)/set_versioning_configuration(Option<VersioningConfiguration>):Optional. How many versions of dataset contents are kept. If not specified or set to null, only the latest version plus the latest succeeded version (if they are different) are kept for the time period specified by the
retentionPeriodparameter. For more information, see Keeping Multiple Versions of IoT Analytics datasets in the IoT Analytics User Guide.late_data_rules(Vec<LateDataRule>)/set_late_data_rules(Option<Vec<LateDataRule>>):A list of data rules that send notifications to CloudWatch, when data arrives late. To specify
lateDataRules, the dataset must use a DeltaTimer filter.
- On success, responds with
UpdateDatasetOutput - On failure, responds with
SdkError<UpdateDatasetError>
sourcepub fn update_datastore(&self) -> UpdateDatastore
pub fn update_datastore(&self) -> UpdateDatastore
Constructs a fluent builder for the UpdateDatastore operation.
- The fluent builder is configurable:
datastore_name(impl Into<String>)/set_datastore_name(Option<String>):The name of the data store to be updated.
retention_period(RetentionPeriod)/set_retention_period(Option<RetentionPeriod>):How long, in days, message data is kept for the data store. The retention period can’t be updated if the data store’s Amazon S3 storage is customer-managed.
datastore_storage(DatastoreStorage)/set_datastore_storage(Option<DatastoreStorage>):Where data in a data store is stored.. You can choose
serviceManagedS3storage,customerManagedS3storage, oriotSiteWiseMultiLayerStoragestorage. The default isserviceManagedS3. You can’t change the choice of Amazon S3 storage after your data store is created.file_format_configuration(FileFormatConfiguration)/set_file_format_configuration(Option<FileFormatConfiguration>):Contains the configuration information of file formats. IoT Analytics data stores support JSON and Parquet.
The default file format is JSON. You can specify only one format.
You can’t change the file format after you create the data store.
- On success, responds with
UpdateDatastoreOutput - On failure, responds with
SdkError<UpdateDatastoreError>
sourcepub fn update_pipeline(&self) -> UpdatePipeline
pub fn update_pipeline(&self) -> UpdatePipeline
Constructs a fluent builder for the UpdatePipeline operation.
- The fluent builder is configurable:
pipeline_name(impl Into<String>)/set_pipeline_name(Option<String>):The name of the pipeline to update.
pipeline_activities(Vec<PipelineActivity>)/set_pipeline_activities(Option<Vec<PipelineActivity>>):A list of
PipelineActivityobjects. Activities perform transformations on your messages, such as removing, renaming or adding message attributes; filtering messages based on attribute values; invoking your Lambda functions on messages for advanced processing; or performing mathematical transformations to normalize device data.The list can be 2-25
PipelineActivityobjects and must contain both achanneland adatastoreactivity. Each entry in the list must contain only one activity. For example:pipelineActivities = [ { “channel”: { … } }, { “lambda”: { … } }, … ]
- On success, responds with
UpdatePipelineOutput - On failure, responds with
SdkError<UpdatePipelineError>
source§impl Client
impl Client
sourcepub fn new(sdk_config: &SdkConfig) -> Self
pub fn new(sdk_config: &SdkConfig) -> Self
Creates a new client from an SDK Config.
Panics
- This method will panic if the
sdk_configis missing an async sleep implementation. If you experience this panic, set thesleep_implon the Config passed into this function to fix it. - This method will panic if the
sdk_configis missing an HTTP connector. If you experience this panic, set thehttp_connectoron the Config passed into this function to fix it.
sourcepub fn from_conf(conf: Config) -> Self
pub fn from_conf(conf: Config) -> Self
Creates a new client from the service Config.
Panics
- This method will panic if the
confis missing an async sleep implementation. If you experience this panic, set thesleep_implon the Config passed into this function to fix it. - This method will panic if the
confis missing an HTTP connector. If you experience this panic, set thehttp_connectoron the Config passed into this function to fix it.