Expand description
Data structures used by operation inputs/outputs.
Modules§
Structs§
- Aggregation
Config The aggregation settings that you can use to customize the output format of your flow data.
- Amplitude
Connector Profile Credentials The connector-specific credentials required when using Amplitude.
- Amplitude
Connector Profile Properties The connector-specific profile properties required when using Amplitude.
- Amplitude
Metadata The connector metadata specific to Amplitude.
- Amplitude
Source Properties The properties that are applied when Amplitude is being used as a source.
- ApiKey
Credentials The API key credentials required for API key authentication.
- Auth
Parameter Information about required authentication parameters.
- Authentication
Config Contains information about the authentication config that the connector supports.
- Basic
Auth Credentials The basic auth credentials required for basic authentication.
- Connector
Configuration The configuration settings related to a given connector.
- Connector
Detail Information about the registered connector.
- Connector
Entity The high-level entity that can be queried in Amazon AppFlow. For example, a Salesforce entity might be an Account or Opportunity, whereas a ServiceNow entity might be an Incident.
- Connector
Entity Field Describes the data model of a connector field. For example, for an account entity, the fields would be account name, account ID, and so on.
- Connector
Metadata A structure to specify connector-specific metadata such as
oAuthScopes
,supportedRegions
,privateLinkServiceUrl
, and so on.- ConnectorO
Auth Request Used by select connectors for which the OAuth workflow is supported, such as Salesforce, Google Analytics, Marketo, Zendesk, and Slack.
- Connector
Operator The operation to be performed on the provided source fields.
- Connector
Profile Describes an instance of a connector. This includes the provided name, credentials ARN, connection-mode, and so on. To keep the API intuitive and extensible, the fields that are common to all types of connector profiles are explicitly specified at the top level. The rest of the connector-specific properties are available via the
connectorProfileProperties
field.- Connector
Profile Config Defines the connector-specific configuration and credentials for the connector profile.
- Connector
Profile Credentials The connector-specific credentials required by a connector.
- Connector
Profile Properties The connector-specific profile properties required by each connector.
- Connector
Provisioning Config Contains information about the configuration of the connector being registered.
- Connector
Runtime Setting Contains information about the connector runtime settings that are required for flow execution.
- Custom
Auth Config Configuration information required for custom authentication.
- Custom
Auth Credentials The custom credentials required for custom authentication.
- Custom
Connector Destination Properties The properties that are applied when the custom connector is being used as a destination.
- Custom
Connector Profile Credentials The connector-specific profile credentials that are required when using the custom connector.
- Custom
Connector Profile Properties The profile properties required by the custom connector.
- Custom
Connector Source Properties The properties that are applied when the custom connector is being used as a source.
- Customer
Profiles Destination Properties The properties that are applied when Amazon Connect Customer Profiles is used as a destination.
- Customer
Profiles Metadata The connector metadata specific to Amazon Connect Customer Profiles.
- Data
Transfer Api The API of the connector application that Amazon AppFlow uses to transfer your data.
- Datadog
Connector Profile Credentials The connector-specific credentials required by Datadog.
- Datadog
Connector Profile Properties The connector-specific profile properties required by Datadog.
- Datadog
Metadata The connector metadata specific to Datadog.
- Datadog
Source Properties The properties that are applied when Datadog is being used as a source.
- Destination
Connector Properties This stores the information that is required to query a particular connector.
- Destination
Field Properties The properties that can be applied to a field when connector is being used as a destination.
- Destination
Flow Config Contains information about the configuration of destination connectors present in the flow.
- Dynatrace
Connector Profile Credentials The connector-specific profile credentials required by Dynatrace.
- Dynatrace
Connector Profile Properties The connector-specific profile properties required by Dynatrace.
- Dynatrace
Metadata The connector metadata specific to Dynatrace.
- Dynatrace
Source Properties The properties that are applied when Dynatrace is being used as a source.
- Error
Handling Config The settings that determine how Amazon AppFlow handles an error when placing data in the destination. For example, this setting would determine if the flow should fail after one insertion error, or continue and attempt to insert every record regardless of the initial failure.
ErrorHandlingConfig
is a part of the destination connector details.- Error
Info Provides details in the event of a failed flow, including the failure count and the related error messages.
- Event
Bridge Destination Properties The properties that are applied when Amazon EventBridge is being used as a destination.
- Event
Bridge Metadata The connector metadata specific to Amazon EventBridge.
- Execution
Details Describes the details of the flow run, including the timestamp, status, and message.
- Execution
Record Specifies information about the past flow run instances for a given flow.
- Execution
Result Specifies the end result of the flow run.
- Field
Type Details Contains details regarding the supported field type and the operators that can be applied for filtering.
- Flow
Definition The properties of the flow, such as its source, destination, trigger type, and so on.
- Glue
Data Catalog Config Specifies the configuration that Amazon AppFlow uses when it catalogs your data with the Glue Data Catalog. When Amazon AppFlow catalogs your data, it stores metadata in Data Catalog tables. This metadata represents the data that's transferred by the flow that you configure with these settings.
You can configure a flow with these settings only when the flow destination is Amazon S3.
- Google
Analytics Connector Profile Credentials The connector-specific profile credentials required by Google Analytics.
- Google
Analytics Connector Profile Properties The connector-specific profile properties required by Google Analytics.
- Google
Analytics Metadata The connector metadata specific to Google Analytics.
- Google
Analytics Source Properties The properties that are applied when Google Analytics is being used as a source.
- Honeycode
Connector Profile Credentials The connector-specific credentials required when using Amazon Honeycode.
- Honeycode
Connector Profile Properties The connector-specific properties required when using Amazon Honeycode.
- Honeycode
Destination Properties The properties that are applied when Amazon Honeycode is used as a destination.
- Honeycode
Metadata The connector metadata specific to Amazon Honeycode.
- Incremental
Pull Config Specifies the configuration used when importing incremental records from the source.
- Infor
Nexus Connector Profile Credentials The connector-specific profile credentials required by Infor Nexus.
- Infor
Nexus Connector Profile Properties The connector-specific profile properties required by Infor Nexus.
- Infor
Nexus Metadata The connector metadata specific to Infor Nexus.
- Infor
Nexus Source Properties The properties that are applied when Infor Nexus is being used as a source.
- Lambda
Connector Provisioning Config Contains information about the configuration of the lambda which is being registered as the connector.
- Lookout
Metrics Destination Properties The properties that are applied when Amazon Lookout for Metrics is used as a destination.
- Marketo
Connector Profile Credentials The connector-specific profile credentials required by Marketo.
- Marketo
Connector Profile Properties The connector-specific profile properties required when using Marketo.
- Marketo
Destination Properties The properties that Amazon AppFlow applies when you use Marketo as a flow destination.
- Marketo
Metadata The connector metadata specific to Marketo.
- Marketo
Source Properties The properties that are applied when Marketo is being used as a source.
- Metadata
Catalog Config Specifies the configuration that Amazon AppFlow uses when it catalogs your data. When Amazon AppFlow catalogs your data, it stores metadata in a data catalog.
- Metadata
Catalog Detail Describes the metadata catalog, metadata table, and data partitions that Amazon AppFlow used for the associated flow run.
- OAuth2
Credentials The OAuth 2.0 credentials required for OAuth 2.0 authentication.
- OAuth2
Custom Parameter Custom parameter required for OAuth 2.0 authentication.
- OAuth2
Defaults Contains the default values required for OAuth 2.0 authentication.
- OAuth2
Properties The OAuth 2.0 properties required for OAuth 2.0 authentication.
- OAuth
Credentials The OAuth credentials required for OAuth type authentication.
- OAuth
Properties The OAuth properties required for OAuth type authentication.
- Pardot
Connector Profile Credentials The connector-specific profile credentials required when using Salesforce Pardot.
- Pardot
Connector Profile Properties The connector-specific profile properties required when using Salesforce Pardot.
- Pardot
Metadata The connector metadata specific to Salesforce Pardot.
- Pardot
Source Properties The properties that are applied when Salesforce Pardot is being used as a source.
- Prefix
Config Specifies elements that Amazon AppFlow includes in the file and folder names in the flow destination.
- Private
Connection Provisioning State Specifies the private connection provisioning state.
- Range
The range of values that the property supports.
- Redshift
Connector Profile Credentials The connector-specific profile credentials required when using Amazon Redshift.
- Redshift
Connector Profile Properties The connector-specific profile properties when using Amazon Redshift.
- Redshift
Destination Properties The properties that are applied when Amazon Redshift is being used as a destination.
- Redshift
Metadata The connector metadata specific to Amazon Redshift.
- Registration
Output Describes the status of an attempt from Amazon AppFlow to register a resource.
When you run a flow that you've configured to use a metadata catalog, Amazon AppFlow registers a metadata table and data partitions with that catalog. This operation provides the status of that registration attempt. The operation also indicates how many related resources Amazon AppFlow created or updated.
- S3Destination
Properties The properties that are applied when Amazon S3 is used as a destination.
- S3Input
Format Config When you use Amazon S3 as the source, the configuration format that you provide the flow input data.
- S3Metadata
The connector metadata specific to Amazon S3.
- S3Output
Format Config The configuration that determines how Amazon AppFlow should format the flow output data when Amazon S3 is used as the destination.
- S3Source
Properties The properties that are applied when Amazon S3 is being used as the flow source.
- Salesforce
Connector Profile Credentials The connector-specific profile credentials required when using Salesforce.
- Salesforce
Connector Profile Properties The connector-specific profile properties required when using Salesforce.
- Salesforce
Destination Properties The properties that are applied when Salesforce is being used as a destination.
- Salesforce
Metadata The connector metadata specific to Salesforce.
- Salesforce
Source Properties The properties that are applied when Salesforce is being used as a source.
- Sapo
Data Connector Profile Credentials The connector-specific profile credentials required when using SAPOData.
- Sapo
Data Connector Profile Properties The connector-specific profile properties required when using SAPOData.
- Sapo
Data Destination Properties The properties that are applied when using SAPOData as a flow destination
- Sapo
Data Metadata The connector metadata specific to SAPOData.
- Sapo
Data Pagination Config Sets the page size for each concurrent process that transfers OData records from your SAP instance. A concurrent process is query that retrieves a batch of records as part of a flow run. Amazon AppFlow can run multiple concurrent processes in parallel to transfer data faster.
- Sapo
Data Parallelism Config Sets the number of concurrent processes that transfer OData records from your SAP instance. A concurrent process is query that retrieves a batch of records as part of a flow run. Amazon AppFlow can run multiple concurrent processes in parallel to transfer data faster.
- Sapo
Data Source Properties The properties that are applied when using SAPOData as a flow source.
- Scheduled
Trigger Properties Specifies the configuration details of a schedule-triggered flow as defined by the user. Currently, these settings only apply to the
Scheduled
trigger type.- Service
NowConnector Profile Credentials The connector-specific profile credentials required when using ServiceNow.
- Service
NowConnector Profile Properties The connector-specific profile properties required when using ServiceNow.
- Service
NowMetadata The connector metadata specific to ServiceNow.
- Service
NowSource Properties The properties that are applied when ServiceNow is being used as a source.
- Singular
Connector Profile Credentials The connector-specific profile credentials required when using Singular.
- Singular
Connector Profile Properties The connector-specific profile properties required when using Singular.
- Singular
Metadata The connector metadata specific to Singular.
- Singular
Source Properties The properties that are applied when Singular is being used as a source.
- Slack
Connector Profile Credentials The connector-specific profile credentials required when using Slack.
- Slack
Connector Profile Properties The connector-specific profile properties required when using Slack.
- Slack
Metadata The connector metadata specific to Slack.
- Slack
Source Properties The properties that are applied when Slack is being used as a source.
- Snowflake
Connector Profile Credentials The connector-specific profile credentials required when using Snowflake.
- Snowflake
Connector Profile Properties The connector-specific profile properties required when using Snowflake.
- Snowflake
Destination Properties The properties that are applied when Snowflake is being used as a destination.
- Snowflake
Metadata The connector metadata specific to Snowflake.
- Source
Connector Properties Specifies the information that is required to query a particular connector.
- Source
Field Properties The properties that can be applied to a field when the connector is being used as a source.
- Source
Flow Config Contains information about the configuration of the source connector used in the flow.
- Success
Response Handling Config Determines how Amazon AppFlow handles the success response that it gets from the connector after placing data.
For example, this setting would determine where to write the response from the destination connector upon a successful insert operation.
- Supported
Field Type Details Contains details regarding all the supported
FieldTypes
and their correspondingfilterOperators
andsupportedValues
.- Task
A class for modeling different type of tasks. Task implementation varies based on the
TaskType
.- Trendmicro
Connector Profile Credentials The connector-specific profile credentials required when using Trend Micro.
- Trendmicro
Connector Profile Properties The connector-specific profile properties required when using Trend Micro.
- Trendmicro
Metadata The connector metadata specific to Trend Micro.
- Trendmicro
Source Properties The properties that are applied when using Trend Micro as a flow source.
- Trigger
Config The trigger settings that determine how and when Amazon AppFlow runs the specified flow.
- Trigger
Properties Specifies the configuration details that control the trigger for a flow. Currently, these settings only apply to the
Scheduled
trigger type.- Upsolver
Destination Properties The properties that are applied when Upsolver is used as a destination.
- Upsolver
Metadata The connector metadata specific to Upsolver.
- Upsolver
S3Output Format Config The configuration that determines how Amazon AppFlow formats the flow output data when Upsolver is used as the destination.
- Veeva
Connector Profile Credentials The connector-specific profile credentials required when using Veeva.
- Veeva
Connector Profile Properties The connector-specific profile properties required when using Veeva.
- Veeva
Metadata The connector metadata specific to Veeva.
- Veeva
Source Properties The properties that are applied when using Veeva as a flow source.
- Zendesk
Connector Profile Credentials The connector-specific profile credentials required when using Zendesk.
- Zendesk
Connector Profile Properties The connector-specific profile properties required when using Zendesk.
- Zendesk
Destination Properties The properties that are applied when Zendesk is used as a destination.
- Zendesk
Metadata The connector metadata specific to Zendesk.
- Zendesk
Source Properties The properties that are applied when using Zendesk as a flow source.
Enums§
- Aggregation
Type - When writing a match expression against
AggregationType
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Amplitude
Connector Operator - When writing a match expression against
AmplitudeConnectorOperator
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Authentication
Type - When writing a match expression against
AuthenticationType
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Catalog
Type - When writing a match expression against
CatalogType
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Connection
Mode - When writing a match expression against
ConnectionMode
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Connector
Provisioning Type - When writing a match expression against
ConnectorProvisioningType
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Connector
Type - When writing a match expression against
ConnectorType
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Data
Pull Mode - When writing a match expression against
DataPullMode
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Data
Transfer ApiType - When writing a match expression against
DataTransferApiType
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Datadog
Connector Operator - When writing a match expression against
DatadogConnectorOperator
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Dynatrace
Connector Operator - When writing a match expression against
DynatraceConnectorOperator
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Execution
Status - When writing a match expression against
ExecutionStatus
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - File
Type - When writing a match expression against
FileType
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Flow
Status - When writing a match expression against
FlowStatus
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Google
Analytics Connector Operator - When writing a match expression against
GoogleAnalyticsConnectorOperator
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Infor
Nexus Connector Operator - When writing a match expression against
InforNexusConnectorOperator
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Marketo
Connector Operator - When writing a match expression against
MarketoConnectorOperator
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - OAuth2
Custom Prop Type - When writing a match expression against
OAuth2CustomPropType
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - OAuth2
Grant Type - When writing a match expression against
OAuth2GrantType
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Operator
- When writing a match expression against
Operator
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Operator
Properties Keys - When writing a match expression against
OperatorPropertiesKeys
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Operators
- When writing a match expression against
Operators
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Pardot
Connector Operator - When writing a match expression against
PardotConnectorOperator
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Path
Prefix - When writing a match expression against
PathPrefix
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Prefix
Format - When writing a match expression against
PrefixFormat
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Prefix
Type - When writing a match expression against
PrefixType
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Private
Connection Provisioning Failure Cause - When writing a match expression against
PrivateConnectionProvisioningFailureCause
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Private
Connection Provisioning Status - When writing a match expression against
PrivateConnectionProvisioningStatus
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - S3Connector
Operator - When writing a match expression against
S3ConnectorOperator
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - S3Input
File Type - When writing a match expression against
S3InputFileType
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Salesforce
Connector Operator - When writing a match expression against
SalesforceConnectorOperator
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Salesforce
Data Transfer Api - When writing a match expression against
SalesforceDataTransferApi
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Sapo
Data Connector Operator - When writing a match expression against
SapoDataConnectorOperator
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Schedule
Frequency Type - When writing a match expression against
ScheduleFrequencyType
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Service
NowConnector Operator - When writing a match expression against
ServiceNowConnectorOperator
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Singular
Connector Operator - When writing a match expression against
SingularConnectorOperator
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Slack
Connector Operator - When writing a match expression against
SlackConnectorOperator
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Supported
Data Transfer Type - When writing a match expression against
SupportedDataTransferType
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Task
Type - When writing a match expression against
TaskType
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Trendmicro
Connector Operator - When writing a match expression against
TrendmicroConnectorOperator
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Trigger
Type - When writing a match expression against
TriggerType
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Veeva
Connector Operator - When writing a match expression against
VeevaConnectorOperator
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Write
Operation Type - When writing a match expression against
WriteOperationType
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Zendesk
Connector Operator - When writing a match expression against
ZendeskConnectorOperator
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature.