Expand description
Data structures used by operation inputs/outputs.
Modules§
Structs§
- Action
Defines an action to be initiated by a trigger.
- Aggregate
Specifies a transform that groups rows by chosen fields and computes the aggregated value by specified function.
- Aggregate
Operation Specifies the set of parameters needed to perform aggregation in the aggregate transform.
- Allowed
Value An object representing a value allowed for a property.
- Amazon
Redshift Advanced Option Specifies an optional value when connecting to the Redshift cluster.
- Amazon
Redshift Node Data Specifies an Amazon Redshift node.
- Amazon
Redshift Source Specifies an Amazon Redshift source.
- Amazon
Redshift Target Specifies an Amazon Redshift target.
- Annotation
Error A failed annotation.
- Apply
Mapping Specifies a transform that maps data property keys in the data source to data property keys in the data target. You can rename keys, modify the data types for keys, and choose which keys to drop from the dataset.
- Athena
Connector Source Specifies a connector to an Amazon Athena data source.
- Audit
Context A structure containing the Lake Formation audit context.
- Auth
Configuration The authentication configuration for a connection returned by the
DescribeConnectionType
API.- Authentication
Configuration A structure containing the authentication configuration.
- Authentication
Configuration Input A structure containing the authentication configuration in the CreateConnection request.
- Authorization
Code Properties The set of properties required for the the OAuth2
AUTHORIZATION_CODE
grant type workflow.- Backfill
Error A list of errors that can occur when registering partition indexes for an existing table.
These errors give the details about why an index registration failed and provide a limited number of partitions in the response, so that you can fix the partitions at fault and try registering the index again. The most common set of errors that can occur are categorized as follows:
-
EncryptedPartitionError: The partitions are encrypted.
-
InvalidPartitionTypeDataError: The partition value doesn't match the data type for that partition column.
-
MissingPartitionValueError: The partitions are encrypted.
-
UnsupportedPartitionCharacterError: Characters inside the partition value are not supported. For example: U+0000 , U+0001, U+0002.
-
InternalError: Any error which does not belong to other error codes.
-
- Basic
Authentication Credentials For supplying basic auth credentials when not providing a
SecretArn
value.- Basic
Catalog Target Specifies a target that uses a Glue Data Catalog table.
- Batch
GetTable Optimizer Entry Represents a table optimizer to retrieve in the
BatchGetTableOptimizer
operation.- Batch
GetTable Optimizer Error Contains details on one of the errors in the error list returned by the
BatchGetTableOptimizer
operation.- Batch
Stop JobRun Error Records an error that occurred when attempting to stop a specified job run.
- Batch
Stop JobRun Successful Submission Records a successful request to stop a specified
JobRun
.- Batch
Table Optimizer Contains details for one of the table optimizers returned by the
BatchGetTableOptimizer
operation.- Batch
Update Partition Failure Entry Contains information about a batch update partition error.
- Batch
Update Partition Request Entry A structure that contains the values and structure used to update a partition.
- Binary
Column Statistics Data Defines column statistics supported for bit sequence data values.
- Blueprint
The details of a blueprint.
- Blueprint
Details The details of a blueprint.
- Blueprint
Run The details of a blueprint run.
- Boolean
Column Statistics Data Defines column statistics supported for Boolean data columns.
- Capabilities
Specifies the supported authentication types returned by the
DescribeConnectionType
API.- Catalog
The catalog object represents a logical grouping of databases in the Glue Data Catalog or a federated source. You can now create a Redshift-federated catalog or a catalog containing resource links to Redshift databases in another account or region.
- Catalog
Delta Source Specifies a Delta Lake data source that is registered in the Glue Data Catalog.
- Catalog
Entry Specifies a table definition in the Glue Data Catalog.
- Catalog
Hudi Source Specifies a Hudi data source that is registered in the Glue Data Catalog.
- Catalog
Import Status A structure containing migration status information.
- Catalog
Input A structure that describes catalog properties.
- Catalog
Kafka Source Specifies an Apache Kafka data store in the Data Catalog.
- Catalog
Kinesis Source Specifies a Kinesis data source in the Glue Data Catalog.
- Catalog
Properties A structure that specifies data lake access properties and other custom properties.
- Catalog
Properties Output Property attributes that include configuration properties for the catalog resource.
- Catalog
Schema Change Policy A policy that specifies update behavior for the crawler.
- Catalog
Source Specifies a data store in the Glue Data Catalog.
- Catalog
Target Specifies an Glue Data Catalog target.
- Classifier
Classifiers are triggered during a crawl task. A classifier checks whether a given file is in a format it can handle. If it is, the classifier creates a schema in the form of a
StructType
object that matches that data format.You can use the standard classifiers that Glue provides, or you can write your own classifiers to best categorize your data sources and specify the appropriate schemas to use for them. A classifier can be a
grok
classifier, anXML
classifier, aJSON
classifier, or a customCSV
classifier, as specified in one of the fields in theClassifier
object.- Cloud
Watch Encryption Specifies how Amazon CloudWatch data should be encrypted.
- Code
GenConfiguration Node CodeGenConfigurationNode
enumerates all valid Node types. One and only one of its member variables can be populated.- Code
GenEdge Represents a directional edge in a directed acyclic graph (DAG).
- Code
GenNode Represents a node in a directed acyclic graph (DAG)
- Code
GenNode Arg An argument or property of a node.
- Column
A column in a
Table
.- Column
Error Encapsulates a column name that failed and the reason for failure.
- Column
Importance A structure containing the column name and column importance score for a column.
Column importance helps you understand how columns contribute to your model, by identifying which columns in your records are more important than others.
- Column
RowFilter A filter that uses both column-level and row-level filtering.
- Column
Statistics Represents the generated column-level statistics for a table or partition.
- Column
Statistics Data Contains the individual types of column statistics data. Only one data object should be set and indicated by the
Type
attribute.- Column
Statistics Error Encapsulates a
ColumnStatistics
object that failed and the reason for failure.- Column
Statistics Task Run The object that shows the details of the column stats run.
- Column
Statistics Task Settings The settings for a column statistics task.
- Compaction
Metrics A structure that contains compaction metrics for the optimizer run.
- Compute
Environment Configuration An object containing configuration for a compute environment (such as Spark, Python or Athena) returned by the
DescribeConnectionType
API.- Condition
Defines a condition under which a trigger fires.
- Condition
Expression Condition expression defined in the Glue Studio data preparation recipe node.
- Configuration
Object Specifies the values that an admin sets for each job or session parameter configured in a Glue usage profile.
- Confusion
Matrix The confusion matrix shows you what your transform is predicting accurately and what types of errors it is making.
For more information, see Confusion matrix in Wikipedia.
- Connection
Defines a connection to a data source.
- Connection
Input A structure that is used to specify a connection to create or update.
- Connection
Password Encryption The data structure used by the Data Catalog to encrypt the password as part of
CreateConnection
orUpdateConnection
and store it in theENCRYPTED_PASSWORD
field in the connection properties. You can enable catalog encryption or only password encryption.When a
CreationConnection
request arrives containing a password, the Data Catalog first encrypts the password using your KMS key. It then encrypts the whole connection object again if catalog encryption is also enabled.This encryption requires that you set KMS key permissions to enable or restrict access on the password key according to your security requirements. For example, you might want only administrators to have decrypt permission on the password key.
- Connection
Type Brief Brief information about a supported connection type returned by the
ListConnectionTypes
API.- Connections
List Specifies the connections used by a job.
- Connector
Data Source Specifies a source generated with standard connection options.
- Connector
Data Target Specifies a target generated with standard connection options.
- Crawl
The details of a crawl in the workflow.
- Crawler
Specifies a crawler program that examines a data source and uses classifiers to try to determine its schema. If successful, the crawler records metadata concerning the data source in the Glue Data Catalog.
- Crawler
History Contains the information for a run of a crawler.
- Crawler
Metrics Metrics for a specified crawler.
- Crawler
Node Details The details of a Crawler node present in the workflow.
- Crawler
Targets Specifies data stores to crawl.
- Crawls
Filter A list of fields, comparators and value that you can use to filter the crawler runs for a specified crawler.
- Create
CsvClassifier Request Specifies a custom CSV classifier for
CreateClassifier
to create.- Create
Grok Classifier Request Specifies a
grok
classifier forCreateClassifier
to create.- Create
Json Classifier Request Specifies a JSON classifier for
CreateClassifier
to create.- Create
XmlClassifier Request Specifies an XML classifier for
CreateClassifier
to create.- CsvClassifier
A classifier for custom
CSV
content.- Custom
Code Specifies a transform that uses custom code you provide to perform the data transformation. The output is a collection of DynamicFrames.
- Custom
Entity Type An object representing a custom pattern for detecting sensitive data across the columns and rows of your structured data.
- Data
Catalog Encryption Settings Contains configuration information for maintaining Data Catalog security.
- Data
Lake Access Properties Input properties to configure data lake access for your catalog resource in the Glue Data Catalog.
- Data
Lake Access Properties Output The output properties of the data lake access configuration for your catalog resource in the Glue Data Catalog.
- Data
Lake Principal The Lake Formation principal.
- Data
Quality Analyzer Result Describes the result of the evaluation of a data quality analyzer.
- Data
Quality Encryption Specifies how Data Quality assets in your account should be encrypted.
- Data
Quality Evaluation RunAdditional RunOptions Additional run options you can specify for an evaluation run.
- Data
Quality Metric Values Describes the data quality metric value according to the analysis of historical data.
- Data
Quality Observation Describes the observation generated after evaluating the rules and analyzers.
- Data
Quality Result Describes a data quality result.
- Data
Quality Result Description Describes a data quality result.
- Data
Quality Result Filter Criteria Criteria used to return data quality results.
- Data
Quality Rule Recommendation RunDescription Describes the result of a data quality rule recommendation run.
- Data
Quality Rule Recommendation RunFilter A filter for listing data quality recommendation runs.
- Data
Quality Rule Result Describes the result of the evaluation of a data quality rule.
- Data
Quality Ruleset Evaluation RunDescription Describes the result of a data quality ruleset evaluation run.
- Data
Quality Ruleset Evaluation RunFilter The filter criteria.
- Data
Quality Ruleset Filter Criteria The criteria used to filter data quality rulesets.
- Data
Quality Ruleset List Details Describes a data quality ruleset returned by
GetDataQualityRuleset
.- Data
Quality Target Table An object representing an Glue table.
- Data
Source A data source (an Glue table) for which you want data quality results.
- Database
The
Database
object represents a logical grouping of tables that might reside in a Hive metastore or an RDBMS.- Database
Identifier A structure that describes a target database for resource linking.
- Database
Input The structure used to create or update a database.
- Datapoint
Inclusion Annotation An Inclusion Annotation.
- Datatype
A structure representing the datatype of the value.
- Date
Column Statistics Data Defines column statistics supported for timestamp data columns.
- Decimal
Column Statistics Data Defines column statistics supported for fixed-point number data columns.
- Decimal
Number Contains a numeric value in decimal format.
- Delta
Target Specifies a Delta data store to crawl one or more Delta tables.
- DevEndpoint
A development endpoint where a developer can remotely debug extract, transform, and load (ETL) scripts.
- DevEndpoint
Custom Libraries Custom libraries to be loaded into a development endpoint.
- Direct
Jdbc Source Specifies the direct JDBC source connection.
- Direct
Kafka Source Specifies an Apache Kafka data store.
- Direct
Kinesis Source Specifies a direct Amazon Kinesis data source.
- Direct
Schema Change Policy A policy that specifies update behavior for the crawler.
- Double
Column Statistics Data Defines column statistics supported for floating-point number data columns.
- DqResults
Publishing Options Options to configure how your data quality evaluation results are published.
- DqStop
JobOn Failure Options Options to configure how your job will stop if your data quality evaluation fails.
- Drop
Duplicates Specifies a transform that removes rows of repeating data from a data set.
- Drop
Fields Specifies a transform that chooses the data property keys that you want to drop.
- Drop
Null Fields Specifies a transform that removes columns from the dataset if all values in the column are 'null'. By default, Glue Studio will recognize null objects, but some values such as empty strings, strings that are "null", -1 integers or other placeholders such as zeros, are not automatically recognized as nulls.
- Dynamic
Transform Specifies the set of parameters needed to perform the dynamic transform.
- Dynamo
DbCatalog Source Specifies a DynamoDB data source in the Glue Data Catalog.
- Dynamo
DbTarget Specifies an Amazon DynamoDB table to crawl.
- Edge
An edge represents a directed connection between two Glue components that are part of the workflow the edge belongs to.
- Encryption
AtRest Specifies the encryption-at-rest configuration for the Data Catalog.
- Encryption
Configuration Specifies an encryption configuration.
- Entity
An entity supported by a given
ConnectionType
.- Error
Detail Contains details about an error.
- Error
Details An object containing error details.
- Evaluate
Data Quality Specifies your data quality evaluation criteria.
- Evaluate
Data Quality Multi Frame Specifies your data quality evaluation criteria.
- Evaluation
Metrics Evaluation metrics provide an estimate of the quality of your machine learning transform.
- Event
Batching Condition Batch condition that must be met (specified number of events received or batch time window expired) before EventBridge event trigger fires.
- Execution
Attempt A run attempt for a column statistics task run.
- Execution
Property An execution property of a job.
- Export
Labels Task RunProperties Specifies configuration properties for an exporting labels task run.
- Federated
Catalog A catalog that points to an entity outside the Glue Data Catalog.
- Federated
Database A database that points to an entity outside the Glue Data Catalog.
- Federated
Table A table that points to an entity outside the Glue Data Catalog.
- Field
The
Field
object has information about the different properties associated with a field in the connector.- Fill
Missing Values Specifies a transform that locates records in the dataset that have missing values and adds a new field with a value determined by imputation. The input data set is used to train the machine learning model that determines what the missing value should be.
- Filter
Specifies a transform that splits a dataset into two, based on a filter condition.
- Filter
Expression Specifies a filter expression.
- Filter
Value Represents a single entry in the list of values for a
FilterExpression
.- Find
Matches Metrics The evaluation metrics for the find matches algorithm. The quality of your machine learning transform is measured by getting your transform to predict some matches and comparing the results to known matches from the same dataset. The quality metrics are based on a subset of your data, so they are not precise.
- Find
Matches Parameters The parameters to configure the find matches transform.
- Find
Matches Task RunProperties Specifies configuration properties for a Find Matches task run.
- GetConnections
Filter Filters the connection definitions that are returned by the
GetConnections
API operation.- Glue
Policy A structure for returning a resource policy.
- Glue
Schema Specifies a user-defined schema when a schema cannot be determined by Glue.
- Glue
Studio Schema Column Specifies a single column in a Glue schema definition.
- Glue
Table The database and table in the Glue Data Catalog that is used for input or output data.
- Governed
Catalog Source Specifies the data store in the governed Glue Data Catalog.
- Governed
Catalog Target Specifies a data target that writes to Amazon S3 using the Glue Data Catalog.
- Grok
Classifier A classifier that uses
grok
patterns.- Hudi
Target Specifies an Apache Hudi data source.
- Iceberg
Compaction Metrics Compaction metrics for Iceberg for the optimizer run.
- Iceberg
Input A structure that defines an Apache Iceberg metadata table to create in the catalog.
- Iceberg
Orphan File Deletion Configuration The configuration for an Iceberg orphan file deletion optimizer.
- Iceberg
Orphan File Deletion Metrics Orphan file deletion metrics for Iceberg for the optimizer run.
- Iceberg
Retention Configuration The configuration for an Iceberg snapshot retention optimizer.
- Iceberg
Retention Metrics Snapshot retention metrics for Iceberg for the optimizer run.
- Iceberg
Target Specifies an Apache Iceberg data source where Iceberg tables are stored in Amazon S3.
- Import
Labels Task RunProperties Specifies configuration properties for an importing labels task run.
- Inbound
Integration A structure for an integration that writes data into a resource.
- Integration
Describes a zero-ETL integration.
- Integration
Error An error associated with a zero-ETL integration.
- Integration
Filter A filter that can be used when invoking a
DescribeIntegrations
request.- Integration
Partition A structure that describes how data is partitioned on the target.
- Jdbc
Connector Options Additional connection options for the connector.
- Jdbc
Connector Source Specifies a connector to a JDBC data source.
- Jdbc
Connector Target Specifies a data target that writes to Amazon S3 in Apache Parquet columnar storage.
- Jdbc
Target Specifies a JDBC data store to crawl.
- Job
Specifies a job definition.
- JobBookmark
Entry Defines a point that a job can resume processing.
- JobBookmarks
Encryption Specifies how job bookmark data should be encrypted.
- JobCommand
Specifies code that runs when a job is run.
- JobNode
Details The details of a Job node present in the workflow.
- JobRun
Contains information about a job run.
- JobUpdate
Specifies information used to update an existing job definition. The previous job definition is completely overwritten by this information.
- Join
Specifies a transform that joins two datasets into one dataset using a comparison phrase on the specified data property keys. You can use inner, outer, left, right, left semi, and left anti joins.
- Join
Column Specifies a column to be joined.
- Json
Classifier A classifier for
JSON
content.- Kafka
Streaming Source Options Additional options for streaming.
- KeySchema
Element A partition key pair consisting of a name and a type.
- Kinesis
Streaming Source Options Additional options for the Amazon Kinesis streaming data source.
- Labeling
SetGeneration Task RunProperties Specifies configuration properties for a labeling set generation task run.
- Lake
Formation Configuration Specifies Lake Formation configuration settings for the crawler.
- Last
Active Definition When there are multiple versions of a blueprint and the latest version has some errors, this attribute indicates the last successful blueprint definition that is available with the service.
- Last
Crawl Info Status and error information about the most recent crawl.
- Lineage
Configuration Specifies data lineage configuration settings for the crawler.
- Location
The location of resources.
- Long
Column Statistics Data Defines column statistics supported for integer data columns.
- Mapping
Specifies the mapping of data property keys.
- Mapping
Entry Defines a mapping.
- Merge
Specifies a transform that merges a
DynamicFrame
with a stagingDynamicFrame
based on the specified primary keys to identify records. Duplicate records (records with the same primary keys) are not de-duplicated.- Metadata
Info A structure containing metadata information for a schema version.
- Metadata
KeyValue Pair A structure containing a key value pair for metadata.
- Metric
Based Observation Describes the metric based observation generated based on evaluated data quality metrics.
- Microsoft
SqlServer Catalog Source Specifies a Microsoft SQL server data source in the Glue Data Catalog.
- Microsoft
SqlServer Catalog Target Specifies a target that uses Microsoft SQL.
- MlTransform
A structure for a machine learning transform.
- MlUser
Data Encryption The encryption-at-rest settings of the transform that apply to accessing user data.
- Mongo
DbTarget Specifies an Amazon DocumentDB or MongoDB data store to crawl.
- MySql
Catalog Source Specifies a MySQL data source in the Glue Data Catalog.
- MySql
Catalog Target Specifies a target that uses MySQL.
- Node
A node represents an Glue component (trigger, crawler, or job) on a workflow graph.
- Notification
Property Specifies configuration properties of a notification.
- Null
Check BoxList Represents whether certain values are recognized as null values for removal.
- Null
Value Field Represents a custom null value such as a zeros or other value being used as a null placeholder unique to the dataset.
- OAuth2
Client Application The OAuth2 client app used for the connection.
- OAuth2
Credentials The credentials used when the authentication type is OAuth2 authentication.
- OAuth2
Properties A structure containing properties for OAuth2 authentication.
- OAuth2
Properties Input A structure containing properties for OAuth2 in the CreateConnection request.
- Open
Table Format Input A structure representing an open format table.
- Option
Specifies an option value.
- Oracle
SqlCatalog Source Specifies an Oracle data source in the Glue Data Catalog.
- Oracle
SqlCatalog Target Specifies a target that uses Oracle SQL.
- Order
Specifies the sort order of a sorted column.
- Orphan
File Deletion Configuration The configuration for an orphan file deletion optimizer.
- Orphan
File Deletion Metrics A structure that contains orphan file deletion metrics for the optimizer run.
- Other
Metadata Value List Item A structure containing other metadata for a schema version belonging to the same metadata key.
- Partition
Represents a slice of table data.
- Partition
Error Contains information about a partition error.
- Partition
Index A structure for a partition index.
- Partition
Index Descriptor A descriptor for a partition index in a table.
- Partition
Input The structure used to create and update a partition.
- Partition
Value List Contains a list of values defining partitions.
- Physical
Connection Requirements The OAuth client app in GetConnection response.
- PiiDetection
Specifies a transform that identifies, removes or masks PII data.
- Postgre
SqlCatalog Source Specifies a PostgresSQL data source in the Glue Data Catalog.
- Postgre
SqlCatalog Target Specifies a target that uses Postgres SQL.
- Predecessor
A job run that was used in the predicate of a conditional trigger that triggered this job run.
- Predicate
Defines the predicate of the trigger, which determines when it fires.
- Principal
Permissions Permissions granted to a principal.
- Profile
Configuration Specifies the job and session values that an admin configures in an Glue usage profile.
- Property
An object that defines a connection type for a compute environment.
- Property
Predicate Defines a property predicate.
- Query
Session Context A structure used as a protocol between query engines and Lake Formation or Glue. Contains both a Lake Formation generated authorization identifier and information from the request's authorization context.
- Recipe
A Glue Studio node that uses a Glue DataBrew recipe in Glue jobs.
- Recipe
Action Actions defined in the Glue Studio data preparation recipe node.
- Recipe
Reference A reference to a Glue DataBrew recipe.
- Recipe
Step A recipe step used in a Glue Studio data preparation recipe node.
- Recrawl
Policy When crawling an Amazon S3 data source after the first crawl is complete, specifies whether to crawl the entire dataset again or to crawl only folders that were added since the last crawler run. For more information, see Incremental Crawls in Glue in the developer guide.
- Redshift
Source Specifies an Amazon Redshift data store.
- Redshift
Target Specifies a target that uses Amazon Redshift.
- Registry
Id A wrapper structure that may contain the registry name and Amazon Resource Name (ARN).
- Registry
List Item A structure containing the details for a registry.
- Relational
Catalog Source Specifies a Relational database data source in the Glue Data Catalog.
- Rename
Field Specifies a transform that renames a single data property key.
- Resource
Uri The URIs for function resources.
- Retention
Configuration The configuration for a snapshot retention optimizer.
- Retention
Metrics A structure that contains retention metrics for the optimizer run.
- RunIdentifier
A run identifier.
- RunMetrics
Metrics for the optimizer run.
This structure is deprecated. See the individual metric members for compaction, retention, and orphan file deletion.
- S3Catalog
Delta Source Specifies a Delta Lake data source that is registered in the Glue Data Catalog. The data source must be stored in Amazon S3.
- S3Catalog
Hudi Source Specifies a Hudi data source that is registered in the Glue Data Catalog. The Hudi data source must be stored in Amazon S3.
- S3Catalog
Source Specifies an Amazon S3 data store in the Glue Data Catalog.
- S3Catalog
Target Specifies a data target that writes to Amazon S3 using the Glue Data Catalog.
- S3Csv
Source Specifies a command-separated value (CSV) data store stored in Amazon S3.
- S3Delta
Catalog Target Specifies a target that writes to a Delta Lake data source in the Glue Data Catalog.
- S3Delta
Direct Target Specifies a target that writes to a Delta Lake data source in Amazon S3.
- S3Delta
Source Specifies a Delta Lake data source stored in Amazon S3.
- S3Direct
Source Additional Options Specifies additional connection options for the Amazon S3 data store.
- S3Direct
Target Specifies a data target that writes to Amazon S3.
- S3Encryption
Specifies how Amazon Simple Storage Service (Amazon S3) data should be encrypted.
- S3Glue
Parquet Target Specifies a data target that writes to Amazon S3 in Apache Parquet columnar storage.
- S3Hudi
Catalog Target Specifies a target that writes to a Hudi data source in the Glue Data Catalog.
- S3Hudi
Direct Target Specifies a target that writes to a Hudi data source in Amazon S3.
- S3Hudi
Source Specifies a Hudi data source stored in Amazon S3.
- S3Json
Source Specifies a JSON data store stored in Amazon S3.
- S3Parquet
Source Specifies an Apache Parquet data store stored in Amazon S3.
- S3Source
Additional Options Specifies additional connection options for the Amazon S3 data store.
- S3Target
Specifies a data store in Amazon Simple Storage Service (Amazon S3).
- Schedule
A scheduling object using a
cron
statement to schedule an event.- Schema
Change Policy A policy that specifies update and deletion behaviors for the crawler.
- Schema
Column A key-value pair representing a column and data type that this transform can run against. The
Schema
parameter of theMLTransform
may contain up to 100 of these structures.- Schema
Id The unique ID of the schema in the Glue schema registry.
- Schema
List Item An object that contains minimal details for a schema.
- Schema
Reference An object that references a schema stored in the Glue Schema Registry.
- Schema
Version Error Item An object that contains the error details for an operation on a schema version.
- Schema
Version List Item An object containing the details about a schema version.
- Schema
Version Number A structure containing the schema version information.
- Security
Configuration Specifies a security configuration.
- Segment
Defines a non-overlapping region of a table's partitions, allowing multiple requests to be run in parallel.
- Select
Fields Specifies a transform that chooses the data property keys that you want to keep.
- Select
From Collection Specifies a transform that chooses one
DynamicFrame
from a collection ofDynamicFrames
. The output is the selectedDynamicFrame
- SerDe
Info Information about a serialization/deserialization program (SerDe) that serves as an extractor and loader.
- Session
The period in which a remote Spark runtime environment is running.
- Session
Command The
SessionCommand
that runs the job.- Skewed
Info Specifies skewed values in a table. Skewed values are those that occur with very high frequency.
- Snowflake
Node Data Specifies configuration for Snowflake nodes in Glue Studio.
- Snowflake
Source Specifies a Snowflake data source.
- Snowflake
Target Specifies a Snowflake target.
- Sort
Criterion Specifies a field to sort by and a sort order.
- Source
Control Details The details for a source control configuration for a job, allowing synchronization of job artifacts to or from a remote repository.
- Source
Processing Properties The resource properties associated with the integration source.
- Source
Table Config Properties used by the source leg to process data from the source.
- Spark
Connector Source Specifies a connector to an Apache Spark data source.
- Spark
Connector Target Specifies a target that uses an Apache Spark connector.
- Spark
Sql Specifies a transform where you enter a SQL query using Spark SQL syntax to transform the data. The output is a single
DynamicFrame
.- Spigot
Specifies a transform that writes samples of the data to an Amazon S3 bucket.
- Split
Fields Specifies a transform that splits data property keys into two
DynamicFrames
. The output is a collection ofDynamicFrames
: one with selected data property keys, and one with the remaining data property keys.- SqlAlias
Represents a single entry in the list of values for
SqlAliases
.- Starting
Event Batch Condition The batch condition that started the workflow run. Either the number of events in the batch size arrived, in which case the BatchSize member is non-zero, or the batch window expired, in which case the BatchWindow member is non-zero.
- Statement
The statement or request for a particular action to occur in a session.
- Statement
Output The code execution output in JSON format.
- Statement
Output Data The code execution output in JSON format.
- Statistic
Annotation A Statistic Annotation.
- Statistic
Model Result The statistic model result.
- Statistic
Summary Summary information about a statistic.
- Status
Details A structure containing information about an asynchronous change to a table.
- Storage
Descriptor Describes the physical storage of table data.
- Streaming
Data Preview Options Specifies options related to data preview for viewing a sample of your data.
- String
Column Statistics Data Defines column statistics supported for character sequence data values.
- Supported
Dialect A structure specifying the dialect and dialect version used by the query engine.
- Table
Represents a collection of related data organized in columns and rows.
- Table
Error An error record for table operations.
- Table
Identifier A structure that describes a target table for resource linking.
- Table
Input A structure used to define a table.
- Table
Optimizer Contains details about an optimizer associated with a table.
- Table
Optimizer Configuration Contains details on the configuration of a table optimizer. You pass this configuration when creating or updating a table optimizer.
- Table
Optimizer Run Contains details for a table optimizer run.
- Table
Status A structure containing information about the state of an asynchronous change to a table.
- Table
Version Specifies a version of a table.
- Table
Version Error An error record for table-version operations.
- Tag
The
Tag
object represents a label that you can assign to an Amazon Web Services resource. Each tag consists of a key and an optional value, both of which you define.For more information about tags, and controlling access to resources in Glue, see Amazon Web Services Tags in Glue and Specifying Glue Resource ARNs in the developer guide.
- Target
Processing Properties The resource properties associated with the integration target.
- Target
Redshift Catalog A structure that describes a target catalog for resource linking.
- Target
Table Config Properties used by the target leg to partition the data on the target.
- TaskRun
The sampling parameters that are associated with the machine learning transform.
- Task
RunFilter Criteria The criteria that are used to filter the task runs for the machine learning transform.
- Task
RunProperties The configuration properties for the task run.
- Task
RunSort Criteria The sorting criteria that are used to sort the list of task runs for the machine learning transform.
- Test
Connection Input A structure that is used to specify testing a connection to a service.
- Timestamp
Filter A timestamp filter.
- Timestamped
Inclusion Annotation A timestamped inclusion annotation.
- Transform
Config Parameter Specifies the parameters in the config file of the dynamic transform.
- Transform
Encryption The encryption-at-rest settings of the transform that apply to accessing user data. Machine learning transforms can access user data encrypted in Amazon S3 using KMS.
Additionally, imported labels and trained transforms can now be encrypted using a customer provided KMS key.
- Transform
Filter Criteria The criteria used to filter the machine learning transforms.
- Transform
Parameters The algorithm-specific parameters that are associated with the machine learning transform.
- Transform
Sort Criteria The sorting criteria that are associated with the machine learning transform.
- Trigger
Information about a specific trigger.
- Trigger
Node Details The details of a Trigger node present in the workflow.
- Trigger
Update A structure used to provide information used to update a trigger. This object updates the previous trigger definition by overwriting it completely.
- Unfiltered
Partition A partition that contains unfiltered metadata.
- Union
Specifies a transform that combines the rows from two or more datasets into a single result.
- Update
CsvClassifier Request Specifies a custom CSV classifier to be updated.
- Update
Grok Classifier Request Specifies a grok classifier to update when passed to
UpdateClassifier
.- Update
Json Classifier Request Specifies a JSON classifier to be updated.
- Update
XmlClassifier Request Specifies an XML classifier to be updated.
- Upsert
Redshift Target Options The options to configure an upsert operation when writing to a Redshift target .
- Usage
Profile Definition Describes an Glue usage profile.
- User
Defined Function Represents the equivalent of a Hive user-defined function (
UDF
) definition.- User
Defined Function Input A structure used to create or update a user-defined function.
- View
Definition A structure containing details for representations.
- View
Definition Input A structure containing details for creating or updating an Glue view.
- View
Representation A structure that contains the dialect of the view, and the query that defines the view.
- View
Representation Input A structure containing details of a representation to update or create a Lake Formation view.
- View
Validation A structure that contains information for an analytical engine to validate a view, prior to persisting the view metadata. Used in the case of direct
UpdateTable
orCreateTable
API calls.- Workflow
A workflow is a collection of multiple dependent Glue jobs and crawlers that are run to complete a complex ETL task. A workflow manages the execution and monitoring of all its jobs and crawlers.
- Workflow
Graph A workflow graph represents the complete workflow containing all the Glue components present in the workflow and all the directed connections between them.
- Workflow
Run A workflow run is an execution of a workflow providing all the runtime information.
- Workflow
RunStatistics Workflow run statistics provides statistics about the workflow run.
- XmlClassifier
A classifier for
XML
content.
Enums§
- Additional
Option Keys - When writing a match expression against
AdditionalOptionKeys
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - AggFunction
- When writing a match expression against
AggFunction
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Allow
Full Table External Data Access Enum - When writing a match expression against
AllowFullTableExternalDataAccessEnum
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Authentication
Type - When writing a match expression against
AuthenticationType
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Backfill
Error Code - When writing a match expression against
BackfillErrorCode
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Blueprint
RunState - When writing a match expression against
BlueprintRunState
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Blueprint
Status - When writing a match expression against
BlueprintStatus
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Catalog
Encryption Mode - When writing a match expression against
CatalogEncryptionMode
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Cloud
Watch Encryption Mode - When writing a match expression against
CloudWatchEncryptionMode
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Column
Statistics State - When writing a match expression against
ColumnStatisticsState
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Column
Statistics Type - When writing a match expression against
ColumnStatisticsType
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Comparator
- When writing a match expression against
Comparator
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Compatibility
- When writing a match expression against
Compatibility
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Compression
Type - When writing a match expression against
CompressionType
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Computation
Type - When writing a match expression against
ComputationType
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Compute
Environment - When writing a match expression against
ComputeEnvironment
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Connection
Property Key - When writing a match expression against
ConnectionPropertyKey
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Connection
Status - When writing a match expression against
ConnectionStatus
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Connection
Type - When writing a match expression against
ConnectionType
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Crawl
State - When writing a match expression against
CrawlState
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Crawler
History State - When writing a match expression against
CrawlerHistoryState
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Crawler
Lineage Settings - When writing a match expression against
CrawlerLineageSettings
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Crawler
State - When writing a match expression against
CrawlerState
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - CsvHeader
Option - When writing a match expression against
CsvHeaderOption
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - CsvSerde
Option - When writing a match expression against
CsvSerdeOption
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Data
Format - When writing a match expression against
DataFormat
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Data
Operation - When writing a match expression against
DataOperation
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Data
Quality Encryption Mode - When writing a match expression against
DataQualityEncryptionMode
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Data
Quality Model Status - When writing a match expression against
DataQualityModelStatus
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Data
Quality Rule Result Status - When writing a match expression against
DataQualityRuleResultStatus
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Database
Attributes - When writing a match expression against
DatabaseAttributes
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Delete
Behavior - When writing a match expression against
DeleteBehavior
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Delta
Target Compression Type - When writing a match expression against
DeltaTargetCompressionType
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - DqComposite
Rule Evaluation Method - When writing a match expression against
DqCompositeRuleEvaluationMethod
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - DqStop
JobOn Failure Timing - When writing a match expression against
DqStopJobOnFailureTiming
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - DqTransform
Output - When writing a match expression against
DqTransformOutput
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Enable
Hybrid Values - When writing a match expression against
EnableHybridValues
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Execution
Class - When writing a match expression against
ExecutionClass
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Execution
Status - When writing a match expression against
ExecutionStatus
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Exist
Condition - When writing a match expression against
ExistCondition
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Federation
Source Error Code - When writing a match expression against
FederationSourceErrorCode
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Field
Data Type - When writing a match expression against
FieldDataType
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Field
Filter Operator - When writing a match expression against
FieldFilterOperator
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Field
Name - When writing a match expression against
FieldName
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Filter
Logical Operator - When writing a match expression against
FilterLogicalOperator
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Filter
Operation - When writing a match expression against
FilterOperation
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Filter
Operator - When writing a match expression against
FilterOperator
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Filter
Value Type - When writing a match expression against
FilterValueType
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Glue
Record Type - When writing a match expression against
GlueRecordType
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Hudi
Target Compression Type - When writing a match expression against
HudiTargetCompressionType
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Inclusion
Annotation Value - When writing a match expression against
InclusionAnnotationValue
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Integration
Status - When writing a match expression against
IntegrationStatus
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Jdbc
Connection Type - When writing a match expression against
JdbcConnectionType
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Jdbc
Data Type - When writing a match expression against
JdbcDataType
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Jdbc
Metadata Entry - When writing a match expression against
JdbcMetadataEntry
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - JobBookmarks
Encryption Mode - When writing a match expression against
JobBookmarksEncryptionMode
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - JobMode
- When writing a match expression against
JobMode
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - JobRun
State - When writing a match expression against
JobRunState
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Join
Type - When writing a match expression against
JoinType
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Language
- When writing a match expression against
Language
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Last
Crawl Status - When writing a match expression against
LastCrawlStatus
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Logical
- When writing a match expression against
Logical
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Logical
Operator - When writing a match expression against
LogicalOperator
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Metadata
Operation - When writing a match expression against
MetadataOperation
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - MlUser
Data Encryption Mode String - When writing a match expression against
MlUserDataEncryptionModeString
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Node
Type - When writing a match expression against
NodeType
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - OAuth2
Grant Type - When writing a match expression against
OAuth2GrantType
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Param
Type - When writing a match expression against
ParamType
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Parquet
Compression Type - When writing a match expression against
ParquetCompressionType
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Partition
Index Status - When writing a match expression against
PartitionIndexStatus
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Permission
- When writing a match expression against
Permission
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Permission
Type - When writing a match expression against
PermissionType
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - PiiType
- When writing a match expression against
PiiType
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Principal
Type - When writing a match expression against
PrincipalType
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Property
Type - When writing a match expression against
PropertyType
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Quote
Char - When writing a match expression against
QuoteChar
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Recrawl
Behavior - When writing a match expression against
RecrawlBehavior
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Registry
Status - When writing a match expression against
RegistryStatus
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Resource
Action - When writing a match expression against
ResourceAction
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Resource
Share Type - When writing a match expression against
ResourceShareType
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Resource
State - When writing a match expression against
ResourceState
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Resource
Type - When writing a match expression against
ResourceType
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - S3Encryption
Mode - When writing a match expression against
S3EncryptionMode
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Schedule
State - When writing a match expression against
ScheduleState
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Schedule
Type - When writing a match expression against
ScheduleType
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Schema
Diff Type - When writing a match expression against
SchemaDiffType
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Schema
Status - When writing a match expression against
SchemaStatus
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Schema
Version Status - When writing a match expression against
SchemaVersionStatus
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Separator
- When writing a match expression against
Separator
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Session
Status - When writing a match expression against
SessionStatus
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Setting
Source - When writing a match expression against
SettingSource
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Sort
- When writing a match expression against
Sort
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Sort
Direction Type - When writing a match expression against
SortDirectionType
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Source
Control Auth Strategy - When writing a match expression against
SourceControlAuthStrategy
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Source
Control Provider - When writing a match expression against
SourceControlProvider
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Starting
Position - When writing a match expression against
StartingPosition
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Statement
State - When writing a match expression against
StatementState
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Statistic
Evaluation Level - When writing a match expression against
StatisticEvaluationLevel
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Table
Attributes - When writing a match expression against
TableAttributes
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Table
Optimizer Event Type - When writing a match expression against
TableOptimizerEventType
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Table
Optimizer Type - When writing a match expression against
TableOptimizerType
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Table
Optimizer VpcConfiguration An object that describes the VPC configuration for a table optimizer.
This configuration is necessary to perform optimization on tables that are in a customer VPC.
- Target
Format - When writing a match expression against
TargetFormat
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Task
RunSort Column Type - When writing a match expression against
TaskRunSortColumnType
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Task
Status Type - When writing a match expression against
TaskStatusType
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Task
Type - When writing a match expression against
TaskType
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Transform
Sort Column Type - When writing a match expression against
TransformSortColumnType
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Transform
Status Type - When writing a match expression against
TransformStatusType
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Transform
Type - When writing a match expression against
TransformType
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Trigger
State - When writing a match expression against
TriggerState
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Trigger
Type - When writing a match expression against
TriggerType
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Union
Type - When writing a match expression against
UnionType
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Unnest
Spec - When writing a match expression against
UnnestSpec
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Update
Behavior - When writing a match expression against
UpdateBehavior
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Update
Catalog Behavior - When writing a match expression against
UpdateCatalogBehavior
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - View
Dialect - When writing a match expression against
ViewDialect
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - View
Update Action - When writing a match expression against
ViewUpdateAction
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Worker
Type - When writing a match expression against
WorkerType
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature. - Workflow
RunStatus - When writing a match expression against
WorkflowRunStatus
, it is important to ensure your code is forward-compatible. That is, if a match arm handles a case for a feature that is supported by the service but has not been represented as an enum variant in a current version of SDK, your code should continue to work when you upgrade SDK to a future version in which the enum does include a variant for that feature.