Struct aws_sdk_glue::types::SparkSql
source · #[non_exhaustive]pub struct SparkSql {
pub name: String,
pub inputs: Vec<String>,
pub sql_query: String,
pub sql_aliases: Vec<SqlAlias>,
pub output_schemas: Option<Vec<GlueSchema>>,
}Expand description
Specifies a transform where you enter a SQL query using Spark SQL syntax to transform the data. The output is a single DynamicFrame.
Fields (Non-exhaustive)§
This struct is marked as non-exhaustive
Struct { .. } syntax; cannot be matched against without a wildcard ..; and struct update syntax will not work.name: StringThe name of the transform node.
inputs: Vec<String>The data inputs identified by their node names. You can associate a table name with each input node to use in the SQL query. The name you choose must meet the Spark SQL naming restrictions.
sql_query: StringA SQL query that must use Spark SQL syntax and return a single data set.
sql_aliases: Vec<SqlAlias>A list of aliases. An alias allows you to specify what name to use in the SQL for a given input. For example, you have a datasource named "MyDataSource". If you specify From as MyDataSource, and Alias as SqlName, then in your SQL you can do:
select * from SqlName
and that gets data from MyDataSource.
output_schemas: Option<Vec<GlueSchema>>Specifies the data schema for the SparkSQL transform.
Implementations§
source§impl SparkSql
impl SparkSql
sourcepub fn inputs(&self) -> &[String]
pub fn inputs(&self) -> &[String]
The data inputs identified by their node names. You can associate a table name with each input node to use in the SQL query. The name you choose must meet the Spark SQL naming restrictions.
sourcepub fn sql_query(&self) -> &str
pub fn sql_query(&self) -> &str
A SQL query that must use Spark SQL syntax and return a single data set.
sourcepub fn sql_aliases(&self) -> &[SqlAlias]
pub fn sql_aliases(&self) -> &[SqlAlias]
A list of aliases. An alias allows you to specify what name to use in the SQL for a given input. For example, you have a datasource named "MyDataSource". If you specify From as MyDataSource, and Alias as SqlName, then in your SQL you can do:
select * from SqlName
and that gets data from MyDataSource.
sourcepub fn output_schemas(&self) -> &[GlueSchema]
pub fn output_schemas(&self) -> &[GlueSchema]
Specifies the data schema for the SparkSQL transform.
If no value was sent for this field, a default will be set. If you want to determine if no value was sent, use .output_schemas.is_none().
Trait Implementations§
source§impl PartialEq for SparkSql
impl PartialEq for SparkSql
impl StructuralPartialEq for SparkSql
Auto Trait Implementations§
impl Freeze for SparkSql
impl RefUnwindSafe for SparkSql
impl Send for SparkSql
impl Sync for SparkSql
impl Unpin for SparkSql
impl UnwindSafe for SparkSql
Blanket Implementations§
source§impl<T> BorrowMut<T> for Twhere
T: ?Sized,
impl<T> BorrowMut<T> for Twhere
T: ?Sized,
source§fn borrow_mut(&mut self) -> &mut T
fn borrow_mut(&mut self) -> &mut T
source§impl<T> Instrument for T
impl<T> Instrument for T
source§fn instrument(self, span: Span) -> Instrumented<Self>
fn instrument(self, span: Span) -> Instrumented<Self>
source§fn in_current_span(self) -> Instrumented<Self>
fn in_current_span(self) -> Instrumented<Self>
source§impl<T> IntoEither for T
impl<T> IntoEither for T
source§fn into_either(self, into_left: bool) -> Either<Self, Self>
fn into_either(self, into_left: bool) -> Either<Self, Self>
self into a Left variant of Either<Self, Self>
if into_left is true.
Converts self into a Right variant of Either<Self, Self>
otherwise. Read moresource§fn into_either_with<F>(self, into_left: F) -> Either<Self, Self>
fn into_either_with<F>(self, into_left: F) -> Either<Self, Self>
self into a Left variant of Either<Self, Self>
if into_left(&self) returns true.
Converts self into a Right variant of Either<Self, Self>
otherwise. Read more