Struct google_dataproc1::SparkJob [] [src]

pub struct SparkJob {
    pub jar_file_uris: Option<Vec<String>>,
    pub logging_config: Option<LoggingConfig>,
    pub args: Option<Vec<String>>,
    pub file_uris: Option<Vec<String>>,
    pub main_class: Option<String>,
    pub archive_uris: Option<Vec<String>>,
    pub main_jar_file_uri: Option<String>,
    pub properties: Option<HashMap<String, String>>,
}

A Cloud Dataproc job for running Apache Spark applications on YARN.

This type is not used in any activity, and only used as part of another schema.

Fields

[Optional] HCFS URIs of jar files to add to the CLASSPATHs of the Spark driver and tasks.

[Optional] The runtime log config for job execution.

[Optional] The arguments to pass to the driver. Do not include arguments, such as --conf, that can be set as job properties, since a collision may occur that causes an incorrect job submission.

[Optional] HCFS URIs of files to be copied to the working directory of Spark drivers and distributed tasks. Useful for naively parallel tasks.

The name of the driver's main class. The jar file that contains the class must be in the default CLASSPATH or specified in jar_file_uris.

[Optional] HCFS URIs of archives to be extracted in the working directory of Spark drivers and tasks. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.

The HCFS URI of the jar file that contains the main class.

[Optional] A mapping of property names to values, used to configure Spark. Properties that conflict with values set by the Cloud Dataproc API may be overwritten. Can include properties set in /etc/spark/conf/spark-defaults.conf and classes in user code.

Trait Implementations

impl Default for SparkJob
[src]

Returns the "default value" for a type. Read more

impl Clone for SparkJob
[src]

Returns a copy of the value. Read more

Performs copy-assignment from source. Read more

impl Debug for SparkJob
[src]

Formats the value using the given formatter.

impl Part for SparkJob
[src]