pub struct DataFrameWriter<'a> { /* private fields */ }Expand description
Builder for writing DataFrame to path (PySpark DataFrameWriter).
Implementations§
Source§impl<'a> DataFrameWriter<'a>
impl<'a> DataFrameWriter<'a>
pub fn mode(self, mode: WriteMode) -> Self
pub fn format(self, format: WriteFormat) -> Self
Sourcepub fn option(self, key: impl Into<String>, value: impl Into<String>) -> Self
pub fn option(self, key: impl Into<String>, value: impl Into<String>) -> Self
Add a single option (PySpark: option(key, value)). Returns self for chaining.
Sourcepub fn options(self, opts: impl IntoIterator<Item = (String, String)>) -> Self
pub fn options(self, opts: impl IntoIterator<Item = (String, String)>) -> Self
Add multiple options (PySpark: options(**kwargs)). Returns self for chaining.
Sourcepub fn partition_by(
self,
cols: impl IntoIterator<Item = impl Into<String>>,
) -> Self
pub fn partition_by( self, cols: impl IntoIterator<Item = impl Into<String>>, ) -> Self
Partition output by the given columns (PySpark: partitionBy(cols)).
Sourcepub fn save_as_table(
&self,
session: &SparkSession,
name: &str,
mode: SaveMode,
) -> Result<(), PolarsError>
pub fn save_as_table( &self, session: &SparkSession, name: &str, mode: SaveMode, ) -> Result<(), PolarsError>
Save the DataFrame as a table (PySpark: saveAsTable). In-memory by default; when spark.sql.warehouse.dir is set, persists to disk for cross-session access.
Sourcepub fn parquet(&self, path: impl AsRef<Path>) -> Result<(), PolarsError>
pub fn parquet(&self, path: impl AsRef<Path>) -> Result<(), PolarsError>
Write as Parquet (PySpark: parquet(path)). Equivalent to format(Parquet).save(path).
Sourcepub fn csv(&self, path: impl AsRef<Path>) -> Result<(), PolarsError>
pub fn csv(&self, path: impl AsRef<Path>) -> Result<(), PolarsError>
Write as CSV (PySpark: csv(path)). Equivalent to format(Csv).save(path).
Sourcepub fn json(&self, path: impl AsRef<Path>) -> Result<(), PolarsError>
pub fn json(&self, path: impl AsRef<Path>) -> Result<(), PolarsError>
Write as JSON lines (PySpark: json(path)). Equivalent to format(Json).save(path).
Sourcepub fn save(&self, path: impl AsRef<Path>) -> Result<(), PolarsError>
pub fn save(&self, path: impl AsRef<Path>) -> Result<(), PolarsError>
Write to path. Overwrite replaces; append reads existing (if any) and concatenates then writes. With partition_by, path is a directory; each partition is written as path/col1=val1/col2=val2/… with partition columns omitted from the file (Spark/Hive style).
Auto Trait Implementations§
impl<'a> Freeze for DataFrameWriter<'a>
impl<'a> !RefUnwindSafe for DataFrameWriter<'a>
impl<'a> Send for DataFrameWriter<'a>
impl<'a> Sync for DataFrameWriter<'a>
impl<'a> Unpin for DataFrameWriter<'a>
impl<'a> !UnwindSafe for DataFrameWriter<'a>
Blanket Implementations§
Source§impl<T> BorrowMut<T> for Twhere
T: ?Sized,
impl<T> BorrowMut<T> for Twhere
T: ?Sized,
Source§fn borrow_mut(&mut self) -> &mut T
fn borrow_mut(&mut self) -> &mut T
Source§impl<T> IntoEither for T
impl<T> IntoEither for T
Source§fn into_either(self, into_left: bool) -> Either<Self, Self>
fn into_either(self, into_left: bool) -> Either<Self, Self>
self into a Left variant of Either<Self, Self>
if into_left is true.
Converts self into a Right variant of Either<Self, Self>
otherwise. Read moreSource§fn into_either_with<F>(self, into_left: F) -> Either<Self, Self>
fn into_either_with<F>(self, into_left: F) -> Either<Self, Self>
self into a Left variant of Either<Self, Self>
if into_left(&self) returns true.
Converts self into a Right variant of Either<Self, Self>
otherwise. Read more