Please check the build logs for more information.
See Builds for ideas on how to fix a failed build, or Metadata for how to configure docs.rs builds.
If you believe this is docs.rs' fault, open an issue.
rust-data-processing

Rust library: schema-first ingestion (CSV, JSON, Parquet, Excel with Cargo features) into an in-memory DataSet, plus Polars-backed pipelines, optional SQL, profiling, validation, and map/reduce-style processing.
Infographic: Phase 1 — single-node, library-first flow (ingest → DataSet, pipelines, SQL, profile, validate, outliers, transforms, parallel execution, PyO3 bindings, optional chatbot / notebook story).
Limits (masking / “PII”): UTF-8 transforms and validation checks are mechanical helpers only; callers supply policy and must not treat outputs as legal guarantees. See Planning/P2_E6_PRIVACY_POLICY.md in the repository.
This file is the crate README shown on crates.io and at the top of docs.rs (Rust-only). The repository’s README.md is the full monorepo overview (including Python).
Documentation
| Link | |
|---|---|
| Rust API (module tree) | Use the crate index on this docs.rs page (left sidebar). |
| Repository | github.com/vihangdesai2018-png/rust-data-processing |
| Markdown API overview | API.md (shipped in this crate) |
| Rust examples & cookbook | docs/rust/README.md |
| HTML site (Rust + Python pages) | GitHub Pages — use Rust (rustdoc) for this crate; setup if the site is empty. |
Quick start (Rust)
use ;
use ;
let schema = new;
let _ds = ingest_from_path
.expect;
More patterns: docs/rust/README.md.
Features (Cargo)
default: includessql(Polars-backed SQL viapolars-sql).excel: Excel workbook ingestion (calamine).sql: Polars SQL (on by default; usedefault-features = falseto drop).db_connectorx: optional DB → Arrow →DataSet.arrow/serde_arrow: Arrow interop helpers.
Full list: Cargo.toml [features].
License
MIT OR Apache-2.0 - see LICENSE-MIT and LICENSE-APACHE.