LOWESS Project
The fastest, most robust, and most feature-complete language-agnostic LOWESS (Locally Weighted Scatterplot Smoothing) implementation for Rust, Python, and R.
[!IMPORTANT]
The
lowess-projectcontains a complete ecosystem for LOWESS smoothing:
lowess- Core single-threaded Rust implementation withno_stdsupportfastLowess- Parallel CPU and GPU-accelerated Rust wrapper with ndarray integrationPython bindings- PyO3-based Python packageR bindings- extendr-based R package
LOESS vs. LOWESS
| Feature | LOESS (This Crate) | LOWESS |
|---|---|---|
| Polynomial Degree | Linear, Quadratic, Cubic, Quartic | Linear (Degree 1) |
| Dimensions | Multivariate (n-D support) | Univariate (1-D only) |
| Flexibility | High (Distance metrics) | Standard |
| Complexity | Higher (Matrix inversion) | Lower (Weighted average/slope) |
[!TIP] Note: For a LOESS implementation, use
loess-project.
Documentation
[!NOTE]
📚 View the full documentation
Why this package?
Speed
The lowess project crushes the competition in terms of speed, wether in single-threaded or multi-threaded parallel execution.
Speedup relative to Python's statsmodels.lowess (higher is better):
| Category | statsmodels | R (stats) | Serial | Parallel | GPU |
|---|---|---|---|---|---|
| Clustered | 163ms | 83× | 203× | 433× | 32× |
| Constant Y | 134ms | 92× | 212× | 410× | 18× |
| Delta (large–none) | 105ms | 2× | 4× | 6× | 16× |
| Extreme Outliers | 489ms | 106× | 201× | 388× | 29× |
| Financial (500–10K) | 106ms | 105× | 252× | 293× | 12× |
| Fraction (0.05–0.67) | 221ms | 104× | 228× | 391× | 22× |
| Genomic (1K–50K) | 1833ms | 7× | 9× | 20× | 95× |
| High Noise | 435ms | 133× | 134× | 375× | 32× |
| Iterations (0–10) | 204ms | 115× | 224× | 386× | 18× |
| Scale (1K–50K) | 1841ms | 264× | 487× | 581× | 98× |
| Scientific (500–10K) | 167ms | 109× | 205× | 314× | 15× |
| Scale Large* (100K–2M) | — | — | 1× | 1.4× | 0.3× |
*Scale Large benchmarks are relative to Serial (statsmodels cannot handle these sizes)
The numbers are the average across a range of scenarios for each category (e.g., Delta from none, to small, medium, and large).
Robustness
This implementation is more robust than R's lowess and Python's statsmodels due to two key design choices:
MAD-Based Scale Estimation:
For robustness weight calculations, this crate uses Median Absolute Deviation (MAD) for scale estimation:
s = median(|r_i - median(r)|)
In contrast, statsmodels and R's lowess uses the median of absolute residuals (MAR):
s = median(|r_i|)
- MAD is a breakdown-point-optimal estimator—it remains valid even when up to 50% of data are outliers.
- The median-centering step removes asymmetric bias from residual distributions.
- MAD provides consistent outlier detection regardless of whether residuals are centered around zero.
Boundary Padding:
This crate applies a range of different boundary policies at dataset edges:
- Extend: Repeats edge values to maintain local neighborhood size.
- Reflect: Mirrors data symmetrically around boundaries.
- Zero: Pads with zeros (useful for signal processing).
- NoBoundary: Original Cleveland behavior
statsmodels and R's lowess do not apply boundary padding, which can lead to:
- Biased estimates near boundaries due to asymmetric local neighborhoods.
- Increased variance at the edges of the smoothed curve.
Features
A variety of features, supporting a range of use cases:
| Feature | This package | statsmodels | R (stats) |
|---|---|---|---|
| Kernel | 7 options | only Tricube | only Tricube |
| Robustness Weighting | 3 options | only Huber | only Huber |
| Scale Estimation | 2 options | only MAR | only MAR |
| Boundary Padding | 4 options | no padding | no padding |
| Zero Weight Fallback | 3 options | no | no |
| Auto Convergence | yes | no | no |
| Online Mode | yes | no | no |
| Streaming Mode | yes | no | no |
| Confidence Intervals | yes | no | no |
| Prediction Intervals | yes | no | no |
| Cross-Validation | 2 options | no | no |
| Parallel Execution | yes | no | no |
| GPU Acceleration | yes* | no | no |
no-std Support |
yes | no | no |
* GPU acceleration is currently in beta and may not be available on all platforms.
Validation
All implementations are numerical twins of R's lowess:
| Aspect | Status | Details |
|---|---|---|
| Accuracy | ✅ EXACT MATCH | Max diff < 1e-12 across all scenarios |
| Consistency | ✅ PERFECT | Multiple scenarios pass with strict tolerance |
| Robustness | ✅ VERIFIED | Robust smoothing matches R exactly |
Installation
Currently available for R, Python, and Rust:
R (from R-universe, recommended):
Python (from PyPI):
Or from conda-forge:
Rust (lowess, no_std compatible):
[]
= "0.99"
Rust (fastLowess, parallel + GPU):
[]
= { = "0.99", = ["cpu"] }
Quick Example
R:
x <-
y <-
result <-
Python:
=
=
=
Rust:
use *;
let x = vec!;
let y = vec!;
let model = new
.fraction
.iterations
.adapter
.build?;
let result = model.fit?;
println!;
API Reference
R:
Python:
Rust:
new
.fraction // Smoothing span (0, 1]
.iterations // Robustness iterations
.delta // Interpolation threshold
.weight_function // Kernel selection
.robustness_method
.zero_weight_fallback
.boundary_policy
.confidence_intervals
.prediction_intervals
.return_diagnostics
.return_residuals
.return_robustness_weights
.cross_validate
.auto_converge
.adapter // or Streaming, Online
.parallel // fastLowess only
.backend // fastLowess only: CPU or GPU
.build?;
Result Structure
R:
result$x, result$y, result$standard_errors
result$confidence_lower, result$confidence_upper
result$prediction_lower, result$prediction_upper
result$residuals, result$robustness_weights
result$diagnostics, result$iterations_used
result$fraction_used, result$cv_scores
Python:
, ,
,
,
,
,
,
Rust:
Contributing
Contributions are welcome! Please see the CONTRIBUTING.md file for more information.
License
Licensed under either of:
- Apache License, Version 2.0 (LICENSE-APACHE or https://www.apache.org/licenses/LICENSE-2.0)
- MIT license (LICENSE-MIT or https://opensource.org/licenses/MIT)
at your option.
References
- Cleveland, W.S. (1979). "Robust Locally Weighted Regression and Smoothing Scatterplots". JASA.
- Cleveland, W.S. (1981). "LOWESS: A Program for Smoothing Scatterplots". The American Statistician.
Citation
If you use this software in your research, please cite it using the CITATION.cff file or the BibTeX entry below: