Expand description
Raw trait implementations for rug::Float and rug::Complex.
§Arbitrary-Precision Floating-Point Raw Implementations
This module provides raw trait implementations for the rug backend,
enabling arbitrary-precision floating-point arithmetic with configurable precision.
§Purpose and Role
The primary role of this module is to implement the core raw traits
(RawScalarTrait,
RawRealTrait, etc.) for rug::Float
and rug::Complex. These implementations provide the computational foundations
that validated wrappers build upon.
§Precision Model
Unlike the native f64 backend with fixed 53-bit precision, the rug backend uses
const generic precision:
- Precision is specified in bits (e.g., 100, 200, 500)
- All operations preserve precision of the operands
- Precision mismatches are detected during validation
§MPFR Constant Optimization
For mathematical constants, this module uses MPFR’s precomputed constants when available, providing significant performance improvements:
rug::float::Constant::Pi- ~10x faster than computingacos(-1)rug::float::Constant::Log2- ~10x faster than computingln(2)
§Memory and Performance Characteristics
rug::Floatandrug::Complexare heap-allocated, non-Copytypes- All operations support reference-based variants to minimize cloning
- Memory usage scales with precision (approximately
precision/8bytes per value)