Module bf16

Module bf16 

Source
Available on crate feature f16 only.
Expand description

Brain Floating Point implementation, a 16-bit type used in machine learning.

bf16 is meant as an interchange format, and therefore there may be rounding error in using it for fast-path algorithms. Since there are no native operations using bf16, this is of minimal concern.

Structsยง

bf16
A 16-bit floating point type implementing the bfloat16 format.