native_neural_network_std
Purpose
native_neural_network_std is a std-oriented ergonomic wrapper around the native_neural_network crate. It provides owned types and helper utilities for Rust standard users while fully leveraging the underlying no_std core.
Prerequisites
- Rust stable (latest recommended)
.rnnmodel file- No UI included; no pre-trained models bundled
Installation
Add via cargo:
Import in code:
use ModelStd;
Minimal Workflow
- Load a
.rnnmodel viaModelStd::from_file(). - Inspect model metadata for buffer requirements.
- Allocate caller-owned input/output buffers.
- Prepare input tensors using
TensorStd. - Execute inference with
NeuralNetworkStdorRnnStd. - Retrieve outputs from the output buffer.
Accessing Metadata
- Inspect layer counts, weight/bias arrays.
- Retrieve kernel-level execution plans.
- Export internal structures for analysis or integration.
What this Quickstart Does Not Cover
- Training loops and optimizer tuning
- Low-level profiling or quantization pipelines
- FFI integration details (see main README)
- Advanced visualization or external tooling
Philosophy
- Modular, deterministic building blocks for inference
- Predictable memory usage, fully owned APIs
- Designed for easy wrapping and multi-language integration
- 100% Clippy clean
Example Usage (abridged)
use ;
let model = from_file;
let mut nn = new;
nn.forward;
For detailed examples, see the
examples/folder.
Documentation
For full low-level details, refer to native_neural_network documentation: https://docs.rs/native_neural_network
License
MIT