native_neural_network_std 0.1.5

Ergonomic std wrapper for the `native_neural_network` crate (no_std) — std-friendly re-exports and utilities.
Documentation

native_neural_network_std

Purpose

native_neural_network_std is a std-oriented ergonomic wrapper around the native_neural_network crate. It provides owned types and helper utilities for Rust standard users while fully leveraging the underlying no_std core.

Prerequisites

  • Rust stable (latest recommended)
  • .rnn model file
  • No UI included; no pre-trained models bundled

Installation

Add via cargo:

cargo add native_neural_network_std

Import in code:

use native_neural_network_std::ModelStd;

Minimal Workflow

  1. Load a .rnn model via ModelStd::from_file().
  2. Inspect model metadata for buffer requirements.
  3. Allocate caller-owned input/output buffers.
  4. Prepare input tensors using TensorStd.
  5. Execute inference with NeuralNetworkStd or RnnStd.
  6. Retrieve outputs from the output buffer.

Accessing Metadata

  • Inspect layer counts, weight/bias arrays.
  • Retrieve kernel-level execution plans.
  • Export internal structures for analysis or integration.

What this Quickstart Does Not Cover

  • Training loops and optimizer tuning
  • Low-level profiling or quantization pipelines
  • FFI integration details (see main README)
  • Advanced visualization or external tooling

Philosophy

  • Modular, deterministic building blocks for inference
  • Predictable memory usage, fully owned APIs
  • Designed for easy wrapping and multi-language integration
  • 100% Clippy clean

Example Usage (abridged)

use native_neural_network_std::{ModelStd, NeuralNetworkStd};

let model = ModelStd::from_file("/tmp/sample.rnn");
let mut nn = NeuralNetworkStd::new(&model);
nn.forward(&input_buffer, &mut output_buffer);

For detailed examples, see the examples/ folder.

Documentation

For full low-level details, refer to native_neural_network documentation: https://docs.rs/native_neural_network

License

MIT