Expand description
§MNN Rust Bindings
Safe Rust bindings for Alibaba’s MNN (Mobile Neural Network) inference engine.
MNN is a highly efficient and lightweight deep learning inference framework. This crate provides idiomatic Rust bindings for running inference with MNN.
§Features
- Safe API: All MNN operations are wrapped in safe Rust types
- Multiple Backends: CPU, CUDA, OpenCL, Vulkan, Metal support
- Async Support: Optional async API using tokio
- Cross-Platform: Windows, Linux, macOS, Android, iOS support
§Quick Start
use mnn_rs::{Interpreter, ScheduleConfig, BackendType};
// Load a model
let interpreter = Interpreter::from_file("model.mnn")?;
// Create a session
let config = ScheduleConfig::new()
.backend(BackendType::CPU)
.num_threads(4);
let mut session = interpreter.create_session(config)?;
// Get input tensor
let input = session.get_input(None)?;
// Fill input with data (example)
// input.write(&my_data)?;
// Run inference
session.run()?;
// Get output
let output = session.get_output(None)?;
§Backend Configuration
use mnn_rs::{ScheduleConfig, BackendType, MemoryMode, PrecisionMode};
// CPU with custom settings
let cpu_config = ScheduleConfig::new()
.backend(BackendType::CPU)
.num_threads(8)
.memory_mode(MemoryMode::Low);
// GPU (auto-detect best backend)
let gpu_config = ScheduleConfig::new()
.backend(BackendType::Auto)
.precision_mode(PrecisionMode::Low);
§Async API (requires “async” feature)
ⓘ
use mnn_rs::{AsyncInterpreter, ScheduleConfig};
#[tokio::main]
async fn main() -> Result<(), mnn_rs::MnnError> {
let interpreter = AsyncInterpreter::from_file("model.mnn").await?;
let mut session = interpreter.create_session(ScheduleConfig::default()).await?;
session.run_async().await?;
Ok(())
}Re-exports§
pub use mnn_rs_sys;
Modules§
- prelude
- Prelude for common MNN types. Common types for MNN operations.
Structs§
- Backend
Capabilities - Backend capabilities.
- Backend
Config - Configuration for a compute backend.
- Interpreter
- A model interpreter that holds a loaded neural network model.
- Schedule
Config - Schedule configuration for creating sessions.
- Schedule
Config Builder - Builder for creating schedule configurations.
- Session
- An inference session.
- Session
Guard - A guard for ensuring session resources are properly managed.
- Tensor
- A multi-dimensional array for neural network operations.
- Tensor
Info - Information about a tensor’s shape and type.
- Tensor
View - A view into a tensor’s data without ownership.
Enums§
- Backend
Type - Compute backend type.
- Data
Format - Data format for tensors.
- Data
Type - Data type for tensor elements.
- Error
- Main error type for MNN operations.
- Memory
Mode - Power usage mode.
- MnnError
- Main error type for MNN operations.
- Power
Mode - Power usage mode.
- Precision
Mode - Precision mode for inference.
- Session
Mode - Session mode for controlling interpreter behavior.
Traits§
- Tensor
Data - Trait for types that can be stored in a tensor.
Functions§
- available_
backends - Get list of available backends on this system.
- calculate_
element_ count - Calculate the total number of elements in a tensor.
- calculate_
tensor_ size - Calculate the size of a tensor given its shape and element size.
- convert_
format - Convert between NHWC and NCHW formats.
- is_
backend_ available - Check if a specific backend is available.
- version
- Get the MNN version string.
Type Aliases§
- MnnResult
- Result type alias for MNN operations.