Crate nnl

Source
Expand description

§NNL - Neural Network Library

A high-performance neural network library for Rust with comprehensive GPU and CPU support.

§Features

  • Multi-backend Support: Vulkan, and optimized CPU execution
  • Automatic Hardware Detection: Seamlessly selects the best available compute backend
  • Multiple Training Methods: Backpropagation, Newton’s method, and advanced optimizers
  • Flexible Architecture: Support for both linear and convolutional networks
  • Model Persistence: Import/export trained models to disk
  • Production Ready: Zero-copy operations, SIMD optimizations, and batched processing

§Quick Start

use nnl::prelude::*;

// Create a simple network
let mut network = NetworkBuilder::new()
    .add_layer(LayerConfig::Dense {
        input_size: 2,
        output_size: 4,
        activation: Activation::ReLU,
    })
    .add_layer(LayerConfig::Dense {
        input_size: 4,
        output_size: 1,
        activation: Activation::Sigmoid,
    })
    .loss(LossFunction::MeanSquaredError)
    .optimizer(OptimizerConfig::Adam { learning_rate: 0.001 })
    .build()?;

// Train the network
let inputs = Tensor::from_slice(&[0.0, 0.0, 0.0, 1.0, 1.0, 0.0, 1.0, 1.0], &[4, 2])?;
let targets = Tensor::from_slice(&[0.0, 1.0, 1.0, 0.0], &[4, 1])?;

network.train(&inputs, &targets, 1000)?;

// Make predictions
let prediction = network.forward(&Tensor::from_slice(&[1.0, 0.0], &[1, 2])?)?;
println!("Prediction: {}", prediction);

Re-exports§

pub use error::NnlError;
pub use error::Result;

Modules§

activations
Activation functions for neural networks
device
Device and backend abstraction layer
error
Error handling for the NNL library
io
I/O module for model serialization and persistence
layers
Neural network layers module
losses
Loss functions for neural networks
network
Neural Network module with builder pattern
optimizers
Optimizers for neural network training
prelude
Prelude module for convenient imports
tensor
Tensor module providing device-agnostic tensor operations
utils
Utility functions for the NNL library