Expand description
A minimal CNN framework in Rust with INT8 and INT4 quantization support.
This crate provides building blocks for constructing and running convolutional neural networks in FP32, INT8, and INT4 precision. It includes a reference LeNet-5 implementation for MNIST digit classification.
§Example
use microcnn::lenet::lenet;
let mut net = lenet(false);
net.load("data/lenet.raw");Re-exports§
Modules§
- arc
- Quantization utilities and model architecture (LeNet).
- benchmark
- conv
- Convolution algorithm implementations (Naive, Im2col, Winograd, FFT).
- lenet
- loader
- MNIST dataset loaders.
- metrics
- Benchmarking utilities for comparing FP32/INT8/INT4 performance.
- mnist
- network
- Neural network layers and network types (FP32, INT8, INT4).
- quantization
- Re-export at legacy paths.
- tensor
- FP32, INT8, and INT4 tensors.
- tensor_
i4 - tensor_
i8 - Alias modules so
crate::tensor_i8::TensorI8etc. still resolve.