Expand description
Enhanced scirs2-neural implementation with core activation functions
Re-exports§
pub use activations_minimal::Activation;
pub use activations_minimal::ReLU;
pub use activations_minimal::Sigmoid;
pub use activations_minimal::Softmax;
pub use activations_minimal::Tanh;
pub use activations_minimal::GELU;
pub use error::Error;
pub use error::NeuralError;
pub use error::Result;
pub use layers::BatchNorm;
pub use layers::Conv2D;
pub use layers::Dense;
pub use layers::Dropout;
pub use layers::Layer;
pub use layers::LayerNorm;
pub use layers::Sequential;
pub use layers::LSTM;
pub use losses::ContrastiveLoss;
pub use losses::CrossEntropyLoss;
pub use losses::FocalLoss;
pub use losses::Loss;
pub use losses::MeanSquaredError;
pub use losses::TripletLoss;
pub use training::TrainingConfig;
pub use training::TrainingSession;
Modules§
- activations_
minimal - Minimal activation functions without Layer trait dependencies
- autograd
- Automatic differentiation module for neural networks.
- error
- Error types for the neural network module
- layers
- Neural network layers implementation
- losses
- Loss functions for neural networks
- prelude
- Working prelude with core functionality
- training
- Training utilities and infrastructure
- utils
- Utility functions for neural networks