1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
//! Utility functions and types for the library.
//!
//! This module contains various utility functions and types that are used throughout the library.
//! These include activation functions, cost functions, metrics calculators, and optimizers.
//!
//! ## Activation functions
//!
//! The library provides a set of predefined activation functions that can be used in neural networks.
//! These functions are represented by the [`Act`] enum and can be used to apply specific
//! activation functions to the input data during the forward pass of a neural network.
//!
//! | Activation function | Definition |
//! |----------------------------|---------------------------------------------|
//! | [`Act::Step`] | `step(x) = 1 if x > 0 else 0` |
//! | [`Act::Sigmoid`] | `sigmoid(x) = 1 / (1 + exp(-x))` |
//! | [`Act::ReLU`] | `ReLU(x) = x if x > 0 else 0` |
//! | [`Act::Tanh`] | `tanh(x) = (1 - exp(-2x)) / (1 + exp(-2x))` |
//! | [`Act::Softmax`] | `softmax(x) = exp(x) / sum(exp(x))` |
//!
//! ## Cost functions
//!
//! The library provides a set of predefined cost functions that can be used in the training process.
//! These functions are represented by the [`Cost`] enum and can be used to measure the difference between
//! the predicted and actual values during the training process.
//!
//! | Cost function | Description |
//! |---------------|------------------------------------------------------------------------------------------------------------------|
//! | [`Cost::MSE`] | Mean Squared Error. This cost function measures the average squared difference between the predicted and actual values. |
//! | [`Cost::MSE`] | Mean Squared Error. This cost function measures the average squared difference between the predicted and actual values. |AE` | Mean Absolute Error. This cost function measures the average absolute difference between the predicted and actual values. |
//! | [`Cost::MSE`] | Mean Squared Error. This cost function measures the average squared difference between the predicted and actual values. |BCE` | Binary Cross-Entropy. This cost function measures the average difference between the predicted and actual values, weighted by the binary cross-entropy loss function. |
//! | [`Cost::MSE`] | Mean Squared Error. This cost function measures the average squared difference between the predicted and actual values. |CCE` | Categorical Cross-Entropy. This cost function measures the average difference between the predicted and actual values, weighted by the categorical cross-entropy loss function. |
//!
//! ## Optimizers
//!
//! The library provides a set of predefined optimizers that can be used in the training process.
//! These optimizers are represented by the [`Optimizer`] enum and can be used to update the weights and biases
//! of the neural network during the training process.
//!
//! | Optimizer | Description |
//! |-----------|------------------------------------------------------------------------------------------------------------------|
//! | `GD` | Gradient Descent. This optimizer updates the weights and biases of the neural network using the gradient of the loss function with respect to the weights and biases. |
//! | `Momentum`| Momentum. This optimizer updates the weights and biases of the neural network using the gradient of the loss function with respect to the weights and biases, but with a momentum term that helps accelerate the learning process. |
//! | `Adam` | Adam. This optimizer updates the weights and biases of the neural network using the gradient of the loss function with respect to the weights and biases, but with a momentum term that helps accelerate the learning process and a learning rate that adjusts the step size of the gradient descent. |
//!
pub use OptimizerType;
pub use ;
pub use ;
pub use MSGPackFormatting;
pub use MetricsCalculator;
pub use NNUtil;
pub use *;