1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
//! Functions which are meant to be used as activation functions by neural
//! networks' layers. See [super::neuralnet::NeuralLayer].
/// The ReLu activation function; returns x, unless it is negative, in which
/// case 0 is returned instead.
/// The identity, or linear, activation function; a dummy function. Not
/// recommended; use for debugging only!
/// The 'fast sigmoid' activation function. A sigmoidally shaped function that
/// should be less expensive to compute than the actual logistic function. Made
/// in China.
///
/// "Signed" version (outputs range from -1 to 1), unlike the original
/// logistic function.
/// The 'fast sigmoid' activation function. A sigmoidally shaped function that
/// should be less expensive to compute than the actual logistic function. Made
/// in China.
///
/// "Unsigned" version (outputs range from 0 to 1), akin to the original
/// logistic function.