1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
//! Provides nonlinear activation methods.
//!
//! Activation Layers take a input tensor, provide the activation operation and
//! produce a output tensor.
//! Thanks to the nonlinearity of the activation methods, we can 'learn' and
//! detect nonlinearities
//! in our (complex) datasets.
//!
//! The activation operation used should depend on the task at hand. For binary
//! classification a
//! step function might be very useful. For more complex tasks continious
//! activation functions such
//! as [Sigmoid][mod_sigmoid], TanH, [ReLU][mod_relu] should be used. In most cases ReLU might
//! provide the best results.
//!
//! If you supply the same blob as input and output to a layer via the [LayerConfig][struct_layerconfig],
//! computations will be done in-place, requiring less memory.
//!
//! The activation function is also sometimes called transfer function.
//!
//! [mod_sigmoid]: ./sigmoid/index.html
//! [mod_relu]: ./relu/index.html
//! [struct_layerconfig]: ../../layer/struct.LayerConfig.html
/// macro helper to implement activation trait
/// TODO see common
pub use ReLU;
pub use Sigmoid;
pub use TanH;