Expand description
Recurrent layer implementations
Re-exports§
pub use bidirectional::Bidirectional;
pub use gru::GRU;
pub use lstm::LSTM;
pub use rnn::RNN;
Modules§
- bidirectional
- Bidirectional wrapper for recurrent layers
- gru
- Gated Recurrent Unit (GRU) implementation
- lstm
- Long Short-Term Memory (LSTM) implementation
- rnn
- Basic Recurrent Neural Network (RNN) implementation
Type Aliases§
- GruForward
Output - Type alias for GRU forward output (new_h, (reset_gate, update_gate, new_gate))
- GruGate
Cache - Type alias for GRU gate cache (reset, update, new gates)
- GruState
Cache - Type alias for GRU state cache
- Lstm
Gate Cache - Type alias for LSTM gate cache (input, forget, output, cell gates)
- Lstm
State Cache - Type alias for LSTM state cache (hidden, cell)
- Lstm
Step Output - Type alias for LSTM step output (new_h, new_c, (input_gate, forget_gate, cell_gate, output_gate))
- RnnState
Cache - Type alias for RNN state cache