pub trait RNNLayer: LayerTrait + RNNLayerConst {
fn as_raw_mut_RNNLayer(&mut self) -> *mut c_void;
fn set_weights(
&mut self,
wxh: &Mat,
bh: &Mat,
whh: &Mat,
who: &Mat,
bo: &Mat
) -> Result<()> { ... }
fn set_produce_hidden_output(&mut self, produce: bool) -> Result<()> { ... }
}
Required Methods
fn as_raw_mut_RNNLayer(&mut self) -> *mut c_void
Provided Methods
sourcefn set_weights(
&mut self,
wxh: &Mat,
bh: &Mat,
whh: &Mat,
who: &Mat,
bo: &Mat
) -> Result<()>
fn set_weights(
&mut self,
wxh: &Mat,
bh: &Mat,
whh: &Mat,
who: &Mat,
bo: &Mat
) -> Result<()>
Setups learned weights.
Recurrent-layer behavior on each step is defined by current input @f$ x_t @f$, previous state @f$ h_t @f$ and learned weights as follows: @f{eqnarray*}{ h_t &= tanh&(W_{hh} h_{t-1} + W_{xh} x_t + b_h), \ o_t &= tanh&(W_{ho} h_t + b_o), @f}
Parameters
- Wxh: is @f$ W_{xh} @f$ matrix
- bh: is @f$ b_{h} @f$ vector
- Whh: is @f$ W_{hh} @f$ matrix
- Who: is @f$ W_{xo} @f$ matrix
- bo: is @f$ b_{o} @f$ vector
If this flag is set to true then layer will produce @f$ h_t @f$ as second output. @details Shape of the second output is the same as first output.
C++ default parameters
- produce: false