[−][src]Trait opencv::dnn::prelude::RNNLayer
Classical recurrent layer
Accepts two inputs @f$x_t@f$ and @f$h_{t-1}@f$ and compute two outputs @f$o_t@f$ and @f$h_t@f$.
- input: should contain packed input @f$x_t@f$.
- output: should contain output @f$o_t@f$ (and @f$h_t@f$ if setProduceHiddenOutput() is set to true).
input[0] should have shape [T
, N
, data_dims
] where T
and N
is number of timestamps and number of independent samples of @f$x_t@f$ respectively.
output[0] will have shape [T
, N
, @f$N_o@f$], where @f$N_o@f$ is number of rows in @f$ W_{xo} @f$ matrix.
If setProduceHiddenOutput() is set to true then @p output[1] will contain a Mat with shape [T
, N
, @f$N_h@f$], where @f$N_h@f$ is number of rows in @f$ W_{hh} @f$ matrix.
Required methods
pub fn as_raw_RNNLayer(&self) -> *const c_void
[src]
pub fn as_raw_mut_RNNLayer(&mut self) -> *mut c_void
[src]
Provided methods
pub fn set_weights(
&mut self,
wxh: &Mat,
bh: &Mat,
whh: &Mat,
who: &Mat,
bo: &Mat
) -> Result<()>
[src]
&mut self,
wxh: &Mat,
bh: &Mat,
whh: &Mat,
who: &Mat,
bo: &Mat
) -> Result<()>
Setups learned weights.
Recurrent-layer behavior on each step is defined by current input @f$ x_t @f$, previous state @f$ h_t @f$ and learned weights as follows: @f{eqnarray*}{ h_t &= tanh&(W_{hh} h_{t-1} + W_{xh} x_t + b_h), \ o_t &= tanh&(W_{ho} h_t + b_o), @f}
Parameters
- Wxh: is @f$ W_{xh} @f$ matrix
- bh: is @f$ b_{h} @f$ vector
- Whh: is @f$ W_{hh} @f$ matrix
- Who: is @f$ W_{xo} @f$ matrix
- bo: is @f$ b_{o} @f$ vector
pub fn set_produce_hidden_output(&mut self, produce: bool) -> Result<()>
[src]
If this flag is set to true then layer will produce @f$ h_t @f$ as second output. @details Shape of the second output is the same as first output.
C++ default parameters
- produce: false