pub type PtrOfFlattenLayer = Ptr<FlattenLayer>;

Implementations

Trait Implementations

Clears the algorithm state
Reads algorithm parameters from a file storage
Stores algorithm parameters in a file storage
simplified API for language bindings Stores algorithm parameters in a file storage Read more
Returns true if the Algorithm is empty (e.g. in the very beginning or after unsuccessful read
Saves the algorithm to a file. In order to make this method work, the derived class must implement Algorithm::write(FileStorage& fs). Read more
Returns the algorithm string identifier. This string is used as top level xml/yml node tag when the object is saved to a file or string. Read more
List of learned parameters must be stored here to allow read them by using Net::getParam().
Name of the layer instance, can be used for logging or other internal purposes.
Type name which was used for creating layer by layer factory.
prefer target for layer forwarding
Computes and sets internal parameters according to inputs, outputs and blobs. Read more
👎Deprecated: Use Layer::forward(InputArrayOfArrays, OutputArrayOfArrays, OutputArrayOfArrays) instead
Given the @p input blobs, computes the output @p blobs. Read more
Given the @p input blobs, computes the output @p blobs. Read more
Tries to quantize the given layer and compute the quantization parameters required for fixed point implementation. Read more
Given the @p input blobs, computes the output @p blobs. Read more
👎Deprecated: Use Layer::finalize(InputArrayOfArrays, OutputArrayOfArrays) instead
Computes and sets internal parameters according to inputs, outputs and blobs. Read more
👎Deprecated: Use Layer::finalize(InputArrayOfArrays, OutputArrayOfArrays) instead
Computes and sets internal parameters according to inputs, outputs and blobs. Read more
👎Deprecated: This method will be removed in the future release.
Allocates layer and computes output. Read more
Returns index of input blob into the input array. Read more
Returns index of output blob in output array. Read more
Ask layer if it support specific backend for doing computations. Read more
Returns Halide backend node. Read more
Returns a CUDA backend node Read more
Returns a TimVX backend node Read more
Implement layers fusing. Read more
Tries to attach to the layer the subsequent activation layer, i.e. do the layer fusion in a partial case. Read more
Try to fuse current layer with a next one Read more
“Detaches” all the layers, attached to particular layer.
List of learned parameters must be stored here to allow read them by using Net::getParam().
Name of the layer instance, can be used for logging or other internal purposes.
Type name which was used for creating layer by layer factory.
prefer target for layer forwarding
Automatic Halide scheduling based on layer hyper-parameters. Read more
Returns parameters of layers with channel-wise multiplication and addition. Read more
Returns scale and zeropoint of layers Read more