[][src]Module tch::nn

A small neural-network library based on Torch.

This library tries to stay as close as possible to the original Python and C++ implementations.

Structs

Adam

Parameters for the Adam optimizer.

BatchNorm

A batch-normalization layer.

BatchNormConfig

Batch-normalization config.

Conv

A N-dimensional convolution layer.

ConvConfigND

Generic convolution config.

ConvTransposeConfigND

A generic transposed convolution configuration.

ConvTransposeND

A generic transposed convolution layer.

Embedding

An embedding layer.

EmbeddingConfig

Configuration option for an embedding layer.

Func

A layer defined by a simple closure.

FuncT

A layer defined by a closure with an additional training parameter.

GRU

A Gated Recurrent Unit (GRU) layer.

GRUState

A GRU state, this contains a single tensor.

Id

An identity layer. This just propagates its tensor input as output.

LSTM

A Long Short-Term Memory (LSTM) layer.

LSTMState

The state for a LSTM network, this contains two tensors.

LayerNorm

A layer-normalization layer.

LayerNormConfig

Layer-normalization config.

Linear

A linear fully-connected layer.

LinearConfig

Configuration for a linear layer.

Optimizer

An optimizer to run gradient descent.

Path

A variable store with an associated path for variables naming.

RNNConfig

Configuration for the GRU and LSTM layers.

RmsProp

Parameters for the RmsProp optimizer.

Sequential

A sequential layer combining multiple other layers.

SequentialT

A sequential layer combining new layers with support for a training mode.

Sgd

Parameters for the SGD optimizer.

VarStore

A VarStore is used to store variables used by one or multiple layers. It specifies a single device where all variables are stored.

Variables

Enums

Init

Variable initializations.

Traits

Module

The simplest module trait, defining a forward function.

ModuleT

Module trait with an additional train parameter.

OptimizerConfig

Optimizer configurations. These configs can be used to build optimizer.

RNN

Trait for Recurrent Neural Networks.

Functions

adam

Creates the configuration for the Adam optimizer.

batch_norm1d

Applies Batch Normalization over a three dimension input.

batch_norm2d

Applies Batch Normalization over a four dimension input.

batch_norm3d

Applies Batch Normalization over a five dimension input.

conv

Creates a new convolution layer for any number of dimensions.

conv1d

Creates a new one dimension convolution layer.

conv2d

Creates a new two dimension convolution layer.

conv3d

Creates a new three dimension convolution layer.

conv_transpose1d

Creates a one dimension transposed convolution layer.

conv_transpose2d

Creates a two dimension transposed convolution layer.

conv_transpose3d

Creates a three dimension transposed convolution layer.

embedding
func
func_t
gru

Creates a new GRU layer.

init

Creates a new float tensor with the specified shape, device, and initialization.

layer_norm
linear

Creates a new linear layer.

lstm

Creates a LSTM layer.

no_bias

The default convolution config without bias.

rms_prop

Creates the configuration for the RmsProp optimizer.

seq

Creates a new empty sequential layer.

seq_t

Creates a new empty sequential layer.

sgd

Creates the configuration for a Stochastic Gradient Descent (SGD) optimizer.

Type Definitions

Conv1D

One dimension convolution layer.

Conv2D

Two dimensions convolution layer.

Conv3D

Three dimensions convolution layer.

ConvConfig

Convolution config using the same parameters on all dimensions.

ConvTranspose1D

A one dimension transposed convolution layer.

ConvTranspose2D

A two dimension transposed convolution layer.

ConvTranspose3D

A three dimension transposed convolution layer.

ConvTransposeConfig

A transposed convolution configuration using the same values on each dimension.