Crate corgi[][src]

Expand description

Machine learning, and dynamic automatic differentiation implementation.

Modules

Activation functions are differentiable non-linearities applied to the output of layers.

An n-dimensional array, with automatic differentation.

Cost functions compute the loss given a target, and are used for the backward pass.

Initializers initialize the parameters of a model.

Implementations of neural network layers.

A supervised neural network model, which computes a forward pass, and updates parameters based on a target.

Floating point type wrapper, which may be changed to f32 when the feature “f32” is active.

Implementations of gradient descent optimizers, to optimize the parameters of a model.

Macros

Creates an Array, which is row-major, with either: