# Crate neuronika[−][src]

## Expand description

The `neuronika`

crate provides autodifferentiation and dynamic neural networks.

Neuronika is a machine learning framework written in pure Rust, built with a focus on ease of use, fast experimentation and performance.

# Highlights

- Define by run computational graphs
- Reverse-mode automatic differentiation
- Dynamic neural networks

# Variables

The main building blocks of neuronika are *variables* and *differentiable variables*.
This means that when you use this crate you are handling and manipulating instances of `Var`

and `VarDiff`

.

Variables are lean and powerful abstractions over the computational graph’s nodes. Neuronika empowers you with the ability of imperatively building and differentiating such graphs with minimal amount of code and effort.

Both differentiable and non-differentiable variables can be understood as *tensors*, you
can perform all the basic arithmetic operations on them, such as: `+`

, `-`

, `*`

and `/`

.
Refer to `Var`

and `VarDiff`

for a complete list of the avaiable operations.

It is important to note that cloning variables is extremely memory efficient as only a shallow copy is returned. Cloning a variable is thus the way to go if you need to use it several times.

The provided API is linear in thought and minimal as it is carefully tailored around you, the user.

## Leaf Variables

You can create leaf variables by using one of the many provided functions, such as `zeros()`

,
`ones()`

, `full()`

and `rand()`

. Feel free to refer to the complete list.

Leaf variables are so called because they form the *leaves* of the computational graph, as are
not the result of any computation.

Every leaf variable is by default created as non-differentiable, to promote it to a
*differentiable* leaf, i. e. a variable for which you can compute the gradient, you can use
`.requires_grad()`

.

Differentiable leaf variables are leaves that have been promoted. You will encounter them
very often in your journey through neuronika as they are the the main components of the
neural networks’ building blocks. To learn more in detail about those check the
`nn`

module.

Differentiable leaves hold a gradient, you can access it with `.grad()`

.

## Differentiability Arithmetic

As stated before, you can manipulate variables by performing operations on them; the results of those computations will also be variables, although not leaf ones.

The result of an operation between two differentiable variables will also be a differentiable variable and the converse holds for non-differentiable variables. However, things behave slightly differently when an operation is performed between a non-differentiable variable and a differentiable one, as the resulting variable will be differentiable.

You can think of differentiability as a *sticky* property. The table that follows is a summary
of how differentiability is broadcasted through variables.

Operands | Var | VarDiff |
---|---|---|

Var | Var | VarDiff |

VarDiff | VarDiff | VarDiff |

## Differentiable Ancestors

The differentiable ancestors of a variable are the differentiable leaves of the graph involved
in its computation. Obviously, only `VarDiff`

can have a set of ancestors.

You can gain access, via mutable views, to all the ancestors of a variable by iterating through
the vector of `Param`

returned by `.parameters()`

.
To gain more insights about the role that such components fulfil in neuronika feel free to check
the `optim`

module.

# Computational Graph

A computational graph is implicitly created as you write your program. You can differentiate it
with respect to some of the differentiable leaves, thus populating their gradients, by using
`.backward()`

.

It is important to note that the computational grap is *lazily* evalutated, this means that
neuronika decouples the construction of the graph from the actual computation of the nodes’
values. You must use `.forward()`

in order to obtain the actual result of the computation.

use neuronika; let x = neuronika::rand(5); //----+ let q = neuronika::rand((5, 5)); // |- Those lines build the graph. // | let mut y = x.clone().vm(q).vv(x); //----+ // y.forward(); // After .forward() is called y // contains the result.

## Freeing and keeping the graph

By default, computational graphs will persist in the program’s memory. If you want to be more conservative about this aspect you can place any arbitrary subset of the computations in an inner scope. This allows for the corresponding portion of the graph to be freed when the end of the scope is reached by your program.

use neuronika; let w = neuronika::rand((3, 3)).requires_grad(); // -----------------+ let b = neuronika::rand(3).requires_grad(); // | let x = neuronika::rand((10, 3)); // -----------------+- Leaves are created // { // ---+ let mut h = x.mm(w.t()) + b; // | w's and b's h.forward(); // | grads are h.backward(1.0); // | accumulated } // ---+ |- Graph is freed and // -----------------+ only leaves remain

## Modules

data | Data loading and manipulation utilities. |

nn | Basic building blocks for neural networks. |

optim | Implementations of various optimization algorithms and penalty regularizations. |

## Structs

Param | A builder of mutable views over a differentiable variable’s data and gradient. |

Var | A non-differentiable variable. |

VarDiff | A differentiable variable. |

## Traits

Backward | Back-propagation behaviour. |

Cat | Concatenation. |

Convolve | Convolution. |

ConvolveWithGroups | Grouped convolution. |

Data | Data representation. |

Eval | Eval mode behaviour. |

Forward | Forward-propagation behaviour. |

Gradient | Gradient representation. |

MatMatMul | Matrix-matrix multiplication. |

MatMatMulT | Matrix-matrix multiplication with transposed right hand side operand. |

Overwrite | Gradient accumulation’s mode. |

Stack | Stacking. |

VecMatMul | Vector-matrix multiplication. |

VecVecMul | Vector-vector multiplication, |

## Functions

eye | Creates a variable with an identity matrix of size |

from_ndarray | Creates a variable from a |

full | Creates a variable with data filled with a constant value. |

geomspace | Creates a one-dimensional variable with |

linspace | Creates a one-dimensional variable with |

logspace | Creates a one-dimensional variable with |

ones | Creates a variable with data filled with ones. |

rand | Creates a variable with values sampled from a uniform distribution on the interval |

range | Creates a one-dimensional variable with elements from |

zeros | Creates a variable with zeroed data. |