for _ in 0..iterations
Design
- Array operations are never in-place, meaning array values are never modified.
- Eager execution.
- Dynamic-as-possible computational graph.
for _ in 0..10
c.backward;
- The Array is responsible differentiates operations done on it for the backward pass.
- No graph structure for ergonomics - an
Arraycontains only its children. - Arrays do note store consumers (at the moment). They store consumer counts instead.
BLAS
- The
openblas, ornetlibfeatures can be enabled. - Versions prior to 0.9.7 of Corgi did not prioritise optimisation, and will be slow.
Tracked Arrays
- Arrays are untracked by default, so if gradients are required,
tracked(), orstart_tracking()must be used (see the documentation for details). - Tracked arrays are arrays which require gradients to be computed, and stored.
- For more information, see the documentation for
tracked(), anduntracked()inarray.rs.
Examples
- Fully-connected neural network (full version):
let initializer = he;
let relu = relu;
let softmax = softmax;
let ce = cross_entropy;
let gd = new;
let l1 = new;
let l2 = new;
let mut model = new;
for _ in 0..iterations
- Dynamic computational graph:
let a = arr!.tracked;
let b = arr!.tracked;
let mut c = arr!.tracked;
for _ in 0..10
assert_eq!;
c.backward;
assert_eq!;
assert_eq!;
assert_eq!;
- Custom operation (still needs some work).
Resources
- Shields are from shields.io.
- MIT 6.034 on OpenCourseWare for a primer on Backward Propagation.
- CS231n YouTube recordings for a primer on Convolutional Neural Networks.
A lot of the library was built around being as dynamic as possible, meaning if chosen well, some design choices might be similar to other dynamic computational graph libraries.
Third-party libraries were used, and can be found in Cargo.toml.
Licence
- MIT