Examples
- For fully-connected examples, remember to call
model.update(). - Fully-connected MNIST (convolutional neural networks are in-progress).
- Fully-connected neural network (full version):
let initializer = make_he;
let relu = make_relu;
let softmax = make_softmax;
let ce = make_cross_entropy;
let gd = new;
let l1 = new;
let l2 = new;
let mut model = new;
for _ in 0..iterations
- Dynamic computational graph:
let a = arr!.tracked;
let b = arr!.tracked;
let mut c = arr!.tracked;
for _ in 0..10
assert_eq!;
c.backward;
assert_eq!;
assert_eq!;
assert_eq!;
- Custom operation (still needs some work).
Important Design Notes
- Array values should never be modified from operations; instead, new arrays should be created.
- Arrays are untracked by default, so if gradients are required,
tracked(), orstart_tracking()must be used (see the documentation for details). - Versions prior to 0.9.7 of Corgi did not prioritise optimisation, and will be slow.
Design
- Eager execution.
- Dynamic-as-possible computational graph.
- Originally worked around the ergonomics of the
arr!macro (which however, currently still needs more work). - Did not want to have to manage any 'graph' structures when using Corgi (the Arrays should represent the graph alone).
- Graphs do note store consumers (at the moment). They store consumer counts instead.
BLAS
- The
opeblas, ornetlibfeatures can be enabled, and requires CBLAS if used.
Tracked Arrays
- Tracked arrays are arrays which require gradients to be computed, and stored.
- For more information, see the documentation for
tracked(), anduntracked()inarray.rs.
Name
- Original name was going to be 'cog-(something)', since Rust's logo is a cog, and since cognition (get it?). But, many AI libraries are named 'cog-(something)'. Attempts at permutations of 'cog' with other words sounded awkward, such as 'cogi', for 'cog-intelligence', so the name Corgi was chosen.
Resources
- Shields are from shields.io.
- MIT 6.034 on OpenCourseWare for a primer on Backward Propagation.
- CS231n YouTube recordings for a primer on Convolutional Neural Networks.
A lot of the library was built around being as dynamic as possible, meaning if chosen well, some design choices might be similar to other dynamic computational graph libraries.
Licence
- MIT