autograd
A library to run the computation graphs whose backend is rust-ndarray.
Documentation: https://docs.rs/autograd/
Overview
- Automatic differentiation
- Pure Rust
- Neural net first APIs
- Dynamic/static graph construction with shared variables
Examples
Here we are computing partial derivatives of z = 2x^2 + 3y + 1
.
extern crate ndarray;
extern crate autograd as ag;
let mut graph = new;
let ref x = graph.placeholder;
let ref y = graph.variable;
let ref z = 2*x*x + 3*y + 1;
// dz/dy
let ref g1 = gradients;
// dz/dx
let ref g2 = gradients;
// ddz/dx (differentiates `z` again)
let ref gg = gradients;
// evaluation of symbolic gradients
assert_eq!;
assert_eq!;
// dz/dx requires to fill the placeholder `x`
graph.feed;
assert_eq!;
Another example: multi layer perceptron for MNIST.
// -- graph def --
let mut graph = new;
let ref x = graph.placeholder;
let ref y = graph.placeholder;
let ref w = graph.variable;
let ref b = graph.variable;
let ref z = matmul + b;
let ref loss = sparse_softmax_cross_entropy;
let ref grads = gradients;
let ref predictions = argmax;
let ref accuracy = reduce_mean;
// -- dataset --
let = load;
// -- training method --
let mut optimizer = Adam ;
// -- training loop --
for epoch in 0..max_epoch
Available operations in rust-autograd are listed here
For more, see examples or tests.
License
MIT