autograd
This library provides differentiable operations and tensors. The current backend is rust-ndarray.
Examples
Here we are computing partial derivatives of z = 2x^2 + 3y + 1
.
extern crate autograd as ag;
let ref x = placeholder;
let ref y = placeholder;
let ref z = 2*x*x + 3*y + 1;
// dz/dy
let ref gy = grad;
// dz/dx
let ref gx = grad;
// ddz/dx (differentiates `z` again)
let ref ggx = grad;
// evaluation of symbolic gradients
println!; // => 3.
println!; // => 4.
// dz/dx requires to fill the placeholder `x`
println!; // => 8.
Another example: multi layer perceptron for MNIST digits classification.
use Optimizer;
// -- graph def --
let ref x = placeholder;
let ref y = placeholder;
let ref w = variable;
let ref b = variable;
let ref z = matmul + b;
let ref loss = reduce_mean;
let ref params =
let ref grads = grad;
let ref predictions = argmax;
let ref accuracy = reduce_mean;
let mut adam = default;
let ref update_ops = adam.compute_updates;
// -- dataset --
let = load;
// -- training loop --
for epoch in 0..max_epoch
For more, see documentation or examples