[−][src]Crate drug
∂rug  Differentiable Rust Graph
This crate is a collection of utilities to build build neural networks (differentiable programs). See examples for implementations of canonical neural networks. You may need to download those datasets yourself to use them. Examples include:
 Mnist with dense networks
 Mnist with convolutional neural networks (though embarassingly slowly)
 Penn TreeBank character prediction with RNN and GRU
Planned Future Features
 Higher level API
 Building complexes of nodes (conv + bias + relu) / RNN cells, with parameter reuse
 Subgraphs / updating subsets of graphs (e.g. for GAN) with separate optimizers
 Parallel backprop multiple arguments of 1 node
 ndarrayparallel or OpenMPI for graph replication and parallelization
 Link to some optimized OpenCL maths backend for GPU utilization
Reinforcement learning applications may also challenge the archiecture but I don't understand the process well enough yet to consider adding it to the library.
Wish list
 Operator overloading API + Taking advantage of the type system and const generics
 May require total overhaul.. or may be possible with a "Graph Cursor" trait and more sophisticaed handles beyond current Idxs
 Automatic differentiation of operations defined only from loops (proc macros?)
 Taking advantage of just in time compilation and fusion of operations / kernels
 Other kinds of derivatives e.g. jacobian
Reexports
pub extern crate ndarray; 
pub use nodes::Operation; 
Modules
nodes 
This module holds the different types nodes that exist in a computation graph. Nodes that represent a differentiable computation are implemented by a struct with the "Operation" trait. Use Graph methods to create and register nodes inside a graph. See Node for the types of node available. This module may eventually be made private... 
Structs
Graph 
A differentiable computation graph. Use this struct to hold your differentiable program
which is a directed acyclic graph of Nodes, their associated values
and losses (gradients). The graph computes values moving forward in insertion order (see

Idx 
A placeholder to help index into a graph. These should not be interchanged between graphs. 
Optimizer 
Here is a good blog that explains various optimizers.
Currently only SGD, RMSProp, Adam, and SGDwithmomentum are implemented.
The 
Enums
GlobalPool 
Type of pooling operation (currently there is only average pooling). TODO enum max pool, avg pool, sum pool, min pool Implements Operation. See Node constructor for full description. 
Padding 
Type of padding to use in a Conv node . 
Functions
softmax 
Take the softmax of an array of shape 
softmax_cross_entropy_loss 
A loss function used for classification. 
xavier_initialize 
The default (and only provided) initializer. Only works with convolution kernels and matrices. 