Docs.rs
  • auto-diff-0.5.9
    • auto-diff 0.5.9
    • Permalink
    • Docs.rs crate page
    • MIT
    • Links
    • Homepage
    • Repository
    • crates.io
    • Source
    • Owners
    • pipehappy1
    • Dependencies
      • auto-diff-macros ^0.5.9 normal
      • num-traits ^0.2 normal
      • rand ^0.8 normal
      • rand_distr ^0.4 normal
      • serde ^1.0 normal optional
      • tensor-rs ^0.5.9 normal
      • cargo-expand ^1 dev
      • criterion ^0.3 dev
      • csv ^1.1 dev
      • openblas-src ^0.10 dev
      • serde-pickle ^0.6 dev
      • tensorboard-rs ^0.5.9 dev
    • Versions
    • 18.98% of the crate is documented
  • Platform
    • i686-pc-windows-msvc
    • i686-unknown-linux-gnu
    • x86_64-apple-darwin
    • x86_64-pc-windows-msvc
    • x86_64-unknown-linux-gnu
  • Feature flags
  • docs.rs
    • About docs.rs
    • Privacy policy
  • Rust
    • Rust website
    • The Book
    • Standard Library API Reference
    • Rust by Example
    • The Cargo Guide
    • Clippy Documentation

Crate auto_diff

auto_diff0.5.9

  • All Items

Sections

  • An auto-difference library
    • Introduction
    • Install
    • Features
    • Example

Crate Items

  • Re-exports
  • Modules
  • Macros

Crates

  • auto_diff

Crate auto_diff

Source
Expand description

§An auto-difference library

§Introduction

This is yet another auto-difference library for deep neural network. The focus is easy on use and dynamic computation graph building.

§Install

Add auto-diff = “0.5” to the [dependencies] section of your project Cargo.toml file.

§Features

The forward operators support a commonly used set, including:

  1. getter/setter,
  2. index and slicing,
  3. +, -, *, / and matmul,
  4. speciall functions,
  5. statistics,
  6. linear algebra,
  7. random number generator.

The corresponding gradient is work-in-progress.

One feature of auto-diff is the auto-difference is in background and don’t get in your way if only forward calculation is needed. Thus it can be used without syntax like variable place holder.

§Example

Re-exports§

pub use var::Var;
pub use err::AutoDiffError;

Modules§

collection
compute_graph
err
op
optim
Gradient based optimization.
serde
var
var_inner

Macros§

var_f64

Results

Settings
Help
    re-export
    auto_diff::op::Sub
    method
    auto_diff::var::Var::sub
    method
    auto_diff::var::Var::_sub
    re-export
    auto_diff::op::Sum
    method
    auto_diff::var::Var::sum
    enum variant
    auto_diff::op::loss::Reduction::Sum
    method
    auto_diff::op::local::Sub::get_input_size
    &Sub -> usize
    method
    auto_diff::op::local::Sub::get_output_size
    &Sub -> usize
    method
    auto_diff::op::local::Sub::as_any
    &Sub -> &Any
    method
    auto_diff::op::local::Sub::get_name
    &Sub -> &str
    method
    auto_diff::op::local::Sub::get_grads
    &Sub -> Vec<Tensor>
    method
    auto_diff::op::local::Sub::get_values
    &Sub -> Vec<Tensor>
    method
    auto_diff::op::local::Sub::serialize
    &Sub, __S -> Result
    method
    auto_diff::op::local::Sub::set_values
    &Sub, &[Tensor] -> ()
    method
    auto_diff::op::local::Sub::apply
    &Sub, &[Tensor], &[Tensor] -> ()
    method
    auto_diff::op::local::Sub::grad
    &Sub, &[Tensor], &[Tensor], &[Tensor] -> ()
    method
    auto_diff::op::local::Sub::new
    -> Sub
    method
    auto_diff::op::local::Sub::default
    -> Sub
    method
    auto_diff::op::local::Sub::deserialize
    __D -> Result<Sub>