Crate tfdeploy [] [src]

Tensorflow Deploy

Tiny, no-nonsense, self contained, portable Tensorflow inference.

Example

// load a simple model that just add 3 to each input component
let graph = tfdeploy::for_path("tests/plus3.pb").unwrap();

// "input" and "output" are tensorflow graph node names.
// we need to map these names to ids
let input_id = graph.node_id_by_name("input").unwrap();
let output_id = graph.node_id_by_name("output").unwrap();

// run the computation.
let input = ndarray::arr1(&[1.0f32, 2.5, 5.0]);
let mut outputs = graph.run(vec![(input_id,input.into())], output_id).unwrap();

// grab the first (and only) tensor of the result, and unwrap it as array of f32
let output = outputs.remove(0).take_f32s().unwrap();
assert_eq!(output, ndarray::arr1(&[4.0, 5.5, 8.0]).into_dyn());

For a more serious example, see inception v3 example.

Re-exports

pub use errors::*;
pub use matrix::Matrix;

Modules

errors

error_chain generated types

matrix

Matrix is the equivalent of Tensorflow Tensor.

ops

TensorFlow Ops

tfpb

Generated protobuf codec for Tensorflow models, plus a handful of helper for writting tests.

Structs

Model

Model is Tfdeploy workhouse. It wraps a protobuf tensorflow model, and runs the inference interpreter.

ModelState
Node
Plan

Functions

for_path

Load a Tensorflow protobul model from a file.