tract-core 0.2.0

Tiny, no-nonsense, self contained, TensorFlow and ONNX inference
Documentation

Tract

Tiny, no-nonsense, self contained, portable SharedTensor and ONNX inference.

Example

# extern crate tract_core;
# extern crate ndarray;
# fn main() {
use tract_core::*;
use tract_core::model::*;
use tract_core::model::dsl::*;

// build a simple model that just add 3 to each input component
let mut model = Model::default();

let input = model.add_source("input").unwrap();
let three = model.add_const("three".to_string(), 3f32.into()).unwrap();
let add = model.add_node("add".to_string(),
    Box::new(tract_core::ops::math::Add::default())).unwrap();

model.add_edge(OutletId::new(input, 0), InletId::new(add, 0)).unwrap();
model.add_edge(OutletId::new(three, 0), InletId::new(add, 1)).unwrap();

// we build an execution plan. default input and output are inferred from
// the model graph
let plan = SimplePlan::new(&model).unwrap();

// run the computation.
let input = ndarray::arr1(&[1.0f32, 2.5, 5.0]);
let mut outputs = plan.run(tvec![input.into()]).unwrap();

// take the first and only output tensor
let mut tensor = outputs.pop().unwrap();

// unwrap it as array of f32
let tensor = tensor.to_array_view::<f32>().unwrap();
assert_eq!(tensor, ndarray::arr1(&[4.0, 5.5, 8.0]).into_dyn());
# }

While creating a model from Rust code is usefull for testing the library, real-life use-cases will usually load a SharedTensor or ONNX model using tract-tf or tract-onnx crates.