Expand description
§Tract
Tiny, no-nonsense, self contained, portable TensorFlow and ONNX inference.
§Example
use tract_core::internal::*;
// build a simple model that just add 3 to each input component
let mut model = TypedModel::default();
let input_fact = f32::fact(&[3]);
let input = model.add_source("input", input_fact).unwrap();
let three = model.add_const("three".to_string(), tensor1(&[3f32])).unwrap();
let add = model.wire_node("add".to_string(),
tract_core::ops::math::add(),
[input, three].as_ref()
).unwrap();
model.auto_outputs().unwrap();
// We build an execution plan. Default inputs and outputs are inferred from
// the model graph.
let plan = SimplePlan::new(&model).unwrap();
// run the computation.
let input = tensor1(&[1.0f32, 2.5, 5.0]);
let mut outputs = plan.run(tvec![input.into()]).unwrap();
// take the first and only output tensor
let mut tensor = outputs.pop().unwrap();
assert_eq!(tensor, tensor1(&[4.0f32, 5.5, 8.0]).into());
While creating a model from Rust code is useful for testing the library, real-life use-cases will usually load a TensorFlow or ONNX model using tract-tensorflow or tract-onnx crates.
Re-exports§
pub extern crate downcast_rs;
pub extern crate ndarray;
pub extern crate num_traits;
pub extern crate tract_data;
pub extern crate tract_linalg;
pub use dyn_clone;
Modules§
- axes
- broadcast
- N-way tensor broadcast
- floats
- framework
- Enforce consistent API between the implemented Frameworks importers.
- internal
- This prelude is meant for code extending tract (like implementing new ops).
- macros
- model
- Models and their lifecycle
- ops
- Ops
- optim
- plan
- prelude
- This prelude is meant for code using tract.
- runtime
- transform
- value