mcai_onnxruntime/
tensor.rs

1//! Module containing tensor types.
2//!
3//! Two main types of tensors are available.
4//!
5//! The first one, [`Tensor`](struct.Tensor.html),
6//! is an _owned_ tensor that is backed by [`ndarray`](https://crates.io/crates/ndarray).
7//! This kind of tensor is used to pass input data for the inference.
8//!
9//! The second one, [`OrtOwnedTensor`](struct.OrtOwnedTensor.html), is used
10//! internally to pass to the ONNX Runtime inference execution to place
11//! its output values. It is built using a [`OrtOwnedTensorExtractor`](struct.OrtOwnedTensorExtractor.html)
12//! following the builder pattern.
13//!
14//! Once "extracted" from the runtime environment, this tensor will contain an
15//! [`ndarray::ArrayView`](https://docs.rs/ndarray/latest/ndarray/type.ArrayView.html)
16//! containing _a view_ of the data. When going out of scope, this tensor will free the required
17//! memory on the C side.
18//!
19//! **NOTE**: Tensors are not meant to be built directly. When performing inference,
20//! the [`Session::run()`](../session/struct.Session.html#method.run) method takes
21//! an `ndarray::Array` as input (taking ownership of it) and will convert it internally
22//! to a [`Tensor`](struct.Tensor.html). After inference, a [`OrtOwnedTensor`](struct.OrtOwnedTensor.html)
23//! will be returned by the method which can be derefed into its internal
24//! [`ndarray::ArrayView`](https://docs.rs/ndarray/latest/ndarray/type.ArrayView.html).
25
26pub mod ndarray_tensor;
27pub mod ort_owned_tensor;
28pub mod ort_tensor;
29pub mod type_dynamic_tensor;
30
31pub use ort_owned_tensor::OrtOwnedTensor;
32pub use ort_tensor::OrtTensor;
33pub use type_dynamic_tensor::FromArray;
34pub use type_dynamic_tensor::InputTensor;