Module runtime

Module runtime 

Source
Expand description

Generic inference runtime for edge computing

This module provides a generic interface for running inference on various deep learning models using different backends.

Modules§

onnx
ONNX Runtime backend for liquid-edge inference

Structs§

InferenceInput
Generic input for inference operations
InferenceOutput
Generic output from inference operations
InferenceRuntime
Main inference runtime that manages different backends

Traits§

RuntimeBackend
Generic inference runtime trait