Please check the build logs for more information.
See Builds for ideas on how to fix a failed build, or Metadata for how to configure docs.rs builds.
If you believe this is docs.rs' fault, open an issue.
bevy_ort 🪨
a bevy plugin for the ort library
> modnet inference example
capabilities
- load ONNX models as ORT session assets
- initialize ORT with default execution providers
- modnet bevy image <-> ort tensor IO (with feature
modnet
) - batched modnet preprocessing
- compute task pool inference scheduling
models
- lightglue (feature matching)
- modnet (photographic portrait matting)
- yolo_v8 (object detection)
- flame (parametric head model)
library usage
use *;
use ;
;
run the example person segmentation model (modnet)
use an accelerated execution provider:
- windows -
cargo run --features ort/cuda
orcargo run --features ort/openvino
- macos -
cargo run --features ort/coreml
- linux -
cargo run --features ort/tensorrt
orcargo run --features ort/openvino
see complete list of ort features here: https://github.com/pykeio/ort/blob/0aec4030a5f3470e4ee6c6f4e7e52d4e495ec27a/Cargo.toml#L54
note: if you use
pip install onnxruntime
, you may need to runORT_STRATEGY=system cargo run
, see: https://docs.rs/ort/latest/ort/#how-to-get-binaries
compatible bevy versions
bevy_ort |
bevy |
---|---|
0.1.0 |
0.13 |
credits
license
This software is dual-licensed under the MIT License and the GNU General Public License version 3 (GPL-3.0).
You may choose to use this software under the terms of the MIT License OR the GNU General Public License version 3 (GPL-3.0), except as stipulated below:
The use of the yolo_v8
feature within this software is specifically governed by the GNU General Public License version 3 (GPL-3.0). By using the yolo_v8
feature, you agree to comply with the terms and conditions of the GPL-3.0.
For more details on the licenses, please refer to the LICENSE.MIT and LICENSE.GPL-3.0 files included with this software.