docs.rs failed to build cortenforge-inference-0.1.4
Please check the build logs for more information.
See Builds for ideas on how to fix a failed build, or Metadata for how to configure docs.rs builds.
If you believe this is docs.rs' fault, open an issue.
Please check the build logs for more information.
See Builds for ideas on how to fix a failed build, or Metadata for how to configure docs.rs builds.
If you believe this is docs.rs' fault, open an issue.
Visit the last successful build:
cortenforge-inference-0.6.0
inference crate
- Purpose: provide the Burn-backed detector factory and inference plugin used by Bevy apps (sim_view/inference_view).
- Backend: defaults to
backend-ndarray; enable--features backend-wgpufor WGPU. Needsburnfeatures enabled in the root build if you want GPU. - Model: loads
TinyDet(default) orBigDetfrom the sharedmodelscrate viaBinFileRecorder(full precision). Pass a weights path to the factory to load a checkpoint; otherwise it falls back to a heuristic detector. - Use: app orchestrators insert the detector built by
inference::InferenceFactorywhen mode==Inference. Ensure the checkpoint exists and matches the model config. - Smoke: unit test ensures fallback when no weights are provided. Add an integration test pointing at a real checkpoint once available.
License
Apache-2.0 (see LICENSE in the repo root).