cortenforge-inference 0.1.4

Detector factory and inference helpers (Burn-backed or heuristic) for the CortenForge stack.
docs.rs failed to build cortenforge-inference-0.1.4
Please check the build logs for more information.
See Builds for ideas on how to fix a failed build, or Metadata for how to configure docs.rs builds.
If you believe this is docs.rs' fault, open an issue.
Visit the last successful build: cortenforge-inference-0.6.0

inference crate

crates.io docs.rs MSRV

  • Purpose: provide the Burn-backed detector factory and inference plugin used by Bevy apps (sim_view/inference_view).
  • Backend: defaults to backend-ndarray; enable --features backend-wgpu for WGPU. Needs burn features enabled in the root build if you want GPU.
  • Model: loads TinyDet (default) or BigDet from the shared models crate via BinFileRecorder (full precision). Pass a weights path to the factory to load a checkpoint; otherwise it falls back to a heuristic detector.
  • Use: app orchestrators insert the detector built by inference::InferenceFactory when mode==Inference. Ensure the checkpoint exists and matches the model config.
  • Smoke: unit test ensures fallback when no weights are provided. Add an integration test pointing at a real checkpoint once available.

License

Apache-2.0 (see LICENSE in the repo root).