#detector #inference #computer-vision #burn

cortenforge-inference

Detector factory and inference helpers (Burn-backed or heuristic) for the CortenForge stack

14 releases (5 breaking)

0.6.0 Jan 14, 2026
0.5.1 Jan 13, 2026
0.4.1 Jan 12, 2026
0.3.1 Jan 11, 2026
0.1.5 Jan 10, 2026

#1249 in Machine learning


Used in 3 crates

Apache-2.0

155KB
1K SLoC

inference crate

crates.io docs.rs MSRV

Burn-backed (or heuristic) detector factory and inference plugin for Bevy apps.

Details

  • Backend: defaults to backend-ndarray; enable --features backend-wgpu for WGPU. Needs burn features enabled in the root build if you want GPU.
  • Model: loads TinyDet (default) or BigDet from the shared models crate via BinFileRecorder (full precision). Pass a weights path to the factory to load a checkpoint; otherwise it falls back to a heuristic detector.
  • Use: app orchestrators insert the detector built by inference::InferenceFactory when mode==Inference. Ensure the checkpoint exists and matches the model config.
  • Smoke: unit test ensures fallback when no weights are provided. Add an integration test pointing at a real checkpoint once available.

License

Apache-2.0 (see LICENSE in the repo root).

Dependencies

~120–170MB
~3M SLoC