FAQ

ONNX Runtime Issues

Check these issue trackers:

Linking errors with __isoc23_strtoll?

Set the ORT_DYLIB_PATH environment variable to the path to libonnxruntime.so/onnxruntime.dll

export ORT_DYLIB_PATH=/path/to/onnxruntime/lib/

Use the dynamic loading feature:

cargo run -F ort-load-dynamic --example

Other Linking Errors?

See ORT Linking for more information.

Why no LLM models?
  • Focus: Vision and VLM models under 1B parameters
  • LLM inference engines like vLLM already exist
  • Pure text embedding models may be added in the future
How fast is it?
  • YOLO benchmarks: see Performance
  • Optimizations: multi-threading, SIMD, CUDA acceleration
  • YOLO and RFDETR are well-optimized; other models may need more work

Still have questions?

Open a GitHub Issue.