Arm NN Execution Provider

Contents

Accelerate performance of ONNX model workloads across Arm®-based devices with the Arm NN execution provider. Arm NN is an open source inference engine maintained by Arm and Linaro companies.

Build

For build instructions, please see the BUILD page.

Usage

C/C++

To use Arm NN as execution provider for inferencing, please register it as below.

Ort::Env env = Ort::Env{ORT_LOGGING_LEVEL_ERROR, "Default"};
Ort::SessionOptions so;
bool enable_cpu_mem_arena = true;
Ort::ThrowOnError(OrtSessionOptionsAppendExecutionProvider_ArmNN(so, enable_cpu_mem_arena));

The C API details are here.

Performance Tuning

When/if using onnxruntime_perf_test, use the flag -e armnn