Machine Learning Platform

Arm NN, Compute Library and Arm MLIA.

What is the machine learning platform?

The machine learning platform is part of the Linaro Artificial Intelligence Initiative and is the home for Arm NN and Compute Library – open-source software libraries that optimise the execution of machine learning (ML) workloads on Arm-based processors. It also hosts the Arm ML Inference Advisor (Arm MLIA), a tool to make optimisation efforts across Arm available to developers with varying backgrounds.

It enables a new era of advanced, ultra-efficient inference at the edge. Specifically designed for machine learning and neural network (NN) capabilities, the architecture is versatile enough to scale to any device, from the Internet of Things (IoT) to connected cars and servers.

NN Frameworks flow chart image

Linaro’s Artificial Intelligence Initiative brings companies together to develop the best-in-class Deep Learning performance by leveraging Neural network acceleration in IP and SoCs from the Arm ecosystem. To find out more about joining the initiative as a member, go to https://www.linaro.org/engineering/artificial-intelligence/.

Why choose the machine learning platform?

The machine learning platform libraries – Arm NN and Arm Compute Library – bridge the gap between existing neural network (NN) frameworks, such as TensorFlow, TensorFlow Lite, Caffe and ONNX, and the underlying IP.

They enable efficient translation of these NN frameworks, allowing them to run efficiently – without modification – across Arm Cortex-A CPUs, Arm Mali GPUs and the Arm ML processor. Additionally, the Arm MLIA provides insights on how ML models will perform on Arm early in the model development cycle.

ML Platform flow chart image
Technical Overview Icon

Technical Overview

Find more information on the machine learning platform projects from the existing Arm developer websites: