How to Run Your GenAI, AI, and ML Workloads on Arm CPUs
These educational materials are for beginners to advanced app developers, while Topic 3 is for developers of AI and ML tools and frameworks. Resources focus on coding best practices, optimized AI and ML libraries and tools, and how to optimize AI and ML workloads on Arm CPUs.
Build AI/ML Android Apps
Get Started running your ML and AI workloads on Arm Cortex CPUs.
ExecuTorch for On-Device AI
- Meta announced ExecuTorch to enable on-device AI for PyTorch in October, 2023. Learn what it is and how Executorch works.
- Read our guide on how to build an Android chat app with Llama, ExecuTorch, and XNNPACK.
Get Started with AI and ML on Unity with Unity Sentis and ML Agents
- A series of videos, learning paths, blogs and tutorials on how to bring AI and ML into your Unity project on Arm-based mobile devices.
Google’s AI Edge’s Mediapipe Acceleration on Arm-Based Android
- Google’s AI Edge’s Mediapipe is accelerated with Arm KleidiAI for Arm Cortex CPUs.
- Get started and run LLM inference on Android with KleidiAI, Mediapipe, and XNNPack.
Get Started with OpenCV on Android
- Learn how to create a computer vision (CV) application with OpenCV on Android devices.
- Learn how you can use OpenCV for face detection.
Build GenAI Android Apps
Learn how to quantize neural network (NN) models and run large language models (LLMs) on mobile.
Build an Android Chat App with Llama and ExecuTorch
- A get-started guide to learn how to set up ExecuTorch and quantize models without sacrificing significantly the model accuracy.
- A blog covering a virtual assistant demo, which first used Meta’s Llama2-7B on mobile via a chat-based app, and has since expanded to include the Llama3 model and Phi-3 3.8B.
- A blog covering the use of quantization for neural network (NN) models, which is critical for deploying GenAI models on mobile devices and edge platforms.
Accelerate GenAI, AI, and ML
Accelerate your AI/ML framework, tools, and cloud services with open-source Arm libraries and optimized Arm SIMD code.
- Arm Kleidi open-source libraries are a lighter weight performance library than ACL for accelerating AI and ML workloads and frameworks.
- Explore KleidiAI’s initial features and follow the step-by-step guide to running one of its critical functions for accelerating the Gemma LLM.
- The KleidiCV library is designed for image processing and integrates into any CV framework to enable optimum performance for CV workloads on Arm.
- Optimize your AI/ML workloads with Arm SIMD code, either in assembly or using Arm Intrinsics in C/C++, to leverage huge performance gains.
Join the Arm Developer Program
Join the Arm Developer Program to build your future on Arm. Get fresh insights directly from Arm experts,
connect with like-minded peers for advice, or build on your expertise and become an Arm Ambassador.
Community Support
Ben Clark
Ben Clark is an Arm Staff Software Engineer and developer advocate, researching and publishing the best use of Arm technologies in consumer devices. He has a graphics background and a keen interest in machine learning.