What devices will support the on-device LLM?
Asked on 06/12/2025
1 search
The on-device large language model (LLM) is designed to run on Apple devices that are powered by Apple Silicon. This includes devices such as iPhones, iPads, and Macs. The model is optimized to take advantage of the CPU, GPU, and neural engine available on these devices, allowing for efficient and low-latency inference while keeping user data private. The on-device LLM is a device-scale model with around 3 billion parameters, making it suitable for tasks like summarization, extraction, and classification, but not for advanced reasoning or tasks requiring extensive world knowledge.

Explore prompt design & safety for on-device foundation models
Design generative AI experiences that leverage the strengths of the Foundation Models framework. We’ll start by showing how to design prompts for the on-device large language model at the core of Apple Intelligence. Then, we’ll introduce key ideas around AI safety, and offer concrete strategies to make your generative AI features safe, reliable, and delightful.

Meet the Foundation Models framework
Learn how to tap into the on-device large language model behind Apple Intelligence! This high-level overview covers everything from guided generation for generating Swift data structures and streaming for responsive experiences, to tool calling for integrating data sources and sessions for context management. This session has no prerequisites.

Platforms State of the Union
Discover the newest advancements on Apple platforms.