What is the context size of apple’s local foundation model?
Asked on 06/11/2025
1 search
The context size of Apple's local foundation model is approximately 3 billion parameters. This model is designed to run on-device, which means it is smaller than server-based large language models that typically have hundreds of billions of parameters. The on-device model is optimized for tasks like summarization, extraction, and classification, but it is not designed for tasks requiring extensive world knowledge or advanced reasoning.

Explore prompt design & safety for on-device foundation models
Design generative AI experiences that leverage the strengths of the Foundation Models framework. We’ll start by showing how to design prompts for the on-device large language model at the core of Apple Intelligence. Then, we’ll introduce key ideas around AI safety, and offer concrete strategies to make your generative AI features safe, reliable, and delightful.

Meet the Foundation Models framework
Learn how to tap into the on-device large language model behind Apple Intelligence! This high-level overview covers everything from guided generation for generating Swift data structures and streaming for responsive experiences, to tool calling for integrating data sources and sessions for context management. This session has no prerequisites.

Platforms State of the Union
Discover the newest advancements on Apple platforms.