Can I run an llm in my phone
Asked on 06/09/2025
1 search
Yes, you can run a large language model (LLM) on your phone using Apple's Core ML framework. Apple provides tools to import and run on-device AI models, including large language models, across their devices. You can start with a model developed in PyTorch, convert it into the Core ML format, and then optimize it for performance using Core ML tools. This allows the model to leverage the CPU, GPU, and neural engine for efficient execution. For more details, you can refer to the Platforms State of the Union session.

Platforms State of the Union
Discover the newest advancements on Apple platforms.

Explore machine learning on Apple platforms
Get started with an overview of machine learning frameworks on Apple platforms. Whether you’re implementing your first ML model, or an ML expert, we’ll offer guidance to help you select the right framework for your app’s needs.

Deploy machine learning and AI models on-device with Core ML
Learn new ways to optimize speed and memory performance when you convert and run machine learning and AI models through Core ML. We’ll cover new options for model representations, performance insights, execution, and model stitching which can be used together to create compelling and private on-device experiences.