Can you run an LLM on visionOS?

Asked on 08/05/2024

1 search

Yes, you can run a large language model (LLM) on visionOS. Vision Pro, which runs on visionOS, has access to the Apple Neural Engine, allowing for machine learning tasks to be executed on-device. This includes running models that require or are enhanced by the Apple Neural Engine, such as large language models.

For more details, you can refer to the session Introducing enterprise APIs for visionOS (11:44).

Additionally, the session Explore machine learning on Apple platforms (07:17) discusses running various models on Apple devices, including visionOS, which supports models like Whisper, Stable Diffusion, and Mistral.