Can you run an LLM on visionOS?
Asked on 08/05/2024
1 search
Yes, you can run a large language model (LLM) on visionOS. Vision Pro, which runs on visionOS, has access to the Apple Neural Engine, allowing for machine learning tasks to be executed on-device. This includes running models that require or are enhanced by the Apple Neural Engine, such as large language models.
For more details, you can refer to the session Introducing enterprise APIs for visionOS (11:44).
Additionally, the session Explore machine learning on Apple platforms (07:17) discusses running various models on Apple devices, including visionOS, which supports models like Whisper, Stable Diffusion, and Mistral.

Bring your iOS or iPadOS game to visionOS
Discover how to transform your iOS or iPadOS game into a uniquely visionOS experience. Increase the immersion (and fun factor!) with a 3D frame or an immersive background. And invite players further into your world by adding depth to the window with stereoscopy or head tracking.

What’s new in device management
Learn about the latest management capabilities for iOS, iPadOS, macOS, and visionOS, then discover the latest changes to Apple Business Manager and Apple School Manager. We’ll also share updates to Activation Lock, SoftwareUpdate, and Safari management.

Optimize for the spatial web
Discover how to make the most of visionOS capabilities on the web. Explore recent updates like improvements to selection highlighting, and the ability to present spatial photos and panorama images in fullscreen. Learn to take advantage of existing web standards for dictation and text-to-speech with WebSpeech, spatial soundscapes with WebAudio, and immersive experiences with WebXR.