How does the updated Vision Pro SDK work for creating AR experiences?

Generated on 8/6/2024

1 search

The updated Vision Pro SDK offers several new features and improvements for creating AR experiences. Here are some key highlights:

  1. Hand Interaction: You can now decide if you want the user's hands to appear in front of or behind the content, providing more creative control over the app experience. This is detailed in the Platforms State of the Union session.

  2. Enhanced Scene Understanding: The SDK has significantly extended the fidelity of scene understanding capabilities. Planes can now be detected in all orientations, allowing for anchoring objects on various surfaces. Additionally, room anchors consider the user's surroundings on a per-room basis, and you can detect a user's movement across rooms. This is also covered in the Platforms State of the Union session.

  3. Object Tracking API: A new object tracking API for visionOS allows you to attach content to individual objects found around the user. This can be used to attach virtual content like instructions to physical objects, adding new dimensions of interactivity. More details can be found in the Explore object tracking for visionOS session.

  4. ARKit Enhancements: The ARKit framework has received several updates, including improvements in room tracking, plane detection, and hand tracking. These updates help customize experiences based on the room and enhance interaction with the surroundings. This is discussed in the Create enhanced spatial computing experiences with ARKit session.

  5. Integration with Existing Frameworks: VisionOS is built on the foundation of macOS, iOS, and iPadOS, meaning many tools and frameworks are common across these platforms. SwiftUI, RealityKit, and ARKit are core to developing spatial apps for visionOS. If your app already uses these frameworks, you are well on your way to creating a great spatial computing app. This is highlighted in the Platforms State of the Union session.

For more detailed information, you can refer to the following sessions:

These sessions provide comprehensive insights into the new capabilities and how to leverage them for creating immersive AR experiences.

Explore object tracking for visionOS

Explore object tracking for visionOS

Find out how you can use object tracking to turn real-world objects into virtual anchors in your visionOS app. Learn how you can build spatial experiences with object tracking from start to finish. Find out how to create a reference object using machine learning in Create ML and attach content relative to your target object in Reality Composer Pro, RealityKit or ARKit APIs.

Optimize for the spatial web

Optimize for the spatial web

Discover how to make the most of visionOS capabilities on the web. Explore recent updates like improvements to selection highlighting, and the ability to present spatial photos and panorama images in fullscreen. Learn to take advantage of existing web standards for dictation and text-to-speech with WebSpeech, spatial soundscapes with WebAudio, and immersive experiences with WebXR.

Design great visionOS apps

Design great visionOS apps

Find out how to create compelling spatial computing apps by embracing immersion, designing for eyes and hands, and taking advantage of depth, scale, and space. We’ll share several examples of great visionOS apps and explore how their designers approached creating new experiences for the platform.

Design interactive experiences for visionOS

Design interactive experiences for visionOS

Learn how you can design a compelling interactive narrative experience for Apple Vision Pro from the designers of Encounter Dinosaurs. Discover how these types of experiences differ from existing apps, media, and games, and explore how to design narratives that bring audiences into new worlds. Find out how you can create stories that adapt to any space and size, provide multiple levels of interaction to make them accessible to all, and use animation, spatial audio, and custom gestures to further immerse people in your experience.

Platforms State of the Union

Platforms State of the Union

Discover the newest advancements on Apple platforms.

Platforms State of the Union 5-Minute Recap

Platforms State of the Union 5-Minute Recap

Watch a quick recap of the newest advancements on Apple platforms.

Create enhanced spatial computing experiences with ARKit

Create enhanced spatial computing experiences with ARKit

Learn how to create captivating immersive experiences with ARKit’s latest features. Explore ways to use room tracking and object tracking to further engage with your surroundings. We’ll also share how your app can react to changes in your environment’s lighting on this platform. Discover improvements in hand tracking and plane detection which can make your spatial experiences more intuitive.

Build compelling spatial photo and video experiences

Build compelling spatial photo and video experiences

Learn how to adopt spatial photos and videos in your apps. Explore the different types of stereoscopic media and find out how to capture spatial videos in your iOS app on iPhone 15 Pro. Discover the various ways to detect and present spatial media, including the new QuickLook Preview Application API in visionOS. And take a deep dive into the metadata and stereo concepts that make a photo or video spatial.