How does the newly updated Vision Pro SDK work for creating AR experiences?
Generated on 8/6/2024
1 search
The newly updated Vision Pro SDK offers several enhancements for creating AR experiences. Here are some key features and improvements:
-
Hand Interaction: You can now decide if you want the user's hands to appear in front of or behind the content, providing more creative control over the app experience. This is part of the extended fidelity of scene understanding capabilities (Platforms State of the Union).
-
Scene Understanding: The SDK has significantly improved scene understanding capabilities. Planes can now be detected in all orientations, allowing for anchoring objects on various surfaces. Additionally, room anchors can consider the user's surroundings on a per-room basis, and you can detect a user's movement across rooms (Platforms State of the Union).
-
Object Tracking API: A new object tracking API for visionOS allows you to attach content to individual objects found around the user. This can be used to attach virtual content like instructions to physical objects, adding new dimensions of interactivity (Platforms State of the Union).
-
ARKit Enhancements: The ARKit framework, which is core to developing spatial apps for visionOS, has received several updates:
- Room Tracking: Customizes experiences based on the room.
- Plane Detection: Improved to detect planes in all orientations.
- Object Tracking: Allows for attaching content to physical objects.
- Hand Tracking: Enhanced for more interactive experiences (Create enhanced spatial computing experiences with ARKit).
-
Framework Integration: VisionOS is built on the foundation of macOS, iOS, and iPadOS, meaning many tools and frameworks are common across these platforms. This allows developers to write code once and use it across multiple platforms. Core frameworks like SwiftUI, RealityKit, and ARKit are essential for developing spatial apps (Platforms State of the Union).
-
Developer Resources: Apple has provided more samples and documentation to help developers get started with spatial computing on the VisionOS platform (Platforms State of the Union).
These updates make it easier for developers to create rich, interactive AR experiences on the Vision Pro platform.
Optimize for the spatial web
Discover how to make the most of visionOS capabilities on the web. Explore recent updates like improvements to selection highlighting, and the ability to present spatial photos and panorama images in fullscreen. Learn to take advantage of existing web standards for dictation and text-to-speech with WebSpeech, spatial soundscapes with WebAudio, and immersive experiences with WebXR.
Design great visionOS apps
Find out how to create compelling spatial computing apps by embracing immersion, designing for eyes and hands, and taking advantage of depth, scale, and space. We’ll share several examples of great visionOS apps and explore how their designers approached creating new experiences for the platform.
Design interactive experiences for visionOS
Learn how you can design a compelling interactive narrative experience for Apple Vision Pro from the designers of Encounter Dinosaurs. Discover how these types of experiences differ from existing apps, media, and games, and explore how to design narratives that bring audiences into new worlds. Find out how you can create stories that adapt to any space and size, provide multiple levels of interaction to make them accessible to all, and use animation, spatial audio, and custom gestures to further immerse people in your experience.
Platforms State of the Union
Discover the newest advancements on Apple platforms.
Platforms State of the Union 5-Minute Recap
Watch a quick recap of the newest advancements on Apple platforms.
Create enhanced spatial computing experiences with ARKit
Learn how to create captivating immersive experiences with ARKit’s latest features. Explore ways to use room tracking and object tracking to further engage with your surroundings. We’ll also share how your app can react to changes in your environment’s lighting on this platform. Discover improvements in hand tracking and plane detection which can make your spatial experiences more intuitive.
Build compelling spatial photo and video experiences
Learn how to adopt spatial photos and videos in your apps. Explore the different types of stereoscopic media and find out how to capture spatial videos in your iOS app on iPhone 15 Pro. Discover the various ways to detect and present spatial media, including the new QuickLook Preview Application API in visionOS. And take a deep dive into the metadata and stereo concepts that make a photo or video spatial.