What new features are the in the Visionos 2 SDK?

Generated on 7/31/2024

1 search

The VisionOS 2 SDK introduces several new features to enhance the development of spatial computing applications. Here are some of the key updates:

  1. Hand Positioning: Developers can now decide if the user's hands appear in front of or behind the content, offering more creative control over the app experience.
  2. Enhanced Scene Understanding: The fidelity of scene understanding capabilities has been significantly extended. Planes can now be detected in all orientations, allowing for anchoring objects on various surfaces. Additionally, room anchors can consider the user's surroundings on a per-room basis, and user movement across rooms can be detected.
  3. Object Tracking API: A new object tracking API allows developers to attach content to individual objects found around the user. This enables attaching virtual content, like instructions, to physical objects for new dimensions of interactivity.
  4. Known Object Tracking: Enhanced known object tracking through parameter adjustment allows apps to detect and track specific reference objects within their viewing area. This feature includes configurable parameters to optimize object tracking for different use cases, such as changing the maximum number of objects tracked and adjusting the tracking rate for static and dynamic objects.

For more details, you can refer to the Platforms State of the Union session.

Relevant Sessions

  1. Platforms State of the Union
  2. Introducing enterprise APIs for visionOS
  3. Create enhanced spatial computing experiences with ARKit
Introducing enterprise APIs for visionOS

Introducing enterprise APIs for visionOS

Find out how you can use new enterprise APIs for visionOS to create spatial experiences that enhance employee and customer productivity on Apple Vision Pro.

Platforms State of the Union

Platforms State of the Union

Discover the newest advancements on Apple platforms.

Create enhanced spatial computing experiences with ARKit

Create enhanced spatial computing experiences with ARKit

Learn how to create captivating immersive experiences with ARKit’s latest features. Explore ways to use room tracking and object tracking to further engage with your surroundings. We’ll also share how your app can react to changes in your environment’s lighting on this platform. Discover improvements in hand tracking and plane detection which can make your spatial experiences more intuitive.

Get started with HealthKit in visionOS

Get started with HealthKit in visionOS

Discover how to use HealthKit to create experiences that take full advantage of the spatial canvas. Learn the capabilities of HealthKit on the platform, find out how to bring an existing iPadOS app to visionOS, and explore the special considerations governing HealthKit during a Guest User session. You’ll also learn ways to use SwiftUI, Swift Charts, and Swift concurrency to craft innovative experiences with HealthKit.

Platforms State of the Union 5-Minute Recap

Platforms State of the Union 5-Minute Recap

Watch a quick recap of the newest advancements on Apple platforms.

Explore machine learning on Apple platforms

Explore machine learning on Apple platforms

Get started with an overview of machine learning frameworks on Apple platforms. Whether you’re implementing your first ML model, or an ML expert, we’ll offer guidance to help you select the right framework for your app’s needs.

Bring your iOS or iPadOS game to visionOS

Bring your iOS or iPadOS game to visionOS

Discover how to transform your iOS or iPadOS game into a uniquely visionOS experience. Increase the immersion (and fun factor!) with a 3D frame or an immersive background. And invite players further into your world by adding depth to the window with stereoscopy or head tracking.

Discover Swift enhancements in the Vision framework

Discover Swift enhancements in the Vision framework

The Vision Framework API has been redesigned to leverage modern Swift features like concurrency, making it easier and faster to integrate a wide array of Vision algorithms into your app. We’ll tour the updated API and share sample code, along with best practices, to help you get the benefits of this framework with less coding effort. We’ll also demonstrate two new features: image aesthetics and holistic body pose.