Is there a session that explains how to use ARKit in visionOS to detect the floor and place things on it?

Generated on 8/27/2024

1 search

Yes, there is a session that explains how to use ARKit in visionOS to detect the floor and place things on it. The session titled "Create enhanced spatial computing experiences with ARKit" covers updates to plane detection, which includes detecting horizontal surfaces like floors. This session explains how plane detection data is delivered in the form of plane anchors, which hold information about surfaces detected in the real world. You can find more details in the chapter on Plane detection.

Additionally, the session "Dive deep into volumes and immersive spaces" discusses placing objects relative to a floor anchor using the new spatial tracking session API in RealityKit. This is covered in the section on Immersive spaces: Anchored UI interactions.

Dive deep into volumes and immersive spaces

Dive deep into volumes and immersive spaces

Discover powerful new ways to customize volumes and immersive spaces in visionOS. Learn to fine-tune how volumes resize and respond to people moving around them. Make volumes and immersive spaces interact through the power of coordinate conversions. Find out how to make your app react when people adjust immersion with the Digital Crown, and use a surrounding effect to dynamically customize the passthrough tint in your immersive space experience.

Create enhanced spatial computing experiences with ARKit

Create enhanced spatial computing experiences with ARKit

Learn how to create captivating immersive experiences with ARKit’s latest features. Explore ways to use room tracking and object tracking to further engage with your surroundings. We’ll also share how your app can react to changes in your environment’s lighting on this platform. Discover improvements in hand tracking and plane detection which can make your spatial experiences more intuitive.

Explore object tracking for visionOS

Explore object tracking for visionOS

Find out how you can use object tracking to turn real-world objects into virtual anchors in your visionOS app. Learn how you can build spatial experiences with object tracking from start to finish. Find out how to create a reference object using machine learning in Create ML and attach content relative to your target object in Reality Composer Pro, RealityKit or ARKit APIs.

Platforms State of the Union

Platforms State of the Union

Discover the newest advancements on Apple platforms.

Build a spatial drawing app with RealityKit

Build a spatial drawing app with RealityKit

Harness the power of RealityKit through the process of building a spatial drawing app. As you create an eye-catching spatial experience that integrates RealityKit with ARKit and SwiftUI, you’ll explore how resources work in RealityKit and how to use features like low-level mesh and texture APIs to achieve fast updates of the users’ brush strokes.