Which visionOS WWDC session explains how to place entities at the tip of hands to touch things in space?

Generated on 8/1/2024

1 search

The visionOS WWDC session that explains how to place entities at the tip of hands to touch things in space is "Build a spatial drawing app with RealityKit." This session covers setting up spatial hand tracking and using anchor entities to affix RealityKit entities to hand anchors, specifically the thumb tip and index fingertip.

You can find this information in the chapter titled "Set up spatial tracking" of the session Build a spatial drawing app with RealityKit.

Here is the ordered list of relevant sessions mentioned in the context:

  1. Build a spatial drawing app with RealityKit
  2. Platforms State of the Union
  3. Discover RealityKit APIs for iOS, macOS and visionOS
  4. Create enhanced spatial computing experiences with ARKit
  5. Explore object tracking for visionOS
Create enhanced spatial computing experiences with ARKit

Create enhanced spatial computing experiences with ARKit

Learn how to create captivating immersive experiences with ARKit’s latest features. Explore ways to use room tracking and object tracking to further engage with your surroundings. We’ll also share how your app can react to changes in your environment’s lighting on this platform. Discover improvements in hand tracking and plane detection which can make your spatial experiences more intuitive.

Platforms State of the Union

Platforms State of the Union

Discover the newest advancements on Apple platforms.

Optimize for the spatial web

Optimize for the spatial web

Discover how to make the most of visionOS capabilities on the web. Explore recent updates like improvements to selection highlighting, and the ability to present spatial photos and panorama images in fullscreen. Learn to take advantage of existing web standards for dictation and text-to-speech with WebSpeech, spatial soundscapes with WebAudio, and immersive experiences with WebXR.

Build a spatial drawing app with RealityKit

Build a spatial drawing app with RealityKit

Harness the power of RealityKit through the process of building a spatial drawing app. As you create an eye-catching spatial experience that integrates RealityKit with ARKit and SwiftUI, you’ll explore how resources work in RealityKit and how to use features like low-level mesh and texture APIs to achieve fast updates of the users’ brush strokes.

Discover RealityKit APIs for iOS, macOS and visionOS

Discover RealityKit APIs for iOS, macOS and visionOS

Learn how new cross-platform APIs in RealityKit can help you build immersive apps for iOS, macOS, and visionOS. Check out the new hover effects, lights and shadows, and portal crossing features, and view them in action through real examples.

Explore object tracking for visionOS

Explore object tracking for visionOS

Find out how you can use object tracking to turn real-world objects into virtual anchors in your visionOS app. Learn how you can build spatial experiences with object tracking from start to finish. Find out how to create a reference object using machine learning in Create ML and attach content relative to your target object in Reality Composer Pro, RealityKit or ARKit APIs.