how do I get real time hand tracking data from VisionOS

Asked on 07/29/2024

1 search

To get real-time hand tracking data from VisionOS, you can use the hand tracking provider in ARKit. Here are the steps and relevant details:

  1. Hand Tracking Provider: VisionOS allows you to track a person's hands and fingers. You can either pull the hand tracking provider for the latest updates or receive hand anchors asynchronously as they become available. The hand tracking provider now delivers data at display rate, which minimizes latency.

  2. Hand Anchors: You can use hand anchors to detect gestures and anchor virtual content to the user's hands. This can be done using ARKit or RealityKit.

  3. Hands Prediction API: For minimal delay in hand tracking results, ARKit can predict hand anchors expected to exist in the near future. This is particularly useful if your app renders with compositor services or RealityKit.

  4. Setting Up in RealityKit: In RealityKit, you can use anchor entities to affix a RealityKit entity to an ARAnchor. For example, you can create anchor entities for each hand, such as the thumb tip and index fingertip.

For more detailed information, you can refer to the following sessions and chapters:

These sessions provide comprehensive guidance on how to implement and utilize hand tracking in VisionOS.