Gesture recognizers

Generated on 8/12/2024

2 searches

Gesture recognizers were discussed in several sessions at WWDC 2024, particularly focusing on their use in visionOS and UIKit.

In the session titled "Explore game input in visionOS," various system gestures such as tap, double tap, pinch and hold, and pinch and drag are explored. These gestures can be used to interact with objects in a game environment, allowing for actions like zooming, rotating, and controlling objects with hand movements. The session also covers how to combine system gestures for more complex interactions and how to create custom gestures using full hand skeleton tracking through ARKit. For more details, you can refer to the Explore game input in visionOS session.

In the session "What’s new in UIKit," the integration of gesture recognizers between UIKit and SwiftUI is discussed. iOS 18 introduces the ability to add existing UIKit gesture recognizers directly to SwiftUI hierarchies using the new UIJs recognizer representable protocol. This session also covers how to coordinate gestures across both frameworks, ensuring seamless interaction between single and double tap gestures. For more information, see the What’s new in UIKit session.

These sessions provide a comprehensive overview of how gesture recognizers can be utilized and coordinated across different platforms and frameworks in the Apple ecosystem.