How to make haptics in SwiftUI

Generated on 7/31/2024

1 search

To make haptics in SwiftUI, you can use the new feedback APIs introduced for alignment, snapping, and completing a path. These APIs are available in both SwiftUI and UIKit. Here is a brief overview of how you can implement haptics in SwiftUI:

  1. Use Sensory Feedback Modifiers: SwiftUI offers sensory feedback modifiers that you can add to your views. These modifiers will request feedback whenever the trigger value changes.

  2. Example Implementation:

    • Define state variables that will trigger the feedback.
    • Use sensory feedback modifiers on the specific views where you want the feedback to occur.

Here is a code snippet to illustrate this:

import SwiftUI

struct ContentView: View {
    @State private var alignmentTrigger: Int = 0
    @State private var pathCompleteTrigger: Int = 0

    var body: some View {
        VStack {
            // Your custom canvas or view
            CanvasView()
                .onAlignmentChange {
                    alignmentTrigger += 1
                }
                .onPathComplete {
                    pathCompleteTrigger += 1
                }
                .sensoryFeedback(.alignment, trigger: $alignmentTrigger)
                .sensoryFeedback(.pathComplete, trigger: $pathCompleteTrigger)
        }
    }
}

In this example:

  • CanvasView is your custom view where you want to provide haptic feedback.
  • onAlignmentChange and onPathComplete are custom methods that increment the state variables.
  • sensoryFeedback modifiers are used to provide haptic feedback when the state variables change.

For more details, you can refer to the session Squeeze the most out of Apple Pencil at WWDC 2024.

Relevant Sessions

Catch up on accessibility in SwiftUI

Catch up on accessibility in SwiftUI

SwiftUI makes it easy to build amazing experiences that are accessible to everyone. We’ll discover how assistive technologies understand and navigate your app through the rich accessibility elements provided by SwiftUI. We’ll also discuss how you can further customize these experiences by providing more information about your app’s content and interactions by using accessibility modifiers.

What’s new in UIKit

What’s new in UIKit

Explore everything new in UIKit, including tab and document launch experiences, transitions, and text and input changes. We’ll also discuss better-than-ever interoperability between UIKit and SwiftUI animations and gestures, as well as general improvements throughout UIKit.

Port advanced games to Apple platforms

Port advanced games to Apple platforms

Discover how simple it can be to reach players on Apple platforms worldwide. We’ll show you how to evaluate your Windows executable on Apple silicon, start your game port with code samples, convert your shader code to Metal, and bring your game to Mac, iPhone, and iPad. Explore enhanced Metal tools that understand HLSL shaders to validate, debug, and profile your ported shaders on Metal.

Discover RealityKit APIs for iOS, macOS and visionOS

Discover RealityKit APIs for iOS, macOS and visionOS

Learn how new cross-platform APIs in RealityKit can help you build immersive apps for iOS, macOS, and visionOS. Check out the new hover effects, lights and shadows, and portal crossing features, and view them in action through real examples.

Squeeze the most out of Apple Pencil

Squeeze the most out of Apple Pencil

New in iOS 18, iPadOS 18, and visionOS 2, the PencilKit tool picker gains the ability to have completely custom tools, with custom attributes. Learn how to express your custom drawing experience in the tool picker using the same great tool picking experience available across the system. Discover how to access the new features of the Apple Pencil Pro, including roll angle, the squeeze gesture, and haptic feedback.