DEV Community

garry
garry

Posted on

Haptic Feedback Design for Workout Apps

Why Haptics Matter More Than Sound in the Gym

When I built BoxTime, I assumed the bell sound would be the primary way users know a round ended. Then I tested it at an actual boxing gym. Between the music, the bag noise, other people training -- you cannot hear your phone. Haptics became the real signal.

The Haptic Engine on iOS

Apple gives you three levels of haptic control, from simple to granular:

UIImpactFeedbackGenerator

The simplest option. Predefined impact styles.

let impact = UIImpactFeedbackGenerator(style: .heavy)
impact.prepare()
impact.impactOccurred()
Enter fullscreen mode Exit fullscreen mode

UINotificationFeedbackGenerator

Semantic feedback for success, warning, and error states.

let notification = UINotificationFeedbackGenerator()
notification.prepare()
notification.notificationOccurred(.success)
Enter fullscreen mode Exit fullscreen mode

Core Haptics (CHHapticEngine)

Full control over haptic patterns. This is where it gets interesting for a timer app.

import CoreHaptics

class HapticManager {
    private var engine: CHHapticEngine?

    func prepareEngine() {
        guard CHHapticEngine.capabilitiesForHardware().supportsHaptics else { return }

        do {
            engine = try CHHapticEngine()
            try engine?.start()
        } catch {
            print("Haptic engine failed: \(error)")
        }
    }

    func playRoundEndPattern() throws {
        // Three strong taps with decreasing intervals - feels like a boxing bell
        let events: [CHHapticEvent] = [
            CHHapticEvent(
                eventType: .hapticTransient,
                parameters: [
                    CHHapticEventParameter(parameterID: .hapticIntensity, value: 1.0),
                    CHHapticEventParameter(parameterID: .hapticSharpness, value: 0.8)
                ],
                relativeTime: 0
            ),
            CHHapticEvent(
                eventType: .hapticTransient,
                parameters: [
                    CHHapticEventParameter(parameterID: .hapticIntensity, value: 0.8),
                    CHHapticEventParameter(parameterID: .hapticSharpness, value: 0.6)
                ],
                relativeTime: 0.15
            ),
            CHHapticEvent(
                eventType: .hapticTransient,
                parameters: [
                    CHHapticEventParameter(parameterID: .hapticIntensity, value: 0.6),
                    CHHapticEventParameter(parameterID: .hapticSharpness, value: 0.4)
                ],
                relativeTime: 0.30
            ),
        ]

        let pattern = try CHHapticPattern(events: events, parameters: [])
        let player = try engine?.makePlayer(with: pattern)
        try player?.start(atTime: CHHapticTimeImmediate)
    }
}
Enter fullscreen mode Exit fullscreen mode

Designing a Haptic Language

In BoxTime, I use different haptic patterns for different events, so the user can feel what is happening without looking at the screen:

  • Round start: Two sharp taps (get ready, fight)
  • Round end: Three descending taps (imitates a bell resonating)
  • 10-second warning: Single soft pulse (heads up)
  • Workout complete: Long success pattern (celebration feel)

The key insight is that haptic patterns should be distinct enough to be recognized without visual context. If your round-start and round-end patterns feel similar, the user has to look at the screen to know what happened, which defeats the purpose.

The .prepare() Call Matters

Haptic engines have a spin-up latency. If you create a generator and immediately fire it, there may be a perceptible delay. Always call .prepare() ahead of time.

// Bad: latency on first fire
func roundEnding() {
    let impact = UIImpactFeedbackGenerator(style: .heavy)
    impact.impactOccurred() // might be delayed
}

// Good: pre-warm the engine
func roundStarted() {
    upcomingHaptic.prepare() // warm up for the end-of-round haptic
}

func roundEnding() {
    upcomingHaptic.impactOccurred() // fires immediately
}
Enter fullscreen mode Exit fullscreen mode

In BoxTime, I prepare the next haptic event as soon as the current phase starts. By the time the round ends, the engine is ready.

Testing Realities

The iOS Simulator does not support haptics. You must test on a real device. And not all devices support Core Haptics -- the iPhone 7 supports basic UIFeedbackGenerator haptics but not the full CHHapticEngine. Always check CHHapticEngine.capabilitiesForHardware().supportsHaptics and fall back gracefully.

What I Learned

Haptics are not a nice-to-have in fitness apps. They are essential. A boxer with gloves on cannot easily tap their phone screen. They need to feel the transitions. Investing time in distinct, well-timed haptic patterns was one of the highest-impact improvements I made to BoxTime.

Top comments (0)