DEV Community

Maksim Ponomarev
Maksim Ponomarev

Posted on

Haptic Feedback and AVAudioSession Conflicts in iOS: Troubleshooting Recording Issues

In my previous article, Haptic Feedback in iOS: A Comprehensive Guide, we explored the four primary methods for implementing haptic feedback in iOS applications. We covered UIImpactFeedbackGenerator, CHHapticEngine, SwiftUI's .sensoryFeedback() modifier, and the low-level AudioServicesPlaySystemSound() API. Each approach has its strengths and appropriate use cases for delivering tactile feedback to users.

However, there's a critical challenge that wasn't fully addressed: what happens when you try to use haptics in apps that record audio or video? Many developers encounter a frustrating issue where haptics work perfectly in testing, but mysteriously fail once video or audio recording begins. This problem stems from complex interactions between the AVAudioSession API and iOS's haptic feedback systems.

In this article, we'll dive deep into these conflicts, understand why they occur, and explore practical solutions to ensure your haptics remain reliable even during active recording sessions.

The AVAudioSession Challenge

AVAudioSession is iOS's central mechanism for managing audio behavior in your app. It controls how your app's audio interacts with other audio on the device, handles audio routing (speaker vs. receiver vs. headphones), and manages microphone access. When you start recording video or audio, you must configure the audio session appropriately:

try AVAudioSession.sharedInstance().setCategory(
    .playAndRecord,
    mode: .videoRecording,
    options: [.defaultToSpeaker, .allowBluetooth]
)
try AVAudioSession.sharedInstance().setActive(true)
Enter fullscreen mode Exit fullscreen mode

The problem arises when calling setActive(true). This seemingly innocuous method can interfere with haptic delivery, causing UIImpactFeedbackGenerator and CHHapticEngine to fail silently or produce inconsistent results. The root cause lies in how iOS prioritizes audio resources and manages the Taptic Engine's interaction with the audio subsystem.

Why Haptics Fail During Recording

1. Audio Session Category Restrictions

Certain AVAudioSession categories are designed to take exclusive control over audio hardware, which can inadvertently affect haptic delivery. The .record category, for example, prioritizes audio input and may deprioritize or block haptic feedback to prevent interference.

Even the .playAndRecord category—commonly used for video recording—can create conflicts. When the audio session activates, it reconfigures the audio hardware routing, and this reconfiguration can disrupt the pathways that haptic generators rely on.

// This configuration can interfere with haptics
try AVAudioSession.sharedInstance().setCategory(
    .playAndRecord,
    mode: .videoRecording
)
try AVAudioSession.sharedInstance().setActive(true)

// Haptics may now fail or be inconsistent
impactGenerator.impactOccurred() // May not work
Enter fullscreen mode Exit fullscreen mode

2. Audio Session Mode Impact

The mode parameter further refines audio behavior, and .videoRecording mode is particularly problematic for haptics. This mode optimizes audio input for video capture, applying noise cancellation and adjusting gain levels. These optimizations can create timing conflicts or resource contention that affects the Taptic Engine.

Other modes like .voiceChat, .videoChat, and .measurement have similar effects, each prioritizing specific audio characteristics that may not coexist well with haptic delivery.

3. The Lifecycle Problem with UIImpactFeedbackGenerator

UIImpactFeedbackGenerator requires calling prepare() to prime the Taptic Engine for low-latency response. However, when you activate an audio session, the internal state that prepare() establishes can be invalidated:

let generator = UIImpactFeedbackGenerator(style: .heavy)
generator.prepare() // Taptic Engine is ready

// Start recording
try AVAudioSession.sharedInstance().setActive(true)

// The prepare() state may now be invalid
generator.impactOccurred() // May fail or have high latency
Enter fullscreen mode Exit fullscreen mode

The generator doesn't automatically detect or recover from this state change. You must manually re-prepare the generator after audio session changes, but even this doesn't guarantee success if the audio session maintains exclusive control over shared resources.

4. CHHapticEngine Session Conflicts

CHHapticEngine can experience similar issues. When an AVAudioSession activates, particularly with recording categories, it may trigger the haptic engine's stoppedHandler or cause the engine to reset. The engine's internal audio rendering pipeline conflicts with the active audio session's configuration:

hapticEngine?.stoppedHandler = { reason in
    print("Engine stopped: \(reason.rawValue)")
    // Often triggered when AVAudioSession activates
}

hapticEngine?.resetHandler = {
    print("Engine reset - attempting restart")
    try? self.hapticEngine?.start()
    // May fail to restart properly during active recording
}
Enter fullscreen mode Exit fullscreen mode

Even if you successfully restart the engine, pattern playback may be inconsistent or fail entirely while recording is active.

Common Issues and Their Symptoms

Silent Haptic Failures

The most insidious issue is when haptic methods execute without errors but produce no tactile feedback. The API calls succeed, but the Taptic Engine never activates:

// No error thrown, but no haptic feedback occurs
impactGenerator.impactOccurred()
try player?.start(atTime: 0)
Enter fullscreen mode Exit fullscreen mode

This happens because the audio session has deprioritized haptic output, and the haptic APIs don't expose this state clearly.

Delayed or Inconsistent Haptics

Sometimes haptics work intermittently—firing correctly on the first tap but failing on subsequent attempts, or experiencing multi-second delays. This indicates resource contention where the audio session and haptic engine are competing for shared hardware access.

Haptics That Work Before Recording, Fail During, and Recover After

A telltale pattern: haptics function perfectly until you call movieOutput.startRecording(), then fail completely during recording, and resume working after movieOutput.stopRecording(). This clearly points to AVAudioSession activation as the culprit.

The Missing Configuration: setAllowHapticsAndSystemSoundsDuringRecording

Apple provides a solution, but it's not widely known and often overlooked: setAllowHapticsAndSystemSoundsDuringRecording(true). This method explicitly tells the audio session to permit haptic feedback even when recording is active:

try AVAudioSession.sharedInstance().setCategory(
    .playAndRecord,
    mode: .videoRecording,
    options: [.defaultToSpeaker, .allowBluetooth]
)

// Critical: Allow haptics during recording
try AVAudioSession.sharedInstance().setAllowHapticsAndSystemSoundsDuringRecording(true)

try AVAudioSession.sharedInstance().setActive(true)
Enter fullscreen mode Exit fullscreen mode

Why this helps:

  • Explicitly signals to iOS that haptic feedback is intentional and should coexist with recording
  • Prevents the audio session from deprioritizing or blocking Taptic Engine access
  • Works with UIImpactFeedbackGenerator, CHHapticEngine, and .sensoryFeedback()

However, even this configuration doesn't guarantee success in all scenarios. Some device states, audio routing configurations, or iOS versions may still exhibit conflicts.

Workarounds and Solutions

Solution 1: Deactivate and Reactivate Audio Session Around Haptics

One approach is to briefly deactivate the audio session, trigger the haptic, then reactivate:

func triggerHapticDuringRecording() {
    // Temporarily deactivate audio session
    try? AVAudioSession.sharedInstance().setActive(false)

    // Trigger haptic
    impactGenerator.impactOccurred()

    // Reactivate audio session
    try? AVAudioSession.sharedInstance().setCategory(
        .playAndRecord,
        mode: .videoRecording,
        options: [.defaultToSpeaker, .allowBluetooth]
    )
    try? AVAudioSession.sharedInstance().setAllowHapticsAndSystemSoundsDuringRecording(true)
    try? AVAudioSession.sharedInstance().setActive(true)
}
Enter fullscreen mode Exit fullscreen mode

Pros:

  • Can restore haptic functionality when other methods fail
  • Works with existing haptic generator objects

Cons:

  • Introduces audio artifacts or brief interruptions in recording
  • Adds significant latency (50-200ms) to haptic feedback
  • May cause the audio session to glitch or require full reconfiguration
  • Not suitable for real-time or frequent haptics

Solution 2: Use AudioServicesPlaySystemSound() for Reliability

As mentioned in my previous article, AudioServicesPlaySystemSound() bypasses the higher-level haptic APIs entirely and directly accesses the Taptic Engine:

import AudioToolbox

// Reliable haptic even during recording
AudioServicesPlaySystemSound(1519) // Strong haptic
Enter fullscreen mode Exit fullscreen mode

Why this works:

  • Independent of AVAudioSession: Operates at the AudioToolbox framework level, below where audio session conflicts occur
  • Used by Apple's Camera app: The native iOS Camera app uses this exact API for shutter and zoom haptics
  • No lifecycle management: No preparation, no engine state, no generator objects to manage
  • Immediate execution: Synchronous call with minimal overhead

Limitations:

  • No customization of intensity, duration, or pattern
  • Limited to predefined system haptic IDs (1519, 1520, 1521, 4095)
  • Less sophisticated than CHHapticEngine patterns

Solution 3: Configure Audio Session Before Creating Generators

Initialize your audio session with the correct configuration before creating haptic generators:

// Step 1: Configure audio session FIRST
func setupAudioSession() {
    try? AVAudioSession.sharedInstance().setCategory(
        .playAndRecord,
        mode: .videoRecording,
        options: [.defaultToSpeaker, .allowBluetooth]
    )
    try? AVAudioSession.sharedInstance().setAllowHapticsAndSystemSoundsDuringRecording(true)
    try? AVAudioSession.sharedInstance().setActive(true)
}

// Step 2: Then create generators
func setupHaptics() {
    impactGenerator.prepare()
    try? hapticEngine?.start()
}

// App initialization order matters
func application(_ application: UIApplication, didFinishLaunchingWithOptions...) {
    setupAudioSession()  // First
    setupHaptics()       // Second
}
Enter fullscreen mode Exit fullscreen mode

This approach ensures generators are created in an environment where the audio session is already configured, reducing the likelihood of state conflicts.

Solution 4: Monitor and Respond to Audio Session Interruptions

Set up observers to detect audio session changes and re-prepare your haptic generators:

NotificationCenter.default.addObserver(
    forName: AVAudioSession.interruptionNotification,
    object: nil,
    queue: .main
) { notification in
    guard let userInfo = notification.userInfo,
          let typeValue = userInfo[AVAudioSessionInterruptionTypeKey] as? UInt,
          let type = AVAudioSession.InterruptionType(rawValue: typeValue) else {
        return
    }

    if type == .ended {
        // Re-prepare haptics after interruption ends
        self.impactGenerator.prepare()
        try? self.hapticEngine?.start()
    }
}
Enter fullscreen mode Exit fullscreen mode

Solution 5: Test with Different Audio Session Options

The options parameter in setCategory() can affect haptic behavior. Experiment with different combinations:

// Option 1: Minimal options
try AVAudioSession.sharedInstance().setCategory(
    .playAndRecord,
    mode: .videoRecording,
    options: [.defaultToSpeaker]
)

// Option 2: With mixing (may help)
try AVAudioSession.sharedInstance().setCategory(
    .playAndRecord,
    mode: .videoRecording,
    options: [.defaultToSpeaker, .mixWithOthers]
)

// Option 3: Allow Bluetooth but avoid mixing
try AVAudioSession.sharedInstance().setCategory(
    .playAndRecord,
    mode: .videoRecording,
    options: [.defaultToSpeaker, .allowBluetooth]
)
Enter fullscreen mode Exit fullscreen mode

The .mixWithOthers option allows your audio to play alongside other apps' audio, which may reduce resource contention for haptics. However, it can also introduce unwanted audio mixing behavior.

Best Practices for Haptics in Recording Apps

1. Always Call setAllowHapticsAndSystemSoundsDuringRecording(true)

This should be standard practice for any app that records audio or video and uses haptics:

try AVAudioSession.sharedInstance().setAllowHapticsAndSystemSoundsDuringRecording(true)
Enter fullscreen mode Exit fullscreen mode

2. Initialize Audio Session Early

Configure your audio session in AppDelegate or @main struct initialization, before any haptic setup:

@main
struct CameraApp: App {
    init() {
        setupAudioSession()
    }

    var body: some Scene {
        WindowGroup {
            ContentView()
        }
    }
}
Enter fullscreen mode Exit fullscreen mode

3. Use AudioServicesPlaySystemSound() for Critical Haptics

For essential user feedback like shutter button taps or recording start/stop, use the low-level API to ensure reliability:

func shutterButtonTapped() {
    AudioServicesPlaySystemSound(1519) // Guaranteed to work
    capturePhoto()
}
Enter fullscreen mode Exit fullscreen mode

4. Implement Fallback Strategies

Have a fallback hierarchy:

func triggerFeedback() {
    // Try UIImpactFeedbackGenerator first
    impactGenerator.impactOccurred()

    // Fallback to AudioServicesPlaySystemSound if needed
    DispatchQueue.main.asyncAfter(deadline: .now() + 0.05) {
        if !self.hapticSucceeded {
            AudioServicesPlaySystemSound(1519)
        }
    }
}
Enter fullscreen mode Exit fullscreen mode

5. Test on Physical Devices

Haptic issues often don't manifest in the simulator. Always test on real hardware, ideally multiple device models and iOS versions, as the behavior can vary significantly.

6. Consider User Preferences

Provide a setting to disable haptics during recording if conflicts persist:

if !isRecording || userPreferences.allowHapticsWhileRecording {
    triggerHaptic()
}
Enter fullscreen mode Exit fullscreen mode

Debugging Checklist

When haptics fail during recording, systematically check:

  1. Is setAllowHapticsAndSystemSoundsDuringRecording(true) called?
  2. Is the audio session activated before haptic generator creation?
  3. Are you testing on a physical device (not simulator)?
  4. Are you calling prepare() on generators after audio session changes?
  5. Is the haptic engine's stoppedHandler being triggered?
  6. Does AudioServicesPlaySystemSound(1519) work as a test?
  7. Are you using .videoRecording mode (which may be more restrictive)?
  8. Have you tried different audio session options?

Conclusion

Building on the haptic implementation techniques covered in my previous article, we've now explored the specific challenges that arise when combining haptics with audio and video recording. The interaction between AVAudioSession and haptic feedback APIs is one of iOS development's more frustrating hidden complexities.

While setAllowHapticsAndSystemSoundsDuringRecording(true) solves many cases, some scenarios require more robust solutions like AudioServicesPlaySystemSound() or careful lifecycle management. For camera and recording apps, the most reliable approach is a hybrid strategy: use AudioServicesPlaySystemSound() for critical feedback that must always work, reserve UIImpactFeedbackGenerator and CHHapticEngine for non-recording contexts, and always configure your audio session before initializing haptic systems.

By understanding these conflicts and implementing appropriate workarounds, you can deliver consistent, reliable haptic feedback throughout your app's recording experience—creating the polished, professional feel that users expect from modern iOS applications.

example on Github:
https://github.com/Maxnxi/TestHaptic

Top comments (0)