There’s a pattern in QA that plays out more often than anyone likes to admit. A team runs a full regression sprint. Automation is green across the board. Every form, every flow, every edge case, all covered. Then an accessibility auditor walks in and flags something nobody considered: What happens when a user with motor disabilities makes a typo?
On iOS, the answer is supposed to be simple. Shake the device, tap “Undo,” move on. It’s one of those small Apple conveniences most people forget exists. But here’s the thing: if someone physically cannot shake a phone, that convenience becomes a wall. Users with tremors, limited dexterity, paralysis, or muscle weakness can’t just give their iPhone a wiggle. They depend on Assistive Touch to trigger that same shake action digitally.
Most QA teams have never tested for it. Not once.
The shake gesture sits in a blind spot between “native OS behavior” and “app-level functionality,” so it falls through the cracks of every test plan out there. Meanwhile, over 2.5 billion people globally need assistive technology, accessibility lawsuits jumped 37% in 2025, and there’s an ADA Title II deadline staring everyone down in April 2026.
That gap needs closing, fast. And it needs to happen without shipping 15 physical iPhones to a distributed testing team.
Testing the Shake Gesture (Without Shaking Anything)
TestMu AI’s Real Device Cloud solves this cleanly: the Shake gesture can be simulated directly from the testing toolbar during a live device session. No physical device in hand, no workarounds, no hacks. The shake triggers via Assistive Touch on a real iPhone running in the cloud, and the OS responds exactly the way it would for an actual user.
Here’s the exact flow, step by step.
Step 1: Spin Up a Real Device Session
From the TestMu AI dashboard, head to Real Device > Browser Testing. Enter the application URL, pick an iOS device (iPhone or iPad running iOS 14 or later) and hit Start. The session connects to an actual device in the cloud, opens the browser, and loads the URL. No emulators, no simulators. Real hardware.
Step 2: Get Some Text on Screen
Navigate to any form in the app. Login page, search bar, registration form, anything with a text input. Type something in. A username, a search query, a few words of dummy text. There needs to be content in the field so the undo action has something to reverse.
The best approach is to test against the most form-heavy flow in the app, because that’s where real users are most likely to make mistakes and need to undo.
Step 3: Turn On Assistive Touch
In the left sidebar of the testing toolbar, expand iOS Settings and toggle Assistive Touch to ON. A small circular floating button appears on the device screen. That’s the Assistive Touch trigger, the exact interface motor-impaired users rely on to interact with iOS.
Step 4: Open the Assistive Touch Menu
Tap the floating Assistive Touch button. A dark overlay menu pops up with five options: Notification Center, Home, App Switcher, Shake, and Screenshot. This is what a user with motor disabilities sees every time they need to perform an action that would normally require a physical gesture.
Step 5: Hit “Shake”
Tap Shake. iOS responds instantly, as if the device was physically shaken. Because there’s text sitting in a form field, the system fires the “Undo” prompt: “Undo: Double-tap with three fingers.” Then the Undo Typing dialog appears, with Cancel and Undo buttons.
This is the critical moment. If this dialog doesn’t show up, or if it shows up but the buttons aren’t tappable, that’s an accessibility failure.
Step 6: Confirm the Undo
Tap Undo. The text disappears from the form field. That’s the confirmation: shake-to-undo works correctly through Assistive Touch, and a motor-impaired user can recover from a typing error without any physical gesture.
That’s the whole test. Six steps. Under two minutes. And it validates something most teams have never checked.
What This Actually Validates
This isn’t just a “does the shake work?” test. Running through this flow confirms several things at once:
- Undo functionality works end-to-end. Users can reverse text input without needing keyboard shortcuts, multi-finger gestures, or physical motion.
- Assistive Touch integration is intact. The shake action triggered through Assistive Touch produces the exact same result as a physical shake. No difference in behavior, no missing dialogs, no broken flows.
- The Undo dialog is actually accessible. The dialog appears, renders correctly, and the buttons are tappable. System dialogs getting partially obscured by custom UI elements is more common than anyone expects.
- Motor-impaired users have a clear error-recovery path. This is the big one. If someone with a motor disability fills out a form wrong, can they fix it? This test answers that question definitively.
These checks map directly to WCAG compliance requirements around operable interfaces and input assistance, both of which auditors scrutinize closely.
Don’t Stop at Undo: Test Custom Motion Features Too
Here’s where it gets more interesting. Apps that use custom shake-triggered features (shake-to-refresh, shake-to-shuffle a playlist, shake-to-report a bug) need the same treatment. Trigger the shake through Assistive Touch and see if the custom feature responds.
If it doesn’t? That’s an accessibility gap that needs an alternative input method. And it’s far better to catch it in a cloud session than in a lawsuit or an app store review rejection.
A common finding: native undo works fine through Assistive Touch, but app-level shake features only listen for the accelerometer. They completely ignore the Assistive Touch shake. These are the kinds of bugs that slip past every automated accessibility scan and only surface when someone actually tests the assistive path on a real device.
For teams looking to build a more comprehensive mobile accessibility testing workflow, the shake gesture test is a strong starting point because it validates both native OS behavior and app-level feature integration in a single pass.
Why This Belongs in Every iOS Regression Cycle
Motion-based accessibility is the kind of thing teams “plan to get to eventually.” But eventually doesn’t hold up against tightening regulations and a user base that expects inclusive design by default.
Testing the Shake gesture through TestMu AI’s Assistive Touch simulation ensures that users with motor disabilities aren’t locked out of basic functionality in an iOS app. It validates a critical error-recovery path, covers compliance requirements that are tightening by the quarter, and runs entirely from a cloud-based real device session.
Pair this with the right set of accessibility testing tools and WCAG scanning integrated into the CI/CD pipeline, and motion-based accessibility stops being a blind spot.
One test. Two minutes. And a meaningful difference for millions of users who can’t just shake it off.
Top comments (0)