Automating the Automation??
If you've ever written UI tests with XCUI, you've probably struggled with locating elements using accessibility identifiers, labels, or even element paths. It's tedious, repetitive, and easy to break.
Well, good news: with Xcode 26, Apple is making this process significantly easier.
UI tests are now so tightly integrated into Xcode that you can practically use it as a no-code test builder. Just hit Record, interact with your app, and Xcode automatically writes the test code for you.
Of course, you still have full control to modify the test as needed - because at the end of the day, it's still updating your XCTestCase
file.
First Things First - Accessibility Is Key
⚠️ Before we go any further, let this sink in: ⚠️
Accessibility isn't just for users - it's essential for automation.
If you're serious about automation, investing in proper accessibility setup is not optional!!
Xcode relies on accessibility APIs to identify and interact with UI elements during tests. And according to Apple (not just me 😄), this is the most reliable way to interact with elements in your UI tests.
A key tool here is the accessibilityIdentifier
, which acts as a unique key for automation.
For example, in SwiftUI you can simply add:
.accessibilityIdentifier("usernameTextField")
💡 Tip: Use the Accessibility Inspector in Xcode to audit and debug your UI elements.
So What's new? - We are recording test flows now!
The demo from WWDC25 showcasing the recording tool was super slick.
You start by creating a new UI Test target, then hit Record, and perform your actions in the app - just like a user would. Xcode then auto-generates the corresponding XCTest code in real-time: tapping buttons, entering text, navigating screens etc.,
When you open your UI test case file, you'll see a Record button in the sidebar. Click it, start interacting with your app, and watch the test code get written automatically as you go.
The output is standard XCTest code, so you can always refine it manually. The example from the session showed editing a trip plan title and navigating to another screen. It captured everything - tap gestures, keyboard input, navigation.
This approach is much faster than writing tests from scratch and ensures the automation reflects real user behavior.
Reviewing Failures Visually
One of the best parts of this update is the test result viewer. If a UI test fails, you can:
Watch a video recording of the test run, with touch interactions shown.
- Jump to the exact step where it failed using the timeline.
- See element overlays, showing what Xcode was trying to tap or assert.
- Compare actual vs expected metadata.
This kind of context-rich feedback makes debugging UI tests a lot more approachable. No more guesswork - you can see what went wrong.
Takeaways & Tips
Having tried this in Xcode 26, my first impressions are really positive:
-
It works great - if you've defined clear accessibility identifiers for all interactive elements
- Add unique
accessibilityIdentifiers
and make elements reliably testable - Experience it yourself - Use the Recorder - Jumpstart test creation, especially for complex flows
- Always review the generated code - it's not always perfect
- Some limitations remain - for example, I tested selecting time in a picker, and the generated code didn't work out of the box
If you've been avoiding UI automation because it felt too fragile or time-consuming, now's a great time to revisit it. The improvements in Xcode 26 - especially around recording and reviewing tests - make it far more approachable.
I'm definitely planning to use this more in my own apps, especially to validate critical flows across locales and devices. If you haven't seen the session yet, I highly recommend watching it here.
Thanks for reading - and happy automating!
Top comments (0)