DEV Community

Cover image for Automated Usability Testing: Maze vs Lookback
Aditya (Ruben) Prasad
Aditya (Ruben) Prasad

Posted on

Automated Usability Testing: Maze vs Lookback

What is Automated Usability Testing?

Automation is the ability to carry out a task with little assistance from humans. Software for user testing can be used to automate research, especially when looking for remote participants. Usability testing software facilitates communication with users, collects data automatically, and shows users where they succeeded or failed, enabling you to develop better products.

In order for designers and developers alike to produce beautiful, user-friendly software, they must test the usability of their designs and implementations. This can be achieved with the use of various usability testing tools, whether through eye tracking or automated tools. In this blog, I will be comparing two automated tools, Maze and Lookback.

To successfully draw a conclusion, these are the aspects that will be compared between the two tools:

  1. Usability Metrics
  2. Ease of use
  3. Remote Testing Capabilities
  4. Integration with design tools

This blog is based on my personal experience with these two tool on a test I carried out. The test was setup in such a way that the user would have to create an account from the website used to make the tests.


Justification for the choice of tools

  • Maze was chosen for its ability to conduct quantitative usability testing by turning design files into realistic prototypes, making it suitable for assessing user interaction flow and navigation.
     

  • Lookback is chosen for its robust capabilities in remote user testing, especially its screen sharing and video recording features, which enable in-depth qualitative analysis.

By integrating Maze's quantitative assessment and Lookback's qualitative analysis, this offers an in-depth understanding of the user experience.


Usability Metrics

  • Maze excels in providing quantitative success rates, giving a clear indication of how well users accomplish predefined tasks within the prototype. It also provides precise measurements of the time taken by users to complete tasks, along with error rates.
     

  • Lookback gives a more qualitative insight related to user behavior during their testing sessions, allowing us to capture and observe with the video recordings the navigation patterns, hesitation points, and interactions outside the scope of predefined tasks.
     

Ease of Use

  • Maze offers a simplified user-interface that makes it easy to set up tests and prototypes. It offers multiple templates for your needs to quickly and effectively set up and deploy automated tests. Maze makes data analysis easy due to its design metrics, which include misclicks and time-on-screen.
     

  • Lookback requires little technical knowledge to deploy a remote user session. It supports both moderated and unmoderated sessions. As interviews are being conducted during moderated sessions, you can collaborate with team members and take notes to highlight key points. The downside of Lookback is that a live website is required for testing.
     

Remote Testing Capabilities

  • Maze focuses on unmoderated testing, it offers a variety of question types, such as multiple choice, Likert scale, and open ended questions. Maze's strong suite is its built in analytics. It is best suited for a quick and easy insight into a new design, it lacks the real time interaction that lookback offers, which can provide a more in-depth understanding of how the user interacts with the design.

     

  • Lookback is mainly used for moderated testing, with its screen sharing, audio/video, and live chat capabilities. Lookback is ideal for more in-depth interviews and exploration. It will allow the researcher(s) to ask follow-up questions, clarify tasks, and get more insight into the way the user thinks. Lookback primarily focuses on capturing user sessions rather than providing the advanced data analysis tools used by Maze.
     

Integration with Design Tools

  • Maze seamlessly integrates with multiple design software such as Figma and Adobe XD. This allows you to directly import a prototype into Maze, eliminating the need for a live website to build and deploy a test. Any changes that are made to prototypes are automatically synced with Maze, maintaining consistency and streamlining the testing process. If your design and testing workflows are tightly integrated, Maze's seamless design tool integration offers significant efficiency benefits.
     

  • Lookback focuses on recording user interactions with existing webpages. It does not allow you to use a prototype file to conduct testing. Nevertheless, it is suitable for situations where design tools are not used.
     


During testing it is important to acknowledge the risk of biases. Strategies should be used to minimize this risk, this includes but not limited to recruiting a diverse group of people, preferably your target audience, carefully crafting task to avoid leading users to a specific outcome.

Final Thoughts

Choosing the right user testing tools depends on your specific research goals and project requirements. Both Maze and Lookback offer valuable capabilities.

Maze has a streamlined setup process, asynchronous testing, multiple question types, data analysis tools, and design tool integration. It is ideal for quantitative usability testing, assessing user interaction flow, and navigation. Maze is best suited to quickly test and get feedback on a prototype before implementing the design. Although it has limited qualitative capabilities, it makes up for that with how easy it is for large scale testing of a prototype.

Lookback's light weight setup makes it ideal for qualitative user research, understanding user behavior and motives. It offers real-time interaction, screen sharing, and video recording. Lookback is great if you wish to get a deeper understanding of how users interact with your designs. Note that it is not best suited for large-scale testing, but this can be combated since you will be speaking directly to the participants, so you can pick up on issues faster than you would in an unmoderated research environment.

By utilizing both tools, you can iteratively improve your designs based on data-driven insights and a deeper understanding of your users' needs.

Top comments (0)