Mesmer

AI-powered application testing
Mesmer App Graph preview

OVERVIEW

Mesmer is an AI-powered, mobile application QA platform. It's currently the only platform that uses machine learning to complete end-to-end testing, visual testing, and accessibility compliance for native iOS and Android applications.

ROLES

Product Design
Prototyping

App Graph

CHALLENGE

The App Graph (formerly known as "App Map") is the first thing a user sees when logging into Mesmer. Users liked the pleasing build animation but were disappointed by its limited capabilities.

Our goal with the redesign was to make it easier for a user to find screens, paths, and real-time application issues.

TEAM

VP of Product
Product Design
Project Manager
Backend Dev
Frontend Dev

App Graph version 1

This is v1 of the App Graph.

App Graph evolution 1 - secondary connections

We played with the idea of grouping and showing links between non-consecutive screens, however, it started to look rather congested.

App Graph evolution 2 - tree on grid

Aligning screenshots to a grid helped with readability but the small screenshot size still made finding a screen challenging.

App Graph evolution 3 - hub and spoke

A hub-and-spoke layout seemed to offer better use of space and not force a top-down hierarchy to apps.

App Graph version 2

I found that varying screenshot size and hiding deeper screens would allow users to quickly find paths from the user-defined home screen.

App Graph performance filters

Adding dynamic filters to an App Graph allows a user to see build, accessibility, and performance issues in real-time all in one place.

Screen Results

CHALLENGE

Users expressed a number of frustrations with what is, arguably, the most important part of the product: Screen results. Screenshots were too small and editing objects and assertions was confusing.

TEAM

VP of Product
Product Design
Project Manager
Frontend Dev

Old screen result detail

The original designs displayed results in a modal, which limited the space available for tablet screenshots. Users also found editing and adding assertions confusing.

New screen results schematic

We had to accommodate three distinct sections on screen results: Screenshots (yellow), issues (purple), and the right drawer (orange), where users are able to access test, step, object, and accessibility data.

My solution was to make each section a flex box. Screenshots remain centered above issues while they contract to accommodate the right drawer opening and closing. A user is able to slide the detections section down to expand screenshots.

New screen results with flexboxes

Users wanted less clutter on the screenshots so I removed all issues by default. Hovering over an issue will highlight it on the screenshots.

New screen result accessibility object issue

Clicking an Object issue opens its respective panel to highlight the issue in question. Screenshots and issue details contract to accommodate the drawer.

RESULTS

Users expressed satisfaction with being able to more easily mark detections, edit assertions, or edit gestures. What took multiple clicks and closing and opening modals before, was now one click away.