Heuristic Evaluation & Redesign: Walmart

 
 

Context

 
 

Date: February 2020

What: A heuristic evaluation conducted in a group of 3

Where: BrainStation Vancouver Campus

Role: UX/UI Researcher & Designer

Tools: Figma, G Suite

Timeline: 1 week (with concurrent projects)

Project Goals:

  • Critically evaluate the usability of a mobile app for iOS within a specific user flow, on the basis of Nielsen and Molich's User Interface Design Heuristics 

  • Suggest and implement changes to the UI, and document these in a UI library

  • Communicate evaluation and subsequent redesign in two respective 8-minute presentations

Chosen Application: Walmart for iOS

Chosen User Flow: Selecting a grocery item and proceeding to checkout 

 

Overview

This was a short term group project I participated in as part of the UX Diploma program at BrainStation. We were tasked to pick a specific user flow of a mobile app, evaluate it based on User Interface Design Heuristics, and subsequently redesign the interface according to the opportunities for improvement we identified. As the timeline for this project was rather short, we decided to focus on the following out of the 10 heuristics: 

  1. Visibility of system status

  2. Match between system and real world

  3. Consistency and standards

  4. Flexibility and efficiency of use

  5. Aesthetic and minimalist design

  6. Help users recognize, diagnose and recover from errors

 
 
usability.png
 

Methodology

We devised a scale from 0-4, to quantify how much (if at all) a specific UI element violated the design heuristics. Each identified violation of the chosen heuristics was rated on a severity scale from 0-4, where 0 would be a non-existent violation and 4 being catastrophic. 

Due to the time constraint of the project, we decided to limit our evaluation to 1 instance per heuristic. We tallied up the score for each heuristic identified which yielded an overall score of 16 out of 24 (where 24 would be the absolute worst). This yielded an average score of 2.67 per identified heuristic.

 
 

Why Walmart?

We looked at the app store for iOS and found Walmart in the top 10 free apps. For being a top free application it had a rather low rating, at the time 3.2 stars averaged over thousands of reviews. Looking closer at individual reviews, there was high variability in the ratings, from low to high. I hypothesized that the high variability in ratings could be due to the fact that the app had a wide range of functionalities, some of which were more developed than others, and that the motivations for usage among people may be highly variable as a result.

This led us to assume that there could potentially be a lot of opportunities to improve usability.

 
 

evaluation

 
 

User Flow: Purchasing Coffee Using Filter

 
 
old-screens-for-web.png
 
 
 

Grocery Page

The issue here is the copy for the CTA: does it mean "shop aisles" as a noun, or is "shop" to be used as a verb (which doesn't make linguistic sense)? Moreover, the following page (see below) is inconsistent, as the header switches to "Departments". Regardless, the "Shop aisles" CTA fails to communicate the intended metaphor of a shopping aisle because of its ambiguity.

Severity: 2

Suggested improvement: Reduce ambiguity by rewording the CTA

 
old-grocery-page-for-web.jpg
 
Old+shop+aisles+screen.jpg
 

Aisle Categories

The problem here relates to an insufficient visual hierarchy: the top items, "Rollback" and "All Drinks" are ostensibly supposed to be categorically separate from the other items, as seen by the bold copy. However, this hierarchy is possibly too subtle to be noticeable -- especially for users who have visual impairments.

Severity: 3

Suggested improvement: Differentiate hierarchy by adding emphasis on scale, colour, space, or structure

 
old+filtering.jpg

Filtering search #1

The categorical separation between "Refine by" and "Filter by" is redundant because, at the time of conducting the heuristic evaluation, the user is only able to select one of any of the items in the list at a time. As such, there is an opportunity to reduce clutter on the screen.

Severity: 2

Suggested Improvement: Remove "Refine by" and just include “Category” as a part of the “Filter by” list

 
old+filtering+part+2.jpg

Filtering Search #2 

The second opportunity for improvement on this screen stems from the same underlying issue of the previous issue, i.e. that the UI element in question is made redundant by the fact that only one selection can be made at one given time. As such, having this element here is presently redundant, and may confuse users.

Severity: 3

Suggested Improvement: Remove "Clear” and “Apply” CTAs as they have no function on that screen

 
add+to.+cart+4.jpg

Add Item

We noticed that when we tried to add an item to the cart, there was no clear indication that the item had been added, for two reasons:

  1. There was a variable delay in when the blue toast appeared upon clicking, sometimes taking more than 5 seconds

  2. There was no change in colour of the button upon pressing it.

Both of these factors present a significant usability issue: if the feedback is unreliable and there is no visible state change, it is conceivable that a user may tap the button multiple times -- only to seconds later see that multiple items have been added. This could potentially be an annoyance.

Severity: 3

Suggested improvement: Implement an active state of the button (with a slight change in hue)

 
old cart 2.png
 

Checkout

When initially testing the app, we didn't immediately notice that our order was insufficient for check out, and so we were confused as to why the CTA was disabled. This could be a potential source of frustration for a user. 

Severity: 3

Suggested improvement: Add a centrally aligned modal, indicating insufficient cart amount, to show up upon pressing the Check out CTA.

 

Results

We tallied up the score for each heuristic identified which yielded an overall score of 16 out of 24 (where 24 would be the absolute worst). This yielded an average score of 2.67 per heuristic.

Out of all issues identified, none were deemed catastrophic (i.e. none received a rating of 4). However, 4 out of the 6 heuristics received a score of 3 (indicating a major usability issue). As such, there is much room for improvement within the chosen task flow. The most salient problems related to the following themes:

  • Inadequate visual feedback from user action

  • Inconsistent use of design conventions

  • Ambiguous terminology

 
results.png
 

redesign

 

Redesigned Screens

redesigned-screens-for-web.png

Interactive Prototype 

 
In order to show the micro-interactions and state changes made in our screens, we decided to make a prototype showcasing the chosen flow, but with our modified screens

In order to show the micro-interactions and state changes made in our screens, we decided to make a prototype showcasing the chosen flow, but with our modified screens

 
 

UI Library

 
The final step in the project was to individually prepare a UI library for the updated user flow. Here is a link to a presentation deck showcasing the design library

The final step in the project was to individually prepare a UI library for the updated user flow. Here is a link to a presentation deck showcasing the design library

 
 

retrospective

 
 

Key Learnings 

Upon completing this project I learned a few things:

  • Personal biases related to technology usage habits, and motivations for using particular smartphone apps in the first place, can cloud your judgement as an evaluator of its usability. Key take-away: validate your appraisal of issues and subsequent redesign with a novel user base to mitigate personal biases.

  • Profit incentives can potentially justify otherwise questionable design choices: while my team and I were going through the grocery shopping flow, it struck me how some of the design patterns seemed to be made intentionally redundant to increase time on the device. For instance, the 'clear' and 'apply' CTAs on the category filter screen seemed especially pernicious, as the only function I can see that they served (at the time of evaluation) seemed to be to increase the user's time on the app. While this is only conjecture and, again, might be a figment of my own usage biases, I was unable to see any other purposes of the UI component. In my most charitable interpretation, the design might have always been intended to be updated to allow for the selection of multiple filters, and so the UI there was kept merely as a proof of concept to stakeholders. But for my purposes, this assumption is just as unfalsifiable as the previous one. Key take-away: sometimes market forces may act in contradiction to the principles of human-centered design (if only temporarily). UX design may, therefore, be a tightrope walk between appeasing company stakeholders and users.


 
 

Next Steps

 
Proposed study design1.png

Validating the Evaluation & Testing the Redesigned User Flow 

An obvious next step in the project would be to actually test the redesigned user flow. But, in addition to that, it would also be interesting to validate our group's evaluation of the chosen heuristics with other designers (or people trained in the heuristics).

For such a validation test to be externally valid, some experimental control would need to be implemented, e.g. in the form of an independent-groups design, where the independent variable would be the order in which the user flows were evaluated (new-old; old-new).

Designers would first be trained on the 1-4 scale we used. The dependent variable would be the weighted mean ratings for each user flow across the 2 conditions (although opportunities to make sophisticated analyses exist).

To test the hypothesis of whether, in the eyes of other designers, the redesigned user flow has been an improvement, detriment, or remained unchanged, one could take the difference between the weighted average for the old user flow and the new user flow. If the difference is greater than 0 that means the hypothesis is supported, and the null hypothesis can be rejected (because the lower the rating, the better).

 
 

Investigate More Opportunities for Improvement 

In the evaluation portion of the project, we found many more opportunities to improve the user flow; both additional instances of any of the 6 mentioned heuristics, as well as for the remaining 4 heuristics. If you're interested in seeing more of our background research, here's a link to additional research conducted in preparation for this project.