Group Project
January 2023

University of Michigan App Case Study

Expertise

UX Research

Platforms

Research and usability

Deliverables

UX Reports, Heuristic Evaluation
University of Michigan App Case Study

Project Overview

Summary

The following is a chunk-by-chunk overview of the components of our research and usability work on the University of Michigan app, which is the official app of the University of Michigan. Each section gives a high-level review, but a UX report is available for each section upon request, in addition to additional appendices.

Execution

Interaction map

💡 THE GIST: The following is an interaction map showing a majority of interactions within the app at the time we began our work. It demonstrates that the navigational tabs of the app do not interact. Furthermore, the depth and breadth of the tabs varies greatly, with some variance in consistency within tabs.

Based on the structure and proliferation of the M Life section, the following interviews placed a focus on the M Life sub-section of the app.

Interviews

💡 THE GIST: Interviews focused on users' previous usage of the app with particular care paid to the M Life section based on its proliferation in the interaction map and the client's preference. Four themes (user frustrations, problems with the app, user behaviors, and user wishes) were distilled with various sub-assertions beneath.

Five interviews were conducted with a mixed population of undergraduate and graduate students, both male and female, all actively enrolled and taking classes at the University of Michigan at the time of interview.

Some key points regarding the M Life tab:

  • The dining sub-sections sees the most use across all users
  • The recreation section sees some use, but users expected features that were missing
  • No users mentioned using the printing section
  • Users use a multitude of other apps to supplement the official app

Survey

We generated a survey and passed it off to the client.

The survey was not deployed, but was piloted with real users of the University of Michigan app.

As they were taking the survey, these users were asked about the clarity of questions, whether they felt any desired choices were missing, whether all choices were clear, whether they were swayed by any questions, so on, so forth.

As a result of our pilot test, some redundant questions were eliminated, some questions were reworded, some questions were given additional responses, and the flow of the survey was sightly restructured.

Some example changes are highlighted below with the changes indicated in bold. The question's before state in indicated on the left; the revision is on the right, after the "| -> |".

  • What section of the University of Michigan App do you use most per day? [Rank the sections by usage] | -> | What section of the University of Michigan App do you use most per week? [Rank the sections by usage]
  • What alternative apps do you use to complete tasks you would expect to be able to complete in the University of Michigan App? [Check all that apply] | -> | What functions do you expect to see in the University of Michigan App? [Check all that apply]
  • What is the most helpful part of the M Life section for you? [Select one from list of options] | -> | How helpful is the information provided in the M Life Section? [Likert: not at all to extremely helpful; access to question in contingent on having used the M Life section]
  • Please rank the following sections of M Life based on their clarity. [Likert: unclear to clear] | -> | Please rank the following sections of M Life based on their clarity. [Likert: unclear to clear, individual option for "I don't use this feature"]

Following the piloting and revision process, the survey was passed off to the client.

Competitive analysis

Given the client's current focus on the M Life section of the app, we conducted a comparative analysis on the features within the M Life section. Since the official University of Michigan app neither has a true direct competitor on the broad scale, nor a competitor at the scale of the M Life sub-section, these analyses were conducted at the feature-level: dining, recreation, and printing, for which there exist partial competitors to the app and direct competitors to the features in addition to analogous and/or indirect competitor(s) depending on the feature in question.

You may click here to view an informational chart about the relevant competitors, and you may click here to view an analytical chart about the competitors.

Heuristic analysis

💡 THE GIST: The following areas are substantial stumbling blocks for users based on evaluation according to Nielson's heuristics: the calendar within the dining section, the structure of lists, the meatball menus, the lack of consistency between aesthetics and structure across pages, and certain help functions within the app.

Five focal points are enumerated below, each referencing heuristics to which they pertain. You may click here to find an expanded table of relevant heuristics with more specific details of what about the heuristic was violated; corresponding focal points have been identified beside the heuristic discussed and labeled "FP". These correspond to the heuristics called out below.

Focal point 1: the calendar in the dining section

  • Heuristics: Visibility of system status (#1), recognition rather than recall (#7)
  • Severity: 4; imperative to fix
  • Explanation: The calendar has a substantial contrast problem that prevents the user from being able to see the date prior to or after today’s current date (whiteish text on a whitish background). Furthermore, there is a color distinction between “previous” dates and “future” dates, but it is indistinguishable due to the prior issue and the slight difference.
  • Recommendation: We recommend modifying the calendar feature and correcting the color contrast; this is a priority fix. The distinction between “past” and “future” dates should also be stronger.

Figure 1. The contrast issues in the date selection calendar within dining.

Focal point 2: sorting and organization of lists/information

  • Heuristic: match between system and real world (#2)
  • Severity: 2; minor
  • Explanation: The organization of information within lists is limited; favorites are sorted by alphabetical rather than any real-world metric (location, capacity, etcetera), and nearby locations do not clarify what is meant by nearby or provide meaningful contextual information about proximity.  Lastly, “closed” facilities are not distinguished from their counterparts, and are even displayed first.
  • Recommendation: We recommend considering offering alternate sorting methods or reconsidering the current sorting scheme for favorites in addition to providing location proximity information on “nearby dining” locations to clarify why they are being recommended in the listed order. We also recommend distinguishing closed/inaccessible facilities and deprioritizing them in lists.

Figure 2. Dining near you doesn’t clarify what “near you” means.

Focal point 3: meatball menus are obscuring relevant information

  • Heuristics: Match between system and real world (#3), consistency and standards (#4), aesthetic and minimalist design (#9)
  • Severity: 3; important to fix
  • Explanation: Meatball menus are present throughout the M Life section, but these menus are not intuitive, and it is not easy to infer what content these menus contain. In addition to this, pertinent information tends to be hidden within these meatball menus without a way to intuitively understand how to access the hidden information.
  • Recommendation: We recommend re-evaluating the information hierarchy of the pages and bringing some of the information in the meatball menu to “surface level.”

Figure 3. Meatball menus suppress location information and other transit information, in addition to failing to maintain consistency and duplicating information several layers deep.

Focal point 4: some pages are inconsistent in styling and content; styling may additionally be hampering the usability in some areas

  • Heuristics: Consistency and standards (#4), recognition rather than recall (#7)
  • Severity: 1.5; mostly cosmetic, but worth correcting
  • Explanation: Content might vary between pages, or the styling might vary between pages (for example, the sub-headers between printing and dining locations). The header sizing is perhaps inconsistent with what should receive the most attention or is the most important, and verbiage isn’t fully consistent throughout the app. White space and overall design don’t serve to fully distinguish the three “sections” (printing, recreation, dining) of the M Life section, which ultimately still look fairly similar and like they belong to one another.
    In addition to the above, the size of the cards varies between cards, which is inconsistent; the printing cards are significantly smaller and different in color and styling–in the dining section, cards vary in length, which again reduces standardization.
  • Recommendation: We recommend standardizing the aesthetics and information hierarchy of the pages, including white space and the “blocking” of like-content, as well as reevaluating how the headers are structured and used within the pages. We also recommend standardizing card size.
Figure 4. Differences in styling/content in some areas; e.g. the dining section and printing section.

Focal point 5: some “help” features aren’t as helpful as they could be due to a need for repeated user intervention

  • Heuristics: Consistency and standards (#4), help and documentation (#10)
  • Severity: 2; minor usability problem
  • Explanation: Icons are not labeled at the top-most level, resulting in a need for users to repeatedly tap into menu items despite the fact that the key is the same. Furthermore, the information provided when pressing “i” is mostly just icon information and nutrient information, rather than substantial information about what a dish is, which is perhaps inadequate information about what a dish is. Furthermore, other areas of the M Life section lack the ability to gain additional information, such as the recreation area which does not give the user the ability to “inquire” for more details.
  • Recommendation: We recommend making the “help” documentation (in the form of icon key) available at the higher level in addition to providing more “help” information throughout the M Life section, as the section, overall, lacks “help” information.

Figure 5. Some icon keys are repeated in different areas; some icons’ meanings are best understood contextually only by looking at the same icon in multiple locations (e.g. “Nutrient Dense”).

Usability testing

💡 THE GIST: The following areas are substantial stumbling blocks for users based on evaluation according to Nielson's heuristics: the calendar within the dining section, the structure of lists, the meatball menus, the lack of consistency between aesthetics and structure across pages, and certain help functions within the app.

Users were given the following tasks:

  • Task 1: Add three more dining places to your list of favorite dining.
  • Task 2: Assuming you are intolerant to lactose, please review the Lunch menu for East Quad Dining Hall on Wednesday April 5 and tell me which dish you want to try from the Halal section that doesn’t contain milk.
  • Task 3: Imagine that you want to go environmentally friendly; check the menu for dinner on April 5 at Morsh-Jordan dining hall to find one of the Low Carbon Footprint items as well as the nearest bus stop to this dining hall.
  • Task 4: Find the North Campus Recreational Building’s swimming pool operating hours on Wednesday April 5. Since you will be driving, find the nearest Blue permit parking lot and its enforcement hours.
  • Task 5: Find a computing site at Michigan Union where you can use Windows to print your homework for your upcoming class.

Each task was given an overall "task criticality" rating, with an user issues encountered while attempting to complete the task given an additional "impact" rating. The percentage of users who encountered the issue is denoted by "frequency." (Note: 100% of users = 1, 20% of users is 1 user out of 5 and thus 0.2). The frequency, impact, and criticality multiplied together result in a severity rating which informs our understanding of which issues are the most significant to users, and should thus be prioritized for amelioration.

All tasks and their child issues may be viewed in the full chart here. The most severe issues are elaborated upon below.

Focal issue 1: Inability to identify pertinent allergy information

  • Task: Task 2: Assuming you are intolerant to lactose, please review the Lunch menu for East Quad Dining Hall on Wednesday April 5 and tell me which dish you want to try from the Halal section that doesn’t contain milk.
  • Severity: 12 [Criticality: 4, Impact: 3, Frequency: 1]
  • Recommendation: Certain iconography and architecture within the app is confusing, particularly within the menu. These icons need to be clarified and additional information included, particularly regarding allergies and dietary restrictions.

Focal issue 2: Users did not locate the meatball menu, which contained transit info

  • Task: Task 3: Imagine that you want to go environmentally friendly; check the menu for dinner on April 5 at Morsh-Jordan dining hall to find one of the Low Carbon Footprint items as well as the nearest bus stop to this dining hall.
  • Severity: 6.4 [Criticality: 4, Impact: 2, Frequency: 0.8]
  • Recommendation: Pertinent information should not be hidden behind nested menus and instead raised to a more surface level, or otherwise relocated to a location users are more likely to anticipate it being.

Focal issue 3: Users were unsure about parking lot identification (unclear iconography)

  • Task: Task 4: Find the North Campus Recreational Building’s swimming pool operating hours on Wednesday April 5. Since you will be driving, find the nearest Blue permit parking lot and its enforcement hours.
  • Severity: 4.8 [Criticality: 4, Impact: 2, Frequency: 0.6]
  • Recommendation: When an icon has a complex or potentially dubious meaning, the icon should be paired with words to ensure clarity and prevent confusion.

Results

Conclusion

There are two primary recurring themes according to the methods used to conduct analysis above into which most of the issues can be sorted:

  1. Unclear hierarchy and organization
  2. Unclear iconography

A lack of clear organization/hierarchy impacts users' ability to successful navigate and locate features where they expect to find them; unclear iconography makes it difficult for users to understand what they are looking at and greatly increases the difficulty of completing tasks.

Both issues significantly contribute to increased cognitive load but have relatively simple and powerful steps to take for ameliorating their current causes as demonstrated by our recommendations, which should be reviewed on an issue-by-issue basis for a finer understanding.