Monitoring Feature

Challenge:

How might we encourage users to engage with our monitoring feature?

Role:

Lead UX Designer
& Lead Researcher

Solution:

Introduction of a new ‘Monitoring Spotlight’ feature and addition of a subtle toggle animation

The Context

As a Q4 goal, stakeholders tasked our team to increase user engagement with the existing monitoring feature on our member site. This feature notifies users whenever there is updated information on a report they are monitoring. To activate the monitoring feature, users must search for the person they want to monitor, access their report, and toggle the monitoring function to 'ON'. The monitored report then appears on their dashboard for easy access. Given that the monitored reports is a legacy feature introduced with the initial launch of the member site, we needed to evaluate whether it remains relevant and desirable for users, especially considering its current low utilization.

To validate our assumptions about the continued relevance of the ‘monitored reports’ feature, I proposed employing the Design Thinking process. This approach would offer valuable user insights, helping us better understand their needs, motivations, and pain points, while also guiding us in developing an effective solution to boost engagement with the feature.

The Design Thinking process

Graphic provided by my DesignLab mentor that I always keep on hand.

Empathize

Since the design thinking process begins with empathy, I aimed to delve deeper into our users' expectations regarding the member site. Key questions to explore include: How do they regularly use this product? What are their main motivations? Do they encounter any pain points? And are they using our product as intended?

Methodologies

User Interviews

Surveys

Observation

Deliverables

Personas

Affinity Map

User Journey Map

With the assistance of a contract UX designer and the support of our customer service team, we conducted 5 user interviews, implemented a survey on the member site, and analyzed hundreds of user sessions through HotJar recordings.

Define

After organizing our research observations and responses, we identified three key personas: the ‘Super User’, the ‘Search Impaired’, and the ‘Pinballer’. We decided to focus on the third and most prevalent persona, the ‘Pinballer’. Mapping out this user's experience across the site, we uncovered a major pain point: users were manually monitoring important reports. We observed users repeatedly pulling the same reports, both within the same session and across multiple sessions. To understand their motivations, we asked users “why” behind their actions. Their response revealed that they were trying to check for any updates or changes in the information on those specific reports.

Ideate

After confirming that the 'monitored reports' feature is valuable to our users, we began brainstorming ways to enhance its visibility and clearly demonstrate how it can significantly benefit their experience navigating the member site.

We decided to test several different solutions by conducting an A/B/C experiment to compare their effectiveness.

Solution A:

Pop-up message/modal on report page prompting the user to turn on monitoring

Solution B:

Adding a visual indicator to the monitoring UI toggle element

Solution C:

New section within the report itself prompting the user to turn on monitoring

Prototype & Test

The next step in the process involved creating a working prototype in Figma to conduct usability tests. These prototypes served dual purposes: they allowed us to demonstrate the functionality to the development team and enabled us to test the feature with real users. This testing helped ensure we identified and addressed any unforeseen issues, and confirmed that users could interact with the feature as intended.

From our usability tests, we identified the need for a message banner (success or error) to appear when users toggle the monitoring feature 'ON'. This visual confirmation reassures users by indicating whether their action was successful or if further steps are required.

The Impact

After conducting our A/B/C experiment, we found that the subtle toggle animation (Solution B) performed the best, increasing user engagement by 84.86%. In comparison, Solution A, which used a pop-up modal, increased engagement by 15.76%, while Solution C, which involved a section within the report, resulted in a 9.88% increase. These solutions were all compared to the legacy/control experience.

Reflecting on this project, it was intriguing to see how subtle visual cues could significantly enhance the user experience, compared to simply introducing new features aggressively. The testing phase also underscored the importance of providing users with visual confirmation to validate their actions. Without the usability tests, we might not have identified this crucial need.