Participant Drop-off Predicament

Led a research initiative to understand why 50% of users were not using an informative virtual assistant tool. Leveraged interviews and data analysis to investigate the root of this issue, and propose design solutions for how to increase engagement and satisfaction.

A GIF showing the first few steps in the CDPH Virtual Assistant COVID case investigation survey

My Role

Co-lead researcher: Designed the flexible interview guide, including coordinating pilot interviews to test weaknesses and flow; facilitated interviews; analyzed and synthesized data into tangible design recommendations for the client.

Project Impact

By incorporating features that established trust and put people at ease, we ensured that people actually utilized a helpful tool to keep them and everyone safe from COVID-19.

the problem

Why are people quitting?

Is it time-consuming? Are there too many questions? Is the lack of a time indicator turning people off? Maybe it’s too overwhelming to answer questions when you are COVID-positive? Does the survey look like a scam? Are people just apathetic?

We began hypothesizing a lot of reasons why 80% of people did not enter the survey, 15% would drop-off at the first question, and an overall 50% drop-off rate by question four.

planning and scope

figuring out how to interview

With our goals in mind, we knew we wanted to simulate the real-life experience as best as possible. Considering we would conduct these interviews virtually, and not recruit COVID-19 positive individuals, we had to get creative with how to test our hypotheses.

With 45-minutes to talk to our participants, we provided them with context of why someone receives a message from the VA, and then asked them to imagine they just tested positive. We told them that we’d like them to do what they’d do in real life, except for providing real information (for privacy). To get their candid opinions, we asked them to think aloud and warned that we would pause here and there to ask a few questions. We even included a subtle note that they were allowed to “quit” answering questions at any time.

interview guide at a glance

Introduction + Overview (and double checking) of consent

  1. Overview and time constraints of the interview, ensuring that 45 minutes was still feasible for them

  2. Setting the scene: Imagine you just tested positive for COVID-19, act as if you would in real life

  3. Provide them with an out - if they would actually quit the survey, we wanted them to!

  4. Based on their behavior and choices, we wanted a different set of questions to understand:

    1. Why they finished the survey, confirming that they actually would in real life (and weren’t satisficing)

    2. Why they chose to quit the survey, probing to see their reactions

  5. Exit questions to understand their emotions regarding the VA, if anything was confusing at any point, and of course, if they’d like to add any thoughts that we didn’t cover in the session

analyzing the data

high-level demographics

A majority of participants, 6 out of 9, identified as female, the remaining identified as male. 4 were from Southern California, the rest were from scattered parts of Mid-Northern California. All participants indicated that they take precautions to slow the spread of COVID-19, like wearing masks and limiting social interactions where applicable.

Shown in the first image - transcription and key moments from each interview.

Original insights from each interview, displayed on color-coded virtual sticky notes for each participant

affinity map

We grouped the data into main insights, including:

  • SMS Trustworthiness

    • Who is sending this? Can I trust them? This looks phishy…

  • Authentication Page

    • Why should I have to input my name and birthdate? Why am I getting this?

  • Reassurance / Tone

    • The VA is brusque, too talkative, not explaining what I want to know…

  • Personal Information

    • Hesitant in providing personal information, contact information, and anything around contact tracing

  • Survey Length

    • Most participants expressed survey fatigue and concerns around not knowing the length

Participant data points grouped together by commonalities

moments of hesitancy

There were multiple parts in the survey where people hesitated with continuing. Below are some quotes from key sections of the survey.

Moments of hesitancy mapped out in a table with quotes from various participants, such as: "I am hesitant to click on random links for security reasons"

key quotes

“I am hesitant I am hesitant to click random links. I would need more information or a reminder of where this information is coming from.”

— participant two

“This feels like doing my taxes”

— participant eight

“I don't feel comfortable providing information about others without knowing how the information will be used. My contacts probably don't want me to either.”

— participant six

insights

01

Connected Journeys

Demonstrating that this is merely a continuation of the preceding testing touchpoint is key to ensuring the user will trust the SMS entry point and choose to engage with the virtual assistant. And in a world of nebulous SMS text message scams, early signs of authenticity are a must.

02

Uneventful Transactions

A positive COVID-19 test result is inconvenient at best, terrifying at worst. A survey that feels brief, seamless, and shoulders the work on behalf of a user will ensure they don't experience any added undue burden.

03

We have you Covered

The experience should be personable, informative, and emotionally supportive to help in a time of stress.
The expectation of a COVID-19 test result can be anxiety provoking, but the right approach, voice, and tone can make a world of difference.

04

Handle with Care

Sharing information with the government is a sensitive matter. What will be gained and how information will be used are key concerns expressed by people. Sharing information about others without their consent requires a high level of clarity and trust.

05

Right Information, Right Time

Strike the right balance between succinct, necessary instruction and deeper, supplemental information.

Most people want to be told quickly and clearly what to do following a positive result, but preferences vary when it comes to how they want to be told. People want to skip to the relevant information as soon as possible.

design recommendations

01

Connected Journeys

  • 5-digit SMS sender telephone number, URL ending in .gov, CDPH logo, and timestamps engender confidence in authenticity.

  • Lead with simple, specific references that call back previous information gathered during testing interaction (e.g., "we have test results" rather than "important health issues").

  • Give multiple signs for user to quickly identify text message legitimacy and verify communication is truly coming from CDPH (e.g., information can be cross-referenced and verified within the CDPH website).

  • Provide identical markers (i.e., icons, imagery, branding, vocabulary) to unite the SMS text message with the VA tool authentication page.

  • Express clarity about how user's data will be handled at the outset.

  • Bring justification for this tool specifically in contrast with alternative channels (i.e., Bluetooth contact tracing).

02

Uneventful Transactions

  • The tool should be responsive, pre-populating known data when possible and acknowledging information already provided by the user elsewhere in the survey.

  • Users appreciate a notion of survey completeness; indications of total survey length and progress will encourage them to proceed.

  • It should be clear what happens if a user needs to exit the survey early and there should be mechanisms to save progress.

  • The distinction between optional and mandatory sections and fields in the survey should be clearer.

  • Option to "skip" sections should be visible, when possible.

  • The most tedious and labor-intensive portion of the survey came when users had to enter information for their contacts.

03

We have you Covered

  • Emotional appeals following a positive result make the user feel that they aren't alone (e.g., ”we recognize this is difficult news...we will give you the information and next steps to take care of yourself and your loved ones...”)

  • Informal, conversational voice and tone is comforting and feels appropriate for the scenario.

  • Directing, simple and clear steps early on are indispensable for those that just need to be told what to do in a time of stress.

  • An "avatar" could help make the chatbot feel more personable and human.

  • Thorough context setting before personal or uncomfortable questions is appreciated (e.g., in advance of the employment question).

  • Incorporating additional resources (e.g., mental health information, questions answered) at the end of the survey, that can be accessed at any time, can be a source of comfort.

04

Handle with care

  • Outlining early on which information won't be collected helps to create trust.

  • Provide clarity about how the government will contact other people, employers, and what details will be disclosed.

  • Understand that some people prefer to reach out on their own and some have already communicated the news to their contacts post exposure/pre-results.

  • Give cues for people to remember who they might have come into contact with and reassure them that there are no consequences if they can’t remember everyone.

  • Users would like to preview the communication and experience they're volunteering their contacts for before they do so.

05

Right Information, Right Time

  • Brevity is paramount: condense instructional and contextual copy as much as possible to reduce reading fatigue.

  • Provide optional pathways for people to learn more about their COVID-19 result and how it impacts someone like them.

  • Users are looking for quick and commanding directives when faced with a positive COVID-19 test result. High level "TL;DR" ("too long; didn't read") takeaways will prioritize focus and help the user feel guided when they need timely instruction.

  • It is critical to iterate section ordering and overall flow to ensure similar prompts are positioned near each other or consolidated (e.g., "places" and "gatherings") to avoid question redundancy.

  • User should be able to save all information from the chatbot conversation for later review.

  • Updating the style of drop-downs to be more discoverable might help people understand the ever-changing nature of the virus and notice information to check for.