Onboarding New Hudl Users

 

BACKGROUND

In 2018, I was part of a Growth team at Hudl that originally had set out to better connect new users to Hudl's value and overall improve the conversion rate for trial customers to paid customers. Historically, when prospective customers would ask a sales representative if they could trial the Hudl product, the sales rep would have to handhold the coach or athletic director through their experience because there was little to no onboarding within the app. This would take a sales representative’s time away from opening/closing other opportunities. It also meant that trial users would have to rely solely on the reps’ availability in order to figure out how to navigate through the app and understand the value it could bring to their lives.

We originally were working to solve the trial user experience but unfortunately a couple of months into the project, we uncovered an internal data integrity issue. We weren’t able to decipher between which users were truly in a trial and which users were using the “trial” flow but had already made the decision to purchase Hudl. Due to the inability to distinctively track these two cohorts of users (without further investment and time) we decided to focus on activating paid non-trial users as they started to use Hudl.

 

ROLE | DURATION | TEAM

Product Designer | Hudl

2018

My team was made up of a product director, 4 developers, and 1 QA. I also worked closely with members of the Customer Success, Sales, and Support teams.

 

THE PROBLEM

Users who had never used Hudl were essentially left on their own to explore and learn the product, leaving them with outstanding questions and resulting in low engagement. Our Customer Success team would do their best to help connect the value of Hudl to their specific needs and answer questions as they go, but this team’s efforts were not a scalable solution for all of the new users we had on Hudl. In working closely with our Data Science team, we knew that specific actions taken within the product correlated with an increase in engagement. We had to figure out how to design a scalable experience that enabled them to take these actions, as well as met their needs as a new user.

 

RESEARCH

At Hudl we utilize the Jobs to Be Done interview technique to understand the deep motivations behind what causes our customers to purchase Hudl (or not renew). In 2017 our research team conducted 30+ interviews with our users and eventually landed on five Hudl Jobs to be Done.

When my team started this project, we were unsure why users do what they do when they are in a trial, as well as what their specific needs were at the post-purchase phase. We felt like the Jobs to Be Done technique was a promising tool to help us find clarity because:

  • It forced us to dig deeper to understand cause and effect

  • It gave us a way to think about a variety of forces that fuel people's motivations and shape their behavior

  • Challenged us to think holistically and not just hone in on specific features

We worked with the User Research team to start digging into what we called "Little Jobs". We were confident in Hudl's Jobs for the Big Hire (when someone purchases the product), but we were unsure on the Little Hires (when they use it). So we operated under the assumption that there were Little Jobs for those Little Hires and that figuring out what those were for our users (who were just getting started with Hudl) was going to assist us in helping them.

After conducting 11 interviews with Hudl users who had completed their trial, we started to analyze what we learned. We used affinity mapping to group insights which led to categories such as "When I am", "So I can", "More about", "Less About", "Hiring" vs. "Firing", etc. Grouping these data points into these categories allowed us to spot patterns and call out the differences and similarities between users's experiences. This led to loose definitions of these Little Jobs which then enabled us to validate/invalidate them at a larger scale.

Surveying Our Users

The next step involved surveying new customers using the Little Jobs we predefined so we could see if the actions they took after "categorizing" themselves matched our expectations for that Job. We decided to send out a Google survey as a low-cost method of fine-tuning our understanding of the Little Jobs they had during their early Hudl experience.

 

After collecting the survey responses we felt that the data we would get at scale would be really valuable in continuing to refine the Little Jobs definitions. We decided to design an in-app survey that would display when new users first logged in. This flow not only helped us collect more responses and refine the way we were asking what our users needs were, but it also served as an MVP to the onboarding guide we were sketching out as a potential solution.

Through this experience we were able to distill and define the Little Jobs and they evenly split into two buckets, pre-decision and post-decision.

 
 
 

COMPETITIVE ANALYSIS

In order to better understand best practices in new user onboarding, we collected a variety of patterns from different SaaS products. We analyzed their use cases which helped our team take their purpose into account when working towards the best solution for our users.

Competitive analysis of onboarding patterns in other products

 

SOLUTION

Because we were focusing on users who had already paid for their subscription and were in the post-decision phase, we wanted to make sure our solution was meeting the needs of those in the Little Job 3 and 4 categories.

  • "I'd like to take the time to learn how to use Hudl, so that I can establish a routine as quickly as possible." (Little Job 3)

  • "I just want to get going and establish a routine with Hudl as quickly as possible." (Little Job 4)

Based on the analysis our Data Science team provided, we found that adding at least 4 athletes in the first 14 days of their experience with Hudl was strongly correlated with the team becoming engaged. We defined an “engaged team” as five or more users on a team watching video. In addition to adding athletes to their team, we suspected that adding a schedule entry for their first game as well as uploading/recording a game to Hudl were important first steps to establishing a routine with the product. The solution needed to emphasize these steps and do so in a manner that was easy and didn’t feel like it was getting in the way of our users.

From our interviews, we knew that our users typically had a lot of things to juggle when they signed up for Hudl. They were responsible for making sure athletes knew how to register with Hudl, they had to coordinate who was going to film their game, and they had to make sure they remembered to upload it before they reviewed it with their team. With this in mind, I mapped out what their experience could look like from the point they signed up for Hudl to after they played their first game, integrating conditional reminders based on outstanding actions in-app.

Using the storyboards designer Allie Ward illustrated for our company vision, I outlined a proposed journey from pre-game to post-game for a coach.

Taking everything we had learned about new users into consideration and knowing we had specific tasks we wanted our users to take, we felt that a mix of two onboarding patterns, Wizards and Interstitials, would be the best solution for when they logged in. We designed what we internally referred to as the Activation Guide. Due to the fact that there were typically more than one coach on a team, we designed the guide to track the progress of a team to ensure we weren’t asking the coaching staff to duplicate efforts in uploading a game or adding athletes.

In addition to prompting a coach to finish the tasks in the guide, we thought it would be important to include retention loops to nudge users who fell off the path to activation. Our capabilities at the time were limited to the use of emails as the primary form of notifying a user. We designed the flows to trigger specific, personalized emails dependent on the timing and completion of the steps in the guide. For example, we sent an email with a checklist of items matching the steps in the guide, and if a coach had completed one or more of the steps, we would display the checked item in the email so that they could see their progress and feel a sense of accomplishment.

 

DEMO OF THE ACTIVATION GUIDE & RETENTION LOOPS

The video below provides a walkthrough of the Activation Guide and the emails sent during the coach’s journey.

 
 
 

TESTING THE SOLUTION

My team set up an A/B testing framework so that we could test the efficacy of the Activation Guide in increasing engagement. We showed the guide to users in the test group and left the experience the same (no guide) for the control. Our product team happened to go through a reorg during the time of launch, therefore our team unfortunately had to move on to a different project. At the time we were seeing positive results from the test group as they were adding more team members and uploading video at a faster rate than the control, so we turned it on for all paid sales-assisted subscriptions.

 

REFLECTIONS

This project taught me a lot about the adverse effects of faulty data, the value of A/B testing and low cost methods of learning quickly. The frustration our team experienced after finding out that we hadn’t been working from clean data sources after a couple of months was something I don’t want to go through again. Since then, I’ve made sure to question the integrity of data sources I work from at the very beginning of a project. A/B testing was an important part of being able to prove/disprove the small experiments we were running, especially when it got time to launch the Activation Guide to a larger subset of teams. This data was essential when it was time to decide whether or not we wanted to keep the Guide as the experience new users got when they first logged in. This project also reminded me of “scrappy” methods to validate/invalidate assumptions. Using Google surveys was a cheap and fast way to get feedback from our users before investing in building out a similar experience in-app.