magoosh is all about making sure our students are well educated and happy. but we’re also a data-driven business that uses metrics to make decisions — vague notions of happiness are nice, but we want numbers!
so this is the story of how we improved student happiness by a/b testing changes to our product with the goal not of optimizing clicks or conversions or revenues, but of maximizing student happiness. to start, though, i’ll introduce the metric at hand: net promoter score.
nps: our reliable referral indicator
net promoter score is a metric that tells you, on the whole, how willing your customers are to promote your product. customers are asked on a scale of 1-10 how likely they would be to recommend your product; 9s and 10s are considered “promoters”, 7s and 8s are neutral, and anything below 6 is a “detractor.” your net promoter score is calculated by subtracting the number of detractors from the number of promoters and dividing by the total number of respondents. as a result, nps is a percentage somewhere in the range of -100% (all detractors) to 100% (all promoters). not to brag, but our nps is high. really high. apple high.
at magoosh, nps is one of the most important metrics we track — it helps us determine not only whether students like our customer service and user interface, but also how well our products prepare students for their exams. and most importantly it has been a reliable leading indicator of growth in word-of-mouth referrals — our largest marketing channel. when nps is high, students talk about magoosh and more people buy it!
historically, we’ve asked students the nps question after they’ve taken their exams (and, importantly, seen their final scores). we do this because our products prepare students for tests, and, really, the proof is in the pudding. you can’t fully decide if you’re willing to recommend magoosh for gre prep until you’ve taken the real gre. the downside is that it can take a while for us to see nps change in response to product changes. since we’re waiting until after students are done studying to survey them, it can take months between when a student sees a new feature and when she rates our product.
our nps issue: mismatched expectations
because nps is such an important metric to our company, we take changes very seriously. earlier this year we saw nps for our gmat product dip fairly significantly. looking into why, we discovered that several passive and detractor students were complaining that they were getting lower scores on their real gmat than they did on their magoosh practice tests.
our algorithm was telling students to expect one score, but, for some, their official reports were coming back lower — obviously a frustrating experience. these students were still improving their scores significantly, but once you’ve got a 750 in your mind, a 700 seems disappointing! we determined that we needed to fix our score prediction algorithm to be more accurate, but we were left with a major concern: would an improved algorithm that displayed a lower predicted score be demoralizing for students? which was worse for customer satisfaction — a lower predicted score while studying, or a disappointing final score after the exam?
the challenge: could we optimize quickly for nps?
normally when we have questions about what works best for conversion or marketing, we run a quick a/b test to determine what works best. but nps was different — we’d never a/b tested for nps optimization before, and our nps collection survey only went to students after their exams. it would be months before students who saw the changed algorithm took their exams and we got back nps data. making a significant change without knowing how it would affect our word of mouth marketing was a big risk.
our solution: bring nps inside our product
we determined that in order to a/b test the algorithm change, we needed a method for collecting nps data while students were still studying — not just waiting til the end of their exam. we began using a third-party tool called wootric, which allows us to ask the nps question in our product and analyze the data in real-time. we then deployed the changed algorithm to half of our gmat students, and we could then match the “likely-to-refer” rating to students in the treatment and control groups. suddenly nps had a new use case for us — as a powerful, agile product tool.
it turned out that the improved algorithm did not affect student satisfaction while studying with magoosh — nps from both student groups was identical. knowing this allowed us to roll the change out to all students more quickly. we were also able to track the students in the a/b test over time, and have seen that post-exam nps for students in the treatment group is a full nine points higher than for the control.
takeaways from a/b testing for nps
1) include current customers in your optimizable funnel
our goal is always to provide our students with the best possible test prep experience. but since we’re not able to read minds, it’s not always easy to know if what we’re doing is actually providing a great experience. it’s easy to think of customer acquisition as a funnel, and to wrap our brains around how to a/b test to optimize that funnel. but what doesn’t come easily (at least for most startups — and definitely not for magoosh, at first) is to think of current customers as part of an optimizable funnel too.
2) optimize your products for referrals
if your business is built on recommendations and word-of-mouth, then you really can’t afford not to optimize your products for referrals. this process has helped us make sure that what we’re doing is making a meaningful difference for students, and has provided us with a useful and repeatable framework for testing future features and products.
3) focus on agility
shift your thinking on nps from a one-time transactional model to an ongoing and contextual model. in-product nps tools available today like wootric can help you do this easily, as well as keep track of your a/b test groups. you can speed up decision making and keep your pulse on customer happiness.
read about it on wootric’s blog too! a/b testing to optimize net promoter score at magoosh
leave a reply