Why Aren't We Doing More User Testing?
I’ve been thinking a lot about user testing and minimum viable products (MVPs). Right now we’re designing a new workshop. Once we have a workable script, we will ask some learners to participate in a practice run, review their feedback, make revisions, test again, gather feedback, etc. We do several rounds of user testing because each time we get so much helpful information. You can feel what isn’t making sense and where you’re missing transitions. You notice when you lose your participants and when they start paying attention again. You see which questions spark good discussion and which ones fall flat.
An MVP is “that version of a new product a team uses to collect the maximum amount of validated learning about customers with the least effort.”¹
It can’t just be a storyboard — it has to be developed enough that users are having the intended experience.
But don’t go too far — the least effort means not investing too much time or money because the product is still evolving.
You also want validated learning, which means you want to see the experience and results of the product with your own eyes.
MVP testing shows you what’s missing. It reveals what’s not intuitive. Even small problems become painfully obvious — and early enough that you can correct the issues and completely redesign if necessary. I know we all want these benefits.
The problem is that we rarely see teams include MVP testing in their projects. “Don’t worry, the project team is doing multiple rounds of review.” Okay, but that’s not user testing. “We did a needs analysis.” That’s great, but even the best up-front analysis can only get you so far. Think of all the decoding you’ve done and all the design decisions you’ve made since then. Your users should be the ones to validate your choices and tell you if the learning is actually helping them.
As you can probably tell, this is something we are passionate about. MVP testing is critical to delivering effective learning solutions. For those who are currently skipping the MVP phase, find your reason below and review our ideas.
Most of our courses include video.
If the learning experience includes video or extensive custom graphics — basically anything costly — try one of these strategies:
Record a low-fidelity version of the video using your phone.
Create a PowerPoint version and approximate the visual experience using a series of photos or even simple drawings as you narrate the script.
Include some detail about the location, talent, tone, music, and supporting graphics to help your users visualize the experience.
Be sure to present the video(s) as part of the larger course experience so they can evaluate the big picture and gauge the course’s effectiveness.
Our courses are fairly similar. Our learners are used to them.
In the spirit of continuous improvement, we implore you to do some MVP testing. Maybe you’re making assumptions about your learners. Maybe you haven’t given them a chance to share honest feedback. And here’s an interesting question to ask: What can we remove from the training? Is there any content that is too basic or repetitive? The last thing you want is to unintentionally develop a bloated experience that everyone dreads taking. (We actually have a workshop on this very topic! #TooMuchTraining)
We don’t have time.
We’re not suggesting a return to 3-month development cycles. It’s critical to keep pace with the business. Typically adding one extra week to the timeline will allow you to do a few rounds of MVP testing. To negotiate the extra time, we suggest explaining that MVP testing is the only way you can know if the course is improving performance. As part of the test, you can ask your learners to practice their new skills and have them report their results. This is your chance to make sure the experience will actually have impact.
If you have other reasons, let us know so we can share ideas. We also look forward to hearing your testing stories and the results you’re seeing!
¹ Ries, Eric (August 3, 2009). "Minimum Viable Product: a guide."
What we’re reading: Don’t Serve Burnt Pizza (And Other Lessons in Building Minimum Lovable Products), First Round Review