To continue the conversation started in this course, with Drew and other user experience professionals, join Drew's Practical UX: Lessons from the Trenches LinkedIn group.
Skill Level Intermediate
- Learning a new skill can sometimes be hard. You might be on a time crunch, or you might just need a little nudge to keep you on track. Finding the time to learn is no easy feat, but given a consistent reminder and encouragement, you'll have a better chance to get back into learning. In this movie, I want to walk you through Reminders, a feature added to lynda.com in 2016, that we're currently working on as of this recording. I'll walk you through what the feature is, why we decided to build it, how we designed it, and the lessons learned in the process.
We believe that users chance of watching the content become much higher when they would schedule it. They get to choose the best day, time, and duration that works with their busy schedule. Now that we understand the problems learners face, the team set off to brainstorm new solutions that could be quickly prototyped, designed, and tested. We had a few restraints we had to consider for this new feature. We had to validate that this feature would actually add value to our learners before investing too much time and resources. We had to find the most optimal entry point for this feature without affecting the learning efficacy of the site.
It had to be responsive, but we wanted to support a mobile web version of this feature. This feature would only be on desktop computer and not native platforms. We explored many ideas and the team debated many solutions. This consisted of sketches and designs that we presented to the team and key stakeholders. We decided to build two options and A/B test them. Each test would get a specific percent of traffic. For example, each test, A, B, and the Control all got 33.333%.
An A/B test is when you have two versions of a feature and you want to measure which one performs better. The solution that performs better ends up being the one you go with, or informs the next round of your design iterations. The trick is to make sure each of your designs continue to meet your business objectives and product requirements. The first option was a module that displays in the video players location that has prefilled days, the length of your learning session, and the best time that works for you.
This would display after watching a few videos of the course. The thinking behind having a learner watch a few videos first is to make sure they're interested in the content before we switch out of the video player. The second option is a lightweight solution that bypasses asking the member the specific days, duration, and time. The lightweight solution prepopulates their calendar automatically and their email client like Outlook, Google, Apple, or Yahoo. The user can still go and change any of the following details they wish.
We decided to go with two versions because we hypothesized that some members will want to have specific control over when they should learn. The second option still gives you control, but creates the calendar invite with one click. This was a perfect candidate for an A/B test, as qualitative user testing wouldn't be enough to decide which feature would win. In this project, my deliverables consisted of an InvisionApp Project to access all files, designs, and comments, a Box Study of the different breakpoints to support, and to understand how the feature looked on all the breakpoints, a Redline Specs sheet with logic to help build the team build it to spec.
And as the A/B tests were running, we quickly analyzed the data and realized that users were more actively completing the custom calendar reminder versus the one click option. We received enough data to pull the one click option out and ramped the variant to 50%. It's important to not have too many variants running at a given time, so you can have cleaner data to report back. Plan on measuring all the microinteractions on screen so you can determine the overall user behavior of this new feature before ramping it to 100% of your users.
If you realize that the design variant needs a few interactions because your goals are not being met, then you can do some qualitative user tests to see if you might have missed anything. This is the beauty of trying new things and taking educated risks. If you're watching this course, you might have been a part of this test, or perhaps, by the time you're watching this, it could be a ship feature. I'd love to hear about one of your latest A/B tests and your product, and how you approached it on our Practical UX: Lessons from the Trenches LinkedIn group.
This will help our UX community see other perspectives and techniques when trying to learn something new.
Q. Where can I ask the author questions about practical UX?
A. To continue the conversation started in Practical UX Weekly with Drew and other user experience professionals, join the LinkedIn group at https://www.linkedin.com/