Ahhhh… the end of the Crash Course. Really it’s the beginning, because if you have followed all 30 lessons so far, you have a lot of new tools to use. But before you get started with professional UX, we need to learn one more thing, so you know what is actually “better”:
Designing with Data: A/B Tests
(If you’re just starting the UX Crash Course: Start Here.)
Let me paint the picture a little bit first…
Imagine that you want to design a page to sell shoes. And, of course, you want to sell as many shoes as possible. What do you think would get more people to buy?
A video of the shoes?
Complete shipping details before they click “buy”?
The logo of the shoe brand?
A money-back guarantee?
How do you choose?
If your first thought was “ask the users!” that’s not a bad idea. But often — when the options are all subjective, like this — asking people just confirms that different people like different things.
So how can you decide between subjective things, like a boss?!
Design all the things! Then you launch all the options at the same time, as an A/B Test.
What is an A/B Test?
An A/B test is a way of asking thousands or millions of real visitors which option is best. You design all the options you want to compare, and launch them all!
The tests makes sure that each unique visitor only sees one option, then — after enough people have gone through your test — you can see which version of the design created more clicks. The test should also measure the “confidence level” statistically, so you know when you’re done (don’t stop it too early!)
You can do it with 2 versions, or 20 version. Just remember: only part of your traffic can see each one, so the more versions you are testing, the more traffic or time you will need.
A couple details:
1) A/B Testing is usually free, other than the time it takes to design and create the pages you’re testing. In the big picture, the results can be extremely valuable, so even a small cost to do an A/B Test is very worth it.
2) It is not the same thing as launching a new page and then watching to see if that page seems better than the old one. The only way to compare two designs is to run them both at the same time, using (roughly) equal numbers of people for both.
3) An A/B Test is most reliable when you only change one detail. If two pages are the same, but one has red links and one has blue links, that’s fine. If they also have different menus, then there is no way to tell whether it is the link color or the menu that is making the difference.
4) Testing two totally different pages, like a home page and a checkout form, is completely useless. That’s not a proper A/B Test.
That’s it! If you made it through all 31 lessons, congratulations! You are now a more well-rounded design thinker, and if you practice what you have learned in this course, you will have skills that are in very high demand in the tech industry.