In the business world — which is where UX lives, mostly — everything has a cost. UX is no different. So when you start telling your clients or bosses that they should do A/B testing, they might ask:
“How much does an A/B test cost?”
Just starting the Crash Course? Start here!
When people in business ask how much something costs, usually they are not interested in the price itself.
What they really want to know — whether it relates to UX or not — is if this purchase is going to be “worth it”. They want to compare the cost to the benefit, like when you wake up with a hangover and the pizza that was “too unhealthy” yesterday suddenly seems like God’s greatest invention.
Sidenote: an Italian named Raffaele Esposito invented pizza, so if you ever wondered what God’s name was, there you go.
So… how much does an A/B test cost, really?
The stupid answer:
“An A/B test takes weeks! We just don’t have a budget for that.”
Yeah… not quite.
The real answer:
A/B testing is very cheap. But not free.
As much as I want to say it is completely free — it’s not. Google analytics can run A/B tests and it is completely free, but the software is not the whole story.
Somebody needs to set up the code. That only takes an hour or two, but it’s still an hour of somebody’s time.
Somebody needs to design the different versions for the test, and that time isn’t free either.
And in a real company, you might need a meeting about it, and that is 30 minutes of your life that you’ll never get back.
Basically that’s it, but it’s not nothing. Remember: honesty in UX builds trust, so don’t tell people that an A/B test is free. It costs time.
The time you spend waiting for the test to run is not a cost, because you don’t have to do anything while you wait. You don’t have to babysit or check an A/B test. It just… runs. So the start-to-finish time of an A/B test is not relevant to the “cost”.
Facebook is usually more of a cost than waiting for an A/B test to run.
Why this isn’t a stupid question:
NOT doing the A/B test might be the real cost.
It is very common to focus on the cost of an action, rather than the cost of inaction. That can be a huge mistake. And that is the conversation you should have about A/B testing.
The discussion about “return on investment” for an A/B test is often stupid.
If you’re testing your conversion rate, and Version B is 5% better than the existing design, you just added 5% to your registrations or revenue, forever.
Unless you run a high-tech lemonade stand — which would be awesome — a 5% increase in conversion could be thousands or millions of dollars, every year. That’s far more than what you spent on the A/B test, once.
In business, the technical term for that is “fuck yeah, let’s do it.”
You can never know what the increase will be until you run the test, but 5% is a pretty normal result. It is not uncommon to increase your numbers by 20%, 50%, or even 200% if you nailed the hypothesis.
Some CEOs would be willing to move the company to another continent to double their global revenue. You might only have to do a good A/B test on the right thing.
The hour-or-two of development time sounds pretty good now, right? It probably took longer to design your business cards.
This is why I always say that the best salesperson in any company isn’t on the sales team. It is (potentially) the UX designer.
Tomorrow we will answer: “What if my favorite design doesn’t win the A/B test?”