Browse Templates Explore the Product Try It Free

A/B Testing is for Idiots

It starts with an innocent question. 
"Which headline do you think sounds better, this or this?"

It seems reasonable, you're asking someone's opinion. Why then does it get shot down with obnoxious responses like "How do I know, test it"? Don't get me wrong, I'm all for testing but I think 99% of the time for small businesses, it's completely and utterly pointless for one simple reason.

The sample size isn't big enough.

When I was 21/22, I worked for (at the time) the UK's leading Email Marketing company, Pure. They invented a tool called The Subject Line Selector. Nowadays, it's standard in any email software, but they pioneered it. Here's how it worked: You upload your list (minimum of 10,000), then it sends 2,500 emails using one subject line and 2,500 using an alternative subject line. It'll wait an hour and check the open rates against each other, see which one is best and use that for the remaining 5,000 sends. Awesome stuff, but here's the thing. It needed the sample size of 5,000 to see if there was any actual pattern. Anything before that is just randomness and unreliable.

What's your sample size?

Let's say you want to test the headline on your homepage. Seems reasonable enough. You can't even think about drawing any sort of conclusion until you've had a minimum of 5,000 people see that test. You can start to see a pattern at that point, but realistically you need at least 10,000 people to confirm that one has performed better than the other. Let's say for example you currently get 50 unique visitors to your site per day. Do you know how long it's going to take with those visitor numbers to get any sort of proof? 200 days. So, with that considered, you can essentially only run 1.5 tests each year based on the idea that you shouldn't test more than one thing at a time.

What's the point?

If your numbers differ from this, then adjust the severity of my suggestion accordingly. I do want you to question the point of wasting time, agonising over whether an orange button performs better than a green one though. If your visitor numbers are relatively low, you'd be better off spending an hour doing some marketing or setting up some ads than doing a split test. I'm not suggesting you should never test anything. Of course, change your homepage once every 6 months and test one against the other. But here's the thing. You've probably changed other, more influential things in that time which render your headline test or button colour test useless, like your pricing or how you position yourself maybe.

Don't obsess

I'm not 100% against testing. We do it, but I don't obsess over it. We tested our pricing recently by adding a lower price tier of £9 per month. Within 3 days, 2 people had bought it. Sweet, test worked! Now forget about it. Testing subtle design changes is pointless and a waste of time unless you're dealing with thousands of visits to your site in a short space of time. If you are going to test something, just make sure you give it that exposure of at least 5,000 views, but ideally, 10,000.

Let others do the work for you

If you want to find out if a CTA button works better in a mad colour or more in-line with the design of the site then just Google it. Someone with far more testing ability has already tested this a million times over. Just read a few articles and go with your best judgment. Test, but don't obsess. Until next time.

Adam Hempenstall's profile image
Adam Hempenstall is the CEO and Founder of Better Proposals. He started his first web design business at 14 and has since written four books and built an international movement around sending better proposals. Having helped his customers win $500,000,000 in the last 12 months alone, he’s launched the first ever Proposal University where he shares best practices on writing and designing proposals. He co-runs a once-a-year festival called UltraMeet and is a massive FC Barcelona fan.