“Let’s do an A/B test”. This is a common buzzphrase thrown around during my agency and enterprise days as a digital marketer. Today, A/B testing is a popular concept used for decision-making amongst marketers, product teams and web developers.
However, as much as it is celebrated, there are compelling reasons to NOT A/B test.
Now, before that, let’s look at what’s an A/B test, why do people do it and the benefits.
What is A/B Testing?
A/B testing, also known as split testing, is a method where you compare two versions of a web page, email, ad or an app feature to see which performs better.
You present version “A” to one group and version “B” to another, tracking how each performs on a specific metric such as click-through rates or conversions.
The idea is simple: change just one element like a headline, call-to-action button, or product image and see which version gets more favourable results.
The Benefits of A/B Testing
The benefit behind the A/B test is relying on data driven decisions to improve business outcomes. Instead of relying on intuition or guesswork, A/B tests provide quantifiable data.
Small changes like tweaking a headline or changing the layout of your landing page can lead to improvements in conversion rates. Theoretically, if you manage to double your conversion rate from 1.5% to 3%, your sales is going to double. Holding all other variables constant.
I use “casual A/B tests” mostly on an ad level. If I am running a video ad on Meta’s advertising platform and it’s not getting the desired click through rate. I’ll switch up the headlines (as a “hook”) and look at the data for a couple of days.
It is also common consensus amongst the digital marketing community to also test headlines on landing pages.
Here are five common areas where A/B tests are applied:
- Website Design: Testing layouts, call-to-action buttons, or navigation changes to improve user engagement and conversion
- Email Marketing: Comparing subject lines, email content, or sending times to increase open rates and clicks
- Advertising: Testing ad copy, images, or targeting to improve click-through rates and ROIs
- App Features: Testing different app functionalities to enhance user experience or retention.
- E-Commerce: Testing product descriptions, pricing strategies, or checkout flows to boost sales.
Free Tools for A/B Testing
A/B tests have to reach “statistical significance”. Statistical significance in an A/B test means that the results aren’t just due to random chance.
It tells you if the difference between version A and version B is real and not just a coincidence. If a test result is “statistically significant,” it means you’re confident that the changes you made (like a new headline or button colour) caused the difference in user behavior.
Usually, this confidence is set at 95%, meaning there’s only a 5% chance that the result happened by luck.For your A/B test to stand mathematically, I recommend using the AB Test Significance Calculator by SurveyMonkey.
The Math Fun Behind A/B Tests
Now, here’s why I say that most marketers don’t truly understand A/B tests.
When testing small changes in conversion rates, the number of visitors needed grows drastically as the conversion rate decreases:
- 50% to 55%: Needs 1,567 people per group.
- 20% to 22%: Needs 6,347 people per group.
- 10% to 11%: Needs 14,313 people.
- 5% to 5.5%: Needs 30,244 people.
- 1% to 1.1%: Needs 157,697 people.
For a 0.1% to 0.11% increase, you’d need over 1.5 million unique landing page visitors.
You can interestingly observe that as the base rate gets smaller, you need exponentially bigger sample size to get reliable results.
Source: https://www.evanmiller.org/the-low-base-rate-problem.html
Let’s further math it into practical terms.
Let’s assume you are paying $2 per click from Meta ads to driven traffic to a landing page converting at 5%.
You’ll then need at least $60,488 ($2 multiplied by 30,244) in advertising budget to confidently say that your test is statistically significant if your conversion rate increases to 5.5%.
That’s ALSO assuming each click drives a unique landing page view on your landing page.
When and Why You Should A/B Test
Look, I can confidently say that if you are fixated on A/B testing button colours, you’re missing the bigger picture.
If your marketing strategy focuses on tweaking minor design elements instead of addressing core customer pain points, product-market fit or building trust with your audience.
You have a business problem, not a “split testing” problem.
Marketing success is built on understanding your customers: their motivations, needs, and barriers to purchase.
Even though A/B tests are a buzzword thrown around marketing discussions. In reality it requires a good amount of time (and traffic) to gather enough data for statistically significant results.
This isn’t plausible for smaller websites with low traffic or a lower paid media budget. They can also often produce incremental improvements, which might not justify the time and resources invested.
How about… A/A Testing?
There’s a blogger that ran A/A tests just to demonstrate the ridiculousness of running A/B tests.
A/A testing is an experiment where two identical versions are tested against each other. Ideally, the results should be the same.
However, if it’s different, it indicates sampling errors or flaws in your A/B testing setup. This helps point out natural variation that can occur when split testing, showing how often you might get false positives in A/B testing.
The more interesting part is that it can occur just due to random chance, especially with small sample sizes.
This goes to show that just because a test hits a significance threshold (often 95%) doesn’t mean the results are always valid.
Small Sample Sizes Lead to Misleading Data
When you run tests on a small set of users, you are more likely to get skewed results. If you don’t have enough traffic, the variations between two groups may seem significant when they could merely just statistical noise.
In Kadavy’s case, his A/A test showed a 300% increase in conversions even though he made no changes. Hence demonstrating how unreliable small sample tests can be.
For the most part, small businesses often do not have the traffic necessary for statistically valid A/B tests.
Simple question to ask yourself:
- Do you have enough traffic to make the test valid? You can use a sample size calculator to check.
In my experience, you are a hundred percent better off relying on qualitative data like customer feedback than investing heavily on split testing.
Minor Gains are Often an Excuse of Vision
In corporate speak, you may sound smart in marketing discussions for proposing A/B tests. However, A/B tests are often a poor excuse for a lack of vision.
Yes, a website that looks like it’s designed in the early 2000s isn’t going to need A/B testing. It needs a revamp.
If your customer acquisition funnel has poor images, is barely congruence to your Facebook advertising visuals… does it need an A/B test? Or does it need a revamp?
As mentioned, A/B tests detract from long-term vision and bolder business decisions.
Secondly, if you overly emphasize data, metrics, you can lose sight of creative ideas and larger brand messaging.
Peter Thiel, a well known tech entrepreneur criticizes incrementalism, which is just making small improvements to existing ideas. He believes that entrepreneurs need bold visions, not constant second-guessing through endless tests.
Eric Ries, known for the Lean Startup concept, also warns against turning A/B testing into a strict routine. He argues that over-relying on tests can hurt creativity and agility.
They both mentioned that testing can’t replace the big-picture thinking needed for true innovation.
100 Million Dollars in Revenue without a Single A/B Test
Ahrefs, an SEO tools company that grew to $100 million in annual revenue. They achieved this growth without a single A/B test.
They even did it without using traditionally data driven strategies or tools like Google Analytics, conversion tracking and retargeting.
They offered no discounts, discontinued their affiliate programs and didn’t bid on competitor’s keywords.
Tim Soulo, their chief marketing officer, attribute their success to focusing on product development, constant re-iterates based on gut, qualitative feedback.
My Own Experience with A/B Tests (Or a Lack Of)
I was an employee for a digital marketing agency serving small medium enterprises in Singapore. I also managed millions of dollars in paid advertising spend on an enterprise level. I always rejected A/B tests both internally and proposals from vendors. If I wasn’t in the position to make the call, I didn’t pay much attention to A/B tests.
In short, I never bothered.
I also grew my first business, a dating advice blog for men targeted at the Singaporean market to more than 100k in sales without a single serious A/B test.
If I had a hunch that the headline copy of a three minute video ad on Meta platform was causing a drop in click through rates, I’ll duplicate it on an ad level in the ad dashboard, run it side by side with the original for a couple of days, and then look at the results. That is the CLOSEST I did to a proper A/B test.
Multivariate Testing: Can you Test Multiple Things at a Go Then?
Now, can you testing multiple changes (variables) simultaneously?
Multivariate testing allows you to test multiple elements simultaneously. For instance, if you’re testing headlines and images, a multivariate test could assess how various combinations perform.
The downside?
Multivariate tests require significantly more traffic to achieve meaningful results, as you need a larger sample size to ensure that your findings are statistically significant.
You’re back to the same constraint: a lack of traffic (and budget to purchase that type of traffic).
Then again, I’ll assure you that most marketing issues aren’t in single or multi variable tests. They are in your advertising creatives, customer research, messaging and lack of a product market fit.
Conclusion
All in all, should you conduct A/B tests? Yes, A/B tests can be helpful if it is implemented on high-traffic sites or customer acquisition funnels with bigger budgets. If you are spending hundreds of thousands on paid media buying per day, then A/B tests can be a good consideration.
Here’s the bottomline: if you are obsessing over testing button colours… then you’re probably missing the point.
In our opinion, marketing success is built on understanding your customers and addressing their pain points. Yes, Ahrefs grew to $100M in revenue without a single A/B test, no Google Analytics, no conversion tracking, just focusing on product development, listening to customers, good content marketing and reiterations.