AB Tests That Destroy the Myths Created by AB Testing+
- Posted in Conversion Rate Optimization
Remember the good old days of retailing? You could count on at least two truths. First, advertising would get people into your store and, second, you converted some of that traffic into customers – at least 20% of them, with many sectors enjoying rates of 50% or more.
And there was a bonus truth: ‘cart abandonment’ wasn’t in your vocabulary.
Then along came ecommerce. The two truths remained the same. Online advertising, like SEO and PPC, would drive customers to your website. But the second truth proved shocking. While conversions still occured, online conversion rates were an appalling 2% to 5%.
And the third truth was scorched by 60%+ cart abandonment rates.
As the web evolved and technology made it easier to find answers, AB testing came to the fore as a means to improve conversion rates.
What a concept: serve two variations of a web page to customers and see which one performs better. Then use the winning version in subsequent tests to get even better performance. It seemed like the key to an enviable progression of ever-increasing conversion rates.
AB test results were considered so conclusive and reliable, many site and landing page designers used them as a rule-book for conversion optimization. More and more orange call-to-action buttons appeared, privacy assurances popped up and content pages increasingly featured headlines that began with a “How to…” or “9 Most/Best/Great …”.
All seemed well, but, oddly, while many AB tests improved conversion rates by hundreds of percentage points, average conversion and cart abandonment rates remained the same.
And the reason why is revealed by subsequent AB tests, which show that many of the long-held truths of conversion optimization and design can be mythical.
AB Test Myths Busted by AB Tests
1. Always Have Your Call-to-Action Above the Fold
One of the reasons we so readily ‘swallowed the pill’ of early test results is that they made so much sense. One such is the best place on the page for your call to action.
Early tests showed that, if the CTA was below the fold, many page visitors bounced without heeding the call. The assumption was that, if it was out of sight below the fold, visitors didn’t know the CTA was there.
However, in a number of tests, including one by Marketing Experiments, below-the-fold CTAs outperformed above-the-fold versions.
In the Marketing Experiments test, which showed a 20% increase in conversions, the post-test analysis determined that customers must appreciate the value of taking the call-to-action. If you can’t convey that value above the fold, they won’t take the call-to-action just because it’s there.
2. Video and/or Images Increase Conversion Rates
Another connection we have to AB test results is that we can often relate to the findings. As it became easier to add images and videos to landing pages, many test results showed that pages with visuals out-converted text-based versions.
And all it took for us to quickly accept those results as ‘etched-in-stone’ is to consider whether we would prefer to read 300 words or watch a snappy video. Gimme the video.
But, in a recent Invesp test, removing a video from above-the-fold, and replacing it with benefit-driven text, increased conversions by 88.46%.
Why? Not all video is faster at outlining the value of your offer than copy. And, while there are some exceptions, videos do not have actionable calls-to-action. The copy added by Invesp was to the point and very clearly outlined the value. And it was accompanied by a clear call-to-action.
3. You Must Highlight Privacy & Trust Factors
Again, you can easily relate to this one. It didn’t take us long to realize that all the personal information we were asked to submit to open an account on ecommerce sites, or to download some marginally helpful whitepaper, would later be used to bombard us with marketing messages – or worse.
We quickly hesitated to give away our personal info and soon AB test results showed that highlighting privacy policies and/or trust icons, especially close to CTAs, boosted conversion rates.
But it looks like consumers are more discerning than we think. In a test outlined last month on visualwebsiteoptimizer.com, adding the reassuring copy “We respect your privacy” immediately below the CTA button decreased conversions by almost 25%.
What happened? in their analysis of the results, VWO concluded that adding the text had the counter intuitive effect of instilling fear into customers’ thought process. Without the ‘privacy’ copy, they didn’t think about it, but when it appeared, they became concerned.
In this case, the page asked only for an email address, not even a name. Especially in light of stronger email filters, customers likely don’t mind swapping such a relatively small amount of info to take advantage of an offer. Privacy concerns don’t enter their minds, unless you place them there.
4. You Must Optimize for Mobile
With the proliferation of mobile devices, it’s a no-brainer that we should test what converts best on them.
One of the first findings was that mobile users preferred sites that displayed well on mobile devices because they didn’t have to pinch, squint or zoom to see content – or buy something.
With the aforementioned double-digit growth, year after year and with no end in sight, it has become conventional wisdom that your site must at least use responsive design, it not be designed for mobile first.
But don’t go mobile too fast. As outlined on www.wordstream.com, Jeff Allen of Hannapin Marketing, a big fan of mobile PPC, was surprised by AB test results that showed a desktop landing page outperformed a mobile-specific page by more than 25% in a month long test.
More bad news for mobile optimization? As shown in the Invesp infographic “U.S. mobile Commerce Sales”, tablets account for almost three times the sales of smartphones – and even Google suggests you serve your desktop site to tablets.
So Should You Just Stop Testing?
No. On the contrary. By busting its own myths, different results from similar AB tests are the perfect case for more testing. Indeed, because it’s obvious that you can’t rely on someone else’s results, you must test everything yourself to see what works best for you.
So there’s only one truth about AB testing: test everything and keep testing everything.
Join 25,000+ Marketing Professionals
If you enjoyed this post, please consider subscribing to the Invesp blog feed to have future articles delivered to your feed reader. or,receive weekly updates by email:
Connect with us
The Art and Science of Converting Prospects to Customers
By Khalid Saleh and Ayat Shukairy
- How to Optimize Product Pages on a Shopify Website
- The Role of User Research in E-Commerce Experimentation
- The Role of Usability in eCommerce Experimentation
- Invesp and e-CENS Form a Strategic Partnership
- The Role of Branding in E-Commerce Experimentation
- Shopify Mobile Optimization: Tips and Techniques for Improving Mobile CRO
- The Role of Data Analysis in E-Commerce Experimentation
- How to measure the revenue impact of experimentation
- Top 9 SaaS value proposition examples to learn from in 2023
- How to do Quality Assurance (QA) in a high-velocity testing program