5 Tips to Run A/B Tests The Right Way

Khalid Saleh

Khalid Saleh

Khalid Saleh is CEO and co-founder of Invesp. He is the co-author of Amazon.com bestselling book: "Conversion Optimization: The Art and Science of Converting Visitors into Customers." Khalid is an in-demand speaker who has presented at such industry events as SMX, SES, PubCon, Emetrics, ACCM and DMA, among others.
Reading Time: 5 minutes

When you implement changes into your marketing campaign how do you know these changes will be liked by your audience?

More importantly, how do you know the changes will convert your customers better than what you had earlier?

This is where A/B testing comes in place.

Besides increasing your conversions, A/B testing helps you look beyond your analytics and figure out what your customers really care about.

As powerful as it may be, A/B testing is also hard to understand for most businesses. So in this article, I’ll share some tips to help you do A/B testing the right way

1. Focus on what your audience needs

Asking your users questions is a great way to get information for testing hypothesis. It is also a good starting point for your A/B testing journey.

You can use tools like SurveyMonkey to get feedback from your visitors. One way is to get a spot in the sidebar and ask people questions like, “How do you think we can improve your experience on the website?”, “Are there any problems you are facing on the website?”

These can range from very subjective questions about the site, like this:

ab-testing-survey

Or it can be highly specific questions focused on single issues such as these:

ab-testing-survey-question

Surveys aren’t the only way to gather customer feedback. Old fashioned email and even phone interviews often work even better.

For example, Alex Turnbull, founder of Groove, called over 500 customers over 4 weeks to learn why they were leaving and what could be done to get them to stay.

The idea behind getting direct feedback is to apply the information you get to do smart A/B testing that truly impacts the conversion rate.

Although surveys are great for getting feedback do take their results with a pinch of salt. Individual testimony is subjective, after all. Always make sure that your behavior data matches your survey data before making a big scale change.

2. Don’t follow others blindly

Stop all your guesswork.

You must begin A/B testing without any assumptions. Just because big orange button worked for one startup doesn’t mean it will work for you as well.

It may very well be the case that a big yellow button converts better than any other color on your website as well, but implementing it without A/B testing isn’t the way to go about it. In some cases, violet works better than yellow.

Once you understand this simple framework, you can adopt a systemic approach to testing instead of guesswork.

For example, suppose you want to test your CTA button to get better conversions.

A systemic approach to do this would start with the following assumptions:

  • Users don’t read pages; they scan.
  • Using bold colors ensures that an element grabs the user’s attention.
  • Certain colors evoke certain emotional states (though the effect is often overstated).
  • Using colors that match or approximate your brand colors makes for a more cohesive (and better looking) brand.

By combining these three assumptions, you’d basically narrow down your testing ideas to just 2-3 colors (colors that stand out on the page, evoke your desired emotion, and match your brand colors).

For example, HipMunk’s CTA is:

  • Above the fold
  • In a color that stands out and contrasts against the blue of the background
  • Has a visual indicator (a plane icon) to show what it does
ab-testing-cta

This is far better than simply testing a yellow button because it worked for somebody else.

3. Don’t brute force and test out all the best practices

Every website has a different list of things to test. You must have come across articles like, “13 A/B testing Best Practices”, “50 Essential Best Practices to Boost Conversion”.

Do you really have time to run tests on every little item in that list? After all, you need results and you need them now.

Narrow down the list to the tests that are relevant to your website and the ones that have potential of producing higher conversion rates for metrics that actually matter.

For example, if your target is to get more people to download your free guides (in exchange for an email address), it doesn’t make much sense for you to test your site’s footer design. You could focus on your download page design instead and extract much more value.

At the same time, it’s a good idea to knock off the “low hanging fruit” from your testing to-do list quickly. These are usually tests you can execute quickly with minimal traffic, tests like:

  • Headline value proposition
  • Social proof in sidebar
  • Button CTA

4. Clearly define your success metric

Before you run any test, decide how you’re going to measure success for that particular test. Define one success metric that will determine the winner based on the result you’re trying to achieve.

At the same time, ensure that your success metric isn’t so broad that it is unmeasurable. Pick something that is more measurable and points to your eventual goal.

For example, for most businesses, the ultimate goal is to convert as many visitors as possible into customers.

However, tracking the visitor to customer conversion rate can be hard, especially if you have a lengthy sales process.

In such a case, tracking the conversion rate for the intermediary steps leading to the sales process is much more useful.

For example, you could focus on the number of people who enter a landing page and fill out a form to get a free eBook. Your “success metric” in this case would be the percentage of visitors who fill out this form.

Based on this your testing hypothesis might be, “The more fields on a form, the lower the conversion rate.”

Doing this will help you understand the success of your A/B tests much better.

5. Document your tests and results

For some reason, this is usually neglected by a lot of businesses.

This not only saves you time by letting you not repeat the tests you’ve done in the past but also helps educate your employees and successors.

By building a library of test results, you’ll also have valuable data to aid your future site redesign.

Additionally, testing data can reveal deep insights about your customers.

You don’t have to make this very elaborate either. Something like a simple Excel sheet would work quite well:

document-ab-test-results

Over to You

A/B testing is part science, part art. The most effective A/B tests combine subjective data from customer interviews, eye tests, etc. with objective data to create better converting pages.

Here’s what you should take away from this post:

  • Understand your audience and what it really wants on a page before you test anything.
  • Define your success metric for each test.
  • Prioritize your tests instead of simply testing every “best practice”.
  • Don’t rely on case studies; what worked for others might not work for you.
Share This Article

Join 25,000+ Marketing Professionals!

Subscribe to Invesp’s blog feed for future articles delivered to receive weekly updates by email.

Khalid Saleh

Khalid Saleh

Khalid Saleh is CEO and co-founder of Invesp. He is the co-author of Amazon.com bestselling book: "Conversion Optimization: The Art and Science of Converting Visitors into Customers." Khalid is an in-demand speaker who has presented at such industry events as SMX, SES, PubCon, Emetrics, ACCM and DMA, among others.

Discover Similar Topics