• What they don’t tell you about A/B testing velocity

    How many A/B tests do you run every month?

    How about every year?

    Now. How many tests should you be running every year?

    You probably don’t know the answer to the third question.

    You’re not alone – most marketers can’t answer that with confidence.

    If you are like most people, you probably launch one or two A/B tests every month – meaning that you perform less than 30 tests per year.

    Studies show that only a few companies run more than 15 tests every month. That’s understandable, considering how demanding it can be to build a high-velocity A/B testing program.

    Anyways…in this blog post, I will splash a few things that don’t get talked about in the subject of A/B testing velocity. So this one is for optimizers, marketers, founders, CEOs, or just anyone looking to understand what transpires behind the scenes of a high-velocity testing program.

    Let’s get the ball rolling…

    What’s A/B testing velocity

    A/B testing velocity refers to the number of times you launch an A/B test in a given period.

    For example, if you only run one experiment every month, your AB testing velocity would be 12 tests per year. Likewise, if you run six tests every month, your testing velocity would be 36 tests per year.

    There’s high and low-velocity testing.

    High-velocity testing means launching more tests at a maximum speed to accelerate growth. On the other hand, low-velocity testing is about running fewer tests in a given period. The idea behind a low-velocity testing program is that quality is much more important than quantity.

    Now that you know the difference between high- and low-velocity testing, which one do you prefer?

    I asked Jeremy Epperson – CRO specialist at Conversion Guides – a similar question, and this is what he had to say:

    If we are looking at managing low vs. high levels of effort (LOE) testing, then it is a mix based on the overall CRO strategy. For teams newer to CRO, it’s important to start with low LOE tests to reduce technical complexity, simplify, mitigate resources, and help build momentum. 

    You can start with low LOE testing, and if you aren’t getting the results, you can shift towards bigger swings tests. We try to focus our testing on isolating specific variables to understand the true impact and extrapolate insights in a clean way. 

    In situations where we have high LOE or technically complex tests, then we push the launch date out for a few weeks on our weekly sprint cycles. That can be balanced across tests with a faster turnaround, so we are maintaining velocity.”

    Shiva Manjunath – the Marketing Manager at Gartner – says that the test quality can also determine testing velocity:

    You don’t have to compromise on the quality of some tests. High-quality tests demand more development time and design time. And that will decrease your velocity unless you have many developers and designers.”

    I also reached out to Oliver Palmer – a CRO consultant – and asked him the same question, and this is what he said:

    Testing velocity is generally dictated by website traffic and the size of your team. If you have lots of traffic and a sizable team, you’re in a good position to run a lot of tests.

    In other words, Oliver is saying that high-velocity testing is not for everyone. You need to have adequate resources (loads of traffic, high budgets, and sizable teams) if you plan to execute a high-velocity testing program.

    Besides the web traffic and team size, your testing velocity can also be determined by the testing tool you use. Not all testing tools can handle a high-velocity testing program. Some have advanced features that are more reliable for testing at high velocity than others.

    What’s all the fuss about high-velocity testing?

    Although most optimizers favor a high-velocity testing program, not everyone is on the same page.

    For example, Conversion Guides’ Jeremy says:

    We have tracked data across launching CRO for 155 businesses and have found that there is a positive correlation between increasing testing velocity and improving ROI. This is one of the key quantitative metrics that you can monitor to find improvements in your team, process, and workflows. Of course, velocity is not the only contributor to driving results, but it is a key factor.

    On the other hand, in this article written by Natasha Wahid, she says that running more tests does not equate to a successful testing program. She also says:

    If you decide to test many ideas quickly, you are sacrificing your ability to really validate and leverage an idea. One winning A/B test may mean quick conversion rate lift, but it doesn’t mean you’ve explored the full potential of that idea.”

    In this vein, Shiva says:

    FigPii Heatmaps

    Just because you have a super high-velocity testing program, it doesn’t mean that the quality of your tests is high. You can run 50 button color tests in one month, and your velocity for that month is 50 tests. But such tests are lower quality tests because they don’t help you gain valuable insights.”

    Companies like Booking.com live and swear by high-velocity testing. They run more than 24k tests every year. At one point, they even had 1000 tests running at the same time. Sounds crazy, but they consider high-velocity testing as a critical component of their growth strategy.

    Invesp‘s CEO, Khalid Saleh was not a huge fan of high-velocity testing at first, but he now has grown into it:

    I was not a big believer in the importance of high-velocity testing prior to 2019. We averaged four experiments per month for any of our clients. And our success rate was around 40%. The reason that many companies focus on high-velocity testing is that most of their tests do not succeed. The average industry success rate for most A/B test experiments hovers around 12 to 15 percent. But then I figured out that if we can increase our testing velocity and at the same time keep our success rate on the same levels then it will be a huge win for the companies that we work with.

    As you can see, at its core, high-velocity testing is based on the philosophy that the more tests you run, the more knowledge – about growing your business – you gain. According to Trevor Hinkle:

    “Massive growth often doesn’t come from one or two big winning tests, but from many smaller wins stacked together, and consistent testing means the next win is always right around the corner.”

    Oliver says it best: “the more frogs you kiss, the more princes you will find!”

    How to measure the velocity of your A/B testing program

    Actual there are three metrics you should measure if you want to learn more about the velocity of your testing program:

    Testing Capacity

    How many tests do you launch in a calendar year? You can use this formula to find out:

    52 weeks ÷ average test duration x the number of pages you can test

    Testing Velocity

    Now that you know your capacity, it’s time to find out how many A/B tests you launch.

    Testing Coverage

    Once you know the number of tests you can run and the actual number of tests you launch, you now have to ask yourself the number of days you are run tests? Again, keep in mind the periods you don’t run tests to get an exact number.

    6 Tips for building a high-velocity testing program

    Only a handful of companies run more than 15 tests per month. The majority of companies launch one or two tests per month. That is because a high-velocity testing program is not easy to execute. And you might fail a couple of times before creating a consistent high-velocity testing program.

    In this webinar, Booking.com’s ex-Director of Experimentation, Lukas Vermeer, explains that their culture of experimentation is not something that they achieved on the get-go.

    He says that it took them 15 years to build. And they had to produce five to six iterations before having a thriving culture of experimentation.

    Lukas also goes on to say:

    “The reality is that you build a platform for four, and then you learn along the way all of the ways that are going to break, and then you scale up to 40, and then you scale up to 400, and then you scale up to four thousand. But, every time you scale up, you learn about some of the things that are breaking and how we will avoid that next time. So, this is an iterative process. I don’t think what we have now; we could have built or designed from the get-go.”

    Jeremy agrees with Lukas in starting small and increasing the velocity overtime:

    You have to put the processes in place and refine them before you ramp up testing velocity. Otherwise, you experience breakdowns. So the first 90 days should focus on working on process, documentation, handoffs in the workflows between team members, getting comfortable with scientific testing, pushing many tests through the queue together. When those pieces are in place on whatever timeline works for your team, then you can start to increase the testing velocity.

    There are real-world constraints in any business, including time, resources, knowledge, team skills. So each team has to approach this in a way that works for them. There are a lot of factors with the complexity of starting or managing a CRO program. Find what works for you. Start small and build over time.”

    With that said, here are five key elements to keep in mind when you think of unlocking a high-velocity testing program:

    1. Fill your website’s testing pages at all times 

    To maintain a high testing velocity, you have to make sure that each page (that you can test on) has a specific test running at all times.

    Not testing anything represents a wasted learning opportunity that can lead to growth.

    For instance, an eCommerce website is said to have a high testing velocity if there’s always a test running on the homepage, category pages, product pages, cart pages, and checkout pages.

    In this article, Trevor says that if your site’s mobile experience differs from that of a desktop, you should launch separate tests on desktop and mobile.

    But before you go down the rabbit hole, let’s agree that you won’t launch tests just for the sake of having a high testing velocity. Yes, it would be best if you always were testing – but you must have a strong hypothesis backing your test.

    As you run many tests simultaneously, it’s crucial to avoid running tests that could potentially conflict with each other or tests that are similar to each other.

    2. Prioritize tests that are easy to implement 

    This is a no-brainer. Your prioritization of testing ideas has to be strategic.

    Focusing on tests that require much effort (such as strategic tests) to implement means you won’t have enough time to run many tests.

    You don’t have to ignore strategic tests, of course, but you need to prioritize testing ideas that are more tactical because they require lower effort to implement and they also have a higher impact.

    So how do you know which testing ideas have a lower effort-to-impact ratio?

    That’s an intelligent question to ask.

    You can use Widerfunnel’s Pie framework. This framework evaluates ideas based on three factors: potential, importance, and effort/ease.

    Potential: Testing ideas are not equal – some are more important than others. Some testing ideas have a significant impact on conversions than others. For instance, testing the messaging of your homepage can have a substantial effect than testing the size of your security badges.

    Importance: Is the page you are testing significantly? How important is it? Your site’s most important pages are those pages that have the highest volume and the most expensive traffic.

    Effort: How easy is it to build the test? You should consider two critical factors when it comes to ease: technical implementation and organizational barriers.

    3. Build new tests before the current one ends

    Continual ideation and being proactive is critical in a high testing velocity program. You don’t have to wait for the current test you are running to finish before you build the next one.

    Jeremy says that they “build their testing roadmaps two weeks ahead of launch at most.” He adds:

    If you look further out than that, you end up with wasted resources, not needing a test, getting insights that change how you test. A lot of things change rapidly. It’s about striking a balance of testing efficiency vs. needing to be agile with test queue management. 

    We “batch” a few tests at a time and push them through the queue together and launch them on our weekly sprint call. If you have enough traffic and pages to test across, then it’s pretty easy to use this approach and avoid the issue of building tests that won’t go live.”

    Oliver concurs:

    “You should build out your next test while one is life. However, to avoid wasted effort, it’s best to ensure that they’re different streams of work which explore tangential hypotheses which are ideally on different parts of the site.” 

    But, he also warns:

    “Presupposing the outcome of a live experiment is a humbling and frustrating exercise that is best avoided, in my experience.”

    Constantly be scrutinizing every touchpoint that your customers have with your brand. You can conduct expert reviews, qualitative research, quantitative research, competitive analysis, and usability to generate more hypotheses that you can test.

    As I have already mentioned earlier, every test has to be backed by a data-driven hypothesis. Otherwise, you will end up testing for solutions in search of a problem.

    4. A standardized Quality Assurance process 

    Over the years, we have noticed that the Quality Assurance (QA) process can determine the success and failure of your test(s).  Visual and functional errors in the variants are more likely going to distort your test results if you don’t do proper QA.

    Khalid says:

    QA is an area we have invested in tremendously. You want to make sure that you’re not deploying a test that is going to cause a breakdown in the checkout process. Each and every test we launch is QA’ed by multiple people on our side, and before we even launch any test, we always ask our clients to review those tests before we deploy them.”

    So, whether your testing velocity is high or low, you still have to maintain a high degree of accuracy and produce reliable findings. That is why you should make it a point that every test is quality assured.

    It’s even more important to have a standardized QA process in a high testing velocity program since you will be launching dozens and dozens of tests and making constant changes on your website.

    So, make sure you have an effective and standardized QA process that you do before rolling any test to your visitors. Trust me; you don’t want to discover a functional or visual error after a test has been running for days/weeks.

    5. Document your learnings 

    It’s true. A lot of learning happens through reflecting on things you have done. That also rings true in the world of A/B testing.

    Documenting your learning is a critical but often underrated step in A/B testing – especially when you are constantly changing your website elements.

    Besides helping you avoid repeating the same tests, you might have conducted before, documenting your A/B test learning enables you to remember information and provide the source material for your next series of tests.

    At Invesp, in whatever optimization project we handle, we always show our clients how to document test learnings and share their learnings with every department in their company.

    When it comes to documenting A/B test learnings, people usually make the mistake of only registering tests with a lift in conversions. And they ignore difficulties that had inconclusive results or no uplift.

    FigPii Heatmaps

    According to Shiva:

    Even if there’s no winner in your test, there are always insights. You will know why the test lost and what to avoid next time, and you can always document these findings and communicate them to stakeholders.”

    6. Robust project management 

    A/B testing is something that requires specific management criteria for everything to run smoothly. In a high-velocity testing program, there are many different types of experiments that need their own unique plans.  This is why you need to have a robust project management process.

    According to Khalid:

    Remember, a single experiment involves many different pieces like CRO analysis, design, implementation, QA, deployment of monitoring, and analysis. These pieces are easy to monitor when you are doing about four experiments a month. It gets a lot complicated when you cross to 10 experiments per month. So, you’d need very detailed project management to make sure that everything gets delivered in time and to keep track of all the different elements happening.”

    Robust project management ensures that there’s rigor in architecting experiments properly so that they fit well within the broader context of your testing program.

    Of course, as you launch more tests in a short space of time, it is possible that mistakes may emerge, and those issues may result in bad quality output. But if you have a detailed project management process in place, risks are mitigated by making sure that the quality is tested at every step along the way.


    A high-velocity testing program is more ideal because it enables you to fail fast, and learn fast, or to win fast and grow quickly. But it’s important to emphasize the fact that it’s not for every website. You need to have enough traffic, budget, and a fairly large team to be constantly testing.

    A high-velocity testing program doesn’t mean that you should stop tests too soon – make sure that your tests run for a certain period of time before you stop them. It’s always important to make sure that you prioritize quality over quantity at all times.

    And if you are considering building your own high-velocity testing program, be prepared to fail before you succeed – it takes a lot of time to perfect effective and consistent testing. But it’s not impossible, you can achieve it if you follow the tips highlighted in this article.

Simbar Dube

Simba Dube is the Growth Marketing Manager at Invesp. He is passionate about marketing strategy, digital marketing, content marketing, and customer experience optimization.

View All Posts By Simbar Dube

Join 25,000+ Marketing Professionals

If you enjoyed this post, please consider subscribing to the Invesp blog feed to have future articles delivered to your feed reader. or,receive weekly updates by email: