A/B and Multivariate Testing

There are two types of tests that a digital marketer uses to optimize creative: A/B or multivariate tests and creative rotation attribution Tests.  

The first, A/B and multivariate tests, provide insights about creative elements within a creative.  These tests answer questions such as the following: Do price or feature-oriented ads perform best?  Do multi-frame or single-frame ads perform best?  Do feature-oriented ads outperform price-oriented ads with any audience segments?  Do images improve the performance of price-oriented ads or detract from the performance?  And so on.  

Test Planning

Central to planning a test of your display or video creative is ensuring you can generate creative insights that are transferable to future creatives, in as efficient a time period as possible. 

A/B tests of digital ads are not like A/B tests of web site content or of newspaper headlines.  The latter scenarios are inbound channels in which the primary question is which entire treatment outperforms other treatments so that you can make a decision on which entire treatment to include on a web site or a news article.  Generating insights into why a treatment performs well based on its content, image and stylistic elements is a lower priority. 

In the case of digital ad creative, an outbound channel with a more direct cost to administer tests, the primary question is what can be learned about the performance of various creative elements - messaging, images, styles - that can be applied to creative design and decisioning now and into the future.     

The central constraint in digital creative testing is the time required to run a test, during which traffic is tied up in the test and not available for new campaigns and promotions.  Adacus is able to reduce testing time using Bayesian A/B test statistics, multivariate test design and offline ad effectiveness tracking (see Pitfalls to Avoid in Creative Testing to learn more on this). 

As a result, efficiency in generating valuable creative insights that can be transferred to creative design and decisioning going forward is the goal of digital creative test planning, and is maximized with the following two test planning tactics.

1. Identify broad creative distinctions to test first

A best practice in A/B testing is to first test coarse-grained differences, then hone in on tests of more fine-grained differences. 

Testing the difference in performance between different messages which reflect differing value propositions - such as a price-oriented vs feature-oriented message - will generate results much more quickly than testing the difference in performance between different background colors of a call-to-action button (assuming the ad effectiveness metric is not click-through rate).  This is because large differences in outcomes can be detected faster than small differences.

2. Set up a Multivariate Test Design to ask multiple questions with the same sample

When identifying the distinctions to test, group the distinctions that are orthogonal to each other and can be tested together in a multivariate test design.  For example, messaging and imagery can be combined into a 2x2 test, in which the A/B test results are aggregated by message and, separately, by image.  This effectively reuses the same sample for multiple A/B tests.

Test Setup

Setting up A/B and multivariate tests in Adacus is incredibly easy.  Unlike other ad servers that require separate creation of audience segments that are split and to which separate creatives must be traficked one by one, Adacus supports testing natively in one simple interface.

Simply create a separate Rotation for each creative, create a new Campaign and click Assign Creative Rotation(s).

In this example, a 2x2 test is comparing the performance of two messages - Premium Feature vs Free First Month - and two images - No Image vs Family Image.  As a result, there are 4 single-creative rotations that are selected for the top and only node of the Campaign's Decision Tree.

Adacus automatically splits the percentage of users assigned to each creative rotation equally.

Once the campaign is saved, simply generate the campaign-specific ad tag for the appropriate DSP (or publisher) and you are done.

Analyzing Test Results

Adacus does the statistics for you, via an A/B Test Results dashboard that makes it easy to glean creative insights, present results and make decisions based on all of the information. 

Simply go to the Analytics tab of Adacus, and select the Campaign for which you want to view test results.  Below is an example of an A/B/C/D campaign that compared 5 messages - Discount, Family, Multicultural, Premium and the Control message.

Analytics.png
  1. The line chart at the top of the dashboard shows the cumulative conversion rates by creative variation over time.  As you can see, initially the Discount and Family appeared to outperform the other 3 messages, but as is usually the case in A/B tests there was a "regression to the mean" over the course of the test. 
  2. The first bar chart column shows the Chance to Beat the control for the first 4 messages.  Whenever the Chance to Beat is over 80%, the bar chart flags it with a light blue conditional formatting.  In this example, while no message sufficiently outperformed the control among all users, when drilling down within audience segments certain messages did outperform the control.  For example, the Family message outperformed the control for users in households with a child in the house.
  3. It's all fine and good to know the chance that one creative will outperform another creative, but that doesn't really tell us how much is at risk when we are wrong.  That's what the Expected Loss tells us in the dashboard.  So if the Chance to Beat is 90%, how much conversion rate would we expect to lose on average during the 10% of the time we are wrong?  In the case of the Family message, the Chance to Beat the control is 78.8%, and the Expected Loss is 0.71%.  That means that we expect that 21.2% of the time the Control will outperform Family, but the average loss in conversions will be only 0.71%.  By comparison, during the 78.8% of the time that the Family message outperforms the control, the expected gain in conversions is 4.95%.  That might be a risk that many digital marketers would be comfortable with. 

Testing Overview

Pitfalls to Avoid in Creative Testing

  • Faint Signal Problem
  • Not Controlling for Media
  • Test Plan Siloed from Media Plan
  • Rotation-based Testing on DCM, Facebook

A/B and Multivariate Testing

  • Test Planning
  • Test Setup
  • Analyzing Test Results

Creative Rotation Attribution Testing

  • Test Planning
  • Test Setup
  • Analyzing Test Results