How to A/B test Catalog Ads with Advantage+

A method for A/B test in Facebook with Advantage+ campaigns.

Table of Contents

1. Save variants in Confect2. Create a new Advantage+ campaign3. Duplicate 4. Start the A/B test and select campaigns5. Wait for the winner6. Apply learnings

In the majority of cases, we recommend using manual campaigns to set up an A/B test instead. The only way to conduct a reliable A/B test with Advantage+ campaigns is to create an individual campaign per each variation you're testing while you'd only need a single manual campaign to do this.

Why isn't it recommended to A/B test with Advantage+ campaigns?

1. A proper test with Advantage+ campaigns requires one campaign per creative tested. This can quickly add complexity and become overwhelming.

2. A common mistake is adding multiple ads using different designs into ONE Advantage+ campaign. This will lead to Meta distributing the budget unequally, not giving each design a chance to deliver.

If you'd rather use a manual campaign, click here to read the guide.

1. Save variants in Confect

The first step is to create the individual designs you'll be using for A/B testing, and add them to the Confect catalog as design variants.

If you haven't yet done so, click here and follow STEP 1 of this guide.

⚠️ IMPORTANT It is VERY important that all creatives in the test, including your "white design", are added to the same product feed as design variants.

Do not A/B test multiple catalogs - the older catalog will have an advantage in the Pixel learning data, making the test unreliable.

In case this is your first time customising your Catalog Ads, and you want to test the difference between using standard, unedited pictures and Confect designs, we recommend creating these designs:

  • Standard "packshot" design (design only includes the product and a white background)
  • Confect design #1
  • Confect design #2
  • Confect design #3

For example:

2. Create a new Advantage+ campaign

The next step is to create the campaigns that will be used for this A/B test. Click "Create" in the Ad Manager, select "Sales" and then "Advantage+ shopping campaign".

After creating it, we recommend giving it a name (i.e. Advantage+ A/B Test #1).

3. Duplicate the campaign and change the design

Since you've decided to use Advantage+ campaigns, you'll have to duplicate the entire campaign (not ad) once per each variation you're testing.

Click on the campaign and duplicate it (using "Quick duplicate"/ CTRL+D).

This will open a new campaign. Rename it to keep track, and duplicate it again, once per each variant.

For our example, since we're testing one unedited image and three Confect designs, we'll need to duplicate this campaign 3 times, so that we end up with 4 individual campaigns.

Once you've duplicated the campaign, you'll need to update the catalog design variants used in each campaign. To do that, just follow STEPS 2 and 3 of this guide. Go to each campaign, and make sure the right design variants are selected.

⚠️ IMPORTANT It is important that all the campaigns in the test - including your "original variant" - are 100% new.

Otherwise the old campaign will have a clear advantage, not going through Learning Phase.

For your own sake it's recommended naming the ads of each variant the same name as your variants in Confect.

4. Start the A/B test and select campaigns

Click the "A/B Test" button (you might have to click "Edit" to find it if it's not visible) to specify the campaign and set up your test.

This will open a window where you'll have to specify the campaigns to be used.

Here it's important to click "Select two existing ads" and then "Next".

In the next window, make sure to select "Existing campaigns".

Then, you'll have to search for each of your Advantage+ campaigns. That's why it's so valuable to name them properly in the previous steps. After you're done, click "Next".

For the key metric, we recommend using "Cost per result", as this is among the most important metrics for eCommerce companies, however this can change based on your advertising objectives (i.e. clicks, leads, etc...).

We recommend running the test for a period of 2 weeks MINIMUM, and not concluding on tests with less than 30 results in each variant.

Once the A/B test has been set up, just click "Publish Test" to save your changes and begin testing.

5. Wait for the winner

Now the waiting game begins.

After the testing period has come to an end, compare the key metric(s) between the ad sets to find your winner.

Since all the creatives come from the same catalog, with the same degree of optimisation (which customer is interested in which products), the only variable creating the difference will be the design itself.

6. Apply learnings

Lastly, after finding your winning design, it's good practice to reflect on which design choices correlate with higher performance.

You might for example find that the main difference between your best and worst performing design is that the winning variant uses your logo, while the loosing doesn't. In that case, it's a good sign that your target audience responds well to seeing your logo in the ad, and you should be basing your future designs from this learning.

Remember that the more variation between the designs (i.e. different background, text, elements), the more difficult it will be to judge exactly which change cause the higher, or lower, performance. Though, it can give you a feel of what type of creative makes your audience buy.