A/B test Catalog Ads in Facebook

A guide on how to create an A/B test in Facebook.

Table of Contents

Save variants in ConfectMake a Manual sales campaignCreate one ad set per designChange the designsStart the A/B testApply learnings

One of the most groundbreaking benefits of being able to use multiple designs in a single catalog is that you can A/B test with ease.

With different designs for each product, you can select which creative to use on the ad level.

Summary:
Add multiple design variants to your catalog and create one new ad per tested variant. Run these ads through Meta's A/B test tool.

Steps:
- Add 3+ design variants to your Confect catalog.
- Download our Chrome plugin and log in.
- Create a new Manual sales campaign (just for this test).
- Create one ad set and ad per variant you're testing.
- Switch each ad to a unique variant.
- Use Meta's A/B test tool on this campaign.

We recommend testing multiple designs. Using at least 3 different designs will give you a higher chance of success and learn what type of design choices work for your audience.

You can find inspiration for what to test here.


Save variants in Confect

The first step is to create the individual designs you'll be using for A/B testing, and add them to the catalog as design variants.

If you haven't yet done so, click here and follow STEP 1 of this guide.

Useful tip 💡
If you don't want to affect your existing catalog ads, set the design of "Primary" to a blank design (see example below; including only the product layer).

Important ⚠️
Test all variants (incl. original) in the same catalog to avoid inaccurate results. Never use multiple catalogs for A/B testing, as an older catalog with more pixel data will always outperform a newer one.

If this is your first time testing Catalog Ads, and you want to test the difference between the original pictures and Confect designs, your design could look like this - 2 to 4 Confect designs, and 1 "Blank template" (a design that only includes the product layer).

For example:


Make a Manual sales campaign

The next step is to create a NEW campaign for this A/B test. Click "Create" in the Ad Manager, select "Sales" and then "Manual sales campaign".

Using a manual campaign allows us to create multiple ad sets and test between them.

Important ⚠️
We highly recommend performing the A/B test using a manual sales campaign. This lowers the complexity and helps avoid common pitfalls.
You may use the desired design in an Advantage+ campaign after completing the test.

If you have reasons to use an Advantage+ campaign for testing, click here to see the guide (needs to be followed precisely).


Create one ad set per design

A new campaign will have one ad set and ad. The ad set should be duplicated (using "Quick duplicate"/ CTRL+D).

Duplicate your ad sets so that you have one per design you're testing.

For our example (1 original, 4 designs), we'll need to duplicate until we have 5 individual ad sets.

Important ⚠️
All ads and ad sets in the test, including the original variant, must be entirely new to avoid an unfair advantage for old ads that skip the Learning Phase.


Change the designs

Once you've duplicated the ad sets, you'll need to change the design variants in each ad.

It is useful to rename the ads, or ad sets, after the design you're testing.

Go to each ad, and make sure the right design variants are used.


Start the A/B test

All that's left to do is select the ad sets you'll be using in this test (all the ad sets in this new campaign).

With ad sets selected, click the "A/B Test" button (you might have to click "Edit" to see it).

It's important to pick "Existing ad sets" for comparison (this should be selected by default, but make sure it is correct).

For the key metric, we recommend using "Cost per result"; however, this can change based on your advertising objectives (i.e., clicks, leads, etc...).

Once the A/B test has been set up, just click "Publish Test" to save your changes and begin testing.

We recommend running the test for a period of 2 weeks MINIMUM and not concluding on tests with less than 30 results in each variant.


Apply learnings

After the testing period ends, compare the key metric(s) between the ad sets to find your winner.

Since all the creatives come from the same catalog, the only variable creating the difference will be the design itself.

After finding your winning design, reflect on which design choices led to higher performance.

You might for example find that the main difference between your best and worst performing design is that the winning variant uses your logo, while the losing doesn't.

In that case, it's a good sign that your target audience responds well to seeing your logo in the ad, and you should be basing your future designs on this learning.