In the majority of cases, we recommend using manual campaigns to set up an A/B test instead. The only way to conduct a proper A/B test with Advantage+ campaigns is to create an individual campaign per each variation you're testing (can become overwhelming when testing 3 or more variations), while you'd only need a single manual campaign to do this.
If you'd rather use a manual campaign, click here to read the guide.
The first step is to create the individual designs you'll be using for A/B testing, and add them to the Confect catalog as design variants.
If you haven't yet done so, click here and follow STEP 1 of this guide.
In case this is your first time customising your Catalog Ads, and you want to test the difference between using standard, unedited pictures and Confect designs, we recommend creating these designs:
For example:
The next step is to create the campaigns that will be used for this A/B test. Click "Create" in the Ad Manager, select "Sales" and then "Advantage+ shopping campaign".
After creating it, we recommend giving it a name (i.e. Advantage+ A/B Test #1).
Since you've decided to use Advantage+ campaigns, you'll have to duplicate the entire campaign (not ad) once per each variation you're testing.
Click on the campaign and duplicate it (using "Quick duplicate"/ CTRL+D).
This will open a new campaign. Rename it to keep track, and duplicate it again, once per each variant.
For our example, since we're testing one unedited image and three Confect designs, we'll need to duplicate this campaign 3 times, so that we end up with 4 individual campaigns.
Once you've duplicated the campaign, you'll need to update the catalog design variants used in each campaign. To do that, just follow STEPS 2 and 3 of this guide. Go to each campaign, and make sure the right design variants are selected.
For your own sake it's recommended naming the ads of each variant the same name as your variants in Confect.
Click the "A/B Test" button (you might have to click "Edit" to find it if it's not visible) to specify the campaign and set up your test.
This will open a window where you'll have to specify the campaigns to be used.
Here it's important to click "Select two existing ads" and then "Next".
In the next window, make sure to select "Existing campaigns".
Then, you'll have to search for each of your Advantage+ campaigns. That's why it's so valuable to name them properly in the previous steps. After you're done, click "Next".
For the key metric, we recommend using "Cost per result", as this is among the most important metrics for eCommerce companies, however this can change based on your advertising objectives (i.e. clicks, leads, etc...).
We recommend running the test for a period of 2 weeks MINIMUM, and not concluding on tests with less than 30 results in each variant.
Once the A/B test has been set up, just click "Publish Test" to save your changes and begin testing.
Now the waiting game begins.
After the testing period has come to an end, compare the key metric(s) between the ad sets to find your winner.
Since all the creatives come from the same catalog, with the same degree of optimisation (which customer is interested in which products), the only variable creating the difference will be the design itself.
Lastly, after finding your winning design, it's good practice to reflect on which design choices correlate with higher performance.
You might for example find that the main difference between your best and worst performing design is that the winning variant uses your logo, while the loosing doesn't. In that case, it's a good sign that your target audience responds well to seeing your logo in the ad, and you should be basing your future designs from this learning.
Remember that the more variation between the designs (i.e. different background, text, elements), the more difficult it will be to judge exactly which change cause the higher, or lower, performance. Though, it can give you a feel of what type of creative makes your audience buy.