Skip to main content

A/B testing your bundles

Learn how to set up A/B tests for your bundles, compare different configurations, and pick a winner based on real performance data.

Written by Atlas Team
Updated this week

A/B testing lets you run multiple versions of the same bundle simultaneously and compare their performance. Traffic is automatically split evenly between variants, so you can test different offer structures, pricing, designs, or copy and let the data tell you which version converts best.

In this article:


How A/B testing works

When A/B testing is enabled, each visitor to a product page is randomly shown one of your bundle variants. The traffic is split evenly across all variants (50/50 for two variants, 33/33/33 for three, and so on). Each variant is tracked independently so you can compare impressions, clicks, click rates, added revenue, and revenue per visitor side by side.

Once you've collected enough data to see a clear winner, you end the test and choose the winning variant. The losing variants are deleted, and all traffic is directed to the winner going forward.


Enabling A/B testing

Before you can enable A/B testing, you need to save your bundle with products selected. A/B testing is not available on bundles that haven't been saved yet or that don't have products assigned.

Once your bundle is saved with products, you'll see an Enable A/B testing button at the top of the left sidebar in the bundle editor. Click it to activate A/B testing.

When enabled, the left sidebar changes to show an A/B testing variants section at the top with "Variant A" as your current bundle configuration. Everything you had already set up becomes Variant A automatically.


Creating variants

After enabling A/B testing, click Add variant in the A/B testing variants section to create a new variant. You can create up to 4 variants total (Variant A, B, C, and D).

Each variant gets its own complete bundle editor. Click a variant name in the sidebar to switch to it. When you switch, the entire center panel and preview update to show that variant's configuration.

To remove a variant, click the trash icon next to its name in the sidebar.


What you can test

Each variant has its own fully independent bundle editor. This means you can change anything between variants, including:

  • Offer structure (number of offers, pricing types, discount levels)

  • Offer titles, subtitles, and badges

  • Bundle title and layout

  • Colors and text styling

  • Countdown timer settings

  • Subscription configuration

  • Progressive gifts

  • Upsells and gifts on individual offers

There are no restrictions on what can differ between variants. You can make small changes (like testing two different badge labels) or completely different bundle configurations.

💡 For the most useful results, change only one or two things between variants. If Variant A and Variant B are completely different, you'll know which one won but you won't know why. Testing one variable at a time (like "10% discount vs. 15% discount" or "countdown timer on vs. off") gives you actionable insights.


How traffic is split

Traffic is split automatically and evenly across all active variants. You don't need to set a split percentage. If you have two variants, each gets 50% of visitors. If you have three, each gets roughly 33%. With four variants, each gets 25%.

The split is randomized per visitor, so each person who loads a product page with the A/B test active is randomly assigned one of the variants.


Monitoring your A/B test

Bundles with an active A/B test are marked with an A/B Test label on the Bundle Deals dashboard, next to their status badge (Active or Draft).

To view detailed performance data for each variant, click the End A/B Test button on the dashboard row for that bundle. This opens the analytics modal showing per-variant metrics:

  • Impressions: How many times the variant was shown to visitors.

  • Clicks: How many visitors clicked on the bundle offer.

  • Click Rate: Clicks divided by impressions.

  • Added Revenue: Additional revenue generated beyond the base product price.

  • Revenue Per Visitor: Added revenue divided by impressions.

ℹ️ Let your test run long enough to collect meaningful data before making a decision. A few days of traffic is usually not enough. The more visitors each variant receives, the more confident you can be that the results reflect a real difference and not random variation.


Ending the test and picking a winner

When you're ready to end the test, click the End A/B Test button on the dashboard for the bundle. This opens the analytics modal showing all variant performance data.

Click Choose next to the variant you want to keep. Once you select your winning variant, all other variants are permanently deleted, and all future traffic is directed to the winner.

⚠️ Choosing a winner is permanent. All other variants are deleted and cannot be recovered. Make sure you've reviewed the data carefully before making your selection.

After ending the test, the bundle returns to a normal, single-variant state with the winning configuration. The A/B Test label is removed from the dashboard and the bundle continues running as a standard bundle.


Next steps

With A/B testing covered, see Adding upsells and gifts to your offers to learn how to attach per-offer add-ons that increase your average order value.

Did this answer your question?