Table Of Content

Why can CPMs be two or Three Times Higher in Meta A/B Tests? Here's What's Happening

Table Of Content

Running A/B tests on Meta is an essential part of optimizing your advertising campaigns, but what happens when your CPMs (Cost Per Thousand Impressions) skyrocket? You’ve used the same creative and the same targeting, but your CPMs are suddenly two or three times higher than usual. Sound familiar?

Don’t panic! You’re not alone, and the good news is that there’s a logical explanation behind these sudden (but temporary) cost hikes. In this article, we’ll break down why this happens, why it’s normal, and what you can do about it.

Keep it Simple

A/B testing lets you explore a variety of variables to refine your ad performance, but the key to success is keeping things simple. The best practice in any Meta test is to try to test one variable at a time to achieve results you can confidently act upon. If you add too many variables, the data becomes more complicated, and it becomes difficult to tell what’s driving the changes.

For this article, we’ll focus on tests that maintain a consistent audience, conversion goal, and budget. For example, imagine a test where traffic is split 50/50 between two different landing experiences. Everything is identical except for the landing page itself. This setup will help us explore the underlying reasons behind the surge in CPMs.

Everything is Different in a Test


First and foremost, remember this: when you run an A/B test, Meta treats it as an entirely new scenario, even if you’re using an ad that has previously performed well. The platform doesn’t rely on any history or past performance metrics from previous runs of the same ad. It starts from scratch. That means all the optimization work that has been done over time with your ad resets during the A/B test.

Because of this, the numbers you see during testing are going to be different. This makes direct comparisons between A/B tests and regular campaigns a trap that can lead to frustration and confusion.

The Comparison Trap


It’s tempting to compare the performance of an A/B test to a regular campaign, especially if you’re using the same creative and targeting that typically delivers solid results. But this comparison can be misleading. If you’ve experienced a spike in CPMs and a drop in ROAS (Return on Ad Spend) during an A/B test, don’t be discouraged—it’s normal.

In an A/B test, Meta is essentially running a brand-new campaign from scratch, which impacts the auction dynamics and optimizations. So, what causes those CPM spikes? Let’s dive into the key factors.

1. The Learning Phase Is Messing with Your CPMs


Meta's algorithm uses a "learning phase" to optimize ad delivery. During this phase, the system experiments with different strategies to find the most effective way to deliver your ads. Even though you might be using the same creative and audience as before, the learning phase resets during an A/B test. Both variants of your ad start from square one.

The algorithm is gathering data to determine which placements, audiences, and strategies work best, but until it has sufficient information, ad delivery is often inefficient. This inefficiency typically leads to a temporary spike in CPMs.

Pro Tip: Allow the test to run for at least seven days to give Meta enough time to exit the learning phase and optimize delivery. The longer the test runs, the more time Meta has to find a cost-efficient strategy.

2. Splitting Auction Power – You're Competing Against Yourself


In an A/B test, your ads are split into two or more variants, which means that Meta divides your audience between them. Even though you're using the same creative and targeting, this split creates intra-test competition. Essentially, your ads are bidding against each other in Meta's ad auction.

With your budget now divided across multiple variants, each ad has less power in the auction. This fragmentation reduces efficiency and leads to higher CPMs. Meta might bid more aggressively to ensure that each ad variant gets enough impressions to gather statistically significant data.

Solution: Only test one element at a time—whether it’s creative, landing page, or targeting. This minimizes the internal competition within your campaign and helps avoid significant CPM spikes. Also, make sure your team understands that comparing CPMs and ROAS from an A/B test to a standard campaign is like comparing apples and oranges.

3. Resetting Optimization—A Double-Edged Sword


Meta's algorithm continuously optimizes your regular campaigns based on past performance. It learns over time which audiences respond best, which placements drive the best results, and what budget allocation is optimal. However, in an A/B test, this optimization resets.

Each variant in your test starts fresh, meaning both versions of your ad must relearn the best way to serve ads. This optimization reset leads to temporary inefficiencies, resulting in higher CPMs as Meta gathers fresh data.

Key Insight: These higher CPMs are normal and typically temporary. Once Meta has optimized each ad variant, your CPMs should begin to stabilize.

4. Over-Segmentation: Shrinking Your Audience


A/B tests often force Meta to segment your audience into smaller groups. This can unintentionally shrink your effective audience size, which increases competition. With fewer impressions available for each ad variant, Meta might need to bid more aggressively to win auctions and secure placements—leading to inflated CPMs.

For example, in a regular campaign targeting a broad audience, Meta’s algorithm can optimize ad delivery efficiently across a wide user base. But when that audience is split into smaller chunks during testing, this efficiency drops, and CPMs rise.

5. Ad Auction Dynamics During Testing


Meta's ad auction behaves differently during A/B tests than it does in regular campaigns. The system may experiment with serving your ads in less efficient ways, such as testing different placements, times, or ad frequencies, which can increase CPMs.

Meta’s algorithm prioritizes gathering sufficient data for each ad variant to ensure the test yields meaningful insights. This focus can lead to temporary cost increases, as the system may deliver ads in ways that aren’t yet fully optimized.

Pro Tip: Limit the number of formats and placements being tested. By reducing the complexity, you shorten the learning phase and keep costs under control.

In Conclusion: Understanding the CPM Spike


A sudden rise in CPM during an A/B test on Meta is a normal and expected part of the process. Factors such as the learning phase, intra-test competition, and resetting optimization are the main contributors to these cost spikes.

The good news? These CPM surges are typically temporary. With strategies like allowing more time for learning, simplifying your tests, and minimizing the number of variables, you can mitigate the impact and still get valuable insights to optimize your campaigns.

Why can CPMs be two or Three Times Higher in Meta A/B Tests? Here's What's Happening

Running A/B tests on Meta is an essential part of optimizing your advertising campaigns, but what happens when your CPMs (Cost Per Thousand Impressions) skyrocket? You’ve used the same creative and the same targeting, but your CPMs are suddenly two or three times higher than usual. Sound familiar?

Don’t panic! You’re not alone, and the good news is that there’s a logical explanation behind these sudden (but temporary) cost hikes. In this article, we’ll break down why this happens, why it’s normal, and what you can do about it.

Keep it Simple

A/B testing lets you explore a variety of variables to refine your ad performance, but the key to success is keeping things simple. The best practice in any Meta test is to try to test one variable at a time to achieve results you can confidently act upon. If you add too many variables, the data becomes more complicated, and it becomes difficult to tell what’s driving the changes.

For this article, we’ll focus on tests that maintain a consistent audience, conversion goal, and budget. For example, imagine a test where traffic is split 50/50 between two different landing experiences. Everything is identical except for the landing page itself. This setup will help us explore the underlying reasons behind the surge in CPMs.

Everything is Different in a Test


First and foremost, remember this: when you run an A/B test, Meta treats it as an entirely new scenario, even if you’re using an ad that has previously performed well. The platform doesn’t rely on any history or past performance metrics from previous runs of the same ad. It starts from scratch. That means all the optimization work that has been done over time with your ad resets during the A/B test.

Because of this, the numbers you see during testing are going to be different. This makes direct comparisons between A/B tests and regular campaigns a trap that can lead to frustration and confusion.

The Comparison Trap


It’s tempting to compare the performance of an A/B test to a regular campaign, especially if you’re using the same creative and targeting that typically delivers solid results. But this comparison can be misleading. If you’ve experienced a spike in CPMs and a drop in ROAS (Return on Ad Spend) during an A/B test, don’t be discouraged—it’s normal.

In an A/B test, Meta is essentially running a brand-new campaign from scratch, which impacts the auction dynamics and optimizations. So, what causes those CPM spikes? Let’s dive into the key factors.

1. The Learning Phase Is Messing with Your CPMs


Meta's algorithm uses a "learning phase" to optimize ad delivery. During this phase, the system experiments with different strategies to find the most effective way to deliver your ads. Even though you might be using the same creative and audience as before, the learning phase resets during an A/B test. Both variants of your ad start from square one.

The algorithm is gathering data to determine which placements, audiences, and strategies work best, but until it has sufficient information, ad delivery is often inefficient. This inefficiency typically leads to a temporary spike in CPMs.

Pro Tip: Allow the test to run for at least seven days to give Meta enough time to exit the learning phase and optimize delivery. The longer the test runs, the more time Meta has to find a cost-efficient strategy.

2. Splitting Auction Power – You're Competing Against Yourself


In an A/B test, your ads are split into two or more variants, which means that Meta divides your audience between them. Even though you're using the same creative and targeting, this split creates intra-test competition. Essentially, your ads are bidding against each other in Meta's ad auction.

With your budget now divided across multiple variants, each ad has less power in the auction. This fragmentation reduces efficiency and leads to higher CPMs. Meta might bid more aggressively to ensure that each ad variant gets enough impressions to gather statistically significant data.

Solution: Only test one element at a time—whether it’s creative, landing page, or targeting. This minimizes the internal competition within your campaign and helps avoid significant CPM spikes. Also, make sure your team understands that comparing CPMs and ROAS from an A/B test to a standard campaign is like comparing apples and oranges.

3. Resetting Optimization—A Double-Edged Sword


Meta's algorithm continuously optimizes your regular campaigns based on past performance. It learns over time which audiences respond best, which placements drive the best results, and what budget allocation is optimal. However, in an A/B test, this optimization resets.

Each variant in your test starts fresh, meaning both versions of your ad must relearn the best way to serve ads. This optimization reset leads to temporary inefficiencies, resulting in higher CPMs as Meta gathers fresh data.

Key Insight: These higher CPMs are normal and typically temporary. Once Meta has optimized each ad variant, your CPMs should begin to stabilize.

4. Over-Segmentation: Shrinking Your Audience


A/B tests often force Meta to segment your audience into smaller groups. This can unintentionally shrink your effective audience size, which increases competition. With fewer impressions available for each ad variant, Meta might need to bid more aggressively to win auctions and secure placements—leading to inflated CPMs.

For example, in a regular campaign targeting a broad audience, Meta’s algorithm can optimize ad delivery efficiently across a wide user base. But when that audience is split into smaller chunks during testing, this efficiency drops, and CPMs rise.

5. Ad Auction Dynamics During Testing


Meta's ad auction behaves differently during A/B tests than it does in regular campaigns. The system may experiment with serving your ads in less efficient ways, such as testing different placements, times, or ad frequencies, which can increase CPMs.

Meta’s algorithm prioritizes gathering sufficient data for each ad variant to ensure the test yields meaningful insights. This focus can lead to temporary cost increases, as the system may deliver ads in ways that aren’t yet fully optimized.

Pro Tip: Limit the number of formats and placements being tested. By reducing the complexity, you shorten the learning phase and keep costs under control.

In Conclusion: Understanding the CPM Spike


A sudden rise in CPM during an A/B test on Meta is a normal and expected part of the process. Factors such as the learning phase, intra-test competition, and resetting optimization are the main contributors to these cost spikes.

The good news? These CPM surges are typically temporary. With strategies like allowing more time for learning, simplifying your tests, and minimizing the number of variables, you can mitigate the impact and still get valuable insights to optimize your campaigns.

Join our newsletter to stay up to date on social commerce and eCommerce news and views.
Subscribe
By subscribing you agree to with our Privacy Policy and provide consent to receive updates from our company.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.