Skip to main content

One of Coegi’s best practices is to employ pre-optimized campaigns based on proven results and  data-driven strategies paired with a robust understanding of our client’s business goals. While AI and machine learning has come a long way to help ensure campaign spend is being effectively allocated toward audiences and creatives that contribute to the campaign KPI, A/B testing can be a fruitful tactic to gather insightful learnings to inform future campaigns and optimize current ones.

How can A/B testing be executed for programmatic and social buys? 


When programmatic audiences are set up and targeted as separate line items, AI and machine learning optimizations allocate spend towards the top performing audiences as long as there is scale. To even out the playing field and make it more of a true “test,” Coegi programmatic specialists also have the ability to set specific budgets maximums and minimums for each audience. However, establishing budgets at this level is not considered best practice as it counters the benefits of AI. 

Similar to the audience targeting approach outlined above, there is equal opportunity for all creatives to serve against specific audiences. The platform uses a data-driven feedback loop to determine how often to serve creatives and to whom. 

That being said, there are numerous benefits to having more than one creative variations with different images, copy and/or calls-to-action. The Coegi team is able to provide reporting that speaks to which images and creative components were the most effective. For true A/B tests, Coegi recommends only testing one creative variable at a time (ie. images variations OR copy variations).


While most social channels operate similarly to the programmatic process described above, Facebook does have a unique split-testing feature built into the platform that enables machine learning to determine the highest performing campaign variables over a set time period. This allows advertisers to gain campaign insights quickly, enabling faster optimizations.

A Split Test (or A/B Test) helps marketers understand which ad strategies have the strongest impact on the campaign performance. This allows marketers to:

  • Run a controlled test with non audience overlap: Facebook will randomize and split audience among ads, creating a fair test
  • Get clean, single-variable tests: everything about this test will be identical except the variable we are testing 
  • Easily measure results: Facebook will alert the Coegi Team once the winner of the test variables is found 
  • Leverage the results of the split test to improve campaign performance now and in the future

Split testing divides your audience into random, non-overlapping groups who are shown in ad sets which are identical in all aspects except for one distinct difference or “variable.” The performance of each ad set is then measured against your campaign objective and the best-performing ad set wins. After the split test is complete an email will be sent out to notify of the winning strategy. 

Facebook Split Testing Diagram from Instapage

A Split Test can test up to 5 assets (for example 5 audiences). The variables that can be tested include ads, audience, placements, and delivery optimizations. Split Tests on Facebook can be executed on all objectives, but there are a few variables that are limited to certain objectives. Reach out to your Account Manager with any questions.


Author: Elise Stieferman, Client Strategist