🚨 A New Era in Publisher Monetization: Aditude Acquires Category Leader Hashtag Labs. Click here to read the press release.

Ad Operations

How Aditude Uses A/B Testing to Optimize Ad Performance

Josh Robins
Aug 15, 2024
split testing

At Aditude, our commitment to maximizing ad revenue for publishers is unwavering. We leverage advanced technology and innovative solutions to ensure publishers achieve optimal performance. One of our core strategies in this endeavor is A/B testing. 

This blog provides an in-depth explanation of Aditude's A/B testing process and highlights its numerous benefits to potential publishers. By understanding our approach and its advantages, publishers can better appreciate the value of partnering with Aditude to optimize their ad operations and boost their revenue.

Understanding A/B Testing in AdTech

A/B testing involves creating two or more variants (A and B) of a particular item, such as a configuration for prebid, or a lazy load distance test, and comparing their performance to determine which variant yields better results. This method is essential for refining programmatic advertising strategies and providing empirical data on what works best.

A/B testing is crucial in optimizing ad performance in the ad tech industry. By systematically testing different variables or designs, publishers can identify and implement the most effective strategies to enhance overall performance. Key metrics evaluated during A/B testing include:

  • Revenue Per Session is a metric that measures the total ad revenue generated during a single user session on a website. It provides insight into how effectively a website monetizes each visit, allowing publishers to evaluate the performance of their ad strategies and optimize their content to maximize earnings per session.
  • Session RPM (Revenue Per Thousand Sessions) is a metric that calculates the revenue earned from every 1,000 user sessions. Session RPM helps publishers understand the average earnings generated from many sessions, providing a broader view of monetization effectiveness over time.
  • eCPM (Effective Cost Per Thousand Impressions) is a standard digital advertising metric measuring the revenue earned per 1,000 ad impressions. eCPM provides a clear indication of the ad performance, helping publishers assess the effectiveness of their ad inventory and pricing strategies.
  • Viewability is a metric that measures the proportion of ads that users actually see. For an ad to be considered viewable, it must meet specific criteria, such as being in the user’s viewport for a minimum amount of time (e.g., at least 50% of the ad’s pixels are visible for at least one second for display ads). High viewability rates indicate that ads are more likely to capture user attention and drive engagement, leading to better performance and higher revenue.
  • Fill Rates refer to the percentage of ad requests successfully filled with ads. High fill rates indicate that a significant portion of the ad inventory is being utilized, which can enhance overall revenue. Low fill rates suggest inefficiencies in the ad delivery or a lack of demand from advertisers.

These metrics provide valuable insights into the effectiveness of different ad strategies, allowing for continuous optimization.

Aditude's Approach to A/B Testing

Our A/B testing process is designed to be thorough and meticulously planned. Here is a step-by-step breakdown of how we conduct A/B testing at Aditude:

Step 1: Identifying Goals and KPIs

The first step in our A/B testing process is to identify the test's goals and KPIs. For example, Publisher A would like to see their BTF incontent units reach 80% viewability on their homepage. We collaborate with publishers to define clear objectives aligning with their business goals. This step is crucial for ensuring that the test is focused and relevant.

Step 2: Designing Test Variations

Based on the publisher's goals, we design different ad variations to be tested. This involves creating multiple variations of ideas or configurations to be  tested, each with anywhere from major to minor differences. Some common A/B testing scenarios include:

  • Supply-Side Platforms (SSPs): Testing traffic with and without a particular SSP.
  • Throttling SSP requests: Changing the amount of times an SSP is allowed to bid on a certain page and removing them to allow faster auctions.
  • Lazy Loading Distance: Adjust the distance below the page to trigger ad loading (e.g., 100 pixels below the bottom of the viewport).
  • Prebid Settings: Modifying syncs per bidder and prebid timeout settings to optimize bidding processes.

Step 3: Implementing the Test

Once the test variations have been designed, we implement the test by deploying the different variations across the publisher's website or ad inventory and the appropriate tracking is put in place to measure the effective KPI being tested. This could be as as simple as page level targeting through GAM, or more complex like tracking across only certain countries and device types in real time via Insights. This step is critical for ensuring the test runs smoothly, and data is collected accurately.

Step 4: Monitoring and Collecting Data

During the test, our team continuously monitors the performance of the different ad variations. It collects key metrics such as revenue per session, session RPM, eCPM, viewability, and fill rates. This data provides valuable insights into the effectiveness of each ad variation. As we monitor performance, our revenue and adops team makes data informed decisions on whether to ramp up the percentage of the winning variant, roll out the winning variant, or conclude the test with the control sample.

Step 5: Analyzing Results

After completing the test, we analyze the data to determine which ad variation performed the best. This analysis involves comparing the performance of the control group and the test group across various metrics to identify the most effective strategy.

Step 6: Sharing Insights with Publishers

Once the analysis is complete, we share detailed insights with the publishers, giving them a comprehensive overview of the test results. This step is crucial for helping publishers understand the test's impact and make informed decisions about their ad strategies.

A/B Test Examples

To illustrate the effectiveness of our A/B testing process, here are two real-world examples of publishers who have benefited from partnering with Aditude:

Example #1: Publisher A

  • Objective: Test SSP 1 on 30% of traffic in the US.
  • Goal: Determine if the publisher should add this SSP to their auctions in the US.
  • Approach: Aditude set up the A/B test, with 30% of the traffic exposed to SSP 1 and the remaining 70% as the control group in the US.
  • Results: With SSP 1, the publisher saw a 6% increase in CPM. 

Example #2: Publisher B

  • Objective: Increase CPM while maintaining fill rates.
  • Goal: Evaluate different models of Aditude's Dynamic Flooring.
  • Approach: Aditude designed and implemented an A/B test with different models of in house Dynamic Flooring.
  • Results: The publisher saw significant improvements of a 12% CPM increase, whilst losing < 1% of fill across the site, this not only achieved the publishers goals of increasing CPM, but also lead to much higher RPS.

Interpreting A/B Test Results

A critical aspect of our A/B testing process is interpreting the test results accurately. Here’s how we approach this:

  • Testing at 10% Traffic: We often begin by testing at 10% traffic to gauge initial performance. This step helps us identify the potential lift and refine the test before scaling up. It's important to take note of the percentage allocation and view the data according, checking indicators apart from just KPIs. This could mean making sure requests are 10% before checking that the SSP is live on 10% of traffic.
  • Scaling Up to 50% Traffic: If the initial results are positive, we scale up the test to 50% traffic. This larger sample size provides more robust data and helps confirm the initial findings.
  • Rolling Out Successful Tests: Once we have validated the test results, we implement the successful strategy across all traffic. This ensures that the publisher can fully benefit from the optimized ad strategy.
  • Ensuring Fair Allocation: Our 50/50 tests are designed to provide an unbiased and fair split between the control group and the test group. This approach helps us accurately measure the impact of the test and make data-driven decisions.

Benefits of A/B Testing with Aditude

Partnering with Aditude for A/B testing offers numerous benefits to publishers:

  • Scalability: Our extensive network allows us to test over a billion impressions a day, providing quicker and more definitive results than in-house testing. This scalability allows us to deliver actionable insights rapidly.
  • Enhanced Performance: Our A/B testing process is designed to meet a publisher’s goals whether that be page speed and user experience or enhancing ad performance and revenue growth. By identifying and implementing the most effective strategies, we help publishers achieve their business goals.
  • Data-Driven Partnerships: At Aditude, we build strong, data-driven partnerships with our publishers. Our A/B testing process provides valuable insights that enable continuous optimization and improvement.
  • Continuous Support and Consultation: Our team offers ongoing support and consultation to ensure publishers can fully benefit from our A/B testing process. We work closely with publishers to help them understand the test results and implement the best strategies for their business.

Conclusion

Aditude’s A/B testing process is a powerful tool for maximizing ad performance and revenue growth. By partnering with us, publishers can leverage our expertise, technology, and dedicated support to achieve their business goals. Contact Aditude today for a consultation and optimize your ad performance through A/B testing.

aditude insights
Have you seen our Insights platform?

Schedule a demo for free and learn more about the offering.

Join Our Monthly Newsletter

No spam. Just the latest blogs, industry news, and strategies in your inbox every month.