The Most Important A/B Tests to Run in 2025: Our Research Revealed

We analysed over 9,000 A/B tests from 800 businesses to discover which types of experiments really boost referral performances.
By tweaking aspects like the text, rewards, and images in referral offers — and measuring results when tests were statistically significant — we now know the most impactful tests to run in 2025.
We also used Generative AI to analyse parts of our referral offer designs, combining these findings with our historic A/B test results to see which imagery captivates customers the most.
So, which design elements spark referrals? Are simple offers or detailed promotions better for acquiring new customers?
Read on to find out…
Key Findings: Most Impactful Tests
Our A/B testing experts revealed distinct patterns that impact conversion rates across different industries and types of experiments. Optimizing landing pages was found to be particularly effective in boosting conversion rates across different industries. Our experts also noted that incorporating machine learning algorithms can automate the analysis of user interactions, enabling predictive A/B tests and continuous improvement based on real-time data. Watch our 3 favourite stats from the research below, then read on to see which experiments caused the biggest uplifts.
Our top pick stats
1. 62% of the time, white background offers win vs non-white designs. (1.6x more likely).
2. 60% of the time, human-focussed designs lose vs designs without humans. (1.5x more likely).
3. 59% of the time, product-focussed offers win vs non-product-led designs. (1.4x more likely).
Incentive experiment types statistical significance results
These are the big hitters. As a median, incentive-based tests boosted conversion rates by an impressive 91%.
For incentive-based experiments, testing scenarios like minimum spend versus no minimum spend, and percentage discounts versus flat amounts, lead to the highest uplifts.
Design experiment types results
Never underestimate the power of good design. Our findings show that imagery that focused on the product can significantly boost engagement vs simply showing a person or lifestyle image, especially in industries like fashion.
We also found that simple, bright, and higher contract images perform better than complex, darker, low contrast images.
Copy experiment types results
Words matter. I’m not just saying that as a copywriter. For A/B testing, concise vs. descriptive language and different lead flows (referee-led vs. referrer-led) produces the highest uplifts.
Sharing experiments results
How do your customers want to physically share referral offers? The list, and particularly the order, of sharing options can have a huge impact.
In our research, we saw the highest sharing rate uplifts when customers adopted Name Share® and placed it first in their share option list.
Key Findings: Most Impactful Metrics
On average, you’ll see the best performance when you run at least five A/B tests. Statistical analysis is crucial for determining the effectiveness of different versions of a webpage or app. You’ll likely see the uplift much earlier — most likely after the first or second test — but your performance should continue to increase until the fifth experiment.
But which metrics should you measure for success? In our testing, we looked at:
- Share rate: This tells you how often users spread the word about your offer. Test different messages or incentives to see what gets people talking.
- Conversion rate: This is the percentage of users who take the action you want, like signing up or buying. Test elements like button text or page layout to boost this number.
- Purchase rate: This measures how many referrals actually make a purchase. Tweak offers and checkout processes to see more buys.
Our research shows that these metrics improve significantly with continuous testing.
Key Findings: Industry-Specific Results
From our results, we know that impactful A/B testing varies depending on the industry. Optimizing various web pages for better engagement and conversion rates was found to be particularly effective in different industries. Here's what we found:
- Home, Pets, and Garden: Incentive-led experiments are by far the most impactful.
- Fashion: Design-led experiments are the most impactful, followed by incentive experiments.
- Health and Beauty: Incentive is the most impactful, followed by design.
- Food and Drink: Incentives are easily the most impactful.
Importance of A/B Testing in Referral Programmes
A/B testing is vital to running a successful referral programme — on average, our top-performing brands acquire 4x new customers in just six months of continuous testing. Using the right testing tools is crucial for conducting effective experiments and optimizing referral programs.
Why? Because A/B testing allows you to experiment and fine-tune. Whether you test offers, visuals or messaging, there are many ways to see what actually motivates customers to share and refer friends.
We’ve got the research to prove it.
Tips for Effective A/B Testing
Thought the priceless insights stopped there? Think again. We’ve got a couple quick tips to help you run better, more insightful A/B tests than ever before.
For more A/B testing advice, check out our video or read our blog on the top A/B testing mistakes to avoid.
- Run simultaneous control and test segments: Ensure that your control and test segments occur at the same time to prevent external factors, such as economic changes or seasonal effects, from skewing your results. Additionally, running multiple experiments on the same page can help identify which elements work best together to enhance user experience and personalization.
- A/B test by cohort: Displaying one variation to a specific group while showing a different version to another group simultaneously provides accurate insights into which testing elements resonate best with your audience without disrupting their experience.
Conclusion
Our research revealed how A/B testing can redefine the way you design your referral marketing campaigns. We know which tests produce the greatest impact across key industries and how they can introduce enormous improvements in customer acquisition and engagement.
Let’s recap…
Incentive-driven tests drive the highest impact within the Home, Pets, and Garden industry. In Fashion, design-driven tests were most influential, closely followed by incentive tests.
Our Health and Beauty results show incentive tests come out on top, with design changes close behind. Finally, incentives came in first for Food and Drink when trying to persuade customers to take action.
Ready to turn insights into action? Start A/B testing now and kickstart 2025 with KPIs and results to be proud of. Contact your account manager today.
FAQ
What is A/B Testing
A/B testing, otherwise known as split or bucket testing, refers to a method of comparing two different versions of a webpage or app against one another in order to understand performance. It remains one of the highest drivers of optimizing performance and improving the conversion rates of websites. Leading brands keep on doing this as guidance in building or updating their web material. Testing involves running two versions of a page against each other — with and without the variable under test-to know which will work better.
Why A/B Test?
A/B testing is the ultimate resource that enables SaaS, eCommerce, and business websites to perform calculated changes in user experiences while gathering data about its after-effects. It can also be conducted on a single goal, say conversion rate optimization, in order to keep improving an experience iteratively over time. In that sense, A/B testing provides a reason to make changes in one's business while saving one from risks and enhancing their customer experience.
What are the types of A/B Tests?
There are different sorts of A/B Tests, including:
- A/B Testing: It tests the performance of two versions of a page to determine which one turns out better.
- Multivariate testing: This is used when several changes are compared and basically indicates how several variables interact.
- Split Testing: Same as A/B testing, except that the two variant pages will be different URLs.
- Multivariate test: allows the testing of more than one version of something.
- A/B/n testing: Testing multiple versions of any given element.
How to plan an A/B Test?
There's a bit of planning and preparation before an A/B test can be started. Here are the steps:
- Identify an objective or hypothesis: Describe what you want to measure and why.
- Pick a testing tool where you can write and run your test in it.
- Setup test: Build the two variations of the page and then configure the test through your tool.
- Determine sample size: Number of visitor respondents that have to be studied in order to derive statistical significance.
- Run test: The test is started and runs for some predefined period of time.
How to analyse the test results?
Analysis of test results is a very crucial step in A/B testing. Following are some key things that one should keep in mind:
- Statistical significance: The results should be statistically significant and not due to chance.
- KPIs: Observe key performance indications, including conversion rate, average order value, and customer lifetime value.
- Test results: Analyze the outcome of the test and find out where the difference lies.
- Preliminary data: Keep the pre-existing data in view while looking at the test results.
- Google Analytics: Google Analytics can be used to track and analyze test results.
By following these steps and keeping these factors in mind, businesses can make data-informed decisions and enhance their website performance along with conversion rates.
Common A/B Testing Mistakes
A/B testing can be a powerful tool for improving conversion rates and user experience, but it’s not without its challenges. Here are some common mistakes to avoid:
- Not Testing Long Enough: One of the biggest mistakes is ending a test too early. To achieve statistical significance, you need a large enough sample size. Ending a test prematurely can lead to inaccurate conclusions.
- Ignoring External Factors: External factors such as seasonality, economic changes, or marketing campaigns can impact your test results. Make sure to account for these variables when analyzing your data.
- Testing Too Many Variables at Once: While multivariate tests can be useful, testing too many variables simultaneously can make it difficult to pinpoint which change caused the improvement. Start with simple A/B tests before moving on to more complex experiments.
- Not Segmenting Your Audience: Different segments of your audience may respond differently to changes. By not segmenting your audience, you might miss out on valuable insights. Use cohort analysis to understand how different groups react to your tests.
- Overlooking Key Performance Indicators (KPIs): Focus on the right metrics. While conversion rate is important, other KPIs like average order value and customer lifetime value can provide a more comprehensive view of your test’s impact.
What are the A/B Testing Tools and Resources
There are many A/B testing tools and resources available, each with its own strengths and weaknesses. Here are a few popular options:
- Optimizely: Known for its user-friendly interface, Optimizely offers robust features for A/B testing, multivariate testing, and personalization. It’s a great choice for both beginners and advanced users.
- Google Optimize: Integrated with Google Analytics, Google Optimize allows you to run A/B tests and multivariate tests with ease. It’s a cost-effective option for businesses already using Google’s suite of tools.
- VWO (Visual Website Optimizer): VWO provides a comprehensive platform for A/B testing, heatmaps, and user recordings. It’s ideal for businesses looking to gain deep insights into user behaviour.
- Adobe Target: Part of the Adobe Experience Cloud, Adobe Target offers advanced targeting and personalization features. It’s best suited for large enterprises with complex testing needs.
- Unbounce: Primarily a landing page builder, Unbounce also offers A/B testing capabilities. It’s perfect for marketers looking to create and test multiple versions of landing pages quickly.
By leveraging these tools, you can streamline your testing process, collect data more efficiently, and make data-driven decisions to optimize your web pages and campaigns.

Never miss another update
Subscribe to our blog and get monthly emails packed full of the latest marketing trends and tips