How do I set up an auto-winner test?
If you already know how to create an A/B test campaign, then you already (almost!) know how to create an auto-winner test.
Once launched, a percentage of your subscriber list (e.g. 20%) is sent each of your test variants. After one hour has elapsed, the best performing variant is sent to the remaining 80% of your subscriber list. Note that you can change both the initial subscriber allocation and how long to run the initial test for.
How does Attentive pick a winner?
You can tell Attentive to pick for a winner based on one of four criteria:
- CTR: The default. The variant with the highest click through rate wins.
- Unsubscribe: The variant with the lowest opt-out rate wins.
- Conversion rate: The variant with the highest conversion rate wins.
- Total revenue: The variant with the highest total revenue wins.
Can I choose a different criteria for auto-winner A/B test campaigns?
Currently, you can choose between the highest click through rate, lowest opt-out rate, highest conversion rate, and highest total revenue wins.
Why do I have to set a test duration? Why is there a duration at all?
In order for Attentive to give you the most accurate test results possible, the system needs a little bit of time to see which types of messages perform best among your subscribers. The more time it has to evaluate each message variant, and the more types of test variants you provide, the more accurate auto-winner A/B campaigns will be.
By default, the initial auto-winner A/B test duration is two hours.
Based on our experience with other clients, one hour is roughly about enough time to get a good sense of subscriber engagement with your message. How many clicked on a link? How many converted and made a purchase? How many subscribers … unsubscribed?
Keep in mind that you can increase the test time for a more accurate auto-winner result, or shorten the test time for faster messaging to more subscribers. We encourage you to run auto-winner A/B tests for a minimum of 2 to 4 hours, as conversion results aren't realized immediately.
What happens if my A/B test doesn't have a clear winner?
If we can't determine which of your message variations had the best performance, we'll select one of your message variants at random as the winner.
This usually only happens if none of your variants had any data in the selected window, or if the stats were the same for all of your message variants.
You'll get the best results if you run A/B tests with large numbers of subscribers. As a general rule, more subscribers = better test data.
What happens to the test duration time if my message gets delayed?
The length of the test is from the time when the final message is sent. This means that if there’s a brief delay and it takes longer than expected to send each message in the initial auto-winner test, then the test results will also be delayed by the same amount of time.
For example, if it takes a total of fifteen minutes to send out each of the initial test messages and the test duration is set to one hour, then you’ll see the winning variant after an hour and fifteen minutes.
Can auto-winner A/B tests be time zone relative?
Yes, but only if you send the tests to 100% of recipients:
How many variants can I include in an auto-winner A/B test campaign?
As with other A/B test campaigns, you can include up to four variants per auto-winner A/B test campaign.
What’s the minimum number of subscribers to get accurate test results?
As with all A/B multivariate testing, the more subscribers who take the test, the better. That said, you’ll get the best results if your auto-winner A/B test campaigns have at least 500 subscribers.
Also, note that your initial testing pool must include at least 10% of your total subscribers.
Is there a limit to the number of auto-winner A/B test campaigns I can send?
The only limit is the number of tests you can come up with! With Attentive, you can create and send an unlimited number of A/B test campaigns.
Can I still send a regular A/B test campaign if I want to get immediate results?
Yes! You can send a regular A/B test campaign by dragging the slider to 100% when building an A/B test.