When preparing a newsletter issue, it can be difficult to predict exactly how subscribers will react to various important elements, such as the e‑mail subject or the primary call to action. In some cases, you may also wish to find out which option out of several alternatives will have the most positive effect. Optimizing your newsletter issues via A/B testing provides a possible solution to these problems.
It allows you to create multiple different versions of each issue referred to as variants. These variants are then mailed out to a test group of subscribers, typically a relatively small percentage of the newsletter's full mailing list. This way, the best version of the issue can be identified based on the tracking statistics measured for the test group, and then sent to the remainder of the subscribers. The winning variant can either be selected automatically by the system according to specified criteria or manually after evaluating the results of the test.
Please note that A/B testing works best for newsletters that have a large number of subscribers. With a small testing group, the results may be heavily affected by random factors and will not be statistically significant.
A/B testing variants of newsletter issues are evaluated based on the actions performed by recipients, so both types of E-mail tracking need to be enabled for the given newsletter on its Configuration tab (Track opened e-mails and Track clicked links). This also requires the Enable on-line marketing setting to be enabled in Site Manager -> Settings -> On-line marketing.
Additionally, A/B testing is only supported for template-based (static) newsletters.
If the conditions described in the section above are met, A/B tests can be defined through the CMS Desk -> Tools -> Newsletters interface either directly when creating a new newsletter issue via the wizard, or when editing an existing issue on the Content tab. In the case of new issues, it is necessary to Save the content before the advanced actions become available. To add the first testing variant for the issue, click the Create A/B test button offered among the header actions.
This opens a dialog where the new variant can be defined. Start by filling in the Name, which will be used to identify the variant while working with the A/B test. You can then choose one of two possible options that determine the initial content of the variant:
•Create empty variant - select this option if you wish to create the issue variant from scratch. The variant will use the main template set for the newsletter on the Configuration tab and the content of its editable regions will be empty.
•Copy content from another variant - if selected, the content of an existing variant (or the original issue) will be used as a starting point that you can modify as required. Choose the source from the list of issue variants in the list below. The variant will use the same template as the source issue and all editable region content will also be copied. This option makes it easy to create variants for testing small changes, such as a different e‑mail subject or text headline.
Now that the issue has an A/B testing variant defined, a slider will be displayed at the top of the content editing page. You can use it to switch between individual variants, including the original issue. The name of the currently selected variant is shown next to the slider. You may manage the A/B test variants using the following buttons:
• Add variant - creates another variant of the issue. You can add any number of variants.
• Remove variant - deletes the variant currently selected through the slider.
• Edit properties - allows you to change the name of the currently selected variant. If required, you may also rename the original issue.
The settings and content of the currently selected variant can be modified just like when editing a standard issue. Each variant may have a different subject, issue template, editable region content etc. This allows you to test any variables that you need.
When all of the issue's variants are defined as required, the next step is to configure and schedule how the test should be sent out and evaluated. This can be done either in the send step of the new issue wizard or when editing an existing issue on the Send tab.
The size of the subscriber test group can be defined using the slider in the upper part of the page. By moving the slider's handle, you can increase or decrease the number of subscribers that will receive the variants of the newsletter during the testing phase. The test group is automatically balanced so that each variant is sent to the same amount of subscribers. Because of this, the overall test group size will always be a multiple of the total number of variants created for the issue.
The remaining subscribers who are not part of the test group will receive the variant that achieves the best results (i.e. the winner) after the testing process is complete.
Using a full test group
It is even possible to set up a scenario where the test group includes 100% of all subscribers. In this case, the A/B test simply provides a way to evenly distribute different versions of the issue between the subscribers and the selection of the winner is only done for statistical purposes.
In the Schedule mail-out section below the slider, you can specify when individual issue variants should be sent to the subscribers from the corresponding portion of the test group. To schedule the mail‑out, enter the required date and time into the field below the list (you may use the Calendar selector or the Now link) and then click OK. This can either be done for a specific variant, all variants or only those selected through the checkboxes in the list. If the mail-out time is the same for multiple variants, they will be sent in sequence with approximately 1 minute intervals between individual variants.
The configuration made in the Winner selection section determines how the winning variant of the A/B test will be chosen. You can select one of the following options:
•Number of opened e-mails - the system will automatically choose the variant with the highest number of opened e-mails as the winner. This type of testing focuses on optimizing the first impression of the newsletter, i.e. the subject of the e-mails and the sender name or address, not the actual content.
•Total unique clicks - the winner will be chosen automatically according to the amount of link clicks measured for each variant. Each link placed in the issue's content will only be counted once per subscriber, even when clicked multiple times. This option is recommended if the primary goal of your newsletter is to encourage subscribers to follow the links provided in the issues.
•Manually - the winner of the A/B test will not be selected automatically. Instead, the author of the issue (or other authorized users) can monitor the results of the test and choose the winning variant manually at any time.
When using an automatic selection option (one of the first two), it is also necessary to enter the duration of the testing period through the Select a winner after settings below. This way, you can specify how long the system should wait after the last variant is sent out before it chooses a winner and mails it to the remaining subscribers.
Once everything is configured as required, you can confirm that the variants should be sent according to their mail-out scheduling time by clicking the Send button (or Send and close in the new issue wizard). If you only wish to save the configuration of the A/B test without actually starting the mail-out, use the Save (Save without sending) button instead.
The testing phase begins after the first variant is sent out. If you need to make any changes to the configuration of the A/B test or the content of its variants, you can do so by editing the issue.
On the Content tab, you may modify the variants that have not yet been mailed, but the slider actions will be disabled. This means that it is no longer possible to add, remove or rename variants.
If you switch to the Send tab, the test group slider will now be locked. However, you can view the e‑mail tracking data measured for individual variants in the Test results section. The current tracking statistics are shown for each variant, specifically the number of opened e‑mails and amount of unique link clicks performed by subscribers. By clicking on these numbers, you can open a dialog with the details of the corresponding statistic for the given variant. It is also possible to reschedule the sending of variants that have not been mailed yet using the selector and date-time field below the list.
The Select as winner action allows you to manually choose a winner (even when using automatic selection). It opens a confirmation dialog where you can schedule when the winning issue variant should be sent to the remaining subscribers. If you specify a date in the future, you will still have the option of choosing a different winner during the interval before the mail-out.
The winner selection criteria may also be changed at any point while the testing is still in progress.
Special cases with tied results
If a draw occurs at the end of the testing phase (i.e. the top value in the tested statistic is achieved by multiple issue variants), the selection of the winner will be postponed and evaluated again after one hour.
In certain situations, you may need to choose the winner manually even when using automatic selection, e.g. if you are testing the number of opened e-mails and all subscribers in the test group view the received issue.
Once the test is concluded and the winner is decided, the given variant is highlighted by a green background. At this point, the winning issue is mailed out to the remaining subscribers who were not included in the test group and no further actions are possible except for viewing the statistics of the variants.
The overall statistics of the A/B tested issue, including the e-mails used to deliver the winning variant to the subscribers outside of the test group, can be monitored in the usual way on the Issues tab of the newsletter. When viewing the opened e-mail or clickthrough data in the detail dialogs, you may use the additional Variants filter to display either the total (all) values for the entire issue, or only those of specific variants. The statistics of the winning variant include both the corresponding portion of the test group and the remainder of the subscribers who received it after the completion of the testing phase.