A/B testing: Channel

If you are new to my website, you’ll know I am not the biggest fan of A/B testing. Does that mean you should never do it? No.

When creating an actionable A/B test, I set out to find the following answers to these questions:

What do I want to test?

Why do I want to test it?

How much time is it going to take me to run this test?

What am I going to do with the results?

And with the above answered, I can feel confident to move forward with a solid testing strategy knowing that I’m not wasting time.

Introducing, A/B testing owned channels.

What do I want to test?

I want to test the same message, across multiple channels to determine if one channel converts at a higher rate than the others.

I will be running a multi-variant test for a SaaS company that focuses on encouraging users to come back to their “cart” and complete their subscription to the app.

A faux-abandoned cart!

Why do I want to test it?

I want to understand how certain channels impact the behavior of users over others, and ideally come up with a governance framework for what channel is best suited for what type of message.

How much time is it going to take me to run this test?

I want to run this test over a 30-day period as the message will not always display at the same time for these users, and in order to get a healthy sample size I believe this timeframe will give me room to accomplish that.

What am I going to do with the results?

I would like to run this test with other message types first, but once I do have an idea of what channels convert at the highest rate, and volume, I would like to take my future learnings to translate them into a framework for the rest of the team to look to when coming up with marketing strategy.

Something that outlines

  • Promotional campaigns aimed at re-converting old subscribers back works best over a push notification

  • Product promotion aimed at activating users to first-time feature adoption works best in an in-app message…

etc.

My hypothesis

My hypothesis is that email will outperform the other two channels as it presents with a larger canvas to display copy, and imagery that might convert the subscriber.

The results

In-app messages had the highest conversion rate but with this channel, I believe the subscriber has the most intent. They are inside the app, on their own time. These subscribers might be the most willing to convert since they are expressing interest in the app by starting a session. However, this channel overall reached less users because not everyone that was eligible for this message opened up the app within the timeframe they were eligible for it.

Push notifications on the other hand are urgent. Immediately they are in the users face (similar to SMS) and will often have a lower eligible audience group due to fewer users opting into push (always a requirement on iOS, now soon to be a requirement on Android). This channel also needs to be used for short and sweet messaging.

Email as we know it, takes people longer to engage with, so should email be provided with a longer conversion window? Although it reached the most users compared to the other two, there were still some users who were unsubscribed and thus excluded from receiving the message.

Upon reflection, in-app had the highest conversion compared to email and push. But it also missed out on a chunk of the audience because it did not extend outside of the app. Although I don’t know if I can declare this test a success, it provided a lot of learnings, discussion points, and opportunities to use these channels together. I personally am a huge fan of testing different message types across channels to see what my audience responds best with - so long as I'm also assessing external factors!

TLDR: These channels are difficult to compare.

Previous
Previous

A/B testing: Winback strategy

Next
Next

A/B testing: CTA