Email Power Hour and the Prisoner of A/Bzkaban
Earlier this year we began hosting a recurring event at our San Francisco HQ – Email Power Hour. We invite local email aficionados and geeks alike (no dementors permitted) to come hang out with us and enjoy snacks, drinks and email nerdery. This past Monday marked the third such occurrence and this time we focused on the topic of A/B testing.
After a bit of chatting, snacking, and sipping, we settled in for an hour of email discussion. Alex briefly covering the basics of A/B testing emails, and then we dove into three awesome case studies.
Our friend Alli Shea, head of Growth of Marketing over at Couchsurfing, often uses A/B testing in email as a low-friction, low-dependency avenue to of course optimize emails but also test product messaging/imagery. Email test results can then be used as backup to work with product and dev teams for future product innovations and mediums, like website messaging.
For anyone unfamiliar with Couchsurfing, it hosts a network of over 4 million people from around the world who are willing to open their lives and home (free of charge) with travelers bouncing around the globe. Alli recently ran a copy test on the welcome email series to test and hone the product messaging for Verification. While a free service at its core, Couchsurfing offers a verification option that ensures an ad-free website/app experience as well as increasing likelihood of host acceptance for surfers.
Though a super simple and small variation, the test was simply changing the messaging strategy for Verification:
A) “Everyone else is doing it”
B) “What’s in it for me”
Option B resulted in a 23% send-to-convert rate over Option A. This messaging strategy promoting a betterhost acceptance rate was then passed along to the Verification landing page. Next up, Alli said she plans on testing other elements of the email such as the balloon graphic and highlighting different surfers within the email, to hone the best combination and continue to improve the onboarding series. Through this discussion, we reviewed our first takeaway of the evening: document, document, document. Be sure to record variant details and results so that you can share learnings with team members and don’t have to duplicate tests in the future.
Next up, Justin Khoo of Email on Acid shared a couple of small batch A/B tests he has run lately with his own sites, Fresh Inbox and Campaign Workhub.
Justin has been working on an Email proofing tool called Campaign Workhub and as he gets ready to launch, opted to A/B test a message to his signup list to test messaging.
A) Graphic at the top of the email demonstrates what the tool does, copy references “early access,” and CTA offers a 30% discount for a “limited time.”
B) Shorter length email, green button CTA, messaging is around early access and does not reference an offer.
The results of this email reflected a 45% click-thru-rate for version B over version A. While discussing this email test with the group, we reviewed how it was interesting that the simpler version without an offer had better results but it led to a conclusion regarding the nature of A/B tests and landed on our second best practice for the night: test one element at a time so you can isolate the winning component. Upon further discussion, the idea was presented that maybe this wasn’t a true A/B test after-all, with the versions being so different and Version B not mentioning a cost to the tool at all. This was a relatively small batch test, however with more tests in the future Justin looks forward to testing different variations.
Lastly, the third and final A/B test we reviewed was for a newsletter digest email that Justin sends out to his Fresh Inbox subscribers weekly. When Justin isn’t writing amazing content for Email on Acid or developing his own product (maybe he also is using a Time-Turner like Hermione?), he maintains Fresh Inbox: a great blog covering interactive and advanced email design techniques and news. While an A/B test could test many different email elements from copy, graphics, headings, layout, CTA and this A/B test was all about the Subject line.
Subject line A: #EmailGeeks Digest: Interactive Email Galore
Subject line B: #EmailGeeks Digest: Interactive Pokemon GO Email
Anyone who hasn’t been living under a proverbial rock this year is aware that PokemonGo launched a few months ago and has been wildly popular. Justin typically pulls a topic from the newsletter email to mention in the subject line, in this edition of the newsletter there were 7 options, and featuring PokemonGo seemed like a no brainer.
The results were actually pretty interesting, but not in the way that you’d think – the Open Rates ended up being nearly the same (despite an initial surge by the B version mentioning PokemonGo). While the open rates were fairly similar, not only were the click-thru rates +23% for the PokemonGo version, but the distribution of clicks was fairly different:
While it’s clear that the article regarding Nest was more popular for the agnostic Version A subject line, the results skewed towards PokemonGo for Version B. The takeaway from this test is to really consider the impact of changes such as Subject line topic as they will not only impact openrate but could also change the way the user interacts with the email and other engagement metrics. Digest newsletters are a little different since the topics likely won’t all have one goal or theme, but it’s something to keep in mind while you test.
That brings our Email Power Hour event recap to a close; full slide deck from the presentation is located here.
An aside before we go, if you’re looking for an email platform to help you run A/B tests (or C/D/E/etc, as you can easily set up as many versions as you’d like), look no further.