Split-Testing Responsive Email Design: You’re Doing it Wrong

Split-Testing Responsive Email Design: You’re Doing it Wrong

I watched a webinar yesterday where the presenter was strongly advocating doing a split test between a Responsive email vs. a Non-Responsive one (ideally the same email just sans-media queries) and I was reminded of a rant I half-wrote on my flight back from the Litmus Conference last year. There’s still too many swear words in that, but nevertheless I still think this approach demonstrates a lack in understanding in the benefits of going responsive.

Responsive email design is an experience thing. It’s about making the experience for a user better when they’re using a mobile device, and that’s a very difficult thing to quantify using numbers. Any difference in opens, or clicks, can be down to so many things, that they’re at best only a vague indicator. Even if you look at conversions, there’s so many factors at play that it isn’t a reliable statistic.

The true measure of things like responsive design, is to gauge the user’s comprehension of the message and how happy they are with the brand as a result — but whilst that’s perhaps the ultimate aim in marketing, that’s incredibly hard to put a number on.

Incidentally, towards the end of my flight on that trip home, the flight attendants handed out free choc-ices to everyone. That wasn’t part of the in-flight meal — that was long gone — so what was the return on investment on that? could you split test that and see who re-books another flight? No you can’t. There’s no reliable correlation there whatsoever. But I’ve still flown with Virgin Atlantic three times since then.

More on this

Image: Ivan Colic / Noun Project



  • Emailer

    But equally companies want to know if its making a difference as they often have to outlay costs to make the templates responsive.

    • http://www.actionrocket.co Elliot Ross

      true – they do, agree with that, but the assumption that opens/clicks is the right thing to measure is dangerous. It can be an indicator that something has changed, but it’s hard to make any judgement aside from that, and therefore there’s not much value in it.

  • http://twitter.com/philipstorey Philip Storey

    I agree. Who cares whether responsive design drives more traffic or revenue? It’s a matter of creating the right user experience for mobile scenarios. Simple as that.

    • Kirsty

      Unfortunately, clients mind. When we put the effort into a responsive template and charge them for the effort, they need to see a return in that investment, especially as we work with mostly small/medium businesses.

    • James

      If the business goal is generating revenue (most are), then the best user experience aligns itself with generating the most long term revenue.

  • James

    Incidentally, towards the end of my flight on that trip home, the flight
    attendants handed out free choc-ices to everyone. That wasn’t part of
    the in-flight meal — that was long gone — so what was the return on
    investment on that? could you split test that and see who re-books
    another flight?

    Technically, yes, you could. There is nothing stopping the airline from offering this benefit to half the flights and tracking results.

    Split testing allows companies to understand the impact of their decisions. Perhaps using opens or clicks does not make sense in this instance, but it’s good practice to test changes. Define the desired outcome before implementing the change, identify how it can be measured, and then determine the viability of running the test. In the case of responsive email design I could see the best metrics taking months or years to collect, making split testing cost prohibitive.

    I guess what I’m saying is that we should be weary of the methodology, not the desire for a data driven approach. How can we measure the impact and improve the methodology?

    On an personal note, we have tested responsive design on both our website and emails with both showing a positive increase in all metrics. Even though we would have moved forward with the responsive design either way, it was a very useful endeavor. Conducting the tests not only validated the investment in responsive design, but it quantified the revenue impact. This allowed us to invest further in perfecting and enhancing the responsive design then the original budget and scope.