Home > NewsRelease > Testing Personalization, Take 3 [Case Study]
Text
Testing Personalization, Take 3 [Case Study]
From:
Jeanne S. Jennings -- Author - The Email Marketing Kit Jeanne S. Jennings -- Author - The Email Marketing Kit
For Immediate Release:
Dateline: Washington, DC
Tuesday, March 7, 2023

 

Personalization always boosts performance – right? Here we once again test that conventional wisdom. Read on for links to the first two personalization tests we did for this client.

As always, don’t just take these case study test results and apply them to your own program – do your own testing to see how your email subscribers respond to personalization.

Background

Many of the products this client sold were themselves personalized with company logos and names. As a result, showing products that were personalized with the recipient’s name or company logo seemed to make a lot of sense.

This was the third personalization test we did; it was to be a ‘tie-breaker.’ At this point, we had one test that showed a 96% increase in revenue from personalization, and another test which showed a 67% decrease in revenue from personalization. So this test was intended to decide whether or not to make personalized email a standard part of our program.

Test Set-up

For our third personalization test, we decided to do something similar to our first test. We chose a product, an activity book for children, that was traditionally purchased in bulk and personalized with the customer’s school or church name (schools and churches were the target audience).

Our control is what we would usually send – it has an image of the product at the top; in the space where the school or church name would be imprinted it says “Your Imprint Here” and a call-out next to it says “Your Personalization available on the cover!”

For the test version, we were able to dynamically add the name of the recipient’s school or church in place of the generic “Your Imprint Here” message. We kept the “Your Personalization available on the cover!” call-out here, for consistency and so it was clear that these books could be personalized.    

Wireframes of both versions appear below. The gray area was the same for both versions (I’ve truncated the wireframes to save space); the only difference was in the product image, which is green here.

To try to address what we thought may have been a bias in previous tests, we actually split the list into three segments. The first segment was the smallest – if we did not have a school or church name for a recipient, they were put into this segment. Since we didn’t have a school or church name to use for personalization, this group received the “Your Imprint Here” version of the creative.  Doing this allowed us to be certain that everyone in the other two segments was someone we had a valid school or church name for.

Once the segment one people were removed, the split the balance randomly in two groups, one of which would receive the “Your Imprint Here” control and one of which would received the personalized test.

Both the control and test groups each had more than 36,000 names in there, above my minimum cell size of 20,000, which almost always allows me to get statistically significant results. See the split below. 

The ‘No School/Church Name Available’ isn’t being use for the test, so the quantity there doesn’t matter. For the rest of the list 50% received the test/personalized version, while the balance (50%) received the control/not personalized “your imprint here” version.

As always, we kept everything except the personalization the same, so we could get clear results.

Which version do you think won? Was it the test with the personalization? Or the control without the personalization?

I’ll give you a minute to think about it… then scroll down to get the answer.

Results

Here are the results!

That’s right. Neither version of the email generated a single sale. For the record, neither did the ‘No School or Church Name Available’ segment.

How many of you guessed correctly?

Our key performance indicator (KPI) here is revenue-per-thousand-emails-sent (RPME), because we are looking to optimize revenue.

Note: We used RPME instead of revenue-per-email (RPE) because for this client the RPE numbers were often very small. By increasing the magnitude it makes it easier to see variances – and the relative variance between the cells remains the same.

Here are the full results.

As you can see, none of the versions produced any revenue. We checked to make sure the tracking was in place – it was. This wasn’t an error, it was just a campaign with no sales and zero revenue.

Going deeper, we see that the non-personalized control version bested the personalized test in terms of click-through rate (CTR), and as a result it also bested the test in click-to-open rate (CTOR). The variance was more than 20%, but even so none of those clicks turned into sales. Just more proof that CTR is not a good key performance indicator (KPI).   

Were we surprised by the result?

Yes. We were shocked. And saddened. We had such high hopes for personalization after the results of the first test. We hoped that the second test was an anomaly. We had anticipated that using the name of the recipient’s school or church on the product would boost performance. It did not.

Take-aways

So, does this mean that you should not personalize your email messages?  

No.

When we debriefed on this test, I learned that the activity book was a bit of a problem-child product. They had a lot in inventory; they hadn’t sold well. The thought was that personalization in the email would help that problem. But as you see, it did not. Personalization can’t turn an undesirable product into one that’s wanted.

Since none were sold at all, we chalked this up to a poor product choice.

It’s important to hypothesize about what might have impacted your results, win or lose. Doing this after a losing test is almost more important, as you are often able to adjust your approach and win on your next test.

Which is what we did on our next, our fourth, personalization test. Watch this blog for details…

In the meantime, give it a try and let me know how it goes!

Be safe, stay well,

Pickup Short URL to Share
News Media Interview Contact
Name: Jeanne S. Jennings
Title: Author, The Email Marketing Kit
Dateline: Washington, DC United States
Direct Phone: 202-333-3245
Cell Phone: 202-365-0423
Jump To Jeanne S. Jennings -- Author - The Email Marketing Kit Jump To Jeanne S. Jennings -- Author - The Email Marketing Kit
Contact Click to Contact