Like any dutiful marketing manager, I ran the test by setting up and sending two of the same emails, each with a different subject line. But something surprising happened a day later, when I compared the open rates: subject line B (Register early and save big) won by 33 percent. My intuition failed me.
I messaged the outcome to my boss, who sauntered over to my work station with a look of, “I told you so, Lizzy,” written all over his face. What could I say? He was right that we can’t just go with our gut every time — and that these testing tools exist so we can do our work in a more objective, unbiased fashion.
But this insight got me thinking: why do we only ever test for subject lines? Subject lines are important, and no doubt a determining factor in whether people open or ignore an email. But what about the other parts worth testing, like time of day, day of the week, layout and design, messaging, and content? All of these affect outcomes and deliverability. So I started scripting an email to my boss, taking caution not to come on too strong. (He likes to be the one with the big ideas.)
To my surprise, he took my email well and even complimented my “strategic thinking.” Then he gave me a new assignment: do some research and figure out the most important parts of an email to test, and the tools and technologies that make A/B split testing manageable, not too time-consuming.
That “manageable, not too time-consuming” part was music to my ears. And of course, receiving accolades from my boss left me glowing for days on end.
All of that aside, though, I learned an important lesson in marketing. Setting up and sending A/B split tests might feel like an unnecessary sidestep, but the process leads to findings that can boost your results and help you gain an edge as a modern marketer.