Real Magnet

Wagging the Dog: Using Email Analytics To Evaluate What Email Promotes

Marketing isn’t all about the message and the audience. For marketing to be effective, the product or service being marketed is also responsible for its results. While it is true that many excellent products never take off due to inadequate marketing, it is equally true that mediocre or unappealing products also fail despite the most aggressive and sophisticated campaigns. Look no further than a list of movies opening this weekend – for every $100 million blockbuster on the list, there’s a $5 million bomb, both of which may have been marketed by the same A-list agency with a comparable 7-figure budget.

The same thing may be happening in your organization. Not everything we market is a breakout success. As marketers, we often control only the marketing and not the product itself, so all we can do is try to increase, amplify and innovate our way to better results for a flagging product. In my experience, it is not uncommon for marketing to be on the hook for an event or webinar or subscription or new product developed by another part of the organization, particularly if the marketing team has been largely successful in the past. There is a bit of irony in there, to be sure. If a marketing team is consistently successful, a poorly performing product is more likely the fault of the product. But more often than not, the questions that arise are more akin to, “how else can we communicate this?” than “what can we do to make this thing more appealing?” It’s easier to change marketing than it is to change a product, and responsibility follows the path of least resistance.

But in these instances, it’s not just the marketer’s responsibility to figure out how to boost results. It is also marketing’s responsibility to let the product team know when the product itself is coming up short. The product team should know so that they make improvements next time around, or this time if it’s not too late. But there is some selfishness in this recommendation again, as the marketer does not want to get stuck marketing the next product with the same shortcomings that made this one so difficult.

It’s always a disappointment when marketing doesn’t generate the results we want. But some dogs just won’t hunt. These are the ones that email should wag, by using analytics to evaluate what is being marketed, not the marketing itself. Here is how to tell if what needs tweaking is what you’re marketing, and not the marketing itself:

1. A history of consistency: First off, your email marketing has to show consistently successful results promoting comparable products – whether they are other conferences or webinars, or selling subscriptions to other products, or launching other new products. Being able to demonstrate – through results – that your email marketing knows what it’s doing is the first step towards convincing yourself that the marketing might not be the problem. Significantly, it is also the first step towards convincing whoever is responsible for the product. They are already fully confident that they know what they are doing, yet something is still not going well. Backing up your suspicions about the product with some data that helps underscore your own expertise can make the discussion less confrontational and help a product manager or event producer entertain your point of view. How much history of strong results is necessary? At least two years if comparable products have been marketed several times in each year, even longer if this product or one like it only hits the market once annually. Results can be skewed – up or down – for one or two cycles based on factors outside of the marketer’s control, such as a poor economy or a position in the marketplace bolstered by a major event or partner.

2. Outliers in results: Once you’ve identified your history of consistency, you’re looking for something in this cycle that is decidedly inconsistent. For example, say you track average open and click-through rates on the 12 events you have marketed in the past three years, and find that open rates range from 15% to 25% and click-throughs from 3% – 5%. In this cycle, your open rates are still strong, but your click-through rates are hovering around 2%. Put it in a chart and you’ll see (and show) how dramatic an aberration the current cycle is. The high open rates in your data suggests that you haven’t lost your audience’s attention, but the flagging click-through is an indication that once your audience reads the message, they are not moved to action nearly as readily.

3. Confirm with A/B testing: If you do find some outliers in your data, the next step is to play devil’s advocate with yourself. Maybe it is the marketing after all. To find out, try some A/B testing of the way you present the product or offer. You don’t want to stray too far from your regular positioning, but you do want to exhaust the marketing possibilities available to you. Through email this is fairly easy to do. For example, if your open rates remain strong and your click-throughs are low (as above), leave your subject lines alone and focus on presenting the copy differently. Use A/B testing to split your list, focusing on one key attribute in one message, and another in the other. For an event message focused on value, for example, one message might communicate the low overall cost of attending event, while the other will emphasize the ROI to be gained through attending. Your objective is not to prove that your marketing works but the product is broken. Rather, it’s to do everything you can – within the current positioning of your brand – to see if you can lift results. This way, when the product manager or event producer asks, “Well did you try this?” you can reply, “Yes, and that and that as well. We can’t move the needle.”

Once you assure yourself that you’ve done everything you can to improve the response to the product or event you’re marketing, and can reasonably conclude that the shortcomings are in the product itself, then you can have a constructive (and diplomatic) conversation with the folks on the product side.