In content marketing, what you don’t know can hurt you

“But it tested really well…”

Ever heard (or worse, said) that one?

It typically comes from a brand manager following a berating from his boss regarding how terrible the new advertising is, and how sales are tanking as a result.

Or maybe it’s the president of a big media company who needs to explain to her shareholders and clients why the “star show” they bought for millions of dollars hasn’t delivered the audiences that the models said it would.

Part of the brand manager or president’s case, no doubt, will be a collection of ‘maybes’ that attempt to rationalize the poor performance.

“It was the media plan”

“Competitor x ran a great promotion”

*“The weather was particularly great that fall” *


“We had no idea a show about a swamp family would rate so highly and steal our audience”.

What you’ll never hear them admitting, however, is that they’ve been measuring the wrong thing. That they’ve been asking of the data something it simply can’t answer.

But that’s precisely what’s happening.

The typical testing around a new piece of copy or a new show relies on a consumer’s conscious, rationalized response, which is normally captured through a focus group or survey.

But content creators seek an emotional response, not one’s prompted articulation of an emotion. The former is instantaneous, uncontrollable and can powerfully encode to memory; the latter, considered, rationalized and easily forgotten.

We confuse felt with stated felt. The problem is that, in seeking to evaluate the former, we actually end up evaluating the latter, which is why you have brand managers and content testers asking of the focus group “data” more than it’s equipped to answer.

The best creatives and producers have known this for some time. That’s why they eschew this type of “creative testing” or “concept optimization”. They lament focus groups that destroy good creative, but more specifically, that they’re not even measuring the right thing.

And this can hurt you.

One packaged goods company we worked with struggled to understand why their campaign, which had “tested so well”, wasn’t driving any share gains (and in fact, was in market over a period of share loss).

They had spent hundreds of thousands in research, and millions in creative production for a series of four spots that sought to create a persona around their brand’s character.

We analyzed the subconscious activity of their target segment against these spots, as well as their previous campaign spots, a handful of top competitive activity and a host of other content.

What we found was very different to “tested so well”. The target had completely tuned out to most of each of the spots. The stories were disjointed, inauthentic and too polished. On the other hand, we saw competitors with much more cohesive and authentic storylines, who used grainy, vintage imagery to win over a target that traditionally avoids advertising.

Where the audience did tune in, which may explain why the spots “tested well”, was when popular music featured. The audience likely confused their enjoyment of the music with their position on the spots.

The risk here is fairly clear. Your target doesn’t have a great grasp of why they respond the way they do.

To paraphrase Henry Ford, in accepting what people say about your product and tweaking it based on this, you’d end up with faster horse and carts.

Not the Model T.

Or, to put it more bluntly, as one CEO of a big packaged goods company recently told me, “what could a group of 12 idiots in a room tell me about my advertising?”


So what to do?

There are two clear paths forward. The first is to better understand in what circumstances focus groups are useful. Specifically, if you’re looking to chat with your customers and understand their stated opinions, then yes, focus groups are meaningful, as are surveys, and social media monitoring, and other methodologies that delve into the stated opinions of your consumers and prospects.

The second path, though, is to recognize the limitation of these methodologies and pursue other means to evaluate the physiological responses of your consumers to your marketing stimuli.

To measure feelings.

It’s the only way to understand how content truly performs as it’s meant to – in evoking an emotional response. And we can only improve what we can measure.

It’s not as ‘out there’ as you may think, but it is crucial. As an advertiser, you’re competing against more now than ever before for that corner of the consumer’s mind that your brand wants – needs - to occupy. The quickest path to that real estate in the brain is through the heart.

Which means that content and advertising choices should be made to reflect how the audience or consumer actually feels when he or she is consuming that content.

Only then will you know whether it tested well.

Until that time, expect to say and hear “But…” a lot more.