The Warc Blog

The Warc Blog

On quantitative pre-testing
Left Field
Left Field

I'm planning - with your help - to use the Warc database to explore current hot topics. Each post will draw on new material appearing in the archive that I think may be interesting and relevant. But your comments will be my guide.

I'm kicking off with an issue that has been running for decades but never seems to get resolved: quantitative pre-testing (or copy testing as many will know it).

I'm prompted by an intriguing feature in January's Admap by Orlando Wood from Brainjuicer.

I've been examining evidence for some time to explain two findings from Les Binet and my analysis of the IPA dataBank: that pre-tested campaigns generally perform significantly less well in business terms than non pre-tested ones (the effectiveness success rate); and that the gulf becomes overwhelming if you look at the profitability results:


Click image to expand.

Like many progressive research companies Brainjuicer starts from two premises: that it is the level of emotional engagement of communications that will drive profitability going forward; and that the way this is measured by traditional pre-testing techniques is woefully inadequate.

So they have pitched their 'Facetrace' technique for probing emotional responses against a traditional approach that asks consumers to report how they feel about the campaign. (Traditional measurement has been so comprehensively attacked it is amazing it is still used).

The traditional approach also incorporates a lot of standout and persuasion metrics, whereas Brainjuicer's uses some clever analytics to weight purely emotional metrics. The yardstick was: which approach best predicted the actual scale of business success of a range of IPA effectiveness awards case studies. Unsurprisingly the Brainjuicer approach won hands down, but that isn't the end of the story.

Orlando's article goes some way to explaining this. The dataBank suggests that emotional campaigns are much more profitable than rational ones - and the Brainjuicer experiment suggests thattraditional pre-testing methods probably filter out the most emotive campaigns and therefore also filter out the most profitable ones.

It appears that traditional techniques suffer from a double-whammy of weaknesses: not only are they poor at measuring the potency of emotional campaigns, but also, because of this, they are very poor at predicting the ability of campaigns to reduce price sensitivity (the key driver of profitability and the great strength of emotional campaigns). This is not surprising - these techniques were born from a persuasion model of advertising in which essentially rational campaigns persuaded consumers to buy more of the brand. Some hasty emotional bolt-ons were added to appeal to smart marketers chasing emotional engagement, but more fundamental re-engineering is urgently needed.

If you start from a model that believes in building consumer esteem for brands you will arrive at a technique like Brainjuicer's that places emotional effects at the core, not as a bolt-on. And coming back to their experiment, the key area where Brainjuicer's technique outperformed the traditional approach was in its ability to predict price sensitivity effects. It's a slam dunk that I hope will encourage the many experimental research techniques that aim to improve emotional response measurement and give it its deserved importance.

My closing thought is this. Even if you don't buy my argument and think traditional pre-testing techniques continue to serve us well, then consider the future. The great asset of online within a campaign is its ability to drive word-of-mouse for the brand - campaigns that do this successfully are the high-fliers of effectiveness. Try kidding yourself that traditional pre-testing can predict the WOM potential of a campaign. Given up yet?

Subjects: Marketing, Advertising

25 January 2010 15:59

There are 1 comments on this blog

Two quick points: At Millward Brown we absolutely understand how emotional response can be an important driver of sales success. For findings based on hundreds of ads (rather than just 18), see the Knowledge Point 'Should my advertising stimulate an emotional response?' on the website (in the knowledge center, knowledge points, advertising). Secondly, we know definitively that advertising pre-testing can predict the viral potential of an ad campaign. I'll be showcasing this learning at the WARC MAP 2010 conference on March 10th. Look forward to seeing you there!
Duncan S. 30 November 2010 at 10:16am
Comments IconAdd your comment here:

Blog Search


  • 2016
    • September (11)
    • August (18)
    • July (11)
    • June (16)
    • May (10)
    • April (17)
    • March (16)
    • February (13)
    • January (12)
  • 2015
  • 2014
  • 2013
  • 2012
  • 2011
  • 2010