First a big thank you for the encouraging comments to previous posts– much appreciated.
I'm prompted to post once more on the subject of pre-testing by two recent events. One was seeing Duncan Southgate's fascinating presentation at the MAP 2010 conference, which was mostly concerned with viral viewing and how to pre-test for viral success. The other was re-reading David Penn's fantastic 2008 ESOMAR paper in the light of my recent exploration of the pioneering pre-testing work being done by Brainjuicer and Cogresearch (see my recent posts here and here).
Penn's company– Conquest– is also pioneering a new technique to improve the all-important measurement of emotional response to communications that is at the heart of business success for brands. In some ways similar to Brainjuicer's approach, Conquest's 'Metaphorix' technique uses visual metaphors in the shape of animated avatars as a way of gathering more intuitive and instantaneous emotional responses.
In his paper Penn quotes extensively from academic research highlighting the inadequacies of traditional verbal questioning techniques to elicit the true scale of feelings and supporting the use of metaphors as an improvement.
In a nutshell, the research reminds us that the more we ask consumers to think how they feel about an ad, the more distant and diluted the reported response becomes. Thus you might expect ads intended to work primarily through emotions to be underestimated by traditional verbal research techniques, whereas ads intended to work through rational messages would suffer no such disadvantages.
The research suggests that metaphors work by short-circuiting the consideration process, enabling a consumer's emotional brain to speak more directly in research with less interference from the deadening hand of their cognitive brain. So when compared to a traditional verbal technique, you would expect Metaphorix to be more sensitive to ads with a strong emotional modus operandum. And Conquest's impressive large-scale validation showed exactly that.
In a revealing parallel with Brainjuicer's experimental findings, Conquest found that ads that provoked only low-key emotional responses scored similarly in traditional verbal and Metaphorix methodologies. However, at the high end of the emotional response scale, Metaphorix responses were considerably more elevated than traditional verbal responses (which revealingly had begun to tail off). There seems to be an inescapable conclusion from these and many other pieces of work (e.g. TNS pre-testing Sony Bravia commercials) that traditional pre-testing techniques are simply not up to the demands of the modern highly emotionally charged world of successful communications.
It seems to me that nowhere is emotional engagement more important than for campaigns hoping to enjoy viral success and word-of-mouth in general. How many successful virals do you know that are simply informative?
At MAP 2010, Southgate showed the Old Spice US commercial that has now enjoyed over 5 million YouTube views to illustrate the qualities that we should look for. I found his LEGS criteria for viral success (Laugh out loud funny, Edgy, Gripping or Sexy) very convincing and useful, though personally I would have added a few more legs to the beast. So I was expecting to see a new viral pre-testing tool that was built on these very sensible criteria. Instead he presented 'ABCD'– that is Awareness Index (surprise!), Buzz, Celebrity, Distinctiveness. Who could argue with a buzz measure, or a distinctiveness one? Celebrity seems to me to be another leg (if you'll excuse my obsession with legs), but perfectly sensible. But the AI? Are we to believe that branded cut-through drives the desire to pass-along ads? And that measures of surprise and excitement proved less valuable (as Southgate reported)?
I think the 'powers that be' at Millward Brown should back off Mr Southgate and allow him to develop his original thinking as I suspect he must have wanted. He really does have great LEGS.