Fresh research is helping advertisers to better understand how many ad exposures are required to ensure an effective campaign, writes WARC’s Alex Brownsell.

Arthur Dent, the unfortunate protagonist of Douglas Adams’ oft-adapted (and still hilarious) 1978 BBC radio play The Hitchhiker's Guide to the Galaxy, is underwhelmed to discover that the answer to the Great Question of Life, the Universe and Everything is “forty-two”.

At least Dent was given an answer, however implausible.

In comparison, brands seeking to understand the optimal number of occasions a target consumer should have an opportunity to see (or hear) an advert are still as in the dark today as they were 40 years ago, when Herb Krugman, then head of advertising research at General Electric, postulated three to be the ‘magic number’.

Krugman’s approach became the industry norm until the 1990s, when it was challenged by media planner Erwin Ephron, who instead proposed the idea of ‘recency planning’ and the need for advertisers to constantly build reach. Except Ephron himself subsequently changed his mind.

Guided by influential research from Professor Byron Sharp and the Ehrenberg-Bass Institute, advertisers began to prioritise audience reach – not least due to the difficulties of managing frequency across online and offline channels. And the problem will only grow more complex with the demise of third-party cookies, and restrictions around mobile ad tracking.

Marketer confidence is on the line

While frequency management may no longer be a favoured cause among advertising researchers, it is still a subject that causes CMOs to gnash their teeth. Procter & Gamble’s Marc Pritchard is one of several brand leaders to have railed against the perils of “excess frequency”, and the perception that media dollars are being spent on serving ads to consumers who no longer require the exposure or – worse – are actively irritated by the repetition.

In some cases, this fear of excess frequency risks suppressing investment, not least in nascent channels like connected TV. In the words of a media expert I spoke to earlier in the summer, limitations in CTV measurement leave marketers worrying whether they served an ad “to 47 different people, or one guy 47 different times when he flipped between channels on his Apple TV”.

In an effort to dispel some of these concerns, ad tech vendor Innovid partnered with the Association of National Advertisers to carry out a first-of-its-kind study into CTV advertising best practice. The report analysed 35 campaigns across 20 of the largest advertisers, measuring 1.7 billion impressions that totalled over $35m in media expenditure.

Researchers concluded that CTV’s apparently frequency problem is a “myth”. The average campaign frequency was “surprisingly low” over the life of the course of a campaign, reaching just 4.6 exposures, according to Jessica Hogue, Innovid’s GM Measurement and Analytics.

Eighty-five percent of households were exposed to an ad between one and two times on average, while 14% of households were exposed between three and nine times. Just 1% of households were exposed more than 10 times on average.

Finding today’s frequency ‘magic number’

As helpful as it is to know they are unlikely to be bombarding audiences with the same ad, this leaves marketers no closer to knowing the ideal number of exposures required to ensure a successful campaign.

To fill this knowledge gap, Comcast-owned European pay-TV broadcaster Sky commissioned an analysis of its customer data – and concluded that optimum frequency sits in a range of between eight and 14 exposures.

The paper draws its insights from three primary data sources:

  1. Respondent-level data from eight brand evaluation projects across linear TV and video on-demand (VOD), with precise exposure sourced from the Sky’s viewing panel. This study shows diminishing gains after 14 ad exposures.
  2. A controlled frequency cohort analysis on an unnamed financial services brand, which advertised using Sky’s addressable TV platform AdSmart. This returned an optimum frequency of eight ad exposures.
  3. Aggregated campaign level norms data across AdSmart and VOD, based on 135 studies, which metadata used to assign an average frequency per campaign. Gains here gains are strongest for campaigns with a frequency of between seven and nine exposures.

There is no “one-size-fits-all” approach to frequency planning, argues Leo Malagoni, Head of Market Research at Sky Media. Brand size, share of voice and category context are just some of the external factors that can influence the optimum level of frequency for advertising. However, by triangulating the findings from separate analyses, it is possible to provide a “steer in the right direction”, he said.

The answer is… 42?

There are further factors to consider before brands can settle the Great Question of Advertising Frequency.

For a start, thanks to the work of experts like Professor Karen Nelson-Field, CEO and Co-founder of Amplified Intelligence, we now know that consumer attention has a significant impact upon advertising effectiveness. Not all frequency exposures are equal – something media buyers like Omnicom acknowledge with the move to incorporate attention adjustment into planning decisions.

Then there is the daunting challenge of cross-media measurement – especially in light of the rise new advertising platforms, like Amazon, with their own unique frequency requirements.

The ANA has partnered with ComScore to develop ‘Virtual People ID’, a system by which it can combine online impressions with TV panel data to help marketers to deduplicate audiences across TV and online media (read the ANA’s blog for more on that technology). It will likely take years – rather than months – before any industry-wide cross-media currency is in place.

Yet, for all the apparent setbacks, there is reason for optimism. The pathway is becoming clearer. Marketers have come a long way since Herb Krugman’s assertion that it only took three TV ad exposures to deliver a campaign message. They can take confidence that the industry is getting closer to an empirical solution to the frequency conundrum.