2022 will be the year that accuracy in attention measurement comes into its own, writes Karen Nelson-Field from Amplified Intelligence.

It’s true that attention metrics mean different things depending on which side of the ad transaction you sit. But the beauty of rigorous attention measurement is that it offers value and redemption for the entire ecosystem.

There’s no denying that the buy-side has jumped on the attention measurement ride, the benefits are obvious. Who doesn’t want to optimise their media spend? But it’s simply not true to say that the sell-side has been slow to respond.

We can categorically say that publishers and broadcasters have been early adopters of attention measurement. In fact, in the last couple of years we have collected more data in partnership with sell-side than buy-side. Those that have publicly announced they are collecting attention data with us include: Meta, Spotify, Val Morgan Cinema, Channel 7 and 9 Network.

They can see the value in using attention, and what it reveals about a viewing environment, to improve the quality of their platforms and formats in a quantifiable way. Now, they have a metric that starts to hone in on the human user of their platform. Absolutely everyone benefits from this.

In a recent WARC article, Paul Nesbitt, Director International Insight and Measurement at Twitch, hit the nail on the head when he said, “The more we can move to a place where we have a common understanding of how attention is measured across vendors, and where their metrics are actually quite similar, that's going to help us to actually navigate our way.”

There is a way we can move toward this common understanding, but it also needs to be on some commonly agreed standards.

  1. Only human data. Sounds straightforward but only human footage collected via an outward facing device camera can tell us whether a human is paying attention. Javascript executed inside a browser cannot. Advanced viewability is still device data. Human data tells a human story that device data simply can’t.
  2. Privacy safe. Without trust, the data and what it is telling us is open to question. There must be evidence of meaningful opt-in and -out for collection participants, plus data security and retention policies. There is no room for malware, dressed up as gaze-based collection. Humans must be treated with respect.
  3. Natural environment. We need to observe natural human behaviour, so the attention data needs to be collected in a familiar environment. We don’t want people to concentrate harder, we want them to experience a natural level of distraction. This needs real (not just realistic) platforms, passive cameras (not gaze-tracking goggles), no labs and calibration-free.
  4. Accurate gaze estimation. Accuracy comes from extensive training data for the model. Without ongoing access to facial footage, models will be less accurate. Granularity (rolling eyes-on-ad), continuous improvement (from varied data), individual level data (aggregates are useful but they don’t give enough depth), collection conditions (how many edge cases are included), validation (inherent biases).
  5. Boundary conditions. The model should be proved across boundary conditions. The baseline needs to hold over a range of conditions, so that it can then be used predictively and at scale.

As a bare minimum, these standards should be met for attention data. If we know the data meets these criteria, we can start to compare apples with apples. If your vendor can’t say ‘Yes’ to these quality criteria, there is a problem. At this stage in the young life of attention measurement, to throw the industry back into a circus of mis-matched metrics would do more damage than good.

2022 will be the year that accuracy in attention measurement comes into its own. With increased granularity we will start to see the patterns that sit beneath the attention seconds data. It’s time to go beyond eyes-on-screen. We want to know not just what people are watching, but how they are watching.

To do this, there must be a solid foundation of accurate and trustworthy attention data. If we don’t start with that, as we dig deeper, we will find that we have mined the wrong quarry.

More on this coming soon.