Algorithms are playing ever greater roles in our lives and work – a new piece on WARC explores how they can be built fairly from the ground up.
“If you have any hope to create a fair algorithm it is important to understand the context in which it is going to be applied”, writes Henrik Nordmark, Head of Data Science at Profusion, in Algorithmic fairness must be at the heart of the tech we design, a new WARC Exclusive.
Essentially, as algorithms are increasingly used to make decisions pertaining to the general population, the social and economic issues cannot be ignored in what traditionally has been an engineering problem. Even engineering isn’t neutral.
“It is interesting how the exact same algorithm can be negative, neutral or positive depending on its context of application. This is why the design of an algorithm should never be divorced from its intended social use.”
Of course, the profile of the teams that typically put algorithms to work skew differently from the general population. Diversity is essential.
“This is because diverse teams provide a plurality of backgrounds, experience, skills and knowledge. They are able to provide a complete picture which can avoid unintended consequences, such as failing to account for bias or too narrow a world view of the context of application.”
But the data is similarly crucial. Though a system might be created by a diverse team, the data may itself carry its own biases. “There are ways of detecting whether a dataset is biased relative to what would be a representative sample of the population that you are interested in analysing, but that does not always do you as much good as you might think.”
None of this is simple, Nordmark acknowledges, but serious thought and a nuanced understanding of the context from the very beginning is a good first step.
Sourced from WARC