Like the Zeppelins of old, many of today’s technologies fit a ‘pathological’ profile, argues Malcolm White, and more of us need to take control of the way we use our tech, instead of being used by it.
Here in London, there is a commemorative plaque which reads: 'These premises were totally destroyed by a Zeppelin raid during the World War on September 8th 1915. Rebuilt 1917.'
It seems that it wasn't only during the Second World War that Britain was attacked from the air, in what became known as 'The Blitz'. In fact, between 1915 and 1918, Britain was bombed from the air on 103 separate dates – a surprising total that has prompted historians to call these early air raids the 'First Blitz'. Many of these attacks were carried out by the Zeppelin airships of the German Luftwaffe.
To this day, the Zeppelin continues to fascinate many people. Although I'd heard about airship disasters, including the Hindenburg in 1937, what I hadn't realised was just how bad their safety record was. In the 40 years prior to the Hindenburg disaster, a total of 26 hydrogen-filled airships had been destroyed by fire due to accidental causes, sometimes killing everyone on board.
Yet, despite an airship disaster occurring almost every 18 months, Zeppelins and their like continued to exert a spellbinding power over nations, industrialists and inventors – such as the eponymous Count Zeppelin himself – and on those passengers who were prepared to pay a premium to be transported in an immense vessel filled with more than seven million cubic feet of explosive hydrogen gas.
In his book, Monsters: The Hindenburg Disaster and the Birth of Pathological Technology, Ed Regis dissects this unhealthy fascination and concludes that “never has a technology been so soundly, thoroughly, and utterly discredited as the hydrogen airship. The craft was a complete and final dead end.”
But what caught my eye in Regis' book was his description of the Zeppelin as an example of a 'pathological technology', and his definition of that suggestive phrase. For Regis, there are four things that make a technology 'pathological'. First, they are oversized in terms of their absolute size or effects. Second, 'pathological technologies' cast such a powerful spell on people that all rational evidence against them or to their contrary is rendered null and void.
Third, their risks and even their blatantly dangerous downsides are systematically minimised and underplayed. And fourth, a technology should be considered pathological when there is an extreme mismatch between benefits and costs.
It strikes me that many of our modern technologies fit this pathological profile. The likes of Facebook, YouTube, Snapchat, Reddit and Twitter create addictive feedback loops that keep us liking, swiping and in a state of 'continuous partial attention'.
They all exist on an almost unimaginably huge scale; they've got most of us hooked; their downsides don't get much airtime, certainly when compared to how continuously they are lauded; and their meaningful benefits (I'm especially thinking of Twitter, here) don't take too long to count. They might not cost lives, but they do have other costs.
Talking of Twitter, I recently heard a radio interview with Evan Williams (ex-CEO of Twitter, though still a major shareholder), where he lamented the dumbing-down created by forms of online communication. Astonishingly, the respected journalist interviewing him didn't go for the jugular and point out – what I was shouting at the radio – that perhaps 140 characters (recently revised up to 280) might have had something to do with the dumbing-down Williams is belatedly concerned about. By being spellbound by the co-founder of Twitter, the journalist in question did what most of us do when faced with a pathological technology: he acquiesced.
I think it's high time that we resisted our tendency to behave in this way. More of us need to take control of our technology, and the way we use it, instead of being used by it. In marketing and advertising, we must not shy away from discussing the pathological effects of programmatic, for example. Just because we can do something doesn't mean that we should. We need to have proper debates about a new technology's pathology, as well as its benefits, when that technology is still nascent. And because it seems to me that there is a debate about AI in these terms, I am reasonably optimistic about the future of that new technology.
Above all, we must remember that while new technologies can be awe-inspiring, they can also blow up in our faces. Just like the Zeppelins.