The regulator’s research, carried out with the Information Commissioner’s Office, asked 1,686 internet users over 16 about their attitudes to, and experiences of, online harm across a range of categories.
This found that eight in ten people (79%) had concerns about aspects of going online, including online content (66%), data/privacy (58%), interactions with other uses (55%) and hacking/security (54%).
But rather fewer (45%) had actual experience of online harm. These included 20% who complained of receiving spam emails or communications; 14% had experience of viruses or malicious software; 13% had experienced scams, fraud or identity theft; just 10% reported seeing fake news or disinformation online.
Among those experiencing online harm, hacking and security had the biggest negative impact.
One in five people said they had reported harmful content encountered online; almost half of those who said they had done so were aged 16-34, with only 16% over the age of 55. Illegal sexual content was the type of content most likely to be reported, followed by content promoting terrorism and racism.
And, unsurprisingly, protection of children was a leading area of concern, with potential harms such as exploitation, inappropriate content, and bullying, harassment or trolling cited by some respondents.
Ofcom also found that respondents had a “mixed” understanding of how online content is regulated: 31% and 30% respectively thought that social media sites and video-sharing sites are regulated. And over half of respondents felt social media needs more regulation.
In a separate discussion document, Ofcom noted how audience expectations and context differ between broadcasting and online environments but highlighted several areas where principles from broadcasting regulation could be relevant as policymakers consider issues around online protection, including transparency and enforcement.
Sourced from Ofcom; additional content by WARC staff