Children have never been exposed to more content, and concerns around their privacy should be a priority. Here SuperAwesome's CEO Dylan Collins shares three key areas that we can change.
2017 saw a seismic shift in the digital landscape for kids. After largely ignoring their existence, the release of Facebook’s Messenger for Kids proves that Silicon Valley is beginning to respond to the need for purpose-built kid-safe technology. Yet, as we’ve seen in recent weeks, even companies with all the technical resources in the world are struggling to build fully kid-safe environments. In the wake of advertisers’ mass exodus from YouTube following inappropriate footage of children and lewd comments, it’s as evident as ever that there needs to be greater moderation and compliance in the kids’ digital media ecosystem.
How did this happen? Children are not exactly a new concept for society. On any given day in the U.S., there are approximately 11,000 kids going online for the first time. Yet for years, Silicon Valley mostly operated on the basis that kids simply didn’t exist on the internet. A mere tick of a registration box would declare that the user was over 13. However, since 2010, Apple’s tablet release cycle ensured that kids rapidly started gaining access to family tablets and secondhand smartphones. Over this period, kids’ digital screen-time increased tenfold. It was inevitable that the needs of kids would become a major topic.
The problem is simple: we have an internet built for adults, which now has children running all over it. In fact, with the growth of online advertising and social networks, the internet has become even more of a data-harvesting platform than ever before. This is in direct opposition to what we need in a kids internet. And it’s not just technology. Engineers and designers who have spent their careers building for adults will often carry those assumptions into thinking about kids. This adult engineering bias is a surprisingly significant contributor to the problems we see today. It creates three major problem areas:
1. Retrospective content moderation
For most adult-based services like YouTube, content issues are handled post-event. YouTube may recently have significantly increased the size of their moderation team, but they’re still only reviewing content after it’s been published. This measure assumes that users are mature enough to deal with imperfect services. It’s why algorithms and automation generally work. 95 percent “OK” is typically acceptable in the adult market. But kids are not adults. Their tolerance for error is infinitely less and 95 percent “OK” isn’t acceptable. Content moderation for kids has to be pre-event and it needs to combine human moderation with automation.
2. Data harvesting by default
Virtually every consumer service is built around the concept of profile data (cookies, social graphs), which forms a picture of a users’ online behavior. Even though data privacy laws like COPPA and GDPR have made data capture illegal for kids, behind the scenes a great deal of ‘adult’ ad tech is still being used to show ads on kids content, which is facilitating the mass collection of data on kids. This also extends to embedded widgets from YouTube, Facebook and Twitter. It’s not just advertising, consumer apps are now including technology that allows advertisers to listen to local activity. Much of the negative parental reaction to Facebook’s Messenger Kids is because of the growing fear of all this data harvesting.
3. Thinking that everyone under-13 is a ‘kid’
The single most common mistake made in this sector is releasing a product aimed at the under-13 market and putting the word ‘kids’ in the title. As children get older (7+), they start to develop much more of their own identity and don’t want to be called ‘kids.’ The best way to build a product that is ignored by 7- to 12-year-olds is to use ‘kids’ in the name.
Looking at the respective responses of Facebook and Google to the under-13 audience, you can see both companies going through this learning curve. Google’s ‘YouTube Kids’ (an automated filter for YouTube) and Facebook’s ‘Messenger Kids’ (a data-collecting messenger app) are noble efforts but clearly they are not going to have a material impact on the actual problems.
We need to start thinking long-term about how we build a digital ecosystem for kids. This is not just about content, but specifically designed ‘kidtech’ to support monetization, advertising, social engagement, analytics etc. ‘Kidtech’ is based on three principles:
Technology and processes built specifically for the under-13 digital audience, not re-purposed adult paradigms.
Built with kids’ digital privacy as the core design principle
Built for sustainability, designed to allow an economically viable kids digital environment for developers
There are going to be more kids going online in larger numbers at even younger ages. Without much greater emphasis on kidtech we are going to see more lawsuits, increasingly large fines (especially with GDPR becoming law in May) and the associated brand damage. Worse, at some point there will be a ‘9-11’ event that does real harm to kids on the adult internet. The kidtech revolution needs to begin now.