Facebook’s developer blog announced Facebook Messenger for Kids, and said the new app “makes it easier for kids to safely video chat and message with family and friends when they can’t be together in person”. The product’s unveiling follows consultation with the National PTA, as well as parents and experts, the company added.
The app is designed with parental controls that link to a parent’s Facebook account, allowing a parent to choose who their child can speak to online.
Crucially, the post stressed “there are no ads in Messenger Kids and your child’s information isn’t used for ads”. Facebook has also created a dedicated microsite to provide additional information to parents.
Speaking to the Financial Times, VP of messaging products David Marcus said the service’s competitive edge comes from the need for parents to be connected with the parents of their child’s friends, a greater protection than content moderation alone.
In light of recent revelations about the kind of content accessible through YouTube’s Kids platform, the exodus of big brands from that platform illustrates the high stakes involved in making sure children’s platforms are properly moderated and that children are kept safe online.
Facebook stressed the new app would be moderated closely to ensure children are not accessing inappropriate material - for example, scanning for naked photos sent across the platform.
Though there are no details about what the moderation effort will look like, the company will mirror the way users flag inappropriate content on the main Facebook service through reporting.
As ever, Facebook is optimistic about the service’s purpose. “When you think about things at scale that we do to get people to care more about Messenger, this is one that addresses a real need for parents,” Marcus told TechCrunch.
Sourced from Facebook, Financial Times, TechCrunch; additional content by WARC staff