Children’s privacy protection is well and truly in the spotlight as we move into spring. There’s plenty happening in the EU. TikTok has been grabbing the headlines in relation to children’s privacy and safety during the past month and the other big tech platforms are never far behind. The UK Information Commissioner’s Children’s Code is due to come into force in less than six months and the Irish Data Protection Commissions consultation puts children at the heart and center of approaches to data processing. Later this month the UK’s DCMS will hold a key online event: Safety Tech 2021Unleashing the Potential for a Safer Internet, which will tackle hot topics ranging from how safety tech can protect children, moderation and brand protection and let’s not forget the Online Harms bill and its anticipated passage through the UK parliament.

Meanwhile, in the United States, there are moves to tighten privacy protections for children. The Children’s Online Privacy & Protection Act (COPPA) Rule Review was brought forward last year, in light of the changing privacy landscape and emerging technologies and the FTC has 176,280 comments to review before we expect to hear from them. During this time several bills have been put forward which raise some challenging and interesting ideas in terms of privacy protection for children online. Leading the charge are Sen. Markey and Hawley with their bill to amend COPPA and Rep. Castor’s PRVCY Act.  Both bills address the need for protection of teens as COPPA covers children 12 or under only. Sen. Markey has also called for a Bill of Rights for children akin to the rights they enjoy under the EU’s General Data Protection Regulation (GDPR) which came into force in 2018. Rep. Castor’s goes further. During a House Committee Hearing on child safety online last week (click here to see recording) she recommended not only increased protections for teens but also increasing penalties for non-compliance and changing the actual knowledge standard to constructive knowledge. Rep Castor also called for an end to self-regulation.

Changing the actual knowledge standard (that you have children on your service) to constructive knowledge is an interesting topic and one that causes heated discussions among privacy advocates and industry. However, this debate somewhat misses the point. Let’s take a step back and tackle the real issue here, which is one of “know your audience”. This knowledge would go a long way to helping protect children’s privacy and safety online out the gate. Instead of turning a blind eye, stating “not for children” in terms as a get out of jail free card and implementing weak age screens, online platforms need to step up, smarten up their age gates and verify a user’s age. Children should then be provided an appropriate experience with baked in privacy and safety by design.

In January this year the Italian data protection authority (Garante per la protezione dei dati personali) imposed a limitation on TikTok stating that it should not process data of users whose age could not be established with certainty. This followed the tragic death of a ten-year-old girl who had been viewing, and was thought to have taken part in, an asphyxiation game. What does it take to realize that age verification is key on social platforms where children are clearly and constantly present in very high numbers?

Rep. Castor also called for an end to self-regulation. There are millions of apps, sites and online services in the market globally. Policing these to ensure that compliance with regulations is a mammoth task to say the least. Self-regulation should be just one tool in a box of many, that supports the regulators to do their jobs. Companies that join a robust COPPA safe harbor program are putting their heads above the parapet, investing time and money in getting it right. Of course, more needs to be done and the FTC should have the resources required to do more, and to ensure the integrity of the safe harbors. However, there is no doubt that safe harbors can provide an invaluable service for industry, regulators and not least for families and children. Why throw the baby out with the bath water?

PRIVO welcomes this long-awaited focus on child and teen privacy protections, but changes and reforms must be well thought out and well informed. The way to achieve this is by working together as an industry, with regulators and ensuring parents and children are part of the conversation.

 

About the author

Claire Quinn is VP of Compliance for PRIVO. PRIVO is at the heart of developments in this space and will be updating its GDPRkids™ Privacy Assured Program to ensure it remains relevant and mapped to the regulation, the latest guidance and codes. For more information on the program, which demonstrates compliance in relation to children, contact us!