Opinion

Data privacy and protecting children online – finding a path through the complexities

Data privacy and protecting children online - finding a path through the complexities
Published Date
Mar 31 2025
On March 11, 2025, A&O Shearman hosted a conference for Privacy Law and Business entitled What's right for children's data? Keeping on the right side of the law. The conference brought together companies, regulators, NGOs, academics and experts from A&O Shearman to discuss how apps, websites and games can be designed in a way that puts children’s interests first and empowers children so they can use the valuable and educational aspects of the internet in a safe way.

The event included speakers from assurance providers Yoti, and VerifyMy, digital platform providers such as Google, TikTok, the BBC and Epic Games, age-appropriate design platform K-ID, data protection regulators from the UK and Ontario, academics from the Universities of Nottingham and Northumbria and civil society groups Defend Digital Me and 5Rights.

The day was structured around the following themes:

  1. Identifying child users: exploring age assurance,
  2. Navigating and managing risks: framing consent and parental controls around the best interests of children, and
  3. Design: Making data rights understandable and accessible for children.

The context for the conference was also framed by a range of regulatory regimes focused on protecting children online. This included the General Data Protection Regulation’s (GDPR) requirement for parental consent under Article 8 and requirements to address the specific protections needed to protect children when processing their personal data. In the UK we also have the Age-Appropriate Design Code issued by the Information Commissioner. Focused on wider risks from content, and age assurance requirements, the UK Online Safety Act (UK OSA), EU Digital Services Act (DSA) and the Australian Online Safety Act (AOSA) are also driving key questions related to data protection. 

Businesses that have child users of their digital products and services will need to address compliance with some or all of the regulations referenced above, depending on the nature of their platform. Alongside this, businesses are prioritising the protection of children online in response to societal concern about potential harms, demonstration of corporate social responsibility and the risk to their reputation and trust from consumers.

This blog sets out a summary of the key issues that were discussed across the day and some of the key learnings:

The voices of children and parents need to be effectively considered across the design lifecycle

The voices of children and parents need to be effectively considered across the design lifecycle - from conception through to testing and reporting about how products and services operate for children in practice. 

This is important, as it addresses the risk of tick box compliance. Ensuring that their views are effectively considered will enable consideration of children’s experiences online to be drawn into the design process, what the likely risks and harms are, and the benefits children gain from using digital products and services.  

Such an approach will enable a feedback loop of evidence about how solutions are working in practice. It will also enable compliance with data protection by design and foster trust from children and parents that their opinions and experiences are heard and respected. 

Organisations may want to consider giving children the space, voice, audience and influence to make a real difference in practice. Many companies are now using special panels and youth councils to give greater voice.

The complexity of children’s experiences online requires a risk-based approach and will often require a combination of solutions 

A risk-based approach, matched by appropriate mitigations remains an essential tool for the implementation of an effective approach to engaging with children’s data. This will require different combinations of solutions recognising the need to take into account the context of the particular digital product or service.

Children’s lives online are shaped by a diverse range of experiences and an overriding impression from the conference was the need to look at risks and challenges regarding children’s data from many angles. For example, alongside technology and regulation, investment in digital literacy may also be a key to addressing the risks that exist. This investment will need to come from business, regulators, civil society and government. 

Data protection by design and safety by design must be central to the approach of online platforms

The conference heard about a range of innovative ways that organisations had developed engaging design for children and effective methods to reach different child audiences. Information provided to children and their parents throughout the online journey should be accessible, empowering and meaningful. This will be needed to meet data protection and online safety compliance requirements and also from the broader perspective of the United Nations Convention on the Rights of the Child.

The importance of designing with the user experience in mind was also a key area of discussion. This includes addressing the risk that the online safety solutions, such as age assurance, could create friction for users and could drive them to less safe alternatives. Balancing utility with effective protections will be a key area where regulators will have to find a balance.

Age assurance solutions will need to offer choice whilst providing certified levels of data protection and security compliance 

The safety tech market and age assurance solutions are rapidly evolving to offer a range of options that seek to balance effectiveness and data protection compliance. The panel discussions highlighted that choice and a diverse range of solutions will be important for children, parents and the wide range of platforms and services that are used by children. The age assurance service providers also recognise that accounting for inclusivity and bias are key issues to be addressed. 

Certification schemes are important in technically demonstrating that age assurance systems achieve an effective age check whilst remaining data protection compliant. Some solutions will be effective across a full range of age ranges while some will only be effective in providing assurance as to whether the user is under or over 18. The conference also heard about the levels of effectiveness that could be provided by age assurance solutions, often at four levels, ranging from low to very high, involving a combination of safeguards. For age estimation this could include measures to protect against injection attacks. The panellists also recognised the criteria that Ofcom have set for assessing age assurance processes: technically accurate, robust, reliable and fair.

A question was posed about the role that could be played by app stores and whether age verification provided at that level of the ecosystem could then be used by app developers. The challenges of this approach were highlighted in the panel discussions, including whether this would support a risk-based approach if introduced as a standard position for all apps. The fact that a relatively low percentage of parents use those controls already made available by apps and platforms was also cited as an issue. 

Age assurance is not a silver bullet and needs to work as an enabler for age-appropriate solutions, to deliver real value as a protection

It will be important that age assurance solutions enhance rather restrict children’s experiences online. They should be used as a gateway to the provision of age-appropriate protections such as reducing the rate of notifications, default private account settings, tailored information and restrictions on advertisements. 

Companies presenting at the conference also provided examples to show how their services were now providing a range of different age-appropriate design protections. They addressed aspects such as different privacy notices and tailored information as well as circumstances where service functionalities were adjusted such as when certain messaging and live streaming features were turned off by default. 

Parental consent can play a valuable role as a safeguard, but efficacy will often depend on a range of factors

Under the GDPR it is important to consider whether consent is the most appropriate lawful basis to rely on to process a child’s personal data or whether other alternatives such as legitimate interest can be considered. 

If an organisation is looking to consent, it may be necessary to rely on parental consent. In that context, and for parental permissions, the conference considered the impact of parental digital literacy, varied parent-child relationships and different approaches to effective transparency for children and parents. It was noted that the operation of parental consent as an ongoing mechanism may need to be renewed and updated over time, including when children become adults. 

Panel discussions also illustrated the challenge of taking parental consent concepts from family law into the complexity and scale of the online environment. For example, in a school or healthcare setting it is feasible for consent to be addressed on a one-to-one basis, taking into account individual circumstances and relationships. 

Research from the Digital Regulation Cooperation Forum also illustrates that parents often want to make decisions about their child’s online activity based on a range of factors, not just age but maturity and perceived risk too. The research also indicates that parents often see social media age limits as arbitrary. Alongside this, the technical knowledge of parents can also be a barrier to their controls being an effective mechanism for protecting children online. Research from the Family Online Safety Institute suggests that a child’s approach to online activity and the relationship with their parent in that regard, is based on curiosity, partnership, empathy and respect.  

A graduated approach could be taken that involves parental controls being used, followed by ongoing discussions as children grow and develop, ultimately leading to their autonomy with regard to decisions around their online experience. Such an approach would enable companies to demonstrate a data protection by design approach as well as supporting trusted relationships with children and parents.

It was clear that the difference between parental consent (to process the child’s personal data) and parental permission (to access an online service or features) should also be recognised and a distinction drawn to ensure effective consent is obtained and controls applied.  

Lastly, there was recognition that parental controls are becoming more granular and sophisticated in what they can offer in terms of control of the online experience e.g. different chat features within games. This can be positive as it can offer a less binary decision about whether children use certain services and features. 

Globally fragmented approaches remain a challenge 

For companies seeking to rollout solutions that can work across jurisdictions, and reduce costs and barriers to implementation, it is clear that differing regulatory requirements continue to pose a challenge. For example, varying ages for parental consent, with some jurisdictions are now focused on social media ‘bans’ for certain age groups.  

The  differences between the UK and Australian approach were highlighted; while Australia has banned social media for under 16s, the UK takes a risk-based approach under the UK OSA, with a focus on preventing harms via safety by design. Importantly, it was highlighted that Australia does not have human rights embedded into its constitution, and therefore its approach may not work in other jurisdictions.

It was acknowledged that there is valuable complementarity between the UK ICO and European Data Protection Board approaches to  age assurance but there is a risk that some EU data protection authorities will push back on the use of biometrics for age estimation. However, participants cautioned that narrowing down options now by early enforcement would likely make it difficult to maintain an effective range of options for companies to deploy. 

Concluding observations 

The priority now placed on data protection and online safety for children was apparent from many presentations, with real examples of meaningful design changes and age assurance technologies maturing into application.   

There is a sense of progress as well as a recognition of the ongoing complexity of the tasks and the importance of a sustainable long-term approach that builds on evidence about what effectively works in practice. A multi-disciplinary approach will also be vital.

 

Related capabilities