The Age Assurance (Minimum Standards) Bill [HL] is a private member’s bill sponsored by Baroness Kidron (Crossbench). It is due to have its second reading on 19 November 2021.

What is the bill about?

UNICEF has defined age assurance as using “diverse data sources to estimate an individual user’s age (to varying degrees of accuracy)”. It explains that age verification, which is used to more formally establish a user’s age (for example, through requirements for identification), is a subset of age assurance.

Baroness Kidron has described the purpose of her age assurance bill as follows:

There are many calls for age assurance online—particularly in relation to adult content, data protection and age restricted goods and activities—but, in the absence of a regulatory or statutory code, each online provider is deciding for themselves what levels of privacy and efficacy are required, with the inevitable outcome that age assurance is poorly understood and little trusted. Digital technologies are built into many environments that children inhabit and the systems that govern their lives, but not all digital services are, or should be forced to be, suitable for children. The Age Assurance (Minimum Standards) Bill would require Ofcom to produce a statutory code ensuring age assurance systems meet minimum standards. Crucially, the bill is not aimed at one technological approach, but rather sets the bar for all age assurance systems.

Baroness Kidron is chair of the 5Rights Foundation, a charity focused on young people’s experiences online. The foundation has published a detailed report on the need for developing age assurance methods and building a regulatory framework: ‘But How Do They Know it is a Child?’ Age Assurance in the Digital World. The foundation explains:

Rather than viewing it as simply restricting access, we should be looking at age assurance as a chance to invite children into a digital world that offers them greater privacy, freedom from commercial pressures, content and information in formats and language that they like, protection from misinformation or material that promotes harmful activities (such as suicide, self-harm or disordered eating), alongside supporting digital services in their legal duty not to provide children with age restricted contact and content.

However, the Open Rights Group have raised concerns about digital age assurance. Writing in the context of the draft online safety bill, the organisation said it could impact “rights to privacy and freedom of expression” and could also “threaten the integrity of the Internet’s architecture”. It also addressed the Age Assurance (Minimum Standards) Bill itself, stating that it would establish “minimal privacy, ethics, and human rights standards around the use of these technologies”. However, it said “this approach would not provide sufficient protection across the huge range of commercial applications which will be required to implement the technology”.

What would the bill do?

The bill contains five clauses.

Clause 1 would require that any age assurance system operated in relation to online or digital services used by UK consumers, or operated in the UK, would have to comply with the minimum standards set out in clause 2.

Clause 2 states that the minimum standards must be published by Ofcom within six months of the legislation coming into force. It also lists factors that must be applied in the minimum standards, including that they must ensure that any age assurance systems:

  • protect the privacy of users in accordance with applicable laws, including the data protection laws and obligations under the treaties set out at paragraph (k);
  • are proportionate having regard to the risks arising from the product or service and to the purpose of the age assurance system;
  • offer functionality appropriate to the capacity and age of a child who might use the service;
  • are secure, not exposing users or their data to unauthorised disclosure or security breaches, and does not use data gathered for the purposes of the age assurance system for any other purpose;
  • provide appropriate mechanisms and remedies for users to challenge or change decisions if their age is wrongly identified;
  • are accessible and inclusive to users with protected characteristics;
  • do not rely solely on users providing information; and
  • are compliant with certain data protection and rights legislation.

Clause 3 would require Ofcom to make regulations containing enforcement provisions for the minimum standards, also within six months of the passing of the legislation.

Clause 4 provides definitions for some of the terms used within the bill, and clause 5 contains territorial extent and commencement provision. The bill would apply to England and Wales, Scotland and Northern Ireland.

What does the Government say about age assurance in the UK?

The Department for Digital, Culture, Media and Sport (DCMS) has published guidance for businesses on ensuring they provide age-appropriate online content in the UK. This lists a number of things that all businesses should do, including incorporating tools for children to report inappropriate content, and suggestions on how businesses can go even further; for example, thinking about how to empower children to be able to make informed and safer choices online.

One of the suggestions for what businesses should do is to:

Consider implementing safety technology tools such as age assurance technologies to ensure that children are not able to access content aimed at adult audiences.

The importance of age assurance technologies was emphasised in the Government’s November 2020 report of its Verification of Children Online project, involving GCHQ, DCMS and the Home Office. The report noted that 99% of 12–15 year olds are going online regularly, spending on average 20. 5 hours a week on the internet. It stressed that the internet provides many benefits to younger people, including learning opportunities, entertainment, and ways to keep in contact with friends and family. However, it also stated that the internet does “present risks to children and some do encounter online content and behaviour that is harmful to them, and in some cases illegal”.

The report provided recommendations for developing age assurance, under four key headings:

1. A regulatory strategy for age assurance

  • Undertaking research on the risks posed to children by online services, to help inform the proportionate and risk-based use of age assurance. This research should engage with industry and subject experts.
  • Taking action to secure regulatory alignment between relevant current and emerging regulatory frameworks. A ‘task force’ of government and relevant regulators would help to deliver this.

2. Encouraging industry’s adoption of age assurance

  • Developing industry benchmarks, facilitated through research on the risks posed to children by online services. This research should engage with industry, regulators and subject experts.
  • Developing best practice examples, in partnership with regulators and industry.

3. Stimulating innovation in the age assurance market

  • Taking action to promote the age assurance market among industry and users.
  • Supporting the development of industry standards to ensure consistency and trust in age assurance solutions.
  • Exploring accessibility to testing data, to improve accuracy in age assurance methods. This is particularly important for methods that rely on training an algorithm, such as age estimation based on biometric data.
  • Taking action within the engineering and design community to ensure that age assurance is considered as part of voluntary design codes of practice.

4. Growing public confidence in age assurance

  • Undertaking research into how age assurance may disproportionately impact on some children and explore how these insights can be reflected in the development and implementation of age assurance.
  • Supporting digital parents to gain a better understanding of the safeguards that age assurance offers, and the compliance action taken by providers and platforms.

Current rules and guidance

The Government’s guidance for businesses also highlighted some of the rules and regulations that must already be complied with; for example, it stated that, from 1 November 2020, UK-established video-sharing platforms had to comply with new rules about protecting users from harmful content. Further information on this can be found in guidance published by Ofcom.

In addition, a new age appropriate design code came into force in September 2020, with businesses required to comply with it from 2 September 2021. It focuses on data protection and includes 15 standards that should be adhered to, including:

  • Best interests of the child: The best interests of the child should be a primary consideration when designing and developing online services likely to be accessed by a child.
  • Age appropriate application: Take a risk-based approach to recognising the age of individual users and effectively apply the standards in this code to child users. Either establish age with a level of certainty that is appropriate to the risks to the rights and freedoms of children that arise from data processing, or apply the standards in this code to all users instead.
  • Data minimisation: Collect and retain only the minimum amount of personal data needed to provide the elements of a service in which a child is actively and knowingly engaged. Give children separate choices over which elements they wish to activate.

The design code is a statutory code, prepared under section 123 of the Data Protection Act 2018. The Information Commissioner’s Office explained that failure to comply with the code would make it less likely businesses could demonstrate their processing meets the General Data Protection Regulation and the Privacy and Electronic Communications Regulations requirements. This could lead to enforcement action being taken.

Read more


Cover image by Compare Fibre on Unsplash.