On 23 November 2023, the House of Lords will debate the following question for short debate:

Baroness Kidron (Crossbench) to ask His Majesty’s Government what assessment they have made of the role of educational technology (ed tech) being used in United Kingdom schools in relation to (1) the educational outcomes, (2) the social development, and (3) the privacy of schoolchildren.

1. What is educational technology?

Educational technology, or EdTech, refers to the practice of using technology to support teaching and the effective day-to-day management of education institutions.[1]

The potential uses of EdTech are many and varied. The Department for Education (DfE) notes that the Covid-19 pandemic and the subsequent partial school closures significantly disrupted the delivery of education in England and across the world, and “created an unprecedented need for remote teaching and learning solutions”.[2] During this period, research cited by the DfE suggests that 64% of schools introduced, increased or upgraded their technology, with 80% of schools using either new tools or a mix of new and old. Tools included the use of online learning platforms, digital curriculum content tools and services, and technology to deliver both live remote lessons and pre-recorded lessons online.

As technology and innovation continue to develop rapidly, emerging trends in EdTech include artificial intelligence (AI)-powered learning environments, augmented reality (AR) and virtual reality (VR), automated assessments and adaptive learning.[3]

A 2022 qualitative review of the EdTech sector by the DfE found that the technology was principally being used in three key ways in schools:

1. School management and administration: schools interviewed used specific tools and platforms which allowed them to undertake the day-to-day management and administration of the school more effectively. These included:

    • tools for pupil data management to help with more effective monitoring and to support learner progress
    • tools for engaging with staff for day-to-day communication, sharing school policy, providing training and sharing resources for curriculum planning and delivery
    • tools for engaging with parents to share communications and resources, and for providing updates about pupil progress.

2. Support for teaching and learning: most schools interviewed had chosen to support teaching and learning across the school (both in class and remotely) using VLEs [virtual learning environments], devices or website subscriptions. They had also invested significantly in EdTech devices, including interactive whiteboards, laptops or tablets for learners and staff, and visualisers in a few cases.

Some schools used EdTech for assessment tasks to help reduce teacher workload through automation of marking, moderation and inputting. These included systems designed specifically for assessment purposes, while others used more informal approaches such as embedding quizzes and tests using existing VLEs or specific services.

Several schools used EdTech to support SEND [special education needs and disabilities] learners or those with language barriers. This included the use of tools designed to assess learner needs and adaptive technologies to improve wider curriculum engagement.

3. Pastoral support: EdTech was used to support safeguarding and related liaison with external agencies. Schools valued technologies that allowed them to better understand and monitor pupils’ wellbeing.

A few schools also mentioned using EdTech to promote the importance of mental health and wellbeing and sharing online resources with parents, learners and staff.

Schools also used videoconferencing to support meetings with external stakeholders such as safeguarding professionals and careers guidance professionals. Views on the benefits of this were mixed as some believed a face-to-face approach was preferable.[4]

The study also noted that EdTech was being used to deliver curriculum specific objectives:

Schools also used specific programmes and online resources to enhance teaching and learning in different curriculum areas. A wide range of products were used and were particularly valued when they (1) benefitted learner engagement and progress, (2) supported in-person and remote teaching, and learning activities, (3) could be tailored to learner needs, (4) allowed teachers to use metadata to provide further support, and (5) provided options to share work with parents.[5]

The study reported that qualitative interviews found that schools were mixed in the extent to which they had embraced and embedded EdTech. The report noted that as a result of changes between face-to-face and remote teaching at different points as a result of the Covid 19 pandemic more schools had been “propelled to increase and further embed EdTech and to refine existing processes”. The study also noted that a small number of schools interviewed “continued to be cautious about using EdTech, particularly for use in the classroom”.

Yet this picture is evolving rapidly, as examined below.

2. Growth of the UK EdTech sector and rapid AI development

According to the government, the UK’s EdTech sector is the largest in Europe.[6] They also report that UK schools already spend an estimated £900mn a year on educational technology.

The role of EdTech is likely to grow significantly in coming years, particularly as a result of the rapid evolution and growth of AI technologies, in particular, generative AI such as large language models (LLMs) including ChatGPT and Bard.[7]

There are already reports that increasing numbers of children are using AI tools in their schoolwork, both with and without the direction of teachers and other educational professionals, as a result of the free availability of the growing number of tools.[8]

Reflecting on the potential use of AI in the American educational system, writing in the journal Education Next, John Bailey, an academic and former director of science technology at the US department of education, said that AI could serve, or is already serving, in several teaching-and-learning roles:

Instructional assistants. AI’s ability to conduct human-like conversations opens up possibilities for adaptive tutoring or instructional assistants that can help explain difficult concepts to students. AI-based feedback systems can offer constructive critiques on student writing, which can help students fine-tune their writing skills. Some research also suggests certain kinds of prompts can help children generate more fruitful questions about learning. AI models might also support customised learning for students with disabilities and provide translation for English language learners.

Teaching assistants. AI might tackle some of the administrative tasks that keep teachers from investing more time with their peers or students. Early uses include automated routine tasks such as drafting lesson plans, creating differentiated materials, designing worksheets, developing quizzes, and exploring ways of explaining complicated academic materials. AI can also provide educators with recommendations to meet student needs and help teachers reflect, plan, and improve their practice.

Parent assistants. Parents can use AI to generate letters requesting individualized education plan (IEP) services or to ask that a child be evaluated for gifted and talented programs. For parents choosing a school for their child, AI could serve as an administrative assistant, mapping out school options within driving distance of home, generating application timelines, compiling contact information, and the like. Generative AI can even create bedtime stories with evolving plots tailored to a child’s interests.

Administrator assistants. Using generative AI, school administrators can draft various communications, including materials for parents, newsletters, and other community-engagement documents. AI systems can also help with the difficult tasks of organizing class or bus schedules, and they can analyse complex data to identify patterns or needs. ChatGPT can perform sophisticated sentiment analysis that could be useful for measuring school-climate and other survey data.[9]

However, whilst these applications potentially offer “great potential” according to Mr Bailey, he is also clear that these solutions come with significant risks, as explored in section 4 of this briefing.

3. Recent UK developments: Expansion of funding provided to Oak National Academy for AI

The government created the Oak National Academy during the Covid-19 pandemic to support remote learning.[10] It has since been converted into an arms-length body that is focused on supporting teachers. This plan was announced in the schools white paper published in March 2022.[11]

In the white paper, the government said that the new body would:

Work with thousands of teachers to co-design, create and continually improve packages of optional, free, adaptable digital curriculum resources and video lessons that are effectively sequenced to help teachers deliver an evidence-based, high-quality curriculum.[12]

It explained that each subject would have a choice of resources to provide variety for teachers. It argued that this “sector-led approach” would draw on expertise and inputs from across the country, involving teachers, schools, trusts, subject associations, national centres of excellence and education publishers.

In October 2023, Prime Minister Rishi Sunak announced that the government would invest up to £2mn in Oak National Academy to create new teaching tools using AI, “marking the first step towards providing every teacher [in England] with a personalised AI lesson-planning assistant”.[13] The government suggest this will support teachers in planning lessons and building classroom quizzes, and reduce workloads.

The government also held a “two-day AI hackathon” hosted by the DfE, in collaboration with Faculty AI, the National Institute of Teaching and the AI in Schools Initiative. The event brought together teachers and leaders from schools and trusts across England to experiment with AI. The government also said that it aimed to reduce workload for teachers and school leaders:

Over the coming months, the government will continue to work with teachers, and experts on the Workload Taskforce to develop solutions to minimise the time teachers spend working beyond their teaching hours. This will support its ambition to reduce working hours for teachers and leaders by five hours per week.[14]

The DfE is also shortly due to publish the results of its ‘AI call for evidence’.[15] This was launched to gather views from educational professionals on the risks, ethical considerations, and possibilities of AI in education. The government states that the “results will support the government’s work to identify AI’s potential and ensure it advances in a safe, reasonable, and fair way”.

4. Are there potential harms which could result from using EdTech, and AI in particular?

‘AI in Education’ is a new organisation formed by leaders in state and independent schools in Britain, and led by Sir Anthony Seldon, head of Epsom College. It aims to inform teachers, school leaders and parents about the potential rewards and harms of EdTech, and AI in particular, which it believes “has the potential to be the greatest benefit but also the greatest threat to our students, staff and schools”. Further, AI in Education contends that the “truth is that AI is moving far too quickly for government or parliament alone to provide the real time advice that schools need”.[16]

Among the risks that AI in Education and others suggest could be caused by the use, or improper use, of AI in schools are the following:

  • infantilisation of students (and staff)
  • moral risk, not least through deep fake
  • perceptions about cheating and dishonesty
  • lack of responsibility—or answers to the question: who is in charge?
  • impact on jobs[17]

These are similar to the potential risks set out by John Bailey in the journal Education Next, who outlined the following areas of concern:

Student cheating. Students might use AI to solve homework problems or take quizzes. AI-generated essays threaten to undermine learning as well as the college-entrance process. Aside from the ethical issues involved in such cheating, students who use AI to do their work for them may not be learning the content and skills they need.

Bias in AI algorithmsAI systems learn from the data they are trained on. If this data contains biases, those biases can be learned and perpetuated by the AI system. For example, if the data include student-performance information that’s biased toward one ethnicity, gender, or socioeconomic segment, the AI system could learn to favor students from that group. Less cited but still important are potential biases around political ideology and possibly even pedagogical philosophy that may generate responses not aligned to a community’s values.

Privacy concerns. When students or educators interact with generative-AI tools, their conversations and personal information might be stored and analyzed, posing a risk to their privacy. With public AI systems, educators should refrain from inputting or exposing sensitive details about themselves, their colleagues, or their students, including but not limited to private communications, personally identifiable information, health records, academic performance, emotional well-being, and financial information.

Decreased social connection. There is a risk that more time spent using AI systems will come at the cost of less student interaction with both educators and classmates. Children may also begin turning to these conversational AI systems in place of their friends. As a result, AI could intensify and worsen the public health crisis of loneliness, isolation, and lack of connection identified by the US Surgeon General.

Overreliance on technologyBoth teachers and students face the risk of becoming overly reliant on AI-driven technology. For students, this could stifle learning, especially the development of critical thinking. This challenge extends to educators as well. While AI can expedite lesson-plan generation, speed does not equate to quality. Teachers may be tempted to accept the initial AI-generated content rather than devote time to reviewing and refining it for optimal educational value.

Equity issues. Not all students have equal access to computer devices and the internet. That imbalance could accelerate a widening of the achievement gap between students from different socioeconomic backgrounds.[18]

Similarly, the 5Rights Foundation, developed and chaired by Baroness Kidron, argues that digital services that children and young people use are not designed to meet their needs or uphold their rights.[19]

The 5Rights Foundation has called for digital products and services used by children to be more explicitly designed with children and their key developmental milestones in mind:

The digital environment looks quite different when we look at it from the point of view of a child’s ability to meet their development goals. It has the potential to be a landscape of opportunity, understanding, risk and challenge; designed with children in mind and overseen by regulatory protections that support parental advice or school supervision. In this version of the future, children would be considered at all points in the design cycle, so that they could be participants in a digital world that has considered their development stages and interacts with them on the basis of their age and maturity.


Digital habits start young and impact the journey to adulthood. The need to respect children’s developmental milestones is paramount and must inform research, policy and practice in the digital environment. We cannot solely rely on the resilience or education of children. Digital service providers must design products and services fit for children by acting above and beyond commercial considerations, and legislation and regulation must ensure a robust regime of accountability, transparency and oversight.[20]

Baroness Kidron was particularly critical of the EdTech sector in a speech at a fringe event held before the government’s recent AI Safety Summit, where she said:

AI in ed tech is already such a problem that UNESCO recently published a 500+ page book, ‘The Ed Tech Tragedy’ that forensically points out the failure to ask basic questions about the quality of outcomes for children—social, developmental and pedagogical/educational—before creating a[n] ed tech market that is cannibalising education systems across the world.[21]

The UNESCO report in question examined how the closure of schools during the Covid-19 pandemic and what it called the “hard pivot” to remote learning with connected technology during the pandemic resulted in “numerous unintended and undesirable consequences”.[22] The report stated:

Although connected technology supported the continuation of education for many learners, many more were left behind. Exclusion soared and inequities widened. Achievement levels fell, even for those with access to distance learning. Educational experiences narrowed. Physical and mental health declined. Privatization accelerated, threatening education’s unique standing as a public good and human right. Invasive surveillance endangered the free and open exchange of ideas and undermined trust. Automation replaced human interactions with machine-mediated experiences. And technology production and disposal placed new strains on the environment.

Visions that technology could form the backbone of education and supplant school-based learning—in wide circulation at the outset of the health crisis—had promised better outcomes. Ed-tech proponents held that the immense challenges of school closures could be met with technology and that deeper technology integration would transform education for the better. But these high hopes and expectations unravelled when ed-tech was hurriedly deployed to maintain formal education as Covid-19 tore across countries.

The report questioned whether more and faster integration of technology is desirable for learners, teachers and schools.

Professor Victoria Nash, director and senior policy fellow at the Oxford Internet Institute, has also warned about the potential implications of AI in education, warning that a lack of clear government strategy puts “children at risk of significant harm”.[23] On privacy for example, she notes:

Many digital services and apps harvest huge amounts of data from their users, often in lieu of payment, whilst the terms of service explaining this are painfully obscure. Navigating such data protection responsibilities is complex, and schools are poorly resourced to manage this, both in terms of expertise and infrastructure. Investment and training in data protection is definitely needed, as well as provision of more government support and advice.[24]

Professor Nash argues that ethical guidelines for design and use of AI in schools, certification or accreditation of AI tools, and comprehensive data governance would “provide a decent starting point for a UK strategy that would enable safe, positive and effective exploration of the great potential of AI technologies in schools”.

The DfE noted many of the potential risks with using generative AI in its policy paper ‘Generative artificial intelligence (AI) in education’ (updated 26 October 2023). It said that the Office for AI was currently conducting research into the skills that will be needed for future workforce training. In addition, the DfE will continue to work with experts to:

  • consider and respond to the implications of generative AI and other emerging technologies
  • support primary and secondary schools to teach a knowledge-rich computing curriculum to children up to the age of 16

5. Read more

Cover image Piyapong Saydaung from Pixabay.


  1. Department for Education, ‘Education technology for remote teaching’, November 2022, p 5. Return to text
  2. As above, p 6. Return to text
  3. Julia Suk, ‘10 trends in education technology that will have a major impact in 2023’, Hurix Digital, 15 November 2023. Return to text
  4. Department for Education, ‘The education technology market in England’, November 2022, pp 9–10. Return to text
  5. As above, p 10. Return to text
  6. Department for Business and Trade, ‘EdTech’, accessed 17 November 2023. Return to text
  7. For an overview of different types of AI technology, see: House of Lords Library, ‘Artificial intelligence: Development, risks and regulation’, 18 July 2023. Return to text
  8. BBC News, ‘Most of our friends use AI in schoolwork’, 31 October 2023. Return to text
  9. John Bailey, ‘AI in education’, Education Next, August 2023, vol 23, no 4. Return to text
  10. Oak National Academy, ‘About us’, accessed 17 November 2023. Return to text
  11. HM Government, ‘Opportunity for all: Strong schools with great teachers for your child’, March 2022, CP 650. Return to text
  12. As above. Return to text
  13. Department for Education, ‘New support for teachers powered by artificial intelligence’, 30 October 2023. Return to text
  14. As above. Return to text
  15. Department for Education, ‘Generative artificial intelligence in education call for evidence’, 14 June 2023. Return to text
  16. AI in Education, ‘Navigating AI’s benefits and challenges’, accessed 17 November 2023. Return to text
  17. AI in Education, ‘AI’s role in the education revolution: Discover, engage, transform’, accessed 17 November 2023. Return to text
  18. John Bailey, ‘AI in education’, Education Next, August 2023, vol 23, no 4. Return to text
  19. 5Rights Foundation, ‘Digital childhood: Addressing childhood development milestones in the digital environment’, updated October 2023. Return to text
  20. As above. Return to text
  21. 5Rights Foundation, ‘Baroness Kidron on putting children at the start of the AI debate’, 2 November 2023. Return to text
  22. UNESCO, ‘An ed-tech tragedy? Educational technologies and school closures in the time of Covid-19’, 2023, p19. Return to text
  23. Victoria Nash, ‘Government must do more to regulate AI in education’, Schools Week, 6 May 2023. Return to text
  24. As above. Return to text