If you are looking for support about suicide, the NHS website has links to a number of organisations that are there to help, including support for self-harm, even if you just want someone to talk to.

On 20 January 2022, the House of Lords is due to consider the following question for short debate:

Baroness Kidron (Crossbench) to ask Her Majesty’s Government what assessment they have made of the role played by social media in the deaths of UK children, including by suicide, self-harm and murder.

Background

People born from around the year 2000 have been described as ‘digital natives’, growing up in an era of user-friendly digital technology. Social media platforms form part of this environment, and usage of these platforms amongst children is high. Ofcom’s Children and Parents: Media Use and Attitudes Report 2020/21 found that just over half of 5- to 15-year-olds used social media sites or apps. This increased to 87% amongst 12- to 15-year-olds.

Social media platforms, and online platforms in general, can be sources of learning, advice and support for children and young people. However, concern has been expressed that use of social media could be contributing to higher rates of self-harm and suicide by children, including cases such as that of Molly Russell, who killed herself at the age of 14 after viewing graphic images of self-harm and suicide on the social media platform Instagram. Following her death, Instagram said that it would ban graphic images of self-harm as part of a series of changes.

Rates of suicide and self-harm

Suicide

Rates of suicide among male 15- to 19-year-olds have generally increased over the past 10 years, although they are currently lower than peak rates during the late 1980s and late 1990s. There has also been a general increase in rates for females aged 15 to 19 in the last seven years. However, in contrast to males, the highest number in the series for females was in 2019 and the general trend has been flatter.

Figure 1: Age-specific suicide rates per 100,000 population by sex and for 15- to 19-year-olds, England and Wales, 1981 to 2020 registrations

Age-specific suicide rates per 100,000 population by sex and for 15- to 19-year-olds, England and Wales, 1981 to 2020 registrations
Source: Office for National Statistics, ‘Dataset: Suicides in England and Wales’, 7 September 2021, table 5. The ONS also publishes rates for 10- to 14-year-olds but due to low numbers it considers these figures to have low reliability.

In 2020, rates of suicide per 100,000 for males and females aged 15 to 19 were 6.8 (115 registered deaths) and 2.8 (45 registered deaths) respectively. The highest rates in 2020 were amongst males and females aged between 45 and 49, at 24.1 per 100,000 (457 registered deaths) and 7.1 per 100,000 (138 registered deaths) respectively. The ONS provides a more in-depth analysis of rates across age groups in its statistics bulletin on suicide in England and Wales.

Self-harm

The Health Behaviour in School-aged Children (HBSC) survey 2018 provides data on a range of areas, including deliberate self-harm, based on a survey of 3,398 young people in England. The HBSC publishes reports every four years. Only 15-year-olds were asked about self-harm. The survey found that 25% of these had reported that they had ever self-harmed. This was an increase from 22% in the 2014 survey. Self-harm was more common among girls than boys, but rates had risen more in boys than girls:

Twice as many girls as boys reported that they had undertaken DSH [deliberate self-harm]; 16% of boys compared to 35% of girls. Reporting self-harming has however increased among boys (rising from 11% in 2014 to 16% in 2018); a smaller increase was noted among girls (increasing from 32% to 35%).

The survey found that 48% of young people who reported they had self-harmed had done so once, but 2% of boys and 6% of girls reported self-harming every day. However, the survey found that “fewer young people [both boys and girls] reported self-harming every day compared to the previous 2014 survey (4% vs 13%)”.

An article published in the British Medical Journal in 2017 stated that “the elusive nature of self-harm” represented a major obstacle for accurately quantifying it. However, using the Clinical Practice Research Datalink (an electronic primary care patient records database), academics at the University of Manchester found that the incidence of self-harm increased by 68% among girls aged 13 to 16 between 2011 and 2014.

In February 2021, data analysed for BBC Radio 4’s File on 4 programme indicated that the number of children aged 9 to 12 who were admitted to hospital for intentionally hurting themselves had risen from 221 in 2013–14 to 508 in 2019–20.

Social media, online content and risks of harm

In its report on freedom of expression in the digital age, the House of Lords Communications and Digital Committee argued that “while providing children with an unprecedented range of information and opportunities to learn, the internet presents risks; children can see violent, pornographic or disturbing content”. The committee cited Ofcom’s Children and Parents: Media Use and Attitudes Report 2020/21, which stated that 48% of parents were concerned about the content on sites or apps that their child visited and 54% were concerned about their children seeing content which encouraged them to harm themselves. Ofcom also found that “although some parents found it hard to control their child’s screen time in 2020, half of parents whose child went online felt that the benefits of the internet for their child outweighed any risks”.

Quantifying a link between social media use and children’s health is complicated. In a commentary on ‘Screen-based activities and children and young people’s mental health and psychosocial wellbeing’, the UK Chief Medical Officers have said that “many factors affect mental health and it can be difficult to disentangle these factors from any effect caused by screen or social media use”.

The House of Commons Science and Technology Committee published a report on the impact of social media and screen-use on young people’s health in January 2019. The committee started its report saying that finding “unambiguous answers” to its questions about whether the growing use of social media by children was healthy or harmful was “hindered by the limited quantity and quality of academic evidence available”. The committee stated that:

We found that the majority of published research did not provide a clear indication of causation, but instead indicated a possible correlation between social media/screens and a particular health effect. There was even less focus in published research on exactly who was at risk and if some groups were potentially more vulnerable than others when using screens and social media.

The committee recommended that the Government commission research into this area. However, it also said the absence of “good academic evidence” was not evidence that social media and screens had no effect on young people. The committee stated that alongside evidence that social media could be “a force for good”, it also received evidence of potential negative impacts “from detrimental effects on sleep patterns and body image through to cyberbullying, grooming and ‘sexting’”. The committee argued that “generally, social media was not the root cause of the risk but helped to facilitate it, while also providing the opportunity for a large degree of amplification […] this was particularly apparent in the case of the abuse of children online, via social media”.

In its response to the committee’s report, the Government said it recognised the need for further research and evidence on online harms. It referred to work that the Department for Digital, Culture, Media and Sport (DCMS) had done with the UK Council for Internet Safety (UKCIS) and Ofcom. The Government said that the DCMS had recently commissioned research projects to:

  • build an understanding of online platforms and their current capabilities in mitigating the risk of harm to users;
  • observe how parents of 10- to 13-year-olds are helping their children stay safe online through ethnographic research; and
  • review the literature on adult experiences of harm online.

DCMS was also developing plans for future research, and the future online harms regulator would also work with UK Research and Innovation on targeted research into online harms.

More recently, the Joint Committee on the Draft Online Safety Bill’s December 2021 report stated it had received evidence linking self-harm and suicide attempts with accessing content online:

Ian Russell, founder of the Molly Rose Foundation, told us that in 26 percent of cases where young people present to hospital with self-harm injuries and suicide attempts, those young people have accessed related content online. The Samaritans reported that children as young as 12 have accessed suicide and self-harm material online. We have heard that, while children and young people are particularly at risk, adults can also be led to suicide and self-harm as a consequence of online content and activity.

In its 2018 ‘How Safe are our children?’ report, the NSPCC argued that “cases of abuse that might traditionally been considered ‘offline’, such as institutional abuse, interfamilial abuse or where children are abused by someone in a position of trust, are now likely to have been facilitated through social media”. It noted that children who had been subjected to online abuse had reported a range of negative experiences including self-harm, along with flashbacks, anxiety and self-blame.

The HBSC survey also asked questions about social media use to identify normative and problematic social media use. Based on a scoring system (including reporting at least one of the following behaviours: neglected other activities, had arguments with others, or had serious conflict with parents or siblings) the survey stated that 12% of respondents were identified as having a problematic relationship with social media. It found that girls were more likely than boys to have been found to have problematic social media use (9% of boys compared with 14% of girls). Problematic social media use was found to be most common amongst 13-year-old girls (18%).

Writing to the Lancet in July 2019, academics from Bristol medical school said that there was concern that the incidence of affective disorders and self-harm was rising among adolescents. They presented a short analysis of suicide rates among 15- to-19-year-olds that they argued suggested that “the increasing number of adolescents presenting with self-harm and affective disorders is likely to be associated with increased morbidity in the population”. They said several factors had been associated with the problem, including social media:

Many factors have been associated with the increasing levels of distress reported by adolescents, including the financial crisis in 2008, social media, cyberbullying, increasing academic pressures, and broader concerns about job prospects, financial security, and global politics.

They argued that “research is urgently needed to clarify whether recent trends reflect a real deterioration in adolescent mental health and, if so, the key drivers of this change”.

More recently, the Covid-19 pandemic has intensified risk factors for child mental health disorders.

There have also been concerns about the links between social media and violence amongst young people. In an interview with the Times in 2018, Metropolitan Police Commissioner, Cressida Dick, expressed concern about “the impact of social media in terms of people being able to go from slightly angry with each other to ‘fight’ very quickly,”. This concern has also been expressed more recently in the media, for example articles in the Independent and in the Telegraph in December 2021.

Government position

The Government’s draft Online Safety Bill was published on 12 May 2021. The bill seeks to address a wide range of concerns relating to online safety. Amongst its provisions, the bill would impose “duties of care in relation to illegal content and content that is harmful to children on providers of internet services which allow users to upload and share user-generated content”.

In its December 2020 response to the consultation on its online harms white paper, the Government said its online harms framework would include provisions to address suicide, self-harm and eating disorder content:

  • The online harms framework will place regulatory responsibilities on in-scope companies likely to be accessed by children to protect their child users from harmful content and activity, including suicide, self-harm and eating disorder content. However there are wider government-led initiatives to develop voluntary cooperation in this area ahead of legislation.
  • The Department of Health and Social Care has coordinated a strategic partnership with social media companies and the Samaritans to set guidance on moderating suicide and self-harm content, and educating users to stay safe online.

The consultation response also provides a discussion of online harms suffered by children and young people, as well as the positive impact of being online for children and young people.

The Government has said the strongest protections in the bill are for children. Speaking in answer to a written question on age-verification of social media sites, the Government stated that companies covered by the legislation would have to “ensure that only users who are old enough are able to access services which have age restrictions or which risk causing them harm”. Ofcom would be able to take enforcement action against a company that failed to comply. The Government also stated that companies would have to publish annual reports setting out how they are tackling online harms:

The largest and most high-risk companies will also be required to publish annual transparency reports about the steps they are taking to tackle online harms. This will include steps they are taking to fulfil their safety duties and provide a higher level of protection for children. Ofcom can take robust enforcement action where companies do not provide the required information.

Additionally, the Government stated that a roundtable with social media companies had been hosted by the Secretary of State for Digital, Culture, Media and Sport and the Secretary of State for Education, as well as the Children’s Commissioner. The Government said that the companies “pledged to identify further information regarding children on their platforms and the nature of harms children may face”.

The Department of Health and Social Care, through the National Institute for Health Research (NIHR), has funded a systematic review to “explore the relationship between social media and other online content and body image and disordered eating in children and young people”.

Joint Committee on the Draft Online Safety Bill

The Joint Committee on the Draft Online Safety Bill published its report on the draft legislation on 14 December 2021.

The joint committee argued that “for too long” online service providers had been allowed to consider themselves “as neutral platforms which are not responsible for the content that is created and shared by their users”. It argued that this was “why the driving force behind the Online Safety Bill is the belief that these companies must be held liable for the systems they have created to make money for themselves”.

Amongst its findings, the joint committee said it was concerned about ‘design features’ of online services that carried a risk of exacerbating harm. For example, algorithms which promoted content to users based on interests could “constantly recommend pictures of self-harm to a vulnerable teenager”. The committee argued that addressing risks of this kind was more effective than removing individual pieces of content. It recommended that the bill included a specific responsibility on service providers to ensure they “have in place systems and processes to identify reasonably foreseeable risks of harm arising from the design of their platforms and take proportionate steps to mitigate those risks of harm”.

Speaking in a House of Commons debate on the joint committee’s report, Chris Philp, Parliamentary Under Secretary of State for Digital, Culture, Media and Sport, urged social media firms to act prior to the enactment of the Online Safety Bill. He argued that they could “edit their algorithms tomorrow […] they should not be waiting for us to legislate; they should do the right thing today”.

Social media company policies

The social media platform Facebook (Facebook is now part of Meta) has stated that it “cares deeply” for the safety of people that use its applications and has set out its policy on suicide and self-injury. As part of this it states:

We regularly consult with experts in suicide and self-injury to help inform our policies and enforcement, and work with organisations around the world to provide assistance to people in distress.

While we don’t allow people to intentionally or unintentionally celebrate or promote suicide or self-injury, we do allow people to discuss these topics because we want Facebook to be a space where people can share their experiences, raise awareness about these issues and seek support from one another.

Other social media platforms also have content and behaviour policies. For example, Twitter prohibits the promotion or encouragement of suicide and self-harm and has policies against abuse behaviour and harassment.

Read more

Parliamentary committee reports

Parliamentary research briefings

Further information


Cover image by AzamKamolov on Pixabay.