The House of Lords Democracy and Digital Committee was an ad hoc committee that sat in 2019 and 2020 to consider democracy and digital technologies. Its report, Digital Technology and the Resurrection of Trust, examined the impact of digital technologies on: political campaigning; the electoral process; and the public’s engagement with politics and political debate in the UK.

In its report, the committee stated that the digital and social media landscape was dominated by Facebook and Google, who “largely pass under the radar, operating outside the rules that govern electoral politics”. The committee argued that this had become “acutely obvious” during the Covid-19 pandemic, whereby online misinformation was “not only a real and present danger to our democracy but also to our lives”. It also criticised governments for being “dilatory in adjusting regulatory regimes to capture these realities”, which had led to a “crisis of trust”.

Therefore, the committee concluded that there was a “need for government leadership and regulatory capacity to match the scale and pace of challenges and opportunities that the online world presents”. This included the committee calling for electoral law in the UK to be “completely updated for an online age”.

Recommendations

The committee made 45 recommendations in its report, which addressed a number of concerns it had identified. This included the “urgent case” for reforming electoral law and “our overwhelming need to become a digitally literate society”. Therefore, the committee urged the Government to implement its recommendations. These included calling on the Government to:

  • Publish online harms legislation that covers disinformation as part of its scope, within a year of the report’s publication. The legislation would include empowering Ofcom to “sanction platforms” that “fail to comply with their duty of care” which would be set out in the legislation. Sanctions would include fines of up to 4% of global turnover and powers to enforce internet service provider blocking of “serially non-compliant platforms”.
  • Establish an independent ombudsman for content moderation decisions. The committee stated that this would provide a point of appeal for people who have been “let down” by a platform’s decisions on whether to remove content. The ombudsman’s decisions should be binding on the platform and, in turn, create clear standards to be expected for future decisions for UK users. The report also detailed how platforms would be able to make representations on how such standards are applied within their moderation processes.
  • Work with political parties, the Advertising Standards Authority (ASA) and other regulators to develop a code of practice that, in addition to “appropriate” sanctions, “restricts fundamentally inaccurate advertising” during a parliamentary or mayoral election or referendum. The code of practice would be overseen by a committee including Ofcom, the ASA and the Electoral Commission and would empower them to remove political advertising that breached the code.
  • Bring forward a bill based on proposals made by the Law Commission in its 2020 report that “comprehensively modernises electoral law”. The committee called for the bill to have completed all its stages in Parliament before the next general election. As part of its review into electoral law, the Law Commission aimed to ensure that the law governing the conduct of elections and referendums was “modern, simple, and fit for purpose”. In September 2020, the Government said that it welcomed the report.
  • Legislate “immediately” through secondary legislation to introduce digital imprints for online political campaign material. A digital imprint would include the name and address of the promoter of the material and the name and address of any person on behalf of whom the material is published.
  • Commit to further reforms to electoral law, including real-time databases of all political advertising on online platforms and an increase to the size of fines that the Electoral Commission can impose on political campaigners to either £500,000 or four percent of the total campaign spend, whichever is the greater amount.

The committee also made recommendations for increasing digital literacy and making “active digital citizens” through digital media literacy initiatives and changes to the school curriculum.

Government response

The Government published its response to the committee’s report on 1 September 2020. It welcomed the report and recognised the committee’s concerns that digital technology had “introduced risks that could undermine democracy”.

In addition, the Government provided responses to all of the committee’s recommendations. A summary of some of the responses is as follows:

  • Online harms legislation: the Online Harms white paper, published in April 2019, set out proposals for a new regulatory framework that would “hold technology companies accountable for their action to tackle illegal content, safeguard children and uphold their own terms and conditions”. It would be overseen and enforced by an independent regulator, who would be able to take enforcement action, such as notices, warnings and “substantial fines”, against companies that “do not fulfil their duty of care”. The Government stated that it would publish the full response to the white paper by the end of 2020, followed by legislation “as soon as parliamentary time allows”.
  • Independent ombudsman for content moderation decisions: as set out in the white paper, companies would be expected to have effective and accessible mechanisms for user redress. However, the Government stated that it did not support introducing new regulation on the content of political advertising.
  • Code of practice for political advertising: the Government was “committed to promoting transparency around political advertising” and announced that it would be bringing in new rules that require election material to “explicitly” show who is behind it. However, it also reaffirmed that it did not support regulation of policy or political arguments, both online and offline, which “can be rebutted by rival campaigners as part of the normal course of political debate”.
  • Introduce a bill to modernise electoral law: the Government welcomed the Law Commission’s report and would respond. It stated that its “immediate priority” was: implementing its 2019 manifesto commitments on electoral law, such as introducing identification to vote at polling stations; and “responding to the needs of the electoral sector” by supporting the operations, resilience and security of elections.
  • Introduce imprints on online political material: a technical consultation on the Government’s proposed digital imprints regime was launched in August 2020. The Government stated that whilst it was possible to introduce a regime through secondary legislation, doing so would “present some limitations to the scope of the policy”. Therefore, the Government said it would decide on the legislative approach after consulting on the policy.
  • Further electoral reforms: the Government noted that it was in the process of assessing the current regulatory system for online advertising through its online advertising programme. Responding to the recommendation to increase the maximum fine that the Electoral Commission can issue, the Government said that it had “no plans” for doing so. It also explained that the Electoral Commission can refer “more serious matters” to the police, with these matters considered by a court of law, which has the power to levy unlimited fines and jail sentences.

Recent Government policy

Since the report’s publication, the Government has introduced draft legislation on online safety and a bill in Parliament to reform electoral law.

Draft Online Safety Bill

In December 2020, the Government published its full response to the online harms white paper. In its response, the Government stated that the case for “robust regulatory action” had continued to grow. Therefore, as detailed in its white paper, it would be creating an online harms framework to be introduced through an online safety bill. The framework would apply to companies whose services host user-generated content or facilitate interaction between users, one or more of whom is based in the UK, as well as search engines. The legislation would set out a general definition of harmful content and activity, with a limited number of priority categories of harmful content, “posing the greatest risk to users”, set out in secondary legislation.

The Government stated that the framework would be overseen by Ofcom, which would be listed as the independent regulator. The primary duty of Ofcom in enforcing the framework would be to improve the safety of users of online services (in addition to non-users who may be directly affected by others’ use of them). This would include “setting codes of practice, establishing a transparency, trust and accountability framework and requiring all in-scope companies to have effective and accessible mechanisms for users to report concerns”. It noted that to ensure the “effective implementation” of the framework, Ofcom would be equipped with “robust enforcement tools” to tackle non-compliance. This would include the power to issue fines of up to £18 million or 10% of global annual turnover, whichever is the higher amount.

In May 2021, the Government published a draft Online Safety Bill. The draft bill received its pre-legislative scrutiny in September 2021. The Joint Committee on the Draft Online Safety Bill published its report into the draft legislation in December 2021. In its report, the committee made several recommendations. This included that the bill be “restructured” so that the Government could “set out its objectives clearly at the beginning” and that the bill should include a specific responsibility on service providers to have in place systems that can identify “reasonable foreseeable risks of harm” arising from their platforms and take “proportionate steps” to mitigate those risks, In January 2022, the Government stated that it was “considering fully and carefully” the joint committee’s recommendations.

Elections Bill 2021–22

The Government introduced the Elections Bill in the House of Commons on 5 July 2021. The bill would make several changes to electoral law, such as requiring voters in UK parliamentary elections and local elections in England to produce photo ID at polling stations and requiring digital campaign material to include digital imprints. The Government has stated that the bill would “strengthen the integrity of UK elections and protect our democracy”. In contrast, the Opposition and other members of Parliament have criticised the bill, arguing that measures such as requiring ID at polling stations were a form of “voter suppression”.

The bill passed its House of Commons stages on 17 January 2022. It had its first reading in the House of Lords on 18 January 2022. The bill is scheduled to have its second reading in the House of Lords on 23 February 2022.

Further information on the bill, including what it seeks to do and what amendments were made to it during its House of Commons stages, can be found in the following House of Lords Library briefing:

Read more


This article was first published on 7 February 2022. It was updated on 24 February to include a rescheduled date for the debate to take place in the House of Lords.

Cover image by jmexclusives on Pixabay.