- By - digi-admin
- August 8, 2024
- Comments (0)
Blog #4 From Print to Pixels: Mapping the developments of UK digital campaign regulation
6 August 2024
By Thomas Montgomerie, PhD candidate, Babes-Bolyai University, Cluj-Napoca, Romania
Digital technologies are now a well-established electoral tool in British political parties’ campaign arsenal, particularly since 2015. Contemporary regulation has attempted to keep pace with the multifaceted nature of digital political campaigns, though some associated risks continue to evade UK parliamentary legislation. So, which aspects have been addressed and what remains broadly unregulated? Looking at the key digital regulatory measures in the UK, I overview the most important developments to date and consider which pitfalls still need addressed.
Before the advent of Web 1.0 and 2.0, political campaigns centred on reaching out to voters through traditional techniques like canvassing and broadcast advertisements. The Representation of the People Act (1983) (ROPA) laid out specific regulatory rules on expenses, donations and campaign conduct, but were unfit to deal with the complexity of modern digital campaigns emerging with the launch of social media in the 2000s. Despite calls from the Electoral Commission for stricter candidate spending limits and imprints on digital campaign materials, online electoral campaigns largely operated within an unregulated space and escaped legislative scrutiny.
As digital campaigning became a more prominent electoral feature and catalysed by the Cambridge Analytica scandal and alleged foreign interference during the 2016 US Presidential election, the onus was on the UK government to develop enhanced regulation following renewed concerns about risk management and electoral integrity. The Digital Charter and Internet Safety Strategy proposed a social media code of practice and tools to detect and reduce and other harmful content, but regulatory measures remained absent. It was not until the introduction of the Data Protection Act (2018) which signalled the Conservative Government’s legislative ambitions, providing new rights for individuals’ personal data collection and usage. Guidance was provided to fairly process personal data for electoral campaign purposes but did not go far enough to improve absolute transparency of political advertising.
Following a series of recommendations including restrictions on foreign donations, increased fines for illegal online campaign activities and improved transparency, Part 6 of the Elections Act (2022) mandated imprints to be included on digital campaign materials, containing information about the name and address of the promoter. Instances where a campaigner was suspected to have breached the requirements could be investigated and prosecuted, ensuring a degree of compliance with the obligations. Information about the amount spent on digital campaign material, why someone was targeted and what personal data was used is not required by the legislation, however. Whilst some of this data may be found for political adverts on Facebook through Meta’s Ad Library, no independent searchable repository for content produced and disseminated by parties and other political organisations across online platforms exist, revealing a shortcoming of the current political advertising regulation.
The Online Safety Act (2023) indirectly addresses specific digital campaign risks, establishing new responsibilities for platforms and search engines to protect user safety, and strengthened the enforcement powers of Ofcom. Specifically, Chapter 7 (Section 152) requires Ofcom to launch a committee focused on tackling and preventing online misinformation and disinformation. Chapter 8 outlines steps to improve public understanding and protect against harmful content, while Chapter 3 compels larger online platforms to maintain adherence to their policies, with fines and disruption measures enforced for reneging. Whether these latest measures aid democratic engagement and user empowerment remains to be seen. Despite omission of any explicit reference to digital campaigns (or even elections), the Online Safety Act does begin to tackle false information and places greater responsibility on platforms to improve their services.
Although commendable legislative progress has been made, issues persist with digital campaign content across platforms. Campaign materials are still subject to Section 106 of ROPA which prohibit false statements about rival candidates’ character or conduct. Scarce restrictions exist for addressing dishonest content such as misleading or false policy pledges.
This is particularly concerning for campaigns fought on social media. Out with the advice of the Electoral Commission for users to think critically and fact-check information, campaign content posted on social media are primarily subject to individual platforms’ policies and terms of service. Yet studies have demonstrated the ineffectiveness of these measures to prevent the spread of misleading campaign information across their platforms. And with the newly elected Labour government’s election manifesto seldom providing any concrete policy proposals to address such risks, questions remain about the commitment of the Government to prevent the spread of misinformation online.
Looking ahead, the incoming Labour government brings some uncertainty to the
future of digital campaign regulation and risks falling behind globally – the EU and US pushing ahead with legislation to keep up with increasingly complex digital campaign practices. The Labour promises to ‘build on the Online Safety Act…and explore further measures to keep everyone safe online’ (p.103) and conducting a ‘Strategic Defence Review’ (p.15) but does not offer much detail on what exactly Labour hope to change with digital campaign regulation. The recent King’s Speech setting out Labour’s plan for government forwent precise details on proposals discussed in their manifesto to introduce a new Artificial Intelligence Bill, nor dedicated any part of their manifesto to resurrecting or reforming the previous government’s Data Protection and Digital Information Bill.
Notwithstanding these shortcomings there is still reason to be optimistic. Other political parties are taking more seriously the challenges and opportunities created by digital technologies. The Liberal Democrats propose a Digital Bill of Rights to protect individuals’ right to freedom of expression and privacy, introducing a public database for online political adverts to improve transparency and establishing a global treaty to tackle disinformation and electoral interference. The Greens have offered similar policy pledges, as well as supporting an international coordinated response to the dangers of AI.
Beyond party promises, regulation can go further to improve digital political campaigns on several fronts. The inclusion of fact-check labels across online content published by parties, and early years digital literacy training, can ensure individuals are equipped to source reliable information and prevent the spread of misinformation. Future regulation should also restrict the use of deepfakes during elections and promote fairer access for parties to campaign online regardless of size or budget. Finally, to strengthen public oversight, the Government should establish an independent body to oversee digital campaign practices and enforce stringent penalties for failed compliance. Altogether, these recommendations and policy proposals aim to foster a more accessible and transparent digital campaign environment, enabling parties and citizens alike to better navigate the digital frontier in future elections and beyond.