Senatory Poll
Pier Goodmann
PARTY SECRETERY

    Vote Now
    Written evidence submitted by Dr Gabriela Borz, Dr Anna Longhini, Thomas Montgomerie (DIS0041) for the Foreign Affairs Committee. “Disinformation diplomacy: The UK and the EU regulatory landscape”

     

     

    Written evidence submitted by Dr Gabriela Borz, Dr Anna Longhini, Thomas Montgomerie (DIS0041)

    Disinformation diplomacy: The UK and the EU regulatory landscape

    Evidence for the Foreign Affairs Committee.

    Dr Gabriela Borz , Senior lecturer in Politics at the University of Strathclyde, Glasgow, United Kingdom and Babes-Bolyai University, Cluj-Napoca, Romania,

    Dr Anna Longhini, post-doctoral research fellow, Babes-Bolyai University, Cluj-Napoca, Romania,

    Thomas Montgomerie, doctoral researcher, Babes-Bolyai University, Cluj-Napoca, Romania

     

    Dr. Gabriela Borz  is the principal investigator of  DIGIEFFECT a Next Generation EU research project (1,4 million EUR 2023-2026) which investigates the risks associated with digital political campaigning in the European Union. More specifically, the project investigates to what degree regulatory instruments adopted by the EU, national governments and by the on-line corporations contribute to reducing on-line mis/dis/malinformation during electoral campaigns.

    1. Threats of disinformation campaigns

    Disinformation or intentional spread of false information with the intention of manipulating voters creates democratic vulnerabilities at the individual, societal and system level. In a 2024 DIGIEFFECT survey with 140 political parties with parliamentary representation in Europe, disinformation is the highest ranked concern among political parties across 27 EU member states.

    1. What channels and technologies are states and non-state actors using to spread disinformation?

    Large scale disinformation generates systemic risks, especially when disseminated by foreign actors.   The first principle at risk is free and fair elections. Algorithmic amplification of false messages can give an unfair advantage to certain candidates or parties at the expense of other candidates.

    Who disseminates disinformation ‘abroad’: foreign actors through umbrella organisations, elected foreign politicians, individuals acting as opinion leaders (‘influencers’) or more recently on-line supporter groups across various social media platforms.

    Example of a systemic risk: First Round of Presidential elections in Romania 2024

    The first round of Romanian presidential elections was annulled by the Constitutional Court in late 2024 following suspicions of foreign actor interference.  The ‘virtual’ candidate Calin Georgescu has won the first round of presidential elections after an intensive TikTok campaign via influencers. He did not participate in major TV debates and was ranked very low in the first opinion polls. Additionally, he declared zero campaign expenses. Declassified documents submitted by secret services to the Supreme Council for National Defense, stipulate the possible interference from external state actors. The presidential elections will rerun in May 2025.

    1. What lessons can the UK learn in countering state-sponsored disinformation?

    Firstly, we briefly present the EU regulatory overview on disinformation and foreign interference. In order to mitigate the risks of disinformation, the EU developed a comprehensive legal framework for countering disinformation. From 2016 until 2024, that is from the issue of the General Data Protection Regulation (GDPR) to the Artificial Intelligence Act (DIGIEFFECT 2024), fourteen EU laws, most of which are soft laws (12 out of 14), make reference to foreign actors (or ‘third countries’) and disinformation. This includes references to ‘hybrid threats’, ‘malicious cyber activities’ and ‘cyber-attacks’, ‘disinformation campaigns’ (i.e. online disinformation campaigns in 2015 by Russia), ‘foreign information manipulation and interference’ (FIMI) and ‘Interference in elections’.

    Table 1 below highlights pages, articles or recitals where such references in EU laws are made.

    Table 1. An overview of EU laws tackling the link between foreign actors and disinformation

    Years EU Laws References to foreign actors and disinformation
    2016 Joint Framework on countering hybrid threats P. 2: Massive disinformation campaigns, using social media to control the political narrative or to radicalise, recruit and direct proxy actors can be vehicles for hybrid threats.

    P. 4: Perpetrators of hybrid threats can systematically spread disinformation, including through targeted social media campaigns, thereby seeking to radicalise individuals, destabilise society and control the political narrative.

    P. 13: Targeting hybrid threat financing / Fringe political parties.

    2018 Increasing resilience and bolstering capabilities to address hybrid threats P. 3: Election periods have proven to be a particularly strategic and sensitive target for cyber-enabled attacks and online circumvention of conventional (“off-line”) safeguards and rules such as silence periods, transparent funding rules, and equal treatment of candidates. This has included attacks against electoral infrastructures and campaign IT systems, as well as politically-motivated mass online disinformation campaigns and cyber-attacks by third countries with the aim to discredit and delegitimise democratic elections.

    P.8: The Framework for a Joint EU Diplomatic Response to Malicious Cyber Activities (the “cyber diplomacy toolbox”) sets out […] restrictive measures that can be used to strengthen the EU’s response to activities that harm its political, security and economic interests.

    2018 Code of Practice on Disinformation P. 1: Disinformation (definition).

    P. 5, II.B: Signatories recognize that approaches to issue-based advertising developed should be reflective of the European market for political and issue-based advertising, and take note of the European Commission Recommendation on election cooperation networks, online transparency, protection against cybersecurity incidents and fighting disinformation campaigns in the context of elections to the European Parliament.

    2018 Action Plan against Disinformation P. 2: Online disinformation campaigns in 2015 by Russia.

    P. 7: Set up of the Rapid Alert System (RAS).

    P. 8: Crucial role of online platforms, advertisers and the advertising industry in tackling the disinformation problem.

    2019 Council Reg. 796 Restrictive measures against cyber-attacks Art. 1.1:  cyber-attacks […]  constitute an external threat to the Union or its Member States.

    Art. 1.4c: Threat to critical State functions […] including public elections or the voting process.

    2019 Council Decision 797 Restrictive measures against cyber-attacks Art. 1: cyber-attacks […] constitute an external threat to the Union or its Member States.

    Art. 1.4c: Threat to critical State functions […] including public elections or the voting process.

    2020 European Democracy Action Plan P. 3: specific measures to: promote free and fair elections and strong democratic participation; support free and independent media; and counter disinformation.

    P. 4: need for more transparency in political advertising and communication.

    P.5: Clearer rules on the financing of European political parties

    P. 8 – 2.4: Promoting democratic engagement and active participation beyond elections

    PP. 17-18: disinformation (types).

    P. 19: prevention of the manipulative amplification of harmful content / introduce deterrence by imposing costs on actors engaged in influence operations and foreign interference.

    2022 Directive on measures for a high common level of cybersecurity across the Union Rec. 69: disinformation campaigns / higher societal risk (online platforms).

    Art. 1: measures that aim to achieve a high common level of cybersecurity across the Union.

    Art. 10: Computer security incident response teams (CSIRTs)

    2023 Revised Implementing Guidelines of the Cyber Diplomacy Toolbox P. 21: 8. 33 – EU external policies on digital and cyber issues, as well as countering hybrid threats, including FIMI, should be coherent and mutually reinforcing. Due to geopolitical dynamics, cyber and digital issues are increasingly intertwined at the international level.
    2023 Commission work program 2024. Delivering today and preparing for tomorrow. P.11-12: The EU will continue to develop further the toolbox to counter foreign information manipulation and interference.
    2023 Directive on Transparency of Interest Representation on behalf of Third Countries – Defense of Democracy package (Hard -Proposal) Rec. 44 – citizens as voters […] can be the target for certain interest representation services.

    P.3: high risk of covert foreign interference. The European Parliament and the Council have underlined the importance of addressing the threat to democracy posed by foreign interference. These concerns have intensified since Russia’s war of aggression against Ukraine.

    2023 Commission Recommendation on inclusive and resilient electoral processes in the Union Rec. 48-49: Foreign interference.

    1. This Recommendation promotes high democratic standards for elections in the Union and the enhancement of the European nature and efficient conduct of the elections to the European Parliament.

    2024 Regulation on the transparency and targeting of political advertising (Hard law) Rec. 4: Political advertising can be a vector of disinformation, in particular where the advertising does not disclose its political nature, comes from sponsors outside of the Union or is subject to targeting techniques or ad-delivery techniques. […] Transparency of political advertising contributes to enabling voters and individuals in general to better understand when they are being presented with a political advertisement, on whose behalf that advertisement is being made, as well as how and why they are being targeted by a provider of advertising services, so that voters are better placed to make informed choices.

    Rec. 19: Interference in elections by certain third-country entities or third-country nationals, who may sponsor political advertising in the Union, is known to pose a serious threat to democracy […].

    Art. 3.2: Political advertising [Definition].

    2024 Guidelines for providers of VLOPs and VLOSEs on the mitigation of systemic risks for electoral processes P.1: A wide range of phenomena involving online platforms and search engines give rise to a heightened risk to election integrity. These include, but are not limited to the proliferation of illegal hate speech online, threats linked to foreign information manipulation and interference (“FIMI”) as well as the wider phenomenon of disinformation, the spread of (violent) extremist content and such with the intent to radicalise people, as well as the spread of content generated through new technologies such as generative Artificial Intelligence (“AI”).

    P. 5 – 3.: Election specific risk mitigation measures.

    P. 10 e): Political advertising : ads needs to be labelled in a clear, salient and unambiguous manner and in real time to allow users to understand that the content displayed contains political advertising.

    Source: authors’ data, available at: www.digieffect.eu, data visuals.

     

    Secondly, our research shows that on-line platforms who are the facilitators of on-line political content maintain a dialog with state actors via soft laws. One way to evaluates the effectiveness of EU soft regulation on disinformation is to investigate how on-line platforms respond to the EU code of practice on disinformation (2018 and 2022). We developed a compliance framework (Borz et al 2024) for assessing platform compliance with EU soft law related to disinformation. Bridging literature from EU governance and corporate governance we outline the following stages in the platform compliance continuum: (1) symbolic commitment: written agreement to regulatory principles, no clear commitment taken, (2) formal commitment: written pledge to implement principles and actions (new platform policies), (3) implementation of EU law: reported actions aligned with commitments taken; (4) corporate governance changes: reported organizational and procedural changes as a response to soft and hard regulation.

    Our analysis of reports submitted by major VLOPs and VLSEs following the 2018 and 2022 Code of Practice on Disinformation (CPD) shows that platform responsiveness to EU law is high mainly on regulatory aspects prioritised by platforms and also when the reporting requirements are not too rigid. Most companies fail to discuss the theme of democratic rights and freedom of expression to the same (or similar) extent as the EU’s CPD. Instead, most companies devote a large part of their annual reports to integrity of services, empowerment of users and coordinated responsibility. A level of asymmetry between EU’ priorities and those of each signatory can be observed across most themes.

    Lastly, we address a few regulatory gaps in the UK. Despite the UK’s legislative framework for addressing the complex and multifaceted nature of disinformation campaigns in elections, a few regulatory gaps remain. If left unaddressed, these gaps could enable more malicious state and non-state actors and further erode democratic processes. Here we outline the key gaps in the current legislation, its potential effect for citizens and democracy more broadly, and offer a list of various remedial steps.

    Part 6 of the Elections Act (2022) mandates that a digital imprint must be included on certain digital campaign materials, including paid and non-paid (organic) digital material, which contains information about the name and address of the promoter. Instances where a campaigner is suspected to have breached the requirements could be further investigated, ensuring a degree of compliance with the obligations. However, whilst improving ad transparency for users may increase trust, we argue this is not sufficient to reduce the spread of disinformation or instances of foreign interference. For example, current legislation does not require that digital imprints need to include any disclaimer about the accuracy of claims made within the content, how much was spent on the material or comprehensive details about the promoter.

     

    Three issues consequently arise. First, the lack of fact-checked political materials shared across online platforms means users can still be misled by false and malicious claims, regardless of the inclusion of a digital imprint. In fact, false and dangerous claims made within a digitally imprinted material could appear more authentic by the public, and in turn exacerbate the spread of disinformation. Digital materials which require an imprint should thus pass through additional authenticity checks by independent moderators before it’s published and include a fact-checked label. Second, failing to include any information about how a digital material is funded opens the possibility for money outside the UK being used to influence public attitudes or promote false narratives against a particular candidate, party or political issue. The inclusion of financial information and who paid for the digital material would help to disrupt and deter foreign interference by better tracing how digital materials are funded. Finally, beyond providing basic details about the name and address of the promoter, digital imprints do not give users more accessible information about the promoter, such as an organisation’s purpose and main activities. By including this information, users could better evaluate the authenticity of content and whether to trust the promoter sharing the digital material. A similar issue exists with the digital imprint criterion for paid adverts, which are limited to payment to the service provider or platform for hosting the material. We currently lack any transparency about the cost to create, set up, operate or maintain, and who was responsible for funding these steps. This leaves the door open for malicious foreign states to fund and spread disinformation. We recommend that greater financial transparency measures for digital materials would help detect instances of foreign interference during elections.

     

    Beyond imprints, digital campaign materials are still subject to Section 106 of the Representation of the People Act (1983), which, alongside broader regulations on expenses, donation and campaign conduct, prohibits false statements about rival candidates’ character or conduct. Scarce restrictions exist however for addressing dishonest content such as misleading or false policy pledges. This is particularly concerning for political campaigns which are increasingly fought online. Out with advice from the Electoral Commission for users to think critically and fact-check information, campaign content shared on social media are primarily subject to individual platforms’ policies and terms of services. Urgent regulation needs to be introduced which prohibits candidates spreading disinformation in elections, and stringent measures in place for those seeking to mislead the public and distort debates. Since digital imprints already represents commendable progress made to improve transparency and begins to tackle disinformation and foreign interference, we recommend integrating fact-check labels, details about who paid for the material and more complete information about the promoter which can address the current regulatory gaps on disinformation and foreign interference.

     

     

    References:

    Borz, G., & De Francesco, F. (2024). Digital political campaigning: contemporary challenges and regulation. Policy Studies, 45(5), 677–691. https://doi.org/10.1080/01442872.2024.2384145

    Borz, G., De Francesco, F., Montgomerie, T. L., & Bellis, M. P. (2024). The EU soft regulation of digital campaigning: regulatory effectiveness through platform compliance to the code of practice on disinformation. Policy Studies, 45(5), 709–729. https://doi.org/10.1080/01442872.2024.2302448

    Borz, G. (2024) Digital political campaigning: Platform compliance with EU law, Research Features, 154. DOI: 10.26904/RF-154-7290336456

    Longhini, A. (2024) How is the EU preparing for the 2024 EU Parliament Elections? The Defence of Democracy Package and beyond. DIGIEFFECT blog#1. Available at: https://digieffect.eu/how-is-the-eu-preparing-for-the-2024-eu-parliament-elections-the-defence-of-democracy-package-and-beyond/

    Mitrea, E. (2024) (Not) Spending it all: restrictions on online political advertising expenses – DIGIEFFECT blog#2. Available at: https://digieffect.eu/not-spending-it-all-restrictions-on-online-political-advertising-expenses/

    Almodt, R. (2024) Digital campaigning: scope, practice and potential described more by opposition parties across Europe, DIGIEFFECT blog#3. Available at: https://digieffect.eu/3-digital-campaigning-scope-practice-and-potential-described-more-by-opposition-parties-across-europe/

    Montgomerie, T. (2024) From print to pixels: mapping the developments of UK digital campaigning, DIGIEFFECT blog#4 Available at:

    https://digieffect.eu/blog-4-from-print-to-pixels-mapping-the-developments-of-uk-digital-campaign-regulation/

    Borz, G. (2025)  Why digital electoral campaigning needs urgent regulation , The Loop. ECPR’s Political Science Blog.

     

    Top Related Post

    Privacy Overview

    This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.