top of page

The Threat of Disinformation Ahead of the UK General Election

3 Jun 2024

Report


A Rise in Fake News:

Polis Analysis holds a mission to provide high quality and impartial analysis of global politics which is dependent on the ability to separate fact from fiction. In a world of ever-advancing capabilities, historically low trust in the press, and widespread access to near-infinite digital resources, this is a task that has become increasingly difficult to complete. 


Misinformation and disinformation, the unintentional and intentional spreading of false or misleading information, has posed a challenge to objectivity and truth long before the 21st century. However, a recent surge in publicly-available large language models (LLM) driven by artificial intelligence (AI) is but one example of modern technology now holding the ability to achieve both considerable and world-altering harm. Within the realm of fake news, an increased capacity to disseminate rumour, bias, narrative, and lies as though they were fact holds the potential, if unchecked, to pose an unignorable threat to public interests [1].


Already in recent years we have witnessed:

  • A  550% increase in deepfake videos from 2019-2023 [2]

  • A 3000% increase in deepfake fraud in 2023 [3]

  • A 2946% increase in usage of the term “fake news” found within Google Books between 2012-2019 [4]

  • 88% of individuals surveyed by Polis Analysis feeling “at risk” or having already “fallen victim” to fake news [5]


As claimed by Professor Stephen Hutchings, principle investigator at the University of Manchester/Chatham House (Mis)Translating Deceit research project, disinformation has become “a general problem of our era” following the 2016 US Presidential Election [6], and has since witnessed an “unprecedented surge” following Russia’s invasion of Ukraine [7]. Disinformation has subsequently been recognised by NATO as a means by which Russia – the “most significant and direct threat… to peace and stability in the Euro-Atlantic area” attempts to “establish spheres of influence” against NATO Allies and its partners [8] – a message that Polis Analysis itself heard directly when attending the NATO Youth Summit in Miami this May [9]. Still continuing to gain momentum, misinformation and disinformation is now projected by the World Economic Forum (WEF) to be the most severe risk facing the world over the next two years, as well as the fifth largest risk over the next ten [10]. 


2024 has been described by Chatham House Research Fellow Isabella Wilkinson as a “perfect storm” for the threat of disinformation, claiming that a “whole society potential threat” requires a “whole of society response.” [11] Yet, as the UK heads towards a general election, the country remains behind the curve on government legislation, social media regulation, and digital literacy skills capable of combating these threats. 


The NSA, OSA, and Ofcom:

The Online Safety Act 2023 (OSA) was a great opportunity for Westminster to begin to coordinate a proactive, holistic response against online dangers facing the UK. However, as Polis Analysis repeatedly warned when submitting evidence to the legislation’s pre-legislative scrutiny committee, the OSA does not go far enough in tackling the issue of online disinformation [12].


Section 165 of the OSA amends the Communications Act 2003, mandating Ofcom to help raise public awareness towards “understand[ing] the nature and impact of disinformation and misinformation, and reduc[ing] their and others’ exposure into it.” In order to achieve this, section 152 instructs the regulator to establish an advisory committee on disinformation and misinformation with the purpose of producing a report recommending how Ofcom can fulfil these obligations. Yet despite Ofcom’s own written evidence to Parliament’s Joint Committee on the National Security Strategy’s (JCNSS) Defending Democracy inquiry acknowledging misinformation as the most prevalent potential harm currently encountered by adults, with a majority of respondents in an Ofcom survey believing the issue to be of “high concern” [13], the regulator has failed to make sufficient progress on their legal duties. 


The regulator describes 2024 as “a particularly important year for people to feel comfortable and able to recognise misinformation” – citing upcoming UK and US elections [14]. However, as confirmed in a Freedom of Information (FOI) request from April – six months after the bill became law – Ofcom has so far not established this committee [15]. The regulator has also separately stated that the committee would only “probably” be assembled “towards the end of this year.” [16] With the report’s 18-month deadline beginning only after the committee’s creation, in spite of the regulators own acknowledgement of the imminent dangers fake news poses to 2024, Ofcom’s timeline pushes a final report recommending how these threats can be countered to 2026. Despite a rhetoric of urgency, Ofcom and the OSA’s inaction does not adequately take into consideration the immediate gravity of the situation at hand. 


The OSA’s establishment of “false communication offences” also entails unnecessary exemptions to organisations in need of increasing public scrutiny. While section 179 makes the intentional sending of false information likely to cause non-trivial psychological or physical harm punishable by 51 weeks prison and/or a fine, this does not apply to “recognised news publishers,” any activity “in connection with anything done” under the Broadcasting Act 1990 or 1996, multiplex licence holders, and the providers of on-demand programme services [17].


The Edelman Trust Barometer Global Report [18] places the UK bottom and third bottom out of 28 countries for trust in media and government respectively, while the University of Oxford’s Reuters Institute for the Study of Journalism cites the UK as an example of a growing correlation between declining trust and increasing news media criticism [19]. Yet despite evidence of public concern over the current state of our media institutions, section 179 reaffirms a two-tiered justice system that continues to protect the media from fair scrutiny. Legal and illegal behaviour should not be decided by the status or licence of an individual, but rather the ethics of their actions.


Despite supporting international agreements and initiatives aimed at tackling these challenges, such as the UN’s AI Resolution [20] and the Framework to Counter Foreign State Information Manipulation [21], without greater efforts to tackle domestic disinformation the UK runs the risk of undermining its position as world-leaders in democracy and technology. Disinformation campaigns have been deemed a form of “malign foreign interference” and a criminal offence under the National Security Act 2023 [22], but as Ofcom’s analysis states there is “no ‘generic’ example” of what this looks like as cross-border influence operations often use a “variety of tactics.” [23] Simply recognising the issue within law is not enough to halt disinformation’s growth, particularly when evidence submitted to the Defending Democracy inquiry describes current legislation as containing “gaps and inconsistencies.” [24]  


A more proactive approach to tackling disinformation is necessary for tangible results to occur. Amongst other measures, further legislation is needed from the UK Government to ensure that disinformation will neither be a key factor in influencing public opinion nor a threat to the country’s democratic responsibilities. 


Digital Literacy and Big Tech Apathy:

A lack of sufficient public sector oversight has led to a dependency on voluntary cooperation from technology and social media to provide adequate protections against the harms of misinformation and disinformation. These include policies such as risk assessments, content moderation, algorithm tests, and user verification schemes. However, as witnessed by the European Commission’s legal proceedings against many of the Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs) currently liable under the DSA [25], it unfortunately seems as if  social media platforms such as Facebook, X, TikTok, and Instagram will not implement appropriate measures unless legally required to do so. 


Despite recording a net income of nearly £17.2B in the first quarter of 2024 [26], an increase of 20% compared to the year prior, Microsoft’s announcement of a Societal Resilience Fund in partnership with OpenAI only dedicates less than £1.6M to targeting attempts to “deceive the voters and undermine democracy” – less than a single percent of Q1 income [27].  Meanwhile, (Mis)translating deceit found that Facebook has failed to identify 91% of wartime propaganda against Ukraine [28]. Instead of seeking solutions to issues such as this, Meta has chosen rather to actively restrict political content [29]. As recently as 21 May, JCNSS Chair Dame Margaret Beckett MP described the approaches of a majority tech and social media companies towards harmful digital content with the potential to undermine UK and global democracy as “uncoordinated” and “silo-led,” continuing that there was “too little evidence… of the foresight we expected: proactively anticipating and developing transparent, independently verifiable and accountable policies to manage the unique threats in a year such as this.” [30]


These inadequacies have consequently placed the burden of protection from disinformation largely onto the public. Yet, as a recent study by Harvard Kennedy School’s Misinformation Review found, nearly 50% of individuals show “no improvement over chance” when differentiating fact from opinion [31] – a problem that even legislation such as the DSA cannot be relied upon to resolve alone. The threat of fake news requires a holistic response, and an increased focus on digital literacy is essential to provide individuals of all ages the capabilities to identify cases of disinformation that state apparatuses are so far failing to detect. 


Solutions are readily available. One notable example comes from the University of Cambridge, whose fifteen-minute digital literacy game was concluded in 2019 to reduce the perceived reliability of fake news by an average of 21% [32]. Calls to build public resilience to fake news are not new: as Polis Analysis wrote when submitting its evidence in 2021, when dealing with platforms such as Whatsapp that fall outside large sways of the regulatory sphere, digital literacy skills are vital to protect individuals within private communication channels [33]. However, as organisations such as Full Fact have vocalised, “good ideas have not been matched with sufficient resources,” and despite continued calls for digital literacy skills to become “a core part of the UK’s defence against misinformation and disinformation,” its importance remains severely undervalued in the run up to the general election [34]. 


As a June 2024 BBC report has shown, key demographics are being targeted and exposed to fake news on platforms such as TikTok in order to sway voting behaviour come 4 July [35]. With government legislation and social media companies currently failing to protect individuals from content intended to mislead and influence democracy, it is vital that the run up to the general election is accompanied by a vocal and widespread emphasis on how to detect and flag unreliable content online. 


Immediate Concerns Ahead of the UK General Election:

While Polis Analysis remains committed to researching and advocating towards long-term policy solutions in the bid to tackle both misinformation and disinformation, Prime Minister Rishi Sunak’s decision to announce a snap election [36] only increases the necessity for all political parties to provide the public with immediate solutions to these concerns. Polis Analysis calls for all political parties to adopt these recommendations:


  • The creation of a short-term deadline mandating  Ofcom to establish its advisory committee on disinformation and misinformation

  • A reduction in the timeframe of the advisory committee’s final report from 18 to 6 months

  • The removal of media exemptions to false communication offences established in the OSA

  • Image authentication from public sources and “recognised news publishers” via a mandatory watermark system

  • The establishment of a public fund awarded to AI startups providing immediate and longer-term solutions to tackling disinformation, instead of a reliance on compliance by tech and social media giants


  • The creation of a free-to-access public database containing resources focused on key skills to verify online information

  • The implementation of digital literacy classes in education from Key Stage 2 and onwards, making sure children understand the potential dangers disinformation from an early age


References:

[1] https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8853081/

[2] https://www.homesecurityheroes.com/state-of-deepfakes/

[3] https://onfido.com/landing/identity-fraud-report/

[4]https://books.google.com/ngrams/graph?content=Fake+News&year_start=1800&year_end=2019&corpus=en-2019&smoothing=3&case_insensitive=true

[5] https://www.polisanalysis.com/fake-news-observatory/polis-survey-on-fake-news-

[6] https://issuu.com/russiaprogram/docs/russia_program_journal_no.3

[7]  https://issuu.com/russiaprogram/docs/russia_program_journal_no.3

[8] https://www.nato.int/cps/en/natohq/115204.htm 

[9] https://www.polisanalysis.com/fake-news-observatory/our-message-to-nato%3A-take-disinformation-seriously

[10] https://www.weforum.org/publications/global-risks-report-2024/

[11]https://www.chathamhouse.org/events/all/research-event/developing-new-responses-disinformation (attended by Polis representative)

[12] https://www.legislation.gov.uk/ukpga/2023/50/enacted

[13] https://committees.parliament.uk/writtenevidence/128792/pdf/

[14]  https://www.ofcom.org.uk/__data/assets/pdf_file/0020/283025/adults-media-use-and-attitudes-report-2024.pdf

[15] https://www.ofcom.org.uk/__data/assets/pdf_file/0023/285008/advisory-committee-misinformation-disinformation-establishment.pdf

[16] https://committees.parliament.uk/publications/45032/documents/223340/default/

[17]https://www.legislation.gov.uk/ukpga/2023/50/enacted

[18] https://www.edelman.com/sites/g/files/aatuss191/files/2024-02/2024%20Edelman%20Trust%20Barometer%20Global%20Report_FINAL.pdf

[19] https://reutersinstitute.politics.ox.ac.uk/digital-news-report/2023/dnr-executive-summary

[20] https://documents.un.org/doc/undoc/ltd/n24/065/92/pdf/n2406592.pdf?token=wLJtPkFGxRjwyhdgEj&fe=true

[21] https://www.gov.uk/government/news/us-uk-canada-joint-statement-foreign-information-manipulation

[22] https://www.legislation.gov.uk/ukpga/2023/32/pdfs/ukpga_20230032_en.pdf 

[23] https://committees.parliament.uk/writtenevidence/128792/pdf/

[24] https://committees.parliament.uk/writtenevidence/128453/pdf/

[25] https://www.polisanalysis.com/fake-news-observatory/polis-analysis-is-pleased-to-hear-that-the-european-commission-(ec)-has-opened-formal-proceedings-against-meta-under-the-digital-services-act-(dsa)

[26] https://www.microsoft.com/en-us/investor/earnings/fy-2024-q3/press-release-webcast

[27] https://blogs.microsoft.com/on-the-issues/2024/05/07/societal-resilience-fund-open-ai/

[28] https://doi.org/10.37016/mr-2020-136

[29] https://transparency.meta.com/en-gb/features/approach-to-political-content/

[30] https://committees.parliament.uk/work/8131/defending-democracy/news/201621/big-tech-and-social-media-companies-falling-short-ahead-of-uk-general-election/

[31] https://misinforeview.hks.harvard.edu/article/fact-opinion-differentiation/

[32] https://www.cam.ac.uk/research/news/fake-news-vaccine-works-pre-bunk-game-reduces-susceptibility-to-disinformation 

[33]  https://committees.parliament.uk/writtenevidence/39266/pdf/ 

[34] https://fullfact.org/media/uploads/ff2024/18042024-full_fact_report_corrected.pdf

[35] https://www.bbc.co.uk/news/articles/c1ww6vz1l81o

[36] https://www.instituteforgovernment.org.uk/explainer/uk-general-election-july-2024





bottom of page