EU Flags

Building Resilience Against Election Influence Operations: Preparing for the European Elections in 2024 and Beyond

Case Studies
Spring | 2024
Image: Photo Credit: VanderWolf Images | Adobe Stock
Daria Azariev North
Author
Sr. Program Manager
David Levine Headshot
Author
Senior Elections Integrity Fellow, Alliance For Securing Democracy
Krystyna Sikora  headshot
Author
Research Assistant, Alliance For Securing Democracy (ASD)
Nikoleta Diossy
Author
Senior Program Officer
TwitterFacebookLinkedin
French flag
France

Bolster the government’s ability to identify and counter foreign influence operations

France’s experience illustrates ways to counter election disinformation. In the country’s 2017 presidential election, frontrunner Emmanuel Macron—a pro-European centrist—became the target of a coordinated Russian influence campaign to undermine his candidacy against the right-wing Marine Le Pen, Moscow’s preferred candidate. Months before the election, a fake website posing as a reputable Belgian news agency (that was later traced to a Russian troll factory) reported that Saudi Arabia was financing his campaign. The post generated more than 10,000 likes, shares, and comments on Facebook. Macron was again targeted when hackers affiliated with Russian military intelli­gence stole internal emails and documents from his campaign team and released them online, along with fraudulent emails, two days before the second round of voting during a legally mandated 44-hour pre-election media blackout. The so-called “Macron leaks” did not significantly impact the outcome of the election, in which Macron received 66% of the votes. However, France learned from this experience and took proactive steps to further bolster future elections from information manipulation. 

In preparation for its next major election cycle, the 2022 presidential and general elections, France developed a unit to detect, monitor, and counter foreign information operations intended to undermine the country’s stability. Before launching the unit, called Viginum, the government created safeguards to protect privacy rights and freedom of speech. This includes regulations on how Viginum collects and uses social media data. Viginum’s research is restricted to open-source data; it is prohibited from interacting with other users, entering private groups, or creating avatars. The inter-ministerial Ethics and Scientific Committee, established within the General Secretariat for National Defense and Security, also supervises all of Viginum’s work. 

Before Viginum began working on the 2022 elections, it conducted a landscape review of foreign digital interfer­ence in other elections. Viginum representatives met with France’s three election oversight bodies—the National Commission for the Control of the Electoral Campaign, the Constitutional Council, and Arcom—to better understand how elections are administered and kickstart collaboration. They also consulted German agencies about threats to Germany’s 2021 general elections and tested Viginum’s monitoring capabilities by tracking French-language public debate about the German elections so it could prepare for issues that it could leverage in the information environ­ment of the French elections. 

Viginum began its monitoring operations for the 2022 elections in November 2021. Throughout the campaign cycle, Viginum staff worked closely with Objectif Désinfox, a fact-checking initiative of Agence France-Presse and Google, to characterize flagged content as false or misleading. Viginum also established direct lines of communication with the social media platforms so it could ask them about influence campaigns, identify situations that required further discussion, and request that they flag suspicious incidents. Starting in February 2022, Viginum regularly sent reports describing the campaigns it had detected during the election period to the three election oversight bodies, as well scheduled a hearing with them in April to go over specific information manipulation attempts. 

Viginum’s communication with different French institutions ensured that they were able to take robust measures to counter information manipulation during the election. During the election cycle, Viginum identified 60 inauthentic occurrences on digital platforms. Of those, five met the definition of foreign interference, including a campaign— that gained traction among domestic right-wing accounts—that suggested that the French government was using voting machines from the Canadian-US firm Dominion Voting Systems to skew election results in favor of Macron. Viginum alerted the Ministry of the Interior, which issued a public refutation of the claim. Despite a number of other false claims targeting candidates and the voting process, largely amplified by domestic social media accounts, none appear to raise significant doubts about the electoral outcome. 

How can this best practice work elsewhere? A government unit that exposes and counters foreign influence operations could be an invaluable tool for protecting a country’s elections and democracy more broadly. However, creating units to monitor the public information space may prompt legitimate concerns about government over­reach and potential breaches of individuals’ freedom of speech and privacy. Therefore, governments should give such units clear mandates and ensure they can work within the relevant legal framework and avoid infringing on people’s civil liberties. 

To mitigate those concerns, the French unit, and similar ones established in other countries—including Sweden’s Psychological Defense Agency and the United States’ Global Engagement Center—have created strict guidelines on what the units will monitor. They focus only on foreign influence and do not monitor the activities of domestic enti­ties. France also created clear regulatory safeguards. These ensure that Viginum respects citizens’ digital rights and personal privacy, including how it collects and uses personal data. A well-qualified ethical and scientific committee supervises its work. 

Although not all governments have the resources to create units to identify and counter foreign influence operations, they can still pursue similar efforts via other means. For example, many non-governmental organizations (NGOs), social media platforms, news outlets, and other key stakeholders have initiatives to monitor influence operations or possess the tools to do so. Supporting and building on the work of those partners by funding or collaborating on projects, especially before elections, could also be beneficial. Doing so could generate feedback on information threats similar to that of a dedicated government unit at a fraction of the cost. Such cross-sectoral collaboration can also be a valuable strategy for defending the electoral information environment. 
 

Swedish Flag
Sweden

Adopt a whole-of-society framework to bolster resilience to withstand information manipulation

Like France, Sweden is a global leader in developing best practices to counter election disinformation. In response to Russia’s interference in the 2016 US presidential elections and other European races“—notably the 2017 German federal and French presidential elections—Sweden, a frequent target of Kremlin-sponsored disinformation, prioritized its information influence defenses. Rather than attempting to halt the creation and spread of disinfor­mation, Sweden adopted a whole-of-society approach to build the resilience of institutions and society overall to withstand influence activities. That approach improved the country’s security posture against both foreign and domestic disinformation actors.
 
To safeguard Sweden’s 2018 general election against disinformation campaigns, the country’s Civil Contingencies Agency (MSB) provided election authorities with knowledge, education, and tools to understand foreign influence threats, vulnerabilities in the electoral system, and potential response methods. This included developing a compre­hensive training package for election officials to counter disinformation and election security threats. The package was shared with county administrative boards across the country. The MSB also piloted an in-person training for elec­tion authorities from all municipalities of Va¨stra Go¨taland county. With mandates to counter homegrown disinforma­tion, the election authorities drew on the training, knowledge, and support provided by the MSB to counter foreign and domestic purveyors of false information. These readiness activities, which benefited approximately 14,000 civil servants and election officials, helped ensure that the 2018 election ran smoothly, notwithstanding a cyberattack on the Swedish Election Authority that generated a flood of homegrown political mis- and disinformation. 

Building on this 2018 success, Sweden doubled down on its whole-of-society efforts in preparation for the 2022 general election by launching the Psychological Defense Agency (MPF). Like France’s Viginum, the MPF leads Sweden’s monitoring of foreign influence efforts. However, the agency has an equally important second role, directing initiatives to strengthen Swedish society’s overall resilience to information manipulation. 

Amid increased tension with Moscow following Sweden’s bid to join NATO, the MPF was on high alert for potential interference in the 2022 election. To better protect the election from potential information threats, the MPF set out to increase public capacity to identify mis- and disinformation. Five months before the election, it launched the “Don’t be fooled” (bli inte lurad) education campaign to raise awareness of threats from foreign influence opera­tions and encouraged Swedes to think about the sources and publishers of information before sharing it online. The campaign website provides educational videos, one-pagers, and even a free online course on how to resist foreign influence. The agency plans to expand its public education programs for the public by teaming with a university to create a master’s program in information resilience

The Swedish government directed additional efforts toward other key stakeholders. For example, the MSB published its handbook for communicators to help journalists and other media actors to better identify and respond to false information when assessing information from both foreign and domestic sources. Likewise, in the months leading up to the 2022 election, MPF staff educated Swedish political parties about the possibility that they might become targets of disinformation campaigns and to advise on guarding against such campaigns. 

Ultimately, Sweden’s 2022 election did not experience the anticipated degree of foreign interference. Regardless, the country’s success in bolstering resilience against foreign disinformation during elections is reflected both in its high voter turnout (84%) and general satisfaction with the country’s democratic institutions (79%)

How can this best practice work elsewhere? With influence campaigns becoming more and more sophisticated and widespread, it is more important than ever that critical election stakeholders—from election management bodies to the media and voters—can identify and defend themselves against information threats. Adopting a whole-of-society framework, as Sweden did, for countering information manipulation, particularly during elections, is key to building resilience by fortifying defenses across all sectors of society, strengthening partnerships for sharing information and tools, and fostering trust within the community. 

Governments that adopt such a whole-of-society framework should begin with a landscape review of the vulnerabil­ities that different stakeholders and sectors of society face. Doing so may kickstart collaboration between govern­ment and non-government partners across various sectors, help identify priority areas, and assess the feasibility of implementing plans. 

In addition to trying to reach various segments of society, governments that adopt a whole-of-society approach should consider diverse proactive measures. A comprehensive framework should incorporate activities that address communication, resilience, disruption, and regulation rather than covering each theme separately. This idea of an all-comprehensive framework is exemplified by the MPF’s two primary responsibilities: to both disrupt foreign influ­ence campaigns and create initiatives to build resilience against them. Since Sweden adopted its whole-of-society framework, countries that have followed suit include Estonia, Finland, and Latvia.

Estonian flag
Estonia

Invest in fact-based journalism and other disinformation resilience programs for segments of society that are most at risk of being targeted by influence campaigns 

Estonia is another country in which to look for best practices on countering election disinformation. Despite having an Internet voting system some security experts view as controversial and being a regular target of Russian cyber and influence operations since regaining its independence in 1994, Estonia has established itself as a digital power­house and transitioned to a well-functioning electoral democracy. Estonia provides most public services online, continuously refines and upgrades them, and makes consistent efforts to bolster resilience against information manipulation among segments of Estonian society that are most at risk. Those efforts often focus on Estonia’s Russian-speaking population. 

Russian speakers account for 27% of Estonia’s population, many of whom are ethnic Russians who settled in the country during the time of the Soviet Union. Until 2023, the majority of Estonia’s Russian speakers obtained news from Russian-language stations controlled by Moscow. In part for this reason, Russian state media routinely targets the country’s Russian-speaking population with influence campaigns designed to widen fissures with the Estonian majority. The Bronze Soldier incident is a prime example. In 2007, Estonia became the first victim of the Kremlin’s modern influence tactics after a Soviet statue, a flashpoint for competing nationalist narratives, was removed from the center of Estonia’s capital, prompting multi-day riots. 

Understanding this challenge, Estonia sought to offer alternatives to Russian media and engage with its Russian-speaking minority. In 2015, Estonia’s public broadcaster launched a Russian-language television channel, ETV+, alongside a studio in Narva, where 95% of residents speak Russian and over 30% hold Russian passports. The govern­ment also offered Estonian language and cultural courses to reduce susceptibility to Russian state narratives. 

Estonia’s efforts to bolster resilience against pro-Kremlin disinformation in its Russian-speaking population became more urgent after Russia’s invasion of Ukraine, a little over a year before the county’s March 2023 parliamentary elections. A staunch supporter of Ukraine, Estonia provided Kyiv nearly €500 million in defense assistance (1.4% of its GDP) and took in over 62,000 Ukrainian refugees. Unsurprisingly, the country experienced a surge in disinforma­tion about the war, mainly in the form of Russian propaganda, in the run-up to the parliamentary elections. These narratives largely sought to create tension between the country’s Russian and Estonian-speaking communities by portraying Ukrainian refugees as “criminals,” suggesting political links to Russia, and falsely reporting that Estonia was supplying arms to Ukraine

To lessen the risk of these narratives influencing those most susceptible to them, Estonia coordinated closely with its media sector and invested in fact-based journalism for its Russian-speaking population. The Ministry of Culture distributed €1 million in funding to private media to improve Russian-language media coverage. With its alloca­tion, Postimees, one of Estonia’s most prominent newspapers, published a 24-page weekly Russian-language print edition, staffed by 25 Russian-speaking journalists hired to cover domestic and international news. Likewise, ETV+ invested in a daily evening program and a weekly foreign affairs program and a Russian-language version of the public broadcaster’s video-on-demand platform. Other Estonian media outlets reported similar projects and staff hires. 

While engaging with a population living in a separate information ecosystem is not easy, Estonia’s strategy seems to be paying off. Despite widespread disinformation about the war in Ukraine, Russian speakers’ trust in Estonian media has increased and the share of non-ethnic Estonians who consider Russian media as “important sources of information” has dropped from 30% to 10% since Estonia’s actions following the invasion of Ukraine, according to a poll commissioned by the government. 

How can this best practice work elsewhere? A pluralistic, free, and truthful media landscape is vital to a vibrant democracy—even more so when the information environment is affected by disinformation and influence campaigns intended to undermine democratic institutions and processes. While no one is completely immune to informa­tion manipulation, some populations—particularly members of minority groups—may be at greater risk of being targeted to fuel discontent. Marginalized communities are often primary targets of hostile influence campaigns by the Kremlin, other authoritarian governments, and autocratic domestic actors to exacerbate societal cleavages and discord. Many actors’ divisive tactics across Europe and elsewhere have intersectional impacts on elections and democratic processes. Continually building more inclusive societies and information environments can be a powerful response to such malign influence operations. 

While Estonia prioritizes fact-based journalism for its Russian-speaking population, there are an array of other ways that governments can bolster resilience to influence campaigns among those most at risk. This includes supporting media literacy education, furthering partnerships or cooperation with respected organizations representing minority interests, and developing voter education programs aimed at boosting trust in election systems. Regardless of the initiative, the most critical component for programs that target specific groups is that they empower communities and make them feel accepted and heard, rather than isolated. 

Bosnian Flag
Bosnia and Herzegovina

Adopt a crisis communications strategy to preserve trust in the face of false and misleading information campaigns 

Bosnia and Herzegovina offers valuable lessons on countering disinformation and the importance of building a resil­ient and cohesive society to resist malign influence. The country’s experiences combating disinformation throughout the electoral cycle are tied to a complex political landscape shaped by a fragile tripartite power-sharing arrangement. Bosnia and Herzegovina’s internal administrative structure was formed following a brutal aggression and war in the 1990s. The peace deal struck in 1995 in Dayton, Ohio divided the country into two entities: the Serb-dominated Republika Srpska and the Federation of Bosnia and Herzegovina with Croats and Bosniaks in majority. A tenuous balance among three constituent nations impedes quick responses to emerging threats, increasing susceptibility to the geopolitical interests of Russia and the PRC, in addition to neighboring Croatia and Serbia.
 
Entrenched, divisive rhetoric and narratives are creating an ongoing political crisis in the country, including in the run-up to recent Bosnia and Herzegovina’s 2022 general election. Politicians’ failure to agree on electoral reform were followed by disinformation alleging that the disagreement meant there was no legal basis for elections. Then, government actors obstructed adoption of the election budget, which must be secured 15 days after the date of the announcement of the election. As a result, the 2022 election was marred by deep political and social polariza­tion, including significant disinformation campaigns, hate speech, and ethnic tensions. Russia-affiliated domestic political actors also spread malign messages. The election period, described by the Central Election Commission (CEC) as “the most turbulent so far,” was punctuated by disinformation campaigns across social media platforms that inflamed social division and undermined trust in the election and the CEC itself. Ahead of the election, Russia leveraged support for separatism and openly supported the secessionist pro-Russian President of Republika Srpska Milorad Dodik. NATO Secretary General Jens Stoltenberg remarked, “We are concerned by the secessionist and divisive rhetoric as well as malign foreign interference, including Russia.” These occurrences were further exacerbated by challenges that many election management bodies (EMBs) face across the region, such as limited institutional capacity and budgetary constraints. Additionally, a series of direct online attacks on several female candidates and members of the CEC, mainly on social media, emphasized the gendered dimension of online hate speech and disinformation, and their negative impact on women’s participation in political processes and inclusive elections. 

The CEC responded to this challenging landscape by building institutional capacity and developing a proactive crisis communication strategy to anticipate and counter the highest-risk disinformation narratives ahead of the election. The CEC strengthened its capacity in part by incorporating the Crisis Communications and Combating Disinformation Playbook, developed by the International Foundation for Electoral Systems’ (IFES) in partnership with the Brunswick Group. The lessons from the playbook were reinforced by tailored training on social media engage­ment for CEC staff. The playbook addresses gaps identified by a regional working group for EMBs on disinformation, electoral integrity, and foreign influence convened by IFES. The working group facilitated the development of relationships with technology firms and provided a platform for countries to learn from one another about shared challenges, good practices, and useful tools. The playbook outlines best crisis communications practices, such as building a sustainable rapid response process. This includes establishing an internal CEC disinformation response team, using early warning tools, conducting stakeholder outreach and education, activating networks for escalation protocols, and creating a rapid response checklist.
 
To prepare organizationally and build trust with voters, the CEC conducted an initial vulnerability assessment in advance of the 2022 election. The assessment identified potential high-risk false narratives that the CEC could expect and for which it could begin to formulate mitigation tactics. A risk matrix enabled the CEC to assess the threat level each narrative could pose to the integrity of the election and prepared responses to reduce suscepti­bility to them. A scenario planning exercise identified potential false narratives that could undermine the election and sow discord among ethnic groups. For example, one scenario accused the CEC of delaying the delivery of crit­ical election materials, such as ballots, to a region populated largely by an ethnic group connected to the opposition. The scenario challenged the validity of the CEC’s explanation about road closures due to extreme weather, using old photos of the roads in perfect condition. To counter this potential narrative, the CEC drafted interim statements, social media posts, and questions and answers targeting different audiences. To further support its response, the CEC developed a list of potential third-party spokespersons who could confirm the condition of the roads and refute the false claims. 

The CEC’s proactive preparation for the 2022 election helped ensure effective responses to false narratives across media, proactive communication with voters through social media and traditional media briefings, and the broad­casting of CEC sessions before, during, and after the election. The Organization for Security and Co-operation in Europe/Office for Democratic Institutions and Human Rights (OSCE/ODIHR) Election Observation Mission noted that the CEC “administered the election efficiently, transparently and within the legal deadlines” and “… enjoyed the confidence of most stakeholders.” 

How can these practices work elsewhere? As the ultimate authorities on elections, EMBs are the official resources for timely and accurate election information. This responsibility has become even more critical due to a rise in elec­tion-related disinformation, malign influence, and increasing polarization of the media environment. To mitigate these threats to electoral integrity, EMBs and other electoral stakeholders must act proactively and adopt crisis communications strategies to help them respond to false information and harmful narratives intended to undermine elections, as Bosnia and Herzegovina did in 2022. At the end of the day, the reputation of EMBs and other stake­holders hinges to a great extent on their ability to promote credibility and build trust in the electoral process. How well they do so can influence the outcome of an election—and a country’s ability to preserve democracy. 

In building their institutional crisis communication plans, EMBs should consider several key steps. These include identifying target audiences (for example, marginalized groups, such as ethnic minorities, women, young people, persons with disabilities) and their need for information, and developing communication plans that demonstrate core values to each audience to build trust systematically. Proactive, concise, consistent, and direct communica­tion through the EMB’s website or social media can decrease the risk of information being misinterpreted or taken out of context; it also enables EMBs to communicate in a timely manner, which is essential when responding to harmful narratives. Establishing relationships and maintaining consistent communication with voters throughout the electoral cycle increases the likelihood that the electorate receives messages with trust, particularly during periods of crisis. Such seemingly simple skills such as using social media to communicate with the electorate are critical for EMBs to formulate timely, effective messaging that aligns with their values and target audiences. An effective crisis communication strategy also highlights the fact that no response is also a response—although perhaps not a desirable one. 

EMBs must also be aware of their election systems’ main vulnerabilities. As Bosnia and Herzegovina demonstrated, conducting self-assessments of institutional weaknesses and identifying potential risk scenarios can enable EMBs to get ahead of possible narratives, develop appropriate responses, and proactively communicate important messages. EMBs can apply the assessment to the country context to isolate common disinformation themes and anticipate scenarios that might be used to discredit public institutions and sow doubt or mistrust among citizens. Early self-as­sessments and crisis scenario planning build institutional confidence and preparedness that can decrease the time needed to respond to the appearance of an intentionally damaging narrative. That in turn reduces the likelihood of disinformation reaching and influencing large numbers of people. 

EMBs should also develop risk matrices or escalation protocols to assess the severity of potential harmful narratives and determine the appropriate level of response. To do so, EMBs should focus predominantly on three areas: a narra­tive’s potential impact, its credibility, and how quickly it spreads. The severity of the narrative should determine the response and communication strategy. 

Ukrainian flag
Ukraine

Leverage robust, cross-sectoral coordination, particularly on emerging technology, to counter malign influence 
 

As a preeminent target of the Kremlin’s malign interference for decades, Ukraine’s hard-won experience in advancing democracy offers several lessons for defending the electoral information environment, including instances when Ukraine has been unable to hold its elections at ordinary intervals. Since the Revolution of Dignity in 2013 and Russia’s illegal annexation of Crimea in 2014, Ukraine has been at war with Russia on two major fronts—an escalatory kinetic conflict, alongside a persistent and increasingly complex battle fought predominantly in the information space. Over the past decade, Ukraine has developed a comprehensive framework to defend its electoral processes—and its society more broadly—from the impacts of Kremlin interference and malign influence efforts. While the country has achieved many successes in building information resilience in recent years, a core pillar of Ukraine’s informa­tion integrity resilience is deep cross-sectoral cooperation—particularly on emerging technology, including AI and machine learning for detection and response—among civil society, government institutions, media, academia, and the private sector.
 
Heading into Ukraine’s 2020 local elections, the country’s information environment was tainted by an upsurge in Russian disinformation. These challenges, including the Kremlin’s tactics, sought to destabilize the country with disinformation about issues such as the COVID-19 pandemic. Fear about safety concerns was sown through the mainstream media, including prominent TV channels, and social media. The narratives were crafted to undermine the legitimacy of the Ukrainian state and its institutions, weaken ties between Ukraine and its partners in the West, and promote a positive image of the Russian government. Russia, however, underestimated the long-term preparedness building and cooperation of various Ukrainian sectors. 

To counter these narratives, Ukraine created deep networks of cooperation across various sectors of society for sharing lessons learned and highlighting best practices for countering false narratives. Ukraine’s civil society has been instrumental in re-orienting itself to advance the country’s narrative response capabilities by increasingly providing timely, measured, factual responses to harmful narratives around elections. Consequently, this has also expanded the capacity of independent media to accurately cover these narratives. Organizations such as Detektor Media and StopFake, which monitored election campaigns ahead of the 2020 election, have debunked numerous false Russian narratives since 2014. Those efforts have helped Ukrainians distinguish between fact and fiction. 

One successful example of cross-sectoral coordination is the e-platform, HelpSMI, which was created specifi­cally for Ukraine as a communication platform connecting journalists and experts from various fields, civil society, academia, and local businesses. The platform supports democracy building, promotes freedom of speech, and counters malign narratives by providing journalists and media outlets with trusted sources and reliable information from experts gathered on one platform. HelpSMI helped uncover a range of issues, including bribes by a local politi­cian one year ahead of Ukraine’s 2020 elections, how illicit funding circulated during parliamentary and presidential elections in 2019, and the debunking of false narratives about Stepan Bandera, which are often used by Russia to paint Ukrainians as supporters of the Nazi regime. 

The Ukrainian government also understood the need for better cross-sectoral cooperation with other parts of society to successfully counter malign narratives. Ahead of the 2020 local elections, President Volodymyr Zelenskyy announced to the 75th United Nations General Assembly that Ukraine was ready to set up “the headquarters of the International Office for Countering Disinformation and Propaganda in Kyiv. There is no longer the concept of someone else’s war. Our planet is no longer so big ... when disinformation and fake news can influence global markets, stock exchanges, and even the electoral process.” In March 2021, the Ukrainian government realized these plans by establishing the Center for Countering Disinformation (CCD) within the National Security and Defense Council. The CCD is dedicated to debunking manipulative and misleading Russian narratives, including across social media plat­forms. It develops reports, articles, and refutations of prominent disinformation narratives that further strengthen public resilience to disinformation. The CCD also fosters cross-sectoral cooperation with civil society organizations such as StopFake and others. 

Shortly after the CCD was established, the Center for Strategic Communication was created under Ukraine’s Ministry of Culture and Information Policy as “one of the mechanisms for countering disinformation, by joint efforts of the state and civil society.” The center focuses on cross-sectoral cooperation with civil society to amplify their work with the general public, conducting joint information campaigns to build public resilience to malign narratives, and facilitating dialogue between the state and civil society organizations to develop regulatory frameworks. In close cooperation with civil society, the center created the debunking page, “Spravdi”, which researches, analyzes, and reports on new false narratives being spread in Ukraine and abroad. In 2021, the center, in cooperation with more than ten non-governmental organizations from Ukraine and abroad, analyzed how the Kremlin spread malign narratives about COVID-19 in Ukraine during the 2020 election year. The success of Spravdi provided the Ukrainian public with verified information that helped them navigate this complicated information space, which included over 250,000 fake messages about COVID-19 across Ukrainian social networks in 2020. 

Ukraine’s media monitoring has helped increase its public’s media literacy and enabled Ukrainians to make more informed choices during the electoral process. However, it takes significant manpower and time to sort through the amount of news online. Additionally, civil society organizations often have financial constraints that make it difficult, if not impossible, to reach broad swaths of the public. As a result, some organizations started using AI in order to increase their capacity as well as enhance their cooperation with the government and private sector. After Russia’s 2022 invasion, Ukraine leveraged emerging technologies to further expand cross-sectoral cooperation between civil society, government, and the private sector. Government agencies and ministries built relationships with start-ups and private companies, using the latter’s software to swiftly detect and counter Kremlin disinformation on a coun­try-wide level. Start-ups such as Osavul and Mantis Analytics, created in response to the invasion, began to employ large language models and natural language processing to counter the Kremlin’s disinformation narratives about Ukraine. Adopting this new technology, helped Ukrainian agencies quickly identify potential harmful narratives and disseminate factual information before they gained traction. 

How can these practices be used elsewhere? Ukraine’s approach in fostering robust cross-sectoral cooperation among civil society, government institutions, media, academia, and the private sector demonstrates how countries can build resilient and holistic frameworks to combat disinformation. As the National Endowment for Democracy’s report notes, “civil society organizations should leverage common values and diverse skill sets to form cooperative networks that have the sophistication and speed to combat” the scale of information threats to electoral processes. The creation of synergies among various sectors of society can lead to a more informed electorate that trusts public administration and has a high degree of literacy in recognizing malign narratives. However, even the best-staffed organizations across sectors may have limited capacity to monitor emergent disinformation narratives across the global media ecosystem. Ukraine offers valuable lessons for other countries on how a proactive approach in leveraging technology for good can also help build capacity across various sectors. Countries across the region can draw inspiration from the Ukrainian case study, utilizing new technology and machine learning to accelerate identifying harmful narratives and strengthen cross-sectoral cooperation. Just as AI and other new emerging technologies offer malign actors tools to undermine elections, with proper safeguards they can also offer EMBs and other critical election stakeholders potential oppor­tunities to help them better defend elections

Although Ukraine was unable to use these technological advances in the postponed 2024 presidential and parliamen­tary elections, its ability to counter malign Russian narratives has enabled the country to better protect its democ­racy. The war in Ukraine catalyzed the country’s governing institutions to further modernize and fortify, reducing vulnerability to Russia’s malign influence. Ukraine’s blooming, multi-faceted, integrated response can be described as “total democratic resilience”. Through its comprehensive and multidisciplinary approach, Ukraine adopted a broader framework for information resilience than exists in many European countries, including EU member states. Many countries can learn from this example in defending the integrity of their elections.