Skip to main content
Digital trust and cybersecurity March 14, 2024

Securing the 2024 Elections: Cyber Risks and Protective Measures

By Ketevan Wehrsig
Securing the 2024 Elections: Cyber Risks and Protective Measures
2024 is a significant election year, as nearly half of the world's population will be eligible to vote. In today's complex world, it's no longer the sole responsibility of governments, electoral bodies, and policymakers to ensure security and freedom during elections.

With the advent of the digital age, traditional methods of campaigning are no longer sufficient, and the evolution of the information space has brought new challenges and vulnerabilities to the integrity and security of elections. The emergence of generative AI and the widespread use of digital platforms have exacerbated the problem, making it critical to address cyber threats to elections this year.

Primary risks and challenges

The 2019 report of the European Union Agency for Cybersecurity (ENISA) on "Election Cybersecurity: challenges and opportunities" highlights several cyber threats that can affect the electoral process, among which public political campaigns are considered to be especially at risk due to online disinformation and deepfakes that can manipulate societal opinion and influence voters' decisions.

The digital and information landscape around elections is constantly evolving. This often creates new obstacles. The development of generative AI has added a new layer of complexity. Generative AI can create very convincing deep fakes, making it difficult to distinguish between what's real and what's not. This was well demonstrated during the 2019 elections in Slovakia, when an AI-generated audio recording was posted on Facebook just two days before the elections. In the recording, a party leader and a journalist discussed how to rig the elections by buying votes. Although fact-checking departments confirmed that the audio was fake, the post was difficult to widely debunk due to Slovak law, which requires media and politicians to remain silent 48 hours before elections. This situation highlights the vulnerability of information space and underscores the critical factor of timing in addressing the issue.

The universality of the digital space and the widespread use of digital platforms pose other significant challenges. While new regulatory frameworks are being developed, existing regulations are struggling to keep pace with the rapid pace of technological change. Moreover, the borderless nature of the Internet makes it difficult to enforce legislation effectively. In addition, current EU legislation lacks the power to hold platforms fully accountable for their actions, both within and outside the European Union.

Progress amid challenges

Despite these challenges, there have been positive developments in recent years. More authorities are recognizing the importance of these issues than ever before. In fact, the World Economic Forum’s Global Risks Report 2024 identified misinformation and disinformation as the biggest short-term risk. 

There have also been significant developments in the European Union's regulatory framework, such as the Digital Services Act, which sets rules for online platforms, holding them accountable for disinformation and ensuring a safer cyberspace. The recently adopted AI Act is the world's first law regulating artificial intelligence, containing transparency requirements to prevent the proliferation of harmful content. In addition, the EU Parliament has established a special committee on disinformation (INGE) to prioritize the integrity of the information ecosystem. These and other efforts are crucial steps to be better prepared for cyber threats during elections.

Possible measures to mitigate cyber risks in elections

To ensure the integrity and safety of elections, it is essential that different stakeholders, including government agencies, platforms, and NGOs, work together. Creating policy recommendations and guidelines can provide a framework for countering disinformation. The top ten recommendations for "Protecting democratic elections through safeguarding information integrity" jointly published by International Idea, Forum on Information & Democracy, and Democracy Reporting International in 2024, emphasize the importance of platforms and states prioritizing trustworthiness and information plurality during elections. They also emphasize the need to reduce the impact of disinformation and misinformation and to increase the accountability of influential actors, such as journalists and influencers.

To effectively address this global phenomenon, key measures and possible solutions should include enhanced cooperation between platforms, state institutions, election management bodies, the media and civil society. Learning from one another will be crucial in this regard. In addition, the development of AI tools to identify and analyze disinformation can be instrumental in mitigating the harm caused by disinformation in a time-sensitive manner. Efforts to improve the skills and data literacy of journalists are also essential for combating misinformation.

It is apparent that the upcoming 2024 elections will face significant cyber risks that could threaten the integrity, security, and credibility of the democratic process. These risks are primarily driven by the rapid evolution of the digital and information space, as well as the challenges posed by generative AI. While regulatory frameworks have been put in place to address these concerns, mitigating cyber risks will require a more collaborative and holistic approach.

Add new comment