Last week, PinnacleOne considered what the Office of National Cyber Director’s Annual Report means to modern enterprises.
This week, we highlight the convergence of AI and foreign malign influence efforts on the 2024 year of global elections.
Please subscribe to read future issues — and forward this newsletter to interested colleagues.
Contact us directly with any comments or questions: [email protected]
Insight Focus | AI and Foreign Election Interference
The 2024 U.S. elections (and many other global elections) face a threat landscape defined by foreign influence actors using time-tested tactics augmented by emerging AI tools to undermine the democratic process. On May 15, 2024, officials from the Intelligence Community, FBI, and CISA testified before the Senate Select Committee on Intelligence to draw public attention to the evolving threat.
Their top-line message: Foreign adversaries – primarily Russia, China, and Iran, and their commercial enablers – are increasingly attempting to undermine democratic systems through both cyber interference targeting election infrastructure as well as covert influence efforts aimed at civil societies.
At the same time, these government leaders and outside experts (including our very own Chris Krebs) are confident that the 2024 election will be technically secure. Krebs noted on Face The Nation yesterday that while the 2020 election was deemed safe and secure, the current election systems are even more robust due to continued investments and improvements. However, his concerns remain focused on the growing influence capabilities of foreign actors:
“On influence, the scope, the scale, the technology available to our adversaries, including AI and deep fakes, [makes it] a much more precarious threat environment. The Chinese are active. The Russians are very active. They’ve been using deep fakes in Europe. We’ve seen AI [generated content] pop up in [elections in] Moldova, Slovakia, in Bangladesh. So it is going to be a tool.
My sense, however, is that threats that are AI powered or AI enabled, will be much like what happened in New Hampshire with the Robocall. It will be immediately detected, it will be investigated quickly, and it will be prosecuted. And that’s what’s happening right now.
I think the biggest concern though, is that this is cumulative. It’s accretive. So, rather than one single catastrophic AI-enabled event, it’s gonna be a steady drum beat where we, where the voters, the public are just going to lose confidence and trust in the overarching information ecosystem.”
Intensifying Influence Operations
This concern is merited as recent intelligence and observed operations show that threat actors have expanded well beyond traditional propaganda, adopting sophisticated tactics to sow discord and interfere with U.S. elections, including:
- Infiltrating organizations to exacerbate polarization and incite social strife;
- Conducting targeted harassment and sting operations against candidates;
- Organizing opposing rallies to trigger violence;
- Undermining democratic allies through narrative manipulation and “gray media”; and
- Circulating deep fakes of political figures to mislead the public.
For example, recent PRC campaigns used AI to target voters in Taiwan and the U.S. with false information laundered through fake social media accounts, intending to identify and exploit divisive domestic political issues and shape election results. Last month, Belgian and Czech leaders called for urgent action to push back on Russian interference in advance of the European elections in June.
The Czech government imposed sanctions on individuals accused of attempting to bribe members of the European Parliament to promote Russian narratives. In January, the European Parliament opened an investigation of a Latvian representative, reported to be serving as a Russian agent since at least 2004.
Increased Incentives and Capabilities
Foreign adversaries view election interference as a cost-effective and plausibly deniable means to achieve their strategic goals and now have powerful new tools at their disposal. The combination of synthetic media tools and powerful LLMs (many open-source) can be used to democratize and proliferate the sort of cross-language disinformation and media manipulation activities that took the Internet Research Agency an entire building full of fluent English speakers and media designers to execute. The barrier to run a “troll farm” is falling precipitously.
Further, many adversaries have (and still are) collecting bulk data on western publics. Feeding this trove of information into sophisticated analytics engines may enable more precise targeting of selected populations and even individuals for bespoke influence, at a larger scale. While currently not directly observed, these sorts of future malign influence operations will be harder to discover, attribute, and counter.
As a result, the threat landscape continues to intensify, with a growing number of foreign actors, including non-state entities, engaging in election interference. In addition, more commercial firms (wittingly and unwittingly) are used by foreign actors to support influence operations, increasing their sophistication and making tracking more difficult.
U.S. Efforts to Counter Election Threats
The U.S. government has taken significant steps to bolster its defenses, including:
- Establishing dedicated centers and executive positions within the Intelligence Community, including the Foreign Malign Influence Center and Election Threats Executive;
- Enhancing collaboration between Cybercom, CISA, law enforcement and election officials (many of whom have been granted security clearances) to share information on cyber threats and disrupt illicit financial networks and digital infrastructure used for attacks;
- Offering DARPA-developed semantics forensics tools to authenticate suspect foreign media (state and local officials can request authentication assistance through the FBI for these tools if they suspect deep fakes are the product of a foreign actor);
- Extensive training and security support from CISA (the election sector risk management agency), working with front-line election officials to assure the integrity of election processes – the diversity of which acts a bulwark against systemic compromise.
Recommendations for Cybersecurity Professionals and Election Officials
To effectively counter evolving threats, cybersecurity professionals and election officials must:
- Stay informed about the latest adversary tactics, techniques, and procedures;
- Join in close collaboration with intelligence, law enforcement, and private sector partners;
- Actively participate in information sharing initiatives, such as the EI-ISAC;
- Leverage CISA’s cybersecurity services and resources;
- Invest in training and awareness programs for staff;
- Continuously update and fortify cybersecurity defenses;
- Leverage tools to detect and attribute synthetic media and AI-enabled influence efforts;
- Work with government partners to establish attribution and response frameworks.
A Fraught Year for Elections
The 2024 elections will test the United States’ ability to safeguard its democratic institutions against foreign cyber threats and influence operations. By staying informed, collaborating with partners, leveraging resources, and strengthening defenses, cybersecurity professionals and election officials can play a vital role in ensuring the integrity of the electoral process. Prioritizing the development of robust attribution and response frameworks will be essential to effectively counter these threats and maintain public confidence in the democratic system.