Nisos Tradecraft

A Framework For Tackling Influence Operations During a Busy Election Year

by | Mar 6, 2024 | Blog

EXECUTIVE SUMMARY

Tackling influence operations (IO) in a normal year is difficult enough. This year, half the world’s population heads to the polls. For social media companies, unfettered IO campaigns targeting platform users—often involving disinformation, misinformation, or malinformation—can undermine user confidence in key markets and present reputational and legal risks. In this report, Nisos breaks down why IO is so complex and explains why it is so difficult to scope, identify, and remediate these activities without dedicated analytical and investigatory support.

Timing: Planning ahead is key to tracking how IO threats to elections evolve. Election campaigns can be long, which means a lot can change, including the prevailing IO narratives. Nisos analyzes the broader IO landscape well ahead of and in the direct lead up to elections to understand those shifts.

State and Non-State Actors: State actors in the IO space like Russia, China, and Iran are well known, but the scope of actors running IO includes domestic actors like media outlets, influencers, and political parties.

From Overt to Covert and Everything in Between: Actors involved in IO proliferate their narratives through a variety of means, ranging from very visible, attributable, and overt activities to hidden, difficult-to-attribute, and covert measures. Some IO propagators will also seek to amplify grassroots messages that sow division or further polarization in a country of interest.

Information Flow On and Off Platform: IO takes place on not just one platform but usually flows across multiple platforms, including those with few or no restrictions on content, enabling violative content to spread across the information ecosystem.

Broader Political Context: A country’s history and recent events color the political context—understanding that background is critical in assessing the intent behind election-driven IO campaigns.

Nisos acts as an embedded part of trust and safety teams, leveraging analysts who have in-depth expertise with the IO landscape and the techniques, tactics, and procedures that IO actors use, enabling us to detect and often attribute the sophisticated adversaries behind these operations.

BACKGROUND

This year, voters across more than 80 countries will cast their ballots. Elections are particularly challenging for social media companies, as IO occurring on social media platforms can have manifold off-platform repercussions, ranging from violent demonstrations to erosion of trust in democratic processes. These events can degrade user confidence in platforms, pushing them to flee to other online communities or lead to regulatory scrutiny.

It can be overwhelming for online platforms to devise effective solutions to address IO in just one high-profile election, let alone dozens. There is a significant amount of content available focused on prominent IO actors and narratives; however, such material too often focuses on tactical concerns over broader business implications. The impact on platforms extends beyond the scope of typical IO research, and concerns a company’s reach and presence into a given country, volatility in that country, and past perceptions. Nisos can help online platforms and their trust and safety teams stay ahead of and defend against the multi-dimensional threats and risks posed by election-driven IO.

To complicate the landscape further, IO actors are using AI to create deepfake videos and voice cloning audio. While AI has factored into previous elections, generative AI capabilities have significantly improved in recent years and trust and safety playbooks derived from previous election cycles are already out of date. For example, some residents in New Hampshire received an AI-generated audio recording of President Biden urging voters not to vote in the primary election (see source 1 in appendix). The Pakistan Tehree-e-Insaf party used generative AI to create footage of Imran Khan, its founder, urging supporters to vote on election day (see source 2 in appendix). Nisos tracks these kinds of trends to maintain an up-to-date understanding of how actors use the information ecosystem to their advantage. We contribute these insights to trust and safety workflows tasked with identifying and tracking new IO techniques and campaigns on-platform.

TIMING

Mitigating IO threats to elections requires planning ahead. A lot can change during an election cycle, including the make-up of the candidate pool and the hot issues that provide fresh fodder for new IO narratives. It is not always feasible to fully attribute the actors behind the most potentially harmful narratives—including those that may lead to an offline event or present the highest reputational risk. At Nisos, we analyze the broader IO landscape months in advance of an election, including forecasting how narratives might evolve closer to the actual vote. We continue to revisit our assessments as election day nears, highlighting more current and specific examples of overt and covert IO, including by way of coordinated inauthentic behavior (CIB) on social media platforms.

 In Argentina, former president Alberto Fernandez announced that he would not be running for reelection at the end of April 2023, only four months before the primaries and six months before the presidential election (see source 3 in appendix). As such, IO campaigns and associated narratives had to adapt to the changing cast of candidates.

In Indonesia, then-presidential candidate Prabowo Subianto did not pick his running mate, Gibran Rakabuming Raka, son of the former president Jokowi, until 22 October 2023, only two months before the election (see source 4 in appendix). Results for “Gibran” on the Indonesian fact-checking website Cekfakta increased tenfold between the periods of 1 January to 21 October 2023 and 22 October 2023 to 14 February 2024—election day.

In early January, Indonesia’s General Election Supervisory determinedthat Gibran broke campaign rules (see source 5 in appendix). Cekfakta highlighted a misleading video that appeared on social media a week before the election claiming that Gibran’s trial had already started and that he faced disqualification from running for vice president, but the footage was of another, unrelated trial (see source 6 in appendix).

STATE AND NON-STATE ACTORS

Russia’s efforts to influence US elections and China’s attempts to change voting patterns in Taiwan are prominent examples of how state actors are active in the IO space. Most people are familiar with IO efforts by Russia, China, and Iran, but other countries are growing their own capabilities in this space. Understanding which actors are seeking to amplify election-related narratives not only helps identify the different types of IO propagators but also their varying methods and goals. During elections, we often see state actors boost one candidate or try to denigrate another to persuade the targeted country’s electorate to vote accordingly.

A declassified State Department cable released in October 2023 said the US intelligence community found evidence that Russian actors made a concerted effort to undermine faith in the voting process in at least nine countries between 2020 and 2022 (see source 7 in appendix).

Taiwan earlier this year said it was documenting its experiences countering Chinese interference in its January election and would make its analysis public (see source 8 in appendix).

The Indian military backed an IO campaign in 2019 that sought to convince Kashmiris that they would be better off under Indian authority (see source 9 in appendix).

State actors are just part of the story—domestic and non-state actors play a role in the IO landscape as well. Such actors could include journalists, influencers, political parties, or other individuals or groups focused on pushing certain perspectives to the public. Overlap between groups of actors is also common, as some domestic and nonstate actors often share viewpoints with state actors and knowingly or unknowingly propagate the same material. In some cases, these actors have also incited violence.

In 2021, Nisos researchers identified a coordinated, inauthentic network of 317 Twitter accounts that aimed to discourage Honduran voters from supporting either of their two presidential candidates and to abstain from voting entirely (see source 10 in appendix)

The Telegram channel LexeData, which appears to be linked to the Turkish Kurdistan Workers’ Party, collects and shares information on Turkish citizens and incites violence against individuals who insult or offend Kurds (see source 11 in appendix).

During the Colombian election in 2022, Nisos identified a Twitter disinformation campaign by Venezuelan organizations seeking to support then-presidential candidate and current Colombian President Gustavo Petro (see source 12 in appendix).

Some reporters from China’s state-run media outlets have social media accounts where they cast themselves as influencers, posting content that counters Western perceptions, highlights positive stories about China, and amplifies Russian IO narratives (see source 13 in appendix).

To obtain the complete research report, including endnotes, please click the button below.
DISCLAIMER

The reporting contained herein from the Nisos research organization consists of analysis reflecting assessments of probability and levels of confidence and should not necessarily be construed as fact. All content is provided on an as-is basis and does not constitute professional advice, and its accuracy reflects the reliability, timeliness, authority, and relevancy of the sourcing underlying those analytic assessments.

About Nisos®

Nisos is the Managed Intelligence Company®. We are a trusted digital investigations partner, specializing in unmasking threats to protect people, organizations, and their digital ecosystems in the commercial and public sectors. Our open source intelligence services help security, intelligence, legal, and trust and safety teams make critical decisions, impose real world consequences, and increase adversary costs. For more information, visit: https://www.nisos.com.