Introduction

Companies, politicians, and governments are constantly working to motivate audiences to think and act favorably toward them. Think of a billboard promoting a fast-food chain, a political campaign video on YouTube, or a government-led polio vaccination drive. But some influence operations go too far and undermine democracy, which depends on the integrity of information. Can influence operations be assessed to distinguish those that are acceptable from those that are not?

This paper explores three potential criteria—transparency in origins, quality of content, and calls to action—to assess the acceptability of an influence operation in the context of democracies. It focuses on three case studies: U.S. efforts to sell the war in Iraq; campaigns that fuel climate change denial in the United States; and the WhatsApp-based electoral campaign of the Bharatiya Janata Party (BJP) in India. Each case study examines the following questions within the three criteria: Who is behind the operation? What activities are being carried out? What is the quality of the content involved? Who is the target audience? And what are the means of distribution? By posing such a framework, the paper aims to foster a much-needed discussion in democracies about what kinds of influence operations are acceptable, thereby guiding the policy, government, and military interventions democracies make in the information environment.

Kamya Yadav
Kamya Yadav is a research analyst in the Technology and International Affairs Program at the Carnegie Endowment for International Peace. Kamya is a PhD student at the University of California, Berkeley, where she researches gender, representation, and technology in the context of developmental political economy, with a regional focus on South Asia.
More >

Influence operations are organized attempts to affect an audience or an outcome toward a specific aim. They are conducted by a variety of actors, including advertisers, politicians, governments, activists, agents, opportunists, provocateurs, and celebrities. While several frameworks have emerged to analyze influence operations, they often stop short of providing criteria that might help distinguish influence operations that are acceptable from ones that are not.1 Making such a distinction, however, is a critical part of creating a guiding framework for governance based on democratic principles and values. Drawing from and building on existing literature on influence operations, the framework presented here identifies characteristics that can help assess the acceptability of influence operations.

Three Criteria to Assess the Acceptability of Influence Operations

A practical approach to classifying influence operations—based on the transparency of their origins, the quality of content distributed, and their calls to action—may provide a foundation for determining the degree of the operations’ acceptability. These criteria allow for the creation of a scale of acceptability, which could be used to inform policy development and guide the administration of public education campaigns.2

Martin J. Riedl
Martin J. Riedl is an assistant professor in the School of Journalism and Media at the University of Tennessee, Knoxville. His research investigates platform governance and content moderation, digital journalism, and the spread of false and misleading information on social media.

Beginning with the premise that access to trustworthy information is necessary for democracies to foster an informed electorate, two criteria for assessing the acceptability of an influence operation are the transparency of its origins and the quality of its content. Given the need for democracies to reach consensus to pass laws and make decisions in the public interest, a third criterion is what an influence operation encourages audiences to do, or its call to action. These criteria are described in depth below.

  1. Transparency describes how clear the influence operation’s origins are to the audience. Campaigns whose origins are very obviously identifiable are the most acceptable, and those whose origins are completely obfuscated are the least acceptable. Under this approach, running an influence operation while obfuscating its origins, such as by hiding its funding source, would be unacceptable. Examples that would fall at the unacceptable end of the spectrum include a foreign actor attempting to create the appearance of being a domestic source, a politician not clearly stating their use of political action committees, or a citizen engaged by a foreign or domestic actor to promote a specific agenda without noting their funding source.
  2. Quality of content relates to the accuracy, completeness, timeliness, and coherence of information along a spectrum.3 Information pollution degrades the information environment and comes in many forms. It includes irrelevant or unsolicited messages, such as spam emails, and redundant or empty information that contributes little to knowledge, as well as information that misleads or is false, such as disinformation.4
  3. Calls to action are what an influence operation asks audiences to do, with peaceable calls to action being more acceptable and those calling for violence or hatred against others less acceptable. For example, Article 20 of the UN International Covenant on Civil and Political Rights prohibits “any advocacy of national, racial or religious hatred that constitutes incitement to discrimination, hostility or violence.”5 Calls to action that violate human rights could conceivably also infringe on other democratic principles, such as equality.

For an influence operation to be acceptable in the context of democracies, it must be acceptable on all three dimensions. For instance, an influence operation that uses true information but obscures the origins of that information is unacceptable. Conversely, an influence operation that spreads false information but does not obscure its origins or instigate violence is also unacceptable.

Applying the Assessment Framework

To test these criteria, we applied them through a series of questions, outlined in table 1, across three case studies. The three case studies reflect different types of campaigns, including one that crossed borders and two that targeted domestic audiences. The first case study covers U.S. influence efforts regarding the war in Iraq, including both campaigns targeting Iraqis and campaigns soliciting domestic support for the invasion. The second case study assesses an influence operation aimed at promoting climate change denial in the United States. The third case study explores Indian Prime Minister Narendra Modi’s campaign for reelection in 2019.

Table 1. Questions to Guide the Assessment of Influence Operations
Criteria Aspect of the influence operation Assessment against the criteria
Transparency Who is behind the operation? How are the involved actors positioning themselves in the operation? Are they disclosing their role in the operation?
Transparency What activities are the operators undertaking? Do their behaviors
  • obfuscate the origins of the operation;
  • breach terms of service on platforms used; or
  • create an illusion of scale and support?
Quality of content What is the quality of content distributed? Is the nature of the content pushed through the operation
  • demonstrably and repeatedly false;
  • misleading;
  • inciting hatred or violence; or
  • unverified in veracity and/or origin?
Transparency; Calls to action Who is the target audience? Given the operators’ positioning, do they have a legitimate right to target this audience in the manner undertaken?

Is the target an important conduit through which the integrity of the information environment is threatened (for instance, journalists, politicians, or influencers)?

Is the target audience or the focus of the messaging a protected group (for instance, minority or LGBTQ communities)?

What is the influence operation asking audiences to do?
Transparency; Calls to action What are the means of distribution? Do the channels for moving this content
  • obfuscate their origins;
  • increase the scale of the operation; or
  • encourage immediate responses by the target audience?

The U.S. Government’s Efforts to Justify the Iraq Invasion

Assessed through the three criteria introduced in this paper, U.S. government efforts to sell the invasion of Iraq at home and abroad included initiatives that obscured the origins of the influence operation and purveyed questionable claims, eroding the legitimacy for the basis of the invasion over time (see table 2). The campaign by the administration of president George W. Bush to promote invading Iraq remains a textbook example of an influence operation. The campaign targeted domestic audiences in the United States and the United Kingdom, Iraqi military decisionmakers, and others. As demonstrated by numerous investigative reports, the justifications for the invasion used by the Bush administration were questionable.6 And in attempting to win public support for the invasion, parallel structures were created to coordinate activities and plant intelligence, often through third parties.7 These activities included creating official government bodies to justify the war; influencing media coverage to support the war; using think tanks and public relations firms to push the agenda for invasion; and creating Iraqi organizations to lend the appearance of support within Iraq. These efforts were driven by a relatively small group of high-level U.S. political influencers, including the president.8

Alicia Wanless
Alicia Wanless is the director of the Partnership for Countering Influence Operations.
More >

Several official bodies worked to win support for invading Iraq. The U.S. Defense Department’s Office of Strategic Influence reportedly ran covert influence operations, “planting news items with foreign media organizations through outside concerns that might not have obvious ties to the Pentagon,” until the office was closed in 2002 following public outcry.9 The White House Iraq Group coordinated daily messaging about the administration’s efforts on its so-called war on terror; some commentators on the invasion have viewed the group as a propaganda coordination committee.10 The Committee for the Liberation of Iraq (CLI) was spun off from the Project for the New American Century, a neoconservative think tank, and worked closely with and shared many members with the American Enterprise Institute, another think tank.11 The CLI conducted outreach to journalists across the United States, holding lectures and dinners with administration officials and editorial boards, in a bid to sell the war.12

The U.S. State Department sponsored the Iraq Public Diplomacy Group, which aimed to provide media training to Iraqi dissidents to “help make the Bush administration’s argument for the removal of Saddam Hussein.”13 As an anonymous U.S. official said then, “We’re going to put [Iraqi dissidents] on the front line of winning the public hearts and minds. It’s one thing for an American to get up and talk about regime change in Iraq. It’s quite another thing when Iraqis do it.”14 Such training taught not only how to communicate effectively but also what the messaging ought to be, particularly regarding the promotion of democracy.15

Several public relations firms were also engaged in promoting an invasion of Iraq, including the Rendon Group and the Lincoln Group (formerly known as Iraqex).16 The Rendon Group was credited with setting up and managing the Iraqi National Congress, an “organized opposition movement with a view of overthrowing Saddam Hussein” led by Ahmed Chalabi, whom the Pentagon relied on as a source of information and coordination of anti–Saddam Hussein activities.17 The Rendon Group was reportedly contracted by the U.S. Central Intelligence Agency to run a campaign “to influence global political opinion against Saddam,” which included “offering information to British journalists, and many articles subsequently appeared in the London press.”18 While Saddam Hussein had a horrible human rights track record, some of the stories placed were inaccurate, including claims that Iraq possessed weapons of mass destruction.19

Samuel Woolley
Samuel Woolley is an assistant professor in the School of Journalism and Media at the University of Texas at Austin and leads the propaganda research lab at the university’s Center for Media Engagement. He is the author of four books on emerging technology and the manipulation of public opinion, including the recent Manufacturing Consensus: Understanding Propaganda in the Era of Automation and Anonymity (Yale University Press).

Following the U.S. invasion in March 2003, the planting of news stories continued into the occupation. Details emerged about the U.S. military secretly paying Iraqi newspapers to publish stories written by American troops to burnish the image of the U.S. mission in Iraq.20 The U.S. government’s activities extended to hiring the now discredited British public relations firm Bell Pottinger to develop and run a $540 million covert propaganda campaign in Iraq after the invasion in what is believed to be one of the world’s most costly public relations contracts. Bell Pottinger used staff based in Baghdad to disseminate pro-coalition material. In addition to fake al-Qaeda videos, which were used to monitor people who viewed them, Bell Pottinger made television segments that looked as though they were produced by Arabic-language news networks. The staff created news packages and sent the videos to television stations across the Middle East. The fact that the stories were created with the support of the U.S. government was sometimes kept hidden.21

Beyond planting and faking news stories, the U.S. government’s campaign involved false claims that Saddam Hussein had or was acquiring weapons of mass destruction. These claims, along with accusations that Saddam Hussein had ties to al-Qaeda, were made on at least 532 separate occasions by Bush and three senior U.S. officials—vice president Dick Cheney, national security adviser Condoleezza Rice, and defense secretary Donald Rumsfeld.22 This concerted effort, amplified by thousands of news stories and broadcasts, provided the foundation for the administration’s case for war, with 77 percent of Americans supporting the invasion by April 2003.23 While this effort cannot be labeled disinformation, as it remains unclear whether the U.S. officials knew the information that they were spreading was untrue, the actions eroded public trust nonetheless as details emerged after the invasion through commissions such as the Iraq Inquiry in the United Kingdom.

The effort to sell an invasion of Iraq, not just to Americans but also to U.S. allies and Iraqis, was considerable. It included a co-opting of U.S. and international media, hundreds of millions of dollars spent on public relations firms, and campaigns at the United Nations for resolutions condemning Saddam Hussein and pressing for the disarmament of nonexistent weapons of mass destruction.24 In Iraq, efforts by the U.S.-led military coalition included “radio and television broadcasts and 50 million leaflets” and “[psychological operations] . . . conducted against Iraqi civilians and soldiers.”25 These efforts also attempted to undermine Saddam Hussein’s trust in his own intelligence, “combining “[psychological operations], bribery, deception, and a human form of cyberwar.”26

Would the invasion of Iraq have happened without this influence operation and the use of misleading information about weapons of mass destruction? It is hard to say. It will rarely be clear whether and to what extent influence operations cause polities or other targets to behave differently than they otherwise would have, for good or ill. Yet, the public trust necessary to sustain democracies may be undermined if such operations become normalized.

Table 2. Analysis of the U.S. Invasion of Iraq
Aspect of the operation Analysis of the U.S. invasion of Iraq
Who is behind the operation? The U.S. government with help from the British government
What activities are the actors undertaking? Planting false stories both domestically and internationally, among other influence activities
What is the quality of the content? Fabricated intelligence
Who is the target audience? Audiences in the United States, the United Kingdom, and the Iraqi military, among others
What are the means of distribution? Official government bodies, media organizations, think tanks and public relations firms, and domestic organizations in Iraq
Assessment of criteria Transparency: The U.S. government made efforts to set up agencies and organizations to justify the war on Iraq and plant news stories to make it seem as if the support for the war was both rational and coming from the broader public, thus obfuscating the origins of the content.

Quality of content: Several studies and internal reports have highlighted that much of the intelligence and news around the Iraq invasion was knowingly, demonstrably, and repeatedly false and misleading.

Calls to action: The bulk of the call to action was to outwardly support the U.S. government and military.

Climate Change Denialism in the United States

Climate change denial in the United States falls squarely into the realm of unacceptable influence operations as defined in this paper (see table 3), with its proponents—fossil fuel companies, conservative philanthropists and think tanks, and front groups, among others—hiding the origins of misleading and false information and aiming to cast doubt on the scientific knowledge and institutions that propose the reality of human-caused climate change.

While the evidence for human-caused global warming is unambiguous, this is far from being unanimously accepted as a fact in the United States.27 A combination of denialism, attacks on science, and propaganda campaigns forms a sophisticated system that is sometimes referred to as a “denial machine.”28 This denial machine has led to disinformation attacks on prominent climate change activists, such as Greta Thunberg, and subverted educational curricula. Denialism has stalled U.S. action to curb greenhouse gas emissions, impacting global climate agreements and international efforts to combat climate change.29 And doubt of the facts has had severe implications for populations of wild animals such as polar bears, whose habitat has been damaged by a climate change–driven reduction in Arctic sea ice.30

By many accounts, climate change denialism by the fossil fuel industry in the United States has followed the playbook that the tobacco industry used in the 1950s to prevent the public from understanding the negative health effects of smoking.31 The playbook includes “manufactur[ing] uncertainty” through a multistep protocol:32 confusing people by using independent organizations that further the fossil fuel industry’s claims, co-opting scientists to inaccurately or incompletely represent scientific knowledge, and casting doubt over the accuracy of science and its institutions by attacking peer-review mechanisms or focusing on non-peer-reviewed information.33 Denialism, at the center of this approach, refers to “the employment of rhetorical tactics to give the appearance of argument or legitimate debate, when in actuality there is none.”34 But whereas the word skepticism insinuates a healthy critical stance toward knowledge production in the spirit of scientific inquiry, denial refers to “motivated rejection of evidence in favor of political or personal views.”35 Framing climate change deniers as skeptics can be interpreted as a success of their informational politics, which seeks to give credence to and create legitimacy for denials that human activity is the cause of climate change.

Motivations for climate change denial are wide-ranging but typically overlap in the opposition to regulations of and restrictions on carbon emissions.36 This can be rooted in faith, ideology, or political affiliation.37 Researchers have further argued that climate change denial is the extension of a world order most interested in continuing a capitalist pursuit, uniting those who believe that a free market and technology will solve all societal problems, including climate change.38

Actors leading climate change denialism are varied. Some act publicly, while others act by subsidizing organizations that obfuscate involvement. The main actors are the fossil fuel industry; conservative think tanks, such as the Heritage Foundation and the CATO Institute; conservative philanthropists; front groups; co-opted scientists; and conservative media, politicians, and bloggers.39 Research has shown that heavily automated accounts have played an important role in propagating climate change denial content on Twitter; specifically, on an average day during a period in 2017 around when Trump announced that the United States would withdraw from the Paris Agreement, researchers found that 25 percent of the Twitter discourse about climate change was produced by bots.40

Researchers have documented the role of the fossil fuel industry in promoting climate change disinformation. One study showed how the American Petroleum Institute, the leading trade association for the U.S. fossil fuel industry, has known since at least 1980 that carbon emissions can lead to climate change but promoted disinformation despite this knowledge.41 Other researchers have examined how ExxonMobil “embarked on a deliberate campaign of confusion and disinformation, producing a counter-science to manufacture public uncertainty by funding a diffuse network of ideologically driven advocacy organizations, as well as other issues management, public relations, lobbying and legal tactics.”42

Historians have argued that the U.S. culture of climate change denial emerged after the end of the Cold War, when leading scientists at the conservative nonprofit George C. Marshall Institute “turned their attention to environmentalists, who some at the time called ‘watermelons’—green on the outside, ‘red’ on the inside.”43 At the time, attempting to inhibit the real and consequential effects of the Anthropocene and the human-caused greenhouse effect did not align with conservative politics of the free market. News media organizations also played a role in the denial machine. For instance, reporting that gave equal space to climate change deniers and believers propagated a false equivalence of legitimacy to each group’s claims.44 Furthermore, according to one study, the U.S. news media exhibited hesitancy in the mid-2000s to cover climate change when compared with media organizations in other countries, such as Canada.45

Table 3. Analysis of Climate Change Denial
Aspect of the operation Analysis of climate change denial in the United States
Who is behind the operation? Fossil fuel industry, conservative think tanks, conservative philanthropists, front groups, co-opted scientists, and conservative media and politicians
What activities are the actors undertaking? Using news media and scientific sources to spread false information and funding counter-science organizations
What is the quality of the content? Demonstrably and repeatedly false information being circulated about climate change, in some case despite the actors’ knowledge that the information was false
Who is the target audience? Audiences in the United States
What are the means of distribution? Astroturfing operations and bots on social media, co-optation of traditional media, and academic and pseudo-academic venues
Assessment of criteria Transparency: : Climate change deniers at times obfuscate the origins of funding for particular campaigns, but there are also cases in which no obfuscation happens as false information spreads.

Quality of content: Much of the climate change denial content has been demonstrably and repeatedly false and misleading. In some cases, actors spread false information knowingly.

Calls to action: Much of the false and misleading information is an attempt to change the narrative around climate change in the United States and lobby against policies that aim to stop and reverse climate change.

The BJP’s 2019 Election Campaign in India

In 2019, the BJP won a landslide victory in the Indian general election, giving Modi a second term as prime minister. Mobilizing voters through social media campaigns was integral to the BJP’s electoral strategy,46 in line with what researcher Seva Gunitsky refers to as a “shift from contestation to co-optation” of social media by governments.47 A closer look at the BJP’s tactics reveals a host of dubious actions and actors. The BJP leveraged social media, particularly WhatsApp, to generate the illusion of mass support and spread misleading, false, and divisive information.48

Volunteers working for the Rashtriya Swayamsevak Sangh (RSS), a paramilitary volunteer organization that promotes Hindu nationalism and maintains links to the BJP, and so-called information technology (IT) cells comprising official party workers collected data on the socioeconomic, religious, and caste status of voters using electricity bills and voter registration lists.49 This data was then used to create thousands of tailor-made WhatsApp groups, which were built to target specific populations with messaging that catered to their beliefs and backgrounds.50 Data collection and microtargeting of this kind is not illegal in India because of the country’s weak data protection laws.51

While all political parties—including the BJP’s rival, the Indian National Congress party—try to influence voters,52 it was the content of the BJP’s targeted messaging that made the campaign an unacceptable influence operation, according to this paper’s criteria (see table 4). Much of the content shared was demonstrably and repeatedly false and misleading. The BJP worked to coordinate and disseminate such messaging through regional party IT cells.53 Oxford’s Computational Propaganda Project found that more than a quarter of the content shared by the BJP on Facebook and WhatsApp in the months preceding the election was “junk news,”54 and more than a third of the visual content the party shared was “divisive and conspiratorial.”55 Several academic teams and journalists found repeated false, misleading, and hateful rhetoric being circulated within the BJP’s WhatsApp groups. Misinformation included divisive Islamophobic messaging and doctored visuals undermining the credibility of the then president of the Congress party, Rahul Gandhi.56

Several social media platforms besides WhatsApp were leveraged by the BJP to disseminate its messaging. The BJP’s IT cells used Facebook, Twitter, Instagram, and the NaMo (Narendra Modi) app to spread pro-BJP, anti–Congress party messages designed to polarize and disinform.57 However, the party’s use of WhatsApp far outstripped the use of other platforms. WhatsApp’s encrypted messaging and ubiquity in the country—an estimated 400 million people used the app at the time in India58—played a part in the BJP seeing it as the most effective communication platform for mobilizing voters. Messages forwarded on WhatsApp often linked to content on other platforms, including ShareChat, Helo, Facebook, Twitter, and YouTube.59

The BJP’s digital informational strategies often subverted or blatantly violated the social media platforms’ policies. In 2019, WhatsApp introduced two key changes to its forwarding policy in India to prevent misinformation from going viral: it limited the number of users in a group to 256 and limited message forwarding to five users.60 However, the BJP bypassed these changes by employing hundreds of thousands of people to create small WhatsApp groups based around local voting sites and forward messages. The party employed an estimated 900,000 ““cell phone pramukhs”—individuals given cell phones and assigned to create WhatsApp groups—primarily to disseminate propaganda.61 The BJP also created an illusion of support on Twitter by using fake accounts made by volunteers in its IT cells and through automation.62 Researchers have documented how the BJP used WhatsApp groups to coordinate campaigns to manipulate trends on Twitter.63 As many as 60 percent of Modi’s Twitter followers were purported to be fake accounts at one point,64 and researchers found evidence of computational propaganda, or the use of bots, at play prior to the 2019 election.65 In April 2019, Facebook took down more than 700 pages associated with political parties in India for violating polices on coordinated inauthentic behavior and civic spam.66 Although only fifteen of the pages were pro-BJP, their followers far outnumbered the followers of the 687 pages linked to the Congress party.67 Silver Touch, creator of the NaMo app, ran one of the pages that was taken down—“The India Eye”—which had more than 2.5 million followers and whose content was displayed by default on the NaMo app.68

Assessing the various aspects of the BJP’s electoral strategy on social media platforms against the three criteria proposed in this paper suggests that the BJP’s efforts in the run-up to the 2019 election propagated the illusion of scale and support, produced false and misleading content, and, in some cases, obfuscated the authenticity of users on platforms by using fake accounts and bots.

Table 4. Analysis of the BJP’s 2019 Election Campaign
Aspect of the operation Analysis of the BJP’s 2019 election campaign in India
Who is behind the operation? The BJP’s IT cells and workers on the ground
What activities are the actors undertaking? Creating WhatsApp groups that microtargeted voters and using various social media platforms to spread false information and online abuse
What is the quality of the content? Verifiably false information, often religious in nature
Who is the target audience? Audiences in India
What are the means of distribution? Social media, especially WhatsApp
Assessment of criteria Transparency: The use of fake accounts and the creation of WhatsApp groups obfuscated the origins of the content.

Quality of content: Content shared through WhatsApp groups and on other social media platforms was demonstrably and repeatedly false and misleading. It often targeted caste and religious minorities.

Calls to action: Most efforts were aimed at getting people to vote for the BJP in the 2019 election.

Conclusion

Assessing these case studies through the three criteria of transparency, quality of content, and calls to action, all fall short of acceptability in the context of democratic values. Taken together, these three criteria provide a starting framework to assess the acceptability of influence operations. This metric of acceptability can guide the conduct of influence operations by governmental and nongovernmental actors, as well as the conduct of interventions to curtail or correct operations that are unacceptable.

The first case study showcased how the U.S. government created organizations obfuscating the origins of its influence operations and planted fake stories, thereby compromising transparency and the quality of content. False claims—most importantly about the existence of weapons of mass destruction—make this influence operation problematic in the context of democracy. The scale of the operation, legitimizing a war that lasted decades and led to many deaths, is devastating. The second case showcased how climate change deniers in the United States endeavored to cast uncertainty on science, produced misleading and false information, attacked public institutions of knowledge, and degraded the information environment by creating a vacuum that was filled with climate change denialism. Because it used front organizations and misleading narratives, this influence operation is problematic in that it lacked transparency and spread low-quality information. The third case highlighted how political actors in India also crossed the lines of acceptability. Through fake accounts, they emulated the appearance of support, obfuscating the origins of information, and created an air of authentic activity. All three case studies showcased influence operations that crossed the lines of acceptability in the context of democracies. In some cases, the involvement of government officials and politicians made these influence operations more problematic because they potentially eroded trust in public institutions.

It is imperative that democracies articulate principles for how they engage in the information environment and how they counter influence operations.69 By failing to articulate these principles, democracies have left it to authoritarian countries to be a guide to other governments on how to govern the information environment.70 Influence operations and lies are part of the authoritarian playbook, but controlling the information space in the name of curbing disinformation is also a key element of autocratic governing. So, the shift toward authoritarianism manifests not only as a government propagating its own lies but also as a government exerting increased control over the information environment, while trust in public institutions is degraded by information pollution. Indeed, democracies around the world are backsliding into authoritarianism, partly due to the degradation of the information environment.71 This trend is observable in India, Hungary, and Ghana, once a bastion of democracy in Africa. Even well-established democracies are addressing problems such as disinformation and foreign interference less effectively than they once did.

This is why the criteria outlined in the paper are so important. Democracies need clear guidance about what is acceptable regarding influence operations and interventions. Under no circumstances, for example, should democratic governments and politicians deliberately mislead citizens. Moreover, the same guidance should apply to all branches of the government, including the military and intelligence communities.

Acknowledgments

Carnegie’s Partnership for Countering Influence Operations is grateful for funding provided by the William and Flora Hewlett Foundation, Craig Newmark Philanthropies, the John S. and James L. Knight Foundation, Microsoft, Meta, Google, Twitter, and WhatsApp. The PCIO is wholly and solely responsible for the contents of its products, written or otherwise. We welcome conversations with new donors. All donations are subject to Carnegie’s donor policy review. We do not allow donors prior approval of drafts, influence on selection of project participants, or any influence over the findings and recommendations of work they may support.

Notes

1 Alexandre Alaphilippe, “Adding a ‘D’ to the ABC Disinformation Framework,” Brookings TechStream, April 27, 2020, https://www.brookings.edu/techstream/adding-a-d-to-the-abc-disinformation-framework/; Camille François, “Actors, Behaviors, Content: A Disinformation ABC,” Transatlantic High Level Working Group on Content Moderation Online and Freedom of Expression, Institute for Information Law, University of Amsterdam, 2019, https://www.ivir.nl/publicaties/download/ABC_Framework_2019_Sept_2019.pdf; James Pamment, “The EU’s Role in Fighting Disinformation: Crafting a Disinformation Framework,” Carnegie Endowment for International Peace, September 24, 2020, https://carnegieendowment.org/2020/09/24/eu-s-role-in-fighting-disinformation-crafting-disinformation-framework-pub-82720; and Ben Nimmo and Eric Hutchins, “Phase-Based Tactical Analysis of Online Operations,” Carnegie Endowment for International Peace, March 16, 2023, https://carnegieendowment.org/2023/03/16/phase-based-tactical-analysis-of-online-operations-pub-89275.

2 We chose to leave out the criterion of scale of operations as it was less relevant to upholding democratic principles. While the three criteria in this paper provide a preliminary foundation, we welcome suggestions for other criteria that may be relevant.

3 Behrouz Ehsani-Moghaddam, Ken Martin, and John A. Queenan, “Data Quality in Healthcare: A Report of Practical Experience with the Canadian Primary Care Sentinel Surveillance Network Data,” Health Information Management 50, no. 1/2 (2021): 88–92, https://doi.org/10.1177/1833358319887743.

4 Jonah Berger, Contagious: Why Things Catch On (New York: Simon & Schuster Paperbacks, 2016).

5 UN General Assembly, “International Covenant on Civil and Political Rights,” December 19, 1966, https://treaties.un.org/doc/publication/unts/volume 999/volume-999-i-14668-english.pdf.

6 “Report of the Select Committee on Intelligence on Postwar Findings About Iraq’s WMD Programs and Links to Terrorism and How They Compare With Prewar Assessments Together With Additional and Minority Views,” S.Rep. No. 109-331 (September 8, 2006): https://www.congress.gov/congressional-report/109th-congress/senate-report/331/1; and “The Report of the Iraq Inquiry,” UK House of Commons, July 6, 2016, https://www.gov.uk/government/publications/the-report-of-the-iraq-inquiry.

7 “Report of the Select Committee on Intelligence on the U.S. Intelligence Community’s Prewar Intelligence Assessments on Iraq Together With Additional Views,” S.Rep. 108-301 (July 9, 2004): https://www.congress.gov/congressional-report/108th-congress/senate-report/301/1; “Report of the Select Committee on Intelligence on Postwar Findings About Iraq’s WMD”; UK House of Commons, “The Report of the Iraq Inquiry”; and “Comprehensive Report of the Special Advisor to the DCI on Iraq’s WMD,” U.S. Central Intelligence Agency, September 30, 2004, https://www.cia.gov/readingroom/docs/DOC_0001156395.pdf.

8 Amy Gershkoff and Shana Kushner, “Shaping Public Opinion: The 9/11-Iraq Connection in the Bush Administration’s Rhetoric,” Perspectives on Politics 3, no. 3 (2014): 525–537, https://www.cambridge.org/core/journals/perspectives-on-politics/article/shaping-public-opinion-the-911iraq-connection-in-the-bush-administrations-rhetoric/FBF0272D582863800F770F1AEE276593.

9 James Dao and Eric Schmitt, “A Nation Challenged: Hearts and Minds; Pentagon Readies Efforts to Sway Sentiment Abroad,” New York Times, February 19, 2002, https://www.nytimes.com/2002/02/19/world/nation-challenged-hearts-minds-pentagon-readies-efforts-sway-sentiment-abroad.html.

10 Timothy Cake, America’s Alleged Intelligence Failure in the Prelude to Operation Iraqi Freedom: A Study of Analytic Factors (Calgary, Alberta: University of Calgary, 2017).

11 Sheldon Rampton and John Stauber, Weapons of Mass Deception: The Uses of Propaganda in Bush’s War on Iraq (New York: Tarcher & Penguin, 2003).

12 Eric Schmitt, “Threats and Responses: Bipartisan Hawks; New Group Will Lobby for Change in Iraqi Rule,” New York Times, November 15, 2002, https://www.nytimes.com/2002/11/15/world/threats-responses-bipartisan-hawks-new-group-will-lobby-for-change-iraqi-rule.html; and Douglas Quenqua, “Opinion Leaders Unite to Shift Saddam Focus in U.S.,” PR Week, November 25, 2002, https://www.prweek.com/article/165046/opinion-leaders-unite-shift-saddam-focus-us.

13 Douglas Quenqua, “U.S. Training Iraqis in Media to Raise Support for Attack,” PR Week, September 2, 2002, https://www.prweek.com/article/1233725/us-training-iraqis-media-raise-support-attack.

14 Robin Wright, “United States to Train Iraqis in Rhetoric Against Hussein,” Los Angeles Times, August 25, 2002.

15 Rampton and Stauber, Weapons of Mass Deception.

16 Mark Mazzetti and Borzou Daragahi, “U.S. Military Covertly Pays to Run Stories in Iraqi Press,” Los Angeles Times, November 30, 2005, https://www.latimes.com/archives/la-xpm-2005-nov-30-fg-infowar30-story.html.

17 Rampton and Stauber, Weapons of Mass Deception; and Cake, America’s Alleged Intelligence Failure in the Prelude to Operation Iraqi Freedom.

18 Jane Mayer, “The Manipulator,” New Yorker, May 30, 2004, https://www.newyorker.com/magazine/2004/06/07/the-manipulator.

19 Associated Press, “CIA’s Final Report: No WMD found in Iraq,” NBC News, April 25, 2005, https://www.nbcnews.com/id/wbna7634313.

20 Mazzetti and Daragahi, “U.S. Military Covertly Pays to Run Stories in Iraqi Press.”

21 Ben Norton, “U.S. Paid P.R. Firm $540 Million to Make Fake al-Qaida Videos in Iraq Propaganda Program,” Salon.com, October 3, 2016, https://www.salon.com/2016/10/03/u-s-paid-p-r-firm-540-million-to-make-fake-al-qaida-videos-in-iraq-propaganda-program/.

22 Charles Lewis and Mark Reading-Smith, “False Pretenses,” Center for Public Integrity, January 23, 2008, https://publicintegrity.org/politics/false-pretenses/.

23 Eric V. Larson and Bogdan Savych, American Public Support for U.S. Military Operations from Mogadishu to Baghdad (Santa Monica, CA: RAND Corporation, 2005).

24 Russ Hoyle, Going to War: How Misinformation, Disinformation, and Arrogance Led America into Iraq (New York: St. Martin’s Press, 2018).

25 John Ferris, “A New American Way of War? C4ISR in Operation Iraqi Freedom: A Provisional Assessment,” Journal of Military and Strategic Studies 6, no. 1 (2003), https://jmss.org/article/view/57813.

26 Ferris, “A New American Way of War?”

27 Naomi Oreskes and Erik M. Conway, Merchants of Doubt (New York: Bloomsbury Press, 2010); Pamment, “The EU’s Role in Fighting Disinformation”; Theda Skocpol and Alexander Hertel-Fernandez, “The Koch Network and Republican Party Extremism,” Perspectives on Politics 14, no. 3 (2016): 681–699, https://www.cambridge.org/core/journals/perspectives-on-politics/article/koch-network-and-republican-party-extremism/035F3D872B0CE930AF02D7706DF46EEE.

28 Pascal Diethelm and Martin McKee, “Denialism: What Is It and How Should Scientists Respond?” European Journal of Public Health 19, no. 1 (2009): 2–4, https://doi.org/10.1093/eurpub/ckn139; Mark Hoofnagle and Chris Hoofnagle, “What Is Denialism?” Denialism.com, March 18, 2007, https://denialism.com/about/; Oreskes and Conway, Merchants of Doubt; Brad MacKay and Iain Munro, “Information Warfare and New Organizational Landscapes: An Inquiry into the ExxonMobil-Greenpeace Dispute over Climate Change,” Organization Studies 33 no. 11 (2012): 1,507–1,536, https://doi.org/10.1177/0170840612463318; and Riley E. Dunlap and Aaron M. McCright, “Organized Climate Change Denial,” in The Oxford Handbook of Climate Change and Society, ed. J. S. Dryzek, R. B. Norgaard, and D. Schlosberg (Thousand Oaks: Sage, 2012), https://doi.org/10.1093/oxfordhb/9780199566600.003.0010.

29 Jeffrey Pierre and Scott Neuman, “How Decades of Disinformation About Fossil Fuels Halted U.S. Climate Policy,” NPR, October 27, 2021, https://www.npr.org/2021/10/27/1047583610/once-again-the-u-s-has-failed-to-take-sweeping-climate-action-heres-why.

30 Aashka Dave, Emily Boardman Ndulue, Laura Schwartz-Henderson, and Eli Weiner, Targeting Greta Thunberg: A Case Study in Online Mis/Disinformation (Washington, DC: German Marshall Fund, 2020); MacKay and Munro, “Information Warfare and New Organizational Landscapes”; Stuart Tannock, “The Oil Industry in Our Schools: From Petro Pete to Science Capital in the Age of Climate Crisis,” Environmental Education Research 26, no. 4 (2020): 474–490, https://doi.org/10.1080/13504622.2020.1724891; Jeffrey A. Harvey et al., “Internet Blogs, Polar Bears, and Climate-Change Denial by Proxy,” BioScience 68, no. 4 (2018): 281–287, https://doi.org/10.1093/biosci/bix133.

31 Steven A. Kolmes, “Climate Change: A Disinformation Campaign,” Environment 53, no. 4 (2011): 33–37, https://www.tandfonline.com/doi/full/10.1080/00139157.2011.588553.

32 Kolmes, “Climate Change.”

33 Diethelm and McKee, “Denialism”; Kolmes, “Climate Change”; and Oreskes and Conway, Merchants of Doubt.

34 Hoofnagle and Hoofnagle, “What Is Denialism?”

35 Stephan Lewandowsky, “Climate Change, Disinformation, and How to Combat It,” Annual Review of Public Health 42 (2021): 1–21, https://www.annualreviews.org/doi/abs/10.1146/annurev-publhealth-090419-102409.

36 Dunlap and McCright, “Organized Climate Change Denial.”

37 Diethelm and McKee, “Denialism”; and Lewandowsky, “Climate Change, Disinformation, and How to Combat It.”

38 Dunlap and McCright, “Organized Climate Change Denial”; and Oreskes and Conway, Merchants of Doubt.

39 Dunlap and McCright, “Organized Climate Change Denial”; Harvey et al., “Internet Blogs, Polar Bears, and Climate-Change Denial by Proxy”; and Lewandowsky, “Climate Change, Disinformation, and How to Combat It.”

40 Thomas Marlow, Sean Miller, and J. Timmons Roberts, “Bots and Online Climate Discourses: Twitter Discourse on President Trump’s Announcement of U.S. Withdrawal from the Paris Agreement,” Climate Policy, 2021, https://ideas.repec.org/a/taf/tcpoxx/v21y2021i6p765-777.html.

41 Benjamin Franta, “Early Oil Industry Disinformation on Global Warming,” Environmental Politics 30, no. 4 (2021): 663–668, https://doi.org/10.1080/09644016.2020.1863703.

42 Mackay and Munro, “Information Warfare and New Organizational Landscapes.”

43 Naomi Oreskes and Erik M. Conway, “Defeating the Merchants of Doubt,” Nature 465 no. 7299 (2010): 686–8, https://pubmed.ncbi.nlm.nih.gov/20535183/.

44 Lewandowsky, “Climate Change, Disinformation, and How to Combat It.”

45 Jennifer Ellen Good, “The Framing of Climate Change in Canadian, American, and International Newspapers: A Media Propaganda Model Analysis,” Canadian Journal of Communication 33, no. 2 (2008): 233–255, https://cjc.utpjournals.press/doi/pdf/10.22230/cjc.2008v33n2a2017?download=true.

46 Christophe Jaffrelot and Gilles Verniers, “The BJP’s 2019 Election Campaign: Not Business as Usual,” Contemporary South Asia 28, no. 2 (2020): 155–177, https://www.tandfonline.com/doi/abs/10.1080/09584935.2020.1765985.

47 Seva Gunitsky, “Corrupting the Cyber-Commons: Social Media as a Tool of Autocratic Stability,” Perspectives on Politics 13, no. 1 (2015): 42–54, https://www.cambridge.org/core/journals/perspectives-on-politics/article/corrupting-the-cybercommons-social-media-as-a-tool-of-autocratic-stability/CD2CCFAB91935ED3E533B2CBB3F8A4F5.

48 Ualan Campbell-Smith and Samantha Bradshaw, Global Cyber Troops Country Profile: India (Oxford: Computational Propaganda Project, Oxford Internet Institute, 2019); Swati Chaturvedi, I Am a Troll: Inside the Secret World of the BJP’s Digital Army (New Delhi: Juggernaut Books, 2019); and Jaffrelot and Verniers, “The BJP’s 2019 Election Campaign.”

49 Jaffrelot and Verniers, “The BJP’s 2019 Election Campaign.”

50 Jacob Gursky, Katlyn Glover, Katie Joseff, Martin J. Riedl, Jimena Pinzon, Romi Geller, and Samuel C. Woolley, Encrypted Propaganda: Political Manipulation via Encrypted Messaging Apps in the United States, India, and Mexico (Austin, TX: Center for Media Engagement, University of Texas at Austin, 2020).

51 Divij Joshi, “India’s Electoral Laws Are Ill-Equipped to Deal with Digital Propaganda,” Wire, November 28, 2018, https://thewire.in/politics/indias-electoral-laws-are-ill-equipped-to-deal-with-digital-propaganda.

52 Samir Patil, “India Has a Public Health Crisis. It’s Called Fake News,” New York Times, April 29, 2019, https://www.nytimes.com/2019/04/29/opinion/india-elections-disinformation.html; and Snigdha Poonam and Samarth Bansal, “Misinformation Is Endangering India’s Election,” Atlantic, April 1, 2019, https://www.theatlantic.com/international/archive/2019/04/india-misinformation-election-fake-news/586123/.

53 Gursky et al., Encrypted Propaganda.

54 Junk news comes from “sources that deliberately publish misleading, deceptive, or incorrect information packaged as real news” (Samantha Bradshaw, Philip N. Howard, Bence Kollanyi, and Lisa-Maria Neudert, “Sourcing and Automation of Political News and Information on Social Media in the United States, 2016-2018,” Political Communication 37, 2 (2020): 173-193, https://doi.org/10.1080/10584609.2019.1663322).

55 Vidya Narayanan, Bence Kollanyi, Ruchi Hajela, Ankita Barthwal, Nahema Marchal, and Philip N. Howard, “News and Information Over Facebook and WhatsApp During the Indian Election Campaign,” Computational Propaganda Project, Oxford Internet Institute (2019), https://demtech.oii.ox.ac.uk/research/posts/news-and-information-over-facebook-and-whatsapp-during-the-indian-election-campaign/.

56 Jaffrelot and Verniers, “The BJP’s 2019 Election Campaign”; Billy Perrigo, “How Volunteers for India’s Ruling Party Are Using WhatsApp to Fuel Fake News Ahead of Elections,” Time, January 25, 2019, https://time.com/5512032/whatsapp-india-election-2019/; and Poonam and Bansal, “Misinformation Is Endangering India’s Election.”

57 Campbell-Smith and Bradshaw, Global Cyber Troops Country Profile: India.

58 Priyanjana Bengani, “India Had Its First ‘WhatsApp Election.’ We Have a Million Messages From It,” Columbia Journalism Review, October 16, 2019, https://www.cjr.org/tow_center/india-whatsapp-analysis-election-security.php.

59 Bengani, “India Had Its First ‘WhatsApp Election.’”

60 WhatsApp, “More Changes to Forwarding,” 2019, https://blog.whatsapp.com/more-changes-to-forwarding. These restrictions have since changed. WhatsApp was experimenting with increasing the group size to 1,024 people as of November 2022. WhatsApp, “Communities Now Available!” 2022, https://blog.whatsapp.com/communities-now-available?lang=en.

61 Jaffrelot and Verniers, “The BJP’s 2019 Election Campaign.”

62 Campbell-Smith and Bradshaw, Global Cyber Troops Country Profile: India; and Chaturvedi, I Am a Troll.

63 Maurice Jakesch, Kiran Garimella, Dean Eckles, and Mor Naaman, “Trend Alert: A Cross-Platform Organization Manipulated Twitter Trends in the Indian General Election,” Proceedings of the ACM on Human-Computer Interaction, 5(CSCW2) (2021): 1–19, https://dl.acm.org/doi/10.1145/3479523.

64 Twiplomacy (@twiplomacy), “World leaders and their Fake followers,” Twitter post with image, February 21, 2018, 3:23 a.m., https://twitter.com/Twiplomacy/status/966226775683534848.

65 Campbell-Smith and Bradshaw, Global Cyber Troops Country Profile: India.

66 Nathaniel Gleicher, “Removing Coordinated Inauthentic Behavior and Spam From India and Pakistan,” Facebook, April 2019, https://about.fb.com/news/2019/04/cib-and-spam-from-india-pakistan/.

67 Aria Thaker, “Most of the Pages Facebook Purged Recently Were Pro-Congress — but the BJP Took a Much Bigger Hit,” Quartz, April 2, 2019, https://qz.com/india/1585087/facebooks-fake-pages-purge-hits-modis-bjp-harder-than-congress.

68 Thaker, “Most of the Pages Facebook Purged Recently Were Pro-Congress.”

69 Alicia Wanless, “One Strategy Democracies Should Use to Counter Disinformation,” Carnegie Endowment for International Peace, March 28, 2022, https://carnegieendowment.org/2022/03/28/one-strategy-democracies-should-use-to-counter-disinformation-pub-86734.

70 Jacob N. Shapiro and Alicia Wanless, “Why Are Authoritarians Framing International Approaches to Disinformation?” Lawfare, December 28, 2021 https://www.lawfaremedia.org/article/why-are-authoritarians-framing-international-approaches-disinformation.

71 “Global State of Democracy Initiative,” International IDEA, accessed July 25, 2023, https://www.idea.int/gsod/.