Testimony: “A Growing Threat: The Impact of Disinformation Targeted at Communities of Color”

 

TESTIMONY OF SAMUEL WOOLLEY, PHD

Program Director, Propaganda Research Lab, Center for Media Engagement
Assistant Professor, School of Journalism & Media
The University of Texas at Austin

-Hearing on-
A Growing Threat: The Impact of Disinformation Targeted at Communities of Color

Introduction

Good morning Chairman Butterfield, Ranking Member Steil, and members of the subcommittee. Thank you very much for giving me the chance to testify on the impact of disinformation targeted at communities of color. I would also like to extend thanks to the other witnesses providing testimony during this hearing and to my colleagues and collaborators at the University of Texas at Austin.

I lead the Propaganda Research Lab at UT Austin’s Center for Media Engagement (CME). Our team at CME works across multiple fields in order to help foster and maintain “a vibrant American media ecosystem that more effectively empowers the public to understand, appreciate, and participate in the democratic exchange of ideas.”1 The Propaganda Research Lab specializes in the study of how emerging media technologies–from social media to virtual reality–are leveraged in efforts to manipulate public opinion. Crucially, we work to develop research, technological, as well as policy-based solutions to this growing informational problem. During my career I have focused on conducting empirical research into how internet-based platforms such as Facebook, YouTube, and WhatsApp are used to spread mis- and dis- information, coordinated political trolling campaigns, and other forms of malign influence. In particular, I concentrate upon computational propaganda2: how online automation (often in the form of software-driven “bots”), algorithms, and other digital tools are used to amplify and suppress various streams of propaganda.

In the last five years, I have begun focusing upon the human consequences of computational propaganda3. Rather than seeing growing issues like disinformation as technological problems with technological solutions, I view them as deeply-rooted social problems magnified by new communication tools. My team has begun to focus on how communities of color and diaspora communities–groups crucial to the success of a vibrant U.S. democracy–are specifically targeted with malicious online influence campaigns during elections. Our work centers the perspectives and experiences of these communities and their own sense-making and evaluations of contemporary informational problems.

Just last month, the Propaganda Research Lab published the paper “Escaping the Mainstream? Pitfalls and Opportunities of Encrypted Messaging Apps and Diaspora Communities in the U.S.”4, which I have attached here in my written testimony. In this paper we explore the ways diaspora communities use apps like WhatsApp and Telegram to communicate and organize democratically, but also how they are targeted with disinformation and other forms of political manipulation. For the purposes of this work, we use the term “diaspora communities ” to include people who told us that they regularly use EMAs to communicate with people in their country of origin or where their family is from, with individuals who share their cultural context, and with people living in the U.S. identifying with the same community. While this approach risks over- including individuals who are not part of identical communities, the connecting thread for our research is the usage of these apps. Our definition also aims to avoid the dominant rationale for the homogenization of people from non-white communities in the U.S. Instead of basing our study on sociologically deterministic inclusion/exclusion criteria, we follow well-established conceptualizations that subvert the determinist scheme in which any nation is portrayed as a product of specific sociological conditions; instead, the nation is an “imagined political community.”5

This work informs recently initiated analysis focused on electoral disinformation and electoral propaganda spread during the 2022 midterms supported via a new grant received from the John S. and James L. Knight Foundation: “Addressing Disinformation Campaigns Against Diaspora Communities on Encrypted Messaging Applications”.6 Finally, I am working on the second paper for a two-part report commissioned by the group Protect Democracy–the first paper7 for this series is attached in this testimony. This project specifically focuses on building research- based understandings of how communities of color in Arizona, Georgia, and Wisconsin are contending with electoral disinformation and propaganda. I offer preliminary results of the forthcoming original study (part two of the report) in the following pages.

Today, I will focus on four key points drawn from these–and several other–Propaganda Research Lab studies:

  1. Encrypted messaging apps are critical vectors for false and misleading information: Chat apps are important platforms for democratic communication, and particularly so among diaspora communities. Their particularities and design, however, make them attractive targets for false and misleading information, and difficult venues for interventions by platforms and governments.
  2. Minority groups are targeted by unique strains of propaganda: Specific content is drafted as well as tactical behavior employed to influence minority groups. These targeted messages undermine our democracy as they work to alienate and disengage minority groups. The specific, often transnational, tactics challenge content moderation and fact checking efforts and regularly rely on cross-platform communication.
  3. COVID-19 dis- and misinformation affecting communities of color: For a range of reasons, including historical injustices around medical research, racism inherent in the medical system, distrust around vaccinations, and the confluence of social media promotions of alternative treatments, false information around COVID-19 severely affects communities of color. As COVID-19 has become a highly politicized issue, it also intersects with and permeates election-related dis- and misinformation.
  4. Structural factors of our information environment inhibit the democratic inclusion of communities of color: These structural issues are related to long-term efforts of controlling minority groups’ access to, and understandings of, the country’s electoral and media systems. Online attempts at undermining our democracy by thwarting minority groups’ democratic participation are concerning developments. They become exacerbated by existing structures of disadvantage and offline propaganda.

I want to be very clear on what I mean when I say “electoral disinformation”: false content purposely spread during elections with the intent to disinform, demobilize, and/or disenfranchise voters and those involved in other forms of civic engagement. “Electoral propaganda” is a broader term that encapsulates other intentional efforts to manipulate public opinion through illicit means ranging from the purposeful spread of false content–disinformation–to politically motivated harassment and trolling.

KEY POINT ONE

Encrypted messaging apps are critical vectors for false and misleading information

Over the last several years, our research at the Center for Media Engagement’s Propaganda Research Lab has focused upon platforms that do not squarely fall into the purview of what is conventionally understood as social media. We specifically investigate (often-encrypted) chat and messaging apps such as WhatsApp, Telegram, or Signal. Some of these apps rose to fame when far-right groups were deplatformed from the social media platform Parler after January 6, 2021, only to congregate on the chat app Telegram soon thereafter.8 Others, such as WhatsApp, are not as widely used across the U.S. general population as they are in other countries, but they are used disproportionately more among demographic groups defined by intersecting identities of race/ethnicity, and membership in diaspora communities.

For example, according to data from the Pew Research Center, 42% of Hispanic Americans use WhatsApp, 24% of Black Americans do so, compared to 13% of white Americans.9 Our own survey research, representative of the U.S. population, also confirms this, and underscores how WhatsApp as a platform for political discussion is more used among Hispanic Americans, Asian Americans, Black Americans, as well as among those identifying with two or more racial/ethnic categories than it is used among non-Hispanic whites.10

In our research, we investigate and describe central mechanisms and defining features of encrypted messaging apps. These apps are, in many cases, immersed in everyday life and constitute critical communicative infrastructures11 among networks of trusted connections, coworkers, friends, and family. They are important and valuable civic spaces, while at the same time nefarious actors can engage their specificities to fly under the radar and promote false and misleading information. It is precisely when propagandists who want to spread false or misleading information are able to penetrate these spaces–defined by trust, close relationships, and authenticity–that their influence operations are ever-more damaging.

One mechanism that is particularly important when discussing the role of encrypted chat apps–and that my team finds across many such apps–is what we call cascade logic. By this, we mean the ways in which encrypted chat and messaging apps serve as points through which information is either “trafficked upstream (making its way from private conversations into the mainstream),” or “downstream (allowing information to withdraw from the public eye).”12 What is important here is that as information moves through chat apps, it can easily get “distorted, decontextualized, and thereby, transport false information”13. Chat apps oftentimes provide both spaces that are very public, as well as spaces that are very private: many of them allow the convening of large groups of people in chats or broadcast lists, while at the same time allowing for a great deal of conversation via very small, private, groups of individuals.

As our interviews with producers and trackers of propaganda in the United States, India, and Mexico show, encrypted chat apps are a particularly relevant vector for false information globally and transnationally. In these spaces key informational context can get easily lost when messages get forwarded. Our research also shows how chat apps can be venues in which cross-platform coordination happens, meaning that the manipulation and seeding of content on other platforms is coordinated in the privacy of encrypted chat apps. For instance, white nationalist interviewees in the U.S. that we spoke with during the course of our research described to us how they had used certain apps for backstage coordination of campaigns with the intent to ultimately seed false information in more public social media platforms, information that might ultimately make its way into legacy media.13

KEY POINT TWO

Minority groups are targeted by unique strains of propaganda

In our research we found that minority groups are targeted by unique strains of propaganda. In other words, specific tactics as well as themes are tailored towards deceiving sub-populations of the American electorate.14 Misinformation targeting minority communities–and particularly communities of color–as a form of voter suppression is not new; as the most recent State of Black America report notes, misinformation is one strategy besides gerrymandering, voter suppression, and intimidation, to exclude minority communities and communities of color from electoral participation.15 Specific types of false information targeting minority populations in the United States carries two concerning implications: one short-term and one long-term. The short-term one is the disenfranchisement of these populations which jeopardizes their voice in imminent political decisions;16 the long-term one is the undermined trust in our democratic system17 which outlines its struggles as being defined by histories of exclusion or marginalization.18

With our research, we identified four main themes within contemporary propagandistic messages aimed at diaspora communities in the U.S.: First, the sowing of confusion via translational ambiguities; second, the leveraging of falsehoods to redraw ideological fault lines; third, the use of religion to sow doubt about candidates’ views, and finally, the oversimplification of complex perspectives, policies, and procedures to alter voting decisions.19 Recent research in this space points out how the impact and legacy of imperialism affects some diaspora communities and their respective information infrastructures and media systems, as well as how language and (mis)translation can play key roles in the spread of false and misleading information.20

To make these themes more tangible, I would like to provide a few examples. For instance, during the previous presidential election campaign period, various open and more private social media sites hosted content decrying Joe Biden, at the time presidential candidate, as socialist or at least as being favored by Latin American socialists.21 Allegedly Nicolás Maduro’s socialist party in Venezuela was in favor of Biden; allegedly Joe Biden was a force of creeping socialism in the U.S. While large parts of the American population are likely to disapprove of this messaging, the outlined invocations raise graver concerns with people who have either directly fled regimes such as Venezuela and Cuba, or who have been brought up with harrowing stories of living under (partial) socialism, such as some Colombian diaspora communities.

Another example would be propaganda that criticized Joe Biden for being “Anti-Catholic”.22 Many of these messages seemed to be less targeted towards Joe Biden per se, which is reasonable given the fact that he is Catholic himself, and instead pointed towards the Democratic Party and its stance on abortion policies. For instance, videos were shared with the caption: “You cannot be a Catholic and be a Democrat,” which elaborated on the conviction that “no Catholic can be aligned with the Democratic Party.” Again, this messaging plays on identity-defining characteristics for some Hispanic voters who are Catholics. Important to note here is also that Hispanic/Latino/Latinx voters are not a monolithic group of the electorate23, and generalizations across subpopulations should generally be avoided.24

In terms of specific tactics used by propagandists which are tailored towards diaspora communities in the U.S., two dynamics are prevalent: its transnational character and its reliance on intimacy25 and trust amongst the communities.26

Disinformation has proven borderless, as content is often created in one country with the intent to influence communities in another.27 In other instances, it is clear that while the messages might have originated or been produced inside the U.S., their distribution tactics were borrowed from popular forms of sharing information in a given target community–for instance, distribution might rely on audio and video messages in addition to text; sometimes because audio and video messages are more easily accessible for people.28 These tactics challenge content moderation and fact checking efforts and often rely on cross-platform communication — for example, videos were first found on YouTube, then shared on WhatsApp and forwarded to different WhatsApp groups.

KEY POINT THREE

COVID-19 dis- and misinformation affecting communities of color

The informational problems associated with COVID-19 have been referred to as an ‘infodemic’ by the World Health Organization, a term meant to describe “an over-abundance of information — some accurate and some not.”29 In July 2021, President Joe Biden said that social media platforms are ‘killing people,’ referring to how false information related to the pandemic spread through social media.30 Research published in Nature Human Behavior points to how exposure to false information about COVID-19 can negatively impact people’s intent to get vaccinated.31

False information about COVID-19 affects communities of color severely and disproportionately, for a range of reasons. Many of these reasons are systemic, and rooted in historical injustices around medical research, racism inherent in the medical system, corresponding distrust around vaccinations, issues of access to health care, or the prevalence and promotion of alternative treatments, to mention but a few. For example, a study released by researchers at UCLA points to how members of racial and ethnic minority communities in Los Angeles County, in their calculus of decision-making around vaccinations, were influenced by knowledge about historical injustices.32 In a recent report, First Draft News emphasized how skepticism toward vaccines among Hispanic/Latino/Latinx Americans is connected to medical exploitation and discrimination, such as sterilization of Puerto Rican women, or of Mexican American women in California during eugenics programs;33 and among Black Americans it is connected to structural inequities, exploitation, and medical racism.34

This matters in a political and in an election-related context. The COVID-19 pandemic has illustrated how hate speech and misinformation are connected, and how they create disparate effects for communities of color. President Donald Trump is notorious for having sown anti-Asian sentiment during his tenure, such as by calling COVID-19 the “Chinese virus”35 or by referring to it as “Kung Flu.”36 Empirical research has found an increase in the spread of false hateful information about COVID-1937  after President Trump’s hateful tweets. COVID-19, as well as corresponding mask and vaccination policies, have become incredibly polarized partisan issues. Therefore, COVID-19 constitutes a noteworthy vector of election-related mis- and disinformation.

While our research with diaspora and immigrant community leaders in the U.S. was not primarily focused on COVID-19, the pandemic still left its mark during these interviews. I will provide an example from an interview with one of our participants, who is a member of an organization focused on Indian Americans. They described an increase in false information about COVID-19 in 2021 and said: “A lot surrounding incorrect information about masks and false coronavirus cures. More recently, Covid vaccine, or how to prevent Covid with different kinds of tea.” A former Democratic strategist, who is a member of several WhatsApp groups of Latinx communities, had also witnessed COVID-19 dis- and misinformation. Covid group chats originally created to inform about the pandemic, according to this interviewee, were sprinkled with political disinformation and conspiracy theories: “then, in between all of that [referring to community services and things such as food banks] someone places a conspiracy theory. It goes from religion to things on George Soros, to things about Biden, Obama, and Harris.”

KEY POINT FOUR

Structural factors of our information environment inhibit the democratic inclusion of communities of color

Linked to a previously mentioned theme, namely the oversimplification of complex perspectives, policies, and procedures to alter voting decisions, I would like to elaborate on my fourth and final main takeaway: that communities of color across the United States are more impacted by structural issues related to the broader information environment than by purely online disinformation. These structural issues are connected to long-term efforts aimed at controlling access of communities of color to, and understandings of, the country’s electoral and media systems.

Research for the organization Protect Democracy by Mark Kumleben, Katie Joseff and myself on three U.S. battleground states–Arizona, Georgia, and Wisconsin, finds that communities of color are more challenged by structural disinformation surrounding our elections than they are by disinformation that originates online with the primary intention of radicalization or sowing conspiracy.38 In our research, we paid particular attention to interviewees’ concerns about the forthcoming 2022 U.S. midterm elections. While many organizations and individuals have flocked to the study of digital disinformation because they see it as a technological problem with technological solutions, our research underscores the fact that both disinformation and propaganda are social and cultural problems first. Interviewees–primarily community organizers representing various communities of color in these states–again and again told us that a great deal of content circulating among their groups about how, when, or where to vote–but also about recent, planned, or possible changes to legislation–is disinformative. Some of this even originates from official state sources.

These efforts to manipulate public opinion are amplified and strengthened via various affordances of new media, but they are rooted in the context of a history in which powerful groups have exerted continuous control over both political franchise and the communication ecosystem.39

During the 2020 election, people across the United States faced a barrage of deceptive and and divisive information related to that year’s high contentious election cycle. Next to false content  about candidates, people were also targeted by patently untrue information about electoral processes.40 For example, in Arizona, Hispanic American and Native American communities faced a cascade of untrue digital messaging over Twitter about the voting process.41 In Wisconsin, multiple communities of color from Madison to Milwaukee were targeted with lies about mail-in ballot fraud and ballot dumping.42

In online chat groups, false information was sometimes shared by people within a WhatsApp group that were from other states or was forwarded from other WhatsApp groups that claimed the information was relevant for all voters even though it only concerned some states and not others. One community organizer we spoke to provided an example wherein members of a national organization dedicated to mobilizing South Asian American voters for Democratic candidates, likely unintentionally, spread dangerous false information by sharing voting laws from California that did not apply in North Carolina. Another interviewee, the co-executive director of an independent political organization that has members in over 12 counties in Florida, cited various examples of relatively unsophisticated attempts at spreading false information such as providing wrong information about where and when to vote; nevertheless, these also fed into the previously described dynamics of voter discouragement.

Research into disinformation targeting communities of color is often defined by arguments that disinformation originates online as part of a propaganda strategy designed to mislead or even radicalize its recipients.43 Contrary to this theme, our research indicates that the “online first” model of polarizing techno-propaganda is less relevant for communities of color. When disinformation seemed to be purposefully designed to deceive voters of color, interviewees told us it circulated mostly through offline channels–via deceptive mailers and misleading campaign advertising in print news, on TV, radio, and even billboards. On the whole, they said that communities of color are most harmed by lack of consistent access to accurate, trusted, local information and by the second-order effects of political disinformation, which create barriers to their participation in civic life.

Given the voting intricacies, especially for first-time voters, but also for returning voters, any suitable summary or allegedly helpful information is likely to be picked up as guidance within pre-established communities of trust, such as by a group chat on an encrypted messaging app composed mostly of community members.44 In addition, some members of minority groups face offline risks related to elections, such as poll workers or individual voters of color encountering vicious and dangerous disinformation-inspired harassment. Voters across all communities are demoralized by general concerns about U.S. politics, structural barriers to participation as well as potential repercussions they face.

These online attempts at undermining our democracy by thwarting minority groups’ democratic participation are concerning developments. They become exacerbated by structural disadvantages related to the broader information environment. Overall, we are at a crucial time for our democracy–as Richard Hasen points out in a recent essay for Harvard Law Review wherein he details the risk of future election subversion in the US.45 Disruptive private action can prevent people from voting. We need to make sure these disruptive actions do not turn into democracy-threatening election sabotage and subversion.

SOLUTIONS

Today’s electoral propaganda is driven by a complex hybrid of political but also commercial motivations. This means it needs a variety of efforts to prevent it or at least alleviate its impacts.

In order to be better prepared, we need short-, mid- and long-term responses.46 For the short term, the support of real-time fact-checking is vital.47 Importantly, researchers like Kiran Garimella have built a successful “WhatsApp Monitor” tool for electoral contests in Brazil and India. Tools like this are designed to work alongside communities who use them to detect viral false information content early on, and hence to facilitate fact-checking.48 Such efforts should be accompanied by bottom-up, community-centric, regionally and linguistically specific programs, because the utilization of familiar relationships is particularly relevant for communities of color and because the problem is much more than just a technological one.49 This also means that accurate information about electoral processes must be available in a variety of languages,50 and there ought to be more civic support for communities where English is not the dominant language.

In the mid-term, we need to work harder to understand the significance of electoral propaganda for minority communities and communities of color in order to create more inclusive democracies. Policy makers should carry these insights into discussions about public trust and move away from top-down models that derive from the points of view of people in power but rather include insights from the broader electorate. Legislative discussions about regulating the tech sector and content moderation must include equitable representation from minority groups so that their experiences and opinions inform these discussions and any subsequent policy decisions.

The current crisis of public trust is evident among numerous demographics in the United States. Populist rhetoric has found fertile soil in many communities.51 Singling out communities of color should not be viewed as an effort to ‘educate;’ rather, congress and other entities must work to support, listen to, and learn from these communities so that it can better protect them from the today’s deluge of disinformation and better enable them to engage civically. Without these communities’ crucial equitable inclusion within the democratic process–through access to quality political information and via elections and all means of civic engagement–U.S. democracy will deteriorate.

  1. Center for Media Engagement (2022). About us. https://mediaengagement.org/aboutus/ []
  2. Woolley, S. C., & Howard, P. N. (Eds.). (2018). Computational propaganda: Political parties, politicians, and political manipulation on social media. Oxford University Press. []
  3. Joseff, K., & Woolley, S. (2019). The human consequences of computational propaganda: Eight case studies from the 2018 U.S. midterm elections. Institute for the Future. https://www.iftf.org/fileadmin/user_upload/downloads/ourwork/IFTF_Executive_Summary_comp.prop_W_05.07.19_0 1.pdf []
  4. Trauthig, I. K., & Woolley, S. (2022, March). Escaping the mainstream? Pitfalls and opportunities of encrypted messaging apps and diaspora communities in the U.S. Center for Media Engagement. https://mediaengagement.org/research/encrypted-messaging-apps-and-diasporas []
  5. Anderson, B. (2006). Imagined communities: Reflections on the origin and spread of nationalism. Verso Books. []
  6. Knight Foundation (2022, April 7). Nine universities and nonprofits awarded more than $1.2 million from Knight Foundation to combat disinformation in communities of color. https://knightfoundation.org/press/releases/nine- universities-and-nonprofits-awarded-more-than-1-2-million-from-knight-foundation-to-combat-disinformation-on- communities-of-color/ []
  7. Woolley, S., & Kumleben, M. (2021, November). At the epicenter: Electoral propaganda in targeted communities of color. Protect Democracy. https://s3.documentcloud.org/documents/21099928/at-the-epicenter-electoral-propaganda- in-targeted-communities-of-color.pdf []
  8. DFRLab. (2021, February 11). Extremists on Telegram exploit Parler’s de-platforming to ramp up recruiting. Medium. https://medium.com/dfrlab/extremists-on-telegram-exploit-parlers-de-platforming-to-ramp-up-recruiting- eb1256227a5d []
  9. Perrin, A., & Anderson, M. (2019, April 10). Share of U.S. adults using social media, including Facebook, is mostly unchanged since 2018. Pew Research Center. https://www.pewresearch.org/fact-tank/2019/04/10/share-of-u-s- adults-using-social-media-including-facebook-is-mostly-unchanged-since-2018/ []
  10. Gursky, J., Riedl, M.J., & Woolley, S. (2021, March 19). The disinformation threat to diaspora communities in encrypted chat apps. Brookings Institution, Techstream. https://www.brookings.edu/techstream/the-disinformation- threat-to-diaspora-communities-in-encrypted-chat-apps/ []
  11. Cruz, E. G., & Harindranath, R. (2020). WhatsApp as “technology of life”: Reframing research agendas. First Monday, 25(1). https://doi.org/10.5210/fm.v25i12.10405; Matassi, M., Boczkowski, P. J., & Mitchelstein, E. (2019). Domesticating WhatsApp: Family, friends, work, and study in everyday communication. New Media & Society, 21(10), 2183-2200. https://doi.org/10.1177/1461444819841890 []
  12. Gursky, J., Riedl, M.J., Joseff, K., & Woolley, S. (forthcoming). Chat apps and cascade logic: A multi-platform perspective on India, Mexico, and the United States. Social Media + Society. https://doi.org/10.1177/20563051221094773 []
  13. Gursky, J., Riedl, M.J., Joseff, K., & Woolley, S. (forthcoming). Chat apps and cascade logic: A multi-platform perspective on India, Mexico, and the United States. Social Media + Society. https://doi.org/10.1177/20563051221094773 [][]
  14. Nguyen, S., Kuo, R., Reddi, M., Li, L., & Moran, R. E. (2022). Studying mis- and disinformation in Asian diasporic communities: The need for critical transnational research beyond Anglocentrism. Harvard Kennedy School (HKS) Misinformation Review, 3(2). https://doi.org/10.37016/mr-2020-9 []
  15. National Urban League (2022). Under siege: The plot to destroy democracy. https://soba.iamempowered.com/?_ga=2.237304519.691387762.1650656812-1380527237.1650656812 []
  16. Romero, L. (2020, October 20). Florida’s Latino voters being bombarded with right-wing misinformation, experts and advocates say. ABC News. https://abcnews.go.com/Politics/floridas-latino-voters- bombarded-wing- misinformation-advocates/story?id=73707056 []
  17. Bennett, W. L., & Livingston, S. (Eds.). (2020). The disinformation age: Politics, technology, and disruptive communication in the United States. Cambridge University Press. []
  18. Aguirre Jr, A., Rodriguez, E., & Simmers, J. K. (2011). The cultural production of Mexican identity in the United States: An examination of the Mexican threat narrative. Social Identities, 17(5), 695-707. []
  19. Trauthig, I. K., & Woolley, S. (2022, March). Escaping the mainstream? Pitfalls and opportunities of encrypted messaging apps and diaspora communities in the U.S. Center for Media Engagement. https://mediaengagement.org/research/encrypted-messaging-apps-and-diasporas []
  20. Nguyen, S., Kuo, R., Reddi, M., Li, L., & Moran, R. E. (2022). Studying mis- and disinformation in Asian diasporic communities: The need for critical transnational research beyond Anglocentrism. Harvard Kennedy School (HKS) Misinformation Review, 3(2). https://doi.org/10.37016/mr-2020-95 []
  21. Tankersley, J. (2020, October 14). Why Trump’s efforts to paint Biden as a socialist are not working. The New York Times. https://www.nytimes.com/2020/10/14/business/socialist-biden-trump.html []
  22. Gangitano, A. (2021, June 21). Move by Catholic bishops against Biden brings howls of hypocrisy. The Hill. Retrieved from https://thehill.com/homenews/administration/559491-move-by-catholic-bishops-against-biden-brings- howls-of-hypocrisy []
  23. EquisLabs (2021, April 1). 2020 Post-mortem (part one): Portrait of a persuadable Latino. https://equisresearch.medium.com/2020-post-mortem-part-one-16221adbd2f3 []
  24. Longoria, J., Acosta, D., Urbani, S., & Smith, R., (2021, December 8). A limiting lens: How vaccine misinformation has influenced Hispanic conversations online. First Draft News. https://firstdraftnews.org/long-form-article/covid19- vaccine-misinformation-hispanic-latinx-social-media/ []
  25. Turcotte, J., York, C., Irving, J., Scholl, R. M., & Pingree, R. J. (2015). News recommendations from social media opinion leaders: Effects on media trust and information seeking. Journal of Computer-Mediated Communication 20(5), 520-535. https://doi.org/10.1111/jcc4.12127 []
  26. Cunningham, S. Popular media as public ‘sphericles’ for diasporic communities. Journal of Cultural Studies, 4(2). https://doi.org/10.1177/136787790100400201 []
  27. Trauthig, I. K., & Woolley, S. (March, 2022). Escaping the mainstream? Pitfalls and opportunities of encrypted messaging apps and diaspora communities in the U.S. Center for Media Engagement. https://mediaengagement.org/research/encrypted-messaging-apps-and-diasporas []
  28. Hobbs, R., & Frost, R. (2011). Measuring the acquisition of media-literacy skills. Reading Research Quarterly, 38(3), 330-355. https://doi.org/10.1598/RRQ.38.3.2 []
  29. World Health Organization (2020, February 2). Novel Coronavirus(2019-nCoV) Situation Report — 13. World Health Organization. Retrieved from https://www.who.int/docs/default-source/coronaviruse/situation-reports/20200202-sitrep- 13-ncov-v3.pdf []
  30. Lee, E. (2021, July 16). ‘They’re killing people’: Biden points the finger at social media platforms for pandemic misinformation. USA Today. Retrieved from: https://www.usatoday.com/story/news/politics/2021/07/16/biden-social- media-killing-people-allowing-misinformation/7996725002/ []
  31. Loomba, S., de Figueiredo, A., Piatek, S.J., de Graaf, K., & Larson, H.J. (2021). Measuring the impact of COVID- 19 vaccine misinformation on vaccination intent in the U.K. and U.S.A. Nature Human Behavior, 5, 337-348. https://doi.org/10.1038/s41562-021-01056-1 []
  32. Carson, S. L., Casillas, A., Castellon-Lopez, Y., Mansfield, L. N., Barron, J., Ntekume, E., … & Brown, A. F. (2021). COVID-19 vaccine decision-making factors in racial and ethnic minority communities in Los Angeles, California. JAMA network open, 4(9), e2127582-e2127582. []
  33. Longoria, J., Acosta, D., Urbani, S., & Smith, R., (2021, December 8). A limiting lens: How vaccine misinformation has influenced Hispanic conversations online. First Draft News. https://firstdraftnews.org/long-form-article/covid19- vaccine-misinformation-hispanic-latinx-social-media/ []
  34. Dodson, K., Mason, J., & Smith, R. (2021, October 13). COVID-19 vaccine misinformation and narratives surrounding Black communities on social media. First Draft News. https://firstdraftnews.org/long-form-article/covid-19- vaccine-misinformation-black-communities/ []
  35. Salcedo, A. (2021, March 19). Racist anti-Asian hashtags spiked after Trump first tweeted ‘Chinese virus,’ study finds. The Washington Post. https://www.washingtonpost.com/nation/2021/03/19/trump-tweets-chinese-virus-racist/ []
  36. Ong, J. C. (2021, October 19). Online disinformation against AAPI communities during the COVID-19 pandemic. Carnegie Endowment for International Peace. https://carnegieendowment.org/2021/10/19/online-disinformation-against-aapi-communities-during-covid-19-pandemic-pub-85515 []
  37. Kim, J. Y., & Kesari, A. (2021). Misinformation and hate speech: The case of anti-Asian hate speech during theCOVID-19 pandemic. Journal of Online Trust and Safety, 1(1). https://doi.org/10.54501/jots.v1i1.13 []
  38. Kumleben, M., Joseff, K., & Woolley, S. (forthcoming). Contending with structural disinformation in communities of color. Protect Democracy. []
  39. Kuo, R., & Marwick, A. (2021, August). Critical disinformation studies: History, power, and politics. Harvard Kennedy School (HKS) Misinformation Review, 2(4). https://doi.org/10.37016/mr-2020-76 []
  40. Austin, E. W., Borah, P., & Domgaard, S. (2021). COVID-19 disinformation and political engagement among communities of color: The role of media literacy. Harvard Kennedy School (HKS) Misinformation Review, 1(Special Issue on US Elections and Disinformation). https://doi.org/10.37016/mr-2020-58 []
  41. Quaranta, K., (2020, November 6). Social media disinformation tactics that attempted to deceive Arizonans in the 2020 elections. Cronkite News/Arizona PBS. https://cronkitenews.azpbs.org/2020/11/06/social-media-disinformation- tactics-tried-to-deceive-arizonans-elections-2020/; Kumleben, M., Joseff, K., & Woolley, S. (forthcoming). Contending with structural disinformation in communities of color. Protect Democracy. []
  42. Heim, M., & Litke, E. (2020, November 4). PolitiFact–Fact-checking the avalanche of Wisconsin election misinformation. Politifact. https://www.politifact.com/article/2020/nov/04/fact-checking-avalanche-wisconsin-elect ion- misinfo/ []
  43. Mazzei, P., & Medina, J. (2020, October 21). False political news in Spanish pits Latino voters against Black Lives Matter. The New York Times. https://www.nytimes.com/2020/10/21/us/politics/spanish-election-2020-disinformation.html []
  44. Briggs, P., Burford, B., De Angeli, A., & Lynch, P. (2002). Trust in online advice. Social Science Computer Review, 20(3), 321-332. https://doi.org/10.1177/089443930202000309 []
  45. Hasen, R. (2022, April 20). Identifying and minimizing the risk of election subversion and stolen elections in the contemporary United States. Harvard Law Review, 135, 265-301. https://harvardlawreview.org/2022/04/identifying- and-minimizing-the-risk-of-election-subversion-and-stolen-elections-in-the-contemporary-united-states/ []
  46. Trauthig, I. K., & Woolley, S. (2022, March 23). Digital disinformation increasingly targets the most vulnerable. Centre for International Governance Innovation. https://www.cigionline.org/articles/digital-disinformation-increasingly- targets-the-most-vulnerable/ []
  47. Rossini, P., Stromer-Galley, J., Baptista, E. A., & de Oliveira, Vanessa Veiga (2020). Dysfunctional information sharing on WhatsApp and Facebook: The role of political talk, cross-cutting exposure and social corrections. New Media & Society, 23(8), 2430-2451. https://doi.org/10.1177/1461444820928059 []
  48. Reis, J. C. S., Melo, P., Garimella, K., & Benevenuto, F. (2020). Can WhatsApp benefit from debunked fact- checked stories to reduce misinformation? Harvard Kennedy School (HKS) Misinformation Review, 1(5). https://doi.org/10.37016/mr-2020-035 []
  49. Chang, E., & Zhang, S. (2020, December 11). It’s crucial to understand how misinformation flows through diaspora communities. First Draft News. https://firstdraftnews.org/articles/misinfo-chinese-diaspora/ []
  50. Nguyen, S., Kuo, R., Reddi, M., Li, L., & Moran, R. E. (2022). Studying mis- and disinformation in Asian diasporic communities: The need for critical transnational research beyond Anglocentrism. Harvard Kennedy School (HKS) Misinformation Review, 3(2). https://doi.org/10.37016/mr-2020-9 []
  51. Stier, M., & Freedman, T. (2022, March 1). Why democracy’s in such trouble: A crisis in public trust of government. Politico. https://www.politico.com/news/magazine/2022/03/01/democracy-public-crisis-trust-government-faith- 00012565 []