Political campaigns across the United States are paying close attention to the rising political importance of Asian American communities across the country. The logic is simple — these populations are growing and therefore gaining more civic clout. Nowhere is this clearer than in Texas. In Houston, Asian Americans are the fastest-growing broad demographic throughout the sprawling suburbs. In Fort Bend County, to the southwest of the city, the community is primarily made up of Indian and Chinese Americans. In North Texas, Asian populations have exploded in the last decade — growing by 90% in Irving and by 53% in Tarrant County.
Campaigns are well aware that they need to meet these potential voters where they are, which means moving persuasion efforts to platforms popular among particular demographics. For instance: WhatsApp (by far the most popular chat app in India and, hence, among Indian diaspora communities) and WeChat (the most popular Chinese instant messaging app).
In 2020, these apps were discussed in light of the distinct communicative role they played in political conversations within these communities — with specific attention paid to worrisome flows of false news and misinformation. However, the political importance and the informational challenges of these encrypted messaging apps (EMAs) are still barely understood due to the closed or private nature of these platforms. The Propaganda Lab at the Center for Media Engagement has spent the last several months talking to members of the Indian American and Chinese American communities across Texas in order to better understand how and why political talk — specifically mis- and disinformation — proliferates in such forums. We found that while WhatsApp and WeChat are among the most important platforms for these communities’ political discussions and decisions, too little attention is paid to the virulent falsehoods that are also spread in these spaces.
Messaging apps may be private, but they are still prone to misinformation
What makes WhatsApp and WeChat unique from social media sites like Facebook and Twitter is that they offer private spaces for communication. WhatsApp is end-to-end encrypted (E2EE), meaning messages sent between users are unreadable by either the platform itself or by any third party. WeChat, on the other hand, is transport encrypted — a mode of security that allows the platform (but not third parties) to access messages. Networks of people can connect in group chats of up to 512 individuals on WhatsApp and 500 individuals on WeChat. These chats can range from small family or friend groups to larger community or neighborhood groups.
Information on these platforms can be spread widely via the combination of broadcasting by Official Accounts (OAs) on WeChat and forwarding features that allow users to easily share these messages between chats. WeChat, popular both in China and also among Chinese diaspora communities, hosts over 1.48 million active monthly users in the U.S. Uniquely, it allows for the viral spread of content through public pages called OAs— which allow various “verified” media outlets to disseminate news in a way similar to a newsfeed. Many news outlets based in U.S. cities that are home to substantial Chinese populations have OAs to disseminate local news. Several of our interviewees said that these OAs are important sources of news for them. However, political influencers and politically motivated groups have capitalized on the ease of information sharing on WeChat via OAs: in 2016 a pro-Trump OA “The Chinese Voice of America” amassed over 32,000 followers within months of its creation. Problematically, this allows for propaganda and misleading content to appear more legitimate and to spread widely.
Similarly, on WhatsApp, information can be shared through broadcast lists and an easy-to-use forwarding function that allows any message to be forwarded to multiple chats. Forwarding on chat apps is a common practice in the Asian American community. For some — specifically for those in the immigrant community — forwarding features on EMAs often serve as a point of connectivity with family back home. Zara, a student from the Indian American community in Houston, noted that her parents’ extensive political activity on WhatsApp stems from guilt for immigrating to the U.S.
“[My parents] forward things because they want to feel like they’re doing something,” Zara said. “Them trying to send these news articles is their way of trying to give back and help their community that they left.”
However, when the mass-forwarded information is false, the spread of misinformation compounds.
“Half of the WhatsApp [content] — whatever you’re forwarding — it gets convoluted through the process — it’s like playing a game of telephone almost — in the end they’re not always getting what could be factual,” said Steven, a member of the Indian American community in Katy.
Although WhatsApp implemented forwarding limitations to address the spread of misinformation via mass forwarding, our interviewees note that they consistently receive false information. The issue is so widespread and pervasive that younger generations have even coined the term “WhatsApp degree” to describe their parent’s tendencies to share false information from WhatsApp with confidence.
Increased trust = increased potency of propaganda and falsehood
These messaging apps fuse the intimacy of private messaging with community-connecting features to simultaneously create a viral-friendly, yet sequestered, environment hidden from fact-checking operations. If weaponized, disinformation that spreads via networks on WeChat and WhatsApp has the potential to become even more powerful considering that such groups are often made up of close family and friends and that information from those we are close to is more likely to be trusted.
“[Community members] trust [information from EMAs] because their friends send it to them — they keep spreading and spreading… and no one is doing fact-checking.” J Chen (pseudonym)
The concept of relational organizing offers an explanation as to why messengers on these platforms are given increased credibility. Political consultants have long found success with using relational organizing to promote political campaigns in which personal networks are harnessed with the intention of mobilizing a community. While political discussions on chat apps within these diaspora communities cannot be considered coordinated relational organizing campaigns, understanding the role trust plays when it comes to information efficacy cannot be undervalued.
Our research shows that this relational organizing aspect plays a role in the spread of false information. Ultimately, this expansive private ecosystem of trusted networks is what makes the spread of dubious information difficult to control on EMAs.
“If I get something from my friends, I will obviously believe it more because I trust my friends…it’s a lot more believable.” Pranav (pseudonym)
Political campaigning enters non-political groups
Our team observed a trend of political conversation infiltrating non-political community groups. Within the Asian American community specifically, EMAs are often used as a tool to connect members to helpful local resources. For parents, these chats often discuss matters of their children’s education such as SAT prep classes, college admissions information, tutoring services, etc. According to our interviewees, these chats are designed to connect parents with a like-minded concern for their children’s wellbeing.
Concomitantly, political and social justice conversations infiltrate non-political community chats when parents view their children’s well-being to be at stake. For instance, during the Black Lives Matter protests in response to the killing of George Floyd, Jeffrey, a member of the Chinese American community in Houston, reported that biased and misleading articles regarding alleged violence Black people committed against Asian people went viral in these Chinese American parent WeChat groups. “It was really blatant what the effects were and what the intentions were behind that,” Jeffrey said. “It was a collective effort…Asian American [parents, specifically moms] were so adamant…about making sure their kids were not interacting with Black people in that period of time.” In those instances, hateful and polarizing misinformation was coming from domestic actors first but as the topic went viral, foreign actors capitalized on it and spread similar messages commenting on the American situation from abroad.
Another interviewee, J. Chen, a member of the Chinese American community in Houston, noted how the dissemination of anti-transgender articles on local neighborhood WeChat groups served as a catalyst for urging community members to vote for certain local policies. We found that information in these chats engages in fear-mongering tactics that trigger a specific community value or concern, making it more manipulative.
As elections approach, WeChat and WhatsApp are on the brink of turning into divisive platforms
J. Chen said that the spread of false information on WeChat, along with a general trend of members not bothering to fact-check dubious information, has negative ramifications for communities both online and offline. For instance, during voting cycles, people from within these groups will put out persuasive content on WeChat that tells others to vote for a certain candidate. “A lot of people don’t know who to vote for, so they are easily swayed,” J Chen noted. “It is a big political platform, and a lot of people will just blindly follow.”
The implications of this, in the face of dis- and misinformation about particular candidates or causes, are worrisome. For example, numerous posts alleging Joe Biden had ties to socialism and communism spread over WhatsApp and other media platforms among the Cuban American diaspora in Florida. Contextually, false content like this can be damaging given the social setting — claims about communism tend to have particular significance for Cubans who have immigrated to this country.
WeChat and WhatsApp will be crucial for the midterms
Given the proximity to midterm elections and the importance of these digital spaces as both political forums and vectors of false information, we asked our interviewees about their expectations for election-related information on chat apps. Unsurprisingly, perhaps, they consistently expected election-related information — and misinformation — to increase as the midterms draw closer: “there’s an influx [of political information]…when something actually is happening and people are voting,” says Maggie (pseudonym), a member of the Indian American community in Houston.
This signals a need to keep a pulse on the virality of particular streams of false or misleading content over EMAs — particularly if they relate to how, when, or where to vote or if such streams contain harassment directed at particular groups. Bluntly, we cannot afford to continue to ignore mis-and disinformation in private messaging spaces. Broadly, it is detrimental to the democratic process in the U.S. Specifically, it disinforms and disenfranchises minority groups already facing barriers to political participation.
Interviewee Jeffrey says the solution to these informational problems in closed ecosystems might involve conversations among community members aimed at interrupting falsehoods and potential political echo chambers. “Regarding resolving misinformation in Asian American communities…I think the best thing to do is for young people — and people who don’t use these platforms like WeChat — to counteract it by having conversations and bringing in dialogue from other spaces.” Investments in culturally and linguistically contextual media literacy efforts in these communities as well as in civil society groups that are already helping them to be more civically empowered would also be sound long-term methods of combatting mis- and disinformation — both online and off. A technological fix, like WhatsApp’s limitation on the number of times a post can be forwarded, is likely to prove limited when compared to educational, community-based solutions.