Many social media users rely on their feeds for news, but those feeds are sometimes flooded with disinformation from people and organizations with their own agenda — a phenomenon known as "astroturfing."
Tanu Mitra, an assistant professor at the Information School, has been increasingly interested in these disinformation tactics.
“Astroturfing is essentially polluting the existing information environment with deceptive messages as if they were coming from genuine ordinary citizens,” Mitra said. “Censorship, on the other hand, involves selectively removing content to shape narratives. When these two vectors of influence operations act in parallel, they can be highly effective in adversely influencing public opinion.”
She was recently awarded a $245,366 grant from the U.S. Department of Defense (DoD) for a research project that will investigate how these methods of deception affect people’s perception of political events. The project is titled “Models and Methods for Understanding Covert Online Influence.”
Mitra’s interest in the project began when she learned about people in Hong Kong who were receiving false information about protests against a proposed law — a law that would allow the Chinese government to send suspects from Hong Kong, including foreign nationals, to China for trial.
In her grant proposal, she mentioned the rise of the “Fifty Cent Party” which notoriously posts comments on social media platforms in favor of the Chinese government. She is particularly curious about the effect of astroturfing, which creates the appearance of widespread support, on remote islands in Southeast Asia, a region where people don't have a say in current political affairs that are enforced by mainland China.
“There is a lot of work done in the U.S. context, but not so much done in the Southeast Asian region,” Mitra said.
Additionally, when investigating the social effects of astroturfing, the work will also aim to connect online and offline activities.
“When we navigate between the online and offline world, can we actually see some gaps that are not captured in the offline data?” said Mitra, who has been researching how online communities use false information to sway public opinion for the past several years.
Mitra will partner with the College of Information and Cyberspace of the National Defense University (NDU). Together, the two academic forces will combine the social and technical aspects of the research – the iSchool will lead the data collection and visualization while the NDU team will lead the field research and military education.
Integrating academic research and military education will be a new experience, Mitra added, but she is excited about how the research will be received by service members.
Alongside Mitra, Shruti Phadke, an iSchool Ph.D. student, will host lectures with the U.S. military personnel at NDU to help them understand basic concepts in computer science and data analytics. These lectures, Mitra said, will help soldiers be more aware of misleading online information when they serve abroad, especially in regions without a free press.
“Since we’re talking to folks who don’t understand data analysis like we do, [Phadke] has done a really good job figuring out how to translate complex theories in these lectures,” Mitra said.
In addition to her DoD grant, she was recently awarded a Fact-Checking Innovation Initiative grant by the Poynter Institute of $99,707, of which $54,707 will come to the iSchool. Mitra will receive this grant in collaboration with PesaCheck to study misinformation campaigns in Africa.