Prerna Juneja, a Ph.D. student at the Information School, recently won a fellowship from the Social Science Research Council to support her studies on the role of algorithms in spreading misinformation.
The Social Data Research and Dissertation Fellowship program, offered by the Social Data Initiative and supported by the Omidyar Network, seeks to encourage multifaceted pathways for collecting and analyzing social data, with the larger aim of cultivating robust research on technology and society.
“Misinformation is rampant in online search spaces, and there isn't a better time to study the role search and recommendation algorithms employed by online platforms play in amplifying the misinformation that is presented to users," she said.
Juneja transferred to the information School this fall to follow Tanu Mitra, her longtime advisor and new iSchool faculty member. She first met Mitra when she joined her lab as a first-year Ph.D. student at Virginia Tech.
As Mitra’s advisee, Juneja led multiple projects while also mentoring master’s and undergraduate student research assistants. Her first research project was an offshoot of a class project from a graduate-level Social Computing class she took with Mitra in fall 2018.
“I was thrilled to hear that Juneja won the very competitive fellowship," Mitra said.” I am strongly convinced that she has everything that is needed to continue being an excellent scholar, an outstanding member of our research community who will make considerable contributions to the field.”
Earlier this month, Juneja had a paper published at the 23rd ACM conference on Computer-Supported Cooperative Work and Social Computing. Along with Mitra, she investigated whether the YouTube platform’s personalization algorithm (based on age, gender, geolocation or watch history) amplifies misinformation. They found that once users develop a watch history, their demographic information affects the amount of misinformation presented to them in search results and recommendations.
Their further analysis revealed that video recommendations for all the topics they studied, except ones that focused on vaccine controversies, drove people into problematic echo chambers, leading to more misinformative video recommendations. For the vaccine topic, people who watched anti-vaccination videos were presented with less misinformation in their recommendations but more misinformation in their search results, compared to those who watched neutral or pro-vaccine videos.
Their findings suggest that YouTube might be modifying its search and recommendation algorithms selectively, handpicking topics that are highlighted by media reports and technology critics.