Informatics students help Google improve bug reporting

By Shanzay Shabi Thursday, June 6, 2024

Google, like many other companies, utilizes manual and automated processes to investigate security vulnerabilities (otherwise known as bugs or threats) detected by users of its products and services. Given the influx of user- and AI-generated bug reports, many of them are duplicates, rendering 90% of bug reports unactionable.

With insufficient quality control in place, Google’s internal review team wastes time trying to triage and find the answer to a pre-existing bug. Meanwhile, users waste time submitting bug reports as well.

For their Capstone project at the Information School, five Informatics students, (pictured, from left) Hitanshu Prajapati, Kyle Raychel, Eddy Peng, Harold Pham and Sami Foell partnered with Google to explore how AI could be used to create a deduplication tool that addresses this issue and improves the accuracy of bug reports. 

“Essentially we’re streamlining the process of review for security threat reports and are hoping to increase efficiency for Google’s researchers and bug reporters,” said Pham, the back-end lead on the project. “A lot of what we’re doing right now is experimental and testing the capabilities of AI in the deduplication process.”

The Capstone team created the deduplication tool specifically for Google’s Android-focused reporting form called the Vulnerability Reward Program, which allows users to report a bug and potentially receive financial compensation. For example, multiple users might report the same authentication or authorization flaw in Google Cloud.

The students have built “Vedette,” an AI assistant, and integrated it into the existing Google bug report form. Vedette analyzes the reports’ content for similarity to historical reports and provides analytics. The team’s goal for the assistant was to increase transparency and provide quantifiable metrics to improve the reporting process by automating manual analyses.

“Manual deduplication can take several days, but this tool could theoretically conduct those processes in a few seconds,” said Peng, the project manager and product designer. “Our project is trying to champion efficiency while maintaining accuracy and speed throughout.”

By using cutting-edge technology to elevate the bug reporting process, the students are helping Google’s Android Security save time, effort and resources to investigate novel security threats.

Capstone sponsor and Google security team member Greg Wroblewski was thrilled with the students’ work, which won an Innovation Award at the annual Capstone Gala on May 30. “I’m very impressed by the level of professionalism these students exhibited throughout the project,” he said. “They continued to surprise me with their knowledge of AI and have given an outstanding performance. I’m very proud of all of them.”

By the end of the project, the students will have created two key deliverables: a landing page that conveys the process behind their project (with interactive and video demonstrations attached), and a functional solution solely disclosed to Google.

“This is just the beginning of what AI can do in security,” said Foell, who served as research lead and product designer. “Because we’re maintaining a morally and ethically sound position throughout our project, I hope that we can set a precedent for automation to assist and empower people, not replace them.” 

As they have experimented with the possibilities of AI in security, the Informatics students have also developed a strong bond.

“My experience has been absolutely great. You know, I love these guys. They're insanely hard-working and I’m so glad to have met them through Capstone,” said Prajapati, the project’s full-stack engineer.

“This has been amazing. Experimenting with AI and seeing how its capabilities measure up to human reviewers has been incredibly fulfilling," said Raychel, the AI and data engineer. “I’m excited to see the growth and opportunities that will emerge from our discoveries.”