UW accessibility research could help make cars safer

By Jim Davis Wednesday, July 31, 2024

More than 3,000 people die — and another 400,000 are injured — in distracted-driving accidents in the U.S. each year doing such things as fumbling with the navigation system or sending work texts, according to the Centers for Disease Control and Prevention.

Now a team of researchers at the University of Washington is working with the Toyota Research Institute (TRI) to create touchscreen dashboard displays that could lower the risks of driving while distracted.

“When you’re driving, we all know that we probably shouldn’t be looking at the phone or attending to something else, but the reality is that no matter what, people will always do that, people will always multitask,” said Junhun “Judy” Kong (pictured at top), a Ph.D. student at the UW’s Information School.

Kong is the lead student on a team that has been working on a project called the ability-based design mobile toolkit, which provides developers with code to build apps that can adapt to people’s individual abilities and situations. Other members of the team include iSchool Professor Jacob O. Wobbrock and Professor James Fogarty of the Paul G. Allen School of Computer Science & Engineering.

TRI was interested in exploring how the UW team’s research could be incorporated into dashboard digital displays in future cars and trucks, so it entered into a research collaboration with the UW team.

“We were talking to the TRI people about our toolkit and they were excited about its potential for the driving scenario,” said Kong, who is pursuing her doctoral work in information science with a focus on human-computer interaction under Wobbrock’s supervision.

The UW team has been working on the toolkit for more than two years to help developers create mobile apps that adapt to the user. Apps are in many ways inaccessible to people with disabilities or who face other hindrances, Wobbrock said.

“If you design an app that is easier to use for a person with one arm, for instance, it can also be useful for a person who is carrying a bag of groceries or holding a baby,” Wobbrock said. “When you make designs more accessible for people with disabilities, you often make better designs for everyone.” 

Jacob O. Wobbrock
Professor Jacob O. Wobbrock

App developers may not always have time to focus on accessibility. So, the UW team worked on the toolkit, which provides behind-the-scenes code that developers could apply to their own projects. The challenge for the UW team was to create a framework for how to make apps more accessible for all.

“It takes some work to organize the universe of incoming data, reasoning about abilities, and adapting user interfaces,” Wobbrock said. “We know what we want to achieve — we want to make it easier for developers to create mobile interfaces that are responsive to a user’s abilities and situations.”

What they developed is code that allows apps to interact with the user based on factors such as touch, gesture, physical activity, and attention. From there, the code attempts to understand the behaviors and the situations of the user and then accommodate for those conditions. Part of this capability stems from earlier work published by the UW research team at ASSETS 2022, which revealed correlations between a set of touch metrics and specific fine motor challenges such as tremor.

“You can say it in three verbs — observe, reason, and adapt,” Wobbrock said. “Observe user behavior, reason about user behavior, and then adapt to user behavior.”

Maybe the code could make buttons and text larger on the phone for someone who is visually impaired. Or maybe the code could help the app account for jostling when someone is walking down the street and trying to type a message.

“Let's say, for example, you want to know a user's touch abilities and you turn on the touch observer as a developer,” said Kong, who is working in an internship in Redmond this summer. “What happens is this toolkit captures every single touch trace in the background while a user is using the app. Then you can ask the toolkit if tremor is evident in the user’s touch.”

The UW team plans to present its work at MobileHCI 2024 in Melbourne, Australia, later this summer. Afterward, the team plans to release the toolkit and make the code open source for any developer to use. Along with Kong, Fogarty, and Wobbrock, the MobileHCI paper is co-authored by Allen School Ph.D. student Mingyuan Zhong.

Scott Carter, a senior staff research scientist who is the project lead for TRI, said the research aligns well with the organization’s goal of making cars safer.

“This collaboration spans two divisions at TRI — human-centered AI and human interactive driving — and is a perfect expression of their goals: using data-driven techniques to build models of human behavior that can help humans make decisions that lead to collective well-being,” Carter said. “We are especially excited to work with the UW team as their expertise in adaptive user interfaces is a natural complement to our work in behavioral modeling, driving simulation and interaction.”

The principles used for creating better, more accessible apps may make touch displays more useful and adaptable to drivers. While it would be ideal if drivers stayed focused, Kong said, the research team would like to provide as much support as possible for those who multitask.

“There are so many things going on in the car,” she said. “You might have messages coming in, or you might have music on, or you might be adjusting the navigation system. A better interface may create a better driving experience — safer, more comfortable, and less stressful.”