Designers embed certain assumptions about people’s abilities into the technologies they create. Take, for example, the touchscreens on our computers, tabletops, tablets, and smartphones.
“Touchscreens assume people can point a finger, suspend an arm, reach out, and land cleanly. That’s a lot of assumptions. What if someone can’t do those things?” asks Jacob O. Wobbrock, director of the Mobile and Accessible Design Lab (MAD Lab), a center of innovation housed at the iSchool that makes everyday technologies accessible to people who have disabilities or are in disabling situations.
Under Wobbrock’s mentorship, Ph.D. students and collaborators from information and computer science have produced such inventions as a drawing program controlled entirely by non-speech voice commands, a screen reader that enables visually impaired users to explore documents using hand gestures, and a mouse cursor that magnifies small screen targets for people with motor impairments. They have figured out how to employ eye-gaze gestures for text entry and how to slow down a cursor automatically as it nears a target for people without fine motor control, making the mouse easier to use.
A recent MAD Lab invention addresses that touchscreen challenge: “What if someone can’t do those things?” Ph.D. student Martez Mott has developed Smart Touch, which allows people with cerebral palsy, muscular dystrophy, spinal cord injury, and other motor impairments to operate a touchscreen. The system first observes how users touch the screen and then predicts their intended targets, an innovation that can also help people who are “situationally impaired -- like a smartphone user struggling to type while walking along a noisy, crowded sidewalk.
“Jake gave Smart Touch to me and said, ‘I think you should work on this problem, it’s a good problem.’ And then he let me run with it,” says Mott, one of the students drawn to the MAD Lab by Wobbrock’s award-winning publications and his reputation for exciting HCI (human-computer interaction) work.
“He really wants us to create solutions that will have an impact on people’s lives,” says Mott.
Wobbrock’s zeal for working on real-world problems has won him multiple awards, including the just-announced 2017 ACM SIGCHI Social Impact Award, a prestigious international accolade honoring HCI researchers who apply their work to pressing social needs. The judges wanted to acknowledge Wobbrock’s career-long focus on creative solutions to support people’s accessibility needs, says University of Maryland professor Ben Bederson, SIGCHI Adjunct Chair for Awards. “We liked the combination of his scholarly contributions, his impact on industry, and his approach to accessibility solutions that provide benefits to everybody, including those without specific disabilities."
Researchers at the MAD Lab pursue “ability-based design,” focusing on what people with disabilities can do, instead of what they can’t do and creating technology to support those identified abilities, whether that means enabling touchscreen tapping with a whole fist or using a person’s “oooh,” “ahhh,” “eehh” vocalizations to play a video game. Ability-based design also examines “situated abilities” – abilities exercised in the context of real-world use, including that smartphone user walking, distracted, down a busy sidewalk while trying to type.
“We study users’ needs and abilities, invent new technologies, and test prototypes to refine our understanding,” says Wobbrock, an associate professor in the iSchool and, by courtesy, Computer Science and Engineering. “Everyone in my lab creates prototypes and tests them with users, improving our designs until we have something that is right.”
Wobbrock graduated from Stanford University in 1998 and 2000 with a B.S. in Symbolic Systems and M.S. in Computer Science. After working as a design engineer in Silicon Valley, he returned to academia for his Ph.D. in HCI at Carnegie Mellon University, where he invented ways to improve text entry for people with motor impairments. His advisor was Brad Myers, who will receive this year’s SIGCHI Lifetime Research Award. “I found accessibility work full of fascinating intellectual challenges,” says Wobbrock, a big thinker who was already programming in BASIC on his IBM PCjr in grade school.
Wobbrock, 41, was drawn to the UW in 2006 by researchers coalescing around HCI. Once here, he helped found the multi-departmental DUB Group (design: use: build) that joins faculty, students, and industry partners in HCI and Design. DUB includes the iSchool, Computer Science and Engineering, Human-Centered Design and Engineering, and the Design Division, among others. “We’re all rowing the same boat,” says Wobbrock. “This kind of grassroots organization can only arise at universities whose walls are low and whose professors are eager to collaborate. That’s not widespread in high-powered academia.”
At the heart of Wobbrock’s work is inclusion. For society to progress, he says, all people must have access to information. “I like to quote Dean Kamen: ‘Everybody has to be able to participate in a future they want to live for.’”
Expectations in the lab are high, say students. So is Wobbrock’s confidence in them. “He encourages us to take ownership of our projects, and when you allow students to really own their projects, to do their own thing, it makes us want to be creative, and bring that creative energy to the accessibility community,” says Mott.
Shaun Kane, who earned his Ph.D. at the iSchool in 2011 and is now an assistant professor of computer science at the University of Colorado Boulder, said he is grateful to have had Wobbrock as a mentor. Kane’s doctoral work examined how to make touchscreens more accessible for the blind and visually impaired. “A number of people said this idea was a bad one and that I shouldn’t do it, but Jake said ‘Stick with it and let’s find out. We may find out it’s the wrong approach, but we won’t know until we have something we can try out. We’ll see if it works and if not, we’ll try the next thing.’”
Wobbrock’s measure for success at the MAD Lab is simple. He subscribes to advice from former UW colleague James Landay: “If your students succeed, you will succeed.”
And they do. Major publications have covered the groundbreaking work at the lab, where research has influenced game-changing industry innovations like the Apple iPhone’s VoiceOver feature, which enables people who are blind to operate their smartphones. VoiceOver’s interaction techniques were pioneered by Wobbrock, Kane, and then-computer science Ph.D. student Jeffrey Bigham in a project called Slide Rule. Bigham is now an associate professor at Carnegie Mellon in the same unit from which Wobbrock graduated more than 11 years ago.
Wobbrock is especially proud of the diversity of the MAD Lab. Half of his students and alumni are women. Some have disabilities, are refugees, or are from underrepresented groups. Along with doctoral students, Wobbrock has mentored master’s, undergraduate, and even high-school students.
“With diversity comes fresh ideas,” Wobbrock says. “If everyone comes with the same background or worldview, then everyone generates the same ideas. Diversity is the lifeblood of creativity, and creativity is the coin of the realm in HCI and design.”