You touch, tap and swipe your way around the modern world. For many, it’s become second-nature.
But today’s consumer technologies can frustrate those who don’t have the manual dexterity to use them. For people whose motor skills are impaired by cerebral palsy, muscular dystrophy, and other motor-neuron conditions, a touchscreen presents a serious challenge. Devices require users to pinpoint areas with their fingertips; they don’t know how to react when a user drags the side of a hand or an entire palm across the screen.
Enter “Smart Touch,” a system that adapts to however a user touches a screen and greatly improves the accuracy of the device’s response. An idea that grew out of research on Ability-Based Design at the University of Washington Information School, Smart Touch gathers data to learn how the user targets objects on a screen, then uses that data to better interpret the user’s subsequent actions.
“The key concept of Smart Touch is that it allows users to touch in whichever way is most comfortable and natural to them,” said Martez Mott, a fourth-year Ph.D. student at the iSchool. “We’re saying to the user, ‘If it’s most comfortable for you to touch with a closed fist because it is painful or fatiguing to extend your fingers and to open your hand, we’ll allow you to touch with a closed fist.
“That’s different from other approaches, which have essentially said, ‘You need to change yourself to adapt to the requirements of the system.’”
Mott has spent the past two years working on Smart Touch with iSchool Associate Professor Jacob Wobbrock, who has pioneered the concept of Ability-Based Design – the idea that accessibility and adaptability should be baked into consumer technologies so that they can better match the abilities of their users. By adjusting to a wider spectrum of users’ abilities, this approach makes technology more useful and usable for everyone.
The work on Smart Touch earned Mott and Wobbrock a CHI 2016 Best Paper Award, an honor that goes only to the top 1 percent of papers submitted to the most prestigious annual conference on human-computer interaction. (“CHI” stands for “Computer-Human Interaction” and is the abbreviation for the ACM Conference on Human Factors in Computing Systems.) Also contributing to the paper were Romanian researcher Radu-Daniel Vatavu and University of Colorado at Boulder Assistant Professor Shaun Kane, who earned his Ph.D. at the UW iSchool in 2011.
Wobbrock was Kane’s doctoral advisor then and is Mott’s advisor now. This is the seventh CHI Best Paper Award for Wobbrock, who also has seven honorable mention papers from CHI. Mott, the lead author on Smart Touch, will present the work at CHI in San Jose, California, in May.
The paper details how the researchers conducted dozens of trials with people who have motor-neuron conditions, gradually zeroing in on how each uses a touchscreen. By the end of the trial, Smart Touch was more than three times as accurate as the built-in sensors.
“Bringing this research to fruition was a slow process, using observational trial and error spread out over a long period of time,” Mott said. But the payoff made it worth the time investment. “It is a really good project in terms of the potential impact it can have.”
Wobbrock’s Ability-Based Design approach has already made an impact on consumer technology. VoiceOver, which is built into the Apple iPhone’s standard software, uses techniques developed by Wobbrock and Kane in 2007. Anyone who is blind or visually impaired can turn on VoiceOver to “read” the screen by swiping a finger.
Similarly, Smart Touch could open another avenue for accessibility in consumer products, but first there is more study to be done. Researchers want to sharpen its accuracy even more and see if they can bring the concept from interactive tabletops to tablets and smartphones, which usually require more precision from users' touches.
Taken to its fullest potential, Smart Touch could extend beyond helping those with disabilities to improve how touchscreens perform for all users – even those who simply suffer from “fat fingers” when they compose a text message.
“I think Smart Touch has a lot of potential to transform the way systems interpret touch,” Wobbrock said. “And it isn’t just for people with motor impairments. Everyone experiences some degree of ‘impairment’ when they use a smartphone while walking, for example, or while riding on a bumpy bus. Improving the accuracy of touch can benefit everyone.”