In the late 1940s, "assistive technology" might have meant a cane or walker for a wounded World War II vet. Sixty years later, in busy UW labs, the possibilities expand almost daily. Assistive technology experts have created software programs that help disabled people control computers with vowel sounds -- oooh, ahhhh, eeee. They've developed audio technology to help the blind access touch screens. They're working on software that lets the deaf use sign language over mobile phones.
In the thick of the inventive bustle is an iSchool assistant professor with a guiding sense of fairness. "Most businesses can't afford to cater to the community of disabled people as much as they should because they don't conceive it would contribute to the bottom line. As academics, we can pursue things that are not fettered by those bottom-line issues," says Jacob Wobbrock, who worked at Google, DoDots, Microsoft Research, Intel Corporation, and the Intel-Mattel Smart Toy Lab prior to getting his Ph.D. in human-computer interaction from Carnegie Mellon University.
At the UW, Wobbrock heads a human-computer interaction lab he calls AIM, for accessibility, interaction, and mobility. With doctoral students and collaborators from the iSchool and Computer Science, AIM works to break down barriers that prevent hundreds of millions of disabled people from accessing standard one-size-fits-all technology.
The work is intense, the pace fast. "Most of our work we publish at conferences, not in journals -- they're too slow," says Wobbrock.
His long list of collaborative projects includes software that corrects typing errors of motor-impaired people, a hands-free voice-driven drawing program, and cutting-edge programs that will eliminate the need for precise mouse-pointing to access on-screen targets.
While at Carnegie Mellon, Wobbrock developed an innovative program called EdgeWrite, which he continues to hone at the UW. The program began with a stylus that -- no matter how shaky the hand -- could turn loosely drawn strokes into letters, numbers, and other characters. Collaborators have since created a version of EdgeWrite that operates through eye movement. They're also creating a version for use with Apple iPhones.
Wobbrock consistently wins top recognition for his work. CHI, the leading conference for human-computer interaction, has awarded him three highly competitive Best Paper Awards as well as a 2009 nomination. Judges have praised his "excellent mix of theoretical, experimental, and field studies."
Wobbrock says his work is designed to make technology work better for everyone, not just people with disabilities. "But if you pay attention to users with special needs upfront and early in the design process, you often end up serving the needs the general public as well."