Touchscreen readers appear to show impact of iSchool research

By Jim Davis | Illustration by Alexia Lozano Wednesday, March 26, 2025

Tim Paulding ran his fingers across the smooth glass screen of an iPod Touch for the first time in 2010. He was working as a counselor at a summer camp for blind kids outside Grand Rapids, Michigan, when a friend handed him the then top-of-the-line iPod.

Paulding, who is blind, was familiar with Symbian and Windows mobile phones, which featured keyboards with raised buttons. “I loved them — I could type so fast on those keyboards,” he said. Then he felt the iPod touchscreen. 

“I’m like, ‘Dude, what am I gonna do with a touchscreen?’ It's crazy. I can't feel anything,” Paulding remembered.

Then, his friend showed him an innovation on the new device. “You drag your finger around and it reads what’s under your finger,” Paulding said. “I just thought it was really amazing. Now, I use something like that every day.”

Today, iPhones and Android phones come standard with these built-in touch-based screen readers. The screen reader helps people who are blind or visually impaired use the devices by reading aloud the text that appears on the screen and allowing gestures to interact with the app. Apple’s version is called VoiceOver and Android’s is called TalkBack.

When mobile phones with flat touchscreens started becoming pervasive, it wasn’t clear exactly how people who were visually impaired would use these devices. Researchers at the University of Washington’s Information School may have influenced the direction that major technology companies like Apple and Google took to address this issue.

“It’s rare for academic projects to have quite this line of sight from their academic origins to widespread industry adoption. This is about as close of a case that ever happens.”

Jacob O. Wobbrock
Jacob O. Wobbrock

iSchool Professor Jacob O. Wobbrock and doctoral students Shaun Kane, iSchool Ph.D. ’11, and Jeffrey Bigham, Computer Science & Engineering Ph.D. ’09, developed one of the first touch-based screen readers ― called Slide Rule ― in 2007. 

“It’s rare for academic projects to have quite this line of sight from their academic origins to widespread industry adoption,” Wobbrock said. “This is about as close of a case that ever happens.” 

The iPhone 1 with a touchscreen — but no accessibility features for people who were blind — was released in September 2007. It clearly was a transformative device, said Kane, who was pursuing his Ph.D. at the iSchool at the time.

“The iPhone really sparked a conversation about what mobile devices should be like and what are good ways to interact with them,” Kane said. “These smartphones were attractive devices. They were fun. People were excited about them. They certainly had a coolness about them.”

Kane arrived at the University of Washington in 2005. He received his Bachelor of Science degree in computer science from the University of Massachusetts, Amherst. He was interested in accessible computing, but UMass didn’t offer that as an area of study. So, he went to the iSchool for graduate work.

Kane has a disability that limits movement with his hand. As a teenager, he was interested in computer science but had no desire to work in the area of accessibility. By college, he changed his mind, finding the work intriguing. When he started on Slide Rule, he didn’t know anyone who was blind, but felt his research was “disability agnostic.”

Kane’s adviser was Wobbrock, who was also new to the school. When the iPhone arrived, Kane ruminated on how to overcome its accessibility issues. Then, he started talking to people.  

Shaun Kane
Shaun Kane

“I was really motivated by this as a problem where there isn't an obvious solution,” said Kane, who went on to become a tenured professor at the University of Colorado, Boulder, and now works at Google Research.

His idea was to create an app on the iPhone that visually impaired users could open that would read information on the screen and allow the user to interact with the information with a series of gestures. 

Most of the people with whom he discussed the issue felt he was taking the wrong approach. Some felt that an attachment with buttons would be the correct course. Others argued that they needed to convince Apple to produce a version of the iPhone with tactile buttons.

“Well, it's much easier to change software than it is to convince people to change hardware, right?” Kane said.

Wobbrock encouraged Kane to work on this idea. Apple initially didn’t allow third-party programmers to create apps for the iPhone. In 2007, Kane figured out how to do it within a couple of months.

“Shaun Kane, brilliant as he is, found a way to essentially jailbreak the phone,” said Wobbrock, joking that doing so voided the warranty.

A couple of months later, Kane, Wobbrock and Bigham developed Slide Rule, a screen reader that contained an email app, a music player and a contact list.

Kane posted a video of the Slide Rule project on YouTube on May 14, 2008. Kane, Wobbrock and Bigham’s paper on their project, “Slide Rule: Making mobile touch screens accessible to blind people using multi-touch interaction,” was published in October 2008. 

Apple introduced VoiceOver for the first time in June 2009 on the iPhone 3GS. Wobbrock said he and Kane looked at VoiceOver soon after it was released.

“The resemblance to Slide Rule was striking in terms of the gestures and design,” Wobbrock said. “Of course, VoiceOver was a full-fledged industry product. It had features, preferences and polish that you have as a commercial product that a research project doesn't.”

Wobbrock learned that Apple engineers were familiar with his team’s work. The reason? An email from an Apple engineer.

The engineer wrote to Bigham’s adviser, Richard Ladner: “We definitely read through the existing literature before starting. I can say we were certainly aware of this project. We were quite excited to see the [Slide Rule] video when it popped up.” Ladner has kept the identity of the engineer anonymous all these years.

“It's definitely a project that I am known for. Often, people will say, ‘Oh, we read this paper in class.’ That’s really nice.”

Did Slide Rule influence VoiceOver and TalkBack? Or was it great minds think alike? Kane shies away from that debate.

“You should ask other people about that,” Kane said. “I stay out of that, because I’ve heard differing accounts. I think as soon as you see a device that’s transformative in this way, with a new kind of interface, we can all start to see what the accessibility problems might be.”

He’s proud of the approach of Slide Rule. He learned a valuable lesson from the project, something he’s relied on throughout his career. If someone downplays your idea, keep going until you decide for yourself whether the idea would work. 

“It's definitely a project that I am known for,” Kane said. “Often, people will say, ‘Oh, we read this paper in class.’ That’s really nice.”

In 2019, the research team won the SIGACCESS ASSETS Paper Impact Award, which honors a paper at least 10 years old that has had significant impact on information technology that addresses the needs of people with disabilities. 

Kathleen McCoy was the chairperson of the committee that selected the iSchool paper for the award. She said, at the time the award was given, it was perhaps the most influential ASSETS paper ever published. She still uses the paper in her own classes to teach how to write about accessibility. 

“It's a beautiful paper for the field of human-computer interaction and accessibility, going all the way from a formative study of what would you want to do with this phone for people who are blind to having ideas of how to fix that,” McCoy said.

Was Apple influenced by the iSchool’s research? She points to the timeline of when the iSchool team posted its video and published its paper the year before Apple first released VoiceOver on the iPhone.

“Just that it had the same features, the same gestures, I think that’s pretty good evidence that there was some influence going on,” McCoy said. 

She thinks the iSchool research made a lasting difference in the lives of people who are blind. She called it a snowball effect, allowing iPhone users to access other apps that enable even more accessibility.

“What would the world have looked like if someone hadn’t figured this out?” McCoy said. 

For his part, Paulding, whose life changed with touch-based screen readers, recently listened to the Slide Rule video that Kane posted in 2008. 

“It sounded like a screen reader on an iPhone to me,” Paulding said.

Paulding was born with a condition called congenital rubella syndrome after his mother was exposed to German measles when she was pregnant with him. He has no right eye and lost vision in his left eye when he was about 30.

Paulding has taught other people with visual disabilities how to use VoiceOver. Based in Spokane, Paulding works as an orientation and mobility specialist at nonprofit Lighthouse for the Blind.

Technology is near the top of the list, if not the top, for helping people with visual disabilities live an independent life, he said. 

“What Apple has done with accessibility has really revolutionized what you can do as a person who's blind using a smartphone,” Paulding said.

He points to apps that have improved accessibility. Soundscape offers people who are blind or visually limited information about nearby businesses or the street grid. An app called OKO – AI Copilot for the Blind can be used at intersections without an audible pedestrian signal to tell whether the signal says “Walk” or “Don’t Walk.” 

Mainstream apps such as Google Maps and Apple Maps help guide people who are blind or low-vision through neighborhoods.

“The power of being able to have apps that do things for you, that read things for you, where you can get information and knowledge, that kind of power promotes independence in a huge way,” Paulding said. “It’s massive.”