Getting it Done

“What doesn’t get measured doesn’t get done.” That old saying is true as far as it goes. But while it can be relatively easy to know what you want to measure, it’s another thing entirely to know how to go about doing it. To that end, the iSchool’s U.S. Impact Study, a research group focused on public libraries and community technology centers, has helped community non-profits measure the effectiveness of their public technology programs.

At stake is digital inclusion—the ability of community members to access and utilize technology. A luxury as recently ten years ago, digital access is now central to self-sufficiency in America, critical to such necessities as applying for a job, accessing health care, participating in a child’s school progress, and engaging in lifelong learning. Even something as seemingly mundane as comparison shopping now requires basic digital skills.

“You can’t be included in today’s society unless you are part of the digital world,” says Samantha Becker, the research manager working with iSchool faculty advisor Michael Crandall on the U.S. Impact Study. “It’s really not possible anymore.”

In the U.S., the nation’s 9,200 library systems have taken the lead in providing public access to technology, and the U.S. Impact Study’s work is primarily focusing on libraries. But libraries are not alone in this outreach. There are also many community technology providers. Mostly non-profits, they often focus on niche populations with special needs related to such factors as language, culture and economics.

This fact was not lost on the organizers of the $7.2 billion Broadband Technology Opportunity Program (BTOP). According to the Department of Commerce’s National Telecommunications and Information Administration Website, a portion of this Recovery Act Program was aimed at supporting “Projects that focus on increasing broadband Internet usage and adoption, including among vulnerable populations where broadband technology traditionally has been underutilized.”

The U.S. Impact Study worked with community technology providers across Washington State who received BTOP funding, helping them more effectively collect data about their activities and clients and better evaluate their programs. Their work became a national model for other BTOP recipients. The purpose of measurement was twofold: to help organizations make programmatic decisions, and to help justify future funding in the eyes of politicians and other stakeholders.

While adept at their core outreach capabilities, many technology providers lack measurement savvy. A typical report might simply indicate the total number of participants, with success defined by self-reporting. This is flawed. “You can’t ask a non-user how confident they feel about their skills or whether they’ve learned something, because they don’t know enough yet to know whether they’ve mastered a new skill,” explains Becker. “So you have to figure out different ways of evaluating whether your programs are effective.”

To do this, the U.S. Impact Study established a monitoring and evaluation framework. “We laid out domains of impact on peoples lives, and then enumerated different high-value ways that technology could potentially help people accomplish important personal or family goals.” U.S. Impact Study identified roughly 160 indicators, and then helped organizations choose indicators that matched their mandate.

So for example, instead of asking a user to rate their class experience on a scale of 1-5, they might instead ask, “were you able to send an attachment with an email after taking this class?” or “were you able to apply for a job online after taking this class?”

Thanks to the U.S. Impact Study, the resulting hard numbers helped organizations better demonstrate return on investment, which is critical in an era of heightened budgetary scrutiny. To funders, there is a huge difference between “we offered 60 classes” and “we helped 60 people find jobs.” As Becker notes, “It creates a big difference in what you tell your stakeholders.”

The success of this measurement was recently underscored when four of the organizations Becker and her research group worked with went on to qualify for additional support from the City of Seattle’s 2013 Technology Matching Fund. The organizations include Asian Counseling and Referral Service; Jefferson Terrace (Seattle Housing Authority); Horn of Africa Services; and Neighborhood House.

“What the iSchool brings to this is an understanding of digital inclusion in a very broad, societal way,” says Becker, herself an iSchool alumna. “We understand how people seek and find information. We understand how they use technology, and the bigger issues of digital inclusion in terms of how people are really excluded, and what needs to be put into place to make them included.”

Now that the BTOP program has concluded, the U.S. Impact Study has moved on to other challenges. These include the Edge Initiative, a collaboration among many major public library stakeholder that seeks to help libraries evaluate their technology resources and services against a set of benchmarks, and also the Impact Survey, a tool to help public libraries survey their patrons on the ways they use library technology and the impact it has had on their libraries, once again for purposes of needs assessment and interaction with stakeholders—i.e. data that can be provided to city councils and city managers to justify funding.

Technology can have real and genuine impact on people’s lives. Sadly, however, it’s all too easy to undervalue these enriching benefits if you don’t know what to look for. But thanks to the iSchool’s U.S. Impact Study research group, things are getting measured, and things are getting done.