Working with Outcomes: A Worthwhile Challenge

Written by Heidi Andres a Teen Services Librarian with Cuyahoga County Public Library in Northeast Ohio. She received outcome measurement training as a fellow in the Treu-Mart Youth Development Fellowship Program of the Weatherhead School of Management at Case Western Reserve University.

smiley face evaluation by Flickr Creative Commons user BillsophotoAs a librarian involved in the implementation of youth program outcome measures, I was extremely interested in Johannah Genett’s article, “Measuring Outcomes for Teen Technology Programs,” in the Fall 2014 issue of YALS. Reading this article, I was eager to see how another library system (Hennepin County Library) collects outcome measures for youth programs in comparison to my organization, Cuyahoga County Public Library (CCPL).

In the fall of 2012, Cuyahoga County Public Library formed a youth planning team with the mission of creating outcome measures for youth programming across our 27 branches. This team was comprised of CCPL administrative youth staff, three Teen Services librarians, including myself, one Children’s Services librarian, and representatives from an outside youth development organization. Although the three teen staff members had received outcome measurement instruction, learning outcome measurement theory and actually creating and applying the tools proved to be two very different sides of the same coin. Constructing outcomes, indicators, and measurement tools was an eye-opening experience, one which enabled the team to closely examine the library system’s youth programming priorities and goals. The process focused on answering some vital questions: Why do we do what we do? What are our programming strengths, and where can we improve? What kind of impact do we want to have on the young people we serve, and how do we achieve this through the programs we offer? It was our hope that measuring outcomes would not only give us answers to these queries, but also provide staff with insight they could use to develop future youth programs with outcomes in mind.

Once the youth planning team was in place, the group felt it was crucial to refine the focus of CCPL’s youth programming priorities before identifying program outcomes. In order to do this, the planning team reviewed a number of sources, including:

  • The Search Institute’s 40 Developmental Assets
  • Cuyahoga County Public Library’s mission, organizational priorities, and youth programming philosophy (which focuses on youth achieving maximum potential, ensuring children enter school ready to learn, and enriching literacy experiences of children, teens, and their families)
  • Our existing Youth Services manual; and other in-house program evaluation tools.

The team then brainstormed to produce an overarching youth programming outcome statement which would apply to all Cuyahoga County Public Library youth programs. The resulting youth programming outcome statement is:

“Youth attending programming will view the library as a safe and supportive space where they can express who they are. They will build relationships, interests, and skills through constructive use of time.”

It was broad enough to encompass the diverse programming initiated by CCPL youth staff while also highlighting what young people gain from attending library programs.

After we narrowed down our youth program priorities and created our youth programming outcome statement, the team set about deciding what indicators would accurately capture that statement. This process allowed the youth planning team to consider what we want young people to achieve in our programs. However, because Cuyahoga County Public Library offers such a wide variety of youth programs, establishing indicators and a “one-size-fits-all” outcomes tool that could effectively measure every program equally was difficult. To best illuminate the aim and most accurately measure the outcomes of CCPL’s youth programming, the team divided youth library programs into three categories: multi-part programs, stand-alone programs, and youth voice programs. Multi-part programs take place over several days (such as CCPL’s week-long summer camps) or have multiple segments taking place over one day (i.e. one day camps), and each session or segment is intended to build upon the previous one. Stand-alone programs are defined as programs where the topic (or performance) is completed within the single occurrence of that program. Youth voice programs – such as library teen advisory groups and boards – are programs in which youth specifically contribute to the library, their peers, and/or the greater community.

Designating three separate program categories made it possible for indicators and measurement tools to be designed specifically for each of the three program types. The youth programming team conceived multiple indicators which apply to and are included in all three programming types. The indicators for multi-part programs are as follows:

  • Youth view the library as a nurturing and welcoming environment.
  • Youth develop positive relationships with program presenter(s).
  • Youth value library programming.
  • Youth develop social and interpersonal skills.
  • Youth gain knowledge, skills or abilities in the area of program focus.

Although these five indicators may initially seem simple, considerable time and effort went into designing indicators that accurately reflect the purpose of CCPL’s youth programming outcome statement. But how, exactly, would we determine if the indicators are being met and if young people are having these experiences in our library programs? Like Hennepin County Library, the creators of CCPL’s youth programming outcomes concluded that a survey would be the best tool to measure the impact of our library programs on young people. The youth planning team members agreed that self-reporting by program attendees via an anonymous survey would allow program participants to provide honest feedback, and be the most effective way to determine if indicators were being met.

In order to design a youth programs outcome measurement survey, the planning team members created survey statements which would capture each indicator. Program participants would then rate each statement with a numerically weighted response as to whether they strongly agree (4), agree (3), disagree (2), or strongly disagree (1) that they experienced the crux of the statement during the program. A neutral “neither agree nor disagree” option was intentionally omitted from the survey, as it would not provide staff with necessary feedback. In most cases we use more than one survey statement to measure whether or not indicators are being met. For example, the indicator “youth develop positive relationships with program presenter(s)” is measured through two survey statements: “the program leader here cares about me” and “I feel comfortable going to the program leader for help”.

Upon completion of the survey statements, draft surveys were presented to a committee of youth services librarians for commentary. Similar to Hennepin County Library’s process, CCPL’s survey statements were also submitted to a group of teens – a branch Teen Advisory Group – for their input. Both library staff and youth customers provided honest feedback and suggestions that were considered as final survey edits were made. Because Cuyahoga County Public Library’s youth programs encompass a wide range of ages (birth to 18 years), the youth planning team devised age-level specific surveys for multi-part and standalone programs. Multi-part and stand-alone programs each have surveys for programs from Birth-Grade 3 (Ages 0-8/early childhood), and for programs for youth in Grades 4 and Up (Ages 9-18). The age-level specific surveys measure the same outcomes with slightly different indicator statements. The early childhood surveys were designed with the expectation that parents/caregivers would assist children with their responses, so indicator statements are worded from the perspective of the adult. For example, a survey statement to capture the indicator Youth will value library programming is worded, “My child likes coming to library programs” on the early childhood survey, whereas the statement “I like coming to library programs” is used on the survey for grades 4 and up). For youth voice programs, which are usually geared towards older children, we only use one measurement survey.

Although the survey responses are anonymous, the planning team felt parental/legal guardian consent was necessary since we are requesting information from minors. As a result of this, signed release forms are required for youth to complete the outcomes measurement survey at the conclusion of a program. Taking the survey is completely voluntary. We use an electronic survey (available via Survey Monkey) as an alternative to paper whenever program attendees are not able to take the survey at the end of a program. We piloted the survey at a teen entrepreneurship camp as the last step in the process before implementing it system-wide.

As Ms. Genett stated in her YALS article, “measuring outcomes is a new idea for many librarians,” and this was certainly the case for a number of Cuyahoga County Public Library youth services staff. To help familiarize youth personnel with outcomes and get them comfortable with the measurement process, CCPL slowly rolled-out youth program outcomes by focusing solely on the measurement of multi-part programs (specifically summer camps). Two staff training sessions were held in the spring of 2013, beginning with outcome measurement for youth summer camps in June. Since the initial implementation of youth program outcome measures, CCPL has slowly increased the number and types of youth programs that we measure. For example, we now also conduct outcome measurement surveys for Robotix programs (whether a stand-alone session or a multi-part camp). In addition to the three surveys for the original youth programming categories (multi-part, stand-alone, and youth voice), we recently created a fourth outcomes measurement tool for youth writing groups. Although youth writing groups share outcomes and indicators with the other programming categories, a separate measurement survey was necessary in order to capture whether or not participants felt that our programs had improved their writing abilities.

Multi-part and stand-alone surveys are conducted at the close of each program, while writing group surveys (and, eventually, youth voice) are done on a quarterly basis. Completed surveys are sent to youth staff at CCPL’s administrative building, where responses are tabulated and shared with both library staff and potential program funders. CCPL is striving to ultimately measure outcomes for all youth programs. Our goal is to identify and measure the impact programs have on the young people we serve. Such information is not only important to secure funding, it is also crucial to provide library staff with insightful data to improve the content, delivery, and impact of library youth programming.

Although the task of creating and implementing an outcome measurement system may seem daunting, the long-term benefits outweigh what can be a lengthy start-up process. Library staff has known for years that programs can positively impact the young people., Outcome measurement tools give us the ability to quantify and qualify our efforts. Librarians want to provide the best possible programs and experiences for the young people we serve, and outcome measures are an important piece of the puzzle.

Leave a Reply