2 Putting Data to Work:
Interim Recommendations From The Benchmarking Project
foundation grants and other donations—are what make it possible for OBT toprovide general equivalency diploma (GED) preparation, literacy services, businessskills training or job placement assistance for 2,000 area residents each year. But the inefciency of entering data into multiple systems—as well as OBT’s own inter-nal database—is frustrating.OBT’s extensive reporting processes reect the national drive over the past 30 years toward greater accountability for programs receiving government and phil-anthropic funding. Modeled on the private sector’s emphasis on performancestandards, grants typically requiretangible, measurable results—andoutcomes data play a major role indecisions about continued support.Contract payments tied to specicdeliverables, such as job placement andretention, are often essential to organi-zations’ cash ow and bottom lines. While organizations face external pres-sure to demonstrate results with outcomesdata, most also realize that it is criticalto use data to strengthen program ser- vices and spend resources effectively. Asdescribed in Public/Private Ventures’2006 report
Good Stories Aren’t Enough: Becoming Outcomes-Driven in Workforce Development
, frontline practitionersare increasingly engaged in a “cycle of continuous improvement,” analyzingindividual data to glean knowledge that can lead to better results. This processis driven by the following questions:•
What are our results?
Are we servingour target population? How many par-ticipants are showing improved skills?How many are securing employment and sustaining that employment for thelong term? Are we seeing an increasein participants’ income and benets?•
What is “good” performance?
How do our results compare with others? How areour results changing over time? What level of performance should we expect?
The Benchmarking Project
•The Benchmarking Project began in 2004 withintensive work in three cities to understand the typeso data local programs were collecting and relatedperormance management issues.•We designed and piloted a web-based surveyto capture aggregate data rom programs aboutparticipant demographics, services and outcomes or arecent one-year cohort o enrollees.• As o March 2010, 214 programs rom 159organizations had submitted data and additionalprograms are joining all the time.•Organizations receive confdential reports that allowthem to compare their programs’ job placement andretention results (anonymously) with programs thatshare similar characteristics (e.g., size o cohort ortype o service strategy). In each case, the medianoutcomes serve as initial benchmarks o perormance.•Organizations voluntarily decide to participate in TheBenchmarking Project, and programs can update jobretention data or submit new surveys on other cohortsat any time. There is no cost to organizations exceptthe time needed to respond to the survey. Participatingorganizations indicate that the project’s “applesto apples” reports are a useul tool or identiyingareas o program strength as well as areas needingimprovement.•To support program improvement eorts oparticipating organizations, The Benchmarking Projectprovides workshops, webinars, online discussions andother resources related to perormance managementand eective practice.