Welcome to Scribd, the world's digital library. Read, publish, and share books and documents. See more ➡
Download
Standard view
Full view
of .
Add note
Save to My Library
Sync to mobile
Look up keyword
Like this
3Activity
×
0 of .
Results for:
No results containing your search query
P. 1
Putting Data to Work Brief 11-10

Putting Data to Work Brief 11-10

Ratings: (0)|Views: 788|Likes:
Published by valeriefleonard

More info:

Published by: valeriefleonard on Jun 12, 2012
Copyright:Attribution Non-commercial

Availability:

Read on Scribd mobile: iPhone, iPad and Android.
download as PDF, TXT or read online from Scribd
See More
See less

06/22/2012

pdf

text

original

 
Putting Data to Work:Interim Recommendations FromThe Benchmarking Project
Performance Management:Hard Realities on the Ground
For Alice, a staff member at Opportunities for a Better Tomorrow (OBT), dataentry is a long and tedious process. Data entry usually is. But the programs offeredby this workforce development organization in Brooklyn are funded through vecontracts with various government agencies, each requiring OBT to report information using a different onlinedatabase. For each contract, Alice orone of her colleagues enters data onprogram participants’ backgrounds anddemographics, the services they receiveand their job placement or educationaloutcomes. Each contract has crucialreporting deadlines to meet, but waitingfor the online systems to “refresh” aftereach data eld is updated sometimesmakes it a slow endeavor. Alice is careful with her data entry,because in a single OBT program oneparticipant’s services might be fundedthrough a government contract whileservices for another are paid for by a foundation grant. OBT staff alsohave to make sure that backup paperdocumentation is in participant les,ready for possible contract audits. Astime-consuming as this process may be, Alice and her colleagues understand itsimportance. The funds from these pub-lic contracts—supplemented by private
The Challenge
Given the ragmentation o the workorce developmentsystem, it is difcult or unders, policymakers andpractitioners to know what outcomes constitute “good”perormance. Furthermore, the diverse reportingrequirements o workorce development unders takesignifcant time and energy to navigate, thus sappingrontline providers’ capacity to use data or programimprovement.
Key Recommendations
Policymakers and unders at every level—but particularlythe ederal level—need to better support a culture ocontinuous learning and improvement across the entiresystem. To do this, they should:1. Move toward more consistent defnitions operormance measures.2. Implement new technology or adapt existing systemsto allow under and program databases to exchangeinormation more easily.3. Provide more useul reports or practitioners aboutlocal and state data trends.4. Oer more opportunities or program providers to learnrom existing research and rom their peers.5. Encourage more programs to participate in TheBenchmarking Project to enhance the feld’s abilityto defne “good” perormance and to strengthenperormance improvement eorts across the system.
by Marty Miles, Sheila Maguire, Stacy Woodruff-Bolte and Carol Clymer
 
2 Putting Data to Work:
Interim Recommendations From The Benchmarking Project
foundation grants and other donations—are what make it possible for OBT toprovide general equivalency diploma (GED) preparation, literacy services, businessskills training or job placement assistance for 2,000 area residents each year. But the inefciency of entering data into multiple systems—as well as OBT’s own inter-nal database—is frustrating.OBT’s extensive reporting processes reect the national drive over the past 30 years toward greater accountability for programs receiving government and phil-anthropic funding. Modeled on the private sector’s emphasis on performancestandards, grants typically requiretangible, measurable results—andoutcomes data play a major role indecisions about continued support.Contract payments tied to specicdeliverables, such as job placement andretention, are often essential to organi-zations’ cash ow and bottom lines. While organizations face external pres-sure to demonstrate results with outcomesdata, most also realize that it is criticalto use data to strengthen program ser- vices and spend resources effectively. Asdescribed in Public/Private Ventures’2006 report 
Good Stories Aren’t Enough: Becoming Outcomes-Driven in Workforce  Development 
, frontline practitionersare increasingly engaged in a “cycle of continuous improvement,” analyzingindividual data to glean knowledge that can lead to better results. This processis driven by the following questions:
 What are our results?
Are we servingour target population? How many par-ticipants are showing improved skills?How many are securing employment and sustaining that employment for thelong term? Are we seeing an increasein participants’ income and benets?
 What is “good” performance?
How do our results compare with others? How areour results changing over time? What level of performance should we expect?
The Benchmarking Project
The Benchmarking Project began in 2004 withintensive work in three cities to understand the typeso data local programs were collecting and relatedperormance management issues.We designed and piloted a web-based surveyto capture aggregate data rom programs aboutparticipant demographics, services and outcomes or arecent one-year cohort o enrollees. As o March 2010, 214 programs rom 159organizations had submitted data and additionalprograms are joining all the time.Organizations receive confdential reports that allowthem to compare their programs’ job placement andretention results (anonymously) with programs thatshare similar characteristics (e.g., size o cohort ortype o service strategy). In each case, the medianoutcomes serve as initial benchmarks o perormance.Organizations voluntarily decide to participate in TheBenchmarking Project, and programs can update jobretention data or submit new surveys on other cohortsat any time. There is no cost to organizations exceptthe time needed to respond to the survey. Participatingorganizations indicate that the project’s “applesto apples” reports are a useul tool or identiyingareas o program strength as well as areas needingimprovement.To support program improvement eorts oparticipating organizations, The Benchmarking Projectprovides workshops, webinars, online discussions andother resources related to perormance managementand eective practice.
 
3 Putting Data to Work:
Interim Recommendations From The Benchmarking Project
How can we improve?
How do results vary by type of service or participant? What can we learn from peers and research? What do answers to these questions tell usabout potential changes in our strategies?Similar questions about performance exist for workforce development fundersand policymakers. The desire to answer them—particularly the “what is ‘good’performance?” question—inspired the Annie E. Casey Foundation to launch TheBenchmarking Project in partnership with Public/Private Ventures (P/PV). Theproject’s long-term goal is to identify realistic performance standards for programs inthe workforce development eld. The project is now completing its pilot phase afterthree years of data collection, and as of March 2010, 214 programs from 159 organi-zations across the country had submitted condential data about their programs.The Benchmarking Project has already provided compelling information about how specic program characteristics correlate with different outcomes and what some reasonable “benchmarks” might be.
1
While this brief includes initial bench-marks gleaned from the 200-plus programs in the current sample, these ndingscannot yet be generalized—because the sample is not yet “representative” of theeld. The next phase of the project seeks to engage a larger sample of organiza-tions that better represents the many different types of programs and service pro- viders operating across the country, with enhanced technology to accommodatemore participation. Yet even as The Benchmarking Project continues to expand, it already offersimportant insights about what service providers “on the ground” experience asthey seek to better understand and improve their outcomes. This brief highlightsissues that currently make it difcult for providers to fully engage in a cycle of continuous improvement and shows how the Benchmarking approach can helpadvance the use of outcomes data in the workforce eld. We hope this informa-tion will be useful for policymakers and funders as they consider ways to moreeffectively support the improvement of workforce development programs—andultimately produce better results for the participants and businesses they serve.
The Issues
“What Are Our Results?”
 According to the National Conference of State Legislatures, at least 12 different federal agencies provide funds for workforce development programs—includingthe departments of Labor, Health and Human Services, Education, Housing andUrban Development, Agriculture, Transportation, Commerce, Energy and Veterans Affairs—and those funds are then channeled through state and local agencies.
2

Activity (3)

You've already reviewed this. Edit your review.
1 hundred reads
valeriefleonard added this note|
If you haven't had a chance yet, please, go to http://missionsmallbusiness.com/ click 'Log In & Support' and log in using Facebook. Search for valeriefleonard.com by name OR filter by State and City (Chicago, Illinois).Click on the blue Vote button next to "valeriefleonard.com to show your support for my business. If you want to know more about my business, go to http://valeriefleonard.com/. I tha

You're Reading a Free Preview

Download
/*********** DO NOT ALTER ANYTHING BELOW THIS LINE ! ************/ var s_code=s.t();if(s_code)document.write(s_code)//-->