Welcome to Scribd. Sign in or start your free trial to enjoy unlimited e-books, audiobooks & documents.Find out more
Standard view
Full view
of .
Look up keyword
Like this
0 of .
Results for:
No results containing your search query
P. 1
Humanitarian Impact Evaluation Meeting Note 2012

Humanitarian Impact Evaluation Meeting Note 2012



|Views: 165|Likes:
Published by InterAction

More info:

Published by: InterAction on Feb 01, 2012
Copyright:Attribution Non-commercial


Read on Scribd mobile: iPhone, iPad and Android.
download as PDF, TXT or read online from Scribd
See more
See less





Humanitarian Im act Evaluation
November 2011
Meeting Note
For more information,please contact:
Elizabeth Bellardo
Senior Program ManagerInterActionebellardo@interaction.org
www.InterAction.org1400 16thStreet,NWSuite210Washington,DC 20036202.667.8227
O n Thur sda y, O ct. 20, 2011 o ver 40 me mber s of the international hu manitarian N G Oco m munity gathered to discu s s impa ct evaluation in huma nitarian action. This is a shortsu m mar y of the pres entations and dis cu ssions.
Improving huma nitarian impact assessment: bridging theory and prac-tice
John Mitchell, ALNA PJohn Mitchell, Director of AL NAP, pres ented on a study rethinking the impa ct of human-itarian aid and the use of huma nitarian impa ct asse s s ments, which hav e proved difficultto cond uct. The aim s of the study were to clarify ke y issu es arou nd the use of hum ani-tarian imp act as ses s ment s, to mov e tow ard a shared and better understanding of itsnature and uses, and to define a practical vision for future wor k.Since the 1980s, three broad trend s have c ontributed to the curre nt interest and debatearound hu manitarian imp act as se ss me nt. First, hu manitarian aid has e xpand ed andbeco me more politicized. The increa se in co mplex em ergen cies during the 1990s hasresulted in humanitarian principles being increa singly co mpro mised. The c hanging na-ture of vulnerability and hu man suffering has also played a role as the number andscale of emer gen cies has incre ase d in the past 20 years and ha s affected m ore middleinco me co untries. Lastly, the aim s of hu manitarian aid have be co me blurred. The fieldhas be co me increa singly institutionalized and profes sionalized, and the main goal is not just saving lives but can also include state building and addres sing vulnerability.De spite the increa se in initiative s and progres s m ade, con siderable challenges re main.The stud y identifies six ke y challenge s: (1) defining impa ct and im pact as se ss me nt; (2)diver se stake holders and interests; (3) indicators, baselines and data; (4) methodolo-gies; (5) collective interpretation and analysis; and (6) ca pacities and incentive s.Defining imp act and impa ct ass es s ment: While there are widely acce pted definitions,there is no single definition that everyo ne follows. Part of the difficulty is that the idealmod el is linear, but the reality of the impa ct is different and not so straightforwar d. Realworld impact is co mplex and hard to discern, and can hap pen a long time after the fact.There is also a tenden c y to look at contribution rather than attribution. Hum anitarianimpact asse s s ment is even more difficult due to the rapidly changing nature of huma ni-tarian conte xts, the lack of con sen su s on objective s of hu manitarian aid, and the fact
that intended impa cts are often unclear and overa mbitious.Diverse stakeholders and interests: Th ough cru cial to a su cc es sful ass es s ment, different stakeh olders ha ve differ-ent perc eptions of and interests in imp act, thus m aking it a difficult balancing act. Additionally, different needs m aynot be recon cilable and achievable in a single impa ct ass es s ment, i.e. the tension betwe en de mon strating pastimpact and impro ving future practice.Indicators, baselines and data: The choice of indicator s is crucial and includes a value judg ment on what kinds ofchang es are significant and for who m. Is sue s include wea k or nonexistent baselines, una vailable or unreliable da-ta, qualitative v s. quantitative data, and monitoring s yste m s that focu s on proce s s and outputs.Methodologies: Ther e are a man y tools, techniques and appro ach es a vailable. Som e are quantitative and tackle“what and wher e” questions, while others are qualitative and tac kle “how and wh y” questions. Man y hum anitariansadopt a mixed methods ap proa ch.Collective interpretation and analysis: One of the bigge st is sue s in impa ct as se ss m ent is how to include the view sof beneficiaries and how to use these view s and exp erien ces to actually bring about chan ges and impro ve ments.To date, participation of and ac countability to affected populations ha s not been a ke y feature of imp act as ses s-ments.Capa cities and incentive s: In general, there is a lack of individual and organizational capa city to carr y out goodevaluations. Factors contributing to this are un clear T O R s and objectives, high staff turnov er and inadequate in-vest ment of resour ce s. There are also few gen uine incentives to carry out a good ass es s ment.The stud y con cluded that so far the hu manitarian s yste m has bee n poor at me asuring or analy zing imp act. Therehas been s o me progre ss made but not ver y mu ch.Discu s sion: Participants dis cus se d the question of baselines and surro unding issue s, highlighting the challenge ofa lack of baseline data and recon structing bas elines. In hu manitarian imp act as se ss me nts there is a tendenc y toma ke the emerge nc y “0” and not take into ac cou nt life before it happened. Ther e is a tenden cy to recon structbaselines order to attribute the con se quen ce s of progra m, as well as a tendenc y to oversimplify the situation.Som e participants felt it ma y be an unrealistic exp ectation to be able to mea sure a progra m’s or project’s imp act ina humanitarian context.
Impact Evaluation for the international NGO Community
Carlisle Levine, Mo deratorJeannie Annan, Director of Res earc h and Evaluation, IRCDale Hill, Senior M& E Advisor, American Red Cr os s, International ServicesVeronica M. Olazab al, M onitoring and Evaluation Exec utive, United Methodist Co m mittee on ReliefHana Cro we, Senior Specialist, Ac countability, Save the ChildrenThe panel discu s sants sh ared ho w their resp ective organizations define impa ct evaluation, their policies for carry-ing out impa ct evaluations, ho w they mea sure impa ct, and various challenge s they have en countered. Tho ugh thepanelists had condu cted evaluations in a variety of situations and conte xts (both relief and developm ent), all panel-ists agreed that evaluations can be time c ons u ming, expen sive and a burden on staff. Im pact evaluation in a hu-manitarian context is difficult bec aus e, in an emerg en cy, the main priority is to sav e lives rather than collecting datato be used at a later date to evaluate program s. It is also difficult to attribute impa ct in an emerg enc y situation,since it is often the ca se that more than one agen cy re spond s. While impa ct evaluations may help organizationsrespon d more effectively to future situations, learning from e valuations and applying that kno wledge to future re-
spon se s is challenging. It is also difficult to under stand the do wnside of a project since evaluations focus on thosewho ha ve benefitted from certain progra m s and do not take into ac cou nt those w ho do not benefit or are wor se off.Following the panel disc us sion, wor ks hop participants broke up into small groups and w ere as ked to answ er thefollowing questions:1. W hat do we mean by impa ct evaluation?2. W he n is it appropriate to do impa ct evaluation in a relief or hu manitarian conte xt?3. W h y do we underta ke imp act evaluation in a huma nitarian context respo nse ?4. Ho w do we underta ke hu manitarian imp act evaluation in a huma nitarian respon se?5. W ho und ertake s hu manitarian impa ct evaluation in a humanitarian respon se co ntext?Small group discu s sion highlights
The hu manitarian com m unity should take ad vantage of the current ex citeme nt over monitoring and eval-uation to encoura ge greater coordination. Since it is difficult to set and determine baselines in an emer-genc y, one group should act as the lead evaluator and collect and coordinate data to recon struct bas e-lines for all groups to use.
There are pro s and con s to N G O collaboration with acade mic institutions for impa ct asse s s ments. Moremethod s could be explored and there would be greater variety of skill sets and capa cities. While it is im-portant to learn from evaluations and shar e results, there m ay be so m e disc o mfort in publishing results.External collaboration can lead to m ore risk, and there is a co ncer n about being penalized for producingdifferent res ults than wh at was originally expe cted.
N G O s und ertake imp act evaluation becau se there is an increa sed need to understand what happenedand wh y. External evaluators are useful and provide a certain level of expertise, but they are often un-che ck ed and ma y hav e a certain bias. There also needs to be internal supp ort within an organization andgreater senior level buy-in.
Impact evaluation is not eas y. When c o mparing developme nt w ork and hu manitarian action, evaluationma y be more important in hu manitarian action. Better mana ge ment is needed for mea suring how well weare doing.
There is an inherent tension when co ndu cting an impa ct evaluation becau se it has be en ma ndated by adonor or for organizational learning. Do nor s require one thing, which ma y contradict what an organizationis trying to learn from the evaluation. Organizations are trying to create global standards and indicatorsand to cond uct an impa ct evaluation wh en they pilot a new approa ch or enter a new se ctor.
Impact evaluation is just one type of evaluation. An as pect of imp act evaluation is the mana ge ment ofbeneficiaries’ and donors’ expe ctations.
Impact evaluation as a whole can lock e valuators into a theory of cau sal relationships, which ma y blindthem to unintended c ons equen ce s of the project. Mo st evaluations exa mine whether wh at the organiza-tion set out to do wa s achieved or not. As a result, the “do no harm” principle and the difficulty in reachingcertain groups are ignored beca use orga nizations are focu se d on demo nstrating to donor s that they didwhat they were a ske d to do.
Ha ving the learning proce ss driven by local actors is both ke y and ideal. Learning should not be impo sedfro m the top or donor s, but rather sho uld result fro m wre stling with key que stions that will hav e a meaning-ful impa ct of the next phas e of impleme ntation. This ma y or may not be impa ct evaluation.
There was ge neral agree ment that it has been a struggle to im prov e and strengthen capa city on evalua-tion. One contributing factor me ntioned wa s high staff turno ver.

Activity (4)

You've already reviewed this. Edit your review.
1 hundred reads
Jan Goossenaerts added this note
Jan Goossenaerts added this note
citation: In the monitoring and evaluation realm (Interaction Dictionary link), too little collective action would increase costs of roundabout activities while reducing the diffusion of best practices: if peers measure different properties and performances, how could investments be allocated efficiently, or how could they practice benchmarking? ...
Jan Goossenaerts added this note
Added to collection http://www.scribd.com/collections/240... which supports http://www.interaction-dictionary.inf... - Joint evaluation should be pursued also beyond humanitarian action. The desirability of collective versus individual decision frames is briefly addressed at http://www.pragmetaknowledgeclout.be/... I cite: In the mo

You're Reading a Free Preview

/*********** DO NOT ALTER ANYTHING BELOW THIS LINE ! ************/ var s_code=s.t();if(s_code)document.write(s_code)//-->