You are on page 1of 6

Michael Yeap...PhD Candidate: 20101127 - Evaluation re earch!

"ethod #Patton$

Page 1 of 6

Share

0

More

;e<t Blog=

valve0oem>gmail.com

ashboard

4ign 6ut

"ichael #ea$%%%Ph& 'andidate
Search This Blog
4earch

Saturday, November 27, 2010

20101127 - Evaluation research/method (Patton)

Evaluation Research : an interview with Dr Patton
Lin s
Mobile Learning portal [prototype] WAP Blog: Mobile Learning & Usability

Interviewed by Lisa Waldick Comments: Dr Michael Quinn Patton is well-known for Evaluation Research methods. So, this creates m interest to read this interview. Many development programs are evaluated to determine how effective and useful they are. But how effective and useful are the evaluations themselves? Internationally renowned evaluator, Michael Quinn Patton, recently came to I !" #International evelopment !esearch "entre$ to discuss his approach for making sure evaluations are useful for decision%makers. r &atton is the head of an organi'ational development consulting business( )tili'ation%*ocused Information and +raining. ,nown for five influential books on evaluation, including Qualitative/Evaluation and Research Methods, he was the -./0 recipient of the 1lva and 2unnar Myrdal 1ward from the 3valuation !esearch 4ociety for 5outstanding contributions to evaluation use and practice5

Blog !rchive
9 2010 (144) : e!e"ber (4) 9 #o$e"ber (%&)
Mobile 'W('')* pros & !ons Mobile P+one is "ore !on$enient t+an #oteboo, -o"p... rea/ing Bible on Mobile p+one in !+0r!+ 20101121 2 -on!ept0al3'+eoreti!al 4ra"e5or, (5i,ip... Usability 4ra"e5or, (6+a!,el) 20101121 2 !on!epts & /e7intions o7 )$al0ation res... 20101121 2 )$al0ation resear!+3"et+o/ (Patton) 20101121 2 )$al0ation *esear!+ (Wi,ipe/ia) 20101121 2 )$al0ation *esear!+ (Po5ell) 20101121 2 )$al0ati$e *esear!+ (Winsett8 #A'-9) 20101121 2 '+e Planning2)$al0ation -y!le (6o!ial*e... 20101121 2 )$al0ation *esear!+ (6o!ial*esear!+Met+... 2010112& 2 iagra" 2 4o!0s :ro0p as a ;0alitati$e ... 2010112& 2 Pee, & 4ot+ergill8 Using 4o!0s :ro0ps 7...

Why is it important to evaluate development programs? Don't people on the ground ust intuitively understand what is going on in programs? 6ur very processes of taking in information distort reality 7 all the evidence of social science indicates this. We have selective perception 7 some of us have rose%coloured glasses, some of us are gloom%and% doomers. We are not neutral! there is an emotional content to in"ormation. We need disciplined techni8ues to be able to stand back from that day%to%day world and really be able to see what is going on.

http:!!"ichael'eap.(log pot.co"!2010!11!20101127-evaluation-re earch"ethod.ht"l

2%!1!201&

Michael Yeap...PhD Candidate: 20101127 - Evaluation re earch!"ethod #Patton$

Page 2 of 6

2010112& 2 4o!0s :ro0p resear!+ "et+o/ (:ibbs) 2010112& 2 4o!0s :ro0p reporting ((o5a 6tate Uni$)... 2010112& 2 4o!0s :ro0p resear!+ "et+o/ ((o5a 6tate... 2010112& 2 4o!0s :ro0p 2 a <0alitati$e resear!+ (=... 2010112& > 4o!0s :ro0p resear!+ "et+o/ology (Up4ro... 2010112& 2 4o!0s :ro0p resear!+ "et+o/ology (Wi,ip... 20101122 2 ?an/ling -o""on 'as,s 7or iP+one 2 part... 20101121 2 -o""on 'as,s 7or iP+one 20101121 2 Usability & esign :0i/elines 7or iP+on... 20101121 > Apple8 iP+one ?0"an (nter7a!e :0i/eline... Prototype o7 Mobile Learning Portal@pls !riti!ise3... 20101120 2 6tart eAperiential.."obile apps & M2Lea... 20101120 2 'e!+*a/ar: Mobile 5eb /esign: plat7or" ... Mobile Learning portal [prototype] WAP Blog & )1 !lone 20101114 2 =oo8 9nline -ollaborati$e Learning... /B ay 1C #o$ 2010 Proposal3Wor, e7ense 6e"inar 2010110D 2 'ri7ono$a8 ..?oar/ing -ontent in Mobile... 5riting & s0b"itting Eo0rnal Paper...7ire2n27orget... Usability (nspe!tion -riteria 7or e2Learning Porta... Paper 7or Eo0rnal: Usability (nspe!tion -riteria 7... iE)' 2010

We need approaches to help us stand back from our tendency to have biases, prejudices, and preconceptions.

Can you give an example of how a preconception can influence a project? There are examples in development that are legendary. One agriculture project grew a bean that cooked faster so it would use less fuel — and therefore you would partly reduce deforestation. But there was a lot of resistance to adopting it. Part of the resistance was because one of the few times women in this culture were able to sociali e with each other was when they were cooking. They didn!t want a fast cooking bean. Those are the kinds of things evaluators see when they go in — they see the things that people can't see because they are too close to it. valuation is about standing back and being able to see things through somebody else's eyes.

What distinguishes your approach to evaluation? One of the ways of you can distinguish different evaluation approaches is by what they take as their bottom line for the evaluation. "or me# it is the pragmatic use of evaluation findings and the evaluation process. $n other words# it is that the evaluation is designed and implemented in a way that really makes a difference to improving programs and improving decisions about programs. %o the bottom line in my approach is use — that!s the reason why my approach is called utili!ation focussed evaluation.

( 9!tober (42) ( 6epte"ber (&2) ( 200C (212)

"ow do you think the usefulness of evaluations can be increased? $n the timing of the evaluation# for example# it means that you time the findings to match when decisions are really going to be made. # lot of evaluations take place at the end of a project. &n evaluation report gets written and it!s a very good piece of work. $ut all the decisions have already been made about the future of the project by the time the evaluation gets done. On paper# it appears to make sense to do an evaluation right at the end of the project to try to capture everything that!s gone on. But it turns out not to be useful to do that. 'verything that is going to be decided about the future of a program gets decided before the end of the program.

(ace)oo

Badge

http:!!"ichael'eap.(log pot.co"!2010!11!20101127-evaluation-re earch"ethod.ht"l

2%!1!201&

Michael Yeap...PhD Candidate: 20101127 - Evaluation e!ea ch"#ethod $Patton%

Page 3 of 6

Lion Mick Yap

Comments: Evaluation at the end of program is summative evaluation. Formative evaluation should be done so that improvements could be made during the program, and findings could be used to support decision making for the future of the program.

How can evaluations inform decision-making? By knowing what the questions are that the decision-makers bring to a project. So, for example, you have to know: is there consideration being given to expanding the project from one part of the world to another — to adapt the intervention to a new ecosystem or a new group of people? Or do decision makers already know that resources are declining and the real !uestion is: "an we do more with less? #nowing what the decision context is lets you gather data that is relevant$ % lot of evaluations get designed generically$ &hen decision makers get them, the response is: &ell that's interesting$ (ut it doesn't help me with what my decision is$ )t doesn't answer my !uestion$

Create Your Badge

What do you see as the difference between research and evaluation? *here's a whole continuum of different kinds of evaluation and different kinds of research$ +owever, on the whole, the purpose of evaluation is to produce useful information for program improvements and decision making$ %nd the purpose of research is to produce knowledge about how the world works$ (ecause research is driven by the agenda of knowledge production, the standards for evidence are higher, and the time lines for generating knowledge can be longer$ n evaluation! there are very concrete deadlines for when decisions have to get made! for when program action has to be taken$ )t often means that the levels of evidence involve less certainty than they would under a research approach and that the time lines are much shorter$

T*itter (ollo*ers
with .oogle /riend "onnect

#embers $%&

f you don"t have the highest possible levels of evidence in the evaluation! isn"t there a risk of making bad decisions?
%lready a member? Sign in

)n the real world, you don't have perfect knowledge and decisions are going to get made anyway$ When a program is coming to an end and a decision has to get made about it! the decision is going to get made whether or not you have perfect knowledge$ )f you are saying: ,-o, don't decide now$ &ait until )

http:""#ichael(eap.)log!pot.co#"2010"11"20101127-evaluation- e!ea ch#ethod.ht#l

2&"1"201'

Michael Yeap...PhD Candidate: 20101127 - Evaluation e!ea ch"#ethod $Patton%

Page ' of 6

have perfect knowledge,, the train is going to pass$ *he reality is that it's better to have some information in a timely fashion than to have perfect information too late to get used$

What is participatory evaluation? What are its advantages? 'articipatory evaluation means involving people in the evaluation — not only to make the findings more relevant and more meaningful to them through their participation, but also to build their capacity for engaging in future evaluations and to deepen their capacity for evaluative thinking$ So, let's say you want to do a serious evaluation and you are trying to decide whether or not to have an external person do it or to do it internally$ *he external person may do a very good job of generating findings for you$ (ut all the things that they learn about how to do evaluation, they take away with them$ )f you do the evaluation with counterparts in the countries where you are working, then they get the opportunity not only to generate findings and know where those findings come from, but also to learn about evaluative thinking$

What is evaluative thinking? 0valuative thinking includes a willingness to do reality testing, to ask the !uestion: how do we know what we think that we know$ *o use data to inform decisions — not to make data the only basis of decisions — but to bring data to bear on decisions$ 0valuative thinking is not just limited to evaluation projects, it's not even just limited to formal evaluation$ )t's an analytical way of thinking that infuses everything that goes on$

What is the hardest thing to teach about evaluation? *he hardest thing ) find to teach is how to go from data to recommendations$ &hen you are doing an evaluation, you are looking at what has gone on — a history$ (ut when you write recommendations, you are a futurist$ 0valuations can help you make forecasts, but future decisions are not just a function of data$ 1aking good, contextually grounded, politically savvy and do able recommendations is a sophisticated skill$ ( great evaluator can really show the strengths and weaknesses in a program and can gather good! credible data about what is working and not working$ (ut that doesn't mean that they know how to turn that information into recommendations$

http:""#ichael(eap.)log!pot.co#"2010"11"20101127-evaluation- e!ea ch#ethod.ht#l

2&"1"201'

Michael Yeap...PhD Candidate: 20101127 - Evaluation e!ea ch"#ethod $Patton%

Page 5 of 6

I actually prefer to involve the primary decision-makers who are going to use the evaluation in generating their own recommendations through a process of facilitation and collaboration. I encourage them to look at the data, consider the options, and then come up with their own recommendations in a context that includes their values, experience, and resources.

When is it good not to evaluate a project or program? You can overdo evaluation just because people get sick of it. There are also times of crisis when you need to take action, rather than study the questions. I've seen projects go do n the tubes hile people ere studying and evaluating hen in fact they needed to take action.

What do you think is the most important key to evaluation? It is being serious diligent and disciplined about asking the questions, over and over: "What are we really going to do with this? Why are we doing it? What purpose is it going to serve? ow are we going to use this information?" This typically gets ans ered casually! "#e are going to use the evaluation to improve the program" $ ithout asking the more detailed questions! "#hat do e mean by improve the program% #hat aspects of the program are e trying to improve%" &o a focus develops, driven by use.

&ource! http!''

.evaluation iki.org'index.php'(ichael)*uinn)+atton

Poste/ by Mi!+ael o7 MMU at 12:14 AM Labels: e$al0ation resear!+8 Patton8 progra" e$al0ation8 resear!+ "et+o/

1 comment:

Michael of MMU

November 27, 2010 at 12:20 AM

He is the author of si evaluation boo!s includin" a #th edition of $Utili%ation& 'ocused (valuation$ )Sa"e, 200*+ and $,ualitative -esearch and (valuation Methods$ )2002, .rd edition+/ 0revious editions of these boo!s have been used in

http:""#ichael(eap.)log!pot.co#"2010"11"20101127-evaluation- e!ea ch#ethod.ht#l

2&"1"201'

Michael Yeap...PhD Candidate: 20101127 - Evaluation re earch!"ethod #Patton$

Page 6 of 6

over 300 universities worldwide. He is also author of "Creative Evaluation"(2nd. ed., Sage, !"#$% "&ra'ti'al Evaluation"(Sage, !"2$% and "Culture and Evaluation" (editor, (osse)*+ass, !",$. He is 'o*author of "-etting to .a)/e0 How the 1orld 2s Changed" whi'h applies 'o3ple4it) theor) to so'ial innovation and presents develop3ental evaluation for evaluating innovations (5ando3 House Canada, 200#$. 5epl)

Enter your comment...

 
Sign out

Comment as:

 Michael Valves (Google)
Preview

Publish

Notify me

Links to this post
Create a Link

Newer Post Subscribe to: Post Comments (Atom)

Home

Older Post

http:!!"ichael'eap.(log pot.co"!2010!11!20101127-evaluation-re earch"ethod.ht"l

2%!1!201&