VIETNAM AUSTRALIA MONITORING AND EVALUATION STRENGTHENING PROJECT - PHASE II

Findings of Study Tour to review ODA Evaluation Systems in Japan

Prepared for

MPI
2 Hoang Van Thu Ba Dinh Hanoi VIETNAM

23 May 2006
42443867

Prepared by
Vietnam Australia Monitoring and Evaluation Strengthening Project

...........................................................DOC ....................... 11 13 14 Lessons learned Recommendations 5. 14 5............... 5 Japan Bank for International Cooperation (JBIC) .5 4 5 2 3 5 Ministry of Foreign Affairs (MOFA) ...... 6 Foundation for Advanced Studies in Development (FASID) .............4 3........Phase II Findings of Study Tour to review ODA Evaluation Systems in Japan 1 CONTENTS 1 2 3 Purpose and expected outputs Evaluation organisation and functions Summary of findings 3.3 3.....................1 Recommendations for short-term implementation .............................2 Recommendations for medium-term implementation . 10 Japan Evaluation Society (JES).......................... 15 ANNEXES Annex 1: List of Participants Annex2: Detailed program C:\OANH\M&E WEBSITE\JUNE..Vietnam Australia Monitoring and Evaluation Strengthening Project ...........APRIL 06\ENGLISH\REPORT ON EVALUATION STUDY TOUR EN V3.06 UPDATES\ACTIVITIES\STUDY TOUR IN JAPAN ................. 9 Japan Agency for International Cooperation (JICA)............................2 3.1 3.

DOC . program and portfolio evaluations in Japan.APRIL 06\ENGLISH\REPORT ON EVALUATION STUDY TOUR EN V3. A list of participants attached at Annex 1. A small group of senior GoV leaders visited Japan to study evaluation systems. review case studies of sector. The purpose of the study tour was to: • • • • See Japanese best-practice evaluation system in operation Learn methods and tools used by Japanese government for evaluation of ODA Understand how evaluation supports investment and management decisions in Japan See how professional evaluation societies build capacity of evaluation practitioners Outputs expected from the study tour included a detailed understanding of how to establish. Evaluation of ODA investments is central to the pilot M&E system being developed by FERD/MPI with support from VAMESP II. implement and use evaluation systems for public investment to support resultbased management and effective investment formulation. learn current issues and research in Japanese evaluation and compare Japanese and Vietnamese evaluation case studies. during 4 days in Japan participants met with 5 Japanese agencies to review case studies of best practice evaluation partnerships in Japan. C:\OANH\M&E WEBSITE\JUNE.06 UPDATES\ACTIVITIES\STUDY TOUR IN JAPAN . As detailed in the program attached at Annex 2.Phase II Findings of Study Tour to review ODA Evaluation Systems in Japan 2 1 Purpose and expected outputs The Project Document for the Vietnam Australia Monitoring and Evaluation Strengthening Project Phase II (VAMESP II) includes resources for study tours to build capacity in monitoring and evaluation (M&E). Case study examples of good practice ODA evaluation are expected to be provided to inform the pilot Vietnamese evaluation system. Japan has one of the most advanced ODA evaluation systems of any OECD country.Vietnam Australia Monitoring and Evaluation Strengthening Project .

06 UPDATES\ACTIVITIES\STUDY TOUR IN JAPAN . The governing regulation is the Government Policy Evaluation Act (2004) which is administered by the Ministry of Internal Affairs and Communication. project-level evaluation and using lessons learned to inform decisions about new investments FASID .policy–level evaluation and collation of evaluation results from all ODA JBIC and JICA .Phase II Findings of Study Tour to review ODA Evaluation Systems in Japan 3 2 Evaluation organisation and functions Evaluation of ODA in Japan is carried out by the Ministry responsible for ODA (MOFA) and its agencies (JICA and JBIC) with support from FASID (training and research) and JES (association for practitioners from government. The following diagrams summarise the organisational relationships and functions of ODA evaluation in Japan.research and training for improved evaluation methods and tools JES – culture change and development of a cadre of professional evaluators. This integrated system ensures: • • • • systematic evaluation of the implementation and outcomes of ODA policy. tools and capacity.APRIL 06\ENGLISH\REPORT ON EVALUATION STUDY TOUR EN V3. research for continuous improvement of evaluation methods.DOC . private sector and NGOs). C:\OANH\M&E WEBSITE\JUNE. The key functions of each institution can be summarised as: • • • • MOFA . academia. programs and projects.program-level evaluation. professional development of evaluation managers and practitioners. and public reporting and accountability of evaluation results and lessons learned.Vietnam Australia Monitoring and Evaluation Strengthening Project .

Phase II Findings of Study Tour to review ODA Evaluation Systems in Japan 4 C:\OANH\M&E WEBSITE\JUNE.DOC .APRIL 06\ENGLISH\REPORT ON EVALUATION STUDY TOUR EN V3.Vietnam Australia Monitoring and Evaluation Strengthening Project .06 UPDATES\ACTIVITIES\STUDY TOUR IN JAPAN .

Use of joint evaluations to ensure partnership – Government of Japan is placing increasing importance on joint evaluations with partner countries. Ultimately this is reflected in budget requests to MOF.1 Summary of findings Ministry of Foreign Affairs (MOFA) At MOFA the GOV Participants met with Mr Nobuki SUGITA (Deputy Director General of the Economic Cooperation Bureau in MOFA). which reviews higher level evaluation reports before finalising budget decisions. the main focus of MOFA evaluations is effectiveness. This means that data on outputs and outcomes is synthesised and used to report on actual achievement of policy and program purpose. This is consistent with international good practice for management of investment but emphasises the importance of simple stepby-step systems to build an effective management culture. Effectiveness: the evaluation system in MOFA and its agencies such as JICA and JBIC places importance on feedback and use of lessons learned to inform decisions. Efficiency and Effectiveness of a systematic approach to evaluation: . Use of a simple PDCA cycle – MOFA uses a simple cycle of plan. program and project). budget allocation decisions and operational decisions.Phase II Findings of Study Tour to review ODA Evaluation Systems in Japan 5 3 3. check. Use of 4 steps in evaluation – consistent with the Vietnam M&E Manual and training materials developed by FERD/MPI with support from VAMESP II. do. In 2 hours of discussions and presentations. reporting and feedback. This provides a mechanism for recurrent budget allocation to evaluation plans included in agency budget submissions through the normal budget process. field work and data collection. Use of third party evaluators to ensure independent and objective results – most of the primary evaluation work in MOFA is outsourced to independent practitioners from consulting firms or universities. The main constraint to this policy shift is the capacity and availability of partner country staff to participate in evaluations. planning and development of methods and tools. act to structure evaluation into all its activities. and Ms Naoko UEDA (Deputy Director of the Aid Planning Division). • • National ministry evaluations focus on effectiveness – because of budget constraints and pressure from the public on transparency and effective use of tax revenues. This designed to provide a flow of information to support policy decisions.Vietnam Australia Monitoring and Evaluation Strengthening Project . Mr Yukio YOSHI (Director of the Aid Planning Division in the Economic Cooperation Bureau) Mr Takeshi SHIIHARA (Assistant Director of the Aid Planning Division). MOFA uses four steps in evaluation: preparation of evaluation logframe. • • • • C:\OANH\M&E WEBSITE\JUNE. the key findings from MOFA included: • Important of overarching legislation to support evaluation – the Government Policy Evaluations Act (2004) provides a framework for institutional arrangements for evaluation in all national government agencies.Efficiency: MOFA has clear delegation of roles and responsibilities for evaluation to all its subordinate agencies and organisation units at three levels (policy.DOC .APRIL 06\ENGLISH\REPORT ON EVALUATION STUDY TOUR EN V3. This requires MOFA staff to be skilled in evaluation in order to prepare effective TOR and also to manage the quality of evaluation outputs.06 UPDATES\ACTIVITIES\STUDY TOUR IN JAPAN .

This requires JBIC staff to be skilled in evaluation in order to prepare effective TOR and also to manage the quality of evaluation outputs. Ultimately this is reflected in quality of outputs and outcomes. This system has been adopted by the Ministry of Finance in Bangkok (one of the lessons learned from the October 2005 Study Tour) and is under active consideration for piloting by FERD/MPI with support from VAMESP II.DOC . JBIC ensures that evaluation results are used for feedback to activity managers as well as informing those preparing new investments. Mr Asahiko KARASHIMA (Deputy Director of Partnership Strategy Division) and Ms Toyoko KODAMA (Evaluation Officer of Development Assisstance Operations Evaluation Office). Mr Yoshio WADA (Director Evaluation Planning Division in the Development Assistance Operations Evaluation Office).Phase II Findings of Study Tour to review ODA Evaluation Systems in Japan 6 Case studies of country assistance evaluations were presented for Cambodia and Tanzania. Use of an independent panel of experts to review evaluation – objectivity and evaluation quality are ensured by an independent panel of experts that report directly to the JBIC Board. Systematic approach to analysis and feedback .JBIC places importance on feedback and use of lessons learned to inform management and investment decisions. These had been prepared in partnership between Japan and the country government concerned.Vietnam Australia Monitoring and Evaluation Strengthening Project . the key findings from JBIC included: • Systematic approach to evaluation – JBIC prescribes a systematic process for evaluation that clearly guides all staff and clients. JBIC uses the panel of experts to build credibility in evaluation results and ensure that key lessons learned are communicated to the highest levels. impact and sustainability. Use of third party evaluators to ensure independent and objective results primary evaluation work is outsourced by JBIC to independent practitioners from consulting firms or universities. The process assigns roles and responsibilities at project. efficiency.consistent with the Vietnam M&E Manual and training materials developed by FERD/MPI with support from VAMESP II.2 Japan Bank for International Cooperation (JBIC) At JBIC the GOV Participants met with Mr Shigeru TAKEDA (Senior Executive Director of JBIC). Mr Ryutaro KOGA (Director General of the Development Assistance Operations Evaluation Office in the Project Development Department). The experts are mostly academics but also include some people from NGOs. JBIC uses the four steps in evaluation mentioned above and the five OECD-DAC evaluation criteria: relevance. Use of a systematic rating system to benchmark performance – the highly developed rating system used by JBIC enables qualitative and quantitative data for all 5 evaluation criteria to be assessed and quantified for systematic benchmarking. which provides justification for budget allocation decisions by MOF. Systematic approach to feedback including a lessons-learned database – through the use of a lessons learned database and standardised evaluation reporting format. In 2½ hours of discussions and presentations. Use of 4 steps and 5 criteria in evaluation . 3. program and policy levels and was in place and operational before the GPEA (2004) was enacted.06 UPDATES\ACTIVITIES\STUDY TOUR IN JAPAN . JBIC uses a socially recognised expert or leader to independently verify • • • • • • • C:\OANH\M&E WEBSITE\JUNE.APRIL 06\ENGLISH\REPORT ON EVALUATION STUDY TOUR EN V3. effectiveness. Use of socially recognised expert to independently verify results – wherever possible.

• Strong collaboration between Operations Department and Evaluation Department to ensure that lessons learned inform future operations – there are strong linkages across JBIC organisational units to ensure that independently derived evaluation results are actively used during management and formulation of investments. The diagram below depicts the institutional arrangement and the data flow of the lesson learned database.APRIL 06\ENGLISH\REPORT ON EVALUATION STUDY TOUR EN V3.Vietnam Australia Monitoring and Evaluation Strengthening Project .where JBIC has investments Sectors . This collaboration is driven by a continuous improvement culture and places importance on effectiveness and quality. and so users have to look for detailed lessons in the separate project database. • Each record in this database has a unique identifier. This database is operated and maintained by the Evaluation office with support of the IT department.were agreed in 1995 and have not changed much since then. who is in charge of updating information on the JBIC intranet and website. Officers in the evaluation office have been using this since 1999 with information since 1995. which is used to store reports relating to projects. these lessons are not directly linked with the evaluation reports. Some complicated sectors are also divided to sub-sectors: for instance the transport sector includes sea transport. C:\OANH\M&E WEBSITE\JUNE. which has been acknowledged as useful and comprehensive.06 UPDATES\ACTIVITIES\STUDY TOUR IN JAPAN .DOC . they are categorised by project phase: Procurement. railway and road transport sub-sectors. These sectors are consistently used across all JBIC databases. Mid term. Project Officer-IT returned to JBIC the following afternoon to better understand the Lessons Learned Database. using the unique project identifier to link lessons learned and evaluation reports. Evaluation types – including Ex-ante. Cost management.Phase II Findings of Study Tour to review ODA Evaluation Systems in Japan 7 evaluation results. and Terminal. Effectiveness and Efficiency. However. This is primarily a mechanism to address pressure from the public on transparency and effective use of tax revenues. lesson type(s) and a summarised freetext description that is linked to the project from which the lesson(s) were drawn. The other database is easily accessed by intranet. In addition. These lessons are categorised by: • • Countries . Operation and Maintenance.

because of the functionalities of each department.Format Evaluation reports to PDF versions and publish those in JBIC Website and project database Lesson learned database This diagram shows that there is a redundancy of project information in both project and lessons learned databases.APRIL 06\ENGLISH\REPORT ON EVALUATION STUDY TOUR EN V3. While other staff can only access the query interface of the lessons learned database. while the project database provides a rich source of project profile. all relating documents including evaluation reports.Vietnam Australia Monitoring and Evaluation Strengthening Project . both databases are accessible for all staff with appropriate username and passwords over JBIC intranet that allows officers to look for all information needed.06 UPDATES\ACTIVITIES\STUDY TOUR IN JAPAN . However. they see a need of decentralisation of information so they can easily maintain the data concerning to their works. 3 evaluation officers.Link lessons learned with projects accordingly .Update lessons learned to JBIC intranet SQL database . implementation data and lesson identifiers linked to lessons learned database.Make an Excel spreadsheet of all received lessons learned .Phase II Findings of Study Tour to review ODA Evaluation Systems in Japan 8 Evaluation Reports List of evaluated projects Evaluation Office 15 officers 3 officers Operations department IT Department . are competent in basic Office skills including Word.DOC .Update to a local Access database .Evaluation reports Project database . responsible for updating the lessons learned database. those 3 officers are provided a simple Access database for updating lesson learned. In addition. which is done annually and requires only C:\OANH\M&E WEBSITE\JUNE. Excel and Access. For example. the lessons learned database keeps only short information of project and summaries of lessons learned.

DOC . There are tools available for program evaluation – FASID research has built on international experience to refine tools for use in program evaluation including the hierarchy tree. this database provides comprehensive lessons learned. This is consistent with the M&E Manual in Vietnam and all the pilot evaluations conducted in Vietnam with support from VAMESP II. Training provided for those 3 officers was also short because of the simpleness of the application. In 2 hours of discussions and presentations. who then updates to the project database with new links to lessons and publishes on web database.06 UPDATES\ACTIVITIES\STUDY TOUR IN JAPAN . evaluation gives the best return on investment when it is focussed on improving the quality of investment (for example through management for development results) and providing transparent information to citizens that accounts for the outcomes of government policies and investments. Resource materials on PCM and PDM are available through the M&E Resource Centre at FERD/MPI. the key findings from FASID included: • The use of PCM and PDM as foundations for systematic evaluation – FASID research and academic affiliates have invested a considerable amount of effort in the systematisation of evaluation through the use of Project Cycle Management (PCM) and Project Design Matrix (PDM) methods and tools. This Access application links to the JBIC local database in the IT department. This is relevant to the SEDP in Vietnam and highlighted the opportunities to use congruent indicators that link project-level indicators (a large number focused on activities and outputs). efficiency. The use of 5 DAC Criteria for evaluation – FASID recommends the use of the 5 DAC criteria for evaluation (relevance. which has not been shown clearly in their databases. it is planned to further improve those databases in near future. impact and sustainability).3 Foundation for Advanced Studies in Development (FASID) At FASID the GOV Participants met with Professor Naonobu MINATO (Acting Director) and Mr Tadashi KIKUCHI (Program Officer). program logical framework including a hierarchy of indicators and case studies. The meeting was opened by Ambassador Toshio TSUNOZAKI. Evaluation should focus on quality and accountability – in the experience of FASID. they acknowledge the strong connection between monitoring and evaluation. The M&E Manual developed by FERD/MPI with support from VAMESP II has adapted some of these methods and tools to Vietnam where appropriate. Furthermore.APRIL 06\ENGLISH\REPORT ON EVALUATION STUDY TOUR EN V3. The Draft National M&E Strategic Plan developed by FERD/MPI with support from VAMESP II has used the hierarchy tree tool for the program of investments proposed in the strategic plan. with program-level indicators (a smaller number focused on outputs and outcomes). which are really useful for learning experience and applying to similar projects and situations. There are ways of evaluating programs and plans such as SEDP – a case study of Mie Prefecture in Japan demonstrated how a hierarchy of indicators is used for evaluating projects. They are also thinking of further linking the project and lessons learned databases to reduce redundancy. According to staff interviewed at the Evaluation Office. effectiveness.Vietnam Australia Monitoring and Evaluation Strengthening Project . 3. Therefore. the Executive Director of FASID.Phase II Findings of Study Tour to review ODA Evaluation Systems in Japan 9 one day of work. policies and pillars or goals. logic tree. programs. with policy or sector indicators (around 5 indicators per policy or sector focused on • • • • C:\OANH\M&E WEBSITE\JUNE.

Office of Evaluation). JICA Evaluation Reports for 2003.DOC .Vietnam Australia Monitoring and Evaluation Strengthening Project .4 Japan Agency for International Cooperation (JICA) The GOV Participants met with 12 JICA staff. However. In more than 3 hours of animated discussions and presentations. Ms Satoko MIWA (Director. they must be objective and transparent. They provided GOV Participants with many resources including the JICA Evaluation Handbook (consistent with the methods and tools set out in the VAMESP II M&E Manual). methods and tools for systematic evaluation of ODA policies. with goal indicators (around 5 macro-economic indicators focused on purpose and goal).Phase II Findings of Study Tour to review ODA Evaluation Systems in Japan 10 outcomes and purpose. For evaluation results to be credible. JICA has developed institutional arrangements. programs and projects. • • • C:\OANH\M&E WEBSITE\JUNE. (ii) improving evaluation quality and (iii) ensuring objectivity and transparency. led by Mr Seiji KOJIMA (Vice-President JICA and Chair of the JICA Evaluation Study Committee). Office of Evaluation in the JICA Planning and Coordination Department and Co-Chair of the OECD-DAC Committee on Evaluation) and Mr Kazuaki SATO (Deputy Director.APRIL 06\ENGLISH\REPORT ON EVALUATION STUDY TOUR EN V3. In Japan this is set out in the Government Policy Evaluation Act (2004) as already mentioned under MOFA. the agency has conducted evaluation systematically since 1988 and now publishes its annual evaluation report to ensure tax-payer commitment to international cooperation activities.06 UPDATES\ACTIVITIES\STUDY TOUR IN JAPAN . There are 3 pillars for effective evaluation – JICA ensures effective utilisation of evaluation results by: (i) expanding the coverage of evaluation. Mr Fumio KIKUCHI (Resident Representative of JICA Vietnam). • Government commitment to evaluation supports coordination – FASID advised that efficient evaluation relies on political will extended from a government committed to transparency and accountability. These independent experts provide advice to the Board of Vice Presidents to ensure the quality and effective use of evaluation results. Mr Satoru KOHIYAMA (Director General of Regional Department I [SE Asia]). 2004 and 2005 and also distance learning materials. JICA established an Advisory Committee on Evaluation in 2002 (chaired by Professor Muta who we met at JES [see below]) that reports to the Board of Vice Presidents through the Evaluation Study Committee (chaired by Mr Kojima who we met at JICA). Evaluation is part of a systematic approach to continuous improvement – under the Government Policy Evaluations Act (2004) JICA is obliged to conduct evaluation. 3. the key findings from JICA included: • Evaluation is overseen by an independent group of experts – to ensure objectivity and oversee quality of evaluation. Many of these resources are also available on the JICA website. Evaluation systems should ensure objectivity and independence – the major concern JICA has is that Japanese citizens have confidence in international cooperation investments made by the Government of Japan and managed by JICA. In such a case coordination is possible because it is assigned to one government agency. A case study of the ex post evaluation of Bach Mai Hospital was presented by FASID – demonstrating how leadership led to effective delivery of ODA outcomes.

Phase II Findings of Study Tour to review ODA Evaluation Systems in Japan 11 • Evaluation services are outsourced to ensure objectivity – JICA uses consultants. National agencies should focus on policy and program evaluation – JICA is placing increasing emphasis on program and “synthesis” or meta-evaluations to provide better feedback on policy decisions and regional investment strategies. There are now 60 evaluation chiefs in JICA headquarters and 56 in country offices. Secondary-evaluations are used for quality assurance – JICA uses secondary evaluations by the Advisory Committee on Evaluation to ensure quality. • • • • • • • 3. Lessons learned were fedback to JICA staff and consultants. Capacity of staff and consultants can be built with distance learning resources – JICA worked with the World Bank Institute to develop distance learning modules for evaluation that can be used by staff in Japan. In addition Operations Department has a budget of US$7m within existing or planned projects for ex ante and mid-term evaluations Staff that manage evaluations should be trained in evaluation – even though JICA staff do not conduct evaluations themselves. Knowledge management enables best use of evaluation results – as evaluation becomes established as a routine function.Vietnam Australia Monitoring and Evaluation Strengthening Project . Line agencies should focus on project evaluation . Mrs Yoko ISHIDA (Senior Researcher at IDCJ and member of JES) and Mrs Mimi Sheikh (Senior Evaluation Specialist at IDJC and member of JES). NGOs and academics to conduct project and program evaluations. Dr Masaoki TAKEUCHI (Executive Managing Director of JES and the International Development Centre of Japan).APRIL 06\ENGLISH\REPORT ON EVALUATION STUDY TOUR EN V3. they are trained in evaluation to ensure that they understand the importance of evaluation results and can effectively manage consultants implementing evaluations to ensure quality and objective results. The role is to develop an evaluation culture that ensures commitment to evaluation results and their effective use.DOC . country office staff and consultants. The materials are all available on the JICA website in both PowerPoint and video formats.5 Japan Evaluation Society (JES) At JES the GOV Participants met with Professor Hiromitsu MUTA (former Vice President of JES and current Chair of the JICA Advisory Committee on Evaluation). Some FERD/MPI and VAMESP II staff have used these training materials (see p41 of JICA 2004 Annual Evaluation Report). Details are reported in the 2005 Annual Evaluation Report (pp112-140). C:\OANH\M&E WEBSITE\JUNE.06 UPDATES\ACTIVITIES\STUDY TOUR IN JAPAN . A leader is institutionalised to champion evaluation in each agency unit – JICA recently assigned one existing staff member in most organisational units of the agency to the position of “Evaluation Chief”. In 2005 the ACE evaluated all terminal evaluations to check for consistency and quality – using a secondary evaluation checklist. JICA has developed web-based and internal knowledge management systems to ensure that lessons learned are effectively used for formulation of new investments and management for development results. The Office of Evaluation has 12 staff but an annual budget of US$10 million for terminal and ex post evaluation (US$9m) and program evaluation (US$1m).parallel with the shift to program evaluations by national agencies is increasing delegation of project evaluations to country offices. The system is available on the web in Japanese.

An evaluation society can certify training but not consultants – JES is preparing to provide certified training courses for evaluation practitioners. as directed by the Government Policy Evaluation Act (2004). An evaluation society established as a not-for-profit organisation can provide services to government – JES was reconstituted as a not-for-profit organisation so that it could contract with the Government to undertake high level evaluations. It also enabled the development of an annual plan of evaluations that is financed in each agency through the budget process. A society builds capacity through conferences. • • • • • • • • C:\OANH\M&E WEBSITE\JUNE. Local Government agencies at Prefecture and City levels use evaluation for projects. Investments must have a plan if they are to be efficiently evaluated – all policies and projects evaluated in Japan have planned outputs and outcomes against which actual performance can be evaluated. This is a localised version of the DAC evaluation criteria used for ODA. The importance of adequate plans for public investments is reinforced by administrative requirements imposed by the Ministry of Finance and the Ministry of Interior and Communication. Evaluation reports can contribute to budget allocation decisions – evaluation reports from public agencies are submitted to the Ministry of Interior and Communication for review and feedback. Participants in these courses would receive certification once they attended and were tested to a certain level of competence. An evaluation society supports development of an evaluation culture – by bringing together public servants. JES uses semi-annual conferences. a journal and training programs to build this culture in Japan. JES does not plan to certify consultants as being competent to conduct evaluations. Evaluation is driven by one agency (the Ministry of Interior and Communication) and closely used by for budget allocation decisions by the Ministry of Finance. effectiveness.APRIL 06\ENGLISH\REPORT ON EVALUATION STUDY TOUR EN V3. the key findings from JES included: • National legislation helps develop evaluation systems – the Government Policy Evaluation Act (2004) was a strong driver for the development of an evaluation culture for public investment in Japan. The effectiveness of this is demonstrated by the rapid increase in evaluation agencies in government organisations – from 59 in 2003 to 355 in 2005. semi-annual journals and training courses that coincide with each conference are used to build evaluation capacity. and used by the Ministry of Finance in the budget allocation process. However. This is different from the Project Management Societies in USA and Australia – which conduct certification of practitioners as well as training. efficiency. National and local government agencies use different evaluation methods – national agencies focus on policy and program evaluation in Japan. with a focus on outputs and outcomes. journal and structured training – semi-annual conferences.DOC .Vietnam Australia Monitoring and Evaluation Strengthening Project . corporate memberships and conference fees. The cost of this is covered by membership fees. fairness and priority.06 UPDATES\ACTIVITIES\STUDY TOUR IN JAPAN . Public investment uses 5 evaluation criteria – in Japan 5 criteria are used for evaluating policy: necessity. which applies to national agencies only.Phase II Findings of Study Tour to review ODA Evaluation Systems in Japan 12 In 2 hours of discussions and presentations. This particularly related to quality assurance for evaluation through mechanisms such as the secondary evaluations at JICA (see above). private sector practitioners and academics as well as students an evaluation society builds a social consensus about the value and use of evaluation.

build capacity through conferences. A leader should be institutionalised to champion evaluation in each agency unit Evaluation services should be outsourced to third party evaluators to ensure independent and objective results and Evaluation should be overseen by an independent group of experts A professional evaluation society established as a not-for-profit organisation can provide services to government to support development of an evaluation culture.Vietnam Australia Monitoring and Evaluation Strengthening Project . Line agencies should focus on evaluation of project efficiency and effectiveness. Evaluation results contribute to continuous improvement through a systematic approach to feedback including a lessons-learned database • • • • • C:\OANH\M&E WEBSITE\JUNE.DOC . programs and projects to achieve the set targets. a journal and structured training. and certify training A systematic approach to evaluation is efficient and has the greatest impact. Evaluating programs and plans such as SEDP is practicable Evaluation reports can contribute to budget allocation decisions in alignment with policy framework. Project Cycle Management and Logical Frameworks are proven foundations for systematic evaluation.APRIL 06\ENGLISH\REPORT ON EVALUATION STUDY TOUR EN V3.Phase II Findings of Study Tour to review ODA Evaluation Systems in Japan 13 4 Lessons learned Analysis of the findings presented in Section 3 identified the following lessons learned that are relevant to Vietnam: • • Overarching legislation provides an effective framework for evaluation It is very important to clearly assign responsibilities in the evaluation system.06 UPDATES\ACTIVITIES\STUDY TOUR IN JAPAN . National agencies should focus on evaluation of policy and program effectiveness.

Vietnam Australia Monitoring and Evaluation Strengthening Project . MOT. with inputs from whole-of-government coordinated by FERD/MPI.Phase II Findings of Study Tour to review ODA Evaluation Systems in Japan 14 5 Recommendations Based on the findings and lessons learned. MOF.APRIL 06\ENGLISH\REPORT ON EVALUATION STUDY TOUR EN V3. Recommendation 3 – FERD/MPI to work with other GOV agencies and key donors through the PGAE to prepare a program of GOV and joint evaluations for 2007 to enable resource allocation to be planned as part of the 2007 budget process and December CG meeting. Recommendation 4 – train selected staff in MPI.1 • Recommendations for short-term implementation In the next 12 months it is recommended that FERD/MPI lead GOV efforts in M&E to: Recommendation 1 – complete and institutionalise a national monitoring and evaluation strategic plan to provide guidance for the establishment of an official national monitoring and evaluation system. Recommendation 5 – establish a systematic rating system for evaluation results that can be used by MPI. MARD.06 UPDATES\ACTIVITIES\STUDY TOUR IN JAPAN . that introduces the national M&E system. MOF and line agencies to benchmark performance of public investments in Vietnam. a journal and structured training providing services to government and other sectors certifying evaluation training courses and materials • • • • • • C:\OANH\M&E WEBSITE\JUNE. MOH. building on the pilot experience of VAMESP II. Recommendation 7 – encourage the establishment of a professional society or notfor-profit organisation by evaluation practitioners with the purpose of: supporting development of an evaluation culture building capacity through conferences. Recommendation 6 – prepare an Annual Evaluation Report on ODA in Vietnam for the December CG meeting. presents case study examples of pilot evaluations conducted by GOV and presents the proposed program of GOV and joint evaluations for 2007 set out in Recommendation 3. Recommendation 2 – complete and institutionalise a national monitoring and evaluation manual to provide guidance for the practical implementation of evaluation work in Vietnam. 5.DOC . MOC and MoET as well as Provincial DPI in evaluation practice so that they can more effectively plan and manage evaluations. there is an opportunity for the Government of Vietnam to support implementation of the following short-term and medium-term recommendations resulting from the evaluation of policy and programs in Japan and to build on the lessons learned from pilot evaluations implemented with support from the Vietnam Australia Monitoring and Evaluation Strengthening Project.

that: require national agencies to focus on evaluation of policy and program effectiveness require line agencies to focus on evaluation of project efficiency and effectiveness require new public investments to have a detailed plan against which performance can be monitored and evaluated institutionalise a position of evaluation focal point in each agency responsible for public investment enables continuous improvement through effective use of lessons learned from evaluation by institutionalising close cooperation between units responsible for evaluation and units responsible for investment appraisal and implementation uses Project Cycle Management and Logical Frameworks as the foundations for systematic evaluation ensures objective and transparent evaluations by separating those who evaluate from those who manage implementation institutionalise a systematic approach to feedback including a lessons-learned database in FERD/MPI for ODA and ASD/MPI for Public Investment • Recommendation 10 – MPI request the Prime Minister for establishing an independent group of experts in the Office of Government or State Audit Office to oversee evaluation of public investment in Vietnam.DOC .Phase II Findings of Study Tour to review ODA Evaluation Systems in Japan 15 5.06 UPDATES\ACTIVITIES\STUDY TOUR IN JAPAN .2 • Recommendations for medium-term implementation In the next 2 years it is recommended that FERD/MPI lead GOV efforts in M&E to: Recommendation 8 .APRIL 06\ENGLISH\REPORT ON EVALUATION STUDY TOUR EN V3. Recommendation 9 – establish and institutionalise systematic functions and procedures for evaluation of public investment.Vietnam Australia Monitoring and Evaluation Strengthening Project . consistent with the National M&E Strategic Plan. • C:\OANH\M&E WEBSITE\JUNE.prepare and issue a Government Policy Evaluation regulation as overarching legislation to provide an effective framework for evaluation of public investments in Vietnam.

Director of the International Cooperation Department. Office of Government • Mr. Nguyen Xuan Tien.Annex 1: List of Participants The following Government of Vietnam staff and VAMESP II staff participated: • Dr. Deputy Director of the Appraisal and Public Investment Supervision Department. Tran Quoc Phuong. Cao Viet Sinh.DOC . FERD/MPI • Mr. MPI • Ms. Tran Thi Thu Trang. Head of General Division. Kieu Tien Quang. John Fargher.06 UPDATES\ACTIVITIES\STUDY TOUR IN JAPAN . Project Officer of VAMESP II C:\OANH\M&E WEBSITE\JUNE. Head of Japan and Northeast Asia Division. Ministry of Finance • Mr. Mai Huu Dung. Nguyen Thi Hong Yen.APRIL 06\ENGLISH\REPORT ON EVALUATION STUDY TOUR EN V3. Secretary of Vice Minister/Interpreter • Mr. Deputy Director of the External Finance Department. Vice Minister of Planning and Investment • Mr. Cao Manh Cuong. FERD/MPI • Mr. Australian Team Leader of VAMESP II • Ms.

DOC . Infrastructure and Transport Field trip to Subway Friday 28 April 2006 Departure for return to Hanoi (via Hong Kong) 10. Chiyoda-ku. Ohtemachi 1-chome. Director of Economic Cooperation Department Local visit and dinner Thursday 27 April 2006 Meeting with Ministry of Land.00 10. Chiyoda-ku.05 – 20. Japan Courtesy meeting with Mr. Tomioka 2chome.00 12. Vice Chairman of JICA JICA Office of Evaluation 6th–13th floors.00 14. Tel: 03(5218)3101 Fax: 03(5218)3955 Tuesday 25 April 2006 Foundation for Advanced Studies in International Development (FASID). Sato.30 13.00 10.00 11.00 09. Ito. Yoyogi 2-chome. Kasumigaseki 2-2-1. Shibuya-ku.45 17. Tel: +81-3-5352-5311/5312 Wednesday 26 April 2006 Meeting with Japan Evaluation Society International Development Center of Japan. Kyofuku Building.APRIL 06\ENGLISH\REPORT ON EVALUATION STUDY TOUR EN V3. 1-1.06 UPDATES\ACTIVITIES\STUDY TOUR IN JAPAN . Chiyoda-ku. Tokyo 102-0074.(0) 3-3580-3311 Japanese Bank for International Development (JBIC) 4-1.00 – 12.15 15. Tokyo 151-8558 Japan.00 Tokyo 100-8144. Kudan-Minami. Kojima. Tokyo 100-8919.00 – 15.45 12. Japan 14.20 Time C:\OANH\M&E WEBSITE\JUNE. Infrastructure and Transport Lunch with Ministry of Land. Parliamentary Secretary for Foreign Affairs.Annex 2: Detailed program Activities Sunday 23 April 2006 Travel from Hanoi to Tokyo (via Hong Kong) Monday 24 April 2006 Ministry of Foreign Affairs (Economic Cooperation Bureau) Room 272.45 – 17.30 14.) 1-6-17. Japan Lunch with JBIC Meeting with Mr.00 – 12. Executive Director of JBIC Meeting with Mr.).00 – 11. Shinjuku Maynds Tower.00 – 13. Tokyo 135-0047 (4th and 5th fl. MOFA Meeting with Mr.45 – 14.00 15. Department of Planning and Programs Chiyoda Kaikan Building (4th and 5th fl.55 10. Koto-ku.00 – 16.00 – 16. Arakawa. Japan Tel: +81.00 – 12.00 – 13.

Sign up to vote on this title
UsefulNot useful