You are on page 1of 18

VIETNAM AUSTRALIA MONITORING

AND EVALUATION STRENGTHENING


PROJECT - PHASE II

Findings of Study Tour to review ODA Evaluation


Systems in Japan

Prepared for
MPI
2 Hoang Van Thu
Ba Dinh Hanoi VIETNAM

23 May 2006

42443867

Prepared by
Vietnam Australia Monitoring and Evaluation Strengthening Project
Vietnam Australia Monitoring and Evaluation Strengthening Project - Phase II
Findings of Study Tour to review ODA Evaluation Systems in Japan 1

CONTENTS
1 Purpose and expected outputs 2

2 Evaluation organisation and functions 3

3 Summary of findings 5
3.1 Ministry of Foreign Affairs (MOFA) ........................................................ 5
3.2 Japan Bank for International Cooperation (JBIC) .................................. 6
3.3 Foundation for Advanced Studies in Development (FASID) .................. 9
3.4 Japan Agency for International Cooperation (JICA)............................. 10
3.5 Japan Evaluation Society (JES)........................................................... 11

4 Lessons learned 13

5 Recommendations 14
5.1 Recommendations for short-term implementation ............................... 14
5.2 Recommendations for medium-term implementation .......................... 15

ANNEXES
Annex 1: List of Participants
Annex2: Detailed program

C:\OANH\M&E WEBSITE\JUNE.06 UPDATES\ACTIVITIES\STUDY TOUR IN JAPAN - APRIL 06\ENGLISH\REPORT ON EVALUATION STUDY TOUR EN V3.DOC
Vietnam Australia Monitoring and Evaluation Strengthening Project - Phase II
Findings of Study Tour to review ODA Evaluation Systems in Japan 2

1 Purpose and expected outputs


The Project Document for the Vietnam Australia Monitoring and Evaluation
Strengthening Project Phase II (VAMESP II) includes resources for study tours to build
capacity in monitoring and evaluation (M&E). Evaluation of ODA investments is central
to the pilot M&E system being developed by FERD/MPI with support from VAMESP II.
Japan has one of the most advanced ODA evaluation systems of any OECD country.

A small group of senior GoV leaders visited Japan to study evaluation systems. A list of
participants attached at Annex 1. The purpose of the study tour was to:
• See Japanese best-practice evaluation system in operation
• Learn methods and tools used by Japanese government for evaluation of ODA
• Understand how evaluation supports investment and management decisions in Japan
• See how professional evaluation societies build capacity of evaluation practitioners

Outputs expected from the study tour included a detailed understanding of how to
establish, implement and use evaluation systems for public investment to support result-
based management and effective investment formulation. Case study examples of good
practice ODA evaluation are expected to be provided to inform the pilot Vietnamese
evaluation system.

As detailed in the program attached at Annex 2, during 4 days in Japan participants met
with 5 Japanese agencies to review case studies of best practice evaluation partnerships in
Japan; review case studies of sector, program and portfolio evaluations in Japan; learn
current issues and research in Japanese evaluation and compare Japanese and Vietnamese
evaluation case studies.

C:\OANH\M&E WEBSITE\JUNE.06 UPDATES\ACTIVITIES\STUDY TOUR IN JAPAN - APRIL 06\ENGLISH\REPORT ON EVALUATION STUDY TOUR EN V3.DOC
Vietnam Australia Monitoring and Evaluation Strengthening Project - Phase II
Findings of Study Tour to review ODA Evaluation Systems in Japan 3

2 Evaluation organisation and functions


Evaluation of ODA in Japan is carried out by the Ministry responsible for ODA (MOFA)
and its agencies (JICA and JBIC) with support from FASID (training and research) and
JES (association for practitioners from government, academia, private sector and NGOs).
The governing regulation is the Government Policy Evaluation Act (2004) which is
administered by the Ministry of Internal Affairs and Communication.

This integrated system ensures:


• systematic evaluation of the implementation and outcomes of ODA policy, programs
and projects;
• research for continuous improvement of evaluation methods, tools and capacity;
• professional development of evaluation managers and practitioners; and
• public reporting and accountability of evaluation results and lessons learned.

The following diagrams summarise the organisational relationships and functions of ODA
evaluation in Japan.

The key functions of each institution can be summarised as:


• MOFA - policy–level evaluation and collation of evaluation results from all ODA
• JBIC and JICA - program-level evaluation, project-level evaluation and using lessons
learned to inform decisions about new investments
• FASID - research and training for improved evaluation methods and tools
• JES – culture change and development of a cadre of professional evaluators.

C:\OANH\M&E WEBSITE\JUNE.06 UPDATES\ACTIVITIES\STUDY TOUR IN JAPAN - APRIL 06\ENGLISH\REPORT ON EVALUATION STUDY TOUR EN V3.DOC
Vietnam Australia Monitoring and Evaluation Strengthening Project - Phase II
Findings of Study Tour to review ODA Evaluation Systems in Japan 4

C:\OANH\M&E WEBSITE\JUNE.06 UPDATES\ACTIVITIES\STUDY TOUR IN JAPAN - APRIL 06\ENGLISH\REPORT ON EVALUATION STUDY TOUR EN V3.DOC
Vietnam Australia Monitoring and Evaluation Strengthening Project - Phase II
Findings of Study Tour to review ODA Evaluation Systems in Japan 5

3 Summary of findings
3.1 Ministry of Foreign Affairs (MOFA)
At MOFA the GOV Participants met with Mr Nobuki SUGITA (Deputy Director General
of the Economic Cooperation Bureau in MOFA), Mr Yukio YOSHI (Director of the Aid
Planning Division in the Economic Cooperation Bureau) Mr Takeshi SHIIHARA
(Assistant Director of the Aid Planning Division), and Ms Naoko UEDA (Deputy
Director of the Aid Planning Division).

In 2 hours of discussions and presentations, the key findings from MOFA included:
• Important of overarching legislation to support evaluation – the Government
Policy Evaluations Act (2004) provides a framework for institutional arrangements
for evaluation in all national government agencies. This provides a mechanism for
recurrent budget allocation to evaluation plans included in agency budget submissions
through the normal budget process.
• Efficiency and Effectiveness of a systematic approach to evaluation:
- Efficiency: MOFA has clear delegation of roles and responsibilities for evaluation
to all its subordinate agencies and organisation units at three levels (policy,
program and project). This designed to provide a flow of information to support
policy decisions, budget allocation decisions and operational decisions.
- Effectiveness: the evaluation system in MOFA and its agencies such as JICA and
JBIC places importance on feedback and use of lessons learned to inform
decisions. Ultimately this is reflected in budget requests to MOF, which reviews
higher level evaluation reports before finalising budget decisions.
• National ministry evaluations focus on effectiveness – because of budget
constraints and pressure from the public on transparency and effective use of tax
revenues, the main focus of MOFA evaluations is effectiveness. This means that data
on outputs and outcomes is synthesised and used to report on actual achievement of
policy and program purpose.
• Use of third party evaluators to ensure independent and objective results – most
of the primary evaluation work in MOFA is outsourced to independent practitioners
from consulting firms or universities. This requires MOFA staff to be skilled in
evaluation in order to prepare effective TOR and also to manage the quality of
evaluation outputs.
• Use of 4 steps in evaluation – consistent with the Vietnam M&E Manual and training
materials developed by FERD/MPI with support from VAMESP II, MOFA uses four
steps in evaluation: preparation of evaluation logframe; planning and development of
methods and tools; field work and data collection; reporting and feedback.
• Use of a simple PDCA cycle – MOFA uses a simple cycle of plan, do, check, act to
structure evaluation into all its activities. This is consistent with international good
practice for management of investment but emphasises the importance of simple step-
by-step systems to build an effective management culture.
• Use of joint evaluations to ensure partnership – Government of Japan is placing
increasing importance on joint evaluations with partner countries. The main
constraint to this policy shift is the capacity and availability of partner country staff to
participate in evaluations.

C:\OANH\M&E WEBSITE\JUNE.06 UPDATES\ACTIVITIES\STUDY TOUR IN JAPAN - APRIL 06\ENGLISH\REPORT ON EVALUATION STUDY TOUR EN V3.DOC
Vietnam Australia Monitoring and Evaluation Strengthening Project - Phase II
Findings of Study Tour to review ODA Evaluation Systems in Japan 6

Case studies of country assistance evaluations were presented for Cambodia and
Tanzania. These had been prepared in partnership between Japan and the country
government concerned.

3.2 Japan Bank for International Cooperation (JBIC)


At JBIC the GOV Participants met with Mr Shigeru TAKEDA (Senior Executive
Director of JBIC), Mr Ryutaro KOGA (Director General of the Development Assistance
Operations Evaluation Office in the Project Development Department), Mr Yoshio
WADA (Director Evaluation Planning Division in the Development Assistance
Operations Evaluation Office), Mr Asahiko KARASHIMA (Deputy Director of
Partnership Strategy Division) and Ms Toyoko KODAMA (Evaluation Officer of
Development Assisstance Operations Evaluation Office).

In 2½ hours of discussions and presentations, the key findings from JBIC included:
• Systematic approach to evaluation – JBIC prescribes a systematic process for
evaluation that clearly guides all staff and clients. The process assigns roles and
responsibilities at project, program and policy levels and was in place and operational
before the GPEA (2004) was enacted.
• Systematic approach to feedback including a lessons-learned database – through
the use of a lessons learned database and standardised evaluation reporting format,
JBIC ensures that evaluation results are used for feedback to activity managers as well
as informing those preparing new investments.
• Use of an independent panel of experts to review evaluation – objectivity and
evaluation quality are ensured by an independent panel of experts that report directly
to the JBIC Board. The experts are mostly academics but also include some people
from NGOs. JBIC uses the panel of experts to build credibility in evaluation results
and ensure that key lessons learned are communicated to the highest levels.
• Use of a systematic rating system to benchmark performance – the highly
developed rating system used by JBIC enables qualitative and quantitative data for all
5 evaluation criteria to be assessed and quantified for systematic benchmarking. This
system has been adopted by the Ministry of Finance in Bangkok (one of the lessons
learned from the October 2005 Study Tour) and is under active consideration for
piloting by FERD/MPI with support from VAMESP II.
• Use of third party evaluators to ensure independent and objective results -
primary evaluation work is outsourced by JBIC to independent practitioners from
consulting firms or universities. This requires JBIC staff to be skilled in evaluation in
order to prepare effective TOR and also to manage the quality of evaluation outputs.
• Use of 4 steps and 5 criteria in evaluation - consistent with the Vietnam M&E
Manual and training materials developed by FERD/MPI with support from VAMESP
II, JBIC uses the four steps in evaluation mentioned above and the five OECD-DAC
evaluation criteria: relevance, efficiency, effectiveness, impact and sustainability.
• Systematic approach to analysis and feedback - JBIC places importance on
feedback and use of lessons learned to inform management and investment decisions.
Ultimately this is reflected in quality of outputs and outcomes, which provides
justification for budget allocation decisions by MOF.
• Use of socially recognised expert to independently verify results – wherever
possible, JBIC uses a socially recognised expert or leader to independently verify

C:\OANH\M&E WEBSITE\JUNE.06 UPDATES\ACTIVITIES\STUDY TOUR IN JAPAN - APRIL 06\ENGLISH\REPORT ON EVALUATION STUDY TOUR EN V3.DOC
Vietnam Australia Monitoring and Evaluation Strengthening Project - Phase II
Findings of Study Tour to review ODA Evaluation Systems in Japan 7

evaluation results. This is primarily a mechanism to address pressure from the public
on transparency and effective use of tax revenues.
• Strong collaboration between Operations Department and Evaluation
Department to ensure that lessons learned inform future operations – there are
strong linkages across JBIC organisational units to ensure that independently derived
evaluation results are actively used during management and formulation of
investments. This collaboration is driven by a continuous improvement culture and
places importance on effectiveness and quality.

Project Officer-IT returned to JBIC the following afternoon to better understand the
Lessons Learned Database, which has been acknowledged as useful and comprehensive.
Officers in the evaluation office have been using this since 1999 with information since
1995. This database is operated and maintained by the Evaluation office with support of
the IT department, who is in charge of updating information on the JBIC intranet and
website.

These lessons are categorised by:


• Countries - where JBIC has investments
• Sectors - were agreed in 1995 and have not changed much since then. These sectors
are consistently used across all JBIC databases. Some complicated sectors are also
divided to sub-sectors: for instance the transport sector includes sea transport, railway
and road transport sub-sectors.
• Evaluation types – including Ex-ante, Mid term, and Terminal. In addition, they are
categorised by project phase: Procurement, Operation and Maintenance, Cost
management, Effectiveness and Efficiency.

Each record in this database has a unique identifier, lesson type(s) and a summarised free-
text description that is linked to the project from which the lesson(s) were drawn.
However, these lessons are not directly linked with the evaluation reports, and so users
have to look for detailed lessons in the separate project database, which is used to store
reports relating to projects. The other database is easily accessed by intranet, using the
unique project identifier to link lessons learned and evaluation reports.

The diagram below depicts the institutional arrangement and the data flow of the lesson
learned database.

C:\OANH\M&E WEBSITE\JUNE.06 UPDATES\ACTIVITIES\STUDY TOUR IN JAPAN - APRIL 06\ENGLISH\REPORT ON EVALUATION STUDY TOUR EN V3.DOC
Vietnam Australia Monitoring and Evaluation Strengthening Project - Phase II
Findings of Study Tour to review ODA Evaluation Systems in Japan 8

Evaluation
Reports

List of evaluated
projects Evaluation Office 15 officers

- Make an Excel
spreadsheet of all received
3 officers lessons learned
- Update to a local Access
database
Operations - Evaluation reports
department

IT Department

- Update lessons learned to JBIC


intranet SQL database
- Link lessons learned with projects
accordingly
- Format Evaluation reports to PDF
Project
versions and publish those in JBIC
database
Website and project database

Lesson learned
database

This diagram shows that there is a redundancy of project information in both project and
lessons learned databases. However, because of the functionalities of each department,
they see a need of decentralisation of information so they can easily maintain the data
concerning to their works. In addition, both databases are accessible for all staff with
appropriate username and passwords over JBIC intranet that allows officers to look for all
information needed. For example, the lessons learned database keeps only short
information of project and summaries of lessons learned, while the project database
provides a rich source of project profile, all relating documents including evaluation
reports, implementation data and lesson identifiers linked to lessons learned database.

3 evaluation officers, responsible for updating the lessons learned database, are competent
in basic Office skills including Word, Excel and Access. While other staff can only access
the query interface of the lessons learned database, those 3 officers are provided a simple
Access database for updating lesson learned, which is done annually and requires only
C:\OANH\M&E WEBSITE\JUNE.06 UPDATES\ACTIVITIES\STUDY TOUR IN JAPAN - APRIL 06\ENGLISH\REPORT ON EVALUATION STUDY TOUR EN V3.DOC
Vietnam Australia Monitoring and Evaluation Strengthening Project - Phase II
Findings of Study Tour to review ODA Evaluation Systems in Japan 9

one day of work. This Access application links to the JBIC local database in the IT
department, who then updates to the project database with new links to lessons and
publishes on web database. Training provided for those 3 officers was also short because
of the simpleness of the application.

According to staff interviewed at the Evaluation Office, this database provides


comprehensive lessons learned, which are really useful for learning experience and
applying to similar projects and situations. They are also thinking of further linking the
project and lessons learned databases to reduce redundancy. Furthermore, they
acknowledge the strong connection between monitoring and evaluation, which has not
been shown clearly in their databases. Therefore, it is planned to further improve those
databases in near future.

3.3 Foundation for Advanced Studies in Development (FASID)


At FASID the GOV Participants met with Professor Naonobu MINATO (Acting
Director) and Mr Tadashi KIKUCHI (Program Officer). The meeting was opened by
Ambassador Toshio TSUNOZAKI, the Executive Director of FASID.

In 2 hours of discussions and presentations, the key findings from FASID included:
• The use of PCM and PDM as foundations for systematic evaluation – FASID
research and academic affiliates have invested a considerable amount of effort in the
systematisation of evaluation through the use of Project Cycle Management (PCM)
and Project Design Matrix (PDM) methods and tools. The M&E Manual developed
by FERD/MPI with support from VAMESP II has adapted some of these methods and
tools to Vietnam where appropriate. Resource materials on PCM and PDM are
available through the M&E Resource Centre at FERD/MPI.
• The use of 5 DAC Criteria for evaluation – FASID recommends the use of the 5
DAC criteria for evaluation (relevance, effectiveness, efficiency, impact and
sustainability). This is consistent with the M&E Manual in Vietnam and all the pilot
evaluations conducted in Vietnam with support from VAMESP II.
• Evaluation should focus on quality and accountability – in the experience of
FASID, evaluation gives the best return on investment when it is focussed on
improving the quality of investment (for example through management for
development results) and providing transparent information to citizens that accounts
for the outcomes of government policies and investments.
• There are tools available for program evaluation – FASID research has built on
international experience to refine tools for use in program evaluation including the
hierarchy tree, logic tree, program logical framework including a hierarchy of
indicators and case studies. The Draft National M&E Strategic Plan developed by
FERD/MPI with support from VAMESP II has used the hierarchy tree tool for the
program of investments proposed in the strategic plan.
• There are ways of evaluating programs and plans such as SEDP – a case study of
Mie Prefecture in Japan demonstrated how a hierarchy of indicators is used for
evaluating projects, programs, policies and pillars or goals. This is relevant to the
SEDP in Vietnam and highlighted the opportunities to use congruent indicators that
link project-level indicators (a large number focused on activities and outputs); with
program-level indicators (a smaller number focused on outputs and outcomes); with
policy or sector indicators (around 5 indicators per policy or sector focused on

C:\OANH\M&E WEBSITE\JUNE.06 UPDATES\ACTIVITIES\STUDY TOUR IN JAPAN - APRIL 06\ENGLISH\REPORT ON EVALUATION STUDY TOUR EN V3.DOC
Vietnam Australia Monitoring and Evaluation Strengthening Project - Phase II
Findings of Study Tour to review ODA Evaluation Systems in Japan 10

outcomes and purpose; with goal indicators (around 5 macro-economic indicators


focused on purpose and goal).
• Government commitment to evaluation supports coordination – FASID advised
that efficient evaluation relies on political will extended from a government
committed to transparency and accountability. In such a case coordination is possible
because it is assigned to one government agency. In Japan this is set out in the
Government Policy Evaluation Act (2004) as already mentioned under MOFA.

A case study of the ex post evaluation of Bach Mai Hospital was presented by FASID –
demonstrating how leadership led to effective delivery of ODA outcomes.

3.4 Japan Agency for International Cooperation (JICA)


The GOV Participants met with 12 JICA staff, led by Mr Seiji KOJIMA (Vice-President
JICA and Chair of the JICA Evaluation Study Committee), Mr Satoru KOHIYAMA
(Director General of Regional Department I [SE Asia]), Mr Fumio KIKUCHI (Resident
Representative of JICA Vietnam); Ms Satoko MIWA (Director, Office of Evaluation in
the JICA Planning and Coordination Department and Co-Chair of the OECD-DAC
Committee on Evaluation) and Mr Kazuaki SATO (Deputy Director, Office of
Evaluation).

JICA has developed institutional arrangements, methods and tools for systematic
evaluation of ODA policies, programs and projects. They provided GOV Participants
with many resources including the JICA Evaluation Handbook (consistent with the
methods and tools set out in the VAMESP II M&E Manual), JICA Evaluation Reports for
2003, 2004 and 2005 and also distance learning materials. Many of these resources are
also available on the JICA website.

In more than 3 hours of animated discussions and presentations, the key findings from
JICA included:
• Evaluation is overseen by an independent group of experts – to ensure objectivity
and oversee quality of evaluation, JICA established an Advisory Committee on
Evaluation in 2002 (chaired by Professor Muta who we met at JES [see below]) that
reports to the Board of Vice Presidents through the Evaluation Study Committee
(chaired by Mr Kojima who we met at JICA). These independent experts provide
advice to the Board of Vice Presidents to ensure the quality and effective use of
evaluation results.
• Evaluation is part of a systematic approach to continuous improvement – under
the Government Policy Evaluations Act (2004) JICA is obliged to conduct evaluation.
However, the agency has conducted evaluation systematically since 1988 and now
publishes its annual evaluation report to ensure tax-payer commitment to international
cooperation activities.
• There are 3 pillars for effective evaluation – JICA ensures effective utilisation of
evaluation results by: (i) expanding the coverage of evaluation, (ii) improving
evaluation quality and (iii) ensuring objectivity and transparency.
• Evaluation systems should ensure objectivity and independence – the major
concern JICA has is that Japanese citizens have confidence in international
cooperation investments made by the Government of Japan and managed by JICA.
For evaluation results to be credible, they must be objective and transparent.
C:\OANH\M&E WEBSITE\JUNE.06 UPDATES\ACTIVITIES\STUDY TOUR IN JAPAN - APRIL 06\ENGLISH\REPORT ON EVALUATION STUDY TOUR EN V3.DOC
Vietnam Australia Monitoring and Evaluation Strengthening Project - Phase II
Findings of Study Tour to review ODA Evaluation Systems in Japan 11

• Evaluation services are outsourced to ensure objectivity – JICA uses consultants,


NGOs and academics to conduct project and program evaluations. The Office of
Evaluation has 12 staff but an annual budget of US$10 million for terminal and ex
post evaluation (US$9m) and program evaluation (US$1m). In addition Operations
Department has a budget of US$7m within existing or planned projects for ex ante
and mid-term evaluations
• Staff that manage evaluations should be trained in evaluation – even though JICA
staff do not conduct evaluations themselves, they are trained in evaluation to ensure
that they understand the importance of evaluation results and can effectively manage
consultants implementing evaluations to ensure quality and objective results.
• Capacity of staff and consultants can be built with distance learning resources –
JICA worked with the World Bank Institute to develop distance learning modules for
evaluation that can be used by staff in Japan, country office staff and consultants.
Some FERD/MPI and VAMESP II staff have used these training materials (see p41 of
JICA 2004 Annual Evaluation Report). The materials are all available on the JICA
website in both PowerPoint and video formats.
• National agencies should focus on policy and program evaluation – JICA is
placing increasing emphasis on program and “synthesis” or meta-evaluations to
provide better feedback on policy decisions and regional investment strategies.
• Line agencies should focus on project evaluation - parallel with the shift to
program evaluations by national agencies is increasing delegation of project
evaluations to country offices.
• Secondary-evaluations are used for quality assurance – JICA uses secondary
evaluations by the Advisory Committee on Evaluation to ensure quality. In 2005 the
ACE evaluated all terminal evaluations to check for consistency and quality – using a
secondary evaluation checklist. Lessons learned were fedback to JICA staff and
consultants. Details are reported in the 2005 Annual Evaluation Report (pp112-140).
• A leader is institutionalised to champion evaluation in each agency unit – JICA
recently assigned one existing staff member in most organisational units of the agency
to the position of “Evaluation Chief”. The role is to develop an evaluation culture
that ensures commitment to evaluation results and their effective use. There are now
60 evaluation chiefs in JICA headquarters and 56 in country offices.
• Knowledge management enables best use of evaluation results – as evaluation
becomes established as a routine function, JICA has developed web-based and
internal knowledge management systems to ensure that lessons learned are effectively
used for formulation of new investments and management for development results.
The system is available on the web in Japanese.

3.5 Japan Evaluation Society (JES)


At JES the GOV Participants met with Professor Hiromitsu MUTA (former Vice
President of JES and current Chair of the JICA Advisory Committee on Evaluation), Dr
Masaoki TAKEUCHI (Executive Managing Director of JES and the International
Development Centre of Japan), Mrs Yoko ISHIDA (Senior Researcher at IDCJ and
member of JES) and Mrs Mimi Sheikh (Senior Evaluation Specialist at IDJC and member
of JES).

C:\OANH\M&E WEBSITE\JUNE.06 UPDATES\ACTIVITIES\STUDY TOUR IN JAPAN - APRIL 06\ENGLISH\REPORT ON EVALUATION STUDY TOUR EN V3.DOC
Vietnam Australia Monitoring and Evaluation Strengthening Project - Phase II
Findings of Study Tour to review ODA Evaluation Systems in Japan 12

In 2 hours of discussions and presentations, the key findings from JES included:
• National legislation helps develop evaluation systems – the Government Policy
Evaluation Act (2004) was a strong driver for the development of an evaluation
culture for public investment in Japan. Evaluation is driven by one agency (the
Ministry of Interior and Communication) and closely used by for budget allocation
decisions by the Ministry of Finance. The effectiveness of this is demonstrated by the
rapid increase in evaluation agencies in government organisations – from 59 in 2003
to 355 in 2005. It also enabled the development of an annual plan of evaluations that
is financed in each agency through the budget process.
• An evaluation society supports development of an evaluation culture – by
bringing together public servants, private sector practitioners and academics as well
as students an evaluation society builds a social consensus about the value and use of
evaluation. JES uses semi-annual conferences, a journal and training programs to
build this culture in Japan.
• A society builds capacity through conferences, journal and structured training –
semi-annual conferences, semi-annual journals and training courses that coincide with
each conference are used to build evaluation capacity. The cost of this is covered by
membership fees, corporate memberships and conference fees.
• Investments must have a plan if they are to be efficiently evaluated – all policies
and projects evaluated in Japan have planned outputs and outcomes against which
actual performance can be evaluated. The importance of adequate plans for public
investments is reinforced by administrative requirements imposed by the Ministry of
Finance and the Ministry of Interior and Communication.
• Evaluation reports can contribute to budget allocation decisions – evaluation
reports from public agencies are submitted to the Ministry of Interior and
Communication for review and feedback, and used by the Ministry of Finance in the
budget allocation process.
• Public investment uses 5 evaluation criteria – in Japan 5 criteria are used for
evaluating policy: necessity, efficiency, effectiveness, fairness and priority. This is a
localised version of the DAC evaluation criteria used for ODA.
• National and local government agencies use different evaluation methods –
national agencies focus on policy and program evaluation in Japan, as directed by the
Government Policy Evaluation Act (2004), which applies to national agencies only.
Local Government agencies at Prefecture and City levels use evaluation for projects,
with a focus on outputs and outcomes.
• An evaluation society established as a not-for-profit organisation can provide
services to government – JES was reconstituted as a not-for-profit organisation so
that it could contract with the Government to undertake high level evaluations. This
particularly related to quality assurance for evaluation through mechanisms such as
the secondary evaluations at JICA (see above).
• An evaluation society can certify training but not consultants – JES is preparing to
provide certified training courses for evaluation practitioners. Participants in these
courses would receive certification once they attended and were tested to a certain
level of competence. However, JES does not plan to certify consultants as being
competent to conduct evaluations. This is different from the Project Management
Societies in USA and Australia – which conduct certification of practitioners as well
as training.

C:\OANH\M&E WEBSITE\JUNE.06 UPDATES\ACTIVITIES\STUDY TOUR IN JAPAN - APRIL 06\ENGLISH\REPORT ON EVALUATION STUDY TOUR EN V3.DOC
Vietnam Australia Monitoring and Evaluation Strengthening Project - Phase II
Findings of Study Tour to review ODA Evaluation Systems in Japan 13

4 Lessons learned
Analysis of the findings presented in Section 3 identified the following lessons learned
that are relevant to Vietnam:
• Overarching legislation provides an effective framework for evaluation
• It is very important to clearly assign responsibilities in the evaluation system. Line
agencies should focus on evaluation of project efficiency and effectiveness, National
agencies should focus on evaluation of policy and program effectiveness. A leader
should be institutionalised to champion evaluation in each agency unit
• Evaluation services should be outsourced to third party evaluators to ensure
independent and objective results and Evaluation should be overseen by an
independent group of experts
• A professional evaluation society established as a not-for-profit organisation can
provide services to government to support development of an evaluation culture, build
capacity through conferences, a journal and structured training, and certify training
• A systematic approach to evaluation is efficient and has the greatest impact. Project
Cycle Management and Logical Frameworks are proven foundations for systematic
evaluation.
• Evaluating programs and plans such as SEDP is practicable
• Evaluation reports can contribute to budget allocation decisions in alignment with
policy framework, programs and projects to achieve the set targets. Evaluation results
contribute to continuous improvement through a systematic approach to feedback
including a lessons-learned database

C:\OANH\M&E WEBSITE\JUNE.06 UPDATES\ACTIVITIES\STUDY TOUR IN JAPAN - APRIL 06\ENGLISH\REPORT ON EVALUATION STUDY TOUR EN V3.DOC
Vietnam Australia Monitoring and Evaluation Strengthening Project - Phase II
Findings of Study Tour to review ODA Evaluation Systems in Japan 14

5 Recommendations
Based on the findings and lessons learned, there is an opportunity for the Government of
Vietnam to support implementation of the following short-term and medium-term
recommendations resulting from the evaluation of policy and programs in Japan and to
build on the lessons learned from pilot evaluations implemented with support from the
Vietnam Australia Monitoring and Evaluation Strengthening Project.

5.1 Recommendations for short-term implementation


In the next 12 months it is recommended that FERD/MPI lead GOV efforts in M&E to:
• Recommendation 1 – complete and institutionalise a national monitoring and
evaluation strategic plan to provide guidance for the establishment of an official
national monitoring and evaluation system, building on the pilot experience of
VAMESP II.
• Recommendation 2 – complete and institutionalise a national monitoring and
evaluation manual to provide guidance for the practical implementation of evaluation
work in Vietnam.
• Recommendation 3 – FERD/MPI to work with other GOV agencies and key donors
through the PGAE to prepare a program of GOV and joint evaluations for 2007 to
enable resource allocation to be planned as part of the 2007 budget process and
December CG meeting.
• Recommendation 4 – train selected staff in MPI, MOF, MOT, MARD, MOH, MOC
and MoET as well as Provincial DPI in evaluation practice so that they can more
effectively plan and manage evaluations.
• Recommendation 5 – establish a systematic rating system for evaluation results that
can be used by MPI, MOF and line agencies to benchmark performance of public
investments in Vietnam.
• Recommendation 6 – prepare an Annual Evaluation Report on ODA in Vietnam for
the December CG meeting, with inputs from whole-of-government coordinated by
FERD/MPI, that introduces the national M&E system, presents case study examples
of pilot evaluations conducted by GOV and presents the proposed program of GOV
and joint evaluations for 2007 set out in Recommendation 3.
• Recommendation 7 – encourage the establishment of a professional society or not-
for-profit organisation by evaluation practitioners with the purpose of:
¾ supporting development of an evaluation culture
¾ building capacity through conferences, a journal and structured training
¾ providing services to government and other sectors
¾ certifying evaluation training courses and materials

C:\OANH\M&E WEBSITE\JUNE.06 UPDATES\ACTIVITIES\STUDY TOUR IN JAPAN - APRIL 06\ENGLISH\REPORT ON EVALUATION STUDY TOUR EN V3.DOC
Vietnam Australia Monitoring and Evaluation Strengthening Project - Phase II
Findings of Study Tour to review ODA Evaluation Systems in Japan 15

5.2 Recommendations for medium-term implementation


In the next 2 years it is recommended that FERD/MPI lead GOV efforts in M&E to:
• Recommendation 8 - prepare and issue a Government Policy Evaluation regulation
as overarching legislation to provide an effective framework for evaluation of public
investments in Vietnam.
• Recommendation 9 – establish and institutionalise systematic functions and
procedures for evaluation of public investment, consistent with the National M&E
Strategic Plan, that:
¾ require national agencies to focus on evaluation of policy and program
effectiveness
¾ require line agencies to focus on evaluation of project efficiency and effectiveness
¾ require new public investments to have a detailed plan against which performance
can be monitored and evaluated
¾ institutionalise a position of evaluation focal point in each agency responsible for
public investment
¾ enables continuous improvement through effective use of lessons learned from
evaluation by institutionalising close cooperation between units responsible for
evaluation and units responsible for investment appraisal and implementation
¾ uses Project Cycle Management and Logical Frameworks as the foundations for
systematic evaluation
¾ ensures objective and transparent evaluations by separating those who evaluate
from those who manage implementation
¾ institutionalise a systematic approach to feedback including a lessons-learned
database in FERD/MPI for ODA and ASD/MPI for Public Investment
• Recommendation 10 – MPI request the Prime Minister for establishing an
independent group of experts in the Office of Government or State Audit Office to
oversee evaluation of public investment in Vietnam.

C:\OANH\M&E WEBSITE\JUNE.06 UPDATES\ACTIVITIES\STUDY TOUR IN JAPAN - APRIL 06\ENGLISH\REPORT ON EVALUATION STUDY TOUR EN V3.DOC
Annex 1: List of Participants

The following Government of Vietnam staff and VAMESP II staff participated:


• Dr. Cao Viet Sinh, Vice Minister of Planning and Investment
• Mr. Tran Quoc Phuong, Secretary of Vice Minister/Interpreter
• Mr. Kieu Tien Quang, Director of the International Cooperation Department, Office of
Government
• Mr. Mai Huu Dung, Deputy Director of the Appraisal and Public Investment Supervision
Department, MPI
• Ms. Nguyen Thi Hong Yen, Deputy Director of the External Finance Department, Ministry
of Finance
• Mr. Nguyen Xuan Tien, Head of Japan and Northeast Asia Division, FERD/MPI
• Mr. Cao Manh Cuong, Head of General Division, FERD/MPI
• Mr. John Fargher, Australian Team Leader of VAMESP II
• Ms. Tran Thi Thu Trang, Project Officer of VAMESP II

C:\OANH\M&E WEBSITE\JUNE.06 UPDATES\ACTIVITIES\STUDY TOUR IN JAPAN - APRIL 06\ENGLISH\REPORT ON EVALUATION STUDY TOUR EN V3.DOC
Annex 2: Detailed program
Activities Time
Sunday 23 April 2006
Travel from Hanoi to Tokyo (via Hong Kong) 11.05 – 20.20
Monday 24 April 2006
Ministry of Foreign Affairs (Economic Cooperation Bureau) 09.00 – 11.00
Room 272, Kasumigaseki 2-2-1, Chiyoda-ku, Tokyo 100-8919, Japan
Tel: +81- (0) 3-3580-3311
Japanese Bank for International Development (JBIC) 14.00 – 16.00
4-1, Ohtemachi 1-chome, Chiyoda-ku, Tokyo 100-8144, Japan
Tel: 03(5218)3101 Fax: 03(5218)3955
Tuesday 25 April 2006
Foundation for Advanced Studies in International Development (FASID), 10.00 – 12.00
Department of Planning and Programs
Chiyoda Kaikan Building (4th and 5th fl.)
1-6-17, Kudan-Minami, Chiyoda-ku, Tokyo 102-0074, Japan
Courtesy meeting with Mr. Kojima, Vice Chairman of JICA 14.00 – 16.00
JICA Office of Evaluation
6th–13th floors, Shinjuku Maynds Tower, 1-1, Yoyogi 2-chome, Shibuya-ku,
Tokyo 151-8558 Japan; Tel: +81-3-5352-5311/5312
Wednesday 26 April 2006
Meeting with Japan Evaluation Society 10.00 – 12.00
International Development Center of Japan, Kyofuku Building, Tomioka 2-
chome, Koto-ku, Tokyo 135-0047 (4th and 5th fl.), Japan
Lunch with JBIC 12.00 – 13.30
Meeting with Mr. Arakawa, Executive Director of JBIC 13.45 – 14.15
Meeting with Mr. Ito, Parliamentary Secretary for Foreign Affairs, MOFA 15.00
Meeting with Mr. Sato, Director of Economic Cooperation Department 15.45
Local visit and dinner 17.00
Thursday 27 April 2006
Meeting with Ministry of Land, Infrastructure and Transport 10.00 – 12.00
Lunch with Ministry of Land, Infrastructure and Transport 12.00 – 13.30
Field trip to Subway 14.45 – 17.45
Friday 28 April 2006
Departure for return to Hanoi (via Hong Kong) 10.00 – 15.55

C:\OANH\M&E WEBSITE\JUNE.06 UPDATES\ACTIVITIES\STUDY TOUR IN JAPAN - APRIL 06\ENGLISH\REPORT ON EVALUATION STUDY TOUR EN V3.DOC

You might also like