Professional Documents
Culture Documents
Prepared for
MPI
2 Hoang Van Thu
Ba Dinh Hanoi VIETNAM
23 May 2006
42443867
Prepared by
Vietnam Australia Monitoring and Evaluation Strengthening Project
Vietnam Australia Monitoring and Evaluation Strengthening Project - Phase II
Findings of Study Tour to review ODA Evaluation Systems in Japan 1
CONTENTS
1 Purpose and expected outputs 2
3 Summary of findings 5
3.1 Ministry of Foreign Affairs (MOFA) ........................................................ 5
3.2 Japan Bank for International Cooperation (JBIC) .................................. 6
3.3 Foundation for Advanced Studies in Development (FASID) .................. 9
3.4 Japan Agency for International Cooperation (JICA)............................. 10
3.5 Japan Evaluation Society (JES)........................................................... 11
4 Lessons learned 13
5 Recommendations 14
5.1 Recommendations for short-term implementation ............................... 14
5.2 Recommendations for medium-term implementation .......................... 15
ANNEXES
Annex 1: List of Participants
Annex2: Detailed program
C:\OANH\M&E WEBSITE\JUNE.06 UPDATES\ACTIVITIES\STUDY TOUR IN JAPAN - APRIL 06\ENGLISH\REPORT ON EVALUATION STUDY TOUR EN V3.DOC
Vietnam Australia Monitoring and Evaluation Strengthening Project - Phase II
Findings of Study Tour to review ODA Evaluation Systems in Japan 2
A small group of senior GoV leaders visited Japan to study evaluation systems. A list of
participants attached at Annex 1. The purpose of the study tour was to:
• See Japanese best-practice evaluation system in operation
• Learn methods and tools used by Japanese government for evaluation of ODA
• Understand how evaluation supports investment and management decisions in Japan
• See how professional evaluation societies build capacity of evaluation practitioners
Outputs expected from the study tour included a detailed understanding of how to
establish, implement and use evaluation systems for public investment to support result-
based management and effective investment formulation. Case study examples of good
practice ODA evaluation are expected to be provided to inform the pilot Vietnamese
evaluation system.
As detailed in the program attached at Annex 2, during 4 days in Japan participants met
with 5 Japanese agencies to review case studies of best practice evaluation partnerships in
Japan; review case studies of sector, program and portfolio evaluations in Japan; learn
current issues and research in Japanese evaluation and compare Japanese and Vietnamese
evaluation case studies.
C:\OANH\M&E WEBSITE\JUNE.06 UPDATES\ACTIVITIES\STUDY TOUR IN JAPAN - APRIL 06\ENGLISH\REPORT ON EVALUATION STUDY TOUR EN V3.DOC
Vietnam Australia Monitoring and Evaluation Strengthening Project - Phase II
Findings of Study Tour to review ODA Evaluation Systems in Japan 3
The following diagrams summarise the organisational relationships and functions of ODA
evaluation in Japan.
C:\OANH\M&E WEBSITE\JUNE.06 UPDATES\ACTIVITIES\STUDY TOUR IN JAPAN - APRIL 06\ENGLISH\REPORT ON EVALUATION STUDY TOUR EN V3.DOC
Vietnam Australia Monitoring and Evaluation Strengthening Project - Phase II
Findings of Study Tour to review ODA Evaluation Systems in Japan 4
C:\OANH\M&E WEBSITE\JUNE.06 UPDATES\ACTIVITIES\STUDY TOUR IN JAPAN - APRIL 06\ENGLISH\REPORT ON EVALUATION STUDY TOUR EN V3.DOC
Vietnam Australia Monitoring and Evaluation Strengthening Project - Phase II
Findings of Study Tour to review ODA Evaluation Systems in Japan 5
3 Summary of findings
3.1 Ministry of Foreign Affairs (MOFA)
At MOFA the GOV Participants met with Mr Nobuki SUGITA (Deputy Director General
of the Economic Cooperation Bureau in MOFA), Mr Yukio YOSHI (Director of the Aid
Planning Division in the Economic Cooperation Bureau) Mr Takeshi SHIIHARA
(Assistant Director of the Aid Planning Division), and Ms Naoko UEDA (Deputy
Director of the Aid Planning Division).
In 2 hours of discussions and presentations, the key findings from MOFA included:
• Important of overarching legislation to support evaluation – the Government
Policy Evaluations Act (2004) provides a framework for institutional arrangements
for evaluation in all national government agencies. This provides a mechanism for
recurrent budget allocation to evaluation plans included in agency budget submissions
through the normal budget process.
• Efficiency and Effectiveness of a systematic approach to evaluation:
- Efficiency: MOFA has clear delegation of roles and responsibilities for evaluation
to all its subordinate agencies and organisation units at three levels (policy,
program and project). This designed to provide a flow of information to support
policy decisions, budget allocation decisions and operational decisions.
- Effectiveness: the evaluation system in MOFA and its agencies such as JICA and
JBIC places importance on feedback and use of lessons learned to inform
decisions. Ultimately this is reflected in budget requests to MOF, which reviews
higher level evaluation reports before finalising budget decisions.
• National ministry evaluations focus on effectiveness – because of budget
constraints and pressure from the public on transparency and effective use of tax
revenues, the main focus of MOFA evaluations is effectiveness. This means that data
on outputs and outcomes is synthesised and used to report on actual achievement of
policy and program purpose.
• Use of third party evaluators to ensure independent and objective results – most
of the primary evaluation work in MOFA is outsourced to independent practitioners
from consulting firms or universities. This requires MOFA staff to be skilled in
evaluation in order to prepare effective TOR and also to manage the quality of
evaluation outputs.
• Use of 4 steps in evaluation – consistent with the Vietnam M&E Manual and training
materials developed by FERD/MPI with support from VAMESP II, MOFA uses four
steps in evaluation: preparation of evaluation logframe; planning and development of
methods and tools; field work and data collection; reporting and feedback.
• Use of a simple PDCA cycle – MOFA uses a simple cycle of plan, do, check, act to
structure evaluation into all its activities. This is consistent with international good
practice for management of investment but emphasises the importance of simple step-
by-step systems to build an effective management culture.
• Use of joint evaluations to ensure partnership – Government of Japan is placing
increasing importance on joint evaluations with partner countries. The main
constraint to this policy shift is the capacity and availability of partner country staff to
participate in evaluations.
C:\OANH\M&E WEBSITE\JUNE.06 UPDATES\ACTIVITIES\STUDY TOUR IN JAPAN - APRIL 06\ENGLISH\REPORT ON EVALUATION STUDY TOUR EN V3.DOC
Vietnam Australia Monitoring and Evaluation Strengthening Project - Phase II
Findings of Study Tour to review ODA Evaluation Systems in Japan 6
Case studies of country assistance evaluations were presented for Cambodia and
Tanzania. These had been prepared in partnership between Japan and the country
government concerned.
In 2½ hours of discussions and presentations, the key findings from JBIC included:
• Systematic approach to evaluation – JBIC prescribes a systematic process for
evaluation that clearly guides all staff and clients. The process assigns roles and
responsibilities at project, program and policy levels and was in place and operational
before the GPEA (2004) was enacted.
• Systematic approach to feedback including a lessons-learned database – through
the use of a lessons learned database and standardised evaluation reporting format,
JBIC ensures that evaluation results are used for feedback to activity managers as well
as informing those preparing new investments.
• Use of an independent panel of experts to review evaluation – objectivity and
evaluation quality are ensured by an independent panel of experts that report directly
to the JBIC Board. The experts are mostly academics but also include some people
from NGOs. JBIC uses the panel of experts to build credibility in evaluation results
and ensure that key lessons learned are communicated to the highest levels.
• Use of a systematic rating system to benchmark performance – the highly
developed rating system used by JBIC enables qualitative and quantitative data for all
5 evaluation criteria to be assessed and quantified for systematic benchmarking. This
system has been adopted by the Ministry of Finance in Bangkok (one of the lessons
learned from the October 2005 Study Tour) and is under active consideration for
piloting by FERD/MPI with support from VAMESP II.
• Use of third party evaluators to ensure independent and objective results -
primary evaluation work is outsourced by JBIC to independent practitioners from
consulting firms or universities. This requires JBIC staff to be skilled in evaluation in
order to prepare effective TOR and also to manage the quality of evaluation outputs.
• Use of 4 steps and 5 criteria in evaluation - consistent with the Vietnam M&E
Manual and training materials developed by FERD/MPI with support from VAMESP
II, JBIC uses the four steps in evaluation mentioned above and the five OECD-DAC
evaluation criteria: relevance, efficiency, effectiveness, impact and sustainability.
• Systematic approach to analysis and feedback - JBIC places importance on
feedback and use of lessons learned to inform management and investment decisions.
Ultimately this is reflected in quality of outputs and outcomes, which provides
justification for budget allocation decisions by MOF.
• Use of socially recognised expert to independently verify results – wherever
possible, JBIC uses a socially recognised expert or leader to independently verify
C:\OANH\M&E WEBSITE\JUNE.06 UPDATES\ACTIVITIES\STUDY TOUR IN JAPAN - APRIL 06\ENGLISH\REPORT ON EVALUATION STUDY TOUR EN V3.DOC
Vietnam Australia Monitoring and Evaluation Strengthening Project - Phase II
Findings of Study Tour to review ODA Evaluation Systems in Japan 7
evaluation results. This is primarily a mechanism to address pressure from the public
on transparency and effective use of tax revenues.
• Strong collaboration between Operations Department and Evaluation
Department to ensure that lessons learned inform future operations – there are
strong linkages across JBIC organisational units to ensure that independently derived
evaluation results are actively used during management and formulation of
investments. This collaboration is driven by a continuous improvement culture and
places importance on effectiveness and quality.
Project Officer-IT returned to JBIC the following afternoon to better understand the
Lessons Learned Database, which has been acknowledged as useful and comprehensive.
Officers in the evaluation office have been using this since 1999 with information since
1995. This database is operated and maintained by the Evaluation office with support of
the IT department, who is in charge of updating information on the JBIC intranet and
website.
Each record in this database has a unique identifier, lesson type(s) and a summarised free-
text description that is linked to the project from which the lesson(s) were drawn.
However, these lessons are not directly linked with the evaluation reports, and so users
have to look for detailed lessons in the separate project database, which is used to store
reports relating to projects. The other database is easily accessed by intranet, using the
unique project identifier to link lessons learned and evaluation reports.
The diagram below depicts the institutional arrangement and the data flow of the lesson
learned database.
C:\OANH\M&E WEBSITE\JUNE.06 UPDATES\ACTIVITIES\STUDY TOUR IN JAPAN - APRIL 06\ENGLISH\REPORT ON EVALUATION STUDY TOUR EN V3.DOC
Vietnam Australia Monitoring and Evaluation Strengthening Project - Phase II
Findings of Study Tour to review ODA Evaluation Systems in Japan 8
Evaluation
Reports
List of evaluated
projects Evaluation Office 15 officers
- Make an Excel
spreadsheet of all received
3 officers lessons learned
- Update to a local Access
database
Operations - Evaluation reports
department
IT Department
Lesson learned
database
This diagram shows that there is a redundancy of project information in both project and
lessons learned databases. However, because of the functionalities of each department,
they see a need of decentralisation of information so they can easily maintain the data
concerning to their works. In addition, both databases are accessible for all staff with
appropriate username and passwords over JBIC intranet that allows officers to look for all
information needed. For example, the lessons learned database keeps only short
information of project and summaries of lessons learned, while the project database
provides a rich source of project profile, all relating documents including evaluation
reports, implementation data and lesson identifiers linked to lessons learned database.
3 evaluation officers, responsible for updating the lessons learned database, are competent
in basic Office skills including Word, Excel and Access. While other staff can only access
the query interface of the lessons learned database, those 3 officers are provided a simple
Access database for updating lesson learned, which is done annually and requires only
C:\OANH\M&E WEBSITE\JUNE.06 UPDATES\ACTIVITIES\STUDY TOUR IN JAPAN - APRIL 06\ENGLISH\REPORT ON EVALUATION STUDY TOUR EN V3.DOC
Vietnam Australia Monitoring and Evaluation Strengthening Project - Phase II
Findings of Study Tour to review ODA Evaluation Systems in Japan 9
one day of work. This Access application links to the JBIC local database in the IT
department, who then updates to the project database with new links to lessons and
publishes on web database. Training provided for those 3 officers was also short because
of the simpleness of the application.
In 2 hours of discussions and presentations, the key findings from FASID included:
• The use of PCM and PDM as foundations for systematic evaluation – FASID
research and academic affiliates have invested a considerable amount of effort in the
systematisation of evaluation through the use of Project Cycle Management (PCM)
and Project Design Matrix (PDM) methods and tools. The M&E Manual developed
by FERD/MPI with support from VAMESP II has adapted some of these methods and
tools to Vietnam where appropriate. Resource materials on PCM and PDM are
available through the M&E Resource Centre at FERD/MPI.
• The use of 5 DAC Criteria for evaluation – FASID recommends the use of the 5
DAC criteria for evaluation (relevance, effectiveness, efficiency, impact and
sustainability). This is consistent with the M&E Manual in Vietnam and all the pilot
evaluations conducted in Vietnam with support from VAMESP II.
• Evaluation should focus on quality and accountability – in the experience of
FASID, evaluation gives the best return on investment when it is focussed on
improving the quality of investment (for example through management for
development results) and providing transparent information to citizens that accounts
for the outcomes of government policies and investments.
• There are tools available for program evaluation – FASID research has built on
international experience to refine tools for use in program evaluation including the
hierarchy tree, logic tree, program logical framework including a hierarchy of
indicators and case studies. The Draft National M&E Strategic Plan developed by
FERD/MPI with support from VAMESP II has used the hierarchy tree tool for the
program of investments proposed in the strategic plan.
• There are ways of evaluating programs and plans such as SEDP – a case study of
Mie Prefecture in Japan demonstrated how a hierarchy of indicators is used for
evaluating projects, programs, policies and pillars or goals. This is relevant to the
SEDP in Vietnam and highlighted the opportunities to use congruent indicators that
link project-level indicators (a large number focused on activities and outputs); with
program-level indicators (a smaller number focused on outputs and outcomes); with
policy or sector indicators (around 5 indicators per policy or sector focused on
C:\OANH\M&E WEBSITE\JUNE.06 UPDATES\ACTIVITIES\STUDY TOUR IN JAPAN - APRIL 06\ENGLISH\REPORT ON EVALUATION STUDY TOUR EN V3.DOC
Vietnam Australia Monitoring and Evaluation Strengthening Project - Phase II
Findings of Study Tour to review ODA Evaluation Systems in Japan 10
A case study of the ex post evaluation of Bach Mai Hospital was presented by FASID –
demonstrating how leadership led to effective delivery of ODA outcomes.
JICA has developed institutional arrangements, methods and tools for systematic
evaluation of ODA policies, programs and projects. They provided GOV Participants
with many resources including the JICA Evaluation Handbook (consistent with the
methods and tools set out in the VAMESP II M&E Manual), JICA Evaluation Reports for
2003, 2004 and 2005 and also distance learning materials. Many of these resources are
also available on the JICA website.
In more than 3 hours of animated discussions and presentations, the key findings from
JICA included:
• Evaluation is overseen by an independent group of experts – to ensure objectivity
and oversee quality of evaluation, JICA established an Advisory Committee on
Evaluation in 2002 (chaired by Professor Muta who we met at JES [see below]) that
reports to the Board of Vice Presidents through the Evaluation Study Committee
(chaired by Mr Kojima who we met at JICA). These independent experts provide
advice to the Board of Vice Presidents to ensure the quality and effective use of
evaluation results.
• Evaluation is part of a systematic approach to continuous improvement – under
the Government Policy Evaluations Act (2004) JICA is obliged to conduct evaluation.
However, the agency has conducted evaluation systematically since 1988 and now
publishes its annual evaluation report to ensure tax-payer commitment to international
cooperation activities.
• There are 3 pillars for effective evaluation – JICA ensures effective utilisation of
evaluation results by: (i) expanding the coverage of evaluation, (ii) improving
evaluation quality and (iii) ensuring objectivity and transparency.
• Evaluation systems should ensure objectivity and independence – the major
concern JICA has is that Japanese citizens have confidence in international
cooperation investments made by the Government of Japan and managed by JICA.
For evaluation results to be credible, they must be objective and transparent.
C:\OANH\M&E WEBSITE\JUNE.06 UPDATES\ACTIVITIES\STUDY TOUR IN JAPAN - APRIL 06\ENGLISH\REPORT ON EVALUATION STUDY TOUR EN V3.DOC
Vietnam Australia Monitoring and Evaluation Strengthening Project - Phase II
Findings of Study Tour to review ODA Evaluation Systems in Japan 11
C:\OANH\M&E WEBSITE\JUNE.06 UPDATES\ACTIVITIES\STUDY TOUR IN JAPAN - APRIL 06\ENGLISH\REPORT ON EVALUATION STUDY TOUR EN V3.DOC
Vietnam Australia Monitoring and Evaluation Strengthening Project - Phase II
Findings of Study Tour to review ODA Evaluation Systems in Japan 12
In 2 hours of discussions and presentations, the key findings from JES included:
• National legislation helps develop evaluation systems – the Government Policy
Evaluation Act (2004) was a strong driver for the development of an evaluation
culture for public investment in Japan. Evaluation is driven by one agency (the
Ministry of Interior and Communication) and closely used by for budget allocation
decisions by the Ministry of Finance. The effectiveness of this is demonstrated by the
rapid increase in evaluation agencies in government organisations – from 59 in 2003
to 355 in 2005. It also enabled the development of an annual plan of evaluations that
is financed in each agency through the budget process.
• An evaluation society supports development of an evaluation culture – by
bringing together public servants, private sector practitioners and academics as well
as students an evaluation society builds a social consensus about the value and use of
evaluation. JES uses semi-annual conferences, a journal and training programs to
build this culture in Japan.
• A society builds capacity through conferences, journal and structured training –
semi-annual conferences, semi-annual journals and training courses that coincide with
each conference are used to build evaluation capacity. The cost of this is covered by
membership fees, corporate memberships and conference fees.
• Investments must have a plan if they are to be efficiently evaluated – all policies
and projects evaluated in Japan have planned outputs and outcomes against which
actual performance can be evaluated. The importance of adequate plans for public
investments is reinforced by administrative requirements imposed by the Ministry of
Finance and the Ministry of Interior and Communication.
• Evaluation reports can contribute to budget allocation decisions – evaluation
reports from public agencies are submitted to the Ministry of Interior and
Communication for review and feedback, and used by the Ministry of Finance in the
budget allocation process.
• Public investment uses 5 evaluation criteria – in Japan 5 criteria are used for
evaluating policy: necessity, efficiency, effectiveness, fairness and priority. This is a
localised version of the DAC evaluation criteria used for ODA.
• National and local government agencies use different evaluation methods –
national agencies focus on policy and program evaluation in Japan, as directed by the
Government Policy Evaluation Act (2004), which applies to national agencies only.
Local Government agencies at Prefecture and City levels use evaluation for projects,
with a focus on outputs and outcomes.
• An evaluation society established as a not-for-profit organisation can provide
services to government – JES was reconstituted as a not-for-profit organisation so
that it could contract with the Government to undertake high level evaluations. This
particularly related to quality assurance for evaluation through mechanisms such as
the secondary evaluations at JICA (see above).
• An evaluation society can certify training but not consultants – JES is preparing to
provide certified training courses for evaluation practitioners. Participants in these
courses would receive certification once they attended and were tested to a certain
level of competence. However, JES does not plan to certify consultants as being
competent to conduct evaluations. This is different from the Project Management
Societies in USA and Australia – which conduct certification of practitioners as well
as training.
C:\OANH\M&E WEBSITE\JUNE.06 UPDATES\ACTIVITIES\STUDY TOUR IN JAPAN - APRIL 06\ENGLISH\REPORT ON EVALUATION STUDY TOUR EN V3.DOC
Vietnam Australia Monitoring and Evaluation Strengthening Project - Phase II
Findings of Study Tour to review ODA Evaluation Systems in Japan 13
4 Lessons learned
Analysis of the findings presented in Section 3 identified the following lessons learned
that are relevant to Vietnam:
• Overarching legislation provides an effective framework for evaluation
• It is very important to clearly assign responsibilities in the evaluation system. Line
agencies should focus on evaluation of project efficiency and effectiveness, National
agencies should focus on evaluation of policy and program effectiveness. A leader
should be institutionalised to champion evaluation in each agency unit
• Evaluation services should be outsourced to third party evaluators to ensure
independent and objective results and Evaluation should be overseen by an
independent group of experts
• A professional evaluation society established as a not-for-profit organisation can
provide services to government to support development of an evaluation culture, build
capacity through conferences, a journal and structured training, and certify training
• A systematic approach to evaluation is efficient and has the greatest impact. Project
Cycle Management and Logical Frameworks are proven foundations for systematic
evaluation.
• Evaluating programs and plans such as SEDP is practicable
• Evaluation reports can contribute to budget allocation decisions in alignment with
policy framework, programs and projects to achieve the set targets. Evaluation results
contribute to continuous improvement through a systematic approach to feedback
including a lessons-learned database
C:\OANH\M&E WEBSITE\JUNE.06 UPDATES\ACTIVITIES\STUDY TOUR IN JAPAN - APRIL 06\ENGLISH\REPORT ON EVALUATION STUDY TOUR EN V3.DOC
Vietnam Australia Monitoring and Evaluation Strengthening Project - Phase II
Findings of Study Tour to review ODA Evaluation Systems in Japan 14
5 Recommendations
Based on the findings and lessons learned, there is an opportunity for the Government of
Vietnam to support implementation of the following short-term and medium-term
recommendations resulting from the evaluation of policy and programs in Japan and to
build on the lessons learned from pilot evaluations implemented with support from the
Vietnam Australia Monitoring and Evaluation Strengthening Project.
C:\OANH\M&E WEBSITE\JUNE.06 UPDATES\ACTIVITIES\STUDY TOUR IN JAPAN - APRIL 06\ENGLISH\REPORT ON EVALUATION STUDY TOUR EN V3.DOC
Vietnam Australia Monitoring and Evaluation Strengthening Project - Phase II
Findings of Study Tour to review ODA Evaluation Systems in Japan 15
C:\OANH\M&E WEBSITE\JUNE.06 UPDATES\ACTIVITIES\STUDY TOUR IN JAPAN - APRIL 06\ENGLISH\REPORT ON EVALUATION STUDY TOUR EN V3.DOC
Annex 1: List of Participants
C:\OANH\M&E WEBSITE\JUNE.06 UPDATES\ACTIVITIES\STUDY TOUR IN JAPAN - APRIL 06\ENGLISH\REPORT ON EVALUATION STUDY TOUR EN V3.DOC
Annex 2: Detailed program
Activities Time
Sunday 23 April 2006
Travel from Hanoi to Tokyo (via Hong Kong) 11.05 – 20.20
Monday 24 April 2006
Ministry of Foreign Affairs (Economic Cooperation Bureau) 09.00 – 11.00
Room 272, Kasumigaseki 2-2-1, Chiyoda-ku, Tokyo 100-8919, Japan
Tel: +81- (0) 3-3580-3311
Japanese Bank for International Development (JBIC) 14.00 – 16.00
4-1, Ohtemachi 1-chome, Chiyoda-ku, Tokyo 100-8144, Japan
Tel: 03(5218)3101 Fax: 03(5218)3955
Tuesday 25 April 2006
Foundation for Advanced Studies in International Development (FASID), 10.00 – 12.00
Department of Planning and Programs
Chiyoda Kaikan Building (4th and 5th fl.)
1-6-17, Kudan-Minami, Chiyoda-ku, Tokyo 102-0074, Japan
Courtesy meeting with Mr. Kojima, Vice Chairman of JICA 14.00 – 16.00
JICA Office of Evaluation
6th–13th floors, Shinjuku Maynds Tower, 1-1, Yoyogi 2-chome, Shibuya-ku,
Tokyo 151-8558 Japan; Tel: +81-3-5352-5311/5312
Wednesday 26 April 2006
Meeting with Japan Evaluation Society 10.00 – 12.00
International Development Center of Japan, Kyofuku Building, Tomioka 2-
chome, Koto-ku, Tokyo 135-0047 (4th and 5th fl.), Japan
Lunch with JBIC 12.00 – 13.30
Meeting with Mr. Arakawa, Executive Director of JBIC 13.45 – 14.15
Meeting with Mr. Ito, Parliamentary Secretary for Foreign Affairs, MOFA 15.00
Meeting with Mr. Sato, Director of Economic Cooperation Department 15.45
Local visit and dinner 17.00
Thursday 27 April 2006
Meeting with Ministry of Land, Infrastructure and Transport 10.00 – 12.00
Lunch with Ministry of Land, Infrastructure and Transport 12.00 – 13.30
Field trip to Subway 14.45 – 17.45
Friday 28 April 2006
Departure for return to Hanoi (via Hong Kong) 10.00 – 15.55
C:\OANH\M&E WEBSITE\JUNE.06 UPDATES\ACTIVITIES\STUDY TOUR IN JAPAN - APRIL 06\ENGLISH\REPORT ON EVALUATION STUDY TOUR EN V3.DOC