Summary Report on the Regional Training on the Use of

the Dashboard for CCM Oversight of Global Fund Grants

Report prepared by TSF Consultants:
Sally Wellesley and Maria Leny Felix

On behalf of
the Technical Support Facility Southeast Asia and Pacific

1

Contents
1. Background to the Workshop............................................................................. 3
2. Summary of the Workshop Proceedings............................................................3
3. Assessment of Country Preparedness/Need for Dashboards.............................3
4. Strengths, Observations and Lessons Learned During the Assignment.............3
5. Recommendations for Future Workshops...........................................................3
6. Evaluation.......................................................................................................... 3
Annexes................................................................................................................. 3
Annex 1: Training Design....................................................................................... 3
Annex 2: Workshop Agenda................................................................................... 3
Annex 3: Summary of CCM Oversight Capacity Rapid Assessments......................3
Annex 4: Summary of Workshop Evaluations........................................................3
Annex 5: List of Workshop Participants..................................................................3

List of Presentations1

1 In accompanying folder

2

1. Background to the Workshop
The Global Fund to fight AIDS, Tuberculosis and Malaria (GFATM) is a global
public-private partnership dedicated to raising and disbursing large amounts of
additional finance to prevent and treat the three diseases in developing
countries. Since its establishment in 2002, the GFATM has to date committed
US$6.6 billion to over 300 programs in 135 countries.
The Country Coordinating Mechanism (CCM) is a public-private partnership at the
country level, and includes multistakeholder representation from governments,
multilateral and/or bilateral agencies, nongovernment organizations, academic
institutions, private business and people living with the diseases. The CCM is
central to the Global Fund’s commitment to local ownership and participatory
decision making. Overall governance and oversight of grants is the purview of a
national-level Country Coordinating Mechanism.
The HIV/AIDS Technical Support Facility for Southeast Asia and Pacific (TSF-SEAP),
the Global Fund and UNAIDS have identified strengthening Country Coordination
Mechanisms as one of the priority areas to address in order to accelerate the
effective implementation of Global Fund-funded programmes in SEAP countries.
For this workshop, TSF-SEAP, with Global Fund input, has selected participating
CCMs based on a myriad of factors among the SEAP countries which have
implemented the dashboard, plan on implementing dashboard, or have started
but discontinued use of the dashboard for various reasons.
The key objective of this workshop was to build the capacity of CCMs in the SEAP
region to more effectively provide appropriate oversight of Global Fund grant
implementation within their respective countries through an introduction to and
training on the usage of the CCM Dashboard tool. This was to be achieved
through the design and delivery of a workshop to train stakeholders from the
CCM Oversight Committee, CCM Secretariat and Principal Recipients, as well as
local/national and regional consultants to enable them to reach a level of
knowledge and skills that would enable them to provide CCMs with the necessary
technical support to help them improve their usage of the dashboard.
Also taking part were four Community Resource Persons (CRPs) who are working
at the sub-recipient level and participated as part of TSF’s CRP Capacity
Development Program. The CRPs are personnel serving in functional capacities
(programmatic management, financial management and M&E) who are
nominated from first-time civil society organization Principal Recipients in
Southeast Asia to promote the successful management and implementation of
Round 10 grants. The objective of the CRP Program is to create a pool of
community resource persons with key knowledge and skills on functional areas of
HIV program implementation with a strong focus on Global Fund grant
implementation.
Specifically, the training aimed to provide the participants with knowledge, skills
and tools in:
 Assessing the oversight capacity of the participating CCMs;
 Reviewing the oversight function of the CCM as performed in-country,
particularly in the context of the new guidelines and requirements for
CCMs;
 Understanding the background and processes of the dashboard as a tool
for CCM oversight of Global Fund grants;

3

 Creating, reviewing, and presenting grant performance information on a
dashboard as part of the CCM’s (and particularly the oversight
committee’s) oversight function;
 Effectively managing, storing, and archiving dashboard records
(particularly by the CCM Secretariat); and
 Developing an in-country Dashboard Implementation Plan.

The training design and modules were prepared by Maria Leny Felix.
The workshop was conducted at the Berjaya Times Square Hotel in Kuala
Lumpur, on November 1-4, 2011, and was attended by a total of 30 participants
from nine countries.2 In most cases3 the country ‘teams’ included at least one
representative each from the CCM and one of the Principal Recipients (PRs), and
some also included a consultant who was working on grant implementation
issues with the CCM or the PR. A full list of the workshop participants is in Annex
5.
The participants included two country partners from the Indonesia CCM and CCM
Secretariat, where the dashboard is in use, who were invited to share their
experience in setting up and using dashboards.
Yiga Josayma (TSF-SEAP Programme & Capacity Development Manager) was the
coordinator of the workshop, while logistical support was provided by Rohini
Indran. Sumathi Govindasamy, TSF-SEAP M&E Manager, provided technical input
on the workshop evaluation. The workshop was also assisted by TSF Consultant
Anne Jamaludin as the workshop rapporteur.

2. Summary of the Workshop Proceedings
The workshop was delivered through a combination of lecture-presentation,
question-and-answer sessions, hands-on practical sessions, group exercises and
plenary presentations/discussions. Throughout, there were opportunities for
participants to share their experiences (such as issues, good practices and
overcoming bottlenecks) of oversight and the use of the dashboard and other
tools for grant oversight. The workshop was conducted in English. The full
Training Design is provided in Annex 1, and all the presentations are provided in
a separate file.
The 4-day training covered the following key topics:
(a) Rapid Assessment of the oversight capacity of the participating CCMs
(b) Oversight Function of the CCM in the context of the Global Fund New
Guidelines and Requirements for CCMs
(c) Background and processes of the dashboard as a tool for CCM oversight of
Global Fund grant implementation
(d) Four aspects of the dashboard:
(1) creating and populating grant dashboards,
(2) reviewing dashboards for trends and issues,
(3) presenting dashboard information to the CCM for review and
discussion, and

2 Cambodia, Myanmar, Papua New Guinea, Fiji, Timor Leste, Pacific Islands Region, Malaysia,
Indonesia, the Philippines and Australia
3 The exception was Fiji, where all participants were from the PR (MoH)

4

a detailed presentation on the Five Functions of CCMs and the Updated Eligibility Requirements for CCMs. in other countries it was the responsibility of the PR. full and timely dissemination of information to all CCM members and stakeholders. Following the introduction of participants using a group capacity mapping exercise. who takes it to account? Stakeholders with concerns can contact the CCM team in Geneva or the LFA. The participants were generally familiar with the new architecture. a strong focus on performance-based funding and frequent changes in application formats are a constraint on their participation: there needs to be more investment in strengthening grassroots capacity. the participants were asked to work in their 5 . These were quickly summarized and responded to by the consultants. This concluded the first course module (Introduction to the Course). organise energizing activities and act as timekeepers for the day. and the participants were introduced to the concept of ‘host teams’. CCMs should also be proactive in organising regular consultations with key affected populations so that they can provide feedback on the performance of CCMs. if a CCM is failing. PRs. Module 2. two or more of the country teams would be asked to link up to present a recap of the previous day.  How can civil society and key affected populations be mainstreamed into the CCM and grant implementation when complicated processes. It was confirmed that the CCM must include at least one person living with or affected by each of the three diseases. CCM Oversight Strengthening – Theory and Practice. (4) managing. The daily activities are described below: Day 1 The workshop was officially opened by Graham Smith. For the final session of the day. Funds should also be allocated to developing a CCM communication plan/strategy that will ensure the equitable. began with a brief overview of the New Global Fund Architecture. The overview of the training and the agenda were then presented. Some of the key issues raised. and the responses. and was followed by a question and answer session. the consultants asked the participants to share their expectations for the workshop. but there was some discussion regarding the role of the LFA and the clarity of the relationship between the CCM and the PR. Director of TSF-SEAP. Secondly. This discussion led into the next session.  Who exercises oversight over the CCM. were as follows:  Who is responsible for SR and SSR selection? At least two countries shared that SRs were selected by a subcommittee of the CCM. storing. This prompted a discussion on the division of responsibility and authority between the CCM and the other actors in grant implementation. (e) Developing a plan for dashboard implementation in-country. having examined the principles of effective oversight in the previous session. in which each day. CCM budgets can be used to cover the costs of meeting and consultations (including transport) for civil society to strengthen their voice on the CCM. and archiving dashboard records and information. SRs and other grant implementation in general. This focused on the implications of the new architecture for CCMs and their oversight responsibilities.

A brief summary of the results is provided in Section 3. The day finished with a wrap up of the topics covered. and that the dashboard is a strategic tool that the Oversight Committee can use to help the PR identify problems and bottlenecks in implementation. introduced the dashboard template to the participants with a demonstration of the different pages. and CCM oversight and the annual external audit. and if there are changes in the budget or targets. It was reiterated that oversight is a CCM responsibility. Participants were then asked to open the ‘Fictitia’ dashboard on their computers and work in groups to review the indicator information and present an analysis of Fictitia’s grant performance based on the dashboard. CCM oversight and the LFA audit. The Generic Grant Dashboard and its Financial. as well as any insights they had gained regarding the dashboard as an oversight tool. the GF will also seek the CCM’s approval.country teams to assess the capacity of their CCM for oversight by identifying the structures. define solutions and help implement them. The next session. Management and Programmatic Indicators. Purpose and Pilot Results. was introduced with a presentation on the Evolution of the Grant Dashboard. focusing on best practices. the following issues were raised:  What is the value of the dashboard for CCM members? Response: Countries with multiple grants that use the dashboard have been able to improve grant performance. The participants also revisited the differences between oversight and M&E.  It is important to have a communication plan. Through the group presentations and discussion. During the discussion that followed. It has empowered the CCM in terms of its oversight function  The selection of indicators will be subject to a decision by the CCM members and the PRs  It is important to have a good relationship between the CCM and the PR to populate the dashboard  The dashboard calls for a good relationship between the CCM and the PR. communication between the CCM and the GF Secretariat has not been adequate. Module 3: Development of the Grant Dashboard for CCM Oversight. the participants made the following observations:  The dashboard provides an opportunity for the PR as well as the oversight committee as it makes the PR think about the areas where they are not performing 6 . in which groups presented their oversight capacity assessments. Day 2 The day began with a recap of the previous day by the host team and a preview of the day’s agenda. We need to see a more standardised communication structure between the CCM and the Global Fund. This was done using the CCM Oversight Rapid Assessment Tool. A plenary session followed. key constraints and lessons learned. processes and resources in place to support it. Response: the GF usually requests the CCM’s endorsement for every report. but in reality.

and that there should be at least one indicator per grant objective. The presenters highlighted the role of the dashboard as a reporting tool that enables the CCM to make a rapid analysis of grant performance. would be good to have the PR support the CCM rather than doing it on their own  The dashboard indicates an issue. Day 3 7 . e. overspending on testing. then presented a Case Study on CCM Indonesia’s Experience of Dashboard Implementation. The CCM Grant Oversight Tool. noting that it strengthens oversight by visually displaying early signs of weak performance and bottlenecks. The dashboard clearly shows the links between financial. the programmatic indicators is grant-specific and should be selected and agreed by the PR and the CCM. must decide on “which indicators to use and how best to adapt the indicators to ensure they are specific and relevant to the grants”. and the sources of data. all nine PRs were using dashboards. Set-up and Maintenance Guide emphasizes that CCMs. the CCM was finding oversight a challenge. Response: this can be explained in the ‘comments’ box above/adjacent to each indicator  Can indicators M5 and M6 be customised for malaria or TB grants? (Response: yes) Dr Tine Tombokan. and as of the end of 2011. Indonesian Ministry of Health.g. and Dr Eddy Lamanepa from the GF AIDS Component M&E Unit. With GMS support. whether the TWG members were volunteer or paid staff. The final session of the day focused on Indicator Selection and Interpretation. Working in their country teams. management and programme performance  Those responsible for data entry have to know what the indicators are and where the data comes from  It looked complex to start with but adding the information helped make it more manageable  regarding the role of the CCM and the PR. It was explained that whereas the financial and programmatic indicators in the dashboard are pre-defined. in consultation with the PRs. With multiple grants being managed by nine PRs. Executive Secretary of CCM Indonesia. It was emphasized that any GF Top 10 indicators in the performance framework should be selected. but doesn’t indicate why this happened. the participants first selected a grant to focus on before selecting appropriate programmatic indicators and entering them into the dashboard template. Indonesia piloted its first dashboard with the MoH PR for the AIDS component in 2010. Participants asked about who was responsible for data entry. It was stressed that the dashboard is not intended to show all the details of performance – that is recorded in the PUDR.

grant performance reports and enhanced financial reports. Grant Dashboard Maintenance. the previous day’s work on indicator selection was briefly revisited before participants began the Customization of the Grant Dashboard. Day 4 After the recap and the preview of the final day’s agenda. Module 4. A presentation was made on the need to assign roles during dashboard installation and implementation to the CCM Secretariat. performance framework. which may not be reflected in the PUDR. 8 . Observations and issues raised during the ensuing discussion included the following:  It takes some effort to find F2 data  If there is disbursement for pre-implementation costs prior to Period 1. As the indicator displays were generated. Each country team worked on data entry. PUDRs. and then interpreting and analysing the results. referring to grant documentation including the grant agreement. archiving and printing The discussion following this session moved on from responsibility for dashboard maintenance to more general issues about its role in oversight. All groups realised that whoever took responsibility for updating the dashboard—be it the oversight committee or the PR—good communication with the M&E and finance staff at the PR would be critical. it could be combined for dashboard purposes) with period 1 disbursement  Indicators in Phase 2 or a subsequent funding period (for an SSF grant) could be different from those on the previous phase to reflect the change in focus from process to outputs  The dashboard could be modified to accommodate more than five objectives. particularly data concerning Sub Recipient expenditure and reporting. before going on to validate the data entered. The comments included the following:  The dashboard in its current form could give a lot of good information but some items are not captured. a number of potential problems or bottlenecks immediately became apparent to some of the groups. each group presented their dashboard analysis as well as the insights they had gained during the process.Following the recap by the host team and a preview of the day’s agenda. rather than focusing on SR-dependent factors such as reporting on time. including updating. working with different groups to help resolve data entry problems and advise on customisation and interpretation. For example the performance specifically of the PR. This exercise generated considerable discussion within and between the groups on where to source the data. the Oversight Committee. as well as any other partners. the PR and a dashboard working group or task force. was introduced. but there may be more value from focusing on the most critical  The ‘recommendations’ page will force CCMs to consider their role in oversight and be specific about what they want to see from the PR in terms of grant performance. The two resource persons from Indonesia circulated. After populating their dashboards with as much data as they had available. This was followed by a presentation on the various activities needed to maintain the dashboard. such as delays in SR reporting and unrealistic targets.

The Secretariat’s function is to update the CCM members. taking into consideration the factors above as well as any technical assistance needs. and . . Dialogue with the PRs. . . the PR usually submits the dashboard to the CCM before the LFA verification of the PUDR. Timely and appropriate information of good quality. Malaysia . A minimum investment of human and financial resources. Further comment: if SRs are not reporting on time it may be because they lack capacity to produce the report – this raises the question of how the PR is fulfilling the planned technical assistance to the SRs.Need to strengthen the role of CCM Oversight members so that they can assist in addressing major and critical issues on grant implementation.  Regarding the timing of the dashboard: in Indonesia. so dashboard is a useful tool to identify early signs of problems and make a diagnosis. including distributing the dashboard.Annual Executive Meeting in 3 weeks – will discuss about oversight visits and dashboard. PR will start to familiarise themselves with the dashboard tool. The CCM has the right to know. . mid-term and long-term oversight policy. the dashboard may be adjusted. Capacity building sessions for CCM members to improve their skills in analysing information. . investigating problems and identifying solutions. Module 5 – Planning for Dashboard Implementation began with a brief presentation of the factors that should be considered before a CCM decides to adopt the dashboard as one of its oversight tools. The plans were presented to the rest of the group. A CCM substructure and procedures for oversight. 2. After verification and correction by the LFA.Have a meeting on short-term.  The Indonesian experience is that the CCM has found it difficult to review the PUDR in its entirety.  In dashboard implementation it is important to have negotiation and agreement at all levels. the country should have: . Country teams then worked on the development of a dashboard implementation plan. site visits and evaluations are also central to oversight and provide an opportunity for an in depth look at PR performance. At the very minimum. 9 . particularly between the CCM and the PR.Develop a governance manual for Malaysia.  The CCM can use the dashboard as a starting point for asking deeper questions about implementation. . . to ensure oversight functions more effectively. Key elements of the plans from each country are summarised below: 1. Response from facilitators: the dashboard is just one tool for oversight.Dashboard implementation. Multicountry Western Pacific .

Present Oversight and Dashboard in detail at Annual Meeting in June 2012 (will seek funds from GF for the orientation programme). Myanmar .Regularly review the dashboard to make sure it is relevant to needs.Ensure that there is active interaction between CCM and PRs and good collaboration between both parties. secretariat. M&E system. Dashboard development – CCM. Oversight Committee. . The following were among the key issues that arose during the final discussion:  Need to strengthen the role of CCM Oversight members so that they can interpret the dashboards and assist in addressing major and critical issues on grant implementation  Dashboard could help to support effective problem solving within specified timeframes  Dashboards should be regularly reviewed to make sure they remain relevant to needs  Human resource and technical assistance needs for oversight (including the dashboard) should be identified  Dashboard development should be a collaboration between the CCM. 4. the Oversight Committee and the PR  Cultural considerations will play a part in getting the dashboard implemented 10 . . focusing on data entry and interpreting outcomes. technical assistance to implement the dashboard and oversight plan (early 2012): Executive Committee will make the decision . . CCM and PR).Set goals. Cambodia . Capacity assessment of the PR. objectives and activities for each stage of implementing the dashboard in Fiji. 3.Discuss at CCM meeting in February.Provide a short training on the dashboard (2 days. Oversight Committee will meet to discuss dashboard implementation. . PR to agree on indicators and customize the dashboard. . Identify human resources. . . Present the dashboard during CCM meeting in December . technical working group. 5.Dashboard was introduced to the PR in May 2011.Need to strengthen the Oversight Committee to ensure they know they have to conduct field visits in conjunction with the dashboard.Will hold a 3-day workshop for the PRs and the Oversight Committee in January 2012 to introduce the dashboard. Fiji . . Papua New Guinea . Dashboard maintenance and implementation – ongoing 6. .

the dashboard data is verified by the CCM Secretariat.  Regarding potential conflicts of interest between the CCM and the PRs because the PRs exercise oversight over the SRs and SSRs (which may be on the CCM): this should be covered in the CCM governance manual. The dashboard is a valuable tool and should be taken up but a lot of work is needed to adopt it in terms of introducing the concept. what they had learned from other participants. The facilitators led a brief wrap-up session in which the participants shared their thoughts about the dashboard. How it fits into the overall oversight function of the CCM also needs to be considered  The dashboard can be used for SSF (consolidated) grants. At a minimum. CCMs. as the Indonesia experience has shown  Regarding the routine validation of dashboard data by the CCM: In Indonesia. no grant recipients should be on the oversight committee. and ways forward for CCMs and oversight. and particularly a dashboard forum to allow sharing between inter-country PRs. Module 6: Synthesis and Evaluation Participants completed the post-test and workshop evaluation forms. The participants reiterated the need for more sources of guidance or online support from the Global Fund. securing the human resources. CCMs do not need to get too involved in data verification as it is more important for them to focus on the big picture. The CCM then interprets the data and makes recommendations. which also checks data consistency. and SRs. 11 . and ensuring data supply. the workshop.

with only one grant and Fiji. possibly due to gaps in capacity and knowledge of funding availability on the part of the CCM secretariat. OC will meet to discuss implementation Fiji Yes but still Yes. however. In nearly all countries.9 no need: only Will consult the organizing oversight one grant. This indicates a need both for technical assistance to prepare the ground work for submitting a funding application.75 3 SSF grants: Plan to provide a fully would benefit short training on implement from DB and present ed dashboard DB during CCM meeting in Dec.1 no need: only Will consult the needs one grant. but few None yet 1. The score range for the oversight capacity rapid assessment is from 1 to 3. in others—for example Fiji and Malaysia. would implementing DB defined but benefit from just needs to look Oversight dashboard but for areas of Function needs to improvement strengthen oversight structure Papua Yes but Yes – just 2. which may not be eligible for further funding—the effort may not be warranted. Others were still in the process of establishing an oversight committee. and for better communication between the Global Fund and the countries on the progress of the application. based on the results of the CCM Oversight Capacity Rapid Assessment and observations during the workshop. The readiness of the individual countries for dashboard implementation. not CCM whether to its functions skills eligible for do a modified DB further for reporting or funding (?) will develop DB Malaysia Yes but None yet None yet 1. oversight structures and processes should be strengthened before introducing the dashboard.3. Country readiness for the dashboard Country CCM Oversight Oversight CCM Dashboard Intention to Secretariat Committee Plan oversight need implement capacity rapid assessmen t score Cambodi Yes Yes Yes but not 2. 2 ‘Intermediate’. with 1 indicating ‘Beginning’ level. Funding remains an issue: several countries have not yet applied for CCM funding through the Expanded Funding envelope. Some of the countries reported having oversight committees and oversight plans in place. but that these had yet to be operationalised. CCM whether to capacity may not be do a modified DB building and eligible for for reporting or just starting further will develop DB to function funding Myanma Yes OC group None yet 2. is summarized in the table below. and 3 ‘Advanced’. Some of the countries that have applied for expanded CCM funding have reported waits of several months before the funding is approved.2 With 6 SSF Already r not yet grants.0 No data Will introduce to New needs started CCM at a Guinea capacity workshop in Jan 12 . Assessment of Country Preparedness/Need for Dashboards While some of the participating countries are managing multiple grants and would clearly benefit from having a grant dashboard.

Strengths. for some.  The CRPs benefitted from the opportunity to gain insights into the PR reporting and CCM oversight processes and in turn were able to contribute their own knowledge on how and where bottlenecks can arise in implementation and reporting at the community and SR level. They also made very pertinent and helpful contributions to the discussions following the group exercises. It provided a strong platform for shared understanding of the dashboard concept and a chance to discuss potential responsibilities.  One of the high value aspects of the workshop was having the two consultants from Indonesia to share their insights into the decision-making process on adopting the dashboard and their experience of setting up and maintaining it. timekeeping and energizing activities contributed to fostering a spirit of cooperation and ownership in the proceedings. singing and laughter! 13 . Greater exposure to a range of grant management and implementation processes through workshops such as this can only enhance their capacity to support successful grant implementation by community-based organisations. will Will discuss at Leste one person) allocation have 3 grants CCM meeting for this and so could few skills benefit from dashboard but needs to strengthen oversight structure first 4. as well as their concerns. Moreover. operational mechanisms and issues. also no No 2. We noted a definite tendency to want to ‘over-complicate’ the dashboard by providing more detail than the dashboard was designed to display.  Participants quickly appreciated the potential value of the dashboard but were keen to customize it to better suit their own purposes. it was the first opportunity for them to have sustained interaction on grant issues outside a CCM meeting and it proved to be a valuable opportunity for each to gain a better understanding of what the other actually does. building 2012 SPC Yes No but None yet Results of No data Will discuss at there is rapid Annual Executive good assessment Meeting and capacity for not present in detail this submitted in June 2012 Timor Yes (but only No. Getting participant ‘host teams’ more closely involved in the delivery of the workshop by having them take responsibility for providing daily recaps. as well as being on hand to provide technical guidance and support to the country teams as they were working on their dashboards.  Co-facilitation by host teams worked very well. Observations and Lessons Learned During the Assignment  Participation by country teams with a mix of CCM and PR representatives was an important factor in the success of this workshop. in addition to the health benefits of brief bursts of physical activity.0 With R10.

3. so that they can also play a role in dashboard implementation. This would help to leverage their voice and involvement in the CCM. some countries would be ready to implement but would likely have technical questions on inputting data as son on. It would be helpful to be able to point participants to a source of online help for dashboard preparation and implementation. It was interesting to hear. 4. 2. This would provide an opportunity for them to share valuable data and insights for the dashboard. for example as members of a dashboard team or task force. the Global Fund’s Senior CCM Funding and 14 . Such CCMs need support for putting essential structures in place to be able to apply for funding. Provide more opportunities for CCM and PR representatives to take part in capacity development activities together. and strengthening on basic functions such as setting agendas and taking minutes as well as more advanced topics such as developing communication plans. 5. At the same time.5. Offer a program of technical assistance for CCM Secretariat strengthening. particularly on oversight issues. continue to include resource persons or consultants with direct experience who can share insights and help to deal with technical questions that arise. Particularly for highly technical trainings such as the dashboard workshop. Grant Performance Reports and Enhanced Financial Reports if available). It might therefore be more cost-effective to provide an online helpline or forum that they can refer to. from Tatjana Peterson. and the latest PUDRs. From our observations during the workshop. there were still some countries in which the CCM Secretariat is essentially a one-man operation. such as CRPs. for example. It would also be helpful to have clearer guidance on how the dashboard can be customized. This chance to gain a better appreciation of the respective roles. 2. CCMs will find it difficult to perform their oversight function effectively. Recommendations for the Global Fund 1. Future dashboard workshops could be made more effective if participants were to decide on one or two grants to work on. and bring as much documentation about them as possible (at least the grant agreement and performance framework. The benefits of their participation could be maximised by ensuring that CRPs are part of the country teams attending any such workshops in future. their involvement in country dashboard teams could help to ensure support for the dashboard from CSO and NGO constituencies on the CCM. Without sufficient capacity in the secretariat. Recommendations for Future Workshops Recommendations for TSF 1. responsibilities and concerns could help to foster better working relationships on grant implementation and oversight. Continue to include representatives from civil society/community constituencies. Among the countries attending the workshop. and how far the Global Fund feels that dashboard adopters can go in this regard without compromising the original design and purpose of the dashboard. technical assistance on the application process.

and post-training tests aimed at assessing the participants’ knowledge in the subject area. all the participants were satisfied with the facilitation style of the trainers and agreed that the structure of the workshop optimized learning. a majority of them (80%) were more confident in selecting the suitable indicators to measure the performance of their grants. Steady improvement of the participants’ understanding on the dashboard was observed in Day 3 of the workshop. While the pre. Input from the Global Fund for users would be helpful here. 90% of the participants agreed that all of the sessions had a good balance of theory and practice. 3. that Sudan has adapted indicator F2 to reflect expenditures by SDA rather than by grant objective. increasing to 63% on the post-training test (the same questionnaire was used for both tests). This could include greater clarity over the duration of the approval process. Increase the communication between the Global Fund and CCM on access to funding.and post-training tests comprising questions related to CCM oversight and the oversight tool on topics covered in the training. Again. since expenditure by grant objective is reflected only in the EFR and not in the PUDR. At the start of Day 1. Evaluation To assess the immediate impact of the training on their knowledge. This data is certainly easier to source. The design of the questionnaire reduces response shift bias because it accounts for changes in participants’ knowledge from program content. 6. 90% of participants agreed they have a better understanding of the different roles of the main stakeholders involved. By end of Day 4. the participants were given pre. allowing them to assess what they did or did not know at program outset (Rockwell and Kohn 1989). only 41% of participants believed they had a good understanding of the different roles of PR. Before the workshop. The average score (from 29 participants) on the pre-training test was 55%. only 38% of the participants felt confident about selecting appropriate indicators to track their country’s grant performance. which is achievement of the grant objectives. Technical Support Officer during the workshop debriefing teleconference with TSF and GFATM. This is also linked to Recommendation 5 for TSF regarding the need for more intensive technical assistance to CCMs to help prepare the necessary groundwork for submitting a funding request. and more frequent communication regarding improvements or clarifications that are required for the application to be approved. A high proportion of participants (62%) didn’t know how to check for quality and completeness of the dashboard 15 . the workshop evaluation was designed to provide an in depth understanding of participants’ self perception on their skills and confidence level after the workshop. CCM secretariat and CCM members. but perhaps slightly deflects the focus away from the core issue. some sort of online forum might be an appropriate channel for this. thereby improving accuracy because the participants can reflect on what they learned (Davis 2003). Overall. At the end of Day 2. which focused on quality of data.

issues. It is encouraging to observe the vast improvement in knowledge and skill amongst these participants considering that 52% of them had no previous experience in grant oversight and monitoring. infrastructure.and post-training tests and evaluation questionnaire are included in Annex 4a and 4b respectively. technical assistance and capacity requirement in grant implementation. everyone enthusiastic about tool. At the end of the workshop.” ~ Participant Evaluation Form comment “It's useful but more CCM members should be participating as it will enhance their understanding of their roles and responsibilities. Below are some useful and positive comments from the workshop participants: “ …very good & achieved its objectives. 16 . resources. A copy of the pre. TSF will be following up with the participants at 6 and 12 month interval on how they have utilized and shared the knowledge and skills learned from this workshop with stakeholders in their respective organisations and countries. 79% of the participants were confident about using the information from the dashboard to identify trends.” ~ Participant Evaluation Form comment As part of TSF SEAP ongoing effort to evaluate the benefit of its workshop. but by end of Day 3 only a small number of participants (17%) who are still unsure how to execute this exercise.data before the workshop.

Annexes 17 .

Felix. DATE/TIME ACTIVITY & OBJECTIVE METHOD MATERIALS RESPONSIBLE NEEDED Day One /November 1 8:30-9:30 am Registration of participants and Registration TSF Secretariat distribution of workshop handouts & forms name tags Handouts Nametags Pens 9:30 – 10:00 Welcome remarks Plenary discussion Handouts TSF Background and purpose of the LCD 4 Prepared by Maria Leny E. 2011 Purpose: To equip CCM members. Malaysia November 1-4. principal recipients and consultants with knowledge. CCM secretariat staff.Annex 1: Training Design Workshop Design4 CCM OVERSIGHT STRENGTHENING AND GRANT DASHBOARDS A Regional Training for GFTAM Partners in Southeast Asia and the Pacific Kuala Lumpur. Consultant . skills and tools for strengthening of CCM oversight and use of grant dashboards.

expectations. 10:30 – 10:45 TEA BREAK 11:00 – 12:15 SESSION 1: Review of Global Fund Lecture Discussion Handouts Facilitators Grant Architecture & the 5 Functions of Pens/pencils/mar . roles & responsibilities including task clock and task teams. 10:45 . housekeeping. OBJECTIVE: To review training agenda & specific objectives. to orient participants to the workshop environment & agree on roles & responsibilities in the workshop process. training Presentation materials DATE/TIME ACTIVITY & OBJECTIVE METHOD MATERIALS RESPONSIBLE NEEDED November 1 10:00 – 10:30 INTRODUCTION OF PARTICIPANTS Capacity Mapping Manila papers Facilitators & TSF & EXPECTATION SETTING: Exercise Meta cards Secretariat Participants presentation of training Pens expectations. set group norms.11:00 TRAINING OVERVIEW: Agenda. introduce & name “parking lot”. Masking tape OBJECTIVE: To provide participants with the opportunity to share expectations and concerns about the training agenda/topics & begin building rapport among the participants & facilitation team.

12:15-1:30pm LUNCH BREAK 1:30-2:30 SESSION 2: Conduct Rapid Completion & Rapid Facilitators & Assessment of CCM Oversight Capacity discussion of Assessment of Participants OBJECTIVE: Assess the current RACCMOC Tool CCM Oversight TSF Secretariat capacity of CCM to perform oversight Plenary Discussion Capacity Tool of GF grants. use Q&A Handouts and maintenance of dashboard Pens/pencils/mar kers Meta cards/ manila papers/ LCD DATE/TIME ACTIVITY & OBJECTIVE METHOD MATERIALS RESPONSIBLE NEEDED November 1 2:30 – 3:30 pm SESSION 3: Review of CCM oversight Lecture discussion Presentation Facilitators principles. the CCM in line with the New Q&A kers TSF Secretariat Guidelines and Standards on CCM Meta cards/ Eligibility Requirements. including installation. processes and issues Group Exercise materials Participants OBJECTIVE: Deepen understanding of Brainstorming and Handouts TSF Secretariat CCM oversight responsibilities and experience sharing Pens/pencils/mar identify issues and bottlenecks in kers Q&A effective performance of CCM grant Meta cards/ oversight manila papers/ LCD 3:30-3:45 TEA BREAK . manila papers/ OBJECTIVE: To deepen understanding LCD of CCM roles and responsibilities in GF grant implementation.

including best/good practices in dashboard implementation . country-experience Meta cards/  A detailed review of the sharing manila papers/ dashboard indicators Q&A LCD  Sources of information OBJECTIVE: Gain knowledge of the dashboard processes and understand the issues/problems experiences by countries using dashboard tool. LCD 9:30-10:30 SESSION 5: Technical introduction to Lecture discussion Handouts Facilitators the grant dashboard Group Exercise Pens/pencils/mar TSF Secretariat  Overview of the generic grant Brainstorming and kers dashboard Excel workbook.3:45-4:45 SESSION 4: Introduction to Lecture discussion Presentation Facilitators Dashboard: The CCM Oversight Group Exercise materials TSF Secretariat Strengthening and Grant Dashboard Brainstorming and Handouts Pilot and its results country-experience Pens/pencils/mar OBJECTIVE: Understand the evolution sharing kers of dashboard as a tool for CCM Q&A Meta cards/ oversight manila papers/ LCD 4:45 – 5:00 DAY 1 SYNTHESIS Day Two/ November 2 8:30-9:00am Registration of participants & Registration TSF Secretariat distribution of handouts forms Handouts 9:00-9:30 Recap of Day 1 Plenary presentation Presentation Facilitators & TSF Overview of Day 2 agenda materials Secretariat Handouts.

10:30-10:45 TEA BREAK DATE/TIME ACTIVITY & OBJECTIVE METHOD MATERIALS RESPONSIBLE NEEDED November 2 10:45-12:15 SESSION 6a: Group exercise: Country-group work Sample country Facilitator & TSF Understanding and interpreting Brainstorming dashboards Secretariat dashboard information. kers Meta cards/ manila papers/ LCD 12:15-1:30pm LUNCH BREAK 1:30-3:00 SESSION 6b: Presentation of group Plenary presentation Group outputs Participants & outputs & synthesis Q&A LCD Facilitator 3:00-3:15 TEA BREAK 3:15-3:45 SESSION 7a: Introduction to selection Lecture-Discussion Copy of Facilitator. Presentation Q&A OBJECTIVE: Gain skills in interpreting materials financial. of Indicators and interpretation Group Exercise Performance Participants & TSF SESSION 7b: Group Exercise: Framework Secretariat 3:45-4:45 Brainstorming & Facilitating customization of the country-experience Grant Agreement generic grant dashboard sharing Q&A 4:45-5:00 DAY 2 SYNTHESIS Day Three/ November 3 . management and Handouts programmatic information in the Pens/pencils/mar dashboards.

Brainstorming PU/DRS Facilitator OBJECTIVE: 1) Gain knowledge and Q&A Presentation TSF Secretariat skills on how to create and populate materials dashboards. and 3) how to Pens/pencils/mar present dashboard information to CCM kers for review and discussion Meta cards/ manila papers/ LCD 3:00-3:15 TEA BREAK 3:15-4:45 SESSION 8b: Presentation of Group Role-play Group outputs Participants & . LCD 9:30-10:15 SESSION 7b: Continuation of Group Exercise: Facilitating customization of the generic grant dashboard 10:15-10:30 TEA BREAK 10:30-12:15 SESSION 7c: Presentation of group Plenary presentation Group outputs Participants & outputs & synthesis Q&A LCD Facilitator 12:15-1:30 LUNCH BREAK DATE/TIME ACTIVITY & OBJECTIVE METHOD MATERIALS RESPONSIBLE NEEDED Day Three/ November 3 1:30-3:00 SESSION 8a: Transferring data from Country-group work Sample country Participants & the PU/DR to the grant dashboard. 2) review dashboards for Handouts trends and issues.8:30-9:00am Registration of participants & Registration TSF Secretariat distribution of handouts forms Handouts 9:00-9:30 Recap of Day 1 Plenary presentation Presentation Facilitators & TSF Overview of Day 2 agenda materials Secretariat Handouts.

LCD 9:30-10:00 SESSION 9a: Dashboard installation Lecture Discussion Presentation Facilitators & and maintenance: identifying options. Exercise Outputs & Synthesis Plenary presentation LCD Facilitator Q&A 4:45-5:00 DAY 3 SYNTHESIS Day Four/ November 4 8:30-9:00am Registration of participants & Registration TSF Secretariat distribution of handouts forms Handouts 9:00-9:30 Recap of Day 1 Plenary presentation Presentation Facilitators & TSF Overview of Day 2 agenda materials Secretariat Handouts. Principal Recipients. Brainstorming & materials Participants facilitating decision making on roles sharing of Handouts and responsibilities experiences Pens/pencils/mar  Who will manage the grant Q&A kers dashboards? Meta cards/  Who will input data to the manila papers/ individual dashboards? LCD  Transmitting reports to CCM members and committees  Archiving dashboard reports. CCM members and facilitating decision making DATE/TIME ACTIVITY & OBJECTIVE METHOD MATERIALS RESPONSIBLE NEEDED Day Four/ . feedback and follow up  Clarifying the roles of the CCM Secretariat.

including understanding whether dashboard is suitable or not as a tool for CCM grant oversight in-country.November 4 OBJECTIVE: Gain knowledge on how to install and maintain dashboard. 10:15-11:15 SESSION 9b: Group Exercise: Country-group work Presentation Facilitators & Dashboard installation and Brainstorming materials Participants maintenance: identifying options. Handouts Plenary discussion facilitating decision making on roles 11:15-12:15 Q&A Pens/pencils/mar and responsibilities kers SESSION 9c: Presentation of Group Meta cards/ Exercise Outputs & Synthesis manila papers/ LCD 12:15-1:30 LUNCH BREAK 1:30-2:30 SESSION 10a: Developing a Plan for Country-group work Presentation Facilitators & Dashboard Implementation/Timeline Brainstorming materials Participants OBJECTIVE: Come-up with a draft DB Plenary discussion Handouts Implementation Plan for presentation Pens/pencils/mar Q&A to CCMs of participants kers 2:30-3:30 SESSION 10b: Presentation of Meta cards/ Dashboard Plans manila papers/ LCD 3:30-4:00 SESSION 11: Wrap up/Synthesis: Plenary discussion Presentation Facilitator & TSF Roadmap to Strengthening CCM Q&A materials Secretariat Oversight Function and DB Tool Handouts Implementation Pens/pencils/mar OBJECTIVES: Come up with Action kers Plans and Recommendations for CCMs Meta cards/ of participating countries manila papers/ LCD .

4:00-4:30 Evaluation Completion of Evaluation form Participants evaluation forms 4:30-5:00 Closing Remarks TSF END OF REGIONAL TRAINING .

30 – 1. Expectations Mapping 30 min 12. 1 – 4 November 2011 Workshop Agenda Module/Day Time Session Duratio n DAY 1 8.Annex 2: Workshop Agenda Training on CCM Oversight Strengthening and Development of Grant Dashboard for GFATM Partners in Southeast Asia and the Pacific Region Kuala Lumpur.00 – 9.30 – LUNCH BREAK 1.30 1.30 – 2.30 Opening 30 min 9.00 – 9.15 10.00 – 3.30 1: Introduction to the Course 10. and Standards & Requirements for CCMs: 1 hr Practice Lecture/Sharing/Q&A 3.30 – 1.15 – 3.30 TEA BREAK . The Five Functions of the CCM.15 – TEA BREAK 10.45 – 2.30 – Introduction of participants 1 hr 10.45 Icebreaker 15 min 2: CCM Oversight 1.30 12. Pre-test 45 min 11. Course Overview 30 min 12. The New Global Fund Architecture: Lecture/Q&A 30 min Strengthening – Theory and 2.15 4.15 5.00 12.30 11.15 – 3.00 Registration 1 hr 9.

30 1. Purpose and Pilot Results: 1 hr Dashboard for CCM Oversight 10.30 entry .15 – 5.00 – 9. CCM Oversight: Rapid Assessment/Lecture/Discussion 1.45 – 3.30 TEA BREAK 3.00 – 5.45 9. The Generic Grant Dashboard and its Financial.30 Recap & preview 30 min 9.30 Recap & preview 30 min 3: Development of Grant 9.15 10 b) Indicator Selection and Interpretation: Group Practice 45 min 4.30 – 11 a) Customization of the Generic Grant Dashboard: Lecture/Discussion 1 hr 10.30 Programmatic Indicators: Lecture/Group Exercise 12. 3.45 – 2.00 Registration 30 min 9.15 Wrap up 15 min DAY 3 8.45 10.15 – TEA BREAK 10. Evolution of the Grant Dashboard.00 Registration 30 min 9.30 – 9.30 – 7.30 – 4.30 – 5.30 – TEA BREAK 10.45 Energizer 15 min 1.00 – 5.00 10 c) Indicator Selection and Interpretation: Plenary presentations 45 min 5.30 – LUNCH BREAK 1.00 11.5 hr 12.30 10.75 hr 12.30 – 9.45 – 11 b) Customization of the Generic Grant Dashboard: Group practice on data 1.30 10.30 Brainstorming/Lecture/Discussion 10.15 10 a) Indicator Selection and Interpretation: Lecture/Discussion 30 min 3.15 Wrap up DAY 2 8.45 – Energizer 15 min 11.00 – 8.00 – 9. Management and 1.30 – 1.00 6.15 – 3.5 hr 5. Indonesia Case Study on Dashboard Implementation 1 hr 2.

30 – LUNCH BREAK 1.45 – 5.30 1.00 Registration 30 min 9.30 – 3.00 – 2.30 – TEA BREAK 10.30 – 1.30 11.45 – 2.30 – 3.30 11 d) Customization of the Generic Grant Dashboard: presentation of group 1 hr outputs 3.00 – 3.00 11 d) Customization of the Generic Grant Dashboard: presentation of group 1.00 – 9.30 – 2.45 CLOSING AND TEA BREAK .00 – 5.30 10.45 – Energizer 15 min 11. interpretation and analysis 2.45 14 b) Post-test 15 min 1.30 – 3.20 Workshop evaluation 3.00 14 a) Wrap up & final comments from participants 30 min 3.30 – LUNCH BREAK 1.30 Recap & preview 30 min 4: Grant Dashboard 9.30 12.30 – 3.30 – 13 b) Developing a Plan for Dashboard Implementation: Group Exercise 1 hr 12.25 hr outputs 5.30 – 9.00 Energizer 15 min 2.00 5: Planning for Dashboard 11.30 11 c) Customization of the Generic Grant Dashboard: Group practice on data 1 hr validation.00 – 13 a) Developing a Plan for Dashboard Implementation: Lecture/Q&A 30 min Implementation 11.15 Wrap up 15 min DAY 4 8.30 13 c) Plenary presentation of group outputs 30 min 6: Synthesis and Evaluation 2.45 TEA BREAK 3.30 – 12 Discussion of Dashboard Maintenance: Lecture/Practice 1 hr Maintenance 10.45 10.30 1. 12.

.

Annex 3: Summary of CCM Oversight Capacity Rapid Assessments .

True b. Don’t know 2.Annex 4a: Pre /Post Test . TB and Malaria Kuala Lumpur. True b. Don’t know . Oversight is the tracking of the key elements of program/project performance. Malaysia 1-4 November. Oversight is a core responsibility of CCM and PRs a. True b. Site visit for oversight is conducted to assess the change in targeted results related to the grant a. False c. False c. False c. False c. 2011 Please circle your answer: 1. usually inputs and outputs a. False c. True b. Don’t know 4. Don’t know 5. Southeast Asian and the Pacific Region Supported by: UNAIDS and Global Fund to Fight AIDS. True b.Questionnaire Pre / Post-Test Questionnaire TRAINING ON OVERSIGHT STRENGTHENING AND DEVELOPMENT OF GRANT DASHBOARDS Conducted by: Technical Support Facility. The Global Fund requires all CCMs to submit and follow an oversight plan for all financing approved by the Global Fund a. A staff from PR or SR can be a member of the oversight committee a. Don’t know 3.

True b. Don’t know 10. True b. The financial. management and programmatic indicators in the dashboard are pre-defined in the template a. Don’t know . True b. False c. Before data entry can begin for each grant. A Dashboard offers visual signals on key functioning and performance indicators a. Data consistency checks is an important part of dashboard development for data quality and assurance of data integrity a. True b. False c.6. False c. Don’t know 7. the PR will decide on which indicators to use and how best to adapt the indicators a. False c. False c. True b. Don’t know 9. The Dashboard tool was piloted and developed by Global Fund and Grant Management Solutions for PRs’ use in monitoring and evaluation of grant implementation a. Don’t know 8.

Annex 4b: Workshop Evaluation – Questionnaire Training on CCM Oversight Strengthening and Development of Grant Dashboard for GFATM Partners in Southeast Asia and the Pacific Region 1 – 4 November 2011. I am satisfied with the meals provided 1 2 3 4 5. I found the daily recap to be very useful 1 2 3 4 8. I found the tools used to be very useful 1 2 3 4 . I am satisfied with overall organization of the workshop 1 2 3 4 2. Malaysia TRAINING EVALUATION FORM Please rate each statement by circling the corresponding number of your answer for each item. We appreciate your openness and honesty as it will help us to improve future programs. I am satisfied with the general facilitation of the workshop 1 2 3 4 7. I am satisfied with logistics arrangements of the workshop 1 2 3 4 3. Rating scale: 4 – Strongly agree 3 – Agree 2 – Disagree 1 – Strongly disagree NO GENERAL QUESTIONS RATING SCALE 1. Kuala Lumpur. according to the rating scale below. I am satisfied with the accommodations provided 1 2 3 4 4. I found that the structure of the workshop optimized 1 2 3 4 learning 6.

I think the session on Indicator Selection and Interpretation 1 2 3 4 had a good balance of theory and practice 14. 9. The session on the Indonesian case study helped me to 1 2 3 4 recognize and understand some of the issues involved in dashboard implementation 13. I think the session on Developing a Plan for Dashboard 1 2 3 4 Implementation had a good balance of theory and practice PLEASE CONTINUE TO ANSWER QUESTIONS ON THE FOLLOWING PAGES Below are some general knowledge based statements/questions. I acquired new skills at this workshop 1 2 3 4 10. I think the session on Grant Oversight as a Core 1 2 3 4 Responsibility of the CCM had a good balance of theory and practice/discussion 11. Please rate each item by circling the corresponding number of your answer. I think the session on the Generic Grant Dashboard and its 1 2 3 4 Indicators had a good balance of theory and practice 12. according to the rating scale below: . I think the session on Dashboard Maintenance had a good 1 2 3 4 balance of theory and practice 16. I think the session on Customizing the Generic Grant 1 2 3 4 Dashboard had a good balance of theory and practice 15.

I have a good understanding of the key 1 2 3 4 1 2 3 4 functions. and how oversight could be improved in my country. I feel confident about selecting appropriate 1 2 3 4 1 2 3 4 indicators to track grant performance in my . 18. Day 2: 2nd November 2011 NOW. 22. Rating scale: 4 – Strongly agree 3 – Agree 2 – Disagree 1 – Strongly disagree NO QUESTIONS NOW. I have a good understanding of the concept 1 2 3 4 1 2 3 4 and purpose of the grant dashboard oversight tool. I have a good understanding of the different 1 2 3 4 1 2 3 4 roles of the PRs. roles and responsibilities of CCMs. I have basic knowledge of the new Global 1 2 3 4 1 2 3 4 Fund architecture and how it relates to the new guidelines and requirements for CCMs. CCM secretariat and CCM members in grant oversight. 21. at the end BEFORE the of the workshop workshop Day 1: 1st November 2011 17. at the end BEFORE the of the workshop workshop 20. 19. I have a good understanding of the principles 1 2 3 4 1 2 3 4 of dashboard indicator selection and interpretation.

29. Day 4: 4th November 2011 NOW. at the end BEFORE the of the workshop workshop 23. I have a good understanding of the best 1 2 3 4 1 2 3 4 practices involved in archiving. at the end BEFORE the of the workshop workshop 28. I understand how to check the quality and 1 2 3 4 1 2 3 4 completeness of the data entered into the dashboard. at the end BEFORE the of the workshop workshop country Day 3: 3rd November 2011 NOW. distributing and using the dashboard to present information on grant performance. I have a good understanding of how to 1 2 3 4 1 2 3 4 update. 31. CCM secretariat and CCM members in dashboard implementation. 24. I have sufficient knowledge to interpret and 1 2 3 4 1 2 3 4 explain the data presented in a dashboard. 1 2 3 4 1 2 3 4 technical assistance and capacity building . save and print the dashboard. I have a good grasp of the different roles of 1 2 3 4 1 2 3 4 the PRs. I am confident about using the information in 1 2 3 4 1 2 3 4 the dashboard to identify trends and issues in grant performance. 27. and how these might take effect in my country. 26.NO QUESTIONS NOW. infrastructure. I understand where to source the data for 1 2 3 4 1 2 3 4 entry into the dashboard. 25. I have sufficient knowledge of how to enter 1 2 3 4 1 2 3 4 data into the dashboard. I can identify the resources. 30.

________________________________________ 36.What three sessions in this workshop did you find most useful? i.Please circle the statement that best describes your experience in grant oversight and/or monitoring prior to this workshop: i.NO QUESTIONS NOW. at the end BEFORE the of the workshop workshop that might be required to set up and implement the grant dashboard in my country. Integrally involved in grant oversight/monitoring ii. _______________________________________ . _______________________________________ ii. Others: Specify: _______________________________________ 35. ________________________________________ iii. ________________________________________ ii. I understand the advantages and limitations 1 2 3 4 1 2 3 4 of the grant dashboard. No previous experience but will be involved in grant oversight/monitoring in future iv.What three sessions in this workshop did you find least useful? i. 33. 32. I am confident about using the grant 1 2 3 4 1 2 3 4 dashboard. 34. _______________________________________ iii. Peripheral involvement in grant oversight/monitoring iii.

including recommendations on how this workshop can be modified in the future: ___________________________________________________________________________________________________________________________ ___________________________________________________________________________________________________________________________ __________________________________________________________________________________________ 38.How do you plan to use the knowledge and skills learned from this workshop in the next 6 months? ______________________________________________________________________________________________________________________________ ______________________________________________________________________________________________________________________________ ____________________________________________________________________________________ . 37.Please provide an overall assessment of the workshop.

com 5 Leste 1 Timor Mr Fransisco Martins da CCM Chair tekitp2000@yahoo.diapong@undpaffiliates. MoH mcolata9@gmail.com. MEDiCAM and CCC member ssin@medicam-cambodia.com.com 1 1 Fiji Mr Maca Colata GMU Fiji.org 6 Myanmar Mr Pyi Soe Database coordinator.Annex 5: List of Workshop Participants REGIONAL TRAINING ON THE USE OF DASHBOARD FOR CCM OVERSIGHT OF GLOBAL FUND GRANTS. 2011. MoH kamni_narayan@yahoo. CCM Secretariat khaingc@unaids. November 1-4.org 9 PNG Ms Annette Coppola M and E Advisor.tl@googlemail.com 4 Leste Silva 1 Timor Mr Tibor Van Staveren Oversight Committee of the CCM-TL progressio.coppola@jtai. UNOPS SaiH@unops. Committee member pjthomas@ihug.com Health.org. Natl Dept of Health svbieb@gmail.mm 7 PNG Dr Sibauk Bieb Disease Control Manager.com 0 s 1 Fiji Ms Kamni Narayan Executive Secretary.com 2 s 1 Timor Mr Marcelo Amaral Principal Recipient M&E Officer celo_02@yahoo. PR annette.org 5 Myanmar Dr Sai Kyaw Han M&E Officer.com 6 Leste Costa Pereira Jeronimo 1 Tonga Mr Peter Thomas Exec.com 8 PNG Ms Monica Diapong CCM Secretariat Administrator monica.org 2 Cambodia Ms Hou Nirmita Director of the Department of Women and hnirmita@yahoo. PR-SC Save the children psoe@savechildren.au 1 Fiji Mr Maria Vucago-Poese Fiji MoH mariavucago@yahoo. CCC Secretariat cccadm@ccccambodia. MOWA and Alternate CCC member 3 Cambodia Mr Kith Vanthy Admin Officer.au . Kuala Lumpur # Country Name Designation Email 1 Cambodia Dr Sin Somuny Executive Director.com 3 Leste 1 Timor Mr Noe Gaspar Pinto da CCM secretariat ccmsecretariattl@gmail.org 4 Myanmar Ms Cho Suu Suu Khaing Information Officer.

TSFSEAP gsmith@tsfseap. PT Foundation mitch@ptfmalaysia.com 9 Leste 3 Indonesia Dr Tine Tombokan TSFSEAP Consultant tina@tombokan.gwl.TSFSEAP yiga@tsfseap.int 9 2 Malaysia Dr Anita Suleiman CCM Secretariat.int 8 Concepcion 1 Fiji Ms Jitske Irene Wildschut Coordinator – Joint Secretariat jitskew@spc.com 1 3 Philippine Ms Maria Leny Felix Facilitator felix.haque@yahoo.com 0 3 Indonesia Dr Eddy Lamanepa PM&E Unit.int 8 2 Timor Mr Hasibul Haque TSFSEAP Consultant hasibul.7 1 Fiji Mr Albert Angelo Lancita Grant Coordinator – HIV albertc@spc.org. TLF Share jonasbagas@gmail.org 2 @ Mitch 2 Indonesia Mr Tono Permana Koordinator Sekretariat Nasional GWL-INA tono.com 2 s 3 Indonesia Ms Sally Wellesley Facilitator sally.opsi@gmail.com 0 2 Malaysia Ms Elaine Wong PR M&E focal point elaine@mac.org 5 . MoH replam2002@yahoo. GFATM AIDS Component.my 1 2 Malaysia Mr Sri Yusmar Mohd Yusof Senior Program Manager.ml08@gmail. OPSI Aldo.suleiman@me.com 5 s @ Jonas 2 Cambodia Dr Kem Ley TSFSEAP Consultant kem_ley@yahoo.org 4 3 Malaysia Ms Yiga Josayma Programme Manager .com 3 Muhamad @ Tono 2 Indonesia Mr Pardamean Napitu @ National Program Coordinator.com 3 3 Malaysia Mr Graham Smith Director.com 7 2 Fiji Dr Jason Mitchell TSFSEAP Consultant JasonM@spc.com 4 Aldo 2 Philippine Mr Jonas Villamor Bagas Senior Political Consultant.wellesley@gmail. MoH anita.com 6 2 Cambodia Mr Umakant Singh TSFSEAP Consultant uksinku@gmail.ina@gmail.

~/ " .

Master your semester with Scribd & The New York Times

Special offer for students: Only $4.99/month.

Master your semester with Scribd & The New York Times

Cancel anytime.