You are on page 1of 13

digitalGREEN

Quality Assurance Framework Version: June 11, 2011

1 Introduction
Digital Green builds and deploys information and communication technology to amplify the effectiveness of development efforts to affect sustained, social change. Together with our partners, we are accelerating the scale-up of the model across smallholder farming communities in South Asia and Africa. As we expand, we are proactively committed to maintaining quality both in terms of (1) the efficiency of our interventions, which includes the production and dissemination of locally relevant content, as well as (2) its impact, which includes the increased take up of modern sustainable agricultural practices and the ultimate sustainable improvement in the socioeconomic status and self-efficacy of the communities that we work with. We employ a rigorous process in selecting partners supported by three core pillars: (1) locally relevant domain expertise, (2) established operational scale, and (3) community-level trust. By closely collaborating with organizations that are focused on improving the well being of grassroots-level communities, our interventions are directed toward building the capacities of our partners to integrate and institutionalize our model of technology and social organization to improve the efficiency of their work and to broaden the participation of the community. In a controlled research trial in 16 villages with one NGO partner in 2007-08, Digital Green was shown to increase the adoption of certain agricultural practices seven-fold and, on a cost-per-adoption basis, to be at least ten times as effective, per dollar spent, in converting farmers to better farming practices, than classical approaches to agriculture extension. Over the last three years, we have scaled to the present 800 villages with seven NGO partners and aim to extend this system to potentially 10,000 villages over the next three years in India and other developing countries around the world. As a result, we have a renewed focus and commitment to ensure high-quality processes and impact in our work. This document provides an overview of the quality assurance

Quality Assurance Framework

v: June 11, 2011

strategies that we are currently employing and will require further strengthening, as well as, in some cases, introducing new elements to the system. Though there is some overlap, we have segmented our quality assurance strategies in two parts: (1) process quality and (2) content quality. Process quality ensures that the aspects of the Digital Green system are institutionalized with the partners and communities that we work with in a consistent and coherent manner to improve the efficiency of existing extension systems whereas content quality ensures that the information exchanged across Digital Green-support extension system provides sustained, positive value for the members of the communities that use it. To support these protocols, we will create a directorate of quality assurance at Digital Greens headquarters to anchor these processes and to coordinate exchanges and learnings across the organization. The directorate will also work with Digital Greens partners and create supportive structures to manage the process quality and content quality assurance mechanisms described in this document. For instance, a technical advisory panel of experts will be constituted to provide input into assessing and documenting the quality of content and assuring the ultimate impact that we seek to make in improving the livelihoods empowerment of the community. The following schematic provides an overview of Digital Greens quality assurance framework:

Page 2

Quality Assurance Framework

v: June 11, 2011

Quality Assurance Directorate

Process Quality

Conte nt Qualit y

Efficiency

Impact

Improved Livelihoods & Empowerment

This quality assurance framework will be reviewed on a biannual basis to assess whether it is achieving the efficiency and impact gains that we seek to achieve. Digital Greens board and senior management team along with representatives of relevant stakeholders will be requested to participate in these discussions and to determine if there might be opportunities for further refinement. refinements will be documented in future versions of this document. These

2 Process Quality
2.1 Standard Operating Procedures (SOPs)
Since Digital Greens inception as an organization in May 2009, we have actively captured the learnings and experiences of our team and our partners from the field deployments of the Digital Green system in the form of a standard operating procedures framework. This framework continues to be developed in an iterative manner and details in a step-by-step manner the elements of the Digital Green system: from selecting partners to identifying intermediaries from the community to selecting topics for video production to mediating the

Page 3

Quality Assurance Framework

v: June 11, 2011

screening of videos to the follow-up and monitoring conducted of practices that community members may choose to take up for themselves. This framework has been designed to adapt to the diverse contexts of the organizational and operational mandates and structures of the partners and communities that we work with as well as to be dynamic to changes over time. Our existing partnerships with eight non-governmental organizations, one government department, and one agribusiness of varying sizes and approaches has provided a richness in experience to prioritize those areas that require greater attention as well those that might need to customized. Further, this framework defines the roles of Digital Green, our partners, and the community in assuring consistent processes in our interventions and sustainable impact as a result. The core components of our standard operations framework include three primary areas of the Digital Green system: initiation, production and diffusion:

Each of these components and sub-components are documented and form the basis of the training curriculum that is used to institutionalize the Digital Green system with appropriate stakeholders within our partner organizations and in the community. Our performance-based training methodology combines classroom training, hands-on demonstrations, supportive supervision, and follow-up in the field to ensure that program executives use the standard operating procedures as a framework for their interventions or document variations and adaptions that might be necessary to suit local conditions. Since Digital Green effectively serves as a trainer of trainers to its partners, we actively seek to identify and address possible attenuation in the coherence and consistency of the model while replicating the system.

Page 4

Quality Assurance Framework

v: June 11, 2011

2.2 Supportive Supervision & Follow-up Training


Supportive supervision is an attitude first and a process second. It is the creation of an environment that allows trainees to develop and enhance performance regardless of current level of performance or professional expertise.

As our partners build their capacity on the technology and social organizational components of the Digital Green system and integrate it with their existing interventions, we continue to be available in a resource agency function providing the necessary backstopping support as the training processes and activities occur at multiple levels. For instance, we observe the manner in which community intermediaries who are involved in video production and diffusion are selected and trained to ensure that the community has an active role and that thresholds in motivation, leadership, and communication are developed and maintained. In addition, we work to identify and close gaps in the For example, the initial content selected for implementation of the system.

production and diffusion is largely determined based on the existing programs of our partner extension system. The Digital Green system captures feedback from the community at the time of disseminations in the form of the questions and interests that individual farmers express at that time as well as through the adoption of practices (or lack thereof) by individual farmers after the disseminations. Digital Greens technology stack, which includes the Connect Online, Connect Offline (COCO) data management framework as well as Analytics suite of dashboards, provides an open-access, easy-to-use platform to have insight on what is going well and what might need further attention. We spend time, for example, working with our partners to ensure that this data is used to inform the next iteration of content that is produced to progressively better address the needs and interests of the community over time.

As the Digital Green system has multiple touch-points with the partners and communities that it integrates with, we have developed checklists (e.g., Annexure 1) of critical elements that require particular attention both for posttraining reference purposes as well as to provide a mechanism to perform an assessment that can be used to better target additional follow-up trainings and
Page 5

Quality Assurance Framework

v: June 11, 2011

to

provide

feedback

to

partners

on

areas

that

might

require

further

strengthening. The checklists have been designed using the standard operating procedures framework as a basis and follow a similar flow from initiation to production to dissemination.

2.3 Site Visits


Digital Greens team of trainers periodically visits the field sites of our partners to observe the processes of initiation, video production, and diffusion. For each component, they reference the data that has been recorded through the COCO and Analytics systems and randomly sample activities and artefacts of the interventions. For instance, this includes understanding and observing processes for topic selection, storyboarding, shooting and editing for video production and dissemination, adoption, and reporting for video dissemination. The standard operating procedures framework and checklists provide a basis to provide recommendations on areas that need improvement, but as discussed in the previous section, we also maintain an openness to learn and incorporate innovations to the model as well as customizations of the approach that might be necessary to suit the variety of organizations, communities, and contexts that we work with. In all cases, we have found that documenting these experiences, recommendations, and learnings in a structured manner, using tools like checklists and trip reports, is crucial to be able to communicate and follow-up with the stakeholders involved as well as to better facilitate exchanges across locations, partners, and team members. We encourage members of partner organizations to perform field visits along similar lines at a higher frequency and at a greater sample rate in which they operate to be able to better address the strengths, weaknesses, opportunities, and threats across both broader geographies as well as in a more deeper manner through detailed interviews and assessments at the level of individual villages and participants.

2.4 Monthly Reviews


To share experiences across our partners and to develop plans on the way forward on a regular basis, Digital Green convenes monthly review meetings at multiple levels: including, (1) partners senior management and partners executive staff, (2) Digital Greens regional leadership and partners executive staff, and (3) Digital Greens senior leadership and partners senior management. These convenings are geared to develop an enabling environment in which both
Page 6

Quality Assurance Framework

v: June 11, 2011

processes as well as impact are reviewed. It is of course possible to have strong processes but have no impact, but our experience has shown that emphasizing both processes as well as ultimate impact in the livelihoods of individual community members aligns our partners existing interventions and objectives to that of Digital Greens role in improving its efficiency and its ability to succeed.

Creating systemic change is an iterative and continuous process. The monthly reviews consider both previous histories as well as project plans to understand current and anticipated challenges, to develop strategies to address them, and to plan more effective ways to implement the activities going forward. At both national and regional levels, partners are also brought together periodically to exchange experiences, challenges, and learnings to both celebrate the achievements of partners and individual communities as well as to brainstorm possible issues, such work plans, human resources, financing, activities, and impact, that could be better discussed as a group.

2.5 Data Management System


A core ethos of the Digital Green system is to partner with existing development organization and to use data to inform these organizations on ways in which they can improve their efficiency, target their interventions, and better address the needs and interests of the community to have a sustainable impact. To support this aim, Digital Green has developed a technology stack, which includes the Connect Online, Connect Offline (COCO) data management framework and Analytics suite of web-based dashboards. These systems essentially capture the interactions of individual community members with Digital Green and its partners as they appear in videos, attend video disseminations, express particular questions or interests, and adopt featured practices for themselves over the course of time. activity. The source of this data is largely the community intermediaries who are involved in the video production and dissemination The data is initially recorded on paper forms and then digitally transcribed at block- or district-level locations. These processes of capturing and analysing data have been designed in a structured manner for ease-of-entry and analysis as well as to mitigate issues with data reliability. The richness of the data allows both for aggregate and disaggregate analytics in various time-, geographic-, and partner-based dimensions and provides opportunities for
Page 7

Quality Assurance Framework

v: June 11, 2011

sharing.

We have found that it even stimulates healthy competition across

partners since our Analytics suite of web-based dashboards are publically accessible.

Certain parameters, like adoption data, require additional layers of inspection to be meaningful. In the case of adoptions, this includes a definition of each practice, its core components, and an expected cost-benefit statement on its value. Farmers attend video dissemination at a high regularity (usually, once per week), so each video focuses on just one practice at a time. For each practice (or video), checklists are developed to support community intermediaries involved in video disseminations to reference its core components as well as to physically verify whether the practices that community members adopt was done correctly, incorrectly, or adapted innovatively. This dataset also allows us to evaluate the quality of the mediations conducted of the videos using proximate metrics like the questions and interests expressed during them and provides insight into those practices that may be the most (or least) popular in a particular location at a particular time. The videos produced by our network of partners and community members are archived on our website and on YouTube, and by linking the data from the usage of the videos, we can recommend videos based on parameters such as geography, crop, seasonality, interest, clarity, and impact. Our data management framework also includes automated mechanisms to check the consistency of the data and to raise red flags (e.g., adoption of a practice prior to its dissemination) that might require further investigation. field. In addition, our partners and team members regularly sample the raw data from within the system and cross-verify its accuracy in the

2.6 Review & Audit


The aforementioned mechanisms provide our team of trainers, partner staff, and community intermediaries a framework and a toolbox to select from for ensuring quality as we (1) bootstrap on our partners existing institutional capacity and targeting investments in operational tools and processes, (2) monitor and evaluate progress with feedback and refinement support, (3) provide a platform to share content amongst partners, and (4) enable partners to sustain and
Page 8

Quality Assurance Framework

v: June 11, 2011

expand the program on their own. To provide an additional check and balance, we have instituted a system of biannual, formal reviews and audits in which a team not associated with the project area, but familiar with the Digital Green model, randomly samples villages and conducts structured interactions with community intermediaries, participants (as well as non-participants) in the community, and partner staff at various levels to assess the coherence and consistency in the Digital Green system as well as the impact and feedback of members of the community to provide recommendations on aspects that could be improved. These surveys include qualitative as well as quantitative aspects as delineated in Annexure 2. Like other quality assurance mechanisms, the purpose of these reviews and audits is not to police our partners or team members, but rather, to introduce an external perspective that might better see the forest through the trees from observations across locations and partners and to share input on specific aspects that require strengthening.

2.7 Village Certification


Bringing these quality assurance measures together, we have defined a process of certifying villages in which the Digital Green system has been operational for at least three months. As discussed earlier, Digital Green employs a rigorous process for selecting partners and has defined a framework for identifying members of the community involved as intermediaries in video production as well as dissemination. The process of video production includes aspects of selecting members of the community that are featured in the community. Each of these processes serve as an incentive that each stakeholder seeks: that is, to become a Digital Green partner, to be a resource person in ones community, and to be featured as a role model in ones peer group. At the level of an individual village, we can ensure that these and other aspects of the Digital Green system, as defined by our standard operating procedures framework, meets both the process standards as well as outcome impact that we seek to achieve with the partners and communities that we work with. occur These processes on a recurring basis as partner staff members and community

intermediaries may change, local contexts may evolve, and improvements to the Digital Green system may be introduced over time. Consequently, villages are re-certified biannually upon completion of the formal review and audit procedures.

Page 9

Quality Assurance Framework

v: June 11, 2011

3 Content Quality
3.1 Partner Due Diligence
Digital Green uses a rigorous, due diligence procedure for identifying its partners based on three primary criteria: (1) scope of locally relevant domain expertise, (2) established level of scale in their existing extension operations, and (3) community-level trust and rapport. For the purpose of ensuring the quality of content, the first criterion of locally relevant domain expertise is of critical importance. We select partners that have demonstrated experience in utilizing a combination of externally sourced research as well as on-farm participatory research trials to determine the package of practices and technologies that they share in a particular geography as well as their cost-benefits, limitations, etc. Often, we have used a competitive, Request for Proposals (RFP) mechanism to select the most suitable partners. Our partners typically have an existing schedule of interventions, which include field demonstrations, exposure visits, farmer field schools, and the like. are shared across these extension approaches. The content initially produced is seeded with those practices and technologies that In addition, the process of producing short video involves the modularization of complex practices and technologies into digestible forms. Consequently, we work with our partners to standardize the practices in their knowledge banks (i.e., physical or virtual) as appropriate to the location-specific contexts of the communities that they work with. We also ensure that our partners have worked through the relevant linkages across a value-chain (e.g., credit, inputs, markets, government schemes) that might be necessary for particular practices or technologies to have value. The practices are also sequenced to be able to provide the full value to those that adopt an entire package as well as to build interest amongst the community by showcasing those practices that provide tangible value in short time duration first and showing those that might provide longer-term gains later.

3.2 Technical Advisory Panel


The Standard Operating Procedures (SOPs) described in Section 2.1 include checkpoints to ensure that videos are vetted by the domain experts in our partner at various stages of the video production process: i.e., from identifying relevant topics, to ensuring the correctness and completeness of storyboards, to reviewing the final video prior to its distribution. These domain experts typically
Page 10

Quality Assurance Framework

v: June 11, 2011

are based at the district- or block-level locations in which the Digital Green system has been deployed and have specializations in agronomy, animal husbandry, community mobilization, or other related livelihood disciplines. Though several of our partners also have higher-level experts based at stateand national-levels, we have found that having them vet the content is either cumbersome (e.g., because of delays in the transmission of videos or the translation of local languages) or less than useful (e.g., unknown variances that might be necessary to adapt a practice to suit local conditions). Digital Green team of trainers is positioned to provide input for the aesthetic and production qualities of videos. To give input on the content itself, we plan to constitute a technical advisory panel of domain experts that are internationally respected in the various disciplines in which content is produced. Initially, Digital Greens quality assurance manager would be responsible for liaisoning with this panel who would be recruited as consultants to periodically review content and to work with each partner to ensure that they have adequate capacity and systems to vet content quality. Over time, we would look to establish a team inhouse to perform these functions.

3.3 Scientific References & Farmer Feedback


Our partner extension systems maintain primary responsibility for determining the practices and technologies exchanged in a particular locale, but Digital Green is involved in facilitating the use of data to inform partners on targeting improvements to their programs and establishing linkages with external resources to support their interventions. As discussed in Section 2.5, the Digital Green system captures data on the interactions that farmers have with the videos in terms of the interests and questions that they express as well as whether they adopt the practices featured within the videos for themselves. The videos are archived on YouTube and Digital Greens website serves as an openaccess library for these videos alongside relevant data including geography, seasonality, crop, practice, partner, interest, clarity, and effectiveness. Essentially, the communitys feedback is used to inform our partners to better align their interventions (i.e., the videos they produce as well as their interventions generally) based on the needs and interests of the community. This data is of course received after the videos have been shared with the community, but we plan to strengthen the validation of content prior to its release by surveying and referencing relevant scientific literature from

Page 11

Quality Assurance Framework

v: June 11, 2011

internationally recognized sources. Our technical advisory panel will be charged to work with our partners to do so and these references will be prominently displayed and linked to the videos on Digital Greens website. We also plan to enter agreements with members of the Consultative Group on International Agricultural Research (CGIAR), including the International Rice Research Institute (IRRI), to work together in strengthening extension systems in India and elsewhere. These collaborations in well-defined geographical areas could facilitate a two-way information exchange wherein research from these organizations would be localized through the Digital Green system and feedback from the community could be used to potentially inform research agendas.

3.4 Participatory Research


Digital Greens emphasis on local relevance typically requires practices and technologies to be adapted to suit the specific contexts in which communities resides. contexts. Digital Greens partners focus on the use of sustainable agricultural Though there is growing interest in the international agricultural practices to be able to provide viable choices in their resource-constrained development community to study sustainable agriculture, some practices may not have been validated in a rigorous, scientific manner. We plan to work with researchers from University of California, Berkeley, Yale University, and Innovations for Poverty Action to perform a randomized controlled trial of the Digital Green system at scale. changes in yield and income. A majority of the videos that Digital Greens partners produce on improved inputs or production techniques emphasize their cost-benefit to allow farmers to make informed decisions on those that might choose to follow through upon and adopt for themselves. We plan to work with external researchers and our Though we have conducted some partners to carefully understand how the adoption of practices correlates with changes in yield or income to farmers. preliminary, small-scale evaluations along these lines, the ability of Digital Green and its partners to do so in a more systematic fashion has been limited by human and financial constraints. By conducting these research trials in a participatory manner on actual farmer fields, we have an opportunity to both understand the specific nuances that determine the value that a practice In addition to studying the Digital Green system itself, they will assess the effectiveness of the practices in terms of metrics like

Page 12

Quality Assurance Framework

v: June 11, 2011

provides to individuals as well as have a site from which videos showcasing the utility of practices over control conditions can be produced and shared with farmers and researchers alike.

Page 13

You might also like