You are on page 1of 5

In 2004, the results for Mexico’s second participation in PISA were released.

These results allowed to


explore performance across states, as they included test scores for the country's 32 states. Colima,
Mexico’s 4th smallest state, was an outliner: its average math score with 443 points was closer to some
European countries (like Greece and Serbia) than to the Mexican average (0.58 standard deviations
higher). What did Colima do to get such remarkable results? Experts agreed that Colima took advantage
of the decentralization reform enacted in 1994: the state built a learning environment with accountability,
training and organizational change at the school level and adjusted its educational programs to the
specific features and needs of its region (Alvarez et al., 2007; De Hoyos et al., 2017; Garcia-Moreno et al.
2019).

For instance, since 1997, Colima started an evaluation system which publicly rewarded the schools with
the highest achievement scores. In the following year, Colima created a school-based management
mandatory program named “School-Management Program” (Programa de Gestion Escolar-PGE) for all
schools. That seems to be one of the main reasons behind Colima’s success. However, Colima’s stellar
performance did not last long. In PISA 2012, while still classified as a high achiever state, the distance to
the national average score fell to 0.16 standard deviations. This was caused by, both an improvement in
the Mexican national average for PISA 2012 (of 0.28 standard deviations) and a decrease in Colima’s
performance of 0.20 standard deviations.

What changed so that Colima obtained those results?


This is the story of two impact evaluations that give us a glance of what happened in Colima between
these two results. Figure 1 depicts Mexico's math scores in PISA at the state level in 2003 and 2012,
where Colima is a high achiever in math in both PISA editions relative to the rest of Mexico.

Figure 1: PISA Math Score across Mexican States, 2003 and 2012
Source: Garcia-Moreno et al. (2019)

Inspired by Colima’s local SBM-PGE, in 2002, the Mexican ministry of education launched a school-
based management (SBM) program called “Programa de Escuelas de Calidad” (PEC). PEC had a new
component: direct monetary transfer to schools. The national program designed a selection process which
justified the program to focus on the most marginalized schools. The target schools of the program were
disadvantaged schools located in poor urban areas, as these were the schools that could benefit most of
the program given a relatively high parental participation. Additionally, rural schools had another similar
program in place at that moment.

Taking advantage of the decentralization, educational authorities in Colima did not follow these
operational rules. First, the marginality index, which was used to target urban marginalized schools, was a
categorical dummy for which there was no variance in Colima. Second, Colima had an SBM (SBM-PGE)
program in place and for that reason, Colima adjusted the national SBM (SBM-PEC) program rewarding
schools who had the best school plans every year. This new program was open to all public, basic
education schools. To participate in the program and obtain a grant, schools had to submit their
improvement plan – the School Strategic Transformation Plan– along with a needs-assessment and a list
of actions to address those needs. Proposals were submitted early in the school year, typically by the end
of October. While the preparation of the local SBM, SBM-PGE, stayed mandatory for all schools, not all
schools chose to submit their school plan to apply for national SBM-PEC benefits.

The national program (SBM-PEC), was experimentally evaluated in 2006, after five years of
implementation, using the local authorities' selection rules. No significant differences between treatment
and control schools were found in the first year of the program. Presumably, schools included in the
evaluation were subject to SBM practices from the local and national program before the impact
evaluation. Monetary transfers to these schools did not increase achievement test scores for the treatment
schools; however, test scores increased overall in Colima, for both treatment and control schools creating
a gap with the rest of the non-SBM schools in the state.

Colima discretionally changed the selection process of the national SBM program. Was this a good
decision? Did it improve learning outcomes? First, the grant did not make these PEC schools to perform
better, presumably, school actors were already making decisions to improve school in several dimensions.
Second, control schools were high performers and most of them with experiences in both programs (SBM
national and local). Third, this program wanted to create competition among schools, which made it to be
attractive to those schools with active school actors and well-designed school plans. However, schools
with more time in the program had significant increases in learning outcomes in the first year of the
evaluation.

In 2009, the Ministry of Education launched a program to tackle the problem of thirty thousand schools
with very low math scores. While each Mexican state implemented the national’s Specified-Attention
Program for School Achievement Improvement (Programa de Atencion Especıfica para la Mejora del
Logro Educativo, PAE) discretionally, Colima’s education authorities identified the 108 schools that had
obtained the lowest learning outcomes as measured by ENLACE -Mexico’s standardized test scores-.
In February 2010, the state government announced the “performance status” of the selected schools:
schools which performed below an arbitrary cut-off were automatically enrolled in the mandatory PAE.
Once identified as a PAE school, the school was required to implement a series of pedagogical
interventions set by local authorities and consult with external advisors to assist in the preparation
and organization of a school improvement plan for a period of at least three years. This program - low
stakes accountability program had a s.d. 0.12 impact on test scores during the first year of its
implementation (De Hoyos et al., 2017). This low stake accountability program encouraged school actors
not only to promote activities to strengthen management- e.g. the design of a school improvement plan-
but also schools took advantage of the test scores information to have a better pedagogical approach to
address academic challenges. That is, obtaining information about performance through the design of the
school improvement plan gave the schools useful tools and knowledge to tackle problems and set goals
better. The results suggest that information from test scores could make the academic decision-making of
school actors more effective, which could ultimately lead to an improvement in learning outcomes.

Figure 2 depicts four math trajectories for each group of the programs: 1) PEC control and treatment and
2) PAE control and treatment.

Figure 2: School average performance in Colima by evaluation groups


Source: Authors’ elaboration (2019).

What can we learn?

As a part of its own decentralization transition, Colima built an ecosystem for learning interventions in
the late 90's. During this process it created an efficient school information system at the student level and
an evaluation system for the schools’ own accountability. This first waves of policies paid off to Colima
in PISA.

During the 2000s, Colima, consolidate its educational policies by supporting schools with monetary
resources through the national SBM, in which the program's eligibility was highly associated with the
high performance in standardized tests. This type of eligibility design could have increased the gap
between good performing schools and those who may not be the best but the neediest. The 2012 PISA
results for Colima may have been impacted by policies in favor of competition of schools and
encouraging school actors to mobilize efforts, strategies and resources.

Colima amended the way it targeted schools according to standardized tests and identified lagging
schools, whom were given the chance to improve their performance. Unfortunately, the PAE program
was canceled in 2013 and its impact vanished. These cohorts would have taken PISA in 2015. How does
this story end? The National Autonomous Evaluation of Education Institute (INEE) canceled not just
ENLACE but also PISA at the state level since 2014. For that reason, the story is incomplete.
Big picture implications:

Decentralization is a great tool to improve learning, but evidence-based policies help local authorities to
have the most favorable impact.

● National programs can enter in contradiction with the local context. There is a need to use a
comprehensive assessment to provide evidence to support educational interventions.
o Educational authorities use regularly observational information with a lag and it’s hard to
address dynamic factors that have an impact on learning.
o General assessments are a great tool for average-performing schools, but we should
understand the heterogeneity within and among states, systems and populations.
● Differentiated interventions will lead to a more efficient and higher performing system

On the SBM policy

The timing and type of SBM interventions matter to improve learning outcomes. More sophisticated
SBM interventions towards full autonomy may help high performing schools; while moderate
interventions may in turn create a better decision-making for low performing schools. For example, PEC
was a limited version of school autonomy with an indirect effect on the quality of education. The PEC
program was able to increase school actors’ role in areas not necessarily related with pedagogical
interventions (e.g. own curriculum or firing and hiring teachers).

You might also like