You are on page 1of 16

Psychology in the Schools, Vol.

00(0), 2013

C 2013 Wiley Periodicals, Inc.

View this article online at wileyonlinelibrary.com/journal/pits DOI: 10.1002/pits.21703

MEASURING ASPIRATIONS, BELONGING, AND PRODUCTIVITY IN SECONDARY


STUDENTS: VALIDATION OF THE STUDENT SCHOOL ENGAGEMENT MEASURE
CYNTHIA E. HAZEL, G. EMMA VAZIRABADI, AND JOHN GALLAGHER
University of Denver

This article proposes a model of student school engagement, comprising aspirations, belonging,
and productivity. From this model, items for the Student School Engagement Measure (SSEM)
were developed. The SSEM was validated with data from 396 eighth graders in an urban school
district. Utilizing structural equation modeling, the second-order empirical model of the SSEM
was found to fit the data well, to have good reliability for the three factors, and to be predictive
of district-identified risk factors and state standardized academic assessment results. These results
suggest that the Student School Engagement Model and the SSEM may be useful tools for
understanding which students might be at increased risk for school dropout and how to intervene
to support school completion. Recommendations for practitioners and future research are given.
C 2013 Wiley Periodicals, Inc.

School psychologists and educators are increasingly interested in understanding the processes
by which students become engaged in, or disengaged from, school. A growing body of research
demonstrates the tragic outcomes of disengagement, as well as the possibilities for reshaping this
malleable construct by building on student strengths and contextual protective factors (Archambault,
Janosz, Fallu, & Pagani, 2009; Furlong & Christenson, 2008; National Research Council, 2004). This
article contributes to the abilities of school psychologists to provide strengths-based, population-
wide, data-driven services by reviewing the current engagement models, proposing a new model of
engagement, and providing the findings from the validation of a screening instrument to measure
engagement of secondary school students.

S TUDENT S CHOOL E NGAGEMENT D EFINITIONS


Students’ engagement with school does not have an agreed-on definition. This is in part be-
cause various researchers conceptualize engagement differently, as can be seen in the terminology
they use. For instance, Christenson and colleagues use the term student engagement (see, e.g., Ap-
pleton, Christenson, & Furlong, 2008; Christenson, Reschley, & Wylie, 2012), which emphasizes
that engagement is located within students; conversely, Fredricks and others use the term school
engagement (see, e.g., Fredricks, Blumenfeld, Friedel, & Paris, 2003; Jimerson, Campos, & Greif,
2003), which emphasizes engagement with the school context. We believe that both components
are important. We employ the term student school engagement to emphasize that the measure-
ment of engagement represents the student’s perception of the goodness of fit between his or her
needs and the specific environment (Lewin, 1943).This term emphasizes two equally salient points:
(1) students at one school will vary in their levels of engagement with the school, and (2) some
schools will engender greater (or lesser) engagement overall than will other schools.

This research was partially funded by a grant from the University of Denver. The data presented are based on
the analysis in the doctoral dissertation of Dr. Vazirabadi. Thanks to Jennifer Albanes, Kelli Pfaff, Susan McDonald,
Christina Jack, and Rachel Wonner for their assistance in this work.
Correspondence to: Cynthia E. Hazel, Child, Family, and School Psychology, Morgridge College of Education,
University of Denver, 1999 East Evans Street, Denver CO 80208. E-mail: chazel@du.edu

1
2 Hazel, Vazirabadi, and Gallagher

There is consensus that “engagement is a multi-dimensional construct . . . [that] is highly


influenced by specific facilitators such as family and school expectations” (Christenson, n.d., para. 1)
and represents “the fusion of behavior, emotion, and cognition under the idea of engagement”
(Fredricks, Blumenfeld, & Paris, 2004, p. 61). The following is the definition used in this article:

Student school engagement is a multi-dimensional meta-construct representing a student’s internally


and externally mediated affiliation with and investment in schooling. Student school engagement
is a biopsychosocial phenomenon, occurring in and responding to environmental contexts within a
developmental trajectory. (Hazel, Wonner, & Jack, 2008)

The underlying premises of all definitions are that engagement is plastic (and therefore can be
changed and increased) and that higher levels of engagement result in improved academic perfor-
mance and increased likelihood of school completion for the student.

I MPORTANCE OF S TUDENTS ’ E NGAGEMENT WITH S CHOOL


Students’ engagement with school has been found to be an important factor in students’ school
success (Fredricks et al., 2004; Kortering & Braziel, 2008) and adolescent well-being (Archambault
et al., 2009). Engagement with school may also lead to enhanced psychosocial engagement across
the lifespan in multiple settings (Furlong et al., 2003). Because engagement facilitates academic and
social learning, school engagement is a relevant construct for all students (Furlong & Christenson,
2008). However, understanding students’ engagement with school has been primarily motivated by
a desire to increase academic achievement and high school graduation rates (Appleton et al., 2008;
Fredricks et al., 2011).
The high school graduation rate has not changed significantly over the past 30 years (Dynarski
et al., 2008). Approximately 75% of ninth graders will graduate from high school with their cohort
(Aud et al., 2011), and almost 4% of students drop out of high school each year (Chapman, Laird,
& KewalRamani, 2010). However, graduation and dropout rates vary by student race and economic
status, state, and school characteristics. For instance, although only 2% of White students drop out
each year, 5% of Hispanic and 6% of Black students drop out each year. Poverty is an even greater
predictor of dropping out; 9% of low-income students drop out each year (Chapman et al., 2010).
Looking across states, there are discrepancies beyond what can be explained by student char-
acteristics. In 2008, 90% of Wisconsin students graduated on time, but only 64% of Mississippi
students did so (Balfanz, Bridgeland, Moore, & Fox, 2010). Even within states and districts, gradu-
ation rates can vary greatly. Schools at which graduation rates are 50% or lower have been labeled
“dropout factories” (Balfanz et al., 2010). Although these schools represent a little more than 10%
of all high schools, they are responsible for half of the nation’s dropouts; these schools are mostly
large urban institutions that disproportionately serve low-income and minority students (Balfanz
et al., 2010).
The three most highly predictive indicators of a student dropping out are the ABCs: Attendance
(below 90%), Behavior (two or more infractions), and Course Performance (inability to read at grade
level by third grade, failure in English or math in sixth through ninth grade, grade point average
(GPA) lower than 2.0, or failure to earn on-time promotion to the tenth grade; Bruce, Bridgeland,
Fox, & Balfanz, 2011). Although the indicators of importance seem to be agreed on, their utility
is still being debated. For instance, in Chicago Public Schools, these indicators had a high school
graduation prediction power of 80% (73% for predicting nongraduates and 85% for predicting
graduates) for ninth-grade students (Allensworth & Easton, 2007). In contrast, a meta-analysis of
similar risk factors showed that the factors were only able to predict who would drop out with 42%
accuracy (Gleason & Dynarski, 2002).

Psychology in the Schools DOI: 10.1002/pits


Validation of the SSEM 3

However, many nongraduates begin the process of disengaging and ultimately dropping out of
school much earlier than ninth grade (Finn, 1989; National Research Council, 2004). Identifying
students with low engagement, prior to high school, may provide opportunities to alter trajectories
and increase the probability that these students will complete their schooling.

M EASUREMENT OF E NGAGEMENT
The evolving terminology and definitions of engagement present a number of challenges to
the measurement of the construct (Appleton et al., 2008), not least of which is the differentiation
between a student’s engagement and the variables in the environment that support or hinder a
student from engaging in his or her schooling. Sometimes indicators of engagement (such as
attendance, credits earned, or academic competence) and facilitators of engagement (contextual
factors, such as discipline policies, parental supervision, and peer attitudes) are confused with
engagement (Furlong & Christenson, 2008). Although indicators and facilitators can be expected to
correlate with engagement, they do not capture a student’s engagement with school. For instance,
two students could have low attendance rates; for one student, this could be due to low engagement,
and for the other student, it could be due to illness. Both students are at increased risk for not
completing high school with their cohort should their attendance patterns continue; however, the
appropriate interventions to increase their attendance would be different.
Two models of student school engagement have been empirically validated. These models,
referred to in this article as the School Engagement Model and the Student Engagement Model, will
be reviewed, and a third model, termed the Student School Engagement Model, will be proposed.

School Engagement Model


Under the construct of School Engagement Model, Blumenfeld, Fredricks, and colleagues
posited three interrelated domains of engagement: emotional engagement, behavioral engagement,
and cognitive engagement (Finlay, 2006; Fredricks et al., 2003; Fredricks et al., 2004; Jimerson
et al., 2003). Emotional engagement refers to students’ attachment to their teachers and peers,
as well as their feelings about academics and school in general. Behavioral engagement includes
students’ positive conduct, such as effort, persistence, concentration, attention, and contributions
in class (Fredricks et al., 2004). Cognitive engagement comprises both motivation and learning
strategies (Fredricks et al., 2004). Motivation includes students’ investment in their learning, as well
as an intrinsic desire to learn and master the material. Students who are cognitively engaged are
able to self-monitor and evaluate their learning strategies (Fredricks et al., 2004). The authors that
posit emotional engagement may be a precursor to cognitive and behavioral engagement, whereas
cognitive and behavioral engagement appear most directly linked to academic success (Buhs, Ladd,
& Herald, 2006; Fredricks et al., 2004).
In a validation study of the School Engagement Measure (SEM) with elementary students, the
reliability of the three scales ranged from .77 to .86, and the items loaded on the three domains
as predicted (Fredricks et al., 2004). When used with middle school students, the reliability of
the scales ranged from .79 to .92; low and inconsistent correlations between the behavioral and
cognitive scales and academic variables (GPA and attendance) were found (Finlay, 2006). With
Canadian seventh- through ninth-grade students, a higher order three-factor model was validated,
with internal consistency of the three factors ranging from .65 to .88; global school engagement and
behavioral engagement were shown to be significant predictors of school dropout, controlling for
the effects of student age, maternal education, and secondary school retention (Archambault et al.,
2009).

Psychology in the Schools DOI: 10.1002/pits


4 Hazel, Vazirabadi, and Gallagher

Student Engagement Model


Christenson, Reschley and Wylie (2012) proposed a four-factor model of Student Engagement:
academic, behavioral, cognitive, and affective engagement. This model differentiates between ob-
servable factors (academic and behavioral engagement) and internal factors (cognitive and affective
engagement) that contribute to students’ engagement. The observable engagement factors can be
determined by observation of behaviors; the internal engagement factors require self-report to as-
sess. Academic engagement is evidenced by behaviors such as time on task, credits earned, and
homework completion. Behavioral engagement is determined by attendance, voluntary classroom
participation, extracurricular participation, and extra-credit options. Cognitive engagement includes
students’ appraisals of self-regulation abilities, the relevance of school to future aspirations, the
value of learning, and ability to set goals and strategize. Affective engagement encompasses stu-
dents’ sense of belonging at school, identification with school, and appraisal of school membership.
Christenson, Reschley and Wylie (2012) propose that multiple systemic contexts (e.g., family, peers,
and school) influence student engagement and encourage or hinder academic, social, and emotional
outcomes.
The Student Engagement Instrument (SEI), developed to measure cognitive and affective en-
gagement in secondary students, was shown empirically to represent six factors; these factors corre-
lated significantly with academic variables (GPA, reading and math achievement, and suspensions),
and internal consistency of the subscales ranged from .72 to .88 (Appleton, Christenson, Kim, &
Reschly, 2006). Further research has validated five factors (three for affective engagement and two
for cognitive engagement) and has shown invariance across gender and Grades 6 through 12 (Betts,
Appleton, Reschly, Christenson, & Huebner, 2010). Reliability of the factors has been reported to
range from .75 to .89, and the factors correlated with the academic variables of attendance, periods
tardy, and GPA (Christenson, Reschley, & Wylie, 2012).

Student School Engagement Model


The Student School Engagement Model was designed after a comprehensive literature review
that included researching the two previous models, earlier work on student motivation and school
belonging, and drop-out prevention. We selected domains that represented the self-appraisal of fit
with the school environment and that have been shown to impact student success and persistence
in schooling. The three hypothesized domains of student school engagement were aspirations,
belonging, and productivity. Aspirations were defined as students’ interest and investment in their
education. Aspirations lead to students’ appraisals of the worthwhileness of an education and
its utility to their future. Belonging was defined as students’ identification with school values
and positive relationships with adults and peers at school. This domain was conceptualized as
students’ sense that they are a member of the school community as well as their commitment to the
school’s norms. Productivity was defined as students’ effort, persistence, concentration, attention,
and willingness to work on academic tasks. Productivity encompassed cognitive strategies designed
to monitor and maximize learning.

Validation of the Student School Engagement Measure


The purpose of this study was twofold. The first purpose was to assess the psychometric
properties of a student self-report measure of student school engagement, called the Student School
Engagement Measure (SSEM). The second purpose of the study was to determine the extent to
which the SSEM related to important student outcomes (standardized test and course achievement,
attendance rates, and behavior infractions).

Psychology in the Schools DOI: 10.1002/pits


Validation of the SSEM 5

M ETHOD
Participants
The participants were 396 eighth graders at three middle schools (73% of all eighth-grade
students at these schools) within an urban district in the central mountain region of the United States.
All questionnaire data were screened to determine the presence of outliers and missing data. Three
cases (1.4%) were missing more than 10% of the data; thus, they were eliminated from the analysis.
An additional 4 students were eliminated because they provided inaccurate identification and could
not be matched with records provided by the school district. After elimination of these students, the
sample size was 388 students; of these, 98% had 100% complete protocols. According to the district
records, most (80%) of the students were identified as Hispanic, 56% were male, 69% qualified for
free or reduced-price lunches, 18% qualified for English-language services, 6% had been identified
as gifted and talented, and 8% qualified for special education services.

Instrument Development, Instrument Administration, and Student Outcome Measures


Instrument Development. Development of the SSEM began with a review of relevant literature,
existing measures of engagement, and related constructs. From this review, the stated definition of
engagement was developed and the three domains defined. The research team wrote a large bank
of potential items and refined these to 50 items that addressed the three theorized domains. The
items were intended to avoid the assessment of cultural, academic, and behavioral histories. The
researchers attempted to craft items in language that was developmentally appropriate for the
participant population. The resulting draft survey was subjected to cognitive interviews (Willis,
2005) with eighth-grade students at nonparticipating schools within the same district as the study
participants. The students completed the measure and, as they answered each item, they explained
how they interpreted the item and why they answered as they did. Feedback from the cognitive
interviews was used to refine a few potentially problematic items.
The pilot SSEM consisted of 50 items on a 10-point Likert scale of agreeableness (see Table 1 for
items). The items were not grouped by domain, but were presented in random order to discourage
hurried or thoughtless answers to similar questions. Twelve items were negatively worded (and
reverse coded on analysis) to minimize acquiescence bias (Watson, 1992). Because many families in
this district speak Spanish, the survey instructions and each of the items were translated into Spanish
using a double-blind translation procedure: the English version was translated into Spanish by one
bilingual graduate student, and the Spanish version was translated back into English by a second
bilingual graduate student who had no prior knowledge of the instrument. Once translated back into
English, the instructions and items were compared with the original English version for consistency
and agreement, leading to minor translation improvements. There was one protocol: the directions
and each item were listed first in English and then in Spanish, so that a student could read either or
both languages. For the purpose of assessing convergent and discriminant validity (not discussed in
this article), the pilot SSEM was administered with several other measures, for a total of 124 items
(all listed concurrently in English and Spanish) on the protocol.

Instrument Administration. The study instrument was administered electronically to eighth-


grade students at two schools and on paper to students at one school (due to a lack of computer access).
Students took between 10 and 20 minutes to complete the survey protocol. At each administration,
two to three researchers were present to ensure consistent survey administration. Researchers read
the directions aloud in English to students. As the students completed the protocols, the researchers
circulated around the classroom, answering the few questions that arose. Students appeared to work
independently and be focused on completing the protocol.

Psychology in the Schools DOI: 10.1002/pits


6 Hazel, Vazirabadi, and Gallagher

Table 1
Student School Engagement Measure Initial Items, Items Remaining in the Empirical Model, and Their
Theorized and Empirical Factor Placements

Factor Placement

Itemsa Theorized Empirical

1. I would leave school if I would NOT face any consequences.b A


2. It is okay if I do NOT graduate from high school.b A
3. My family knows how I am doing in school. A P
4. I have fun with my friends at school. B
5. I like most of my teachers. B B
6. I volunteer to help at school. B
7. When I am in class, I just pretend I am working.b P
8. If I do not know what something means, I do something to figure it out. P P
9. I study at home. P P
10. I give up when assignments are hard.b P
11. I think school is a waste of time. b A
12. I plan to pursue more education after high school. A A
13. There is someone in my family who helps me when I have trouble completing A P
my homework.
14. Kids at school like me. B
15. I have at least one adult in this school that I can talk to when I have a problem. B
16. Most days, I look forward to going to school. B B
17. I pay attention to my teachers. P P
18. When I am doing school work, I make sure I understand what I am learning. P P
19. I look for more information about things we are learning in school. P P
20. My school work is important. A P
21. Being successful in school will help me in the future. A A
22. There is someone outside of school that I talk to about my future job plans. A
23. I am bothered by bullies at school.b B
24. My teachers are disrespectful to me.b B
25. I am proud to be a student at this school. B B
26. In school, I do just enough to get by.b P
27. When learning new things, I try to connect them to things I already know. P P
28. When I have an assignment due, I keep working until it is finished. P P
29. Getting good grades is important to me. A A
30. It is important to me to be successful in a job. A A
31. I talk to my family about problems I have at school. A P
32. My friends think it is important to do well in school. B
33. There is a lot I can learn from my teachers. B B
34. I feel left out of school activities.b B
35. When I am in class, I often think about other things.b P
36. I catch myself when I am not paying attention in class. P
37. I try my best on school work. P
38. I work hard in school, even when I would rather be doing something else. A
39. I know what I want to do when I am done with high school. A
40. My family would be disappointed if I did NOT graduate from high school. A
41. I am accepted at school. B
42. Teachers help me to be successful at school. B B
43. I attend school events. B

Psychology in the Schools DOI: 10.1002/pits


Validation of the SSEM 7

Table 1
Continued

Factor Placement

Itemsa Theorized Empirical

44. I follow class and school rules. p


45. I know how to study for tests. P P
46. My family does not care if I skip school.b A
47. My friends stand up for me. B
48. I feel like a part of my school. B B
49. I respect my teachers. P
50. I wait until the last minute to start assignments.b P

Note. A = Aspirations; B = Belonging; P = Productivity. a Items in bold were retained in the empirical validation. b Reverse
coded.

Student Outcome Measures. Existing district data were utilized to determine the extent to
which the SSEM domains were related to student outcome measures. The first outcome measure
was a categorized risk score, based on the eighth-grade behaviors of poor attendance (80% or lower),
discipline infractions (one suspension or more), failed language arts course, and failed math course.
A score of 0 indicated meeting none of the risk categories, whereas a 4 indicated the presence of
all four risk behaviors. The presence of these behaviors in eighth grade had been shown within the
district to indicate an increased risk of the student not graduating from high school in 4 years. The
second set of dependent variables was eighth-grade achievement on the state standardized academic
assessment of math, science, reading, and writing.

Data Analysis Procedures


Prior to testing the measurement models, the reliabilities of the items were assessed. Descriptive
statistics of mean and standard deviation were computed for each item. Then, three analyses were
performed to assess the validity of the factor structure of the SSEM: (1) exploratory factor analysis
was conducted to assess the factor structure of an empirically derived model; (2) confirmatory factor
analysis was performed to determine whether a single factor, the theoretical model, or the empirical
model best fit the data; and (3) criterion-related validity was measured using structural equation
modeling. Statistical Package for Social Sciences 19.0 and Analysis of Moment Structures 7 were
utilized for these computations.
Exploratory Factor Analysis. Because the underlying theoretical factors of aspirations, be-
longing, and productivity were assumed to be correlated, oblique rotation was used (Tabachnick &
Fidell, 2001). Items with loadings of at least .40 on one factor were extracted and labeled to derive
the empirical model.
Confirmatory Factor Analysis. Given that multivariate normality is an assumption of all
subsequent analysis, the skew and kurtosis of the items were calculated. The data violated the
assumption of normality, and so, the variables were transformed. The variables with a positive skew
were transformed using a square root transformation, and the variables with a negative skew were
transformed using a power transformation (Kline, 2005). With the exception of four variables, the
skews of the transformed variables were all below the absolute value of 1.0. Thus, these transformed
variables were used in measurement and structural model tests. Confirmatory factor analysis was
conducted on the single-factor model, theoretical model, and empirical model.

Psychology in the Schools DOI: 10.1002/pits


8 Hazel, Vazirabadi, and Gallagher

Table 2
Correlations, Descriptive Statistics, and Cronbach’s Alpha for the SSEM Empirical Factors

Factors 1 2 3 Item Meana SD α

1 Productivity 1.00 12 6.78 1.82 .92


2 Aspirations 0.93b 1.00 4 8.76 1.68 .85
3 Belonging 0.84b 0.84b 1.00 6 6.75 2.04 .83

Note. SSEM = Student School Engagement Measure. a Range of 1 to 10. b Significant at p < .01.

Criterion-Related Validity. Criterion-related validity was tested through structural equation


modeling for the best fitting model of the SSEM with the state and district student outcome data.
Because the district-developed risk scores included failure in language arts and math, it was not
surprising that the risk scores were found to be negatively correlated with the state-level academic
assessment results. Because of these correlations, models that disaggregated the risk scores were
tested as well as a model in which the risk scores were treated as a composite. The model that was
found to be more robust and parsimonious was the one in which the risk scores were utilized as a
composite and is the one presented in the results.

R ESULTS
The following section discusses the psychometric properties of the various structural models of
the SSEM and then provides a model of the empirically supported SSEM with the student outcome
data. See Table 1 for a list of all items in the protocol, those that remained in the empirical model,
and the theoretical and empirical domains of the items.

Reliability of Items
Four items (Items 1, 34, 35, and 46) were excluded because the internal consistency coefficients
for their respective sub-domains were less than .7 prior to deletion. Two items (Items 23 and 26)
did not load significantly to the Student School Engagement construct and were eliminated. This
left 44 reliable items. For models with domains (the first- and second-order models), items that
cross-loaded onto more than one domain were eliminated using modification indices; this led to the
elimination of Items 2, 4, 10, 24, 36, 40, and 50. For the first- and second-order models, 37 reliable
items remained.

Development of an Empirical Model


Initially, the principal factor analysis resulted in four factors using the minimum eigenvalues
of 1.0. The percentage of variance accounted for by the four factors was 61%. A total of 15 items
were removed after three iterations due to low communalities (Tabachnick & Fidell, 2001), leaving
22 items. After thorough assessment of the pattern matrix, three factors were specified. The corre-
lations between the factors ranged from .84 to .93, and the reliability of each factor ranged from .83
to .92. See Table 2 for the correlations, means, standard deviations, and reliability measures for
the three factors. Visual examination of the scree plot also confirmed the three-factor solution. The
first factor consisted primarily of items that were hypothesized to measure productivity. Items that
loaded to this factor included “I study at home” and “I look for more information about things we
are learning in school.” Therefore, this factor was labeled Productivity. The second factor consisted
of items that were expected to measure student aspirations (such as “It is important for me to be
successful in a job” and “Being successful in school will help me in the future”); thus, this factor
was labeled Aspirations. Finally, the third factor consisted of items believed to measure belonging

Psychology in the Schools DOI: 10.1002/pits


Validation of the SSEM 9

Table 3
Factor Loadings of Empirically Validated Items

Item Productivity Aspirations Belonging

3 .52
5 .52
8 .54
9 .80
12 .50
13 .50
16 .47
17 .69
18 .71
19 .80
20 .54
21 .57
25 .78
27 .51
28 .63
29 .55
30 .76
31 .50
33 .52
42 .67
45 .46
48 .79

(i.e., “I feel like a part of my school” and “I am proud to be a student at this school”); this factor was
labeled Belonging. See Table 3 for the retained items with their factor loadings.

Comparison of Models
Table 4 lists the fit indices for the single-factor model, the theoretical model, and the empirically
derived model.
Single-Factor Model. First explored was a single-factor model; see Table 4 for the fit indices.
The comparative fit index (CFI) was below the acceptable criterion of .90 or greater, the normed
chi-square was above the acceptable range of 2 to 3, and the root mean square of approximation
(RMSEA) was on the cusp of the acceptable range (below .08; Joreskog & Sorbom, 1984; Kline,
2005). Although all path coefficients were statistically significant and in the predicted direction, in
total, the single-factor solution indicated poorer fit than did the models next explored.
First-Order Theoretical Model. The next model tested was the first-order theoretical model,
illustrated in Figure 1. As shown in Table 4, the normed chi-square was within the acceptable
range, the CFI was below the acceptable criterion, and the RMSEA was within the acceptable
range. However, the absolute value of the highest standardized residual was 5.47, which was above
the acceptable limit and higher than for the second-order empirical model. Overall, the fit indices
suggested that the model did not fit the data as well as the empirical model (described next; Joreskog
& Sorbom, 1984).

Psychology in the Schools DOI: 10.1002/pits


10 Hazel, Vazirabadi, and Gallagher

Table 4
Fit Indices for the Second-Order Empirical, First-Order Theoretical, and Single-Factor Models of the SSEM

Second-Order First-Order Single-Factor


Index Empirical Model Theoretical Model Model

Chi-Square 537.18 1748.78 2066.77


Degrees of Freedom 206 626 629
Significance 0.00 0.00 0.00
Chi-Square/df 2.60 2.79 3.29
Comparative Fit Index 0.92 0.85 0.80
Goodness-of-Fit Index 0.88 0.77 0.73
Adjusted Goodness-of-Fit Index 0.86 0.75 0.70
Akaike Information Criterion 631.17 1902.48 2214.77
Root Mean Squared Error 0.07 0.07 0.08
Lower 90% 0.06 0.06 0.08
Upper 90% 0.07 0.07 0.09
Standardized Root Mean Residual 0.05 0.06 0.06
Highest Standardized Residual Value 2.45 5.47 –2.46

Note. Student School Engagement Measure.

Second-Order Empirical Model. The second-order empirical model with standardized coeffi-
cients is illustrated in Figure 2. As seen in Table 4, the model fit the data well. The normed chi-square
was within the acceptable range, the CFI was above the acceptable criterion, and the RMSEA was
within the acceptable range (Joreskog & Sorbom, 1984). All indicator variables loaded highly and
significantly onto their respective constructs. Further, all paths from the second-order construct to
the first-order construct were statistically significant. Four items that had been theorized to belong
in the Aspirations domain were found empirically to belong in the Productivity domain (listed in
Table 1).

Criterion-Related Validity
The purpose of the criterion-related analysis was to validate the second-order empirical model
of the SSEM (the best-fitting model) with the criterion of student outcome measures (the state-
measured academic achievement scores and a district-developed composite risk score). See Figure 3
for a diagram of the model and the strength and direction of relationships. The model fit the
data well. The CFI was above the acceptable criterion of .90. The RMSEA was acceptable at .06
(Joreskog & Sorbom, 1984). The normed chi-square was 2.44, within the acceptable range. All
indicator variables loaded highly and significantly to their respective constructs. Furthermore, the
path coefficients between the SSEM and the student outcomes were statistically significant and in
the predicted directions.

D ISCUSSION
Engagement is a malleable component that may impact students’ school achievement. However,
research regarding what constitutes students’ engagement with school, the factors that contribute to
engagement, and how to measure engagement is still emerging. This study provided an alternative
conceptualization of engagement, the Student School Engagement Model, and a means to screen
students’ school engagement.
Once reliability of the SSEM items was established, three statistical models were compared.
The single-factor model did not fit the data well, suggesting that Student School Engagement is better

Psychology in the Schools DOI: 10.1002/pits


Validation of the SSEM 11

e11 TI11

e12 TI20

e13 TI29
.59
.79
e14 TI38
.75
e15 TI12
.79
.60
e16 TI21 .66
.68 Aspirations
e17 TI30 .46
.55
e18 TI39
.53
.61
e19 TI3
.58
e110 TI13

e111 TI22
TI14 e21

e112 I31 .83


TI32 e22

TI41 e23
.44
.56 TI47 e24
.64
.40 I5 e25

.63
.54 TI15 e26
.93
.74
Belonging .76
TI33 e27

.52 TI42 e28


.64
.69 I6 e29
.58
.78 TI16 e210

e31 TI7 TI25 e211

.84
e32 TI17 TI43 e212

e33 TI44 TI48 e213


.56
.82
e34 TI49
.76
e35 TI8
.78
.65
e36 TI18 .82
.67 Productivity
e37 TI27 .66
.63
e38 TI45
.70
.77
e39 I9
.75
e310 I19

e311 TI28

e312 TI37

FIGURE 1. Confirmatory factor analysis results for the first-order theoretical model.

Psychology in the Schools DOI: 10.1002/pits


12 Hazel, Vazirabadi, and Gallagher

e11 I3

e12 I8

e13 I9
.56
.66
e14 I13
D1
.65

e15 I17 .53


.81
e16 I18 .83
.73 Productivity
e17 I19 .75
.68
e18 I20
.79
.59
e19 I27 .94
.64

e110 I28

e111 I31

Student School
e112 I45
Engagement

D2
.85
e21 I12
.62
e22 I21 .70
.81 Aspirations
e23 I29 .77
.86
e24 I30

e31 I5
D3

.65
e32 I6
.51
e33 I25 .70
.79 Belonging
e34 I33 .79
.71
e35 I42

e36 I48

FIGURE 2. Confirmatory factor analysis results for the second-order empirical model.

defined with factors in addition to the global construct. This finding was in keeping with predominant
conceptualizations of engagement (Appleton et al., 2008), as well as many other instruments that
have been developed to measure engagement (Fredricks et al., 2011).
The second-order empirical model (comprising Student School Engagement as the first order
and the factors of Aspirations, Belonging, and Productivity as the second order) best fit the data.
This preliminary validation of the SSEM suggests that Student School Engagement is a multifaceted
construct measured through Aspirations, Belonging, and Productivity. This is a different model of

Psychology in the Schools DOI: 10.1002/pits


Validation of the SSEM 13

D5

e11 I3

Risk
e12 I8

e13 I9
.56
.66
e14 I13
.65 D1
.53
e15 I17
.81
e16 I18 .83
.73 Productivity -.23
e17 I19 .75
.68
e18 I20 .79
.59
e19 I27 .94
.64

e110 I28

e111 I31
Student School
e112 I45
Engagement
D2
.85
e21 I12
.62
e22 I21 .70
.81 Aspirations
e23 I29 .77
.86
e24 I30

.18

e31 I5
D3
.65
e32 I6
.51
e33 I25 .70
.79 Belonging
e34 I33 .79 D6
.71
e35 I42 e41 Math
.81
e42 Reading .93
e36 I48 Academic
.88
Achievement
e43 Science .87

e44 Writing

FIGURE 3. Structural model of the Student School Engagement Measure, district risk scores, and state standardized academic
achievement assessment results.

Psychology in the Schools DOI: 10.1002/pits


14 Hazel, Vazirabadi, and Gallagher

engagement than others, which have primarily divided engagement into some or all of the fol-
lowing dimensions: affective (or psychological), behavioral, cognitive, and academic (Christenson,
Reschley, & Wylie, 2012; Jimerson et al., 2003; O’Farrell, Morrison, & Furlong, 2006). The reliabil-
ity of the SSEM factors ranged from .83 to .92. This is at the upper range of what has been reported
for the SEM and SEI (range of .77 to .92).
When the second-order empirical model of Student School Engagement was analyzed for its
relationship to state-level academic achievement data and district-level risk data, SSEM scores were
shown to positively contribute to achievement on the state assessment of academic achievement
(.18) and protect against district-assessed risks (poor attendance, suspensions, and failure in math
or language arts; −.23). These results are in keeping with what has been found with other engage-
ment instruments. For example, correlations between factors on the SEI and middle school student
attendance and GPA ranged from not significant to .32 (Christenson, Reschley, & Wylie, 2012).
Correlations between the SEM and attendance and GPA ranged from not significant to .37 (Finlay,
2006).
The findings from this study suggest that the SSEM could be a useful predictive measure of
student risk behaviors and academic success. Because the SSEM is 22 items, it would be feasible
for practitioners to utilize the SSEM as a screener for students who might be at increased risk of
school disengagement.

Limitations
Although the SSEM showed promising preliminary results, there are limitations that must be
considered. First, this sample comprised solely eighth-grade students. The transition from middle to
high school has been shown to be a critical juncture in supporting students so that they may remain
on track for successful high school completion (Bruce et al., 2011), so understanding engagement
of eighth graders is extremely useful. However, it will be important to use the SSEM with students
from other secondary grades and see whether the measure proves as valid and reliable for them.
Further, the student participants were drawn from three schools in the same urban district.
How students in this district might differ in their engagement from students in other districts is
unknown. Because 40% of the families in the studied district report that they speak Spanish in the
home, the protocol was designed so that each item was listed simultaneously in English and Spanish.
This allowed students to use either or both languages to understand the items. However, due to
this design, there was no way to assess which language was used by participants and therefore, no
way to know whether the cultural issues associated with linguistic diversity had an impact on the
reliability or validity of the instrument. The students of this sample were predominantly identified
as Hispanic (80%) and poor (69% qualified for free or reduced-price lunches). These students were
representative of the demographics of this district and similar to student populations in other urban
districts, but not representative of the national student population. Poor and minority students in
urban districts are at increased risk for not completing high school (Chapman et al., 2010; Balfanz
et al., 2010), so understanding engagement in this population is critical. However, it will be important
to collect data from students in other districts to understand whether the SSEM is of similar utility
with non-Hispanic students, middle class students, and students in rural and suburban schools.
Another limitation was sample size. According to Costello and Osborne (2005), the sample
size should be large enough to have a subject-to-item ratio of 10 to 1 for exploratory analysis. This
study’s ratio was 8 to 1. Ideally, data are split in half for exploratory and confirmatory analysis. Due
to a relatively small sample size, the sample was not split into two groups for analysis. Future studies
with independent data sets are needed to further validate the model and instrument.

Psychology in the Schools DOI: 10.1002/pits


Validation of the SSEM 15

The SSEM was shown to be predictive of achievement on the state standardized assessment
and district-measured academic risk factors. This suggests that the SSEM is measuring a construct
of importance to students’ academic performance. However, many factors outside the construct
of engagement contribute to academic accomplishments. The SSEM is similar enough to other
measures of students’ engagement with school to suggest that it is measuring engagement, but the
convergent and discriminant validity of the SSEM needs to be empirically established.

Future Directions for Research


The findings presented here suggest that the Student School Engagement Model comprising the
factors of Aspirations, Belonging, and Productivity shows promise and should be investigated further.
Studying results from the SSEM with students of various secondary levels, in multiple districts, and
with students with different demographic profiles would give further information of the validity of
the SSEM. Also, the SSEM should be compared with other measures of engagement and similar
constructs to establish that the SSEM is indeed a measure of engagement. It would be useful to
test the relationship of factors within the SSEM and their predictive validity for student outcomes,
such as course grades and attendance. Of greatest utility will be developing models that predict
student outcomes based on student school engagement profiles and then validating interventions that
interrupt trajectories of school failure.

Implications for School Psychologists


This validation study suggests that the SSEM shows promise as a screener for understanding
students’ engagement with school. It also suggests that conceptualizing secondary students’ engage-
ment with school as comprising their aspirations, sense of belonging, and understanding of how to
be productive in school environments is a viable model for understanding students’ various school
engagement levels. Assessing students’ aspirations, belonging, and productivity may be useful in
deciding how to intervene with students who are at risk for exiting school prematurely.

R EFERENCES
Allensworth, E. M., & Easton, J. Q. (2007). What matters for staying on-track and graduating in Chicago Public High Schools:
A close look at course grades, failures, and attendance in the freshman year. Chicago, IL: Consortium on Chicago School
Research at the University of Chicago. Analysis of Moment Structures (Version 7) [Computer software]. Armonk, NY.
Analysis of Moment Structures (Version 7) [Computer software]. Armonk, NY.
Appleton, J. J., Christenson, S. L., & Furlong, M. J. (2008). Student engagement with school: Critical conceptual and
methodological issues of the construct. Psychology in the Schools, 45, 369 – 386. doi: 10.1002/pits.20303
Appleton, J. J., Christenson, S. L., Kim, D., & Reschly, A. L. (2006). Measuring cognitive and psychological
engagement: Validation of the Student Engagement Instrument. Journal of School Psychology, 44, 427 – 445.
doi: 10.1016/j.jsp.2006.04.002
Archambault, I., Janosz, M., Fallu, J., & Pagani, L. S. (2009). Student engagement and its relationship with early high school
dropout. Journal of Adolescence, 32, 651 – 670. doi: 10.1016/j.adolescence.2008.06.007
Aud, S., Hussar, W., Kena, G., Bianco, K., Frohlich, L., Kemp, J., & Tahan, K. (2011). The condition of education 2011
(NCES 2011–033). Washington, DC: National Center for Education Statistics.
Balfanz, R., Bridgeland, J. M., Moore, L. A., & Fox, J. H. (2010). Building a grad nation: Progress and challenge in ending
the high school dropout epidemic. Washington, DC: Civic Enterprises, Everyone Graduates Center at Johns Hopkins
University, and America’s Promise Alliance.
Betts, J. E., Appleton, J. J., Reschly, A. L., Christenson, S. L., & Huebner, E. S. (2010). A study of the factorial invariance of
the Student Engagement Instrument (SEI): Results from middle and high school students. School Psychology Quarterly,
25, 84 – 93. doi: 10.1037/a0020259
Bruce, M., Bridgeland, J. M., Fox, J. H., & Balfanz, R. (2011). On track for success: The use of early warning indicator and
intervention systems to build a grad nation. Washington, DC: Civic Enterprises and Everyone Graduates Center at Johns
Hopkins University.

Psychology in the Schools DOI: 10.1002/pits


16 Hazel, Vazirabadi, and Gallagher

Buhs, E. S., Ladd, G. W., & Herald, S. L. (2006). Peer exclusion and victimization: Processes that mediate the relation between
peer group rejection and children’s classroom engagement and achievement? Journal of Educational Psychology, 98,
1 – 13. doi: 10.1037/0022-0663.98.1.1
Chapman, C., Laird, J., & KewalRamani, A. (2010). Trends in high school dropout and completion rates in the United States:
1972–2008. Washington, DC: National Center for Educational Statistics.
Christenson, S. L. (n.d.). Department of Educational Psychology: Sandra L. Christenson. Retrieved from http://www.
cehd.umn.edu/edpsych/people/Faculty/Christenson.html
Christenson, S. L., Reschly, A. L., & Wylie, C. (2012). Handbook of research on student engagement. New York, NY:
Springer.
Christenson, S. L., Reschly, A. L., Appleton, J. J., Berman-Young, S., Spanjers, D. M., & Varro, P. (2008). Best practices
in fostering student engagement. In A. Thomas & J. Grimes (Eds.), Best practices in school psychology (Vol. 6, 5th ed.,
pp. 1099 – 1120). Bethesda, MD: National Association of School Psychologists.
Costello, A. B., & Osborne, J. W. (2005). Best practices in exploratory factor analysis: Four recommendations
for getting the most from your analysis. Practical Assessment, Research, and Evaluation, 10. Retrieved from
http://pareonline.net/getvn.asp?v=10&n=7
Dynarski, M., Clarke, L., Cobb, B., Finn, J., Rumberger, R., & Smink, J. (2008). Dropout prevention: A practice guide
(NCEE 2008–4025). Washington, DC: National Center for Education Evaluation and Regional Assistance.
Finlay, K. (2006). Quantifying school engagement: Research report. Denver, CO: National Center for School Engagement.
Finn, J. D. (1989). Withdrawing from school. Review of Educational Research, 59, 117 – 142. doi: 10.3102/
00346543059002117
Fredricks, J. A., Blumenfeld, P., Friedel, J., & Paris, A. (2003). School engagement. In K. A. Moore & L. H. Lippman (Eds.),
What do children need to flourish? Conceptualizing and measuring indicators of positive development (pp. 305 – 321).
New York, NY: Springer.
Fredricks, J. A., Blumenfeld, P., & Paris, A. (2004). School engagement: Potential of the concept, state of the evidence.
Review of Educational Research, 74, 59 – 109. doi: 10.3102/00346543074001059
Fredricks, J., McColskey, W., Meli, J., Montrosse, B., Mordica, J., & Mooney, K. (2011). Measuring student engagement
in upper elementary through high school: A description of 21 instruments. Greensboro, NC: Regional Educational
Laboratory Southeast.
Furlong, M. J., & Christenson, S. L. (2008). Engaging students at school and with learning: A relevant construct for all
students. Psychology in the Schools, 45, 365 – 368. doi: 10.1002/pits.20302
Furlong, M. J., Whipple, A. D., St. Jean, G., Simental, J., Soliz, A., & Punthuna, S. (2003). Multiple contexts of school
engagement: Moving toward a unifying framework for educational research and practice. California School Psychologist,
8, 99 – 113.
Gleason, P., & Dynarski, M. (2002). Do we know whom to serve? Issues in using risk factors to identify dropouts. Journal of
Education for Students Placed at Risk, 7, 25 – 41. doi: 10.1207/s15327671
Hazel, C. E., Wonner, R., & Jack, C. (2008, October). School engagement: What is it and how to get more of it. Paper
presented at the 33th annual convention of the Colorado Society of School Psychologists, Avon, CO.
Jimerson, S., Campos, E., & Greif, J. (2003). Towards an understanding of definitions and measures of school engagement
and related terms. California School Psychologist, 8, 7 – 28.
Joreskog, K. G., & Sorbom, D. (1984). Lisrel 6 user’s guide. Mooresville, IN: Scientific Software International.
Kline, R. B. (2005). Principles and practice of structural equation modeling (2nd ed.). New York, NY: Guilford.
Kortering, L, & Braziel, P. (2008). Engaging youth in school and learning: The emerging key to school success and completion.
Psychology in the Schools, 45, 461 – 465. doi: 10.1002/pits.20309
Lewin, K. (1943/1997). Behavior and development as a function of the total situation. In D. Cartwright (Ed.), Field theory in
social science: Selected theoretical papers. Washington, DC: American Psychological Association.
National Research Council, Committee on Increasing High School Students’ Engagement and Motivation to Learn. (2004).
Engaging schools: Fostering high school students’ motivation to learn. Washington, DC: National Academies Press.
O’Farrell, S. L., Morrison, G. M., & Furlong, M. J. (2006). School engagement. In G. G. Bear & K. M. Minke (Eds.),
Children’s needs III: Development, prevention, and intervention (pp. 45 – 58). Bethesda, MD: National Association of
School Psychologists. Statistical Package for Social Sciences (Version 19.0) [Computer software]. Armonk, NY.
Statistical Package for Social Sciences (Version 19.0) [Computer software]. Armonk, NY.
Tabachnick, B. G., & Fidell, L. S. (2001). Using multivariate statistics (4th ed.). Boston, MA: Allyn & Bacon.
Watson, D. (1992). Correcting for acquiescent response bias in the absence of a balanced scale: An application to class
consciousness. Sociological Methods and Research, 23, 52 – 88. doi: 10.1177/0049124192021001003
Willis, G. (2005). Cognitive interviewing: A tool for improving questionnaire design. Thousand Oaks, CA: Sage.

Psychology in the Schools DOI: 10.1002/pits

You might also like