Updated 11.2.

15

Including Social
Emotional Skills
and CultureClimate Surveys
in the School
Quality
Improvement
Index
Recommendations for minimum
participation rates, how to
translate responses into metric
results, and how to include in the
Index

Updated 11.5.15

Purpose of the SEL & CC Field Test
Provide actionable information to participating
districts and schools
Test drive the administration process
Inform decisions about the final instrument for SY
2015-16
Inform decisions about the method of measurement
within the Index & expectations with respect to
participation rates
Engage in analytics to inform the CORE Districts
and the field
Utilize data to inform School Quality Improvement
System Program Evaluation efforts (PACE)
Build a stronger professional network amongst
CORE District staff to engage in this work

Updated 11.5.15

Purpose of the SEL & CC Field Test
Provide actionable information to participating
districts and schools
Test drive the administration process
Inform decisions about the final instrument for SY
2015-16
TODAY
TODAY

Inform decisions about the method of measurement
within the Index & expectations with respect to
participation rates
Engage in analytics to inform the CORE Districts
and the field

TODAY

Utilize data to inform School Quality Improvement
System Program Evaluation efforts (PACE)
Build a stronger professional network amongst
CORE District staff to engage in this work

.15 Context: CORE.Updated 11. the School Quality Improvement Index. and our Measures of Social-Emotional Skills and Culture-Climate Perceptions.2.

Sanger Unified Clovis Unified Garden Grove Unified City Sacramento Unified Oakland Unified Santa Ana Unified San Francisco Unified Fresno Unified Long Beach Unified Other districts in CA Los Angeles Unified Participating CORE districts Over 1.2.15 CORE is a collaboration among 10 California school districts. We’re working together to significantly improve student outcomes.1 million student s in CORE .Updated 11.

g. Stanford University. CSBA.Updated 11.000 additional students counted! Graduates Social-Emotional & Culture-Climate Domain Elimination of Disparity and Disproportionality Academic Domain • • • Achievement and Growth* Graduation Rate* High School Readiness Rate (Gr. Harvard University) With guidance from our Oversight Panel (e. PACE. 8)* All Students Group & Subgroups • • Chronic Absenteeism • Student/Staff/Parent Culture-Climate Surveys • Suspension/Expulsion Rate* • Social Emotional Skills ELL Re-Designation Rate* • Special Education Disproportionality • Guiding principles:  Information as “flashlight” (and not a “hammer”)  From a narrow focus to a holistic approach  Making all students visible  From just achievement to achievement and growth • Developed through collaboration and partnership: Led by the CORE Superintendents Guided by the experts in our districts With input from hundreds of educators across the CORE districts With support from our key partners (e.g.5. PTA) . ACSA. Ed Trust West.15 Our School Quality Improvement Index College & Career Ready Making all students visible: N size of 20 resulting in over 150.

Meaningful • Clearly connected (e. reliability and stability through the examination of baseline and/or field test data.g. based upon the current presence of substantive gaps in performance). . • Evidence from baseline data that schools serving similar youth demonstrate notably different outcomes (such that there is evidence that schools play a substantive role in the outcome). and analyzed before inclusion in the Index Measurable • Evidence of validity.. refined.Updated 11. and the elimination of disparity and disproportionality (e..15 Each indicator has been carefully developed.g. through research) to college and career readiness. Actionable • Evidence from research that schools can influence and impact the outcome in question.5.

Updated 11.5. CORE Districts will release the 1st version of the School Quality Improvement Index Reports support continual improvement for school leaders and teachers. .15 In 2015.

Updated 11.15 Results will include performance by the “all students” group and subgroups Social Emotional Skills and Climate Surveys will become part of the Index in Fall 2016 .5.

15 CORE is part of the national dialogue on including Social Emotional Skills in Multiple Measure approaches to school quality O T S L L I K S L A I TEACHING SOC VE d emotional IMPRonO e social an th t h lig sh a fl a  We think S E lay. . our Spring 2015 Field Test of measures of social-emotional skills lets us explore how to measure these essential skills at scale. p V We’re putting I ey th L le ro D e N th t ou k abA EinS ls th AoD scho elpR ess but also cc su skills to hG ic m e d ca a ut o “ ot only ab school quality is n whole child.5. e th g n pi o el ev d about ” With almost half a million students participating.Updated 11.

covering approximately 71.060 .767 301 7.342 Los Angeles 308.386 23.136 454.Updated 11.293 2.583 Long Beach 45.000 students District Name Fresno Santa Ana Total Number of Number of Students Covered by Teachers Teacher Reports 2.737 71.5.436 63.249 34.15 CORE Field Test of Measures of Social Emotional Learning and School Culture-Climate More than 450.298 Two districts collected teacher reports on students’ SE competencies from more than 2.000 students participated in the Spring 2015 field test of SEL measures District Name Number of Students Fresno 34.700 teachers.602 Oakland San Francisco Santa Ana Total 8.

5.15 Social Emotional Skills Cover Four Topics – Including Inter-Personal and Intra-Personal Skills .Updated 11.

15 Sample SEL Items To assess social-emotional skills. are some self-management items. we ask students about their beliefs and behaviors.5. Below. Sometimes. •I stayed calm even when others bothered or criticized me. for instance. Often. •I worked independently with focus. Almost All the Time) . Please answer how often you did the following during the past 30 days. During the past 30 days… •I came to class prepared. •I paid attention. (Almost Never. •I kept my temper in check. even when there were distractions. •I got my work done right away instead of waiting until the last minute.Updated 11. •I was polite to adults and peers. Once in a While. •I allowed others to speak without interruption. •I remembered and followed directions.

and teasing. atmosphere conducive to dialog and questioning. varied opportunities to demonstrate knowledge and skills.  Students and parents report feeling welcome at the school. . verbal abuse or harassment. including feeling safe from verbal abuse. KNOWLEDGE AND FAIRNESS OF DISCIPLINE.5. and individual attention to support differentiated learning.Updated 11. such as encouragement and constructive feedback. academic challenge.15 Student. or exclusion by others in the school. RULES AND NORMS Clearly communicated rules and expectations about student and adult behavior. and included. SENSE OF BELONGING (SCHOOL CONNECTEDNESS) A positive sense of being accepted. by others (teacher and peers) in all school settings. SAFETY Students and adults report feeling safe at school and around school. support for risk-taking and independent thinking. teasing. clear and consistent enforcement and norms for adult intervention. especially regarding physical violence. valued. Staff and Family Culture-Climate Surveys cover four topics as well. CLIMATE OF SUPPORT FOR ACADEMIC LEARNING Students and teachers feel that there is a climate conducive to learning and that teachers use supportive practices.

•Teachers go out of their way to help students. (Strongly Disagree. Agree.5. •This school promotes academic success for all students. Strongly Agree) . •This school is a supportive and inviting place for students to learn. staff and families about their experiences with and perceptions of their school. How strongly do you agree or disagree with the following statements? •Adults at this school encourage me to work hard so I can be successful in college or at the job I choose. Disagree. •Teachers give students a chance to take part in classroom discussions or activities. Neither Disagree Nor Agree. we ask students. •I feel close to people at this school. Below are sample items from the student culture-climate survey.Updated 11. •My teachers work hard to help me with my schoolwork when I need it.15 Sample Culture-Climate Items To assess culture-climate.

47. the higher the percentag e of youth in poverty.15 SEL & Culture Climate: A school’s culture-climate is related to social emotional skills reports. Both of these schools have close to 90% of youth in poverty .Updated 11. despite comparable levels of youth in Correlation poverty. and we see a substantive range in school performance.5. The larger the dot. between overall SEL & overall cultureclimate is .

such as math.5.15 Math & SEL: A school’s SEL results are also related to performance on other indicators. In this graph we see that schools with strong SEL generally performed better on SBAC math. . The larger the dot.Updated 11. the higher the percentag e of youth in poverty.

5.15 Pause & Reflect: What do you find noteworthy or interesting about what you have heard and seen so far? What are you looking forward to learning more about? What opportunities do you foresee in the use of this information? .Updated 11.

and our Measures of Social-Emotional Skills and Culture-Climate Perceptions.Updated 11.15 Recommendations and Rationale: CORE. the School Quality Improvement Index.5. .

15 Minimum Participation Rate Thresholds Students Staff Family Denominator (already agreed Student enrollment upon by the CORE Board) count in the fall census Number of staff members as of the administration date (inclusive of all certificated and classified staff invited to respond to the survey) Student enrollment count in the fall census Elementary School Threshold 75% 75% 40% Middle School Threshold 75% 75% 30% High School Threshold 75% 75% 25% Rationale for Staff and Student Participation Thresholds •Georgia utilizes 75% participation as their threshold for accountability reporting of culture-climate staff and student surveys. as being both reasonable to accomplish and fairly representative.* Rationale for Family Participation Thresholds •We are unaware of a state benchmark for family survey participation rate expectations. and have also suggested that there is not a well established guideline for survey participation rates in this context. LBUSD. Median participation was 87%.5. (Georgia reports any case with fifteen or more responses). Median participation was 71%. 41% at middle and 32% at high). •Our research partners have advised us to find a threshold that will maximize the likelihood of a representative sample. (75th percentile participation was 54% at elementary. •Median participation rates amongst designation eligible schools were 39% at elementary.Updated 11.* •85% of schools in the field study achieved this level of participation on the staff surveys. OUSD and SFUSD. 28% at middle and 22% at high. •93% of schools in the field study achieved this level of participation on the student surveys. **Based upon participation data from all waiver . •Our research partners have advised us to find a threshold that will maximize the likelihood of a representative sample. and have also suggested that there is not a well established guideline for survey participation rates in this context. ** *Based upon participation data from LAUSD.

Itinerant staff may take the survey for more than one school.Updated 11. •This approach includes the voice of staff members who spend a substantive amount of time at the school. Rationale •This is comparable to the approach in Georgia.5.15 Who is eligible to participate in the staff survey? Recommendation: A staff member is defined as certified or classified personnel. This includes full-time or part-time but they must work at least 50 percent of the day in the school. . Districts will provide a school by school count of the number of staff members at each school meeting these criterion during the survey window.

Rationale •This is comparable to how participation below the expected threshold is treated on academic assessments.  Any added participants will be treated as a respondent with the lowest possible scores for each item on the survey for the purposes of scoring.Updated 11. then respondents will be added to the denominator to get to the participation threshold. . •This approach incentives schools to meet the participation threshold.15 Participation rates below the minimum threshold Recommendation: If participation is below the prescribed percentage in the “all students” group and for any particular subgroup. •This allows us to include results even when the participation rate minimum has not been met.5.

Updated 11. .5.15 Subgroup categories for SEL and Culture-Climate Students (SEL & Culture-Climate) Staff Family All Respondents Yes Yes Yes Lowest Performing Racial/Ethnic Subgroup Yes No Yes Students with Disabilities Yes No Yes English Learners Yes No Yes Socio-Economically Disadvantaged Students Yes No Yes Rationale •Only in the case of staff are subgroups not applicable given that we are not able to easily pinpoint the subgroups of students that staff work with.

2% for each skill). the weight of Social Emotional Skills is 8%. self-efficacy. For Culture-Climate Surveys. •While one might argue that student and staff participation rates lead to more representative data. .Updated 11. keeping to the even weighting concept that has been used throughout the Index: -Students: 2. this is the one place in the Index that we take family input into account.67% •We-Family: do not 2.67% see a reason to move away from the “even weighting” scheme utilized for other metrics in the Index. growth mindset and social awareness).15 Weights for Social Emotional Skills and Culture-Climate Surveys In the approved CORE Waiver.67% Rationale -Staff: 2. If we decide to include each SEL skill as a separate metric on the Index (e.5. so we recommend incentivizing participation in the family survey by keeping their weight even with students and staff..  We do not recommend a change to these weights. separate ratings for self-management. we recommend even weighting of these items (e.g. and the weight of Culture-Climate Surveys is 8%..g. we recommend the following breakdown.

which is still reasonably strong. •Culture-climate results by topic are generally strongly related to the composite rating. •Including all four topics separately for each of the three survey groups (students. we intend to support users in drilling to the data to see results by topic.5. staff and parents) would create twelve indicators on the Index.Updated 11.54 level with the composite rating.8 or above to topic-specific ratings. . The difference between the 10th and 90th percentile results ranges between 12 and 33 percentage points. For instance. composite ratings of student culture-climate are generally correlated at . which we feel would be too many. family and staff.CultureClimate (1 of 2) We recommend including one composite Culture-Climate result on the Index for students. staff and family. separately and respectively (and not having a separate result for each topic). From there. The exception is safety. which is correlated at . and one for family results) is more manageable on the main reports in the Index. We do recommend reporting the results both as a composite and as Rationale separate topics in a supplementary report. •Three results for culture-climate surveys (one for student results. one for staff results. •There is a large enough range in composite culture-climate results to generate ten levels of performance for students.15 Scoring of the Items to Create a Metric Result .. and is typically at least 18 percentage points such that performance thresholds would generally be at least 2 percentage points wide.

•The two step process for computing composite results creates even weighting amongst the topics. favorable responses will be defined as follows (generally referring to the top two response choices): -Items on the agreement scale: Agree/Strongly Agree -Items on the how significant of a problem scale:  Insignificant Problem/Mild Problem -Items on the how many adults scale: Nearly All Adults/Most Adults -Items on the elementary culture-climate scale:  Yes.Culture-Climate (2 of 2) We recommend the following method for translating survey responses into a metric result: The results will be calculated in terms of percentage of favorable responses. -Step 2:  Average together those topic-specific percentages of favorable responses. Most of the Time/Yes..3% favorable). All of the Time -Items on the how many times scale (for bullying items): 0 Times There will be a two-step process for calculating the percentage of favorable responses Rationale to ensure even weighting of the topics in the composite: •-Step Percent favorable a standard communicating culture1: Calculate theispercentage of method favorableof responses separately by topic.  Metric results will be displayed as whole percentages without decimals (e. 75% favorable and not 75. .15 Scoring of the Items to Create a Metric Result . By item type. climate results in the field.Updated 11.g.5.

•The range in performance on the composite measure is smaller (~0.7 or above to topic-specific ratings.53 level with the composite rating.3/5. That said. For instance.5. •Like culture-climate topics. which is correlated at a . The composite is a more consistent predictor on its own than any individual skill. which is still reasonably strong. so the skill by skill approach provides greater variation in performance amongst schools.0 between the 10th and 90th percentile schools) than skill by skill ranges (+/-0. we looked both at how each skill taken separately in a model predict GPA and then how the composite result predicts GPA.Updated 11. The exception is growth mindset. SEL skills results by topic are generally strongly related to the composite rating. composite ratings of SEL skills are generally correlated at . •In models that examine how social-emotional skills predict GPA. .0).5/5.4/5.   Rationale •Including all four skills separately is manageable from a reporting perspective. allows us to emphasize SEL.0 with several cases closer to 0. In other words. We also generally see greater variation in subgroup performance than all students performance. and there is not a discernable pattern between middle and high school as to which skill or skills stand out as most predictive. including all four skills separately in the model explains more of the variation in GPA than looking at the composite rating does.15 Scoring of the Items to Create a Metric Result – SEL (1 of 2) We recommend including separate skill by skill results for each Social Emotional Skills result on the Index (rather than a composite SEL score). the skills separately provide more predictive power than the composite SEL rating.

with “1” referring to the “worst” response option and “5” to the “best” option. Further. •We explored a more complex approach. the fact the Index includes results for the “all students” group and four subgroup categories gives schools with concentrations of youth in poverty.28) to allow the user to see the differences in school results. etc. . •The two step process for computing composite results creates even weighting amongst the topics. •The five point approach is consistent with how field test results have been released thus far. We then tried ranking schools based upon an average of these differences.  Average together skill-specific •The five point approach keeps the maximum amount of information from the data in the indicator.Updated 11. students with disabilities.  We will display two decimal places (e. where each student’s rating was compared to the average rating of peers with the same demographics and then we took the difference between each student and their matched peer. There will be a two-step process for calculating the ratings to ensure even weighting of the SEL skills in the composite: Rationale -Step 1: Calculate the ratings separately by skill.  Every item on the instrument has five response options. •-Step Percent2: favorable does not apply those to SEL in the way that itratings. We will explore a “growth measure” for SEL after the 2016 administration.5.SEL (2 of 2) We recommend the following method for translating survey responses into a metric result: The results will be calculated in terms of a score from 1 to 5 for each metric result. Each item is scored on a scale from 1 to 5. works for culture-climate. the opportunity to show strength with those subgroups relative to other schools. English Learners..15 Scoring of the Items to Create a Metric Result .g. – – – We did see that some schools would be ranked differently using this approach. but there isn’t strong evidence that this is a reasonable approach and it creates significant complexity in a metric that is already going to be new to users. 3.

” we will use Spring 2016 results the baseline year. ensure that spread between performance levels is consistent -Base Level 10 on the 90th percentile from the (Spring 2016). we will translate metric results into Index Levels (and points) by setting performance thresholds for ten levels of performance. and will allow us to adjust based upon minor updates to the items and impacts from the broader set of decisions contained herein.   •We doas not see a reason to move away from the approach utilized for other metrics on the Index.Updated 11.  We will use the same method for these metrics that we have used for other indicators: -Base Level 1 on the 10th percentile from the baseline year (Spring 2016). Rationale Since the 2015 assessment was a “field test. .5. we provide a set of preliminary thresholds based upon the field test data. -For Levels 2 to 5. ensure that spread between performance levels is consistent -Base Level 6 on the 50th percentile from the baseline year (Spring 2016) -For Levels 7 to 9. waiting for the 2016 data to finalize these aligns with the messaging of the field test year. •That said. •On the next page.15 Translation of Metric Performance into Index Levels Just as we do with other metrics in the Index.

15 Preliminary Performance Thresholds – Based upon Field Test Data Coming soon (in development) .Updated 11.5.

15 Appendices: Relevant analyses by our partners at the Harvard Center for Education Policy Research .5.Updated 11.

85 0.47 0.14 0.90 0.29 0.24 0.00 0.48 1.24 0.13 0.00 8 Self-Efficacy.00 0.68 0.15 1.33 0.44 0.00 6 Self-Efficacy.19 0.33 0.54 1.76 0. Social Studies 0.Updated 11.00 2 Growth Mindset 0.24 0.21 1. Scale 2 3 4 5 6 7 8 9 10 11 12 13 14 1 Self-Management 1.04 0.81 0.15 0.10 0.01 0.20 0.22 1.22 0.53 0.45 0. suggesting that these constructs are related while capturing different facets of a school.84 1.74 0.62 0.20 0.27 0. Composite SEL and Culture Climate ratings are strongly related1to each skill/topic.16 0. all in the expected direction • At a school level we find modest correlations between individual social emotional competencies and culture/climate domains.63 0. Science 0.09 0.43 0.32 1.00 13 Student Safety Standardized Mean of Student 14 School Culture Measures 0.21 0.24 0.40 0.52 0.00 3 Self-Efficacy.15 Appendix: Correlations between individual SE competencies and CC domains School-level results for each social-emotional competency or culture/climate domain were modestly correlated with one another.93 0.53 0.83 1.19 0.04 0. Rules and Norms 0.67 0.26 0.21 0.03 0.55 1.11 0.11 0.00 7 Self-Efficacy.00 5 Self-Efficacy. Math 0.09 0. Global 0.11 0.86 0.17 0.55 0.36 1.00 0.00 9 Mean Student SEL Measures Student Climate of Support for 10 Academic Learning 0.08 0.5.02 0.00 .29 0.41 0.36 0.18 0.51 0.48 0.17 0.76 1.05 0.23 0.22 1.39 0.50 0.53 0.23 0.21 0.35 0.12 0.45 0. ELA 0.21 0.24 0.89 0.19 0.00 11 Student Sense of Belonging Student Knowledge and Fairness 12 of Discipline.32 0.00 4 Social Awareness 0.42 0.64 0.19 0.59 1.

Updated 11.15 Appendix: Predictive power of individual SEL skills versus the composite The composite is a more consistent predictor on its own than any individual skill. Including all four skills separately in the model explains more of the differences in GPA than looking at the composite rating does. . and there is not a discernable pattern between middle and high school as to which skill or skills stand out as most predictive.5.

The modal school would have the same ranking (on a scale from 1 to 10) with either method.2 .5 .6 Difference in Rank Between Composite and Demographic-Adjusted School Student Mean SEL Skills Elementary Schools -10 -9 -8 -7 -6 -5 -4 -3 -2 -1 0 1 Difference in Rank 2 3 4 5 6 7 8 9 10 Spearman's Rho =. Difference in Rank Between Composite and Demographic-Adjusted School Student Mean SEL Skills Elementary Schools Density 0 0 .4 .5. A relatively small subset of schools would receive a substantively different ranking.2 .66 624 Schools Difference in Rank Between Composite and Demographic-Adjusted School Student Mean SEL Skills High Schools Spearman's Rho =.Updated 11.3 .4 .2 Density .15 Appendix: Difference between composite SEL ratings and demographic-adjusted ratings.6 .66 624 Schools .72 187 Schools -10 -9 -8 -7 -6 -5 -4 -3 -2 -1 0 3 4 5 6 7 8 9 10 .1 0 -10 -9 -8 -7 -6 -5 -4 -3 -2 -1 0 1 2 Difference in Rank 3 4 5 6 7 8 9 10 1 2 Difference in Rank Spearman's Rho =.4 Density .