Professional Documents
Culture Documents
Section 1: Program Description a. Program history/overview: Compass Learning Odyssey has over a 30-year history working in a partnership with schools to gather data to evaluate the effectiveness of its student achievement solutions using online learning. The Compass Learning program was created using input from: universally accepted theories and guidelines from cognitive psychology and instructional design experts; market research, including state student performance data, industry association studies, and ongoing customer feedback; and external product research through focus groups and efficacy studies. Glen Ellyn District 41 chose to use this program to increase student learning and because it is aligned to the NWEA Measures of Academic Progress (MAP), an online test which students take three times a year. When students log in to Odyssey, they find their own learning plan in their Student Portfolio. Unique to each child, the plan focuses on that childs particular learning needs as identified by MAP. The activities are designed by experts to be engaging, fun, and just the right amount of challenge for students. Compass learning offers a personalized online instruction that tailors learning objectives, content, method, pace, and environment to each students unique learning needs(Compass Learning, 2013). Additionally, Compass Learning assesses student interests, learning styles, expressions, evaluates strengths and weaknesses in academic subjects, creates a personalized learning pathway to instruct and build skills, and provides engaging online curriculum designed to heighten student achievement and learning. One of the great benefits of this program is the educators ability to monitor student progress, add assignments, and track data. Finally, Compass Learning efficiently seeks to improve and evaluate its own effectiveness including making sure that the content is engaging and relevant for students. b. Program size, location, and description of organization: Program size: Compass Learning is an online program and therefore has limitless size capabilities. Compass Learning has served more than 11 million students in 20,000 schools nationwide, but the four elementary schools using this program in Glen Ellyn support the learning of more than 2,400 students. Program location(s): This program is being utilized by Glen Ellyn District 41 students from four elementary schools: Abraham Lincoln Elementary, Benjamin Franklin Elementary, Churchill Elementary, and Forest Glen Elementary. These students are from Dupage
County cities including Glen Ellyn, Bloomingdale, Wheaton, Lombard, and Carol Stream. Because of the nature of this online program, it can be utilized both at home and at school. Program director(s) With Compass Learning the program director for the director is our Assistant Superintendent of Teaching, Learning, and Accountability. They will facilitate the program throughout the district with the support of our school board in making final decisions. The program director will use data, staff and stakeholder feedback to evaluate the program. Organization (may be presented in chart) September 2011 Literacy specialists, math specialists and Assistant Superintendent of Teaching, Learning, and Accountability review and learn about the Compass Learning Program. Assistant Superintendent of Teaching, Learning, and Accountability proposes to the board to adopt the Compass Learning Program. District 41 gets board approval to adopt the Compass Learning Program. Train 20 teachers across grade levels and schools with Compass Learning to be our leaders
November 2011
February 2012
April 2012
August 2012
Institute Day dedicated to training teachers how to teach their students about Compass Learning. Each school will be taught by our teacher leaders within their buildings. Half day with our Compass Learning instructors refreshing teachers about the program and teaching them how to
December 2012
September 2012
Collect data of the first grade students to compare with the end of the year for evaluation. Collect data of the first grade students on Compass Learning. Computer-based survey for parents, students and teachers to take on Compass Learning.
May 2012
May 2012
Special political considerations Compass Learning is an online-based program, and use of the program requires students to have a computer with Internet access. District 41 has a population of students who are low-income and/or underprivileged, and many of these students do not have a computer or a means of accessing one. With an expectation that every student access Compass Learning outside of school for at least 60 minutes a night, some of our students are already starting with an unfair advantage. In many cases the students who dont have access are also some of our lower performing students who could benefit from the extra practice. Another consideration is that Compass Learning is based off of MAP scores only. This is another computer-based program purchased by the district and taken three times a year. Compass Learning bases the level that students are working at by their performance on the MAP test. Each time the MAP test is taken, student levels are adjusted in Compass Learning. Students who do not test well or who do not test well online are at a disadvantage. c. Evaluation Needs 1. Who wants an evaluation? The stakeholders want this evaluation. The stakeholders in this case are the administrators, teachers, students, parents and community members. 2. What should be the focus of the evaluation (formative, summative, and/or confirmative)? In this case the district has been utilizing Compass Learning for a year.
A formative assessment will be done to assess how the program is being utilized. For example, is it being used in school and at home, and how often is it being used. The formative assessment will also evaluate whether the areas of weakness brought to light for students are seen in other settings. After that, a summative evaluation will be employed focusing on whether the program is improving student test scores, and improving the students areas of weakness. Summative evaluations focus on analyzing the effectiveness of the learner learning, efficiency of the learner learning, cost of program development and continuing expenses in relation to effectiveness and efficiency, attitudes to the program by learners, faculty and staff, and long term benefits of the instructional program (Morrison). The summative evaluation should have a strong focus on MAP testing data. As MAP Testing scores can be correlated to a projected ACT score, this also gives an opportunity to follow students to see if compass learning improves upon a projected ACT score in reality. If the summative evaluation proved that Compass Learning was an effective tool, we could progress to a confirmative evaluation based upon students actual ACT scores, and how many of the students go on to college. 3. Why is an evaluation wanted? The stakeholders would like to: justify the costs of the program, assess whether implementing this program will improve student scores on MAP testing, and explore whether that will lead to higher ACT scores and more college- bound students. They would also like to assess the attitudes students, parents and teachers have toward the utilization the program, including the frequency of use. 4. When is an evaluation wanted? According to Thompson & McClintock (2000), an organization should conduct formative evaluation when a new program is being developed and when an existing program is 1) being modified; 2) having problems with no obvious solutions; or 3) being adapted for a new setting, population, problem, or behavior. Because School District 41 has been using the Compass Learning program for a full academic year without the results they had hoped for, it is time to use a formative assessment to evaluate the program in order to identify the problems and determine the solutions. In the event that the district is more interested in the changes in student MAP scores, an impact evaluation may be used. This type of evaluation provides an organization with baseline data and subsequent evaluations measure the changes in the group. 5. What resources are available to support an evaluation (time, funding, people, software, etc.)?
School District 41 employs several people who would be key in implementing an evaluation. Under the supervision of our Assistant Superintendent of Teaching, Learning, and Accountability, the Director of Communications along with the Director and Assistant Director of Technology would be the foundation of support for the evaluation. The districts Instructional Technologist and Webmaster may also play a role in the evaluation being implemented throughout the district. Use of a computer-based survey would keep the cost down and allow for flexibility when it comes to completing the evaluation.
Section 2: Program Goals and Outcomes a. Program Logic Model Resources In order to MAP assessment Data (Fall-Winter-Spring); accomplish our set of Teacher, Student, Parent Surveys; Personnel to compile the data from surveys activities we will need the following: In order to address our problem or asset we will conduct the following activities: Administer MAP assessment 3 times during the school year; Teacher-Student goal setting conferences; Conduct Teacher, Student, Parent Surveys to extract data on frequency of use, student engagement, and student accessibility. # of students that show improvement in areas of weakness # of students that show improvement on MAP assessment overall # of students using Compass Learning and data on frequency of use Levels of student engagement Levels of student accessibility With continued use of the program, we expect, during the first 1-3 years, to have 80% of students meet or exceed their growth target from the fall test to the spring test. During 4-6 years, the expectation will increase to 90% of students meeting or exceeding their growth target from the fall test to the spring test.
Activities
Outputs
We expect that once completed or underway these activities will produce the following evidence of service delivery:
Short & Long We expect that if Term completed or Outcomes ongoing these activities will lead to the following changes in
1-3 then 4-6 years: Impact We expect that if completed these activities will lead to the following changes in 7-10 years: Data reflecting higher student ACT scores; Data reflecting higher level of college ready students; Data reflecting more students attending four year colleges
b. Stakeholder Identification: Assistant superintendent of learning: Person who is in charge of curricular support Administrators: People who are responsible for leading and establishing expectations for use and data collection. Students: People who will be utilizing the Compass Learning program at home and school. These students come from four elementary schools in Glen Ellyn with program users in grades K-5. Parents: People who will provide at home support for use of this program. Community Members: People who hold stake in the education of the community. School board: Elected members of the community who make final operational decisions for the district. Teachers: District employees who are responsible for monitoring student use and progress. b. Stakeholder prioritization: Evaluation data collected from stakeholders: Checklist/Criteria Individuals and groups needing evaluation information: Assistant To develop new initiatives XX To make operational decisions XX To provide input to evaluation To react For input on program content
superintendent of learning Administrators Students Parents Community members School board Teachers XX X XX XX XX XX XX XX XX XX XX XX X
1.3 Weight your stakeholders Prioritizing stakeholders needs and involvement helps to better utilize resources to collect data and produce a high impact from the evaluation of Compass Learning. As seen in the criteria/checklist and evaluation boundaries, the many stakeholders play different roles and will influence the evaluation in unique ways. Varying levels of consideration will be given to those with higher influence and involvement with the Compass Learning program. Final operational decisions within the district are made by the school board. This includes decisions about curriculum implementation and program evaluation. They listen to the evaluation input from all other stakeholders to make a justified decision. The assistant superintendent of learning brings the culmination of all data and evaluation results to the school board. This role is crucial because it bridges the input from administrators, teachers, students, parents, and community members. Additionally, teachers, parents and students play an important role within the evaluation because they are working with the program on a day to day basis. The individuals in these roles report back on the program content and its effectiveness, provide feedback about what is working/what needs improvement, and will complete surveys throughout the program. The teachers role in the evaluation is essential because they have a heavy involvement in the collection of evaluation data and provide a voice in helping to develop new initiatives in the district. Administrators oversee the rollout of programs in their school. Principals make sure to communicate with faculty and parents about the program in a timely manner. As the
program rolls out, they ensure all teachers are comfortable and supported along the way. Finally, community members play a small role in this evaluation process by reacting to its implementation and the costs of the program. However, members are not always directly involved unless they have children currently attending district schools, and therefore, do not have as much weight in the overall evaluation process. Section 3: Evaluation Design a. Evaluation purpose: The purpose of this evaluation is to identify whether use of the Compass Learning program leads to an increase of student MAP scores. The evaluation will, additionally, determine how often the program is accessed by students in the district. The results will be used to determine the cost-benefit of the program on a district level and next steps with regard to training and accessibility so that the program is implemented in a way that supports the students of District 41. b. Evaluation boundaries and questions: Individuals or groups needing evaluation information: Assistant Superintendent of Learning Administrators Students Parents Community members School board Teachers How might they use or be affected by the results: Is the program cost effective? Is the program aligned with standards? How does the content impact standardized testing? Can the students access the program frequently? Are they? Do parents understand the purpose in using this program? Is this program effective and meaningful for students? Is the program yielding desired results? How can we improve instruction using this program? Is the program providing rigorous and authentic learning opportunities for students? Is the program having a positive effect on student achievement?
What questions should the evaluation Is this program having the desired impact and address? yielding positive results in student achievement? How is this program being implemented at different grade levels and across four buildings? How are students reacting to this program? Are the lessons and activities engaging and age appropriate? How are teachers reacting to this program and utilizing it in the classroom? Are parents using this program at home? How frequently?
b. Evaluation Questions and justifications: Evaluation Questions: Is this program having the desired impact and yielding positive results in student achievement? Can the students access the program frequently, with ease, from multiple locations? Do parents, teachers, and students understand the purpose and big picture results from using the program regularly? How did the students react or respond to the lessons? Did they view them as engaging, age appropriate, and meaningful? Rationale: (Summative) to address questions from all stakeholders and public audience. (Formative) to address questions from parents, teachers, and community members. (Formative) to address questions from administration and goals for program usage. (Formative) to address questions from teachers regarding activities found in the program.
Assistant Superintendent of Learning Is the program cost effective? Administrators Students Is the program aligned with standards? How does the content impact standardized testing? Can the students access the program frequently? Are they?
Parents
Do parents understand the purpose in using this program? Is this program effective and meaningful for students? Is the program yielding desired results? How can we improve instruction using this program? Is the program providing rigorous and authentic learning opportunities for students? Is the program having a positive effect on student achievement? Is this program having the desired impact and yielding positive results in student achievement? How is this program being implemented at different grade levels and across four buildings? How are students reacting to this program? Are the lessons and activities engaging and age appropriate? How are teachers reacting to this program and utilizing it in the classroom? Are parents using this program at home? How frequently?
c. Evaluation design plan matrix and list of measures: Evaluation Questions or Objectives Information Required Is this program having the desired impact and yielding positive results in student achievement? Student learning, student achievement and data regarding activities, scores, and correlation to NWEA scores Student reports on Compass Learning, NWEA map scores Documentation review and data spreadsheets from NWEA and Compass Learning generated reports; focus group to review data and discuss patterns and evaluate results. All students in 4th and 5th grade across the four Elementary schools.
Sampling
Collection process will be implemented using the user-friendly generated reports on usage, activity scores, and progress from Compass Learning. These can be generated using a variety of factors with ease. Also, information will be used from the NWEA MAP test, which is taken in the Fall, Winter, and Spring each year. Student growth and achievement correlations will be evaluated and measured through a committee with members from all stakeholder groups. Data review: October, February, and May from the focus group to search for growth and impact in program usage and student achievement. Transcribe information collected from data review and focus group, discussion and transcription of data patterns and results. Data and documentation review: Are the findings correlating student usage with higher achievement on the MAP test? Focus group: Are there ways to improve student access? Is this program effective in providing a positive correlation in student learning and achievement?
Schedule
Analysis Procedures
Interpretation Procedures
Teachers from each school, administrators, Superintendent of Student Learning What is this program doing well? How can it be better utilized? Is the program yielding desired results? Presentation of findings to the teams of excellence and at board meetings to involve all stakeholders. What are the next steps with this program? Presentation: Early May Report: Late May
Reporting Format
Reporting Schedule
Can the students access the program frequently, with ease, from multiple locations? student learning, student achievement and data regarding activities, scores, demographics of computer/internet access at home student reports on Compass Learning, NWEA MAP scores, survey of computer/internet access Survey, documentation of results, spreadsheet of data from NWEA and MAP All students in 4th and 5th grade across the four elementary schools Collection process will be implemented by accessing the user-friendly reports generated for usage, activity scores, and progress from Compass Learning. Also, information will be used from the NWEA MAP test which is taken in the Fall, Winter, and Spring each year. Survey information will be collected through a paper and/or online survey and placed on a spreadsheet. Student growth and access to the program will be evaluated through a committee from all the stakeholders. Data review: October, February and Mayl Streamline the information collected from the surveys and look for trends. Collect data on a spreadsheet to compare results of surveys and student progress. Focus group: Is there a connection between student progress and access to the program at home? If students have more access to the program are they showing a significant difference in scores than those who dont have it available. Teachers from each school, administrators, Superintendent of Student Learning, Parents How are the students currently being able to access the program? What can be changed to
Information Source
Interpretation Procedures
increase the usage of the program? Reporting Format Reporting Schedule Presentation of findings to Board of Education and recommendations for future program usage Presentation: Early May Report: Late May
Do parents, teachers, and students understand the purpose and possible big picture results from using the program regularly? Parent, teacher, and student feedback regarding their understanding of the purpose and possible big picture results of using Compass Learning Data collection from survey with scales
Information Source
Method for Collecting Information Google Form Sampling All parents, teachers with students utilizing Compass Learning in grades 4-5 and all students using Compass learning in grades 4-5 3 surveys consisting of 5 questions each specifically geared to parent understanding, teacher understanding and student understanding of Compass Learning After the September round of Map Testing: Students will complete the surveys in class. Parents will be sent the survey via email. Teachers will be sent the survey via email. All surveys will have a deadline of mid-October with several reminders to get the maximum amount of feedback. Data will be organized and categorized into five levels of understanding in a google spreadsheet. None, Some, Fair, Good, Excellent understanding
Schedule
Analysis Procedures
Interpretation Procedures
Survey responses will be calculated by answers on the scale to determine whether more communication is needed for stakeholders to understand the purpose and possible big picture results of using this program. School Board, Administrators, Superintendent An answer to this question: Do Parents, Teachers and Students understand the purpose and possible big picture results of using the program? Presentation of findings to administrators and Board of Education and recommendations for future communications November Board Meeting
Reporting Format
Reporting Schedule
Evaluation Questions or Objectives Information Required Information Source Method for Collecting Information Sampling Information Collection Procedures
How did the students react or respond to the lessons? Did they view them as engaging, age appropriate, and meaningful? Student feedback about individual lessons accessed on Compass Learning Collection of data from surveys Google form All students in grades 4 and 5 throughout Dist. 41 Student surveys will be given two times during the school year. The survey will prompt students to respond about their personal experiences with Compass Learning with regard to meaningfulness and engagement of the lessons. Following the September MAP tests, students
Schedule
will be reminded of the procedures and expectations for accessing Compass Learning. Six weeks later (near the end of October) students will complete the first three question survey. Students will complete a second survey sometime near the end of March. Analysis Procedures The data will be organized to show how many students rate the lessons as too easy, just right, and challenging when it comes to them being engaging, age appropriate, and meaningful. This information can be used to help teachers determine if students need to be assigned more age appropriate activities. We will also have the ability to merge the data with the usage results to determine if there is a correlation between the the feelings the students have toward the activities and the amount of frequency of use. Teachers, Parents, Students, Superintendent for Student Learning, School Board Are students engaged in meaningful activities when assigned lessons on Compass Learning? Presentation of findings to staff, administrators, and Board of Education and recommendations for future communications. Presentations: November and April Board meetings
Interpretation Procedures
Reporting Schedule
d. List of measures/instrument drafts Our data will be collected on a Google Form spreadsheet from NWEA and MAP testing. These will be analyzed along with the Compass Learning reports to collect data on the effectiveness of the program. Surveys will also be done throughout the evaluation. These will be given to teachers, parents and teachers. Section 4: Evaluation Management Plan a. Proposed implementation timetable
Section 5: References
Compass Learning, Inc (n.d). The research behind Compass Learning Odyssey. Retrieved from http://it.dadeschools.net/CompassLearning/ResearchBehindOdyssey.pdf. Compass Learning (2013). Why compass learning? Retrieved from http://www.compasslearning.com/odyssey. Kasarda, J. D. (2012). Demographic trends and enrollment projects. Retrieved from http://www.d41.dupage.k12.il.us/referendum/images/kasardademo.pdf. Morrison. (n.d.). Retrieved from http://moodle2.cedu.niu.edu/mod/resource/view.php?id=612 Thompson N. J., & McClintock H. O., (2000). Demonstrating your programs worth. National Center for Injury Prevention and Control. Atlanta, Georgia. Section 6: Appendices a. Measures of Evaluation Online Survey Questions for parents, students, and teachers Google form for students Google form for parents Google form for teachers b. Resumes
c. CITI Training: