This action might not be possible to undo. Are you sure you want to continue?
email@example.com September 24, 2013 The Board received an update from Dr. Christopher Garran, Associate Superintendent of High Schools, and Dr. Erick Lang, Associate Superintendent, Office of Curriculum and Instructional Programs. The title of the agenda item settled (as of 8 a.m.), at least temporarily, on “7.1 final exam and move forward without the score and errors makMath Update (oral) [sic].” The new “Agenda Item Details” describes what was discussed: high schools’ use of action plans to provide supports/interventions to struggling math students, and the charge to the Math Exam Work Group “to examine factors that may be contributing to student performance on countywide mathematics semester exams and make recommendations for increased student learning of mathematics (http://www.boarddocs.com/mabe/mcpsmd/Board.nsf/goto? open&id=9A7T4Q7594DF).” This topic strays from the hard facts of the real problem recently re-discovered—a decade of 50 percent math exam failure. Reorientation to the discussion and issue This only follows the Superintendent’s failure to focus. The Agenda Items Summary states that the update would address “progress made since July when the superintendent announced his commitment to ensuring that all students who are not experiencing success in mathematics receive the appropriate support.” Dr. Starr’s July 1 memorandum noted “concerns about high failure rates” on math semester exams. Dr. Garran’s action team was to “establish parameters for individual school mathematics action plans.” Dr. Lang’s work group was to “to examine factors that may be contributing to student performance on countywide mathematics semester exams” (the same verbiage, but only half, included for Dr. Lang in the Agenda Item Summary). Dr. Starr cited the inadequacy of the current math curriculum (which he did not link either to action plans or exam failures). Then he discounted exam failure: The concerns raised by parents about examination performance are understandable and valid, but this information should not be view in isolation. We must look at it as part of a holistic understanding of how our students are performing in mathematics. According to Dr. Starr, “many of our students are doing very well in mathematics” (citing successful course completion); “However, we know that some students are struggling and we must do better (Memorandum to the Board, Mathematics Semester Examinations— Action Team and Work Group, July 1, 2013, http://www.montgomeryschoolsmd.org/uploadedFiles/info/belltimesworkgroup(1)/13070 1%20Math%20Action%20Team%20Work%20Grp.pdf).
MCPS’ Math Work Group produced a substantial (if greatly flawed) report addressing the written, implemented and assessed curriculum, acceleration, and teacher preparation and development http://www.montgomeryschoolsmd.org/boe/meetings/agenda/2010-11/20101109/3%200%20Memo%20-%20Update%20on%20the%20K-12%20Mathematics %20Work%20Group%20_final_.pdf). As justified by that report, MCPS published a “Mathematics Program for Grades K-12 (http://www.montgomeryschoolsmd.org/uploadedfiles/curriculum/math/mathematicsprogram.pdf).” In the meantime, Dr. Starr is implementing “an integrated system of supports (http://www.montgomeryschoolsmd.org/boe/meetings/agenda/2013/032113/05.0%20Upd ate%20Strategic%20Priority%20Interventions.pdf).” Then Dylan Presman unearthed the contemporary exam failure rate; exam failure has persisted for a decade. Does the rediscovery of massive exam failure warrant reversion to discussion of supports, inadequate math curricula, generalized increased student math performance and generalized math struggles; or does it warrant focus on the correlations surrounding and meaning of math exam failure and MCPS decade-long inattention to this issue? Meeting Summary Dr. Garran described meetings with high school principals and math resource teachers (at which individual students needing supports were identified by name) and the formulation of actions plans, initial drafts of which are now being received by his office. Dr. Lang said that the Math Exam Work Group was examining alignment of curricula, tests and teaching; placement; the purpose of summative exams; performance on formative exams; and maximizing the instructional plan. The Group has met three times and may meet four more times. It is targeted to complete its work in November, but may run longer. MCPS has established a webpage (http://www.montgomeryschoolsmd.org/info/mathexamworkgroup/), with a “brief synopsis” of the meetings; it will share some data with the public. Board member Durso inquired about a school-instigated Rockville High School survey, the role of resource teachers and the end goal of the action team and Math Exam Work Group. The end goal is dissemination of best practices, possibly through a “network.” Board member O’Neill said that it is important to name names of struggling students to “zero in on effective strategies.” She mentioned a parent and student complaint that students are unable to find out what errors they made on summative exams. Dr. Lang said that the Work Group is not reexamining double-period math, but would look for ways to share details of exam errors; the Group is trying to maintain a narrow focus on exams. Ms. O’Neill asked about the balance in grading between homework and final exams.
Board Vice President Phil Kauffman inquired about upper grade students receiving new Curriculum 2.0 without the preparation of Curriculum 2.0 in prior grades, textbooks not aligned with the curriculum (the Work Group is not considering textbooks), and professional development. Dr. Lang said that teachers had two days of Algebra 1 training over the summer, that collaborative planning is important, and that the Instruction Center offers “just-in-time” videos to support teachers. Mr. Kauffman said that we reviewed the 2000 minutes (http://www.montgomeryschoolsmd.org/boe/meetings/minutes/2000/minutes.050900.pdf ) in which the same issue was addressed. He asked whether MCPS believed that this time we will fix it, by doing something different. Dr. Lang said that no major changes in exam performance should be expected this school year and that MCPS will “keep looking” for solutions. Board member Brandman asked whether we should assess “mastery” by successful course completion or by test. She noted that county-wide tests were implemented “to get at the issue of variability.” Dr. Lang responded that this is one of the issues to be considered. There has been some suggestion of demonstration of student knowledge by a course project rather than by summative exam. Students complain of being unable to remember what was learned across a whole semester. Dr. Lang noted that students can access formulae and methods on-line, so may not need to memorize them. Ms. Brandman said that the Work Group should examine what students mean when they report that “I studied hard.” Dr. Garran said that Saturday school could be another intervention—he had been contacted by The George B. Thomas, Sr. Learning Academy, Inc. Board President Chris Barclay asked “What are we trying to achieve?” Dr. Lang said that MCPS is trying to produce “strong math students” who do not need to take remedial math in college. Mr. Barclay asked how MCPS assesses effective teacher use of the Instruction Center and other resources. Dr. Lang responded that MCPS can measure how often the Instruction Center is accessed, and surveys teachers each year. Mr. Barclay asked what it means to “empower” resource teachers; Dr. Garran responded that they are being called into decision-making and to share best practices. Mr. Barclay asked when this conversation would be continued. Dr. Lang said that the Math Exam Work Group was scheduled to complete its work at the end of November. Mr. Barclay said that Board cannot come back in 2026 with this same conversation. The real issue is how MCPS should give students the opportunity to prove what they know. We cannot shy away from bigger, broader questions.
My Reflections It was a mistake to schedule a Board discussion of this issue before it is better developed by MCPS and the Math Exam Work Group: premature discussion encourages loss of focus. There was no mention of a comprehensive report (including extensive data) by the Math Exam Work Group. Instead, the public (whose kids have been ill-served by a decades’ loss of concern for “mastery”) will have a “brief synopses” of the meetings and a smattering of data. This falls well short of what is appropriate, and tends to sweep the problem under the rung (until the next Dylan Presman in 2026). Students need to remember and synthesize across a semester what they learn dayto-day. A math “project” (as contrasted with an English term paper) does not test memory and synthesis. Consideration of the substitution of a project for a summative exam shows a lack of understanding of the theory of summative exams, the historic and continuing lack of function of summative exams in MCPS education, and a willingness to back away from mastery. There were a few encouraging signs of focus: Dr. Lang’s statement that the Math Exam Work Group was focused on exam failure; Mr. Kauffman’s mention of the history of failure and Mr. Barclay’s concern about revisiting the problem in 2026; Ms. Brandman’s points regarding assessment of “mastery” and the meaning to a student of “hard study.” Most of the Board discussion was not focused on exam failure, but on more general issues that had recently been resolved or were being considered in other forums, or were otherwise extraneous. Dr. Lang said that the Math Exam Work Group was examining alignment of curricula, tests and teaching; placement; the purpose of summative exams; performance on formative exams; and maximizing the instructional plan. The first few foci seem to address math exam failure; the last two are diffuse. The Math Exam Work Group apparently will not search for data correlations that will show commonalities among students that fail the exam; nor consider Board, MCPS and Math Work Group responsibility for a decade’s persistence of the problem; nor student acceptance of mastery as an objective in light of a decade’s discounting by MCPS; nor the means necessary to restore trust between MCPS and parents. The “end goal” of the Math Exam Work Group should not be dissemination of best practices; it should be analysis of which students failed and why. I would have wanted my children to attend school where mastery was expected. Appendix Specific Questions for the MCPS Math Semester Exam Work Group
September 20, 2013 Math Semester Exam Work Group MCCPTA-member Merry Heidorn said September 4 on the GTAletters listserv “let me know the specific questions you have and we’ll see if we can’t identify questions you’d like framed on specific issues we’re examining. That way, I can truly represent you in a way that’s meaningful. Not only that, but it would be as if you were participating as a member of the ‘team.’” I have the following “specific questions.” (Data correlations should be made by presenting data from a five- to ten-year period.) 1. What is the correlation between students who “skipped” and subsequent math exam failure? 2. What is the correlation between students who did not skip and subsequent math exam failure? 3. What is the correlation between FARMS students and non-FARMS students, respectively, and subsequent math failure? 4. What is the correlation between students who received an E on a semester exam and students who were subsequently skipped? 5. What is the correlation between students who were, and were not, skipped and subsequent successful course completion? 6. What is the correlation between school FARMS rate and the percentage of students in that school that received an E? 7. What is the correlation between students who were, and were not, skipped and SAT math exam score? 8. The outsized math exam failure rate was identified in 2000 and 2004. Was it identified in Board of Education minutes or in an OSA report at any other time(s)? 9. Why did the Board of Education not monitor the math semester exam failure rate (after it was brought to the Board’s attention in 2000 and again in 2004)? 10. What MCPS offices and departments are responsible for the math semester exams? 11. Why did those MCPS employees responsible for the math semester exams not notice and report the failure rate? 12. What do summative exams measure?
13. Is summative exam performance related to teaching and/or learning? 14. What are the several uses to which summative exams generally, and math summative exams in particular, are put by MCPS? 15. Are summative exam results used in course placement? 16. Are the math curricula and the math semester exams aligned? What studies determined this? 16. What is the correlation between students who were, and were not, skipped and final exam study efforts? 17. What math semester exam failure data was provided by MCPS to the Math Work Group? 18. Why did the Math Work Group Report not address the math semester exam failure rate? 19. Why did the Math Work Group not analyze the nature and uses of summative assessments? 20. Does the long-term continuation of the math semester exam failures breach the trust among parents and MCPS and its Board? 21. What steps will MCPS take to address these issues with the transparency necessary to begin to restore trust? 22. What programs will MCPS implement to ameliorate the learning deficiencies attributable to lax summative assessment expectations of the thousands of students who have failed math exams over the past decade? 23. Does the November 2013 MSEWG completion date allow sufficient time to responsibly fulfill the charge, comprehend the failure, prepare a comprehensive report and determine steps to prevent continuation or recurrence? These “specific questions” supplement my “Math Exam Failure and math Work Group: With Recommendations for the Math Semester Exam Work Group” dated and forwarded to each member of the MSEWG on August 29, 2013.