https://docs.google.com/file/d/0B25T4337dQ6pSGlSSEs2Y3A4eFk/edit?

usp=sharing&pli=1

Penyusunan RPP Kurikulum 2013
POSTED BY KOMANG SUARDIKA POSTED ON 7/22/2013 02:15:00 AM WITH NO COMMENTS langkah-langkah penyusunan Rencana pelaksanaan pembelajaran (RPP) yang dibagi dalam 3 (tiga) langkah besar, Kegiatan pendahuluan, Kegiatan inti dan Kegiatan penutup dengan rincian sebagai berikut : 1. Kegiatan Pendahuluan o Motivasi  Guru memberikan gambaran manfaat mempelajari materi yang akan diajarkan o Pemberian acuan  Berkaitan dengan kajian ilmu yang akan dipelajari  Ajuan dapat berupa penjelasan materi pokok dan uraian materi pelajaran secara garis besar  Pembagian kelompok belajar  Penjelasan mekanisme pelaksanaan pengalaman belajar sesua dengan rencana langkah-langkah pembelajaran 2. Kegiatan Inti o Proses pembelajaran untuk mencapai kompetensi inti dan kompetensi dasar o Dilakukan secara interaktif, inspiratif, menyenangkan, menantang, memotivasi peserta didik o Menggunakan metode yang disesuaikan dengan karakteristik peserta didik dan mata pelajaran dengan proses eksplorasi, elaborasi dan konfirmasi dilaksanakan melalui aktifitas mengamati, menanya, mencoba, menalar, menyaji dan mencipta. 3. Kegiatan Penutup o Kegiatan guru mengarahkan peserta didik untuk membuat rangkuman/simpulan o Pemberian tes atau tugas dan memberikan arahan tindak lanjut pembelajaran, dapat berupa kegiatan diluar kelas, dirumah atau tugas sebagai bagian remidi/pengayaan

Who We Are

John R. Frederiksen

Interests, Background

John R. Frederiksen frederik@u.washington.edu Dr. Frederiksen's interests are broadly concerned with University of Washington's the application of the cognitive sciences to learning Web site and instruction within both classroom settings and Learning Inquiry Through computer-based learning environments. His Reflective Assessment's background in cognitive science encompasses work in Web site experimental cognitive psychology, artificial intelligence, and educational measurement. His recent work focuses on how middle school science students can develop an understanding of scientific inquiry processes and apply this knowledge in creating models of scientific phenomena. In this context, he is carrying out a longitudinal study of how developing students' inquiry skills may enhance their learning across the middle school curriculum. His research on assessment focuses on both teachers and students. He has studied how teachers' use of video portfolios for assessing teaching may support their inquiry into effective teaching practices, and how students' peer and self assessments of their inquiry processes facilitates their learning. He has also investigated how assessments of scientific inquiry may be incorporated into large-scale science assessments. In prior work, he has explored students' understanding of physical theories (particularly of electricity) and, within this domain, he has developed intelligent computer-based learning environments for understanding basic circuit theory and troubleshooting. In all his research, he applies cognitive theories to educational practice, and uses evaluations of instructional processes and outcomes to illuminate further development of cognitive theory.

PROFESSIONAL INTERESTS AND BACKGROUND

Education

EDUCATION Ph.D., 1966 (Psychology and Psychometrics), Princeton University. B.A., 1963 (Psychology, magna cum laude), Harvard University.

Affiliations, Service

PROFESSIONAL AFFILIATIONS AND SERVICE Professor (2001 onward), University of Washington. Adjunct Professor, (1990 onward) School of Education, University of California at Berkeley. Principal Scientist (1990-2000), Educational Testing Service. Division Scientist (1987-1990), Senior Scientist (19751987), BBN Laboratories. Assistant Professor (1968-1975), Brandeis University. Member of the AAAS, American Educational Research Association, Cognitive Science Society, National Association for Research in Science Teaching, and Sigma Xi. Chair, Lindquist Award Committee, American Educational Research Association, 1997-1998; Member: Technical Advisory Committee, California Assessment Program. 1990-1995; National Center for Research in Mathematical Sciences Education, 19901995. Editorial: Journal of the Learning Sciences.

Publications

SELECTED RECENT PUBLICATIONS Frederiksen, J. R., & White, B. Y. (2002). Conceptualizing and constructing linked models: Creating coherence in complex knowledge systems. In P. Brna, M. Baker, K. Stenning and A. Tiberghien (Eds.), The Role of Communication in Learning to Model. Mahwah, NJ: Erlbaum. Frederiksen, J. R., White, B. Y., & Gutwill, J. (in press). Dynamic mental models in learning science: The importance of constructing derivational linkages among models. Journal of Research in Science Teaching, 28(9), 799-822. Frederiksen, J. R., Sipusic, M., Sherin, M., and

Wolfe, E. (1998). Video portfolio assessment: Creating a framework for viewing the functions of teaching. Educational Assessment, 5(4), 225-297 White, B., & Frederiksen, J. (1998). Inquiry, Modeling, and Metacognition: Making Science Accessible to All Students. Cognition and Instruction, 16(1), 3-118. Frederiksen, J. R., & Collins, A. (1996). Designing an Assessment System for the Workplace of the Future. (pp. 193-221). In L. B. Resnick, J. Wirt, & D. Jenkins (Eds.). Linking School and Work: Roles for Standards and Assessment. Jossey-Bass. Frederiksen, J. R., & White, B. (1995). Teaching and Learning Generic Modeling and Reasoning Skills. Journal of Interactive Learning Environments, 5(1), 33-52. Frederiksen, J. R., & White, B. Y. (1993). The Avionics Job-Family Tutor: An approach to developing generic cognitive skills within a jobsituated context. Proceedings of the International Conference on Artificial Intelligence and Education. Frederiksen, J. R., & White, B. Y. (1992). Mental models and understanding: A problem for science education. In E. Scanlon & T. O'Shea (Eds.), New Directions in Educational Technology. New York, NY: Springer Verlag. White, B. Y., & Frederiksen, J. R. (1990). Causal model progressions as a foundation for intelligent learning environments. Artificial Intelligence, 42, 99-157. Frederiksen, J. R., & Collins, A. (1989). A systems approach to educational testing. Educational Researcher, 18 (9), 27-32. Frederiksen, J. R., & White, B. Y. (1989). An Approach to Training Based Upon Principled Task Decomposition. ACTA Psychologica, 71 (13), 89-146.

Who We Are

Barbara Y. White

Interests, Background

PROFESSIONAL INTERESTS AND BACKGROUND Professor White's professional interest is in making science and scientific inquiry interesting and accessible to a wide range of students and teachers. She has been principal investigator on numerous projects concerned with the design of computer-based learning environments and their relationship to theories of human learning, understanding, and problem solving. Furthermore, she has been developing and evaluating new instructional approaches, centered around these environments, that enable students to work together to develop an understanding of both the subject matter and the processes of scientific modeling and inquiry.

Barbara Y. White bywhite@berkeley.edu UC Berkeley's Web site

Education

EDUCATION Ph.D. Computer Science, Massachusetts Institute of Technology, USA, 1981. Completed Ph.D. qualifying program in Psychology, University of New South Wales, Australia, 1975. B.Sc. Mathematics, University of Victoria, Canada, 1971.

Affiliations, Service

PROFESSIONAL AFFILIATIONS AND SERVICE Professor (1995 onward), Associate Professor (19891995), University of California at Berkeley. Senior Scientist (1985-1989), Scientist (1982-1985), BBN Laboratories, Cambridge, USA. Member of the Cognitive Science Society, the American

Association for Artificial Intelligence, the International AI and Education Society (on executive committee), the American Educational Research Association, and the National Association for Research in Science Teaching. Serve on the editorial board for the International Journal of Artificial Intelligence in Education and the Journal of Interactive Media in Education.

Presentations

SELECTED ORAL PRESENTATIONS Enabling Young Learners to Develop QP Theories of Minds. Invited address given at the Fifteenth International Workshop on Qualitative Reasoning, San Antonio, May, 2001. Conceptual Tools for Learning through Inquiry and Reflection. Invited talk given at the Annual Meeting of the American Educational Research Association, San Diego, April, 1998. Technological Tools and Instructional Approaches for Improving Science Education. Invited talk presented at the Harvard Graduate School of Education, March, 1997. The ThinkerTools Project: A Computer-Based Curriculum for Scientific Inquiry and Modeling. Invited address presented at the Annual Meeting of the National Association for Research in Science Teaching, San Francisco, April, 1995. Intermediate Abstractions and Causal Models: A Microworld-Based Approach to Science Education. Invited address presented at the World Conference on Artificial Intelligence in Education. Edinburgh, Scotland, August, 1993.

Publications

SELECTED RECENT PUBLICATIONS White, B., Shimoda, T., & Frederiksen, J. Enabling Students to Construct Theories of Collaborative Inquiry and Reflective Learning: Computer Support for Metacognitive Development. International Journal of Artificial Intelligence in Education, 10(2), 1999. Frederiksen, J, White, B, & Gutwill, J. Dynamic Mental Models in Learning Science: The Importance of Constructing Derivational Linkages Among Models. Journal of Research in Science Teaching, 36(7), 806-836, 1999.

White, B., & Frederiksen, J. Inquiry, Modeling, and Metacognition: Making Science Accessible to All Students. Cognition and Instruction, 16(1), 3-118, 1998. White, B., & Schwarz, C. Alternative Approaches to Using Modeling and Simulation Tools for Teaching Science. In N. Roberts, W. Feurzeig, & B. Hunter (Eds.), Computer Modeling and Simulation in Science Education. (pp. 226256). New York, NY: Springer-Verlag, 1998. Frederiksen, J., & White, B. Teaching and Learning Generic Modeling and Reasoning Skills. Journal of Interactive Learning Environments, 5, 33-51, 1998. White, B. Computer Microworlds and Scientific Inquiry: An Alternative Approach to Science Education. In B. Fraser and K. Tobin (Eds.), the International Handbook of Science Education. Netherlands: Kluwer Publishers, 1998. White, B. The ThinkerTools Project: Computer Microworlds as Conceptual Tools for Facilitating Scientific Inquiry. In S. Glynn & R. Duit (Eds.), Learning Science in the Schools: Research Reforming Practice, (pp. 201-227). Hillsdale, NJ: Lawrence Erlbaum Associates, 1995. White, B. ThinkerTools: Causal Models, Conceptual Change, and Science Education. Cognition and Instruction, 10(1), 1-100, 1993. White, B., Frederiksen, J., & Spoehr, K. Conceptual Models for Understanding the Behavior of Electrical Circuits. In M. Caillot (Ed.), Learning Electricity and Electronics with Advanced Educational Technology, (pp. 77-95) New York, NY: Springer Verlag, 1993. White, B. Causal Models and Intermediate Abstractions: A Missing Link for Successful Science Education? In R. Glaser (Ed.), Advances in Instructional Psychology, Volume 4, (pp., 177-252). Hillsdale, NJ: Lawrence Erlbaum Associates, 1993. White, B. A Microworld-based Approach to Science Education. In E. Scanlon & O'Shea (Eds.), New Directions in Educational Technology, (pp. 227-242). New York: Springer Verlag, 1992. White, B., & Frederiksen, J. Causal Model Progressions as a Foundation for Intelligent Learning Environments. Artificial Intelligence, 24, 99-157, 1990.

Frederiksen, J., & White, B. An Approach to Training Based Upon Principled Task Decomposition. ACTA Psychologica, 71, 1-58, 1989. White, B., & Frederiksen, J. Causal Models as Intelligent Learning Environments for Science and Engineering Education. Applied Artificial Intelligence, 3, 167-190, 1989.

Grants

CURRENT RESEARCH GRANTS White, B. Modeling, Developing, and Assessing Scientific Inquiry Skills Using a Computer-Based, Inquiry Support Environment. Funded by the National Science Foundation. White, B. & Frederiksen, J. Improving Students' Learning and Achievement Through Developing Generalizable Skills for Inquiry and SelfReflection. Funded by the Department of Education's Office of Educational Research and Improvement.

Who We Are

Barbara Y. White

Interests, Background

PROFESSIONAL INTERESTS AND BACKGROUND Professor White's professional interest is in making science and scientific inquiry interesting and accessible to a wide range of students and teachers. She has been principal investigator on numerous projects concerned with the design of computer-based learning environments and their relationship to theories of human learning, understanding, and problem solving. Furthermore, she has been developing and evaluating new instructional approaches, centered around these environments, that enable students to work together to develop an understanding of both the subject matter and the processes of scientific modeling and inquiry.

Barbara Y. White bywhite@berkeley.edu UC Berkeley's Web site

Education

EDUCATION Ph.D. Computer Science, Massachusetts Institute of Technology, USA, 1981. Completed Ph.D. qualifying program in Psychology, University of New South Wales, Australia, 1975. B.Sc. Mathematics, University of Victoria, Canada, 1971.

Affiliations, Service

PROFESSIONAL AFFILIATIONS AND SERVICE Professor (1995 onward), Associate Professor (19891995), University of California at Berkeley. Senior Scientist (1985-1989), Scientist (1982-1985), BBN Laboratories, Cambridge, USA. Member of the Cognitive Science Society, the American

Association for Artificial Intelligence, the International AI and Education Society (on executive committee), the American Educational Research Association, and the National Association for Research in Science Teaching. Serve on the editorial board for the International Journal of Artificial Intelligence in Education and the Journal of Interactive Media in Education.

Presentations

SELECTED ORAL PRESENTATIONS Enabling Young Learners to Develop QP Theories of Minds. Invited address given at the Fifteenth International Workshop on Qualitative Reasoning, San Antonio, May, 2001. Conceptual Tools for Learning through Inquiry and Reflection. Invited talk given at the Annual Meeting of the American Educational Research Association, San Diego, April, 1998. Technological Tools and Instructional Approaches for Improving Science Education. Invited talk presented at the Harvard Graduate School of Education, March, 1997. The ThinkerTools Project: A Computer-Based Curriculum for Scientific Inquiry and Modeling. Invited address presented at the Annual Meeting of the National Association for Research in Science Teaching, San Francisco, April, 1995. Intermediate Abstractions and Causal Models: A Microworld-Based Approach to Science Education. Invited address presented at the World Conference on Artificial Intelligence in Education. Edinburgh, Scotland, August, 1993.

Publications

SELECTED RECENT PUBLICATIONS White, B., Shimoda, T., & Frederiksen, J. Enabling Students to Construct Theories of Collaborative Inquiry and Reflective Learning: Computer Support for Metacognitive Development. International Journal of Artificial Intelligence in Education, 10(2), 1999. Frederiksen, J, White, B, & Gutwill, J. Dynamic Mental Models in Learning Science: The Importance of Constructing Derivational Linkages Among Models. Journal of Research in Science Teaching, 36(7), 806-836, 1999.

White, B., & Frederiksen, J. Inquiry, Modeling, and Metacognition: Making Science Accessible to All Students. Cognition and Instruction, 16(1), 3-118, 1998. White, B., & Schwarz, C. Alternative Approaches to Using Modeling and Simulation Tools for Teaching Science. In N. Roberts, W. Feurzeig, & B. Hunter (Eds.), Computer Modeling and Simulation in Science Education. (pp. 226256). New York, NY: Springer-Verlag, 1998. Frederiksen, J., & White, B. Teaching and Learning Generic Modeling and Reasoning Skills. Journal of Interactive Learning Environments, 5, 33-51, 1998. White, B. Computer Microworlds and Scientific Inquiry: An Alternative Approach to Science Education. In B. Fraser and K. Tobin (Eds.), the International Handbook of Science Education. Netherlands: Kluwer Publishers, 1998. White, B. The ThinkerTools Project: Computer Microworlds as Conceptual Tools for Facilitating Scientific Inquiry. In S. Glynn & R. Duit (Eds.), Learning Science in the Schools: Research Reforming Practice, (pp. 201-227). Hillsdale, NJ: Lawrence Erlbaum Associates, 1995. White, B. ThinkerTools: Causal Models, Conceptual Change, and Science Education. Cognition and Instruction, 10(1), 1-100, 1993. White, B., Frederiksen, J., & Spoehr, K. Conceptual Models for Understanding the Behavior of Electrical Circuits. In M. Caillot (Ed.), Learning Electricity and Electronics with Advanced Educational Technology, (pp. 77-95) New York, NY: Springer Verlag, 1993. White, B. Causal Models and Intermediate Abstractions: A Missing Link for Successful Science Education? In R. Glaser (Ed.), Advances in Instructional Psychology, Volume 4, (pp., 177-252). Hillsdale, NJ: Lawrence Erlbaum Associates, 1993. White, B. A Microworld-based Approach to Science Education. In E. Scanlon & O'Shea (Eds.), New Directions in Educational Technology, (pp. 227-242). New York: Springer Verlag, 1992. White, B., & Frederiksen, J. Causal Model Progressions as a Foundation for Intelligent Learning Environments. Artificial Intelligence, 24, 99-157, 1990.

Frederiksen, J., & White, B. An Approach to Training Based Upon Principled Task Decomposition. ACTA Psychologica, 71, 1-58, 1989. White, B., & Frederiksen, J. Causal Models as Intelligent Learning Environments for Science and Engineering Education. Applied Artificial Intelligence, 3, 167-190, 1989.

Grants

CURRENT RESEARCH GRANTS White, B. Modeling, Developing, and Assessing Scientific Inquiry Skills Using a Computer-Based, Inquiry Support Environment. Funded by the National Science Foundation. White, B. & Frederiksen, J. Improving Students' Learning and Achievement Through Developing Generalizable Skills for Inquiry and SelfReflection. Funded by the Department of Education's Office of Educational Research and Improvement.

Who We Are

ThinkerTools Research Group Members

ThinkerTools Principal Researchers

Barbara Y. White Professor of Science, Math, and Technology Education University of California at Berkeley

Barbara White is Professor of Science, Math, and Technology Education at the University of California at Berkeley, where she chairs the Graduate Group in Science and Math Education and co-directs the Masters and Credential Program in Science and Math Education. Professor White also serves on the editorial boards for the International Journal of Artificial Intelligence in Education and for the Journal of Interactive Media in Education. She is best known in cognitive science for her work on conceptual models, intermediate abstractions, and model progressions, in artificial intelligence for her work on modeling and simulation, particularly qualitative modeling, and the design of intelligent, computerbased learning environments, and in science education for her work on innovate approaches to teaching physics and scientific inquiry. Professor White's professional interests are in making science and scientific inquiry interesting and accessible to a wide range of students and teachers. She has undertaken a variety of projects concerned with the design of computer-based learning environments and their relationship to theories of human learning, understanding, and problem solving. She is also concerned with developing and evaluating new instructional approaches, centered around these environments, that enable students to work together

to develop an understanding of the processes of scientific modeling and inquiry, while they also develop a deep understanding of the subject matter along with widely applicable cognitive, social, and metacognitive capabilities.

John R. Frederiksen Professor of Education University of Washington

John Frederiksen’s research concerns the application of the cognitive sciences to learning and instruction within both classroom settings and computer-based learning environments. His work involves experimental cognitive psychology, artificial intelligence, and educational measurement. Recent research focuses on how middle school science students can develop an understanding of scientific inquiry processes and apply this knowledge in creating models of scientific phenomena. He applies cognitive theories to educational practice, and uses evaluations of instructional processes and outcomes to illuminate further development of cognitive theory. He has served as a principal research scientist and director of the Cognitive Science Research Group of the Educational Testing Service, and as a member of the planning group of the Center for Performance Assessment of Educational Testing Services (ETS). He is currently a Professor of Education at the University of Washington in Seattle. His recent publications include "Dynamic Mental Models in Learning Science: The Importance of Constructing Derivational Linkages among Models," in the Journal of Research in Science Teaching (with B. Y. White, et al., 1999); "Metacognitive Facilitation: An Approach to Making Scientific Inquiry Accessible to All," in Teaching in the Inquiry-Based Science Classroom, edited by J. Minstrell, et al. (with B. Y. White, in press); "Video Portfolio Assessment: Creating a Framework for Viewing the Functions of Teaching," in Educational Assessment (with M. Sipusic, et al., 1998); and "Inquiry, Modeling, and Metacognition: Making Science Accessible to All Students," in Cognition and Instruction (with B. Y. White, 1998).

Allan Collins Professor Emeritus Northwestern University

Allan Collins is Professor Emeritus at Northwestern University. He is a member of the National Academy of Education, a fellow of the American Association for Artificial Intelligence, and served as a founding editor of the journal Cognitive Science and as first chair of the Cognitive Science Society. He is best known in psychology for his work on semantic memory and mental models, in artificial intelligence for his work on plausible reasoning and intelligent tutoring systems, and in education for his work on inquiry teaching, cognitive apprenticeship, situated learning, epistemic games, and systemic validity in educational testing. From 1991 to 1994 he was Co-Director of the US Department of Education’s Center for Technology in Education centered at Bank Street College of Education. In recent years he has been developing a theory of epistemic forms and games. Epistemic forms are the recurring forms that are found among theories in science. Some of the different forms that occur are stage models, hierarchies, aggregate-behavior models, system-dynamics models, and production systems. Inquiry in different disciplines involves mastering how to carry out investigations of phenomena guided by one or more of these target structures. Epistemic games refer to the set of rules and strategies that scientists follow when they carry out an inquiry.

ThinkerTools Project Members Marcela Borge cela54@berkeley.edu MACSME's Web site Marcela Borge is a Ph D. Candidate in Cognition and Development, Education in Math Science and Technology, at the University of California at Berkeley. She has lived in the Bay area for most of her life. Her research interests include developing students' collaborative learning and problem solving skills, behavior regulation, and educational software development. As a graduate student, she has spent a great deal of time doing research on collaborative learning, behavior regulation, and educational software development. Her dissertation is a curriculum development project, focusing on helping students to learn how to monitor and regulate collaborative interactions in a science classroom. Students learn to "manage" assigned collaborative roles while immersed in a collaborative project on the human body.

Karen Bush karbush@scglobal.net

Karen Bush has a B.A. in Human Biology, Stanford University (1987) and an M.A. in Science Education and Life Science Teaching Credential, UC Berkeley (1996). Karen has been teaching science at Longfellow Middle School since 1996. She also works with students in the science credential program at UCB, teaching a methods class and supervising them in their school placements. Karen's interests include: • increasing the participation of minority students in science • creating supportive curriculum and learning environments in science for lower achieving students • improving training and support for science teachers • reducing burnout and increasing longevity of science teachers through the creation of improved curriculum and manageable learning environments.

Eric Eslinger eslinger@udel.edu University of Delaware's Web site

Eric Eslinger is an assistant professor at the University of Delaware. His teaching interests are in science education. Eric received his Ph.D. from the University of California at Berkeley.

Suzanna Loper sjloper@berkeley.edu

Suzanna Loper works with the Seeds of Science/Roots of Reading project of GEMS, at the Lawrence Hall of Science, Berkeley, developing elementary school science curricula that integrate inquiry science with literacy.

Tatiana Miller Tatiana F. Miller is in her first year as a doctoral tatianamiller@sbcglobal.netstudent at University of California at Santa Cruz in the specialization of language and literacy and is part of tatiana@ucsc.edu the research team for the VINE Project (Vocabulary Innovations in Education). She earned her B.A. in Environmental Studies and Minor in Politics from UCSC (1995), and went on to teach science to fifth

and sixth grade students in the outdoor classroom at Foothill Horizons Outdoor Science School and the Santa Cruz County Outdoor Science School. Her science teaching in an informal learning context includes environmental interpretive work for the National Park Service, the U.S Forest Service, and the Massachusetts Audubon Society. Tatiana earned her M.A. in Teaching and a multiple subject teaching credential with cross-cultural language and academic development (CLAD) certification from UCSC (2001). She spent the past five years teaching fifth grade at Bay View Elementary School in Santa Cruz, CA, collaborating with members of the ThinkerTools Research Group to design and implement inquiry learning curriculum in both science and language arts content areas.

L.J.C. Shimoda ljc@shimodaworks.com info@thinkertools.org Shimodaworks' Web site

L.J.C. (Linda) Shimoda is a freelance artist working for a variety of clients and in a wide range of media. She creates the art and design for books and their covers, develops logos and icons, most of which are used on the Web, designs Web sites and software, and works on many other projects that incorporate her creative skills. Linda has worked in media ranging from penand-ink to photography, digital to brush, and oil to found-objects. Linda has worked with Professor Barbara White and her ThinkerTools research group for a number of years, enjoying the duties of graphic art and design, software and Web design and graphics, database management, and any project requiring her hidden talents of organization and pestering people into doing something.

Todd Shimoda Todd Shimoda graduated from the SESAME (Science, todd@shimodaworks.com Engineering, and Mathematics Education) doctoral Shimodaworks' Web site program at the University of California, Berkeley. He has been a professor at Colorado State University and is a visiting researcher at Cal working on the Web of Inquiry Project. His interests include intelligent agent design, cognitive science, epistemology, and science and health education. He has published in books and journals including Science Education and the Journal of Artificial Intelligence in Education.

Home

ThinkerTools Scientific Inquiry and Modeling Project

In the ThinkerTools Project, researchers from UC Berkeley and elsewhere are collaborating with middle school teachers from inner-city, rural, and suburban public schools. We are developing instructional methods and materials aimed at making scientific inquiry accessible to a wide range of students. Our work includes: Scientific Inquiry Creating instructional approaches that foster the development of students' scientific inquiry and reflective learning skills

Inquiry Software

Developing computer-based tools and educational activities that facilitate those skills and understandings

Curricula

Creating curricular materials and teacher's guides which embody these new approaches to science education

Assessments

Developing alternative methods of assessment that evaluate understanding, inquiry, and metacognition rather than rote retention of facts and formulas

Models of Expertise

Creating and testing theories of expertise and its acquisition, including models of scientific inquiry and reflective learning, which form the foundation for these instructional tools, methods, and materials The ThinkerTools Research Group is part of the University of California, Berkeley's Graduate School of Education.

Carl Wieman
From Wikipedia, the free encyclopedia

Carl Edwin Wieman

Wieman (left) with Eric Cornell on the campus of the University of Colorado

Born

March 26, 1951 (age 62) Corvallis, Oregon, U.S.

Nationality Fields Institutions

United States Physics University of British Columbia University of Colorado

Alma mater

MIT Stanford University

Doctoral advisor Known for

Theodor W. Hänsch Bose–Einstein condensate

Notable awards

King Faisal International Prize in Science (1997) Lorentz Medal (1998) The Benjamin Franklin Medal(2000) Nobel Prize in Physics (2001) Oersted Medal (2007)

Carl Edwin Wieman (born March 26, 1951) is an American physicist at the University of British Columbia and recipient of the Nobel Prize in Physicsfor the production, in 1995 with Eric Allin Cornell, of the first true Bose–Einstein condensate.
Contents
[hide]     

1 Biography 2 Selected publications 3 See also 4 References 5 External links

Biography[edit]
Wieman was born in Corvallis, Oregon in the United States and graduated from Corvallis High School. Wieman earned his B.S. in 1973 from MIT and his Ph.D. from Stanford University in 1977; he was also awarded a Doctor of Science, honoris causa from the University of Chicago in 1997. He was awarded the Lorentz Medal in 1998. In 2001, he won the Nobel Prize in Physics, along with Eric Allin Cornell and Wolfgang Ketterle for fundamental studies of the Bose-Einstein condensate.[1] In 2004, he was named United States Professor of the Year among all doctoral and research universities. Wieman joined the University of British Columbia on 1 January 2007 and is heading a wellendowed science education initiative there; he retains a twenty percent appointment at the University of Colorado at Boulder to head the science education project he founded in Colorado.[2] In the past several years, Wieman has been particularly involved with efforts at improving science education and has conducted educational research on science instruction. Wieman currently serves as Chair of the Board on Science Education of the National Academy of Sciences. He has used and promotes Eric Mazur's "peer instruction", a pedagogical system, where teachers repeatedly ask multiple-choice concept questions during class, and students reply on the spot with little wireless "clicker" devices. If

a large proportion of the class chooses a wrong answer, students discuss among themselves and reply again.[3] In 2007, Wieman was awarded the Oersted Medal, which recognizes notable contributions to the teaching of physics, by the American Association of Physics Teachers (AAPT). Wieman is the founder and chairman of PhET, a web-based directive of University of Colorado which provides an extensive suite of simulations to improve the way that physics, chemistry, biology, earth science and math are taught and learned.[4] Link Wieman is a member of the USA Science and Engineering Festival's Advisory Board.[5] Wieman was nominated to be The White House's Office of Science and Technology Policy Associate Director of Science on March 24, 2010. His hearing in front of the Commerce committee occurred on May 20, 2010 and he was passed by unanimous consent. On September 16, 2010 Dr. Wieman was confirmed by unanimous consent.

Selected publications[edit]

Donley, Elizabeth A.; Neil R. Claussen; Simon L. Cornish; Jacob L. Roberts; Eric A. Cornell; Carl E. Wieman (2001-07-19). "Dynamics of Collapsing and Exploding Bose−Einstein Condensates". Nature 412 (6844): 295–299. arXiv:condmat/0105019.Bibcode:2001Natur.412..295D. doi:10.1038/35085500. PMID 11460153.

Matthews, Michael R.; B.P. Anderson; P.C. Haljan; D.S. Hall; C.E. Wieman; E.A. Cornell (1999). "Vortices in a Bose-Einstein Condensate". Phys. Rev. Lett. 83 (13): 2498– 2501. arXiv:condmat/9908209. Bibcode:1999PhRvL..83.2498M. doi:10.1103/PhysRevLett.83.2498.

Walker, Thad; David Sesko and Carl Wieman (1990). "Collective Behavior of Optically Trapped Neutral Atoms". Phys. Rev. Lett. 64 (4): 408– 411. Bibcode:1990PhRvL..64..408W.doi:10.1103/PhysRevLett.64.408. PMID 10041972.

Tanner, Carol E.; Carl Wieman (1988). "Precision Measurement of the Hyperfine Structure of the 133Cs 6P3/2 State". Phys. Rev. A 38 (3): 1616– 1617. Bibcode:1988PhRvA..38.1616T.doi:10.1103/PhysRevA.38.1616. PMID 9900545.

See also[edit]

Timeline of low-temperature technology

References[edit]
1. Jump up^ http://nobelprize.org/nobel_prizes/physics/laureates/2001/public.html 2. Jump up^ "CU-Boulder Nobel Laureate Carl Wieman Announces Move To British Columbia, Will Remain Linked To CU-Boulder" (Press release). University of Colorado, Boulder. 2006-03-20. Retrieved 2007-10-09.

3. Jump up^ David Epstein (2006-04-07). "Trading Research for Teaching". Inside Higher Ed. Retrieved 2007-10-09. 4. Jump up^ http://ebooks.worldscinet.com/ISBN/9789812813787/9789812813787_0097.html 5. Jump up^ http://www.usasciencefestival.org/about/advisors

External links[edit]
    

Carl Wieman's blog at ScientificBlogging.com Carl E. Wieman biography at the Nobel Foundation Globe and Mail Article Carl E. Wieman patents at Patent Genius Group photograph taken at Lasers '95 including (right to left) Marlan Scully, Theodor W. Hänsch, Carl E. Wieman, and F. J. Duarte.
[hide]

V

T

E

Nobel Laureates in P

Röntgen (1901)

Lorentz / Zeeman (1902)

 1901–1925 

Becquerel / P. Curie / M. Curie (1903)

Rayleigh (1904)

Lenard (1905)

J. J. Thomson (1906)

Michelson (1907)

 

Lippmann (1908) Marconi / Braun (1909)

van der Waals (1910)

Wien (1911)

Dalén (1912)

Kamerlingh Onnes (1913)

Laue (1914)

W. L. Bragg / W. H. Bragg (1915)

Barkla (1917)

 

Planck (1918) Stark (1919)

Guillaume (1920)

Einstein (1921)

N. Bohr (1922)

Millikan (1923)

M. Siegbahn (1924)

 

Franck / Hertz (1925) Perrin (1926)

Compton / C. Wilson (1927)

O. Richardson (1928)

De Broglie (1929)

Raman (1930)

Heisenberg (1932)

1926–1950 

Schrödinger / Dirac (1933)

 

Chadwick (1935) Hess / C. D. Anderson (1936)

Davisson / G. P. Thomson (1937)

Fermi (1938)

Lawrence (1939)

Stern (1943)

Rabi (1944)

Pauli (1945)

Bridgman (1946)

 

Appleton (1947) Blackett (1948)

Yukawa (1949)

 

Powell (1950) Cockcroft / Walton (1951)

Bloch / Purcell (1952)

Zernike (1953)

Born / Bothe (1954)

 1951–1975

Lamb / Kusch (1955)

Shockley / Bardeen / Brattain (1956)

 

Yang / T. D. Lee (1957) Cherenkov / Frank / Tamm (1958)

Segrè / Chamberlain (1959)

Glaser (1960)

Hofstadter / Mössbauer (1961)

Landau (1962)

 

Wigner / Goeppert-Mayer / Jensen (1963) Townes / Basov / Prokhorov (1964)

Tomonaga / Schwinger / Feynman (1965)

Kastler (1966)

Bethe (1967)

Alvarez (1968)

Gell-Mann (1969)

Alfvén / Néel (1970)

 

Gabor (1971) Bardeen / Cooper / Schrieffer (1972)

Esaki / Giaever / Josephson (1973)

Ryle / Hewish (1974)

 1976–2000 

A. Bohr / Mottelson / Rainwater (1975) Richter / Ting (1976)

P. W. Anderson / Mott / Van Vleck (1977)

Kapitsa / Penzias / R. Wilson (1978)

Glashow / Salam / Weinberg (1979)

 

Cronin / Fitch (1980) Bloembergen / Schawlow / K. Siegbahn (1981)

K. Wilson (1982)

Chandrasekhar / Fowler (1983)

Rubbia / van der Meer (1984)

von Klitzing (1985)

 

Ruska / Binnig / Rohrer (1986) Bednorz / Müller (1987)

Lederman / Schwartz / Steinberger (1988)

Ramsey / Dehmelt / Paul (1989)

Friedman / Kendall / R. Taylor (1990)

de Gennes (1991)

Charpak (1992)

Hulse / J. Taylor (1993)

Brockhouse / Shull (1994)

Perl / Reines (1995)

D. Lee / Osheroff / R. Richardson (1996)

Chu / Cohen-Tannoudji / Phillips (1997)

 

Laughlin / Störmer / Tsui (1998) 't Hooft / Veltman (1999)

 

Alferov / Kroemer / Kilby (2000) Cornell / Ketterle / Wieman (2001)

Davis / Koshiba / Giacconi (2002)

Abrikosov / Ginzburg / Leggett (2003)

 2001–present  

Gross / Politzer / Wilczek (2004)

Glauber / Hall / Hänsch (2005) Mather / Smoot (2006)

Fert / Grünberg (2007)

Nambu / Kobayashi / Maskawa (2008)

Kao / Boyle / Smith (2009)

Geim / Novoselov (2010)

Perlmutter / Riess / Schmidt (2011)

Oersted Medal
From Wikipedia, the free encyclopedia
For the medal awarded by the Danish Society for the Dissemination of Natural Science, see H. C. Ørsted Medal. The Oersted Medal recognizes notable contributions to the teaching of physics. Established in 1936, it is awarded by the American Association of Physics Teachers. The award is named forHans Christian Ørsted. It is the Association's most prestigious award. Well-known recipients include Nobel laureates Robert Andrews Millikan, Edward M. Purcell, Richard Feynman, Isidor I. Rabi, Norman F. Ramsey, Hans Bethe, and Carl Wieman; as well asArnold Sommerfeld, George Uhlenbeck, Jerrold Zacharias, Philip Morrison, Melba Phillips, Victor Weisskopf, Gerald Holton, John A. Wheeler, Frank Oppenheimer, Robert Resnick, Carl Sagan,Freeman Dyson, Daniel Kleppner, and Lawrence Krauss, and Anthony French, David Hestenes, Robert Karplus, Robert Pohl, and Francis Sears. The 2008 medalist, Mildred S. Dresselhaus, is the third woman to win the award in its 70-plus year history.

Medalists[edit]
              
William Suddards Franklin – 1936 Edward Herbert Hall – 1937 Alexander Wilmer Duff – 1938 Benjamin Harrison Brown – 1939 Robert Andrews Millikan – 1940 Henry Crew – 1941 none awarded in 1942 George Walter Stewart – 1943 Roland Roy Tileston – 1944 Homer Levi Dodge – 1945 Ray Lee Edwards – 1946 Duane Roller – 1947 William Harley Barber – 1948 Arnold Sommerfeld – 1949 Orrin H. Smith – 1950

                                   

John Wesley Hornbeck – 1951 Ansel A. Knowlton – 1952 Richard M. Sutton – 1953 Clifford N. Wall – 1954 Vernet E. Eaton – 1955 George E. Uhlenbeck – 1956 Mark W. Zemansky – 1957 J. W. Buchta – 1958 Paul Kirkpatrick – 1959 Robert W. Pohl – 1960 Jerrold R. Zacharias – 1961 Francis W. Sears – 1962 Francis L. Friedman – 1963 Walter Christian Michels – 1964 Philip Morrison – 1965 Leonard I. Schiff – 1966 Edward M. Purcell – 1967 Harvey E. White – 1968 Eric M. Rogers – 1969 Edwin C. Kemble – 1970 Uri Haber-Schaim – 1971 Richard P. Feynman – 1972 Arnold Arons – 1973 Melba N. Phillips – 1974 Robert Resnick – 1975 Victor F. Weisskopf – 1976 H. Richard Crane – 1977 Wallace A. Hilton – 1978 Charles Kittel – 1979 Paul E. Klopsteg – 1979, Extraordinary Oersted Medal Award Gerald Holton – 1980 Robert Karplus – 1981 I. I. Rabi – 1982 John A. Wheeler – 1983 Frank Oppenheimer – 1984 Sam Treiman – 1985

                           

Stanley S. Ballard – 1986 Clifford E. Swartz – 1987 Norman F. Ramsey – 1988 Anthony P. French – 1989 Carl E. Sagan – 1990 Freeman Dyson – 1991 Eugen Merzbacher – 1992 Hans A. Bethe – 1993 E. Leonard Jossem – 1994 Robert Beck Clark – 1995 Donald F. Holcomb – 1996 Daniel Kleppner – 1997 Edwin F. Taylor – 1998 David L. Goodstein – 1999 John G. King – 2000 Lillian C. McDermott – 2001 David Hestenes – 2002 Edward W. Kolb – 2003 Lawrence Krauss – 2004 Eugene D. Commins – 2005 Kenneth W. Ford – 2006 Carl Wieman – 2007 Mildred S. Dresselhaus – 2008 George Smoot – 2009 Not Awarded - 2010 F. James Rutherford - 2011 Charles H. Holbrow - 2012 Edward F. Redish - 2013

External links[edit]
Awalnya, arti dan tujuan dari langkah-langkah dalam Siklus Permintaan mungkin hanya sebagian dipahami oleh siswa. Oleh karena itu kami merancang cara untuk memberikan perancah untuk pertanyaan mereka sampai mereka dapat merancang eksperimen mereka sendiri dan untuk membangun hukum mereka sendiri untuk mengkarakterisasi temuan mereka. Kami menciptakan keduanya scaffolded kegiatan dan lingkungan yang memungkinkan siswa untuk melaksanakan berurutan

kegiatan sesuai dengan langkah-langkah dalam Siklus Permintaan. Kegiatan scaffolded memandu saat mereka melakukan percobaan dunia nyata dan membantu mereka untuk belajar tentang proses desain eksperimental dan analisis data , sifat ilmiah argumen dan bukti , dan karakteristik dari hukum ilmiah dan model . itu lingkungan scaffolded termasuk simulasi komputer , yang memungkinkan siswa untuk membuat dan berinteraksi dengan model gaya dan gerak . Mereka juga menyediakan alat analitik yang membantu siswa menganalisa hasil komputer mereka dan RealWorld eksperimen . Kegiatan scaffolded dan lingkungan membuat Proses penyelidikan semudah dan seproduktif mungkin pada setiap tahap dalam pembelajaran . Penilaian reflektif Dalam hubungannya dengan penyelidikan scaffolded , siswa terlibat dalam proses reflektif di mana mereka mengevaluasi mereka sendiri dan penelitian masing-masing . Proses ini mempekerjakan yang dipilih dengan cermat set kriteria , seperti " Menjadi sistematis " dan " Penalaran Hati-hati , " yang menjadi ciri penyelidikan pakar ilmiah ( lihat Tabel 1 ) . siswa menggunakan kriteria ini untuk mengevaluasi pekerjaan mereka di setiap langkah dalam Siklus Inquiry, yang membantu mereka untuk melihat tujuan intelektual dan properti dari langkah-langkah penyelidikan dan mereka sekuensing . Siswa juga menggunakan kriteria untuk mengevaluasi mereka sendiri dan masingmasing yang lain bekerja ketika mereka menyelesaikan proyek-proyek penelitian dan mempresentasikan hasil kerja mereka kepada kelas . Dengan terlibat dalam evaluasi ini di mana mereka berbicara tentang dan merefleksikan karakteristik penyelidikan ilmiah ahli dan fungsi masing-masing penyelidikan langkah , siswa tumbuh untuk memahami sifat dan tujuan penyelidikan serta kebiasaan berpikir yang terlibat . Kirim Generalized dan Refleksi Para siswa menggunakan Siklus Permintaan dan Proses Penilaian Reflektif berulang-ulang sebagai kelas membahas serangkaian pertanyaan penelitian . Dengan setiap pengulangan

siklus , beberapa perancah dihapus sehingga akhirnya siswa sedang melakukan penyelidikan independen atas pertanyaan yang mereka pilih sendiri (seperti di perancah dan pendekatan memudar Palincsar dan Brown [ 1984 ] ) . ini pengulangan Siklus Inquiry dalam hubungannya dengan bantuan Assessment Reflektif siswa untuk memperbaiki proses penyelidikan mereka . Menyelenggarakan kegiatan ini di baru konteks penelitian juga memungkinkan siswa untuk belajar bagaimana untuk menggeneralisasi penyelidikan dan proses refleksi sehingga mereka dapat menerapkannya untuk belajar tentang topik masa depan .

KRITERIA PENJURIAN PENELITIAN Memahami ◗ Memahami Ilmu. Siswa menunjukkan bahwa mereka memahami ilmu pengetahuan dikembangkan dalam kurikulum dan dapat menerapkannya dalam pemecahan masalah, di memprediksi dan menjelaskan fenomena dunia nyata, dan dalam melaksanakan proyek penyelidikan. ◗ Memahami Proses Penyelidikan. Siswa dapat berbicara tentang apa pendekatan mereka atau orang lain telah mengambil dalam mengeksplorasi topik penelitian. untuk Misalnya, mereka dapat menjelaskan apa jenis model ilmiah dan penyelidikan proses telah digunakan dalam melakukan investigasi dan dalam mencapai kesimpulan. ◗ Membuat Koneksi. Siswa melihat gambaran besar dan memiliki jelas gambaran pekerjaan mereka, tujuannya, dan bagaimana kaitannya dengan ide-ide lain atau situasi. Mereka menghubungkan informasi baru, ide-ide, dan hasil eksperimen untuk apa yang mereka sudah tahu. Kinerja: Melakukan Sains

◗ Menjadi inventif . Siswa kreatif dan memeriksa banyak kemungkinan dalam pekerjaan mereka . Mereka menunjukkan orisinalitas dan cipta dalam berpikir masalah untuk menyelidiki , dalam datang dengan hipotesis , dalam merancang eksperimen , dalam menciptakan undang-undang atau model-model baru , dan dalam menerapkan model mereka untuk situasi baru . ◗ Menjadi sistematis . Siswa -hati , teratur , dan logis dalam perencanaan dan melaksanakan pekerjaan mereka . Ketika masalah muncul , mereka bijaksana dalam memeriksa kemajuan mereka dan memutuskan apakah akan mengubah mereka pendekatan atau strategi . ◗ Menggunakan Alat of Science . Siswa menggunakan alat dan representasi ilmu tepat . Alat yang mereka memilih untuk menggunakan ( atau membuat ) mungkin termasuk hal-hal seperti peralatan laboratorium , alat ukur , diagram , grafik , grafik , kalkulator , dan komputer . ◗ Penalaran hati-hati . Siswa dapat alasan tepat dan hati-hati menggunakan konsep-konsep ilmiah dan model . Misalnya , mereka bisa berdebat apakah atau tidak prediksi atau hukum yang mereka atau orang lain telah menyarankan cocok dengan model ilmiah. Mereka juga bisa menunjukkan bagaimana eksperimental observasi mendukung atau menolak model . Konteks Sosial Kerja ◗ Menulis dan Berkomunikasi Yah . Siswa jelas mengekspresikan ide-ide mereka satu sama lain atau kepada audiens melalui tulisan , diagram , dan berbicara. Komunikasi mereka cukup jelas untuk memungkinkan orang lain untuk memahami mereka bekerja dan mereproduksi penelitian mereka . ◗ Teamwork . Siswa bekerja sama sebagai sebuah tim untuk membuat kemajuan . siswa menghormati ' kontribusi dan mendukung satu sama lain saling belajar . Siswa membagi pekerjaan mereka secara adil dan memastikan bahwa setiap orang memiliki bagian penting .

Untuk setiap topik baru dalam kurikulum , siswa mengikuti Siklus Kirim : 1 . Pertanyaan . Seperti dijelaskan sebelumnya , proses penyelidikan dimulai dengan mengembangkan penelitian pertanyaan seperti , " Apa yang terjadi dengan gerak dari suatu obyek yang telah mendorong atau mendorong ketika tidak ada gesekan atau gravitasi yang bekerja padanya ? " 2 . Memprediksi . Selanjutnya, untuk mengatur panggung untuk penyelidikan mereka , para siswa mencoba untuk menghasilkan prediksi alternatif dan teori tentang apa yang mungkin terjadi dalam beberapa situasi spesifik yang berhubungan dengan pertanyaan penelitian. di Dengan kata lain , mereka terlibat dalam " eksperimen pikiran . " Misalnya, dalam Modul 1 , mereka diminta untuk memprediksi apa yang akan terjadi sebagai berikut situasi : Bayangkan sebuah bola yang dihentikan pada permukaan gesekan, salah satu yang bahkan lebih halus dari es . Misalkan Anda memukul bola dengan palu . Kemudian , bayangkan Anda memukul bola lagi di arah yang sama dengan ukuran yang sama memukul. Akan hit kedua mengubah kecepatan bola ? Jika demikian , jelaskan bagaimana hal itu akan berubah dan menjelaskan mengapa . Menanggapi pertanyaan ini , beberapa siswa mungkin berkata , " hit kedua tidak mempengaruhi kecepatan bola karena itu adalah ukuran yang sama sebagai hit pertama " , sementara yang lain mengatakan , " itu membuat bola pergi dua kali lebih cepat karena memberikan bola dua kali lebih banyak kekuatan " , dan lain-lain , " itu hanya membuat bola pergi sedikit lebih cepat karena bola sudah bergerak . " 3 . Percobaan . Setelah memberikan prediksi mereka ke kelas , siswa masuk ke kelompok penelitian untuk merancang dan melaksanakan eksperimen untuk menguji teori alternatif mereka. Investigasi ini membuat penggunaan kedua komputer

simulasi dan bahan eksperimen dunia nyata .

4. Model. Setelah siswa menyelesaikan eksperimen mereka, mereka menganalisis data mereka untuk melihat apakah ada pola. Mereka kemudian mencoba untuk meringkas dan menjelaskan temuan mereka dengan merumuskan hukum dan kausal memodelkan untuk mengkarakterisasi kesimpulan mereka. Model Siswa biasanya mengambil bentuk: "Jika A maka B karena ...", misalnya, "jika tidak ada kekuatan seperti gesekan yang bekerja pada suatu benda, maka ia akan pergi selamanya dengan kecepatan yang sama, karena tidak ada memperlambatnya. " Simulasi komputer dikombinasikan dengan eksperimen dunia nyata dan proses pembuatan model dapat membantu siswa untuk memahami sifat model ilmiah. Komputer tidak dunia nyata, hanya dapat mensimulasikan perilaku dunia nyata dengan melangkah melalui waktu dan menggunakan aturan untuk menentukan bagaimana kekuatan yang bertindak, seperti gesekan atau gravitasi, akan mengubah kecepatan titik itu pada itu langkah waktu. Dengan demikian, komputer sebenarnya menggunakan model konseptal untuk memprediksi perilaku , seperti para siswa akan menggunakan konseptual memodelkan mereka membangun untuk memprediksi perilaku . Dalam bekerja dengan komputer , Tugas siswa adalah untuk merancang percobaan yang akan membantu mereka menginduksi hukum yang digunakan oleh simulasi . Ini lebih mudah daripada tugas penyelidikan dunia nyata sesuai. Setelah semua , objek di dunia nyata tidak didorong oleh undang-undang , melainkan , yang hukum hanya mencirikan perilaku mereka . Salah satu contoh kegiatan modeling , yang dilakukan pada awal tahun kurikulum , telah siswa menjelaskan bagaimana komputer dan RealWorld mereka

eksperimen dapat menyebabkan kesimpulan yang berbeda . mereka mungkin mengatakan, misalnya , bahwa " simulasi komputer tidak memiliki gesekan , yang mempengaruhi percobaan dunia nyata kita. " Atau , mereka mungkin mengatakan bahwa " dunia nyata tidak berperilaku dengan sempurna dan tidak mengikuti aturan . " Bekerja dengan simulasi komputer dapat dengan demikian berpotensi membantu siswa untuk mengembangkan pengetahuan metakognitif tentang apa model ilmiah , dan bagaimana hukum dapat digunakan untuk memprediksi dan mengontrol perilaku . Hal ini juga dapat memungkinkan mereka untuk menghargai kegunaan menciptakan simulasi komputer yang mewujudkan hukum ilmiah dan ideal abstraksi perilaku dunia nyata , dan kemudian menggunakan, simulasi untuk melakukan percobaan untuk melihat implikasi dari teori tertentu . Berdasarkan temuan komputer mereka dan percobaan dunia nyata , siswa mempersiapkan poster , membuat presentasi lisan ke kelas , dan menyampaikan laporan proyek . Inquiry Siklus digunakan dalam mengorganisir laporan dan presentasi mereka . Dengan menulis , grafik , dan menggambar perangkat lunak (seperti ClarisWorks ) siswa menganalisis data mereka dan mempersiapkan laporan mereka . Kemudian , dalam sebuah simposium penelitian seluruh kelas , mereka mengevaluasi bersama temuan dari semua kelompok penelitian , dan memilih " terbaik " hukum dan model untuk menjelaskan data mereka . 5 . Terapkan . Setelah kelas memilih hukum terbaik dan model kausal , siswa mencoba untuk menerapkannya pada situasi dunia nyata yang berbeda . Misalnya , mereka mungkin mencoba untuk memprediksi apa yang terjadi ketika anda menekan keping hoki di atas es . Sebagai bagian dari proses ini , mereka meneliti penggunaan hukum mereka dan model untuk memprediksi dan menjelaskan apa yang akan terjadi . mereka juga menyelidiki batas model mereka (seperti , " Apa yang terjadi jika es tidak sempurna mulus ? " ) , yang pasti menimbulkan penelitian baru pertanyaan (seperti , " Apa efek gesekan ? " ) . Ini membawa

kelas kembali ke awal Siklus Penyelidikan dan untuk menyelidiki pertanyaan penelitian selanjutnya dalam kurikulum .

CYCLING TERHADAP PERTANYAAN INDEPENDEN Inquiry Siklus diulang dengan masing-masing tujuh modul kurikulum. Sebagai kurikulum berlangsung, fisika siswa berurusan dengan peningkatan dalam kompleksitas dan begitu juga penyelidikan. Pada tahap awal kurikulum, proses penyelidikan ini sangat scaffolded. Dalam Modul 1, siswa diberi percobaan harus dilakukan dan disajikan dengan hukum alternatif yang mungkin untuk mengevaluasi. dalam hal ini cara, mereka melihat contoh eksperimen dan hukum sebelum mereka harus membuat mereka sendiri. Dalam Modul 2, siswa diberi percobaan untuk dilakukan, tetapi harus membangun hukum untuk diri mereka sendiri. Kemudian, pada Modul 3, mereka merancang eksperimen mereka sendiri dan membangun hukum mereka sendiri untuk mengkarakterisasi temuan mereka (lihat Lampiran A). oleh akhir kurikulum, para siswa sedang melakukan penyelidikan independen atas topik yang mereka pilih sendiri.

Metacognitive Facilitation White, B., & Frederiksen, J. "Metacognitive facilitation: An approach to making scientific inquiry accessible to all." In J. Minstrell and E. van Zee (Eds.), Inquiring into Inquiry Learning and Teaching in Science. (pp. 331-370). Washington, DC: American Association for the Advancement of

Science, 2000. Metacognitive Facilitation: An Approach to Making Scientific Inquiry Accessible to All Barbara Y. White, University of California at Berkeley John R. Frederiksen, Educational Testing Service. Contents TABLE OF CONTENTS
     

    

Abstract Introduction Facilitating Inquiry within the Classroom Research Community Facilitating Reflective Assessment within the Classroom Research Community Instructional Trials of the ThinkerTools Inquiry Curriculum An Overview of the Results o The Development of Inquiry Expertise o The Development of Physics Expertise o The Impact of Understanding the Reflective Assessment Criteria The Implications of Our Findings Metacognitive Facilitation for Teachers Acknowledgments References Appendix A o An Outline and Checklist for Your Research Reports o Example Research Report o Example Self Assessment

Abstract

ABSTRACT In the ThinkerTools Inquiry Project, researchers and teachers collaborated to create a computer-enhanced, middle-school, science curriculum that enables students to learn about the processes of scientific inquiry and modeling as they construct a theory of force and motion. The class functions as a research community. Students propose competing theories. They then test their theories by working in groups to design and carry out experiments using both computer models and real-world materials. Finally, they come together to compare their findings and to

try to reach a consensus about the physical laws and causal models that best account for their results. This process is repeated as the students tackle new research questions that foster the evolution of their theories of force and motion. The ThinkerTools Inquiry Curriculum focuses on facilitating the development of metacognitive knowledge and skills as students learn the inquiry processes needed to create and revise their theories. Instructional trials in urban classrooms revealed that this approach is highly effective in enabling all students to improve their performance on various inquiry and physics measures. The approach incorporates a reflective process in which students evaluate their own and each other's research using a set of criteria that characterize good inquiry, such as reasoning carefully and collaborating well. When this reflective process is included, the curriculum is particularly effective in reducing the performance gap between low and high achieving students. These findings have strong implications for what such inquiry-oriented, metacognitively-focused curricula can accomplish, particularly in urban school settings in which there are many disadvantaged students. Furthermore, the process of metacognitive facilitation can also be helpful to teachers as they learn how to engage in and reflect on inquiry teaching practices. Introduction INTRODUCTION Science can be viewed as a process of creating laws, models, and theories that enable one to predict, explain, and control the behavior of the world. Our objective in the ThinkerTools Inquiry Project has been to create an instructional approach that makes this view of understanding and doing science accessible to a wide range of students, including lower-achieving and younger students. Our hypothesis is that this objective can be achieved by facilitating the development of the relevant metacognitive knowledge and skills: students need to learn about the nature and utility of scientific models as well as the processes by which they are created, tested, and revised. (See Brown [1984], Brown, Collins, and Duguid [1989], Bruer [1993], Collins and Ferguson [1993], Nickerson, Perkins, and Smith [1985], and Resnick [1987] for further discussions regarding the central role that metacognition plays in

learning.) To test our hypothesis about making scientific inquiry accessible to all students by focusing on the development of metacognitive expertise, we created the ThinkerTools Inquiry Curriculum in which students construct and revise theories of force and motion. The curricular activities and materials are aimed at developing the knowledge and skills that students need to support this inquiry process. The curriculum begins by introducing students to a metacognitive model of research, called "The Inquiry Cycle," and a metacognitive process, called "Reflective Assessment," in which they reflect on their inquiry (see Figures 1 and 2). The Inquiry Cycle consists of five steps and provides a goal structure which students use to guide their inquiry. The curricular activities focus on enabling students to develop the expertise needed to carry out and understand the purpose of the steps in the Inquiry Cycle, as well as to monitor and reflect on their progress as they conduct their research. This is achieved via a constructivist approach that could be characterized as learning metacognitive knowledge and skills through a process of scaffolded inquiry, reflection, and generalization. We call this approach "metacognitive facilitation." 1. Scaffolded Inquiry. We designed scaffolded activities and environments to enable students to learn about inquiry as they engage in authentic scientific research. The scaffolded activities are aimed at helping them learn about the characteristics of scientific laws and models, the processes of experimental design and data analysis, and the nature of scientific argument and proof. The scaffolded environments, which include computer simulations (that allow students to create and interact with models of force and motion) and analytic tools (for analyzing the results of their computer and real-world experiments), make the inquiry process as easy and productive as possible at each stage in learning. These activities and environments enable students to carry out a sequence of activities that correspond to the steps in the Inquiry Cycle. Initially, the meaning and purpose of the steps in the Inquiry Cycle may be only partially understood by students. 2. Reflective Assessment. In conjunction with the scaffolded inquiry, students are introduced to a reflective process in which they evaluate their own and each other's research. This process employs a

carefully chosen set of criteria that characterize expert scientific inquiry (such as "Being Systematic" and "Reasoning Carefully" as shown in Figure 2) to enable students to see the intellectual purpose and properties of the inquiry steps and their sequencing. By reflecting on the attributes of each activity and its function in constructing scientific theories, students grow to understand the nature of inquiry and the habits of thought that are involved. 3. Generalized Inquiry and Reflection. The Inquiry Cycle is repeated as the class addresses new research questions. Each time the cycle is repeated, some of the scaffolding is removed so that eventually the students are conducting independent inquiry on questions of their own choosing (as in the scaffolding and fading approach of Palincsar and Brown, 1984). These repetitions of the Inquiry Cycle in conjunction with reflection help students to refine their inquiry processes. Carrying out these processes in new research contexts also enables students to learn how to generalize the inquiry and reflection processes so that they can apply them to learning about new topics in the future.

Figure 1. A model of the scientific inquiry process which students use to guide their research.

Figure 2. The criteria for judging research which students use in the Reflective Assessment Process. Inquiry FACILITATING INQUIRY WITHIN THE CLASSROOM RESEARCH COMMUNITY The project has established Classroom Research Communities in 7th, 8th, and 9th grade science classrooms in middle schools in Berkeley and Oakland. In these classes, inquiry is the basis for developing an understanding of the physics. Physical theories are not directly taught, but are constructed by students themselves. The idea is to teach students how to carry out scientific inquiry, and then have the students discover the basic physical principles for themselves by doing experiments and creating theories. The process of inquiry follows the Inquiry Cycle, shown in Figure 1, which is presented to students as a basis for organizing their explorations into the physics of force and motion. Inquiry begins with finding research questions, that is, finding situations or phenomena students do not yet understand which

become new areas for investigation. Students then use their intuitions (which are often incorrect) to make conjectures about what might happen in such situations. These predictions provide them with a focus as they design experiments that allow them to observe phenomena and test their conjectures. Students then use their findings as a basis for constructing formal laws and models. By applying their models to new situations, students test the range of applicability of their models and, in so doing, identify new research questions for further inquiry. The social organization of the research community is similar to that of an actual scientific community. Inquiry begins with a whole-class forum to develop shared research themes and areas for joint exploration. Research is then carried out in collaborative research groups. The groups then reassemble to conduct a research symposium in which they present their predictions, experiments, and results, as well as the laws and causal models they propose to explain their findings. While the results and models proposed by individual groups may vary in their accuracy, in the research symposium a process of consensus building increases the reliability of the research findings. The goal is, through debate based upon evidence, to arrive at a common, agreed-upon theory of force and motion. Organization of the curriculum. The curriculum is based on a series of investigations of physical phenomena that increase in complexity. On the first day, students toss a hacky sack around the room while the teacher has them observe and list all of the factors that may be involved in determining its motion (how it is thrown, gravity, air resistance, etc.). As an inquiry strategy, the teacher suggests the need to simplify the situation, and this discussion leads to the idea of looking at simpler cases, such as that of one-dimensional motion where there is no friction or gravity (an example is a ball moving through outer space). The curriculum is, accordingly, organized around starting with this simple case (Module 1), and then adding successively more complicating factors such as introducing friction (Module 2), varying the mass of the ball (Module 3), exploring twodimensional motion (Module 4), investigating the effects of gravity (Module 5), and analyzing trajectories (Module 6). At the end of the curriculum, students are presented with a variety of possible

research topics to pursue (such as orbital motion, collisions, etc.), and they carry out research on topics of their own choosing (Module 7). For each new topic in the curriculum, students follow the Inquiry Cycle: 1. Question. As described above, the inquiry process begins with developing a research question such as, "What happens to the motion of an object that has been pushed or shoved when there is no friction or gravity acting on it?" 2. Predict. Next, to set the stage for their investigations, students try to generate alternative predictions and theories about what might happen in some specific situations that are related to the research question. In other words, they engage in "thought experiments." For example, in Module 1, they are asked to predict what would happen in the following situation. "Imagine a ball that is stopped on a frictionless surface, one that is even smoother than ice. Suppose that you hit the ball with a mallet. Then, imagine you hit the ball again in the same direction with the same size hit. Would the second hit change the velocity of the ball? If so, describe how it would change and explain why." In response to this question, some students might say, "the second hit does not affect the speed of the ball because it's the same size hit as the first;" while others might say, "it makes the ball go twice as fast because it gives the ball twice as much force;" and others might say, " it only makes the ball go a little bit faster because the ball is already moving." 3. Experiment. After presenting their predictions to the class, students break into research groups to design and carry out experiments to test their alternative theories. These investigations make use of both computer simulations and real-world experimental materials. a. Computer activities and experiments. Computer models and experiments are done using the ThinkerTools software that we developed for the Macintosh computer. This software enables students

to interact with Newtonian models of force and motion (see Figure 3 which shows an example of a computer activity that students use in studying onedimensional motion). It also lets students create their own models and experiments. Using simple drawing tools, students can construct and run computer simulations. Objects (such as the large circle shown in Figure 3) and barriers can be placed on the screen. (The objects are introduced to students as generic objects, simply called "dots," which are the pictorial equivalent of variables that students can map onto different objects such as space ships or billiard balls.) Students can define and change the properties of any object, such as its mass, elasticity (e.g., bouncy or fragile), and velocity. They can then apply impulses to the object to change its velocity using the keyboard or a joystick as in a video game. (Impulses are forces that act for a specified -- usually short-- amount of time like a kick or a hit.) Students can thus create and experiment with a "dot-impulse model" and can discover, for example, that when one applies an impulse in the same direction that the dot is moving, it increases the dot's velocity by one unit of speed. In this way, they can use simulations to discover the laws of physics and their implications.

Figure 3. The ThinkerTools Software: A modeling and inquiry tool for creating and experimenting with models of force and motion. Such software enables students to create experimental situations that are difficult or impossible to create in the real world. For example, they can turn friction

and gravity on and off and can select different friction laws (i.e., sliding friction or gas/fluid friction). They can also vary the amount of friction or gravity to see what happens. Such experimental manipulations in which students dramatically alter the parameters of the simulation allow students to use inquiry strategies, such as "look at extreme cases," which are hard to utilize in real-world inquiry. This type of inquiry enables students to see more readily the behavioral implications of the laws of physics and to discover the underlying principles. A major advantage of the software is that it includes measurement tools which allow students to easily make accurate measurements of distances, times, and velocities that are difficult to make in real-world experiments. It also includes graphical representations of variables. For example, in Figure 3 there is a "datacross" which shows the x and y velocity components when the dot is moving. Also, as the dot moves, it can leave behind "dot prints" which show how far it moved in each second and "thrust prints" which show when an impulse was applied. In addition, the software provides analytic tools such as being able to step through time in order to analyze what is happening. These representations and analytic tools help students determine the underlying laws of motion. They can also be incorporated within the students' conceptual model to represent and reason about what might happen in successive time steps. Ideally, the software helps students construct conceptual models that are similar to the computers in that both use diagrammatic representations and employ causal reasoning in which they step through time to analyze events. In this way, such dynamic interactive simulations combined with these analytic tools can provide a transition from students' intuitive ways of reasoning about the world to the more abstract formal methods that scientists use for representing and reasoning about a system's behavior (White, 1993b). b. Real-world experiments. Students are also given a set of materials for conducting real-world experiments. These include "bonkers" (a bonker is a rubber mallet mounted on a stand), balls of varying masses, and measurement tools such as meter sticks and stop watches (see Figure 4). These tools are

coordinated with those used in the ThinkerTools software. For instance, the bonker is similar to the joystick and is used to give a ball a standard-sized impulse. Using such materials, students design and carry out real-world experiments that are related to those done with the computer simulation. Students are also shown stop-motion videos of some of their experiments. Using frame-by-frame presentations, they can attach blank transparencies to the video screen and draw the position of a moving ball after fixed time intervals. These "dotprint analyses" allow them to measure the moment-by-moment changes in the ball's velocity. The Double-Bonk Experiment

Data Table

Figure 4. An illustration of the double-bonk experiment along with the table that students use to record their data. 4. Model. After the students have completed their experiments, they analyze their data to see if there are any patterns. They then try to summarize and explain their findings by formulating a law and a causal model to characterize their conclusions. Students' models typically take the form: "If A then B because ..." For example, "if there are no forces like friction acting on an object, then it will go forever at the same speed, because there is nothing to slow it down." The computer simulations combined with real-world experiments and the process of creating a model can help students to understand the nature of scientific models. To elaborate, the computer is not the real world; it can only simulate real-world behavior by stepping through time and using rules to determine

how forces that are acting (like friction or gravity) will change the dot's velocity on that time step. Thus, the computer is actually using a conceptual model to predict behavior, just as the students will use the conceptual model they construct to predict behavior. In working with the computer, the students' task is to design experiments that will help them induce the laws that are used by the simulation. This is more straightforward than the corresponding real-world inquiry task. After all, objects in the real world are not driven by laws; rather, the laws simply characterize their behavior. One example of a modeling activity, which is carried out early in the curriculum, has students explain how their computer and real-world experiments could lead to different conclusions. They might say, for instance, that "the computer simulation does not have friction which is affecting our real-world experiments." Alternatively, they might say that "the real world does not behave perfectly and does not follow rules." Working with a computer simulation can thus potentially help students to develop metaconceptual knowledge about what scientific models are, and how laws can be used to predict and control behavior. It can also enable them to appreciate the utility of creating computer simulations that embody scientific laws and idealized abstractions of real-world behavior, and then of using such simulations to do experiments in order to see the implications of a particular theory. Based on the findings of their computer and realworld experiments, students prepare posters, make oral presentations to the class, and submit project reports. The Inquiry Cycle is used in organizing their reports and presentations. Students use writing, graphing, and drawing software (such as ClarisWorks) for analyzing their data and preparing their reports. Then, in a whole-class research symposium, they evaluate together the results of all the research groups, and choose the "best" laws and models to explain their findings. 5. Apply. Once the class chooses the best laws and causal models, students try to apply them to different real-world situations. For instance, they might try to predict what happens when you hit a hockey puck on ice. As part of this process, they investigate the utility of their laws and models for predicting and

explaining what would happen. They also investigate the limits of their models (such as, "What happens if the ice isn't perfectly smooth?"), which inevitably raises new research questions (such as, "What are the effects of friction?"). This brings the class back to the beginning of the Inquiry Cycle and to investigating the next research question in the curriculum. The Inquiry Cycle is repeated with each of the seven modules of the curriculum. The physics the students are dealing with increases in complexity as the curriculum progresses and so does the inquiry. In the early stages of the curriculum, the inquiry process is heavily scaffolded. For example, in Module 1, students are given experiments to do and are presented with alternative possible laws to evaluate. In this way, they see examples of experiments and laws before they have to create their own. In Module 2, students are given experiments to do but have to construct the laws for themselves. Then, in Module 3, they design their own experiments and construct their own laws to characterize their findings (see Appendix A). By the end of the curriculum, the students are carrying out independent inquiry on a topic of their own choosing. Reflective Assessment FACILITATING REFLECTIVE ASSESSMENT WITHIN THE CLASSROOM RESEARCH COMMUNITY In addition to the Inquiry Cycle which guides the students' research and helps them to understand what the research process is all about, we also developed a set of criteria for characterizing good scientific research. These are presented in Figure 2. They include goal-oriented criteria such as "Understanding the Science" and "Understanding the Processes of Inquiry," process-oriented criteria such as "Being Systematic" and "Reasoning Carefully," and sociallyoriented criteria such as "Communicating Well" and "Teamwork." These characterizations of good work are used not only by the teachers in judging the students' research projects, but also by the students themselves. At the beginning of the curriculum, the criteria are introduced and explained to the students as the "Guidelines for Judging Research" (see Figure 2). Then, at the end of each phase in the Inquiry Cycle,

the students monitor their progress by evaluating their work on the two most relevant criteria. At the end of each module, they reflect on their work by evaluating themselves on all of the criteria. Similarly, when they present their research projects to the class, the students evaluate not only their own research projects but also each others. They give each other feedback both verbally and in writing. These assessment criteria are thus used as a way of helping to introduce students to the characteristics of good research, and to monitoring and reflecting on their inquiry processes. In what follows, we present sample excerpts from a class's reflective assessment discussion. Students give oral presentations of their projects accompanied by a poster, and they answer questions about their research. Following each presentation, the teacher picks a few of the assessment criteria and asks students in the audience how they would rate the students' presentation. In these conversations, students are typically respectful of one another and generally give their peers high ratings (i.e., ratings of 3-5 on a 5-point scale). However, within the range of high scores that they use, they do make distinctions among the criteria and offer insightful evaluations of the projects that have been presented. The following presents some examples of such reflective assessment conversations. (Pseudonyms are used throughout, and the transcript has been lightly edited to improve its readability.) Teacher: Ok, now what we are going to do is give them some feedback. What about their "understanding the process of inquiry"? In terms of their following the steps within the Inquiry Cycle, on a scale from 1 to 5, how would you score them? Vanessa. Vanessa: I think I would give them a 5 because they followed everything. First they figured out what they wanted to inquire, and then they made hypotheses, and then they figured out what kind of experiment to do, and then they tried the experiment, and then they figured out what the answer really was and that Jamal's hypothesis was correct. Teacher: All right, in terms of their performance, "being inventive." Justin? Justin: Being inventive. I gave them a 5 because they had completely different experiments than almost

everyone else's I've seen. So, being inventive, they definitely were very inventive in their experimentation. Teacher: Ok, good. What about "reasoning carefully?" Jamal, how would you evaluate yourself on that? Jamal: I gave myself a 5, because I had to compute the dotprints between the experiments we did on mass. So, I had to compute everything. And, I double checked all of my work. Teacher: Great. Ok, in terms of the social context of work, "writing and communicating well." Carla, how did you score yourself in that area? Carla: I gave myself a 4, because I always told Jamal what I thought was good or what I thought was bad, and if we should keep this part of our experiment or not. We would debate on it and finally come up with an answer. Teacher: What about "teamwork?" Does anyone want to rate that? Teamwork. Nisha. Nisha: I don't know if I can say because I didn't see them work. (laughter) Teacher: That's fine. That's fair. You are being honest. Julia? Julia: I gave them a 5 because they both talked in the presentation, and they worked together very well, and they looked out for each other. There are various arguments for why incorporating such a Reflective-Assessment Process into the curriculum should be effective. One is the "transparent assessment" argument put forward by Frederiksen and Collins (1989; Frederiksen, 1994), who argue that introducing students to the criteria by which their work will be evaluated enables students to better understand the characteristics of good performance. In addition, there is the argument about the importance of metacognition put forward by researchers (e.g., Baird, Fensham, Gunstone, & White, 1991; Brown 1984; Brown & Campione, 1996; Collins, Brown, & Newman, 1989; Miller, 1991; Reeve & Brown, 1985; Scardamalia, & Bereiter, 1991; Schoenfeld, 1987; Schon, 1987; Towler & Broadfoot, 1992) who maintain that monitoring and reflecting on the process and products of one's own learning is crucial to successful learning

as well as to "learning how to learn." Research on good versus poor learners shows that many students, particularly lower-achieving students, have inadequate metacognitive processes and their learning suffers accordingly (Campione, 1984; Chi et. el., 1989). Thus if you introduce and support such processes in the curriculum, the students' learning and inquiry should be enhanced. Instructional trials of the ThinkerTools Inquiry Curriculum in urban classrooms (which included many lower-achieving students) provided an ideal opportunity to test these hypotheses concerning the utility of such a Reflective-Assessment Process. Instructional Trials INSTRUCTIONAL TRIALS OF THE THINKERTOOLS INQUIRY CURRICULUM In 1994, we conducted instructional trials of the ThinkerTools Inquiry Curriculum. Three teachers used it in their twelve urban classes in grades 7-9. The average amount of time they spent on the curriculum was 10.5 weeks. Two of the teachers had no prior formal physics education. They were all teaching in urban situations in which their class sizes averaged almost thirty students, two thirds of whom were minority students, and many were from highly disadvantaged backgrounds. We analyzed the effects of the curriculum for students who varied in their degree of educational advantage, as measured by their standardized achievement test scores (CTBS -- Comprehensive Test of Basic Skills). We compared the performance of these middle-schools students with that of highschool physics students. We also carried out a controlled study comparing ThinkerTools classes in which students engaged in the Reflective-Assessment Process with matched "Control" Classes in which they did not. For each of the teachers, half of his or her classes were Reflective-Assessment Classes and the other half were Control Classes. In the ReflectiveAssessment Classes, the students were given the assessment framework (Figure 2) and they continually engaged in monitoring and evaluating their own and each other's research. In the Control Classes, the students were not given an explicit framework for reflecting on their research; instead, they engaged in alternative activities in which they commented on what they did and did not like about

the curriculum. In all other respects, the classes participated in the same ThinkerTools inquiry-based science curriculum. There were no significant differences in students' average CTBS scores for the classes that were randomly assigned to the different treatments (reflective-assessment vs. control), for the classes of the three different teachers, or for the different grade levels (7th, 8th, and 9th). Thus, the classes were all comparable with regard to achievement test scores. Results AN OVERVIEW OF THE RESULTS Our results show that the curriculum and software modeling tools make the difficult subject of physics understandable and interesting to a wide range of students. Further, the focus on creating models enables students to learn not only about physics, but also about the properties of scientific models and the inquiry processes needed to create them. In addition, engaging in inquiry also improves students' attitudes toward learning and doing science (White & Frederiksen, 1998). The Development of Inquiry Expertise One of our assessments of students' scientific inquiry expertise was an inquiry test that was given both before and after the ThinkerTools Inquiry Curriculum. In this written test, the students were asked to investigate a specific research question: "What is the relationship between the weight of an object and the effect that sliding friction has on its motion?" In this test, the students were asked to come up with alternative, competing hypotheses with regard to this question. Next, they had to design on paper an experiment that would determine what actually happens, and then they had to pretend to carry out their experiment. In other words, they had to conduct it as a thought experiment and make up the data that they thought they would get if they actually carried out their experiment. Finally, they had to analyze their made-up data to reach a conclusion and relate this conclusion back to their original, competing hypotheses. In scoring this test, the focus was entirely on the

students' inquiry process. Whether or not the students' theories embodied the correct physics was regarded as totally irrelevant. Figure 5 presents the gain scores on this test for both low and high achieving students, and for students in the Reflective Assessment and Control Classes. Notice, firstly, that students in the Reflective Assessment Classes gained more on this inquiry test. Secondly, notice that this was particularly true for the low-achieving students. This is the first piece of evidence that the metacognitive, Reflective Assessment Process is beneficial, particularly for academically disadvantaged students.

Figure 5. The mean gain scores on the Inquiry Test for students in the Reflective Assessment and Control Classes, plotted as a function of their achievement level. If we examine this finding in more detail by looking at the gain scores for each component of the inquiry test, as shown in Figure 6, one can see that the effect of Reflective Assessment is greatest for the more difficult aspects of the test: making up results, analyzing those made-up results, and relating them back to the original hypotheses. In fact, the largest difference in the gain scores is that for a measure we call "coherence," which measures the extent to which the experiments that the students designed address their hypotheses, their made-up results relate to their experiments, their conclusions follow from their results, and whether they relate their conclusions back to their original hypotheses. This kind of overall coherence in research is, we think, a very important indication of sophistication in inquiry. It is on this coherence measure that we see the greatest difference

in favor of students who engaged in the metacognitive Reflective-Assessment Process.

Figure 6. Mean gains on the Inquiry Test Subscores for students in the Reflective Assessment and Control Classes. Next, we turn to presenting the results from the students' research projects. Students carried out two research projects, one about half way through the curriculum and one at the end. For the sake of brevity, we added the scores for these two projects together as shown in Figure 7. These results indicate that students in the Reflective Assessment Classes do significantly better on their research projects than students in the Control Classes. The results also show that the Reflective Assessment Process is particularly beneficial for the low-achieving students: lowachieving students in the Reflective Assessment Classes perform almost as well as the high-achieving students. These findings were the same across all three teachers and all three grade levels.

Figure 7. The mean overall scores on their research projects for students in the Reflective Assessment and Control Classes, plotted as a function of their achievement level. The Development of Physics Expertise We now summarize the results from the point of view of the students' understanding of physics. We gave the students a general force-and-motion physics test, both before and after the ThinkerTools curriculum, that includes items in which students are asked to predict and explain how forces will affect an objects' motion (such as that shown in Figure 8). On this test we found significant pre-test to post-test gains. We also found that our middle-school, ThinkerTools students do better on such items than do high-school physics students who are taught using traditional approaches. Furthermore, when we analyzed the effects of the curriculum on items that represent near or far transfer in relation to contexts they had studied in the course, we found that there were significant learning effects for both the near and far transfer items. Together, these results show that you can teach sophisticated physics in urban, middle-school classrooms when you make use of simulation tools combined with scaffolding the inquiry process. In general, this inquiry-oriented, constructivist approach appears to make physics interesting and accessible to a wider range of students than is possible with traditional approaches (White, 1993a, White &

Frederiksen, 1998; White & Horwitz, 1988). Circle the path the ball would take as it falls to the ground.

Explain the reasons for your choice:

Figure 8. A sample problem from the physics test. On a set of such items, the ThinkerTools students averaged 68% correct and significantly outperformed the high-school physics students who averaged 50% correct (t343 = 4.59, p =<.001). What is the effect of the Reflective Assessment Process on the learning of physics? The assessment criteria were chosen to address principally the process of inquiry and only indirectly the conceptual model of force and motion that students are attempting to construct in their research. Moreover, within the curriculum students practice Reflective Assessment primarily in the context of judging their own and others' work on projects, not their progress in solving physics problems. Nonetheless, our hypothesis is that the Reflective Assessment should have an influence on students' success in developing conceptual models for the physical phenomena they have studied, through its effect in improving the learning of inquiry skills that are instrumental in their developing an understanding of physics principles. To evaluate the effects of Reflective Assessment on students' conceptual model for force and motion, we developed a Conceptual Model Test. Our findings, presented in Figure 9, show that the effects of Reflective Assessment extend to students' learning the science content as well as to their learning the processes of scientific inquiry, and that the benefits of Reflective Assessment are again greatest for the academically disadvantaged students.

Figure 9. The mean scores on the Conceptual Model Test for students in the Reflective Assessment and Control Classes, plotted as a function of their achievement level. The Impact of Understanding the Reflective Assessment Criteria If we are to attribute these effects of introducing Reflective Assessment to students' developing metacognitive competence, we need to show that the students developed an understanding of the assessment criteria and could use them to describe multiple aspects of their work. One way to evaluate their understanding of the assessment concepts is to compare their use of the criteria in rating their own work with the teachers' evaluation of their work using the same criteria. If students have learned how to use the criteria, their self-assessment ratings should correlate with the teachers' ratings for each of the criteria. We found that students in the ReflectiveAssessment Classes, who worked with the criteria throughout the curriculum, showed significant agreement with the teachers in judging their work, while this was not the case for students in the Control Classes, who were given the criteria only at the end of the curriculum for judging their final projects. For example, in judging Reasoning Carefully, students who worked with the assessment criteria throughout the curriculum had a correlation of .58 between their ratings of Reasoning Carefully on their final projects and the teachers'. The average correlation for these

students over all of the criteria was .48, which is twice that for students in the Control Classes. If the Reflective-Assessment Criteria are acting as metacognitive tools to help students as they ponder the functions and outcomes of their inquiry processes, then the students' performance in developing their inquiry projects should depend upon how well they have understood the assessment concepts. To evaluate their understanding, we rated whether the evidence they cited in justifying their self assessments was or was not relevant to the particular criterion they were considering. We then looked at the quality of the students' final projects, comparing students who had developed an understanding of the set of assessment concepts by the end of the curriculum with those who did not. Our results, shown in Figure 10, indicate that students who had learned to use the interpretive concepts appropriately in judging their work produced higher quality projects than students who had not. And again we found that the benefit of learning to use the assessment criteria was greatest for the low-achieving students.

Figure 10. The mean scores on their Final Projects for students who did and did not provide relevant evidence when justifying their self-assessment scores, plotted as a function of their achievement level. Taken together, these research findings clearly implicate the use of the assessment criteria as a

reflective tool for learning to carry out inquiry. Students in the Reflective-Assessment Classes generated higher scoring research reports than those in the Control Classes. Further, students who showed a clear understanding of the criteria produced higher quality investigations than those who showed less understanding. Thus, there are strong beneficial effects of introducing a metacognitive language to facilitate students' reflective explorations of their work in classroom conversations and in self assessment An important finding was that the beneficial effect of Reflective Assessment was particularly strong for the lower-achieving students: The Reflective-Assessment Process enabled the lower-achieving students to gain more on the inquiry test (see Figure 5). It also enabled them to perform close to the higherachieving students on their research projects (see Figure 7). The introduction of Reflective Assessment, while helpful to all, was thus closing the performance gap between the lower and higher-achieving students. In fact, the Reflective-Assessment Process enabled lower-achieving students to perform equivalently to higher-achieving students on their research projects when they did their research in collaboration with a higher-achieving student. In the Control Classes, in contrast, the lower-achieving students did not do as well as high-achieving students, regardless of whether or not they collaborated with a higherachieving student. Thus, there was evidence that social interactions in the Reflective-Assessment Classes--particularly those between lower and higherachieving students--were important in facilitating learning (cf., Carter & Jones, 1994; Slavin, 1995; and Vygotsky, 1978). Findings THE IMPLICATIONS OF OUR FINDINGS We think that our findings have strong implications for what such inquiry-oriented, metacognitivelyfocused curricula can accomplish in an urban school setting. In particular, we argue that three important conclusions follow from our work: To be equitable, science curricula should incorporate reflective inquiry, and assessments of students' learning should

include measures of inquiry expertise. Students should learn how to transfer the inquiry and reflective assessment processes to other domains so that they "learn how to learn" and can utilize these valuable metacognitive skills in their learning of other school subjects. Such an inquiry-oriented approach to education, in which the development of metacognitive knowledge and skills plays a central role, should be introduced early in the school curriculum (i.e., at the elementary school level). 1. Science curricula should incorporate inquiry and include assessments of students' inquiry expertise.Our results suggest that, from an equity standpoint, curricular approaches can be created that are not merely equal in their value for, but actually enhance the learning of less-advantaged students. Furthermore, to adequately and fairly assess the effectiveness of such curricula, one needs to utilize measures of inquiry expertise, such as our inquiry tests and research projects. If only subject-matter tests are used, the results can be biased against both low-achieving students and female students. For instance, on the research projects, we found that lowachieving students who had the benefit of the Reflective-Assessment Process did almost as well as the high-achieving students. And, these results could not be attributed simply to ceiling effects. We also found that the male and female students did equally well on the inquiry tests and research projects. On the physics tests, however, the pattern of results was not comparable: males outperformed females (on both pretests and posttests) and the high-achieving students outperformed the low-achieving students (White & Frederiksen, 1998). Thus utilizing inquiry tests and research projects in addition to subjectmatter tests not only played a valuable role in facilitating the development of inquiry skills, it also produced a more comprehensive and equitable assessment of students' accomplishments in learning science. 2. It is desirable to help students transfer what they learn about inquiry and reflection to the rest of their school curriculum. Students' work in the

ThinkerTools Inquiry Curriculum and their performance on the various inquiry assessments indicate that they acquired an understanding of the Inquiry Cycle as well as the knowledge needed to carry out each of the steps in this cycle. They also learned about the forms that scientific laws, models, and theories can take and of how the development of scientific theories is related to empirical evidence. In addition, they acquired the metacognitive skills of monitoring and reflecting on their inquiry processes. Since all of science can be viewed as a process of constructing models and theories, both the Inquiry Cycle and the Reflective Assessment Process can be applied to learning and doing all areas of science, not just physics. Thus understanding and engaging in the Inquiry Cycle and Reflective Assessment Process should benefit students in their future science courses. We see evidence of these benefits and transfer in the subsequent work of ThinkerTools students. For example, 8th grade students, who did ThinkerTools in the 7th grade, were asked to do research projects that used the Inquiry Cycle. They were free to choose topics other than physics; for instance, one group of students wanted to understand how listening to music affects one's performance on schoolwork. They did an experiment in which their classmates listened to different kinds of music while taking an arithmetic test. They wrote research reports that described how they followed the Inquiry Cycle in planning and carrying out their research, and they evaluated their own and each other's research using scoring criteria shown in Figure 2. Their teacher reports that their performance on these projects was equal to or better than the performance on their ThinkerTools physics projects. Further, at the end of the curriculum, some students were interviewed and asked if the Inquiry Cycle and Reflective Assessment Process could be used to help them learn other subjects. Many of their answers involved highly creative explanations of how it could be applied to domains such as history, mathematics, and English, as well as to other areas of science. With regard to the teachers, all of them attest to the benefits of both the Inquiry Cycle and the Reflective Assessment Process and have chosen to incorporate them into the other science courses that they teach. In order to make the valuable skills of inquiry,

modeling, and reflection apply to other experimental sciences, such as biology, as well as to the learning of nonscience subjects, various approaches could be pursued. For instance, students could be introduced to a generalized version of the Inquiry Cycle (such as: Question, Hypothesize, Investigate, Analyze, Model, and Evaluate, which represents a minor transformation of the more experimentally-oriented Inquiry Cycle that students internalize during the ThinkerTools Inquiry Curriculum). This generalization could give students a metacognitive view of learning and inquiry that can be applied to any topic in which building predictive/explanatory models can become the focus. In addition, the students could discuss how the Reflective Assessment Process, which uses the criteria shown in Figure 2 (such as Making Connections, Reasoning Carefully, and Communicating Well), can readily be generalized to learning other science topics as well as to learning in general. Having such explicit discussions of transfer in conjunction with explicitly using versions of the Inquiry Cycle and Reflective Assessment Process in their science and other curricula should enable students and teachers to appreciate and benefit from the power of metacognition. Investigating how such generalization and transfer can be achieved will be a major focus of our future research (e.g., White, Shimoda, & Frederiksen, in press). 3. It is important to introduce inquiry-based learning and metacognition early in the school curriculum.Another major implication of our research is that inquiry and reflective assessment should be taught early. This would enable young students to develop metacognitive skills that are important components of expertise in learning. These skills should help the low-achieving students to overcome their educational disadvantages. Our results suggest that inquiry-based science could and should be introduced in the early grades. Students over a range of grades showed equal degrees of learning using the inquiry curriculum: We found no age differences in students' pretest or posttest scores on the inquiry test over grades ranging from grade 7 to grade 9, nor did we find any age differences in students' gains on the physics tests. In addition, our pretests showed only small gender differences in science content knowledge in the 7th

grade, but these gender differences were much larger in the later grades (White, 1993a; White & Frederiksen, 1998). Together, these results suggest that inquiry-based science could be introduced in earlier grades (see also Metz, 1995), and that doing so in these grades (before gender differences in science content knowledge have developed) might help to eliminate the gender differences that develop in knowledge of and interest in science. We plan to extend our work on the ThinkerTools Inquiry Project to investigate how inquiry, modeling, and metacognition can be taught and assessed in earlier grades. To facilitate this investigation, were are developing a constructivist curriculum, similar to the existing ThinkerTools Inquiry Curriculum, in which low-achieving students work in partnership with high-achieving students to plan, carry out, and critically evaluate research. The subject matter is again the physics of force and motion which we conjecture is appropriate for elementary school students, who are often involved in sports and focused on the physical world in general. We further propose to help students learn to transfer the inquiry and metacognitive skills which they acquire in this domain to their other school subjects (as outlined above). Our hope is that such an inquiry-oriented curriculum which focuses on the development of metacognitive skills will enable all students to "learn how to learn" at an early age and will thereby help underprivileged students to overcome their educational disadvantages. Teachers METACOGNITIVE FACILITATION FOR TEACHERS We conclude by addressing the question: How can we enable teachers to implement such inquiryoriented approaches to education? Our research in which we studied the dissemination of the ThinkerTools Inquiry Curriculum indicates that it is not sufficient to simply provide teachers with teacher's guides that attempt to outline goals, describe activities, and suggest, in a semi-procedural fashion, how the lessons might proceed (White & Frederiksen, 1998). We have found that teachers also need to develop a conceptual framework for characterizing good inquiry teaching and for reflecting on their teaching practices in the same way that students need

to develop criteria for characterizing good scientific research and for reflecting on their inquiry processes. To achieve this goal, we utilized a framework that we developed for the National Board for Professional Teaching Standards (Frederiksen, Sipusic, Sherin, & Wolfe, 1997). This framework, which attempts to characterize expert teaching, includes five major criterion: worthwhile engagement, adept classroom management, effective pedagogy, good classroom climate, and explicit thinking about the subject matter, to which we added active inquiry. In this characterization of expert teaching, each of these criterion for good teaching is unpacked into a set of "aspects." For example, Figure 11 illustrates the criterion of "classroom climate," which is defined as "the social environment of the class empowers learning." Under this general criterion, there are five different aspects: engagement, encouragement, rapport, respect, and sensitivity to diversity. Each of these aspects is defined in terms of specific characteristics of classroom practice, such as "humor is used effectively" or "there is a strong connection between students and teacher." Further, each of these specific characteristics of classroom practice is indexed to video clips, called "video snippets," which illustrate it. This framework characterizes good inquiry teaching and provides teachers with video exemplars of teaching practice.

Figure 11. An example of the hierarchical definitions created for each criterion, such as classroom climate,

which are used to characterize expert teaching. Such materials can be used to enable teachers to learn about inquiry teaching and its value, as well as to reflect on their own and each others' teaching practices. For example, recently we tried the following approach with a group of ten student teachers. The student teachers first learned to use the framework outlined above by scoring videotapes of ThinkerTools classrooms. Then, they used the framework to facilitate discussions of videotapes of their own teaching. In this way, they participated in what we call "video clubs," which enabled them to reflect on their own teaching practices and to hopefully develop better approaches for inquiry teaching. (Video clubs incorporate social activities designed to help teachers reflectively assess and talk about their teaching practices [Frederiksen et al., 1997]). The results have been very encouraging, and our findings indicate that engaging in this reflective activity enabled the student teachers to develop a shared language for viewing and talking about teaching which, in turn, led to highly productive conversations in which they explored and reflected on their own teaching practices (Diston, 1997; Frederiksen & White, 1997; Richards & Colety, 1997). We conclude by arguing that the same emphases on metacognitive facilitation that we illustrated is important and effective for students is beneficial for teachers as well. It can enable teachers to explore the cognitive and social goals related to inquiry teaching and to thereby improve their own teaching practices. Through this approach, both students and teachers can come to understand the goals and processes related to inquiry, and can learn how to engage in effective inquiry learning and teaching. Acknowledgments ACKNOWLEDGMENTS We gratefully acknowledge the support our sponsors: the James S. McDonnell Foundation, the National Science Foundation, and the Educational Testing Service. The ThinkerTools Inquiry Project is a collaborative endeavor between researchers at UC Berkeley and ETS with middle-school teachers in the Berkeley and Oakland public schools. We would like to thank all members of the team for their valuable

contributions to this work. References REFERENCES Baird, J., Fensham, P., Gunstone, R., & White, R. (1991). The importance of reflection in improving science teaching and learning. Journal of Research in Science Teaching, 28(2), 163-182. Brown, A. (1984). Metacognition, executive control, self-regulation, and other more mysterious mechanisms. In F. Weinert & R. Kluwe (Eds.), Metacognition, Motivation, and Learning, (pp. 60108). Germany: Kuhlhammer. Brown, A., & Campione, J. (1996). Psychological theory and the design of innovative learning environments: On procedures, principles, and systems. In L. Schauble & R. Glaser (Eds.), Innovations in Learning: New Environments for Education, (pp. 289-325). Mahwah, NJ: Erlbaum. Brown, J., Collins, A., & Duguid, P. (1989). Situated cognition and the culture of learning. Educational Researcher, 18, 32-42. Bruer, J. (1993). Schools for Thought: A Science of Learning in the Classroom. Cambridge, MA: MIT Press. Campione, J. (1984). Metacognitive components of instructional research with problem learners. In F. Weinert & R. Kluwe (Eds.), Metacognition, Motivation, and Learning, (pp. 109-132). West Germany: Kuhlhammer. Carter, G., & Jones, M. (1994). Relationship between ability-paired interactions and the development of fifth graders' concepts of balance. Journal of Research in Science Teaching, 31, 8, 847-856. Chi, M., Bassock, M., Lewis, M., Reimann, P., & Glaser, R. (1989). Self-explanations: How students study and use examples in learning to solve problems. Cognitive Science, 13, 145-182. Collins, A., Brown J., & Newman, S. (1989). Cognitive apprenticeship: Teaching the craft of reading, writing, and mathematics. In L, Resnick

(Ed.), Knowing, Learning, and Instruction: Essays in Honor of Robert Glaser, (pp. 453-494). Mahwah, NJ: Erlbaum. Collins, A., and Ferguson, W. (1993). Epistemic forms and epistemic games: Structures and strategies to guide inquiry. Educational Psychologist, 28, 25-42. Diston, J. (1997). Seeing teaching in video: Using an interpretative video framework to broaden preservice teacher development. Unpublished master's project, Graduate School of Education, University of California, Berkeley, CA. Frederiksen, J. (1994). Assessment as an agent of educational reform. The Educator, 8(2), 2-7. Frederiksen, J., & Collins, A. (1989). A systems approach to educational testing. Educational Researcher, 18(9), 27-32. Frederiksen, J., Sipusic, M., Sherin, M., & Wolfe, E. (1997). Video portfolio assessment: Creating a framework for viewing the functions of teaching (Technical Report of the Cognitive Science Research Group). Oakland, CA: Educational Testing Service. Frederiksen, J. R., & White, B. Y. (1997). Cognitive facilitation: A method for promoting reflective collaboration. In Proceedings of the Second International Conference on Computer Support for Collaborative Learning. Mahwah, NJ: Erlbaum. Hatano, G., & Inagaki, K. (1991). Sharing cognition through collective comprehension activity. In L. Resnick, J. Levine, S. Teasley (Eds.), Perspectives on Socially Shared Cognition (pp. 331-348). Washington, DC: American Psychological Association. Metz, K. (1995). Reassessment of developmental constraints on children's science instruction. Educational Researcher, 65(2), 93-127. Miller, M. (1991). Self-assessment as a specific strategy for teaching the gifted learning disabled. Journal for the Education of the Gifted, 14(2), 178188. Nickerson, R., Perkins, D., & Smith, E. (1985). The

Teaching of Thinking. Mahwah, NJ: Erlbaum. Palincsar, A. & Brown, A. (1984). Reciprocal teaching of comprehension fostering and monitoring activities. Cognition and Instruction, 1(2), 117-175. Reeve, R. A., & Brown, A. L. (1985). Metacognition reconsidered: implications for intervention research. Journal of Abnormal Child Psychology, 13(3), 343356. Resnick, L. (1987). Education and Learning to Think. Washington, D. C.: National Academy Press. Richards, S., & Colety, B. (1997). Conversational analysis of the MACSME video analysis class: Impact on and recommendations for the MACSME program. Unpublished master's project, Graduate School of Education, University of California, Berkeley, CA. Scardamalia, M., & Bereiter, C. (1991). Higher levels of agency for children in knowledge building: A Challenge for the Design of New Knowledge Media. The Journal of the Learning Sciences, 1(1), 37-68. Schoenfeld, A. H. (1987). What's all the fuss about metacognition? In A. H. Schoenfeld (Ed.), Cognitive Science and Mathematics Education (pp. 189-215). Mahwah, NJ: Erlbaum. Schon, D. (1987). Educating the Reflective Practitioner. San Francisco, CA: Josey-Bass Publishers. Slavin, R. (1995). Cooperative learning: theory, research, and practice (2nd edition). Needham Heights, MA: Allyn and Bacon. Towler, L., & Broadfoot, P. (1992). Self-assessment in primary school. Educational Review, 44(2), 137151. Vygotsky, L. (1978). Mind in Society: The Development of Higher Psychological Processes. (M. Cole, V. John-Steiner, S. Scribner, & E. Souberman, Eds. and Trans.). Cambridge, England: Cambridge University Press. White, B. (1993a). ThinkerTools: Causal models,

conceptual change, and science education. Cognition and Instruction, 10(1), 1-100. White, B. (1993b). Intermediate causal models: A missing link for successful science education? In R. Glaser (Ed.), Advances in Instructional Psychology, Volume 4, (pp. 177-252). Mahwah, NJ: Erlbaum. White, B., & Frederiksen, J. (1998). Inquiry, modeling, and metacognition: Making science accessible to all students. Cognition and Instruction, 16(1), 3-117. White, B., & Horwitz, P. (1988). Computer microworlds and conceptual change: A new approach to science education. In P. Ramsden (Ed.), Improving learning: New perspectives. London: Kogan Page. White, B., Shimoda, T., & Frederiksen, J. (in press). Constructing a theory of mind and society: ThinkerTools that support students' metacognitive and metasocial development. In S. Lajoie (Ed.), Computers as Cognitive Tools: The Next Generation. Mahwah, NJ: Erlbaum. Appendix APPENDIX A This appendix contains the following: The outline for research reports that is given to students. An example of a student's research report and her self-assessment. An Outline and Checklist for Your Research Reports
 

Question: o Clearly state the research question. Predict: o What hypotheses did you have about possible answers to the question?  Explain the reasoning behind each of your hypotheses. Experiment: o Describe your computer experiment(s).  Draw a sketch of your

o o

o 

computer model.  Describe how you used it to carry out your experiment(s). Show your data in tables, graphs, or some other representation. Describe your real-world experiment(s).  Draw a sketch of how you set up the lab equipment.  Describe how you used the equipment to carry out your experiment(s). Show your data in tables, graphs, or some other representation. Describe how you analyzed your data and show your work. Summarize your conclusions.  Which of your hypotheses does your data support?  State any laws that you discovered. What is your theory about why this happens? Show how what you learned could be useful.  Give some examples. What are the limitations of your investigation?  What remains to be learned about the relationship between the mass of an object and how forces affect its motion?  What further investigations would you do if you had more time?

Model:
o o

o 

Apply:
o

o

An example research report about mass and motion written by a 7th grade student (age 12) During the past few weeks, my partner and I have been creating and doing experiments and making observations about mass and motion. We had a specific question that we wanted to answer -- how does the mass of a ball affect its speed? I made some predictions about what would happen in our experiments. I thought that if we had two balls of

different masses, the ball with the larger mass would travel faster, because it has more weight to roll forward with, which would help push it. We did two types of experiments to help us answer our research question -- computer and real world. For the computer experiment, we had a ball with a mass of 4 and a ball with a mass of 1. In the real world they are pretty much equal to a billiard ball and a racquetball. We gave each of the balls 5 impulses, and let them go. Each of the balls left dotprints, that showed how far they went for each time step. The ball with the mass of 4 went at a rate of 1.25 cm per time step. The ball with the mass of 1 went at a rate of 5 cm per time step, which was much faster. For the real world experiment, we took a billiard ball (with a mass of 166 gms) and a racquetball (with a mass of 40 gms). We bonked them once with a rubber mallet on a linoleum floor, and timed how long it took them to go 100 cm. We repeated each experiment 3 times and then averaged out the results, so our data could be more accurate. The results of the two balls were similar. The racquetball's average velocity was 200 cm per second, and the billiard ball's was 185.1 cm per second. That is not a very significant difference, because the billiard ball is about 4.25 times more massive than the racquetball. We analyzed our data carefully. We compared the velocities, etc. of the lighter and heavier balls. For the computer experiment, we saw that the distance per time step increased by 4 (from 1.25 cm to 5 cm) when the mass of the ball decreased by 4 (from 4 to 1). This shows a direct relationship between mass and speed. It was very hard to analyze the data from our real world experiment. One reason is that it varies a lot for each trial that we did, so it is hard to know if the conclusions we make will be accurate. We did discover that the racquetball, which was lighter, traveled faster than the billiard ball, which was heavier. Our data doesn't support my hypothesis about mass and speed. I thought that the heavier ball would travel faster, but the lighter one always did. I did make some conclusions. From the real world experiment I concluded that the surface of a ball plays a role in how fast it travels. This is one of the reasons that the two balls had similar velocities in our real world

experiment. (The other reason was being inaccurate). The racquetball's surface is rubbery and made to respond to a bonk and the billiard ball's surface is slippery and often makes it roll to one side. This made the balls travel under different circumstances, which had an effect on our results. From the computer experiment I concluded that a ball with a smaller mass goes as many times faster than a ball with a larger mass as it is lighter than it. This happens because there is a direct relationship between mass and speed. For example, if you increase the mass of a ball then the speed it travels at will decrease. I concluded in general, of course, that if you have two balls with different masses that the lighter one will go faster when bonked, pushed, etc. This is because the ball doesn't have as much mass holding it down. The conclusions from our experiments could be useful in real world experiences. If you were playing baseball and you got to choose what ball to use, you would probably choose one with a rubbery surface that can be gripped, over a slippery, plastic ball. You know that the type of surface that a ball has effects how it responds to a hit. If you were trying to play catch with someone you would want to use a tennis ball rather than a billiard ball, because you know that balls with smaller masses travel faster and farther. The investigations that we did do have limitations. In the real world experiments the bonks that we gave the balls could have been different sizes, depending on who bonked the ball. This would affect our results and our conclusions. The experiment didn't show us how fast balls of different masses and similar surfaces travel in the real world. That is something we still can learn about. If there was more time, I would take two balls of different masses with the same kind of surface and figure out their velocities after going 100 cm. Overall, our experiments were worthwhile. They proved an important point about how mass affects the velocity of a ball. I liked being able to come up with my own experiments and carrying them out.

An example of a self assessment written by the student who wrote the preceding research report (age 12)

UNDERSTANDING

Justify your score based on your work. I have a basically clear understanding of how mass affects the motion of a ball in general, but I don't have a completely clear sense of what would happen if friction, etc. was taken into account.

Justify your score based on your work. I used the inquiry cycle a lot in my write up, but not as much while I was carrying out my experiments.

Justify your score based on your work. I made some references to the real world, but I haven't fully made the connection to everyday life.

PERFORMANCE: DOING SCIENCE

Justify your score based on your work. What I did was original, but many other people were original and did the same (or similar) experiment as us.

Justify your score based on your work. On the whole I was organized, but if I had been more precise my results would have been a little more accurate.

Justify your score based on your work. I used many of the tools I had to choose from. I used them in the correct way to get results.

Justify your score based on your work. I took into account the surfaces of the balls in my results, but I didn't always reason carefully. I had to ask for help, but I did compute out our results mathematically. SOCIAL CONTEXT OF WORK

Justify your score based on your work. I understand the science, but in my writing and comments I might have been unclear to others.

Justify your score based on your work. We got along fairly well and had a good project as a

result. However, we had a few arguments. REFLECTION

How well do you think you evaluated your work using this scorecard? I think I judged myself fairly - not too high or too low. I didn't always refer back to specific parts of my work to justify my score. copyright 2000 ThinkerTools project
http://thinkertools.org/Pages/paper.html 16 Oktober 2013

Sign up to vote on this title
UsefulNot useful