Last modified: Wednesday, September 14, 2011
NSF awards nearly $1 million for IU project to investigate how students learn science
FOR IMMEDIATE RELEASE
Sept. 14, 2011
BLOOMINGTON, Ind. --The National Science Foundation has awarded $951,997 to the Indiana University School of Education for a project led by Learning Sciences faculty member Nathaniel J.S. Brown to give educators more accurate and precise measurements of how students are learning science.
Brown is the principal investigator for "Psychometric and Growth Modeling of Complex Patterns of Learning Resulting from the Interrelationships between Multiple Learning Progressions," which has as a research goal finding measures for student learning that are more like measures on a physical scale with measurement intervals the same width.
"The distance between 2 centimeters and 3 centimeters is the same as the distance between 15 centimeters and 16 centimeters," Brown said. "To take those standards of measurement and to translate them into the social sciences has been one of the big pushes and developments of the last 50 to 100 years."
Several education organizations have pushed for more intricate learning measurements from psychometricians, including the National Research Council, which noted in the 2007 report "Taking Science to School" that learning assessment needs to accurately reflect research on student thinking and be informed by the structure of the subject matter. Psychometrics is the field that examines the construction of educational measurement.
Brown will conduct the three-year study with colleagues at the Lawrence Hall of Science at the University of California, Berkeley, a public science center and the leading provider of science education outreach programs in Northern California. Seth Corrigan leads development of student assessments and research measures and Suzy Loper is the lead science curriculum specialist for the institution's "Seeds of Science/Roots of Reading" project.
The study will examine potential clues to developing "learning progressions," measures of student learning that give evidence about what a student knows in a science subject, something a typical grading scale can't provide. "A kid gets a 97 on a test, a kid gets a 37 on a test; you know that number is high or low, but that's really all that it tells you," Brown said. "It doesn't tell you anything about what that student knows. It doesn't tell you anything about what you should do next with that student."
This study will investigate ways to develop an assessment that could assign meaning to points along a measurement scale. To do that, Brown said, the researchers will take into consideration the complex patterns of learning that are a part of how students gain knowledge. For example, young students often equate the concept of density with mass and volume. It's important from a teaching and learning standpoint, Brown said, to understand when students move out of misconceptions into a more sophisticated and multi-faceted idea of such a scientific concept.
Similarly, some students learn a lot of content, but then stall before reaching a next level of learning because they are still developing reasoning ability. "If you're not aware of what's going on with the other learning dimensions, it looks like the student makes a lot of progress and then they just sort of sit there for a long time," Brown said. "And that pattern of learning may be because there are multiple learning progressions -- while you're just sitting here, you're actually making progress on this other one. So again, how do you model that?"
The project will begin by examining existing research data on elementary, middle school, and high school students to develop measurement models. Next year, the project team will administer assessments to students across the grade levels and apply the developed models to see if they are consistent with the newly gathered data. Then Brown and his team will analyze and characterize the learning progressions as applied to the student data they have gathered.
"The result will be a set of models other curriculum designers could use to develop more sophisticated psychometric and growth models," Brown said. "With these, the learning progressions they're using to align their assessments and their curriculum and instruction can reflect the complexity of the learning that's actually happening, not an assumption of constant linear growth along independent dimensions."