PARCC recently released some sample computer-based test items for ELA and high school mathematics, so I thought I would check them out since NY state is still officially planning (eventually?) to use the PARCC assessments.
First, some kudoes to the team that created these assessment questions themselves. In general I found that the questions were looking for evidence of mathematical reasoning, and would be difficult to game with classroom test-preparation. What I think is missing is an opportunity for students to demonstrate the complete range of what it means to do mathematics, including asking questions themselves that they answer, but for an assessment with this function, this seems much better than the current generation of standardized tests.
If you want to stop reading this and preview the questions yourself, feel free to do so (if you are only interested in looking at the sample math questions, you'll have to skip through the sample ELA questions).
Here's my preliminary thoughts from attempting the first few problems myself.
There are some things which I think are obvious to me about the computer based assessment that PARCC is working on.
The first is that many of these questions are still going to require actual math teachers, with some experience looking at student work, to look at. Most of these questions are not just reformated multiple choice questions (although some of them are). While this increases the per-student cost of the assessment, I do not think that there are computer programs (yet) that exist that can accurately capture and understand the full range of possible mathematical reasonings of students.
Next, some of the more adaptive and social aspects of the work Dan Meyer and company have put together, are not present in this work. This assessment is intended to capture what students think now, rather than what students are able to do once given more information. This is still an assessment of the content students have learned, and does not appear to do an ideal job of making sense of how students make sense of problems and persevere in solving them, attend to precision, or any of the other standards for math practice (SMP). While it is clear to me that students will have to use these standards when doing this assessment, I do not see how anyone looking at the resulting student work is going to be able to say with any accuracy what is evidence of each of the SMP.
Unfortunately, unless the standards for math practice get captured somehow by an assessment (a goal of ours during next year with our project is to make an attempt to do this systematically), it is unlikely that teachers will use them.
David is a Formative Assessment Specialist for Mathematics at New Visions for Public Schools in NYC. He has been teaching since 2002, and has worked in Brooklyn, London, Bangkok, and Vancouver before moving back to the United States. He has his Masters degree in Educational Technology from UBC, and is the co-author of a mathematics textbook. He has been published in ISTE's Leading and Learning, Educational Technology Solutions, The Software Developers Journal, The Bangkok Post and Edutopia. He blogs with the Cooperative Catalyst, and is the Assessment group facilitator for Edutopia. He has also helped organize the first Edcamp in Canada, and TEDxKIDS@BC.