You are here

PARCC sample assessment items for high school math

PARCC recently released some sample computer-based test items for ELA and high school mathematics, so I thought I would check them out since NY state is still officially planning (eventually?) to use the PARCC assessments.

First, some kudoes to the team that created these assessment questions themselves. In general I found that the questions were looking for evidence of mathematical reasoning, and would be difficult to game with classroom test-preparation. What I think is missing is an opportunity for students to demonstrate the complete range of what it means to do mathematics, including asking questions themselves that they answer, but for an assessment with this function, this seems much better than the current generation of standardized tests.

If you want to stop reading this and preview the questions yourself, feel free to do so (if you are only interested in looking at the sample math questions, you'll have to skip through the sample ELA questions).

Here's my preliminary thoughts from attempting the first few problems myself.

  1. Use of space is critical. The first assessment question does not do this very well. Look at the video below that explains my reasoning on this.

  2. The second question has two issues, one of which is really very minor, the other of which is something PARCC should make an effort to fix.

    I'm okay with questions that use approximate models for mathematics, but it might be at least worth noting to students that these models are approximations.

    Taking a test on a computer is already extremely distracting as compared to taking a test on pencil and paper. Given that there is research that shows that people generally read slower and with less understanding on a computer, PARCC should make an effort to mitigate the platform issues as much as possible. Imagine if your work flickered in and out of existence while you were writing it down on paper?
  3. I put the wording for the third question through a reading level estimation calculator, and it estimated that the reading level was grade 10. While it is reasonable to expect a certain amount of competence from students in this respect, we have to be careful that our assessments of the mathematical thinking of students aren't actually measuring whether or not they can read the prompts in our assessment.
  4. Question 5 assumes a certain amount of cultural knowledge, specifically knowledge of playing golf. Having worked with students who do not have the this sport in their cultural background, I found assessment items like this frustrating. Usually, the questions are doable without the cultural knowledge, but imagine you are a student who comes across a question that contains an idea with which you know nothing. Regardless of whether or not the knowledge is required to do the mathematics of the problem, it impacts student confidence and therefore their performance.
  5. The sixth question assumes that students have some minor technical knowledge, which I would classify in the same genre as my fourth point; students with a minimal technical background may struggle with the mechanics of this task. This may not affect a huge number of students, but the assessment instrument should be as neutral as possible to allow the greatest number of students to interact with the mathematics of the task, not the mechanics of the task.
  6. The seventh question has a video. It's probably between 4 and 10 megabytes in size. Can you imagine what this will do to your school's bandwidth if every student in a particular grade is accessing the resource at the same time?


There are some things which I think are obvious to me about the computer based assessment that PARCC is working on.

The first is that many of these questions are still going to require actual math teachers, with some experience looking at student work, to look at. Most of these questions are not just reformated multiple choice questions (although some of them are). While this increases the per-student cost of the assessment, I do not think that there are computer programs (yet) that exist that can accurately capture and understand the full range of possible mathematical reasonings of students.

Next, some of the more adaptive and social aspects of the work Dan Meyer and company have put together, are not present in this work. This assessment is intended to capture what students think now, rather than what students are able to do once given more information. This is still an assessment of the content students have learned, and does not appear to do an ideal job of making sense of how students make sense of problems and persevere in solving them, attend to precision, or any of the other standards for math practice (SMP). While it is clear to me that students will have to use these standards when doing this assessment, I do not see how anyone looking at the resulting student work is going to be able to say with any accuracy what is evidence of each of the SMP.

Unfortunately, unless the standards for math practice get captured somehow by an assessment (a goal of ours during next year with our project is to make an attempt to do this systematically), it is unlikely that teachers will use them.



About David

David is a Formative Assessment Specialist for Mathematics at New Visions for Public Schools in NYC. He has been teaching since 2002, and has worked in Brooklyn, London, Bangkok, and Vancouver before moving back to the United States. He has his Masters degree in Educational Technology from UBC, and is the co-author of a mathematics textbook. He has been published in ISTE's Leading and Learning, Educational Technology Solutions, The Software Developers Journal, The Bangkok Post and Edutopia. He blogs with the Cooperative Catalyst, and is the Assessment group facilitator for Edutopia. He has also helped organize the first Edcamp in Canada, and TEDxKIDS@BC.


I agree with your points above, especially with the graph that isn't completely visible. I'd also point out that a large fraction of students are supposed to take these tests on either iPads or Chromebooks, according to Sue Gendron. Have you tried these problems without using a mouse with a scroll-wheel? It's a PITA. I'm trying to get my school to purchase the $12 cordless mice so we can share them during testing on Chromebooks. So far, no dice.  I'm recommending to anyone who will listen: get some USB mice. It makes a difference.

Question 4 is unintelligible.
Do they really mean f(x+4) = x^2 +6x   ?
f(x+4) is not a function, it is representing the value of the function  f  for the argument  x+4
Also I am guessing that by vertex they mean max or min point.  Polygons and polyhedra have vertices.
Are math teachers supposed to tell the pupils all the wrong terminology (as well as the generally understood and reasonably correct) just in case some test author was ignorant?
Finally, how long does it take for an average pupil to become reasonably good at using this mathematical word/symbol processor.?

David Wees's picture

Lots of people use the word vertex to describe the turning point, or minimum, or maximum, or extrema, or point on the axis of symmetry of a parabola. See

As for the use of notation, I'm not sure. I'd have go look at the question again to see if all of the function notation they've used makes sense.

Re vertex of a parabola.
Yes, when a parabola is presented in quadratic equation form the vertex is at the minimum point. However, the definition of vertex for a curve is a point where the curvature is a minimum or maximum, so the vertex for a parabola is at the point where the curve crosses its axis. The parabola in standard graph form is probably the only curve where the minimum is a vertex.
I checked out y = x + 1/x. minimum at x = 1, vertex at x = 0.707 (1 over root 2).
I also found this horror from the University of Iowa :

" Parabolas are graphs with equations of the form y = a*sqr(x) + b*x + c , with a < > 0 . Parabolas have a highest or a lowest point (depending on whether they open up or down), called the vertex. "

I may be a bit nit - picky, but this sort of stuff gives math a bad name!
Sometimes I think that the poor kids feel like Alice when the Queen of Hearts says 'Words mean what I say they mean.'

Thank you for this post. It is good to know that others out there have many of the same thoughts about the released questions.
My concern about question 7 is that students don't have access to the diagram unless they are playing the video. So while they might not need to watch the video several times, they will have to keep playing it to know which point is where once they are reading to start working through a proof. It seems like having a static picture of the diagram showing when the video isn't playing could help.
Scrolling is also definitely an issue. On my Mac, I have found it helpful to reduce the font size to see more on the screen at once (command -).  I couldn't find a correct answer to one of the questions on the Smarter Balanced test released last summer, and it turns out I just couldn't see it on my screen. It looked like I had all of the choices showing & wasn't at all intuitive that I needed to scroll down to see additional choices.
I wrote a post on how my students answered the geometry item (this was before the computer based test was available) & my concern about my students making use of structure differently than the released solution. PARCC noticed my post & at least acted like they were glad for the feedback.

Hi, Gregory Olson from South Alabama's EDM310 Class again. I see that you made mention that you would like students be able to make their own questions and answer them. In our class, we have been discussing project based learning and how students interacting with the material more helps them learn better then just having it written on the board. But how does having a math student write and answer their own questions help? Would it be better for students to write the question, and then have to be able to answer another students question?

David Wees's picture

Check out Dan Meyer's Three Act lesson as one activity structure in which students ask, and then attempt to answer, their own questions. There is structure around the lessons; they are not unguided discovery, but because students ask the questions themselves they have significantly more ownership over the process, which means they are more engaged with what they are learning, and consequently the students learn more.

Did anyone ever skip questions on tests and then go back to complete them when taking exams?
I believe there is no going back to previous questions on the PARRC test. Does anyone else think 
that this is a problem?

David Wees's picture

I just double-checked. You can't go back to a previous section, but within a section (ie. the ELA section) you can navigate to and from different questions. This is essentially the same as many exams of this nature; you can't go back and do questions from a completely different section of the exam, but you can flip through and do the questions in the order that you want.

Hello Mr. Wees,
   My name is Sydney Potter. I am a student at the University of South Alabama and I am an elementary education major. I am currently taking the EDM 310 class and I am required to comment on your blog post. As I was reading over your post, this new program seems very unique program that will develop many new ways to get students engaged in the lesson and help them want to learn everyday. Trying new things is always a good idea because kids learn so many ways and you never know what may really help a student succeed. 

I am a 20+ year math teaching veteran at the high school level and I have been diligently working on PARCC prep materials.
Please consider reviewing my growing list of resource materials at...

Add new comment

Subscribe via email

This question is for testing whether you are a human visitor and to prevent automated spam submissions.
Enter the word shown above.


Subscribe Subscribe to my blog

Theme by Danetsoft and Danang Probo Sayekti inspired by Maksimer