- If you want to walk away from the conference with something different about your practice, focus on one or maybe two ideas during the conference and only go to sessions that will support you in learning and revisiting those ideas.On the other hand, if you don’t know what you are hoping to learn at the conference, you might be better off going to many different sessions in the first couple of days at least before deciding however note that this may something you can learn from the conference planner rather than attending a scatter-shot of different sessions.

No one or two hour session is likely to lead to any significant change in your practice. 20 hours thinking about one part of teaching, especially in the different ways different presenters think about the idea, on the other hand may. - Skip a couple of sessions and take the time to reflect on your learning and make a plan for implementing the new idea(s) from the conference into your teaching. If you don’t eventually come up with a plan for implementing new ideas in your teaching, you will never try them out. The sooner you have a plan, the easier it will be to come up with that plan while the ideas are still fresh in your mind. You should also make a note of questions you still have.

- Find someone with whom to share and discuss your learning experiences at the conferences. In my case, I went to NCSM & NCTM with my colleagues, but if you end up going alone, try and arrange a buddy or two to spend the week with early in the week. This is one potential powerful use of Twitter; if you are active on Twitter, chances are good you will know other people at the conference.

It’s worth noting here that not every presenter at NCSM and NCTM is equally good at communicating what they know and so having someone to talk after attending a session is useful to fill in whatever parts of the story or idea you missed but maybe someone else caught. - Take the time to network with other people at the conference, even if this means you may miss a session or two.

- Be strategic in the exhibition hall if you visit it at all. I find it overwhelming and draining. I typically take 30 minutes and scan through the entire hall quickly to see if there are any
*types*of products I don’t know about and then come back to follow up later if necessary.

]]>

]]>

Unfortunately, the video tracking is not great so much of the annotation I was doing of the participant ideas is not easy to see as I am doing it.

The instructional activity itself is called Contemplate, then Calculate and was developed by Grace Kelemanik and Amy Lucenta. The slides, script, resources, and references are available here.

Things I would do differently:

- Set up the projector screen in advance of starting the activity so I don’t have to fiddle with it during the strategy sharing.
- Not wing the recording of noticings and strategies because I ran out of time to prepare before this talk, but take the time to make a template for collecting data.
- Record the initial noticings of participants about the problem as they were happening.
- Bring my own markers so that I can ensure I have access to more than one colour when recording the student ideas.

Things I decided to do or not to do somewhat deliberately:

- I did not focus on student to student discourse during the full group portions mostly in the interest of time. My meta-objective for this activity was to share the overall structure in a somewhat limited amount of time.
- I did not enforce participants writing using the prompts mostly because I knew I had little to no relationships with participants and I wanted to make sure no one felt alienated during this portion of the talk.
- I did make sure that when I was recording student strategies that I tried not to impose, as much as I could, much interpretation of those strategies. One of the participants actually came up after-ward and said she really noticed that I was making an effort to write down representations of what participants were saying rather than filling in too many of the gaps based on my understanding of the problem.
- I also focused on having participants share multiple strategies to solving this problem rather than attempting to focus participants on seeing one particular strategy.
- I decided to summarize participant ideas at the end rather than take the time to have them share out to the room, mostly in the interest of time.

What else about what I did do you have questions or feedback about?

]]>

The audio isn’t terrific unfortunately but it is manageable. The slides, script, references, and resources are available here.

]]>

Here is the activity I gave my students to do:

Here’s what I anticipated they might do:

Here’s a sample of what they **actually** did:

This student seems to understand that if you want to shade 1/4 of something, you want to split it into four equal parts and then shade one of those parts. The triangle drawn is more challenging to divide, so the student likely estimated their partitioning of it.

However, if one wants to shade 1/3 of an object, it looks like this student thinks that they need to split the shape into four unequal parts and shade the smallest of those parts, maybe because they think that 1/3 is smaller than 1/4?

This student told me that they know that 1/4 is the same as “half of a half” and so they divided each of the shapes in half, and then divided one of those halves into a further half, and then shaded it.

When it came to shading 1/3 of a shape though, this student looks like they divided the shape into four equal parts and then shaded three of those parts. My guess is that they thought that fractions involve dividing into four parts, and then used the denominator of the fraction to choose how much to shade? I’m not sure that they noticed the inconsistency in this procedure and what they did in the first part.

This last student seems to have done pretty much the same as the second student, but seems aware that when you want to shade 1/3 of a shape, you want to first divide it into three equal parts and then shade one of those parts. I asked them about it and they said they just weren’t sure how to cut the shapes into three equal parts.

With this gap between what I was expecting to see and what I actually saw, I was left without knowing an effective way to respond. I decided that this task had little entry points for someone who really did not know what 1/4 and 1/3 meant (one of my student’s work was blank, which I am not showing since I do not have permission).

With the worry in the back of my head that I was probably reducing the cognitive demand for students, we split into two groups with my student-teacher leading one group and me leading the other group and we tried a different task, one which offered some context through which one might make sense of what 1/3 means.

“If you had a pizza and you wanted to share it equally with three people, how would you cut it up so that each person got 1/3 of the pizza?”

With this prompt, students in my group almost all produced work similar to the following:

With one student drawing this:

This second student, by the way, said that pizza comes cut into ten-slices and so the best we could hope to do is to give each person 3 slices of the 10 and then give away the last slice to someone else.

The knowledge that would have helped me better anticipate student understandings here is what I call “the ways children typically understand mathematical ideas” and is the kind of knowledge that is rarely explicitly taught before starting teaching. Every time I teach a new topic, I notice that the ways students think about the ideas are different than what I expect, and over time, I learn to anticipate student thinking better as I get feedback from working with them.

If you had students showing these kinds of responses to this task, what would you do during your next lesson on fractions to support them?

]]>

Below is an example of the kind of data that has no useful impact on instruction ever. The data content in the picture below is high but the information content is low. How exactly is this information supposed to help a teacher make sense of what she should do with her students?

Here’s a slightly more useful variation on the same data where it has been organized to potentially be suggest some next steps. However, there’s a critical piece missing in this data – the actual task students did! Without knowing what the questions were that were asked and without knowing what the responses below refer to, this is meaningless information. The only take-away that I have from the information below is that it is unlikely for these multiple-choice questions that students were guessing.

This leads me to believe that one of the issues we have with this data is that it has compressed the information about what students did to such a great degree that it is impossible to use the information meaningfully. All of the rich work students have done and what they have thought about has been compressed into a few numbers and consequently making decisions based on those numbers alone is incredibly difficult.

What can we do differently? One option is to consider hypotheses about why students may have actually chosen those multiple choice responses like the following. Again, the question itself is critical but at least this leads to some potential things to give feedback to students about.

But these suggestions for what students thought about if they chose a response are just hypotheses. They are thought through from an adult’s perspective on the mathematics and these kind of analyses are rarely informed by detailed and thoughtful research into children’s mathematical thinking.

While multiple-choice questions are an inexpensive and relatively time-efficient way to gather evidence of student performance, they make it difficult to really capture the richness of student thinking. What, for example, do you think the student below was thinking about when they worked on this problem?

While it is clear that this sample of student work provides potentially powerful information, the challenge with looking at individual student work is it can be extremely time-consuming and challenging to study a variety of student work and come up with systematic responses to that group of students. We have to somehow combine the richness of information provided by what students actually did with a systematic approach so that we can approach teaching an entire classroom full of students.

One approach we have used in our project to diagnostic assessment is to systematically look at student work and decide what approaches students took and attempt to group their work with other students who appear to have thought similarly. A teacher I know literally uses the desks in his classroom and spreads out all of his student work across the room trying to make sense of the different mathematical ideas students used. Other teachers record their interpretations of the approaches the students used in spreadsheets, like the picture below suggests.

But still, no matter how we organize the information, the question remains, what do we do with it? A further challenge is, how do we use information on what we uncover about student thinking in a more timely fashion rather than after our class is over?

I’m going to suggest some approaches for both of these questions in a follow-up post, but I’d love to know how you respond to student data, both after the fact and during a class.

]]>

Last Saturday, I tried a new instructional activity called Choral Counting. This activity was recommended to me by Magdalene Lampert and comes out of the work she and others have done to support high-quality ambitious teaching.

I’m experimenting with these instructional routines in part because I hope to support teachers in the project I support in using them in their instruction, which I can hardly be expected to do without experiencing them from the inside myself. It’s clear that this activity is much richer mathematically than one might expect from just hearing about it. One finding I have already is that I have to do a much better job of planning how I will use my space when recording the numbers!

The basic idea of choral counting is easy — students count in unison and you write down the numbers as they chant, and then pause students to ask them questions about the numbers. What I learned Saturday is that there is a lot of potential in this activity to bring out rich mathematical ideas for discussion as a group.

I chose to start at sixteen and count by fives. While the counting was going on, I noticed my students paused a bit at 76. So I stopped and asked them why they think they paused. They also paused at 101, which during my lesson planning I had anticipated they would do. They also had different responses at 111 (121 was next, which I expected).

I don’t know why they found 76 more challenging. Maybe because it was the first time they had to use a number in either position higher than 6? 101 is clear – Many students thought “tendy one” and self-corrected before they spoke, which slowed them down. I thought this myself when I first counted through this routine! Many students said 121 instead of 111 and I remember my own son doing something similar when he was learning how to count. I also paused a couple of times to ask students what pattern they noticed and at one point I asked them to predict what the number would be if we counted four more times. One student proved her answer by counting up by fives, another student said it would be twenty more because four times five is twenty.

Here’s the data collected by my assistant teacher (one of the children’s older sister comes to the class and she has happily volunteered to record information for me and to walk around the class generally supporting students).

Once we got to 131, I asked students what would be the next number that starts with 2. Two students came up with different mathematical arguments that the next number would be 201. One student said that the first digit was clearly a 2 and that the next digit must be a zero, because we are counting by fives. The last digit must be a 1 because our pattern always alternates between 1 and 6 and 1 is smaller than 6. The other student said that if we counted 19 fives or 95 higher from 101, that would be 196 which does not start with 2, but that if we counted 20 fives or 100 higher from 101, that would be 201 which must be the next number starting with 2 since we won’t count a number between 196 and 201.

Although the activity went longer than I expected, it was incredibly rich and worth doing.

]]>

Too often we ask teachers to listen for an hour or even two to someone talk nearly non-stop about what they know with perhaps a question thrown in once in a while. Since very few people can talk in an engaging way for this amount of time, it’s no wonder that so many people check out of this kind of experience.

But this experience is even worse because most educators have the background experience to engage in the ideas being discussed almost immediately but lecturing at teachers does not take into account their prior experiences. If someone lectures at teachers for an hour, I assume that they actually do not understand their topic or teaching well enough to think through how someone might engage in the topic more directly (the only exceptions are a keynote or a class size that numbers in the hundreds, very little else is possible without technology).

Yesterday I gave teachers a very brief introduction to one of four potential technology tools they might want to explore, and then I gave them access to the resources they would need to explore the tools. While they did this, I circulated around the room. Once it looked like they had made a decision about which tool they intended to explore more deeply, I asked them to group up with people exploring the same tool.

Then I continued to circulate around the room and made a note who seemed comfortable with the technology, who needed support, and who was vocal and who was not. I checked out who attempted to make things with the tools for themselves and who seemed more content to explore things already created. I used the information I gathered to decide when to pause the various groups’ discussion and work and how to structure the ensuing conversation and what questions might be useful to ask. I used the experiences the teachers were developing from the technology to inform the rest of the session.

I asked teachers to try and fill in a lesson plan template and choose a goal. Most did not, and so I decided to model a potential use of the technology using a Noticing and Wondering protocol. We then discussed what goal this activity might support and what this activity would look like if we tried to use pencil and paper.

At no point did I treat the teachers like they were incapable of figuring out how to use the technology themselves or thinking through for themselves how the technology might support their teaching. I certainly gave them opportunities for feedback on their ideas and offered them support when they seemed to need it, but I treated them as people who think and the ideas they had as being important to surface.

After all, this is all just good teaching, and don’t educators deserve that?

]]>

At the beginning of the year, we made two major shifts in our beginning of unit diagnostic assessments. The first is that we selected tasks which aligned more closely to mathematical ideas that one might consider pre-requisite ideas for the unit. The next is that we developed a more sophisticated protocol for teachers to make sense of the student work.

In prior years, we expected teachers to use a rubric to score the student work and use the scores to make decisions about what to do next. Unfortunately this process has teachers compress the information from the student work into a single number for each student, and then we had to provide a tool to help teachers unpack the score into mathematical understandings and then have them decide on next steps. This means that a huge amount of potentially useful information for making decisions about the student work is lost in the conversion to a number which unnecessarily complicates the decision-making process.

Instead, we developed a protocol and a spreadsheet tool so that teachers could look at the student work and systematically record the strategies the students were using as well as how successfully students used these strategies.

In order to develop the protocol and the spreadsheet tool, I took a sample of student work on the task and grouped it according to different types of mathematical strategies students used for each question on the task. I then decided on language that would communicate those strategies to teachers and created a set of instructions on how to go through the student work and record the strategies systematically.

Given the amount of time this takes, we decided to restrict this to just the beginning of unit assessments and suggested to teachers that instead of looking at every single student’s work, they could select a random sample of 20 to 30 students to look at in depth. We also attempted to make it clear, that while we strongly suggested that teachers try using this tool, this was not a mandated part of our project; instead the mandate is for teachers to give an initial assessment to their students and then systematically make sense of the information provided by the task.

Once we have the spreadsheet tool ready for any given unit, our data researcher uses Autocrat and a custom script a member of the New Vision Cloudlab team wrote to distribute the spreadsheets to teachers and then pre-populate the spreadsheets with their student names.

One theory I have with this work is that an excellent way for teachers to develop their knowledge of how students approach mathematical tasks and consequently understand mathematical ideas, is to systematically look at student work and record and analyze the actual strategies students have used, as represented by their written work.

An interesting finding we have so far is that although not all of our teachers are using the spreadsheet tool, many of them are systematically sorting their student work by different strategies used and making sense of the student work and then deciding on instructional next steps, based directly on the student work itself. This is very likely an idea generated by the use of the tool as we had not witnessed large number of teachers in our project using this protocol until this year.

Our hope is that by the end of this year, we will have tasks and tools available for each of the twenty units we are developing as part of our resource support for teachers.

]]>

In my experience when I first started using technology in my teaching, my planning protocol went something like this:

- Find some cool gadget or activity and say “Oooo, I have to use this.”
- Shoe-horn it into a lesson even if it didn’t always make sense.
- Wonder why my students didn’t learn a whole lot from the activity.

The advantage of the TTAL protocol is that it puts the goal or focus first and helps prompt someone planning a lesson to think through how each part of their lesson supports their overall goals for students. I also think that although the TTAL protocol was originally developed for use developing lessons with a mathematical focus, it could be fairly easily adapted to be more content-agnostic.

The overall protocol goes something like this:

- Set up and select a task based on your goal.
- Support students doing the task.
- Share and discuss the task.

I recommend reading the entire protocol because there is obviously more nuance to the protocol than what I am describing. For example, the TTAL protocol recommends that you do the activity you are planning for students to do, both how you would do it with your knowledge and experience but also to anticipate how your students would do the task with their different knowledge and experience.

A critical idea to keep in mind when making choices about activities for lessons, including ones that involve technology; what your students think about is what they will remember and what they remember will dictate what they will be able to do.

]]>