Our Blog

CCE staff and partner reflections on our collaborative work to create schools where learning is engaging and rewarding, and every student is set up for success.

Re-“Designing Lincoln’s Playground”: One Teacher’s Experience with Task Validation

When I first read about the Massachusetts Consortium for Innovative Education Assessment (MCIEA), I knew it was something I had to be a part of. The idea of a richer means of assessment where we look at the whole student, outside of a standardized test, is something that I feel strongly about. Moving in this direction gives me hope that the test anxiety so many of our students suffer from will eventually disappear.

Using performance tasks was something I already enjoyed implementing in my teaching. In the past, I have assessed students with both traditional assessments and with performance assessments. However, after working with MCIEA this year, I have realized that my tasks could be more substantial, cross-curricular, and have a stronger connection to real life. In order to fully connect to the MCIEA goals, I decided to take an old task, “Designing Lincoln’s Playground,” and revamp it.

(Click image to enlarge)

(Click image to enlarge)

As a member of MCIEA, I was asked to put aside my fear of public speaking and present “Designing Lincoln’s Playground” to a group of approximately twelve teachers from the various MCIEA districts at a Quality Performance Assessment institute. Within this presentation, the other teachers and I went through what is called a “validation process.”

This process seemed very intimidating at first—as the idea of putting my idea out there in front of a group of unfamiliar people and have them review and discuss my task scared me! I was not sure what people would think or how they would react. Yet, this whole validation process was far more powerful than I could have imagined. I left having feedback that would only make my performance assessment that much stronger in the future.

(Click image to enlarge)

The first part of the validation process is to look at the standards alignment and depth of knowledge within the assessment. Through effective feedback, I learned that much of my assessment is at a Depth of Knowledge (DOK) level of 2, with some of my assessment falling in the DOK 1 and DOK 3 levels. Right away, I was able to experience how valuable it was to have many other eyes looking over the assessment.

My fellow educators were able to recognize where my assessment could be stronger and they provided many excellent ideas. My favorite part of this discussion was how to make this assessment have a DOK level of 4. How could I take this area and perimeter and playground design idea and have students apply it elsewhere?

One teacher from Boston talked about how I could also take this assessment and connect it to our science curriculum. She suggested that I have the students think about the playground structure and materials if the playground was moved to a desert area or in an area where it rained often. She suggested pushing students to think of the effects of erosion. I LOVED this idea! This connection had never crossed my mind. Within minutes, I had a valuable suggestion on how to strengthen this assessment in the future.

(Click image to enlarge)

The second part of the validation process focused on looking at clarity, student engagement, and a scoring guide—such as a rubric. My group addressed some questions they had in terms of individual student performance. How was each child going to be assessed? How could I take this partner/group-based task and make sure that each child was being assessed properly?

What’s interesting about the validation process is while your group may have questions, as a presenter, you cannot answer these questions until the very end. It allows the group, without my influence, to engage in a rich discussion. This enables me, in turn, to take notes for discussion points later. Methods for assessing each student properly was something I wanted to discuss further in particular.

The other important feedback I received here was in regards to my rubric. Going into this presentation, I knew that my rubrics—one for the math portion and one for the ELA essay—were the weakest parts of this assessment. The group discussed adding an additional column that would separate the proficient students from the advanced. The group also discussed some judgment words that could be fixed in the rubric.

(Click image to enlarge)

The third part of the validation process focused on fairness and universal design. The discussion points from parts one and two were points that were brought up again in part three. There was still some clarification needed over the assessment for all students.

Within the task, I provided extensions for the above-level students. My group discussed how the extensions could be made available to all students on different levels. I wholeheartedly agreed with them. The extensions were originally designed for my above-level students to further apply math concepts in a real life situation. It really made me think how I could allow all students to explore this same idea at an appropriate level.

(Click image to enlarge)

The last part of the validation process determined whether the validation was complete or if it was pending. In this case, my validation was pending because there were several revisions that could be made. What is great about this process, though, is that revisions can always be made. These assessments can always be reworked and made better year after year.

Having multiple educators with different visions and ideas simultaneously looking over one task is not something that I felt comfortable with at first, but I quickly learned how valuable this whole process was. I loved hearing many great ideas from other teachers and the suggestions they had to help make my assessment that much stronger.

I look forward to building more performance assessments and completing this validation process. I know that this is only going to help me increase the rigor, relevance, and accessibility of assessments.

After putting her task through the validation process, Cabral-Pini revised and implemented her assessment. Here is the completed 3D model of the winning playground design.



Blog Post

Leading the Way for Performance Assessment

August 4, 2016
How do district and school leaders start the conversation about performance assessments with their colleagues, with students, with parents, with school boards? How do you change a practice that also requires a change in culture? Read on to hear some advice from our annual Quality Performance Assessment Summer Institute.
Blog Post

Demonstrations of Mastery at the Met High School

April 12, 2017
An advisor at the Met High School in Providence, RI, shows how one student used personalized learning to create an exhibition on crime scene investigation and law enforcement.
Blog Post

Placing Teachers at the Center of Assessment

March 13, 2017
The Massachusetts Consortium for Innovative Education Assessment seeks to redefine the measures of student learning. The consortium believes that “In this day and age, we need to re-conceptualize assessment rather than tinker to refine a testing model that has limited value in furthering public education,” and that the best way to assess student learning is to return the role of designing assessments to those closest to students – teachers.