In our first blog on this topic, we began to reveal the secret behind the magnetic allure of games, simulations, and other online performance-based digital learning experiences. We showed you a well-aligned, well-designed simulation-based science performance task.
In this blog, we’ll show you how the design of a digital performance task directly influences the richness of the data we can gather on student learning. This time, we’ll use a social studies task we developed:
Behind the Scenes: Designing the Boston Massacre Performance Task
We designed this performance task to give students opportunities to visually, graphically, and dynamically analyze text, make inferences based on evidence, and synthesize their understanding.
Why? Let’s examine the standards. The C3 framework, CCSS standards, and other state and national efforts to align learning expectations to 21st-century workforce demands are emphasizing critical analysis and evidence from text.
In a paper-based instructional approach, students may read primary source documents to write the essay you saw in the video. To demonstrate evidence of their understanding and synthesis of their analysis, students need to support their conclusions with evidence gleaned from a close reading of the text.
Without technology, we would see the results of students’ analysis and conclusions only through their writing. To evaluate their work, we might expect students to cite evidence from text, provide clear descriptions of their analytical process, and clear explanations showing that their conclusions are based on reasoning rather than opinion or supposition. These are valuable and effective measures of performance, but…
What Could a Digital Performance Task Enable?
Imagine being able to see evidence of students’ analysis, the kinds of suppositions they make, and when and how they change their minds even before they write about it. Picture literally watching how their prior investigations influence their subsequent decisions. What if we could recognize not only students’ conclusions, but how their close reading of a text (or struggles with it) shapes their entire decision-making process? And what if students could guide, manipulate, and self-reflect on their own metacognitive analytical process? We could gain more evidence of their learning than exclusively with an essay. And the task would more closely mirror our intended learning outcomes, by expanding students’ opportunities to visually, graphically, and even metacognitively demonstrate evidence of the understandings, knowledge, and skills implied by the standards.
That’s the power of a well-designed interactive digital performance-based task. If we want to see the richness of that data, then we have to design for it intentionally. We are not interested in technology that simply replicates generally accepted and effective paper-based instructional approaches. Our passion is leveraging technology to transform teaching and learning.
In our next blog in this series we’ll show you the patterns of student thinking that every teacher wishes we could see. (And yes, it’s now possible!)