Here is another TEI (Technology Enhanced Item) Victory has developed for English Language Arts. We are trying to push the envelope on interactivity, but we need to keep within the framework of the PARCC and SBAC assessments that students will soon encounter.
Please take 2 minutes to watch the video; then play with the prototype Writing TEI here.
Decisions, decisions… There are so many ways we could have developed this TEI!
One decision about which we’d like feedback:
Should students have to re-enter correct answers in each trial?
Here’s our thinking: There isn’t a lot of keyboarding in this example. In fact, you can score 100% with only 5 mouse clicks and 5 keystrokes! A student who doesn’t get all the answers correct right away might be annoyed at having to re-enter their correct answers, but think of the benefits:
- We’ll find out if the student was guessing.
- We’ll know if the student can duplicate their correct answers. This is a subtle but significant difference from guessing, because it tells whether the student has a solid understanding.
- When students do a lot of these TEIs, they will learn it pays to be careful. If they get an answer right on the first try, they are done. This will nurture a desirable habit of mind—and teachers will thank us!
Keep in mind that we can make multiple versions of this TEI, so your feedback is extremely valuable.
A refresher on how the metacog toolkit works: The log shown in the TEI is for the developer; students won’t see it. Metacog will keep the data anonymous when it is in the Cloud, but then we reattach the data to individual students when we put the results into teachers’ hands. See the FAQ on the metacog website where we explain how we manage to preserve anonymity of student data until it is placed in the teacher’s hands.
Why use the metacog toolkit with this TEI? For one thing, it makes A-B testing easy. We could create:
- version A — the student has to re-enter correct answers in each trial.
- version B — the student only has to fix their errors in each trial.
In fact, you could do A-B-C-D-E-F-G testing, to evaluate any number of variations in the construction of the TEI and how those variations affect the way students interact with it. Using the metacog toolkit will tell us so much more than a raw score: It will yield insights into how students approach the TEI:
- Is there a pattern to the order in which students find the errors that might reveal which errors are more easily or less easily spotted by students?
- Is there a pattern that reveals student misconceptions?
- Do students even look at the feedback? In this TEI there is a clear benefit to the student for looking at the feedback after trial 2: The software shows them where all the errors are. But maybe they don’t even use this feature. A product developer could optimize a TEI (or any other learning object) by refining the navigation, layout, functionality, or even the color of certain elements, all based on the metacog feedback.
We are enjoying the feedback to our previous blog. Please use the comment field to continue the discussion…