Victory’s spinoff metacog just released its advanced scoring analytics API at ISTE 2015 in Philadelphia. See the press release here, and feel free to share it with your colleagues.
Why Assessment Has Changed
Assessments have been evolving rapidly, mostly due to these factors:
- The new standards (Common Core and Next Generation Science) focus on practices that require higher-order thinking and decision-making skills.
- Jobs are changing, and employers need evidence that prospective employees have the necessary twenty-first-century skills.
- Technology makes it possible.
The assessment landscape is of course more complex than this; stay tuned for more detail in future blog posts.
The Problem
One writer recently complained to us about NGSS:
They wrote the NGSS as if they had one goal in mind—don’t allow multiple choice questions. And now I have an assignment to write 100 MC questions for NGSS!
Writing good assessments is an art, but even the best assessments won’t be used unless they can be readily scored.
That’s why there are still so many multiple choice questions in high stakes assessments—because it is so expensive to grade the more open-ended assessments by hand.
While technology has made great strides in interactive digital assessment, the most robust assessments still have to be hand-scored.
Until now.
The Answer—How metacog Makes Automated Scoring Possible
Above, we explained why there is a need for automated scoring of open-ended assessments.
The video below explains how it is possible, using workflows that are very similar to ones publishers currently use for developing interactive digital assessments.
For lovers of data and details (e.g. programmers and psychometricians), you can request our more detailed white paper. Enjoy!