All posts by Joel Gendler

Victory’s Vendor and Partnership Processes

In our recent blog post, Instructional Design 101, we provided an overview of several popular instructional design models. One of these, the original ADDIE model, was a linear approach with some iterative features. It evolved to be more cyclical, and spawned many other models. In similar fashion, our linear workflows at Victory have evolved to keep up with rapid changes in our industry.

Watch this video for a quick look at Victory’s vendor and partnership processes. Many projects do not require a partnership process; we originally used it to develop digital products, but it has many benefits for complex print products as well.

The video also references backward design, which we first blogged about in Talking to the Test: The Learning Continuum. In backward design, the initial development focuses on assessments because they determine what evidence we will accept as proof of mastery of the associated learning objectives. Again, not every project warrants a backward-design approach. It makes the most sense for subjects with open-ended user experiences that are hard to assess and hard to teach. We have found that if most of the assessment is traditional, then a traditional development process generally will also be sufficient.

metacog Building Its Deep Learning Analytics Platform with Databricks

Victory’s spinoff metacog was just featured in a blog post by Databricks, a company founded by the team that created Apache Spark, a powerful open-source data processing engine. See the Databricks blog post below.

metacog has been hard at work releasing new capabilities of its learning analytics platform, while at the same time enhancing existing capabilities. If you offer subscription-based products, you know that your customers expect continuous improvement. With metacog, we partner with you to deliver new capabilities in deep learning analytics that you can easily integrate into your products to generate new data-driven business models and revenue streams.

Why data analytics for adaptive and competency-based learning is so challenging

You may have seen many companies offering data analytics applied to learning products. If you look closely, most of the time what is offered is “administrative-level” data and simple scoring data:

  • Time-on-task data – How long did learners use the interactive?
  • “Attendance” data – Did learners participate?
  • SCORM-compliant scores reported to a learning management system (LMS) – How well are learners doing?
  • Simple score reports – How many right, how many wrong?

It turns out that in order to improve anything, you have to be able to measure it, but so far in education we have been measuring the wrong thing – the final answer.

This explains why scoring is the key issue. In the past, most open-ended assessments had to be human-scored. And this greatly reduces the frequency with which teachers and professors assign open-ended assessments. Yet it is open-ended tasks that best assess the ability of a candidate to perform well in today’s job market.

Why metacog is different

Continue reading

U.S. Education Market Snapshot: English Language Learners (ELLs)

The Early Days

In the 1960s, Victoria Porras attended Melrose High School in Massachusetts as an exchange student from Bogotá, Colombia. She was far from home. Her host family, teachers, and new schoolmates were friendly and eager to help, but the everyday English spoken in the Boston area is accented, idiomatic, and studded with acronyms.

Victoria decided then that she would find a way to help other exchange students navigate spoken English. The idea of Victory Productions grew from that experience.

When Victoria established Victory Productions in 1995, its mission was to develop educational materials to teach English as a Second Language (ESL), but she found that what the market wanted was Spanish translation. Only later did the market recognize the need for materials dedicated to English Language Learners (ELLs).

The ELL market has certainly changed since those early years. Most early programs focused on teaching English language skills so students could learn in the mainstream classroom. The products offered to the market were Spanish student editions, bilingual student editions, and supplemental programs designed to build English reading skills. Today’s market, however, is different.

Today’s Market for English Language Learners (ELLs)

The U.S. Department of Education recorded 4.85 million English language learners (ELLs) enrolled in public schools in the academic year 2012–2013. ELLs attend public schools in all 50 states, so the market is national. As shown in the map below compiled by the Office of English Language Acquisition (OELA), the greatest populations of ELLs (by percent of total enrollment) are in southwestern states. In 2012, there were six states with a 10% or higher density of ELLs: California, Texas, Nevada, Oregon, Colorado, and New Mexico. On the east coast, only three states—Florida, New York, and Virginia—ranked in the top fifteen states in ELL student density.

U.S. State Map of Highest Density of English Language Learners (ELLs)

Continue reading

Instructional Design 101

Before we plunge into instructional design, let’s step back. What does it mean to design? Here’s the definition, according to Merriam-Webster:

Design

1. to create, fashion, execute, or construct according to plan: devise, contrive

2. a: to conceive and plan out in the mind

b: to have as a purpose: intend

c: to devise for a specific function or end

Design is applied in many fields. Engineers design, construct, test, and refine solutions to problems. Fashion designers bring art to life in clothing, jewelry, and accessories. And then there’s user interface design, video and film design, marketing, and even publishing. In educational publishing, design often refers to graphic design—envisioning and creating the visual look and feel of a book or product. However, graphic design is just one small part of another field of design essential to creating educational materials—instructional design.

Instructional Design Models

Over the years, numerous instructional design models have been developed that serve as frameworks for modules or lessons, by:

  • increasing and/or enhancing the possibility of learning, and
  • encouraging the engagement of learners so that they learn faster and gain deeper levels of understanding.

Instructional design is the systemic process by which instructional materials are designed, developed, and delivered. Instructional design creates a learning environment that is focused on the learner, with an organized structure for content and activities designed to achieve specific learning objectives. This involves applying educational research and teaching practice to craft curriculum and instructional materials aligned to those objectives, thereby improving learning outcomes.

You often hear about instructional design in the context of technology—specifically, digital learning experiences. But the primary goal of instructional design is not the use of technology—it’s good instruction. Technology is just one tool that can be employed to achieve the larger goal: to improve learning outcomes.

The ADDIE Model

One of the earliest instructional design models, ADDIE, includes these five phases:

Continue reading

Professional Translation and How To Avoid Flying Naked

flying-naked

Why is it important to have a professional write and translate your product?

In this era of new technology and immediacy, it is easy to get carried away with the specialized tools available to get the work done. But remember, they are only tools, which means they are only as good and effective as the person who uses them.

The same thing happens with free translation tools such as Google Translate, or even with professional tools such as Wordfast or SDL Trados. There are a number of automated tools for translation, but if used alone, they can be more harmful—or comical, for that matter—than useful. In 1977, an airline promoted leather seats in its first-class sections with the slogan “Fly in leather.” It was translated into Spanish as “Vuele en cuero” (a literal translation), which really means “Fly naked.” The biggest danger with automated tools is that they tend to translate literally and word by word. Continue reading

Skate Park Performance Task

We have become proficient at developing performance tasks closely aligned to NGSS (Next Generation Science Standards). Of course, a good performance task aligns to standards across multiple disciplines. The following task was developed for middle grades and for these learning goals.

Please watch the video and then try the performance task. We’d love to hear your feedback!

You have been asked to make the jump safe. The video below explains how to set up a simulation to investigate.

Your Task

Click “Playground” in the PhET® simulation below and set up a jump as shown in the video. Remember to set friction to zero and always release the skateboarder from a height of 5 meters.

Then modify the setup to make the jump safe, where “safe” is defined as converting less than 1/4 of the total energy into thermal energy.

Use your observations of the skateboarder’s motion to explain why reducing thermal energy transfer reduces the risk of injury.

Credits:
The Skate Park simulation was developed by PhET.
PhET Interactive Simulations, University of Colorado Boulder, http://phet.colorado.edu.

The performance task was developed and designed by Victory Productions, Inc.
© 2015 Victory Productions, Inc. All Rights Reserved.

Test Drive a Science CEPA (Curriculum-Embedded Performance Assessment)

We continue to explore the changing landscape in STEM assessment. This 5-minute video gives a whirlwind tour of our prototype Science CEPA (Curriculum-Embedded Performance Assessment). We also invite you to test drive the Science CEPA here.

The Common Core mathematical practices and the Next Generation Science Standards (NGSS) both focus on developing practices. Why? One editor summed it up for me beautifully 25 years ago:

Imagine a basketball team that practiced every day but never played a real game. They could get pretty good, especially if they had 10 players to simulate a game. But what if you only let them read the playbook? That’s unfortunately the way we teach science—from a textbook!

If we want to grow scientists, we have to let them play the game, or at least practice. And what happens when you practice? You make mistakes…and learn from them. That’s how you get better.

So we intentionally designed this CEPA to allow students to make mistakes and then correct them. We model the iterative nature of science in Activity 1 (Satellite Photos) and Activity 2 (Evaluate and Revise an Experiment). This directly supports a number of Science and Engineering Practices in NGSS.

One thing we have learned in our prototyping — it is essential to focus on one or two SEPs (Science and Engineering Practices). Otherwise, the instruction and embedded assessment become too attenuated. We realize there are many ways to develop a Curriculum-Embedded Performance Assessment in science, and that we are presenting just one kind of solution. You may have very different considerations to offer. Your feedback is appreciated! We invite you to use the comments to join in the conversation.

 

Test Drive a Science CEPA (Curriculum-Embedded Performance Assessment)

We continue to explore the changing landscape in STEM assessment. This 5-minute video gives a whirlwind tour of our prototype Science CEPA (Curriculum-Embedded Performance Assessment). We also invite you to test drive the Science CEPA here.

The Common Core mathematical practices and the Next Generation Science Standards (NGSS) both focus on developing practices. Why? One editor summed it up for me beautifully 25 years ago:

Imagine a basketball team that practiced every day but never played a real game. They could get pretty good, especially if they had 10 players to simulate a game. But what if you only let them read the playbook? That’s unfortunately the way we teach science—from a textbook!

If we want to grow scientists, we have to let them play the game, or at least practice. And what happens when you practice? You make mistakes…and learn from them. That’s how you get better.

So we intentionally designed this CEPA to allow students to make mistakes and then correct them. We model the iterative nature of science in Activity 1 (Satellite Photos) and Activity 2 (Evaluate and Revise an Experiment). This directly supports a number of Science and Engineering Practices (SEPs) in NGSS.

One thing we have learned in our prototyping — it is essential to focus on one or two SEPs. Otherwise, the instruction and embedded assessment become too attenuated. We realize there are many ways to develop a Curriculum-Embedded Performance Assessment in science, and that we are presenting just one kind of solution. You may have very different considerations to offer. Your feedback is appreciated! We invite you to use the comments to join in the conversation.

 

Writing TEI (Technology Enhanced Item)

Here is another TEI (Technology Enhanced Item) Victory has developed for English Language Arts. We are trying to push the envelope on interactivity, but we need to keep within the framework of the PARCC and SBAC assessments that students will soon encounter.

Please take 2 minutes to watch the video; then play with the prototype Writing TEI here.

Decisions, decisions… There are so many ways we could have developed this TEI!

Continue reading