Author Archives: Trudi Lord

Dashboard helps teachers understand student progress and performance in genetics game

Our dragon genetics games have engaged thousands of students for many years. In that time, teachers have asked for an easy way to track their students’ progress and performance. Until now, teacher reports have been difficult to pull out of our system and impossible to parse in real time. The GeniGUIDE project, in partnership with North Carolina State University, is developing a teacher dashboard to accompany our new Geniventure software. We are currently piloting the beta version of this dashboard in multiple classrooms in Maine, North Carolina, New Jersey, and Massachusetts.

“A dashboard is a visual display of the most important information needed to achieve one or more objectives that has been consolidated on a single computer screen so it can be monitored at a glance.” – Stephen Few

Our dashboard displays information processed by an Intelligent Tutoring System (ITS) integrated into Geniventure. As students complete challenges in the game, they are rewarded with different color crystals for their accomplishments (Figure 1). Students who complete a challenge efficiently and without mistakes receive a blue-green crystal. Those who make a small number of missteps receive a yellow crystal while those with more mistakes receive a red. A black “try again” crystal is given to a student with too many mistakes to move on. As students level up through the missions, the ITS builds a model of conceptual understanding of specific learning goals. As student performance on these concepts improves over time, evidence that they have a solid understanding grows stronger.

Figure 1. Student view within Geniventure of the colored crystals (bottom of screen).

Our preliminary teacher dashboard design (Figure 2) was guided by three factors. First, we looked back at our many years of classroom observations of teachers who implemented our suite of dragon genetics games—from our most recent Geniverse to GeniGames and BioLogica—and asked: What information could have helped teachers better facilitate student use of the game? Second, we examined recent dashboard designs implemented in prior Concord Consortium projects to help us distinguish between in-class and after-class use. Finally, we looked at other teacher dashboards that are currently available on the market.

Figure 2. Beta version of Geniventure teacher dashboard.

During the pilot testing, we’re closely observing how teachers use the primary view of the dashboard, which provides information on both student progress and performance during class time. We hope to answer the following questions:

  • Can the teacher adequately track student progress through the game?
  • When do teachers intervene and when do they allow students to struggle? (Do teachers first help those students with black or red crystals?)
  • Do teachers look at how many attempts a student made at a challenge?
  • If teachers notice that particular students are ahead of the class, what actions do they take?

The dashboard also displays a graphical representation of student understanding of genetics concepts highlighted in the game. Some concepts are directly related to specific student actions (e.g., two recessive alleles are required to produce a recessive trait) while others are calculated based on performance across certain challenge types (e.g., genotype to phenotype mapping). The teacher can delve deeper into these secondary reports to view not only individual student data (Figure 3), but also aggregated class data (Figure 4). Through classroom observations and interviews with teachers, we hope to determine:

  • Do teachers have the time and bandwidth to make sense of the concept understanding graphs during class?
  • To what extent do the concept graphs help teachers understand where individual students, or the entire class, are having trouble?
  • What action, if any, do teachers take based on the concept graphs?

Figure 3. Display of individual student’s conceptual understanding.

Figure 4. Representation of class average conceptual understanding.

As our ITS becomes more sophisticated, we plan to widen the concepts we track and make better use of student data to inform teachers.

How do you make use of dashboards? Let us know what features you’d like to see as we improve our ITS-enhanced dashboard.

Lights, camera, action: A video that introduces the NGSS practice of scientific argumentation

Following the recommendation to incorporate the Next Generation Science Standards (NGSS) science and engineering practices in their classrooms, schools across the country are looking for ways to integrate scientific argumentation into their curriculum. Since 2012 the High-Adventure Science project in collaboration with National Geographic Education has offered free online modules for Earth and space science topics—including climate change, freshwater availability, the future of energy sources, air quality, land management, and the search for life in the universe—that include multiple opportunities for students to engage in argument from evidence.

Over 67,000 teachers and students across the globe have used High-Adventure Science modules. Based on teacher feedback, classroom observations, and analysis of student data, we have learned that when students engage in argumentation from data and model-based evidence, they need a lot of support on how to write a convincing argument.

Last year, we added an introductory activity to each module where students learn about the component parts of a scientific argument before they are asked to write one. In this highly scaffolded task, students see written examples of a claim and explanation and learn about uncertainty in scientific data and how to express this uncertainty. In High-Adventure Science, argumentation takes a special form, including a multiple-choice structured claim, open-ended explanation, five-point Likert scale uncertainty rating, and uncertainty rationale.

In this introductory activity, students learn about the components of a good explanation.

Even with this new activity, some students still struggled, so we recently created an animated video to introduce the scientific practice of developing an argument. We start by helping students identify the difference between a scientific argument and so-called “arguments” they may have with their friends (e.g., arguing about favorite ice cream flavors!), and making the distinction between claims backed by evidence and opinion. The goal is to introduce students to scientific arguments in a fun and relatable way and to make the terminology and process of scientific argumentation less daunting.

We’re piloting the video in our Will there be enough fresh water? module for select students. We’re looking forward to student and teacher feedback and may revise the video based on their comments. We want everyone to be able to engage in the critical practice of arguing from evidence.

We welcome your comments about our video, as well as your challenges and successes with incorporating the NGSS practice of engaging in argument from evidence.