Technology allows us to record almost everything happening in the classroom. The fact that students' interactions with learning environments can be logged in every detail raises the interesting question about whether or not there is any significant meaning and value in those data and how we can make use of them to help students and teachers, as pointed out in a report sponsored by the U.S. Department of Education:
“New technologies thus bring the potential of transforming education from a data-poor to a data-rich enterprise. Yet while an abundance of data is an advantage, it is not a solution. Data do not interpret themselves and are often confusing — but data can provide evidence for making sound decisions when thoughtfully analyzed.” — Expanding Evidence Approaches for Learning in a Digital World, Office of Educational Technology, U.S. Department of Education, 2013
|A digraph of action transition|
|A histogram of action intensity|
Here we are not talking about just analyzing students' answers to some multiple-choice questions, or their scores in quizzes and tests, or their frequencies of logging into a learning management system. We are talking about something much more fundamental, something that runs deep in cognition and learning, such as how students conduct a scientific experiment, solve a problem, or design a product. As learning goes deeper in those directions, data produced by students becomes bigger. It is by no means an easy task to analyze large volumes of learner data, which contain a lot of noisy elements that cast uncertainty to assessment. The validity of an assessment inference rests on the strength of evidence. Evidence construction often relies on the search for relations, patterns, and trends in student data.With a lot of data, this mandates some sophisticated computation similar to cognitive computing
Data gathered from highly open-ended inquiry and design activities, key to authentic science and engineering practices that we want students to learn, are often intensive and “messy.” Without analytic tools that can discern systematic learning from random walk, what is provided to researchers and teachers is nothing but a DRIP (“data rich, information poor”) problem.
|A polar chart of problem space exploration|
Recognizing the difficulty in analyzing the sheer volume of messy student data, we turned to visual analytics, a whole category of techniques extensively used in cutting-edge business intelligence systems such as software developed by SAS
and others. We see interactive, visual process analytics
key to accelerating the analysis procedures so that researchers can adjust mining rules easily, view results rapidly, and identify patterns clearly. This kind of visual analytics optimally combines the computational power of the computer, the graphical user interface of the software, and the pattern recognition power of the brain to support complex data analyses in data-intensive educational research.
|A scatter plot of action timeline|
Within a week, I was able to write four interactive graphs and charts
that can be used to study four different aspects of the design action data that we collected from our Energy3D CAD software
. Recording several weeks of student work on complex engineering design challenges, these datasets are high-dimensional, meaning that it is improper to treat them from a single point of view. For each question we are interested in getting answers from student data, we usually need a different representation to capture the outstanding features specific to the question.