Posts Tagged ‘cognitive science’

A mixed-reality gas lab

February 12th, 2013 by Charles Xie
In his Critique of Pure Reason, the Enlightenment philosopher Immanuel Kant asserted that “conception without perception is empty, perception without conception is blind. The understanding can intuit nothing, the senses can think nothing. Only through their unison can knowledge arise.” More than 200 years later, his wisdom is still enlightening our NSF-funded Mixed-Reality Labs project.

Mixed reality (more commonly known as augmented reality) refers to the blending of real and virtual worlds to create new environments where physical and digital objects co-exist and interact in real time to provide user experiences that are impossible in only real or virtual world. Mixed reality provides a perfect technology to promote the unison of perception and conception. Perception happens in the real world, whereas conception can be enhanced by the virtual world. Knitting the real and virtual worlds together, we can build a pathway that leads perceptual experiences to conceptual development.

We have developed and perfected a prototype of mixed reality for teaching the Kinetic Molecular Theory and the gas laws using our Frame technology. This Gas Frame uses three different types of sensors to translate user inputs into changes of variables in a molecular simulation on the computer: A temperature sensor is used to detect thermal changes in the real world and then change the temperature of the gas molecules in the virtual world; a gas pressure sensor is used to detect gas compression or decompression in the real world and then change the density of the gas molecules in the virtual world; a force sensor is used to detect force changes in the real world and then change the force on a piston in the virtual world. Because of this underlying linkage with the real world through the sensors, the simulation appears to be "smart" enough to detect user actions and react in meaningful ways accordingly.

Each sensor is attached to a physical object installed along the edge of the computer screen (see the illustration above). The temperature sensor is attached to a thermal contact area made of highly conductive material, the gas pressure sensor is attached to a syringe, and the force sensor is attached to a spring that provides some kind of force feedback. These three physical objects provide the real-world contextualization of the interactions. In this way, the Gas Frame not only produces an illusion as if students could directly manipulate tiny gas molecules, but also creates a natural association between microscopic concepts and macroscopic perception. Uniting the actions of students in the real world and the reactions of the molecules in the virtual world, the Gas Frame provides an unprecedented way of learning a set of important concepts in physical science.

Pilot tests of the Gas Frame will begin at Concord-Carlisle High School this week and, collaborating with our project partners Drs. Jennie Chiu and Jie Chao at the University of Virginia, unfold at several middle schools in Virginia shortly. Through the planned sequence of studies, we hope to understand the cognitive aspects of mixed reality, especially on whether perceptual changes can lead to conceptual changes in this particular kind of setup.

Acknowledgements: My colleague Ed Hazzard made a beautiful wood prototype of the Frame (in which we can hide the messy wires and sensor parts). The current version of the Gas Frame uses Vernier's sensors and a Java API to their sensors developed primarily by Scott Cytacki. This work is made possible by the National Science Foundation.

Detecting students’ "brain waves" during engineering design using a CAD tool

December 12th, 2012 by Charles Xie
Design a city block with Energy3D.
We were in a school these two weeks doing a project that aims to understand how students learn engineering design. This has been a difficult research topic as engineering design is an extremely complicated cognitive process that involves the application of science and mathematics -- another two sets of complicated subjects themselves.


Two types of problems are commonly encountered in the classroom. The first type is related to using a "cookbook" approach that confines students to step-by-step procedures to complete a "design" project. I added double quotes because this kind of project often leads to identical or similar products from students, violating the first principle of design that mandates alternatives and varieties. However, if we make the design project completely open-ended, we will run into the second type of problem: The arbitrariness and caprice in student designs often make it difficult for teachers and researchers to assess student thinking and learning reliably. As much as we want students to be creative and open-minded, we also want to ensure that they learn what is intended and we must provide an objective way to evaluate their learning outcomes.


To tackle these issues, we are taking a computer science-based approach. Computer-aided design (CAD) tools offer an opportunity for us to move the entire process of engineering design to the computer (this is what CAD tools are designed for in the first place for industry folks). What we need to do in our research is to add a few more things to support data mining.

A sample design of the city block.
This blog post reports a timeline tool that we have developed to measure student activity levels while engaged in using a CAD tool (our Energy3D CAD software in this case) to solve a design challenge. This timeline tool is basically a logger that records the number of the learner's design actions at a given frequency (say, 2-4 times a minute) during a design session. These design actions are defined to be the "atomic" actions stored in the Undo Manager of the CAD tool we are using. The timeline approximately describes the user's frequency of construction actions with the CAD tool. As the human-computer interaction is ultimately driven by the brain, this kind of timeline data could be regarded as a reflection of the user's "brain wave."

There are four things that characterize such a timeline graph:

A sample timeline graph.
  • The height of a spike measures the action intensity at that moment, i.e., how many actions the user has taken since the last recording;
  • The density of spikes measures the continuity and persistence of actions over a time period;
  • A gap indicates an off-task time window: A short idling window may be an effect of instruction or discussion;
  • The trend of height and density may be related to loss of interest or improvement of proficiency in the CAD tool: If the intensity (the combination of height and density of spikes) drops consistently over time, the student's interest may be fading away; if the intensity increases consistently over time, the student might be improving on using the design tool to explore design options.
Timeline graphs from six students.
Of course, this kind of timeline data is not perfect. It certainly has many limitations in measuring learning. We are still in the process of analyzing these timeline data and juxtaposing them with other artifacts we have gathered from the students to provide a more comprehensive picture of design learning. But the timeline analysis represents a rudimentary step towards a more rigorous methodology for performance assessment of engineering design.

The above six "brain wave" graphs were collected from six students in a 90-minute class period. Hopefully, these data will lead to a way to identify novice designers' behaviors and patterns when they are solving a design challenge.

A theory of multisensory learning for IR visualization of hands-on experiments

July 1st, 2011 by Charles Xie

I have been "shopping" for a learning theory that can frame the value added by IR visualization to hands-on experiments. Here is a candidate theory.

There are four learning pathways to the brain: visual, auditory, kinesthetic, and tactile. Theory has it that memory and learning could be enhanced if multiple learning pathways are utilized simultaneously.

Let's look at a notorious misconception in heat and temperature. Many people believe that metals are colder than wood or paper. This misconception cannot be easily dispelled because that is how they feel through the sense of touch. As heat transfer is invisible, the tactile experience is all they have.

Now, what if the heat transfer process can be visualized? In other words, what if students have multisensory learning experience: they feel and see it at the same time? IR imaging has enabled us to design such an experiment. The image above shows an IR view that compares heat flow through paper and metal from hands.

Recent studies from Swedish scholars including Konrad J. Schönborn, whom I ran into at a conference and who was enticed by my IR magic, showed that adding haptics to visualization could improve student learning of biomolecular interactions such as docking. Visual and tactile sensorimotor interactions could enhance the cognitive process. Or, in this case, the visualization could "correct" the erroneous idea tangibly gained. The IR visualization shows that the metal is actually warmer than the paper, creating a contradiction with the tactile input that students must reconcile. 

Konrad said he would investigate this through a cognitive experiment with students from his University in Sweden. I was psyched. This is complementary to what he has done. In this case, visualization augments touch--exactly opposite to his prior research on molecular binding in which case haptics augments visualization.