Personal thermal vision could turn millions of students into the cleantech workforce of today

So we have signed the Paris Agreement and cheered about it. Now what?

More than a year ago, I wrote a proposal to the National Science Foundation to test the feasibility of empowering students to help combat the energy issues of our nation. There are hundreds of millions of buildings in our country and some of them are pretty big energy losers. The home energy industry currently employs probably 100,000 people at most. It would take them a few decades to weatherize and solarize all these residential and commercial buildings (let alone educating home owners so that they would take such actions).

But there are millions of students in schools who are probably more likely to be concerned about the world that they are about to inherit. Why not ask them to help?

You probably know a lot of projects on this very same mission. But I want to do something different. Enough messaging has been done. We don't need to hand out more brochures and flyers about the environmental issues that we may be facing. It is time to call for actions!

For a number of years, I have been working on infrared thermography and building energy simulation to knock down the technical barriers that these techniques may pose to children. With NSF awarding us a $1.2M grant last year and FLIR releasing a series of inexpensive thermal cameras, the time of bringing these tools to large-scale applications in schools has finally arrived.

For more information, see our poster that will be presented at a NSF meeting next week. Note that this project has just begun so we haven't had a chance to test the solarization part. But the results from the weatherization part based on infrared thermography has been extremely encouraging!

Listen to the data with the Visual Process Analytics


Visual analytics provides a powerful way for people to see patterns and trends in data by visualizing them. In real life, we use both our eyes and ears. So can we hear patterns and trends if we listen to the data?

I spent a few days studying the JavaScript Sound API and adding simple data sonification to our Visual Process Analytics (VPA) to explore this question. I don't know where including the auditory sense to the analytics toolkit may lead us, but you never know. It is always good to experiment with various ideas.


Note that the data sonification capabilities of VPA is very experimental at this point. To make the matter worse, I am not a musician by any stretch of the imagination. So the generated sounds in the latest version of VPA may sound horrible to you. But this represents a step forward to better interactions with complex learner data. As my knowledge about music improves, the data should sound less terrifying.

The first test feature added to VPA is very simple: It just converts a time series into a sequence of notes and rests. To adjust the sound, you can change a number of parameters such as pitch, duration, attack, decay, and oscillator types (sine, square, triangle, sawtooth, etc.). All these options are available through the context menu of a time series graph.

At the same time as the sound plays, you can also see a synchronized animation of VPA (as demonstrated by the embedded videos). This means that from now on VPA is a multimodal analytic tool. But I have no plan to rename it as data visualization is still and will remain dominant for the data mining platform.

The next step is to figure out how to synthesize better sounds from multiple types of actions as multiple sources or instruments (much like the Song from Pi). I will start with sonifying the scatter plot in VPA. Stay tuned.

What’s new in Visual Process Analytics Version 0.3


Visual Process Analytics (VPA) is a data mining platform that supports research on student learning through using complex tools to solve complex problems. The complexity of this kind of learning activities of students entails complex process data (e.g., event log) that cannot be easily analyzed. This difficulty calls for data visualization that can at least give researchers a glimpse of the data before they can actually conduct in-depth analyses. To this end, the VPA platform provides many different types of visualization that represent many different aspects of complex processes. These graphic representations should help researchers develop some sort of intuition. We believe VPA is an essential tool for data-intensive research, which will only grow more important in the future as data mining, machine learning, and artificial intelligence play critical roles in effective, personalized education.

Several new features were added to Version 0.3, described as follows:

1) Interactions are provided through context menus. Context menus can be invoked by right-clicking on a visualization. Depending on where the user clicks, a context menu provides the available actions applicable to the selected objects. This allows a complex tool such as VPA to still have a simple, pleasant user interface.

2) Result collectors allow users to gather analysis results and export them in the CSV format. VPA is a data browser that allows users to navigate in the ocean of data from the repositories it connects to. Each step of navigation invokes some calculations behind the scenes. To collect the results of these calculations in a mining session, VPA now has a simple result collector that automatically keeps track of the user's work. A more sophisticated result manager is also being conceptualized and developed to make it possible for users to manage their data mining results in a more flexible way. These results can be exported if needed to be analyzed further using other software tools.

3) Cumulative data graphs are available to render a more dramatic view of time series. It is sometimes easier to spot patterns and trends in cumulative graphs. This cumulative analysis applies to all levels of granularity of data supported by VPA (currently, the three granular levels are Top, Medium, and Fine, corresponding to three different ways to categorize action data). VPA also provides a way for users to select variables from a list to be highlighted in cumulative graphs.

Many other new features were also added in this version. For example, additional information about classes and students are provided to contextualize each data set. In the coming weeks, the repository will incorporate data from more than 1,200 students in Indiana who have undertaken engineering design projects using our Energy3D software. This unprecedented large-scale database will potentially provide a goldmine of research data in the area of engineering design study.

For more information about VPA, see my AERA 2016 presentation.

High-Adventure Science Partnership with National Geographic Education

We are excited to announce that the Concord Consortium’s High-Adventure Science modules are now available on the National Geographic Education website, thanks to a National Science Foundation-funded partnership with National Geographic Education. High-Adventure Science modules have been used by thousands of students so far, and we welcome the opportunity to share our modules with a wider audience of middle and high school teachers and students. All modules will continue to be available on the High-Adventure Science website.

High-Adventure Science: Bringing contemporary science into the classroom

Each week-long High-Adventure Science module is built around an important unanswered question in Earth or environmental science; topics include fresh water availability, climate change, the future of our energy sources, air quality, land management, and the search for life in the universe.

Throughout each module, students learn about the framing question, experiment with interactive computer models, analyze real world data, and attempt to answer the same questions as research scientists. We don’t expect that students will be able to answer the framing questions at the end of the module (after all, scientists are still working to answer them!); rather, we want to engage students in the process of doing science, building arguments around evidence and data and realizing that not knowing the answers (uncertainty) drives scientific progress.

To that end, each module (and associated pre- and post-tests) contains several scientific argumentation item sets. The argumentation item set, with multiple-choice and open-ended questions, prompts students to consider the strengths and weaknesses of the provided data (graphs, models, tables, or text). Our research has shown that, after using High-Adventure Science modules, students improve both their understanding of the science content and their scientific argumentation skills. Register for a free account on the High-Adventure Science portal for access to pre- and post-tests.

Expanded teacher resources through National Geographic Education

Partnering with National Geographic Education has allowed us to provide more support for teachers. On the National Geographic Education website, you’ll find in-depth teaching tips, background information, vocabulary definitions, and links to the standards (NSES, Common Core, ISTE, and NGSS) to which our curricula are aligned. Additionally, each module is linked to related resources in the National Geographic catalog, greatly expanding the resources available to both teachers and students.

Teachers have been excited about the models, real world data, and the argumentation prompts that get students to focus on the evidence when making a scientific claim. (You can hear directly from one of the High-Adventure Science field test teachers at NSTA!)

Come see us at NSTA in Nashville, TN, this week! Stop by the National Geographic booth or come to a presentation about using High-Adventure Science modules in your classroom:

  • “High-Adventure Science: Free Simulations Exploring Earth’s Systems and Sustainability” on Thursday, March 31, from 12:30-1:00 PM in Music City Center, 106A
  • “Integrating Literacy Standards in Science” on Sunday, April 3, from 8:00-9:00 AM in Music City Center, 209A

 

Infrared imaging evidence of geothermal energy in a basement

Geothermal energy is the thermal energy generated or stored in the Earth. The ground maintains a nearly constant temperature six meter (20 feet) under, which is roughly equal to the average annual air temperature at the location. In Boston, this is about 13 °C (55 °F).

You can feel the effect of the geothermal energy in a basement, particularly in a hot summer day in which the basement can be significantly cooler. But IR imaging provides a unique visualization of this effect.

I happen to have a sub-basement that is partially buried in the ground. When I did an IR inspection of my basement in an attempt to identify places where heat escapes in a cold night, something that I did not expect struck me: As I scanned the basement, the whole basement floor appeared to be 4-6 °F warmer than the walls. Both the floor and wall of my basement are simply concrete -- there is no insulation, but the walls are partially or fully exposed to the outside air, which was about 24 °F at that time.

This temperature distribution pattern is opposite to the typical temperature gradient observed in a heated room where the top of a wall is usually a few degrees warmer than the bottom of a wall or the floor as hot air rises to warm up the upper part.

The only explanation of this warming of the basement floor is geothermal energy, caught by the IR camera.

Visualizing thermal equilibration: IR imaging vs. Energy2D simulation

Figure 1
A classic experiment to show thermal equilibration is to put a small Petri dish filled with some hot or cold water into a larger one filled with tap water around room temperature, as illustrated in Figure 1. Then stick one thermometer in the inner dish and another in the outer dish and take their readings over time.

With a low-cost IR camera like the FLIR C2 camera or FLIR ONE camera, this experiment becomes much more visual (Figure 2). As an IR camera provides a full-field view of the experiment in real time, you get much richer information about the process than a graph of two converging curves from the temperature data read from the two thermometers.
Figure 2

The complete equilibration process typically takes 10-30 minutes, depending on the initial temperature difference between the water in the two dishes and the amount of water in the inner dish. A larger temperature difference or a larger amount of water in the inner dish will require more time to reach the thermal equilibrium.

Another way to quickly show this process is to use our Energy2D software to create a computer simulation (Figure 3). Such a simulation provides a visualization that resembles the IR imaging result. The advantage is that it runs very fast -- only 10 seconds or so are needed to reach the thermal equilibrium. This allows you to test various conditions rapidly, e.g., changing the initial temperature of the water in the inner dish or the outer dish or changing the diameters of the dishes.

Figure 3
Both real-world experiments and computer simulations have their own pros and cons. Exactly which one to use depends on your situation. As a scientist, I believe nothing beats real-world experiments in supporting authentic science learning and we should always favor them whenever possible. However, conducting real-world experiments requires a lot of time and resources, which makes it impractical to implement throughout a course. Computer simulations provide an alternative solution that allows students to get a sense of real-world experiments without entailing the time and cost. But the downside is that a computer simulation, most of the time, is an overly simplified scientific model that does not have the many layers of complexity and the many types of interactions that we experience in reality. In a real-world experiment, there are always unexpected factors and details that need to be attended to. It is these unexpected factors and details that create genuinely profound and exciting teachable moments. This important nature of science is severely missing in computer simulations, even with a sophisticated computational fluid dynamics tool such as Energy2D.

Here is my balancing of this trade-off equation: It is essential for students to learn simplified scientific models before they can explore complex real-world situations. The models will give students the frameworks needed to make sense of real-world observation. A fair strategy is to use simulations to teach simplified models and then make some time for students to conduct experiments in the real world and learn how to integrate and apply their knowledge about the models to solve real problems.

A side note: You may be wondering how well the Energy2D result agrees with the IR result on a quantitative basis. This is kind of an important question -- If the simulation is not a good approximation of the real-world process, it is not a good simulation and one may challenge its usefulness, even for learning purposes. Figure 4 shows a comparison of a test run. As you can see, the while the result predicted by Energy2D agrees in trend with the results observed through IR imaging, there are some details in the real data that may be caused by either human errors in taking the data or thermal fluctuations in the room. What is more, after the thermal equilibrium was reached, the water in both dishes continued to cool down to room temperature and then below due to evaporative cooling. The cooling to room temperature was modeled in the Energy2D simulation through a thermal coupling to the environment but evaporative cooling was not.

Figure 4

An infrared investigation on a Stirling engine

Figure 1
The year 2016 marks the 200th anniversary of an important invention of Robert Stirling -- the Stirling engine. So I thought I should start this year's blogging with a commemoration article about this truly ingenious invention.

A Stirling engine is a closed-cycle heat engine that operates by cyclic compression and expansion of air or other gas by a temperature difference across the engine. A Stirling engine is able to convert thermal energy into mechanical work.

You can buy an awesome toy Stirling engine from Amazon (perhaps next Christmas's gift for some inquisitive minds). If you put it on top of a cup of hot water, this amazing machine will just run until the hot water cools down to the room temperature.

Figure 2
Curious about whether the Stirling circle would actually accelerate the cooling process, I filled hot water into two identical mugs and covered one of them with the Stirling engine. Then I started the engine and observed what happened to the temperature through an IR camera. It turned out that the mug covered by the engine maintained a temperature about 10 °C higher than the open mug in about 30 minutes of observation time. If you have a chance to do this experiment, you probably would be surprised. The flying wheel of the Stirling engine seems to be announcing that it is working very hard by displaying fast spinning and making a lot of noise. But all that energy, visual and audible as it is, is no match to the thermal energy lost through evaporation of water from the open hot mug (Figure 1).

How about comparing the Stirling engine with heat transfer? I found a metal box that has approximately the same size and same thickness with our Stirling engine. I refilled the hot water to the two mugs and covered one with the metal box and the other with the Stirling engine. Then I started the engine and tracked their temperatures through the IR camera. It turned out that the rates of heat loss from the two mugs were about the same in about 30 minutes of observation. What this really means is that the energy that drove the engine was actually very small compared with the thermal energy that is lost to the environment through heat transfer (Figure 2).

This is understandable because the speed of the flying wheel is only a small fraction of the average speed of molecules (which is about the speed of sound or higher). This investigation also suggests that the Stirling engine is very efficient. Had we insulated the mug, it would have run for hours.

Chemical imaging using infrared cameras

Figure 1: Evaporative cooling
Scientists have long relied on powerful imaging techniques to see things invisible to the naked eye and thus advance science. Chemical imaging is a technique for visualizing chemical composition and dynamics in time and space as actual events unfold. In this sense, infrared (IR) imaging is a chemical imaging technique as it allows one to see temporal and spatial changes of temperature distribution and, just like in other chemical imaging techniques, infer what is occurring at the molecular level based on these information.

Figure 2: IR imaging
Most IR cameras are sensitive enough to pick up a temperature difference of 0.1°C or less. This sensitivity makes it possible to detect certain effects from the molecular world. Figure 1 provides an example that suggests this possibility.

This experiment, which concerns evaporation of water, cannot be simpler: Just pour some room-temperature water into a plastic cup, leave it for a few hours, and then aim an IR camera at it. In stark contrast to the thermal background, the whole cup remains 1-2°C cooler than the room temperature (Figure 2). About how much water evaporation is enough to keep the cup this cool? Let’s do a simple calculation. Our measurement showed that in a typical dry and warm office environment in the winter, a cup of water (10 cm diameter) loses approximately six grams of water in 24 hours. That is to say, the evaporation rate is 7×10-5 g/s or 7×10-11 m3/s. Divided by the surface area of the cup mouth, which is 0.00785 m2, we obtain that the thickness of the layer of water that evaporates in a second is 8.9 nm—that is roughly the length of only 30 water molecules lining up shoulder to shoulder! It is amazing to notice that just the evaporation of this tiny amount of water at such a slow rate (a second is a very long time for molecules) suffices to sustain a temperature difference of 1-2°C for the entire cup. 

This simple experiment actually raises more questions than it answers. Based on the latent heat of vaporization of water, which is about 2265 J/g, we estimate that the rate of energy loss through evaporation is only 0.16 J/s. This rate of energy loss should have a negligible effect on the 200 g of water in the cup as the specific heat of water is 4.186 J/(g×°C). So where does this cooling effect come from? How does it persist? Would the temperature of water be even lower if there is less water in the cup? What would the temperature difference be if the room temperature changes? These questions pose great opportunities to engage students to propose their hypotheses and test them with more experiments. It is through the quest to the answers that students learn to think and act like scientists.

IR imaging is an ideal tool for guided inquiry as it eliminates the tedious data collection procedures and focuses students on data analysis. In the practice of inquiry, data analysis is viewed as more important than data collection in helping students develop their thinking skills and conceptual understandings. Although this cooling effect can also be investigated using a thermometer, students’ perception might be quite different. An IR camera immediately shows that the entire cup, not just the water surface, is cooler. Seeing the bulk of the cup in blue color may prompt students to think more deeply and invite new questions, whereas a single temperature reading from a thermometer may not deliver the same experience.

Scientists use Energy2D to simulate the effect of micro flow on molecular self-assembly

Copyright: ACS Nano, American Chemical Society
Self-assembled peptide nanostructures have unique properties that lead to applications in electrical devices and functional molecular recognition. Exactly how to control the self-assembly process in a solution is a hot research topic. Since a solution is a fluid, a little fluid mechanics would be needed to understand how micro flow affects the self-assembly of the peptide molecules.

ACS Nano, a journal of the American Chemical Society, published a research article on December 11 that includes a result of using our Energy2D software to simulate turbulent situations in which the non-uniform plumes rising from the substrate result in the formation of randomly arranged diphenylalanine (FF) rods and tubes. This paper, titled "Morphology and Pattern Control of Diphenylalanine Self-Assembly via Evaporative Dewetting," is the result of collaboration between scientists from Nanjing University and the City University of Hong Kong.

We are absolutely thrilled by the fact that many scientists have used Energy2D in their work. As far as we know, this is the second published scientific research paper that has used Energy2D.

On a separate avenue, many engineers are already using Energy2D to aid their design work. For example, in a German forum about renewable energy, an engineer has recently used the tool to make sense of his experimental results with various air collector designs. He reported that the results are "confirmed by the experiences of several users: pressure losses and less volume of air in the blowing operation" (translated from German using Google Translate).

It is these successful applications of Energy2D in the real world that will make it a relevant tool in science and engineering for a very long time.

Energy3D V5.0 released

Full-scale building energy simulation
Insolation analysis of a city block
We are pleased to announce a milestone version of our Energy3D CAD software. In addition to fixing numerous bugs, Version 5.0 includes numerous new features that we have recently added to the software to enhance its already powerful concurrent design, simulation, and analysis capabilities.

For example, we have added cut/copy/paste in 3D space that greatly eases 3D construction. With this functionality, laying an array of solar panels on a roof is as simple and intuitive as copying and pasting an existing solar panel. Creating a village or city block is also made easier as a building can be copied and pasted anywhere on the ground -- you can create a number of identical buildings using the copy/paste function and then work to make them different.

Insolation analysis of various houses
Compared with previous versions, the properties of every building element can now be set individually using the corresponding popup menu and window. Being able to set the properties of an individual element is important as it is often a good idea for fenestration on different sides of a building to have different solar heat gain coefficients. The user interface for setting the solar heat gain coefficient, for instance, allows the user to specify whether he or she wants to apply the value to the selected window, all the windows on the selected side, or the entire building.

In a move to simulate machine-learning thermostats such as Google's Nest Thermostat to test the assertion that they can help save energy, we have added programmable thermostats. We have also added a geothermal model that allows for more accurate simulation of heat exchange between a building and the ground. New efforts for modeling weather and landscape more accurately are already on the way.

The goal of Energy3D is to create a software platform that bridges education and industry -- we are already working with leading home energy companies to bring this tool to schools and workplaces. This synergy has led to some interesting and exciting business opportunities that mutually benefit education and industry.

A bonus of this version is that it no longer requires users to install Java. We have provided a Windows installer and a Mac installer that work just like any other familiar software installer. Users should now find it easy to install Energy3D, compared with the previously problematic Java Web Start installer.