Tag Archives: Infrared Imaging

Infrared Street View won Department of Energy’s JUMP competition

Creating an infrared street view using SmartIR and FLIR ONE
Our Infrared Street View (ISV) program has won the JUMP Competition sponsored jointly by CLEAResult, the largest provider of energy efficiency programs and services in North America, and the National Renewable Energy Laboratory (NREL), a research division of the US Department of Energy (DOE). This JUMP Competition called for innovations in using smartphones' sensing capabilities to improve residential energy efficiency. Finalists were selected from a pool of submitted proposals and invited to make their pitches to the audience at the CLEAResult Energy Forum held in Austin, TX on October 4-6, 2016. There is only one winner among all the good ideas for each competition. This year, we just happened to be one.

IR homework
We envision the Infrared Street View as an infrared (IR) counterpart of Google's Street View (I know, I know, this is probably too big to swallow for an organization that is a few garages small). Unlike Google's Street View in the range of visible light, the Infrared Street View will provide a gigantic database of thermal images in the range of invisible IR light emitted by molecular vibrations related to thermal energy. If you think about these images in a different way, they actually are a massive 3D web of temperature data points. What is the value of this big data? If the data are collected in the right way, they may represent the current state of the energy efficiency of our neighborhoods, towns, cities, and even states. In a sense, what we are talking about is in fact a thermographic information system (TIS).

We are not the only group that realized this possibility (but we are likely the first one that came up with the notion and name of TIS). A few startup companies in Boston area have worked in this frontier earlier this decade. But none of them has tapped into the potential of smartphone technologies. With a handful of drive-by trucks or fly-by drones with a bunch of mounted infrared cameras, it probably would take these companies a century to complete this thermal survey for the entire country. Furthermore, the trucks can only take images from the front of a building and the drones can only take images from above, which mean that their data are incomplete and cannot be used to create the thermal web that we are imagining. In some cases, unsolicited thermal scan of people's houses may even cause legal troubles as thermal signatures may accidentally disclose sensitive information.

Our solution is based on FLIR ONE, a $200-ish thermal camera that can be plugged into a smartphone (iOS or Android). The low cost of FLIR ONE, for the first time in history, makes it possible for the public to participate in this thermal survey. But even with the relatively low price tag, it is simply unrealistic to expect that a lot of people will buy the camera and scan their own houses. So where can we find a lot of users who would volunteer to participate in this effort?

Let's look elsewhere. There are four million children entering the US education system each year. Every single one of them is required to spend a sizable chunk of their education on learning thermal science concepts -- in a way that currently relies on formalism (the book shows you the text and math, you read the text and do the math). IR cameras, capable of visualizing otherwise invisible heat flow and distribution, is no doubt the best tool for teaching and learning thermal energy and heat transfer (except for those visually impaired -- my apology). I think few science teachers would disagree with that. And starting this year, educational technology vendors like Vernier and Pasco are selling IR cameras to schools.

What if we teach students thermal science in the classroom with an IR camera and then ask them to inspect their own homes with the camera as a homework assignment? At the end, we then ask them to acquire their parents' permissions and contribute their IR images to the Infrared Street View project. If millions of students do this, then we will have an ongoing crowdsourcing project that can engage and mobilize many generations of students to come.

Sensor-based artificial intelligence
We can't take students' IR images seriously, I hear you criticizing. True, students are not professionals and they make mistakes. But there is a way to teach them how to act and think like professionals, which is actually a goal of the Next Generation Science Standards that define the next two or three decades of US science education. Aside from a curriculum that teaches students how to use IR cameras (skills) and how to interpret IR images (concepts), we are also developing a powerful smartphone app called SmartIR. This app has many innovations but two of them may lead to true breakthroughs in the field of thermography.

Thermogram sphere
The first one is sensor-based intelligence. Modern smartphones have many built-in sensors, including the visible light cameras. These sensors and cameras are capable of collecting multiple types of data. The increasingly powerful libraries of computer vision only enrich this capability even more. Machine learning can infer what students are trying to do by analyzing these data. Based on the analysis results, SmartIR can then automatically guide students in real time. This kind of artificial intelligence (AI) can help students avoid common mistakes in infrared thermography and accelerate their thermal survey, especially when they are scanning buildings independently (when there is no experienced instructor around to help them). For example, the SmartIR app can check if the inspection is being done at night or during the day. If it is during the day (because the clock says so or the ambient light sensor says so), then SmartIR will suggest that students wait to do their scan until nightfall eliminates the side effect of solar heating and lowers the indoor-outdoor temperature difference to a greater degree. With an intelligent app like this, we may be able to increase the quality and reliability of the IR images that are fed to the Infrared Street View project.
Virtual infrared reality (VIR) viewed with Google Cardboard

The second one is virtual infrared reality, or VIR in short, to accomplish true, immersive thermal vision. VIR is a technology that integrates infrared thermography with virtual reality (VR). Based on the orientation and GPS sensors of the phone, SmartIR can create what we called a thermogram sphere and then knit them together to render a seamless IR view. A VIR can be uploaded to Google Maps so that the public can experience it using a VR viewer, such as Google's Cardboard Viewer. We don't know if VIR is going to do any better than 2D IR images in promoting the energy efficiency business, but it is reasonable to assume that many people would not mind seeing a cool (or hot) view like this while searching their dream houses. For the building science professionals, this may even have some implications because VIR provides a way to naturally organize the thermal images of a building to display a more holistic view of what is going on thermally.

With these innovations, we may eventually be able to realize our vision of inventing a visual 3D web of thermal data, or the thermographic information system, that will provide a massive data set for governments and companies to assess the state of residential energy efficiency on an unprecedented scale and with incredible detail.

National Science Foundation funds chemical imaging research based on infrared thermography

The National Science Foundation (NSF) has awarded Bowling Green State University (BGSU) and Concord Consortium (CC) an exploratory grant of $300 K to investigate how chemical imaging based on infrared (IR) thermography can be used in chemistry labs to support undergraduate learning and teaching.

Chemists often rely on visually striking color changes shown by pH, redox, and other indicators to detect or track chemical changes. About six years ago, I realized that IR imaging may represent a novel class of universal indicators that, instead of using  halochromic compounds, use false color heat maps to visualize any chemical process that involves the absorption, release, or distribution of thermal energy (see my original paper published in 2011). I felt that IR thermography could one day become a powerful imaging technique for studying chemistry and biology. As the technique doesn't involve the use of any chemical substance as a detector, it could be considered as a "green" indicator.

Fig. 1: IR-based differential thermal analysis of freezing point depression
Although IR cameras are not new, inexpensive lightweight models have become available only recently. The releases of two competitively priced IR cameras for smartphones in 2014 marked an epoch of personal thermal vision. In January 2014, FLIR Systems unveiled the $349 FLIR ONE, the first camera that can be attached to an iPhone. Months later, a startup company Seek Thermal released a $199 IR camera that has an even higher resolution and can be connected to most smartphones. The race was on to make better and cheaper cameras. In January 2015, FLIR announced the second-generation FLIR ONE camera, priced at $231 in Amazon. With an educational discount, the price of an IR cameras is now comparable to what a single sensor may cost (e.g., Vernier sells an IR thermometer at $179). All these new cameras can take IR images just like taking conventional photos and record IR videos just like recording conventional videos. The manufacturers also provide application programming interfaces (APIs) for developers to blend thermal vision and computer vision in a smartphone to create interesting apps.

Fig. 2: IR-based differential thermal analysis of enzyme kinetics
Not surprisingly, many educators, including ourselves, have realized the value of IR cameras for teaching topics such as thermal radiation and heat transfer that are naturally supported by IR imaging. Applications in other fields such as chemistry, however, seem less obvious and remain underexplored, even though almost every chemistry reaction or phase transition absorbs or releases heat. The NSF project will focus on showing how IR imaging can become an extraordinary tool for chemical education. The project aims to develop seven curriculum units based on the use of IR imaging to support, accelerate, and expand inquiry-based learning for a wide range of chemistry concepts. The units will employ the predict-observe-explain (POE) cycle to scaffold inquiry in laboratory activities based on IR imaging. To demonstrate the versatility and generality of this approach, the units will cover a range of topics, such as thermodynamics, heat transfer, phase change, colligative properties (Figure 1), and enzyme kinetics (Figure 2).

The research will focus on finding robust evidence of learning due to IR imaging, with the goal to identify underlying cognitive mechanisms and recommend effective strategies for using IR imaging in chemistry education. This study will be conducted for a diverse student population at BGSU, Boston College, Bradley University, Owens Community College, Parkland College, St. John Fisher College, and SUNY Geneseo.

Partial support for this work was provided by the National Science Foundation's Improving Undergraduate STEM Education (IUSE) program under Award No. 1626228. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.

Infrared Street View selected as a finalist in Department of Energy’s JUMP competition

JUMP is an online crowdsourcing community hosted by five national laboratories of the US Department of Energy (DOE) and some of the top private companies in the buildings sector. The goal is to broaden the pool of people from whom DOE seeks ideas and to move these ideas to the marketplace faster.

In July, the National Renewable Energy Laboratory (NREL) and CLEAResult launched a Call for Innovation to leverage crowdsourcing to solicit new ideas for saving energy in homes based on smartphone technologies. Modern smartphones are packed with a variety of sensors capable of detecting all kinds of things about their surroundings. Smartphones can determine whether people are home, or close to home, which may be useful for managing their HVAC systems and controlling lighting and appliances. Smartphones can also gather and analyze data to inform homeowners and improve residential energy efficiency.

Infrared images of houses
We responded to the call with a proposal to develop a smartphone app that can be used to create an infrared version of Google's Street View, which we call Infrared Street View. NREL notified us this week that the proposal has been selected as a finalist of the competition and invited us to pitch the idea at the CLEAResult Energy Forum in Austin, TX next month.

The app will integrate smartphone-based infrared imaging (e.g., FLIR ONE) and Google Map, along with built-in sensors of the smartphone such as the GPS sensor and the accelerometer, to create thermal views of streets at night in the winter in order to reveal possible thermal anomalies in neighborhoods and bring awareness of energy efficiency to people. These infrared images may even have business values. For example, they may provide information about the conditions of the windows of a building that may be useful to companies interested in marketing new windows.

The app will be based on the SDK of FLIR ONE and the Google Map API, backed by a program running in the cloud to collect, process, and serve data. The latest FLIR ONE model now costs $249 and works with common Android and iOS devices, making it possible for us to implement this idea. A virtual reality mode will also be added to enhance the visual effect. So this could be an exciting IR+VR+AR (augmented reality) project.

You may be wondering who would be interested in using the app to create the infrared street views. After all, the success of the project depends on the participation of a large number of people. But we are not Google and we do not have the resources to hire a lot of people to do the job. Our plan is to work with schools. We have a current project in which we work with teachers to promote infrared imaging as a novel way to teach thermal energy and heat transfer in classrooms. This is an area in science education that every school covers. Many teachers -- after seeing an infrared camera in action -- are convinced that infrared imaging is the ultimate way to teach thermal science. If this project is used as a capstone activity in thermal science, it is possible that we can reach and motivate thousands of students who would help make this crowdsourcing project a success.

Those who know earlier efforts may consider this initiative a new round to advance the idea. The main new things are: 1) our plan is based on crowdsourcing with potentially a large number of students who are equipped with smartphone-based IR cameras, not a few drive-by trucks with cameras that homeowners have no idea about; 2) the concerns of privacy and legality should be mitigated as students only scan their own houses and neighbors with permissions from their parents and neighbors and only publish their images in the Google Map app when permitted by their parents and neighbors; and, most importantly, 3) unlike the previous projects that do not put people first, our project starts with the education of children and has a better chance to convince adults.

Personal thermal vision could turn millions of students into the cleantech workforce of today

So we have signed the Paris Agreement and cheered about it. Now what?

More than a year ago, I wrote a proposal to the National Science Foundation to test the feasibility of empowering students to help combat the energy issues of our nation. There are hundreds of millions of buildings in our country and some of them are pretty big energy losers. The home energy industry currently employs probably 100,000 people at most. It would take them a few decades to weatherize and solarize all these residential and commercial buildings (let alone educating home owners so that they would take such actions).

But there are millions of students in schools who are probably more likely to be concerned about the world that they are about to inherit. Why not ask them to help?

You probably know a lot of projects on this very same mission. But I want to do something different. Enough messaging has been done. We don't need to hand out more brochures and flyers about the environmental issues that we may be facing. It is time to call for actions!

For a number of years, I have been working on infrared thermography and building energy simulation to knock down the technical barriers that these techniques may pose to children. With NSF awarding us a $1.2M grant last year and FLIR releasing a series of inexpensive thermal cameras, the time of bringing these tools to large-scale applications in schools has finally arrived.

For more information, see our poster that will be presented at a NSF meeting next week. Note that this project has just begun so we haven't had a chance to test the solarization part. But the results from the weatherization part based on infrared thermography has been extremely encouraging!

Infrared imaging evidence of geothermal energy in a basement

Geothermal energy is the thermal energy generated or stored in the Earth. The ground maintains a nearly constant temperature six meter (20 feet) under, which is roughly equal to the average annual air temperature at the location. In Boston, this is about 13 °C (55 °F).

You can feel the effect of the geothermal energy in a basement, particularly in a hot summer day in which the basement can be significantly cooler. But IR imaging provides a unique visualization of this effect.

I happen to have a sub-basement that is partially buried in the ground. When I did an IR inspection of my basement in an attempt to identify places where heat escapes in a cold night, something that I did not expect struck me: As I scanned the basement, the whole basement floor appeared to be 4-6 °F warmer than the walls. Both the floor and wall of my basement are simply concrete -- there is no insulation, but the walls are partially or fully exposed to the outside air, which was about 24 °F at that time.

This temperature distribution pattern is opposite to the typical temperature gradient observed in a heated room where the top of a wall is usually a few degrees warmer than the bottom of a wall or the floor as hot air rises to warm up the upper part.

The only explanation of this warming of the basement floor is geothermal energy, caught by the IR camera.

Visualizing thermal equilibration: IR imaging vs. Energy2D simulation

Figure 1
A classic experiment to show thermal equilibration is to put a small Petri dish filled with some hot or cold water into a larger one filled with tap water around room temperature, as illustrated in Figure 1. Then stick one thermometer in the inner dish and another in the outer dish and take their readings over time.

With a low-cost IR camera like the FLIR C2 camera or FLIR ONE camera, this experiment becomes much more visual (Figure 2). As an IR camera provides a full-field view of the experiment in real time, you get much richer information about the process than a graph of two converging curves from the temperature data read from the two thermometers.
Figure 2

The complete equilibration process typically takes 10-30 minutes, depending on the initial temperature difference between the water in the two dishes and the amount of water in the inner dish. A larger temperature difference or a larger amount of water in the inner dish will require more time to reach the thermal equilibrium.

Another way to quickly show this process is to use our Energy2D software to create a computer simulation (Figure 3). Such a simulation provides a visualization that resembles the IR imaging result. The advantage is that it runs very fast -- only 10 seconds or so are needed to reach the thermal equilibrium. This allows you to test various conditions rapidly, e.g., changing the initial temperature of the water in the inner dish or the outer dish or changing the diameters of the dishes.

Figure 3
Both real-world experiments and computer simulations have their own pros and cons. Exactly which one to use depends on your situation. As a scientist, I believe nothing beats real-world experiments in supporting authentic science learning and we should always favor them whenever possible. However, conducting real-world experiments requires a lot of time and resources, which makes it impractical to implement throughout a course. Computer simulations provide an alternative solution that allows students to get a sense of real-world experiments without entailing the time and cost. But the downside is that a computer simulation, most of the time, is an overly simplified scientific model that does not have the many layers of complexity and the many types of interactions that we experience in reality. In a real-world experiment, there are always unexpected factors and details that need to be attended to. It is these unexpected factors and details that create genuinely profound and exciting teachable moments. This important nature of science is severely missing in computer simulations, even with a sophisticated computational fluid dynamics tool such as Energy2D.

Here is my balancing of this trade-off equation: It is essential for students to learn simplified scientific models before they can explore complex real-world situations. The models will give students the frameworks needed to make sense of real-world observation. A fair strategy is to use simulations to teach simplified models and then make some time for students to conduct experiments in the real world and learn how to integrate and apply their knowledge about the models to solve real problems.

A side note: You may be wondering how well the Energy2D result agrees with the IR result on a quantitative basis. This is kind of an important question -- If the simulation is not a good approximation of the real-world process, it is not a good simulation and one may challenge its usefulness, even for learning purposes. Figure 4 shows a comparison of a test run. As you can see, the while the result predicted by Energy2D agrees in trend with the results observed through IR imaging, there are some details in the real data that may be caused by either human errors in taking the data or thermal fluctuations in the room. What is more, after the thermal equilibrium was reached, the water in both dishes continued to cool down to room temperature and then below due to evaporative cooling. The cooling to room temperature was modeled in the Energy2D simulation through a thermal coupling to the environment but evaporative cooling was not.

Figure 4

An infrared investigation on a Stirling engine

Figure 1
The year 2016 marks the 200th anniversary of an important invention of Robert Stirling -- the Stirling engine. So I thought I should start this year's blogging with a commemoration article about this truly ingenious invention.

A Stirling engine is a closed-cycle heat engine that operates by cyclic compression and expansion of air or other gas by a temperature difference across the engine. A Stirling engine is able to convert thermal energy into mechanical work.

You can buy an awesome toy Stirling engine from Amazon (perhaps next Christmas's gift for some inquisitive minds). If you put it on top of a cup of hot water, this amazing machine will just run until the hot water cools down to the room temperature.

Figure 2
Curious about whether the Stirling circle would actually accelerate the cooling process, I filled hot water into two identical mugs and covered one of them with the Stirling engine. Then I started the engine and observed what happened to the temperature through an IR camera. It turned out that the mug covered by the engine maintained a temperature about 10 °C higher than the open mug in about 30 minutes of observation time. If you have a chance to do this experiment, you probably would be surprised. The flying wheel of the Stirling engine seems to be announcing that it is working very hard by displaying fast spinning and making a lot of noise. But all that energy, visual and audible as it is, is no match to the thermal energy lost through evaporation of water from the open hot mug (Figure 1).

How about comparing the Stirling engine with heat transfer? I found a metal box that has approximately the same size and same thickness with our Stirling engine. I refilled the hot water to the two mugs and covered one with the metal box and the other with the Stirling engine. Then I started the engine and tracked their temperatures through the IR camera. It turned out that the rates of heat loss from the two mugs were about the same in about 30 minutes of observation. What this really means is that the energy that drove the engine was actually very small compared with the thermal energy that is lost to the environment through heat transfer (Figure 2).

This is understandable because the speed of the flying wheel is only a small fraction of the average speed of molecules (which is about the speed of sound or higher). This investigation also suggests that the Stirling engine is very efficient. Had we insulated the mug, it would have run for hours.

Chemical imaging using infrared cameras

Figure 1: Evaporative cooling
Scientists have long relied on powerful imaging techniques to see things invisible to the naked eye and thus advance science. Chemical imaging is a technique for visualizing chemical composition and dynamics in time and space as actual events unfold. In this sense, infrared (IR) imaging is a chemical imaging technique as it allows one to see temporal and spatial changes of temperature distribution and, just like in other chemical imaging techniques, infer what is occurring at the molecular level based on these information.

Figure 2: IR imaging
Most IR cameras are sensitive enough to pick up a temperature difference of 0.1°C or less. This sensitivity makes it possible to detect certain effects from the molecular world. Figure 1 provides an example that suggests this possibility.

This experiment, which concerns evaporation of water, cannot be simpler: Just pour some room-temperature water into a plastic cup, leave it for a few hours, and then aim an IR camera at it. In stark contrast to the thermal background, the whole cup remains 1-2°C cooler than the room temperature (Figure 2). About how much water evaporation is enough to keep the cup this cool? Let’s do a simple calculation. Our measurement showed that in a typical dry and warm office environment in the winter, a cup of water (10 cm diameter) loses approximately six grams of water in 24 hours. That is to say, the evaporation rate is 7×10-5 g/s or 7×10-11 m3/s. Divided by the surface area of the cup mouth, which is 0.00785 m2, we obtain that the thickness of the layer of water that evaporates in a second is 8.9 nm—that is roughly the length of only 30 water molecules lining up shoulder to shoulder! It is amazing to notice that just the evaporation of this tiny amount of water at such a slow rate (a second is a very long time for molecules) suffices to sustain a temperature difference of 1-2°C for the entire cup. 

This simple experiment actually raises more questions than it answers. Based on the latent heat of vaporization of water, which is about 2265 J/g, we estimate that the rate of energy loss through evaporation is only 0.16 J/s. This rate of energy loss should have a negligible effect on the 200 g of water in the cup as the specific heat of water is 4.186 J/(g×°C). So where does this cooling effect come from? How does it persist? Would the temperature of water be even lower if there is less water in the cup? What would the temperature difference be if the room temperature changes? These questions pose great opportunities to engage students to propose their hypotheses and test them with more experiments. It is through the quest to the answers that students learn to think and act like scientists.

IR imaging is an ideal tool for guided inquiry as it eliminates the tedious data collection procedures and focuses students on data analysis. In the practice of inquiry, data analysis is viewed as more important than data collection in helping students develop their thinking skills and conceptual understandings. Although this cooling effect can also be investigated using a thermometer, students’ perception might be quite different. An IR camera immediately shows that the entire cup, not just the water surface, is cooler. Seeing the bulk of the cup in blue color may prompt students to think more deeply and invite new questions, whereas a single temperature reading from a thermometer may not deliver the same experience.

Simulating geometric thermal bridges using Energy2D

Fig. 1: IR image of a wall junction (inside) by Stefan Mayer
One of the mysterious things that causes people to scratch their heads when they see an infrared picture of a room is that the junctions such as edges and corners formed by two exterior walls (or floors and roofs) often appear to be colder in the winter than other parts of the walls, as is shown in Figure 1. This is, I hear you saying, caused by an air gap between two walls. But not that simple! While a leaking gap can certainly do it, the effect is there even without a gap. Better insulation only makes the junctions less cold.

Fig. 2: An Energy2D simulation of thermal bridge corners.
A typical explanation of this phenomenon is that, because the exterior surface of a junction (where the heat is lost to the outside) is greater than its interior surface (where the heat is gained from the inside), the junction ends up losing thermal energy in the winter more quickly than a straight part of the walls, causing it to be colder. The temperature difference is immediately revealed by a very sensitive IR camera. Such a junction is commonly called a geometric thermal bridge, which is different from material thermal bridge that is caused by the presence of a more conductive piece in a building assembly such as a steel stud in a wall or a concrete floor of a balcony.

Fig. 3: IR image of a wall junction (outside) by Stefan Mayer
But the actual heat transfer process is much more complicated and confusing. While a wall junction does create a difference in the surface areas of the interior and exterior of the wall, it also forms a thicker area through which the heat must flow through (the area is thicker because it is in a diagonal direction). The increased thickness should impede the heat flow, right?

Fig. 4: An Energy2D simulation of a L-shaped wall.
Unclear about the outcome of these competing factors, I made some Energy2D simulations to see if they can help me. Figure 2 shows the first one that uses a block of object remaining at 20 °C to mimic a warm room and the surrounding environment of 0 °C, with a four-side wall in-between. Temperature sensors are placed at corners, as well as the middle point of a wall. The results show that the corners are indeed colder than other parts of the walls in a stable state. (Note that this simulation only involves heat diffusion, but adding radiation heat transfer should yield similar results.)

What about more complex shapes like an L-shaped wall that has both convex and concave junctions? Figure 3 shows the IR image of such a wall junction, taken from the outside of a house. In this image, interestingly enough, the convex edge appears to be colder, but the concave edge appears to be warmer!

The Energy2D simulation (Figure 4) shows a similar pattern like the IR image (Figure 3). The simulation results show that the temperature sensor placed near the concave edge outside the L-shape room does register a higher temperature than other sensors.

Now, the interesting question is, does the room lose more energy through a concave junction or a convex one? If we look at the IR image of the interior taken inside the house (Figure 1), we would probably say that the convex junction loses more energy. But if we look at the IR image of the exterior taken outside the house (Figure 3), we would probably say that the concave junction loses more energy.

Which statement is correct? I will leave that to you. You can download the Energy2D simulations from this link, play with them, and see if they help you figure out the answer. These simulations also include simulations of the reverse cases in which heat flows from the outside into the room (the summer condition).

The National Science Foundation funds large-scale applications of infrared cameras in schools


We are pleased to announce that the National Science Foundation has awarded the Concord Consortium, Next Step Living, and Virtual High School a grant of $1.2M to put innovative technologies such as infrared cameras into the hands of thousands of secondary students. This education-industry collaborative will create a technology-enhanced learning pathway from school to home and then to cognate careers, establishing thereby a data-rich testbed for developing and evaluating strategies for translating innovative technology experiences into consistent science learning and career awareness in different settings. While there have been studies on connecting science to everyday life or situating learning in professional scenarios to increase the relevance or authenticity of learning, the strategies of using industry-grade technologies to strengthen these connections have rarely been explored. In many cases, often due to the lack of experiences, resources, and curricular supports, industry technologies are simply used as showcases or demonstrations to give students a glimpse of how professionals use them to solve problems in the workplace.


Over the last few years, however, quite a number of industry technologies have become widely accessible to schools. For example, Autodesk has announced that their software products will be freely available to all students and teachers around the world. Another example is infrared cameras that I have been experimenting and blogging since 2010. Due to the continuous development of electronics and optics, what used to be a very expensive scientific instrument is now only a few hundred dollars, with the most affordable infrared camera falling below $200.

The funded project, called Next Step Learning, will be the largest-scale application of infrared camera in secondary schools -- in terms of the number of students that will be involved in the three-year project. We estimate that dozens of schools and thousands of students in Massachusetts will participate in this project. These students will use infrared cameras provided by the project to thermally inspect their own homes. The images in this blog post are some of the curious images I took in my own house using the FLIR ONE camera that is attached to an iPhone.

In the broader context, the Next Generation Science Standards (NGSS) envisions “three-dimensional learning” in which the learning of disciplinary core ideas and crosscutting concepts is integrated with science and engineering practices. A goal of the NGSS is to make science education more closely resemble the way scientists and engineers actually think and work. To accomplish this goal, an abundance of opportunities for students to practice science and engineering through solving authentic real-world problems will need to be created and researched. If these learning opportunities are meaningfully connected to current industry practices using industry-grade technologies, they can also increase students’ awareness of cognate careers, help them construct professional identities, and prepare them with knowledge and skills needed by employers, attaining thereby the goals of both science education and workforce development simultaneously. The Next Step Learning project will explore, test, and evaluate this strategy.