Designing 2030: Building the Educational Technology Community of Tomorrow for Design, Interoperability, and Equity

The Concord Consortium is thrilled to announce a new initiative to transform STEM teaching and learning and reach more students with educational technology. By applying current and future technologies in unique ways, generating new collaborations, and leveraging the power of open educational resources, a group of innovative thought leaders is working to revolutionize STEM learning experiences and bring them to a broader, more diverse group of learners. A two-day summit sponsored by the Gordon and Betty Moore Foundation jumpstarted the effort last month. Designing 2030 was the first step on a journey to predict, imagine, and design for 2030.

STEM education professionals, scientists, curriculum developers, learning scientists, and more gathered for the inaugural Designing 2030 Summit.

Thirty STEM education professionals, leaders of informal education institutions, scientists, entrepreneurs, curriculum developers, K-12 educational software programmers, teachers, and learning scientists convened at Dynamicland, an experimental space in Oakland, California, that embodies communal computing. Dynamicland houses a multi-person computer, embedded into the physical rooms to support collaboration through tangible objects, running code printed on paper and digital projections on surfaces.

Concord Consortium Executive Vice President Sherry Hsi welcomed the group to Dynamicland, and President and CEO Chad Dorsey set the meeting in the context of technological advances by describing a timeline of different technologies that have changed how people get information and connect with others—from the advent of social media to digital appliances.

How can technology transform the way we teach and learn science and broaden participation by more learners?

Participants outlined potential education technology opportunities and challenges faced by school-aged students brought to life by six fictional personas (Figure 1). They sketched their ideas onto posters, then turned these into blueprints for shared actions. Inspirational talks, technology demonstrations, ideation activities, informal meeting time, and open discussion encouraged participants to dream big and craft a vision for the future of teaching and learning.

Figure 1. Fran (top) is a 10-year-old girl living in Cleveland, attending 4th grade in a new public school downtown. Dominic (bottom) is a 16-year-old in Texas, enrolled in a rural high school.

Participants reflected on strategic questions:

  • What should learning look like in 2030?
  • What role will new or yet-unimagined technologies play in adding and supporting learning?
  • How will information and data help teachers and mentors provide an ongoing ecosystem of support that extends to all learners, regardless of location, economic resources, or background?
  • How will all of these factors work together across learner’s lives in ways that equip them for meaningful contributions as global citizens?

Speakers inspire with farsighted views

Invited speakers ignited new ideas about the future of teaching and learning. Technology visionary and creator of Dynamicland, Bret Victor described a future of computing in which people work together shoulder to shoulder with physical materials, having agency to change the things they create—in sharp contrast to highly independent, isolated users of apps developed by others.

Adam Tobin, CEO of Chabot Space & Science Center, re-envisioned the role of museums and science centers as critical educational hubs to support an integrated STEM learning ecosystem. Colin Dixon, research associate at the Concord Consortium, shared a story of a youth who created a computer from old milk cartons and PVC pipes in a mobile makerspace, suggesting that educators need to create space for surprises and rethink what counts as “technology” in light of young people’s imaginations and maker-oriented opportunities.

Cultural anthropologist Mimi Ito asked us to think about designing educational systems that serve the interest and needs of learners and to reconsider the notion that “kids move through a pipeline from school to the workforce” and instead think of a web of relationships, opportunities, pathways, and interests that serve as learner supports over many years. Jeremy Roschelle, executive director of learning sciences research at Digital Promise, also emphasized the idea that education should be less about learning new things (facts, content) and more about collaboration and building new relationships that connect ideas in productive ways.

Judi Fusco, senior research scientist of STEM teaching and learning at Digital Promise, spoke of the importance of teachers, mentors, and coaches to support students, and the role of teachers and others in the community as key propagators of educational research to be put to use in the classroom. Finally, Britte Cheng, assessment researcher formerly at SRI International, described both the need to design educational implementations for equity and justice and the need to design against stereotypes and bias.

As a next step in designing the future of STEM teaching and learning with educational technology, we will convene a group of designers and software developers to take the ideas and visions from the Designing 2030 summit and make detailed design sketches, prototype scenarios, and begin software mash-ups as working examples. We’re designing the future now!

Contact Sherry Hsi for more information about Designing 2030.

Maine Teacher Workshop on Artificial Intelligence in Engineering Education

In June 10-12, we hosted a successful teacher professional development workshop in York, Maine for 29 teachers from seven states. The theme was around the application of artificial intelligence (AI) in engineering education to assist teaching and foster learning. The workshop was supported by generous funding from General Motors and the National Science Foundation.

The teachers explored how the AI tools built in Energy3D could help students learn STEM concepts and skills required by the Next Generation Science Standards (NGSS), especially engineering design. Together we brainstormed how AI applications such as generative design might change their teaching. We believed that AI could transform STEM education from the following four aspects: (1) augment students with tools that accelerate problem solving, thereby supporting them to explore more broadly; (2) identify cognitive gaps between students' current knowledge and the learning goals, thereby enabling them to learn more deeply; (3) suggest alternative solutions beyond students' current work, thereby spurring them to think more creatively; and (4) assess students' performance by computing the distances between their solutions and the optimal ones, thereby providing formative feedback during the design process. The activities that the teachers tried were situated in the context of building science and solar engineering, facilitated by our Solarize Your World Curriculum. We presented examples that demonstrated the affordances of AI for supporting learning and teaching along the above four directions, especially in engineering design (which is highly open-ended). Teachers first learned how to design a solar farm in the conventional way and then learned how to accomplish the same task in the AI way, which -- in theory -- can lead to broader exploration, deeper understanding, better solutions, and faster feedback.

View my PowerPoint slides for more information.

New Grant to Improve Assessment and Instruction in Elementary Science Classrooms

Eighteen states and the District of Columbia, representing more than a third of the U.S. student population, have adopted the Next Generation Science Standards (NGSS) since their release in 2013, and more are expected to follow. To make the most of NGSS, teachers need three-dimensional assessments that integrate disciplinary core ideas, crosscutting concepts, and science and engineering practices.

We are delighted to collaborate with the Learning Sciences Research Institute at the University of Illinois at Chicago and UChicago STEM Education on a new grant funded by the National Science Foundation to build teacher capacity and develop and test classroom assessments for formative use that will promote high-quality science instruction and student learning in grades 3-5. These assessments will enable students to put their scientific knowledge into use through engaging in science practices and provide teachers with insight into students’ ability to address specific three-dimensional NGSS standards.

The project will work with teachers and other experts to co-develop formative assessment tasks and associated rubrics, and collect data for evidence-based revision and redesign of the tasks. As teachers are using the assessment tasks in their classrooms, the project will study their usage to further refine teacher materials and to collect evidence of instructional validity. The project will also develop teacher support materials and foster a community around use of the assessment tasks. The goal is to build the capacity of teachers to implement and respond formatively to assessment tasks that are diagnostic and instructionally informative.

The project will seek to answer two research questions:

  • How well do these assessments function with respect to aspects of validity for classroom use, particularly in terms of indicators of student proficiency, and tools to support teacher instructional practice?
  • In what ways do providing these assessment tasks and rubrics, and supporting teachers in their use, advance teachers’ formative assessment practices to support multi-dimensional science instruction?

 

 

 

 

Generative Design of Concentrated Solar Power Towers

In a sense, design is about choosing parameters. All the parameters available for adjustment form the basis of the multi-dimensional solution space. The ranges within which the parameters are allowed to change, often due to constraints, sets the volume of the feasible region of the solution space where the designer is supposed to work. Parametric design is, to some extent, a way to convert design processes or subprocesses into algorithms for varying the parameters in order to automatically generate a variety of designs. Once such algorithms are established, users can easily create new designs by tweaking parameters without having to repeat the entire process manually. The reliance on computer algorithms to manipulate design elements is called parametricism in modern architecture.

Parametricism allows people to use a computer to generate a lot of designs for evaluation, comparison, and selection. If the choice of the parameters is driven by a genetic algorithm, then the computer will also be able to spontaneously evolve the designs towards one or more objectives. In this article, I use the design of the heliostat field of a concentrated solar power tower as an example to illustrate how this type of generative design may be used to search for optimal designs in engineering practice. As always, I recorded a screencast video that used the daily total output of such a power plant on June 22 as the objective function to speed up the calculation. The evaluation and ranking of different solutions in the real world must use the annual output or profit as the objective function. For the purpose of demonstration, the simulations that I have run for writing this article were all based on a rather coarse grid (only four points per heliostat) and a pretty large time step (only once per hour for solar radiation calculation). In real-world applications, a much more fine-grained grid and a much smaller time step should be used to increase the accuracy of the calculation of the objective function.


Video: The animation of a generative design process of a heliostat field on an area of 75m×75m for a hypothetical solar power tower in Phoenix, AZ.

Figure 1: A parametric model of the sunflower.
Heliostat fields can take many forms (the radial stagger layout with different heliostat packing density in multiple zones seems to be the dominant one). One of my earlier (and naïve) attempts was to treat the coordinates of every heliostat as parameters and use genetic algorithms to find optimal coordinates. In principle, there is nothing wrong with this approach. In reality, however, the algorithm tends to generate a lot of heliostat layouts that appear to be random distributions (later on, I realized that the problem is as challenging as protein folding if you know what it is -- when there are a lot of heliostats, there are just too many local optima that can easily trap a genetic algorithm to the extent that it would probably never find the global optimum within the computational time frame that we can imagine). While a "messy" layout might in fact generate more electricity than a "neat" one, it is highly unlikely that a serious engineer would recommend such a solution and a serious manager would approve it, especially for large projects that cost hundreds of million of dollars to construct. For one thing, a seemingly stochastic distribution would not present the beauty of the Ivanpah Solar Power Facility through the lens of the famed photographers like Jamey Stillings.

In this article, I chose a biomimetic pattern proposed by Noone, Torrilhon, and Mitsos in 2012 based on Fermat's spiral as the template. The Fermat spiral can be expressed as a simple parametric equation, which in its discrete form has two parameters: a divergence parameter β that specifies the angle the next point should rotate and a radial parameter b that specifies how far the point should be away from the origin, as shown in Figure 1.

Figure 2: Possible heliostat field patterns based on Fermat's spiral.
When β = 137.508° (the so-called golden angle), we arrive at Vogel's model that shows the pattern of florets like the ones we see in sunflowers and daisies (Figure 1). Before using a genetic algorithm, I first explored the design possibilities manually by using the spiral layout manager I wrote for Energy3D. Figure 2 shows some of the interesting patterns I came up with that appear to be sufficiently distinct. These patterns may give us some ideas about the solution space.
Figure 3: Standard genetic algorithm result.
Figure 4: Micro genetic algorithm result.

Then I used the standard genetic algorithm to find a viable solution. In this study, I allowed only four parameters to change: the divergence parameter β, the width and height of the heliostats (which affect the radial parameter b), and the radial expansion ratio (the degree to which the radial distance of the next heliostat should be relative to that of the current one in order to evaluate how much the packing density of the heliostats should decrease with respect to the distance from the tower). Figure 3 shows the result after evaluating 200 different patterns, which seems to have converged to the sunflower pattern. The corresponding divergence parameter β was found to be 139.215°, the size of the heliostats to be 4.63m×3.16m, and the radial expansion ratio to be 0.0003. Note that the difference between β and the golden angle cannot be used alone as the criterion to judge the resemblance of the pattern to the sunflower pattern as the distribution also depends on the size of the heliostat, which affects the parameter b.

I also tried the micro genetic algorithm. Figure 4 shows the best result after evaluating 200 patterns, which looks quite similar to Figure 3 but performs slightly less. The corresponding divergence parameter β was found to be 132.600°, the size of the heliostats to be 4.56m×3.17m, and the radial expansion ratio to be 0.00033.

In conclusion, genetic algorithms seem to be able to generate Fermat spiral patterns that resemble the sunflower pattern, judged from the looks of the final patterns.

CODAP Helps Students in Puerto Rico Understand the Effects of Extreme Weather

Students in the Luquillo Schoolyard Project in Puerto Rico are jamming on data. Large, long-term environmental data! And our free, online tool CODAP (Common Online Data Analysis Platform) joined their Data Jam to help students visualize and explore data in an inquiry-oriented way.

El Yunque National Forest, the only tropical rainforest in the U.S. National Forest System, was hit hard in 2017 by Hurricane Maria. (Photo courtesy of U.S. Forest Service.)

Data has become increasingly critical to understanding countless issues from business and politics to medicine and the environment. It’s hard to imagine a profession in the future that will not require data analysis skills. But while the Next Generation Science Standards (NGSS) feature data analysis and interpretation as part of the science and engineering practices, it’s hard for teachers to develop realistic activities using large datasets.

The Luquillo Schoolyard Project has developed a unique way to engage teachers and their students in local environmental issues while learning about big data, science, and research—and they draw and sing, too—all part of a Data Jam!

Data Jam is an outreach project of the National Science Foundation-funded Luquillo Long-Term Ecological Research (LTER) program in the El Yunque tropical rainforest. Students in a Data Jam work together on real data analysis projects, learning to formulate research questions, explore, analyze, and summarize environmental data, and come up with interpretations.

Students using CODAP to investigate environmental data from El Yunque National Forest during a Data Jam. (Photo: Noelia Báez Rodríguez)

Having recently experienced a drought followed by a Category 4 hurricane, Puerto Rico’s middle and high school students know firsthand the impact of weather on the environment. “It’s extremely impressive how resilient these kids are,” said Steven McGee, research associate professor of learning sciences at Northwestern University and president of The Learning Partnership, during a recent Concord Consortium data science education webinar. “We have kids whose school doesn’t have electricity, but they’re so devoted to science that they are coming out to the rainforest to do research.”

Led by teachers who have completed professional development training, students dive into authentic long-term research data from the Luquillo LTER, the Luquillo Critical Zone Observatory, and the U.S. Geological Survey. “Giving students real scientific datasets to explore introduces the messiness of data analysis that motivates the reasons why students should be engaging in basic data analysis strategies,” says McGee. “If students are only exposed to artificial datasets, the learning of basic analysis techniques seems like school exercises.”

Originally the project used Excel to created graphs, but they recently switched to CODAP, our web-based data analysis and visualization tool. “CODAP provides a platform for students to explore different ways to analyze the data,” McGee explains. “It’s easy for students to generate different types of graphs as a means to examine the data from different perspectives. This feature hopefully enables students to reflect on the type of information that can be gleaned from different types of graphs.”

A student presents her research data at the annual student symposium at the University of Puerto Rico. (Photo: Carla López Lloreda)

Successful Data Jam projects present their findings at a public symposium and poster session at the University of Puerto Rico. “A large number of scientists involved with the LTER program come and interact with the kids,” said Noelia Báez Rodríguez, coordinator of the Luquillo LTER Schoolyard program, during the webinar. Students also develop creative ways to communicate their results, including skits, drawings, short stories, poems, and songs—even a rap.*

If their goal is to get students interested in STEM careers, the Luquillo Schoolyard Project has a lot to jam about. Even in the midst of an ongoing environmental crisis, they’re getting students and teachers excited about data science. We’re proud to be a part of their success.

*By students Paul Ortiz and Jonathan Rodriguez as part of the 2016 Data Jam.

 

National Science Foundation awards new grant to strengthen data literacy across the curriculum

It’s impossible to overstate the importance of getting more students and teachers working with data across all subject areas. Name a problem we face as a society—from combating global warming to feeding the growing population, reducing violence, and increasing equity—and data-savvy people are at the heart of any attempt at a solution.

The Concord Consortium, in collaboration with EDC research scientist Josephine Louie and professors Beth Chance and Soma Roy at California Polytechnic State University, was awarded a grant from the National Science Foundation to develop curriculum materials to improve statistical and data understanding in high school students, particularly among groups underrepresented in STEM. Designed for use in non-AP statistics and mathematics classes, activities will emphasize statistical thinking and quantitative data analysis and use our intuitive web-based Common Online Data Analysis Platform (CODAP) for students to visualize, analyze, and learn from data.

To engage students in meaningful and relevant topics, activities will focus on social justice themes and be co-designed by social science teachers. The datasets will come from the Census Bureau via the Minnesota Population Center and their Integrated Public Use Microdata (IPUMS-USA) project, which has harmonized the data over time so students can study historical trends such as migration, demographic changes, and labor force changes. With its easy-to-use interface, CODAP helps students work with hierarchical data without having to learn to program.

Data from the Minnesota Population Center in CODAP, showing percent of persons over 25 years old with less than primary education. States in darker green have higher percentages.

Students will be able to (1) formulate questions that can be answered with data, (2) design and implement a plan to assemble appropriate data, (3) use numerical and graphical methods to explore the data, and (4) summarize conclusions relating back to the original questions and citing relevant components of the analysis that support their interpretation and acknowledging other interpretations.

Our goal is to integrate data and data analysis throughout the K-12 curriculum, so that when students graduate from high school, data is not foreign to them. This new project is one important step in that direction.

The importance of scientifically literate citizens

At the Concord Consortium our goal is to prepare students to ask questions and use mental models to answer them. Students who develop this habit of mind early on will, we hope, become engaged and scientifically literate adults. And surely they will not lack for important questions to ask!

Here’s an example: According to a recent study published by the National Academy of Sciences, global sea level has increased by more than two inches in this century alone! Why is that happening? People who live on low-lying islands or in coastal cities around the world would really like to know.

Representative Mo Brooks (R-AL), a member of the House Science and Technology Committee and Vice Chair of its Subcommittee on Space, recently proposed a model for this phenomenon. He offered the opinion that a significant cause of the rise in sea level is falling rocks and other erosion, pointing specifically to the California coastline and the White Cliffs of Dover. This debris “forces the sea levels to rise because now you have less space in those oceans because the bottom is moving up,” he explained.

Is he right? Can erosion be causing the rise in sea level? Most important: do you have to be a scientist to address that question?

Actually, anyone can do it. All it takes is a little physics, a little math, and Google.

First, the physics:

  1. When you throw a rock in the ocean, the volume of the ocean goes up by exactly the volume of the rock because…
  2. the water displaced by the rock pushes on the surrounding water and ends up being spread evenly across the surface of the ocean (remember, water is a liquid!).
  3. So the increase in sea level ends up as a thin layer of water—a layer whose volume is equal to that of the rock itself. And the volume of that little layer of water is the vertical rise in sea level times the surface area of the ocean, and that equals the volume of the displaced water, which is just the volume of the rock itself.

Let’s write that up as an equation:

(volume-of-rock) = (surface area of ocean) x (increase in sea level)

Now we need to know the surface area of the ocean. We could estimate it (4/5 of the Earth’s area is ocean, the radius of the Earth is 4000 miles…), or we could Google it.

From Google: The surface area of the Earth’s oceans is 510 million square kilometers.

So to make the sea level rise by one inch we would need to throw in a lot of rocks—or one REALLY BIG rock—whose volume is 1 inch times that surface area. How big is that? Here comes the math!

Let’s put everything in feet, so we can compare. A kilometer is 1000 meters and a meter is about 3.28 feet, so a kilometer is 3280 feet, which makes a square kilometer roughly 10.8 million square feet, which makes 510 million square kilometers, which works out to 5500 million million or 5.5 X 1015 square feet.

So the volume of rock required to raise the ocean level by one inch (1/12 of a foot) is

(5.5/12) X 1015 cubic feet or 0.46 X 1015 cubic feet or 460 trillion cubic feet

How big is 460 trillion? Is it a mountain or a molehill? Turns out, more like a mountain.

Back to Google: The volume of Mount Everest (starting from its base, not from sea level) is 2.1 trillion (2.1 X 1012) cubic feet. So to cause a one-inch rise in sea level you would need to push into the sea

460 / 2.1 = 220 Mount Everests

That’s a lot of rock! Can erosion possibly account for the equivalent of 220 Mount Everests in just a few years? Back to Google…

There’s not a lot of information concerning falling rocks, it turns out, but topsoil erosion is a major concern to a lot of people, so we do know something about that. The European Commission’s Joint Research Centre on Sustainable Resources estimates that 36 billion tons of soil are washed away, worldwide, every year. A cubic foot of rock weighs 150 pounds so all those Mount Everests (2.1 trillion cubic feet’s worth) weigh over 300 trillion pounds. At that rate, it would take almost 9000 years for soil erosion to raise the ocean level by an inch. If rocks and other non-soil debris contributed a similar amount it would still take thousands of years.

The Next Generation Science Standards call for students—and that really applies to all of us!—to learn to use models to answer questions. When we do that it becomes clear that erosion isn’t to blame for the rise in sea level. And the best part? We don’t need to rely on experts, we can figure it out for ourselves!

Using Artificial Intelligence to Design a Solar Farm

Everyone loves to maximize the return of investment (ROI). If you can effortlessly find a solution that pays a higher profit -- even only a few dollars more handsomely, why not? The problem is that, in many complicated engineering cases in the real world, such as designing a solar farm, we often don't know exactly what the optimal solutions are. We may know how to get some good solutions based on what textbooks or experts say, but no one in the world can be 100% sure that there aren't any better ones waiting to be discovered beyond the solution space that we have explored. As humans, we can easily get complacent and settled with the solutions that we feel good about, leaving the job (and the reward) of finding better solutions to another time or someone else.

Artificial intelligence (AI) is about to change all that. As design is essentially an evolution of solutions, AI techniques such as genetic algorithms (GA) are an excellent fit to the nature of many design problems and can generate a rich variety of competitive designs in the same way genetics does for biology (no two leaves are the same but they both work). These powerful tools have the potential to help people learn, design, and discover new things. In this article, I demonstrate how GA can be used to design a photovoltaic (PV) solar farm. As always, I first provide a short screencast video in which I used the daily output or profit as the objective function to speed up the animation so that you can see the evolution driven by GA. The actual assessments are based on using the annual output or profit as the objective function, presented in the text that follows the video. Note that the design process is still geared towards a single objective (i.e., the total output in kWh or the total profit in dollars over a given period of time). Design problems with multiple objectives will be covered later.


In GA, the solution depends largely on the choice of the objective function (or the fitness function), which specifies how the main goal is calculated. For example, if the main goal is to generate as much electricity as possible on a given piece of land without the concern of the cost of the solar panels, a design in which the solar panels are closely packed may be a good choice. On the other hand, if the main goal is to generate as much electricity as possible from each individual solar panel because of their high price, a design in which rows of solar panels are far away from one another would be a good choice. Unsurprisingly, in the case shown in the video, a single row of solar panels was found as the best solution. Aiming at maximizing the profit, the real-world problems always lie between these two extremes, which is why they must be solved using the principles of engineering design. The video above clearly illustrates the design evolution driven by GA in the three cases (the two extremes and an intermediate).

Figure 1. An Energy3D model of an existing solar farm in Massachusetts.
To test the usefulness of the GA implementation in Energy3D for solving real-world problems, I picked an existing solar farm in Massachusetts (Figure 1) to see if GA could find better solutions. A 3D model of the solar farm had been created in the Virtual Solar Grid based on the information shown on Google Maps and its annual output calculated using Energy3D. Because I couldn't be exactly sure about the tilt angle, I also tweaked it a bit manually and ensured that an optimal tilt angle for the array be chosen (I found it to be around 32° in this case). The existing solar farm has 4,542 solar panels, capable of generating 2,255 MWh of electricity each year, based on the analysis result of Energy3D. [I must declare here that the selection of this site was purely for the purpose of scientific research and any opinion expressed as a result of this research should be viewed as exploratory and should not be considered as any kind of evaluation of the existing solar farm and its designer(s). There might be other factors beyond my comprehension that caused a designer to choose a particular trade-off. The purpose of this article is to show that, if we know all the factors needed to be considered in such a design task, we can use AI to augment our intelligence, patience, and diligence.]

Figure 2. The results of 10 iterations.
Energy3D has a tool that allows the user to draw a polygon within which the solar farm should be designed. This polygon is marked by white lines. Using this tool, we can ensure that our solutions will always be confined to the specified area. I used this tool to set the boundary of the solar farm under design. This took care of an important spatial constraint and guaranteed that GA would always generate solutions on approximately the same land parcel as is situated by the existing solar farm.

For the objective function, we can select the total annual output, the average annual output of a solar panel, or the annual profit. I chose the annual profit and assumed that the generated electricity would sell for 22.5 cents per kWh (the 2018 average retail price in Massachusetts) and the daily cost of a solar panel (summing up the cost of maintenance, financing, and so on) would be 20 cents. I didn't know how accurate these ROI numbers would be. But let's just go with them for now. The annual profit is the total sale income minus the total operational cost. Qualitatively, we know that a higher electricity price and a lower operational cost would both favor using more solar panels whereas a lower electricity price and a higher operational cost would both favor using less solar panels. Finding the sweet spots in the middle requires quantitative analyses and comparisons of many different cases, which can be outsourced to AI.

Figure 3: The best design from 2,000 solutions
Figure 4: The second best design from 2,000 solutions.
In Energy3D, GA always starts with the current design as part of the first generation (so if you already have a good design, it will converge quickly). In order for GA not to inherit anything from the existing solar farm, I created an initial model that had only a rack with a few solar panels on it and a zero tilt angle. The size of the population was set to be 20. So at the beginning, this initial model would compete with 19 randomly generated solutions and was almost guaranteed to lose the chance to enter the next generation. In order to stop and check the results, I let GA run for only 10 generations. For convenience, let's call every 10 generations of GA evolution an iteration. Figure 2 shows that GA generated solutions below the supposed human performance in the first two iterations but quickly surpassed it after that. The solution kept improving but got stuck in iterations 5-7 and then it advanced again and stagnated again in iterations 8-10. This process could continue indefinitely, but I decided to terminate it after 10 iterations, or 100 generations. By this time, the software had generated and evaluated 2,000 solutions, which took a few hours as it had to run 2,000 annual simulations for thousands of solar panels.

The best solution (Figure 3) that emerged from these 2,000 generated solutions used 5,420 solar panels fixed at a tilt angle of 28.3° to generate 2,667 MWh per year and was about 16% better than the existing one based on the ROI model described above. The second best solution (Figure 4) used 4,670 solar panels fixed at a tilt angle of 38.6° to generate 2,340 MWh per year and was about 5.5% better than the existing one based on the ROI model. Note that if we use the average annual output per solar panel as the criterion, the second best solution would actually be better than the best one, but we know that the average panel output is not a good choice for the fitness function as it can result in an optimal solution with very few solar panels.

In conclusion, the generative design tools in Energy3D powered by AI can be used to search a large volume of the solution space and find a number of different solutions for the designer to pick and choose. The ability of AI to transcend human limitations in complex design is a significant application of AI and cannot be more exciting! We predict that future work will rely more and more on this power and today's students should be ready for the big time.

Using Artificial Intelligence to Design Energy-Efficient Buildings

The National Science Foundation issued a statement on May 10, 2018 in which the agency envisions that "The effects of AI will be profound. To stay competitive, all companies will, to some extent, have to become AI companies. We are striving to create AI that works for them, and for all Americans." This is probably the strongest message and the clearest matching order from a top science agency in the world about a particular area of research thus far. The application of AI to the field of design, and more broadly, creativity, is considered by many as the moonshot of the ongoing AI revolution, which is why I have chosen to dedicate a considerable portion of my time and effort to this strategically important area.

I have added two more application categories of using genetic algorithms (GAs) to assist engineering design in Energy3D, the main platform based on which I am striving to create a "designerly brain." One example is to find the optimal position to add a new building with glass curtain walls to an open space in an existing urban block so that the new building would use the least amount of energy. The other example is to find the optimal sizes of the windows on different sides of a building so that the building would use the least amount of energy. To give you a quick idea about how GAs work in these cases, I recorded the following two screencast videos from Energy3D. To speed up the search processes visualized in the videos, I chose the daily energy use as the objective function and only optimized for the winter condition. The solutions optimized for the annual energy use are shown later in this article.



Figure 1: A location of the building recommended by GA if it is in Boston.
Figure 2: A location of the building recommended by GA if it is in Phoenix.
For the first example, the energy use of a building in an urban block depends on how much solar energy it receives. In the winter, solar energy is good for the building as it warms up the building and saves the heating energy. In the summer, excessive heating caused by solar energy must be removed through air conditioning, increasing the energy use. The exact amount of energy use per year depends on a lot of other factors such as the fenestration of the building, its insulation, and its size. In this demo, we only focus on searching a good location for a building with everything else fixed. I chose a population with 32 individuals and let GA run for only five generations. Figures 1 and 2 show the final solutions for Boston (a heating-dominant area) and Phoenix (a cooling-dominant area), respectively. Not surprisingly, the GA results suggest that the new building be placed in a location that has more solar access for the Boston case and in location that has less solar access for the Phoenix case.

Figure 3: Window sizes of a building recommended by GA for Chicago.
Figure 4: Window sizes of a building recommended by GA for Phoenix.
For the second example, the energy use of a building depends on how much solar energy it receives through the windows and how much thermal energy transfers through the windows (since windows typically have less thermal resistance than walls). In the winter, while a larger window allows more solar energy to shine into the building and warm it up during the day, it also allows more thermal energy to escape through the larger area, especially at night. In the summer, both solar radiation and heat transfer through a larger window will contribute to the increase of the energy needed to cool the building. And this complicated relationship changes when the solution is designed for a different climate. Figures 3 and 4 show the final solutions for Chicago and Phoenix as suggested by the GA results, respectively. Note that not all GA results are acceptable solutions, but they can play advisory roles during a design process, especially for novice designers who do not have anyone to consult with.

In conclusion, artificial intelligence such as GA provides automated procedures that can help designers find optimal solutions more efficiently and thereby free them up from tedious, repetitive tasks if an exhaustive search of the solution space is necessary. Energy3D provides an accessible platform that integrates design, visualization, and simulation seamlessly to demonstrate these potential and capabilities. Our next step is to figure out how to translate this power into instructional intelligence that can help students and designers develop their abilities of creative thinking.

3 Reasons to Vote in STEM For All Video Showcase

We’re thrilled to present three videos in the National Science Foundation STEM for All Video Showcase from May 14 to 21! We invite you to view the videos and join the conversation about research projects that are transforming the STEM educational landscape. Please vote for our videos through Facebook, Twitter, or email!

Geniventure

Geniventure

Geniventure is a free online game with an Intelligent Tutoring System that engages students from middle school through higher education in genetics and heredity by saving virtual dragons from extinction. Through scaffolded virtual investigations, students explore the physical traits that result from allele combinations, then zoom into cells and manipulate the proteins that ultimately give rise to those traits.

Watch & Vote


InSPECT

Integrating Computational Thinking and Experimental Science

InSPECT supports the integration of computational thinking (CT) in experimental science with a novel technology-enhanced curriculum, and examines how students engage in CT using these tools for inquiry. InSPECT is designing a series of open-ended high school biology experiments using inexpensive DIY lab instruments developed in partnership with Manylabs, including Dataflow—a digital tool for experimental control and data acquisition using Internet-of-Things sensors.

Watch & Vote


Teaching Environmental Sustainability with Model My Watershed

With our collaborators at  and Stroud Water Research Center, we’re developing interdisciplinary, place-based, problem-based, hands-on resources and models aligned to NGSS to promote watershed stewardship, geospatial literacy, and systems thinking. We’re introducing middle and high school students to environmental and geospatial science that engenders critical incidents and encourage students to pursue environmental and geoscience careers.

Watch & Vote