Monthly Archives: June 2018

Designing 2030: Building the Educational Technology Community of Tomorrow for Design, Interoperability, and Equity

The Concord Consortium is thrilled to announce a new initiative to transform STEM teaching and learning and reach more students with educational technology. By applying current and future technologies in unique ways, generating new collaborations, and leveraging the power of open educational resources, a group of innovative thought leaders is working to revolutionize STEM learning experiences and bring them to a broader, more diverse group of learners. A two-day summit sponsored by the Gordon and Betty Moore Foundation jumpstarted the effort last month. Designing 2030 was the first step on a journey to predict, imagine, and design for 2030.

STEM education professionals, scientists, curriculum developers, learning scientists, and more gathered for the inaugural Designing 2030 Summit.

Thirty STEM education professionals, leaders of informal education institutions, scientists, entrepreneurs, curriculum developers, K-12 educational software programmers, teachers, and learning scientists convened at Dynamicland, an experimental space in Oakland, California, that embodies communal computing. Dynamicland houses a multi-person computer, embedded into the physical rooms to support collaboration through tangible objects, running code printed on paper and digital projections on surfaces.

Concord Consortium Executive Vice President Sherry Hsi welcomed the group to Dynamicland, and President and CEO Chad Dorsey set the meeting in the context of technological advances by describing a timeline of different technologies that have changed how people get information and connect with others—from the advent of social media to digital appliances.

How can technology transform the way we teach and learn science and broaden participation by more learners?

Participants outlined potential education technology opportunities and challenges faced by school-aged students brought to life by six fictional personas (Figure 1). They sketched their ideas onto posters, then turned these into blueprints for shared actions. Inspirational talks, technology demonstrations, ideation activities, informal meeting time, and open discussion encouraged participants to dream big and craft a vision for the future of teaching and learning.

Figure 1. Fran (top) is a 10-year-old girl living in Cleveland, attending 4th grade in a new public school downtown. Dominic (bottom) is a 16-year-old in Texas, enrolled in a rural high school.

Participants reflected on strategic questions:

  • What should learning look like in 2030?
  • What role will new or yet-unimagined technologies play in adding and supporting learning?
  • How will information and data help teachers and mentors provide an ongoing ecosystem of support that extends to all learners, regardless of location, economic resources, or background?
  • How will all of these factors work together across learner’s lives in ways that equip them for meaningful contributions as global citizens?

Speakers inspire with farsighted views

Invited speakers ignited new ideas about the future of teaching and learning. Technology visionary and creator of Dynamicland, Bret Victor described a future of computing in which people work together shoulder to shoulder with physical materials, having agency to change the things they create—in sharp contrast to highly independent, isolated users of apps developed by others.

Adam Tobin, CEO of Chabot Space & Science Center, re-envisioned the role of museums and science centers as critical educational hubs to support an integrated STEM learning ecosystem. Colin Dixon, research associate at the Concord Consortium, shared a story of a youth who created a computer from old milk cartons and PVC pipes in a mobile makerspace, suggesting that educators need to create space for surprises and rethink what counts as “technology” in light of young people’s imaginations and maker-oriented opportunities.

Cultural anthropologist Mimi Ito asked us to think about designing educational systems that serve the interest and needs of learners and to reconsider the notion that “kids move through a pipeline from school to the workforce” and instead think of a web of relationships, opportunities, pathways, and interests that serve as learner supports over many years. Jeremy Roschelle, executive director of learning sciences research at Digital Promise, also emphasized the idea that education should be less about learning new things (facts, content) and more about collaboration and building new relationships that connect ideas in productive ways.

Judi Fusco, senior research scientist of STEM teaching and learning at Digital Promise, spoke of the importance of teachers, mentors, and coaches to support students, and the role of teachers and others in the community as key propagators of educational research to be put to use in the classroom. Finally, Britte Cheng, assessment researcher formerly at SRI International, described both the need to design educational implementations for equity and justice and the need to design against stereotypes and bias.

As a next step in designing the future of STEM teaching and learning with educational technology, we will convene a group of designers and software developers to take the ideas and visions from the Designing 2030 summit and make detailed design sketches, prototype scenarios, and begin software mash-ups as working examples. We’re designing the future now!

Contact Sherry Hsi for more information about Designing 2030.

Maine Teacher Workshop on Artificial Intelligence in Engineering Education

In June 10-12, we hosted a successful teacher professional development workshop in York, Maine for 29 teachers from seven states. The theme was around the application of artificial intelligence (AI) in engineering education to assist teaching and foster learning. The workshop was supported by generous funding from General Motors and the National Science Foundation.

The teachers explored how the AI tools built in Energy3D could help students learn STEM concepts and skills required by the Next Generation Science Standards (NGSS), especially engineering design. Together we brainstormed how AI applications such as generative design might change their teaching. We believed that AI could transform STEM education from the following four aspects: (1) augment students with tools that accelerate problem solving, thereby supporting them to explore more broadly; (2) identify cognitive gaps between students' current knowledge and the learning goals, thereby enabling them to learn more deeply; (3) suggest alternative solutions beyond students' current work, thereby spurring them to think more creatively; and (4) assess students' performance by computing the distances between their solutions and the optimal ones, thereby providing formative feedback during the design process. The activities that the teachers tried were situated in the context of building science and solar engineering, facilitated by our Solarize Your World Curriculum. We presented examples that demonstrated the affordances of AI for supporting learning and teaching along the above four directions, especially in engineering design (which is highly open-ended). Teachers first learned how to design a solar farm in the conventional way and then learned how to accomplish the same task in the AI way, which -- in theory -- can lead to broader exploration, deeper understanding, better solutions, and faster feedback.

View my PowerPoint slides for more information.

New Grant to Improve Assessment and Instruction in Elementary Science Classrooms

Eighteen states and the District of Columbia, representing more than a third of the U.S. student population, have adopted the Next Generation Science Standards (NGSS) since their release in 2013, and more are expected to follow. To make the most of NGSS, teachers need three-dimensional assessments that integrate disciplinary core ideas, crosscutting concepts, and science and engineering practices.

We are delighted to collaborate with the Learning Sciences Research Institute at the University of Illinois at Chicago and UChicago STEM Education on a new grant funded by the National Science Foundation to build teacher capacity and develop and test classroom assessments for formative use that will promote high-quality science instruction and student learning in grades 3-5. These assessments will enable students to put their scientific knowledge into use through engaging in science practices and provide teachers with insight into students’ ability to address specific three-dimensional NGSS standards.

The project will work with teachers and other experts to co-develop formative assessment tasks and associated rubrics, and collect data for evidence-based revision and redesign of the tasks. As teachers are using the assessment tasks in their classrooms, the project will study their usage to further refine teacher materials and to collect evidence of instructional validity. The project will also develop teacher support materials and foster a community around use of the assessment tasks. The goal is to build the capacity of teachers to implement and respond formatively to assessment tasks that are diagnostic and instructionally informative.

The project will seek to answer two research questions:

  • How well do these assessments function with respect to aspects of validity for classroom use, particularly in terms of indicators of student proficiency, and tools to support teacher instructional practice?
  • In what ways do providing these assessment tasks and rubrics, and supporting teachers in their use, advance teachers’ formative assessment practices to support multi-dimensional science instruction?

 

 

 

 

Generative Design of Concentrated Solar Power Towers

In a sense, design is about choosing parameters. All the parameters available for adjustment form the basis of the multi-dimensional solution space. The ranges within which the parameters are allowed to change, often due to constraints, sets the volume of the feasible region of the solution space where the designer is supposed to work. Parametric design is, to some extent, a way to convert design processes or subprocesses into algorithms for varying the parameters in order to automatically generate a variety of designs. Once such algorithms are established, users can easily create new designs by tweaking parameters without having to repeat the entire process manually. The reliance on computer algorithms to manipulate design elements is called parametricism in modern architecture.

Parametricism allows people to use a computer to generate a lot of designs for evaluation, comparison, and selection. If the choice of the parameters is driven by a genetic algorithm, then the computer will also be able to spontaneously evolve the designs towards one or more objectives. In this article, I use the design of the heliostat field of a concentrated solar power tower as an example to illustrate how this type of generative design may be used to search for optimal designs in engineering practice. As always, I recorded a screencast video that used the daily total output of such a power plant on June 22 as the objective function to speed up the calculation. The evaluation and ranking of different solutions in the real world must use the annual output or profit as the objective function. For the purpose of demonstration, the simulations that I have run for writing this article were all based on a rather coarse grid (only four points per heliostat) and a pretty large time step (only once per hour for solar radiation calculation). In real-world applications, a much more fine-grained grid and a much smaller time step should be used to increase the accuracy of the calculation of the objective function.


Video: The animation of a generative design process of a heliostat field on an area of 75m×75m for a hypothetical solar power tower in Phoenix, AZ.

Figure 1: A parametric model of the sunflower.
Heliostat fields can take many forms (the radial stagger layout with different heliostat packing density in multiple zones seems to be the dominant one). One of my earlier (and naïve) attempts was to treat the coordinates of every heliostat as parameters and use genetic algorithms to find optimal coordinates. In principle, there is nothing wrong with this approach. In reality, however, the algorithm tends to generate a lot of heliostat layouts that appear to be random distributions (later on, I realized that the problem is as challenging as protein folding if you know what it is -- when there are a lot of heliostats, there are just too many local optima that can easily trap a genetic algorithm to the extent that it would probably never find the global optimum within the computational time frame that we can imagine). While a "messy" layout might in fact generate more electricity than a "neat" one, it is highly unlikely that a serious engineer would recommend such a solution and a serious manager would approve it, especially for large projects that cost hundreds of million of dollars to construct. For one thing, a seemingly stochastic distribution would not present the beauty of the Ivanpah Solar Power Facility through the lens of the famed photographers like Jamey Stillings.

In this article, I chose a biomimetic pattern proposed by Noone, Torrilhon, and Mitsos in 2012 based on Fermat's spiral as the template. The Fermat spiral can be expressed as a simple parametric equation, which in its discrete form has two parameters: a divergence parameter β that specifies the angle the next point should rotate and a radial parameter b that specifies how far the point should be away from the origin, as shown in Figure 1.

Figure 2: Possible heliostat field patterns based on Fermat's spiral.
When β = 137.508° (the so-called golden angle), we arrive at Vogel's model that shows the pattern of florets like the ones we see in sunflowers and daisies (Figure 1). Before using a genetic algorithm, I first explored the design possibilities manually by using the spiral layout manager I wrote for Energy3D. Figure 2 shows some of the interesting patterns I came up with that appear to be sufficiently distinct. These patterns may give us some ideas about the solution space.
Figure 3: Standard genetic algorithm result.
Figure 4: Micro genetic algorithm result.

Then I used the standard genetic algorithm to find a viable solution. In this study, I allowed only four parameters to change: the divergence parameter β, the width and height of the heliostats (which affect the radial parameter b), and the radial expansion ratio (the degree to which the radial distance of the next heliostat should be relative to that of the current one in order to evaluate how much the packing density of the heliostats should decrease with respect to the distance from the tower). Figure 3 shows the result after evaluating 200 different patterns, which seems to have converged to the sunflower pattern. The corresponding divergence parameter β was found to be 139.215°, the size of the heliostats to be 4.63m×3.16m, and the radial expansion ratio to be 0.0003. Note that the difference between β and the golden angle cannot be used alone as the criterion to judge the resemblance of the pattern to the sunflower pattern as the distribution also depends on the size of the heliostat, which affects the parameter b.

I also tried the micro genetic algorithm. Figure 4 shows the best result after evaluating 200 patterns, which looks quite similar to Figure 3 but performs slightly less. The corresponding divergence parameter β was found to be 132.600°, the size of the heliostats to be 4.56m×3.17m, and the radial expansion ratio to be 0.00033.

In conclusion, genetic algorithms seem to be able to generate Fermat spiral patterns that resemble the sunflower pattern, judged from the looks of the final patterns.

CODAP Helps Students in Puerto Rico Understand the Effects of Extreme Weather

Students in the Luquillo Schoolyard Project in Puerto Rico are jamming on data. Large, long-term environmental data! And our free, online tool CODAP (Common Online Data Analysis Platform) joined their Data Jam to help students visualize and explore data in an inquiry-oriented way.

El Yunque National Forest, the only tropical rainforest in the U.S. National Forest System, was hit hard in 2017 by Hurricane Maria. (Photo courtesy of U.S. Forest Service.)

Data has become increasingly critical to understanding countless issues from business and politics to medicine and the environment. It’s hard to imagine a profession in the future that will not require data analysis skills. But while the Next Generation Science Standards (NGSS) feature data analysis and interpretation as part of the science and engineering practices, it’s hard for teachers to develop realistic activities using large datasets.

The Luquillo Schoolyard Project has developed a unique way to engage teachers and their students in local environmental issues while learning about big data, science, and research—and they draw and sing, too—all part of a Data Jam!

Data Jam is an outreach project of the National Science Foundation-funded Luquillo Long-Term Ecological Research (LTER) program in the El Yunque tropical rainforest. Students in a Data Jam work together on real data analysis projects, learning to formulate research questions, explore, analyze, and summarize environmental data, and come up with interpretations.

Students using CODAP to investigate environmental data from El Yunque National Forest during a Data Jam. (Photo: Noelia Báez Rodríguez)

Having recently experienced a drought followed by a Category 4 hurricane, Puerto Rico’s middle and high school students know firsthand the impact of weather on the environment. “It’s extremely impressive how resilient these kids are,” said Steven McGee, research associate professor of learning sciences at Northwestern University and president of The Learning Partnership, during a recent Concord Consortium data science education webinar. “We have kids whose school doesn’t have electricity, but they’re so devoted to science that they are coming out to the rainforest to do research.”

Led by teachers who have completed professional development training, students dive into authentic long-term research data from the Luquillo LTER, the Luquillo Critical Zone Observatory, and the U.S. Geological Survey. “Giving students real scientific datasets to explore introduces the messiness of data analysis that motivates the reasons why students should be engaging in basic data analysis strategies,” says McGee. “If students are only exposed to artificial datasets, the learning of basic analysis techniques seems like school exercises.”

Originally the project used Excel to created graphs, but they recently switched to CODAP, our web-based data analysis and visualization tool. “CODAP provides a platform for students to explore different ways to analyze the data,” McGee explains. “It’s easy for students to generate different types of graphs as a means to examine the data from different perspectives. This feature hopefully enables students to reflect on the type of information that can be gleaned from different types of graphs.”

A student presents her research data at the annual student symposium at the University of Puerto Rico. (Photo: Carla López Lloreda)

Successful Data Jam projects present their findings at a public symposium and poster session at the University of Puerto Rico. “A large number of scientists involved with the LTER program come and interact with the kids,” said Noelia Báez Rodríguez, coordinator of the Luquillo LTER Schoolyard program, during the webinar. Students also develop creative ways to communicate their results, including skits, drawings, short stories, poems, and songs—even a rap.*

If their goal is to get students interested in STEM careers, the Luquillo Schoolyard Project has a lot to jam about. Even in the midst of an ongoing environmental crisis, they’re getting students and teachers excited about data science. We’re proud to be a part of their success.

*By students Paul Ortiz and Jonathan Rodriguez as part of the 2016 Data Jam.