Archive for May 2012

Video: Three Views of Molecular Workbench

May 29th, 2012 by The Concord Consortium

The Molecular Workbench has been downloaded over 800,000 times, making it Concord Consortium’s most popular single piece of software. We’re heading to a million and documenting in video both our history and our vision for the future.

Learn from Charles Xie, Senior Scientist and creator of the Molecular Workbench, about the computational engines that accurately simulate atomic motions, quantum waves, and atomic-scale interactions based on fundamental equations and laws in physics.

Amy Pallant, who researched student use of Molecular Workbench, describes the phone calls she made to students months after they’d used the software—and how impressed she was with their memory of the science of atoms and molecules.

Dan Damelin, Technology and Curriculum Developer, recalls his time as a classroom teacher and his frustration with trying to describe atoms and molecules to his students with words and pictures. He wanted more—and found it in Molecular Workbench!

Dan sums up the goal for Molecular Workbench: “It’s going to be just a given that this is a regular tool that will just be part of learning science.” We hope so.

We’re closing in on a million downloads and looking toward the next million.

If we build it, will they come? Feedback from the field.

May 22nd, 2012 by Amy Pallant

The Molecular Workbench team has a unique opportunity—take our wonderful software and increase access to it. But we know that this is no “Field of Dreams” task. If we build it, will they come?

We’re using The Lean Startup as a guide to optimize our software for the Web. It’s encouraging us to experiment to see which ideas are brilliant and which are crazy and get feedback from users early. We’re thinking about how not to assume we know what people want, but instead go and find out, and be prepared to shift our ideas. In short: Test. Iterate. Repeat.

So we held our first focus group with several Rhode Island teachers who have been loyal users of Molecular Workbench. Our goal was to get feedback on ways to make our new browser-based MW more valuable to them. We asked them to evaluate new designs (we invite you to take our survey, too). We also asked about tone and length of activities. And the teachers described ways they’d like to select and integrate MW models and activities into their classrooms.

Two major themes emerged: flexibility and student accountability. This confirmed what we knew about the classroom: teachers have limited time, a wide range of learners, a diversity of classes, and pressures around high-stakes tests. We’re now working on prototyping ways to incorporate teacher feedback into our Web-based MW models and activities. We’ll share our progress on our website.

And, of course, we’d love to hear your thoughts in the comments.

Video: Molecular Workbench Brings Science to Life in the Browser

May 15th, 2012 by Cynthia McIntyre

As we make our award-winning Molecular Workbench software more accessible and widely available, we’re documenting our story at the same time. Google’s grant to the Concord Consortium funds the conversion of MW from Java to HTML5 so it will run in modern Web browsers. This will reduce barriers for using the next generation MW in schools. Students will be able to access the software from a Web page on a school computer, iPad, or smartphone, giving them anywhere, anytime access to powerful science learning opportunities.

We’re creating videos to share our conversion story. We’ll describe Molecular Workbench, our technical development process, and the benefits of HTML5. We’ve teamed up with the excellent staff of Good Life Productions to produce these videos.

In the first video, Concord Consortium’s Director of Technology Stephen Bannasch describes the power of the modern Web browser to bring science to life. Enjoy.

 

Project KTracker kicks off

May 9th, 2012 by Charles Xie
Watch a demo video
We have started to develop a high quality three-dimensional motion tracking system for science education based on the Microsoft Kinect controller, which was released about 18 months ago. This development is part of the Mixed-Reality Labs project funded by the National Science Foundation.

KTracker will provide a versatile interface between the Kinect and many physics experiments commonly conducted in the classroom. It will also provide natural user interfaces for students to control the software for data collection, analysis, and task management. For example, the data collector will automatically pause while the Kinect detects that the experimenter is adjusting the apparatus to create a new experimental condition (during which the data collection should be suspected). Or the user can "wave" to the Kinect to instruct the software to invoke a procedure. In this way, the user will not need to switch hands between the apparatus and the keyboard or mouse of the computer (this "hand-switching" scene seems familiar to the experimentalists reading this post, huh?). The Kinect sensor has the capacity to recognize both gestures of the experimenter and motions of the subject, making it an ideal device for carrying out performance assessment based on motor skill analysis.

KTracker is not a post-processing tool. It is not based on video analysis. Thanks to the high performance infrared-based depth camera built in the Kinect, KTracker is capable of doing motion tracking and kinematic analysis in real time. This is very important as it helps to accelerate the data analysis process and contributes to enhancing the interactivity of laboratory experiments.

KTracker will also integrate a popular physics engine, Box2D, to support simulation fitting. For example, the user can design a computer model of the pendulum shown in the above video and adjust the parameters so that its motion will fit what the camera is showing--all in real time. Like the graph demonstrated in the above video, the entire Box2D will be placed in a translucent pane on top of the camera view, making it easy for the user to align the simulation view and the experiment view.

KTracker will soon be available for download on our websites. We will keep you posted.

Kinect-based motion tracking and analysis

May 3rd, 2012 by Charles Xie
Click here to watch a video.
Microsoft's Kinect controller offers the first affordable 3D camera that can be used to detect complex three-dimensional motions such as body language, gestures, and so on. It provides a compelling solution to motion tracking, which--up to this point--is often based on analyzing the conventional RGB data from one or more video cameras.

The conventional wisdom of motion tracking based on RGB data requires complicated algorithms to process a large amount of video data, making it harder to implement a real-time application. The Kinect adds a depth camera that detects the distances between the subjects and the sensor based on the difference of the infrared beams it emits and the reflection it receives. This gives us a way to dynamically construct a 3D model of what is in front of the Kinect with a rate of about 10-30 frames per second, fast enough to build interactive applications (see the video linked under the above image). For as low as $100, we now have a revolutionary tool for tracking 3D motions of almost anything.

The demo video in this post shows an example of using the Kinect sensor to track and analyze the motion of a pendulum. The left part of the above image shows the overlay of trajectory and velocity vector to the RGB image of the pendulum, whereas the right part shows the slice of the depth data that is relevant to analyzing the pendulum.

The National Science Foundation provides funding to this work.