The discovery last year at CERN of the Higgs boson — a particle that may well be responsible for all the mass in the universe — was momentous to physicists everywhere. The revelation of Higgs is critical to validating a nearly five-decade-old fundamental physics theory, known as the Standard Model, which accounts for all known subatomic particles and their interactions. Scientists, meanwhile, continue their search for answers to weighty unexplained physical phenomena such as the existence of dark matter and what happened to all the antimatter since the Big Bang.
Fermilab theoretical physicist Paul Mackenzie is leading a multiyear project at the ALCF to shed light on the mysterious particles and forces associated with “physics beyond the Standard Model.” According to Mackenzie, the Standard Model has many complex and peculiar features that have led to the nearly universal belief that there is new, as yet undiscovered physics which will explain these features.
Mackenzie heads a national effort to leverage HPC resources to advance quantum chromodynamics (QDC), the study of how quarks and gluons interact. Supercomputers like Mira enable scientists to study quarks and gluons in situations that are not possible in accelerator and cosmic ray experiments, and have the computational power needed to give quark-antiquark pairs their proper, very light masses for the first time — removing one of the largest remaining uncertainties involved in QCD calculations. Read more about Mackenzie’s research at the ALCF here.
The highly competitive technical program of the annual Supercomputing conference shows broad participation from ALCF researchers this year. Several papers coauthored by ALCF researchers were accepted — one of which is also a finalist for the ACM Gordon Bell Prize — and all feature work either performed on or optimized for Argonne’s Mira supercomputer. Other ALCF participation runs the gamut, from posters and workshops to broader engagement and round table sessions.
Fifteen DOE National Laboratories, including Argonne, will be represented in the exhibit hall in one booth (1327) under the theme of “DOE: HPC for a Greener, Smarter, Safer World” and host presentations, electronic posters, 3D simulations, demonstrations and roundtable discussions.
Iâll be there, trying to take in as many sessions as possible and meeting old friends, colleagues and collaborators. If you want to connect, drop me a line at email@example.com
Two papers that I’m involved with are being presented by their main authors. Here are the links:
Integrating Dynamic Pricing of Electricity into Energy Aware Scheduling for HPC Systems
Performance Characterization and Prediction Based Modeling of Collective Two-Phase I/O
I’m thrilled to be part of a new NSF-funded IMAX and digital 3D documentary film project that will introduce audiences to the major scientific instruments being used to explore the origins of the universe. Chief among these are the Large Hadron Collider at CERN and a new generation of supercomputers.
My specific contribution is advisory and relates to the role of supercomputing in this scientific enterprise. Systems like Mira accelerate discoveries in the cosmology arena through large-scale scientific simulation and visualization of enormously complex physical phenomena. (Both simulation and visualization will be featured prominently in the film.) Supercomputers were recently used to generate the largest cosmology simulation ever, which will help the scientific community to test theories against observational data, such as the next-generation of sky surveys preparing to go online.
Filming will take place during 2013 and 2014 and will result in a 2D/3D giant screen film, a dome planetarium film, museum exhibits and other educational materials. It’s a great team of investigators that include media communications guru Mark Kresser, UC Davis physics professor Manuel Calderon de la Barca, Franklin Institute’s Dale McCreedy, and IMAX film director Stephen Low.
A particularly interesting aspect of this outreach project is that it also supports a study of middle school girls’ interest and engagement in the topic. (Middle school girls’ interest in science and math tends to plummet around this time due to several social factors.) Films like these are high-quality outreach projects that present complex scientific research to the public in an accessible and entertaining way. It’s also an excellent opportunity to develop insights into how to develop STEM content for an especially vulnerable group of learners.
In many ways, biofuel research is like modern day alchemy. The transmutation of biomass materials — which includes anything from kitchen and latrine waste to stalky, non-edible plants — into a sustainable and renewable energy source involves catalysts and chemical reactions. The process promises to help meet the world’s critical energy challenges.
Biofuel research can also be thought of as the ultimate multi-scale, multi-physics research problem. It represents several interesting biological supply-chain management problems. Not surprisingly, biofuel research spans several domains here at Argonne, and takes place in wet labs and joint institutes across the lab campus. There is also an exciting INCITE research project going on in the ALCF aimed at finding a more effective way to convert plant materials that contain cellulose, such as wood chips and switchgrass, into sugars, and then converted into biofuels.
A science team from the National Renewable Energy Laboratory is using Mira to conduct large-scale simulations of the complex cellulose-to-sugar conversion process. Researchers are able to obtain data, such as the level of an enzyme’s binding free energy, which is difficult to obtain through conventional experimental approaches, helping to accelerate the process of screening and testing new enzymes. With such information, researchers will be able to identify potential enzyme modifications and then feed their discoveries into experiments aimed at developing and validating improved catalysts. Read the full research highlight here.
After posting a few months back on the exciting STEAM work at RISD, and the push to integrate art into STEM (science, technology, engineering, and mathematics) curricula, I was eager to attend the “The Art of Science Learning” talk at Argonne this week, where I learned about a National Science Foundation program with similar goals.
The Art of Science Learning is a national initiative that uses the arts to spark creativity in science education. The goal of the project’s development activities is to experiment with a variety of “innovation incubator” models in cities around the country: one in San Diego (hosted by Balboa Park Cultural Partnership), one in Chicago (hosted by the Museum of Science and Industry), and one in Worcester, Mass (hosted by the EcoTarium). These incubators generate collaborations of different professionals and the public around STEM education and other STEM-related topics of local interest that can be explored with the help of creative learning methodologies.
Chicago incubator director Tim Morrison spoke about this initiative, and the yearlong effort starting this January to address the STEM challenges of urban nutrition.
Projects like these are aimed at exploring a framework to ultimately change the way children are educated in the U.S. — one that emphasizes creativity and innovation as a means to build a strong economy. I, for one, am extremely encouraged to see this movement gathering steam. Pun intended.
I’m pleased to announce that Jim Hack and I will be co-editing a special issue of Computing in Science & Engineering magazine on the topic of leadership computing, to be published in fall 2014. The goal of this issue is to explore how leadership computing is being effectively used to support real-world science and engineering applications. The topics of interest and submission guidelines can be found on the CiSE website.