Building a Science-Based Case for Large-Scale SimulationSeptember 30, 2003
Chapter 3 of the SCaLeS report, "Anatomy of a Large-Scale Simulation," features recent achievements of John Bell and collaborators at Lawrence Berkeley National Laboratory, including the 2048-processor simulation of a turbulent premixed flame shown here (bottom) with an experimental particle image velocimetry crossview of the turbulent "V"-flame for comparison. (SIAM has also noted Bell's achievements; see "LBNL Researchers Bell and Colella Receive First SIAM/ACM Prize in CSE.")
What would you do with a hundred times the computer power and storage now available to you? A thousand? What algorithmic and software technologies would you need to take advantage of your new capacity?
These questions were posed to more than 300 of the nation's leading computational scientists by the Office of Science of the U.S. Department of Energy at a workshop in Arlington, Virginia, June 24-25, 2003. Their answers, captured by a team of about fifty contributing authors, have been preserved in a two-volume report, A Science-Based Case for Large-Scale Simulation (informally dubbed SCaLeS), which is expected to strengthen DOE's commitment to large-scale simulation research. Volume 1 of the report was delivered to Raymond L. Orbach, director of the Office of Science, on July 30 and is available for public download. (Volume 2 will appear in September.) Orbach, who delivered the charge for the report at the June workshop, has established a reputation as an advocate for large-scale simulation as an important complement to theory and experiment in the conduct of the scientific mission of DOE.
The SCaLeS report makes eight recommendations. Many of them echo themes familiar from federal reports on supercomputing dating back at least to the so-called Lax Report of 1982, which is credited with the launch of the supercomputer centers of the National Science Foundation. The SCaLeS report reaffirms six such recommendations:
- Extensive investment in new computational facilities, striking a balance between capability computing for "heroic simulations" that cannot be performed any other way and capacity computing for "production" simulations that contribute to the steady stream of progress
- Sustained collateral investment in the software infrastructure that, together with the hardware, provides the "engines of scientific discovery" across a broad portfolio of scientific applications
- Continuing investment in algorithm research and theoretical development, given that improvements in basic theory and algorithms have contributed as much to increases in computational simulation capability as improvements in hardware and software over the first six decades of scientific computing
- Proactive recruitment of computational scientists as early as possible in the educational process, so that the number of trained computational science professionals is sufficient to meet present and future demands
- Investments in network infrastructure for access and resource sharing, as well as in the software needed to support collaborations among distributed teams of scientists
- A federal complement to commercial research and development of innovative, high-risk computer architectures that are suited to the special requirements of scientific and engineering simulations
To justify these investments, the report ties dozens of scientific goals spanning DOE's mission to enhanced simulation capability---either as the only way to achieve the goals, or as a way to reduce the expense and shorten the lead time of research campaigns in which simulation is combined with theory and experiment. The authors note that with the know-how being in the public domain and the cost of simulation continually dropping, leadership in computational science can easily be lost to other countries.
The two leading recommendations of the SCaLeS report, not anticipated as clearly by earlier reports, appear to mark the beginning of a new era of computational science:
- Major new investments in all of the mission areas of DOE's Office of Science to capture new scientific opportunities presented by a fusion of advances in scientific models, mathematical algorithms, computer architecture, and scientific software engineering
- Assembly of multidisciplinary teams to provide the broad range of expertise needed to address the intellectual challenges associated with translating advances in science, mathematics, and computer science into simulations that can take full advantage of advanced computers
The computational scientists in Arlington argued that "phase transitions" are occurring in their fields. A similar evolution has been seen in many branches of experimental physics, where individual investigators and small teams have been succeeded in many cases by large groups centered around billion-dollar facilities, such as accelerators, lasers, telescopes, and tokamaks, engaging not only physicists, but also statisticians, engineers, support technicians, and others. Computational science is now spawning multidisciplinary teams of scientists and engineers, mathematicians, computer scientists, and support personnel centered around large computers offering teraflop/s of processing power, petabytes of storage, visualization facilities, and high-bandwidth networking. The United States, with its enviable collection of multiprogram research laboratories and its tradition of university-laboratory collaboration, is well positioned to induce such "phase transitions" throughout the sciences and engineering, from plasma physics to biotechnology.
The scientists identified DOE's current SCiDAC (Scientific Discovery through Advanced Computing) initiative as paradigmatic of the multidisciplinary future of large-scale computational science research. An increase in raw simulation capability by a factor of a hundred or a thousand, without a concurrent improvement in algorithms, does not go very far for three-dimensional time-dependent problems. Simply doubling the resolution of a problem uniformly in each of these four dimensions eats up a factor of sixteen in computational complexity, and scientists need many such doublings. Therefore, better models and better adaptive strategies will be required along with bigger computers.
Under SciDAC sponsorship, scientists, mathematicians, and computer scientists are already joining forces to investigate and demonstrate such gains. Computational astrophysicist Tony Mezzacappa of Oak Ridge National Laboratory, who directs a multidisciplinary group that is simulating supernovae collapse, has stated that he "would never go back" to working without mathematicians and computer scientists on his team. Mezzacappa's group is one of several working under SciDAC to develop the next generation of community codes for users; other groups are building tools for the developers themselves, so that the latest algorithmic technology will not just migrate into a single application, but will be available across a common interface for many applications.
Volume 2 of the SCaLeS report contains 27 technical chapters in various areas of science, mathematics, and computer science central to the Department of Energy's scientific mission. Scientific areas covered include accelerator design, astrophysics, biology, chemistry, climate, combustion, environment, materials, nanoscience, plasma physics, and quantum chromodynamics (elementary particle physics). In each of these areas, experts from DOE laboratories, universities, industry, and other federal agencies attempt to predict the questions that scientists will be able to address when the next two or three orders of magnitude of computational power and storage become available. Mathematical methods common to simulation in many or all of these areas of computational science were also studied for their projected impact, including multi-physics modeling, multiscale modeling, uncertainty quantification, computational fluid dynamics, transport and kinetic methods, meshing methods, solvers and "fast" algorithms, and discrete mathematics and algorithms. Areas of computer science research deemed critical to progress in computational science at the scale envisioned by DOE include visual data exploration, data management and analysis, programming models and component technology, software engineering and management, computer performance engineering, network access and resource sharing, systems software, and advanced architecture.
In addition to the plenary talk by Orbach, Peter Lax of the Courant Institute gave a plenary retrospective on the report created by the panel he had led two decades earlier. John Grosh of the Department of Defense, co-director of the federal High End Computing Revitalization Task Force, also addressed the SCaLeS workshop, urging the scientists to concentrate on the implications of high-end computing on science; his group (which had conducted a workshop a week earlier and is also scheduled to report this summer) is concerned more with delivery of the cycles required by scientists and other users.
In the aftermath of the success of the Japanese Earth Simulator, which has begun to attract U.S. scientists as users, the national supercomputing community finds itself in a state of retrospection. Several panels and workshops met in the spring of 2003 to seek to define the future of various aspects of federally sponsored high-performance computing. The National Academy of Sciences convened a panel on the "Future of Supercomputing," and the JASONS met during the same week as the SCaLeS workshop to evaluate the use of super-computers in the Advanced Simulation and Computing (ASCI) initiative of the National Nuclear Security Administration wing of DOE.
An expanded version of the SCaLeS report, with more room for coverage outside DOE's immediate mission areas and for bibliographic information, will appear as a SIAM book, the first in a new series on Computational Science & Engineering.
Volume 1 of the SCaLeS report (72 pages) is available at http://www.pnl.gov/scales.
David Keyes is a professor of applied mathematics at Columbia University and acting director of the Institute for Scientific Computing Research (ISCR) at Lawrence Livermore National Laboratory. Together with Phillip Colella of Lawrence Berkeley National Laboratory, Thom Dunning, Jr., of the University of Tennessee and Oak Ridge National Laboratory, and William Gropp of Argonne National Laboratory, he co-edited the SCaLeS report.