"Nothing Was Ever the Same Again"September 15, 1998
The Manchester "Baby," whose momentous first computation was performed 50 years ago.
Nicholas J. Higham and David J. Silvester
In 1948, the world's first stored-program computer performed its first (dramatic) computation at the University of Manchester; this spring, researchers gathered at Manchester to celebrate 50 years of progress in computers and numerical analysis.
June 21, 1998, marked the 50th anniversary of the first computations made by the Manchester "Baby"---the world's first stored-program computer, built by Professors Freddie Williams and Tom Kilburn and their team at the University of Manchester. The momentous computation was described by Williams in 1974:"A program was inserted and the start switch pressed. Immediately the spots on the display tube entered a mad dance. In early trials it was a dance of death leading to no useful result, and what was even worse, without yielding any clue as to what was wrong. But one day it stopped, and there, shining brightly in the expected place, was the expected answer. It was a moment to remember. Nothing was ever the same again."
A week of events was organized by the university and the City of Manchester to celebrate the anniversary. It included a series of exhibitions and specialist conferences and a "launch event" at Manchester's Bridgewater Concert Hall, during which a full-scale replica of the Baby was switched on via a live satellite link to the Museum of Science and Industry, half a mile away. After the switch was thrown, honorary University of Manchester degrees were awarded, in a distinctly surreal ceremony combining tradition with huge video screens, to: Emeritus Professor Tom Kilburn; Sebastian de Ferranti, whose company put a revised version of the Baby into commercial production; Chris Burton from the Computer Conservation Society, who led the reconstruction project; and Mike Brady, a graduate of Oxford, where he is now a professor of engineering.
Among the specialist conferences preceding the launch event was a day-and-a-half meeting, Numerical Analysis and Computers---50 Years of Progress, organized by the authors and Françoise Tisseur (University of Manchester), under the auspices of the Manchester Centre for Computational Mathematics. The 89 attendees enjoyed 11 invited talks describing the ways in which numerical analysis has been influenced by the development of computers over the last 50 years and predicting future developments.
Gene Golub (Stanford University) opened the specialist conference on June 16 with a talk titled "Early Days of Numerical Computing at Illinois on the ILLIAC." The ILLIAC, born in 1952 and retired in 1962, used a Williams tube memory, as did the Baby. Golub described linear algebra calculations performed on the machine, including one of the earliest uses of iterative refinement. Old methods in numerical analysis can make a comeback, he pointed out, as a result of either enhanced theoretical understanding (as in the case of iterative refinement, in the early 1980s) or the development of new computer architectures.
In a talk titled "Early Aces and the NAG Library Machine," Brian Ford (NAG Ltd., Oxford) noted that the first numerical library was published by Wilkes, Wheeler, and Gill (1951) in their book The Preparation of Programs for an Electronic Digital Computer, which contains the first use of the term "subroutine." Ford described the influence of J. H. Wilkinson on software development (along with a trip to the U.S. in 1971 when he was enlisted as Wilkinson's "baggage porter"). Outlining the history of the NAG Library, he explained NAG's early recognition that documentation and thorough testing were as important as user-callable routines.
In a high-tech presentation, "History and Development of Numerical Computation in Mathematica," Mark Sofroniou (Wolfram Research) demonstrated Mathematica through the package's notebook interface on a Pentium PC. Cleve Moler (The MathWorks, Inc.) followed with a stimulating personal perspective on the evolution of Matlab. Using the MathWorks logo, Moler demonstrated the increased sophistication of the graphical interface that has been built into successive versions of Matlab, including an original 1984 Fortran version. In this context, Moler pointed out that the Manchester conference was the first time the three authors of 1960s dissertations devoted to the solution of PDEs on L-shaped regions---Moler himself, Joan Walsh (University of Manchester), and John Reid (Rutherford-Appleton Laboratory)---had simultaneously been in the same room. (The fundamental eigenmode of the L-shaped membrane is the basis of the MathWorks logo.)
In the Tuesday afternoon session, which was devoted to differential equations, Ian Gladwell (Southern Methodist University) gave a talk titled "Software for the Numerical Solution of ODEs---A University of Manchester and NAG Library Perspective." He pointed out that the Runge-Kutta code RKF45 of Shampine and Watts is the most widely used (and plagiarized) ODE solver, and he observed that advances in ODE software tend to lead to added functionality as well as improved efficiency. In an insightful talk on computational initial value problems, Andrew Stuart (Stanford University) then discussed recent developments in the construction of convergence proofs in the context of numerical software, with applications to the Matlab ODE solvers.
In "Fifty Years of Numerical Modelling----From Morphogenesis to CFD and Back," Bill Morton (Bath and Oxford Universities) began by discussing the radar and code-breaking antecedents to digital computers and Alan Turing's role in early computer developments. Morton then gave a personal survey of the key developments in the numerical solution of PDEs over the last 50 years. His emphasis on the frequent resurgence of interest in problems as numerical methods advance (a specific example being morphogenesis) provided an interesting parallel to Golub's comments regarding the resurgence of interest in methods with advances in architectures.
The invited speaker after the conference dinner was Joan Walsh. Resisting the opportunity to be sentimental, she gave a powerful talk titled "Numerical Mathematics: The Seven Ages," in which she traced the development of numerical analysis at Manchester over the last 50 years. David Evans (Nottingham Trent University) added some personal reflections on the establishment of numerical analysis as a discipline at the University of Manchester in the 1950s.
Joan Walsh, after-dinner speaker at the conference commemorating the 50th anniversary of the Baby, traced the 50-year history of numerical analysis at the University of Manchester.
Mike Powell (Cambridge University) teed off the final session with a talk titled "Some Experiences of Algorithms and Fortran Software." Like the other speakers, he recalled early programming experiences, in his case sessions limited to 30 minutes on the Ferranti Mercury computer at AERE Harwell in 1959. He also discussed the historical development of optimization algorithms and the Harwell Subroutine Library, which began in 1963. "Computer users are optimistic," he concluded, "but developers of algorithms should be pessimistic."
Nick Trefethen (Oxford University) had probably the most difficult task of all the speakers at the conference. In his thought-provoking talk, "Predictions for the Next 50 Years," he forecast, among other things, that "determinism in numerical computing will be gone," "multipole methods and their descendants will be ubiquitous," and "the problem of parallel computing will have been blown open by ideas relating to the human brain." (Readers who are intrigued may want to look at the daily "maxims" provided by Trefethen in SIAM News, January/February 1998, page 4.)
Jack Dongarra (University of Tennessee, Knoxville, and Oak Ridge National Laboratory) gave the final presentation, "A Look at Library Software for Linear Algebra: Past, Present, and Future." Following the lead of three other speakers (and in contrast to Powell's classically constructed overheads), Dongarra projected his slides directly from a notebook PC. In preparation for the talk, he had trawled the Internet for facts, figures, and photographs to enliven his concise history of linear algebra software. He also recalled the "lost decade" of parallel software---the 1980s---prior to the availability of parallel programming standards, and anticipated the development of petaflop machines in the near future.
The overwhelming impression left by the speakers is one of considerable progress in numerical analysis over the last 50 years. And while the discipline is strong and healthy, many interesting challenges lie ahead.
The conference was sponsored by the London Mathematical Society, NAG Ltd., the UK and Republic of Ireland (UKIE) SIAM Section, and Wolfram Research. Slides from some of the talks and photographs are available at http://www.ma.man.ac.uk/NAC98.
Nicholas J. Higham is Richardson Professor of Applied Mathematics at the University of Manchester, UK. David J. Silvester is a senior lecturer in the Department of Mathematics at the University of Manchester Institute of Science and Technology, UK.