Former PITAC Chair Details IT Research ChallengesNovember 21, 2000
Drawing on his experience as PITAC co-chair, Ken Kennedy gave his Puerto Rico audience an insider's view of the committee's work before going on to consider a few of the research challenges for which the SIAM community is especially well equipped.
On July 11, at the 2000 SIAM Annual Meeting, Ken Kennedy of Rice University gave an invited talk on the subject of long-term information technology research. Having served for two years as co-chair (with Bill Joy of Sun Microsystems) of the President's Information Technology Advisory Committee (PITAC)---before passing the torch to Raj Reddy of Carnegie Mellon University and Irving Wladawsky-Berger of IBM---Kennedy was well qualified to explain the interactions between Clinton administration policy and research in high-performance computing, communications, and IT. He began with a detailed look at the committee's findings and recommendations, as set forth in its widely discussed 1999 report.
PITAC was created---amid considerable fanfare---by executive order in February 1997. Barring further executive action, it will disband at the end of its fourth year. Broadly speaking, its purpose is to provide the National Science and Technology Council (NSTC), through the director of the Office of Science and Technology Policy, with advice and information on high-performance communications, computing, and information technology. Industrial members slightly outnumber those with academic affiliations on the 26-member committee, which was directed to review efforts in each major area of the federal investment in computers and communications. The administration's main concern was that support be adequate to maintain U.S. leadership in these critical activities.
The PITAC Report
The committee's principal finding, Kennedy explained, was that there had been a long-term drift away from fundamental research in IT. While R&D budgets for IT have grown steadily in recent years, the growth has not been dramatic. Even though the critical nature of the problems solved by IT is widely recognized---they range from medicine and health care to engineering design, and from there to national defense---and even though the IT industry has accounted for more than 30% of real GDP growth over the last five years, only one in seventy-five federal R&D dollars is directed to IT research. That is hardly unnatural, Kennedy pointed out, given that most of the agencies that fund R&D in IT are strongly mission-oriented. Yet it seemed unwise to most members of the committee, which recommended that a determined effort be made to reverse the trend. Specifically, they proposed that the federal government increase its investment in IT R&D by $1.4 billion per year, ramping upward over the next five years, with a clear focus on basic research.
The committee identified four key areas in which increased investment is sorely needed: software; scalable information infrastructures; high-end computing; and social, economic, and workforce issues. PITAC also called for a coherent management strategy, the establishment of clear lines of organizational responsibility, and diverse modes of support for those engaged in basic IT research. A judicious mixture of grants, contracts, and student stipends---of the sort used to finance health care research---was seen as likely to produce the most satisfactory results.
With regard to software, the recommendations were particularly strong. Software research---as opposed to software development---should be a substantial component of every major IT research initiative. Among the areas in need of attention, the committee specifically identified programmer productivity and the shortage of IT professionals. Important projects are languishing for lack of trained personnel. Also pointed out was the all too frequent neglect of software reliability and robustness by developers, whose priorities typically lie elsewhere. Human interface innovations are also needed to enhance user friendliness, lest newly acquired capabilities go to waste.
The committee cited the need for a balanced set of testbeds for networking and Internet research. Even the physics of global-scale networks, blending optical and wireless components with existing satellite technologies, is not well understood. Bandwidth issues abound, as do reservations concerning the overall scalability of the Internet. Talk of a national digital library, much less a next-generation World Wide Web, seems premature until such matters have been clarified.
The committee was adamant that funding for high-end computing must continue. The suppliers of high-end systems are subject to market pressures unknown in the rest of the industry, since their customers (though wealthy) are few in number. Yet high-end computing is essential for science and engineering research, health care applications, and national defense. Numerous applications for high-end computing are ripe for exploration, with new ones being identified every day. Funding agencies, according to the committee, should bear in mind that advances in high-end computing, including algorithm development, eventually find their way to the desktop.
Finally, the committee addressed social, economic, and workforce issues, recommending investment in four areas. Increased use of IT in education would enhance the IT-literacy of the population, rendering it more productive and competitive in emerging global markets. It would also expand the pool of available IT-workers, including minority IT-workers. Finally, the committee recommended an increase in the resources devoted to sorting out the economic and policy implications of technological progress. Kennedy pointed out in particular that investment in university research is absolutely critical to IT-workforce needs: Without it, faculty leave; without them, the flow of graduate students is interrupted; without both, fewer undergraduate computer science degrees are awarded, and fewer U.S. citizens enter the IT profession.
The committee made an effort to anticipate questions sure to be asked. Washington is a city of "simple country lawyers" who, especially in the presence of TV cameras, are invariably curious to learn why unmet needs can't be taken care of by redirecting, rather than expanding, the existing budget. Moreover, at least since the Reagan years, there are always a few who doubt that the government has any business funding research and demand to know "why industry doesn't do this." The committee stressed that rebudgeting won't work in this instance because, among other things, the short-term needs that exhaust the existing budget are exactly that---needs. To take money away from them now would bring important work to a halt, and encourage key personnel to leave the industry. Private corporations are neither willing nor able to fund the needed long-term research.
Industrial research, Kennedy pointed out, is almost always focused---as it should be---on the enormously expensive process known as product development. Only the richest corporations can afford the long lead times and considerable risks inherent in basic research. If anyone knew how to shorten those lead times, or eliminate those risks, research would no longer be research. Corporations have no incentive to fund and develop the sorts of revolutionary technologies that might disrupt their own profit streams. It is more prudent to go on attaching bells and whistles to successive generations of an existing technology, in which their competitive advantages relative to rival suppliers are well established. Revolutionary technologies don't come along all that often, and seem---from a business standpoint---best left to single-product start-ups. When governments support basic research, and then make the results available to all comers, opportunities are created for venture capitalists, currently among the most dynamic, envied, and admired forces in the global economy.
A final justification offered by PITAC for government involvement in research rests on quantitative economic analysis. Kennedy quoted MIT economist Lester Thurow to the effect that society has a higher rate of return on research (approximately 66%) than do those who perform the research (about 24%). Because there is an abundance of non-research projects whose internal rates of return exceed 24%, but whose rates of return to society are negligible, corporate patterns of investment often fail to serve the best interests of society. The argument would be more convincing if Thurow had also estimated the variances of the (presumably random) rates of return whose means are 24% and 66%, respectively, since risk is ordinarily an important element in portfolio decisions. But, assuming that Thurow's estimated rates of return are not grossly inaccurate, his argument in favor of government funding for research is potentially powerful.
Kennedy had good news to report on the funding front. The Clinton administration proposed an additional $366 million for FY 2000, and $605 million for FY 2001, for IT research. Moreover, Congress had actually appropriated $226 million of the recommended $366 million for FY 2000. Better still, the House has passed with virtual unanimity the National Information Technology Research and Development (NITR&D) Act, calling for five years of funding at PITAC-recommended levels, as well as a permanent R&D investment tax credit. Regrettably, the Senate version called for lower spending levels and had yet to pass. Actual appropriations, as opposed to recommendations, are always year-to-year. [In September, NSF announced the first-year IT research awards; see www.itr.nsf.gov/, which also provides information about the submission of proposals for FY 2001.]
"Grand Challenges" in IT
Software reliability was perhaps the most compelling of the challenges identified by Kennedy. As he put it, "Who will pay for bug-free, feature-poor software?" Software developers are paid to add appealing-and capacity-consuming-new features to the standard word-processing, spreadsheet, graphics, database, and report-generating programs that form the backbone of the software industry. After all, uses must be found for all the new computing capacity and memory space brought to market by Moore's law every eighteen months or so. Just as the arithmetic glitch in Intel's original Pentium chips was never even noticed by most users, so do most of the bugs that remain in later versions of standard packages go unremarked.
Market forces alone, then, seem unlikely to elevate industry standards of reliability. If, on the other hand, reliable software modules (similar perhaps to LINPACK and EISPACK) were readily available, software developers would almost certainly incorporate them into their products. Deciding what modules to make available, and persuading taxpayers to pay for them, constitute a very grand challenge.
Also compelling were the questions Kennedy asked about Internet scalability, security, and growth. How is the system to handle (say) two billion Internet connections at DSL speeds? And what if those connections are mobile and wireless? The technical challenges are clearly imposing, and little thought is currently being given to the problems presented by such volumes of traffic. It might be mentioned, on the other hand, that the world is home to (only?) six billion souls, most of whom sleep about eight out of every 24 hours. Two billion Internet connections would therefore require that half of all waking men, women, and children---or an equal number of robots---be online simultaneously.
Kennedy also considered---with obvious relish---technology that could put the airlines out of business. The merits of such a scheme will seem obvious to anyone who was obliged to change planes in Chicago during the summer of Y2K. What is known as "realistic telepresence" seems a genuinely promising way to go about it. The term refers to technology permitting individuals at remote locations to talk more or less face to face by adding TV screens to their telephones, so that each can see a lifelike picture of the other carried, along with their words, by broad-bandwidth fiber optics.
A primitive device of this kind---called Picturephone---was displayed by AT&T at the 1964 World's Fair in New York. Later that year, the company set up special demonstration booths in New York, Washington, and Chicago, between which customers could hold three-minute conversations for as little as $16. Then, from 1970 to 1974, Picturephone was test-marketed in Pittsburgh and Chicago, at a cost of about $160 a month. But after almost five years, during which only a few hundred paying customers ever signed up, the experiment was abandoned.
Over the next quarter century, a number of other attempts---including one by Panasonic and a second by AT&T---were made without notable success. Then, early in 2000, a company named Talk Visual began advertising a $1500 videophone that transmits via ISDN, and an Italian firm named Aethra began to market a higher-end version at $3500. While still high, those prices are but small fractions of the $10,000 and up charged for even the simplest video-conferencing equipment just five years ago. Executives at Talk Visual hope to be able to offer free video-phones, in return for long-term commitments to purchase the service, sometime in 2001. What they still don't know is whether people really want to see one another while talking on the telephone.
More immediately interesting to many in the audience was Kennedy's vision of the Internet as a problem-solving engine. He currently heads up the Grid Applications Development Software (GrADS) project, which is intended to improve the capacity of the Internet to serve such purposes. To explain the project, he displayed an initially blank map of the U.S., on which icons representing remote hardware installations appeared and disappeared to indicate participation in a problem-solving exercise initiated from a workstation in Houston (see Figure 1). The first icon to appear was the one at the bottom, representing the initiating workstation. It was soon connected to the supercomputer above it, in Minneapolis. Next came a database on the East Coast, followed by a second supercomputer in Southern California, and then another database in the Pacific Northwest. The reader can easily imagine more complicated grids and choreographies, indicated by icons flashing on and off the screen as individual installations enter and leave the problem-specific grid, carrying out assigned tasks, and then going dark. A single installation might conceivably sign on, off, and back on several times in the course of a single "grid-computation."
Figure 1. National distributed computing, as envisioned by the Center for High Performance Software at Rice University.
The possibilities are both endless and obvious. Yet extensive research and development will be needed to realize them. The end result must offer secure, reliable performance under varying loads, without sacrificing power or user friendliness. To that end, a number of computer languages and problem-solving environments are being developed by GrADS participants and others. Kennedy cited advantages both of what he called "telescoping languages" and of the high-level "scripting languages" that are comprehensible to virtually any computer. Kennedy identified a number of challenges to which the SIAM community is well equipped to respond. Safeguards of security and reliability are in particularly short supply. Most of the current ones---such as public-key encryption systems---are based on relatively elementary mathematics and may in time be replaced by better systems. Software implementing the existing methods is still in its infancy. Scalable, reconfigurable communication networks require that routing and load-balancing decisions be made in real time, ordinarily via the solution of integer and mixed integer optimization problems. This work is also in its infancy. Robust libraries for scientific computation, to house the components required by problem-solving systems, are not yet readily accessible to many potential users. Finally, the transmission of high-quality video is an urgent priority. The secret, of course, is compression, compression, compression. More than a few SIAM members are already at work on such problems.
James Case writes from Baltimore, Maryland.