In late 1959, Richard Feynman delivered one of his most famous lectures to a packed room at Caltech, entitled, "There's Plenty of Room at the Bottom." He spoke of a then-fledgling field of new physics at the atomic or nanometer scale, foreshadowing many of the research areas on the verge of fruition today: higher densities of information on scaled-down computers; the formation of micromachines (MEMS); the creation of designer materials; and the importance of biological techniques in controlling and manipulating matter at the atomic scale.
Forty years later, on January 21, 2000, President Clinton chose Caltech as the site to announce a bold new federal initiative on behalf of nanoscale science and technology. The FY2001 presidential budget request to Congress calls for a $227 million investment increase in nanoscale science and technology, for a total of $497 million.
The president's proposal is indicative of a growing awareness on the part of government that technology is a critical economic driver, according to Thomas Weber of the National Science Foundation, speaking at a special symposium at the APS March Meeting in Minneapolis. He pointed out that unlike similar past proposals, the strongest push for the nanotechnology initiative did not originate with the White House Office of Science and Technology, but with the president's National Economic Council. "The Administration realizes that much of the profitability and comforts we have in life, and our strong economy, is a direct result of research that was funded over the years," he said. "And they realize that if that standard of living is going to continue, the nation needs to invest right now so that the economy is still healthy 30 years from now."
Definitions of what constitutes nanotechnology are varied. Evelyn Hu (University of California, Santa Barbara) defined it as "the construction and utilization of functional structures and materials with at least one characteristic dimension at the nanometer scale." After 30 years of invigorating research, scientists now have the techniques and instrumentation required for nanoscale fabrication, and the application of nanotechnology to actual devices, such as quantum well lasers. But to fully realize its potential, Hu identified two critical issues: better control of critical dimensions and, in turn, microscopic properties of individual nanostructures; and the integration of those nanostructures into complex heirarchical systems, particularly through the use of such natural templates as molecular self-assembly.
Ever since Intel guru Gordon Moore made his now-famous observation in 1965 - that the number of transistors on a chip will double every 18 months - much discussion has centered on identifying the fundamental limits of Moore's Law. Invariably, such studies target a date roughly 10 years into the future, according to Robert Dynes (University of California, San Diego), for perfectly legitimate reasons. (The current cutoff point is expected to be reached by 2010.) "But the curve just keeps crashing right through them," he said. The reason is that scientists keep coming up with new materials, algorithms, architectures, and other innovative ways to overcome past technological barriers. "I allege that Moore's Law will continue as long as the scientific and engineering community remains healthy," said Dynes. "But if we do not continue to function as an intelligent, creative society, Moore's Law will be limited."
Unfortunately, Dynes has observed some worrying signs that the future is not necessarily bright for science and engineering. While the life sciences have flourished since 1970, federal funding levels for engineering and the physical sciences have been relatively flat. More worrisome is the impact these funding trends are having on the number of students earning degrees in engineering and the physical sciences: in fact, the latter ranks near the bottom of degrees received in the U.S., just above home economics and parks, recreation and leisure. "These young people are our feedstock," he said. "These are the creative people that we hope will fuel the future of nanotechnology, and they're not going into physical science and engineering."
Patricia Dehmer, who spent many years as a high-level researcher at Argonne National Laboratory before joining the Department of Energy, stressed to the audience that the initiative faces a very tough battle in Congress. Hence the scientific community must communicate with their Congressional representatives if the initiative is to survive - a point that was echoed by each of her fellow speakers. She pointed out that past initiatives had failed simply because "there was no 'buzz' on the Hill," and cited the success of the National Institutes of Health, whose budget has increased $4 billion over the last few years - more than the total funding for the NSF. Its success is largely due to its extraordinarily competent and powerful lobbying organization and substantial grass roots activity by NIH members. In contrast, Michael Lubell, APS director of public affairs, reported that less than 500 of the 43,000 APS members have regular contact with Congress.
"The Nanotechnology Initiative will enable the scientific community and our country at large to take advantage of the tremendous potential of nanoscale science and technology across a broad spectrum of basic and applied research," said APS President James Langer, who chaired the symposium. "But it's still a proposal, and it's not the responsibility of the government, but of the scientific community as a whole to help the nation understand the importance of this proposed initiative."
A sampling of nanotechnology-related research presented at APS March Meeting
-Compiled with the assistance of Philip Schewe/Ben Stein; AIP Public Information
©1995 - 2024, AMERICAN PHYSICAL SOCIETY
APS encourages the redistribution of the materials included in this newspaper provided that attribution to the source is noted and the materials are not truncated or changed.