The Beckman Institute for Advanced Science and Technology

'Two heads are better than one' is an adage honored by this research institute, where multidisciplinary collaboration is an art.

Biophysicists, image analysts, and voice-recognition specialists pooled their expertise at the Beckman Institute to develop new ways of manipulating three-dimensional images of complex molecules. By using hand gestures and verbal commands, the researchers can more closely study how molecules respond to changes in energy or chemistry

Originally published November 1996.

The building's very architecture invites interaction. From the three stories of research laboratories on its north side, occupants can look through large picture windows across a skylit atrium at the people in the five stories of faculty offices on the south side. The separation of the laboratories from the offices is deliberate: it encourages scientists and engineers to walk back and forth several times a day, maximizing chance encounters with colleagues.

Each of the 11 pedestrian bridges uniting the two sides has a scattering of coffee tables, chairs, and whiteboards, to promote impromptu discussions and brainstorming. The wide vistas of the block-long vaulted atrium set ideas soaring while swallowing sound, granting privacy to each of the many conversations it shelters. Intimate lounges tucked in odd nooks complement larger seminar rooms, conference rooms, and the 200-seat auditorium. The building's subliminal message is: meet, talk, share.

Beckman Facilities Tour

The skylit atrium of the multidisciplinary Beckman Institute for Advanced Science and Technology at the University of Illinois at Urbana-Champaign [left] allows scientists and engineers in the biological, chemical, engineering, and computer laboratories and in the faculty offices to view each other. Pedestrian bridges joining the labs and offices provide spaces for researchers to meet informally. The conference tower [above] holds four floors of rooms for more formal meetings.

The 11 pedestrian bridges that join the laboratories and offices are equipped with whiteboards, coffee tables, and chairs, making it easy for Beckman Institute investigators to gather colleagues for impromptu brain-storming sessions. In fact, at the institute, "Let's grab a bridge" has become a synonym for "Let's talk about it."

Most Beckman Institute projects are pursued by teams of two to a dozen investigators. Engineers can design and build completely new equipment while scientists often find that investigators in completely different fields have been working on similar problems but with different approaches.

Photos: Beckman Institute

A secluded garden with a contemporary sculpture at the side of the Beckman Institute building allows discussions to take place in a relaxed outdoor setting.

The winner of several architectural awards, the unusual structure is the home of the Beckman Institute for Advanced Science and Technology, located on the campus of the University of Illinois in Urbana-Champaign. The architectural message is central to its raison d'etre: support for a multidisciplinary exploration of complex problems at the juncture of biology, cognitive science, and electrical engineering.

"The institute's premise is that these days, problems in science and technology can be better addressed by an approach that transcends disciplinary boundaries," explained Jiri Jonas, who has been the institute's director since 1993. Researchers in one field often discover that workers in another have been attacking the same kinds of problems, but with the different vocabulary of a different background.

Communication and cross-fertilization among disciplines is formalized in weekly lunch seminars. Every Wednesday, a Beckman researcher gives a talk on work in progress and solicits ideas from the diverse audience.

"In coming here [to the Beckman Institute]," said William T. Greenough, professor of psychology, a member of the neuronal pattern analysis group, and one of the institute's first planners, "we made the decision that we would concentrate on things together and help each other."

In The Beginning

The Beckman Institute was conceived in 1983, when two groups of campus administrators and professors were asked by Theodore L. Brown, then vice chancellor for research, to come up with a plan to facilitate research on the Illinois campus. Nominally, one committee--chaired by Greenough, psychology professor and then chair of the neuroscience program--was to have been composed of biologists, cognitive scientists, and life scientists; the other--chaired by electrical and computer engineering professor Karl Hess--was to have been composed of engineers and physicists.

The Beckman Institute Technology Tour

Photos: Beckman Institute


In fact, however, members of most disciplines were present on both committees. "People kept saying they were on the wrong one," Greenough recalled, "but it was the genius of Ted Brown to put them together." Although both committees were "led to believe they were competitive, they both independently came up with similar ideas for an interdisciplinary research center."

The next year, their proposals were combined into one, which was submitted to Arnold O. Beckman, an Illinois alumnus (BS, chemical engineering, 1922, and MS, physical chemistry, 1923). He had gone on to get a Ph.D. and become a chemistry professor at the California Institute of Technology in Pasadena, and later, in 1935, founded National Technical Laboratories (renamed Beckman Instruments Inc. in 1950). With the development of several best-selling analytical instruments, the company prospered, and in 1987, its leader was inducted into the National Inventors Hall of Fame.

In 1985, though, Beckman was applying his inventiveness elsewhere: negotiating conditions with both the Illinois state government and the university to ensure ongoing operational support for a potential institute. In October, he and his wife Mabel announced their princely gift of US $40 million to the university to found the Beckman Institute. The sum was "unimaginable at the time to fund a multidisciplinary project of this scale," Greenough recollected.

It would take a year, the university administration estimated, to design the building's basic plan. "Beckman said, 'Why not do it by Dec. 10 of this year?' and then [removed] his hearing aid," Hess recalled. "He knew if there was time, people would find a hair in the soup." The 29 000-square-meter building of brick terrazzo and marble was completed in two years, and most of the research groups had moved in by 1989.

Today the Beckman Institute encompasses nearly 1000 faculty, staff, and students (undergraduate and graduate), including about 130 people from the university's National Center for Supercomputing Applications.

Three themes

When Jonas, the institute's director, first arrived at the Beckman in 1993, "there were many different projects," he recalled. "But I felt that if we wanted to be world class, we would have to decide what to be world class in."

By the next year, the whole institute had been restructured to focus on three related research areas: molecular and electronic nanostructures, biological intelligence, and human-computer intelligent interaction. The institute calls these areas its three main research themes.

Research in molecular and electronic nanostructures concentrates on ways of shrinking electronic devices to less than 10 nm. The overarching theme encompasses R and D that ranges from engineers investigating computational electronics and ICs to biophysicists and biochemists working with nanometer-scale devices and chemical systems. After all, "chemists are essentially experts in angstrom systems--but they never talk with people working on electronic devices!" Jonas pointed out.

The cooperation among physicists, engineers, and chemists working under the same umbrella yielded a discovery of great commercial promise just this past year. Joseph Lyding, a professor of electrical and computer engineering who specializes in nanolithography, had started a program to etch lines only a few atoms wide on silicon substrates. Exceptionally hot electrons from a scanning tunneling microscope were to be his etching tool.

Silicon dioxide forms spontaneously on silicon in an oxygen atmosphere. So to do his etching, Lyding first had to cleanse the silicon substrate of the contaminating oxygen. In a step fairly common in IC manufacture, he removed the SiO 2by heating it in a high vacuum to vaporize the oxygen. Next, he passivated the substrate by exposing it to pure atomic hydrogen, which takes up all the loose atomic bonds that would otherwise react with oxygen. Only then could he write on the silicon by stripping off the hydrogen atoms in nanometer-wide lines.

At a suggestion by long-time collaborator Phaedon Avouris, a surface chemist at IBM Corp.'s Thomas J. Watson Research Center in Yorktown Heights, N.Y., Lyding modified the process. He tried passivating the substrate's surface not with atomic hydrogen, but with deuterium, a heavier isotope of hydrogen. "It's standard for a surface scientist trying an experiment with hydrogen to also try it with deuterium," Lyding explained, "because you can change the mass of the atom without changing the chemistry." To his "shock," however, he found that dislodging a deuterium atom with its extra neutron took 100 times as many hot electrons from the scanning tunneling microscope as were required to knock off an ordinary hydrogen atom.

Better engineering through chemistry

The strong isotope correlation remained "just a curiosity" for about eight months, until Oct. 20, 1995, when Lyding happened to chat with Karl Hess. Hess was wondering if Lyding's nanolithography technique might apply to commercial ICs. Lyding asked if anyone had tried using deuterium in IC processing.

According to Lyding, Hess's "jaw dropped." Hess knew that if Lyding's isotope-dependent observations were correct and IC surfaces could be passivated with deuterium, either the speed or the lifetime of devices might be increased. Better yet, perhaps devices could even be hardened for use in environments that are bathed in harsh radiation, such as outer space.

Hess called a former graduate student of his, Isik Kizilyalli, these days an expert on transistor physics and reliability at AT and T Bell Laboratories (now the research arm of Lucent Technologies Inc.) in Orlando, Fla. Kizilyalli provided four specially prepared 150-mm-diameter wafers laid down with CMOS transistors, which Hess and Lyding cooked, two in deuterium and two in hydrogen. In November and December, Kizilyalli ran accelerated aging tests on the results. Indeed, "deuterium did slow down the [hot-electron] degradation [of transistors] by factors of 10 to 50, maybe more in some cases," Kizilyalli found.

Beckman has now processed wafers from several chip manufacturers, duplicating the results obtained at Lucent Technologies. Interest is keen because all CMOS ICs today are passivated with hydrogen. Changing to deuterium would add a mere 30 cents to the cost of a wafer but would yield transistor lifetimes 10 times or more longer.

The handling precautions and plumbing connections for deuterium and hydrogen are identical, making a change-over "trivial," said Lyding: "You just roll in a bottle of deuterium and hook it up where you'd put the bottle of hydrogen."

"This invention was made possible by the special research atmosphere at the Beckman Institute," declared Jonas. Hess's and Lyding's participation in the same main research theme--along with Lyding's long-standing collaboration with surface chemist Avouris--led to "their frequent discussions of ideas."

Spontaneous nanomushrooms

At the molecular end of the nanostructure research is the work-in-progress that materials scientists Samuel Stupp and Li-Sheng Li, along with graduate students in chemistry, are focusing on in molecular self-assembly. Stupp's group is seeking to create molecules that will spontaneously form into stable, low-energy structures that serve an electronically useful function. Examples might be photonic materials (sensitive to light or emitters of light), piezoelectric devices (where a mechanical strain alters electrical resistance and vice versa), or quantum dots (nanostructures that have unique electronic properties because of their size). The team is exploring materials that can perform complex functions yet are easily made without clean rooms or other complex hardware. Stupp would "love to be able to create an electronically useful material by, say, spray-painting a surface."

Of most interest to the group at the moment is a mushroom-shaped structure about 10 nm tall and 57 nm wide, formed by the spontaneous aggregation of about 100 organic molecules. "Unlike a crystal, the mushroom is not symmetrical," Stupp observed. "It spontaneously forms a stem and a cap, which make the mushrooms pack in unique ways, resulting in materials with special properties." Moreover, the stem is sticky while the cap is smooth like Teflon.

Stupp is also enlisting the aid of Joe Lyding and his nanolithographic capabilities, as well as chemistry graduate student Gregory Tew, in trying to make a mushroom that conducts electricity. The trio want to see if they can possibly generate photons by combining holes and electrons in the confinement of a nanostructure. "The cap can easily donate electrons and the stem can accept electrons," Stupp said. "So maybe we could make an array of millions of quantum dots by self-assembly."

Stupp is growing monolayers of mushrooms on substrates that Lyding has patterned with islands of silicon oxide. (Lyding does the patterning with his scanning tunneling microscope and deuterium-passivation techniques.) The stems stick to the oxide islands, which can be set any desired distance apart. Moreover, they have also been found to aggregate spontaneously into quantum-dot-sized structures. So far, the precursor molecules have been synthesized and found to glow brilliantly with a blue color when irradiated with ultraviolet light, which ultimately could make them useful in displays.

Psyching out the brain

Biological intelligence is the Beckman Institute's second main research theme. It delves into how the brain works--all the way from the molecular to the cognitive level. Along with the expected neurobiologists (concerned with how the nerve cells operate) and cognitive psychologists (who care about how the brain perceives the world), there is "a healthy admixture of engineers and physical scientists," pointed out director Jonas. For the neurobiologists, the presence of the engineers is a boon, he explained: "Traditional neurobiologists rely on the instruments available." But at the institute, the engineers are helping them "develop new ways of studying the brain."

One new instrument is the miniature nuclear magnetic resonance (NMR) machine for studying metabolic activity at the cellular level. The project was headed by Richard Magin, a biomedical engineer who is director of Beckman's Magnetic Resonance Engineering Laboratory. Electrical engineer Timothy Peck built the microcoils, using existing glass capillary tubes (developed by analytical chemist Jonathan Sweedler), while NMR expert Andrew Webb designed the pulse sequences to optimize signal detection.

Conventional NMR works by using pulsed RF fields to simulate nuclear transitions that reflect the structure and dynamics of molecules, getting actual molecules to resonate. "To do NMR microscopy for individual cells, you need to shrink the coil to the size of the cells," explained Magin. His group has developed a solenoid smaller than the diameter of a human hair (100 um) with five to 10 turns of a copper wire 12 to 25 um thick. The tiny coil can be formed around a capillary tube holding only a few picograms of the material under study.

Using the microscopic coils, the researchers can push the limits of NMR detection sensitivity two orders of magnitude farther than for conventional detectors. With nanoliter-volume detection coils as sensitive as this, some of the key chemicals present in single neurons (nerve cells) can be identified.

Since lithographic techniques can be used to etch planar coils as small as 5 um across on a substrate, NMR microscopy could be done on an even smaller scale. "We're trying to move to planar technology because then we can use established microelectronic fabrication techniques to make the mini-NMR machines," explained Magin. "The eventual goal is to have an entire NMR detector on a chip--planar coil, MESFET amplifier, and magnetic field gradient coils, all part of one little microprobe."

These tiny NMR detectors, which in essence function as NMR microscopes at the cellular level, have proven valuable to the Beckman Institute's groups on human intelligence. The devices are helping researchers to link chemical and electrical activity in local regions of the brain with higher cognitive functions, such as learning how to perform a task. "It's a marriage of electrical engineering with chemistry," Magin declared.

Sorting out the chatter

At the cognitive level of biological intelligence, another group is working on a problem that has long stumped experts: an "intelligent" hearing aid that enables a hearing-impaired person to single out individual conversations in an environment with a lot of surrounding noise or cross talk--say, a cocktail party.

Some existing hearing aids amplify signals indiscriminately. Others incorporate algorithms that identify desirable signals and filter out noise on the basis of the acoustic profile of the incoming sounds. The latter engineering approach works fine for identifying and minimizing mechanical noises (which are very different from speech), but fails when the background noise consists of people speaking independently.

Early this year at the institute, a multidisciplinary team began to try out a new approach: emulating how the brain selectively extracts and amplifies acoustic information. The team is headed by Albert Feng, a member of Beckman's neuronal pattern analysis group and head of the university's department of molecular and integrative physiology.

Speech and hearing scientists Robert Bilger and Charissa Lansing provide both insight into the psychoacoustics of human speech perception and knowledge of the current approaches to hearing-aid design. Chen Liu, a biomedical engineer, has been handling the mathematics of the model: designing the algorithms, deriving the model's equations, and programming the computer. And to help identify fruitful approaches to the problem, electrical engineer Bruce Wheeler has lent his expertise on signal processing.

Already, Feng's team has developed a computer simulation of how a biologically inspired intelligent hearing aid could amplify selected sounds in the presence of intense random sounds. The strategy is simple: let the wearer of the hearing aid actively select the desired signal by directing the microphones on the hearing aid (which may be mounted on a pair of eyeglasses) toward the spatial origin of the signal of interest.

"We've been able to separate weak signals from noise even when sources are separated by only a few [angular] degrees," said Feng. Indeed, the prototype has successfully singled out a speech signal with 1/30 the amplitude of the surrounding noise. It can do so even when separated from the noise source by no more than the width of an average person's mouth from a meter away. Once the desired signal is extracted, it can be amplified and transmitted to the ear.

Feng and his colleagues are now conducting tests of the algorithm's effectiveness in an anechoic chamber. The final step will be miniaturization of an actual device. The university has already filed an application for a patent.

Homo ex machina

Human-computer intelligent interaction is the third of the institute's main research themes. It concentrates on combining machine and human intelligence to solve problems. As might be expected, the theme embraces the domains of image formation and analysis, artificial intelligence, robotics, speech recognition, plus hardware and software expertise.

The institute's special angle, however, is its inclusion of experts in human perception and performance, whose know-how ranges from psychology to flight simulation. "You have to understand how people read and react to displays to improve the interface between a human and a computer," observed director Jonas. As a consequence, research with this theme spans the gamut from developing autonomous robots to helping human beings to manipulate data in computers better.

At the robotics extreme, Narendra Ahuja, an artificial intelligence researcher, is collaborating with neuroscientist Mark Nelson and entomologist Fred Delcomyn on a semiautonomous robot that can scramble across difficult terrain. Their aim is a machine that, when equipped with a camera and artificial vision, might be enlisted to clean up hazardous waste sites or find survivors in collapsed buildings.

To be sure, the idea of a walking robot is not new. But many of the prototypes developed by the U.S. National Aeronautics and Space Administration, the Department of Defense, and others have lacked the agility and adaptability to handle varied situations and function even after sustaining damage.

To solve those difficulties, the Beckman researchers have taken a novel approach: modeling their experimental robot on that most durable of insect survivors, the cockroach. Of all insects, the cockroach was chosen because it exhibits many of the properties desired in agile locomotion--and its biology, structure, and dynamics have been well-studied for decades.

The robotic cockroach--which is creepily realistic in its hunched crouch--is two-thirds of a meter long with the same low center of gravity that gives an insect stability even when clambering over broken terrain. The robot's six jointed legs have the same geometry as an insect's: shorter legs in front that move vertically and longer legs in back that move backward at an angle. As with an insect, the legs have their own independent motor control, although they are coordinated through a central nervous system based on digital signal-processing chips.

When powered up, this Kafkaesque creation heaves itself to a palsied stand a third of a meter high and, despite its shuddering, is strong enough to resist a visitor's pressing down on its back. The three investigators and their students are now working on programming the robot to take its first steps.

Essential to this project so far have been the combination of Ahuja's expertise in the engineering design of robots, Delcomyn's intimate knowledge of the cockroach's body, and Nelson's understanding of biological control and nervous systems. This interdisciplinary pooling of insights will "become even more [crucial] as the investigators introduce additional sensory and walking capabilities," observed Ahuja. Their research may, incidentally, also prove to be helpful to those studying artificial aids for human limbs.

Molecules in mid-air

A quite different project within the theme of human-computer intelligent interaction is the development of interfaces more effective than the keyboard and the mouse at manipulating complex data in a computer. Physics graduate student William Humphrey and research programmer Andrew Dalke are two members of the theoretical biophysics group, headed by physics professor Klaus Schulten, which is interested in developing simulations of the molecular dynamics (chemical or energy interactions) of biopolymers. The aim is simulations that are both accurate and intuitively understood and manipulated by the people studyng such molecules as proteins, DNA, or membranes

This dynamic modeling is extraordinarily complex. Each molecule may have anywhere from 3500 to 35 000 atoms, and the time scales of their internal processes range from a few femtoseconds to a nanosecond (or even up to a second in nature). Molecular dynamics simulations start with a computer model of a molecule in its initial condition. The equations of motion that describe the bonds, masses, and charges of atoms within a molecule are then applied, and the results of the simulation compared to the dynamics of its natural counterpart.

But as such simulations unfold, how can a mere human observer gain a quick and accurate idea of what is happening? To achieve that sensitivity, the Beckman team has developed interactive visualization tools to display the model molecules. A Silicon Graphics workstation with a projection system throws onto a screen two alternating views of a molecule, such as bacteriorhodopsin. Through special glasses, the molecule appears in its full three-dimensional glory as purple spirals curl around a central ball-and-stick model.

A scientist--or a visitor--can make the molecule translate from left to right or even come apart, simply by pointing with an index finger in the desired direction (two video cameras at right angles detect the gesture and transmit the information to the workstation). To view the molecule from a different angle, one calls out the command, "Rotate!" The molecule then slowly pirouettes, in reaction to a speech-recognition module. The whole human-computer interface blends Schulten's expertise in the molecules themselves as well as in visualization software, Thomas Huang's work on gesture recognition, Robert Skeel's development of software and algorithms, and Yunxin Zhao's background in speech recognition.

Helping one another see

Accessible by every Beckman Institute research project is the National Center for Supercomputing Applications (NCSA), part of which is housed in the Beckman Institute building. Through the center, institute researchers have access to such supercomputers as Thinking Machine's 512-node CM-5, Convex C3 and Exemplar, and SGI Power Challenge. Also located in the institute's building are the center's two Silicon Graphics workstation labs and virtual reality facilities, including the CAVE (Cave Automatic Virtual Environment). Named for the Allegory of the Cave found in Plato's Republic, in which the philosopher explored ideas of perception, reality, and illusion, this cubical virtual-reality theater measures about 3 meters on a side. Full-color stereo SGI graphics are projected onto three rear-projection screens for walls and a down-projection screen for the floor, with a resolution of nearly 3072 by 1536 pixels.

Computer-controlled audio surrounds a visitor with sound while the changing position of a user's head and hands are tracked with electromagnetic sensors clipped to goggles and fingers. The three-dimensional environment definitely eased the analysis of complex biological data, as became obvious when the task was to map the faint, thread-like tails of two sperm intertwined around a fertilized human ovum.

Central to the support of both the National Center for Supercomputing Applications and the three main research themes is the Visualization Facility on the Beckman Institute's fourth floor. The Viz Lab, as it is known, consists of the Digital Visualization Facility (which has many kinds of computers) and the Microscopy Suite (which has half a dozen state-of-the-art instruments). The Viz Lab is where the data gathered by researchers ends up for processing.

As designed by co-directors Bridget Carragher and Barbara Fossum, the Viz Lab is laid out in six "pods," each with two to four computer workstations, including Unix systems, high-end Macintoshes, and Pentium PCs. The workstations run a wide variety of software, ranging from large image-processing and analysis routines to animation packages. Carragher and Fossum designed the setup with the displays pointing inward toward the center of each pod, so that people are able to see each other's screens "to encourage multidisciplinary interaction," Fossum said. "People see someone else's good techniques and try them themselves and suggest them to others."

The scientists themselves sit and manipulate their data, perhaps calling for help from the Viz Lab's staff of consultants. "We're not just a service bureau," explained Fossum. "Every user works his own data, because that analysis in itself is research on the fly." Besides analyzing and displaying data for research projects, the Viz Lab gives researchers the horsepower to devise animated movies of data that change over time, either for research purposes or even for release to television news programs.

In addition to doing number-crunching, any of the workstations can be used to remotely observe and manipulate samples in two instruments in the Beckman Institute's basement: a transmission scanning electron microscope and a magnetic resonance imaging system. The instruments' control panels are accessed through Netscape or any other World Wide Web browser, so that they can be run from any place that has Web access.

In fact, the Viz Lab's co-directors are working with Paul Lauterbur, the inventor of magnetic resonance imaging, and others toward creating a World Wide Laboratory--the brain child of Clint Potter, the project's team leader. This lab eventually would also allow remote control of the scanning confocal microscope and atomic force microscope that also share the basement. If all goes well, a researcher from anywhere around the globe could "mail in a sample, a technician could put the sample into the requested microscope and walk out of the room, and the researcher could operate the instrument in real time via the Web," Fossum explained.

Beckman's razor

One stipulation made by Arnold Beckman in his institute's charter is common in industry but rare in academia. It is a quadrennial review: each research group must be scrutinized every four years to be sure it is contributing healthily to and benefiting from the multidisciplinary approach.

The official document, "Beckman Institute Research Review Policies and Procedures," minces no words about what this means: "The long term success of the Beckman Institute very much depends on whether the Institute can maintain its dynamic nature. Clearly, for a mature Institute this necessarily means that some percentage of faculty must return to their home departments in order to allow a healthy influx of new people."

"Beckman was very worried about [intellectual] petrification," Hess observed. Without some institutionalized way of stirring the pot, "the Beckman Institute would become just another building with people in it," Lyding added.

Then there's the practical issue of space. The institute cannot hire faculty directly. Every faculty member has to be tenured or on a tenure track in a home department. "But the existence of the Beckman Institute has been used as leverage [by the university] to hire the best," noted Jennifer M. Quirk, the institute's associate director for external affairs and research. Attracting anyone, however, is moot "if you're full and no one leaves."

Every program in each main research theme, therefore, must go under a microscope every four years to see if it is making the grade. This very month, on Nov. 18 and 19, the first-ever such review will be conducted. Up first will be the molecular and electronic nanostructures theme, chaired by Hess.

Of the review committee's six members, four will come from outside the university and two from the Beckman Institute's own Program Advisory Committee. The four external reviewers will be chosen not only for their research expertise and accomplishment, but also for "their demonstrated breadth of view and experience in interdisciplinary research."

Before the reviewers arrive on campus, they will read statements of the theme's mission, direction, and project work. Also included will be the rationale for why each group's presence in the Beckman Institute both receives and confers benefits. Then the six-member committee will visit the Beckman Institute for eight intense hours of presentations by faculty members, visits to laboratories, and meetings with the institute's director Jonas and associate director Quirk.

The review committee will then draft its evaluation of the theme and each of its composite programs. Their criteria will include its quality with respect to work elsewhere in the same fields, its effectiveness as part of the main research theme and the Beckman Institute at large, and its relevance to the larger goals of all three main research themes. On the basis of the committee's advice, director Jonas will issue a report in January 1997, listing his conclusions and recommendations as to which groups stay and which leave.

The molecular and electronic nanostructures is only the first main research theme to be examined. Next year, human-computer intelligent Interaction will come under the microscope, and in 1998 it will be the turn of biological intelligence. The year after that, it will be the National Center for Supercomputing Applications. Although the center is not part of the main research themes, "we want to have more of a relationship with [it] than as a tenant," Quirk added.

This pioneering review "has been talked about for a long time, but there is some resistance" to it, Quirk noted. Yet, all concerned agree that "if the Beckman Institute is to thrive, it needs to have new blood."

To Probe Further

For hot links to the Beckman Institute's World Wide Web home page and the page of the World Wide Laboratory and other research laboratories, consult IEEE Spectrum's own home page at http://www/

A detailed description of the workings of a virtual reality CAVE can be found in "A 'room' with a 'view,' " by Thomas A. DeFanti, Daniel J. Sandin, and Carolina Cruz-Neira in the October 1993 issue of Spectrum, pp. 30-33.

Related Stories