Managing Multiple Scales for Nanotechnology Research

One of the problems nanotechnology has faced is that it brings back together disparate scientific disciplines that over the last century had been growing increasingly apart. It was becoming difficult with the high-level of specialization for a physicist to talk to biologist and for the biologist to speak to a chemist, and have them all understand one another.

Now, with nanotechnology they are all thrown back into the same cauldron of science and they need to define terms. This definition issue is no more acute than in the area of length and time scales. It was all fine and good when crystalline materials and biological materials were separate, but now with trend towards hybrid systems itâ''s time to get this sorted.

In a meeting I moderated some time ago with a mix of biologists, chemists and physicists an agreed upon length scale that would keep everyone happy in performing nanotechnology research was an instrument capable of 4 or 5 orders of magnitude, ranging from .1nm to 10 microns. Electron microscopy seemed to be the most likely candidate to fill the role with its ability to bridge multiple scales.

The physicists were pretty happy, but the biologists were still forlorn. It was difficult to see how with current techniques and instruments a living cell could be examined on a nanoscale without cryofreezing it. The only source of information on the atomic scale (beween a nanometer and an angstrom) for examining biological specimens, the biologists lamented, was from crystallography.

You get some information through crystallography when combined with other techniques such as activity analysis, cutting and pasting, etc., the biologists conceded. But the truth is that itâ''s still an ice cubeâ''not exactly representative of the living system you want to analyze.

Computer modelingâ''s role in bridging the gap has its limitations as well. Modeling has its scaling problems as well. Itâ''s pretty accurate under 1000 atoms, but beyond that it all gets a bit compromised and begins to look more like a 2-D image rather than a 3-D one.

To overcome this an area that is being pursued is a combination of precise modeling with empirical modeling. This method has proven itself to be pretty accurate for purely organic systems in germanium, resulting in the capability of accurate models for systems of 100,000 atoms or more.

All of this preamble brings me to recent developments at Argonne National Laboratory where scientists have employed high-intensity X-rays to observe the motions of biological and organic molecules in solution. When combined with their modeling, which heretofore they had no way of checking to see if they were accurate, they have been able to make movies of a DNA molecule in motion within a solution.

I will have to check in with the biologists to see if they are heartened by this breakthrough.


Tech Talk

IEEE Spectrum’s general technology blog, featuring news, analysis, and opinions about engineering, consumer electronics, and technology and society, from the editorial staff and freelance contributors.

Newsletter Sign Up

Sign up for the Tech Alert newsletter and receive ground-breaking technology and science news from IEEE Spectrum every Thursday.