Last year the White House announced an oddly entitled plan called the “Materials Genome Initiative”. The aim of the initiative—and thus the title—was to apply the same kind of data crunching fire power that was used on the mapping of DNA in the so-called Genome Project to the field of material science.
While one could argue that the White House dubbing this project the Materials Genome Initiative was more a metaphorical flourish than a scientific aim, it does raise the question of whether we can map all of material science in a way that will improve manufacturing as the plan has set out to do.
To answer that question, Richard Jones has penned a piece on his blog Soft Machines, which starts by posing the rhetorical question, "Do materials even have genomes?"
Jones raises many of the questions and problems that result from depending on—or expecting—computer simulation to help us design materials to perform tasks we have designed for them. Some of the points he makes remind me of a piece I wrote five years ago: Materials By Design: Future Science or Science Fiction?
At the time, I noted that “Any useful software modeling would need to be able to reveal how an alteration in a material’s structure—for example, a change in a crystal’s lattice structure—affect its properties and functions. Such a program would also need to be able to do that in a range of scales, because we also don’t know whether we must look at the atomic or particle level to find out where effects are taking place.”
This concern about problems of scale is reflected in Jones’ piece but he also raises the question of on what time scale this kind of endeavor would proceed:
"Even with the fastest computers, you can’t simulate the behavior of a piece of metal by modelling what the atoms are doing in it—there’s just too big a spread of relevant length and timescales. If you wanted to study the way different atoms cluster together as you cast an alloy, you need to be concerned with picosecond times and nanometer lengths, but then if you want to see what happens to a turbine blade made of it in an operating jet engine, you’re interested in meter lengths and timescales of days and years (it is the slow changes in dimension and shape of materials in use—their creep—that often limits their lifetime in high temperature situations)."
Jones points out that developing multi-scale modeling like this is nothing new; he refers to Masao Doi’s Octa project as an example. But such projects remain problematic. There are so many variables involved with material science, he says, that it is not clear how generic you can make the processes seem, at least for computer modeling. He further argues that researchers would quickly turn to physical experiments outside of the computer models.
He notes: “I’m skeptical that anyone trying to test out how to shape and weld big structures out of an oxide dispersion strengthened steel (these steels, reinforced with 2 nm nanoparticles of yttrium oxide, are candidate materials for fusion and fourth-generation fission reactors, due to their creep resistance and resistance to radiation damage) without getting someone to make a big enough batch to try it out.”
There is no doubt that computer modeling is a fantastic tool--a view Jones seems to support in the piece--but it should be clear we had better not be expecting material science to reveal itself the way the DNA molecule was mapped by the Genome Project. Whether the Materials Genome Initiative will prove beneficial to US manufacturing is something that can only be determined over the scale of time.
Dexter Johnson is a contributing editor at IEEE Spectrum, with a focus on nanotechnology.