Engineers Evolve Transistors for Next-Gen Chips
Evolutionary algorithms lead to new logic and memory that may smooth the way as CMOS nears its size limits
12 May 2009—Concepts gleaned from the study of evolutioncould help overcome manufacturing problems in future computer chips. That’s the hope, at least, of researchers in the Intelligent Systems Group at the University of York, in England, who will present their findings next week at the IEEE Congress on Evolutionary Computation, in Trondheim, Norway.
As key components of transistors shrink from 45 to 22 nanometers, tiny natural variations in manufacturing—which make no difference in larger devices—start to affect performance. For instance, there’s no way to control the exact arrangement of atoms of doping elements within a lattice of silicon, and different levels of dopant will alter electrical effects. At these tiny sizes, line edges and surfaces that define components also have a natural roughness that can’t be avoided and can trip up a transistor’s function.
So manufacturers working with a complementary metal-oxide semiconductor, or CMOS, want circuit designs that still function well despite these inevitable variations. To come up with a set of candidate designs, the York researchers turned to evolutionary algorithms, a concept in use for several years to create new types of software. They started with a mathematical description of a set of standard cells, which are groups of transistors that perform a Boolean logic function, such as AND or XNOR, or a memory function, such as a flip-flop. The algorithm breaks the transistor group down into subsets, analogous to genes for the circuit. The algorithm then begins altering those genes to see what new circuit design results.
The researchers know that if they give an AND cell a certain set of inputs, it should produce a known output. Darwin spoke of survival of the fittest, and in this case, only the cells that produce the correct outputs are considered ”fit.” The researchers took the most successful 100 ”offspring” and fed them back into the program, eventually producing 200 generations for some smaller circuits and 500 for larger. That means one ”parent” design could wind up with 50 000 descendants, a number of possible solutions that’s too large to be created by other methods, says James Alfred Walker, a research associate with the group.
Not all of the designs are necessarily desirable to industry, he says. Some might achieve the right result but do it by creating a short circuit. ”It quite often creates these unconventional designs by making things that really aren’t feasible anymore,” he says. ”You may actually create something that performs the necessary task but consumes vast amounts of power or is unacceptably slow.”
With thousands of possibilities in hand, the trick is to find which of the potential circuits can best deal with the variations chipmakers inevitably face. The researchers feed their candidate designs into a computer simulation developed by the Device Modelling Group at the University of Glasgow. The simulation shows how each design would perform based on the different manufacturing variations. Manufacturers could then pick the design that would be likely to give them the best combination of, say, speed and power consumption for their particular objectives.
Other groups are working on similar approaches. Walker says his team’s next step is to improve the algorithms and alter them to run on more-powerful computer clusters, so they can produce even more designs and give more accurate results. The team also wants to move beyond standard logic cells to more-complex functions, making their design tool even more valuable, he says.
About the Author
Neil Savage writes from Lowell, Mass., about lasers, LEDs, optoelectronics, and other technology. In March 2009, he reported on amazingly strong carbon nanotube–based artificial muscles.
To Probe Further
The IEEE Congress on Evolutionary Computation runs 18 to 21 May in Trondheim, Norway. Professor Andy Tyrrell, head of the York group, is chairing the conference.