Tech Talk iconTech Talk

2009 May See Reprioritization of Nanotech Concerns

The news cycle for nanotechnology in 2008 is pretty easy to sum up: environmental, health and safety concerns. Wiki projects, websites, blogs were created last year and all were strictly focused on combating the unknown, and largely unproven, dangers of nanotechnology.

But this year as a friend noted to me as the year turned, â''There will be far fewer hippies this year.â'' Unsure of exactly what he meant, he further explained that all those people who had the time and the luxury to concern themselves with the plight of the polar bear will find that much more of their energy will be occupied with their own survival. In other words, setting up think tanks and government-funded research projects to combat threats that donâ''t exist and may never exist will become a little harder to sell when people are facing foreclosure and unemployment.

So if saving the polar bear or defending our privacy from nanobots falls a little down the priority list of concerns for nanotech, what moves up?

Well, we have received our first real predictions for the state of nanotechnology in 2009: Nanotechnologies In 2009: Creative Destruction or Credit Crunch? authored by UK-based Tim Harper, CEO of Cientifica.

Apparently the white paper was inspired by Harperâ''s involvement in the World Economic Forumâ''s Summit on the Global Agenda in which challenges in particular areas are examined, which in this case was nanotechnology.

While toxicology issues remained an anchor of the discussions (this was still held in 2008), the effects of the economic crisis were already beginning to influence peopleâ''s thoughts.

As a result, Harper lists five issues that he sees as having a significant impact on nanotechnology in 2009:

â'¢ Technology Funding â'' expect to see things getting worse before they get better

â'¢ Purchases of Worthless Intellectual Property

â'¢ Academic Funding and Spinouts

â'¢ CleanTech

â'¢ Nanotechnology Applications

While some have found the â''thought-provokingâ'' bits to be those in which Harper has clearly put himself out on the line with predictions that can be referenced back upon, it is probably more in his urgings that the real meat of the piece can be found.

Nanotech as an area of technology commercialization finds itself in a strange spot. We are over seven years into huge government and private funding of nanotech so that we can now begin to see the fruits of all that investment. Technologies are ready to be commercialized, hard-fought profits can finally be won, but in many cases the financial pipeline will get turned off just as it was all about to pay off.

For those with the resources and the fortitude in these perilous times, Harper believes that 2009 could make fortunes while others may not survive the year.

In any case, claims by poorly informed environmentalists of a â''nanotech industryâ'' blinded by a â''gold rushâ'' will hopefully find their place in the greater scheme of thingsâ''down on the priority pole.

Cracks in Molecular Nanotechnology Orthodoxy Starting to Form

As I suggested might become the case last month, we are now beginning to see that the ideas of Eric Drexler in his Metamodern blog are not quite matching up with those of the proponents of molecular manufacturing.

Drexler presents some of his concerns and doubts about diamondoid mechanosynthesis (DMS), which has been most vigorously pursued by Robert Freitas and Ralph Merkle.

This has started a bit of debate between the two parties, which can be chronicled over at The Next Big Future blog. In it we get parsing of words like DMS is not a â''necessary first stepâ'' for molecular manufacturing but a â''highly desirable first stepâ''.

Drexler seems to indicate that not only is DMS not a necessary first step but not a highly desirable first step either. Instead pointing to aqueous synthesis at room temperature as an attractive area of research.

This leads again to the debate over the â''Wetâ'' or â''Dryâ'' approach to advanced nanotech. While this debate constitutes a red herring according to some of the comments to Drexlerâ''s blog entry, itâ''s hard to know how you graduate to automated exponential manufacturing if you donâ''t have any basic building blocks.

In any event, the vacuum surrounding molecular manufacturing research appears to have been relieved. Perhaps this will lead to more experiments and progress.

No Surprise: Semi Sector Slumps Badly at 2008 Close

As you might expect, the global economic downturn has hit the semiconductor industry badly, particularly as the past year dragged to a close.

According to the Semiconductor Industry Association (SIA), semiconductor sales plummeted in November 2008 nearly 10 percent from comparable sales a year earlier. In a press release yesterday, the SIA reported that manufacturers sold US $20.8 billion worth of units, as opposed to $23.1 billion for the same 2007 period. The November sales were also down from October sales by 7.2 percent, indicating that the decline was accelerating.

The SIA reported that sales in the Americas fell 19.5 percent while in Europe sales dropped 13.9 percent. However, the Asia Pacific region had the smallest decline, -6.2 percent in sales.

"The worldwide economic crisis is having an impact on demand for semiconductors, but to a lesser degree than some other major industry sectors. We expect the industry will remain the second largest exporter in the U.S. for 2008," SIA President George Scalise noted.

"Not all segments of the industry are being affected equally by the downturn. The memory market which has been under severe price pressure throughout the year has seen sales decline significantly while many other product sectors have year-to-date sales above 2007 levels," Scalise added.

The SIA said the industry trade group -- whose members include Intel, Texas Instruments, and Advanced Micro Devices -- was moving quickly to prompt U.S. legislators to pass measures aimed at stimulating investment in the sector as part of the incoming administration's economic incentive plan.

That too should come as no surprise.

Will a New President Shake Up the U.S. Space Program?

NASA is facing a sea change in its mission of human spaceflight, moving from an orbital science orientation to a course of longitudinal exploration of the near solar system. The new strategy poses enormous challenges to the management of the U.S. space agency to rethink the way it marshals its resources.

Under the Bush administration, NASA has been tasked with beginning the transition from operating a fleet of shuttles to finish building the International Space Station (ISS) to developing a new generation of conventional launch platforms to send crews to the Moon and, eventually, Mars.

With the Obama presidency approaching, experts in the field are questioning how much of the new program will be embraced by the science advisors coming to a White House confronted with a compromised economy and different political realities.

A feature article in The New York Times today covers the many issues involved in the complex future of the space agency.

The first of these centers around NASA's current plan to ground its shuttles in 2010 and employ the services of the Russian space agency to continue servicing the ISS with its Soyuz spacecraft until 2015, when the new U.S. rockets are ready to fly.

During the presidential campaign, Obama said he supported the new rocket program, known as Project Constellation, but was critical of the proposed shuttle grounding over strictly financial reasons. In effect, he was endorsing a parallel track for both programs, despite the burgeoning costs. But conditions have changed dramatically in the months since the now President-elect spoke those words, and harsher realities may cause him to reconsider his position.

The second major issue concerns the management of the space agency itself. The Times article notes that longstanding complaints about a climate of inflexibility in the administration of NASA programs has never been fully addressed, let alone overcome.

The article states:

Some inside the development program have complained that it is run with a my-way-or-the-highway attitude that stifles dissent and innovation. Jeffrey Finckenor, an engineer who left NASA this year, sent a goodbye letter to colleagues that expressed his frustrations with the program. "At the highest levels of the agency, there seems to be a belief that you can mandate reality," he wrote, "followed by a refusal to accept any information that runs counter to that mandate."

That damning criticism echoes voices heard five years ago after the shuttle Columbia disintegrated upon reentry and a commission studying the disaster concluded that NASA's management philosophy contributed to a chain of events responsible for the failure.

The combination of unchanged intransigence and differences of opinion in direction could lead to quick changes in the space agency's leadership at the hands of a new administration in Washington.

NASA's current boss, Michael D. Griffin, who favors grounding the shuttles, is under contract through 20 January, Inauguration Day, when he will need a new presidential appointment to continue steering the agency.

Articles such as the one in today's Times do not bode well for his future prospects in that regard.

Presidents have historically tended to make up their own minds about which direction to pursue when it comes to the crucial matter of space exploration.

[Update 31 December 2008: In a bit of a controversial move, the wife of NASA Chief Administrator Michael D. Griffin posted e-mail to friends and family on Christmas Eve asking them to sign an online petition requesting President-elect Barack Obama "to consider keeping Mike Griffin on as NASA Administrator," according to a report today from the Associated Press. The online petition was started by former astronaut Scott "Doc" Horowitz of Park City, Utah, who is also a former NASA associate administrator. The AP article states that Mr. Griffin had no prior knowledge of the petition drive.]

Are You Ready for Some 3-D Football?

Hey, that halfback looks like he's coming right at me!

When the topic of 3-D pictures arises, most of us think about cheap cardboard glasses with red and blue plastic lenses allowing us to unevenly watch even cheaper monster movies (and then trying to regain normal vision afterward). Over the years, 3-D projects have come and gone like a vampire that refuses to finally die, even after the stake has been driven through its heart by the public, critics, and technologists alike. Still, some engineers working in the entertainment industry have steadfastly maintained that the big breakthrough in 3-D technology is just around the corner, one that could usher in an era in which consumers will adopt a new way of looking at the world on the screen.

Now, it appears the monster is back and looking for revenge. As reported in an article in the Boston Globe today, the latest incarnation of 3-D is being sponsored by no less an entertainment behemoth than the National Collegiate Athletic Association (or NCAA).

For the biggest football game on the NCAA calendar, the BCS National Championship Game in Miami (between the University of Florida and the University of Oklahoma) on 8 January, Fox Sports will beam a special version of its broadcast to a number of selected movie theaters across the United States using a new implementation of 3-D imaging technology.

The equipment to be employed by Fox Sports for the college championship game will be supplied by a small firm called 3ality Digital Systems, which specifically focuses on using its state-of-the-art 3-D digital image technology to capture live events such as concerts and sports matches. According to the Burbank, Calif., firm, their equipment ranges from robotic motion-control systems to image processing electronics to specially made polarized eyeglasses.

"3-D is going to change the way we experience media," Steve Schklair, chief executive of 3ality Digital, told the Boston Globe. "We've shown it to former NFL players and they say, 'I've been on hundreds of football fields, and this is like standing on the field'."

That's a claim we've heard many times in the past. It may be true, but rightly or wrongly 3-D still has a lot of historical baggage attached to it that will make the road today's firms such as 3ality need to climb a steep one.

It may be a thrill to see a great football game as if you were on the field participating. Still, there's a reason they insist that fans stay in the grandstand.

Broadcast DTV Lives Up to Hype


With all the teeth gnashing, hair pulling and garment rending over the switchover from analog to digital TV (T minus 49 days and counting), the big picture tends to get lost in the kerfuffle: broadcast digital TV looks way better than analog. I discovered this a couple of weeks ago when I helped my parents make the transition, a story that joined all the others as part of Spectrum's special countdown coverage The Day Analog TV Dies (good riddance).

When I was down in Louisville for Thanksgiving, my dad asked me to help him switch his 20-year-old Sony TV from analog to digital. We went to RadioShack with his discount card for a converter box, but the card had expired the day before. Why bother expiring these cards before the switchover date? I answered my own question a couple of weeks later when I went back home to tend to some family matters (can you say â''economic stimulus for the consumer electronics industry?â'').

The TV in my parentsâ'' house holds the same special place that it does in a lot of homes that havenâ''t hitched a ride on the Info Highwayâ''itâ''s the fireplace, the gathering spot, the focus of the living room and of leisure time. So when I arrived at the old homestead ahead of the the replacement discount card, I knew it was important to do whatever it took to help my parents make the switch.

I rifled through the Sunday paper, looking at ads for LCD TVs. Sure, I could have gone out and paid full price for the converter box, but the countdown to Hannukah had also commenced and I put that $60 converter box expenditure into my own twisted gift-giving calculations. Their old TV was small, 19 inches, so bargain hunting was easy. There is a Circuit City (on life support, like all Circuit Cities) near their place, so I drove over to browse the pre-Xmas, post-Chapter 11 bargains. I found one almost immediately: a 22-inch Zenith. It was a floor model, on sale as is for about 60 percent off--$150. I didnâ''t hesitate. The picture (piped in through the storeâ''s cable feed) was at least as good as the Toshibas surrounding it and certainly better than the old Sony.

I bought the TV and a universal remote (the TV was sold as is, as in someone had lost the remote) and went back to my parentâ''s house. They were thrilled with their early Hanukkah gift. But when we turned it on, we got nothing but static. Duh. This TV needed an external DTV antenna. So I went back to the local RadioShack and bought a $35 amplified VHF/UHF/FM indoor antenna.

Set up was a little funky, considering that the $10 universal remote couldnâ''t control the Zenithâ''s menu. I had to navigate the setup using the buttons on top of the set. I attached the antenna to the TV, plugged it in and turned it on. I made the TV run through an auto tuning routine. On the first pass, it picked up four channels; my parentâ''s old TV could pick up eight. That didnâ''t seem right, so I ran the auto tune routine again, 12 channels this time. One channel, WAVE 3, had three channels associated with it instead of just oneâ''a crystal clear HD channel for its regular broadcast, and two other channels, including a 24-hour weather channel and a music channel, which had been discontinued (already? I guess the programming experiments will continue like this for years to come). Still, certain channels that had come through clearly on the old set were totally staticky in that digital, pixilated jigsaw mess of picture many DTV converts will become accustomed to in the next few weeks. So I ran the auto tune routine again. And again. Finally, after some maneuvering of the antenna to put the rabbit ears in the optimum position, we had 27 channels, all of them amazingly clear.

I was shocked. The picture was at least as good as the one I got at home with an HD cable box hooked up to my Olevia monitor (no, it doesnâ''t have a tuner and two years ago, opting for the tunerless monitor saved me about $500). My parents had never seen a picture so clear. And so, while I have been watching HDTV for a while now, I was fascinated, like a child with a new Xmas, I mean Hanukkah, toy. These signals were coming through the air? For free? And many of the channels had added new channelsâ''all of them in beautiful HD color. For less than $200, my parents not only had their first new TV in 20 years, they had 27 HD-quality channels. Hanukkah Joe did good this year.

So far the only complaint is the slow motion pixilated picture break up that happens for reasons unknown. You can be watching a show, when suddenly the picture breaks down into small boxes that slide around the screen. Could it be the â''bargainâ'' TV gone haywire? Could it be simple signal interference? Maybe someone here has some ideasâ'¿.

IEDM: 3d Stackenblocken

Actual footage of Intel's D1D production/development fab

This Conan skit explains not just what chipmakers are trying to do about the interchip communication problem, but also the problem that plagues the solution.

First: what is 3-d chip stacking and why would you do it? Stacking layers of integrated circuits atop each other is a solution to interchip communication problems detailed in a recent Spectrum article on multicore's very bad, no good, terrible day.

At the heart of the trouble is the so-called memory wallâ''the growing disparity between how fast a CPU can operate on data and how fast it can get the data it needs. Although the number of cores per processor is increasing, the number of connections from the chip to the rest of the computer is not. So keeping all the cores fed with data is a problem.

DARPA's exascale computing project showed that the barrier to exascale isn't flops-- we have that figured out. Now the problem is the time that data spends in the chip's wiring. The interconnect bottleneck is upon us. And if these technology trends continue, exascale computing will still be "just a dream in 2015."

To deal with the bottleneck situation, the U.S. Department of Energy formed the Institute for Advanced Architectures and Algorithms, where researchers are exploring "tighter, and maybe smarter, integration of memory and processors." Sandia, for example, is looking into stacking memory chips atop processors to improve memory bandwidth.

Solution: Go Up, Not out

It's not the world's newest idea. IBM showed 3d stacking techniques at IEDM 2002. Six years later, IEDM 2008 attacked the problem with at least 17 papers and one dedicated panel. The papers featured all the heavy hitters: Tokohu Universty, CEA-Léti Minatec, IBM's T. J. Watson Center, IMEC, NEC, and Qualcomm, among others.

Qualcomm showed that the problem is not limited to processors: memory alone will need this stacking too, especially if it's going to follow the same trend as processors-- which it surely will, given the trend toward small handheld devices with the same amount of memory as desktop computers, but with a much smaller footprint.

Belgian consortium IMEC demonstrated the viability of stacking two CMOS chips atop each other, sticking them together with dielectric glue and connecting the relevant parts with contacts called through-silicon vias (TSVs). As you might imagine, TSVs stick through the silicon substrate to whip information between short, stubby, vertical layers. According to one body of research, TSVs enable I/O power savings up to 98 percent.

So what material makes the best vias?

IBM looked at tungsten as a possible material. Others looked at graphene. But mainly everything seems to still rely on copper, the standard interconnect material, or a copper composite.

One of the issues raised by copper, however, is electromigration: as current flows through the wire, the metal atoms eventually migrate and form voids, and the wire breaks.

Hong Kong University rose to the challenge with a technology whose acronym hovers just a carbon molecule from disaster: Cu/CNT (copper/carbon nanotube) TSVs promise to retain all the conductivity of copper without the electromigration problem or the low melting point. Their results showed that "the EM lifetime of the Cu/CNT composite is more than 5 times longer" than copper alone.

The Next Problem

Even with the perfect interconnect, however, 3d chip stacking still poses a number of problems that need to be dealt with, before it will be possible to capitalize on even the best materials.

Depending on what you're stacking, you will run into different problems: logic on memory brings heat problems. (For DRAM, the upper temperature ceiling is 85 degrees C. For logic, however, the high temperatures often exceed 125 degrees.) Memory on logic brings different problems. They need to come up with some way of stacking them that protects different temperature areas. And I don't know what they're going to do about the heat sink.

The laundry list just keeps going: very thin dies will be used to stack memory to logic, and that requires a lot of grinding. DRAM bit cell retention can be adversely impacted by the thinning and package process.

And that brings us back to Conan

The most important issue is the connection between the stack; the reliability of connections is particularly important between memory and logic. Lining up these complex CMOS mash-ups is going to be as tough as getting that perfect right angle, and the punishment for low die yield is probably understated in the Stackenblocken bit.

The 32-nm middle child

Intel's done it. According to Intel's roadmap, the first 32-nm chips will roll out in devices in late 2009. By now, 32-nm test chips put the node squarely in the development part of Intel's three-year research-development-manufacturing cycle.

TSMC did it in late September/early October, announcing two different 28 nanometer chip making processes slated for fabrication in 2010. IMEC did it and told us how at IEDM (immersion lithography).

The notable odd man out is AMD, which just released Shanghai in November, debuting its hot new 45-nm chip technology just in time to mark the one-year anniversary of Intel's first 45-nm Penryn chips.

With 45 nm already old hat for Intel, and 32-nm chips all set to go in Q4 of next year, all the talk at IEDM was of 22-nm and beyond. As I mentioned in my last post, this renders the 32-nm node effectively chopped liver. So what explains the middle child phenomenon?

22 is the new 45

The big advance that enabled Intel's 45 nanometer technology was the ability to contain leakage current with a new high-k metal gate (HKMG) process.

Intel in particular tends to pull out all the stops to get to one node, and then coasts through the next node. It seems intuitively obvious: you put all your R & D money into reinventing the transistor, and then you push that technology to its limits. By the time the technology has hit a brick wall, you're already working on the next big breakthrough.

For its 32-nm process, according to papers at IEDM 08, Intel is using a 2nd-generation high-k gate material. They're manufacturing the new transistors with 193-nm immersion lithography tools. That's not so radically different from what they did at 45-nm (except at 45, they used a combination of 193-nm dry tools from ASML and Nikon-- now that they've dropped ASML, their immersion litho tools are solely Nikon.) Aside from the exclusive choice of Nikon as vendor, and the new immersion litho tools, this year's IEDM papers revealed that nothing in the process had changed significantly between 45 and 32.

But for the 22nm node, it looks like Lanthanum might be the new it-metal. One of the Intel papers discussed an advanced gate stack for 22nm low operating power applications, with thin cap layers using lanthanum oxide or aluminum oxide on hafnium-based high-k.

45 is the new 90

The middle child phenomenon seems to be cyclical: consider the path from 90 nm to 65 nm node technology.*

In 2006, reported that Intel's 65 nm process used "second generation strained silicon with a 10-15% improvement on drive current." [italics mine] When Intel announced its upcoming 90 nm technology node in 2002, strained silicon was the big breakthrough.

Strained silicon allowed Intel to coast through 65, but it stopped working at 45, which is why the CMOS transistor had to be redesigned. So, every time you have a second generation process, that means you haven't had to reinvent anything to make the node-shrink possible, meaning there's a lot less research and a lot more development at that node. Meaning you can save the R part of the R & D money for the next node.

My prediction? The next middle child will be the 16 nm node, which will exhaust the limits of the novel technologies that enable 22nm.

Further support for my crystal ball is brought to you by the New York Times Bits blog, where John Markoff hinted last week that at 11 nm, tiny transitors will be made of III-V hybrids that Intel is just starting to explore.

It might be four chip generations, however, before Intel adds the new hybrid approach to its commercial chips, said Mike Mayberry, the companyâ''s director of components research.

Intel is now solidly at 45; four more generations--32, 22, 16, 11--brings us to 11. If these fancy III-V hybrids emerge at the 11 nm node, 16 nm is sure to be the next chopped liver.

*This pattern is different for different companies. For IBM, the middle child appears to be 45 nm, which AMD is using for Shanghai, but which had been heavily delayed.

AMD has a cross-licensing agreement with IBM, which was supposed to roll out HKMG technology for the 45-nm node in 2008: but though AMD "desperately needed" high-k at 45 nm to keep up with Intel, EETimes' 2007 IEDM coverage stated that

IBM has yet to roll out high-k in a product. ... [I]n a move that raises questions about the readiness of IBM's technology, AMD said it has yet to make a commitment to use high-k at 45 nm, saying instead that the dielectric shift is an "option" at that node.

Some companies are just going to high-k at 32: now IBM and its partners have announced that their 32-nm HKMG devices will be available to IBM alliance members in the second half of 2009. That includes AMD-- but we haven't seen high k mg out of AMD in a product.

Holiday Greetings From the Moon: 40 Years Ago

On 24 December 1968, the crew of Apollo 8 made rendezvous with the moon. That day they became the first humans to journey to a new celestial body. The success of their mission laid the groundwork several months later for the astronauts of Apollo 11 to land on the lunar surface and return safely to our planet.

The flight of Apollo 8 produced one of the most historic images of all time, Earthrise at Christmas. After orbiting the moon some 20 times, the crew took a moment out from their engineering tasks, in consideration of the date, to read passages from the Bible in a live broadcast to the people of Earth.

"In the beginning God created the Heaven and the Earth," Pilot James Lovell began. Then he and his crewmates continued to read aloud from the opening of the Book of Genesis.

Commander Frank Borman concluded the broadcast with the following words: "We close with good night, good luck, a Merry Christmas, and God bless all of you -- all of you on the good Earth."

The passing of the decades has done little to diminish from the significance of their message.

May you also share in the spirit of that historic day at this special time of the year.

Virtual Colonoscopy Takes A Real Step Forward

On 19 December, the president of Stony Brook University in New York announced that it had licensed technologies for virtual colonoscopy invented there--including a computerized technique that makes it possible to see colon walls without having to evacuate the bowels--to Siemens, one of the world's leading makers of medical devices. Virtual colonoscopy uses computerized tomography to create 3D images of the colon, eliminating the need for the fiber optic endoscope that is snaked through the gastrointestinal tract in a conventional colonoscopy. Stony Brook researchers recently patented a refined electronic colon cleansing technique that will allow clinical radiologists to delete fecal matter and fluids from the 3D colon images so they can see the gut, the whole gut, and nothing but the gut.


Tech Talk

IEEE Spectrum’s general technology blog, featuring news, analysis, and opinions about engineering, consumer electronics, and technology and society, from the editorial staff and freelance contributors.

Newsletter Sign Up

Sign up for the Tech Alert newsletter and receive ground-breaking technology and science news from IEEE Spectrum every Thursday.

Load More