Tech Talk iconTech Talk

Devices for Diabetics Expand Inward and Outward

Funding and advising the development of an artificial pancreas is a major long-term initiative at the FDA. A couple interesting advances have recently been made, both commercially and in research, that seem to bring us closer to this goal.

According to the FDA, an artificial pancreas would consist of three components:

"(1) an infusion pump to deliver the required drug, many of which are already available; (2) a continuous glucose monitor, several of which have been approved by the FDA for tracking and trending glucose levels; and (3) an algorithm to communicate between the pump(s) and glucose monitor. An algorithm will receive information from the glucose monitor and convert it to instructions for the infusion pump."

Some continuous blood glucose monitoring devices are already available on the market (here's a good comparative chart).  All give periodic updates of blood glucose levels measured from a sensor inserted just beneath the skin. But all fall short, in some serious way or another, of what an artificial pancreas would require.A huge problem, it seems, is the lifespan of the device. The sensors for these models only last a few day and have to be reinserted regularly. Furthermore, the sensor is only partially implanted, and connects to a transmitter through the skin.

Last month, engineers at the University of Calgary published an alternative design that mounts a glucose sensor onto a transponder chip. An external reader inductively powers the chip while reading the glucose level, eliminating the need for it to hook up to a battery powered transmitter. This makes the  device very small, and thus more durable in the body. Removing the need for a battery also means that the entire chip and sensor can be fully implanted under the skin.

The design also uses an alternative chemical reaction to measure the glucose levels in the body, one that doesn't require oxygen. The oxygen-driven reaction used by other devices produces hydrogen peroxide that can corrode the sensor.

The device hasn't been tested in an organism yet, and once that happens it will be interesting to see how accurate it actually is, but these are definitely ideas that could improve available models.

Another thing that is changing is the extent to which these glucose monitors can communicate with computers and other devices. The MyGlucoHealth system uses a traditional pin prick glucose meter but has installed it with a USB cable and bluetooth capabilities that make it possible to synchronize data with a diabetes management system. It also keep doctors and patients up to date with individuals' glucose level fluctuations with text messages and email. This kind of network is likely to be vital in a system that closes the loop with an automatic insulin infusion pump.

Web 2.0 Meets Public Engagement in Nanotechnology

The UK government is taking this idea of public engagement for nanotechnology quite seriously. And it seems that the interactive capabilities of the Web 2.0 was just the tool they needed to put this seriousness to work.

First we had BIS (Department for Business Innovation & Skills) launch a website earlier this month that urged people to offer their opinions on the UK government’s nanotechnology strategy and even shape its final form.

The premise of the BIS site was characterized by at least one UK-based nanotech expert as a “crowd-sourced nanotechnology strategy”. With the BIS site you are provided a SWOT analysis for each chapter that are divided between cross-cutting themes and industry sectors and then each of these chapters has a handful of questions.

But for all the questions it remains a fairly static site. The questions are already posed for you rather than you posing your questions, for instance. And visually it gives off the aura that this material is not to be touched. One might say it’s the 1.0 of the Web 2.0 in design and feel.

On the other hand, a new UK public engagement website called Nano&me which was set up by an organization called the Responsible Nano Forum and funded by a grant from the Department for Innovation, Universities and Skills takes the visual and interactive capabilities of Web 2.0 and turns it up to 11.

I should say in the name of full disclosure that I helped in the editing of some of the site’s copy. But this material almost seems incidental in the context of the site, which provides every visitor an opportunity to produce their own copy, their own point of view and to set the ground rules for the debate. Quite different from the BIS site, which tells you what the questions are and asks you to just respond to those.

I think if one were to really press the owners of these two sites on what they expected these sites to do, you would probably finally get the answer that they are experiments and the truth is the outcomes are quite uncertain.

As a self-confessed cynic, I am not sure that these sites perform much of a civic duty other than to give politicos something to cover their pudendum and for the public to have the false sense that they are actually involved in shaping some policy even if it’s something as esoteric and ultimately meaningless to them as nanotechnology.

But even as a cynic, I have to admit that it is hard to know how these sites will turn out and what kind of impact they will ultimately have.

Measure for Measure

My daughter is moving to Colorado, and she and her older brother recently took a four-day road trip from his house in Pennsylvania to her new apartment. They both posted their photos on Shutterfly; many of the shots are essentially the same.

The big difference between their photo albums—and it’s makes a huge difference—is that he took a few extra minutes to add captions. So from his pictures I know that the bronze statue of Abe Lincoln they both stood next to is in Vandalia, which was once the capital of Illinois; that the baseball game they went to was in Kansas City; that the giant cross by the highway is in Effingham, Illinois—which is enough information for a Google search that says that the cross is 198 feet tall and was built by the Cross Foundation.

Of course, location information, such as Vandalia, Ill., and Kansas City, Mo., can already be included in a photograph’s metadata, if the camera has GPS. In my ideal universe, all cameras would, and they’d even have little keyboards so you could add a caption right when you take a picture. Photo metadata is phenomenally useful, and, in a world of photo clouds like Shutterfly and Flickr, it’s getting ever more so.

You know what else needs metadata? Engineering numbers. That’s the premise behind Allen Razdow’s new start-up, True Engineering Technology.

Razdow was a co-founder of MathSoft, the company behind Mathcad. In a way reminiscent of Stephen Wolfram’s idea that the Mathematica universe would benefit from databases that could be queried (thus, Wolfram Alpha), Razdow decided that the software on an engineer’s desktop, like Mathcad—but also Microsoft Office—needed metadata for the numbers that move from one program to another. If I had to choose one of these ideas as a winner, it would be Razdow’s.

The idea apparently came slowly to Razdow, who wrote MathCAD back in the 1980s (for MS-DOS!), and with good reason—it requires you to think of numbers in a paradoxical way. While the common conception of computers is that they turn everything into numbers, Razdow’s insight is that the reality is just the reverse.

We take an engineering number, maybe it’s the hydrogen permeability of palladium, or the specific gravity of the railroad ties you just shipped to a customer, and put it into a report that strips it of almost all of its meaning—what reference book the number came from, or when and where it was measured and by whom, the tolerances, and so forth. We take numbers that are ripe with engineering meaning and mathematical context and turn them into flat text. Often—and paradoxically this happens particularly with those bastions of number-crunching, spreadsheets—you don’t even directly know the unit of measurement, because that’s contained in a column heading or a footnote or some other surrounding text.

Consider all the numbers that get used and reused for years, within your company and outside of it. Imagine you’ve worked out a more precise measurement of the hydrogen permeability of the particular palladium alloy to be used in an upcoming product. Or auditors from the Nuclear Regulatory Commission or the Food and Drug Administration have arrived to examine a new report that contains 70 different key engineering numbers that need to be checked, 50 of which were taken from a report that was vetted last year.

Razdow has in mind a plug-in that would encourage you to create metadata for important numbers and would let a number retain that metadata when it’s cut and pasted from one application to another, whether it’s MathCAD to a Word document, or to a PDF, or vice versa. You can hover over someone’s number and see some of that metadata, or, if they make it public, you can click on it and get all of it from a Web page devoted to that number on a public site that Razdow’s company will maintain. True Engineering Technology will make its money by selling a server appliance that will host and manage engineering numbers within an enterprise.

It’s a clever and much-needed idea. Like a lot of other Web 2.0 notions these days—think RSS, for example—it will need widespread adoption by users—in this case engineers—and the software applications that they use. Here’s hoping that happens.

How Will Nano Change the World?

Thus is posed the question for the new video contest put on by the American Chemical Society. In the first contest, the question was simply “What is Nano?” and it turned out that question was best answered with puppets in full-throated song.

I enjoy watching videos as much as the next guy, but I have not quite figured out what purpose these videos are supposed to serve other than to compete in a contest. Are the editors of the ACS’ Chemical & Engineering News supposed to become informed of how nanotechnology is going to impact the world in a way that they hadn’t considered before? Are these videos supposed to become teaching tools for pre-schoolers as in the case with the puppet video?

I am entertained but I don’t get the purpose, or maybe there isn’t one.


The National Register of Historic Newspapers

When you hear analysts explain what's killing the newspaper business, the answer invariably boils down to two words: “The Internet” or even one: “Craigslist.” A recent CNET story, “Pew Center illustrates how Craigslist is killing newspapers” is typical.

The use of online classifieds sites, such as Craigslist, has more than doubled in the past four years, according to a study published Friday by the Pew Research Center. At the same time that Web classifies are on the rise, the classifieds business that newspapers once depended on has collapsed

So it was interesting to hear David Simon interviewed on Bill Moyers Journal. He was on the show back in April, thought I listened to it only this week, through the miracle of podcasting. Simon is famous for creating “The Wire,” the HBO series that many critics think is the greatest television show of all time (see, for example, here, here, and here). But before that, he was a reporter for the Baltimore Sun for more than decade, and he had some interesting things to say about newspapers and the Internet:

Yes, we were doing our job. Making the world safe for democracy. And all of a sudden, terra firma shifted, new technology. Who knew that the Internet was going to overwhelm us? I would buy that if I wasn't in journalism for the years that immediately preceded the Internet because I took the third buyout from the "Baltimore Sun." I was about reporter number 80 or 90 who left, in 1995. Long before the Internet had had its impact. I left at a time-- those buyouts happened when the "Baltimore Sun" was earning 37 percent profits.
You know, we now know this because it's in bankruptcy and the books are open. 37 percent profits. All that R&D money that was supposed to go in to make newspapers more essential, more viable, more able to explain the complexities of the world. It went to shareholders in the Tribune Company. Or the L.A. Times Mirror Company before that. And ultimately, when the Internet did hit, they had an inferior product-- that was not essential enough that they could charge online for it.
I mean, the guys who are running newspapers, over the last 20 or 30 years, have to be singular in the manner in which they destroyed their own industry. It-- it's even more profound than Detroit making Chevy Vegas and Pacers and Gremlins and believing that no self-respecting American would buy a Japanese car in 1973. That-- it's analogous up to a point, except it's not analogous in that a Nissan is a pretty good car, and a Toyota is a pretty good car. The Internet, while it's great for commentary and froth doesn't do very much first generation reporting at all. And it can't sustain that. The economic model can't sustain that kind of reporting. And to lose to that, because you didn't - they had contempt for their own product, these people. I mean, how do-
BILL MOYERS: The publishers. The owners.
DAVID SIMON: Yes, how do you give it away for free? You know, but for 20 years, they looked upon the copy as being the stuff that went around the ads. The ads were the God. And then all of a sudden the ads were not there, and the copy, they had had contempt for. And they had-- they had actually marginalized themselves.
By the time the Internet had its way, I mean, they're down to 180 now. You don't cover the City of Baltimore and a region like Central Maryland with 180 people. You don't cover it well.

The problem doesn't stop at newspapers. (The irony of viewing the entire Moyers show online for free - and reading a free full transcript of it -  almost goes without saying.)

Back in March, eMarketer had a rather terrifying article about the magazine business.

It reported that in the U.S. alone, “525 magazines were shut down in 2008” and that “consumer magazine print ad spending in 2008 was down 7.1%.”

And yet, eMarketer's formula is what David Simon might call the same trip down the rabbit hole:

The big move in publishing, however, is online. But the transition may be coming too late for many titles. “While there are pockets of innovation, many print brands have fallen far short of the mark when it comes to their online presence,” says Ms. Krol. “Given the state of the business, publishers need to act quickly to capitalize on brand assets and provide accessible, compelling content to readers-who already have access to a wealth of content online.”
“You have to get your brand online-95% of the magazines out there haven't really done that,” Jim Spanfeller, CEO of, tells eMarketer. “They're not putting out a product that is compelling for an online audience.”
Digital ad revenues for consumer magazines averaged 6.4% in 2008, according to an analysis of 11 major magazine group publishers by Advertising Age, suggesting the publishers have a long way to go in building their digital businesses.

I imagine David Simon summing it up this way: In other words, publishers should put even more of their eggs in the anemic on-line ad-revenue basket.

And it's not as if Forbes is immune. Far from it.

As it happens, also in March, 24/7Wall Street had a remarkable and depressing analysis of the business prospectes for the business weeklies, “The Sun Sets On BusinessWeek, Forbes, And Fortune.”

The May 11 issue of Fortune Magazine is a perfect demonstration of what the three largest business magazines have done for decades. Its cover story, “How Bernie Did It' is the culmination of a four-month investigation into the details of Bernie Madoff's life and business operations written and reported by three of Fortune's best editorial staff members, one of whom is a Pulitzer Prize winner. This issue of Fortune is also an example of why the magazine and its competitors Forbes and BusinessWeek, will soon no longer be able to publish these kinds of stories. The May 11 issue has 92 printed pages and covers. There are only 21 pages of paid advertising compared with more than a hundred pages in a spring issue 20 years ago.

According to the article, “All three of these magazines lost money in the first quarter” although actual numbers are unavailable because Fortune and BusinessWeek are part of larger media organizations and Forbes is privately owned. The article says Business Week “is in the worst shape of the three” and cites industry experts as saying it “has lost money for two years and will lose over $20 million this year if its advertising continues to move down at its current rate and the operation does not make large cost reductions.”

Its advertising pages fell 16% in 2008, according to data from industry research letter MIN. The magazine's ad pages are down 38% this year through the end of April, and in the most recent issue, the drop was an extraordinary 63%. The magazine has more than 220 editorial, support, and management personnel based on the BusinessWeek masthead. This does not include ad sales, production, or circulation staffs. That is a large number of people to put out a magazine that often has fewer than 60 editorial pages and a website with less traffic than based on March figures from online audience research firm comScore. BusinessWeek online had 3.3 million unique visitors and 18 million pageviews., which has a larger monthly audience, had advertising sales of only $30 million in 2008, based on its 10-K, and that number certainly dropped in the first quarter of this year. That is probably a good benchmark for what BuinessWeek brings in for online advertising.

The article concludes “BusinessWeek will not be a weekly magazine with over 200 employees and a rate base of 900,000 at the end of the year. BusinessWeek will have to become a much, much smaller operation.”

As for Fortune, the article says, “management has said that the magazine still makes money, but based on most definitions of profit that is almost certainly not true.” After looking at its large staff and modest Web revenues, the article says “It will have to cut costs and its choices are similar to BusinessWeek's.”

Things are only a little better at Forbes: “Ad pages at Forbes were down 17% last year and are down 19% year-to-date. The most recent issue's ad pages were 33% lower than they were in the same issue last year.” The good and bad news is that staff has already been cut-and not just on the print side:

Forbes has a financial advantage over its two competitors. It has already gone through two large staff layoffs which totaled about 70 people, about 15% of the staff who worked at Forbes and

24/7WallStreet notes, though, that Forbes has a strong Website, just as CEO Spanfeller bragged to eMarketer:

The print business at Forbes is doing as poorly as it is at BusinessWeek and Fortune. Forbes has the advantage of a much larger audience online. In the US, it has almost 5.6 million unique visitors and 66 million pageviews. Revenue from the Forbes online business is between $70 million and $80 million, but is not growing. Forbes management might say that its online operations are profitable and that its print business loses money. It is convenient to separate the two businesses, but they share so many resources, that this is not a realistic description of the Forbes overall business.

If we want investigative journalism-whether it's of the local politics in Baltimore, or business scandals like Madoff's twenty-year Ponzi scheme, we have to be willing to pay for it. So far, the reading and voting public has shown no appetite for doing so. We're right now dismantling journalistic enterprises that took decades to build.

One is reminded of New York's glorious Pennsylvania Station, part pink granite Doric columns, part iron art deco archways. When it was thoughtlessly torn down in 1963, the loss was so mourned that it led to the National Historic Preservation Act of 1966 and its National Register of Historic Places Program.

The equivalent losses have already been felt, and then some: The 146-year old Seattle Post-Intelligencer, the Rocky Mountain News (140), Tucson Citizen (138), the Cincinnati Post (128), but, as you can imagine, David Simon has a theory why there will be no national historic preservation act for investigative journalism. In his interview, Bill Moyers paraphrased something Simon had said in an earlier interview: "Oh, to be a state or local official in America... without newspapers. It's got to be one of the great dreams in the history of American corruption."

As the printing presses of these journalistic giants grind to a halt, they of course leave websites, ones about as nondescript as the current Penn Station, but woefully understaffed and far less useful. And like the old Penn Station, they won't be rebuilt. Nor will the great magazines whose demise will surely follow.

Responsible Recycling Video Contest Winners Announced

E-waste is a serious problem and you can help by taking your ewaste to a reponsible recycler. That's the message; the challenge to video producers amateur and professional--turn that message into an interesting and informative video of a minute or so.

The winner: Michael Herp from Aubrey Texas.


The second place video:


For more information about the contest, and to view the third place winners, visit the Silicon Valley Toxics Coalition.

Iberian Nanotech Center Opens with Promise of a New "Age of Discovery"

Assorted Spanish and Portuguese dignitaries assembled for the opening for the new International Iberian Nanotech Laboratory located in Braga, Portugal.

Given such an occasion and such an assembled cast of politicos and even royalty we were bound to here some reference to the golden age of Spain and Portugal’s age of discovery. And we were not disappointed.

 "In the age of discovery, we had a lot of success. With this project, Portugal and Spain will chart a new atlas of innovation and will make new discoveries," said Spain’s president, Jose Luis Rodriguez Zapatero.

I have commented somewhat skeptically on this research center earlier this year.

I am sure it will be a huge success, as far as we'll ever know. However it turns out, it should prove interesting to see how this center develops since it should serve as a benchmark for other countries who develop large and expensive nanotech research centers without having much of a foundation for it either in their scientific or industrial communities.


Digital wizards bring back NASA's lost moon walk video

Today marks the 40th anniversary of the Neil Armstrong’s first steps onto the moon. I remember staying up past my bedtime to watch the blurry images on the little black and white TV in my New Jersey living room.

NASA later recorded over the original videotape of those images in order to save the cost of buying new tapes. It didn’t mean that the recording of the event was lost, but that the only copies remaining were simply that, copies, mostly made by TV broadcasters, and had even less resolution that the original.

NASA turned to Hollywood, specifically, Lowry Digital Images, to fix the problem. Lowry’s an amazing company that works magic on old film and video—I first encountered the firm’s work when profiling then-Lowry employee Ian Caven for IEEE Spectrum’s Dream Jobs Special Report. Caven developed original solutions to difficult image processing problems for Lowry, developing software to remove interference, mold damage, and flicker from old movies.

To restore the Apollo 11 images, the digital imaging specialists at Lowry used a number of copies of the video, including an 8 mm film recorded by pointing a handheld camera at a video monitor at Mission Control. Some sections of the video existed only on this 8 mm film. Lowry Digital used its standard temporal image processing technique, in which it compares information from large numbers of consecutive frames to calculate the optimal contrast, resolution, and noise level, to process the images. It also developed new techniques to fix brightness, ghosting, and smearing. NASA instructed the restorers to leave in some of the original flaws, like dirt on the camera lens, for authenticity.

Some short highlights from the restored video are available now; the complete restoration project is slated to wrap up in September.

Q&A With: Michael D. Griffin, Former NASA Administrator

As the Endeavour shuttle headed into orbit and a new NASA administrator was being approved, the most recent chief of the U.S. space program offered his thoughts on his term in office.

Michael D. Griffin, 59, served as the eleventh administrator of NASA from April 2005 to January 2009. He was brought in to oversee the space agency in the wake of the 2003 Columbia shuttle disaster to restore public confidence in the space effort and to begin planning for the next generation of American space exploration projects. His accomplishments at NASA include: restoration of the space shuttle program, renewal of work on the International Space Station, repair of the Hubble Space Telescope, and increasing the agency's emphasis on earth sciences. He resigned in January this year to enable the new presidential administration to name its own candidate.

Griffin has since accepted a position as a professor of mechanical and aerospace engineering at the University of Alabama at Huntsville, where he will serve as the school's first Eminent Scholar in Engineering, interacting with the city's burgeoning high-tech community.

He granted us an interview on 15 July from his new home in Huntsville, where he was watching the latest shuttle launch, as well as coverage of the Senate's approval of his successor, Charles Bolden. We started with an even more historic NASA moment, though.

Spectrum: This is the week Apollo 11 took off on its voyage forty years ago. Where were you on 20 July 1969, and did the moon landing influence your career decisions as a young man?

Griffin: I was surrounded by a bunch of friends, all of whom were absolutely fascinated by what was happening. And no, the moon landing had nothing to do with my career decision. I was fascinated by science and engineering generally and spaceflight in particular from the time I was five years old.

Spectrum: What do you rank as your most satisfying accomplishment as NASA administrator?

Griffin: I was very proud of the team of senior career civil servants that I assembled during my tenure as administrator. I think it was one of the most competent groups of senior managers ever assembled at NASA.

Spectrum: Why should the engineering societies support the retirement of the space shuttle fleet and replace it with next-generation spacecraft, as envisioned in the Constellation program you have championed?

Griffin: Because it is time to retire the shuttle. Irrespective of other considerations, it cannot take us where we want to go again -- out beyond low earth orbit, to the moon, the near-earth asteroids, and Mars. We should build the new Constellation systems because [our engineers] can.

Spectrum: Where do you think humanity is headed in the exploration of space?

Griffin: There is no question, as I see it, that humans will explore our solar system and, eventually, learn to exploit its resources for economic benefit. And, who knows, maybe in the more distant future we will find a way to reach the stars. The real questions are which humans and when?

About the Author

Kieron Murphy is a contributing editor to IEEE Spectrum.

Top 10 Facts Gleaned At EUV Litho Workshop

Ultraviolet light bulb
IMAGE CREDIT: Wikimedia Commons / Artist: Anakin101

Ah, chipmaking: Unlike the early days of lithography, it’s no longer as simple as just shining some light onto a mask and creating a pattern. Now, the actual photons have to be created through torturous means. That means starting from scratch, more or less, in terms of light sources, machine configuration, photoresist and mask design, among other things (oh, so many other things). I’ve collated the ten most interesting new things I learned about extreme ultraviolet lithography at the EUV Litho workshop last week in Honolulu, Hawaii.

10. The end of Moore’s Law is always seven years away.
Lithography guru Chris Mack explained this tenet in a tutorial on microchip lithography, his tongue firmly planted in his cheek. Why seven?

The next generation (say the chips with 32-nm feature sizes Intel plans to ship this fall) is already in development; the following generation is in full R & D phase (22 nm in 2011), and the one right after that is already a gleam in the eye of the semiconductor researcher (15nm in 2013). The one right after (11 nm) that is the one no one’s really thought about yet (2015?), and they see only problems without solutions. And since that adds up to about seven years, it’s the ever-moving horizon beyond which all progress breaks down: human sacrifice, cats and dogs living together, transistors obstinately refusing to shed another nanometer.

Mack, as I have mentioned, has a bet with workshop organizer Vivek Bakshi that EUV will go the way of the dodo, or at least the way of 157-nm lithography: “Everyone will talk about it and talk about it for two more years,” he predicted, “and then after two years no one will ever talk about it again at all.”

The stakes? Mack’s Lotus. Bakshi says he already has a set of “EUVL” license plates—in fact he has two, because he “accidentally” ordered an extra set. Next year, he says, he’s going to give the annual EUV Litho workshop award in the form of a laminated EUVL license plate.

9. You can make 15-nm features with 193-nm light.
At least, that’s what Intel’s Sam Sivakumar said in his keynote on Wednesday. I’m talking about the 15-nanometer process node, which as we know from Bill Arnold’s article, is kind of an invalid concept these days. But for purposes of comparison, today’s cutting edge chips are at the 45-nm node, meaning the smallest features (usually transistors) have 45-nm dimensions. It’s more complicated than that, so if you like, check out Jeff Hecht’s tutorial.

Sivakumar speculated that Intel would be hedging its bets for the 15-nm node (coming to a laptop near you in 2013, which is a lot closer than you think). Though Intel is on track for development and production of those chips using EUV lithography, they also have a side bet on “extreme double patterning,” which is a term I just made up. It’s double-patterning lithography, which Intel is already going to start using for the 32-nm chips it will ship this fall, but twice as much of it, or let’s call it quadruple patterning. That should slow down their wafer throughout (now at 150, and then divide that by four) sufficiently to motivate them to get EUV on the roadmap.

8. There’s more than one way to skin an EUV photon

Apparently there is a big difference between the light source that will produce the 13.5-nm photons necessary for lithography, and the light source that will produce 13.5-nm photons for inspecting the masks. The key is finding defects that would interfere with the patterns and make them useless, and researchers don’t know yet how big a defect has to be to get in the way, or how many of them will cause a problem. It’s still very much a work in progress.

The masks are multilayered, and every layer has to be perfect (all sorts of numbers for maximum defects per square centimeter populated the week’s presentations, and they were all incredibly low). The only way to probe down into the mask layer cake to check for those defects is with 13.5-nm photons.

My first thought was that a scanning electron microscope or an atomic force microscope could tackle that job, but no, says Debbie Gustafson, sales VP at Energetiq, a company that produces light sources. “AFM would be much too slow,” she says, “and even if it weren’t, both SEM and AFM only image the surface layer.”

And the light source has to be brighter than the EUV photons that imprint the pattern—900 kilowatts per square millimeter, according to one presenter from Ritsumiken University.

7. Tin vs. Xenon: This time it’s personal.
Such light is created not by exploding tin droplets with a carbon dioxide laser (the so called laser produced plasma, or LPP, that creates the images), but with Xenon gas and electrodes. An electrode releases an electrical discharge into a cloud of gas, which heats it, magnetically constrains it into a small space, and then EUV photons are released. This is called discharge produced plasma or DPP, and Energetiq uses it for mask inspection sources).

Chris Mack says tin will be the light source of the EUV generation. “So far the industry consensus is that the tin and carbon dioxide method will be the winner,” he says. But he’s talking about LPP, not DPP. Xenon arc lamps, which are used in today’s 193-nm lithography, have new life in the defect inspection field.

6. Litho nerds have dreams, too
Q: Why would you want your own laser if you work at Lawrence Berkeley National Labs?
A: Because just like astronomers who have to sign up to get time on the telescope, researchers working on EUV lithography have to sign up to get time on the synchrotron or whatever other crazy light source they have at the national labs. So, LBNL lead scientist Patrick Naulleau would have preferred to have his own light source.

NanoUV manufactures next-generation EUV light sources. The company claims that their source is as bright as a synchrotron but without the oppressive footprint of, say, the Large Hadron Collider. The instrument is 45 cm long and 14 cm wide and, utilizing a controversial “plasma lens technology,” according to NanoUV presenter Sergey Zakharov, is capable of producing 120 Watts. Naulleau got a little unhappy in the Q & A session. “How is this possible?” he asked, more than once.

But Zakharov skirted the answer and put Naulleau off, telling him to wait until February, when the tool will be revealed to the world.

Unfortunately, it was postulated by attendees who prefer to remain anonymous that the NanoUV light sources appear to transcend the laws of physics.

Eric B. Szarmes from the University of Hawaii’s department of physics and astronomy presented another cutting edge high-brightness lithography source base on Compton backscattering from electron pulses. If I had to bet on eyebrow-raising numbers, I’d bet on his, because he works with John Madey, who invented the free electron laser at Stanford in 1975.

5. Photoresist: Meet the new boss…
However, unlike the newfangled ways of generating photons, one surprising variable probably won’t create a headache. The photoresist that you can use to expose a wafer using 13.5-nm photons is the same chemical structure that worked for the 248-nm lithography of 16 years ago (they’ve been stuck at 193 nm since 2001).

4. I have to learn French!
Pop quiz time. Define “étendue:”

A.     A treacherous postmodernist ballet move that combines a plié with an accent aigu.
B.     What your French teacher used to yell across the room when you weren’t listening.
C.    A measure of attenuation of a beam of light.
D.     Area of the entrance pupil times the solid angle the source subtends as seen from the pupil. These definitions are for infinitesimally small "elements" of area and solid angle however, and have to be summed over both the source and the diaphragm.

I cycled through A, B, and C (C out loud unfortunately, and in the form of a question) before it was explained to me that étendue in fact is defined as D. The equation is the area times the divergence angle. The larger the beam the smaller the divergence. If EUV goes mainstream, you can expect to hear this word as often as I did last week, and you’ll be well-equipped with a definition. So, you’re welcome.

3. EUV Needs A Vacuum

Contamination is a big problem for EUV lithography tools. You have a machine inside which a pulsed laser explodes a series of falling 20- to 40-nanometer tin droplets, creating a super-hot plasma, and that plasma beams out EUV photons.

But that’s just the easy part. Now you have to get them to the mask over the wafer—but remember you have to do it without letting the EUV photons touch anything.  It’s like that game from the 1980s, Operation, where you had to pick up the creepy little plastic liver without touching the metallic sides of the patient’s “cavity.”

EUV photons, which are just on the boundary of where they should be called x-rays, are absorbed by absolutely everything; EUVs go in, but they don’t come out. The list of what is the death of an EUV photon includes: water molecules, air, glass, and metal. In other words, everything you’d expect to find in your average lithography machine.

In order to minimize that absorption, researchers have “child-proofed” their machines almost beyond all reason. Every step—from EUV generation to plasma to the beam concentrators to the beam collectors—takes place in a vacuum. And now, because you can’t use glass or plastic or saran wrap or any kind of barrier, you can’t separate the vacuum chambers. With pretty much any other light, you could just the beam through windows that separate one vacuum chamber from the next. Not with EUV. The area has to be completely open. Now you’re stuck making a huge, long, labyrinthine vacuum chamber in your litho tool.
And everything that interacts with the beam (steerers, light collectors, lenses, and so on) needs to be made of super reflective mirrors. These are based on Bragg reflection, a concept used for optical fibers. A Bragg reflector is created by coating a smooth substrate with several dozen alternating layers of molybdenum and silicon.

And it gets worse!

2. Contamination is bad

University of Illinois at Urbana-Champaign plasma physicist David Ruzic uses the very user-friendly term “splats” to describe the small balls of tin that pancake onto the equipment.

The ions and tin splats exploding out of the plasma have no travel restrictions. They can migrate everywhere and anywhere in your machine. The splats are no fun but you can clean them up, more or less, with RF coils and fancy lasers.

What’s harder to do is guard against the devastation rained on your super sensitive mirrors by the emitted ions from the plsma. These, according to one presentation, cut short the life of the average satellite-grade Bragg reflector mirror from 30,000 hours to, uh, 2 to 4. HOURS. You think double patterning is expensive. Wait till you’re asking ASML for a new litho tool six times a day (though I’m sure they would find some way to get through the pain…).

1. No technology is immune from the Star Wars metaphor!

Ruzic tells me the preferred method of dealing with line edge roughness–-which according to many attendees is the limiter for lithography-- is shooting a tiny laser (pew! pew! pew!) down the canyon walls of those tiny features to smooth out the rough spots. “It’s like that part where they’re shooting lasers down the canyons in the Death Star,” he explained, warming the nerdiest regions of my heart.

Stay on target!


Tech Talk

IEEE Spectrum’s general technology blog, featuring news, analysis, and opinions about engineering, consumer electronics, and technology and society, from the editorial staff and freelance contributors.

Newsletter Sign Up

Sign up for the Tech Alert newsletter and receive ground-breaking technology and science news from IEEE Spectrum every Thursday.

Load More