Tech Talk iconTech Talk

Highlights from National Instruments Week 2009

The annual NI Week was full of cool demos. There was the scarily-named robotic "flying blade of amputation," a robotic keyboard and glockenspiel duet, and the always-popular Guitar-Hero playing robot (with a new twist). This video provides a taste of the best demos and highlights from the keynote sessions.

As always, NI debuted some new products, and gave previews of what's still to come. The most impressive of the future-features was the thin-client web interface that will soon be part of their LabView software. NI already has hardware that runs web services accessible via URL. But the next generation of LabView will allow users to build a web interface right in their browser, without knowing flash or Java. The best way to understand the new capability is to watch the demo below. I've edited it down in length to give just the essentials of the technology without all the pageantry of the keynote.

Nanotech Agreement between IBM and Bulgaria Put on Hold

I suggested at the end of May that it appeared as though IBM was starting an entirely new line of business. Small countries were turning to IBM to jump-start their nanotechnology initiatives.

Well we have seen the first of these announced partnerships fall to the wayside due to the inability of Bulgaria to pay for the expense.

It seems that Bulgaria’s state spending has increased by 30% in the first half of 2009. My speculation is that this increase in spending is likely due to paying unemployment compensation for those laid off during the worldwide recession, and trying a bit too hard to spend their way into joining the Eurozone. In any case, the finance minister started looking around for ways to cut the budget and nanotechnology rose to the top of the list.

If there is a lesson to be learned here, IBM has discovered that one of the drawbacks of doing business with small economies is that sometimes they just can’t pay the bill.

null

Fantasy Sports Prep: Semiconductor Edition

IMAGE CREDIT: Wikimedia user Inductiveload

 

I'll be at Hot Chips in late August, along with the rest of the engineering press (eat your heart out, Perez!). In some ways you might say Hot Chips kicks off chip season, which continues to the International Electron Devices Meeting (IEDM) in December and culminates in the superbowl of chip-talk at the International Solid State Circuits Conference (ISSCC) in February.

In case you want to do some research for your fantasy semiconductor team, Real World Technologies' David Kanter has compiled an excellent post-game analysis of the 32nm process technologies rolled out at IEDM 2008 and ISSCC 2009. In particular, he discusses the semiconductor industry's move to 32nm manufacturing processes with high-k dielectrics and metal gates (HKMG). (I'd like to point out here that it's not all about process technology: the cyborg moth at ISSCC was so creepy I thought I had woken up in a dystopian Arnold Schwarzenegger movie. Is that reference dated now? Who's the new Arnold-- anyone care to update my pop culture database? But I digress.)

David was kind enough to summarize the article for me. However, you should still go read the whole thing yourself because this is the version for people who are easily distracted by robo-moths and Arnold Schwarzenegger's falling star.

New manufacturing technologies are essential to keeping Moore's Law on-track and driving continued advances in microprocessors, graphics processors, FPGAs, ASICs, networking and other industries that rely on semiconductor technologies. At IEDM 2008 and VLSI 2009, leading edge manufacturers announced their initial results for 32nm process technologies, discussing key techniques including transistor strain, immersion lithography, double patterning and for some, custom illumination.

The process technologies analyzed include:

1.  IBM and AMD's research work on a HKMG 45nm process using silicon-on-insulator (SOI), which is not expected to go into production.

2.  IBM and the Common Platform's HKMG 32nm bulk process

3.  Intel's high performance HKMG 32nm process, slated for production at the end of 2009

4.  TSMC's performance optimized HKMG 32nm and 28nm process expected in 2010

5.  Intel's low power 45nm process for SOCs, the first low power process to feature HKMG

6.  Toshiba and NEC's HKMG 32nm density optimized process, which currently uses custom illumination, rather than double patterning

7.  IBM and AMD's high performance HKMG 32nm SOI process, expected to debut in late 2010.

The results for each process include key density metrics such as contacted gate pitch and SRAM cell size and transistor performance metrics such as NMOS and PMOS transistor drive strength.  We include a historical comparison that puts these newer manufacturing technologies into a historical perspective, going back as far as 130nm.  New to this year's coverage of IEDM and VLSI is a graphical comparison of density and performance for various 45nm and 32nm process technologies.

Of particular interest are several facts: First, the rest of the industry, including IBM and AMD will finally catch up to Intel's manufacturing technology, using high-k dielectrics and metal gates, at 32nm. Second, approximate parity between Intel, AMD and IBM for manufacturing CPUs. And finally, approximate parity between TSMC and the Common Platform for bulk foundry processes.

I for one would like to read more about Globalfoundries, but I imagine we'll get an eyeful of that from everyone over the next six months.

Nanotechnology Provides the "McGuffin" for Summer Movie Blockbuster

To those of you not familiar with the term “McGuffin,” according to Alfred Hitchcock, the term comes from the story of two men traveling on a train.

One man asks the other what is that you’re carrying in your luggage. The other man responds by saying it’s a McGuffin. When the first man asks what a McGuffin is, the other says it’s a gun for hunting lions in the Scottish highlands. The first man, nonplussed, responds that there are no lions in the Scottish highlands to which the other man quickly replies than that is no McGuffin.

In other words, a McGuffin is an empty an almost entirely meaningless plot device.

It seems that nanotechnology is becoming the new McGuffin for silly Hollywood action movies with the release of this summer’s blockbuster “G.I. Joe: Rise of Cobra.”

In this case, the McGuffin are little nanobots that are put into a warhead and then launched at a target where they begin to devour the target until they are turned off by remote control. In the trailers you can see the green swarm of nanobugs devouring the Eiffel Tower.

I am afraid nanotechnology is not fairing too well in popular culture it always seems to be a threat whether it be Michael Crichton’s “Prey” or the television program “Eleventh Hour”.

I guess it’s hard to make cleaner drinking water, cheaper alternative energy or better anti-cancer drug treatments into an exciting and compelling plot element.

Is More Being Spent on its Toxicology than on Nanotech Itself?

As nanotech businesses fall deeper into the abyss, and the high-flyers sink into non-existence, you would think with all the government talk of nanotech being the future and stimulus dollars being spent that one would hear about some actual money being spent to keep nanotech companies going.

Sure, billions are invested in government and university research labs that are supposed to lead to some commercial applications. But they never do because the road from a lab prototype to a commercial product is expensive and there are little or no financial means of support to carry it off.

Shiny and new research centers producing all sorts of research ranging from the “who cares” to the “could be significant” abound but the commercialization of nanotech continues to flounder.

Meanwhile there is an endless chorus of the possible dangers of nanomaterials and their proliferation in consumer products. The last bit always leaves me scratching my head. I could have bought ‘nano pants’ 5 years ago and 5 years later that’s still about all I can buy if I go out looking to buy a nanotech-enabled product.

Despite these sinking fortunes for nanotech companies and products, some researchers manage to strike it rich. A professor at Duke University has just received $14 million to figure out the possible detrimental effects of silver nanoparticles, particularly those used in antibacterial socks.

Well done! Chapeaux! And all of that, but imagine if that kind of money was spent to support companies trying to bridge the financial chasm between a prototype and a product, I might actually be able to go out and buy a pair of nanosilver socks.

Nanotechnology Adds to Police Arsenal Against Impaired Drivers

Aside from agility tests, police have had no technological way of detecting the use of controlled substances by drivers other than for alcohol. Cheech and Chong could merrily drive down the highway in their van made out cannabis and be stoned out of their minds and there was little that the police could do to prove it.

According to this rather colorful article, which uses terms like “stoners and dopers,” Philips has developed a hand-held device that employs nanotechnology based on the use of electromagnets and nanoparticles to “separate the sober from the impaired”. 

The article points out that the Netherlands-based Philips will roll this out initially in Europe. But oddly the article raises the specter of the device being a “privacy-invading drug tester.” I am not sure how much privacy you are entitled to when driving impaired on a public road, but in any case I sure this is not the kind of invasion of privacy caused by nanotech that has some so concerned.

Five Years After the Release of Royal Society's Nanotech Report

I guess I have become inured to the idea that there is little synthesis on the issues of the day rather only antithesis. That’s why five years after the release of the Royal Society and Royal Academy of Engineering report “Nanoscience and nanotechnologies: opportunities and uncertainties”, I am not surprised that it seems as though we haven’t progressed much far beyond name calling regarding the safety of nanomaterials.

The Responsible Nano Forum, which has been quite busy of late with the launching of their new Nano&me website, has just released a report  putting the last five years into some kind of perspective.

Andrew Maynard on his 20/20 Science blog has followed up on this report providing his own perspective on the situation, which scans about right.

While I can appreciate the arguments on both sides (to an extent), it all seems so needlessly polemical.

On the one hand you have Allan Shalleck over at Nanotech-Now  arguing that the media have been following sensational headlines and have missed the other side of the story which is there has been zero reported health-related issues caused by nanomaterials…thus far.

Then on the other hand you have some NGOs claiming that the environmental benefits of nanomaterials, such as pollution remediation and clean drinking water, which have been used to counterbalance their reported potential negative effects, are over hyped.

Meanwhile you have everyone trying somehow to engage the public on the issue of nanotechnology, or at least get them mildly interested.

Aside from the fact that we don’t really have as much hard experimental data today on the safety of nanomaterials as one might have expected five years ago when the RS and RAE report was published, perhaps of more interest is that the general public not only doesn’t care, but they still don’t even understand what nanotechnology is. Who can blame them really?

Robots: The Expensive Way to Prepare Cheap Food

If you've ever watched the giant container loaders in Elizabeth N.J. or Yokohama Harbor, you've probably wondered if the same robotic technologies could be used to make ramen soup.

Okay, maybe you never have, but someone seems to—Kenji Nagoya, said to be an industrial robot manufacturer and owner of a new fast-food restaurant where bowls of ramen in pork broth are prepared almost entirely by a pair of robots that look, to me at least, a bit like the container loaders I see from the New Jersey Turnpike.

In a widely copied Reuters video report, Nagoya says, “The benefits of using robots as ramen chefs include the accuracy of timing in boiling noodles, precise movements in adding toppings and consistency in the taste."

The robots are reported to be able to make only 80 bowls a day (though the automated process, which includes heating but not making the broth, is said to take less than 2 minutes). They sell for $7 apiece. That gives the shop a total daily revenue of $560, which has to cover the cost of the ingredients, electricity, rent, and some humans make the broth, serve customers, take their money, and so on. And the robots themselves of course.

The shop therefore doesn't make a profit for Nagoya, but it's a great proof of concept and might someday lead to restaurant robots inexpensive enough to replace all those inprecise high school students currently preparing our fast food. (By the way, it's unclear to me whether Nagoya has anything to do with the soon-to-be-closing robot musuem in the town of Nagoya.)

There's an additional video of the Nagoya ramen robots here.

The Nagoya robot story has completely overshadowed a robot “somewhere in Yamanashi” Japan that also helps make ramen soup. Restauranteur Yoshihira Uchida, for whom the robot was created, had the exact opposite strategy of having the robot custom-prepare the broth with “40 million recipes”—combinations of broth ingredients—while a human chef makes the noodles.

This Article Has Been Revised to Reflect the Following Correction

Last week, a slew of news outlets (emphasis on “outlets,” as if it were a contraction of “outhouse” and “toilets”) published a story about a man allergic to a particular radio signal. Not just a particular radio frequency, which would be crazy enough, but a particular air interface, a particular protocol, if you will: Wi-Fi signals. No, really. Here's the Daily Mail's headline:

Allergic to wi-fi! How 'electrosmog' leaves Afterlife DJ in agony

I know what you're thinking. Wi-Fi, at least the most popular flavors of it, uses the same 2.4 GHz frequency as cordless phones, garage doors, and the microwave oven that Steve Miller, aka “Afterlife DJ,” probably pops his popcorn in. How could someone be allergic to Wi-Fi and not a phone or microwave using the same frequency?

You shouldn't have to know anything about the IEEE 802.11 standard to instantly see that the story is nonsense, but apparently you do if you if you work for any of the publications that took to the story like a lemming to the sea.

Fox News—you know, the fair and balanced people—took it and ran ( Man Allergic to Wi-Fi, Makes Him Sick, Dizzy, Confused), apparently getting it, like a virus, from The Sun.


This isn't an occasional phenomenon, it seems integral to the Web. But not just the Web, it's probably a story as old as history itself, or at least the 1970s. The national public radio watchdog show, “On The Media,” had a great piece (“Too Good to Check”) this weekend about crazy claims about Walter Cronkite that have been around for decades and that resurfaced on the occasion of his recent death.

Did you know, for instance, that Uncle Walter is so identified with the news business that in Sweden an anchorman is called a "Kronkiter?" And speaking of anchorman, did you know that the word was coined in the '50s to define Cronkite's role on broadcast TV? Turns out, despite what many media eulogies would have you believe, neither of those facts I just asserted are exactly true.

On the Media host Bob Garfield traced the virus's 30-year etiology with the help of Ben Zimmer, executive producer of the Visual Thesaurus.

BOB GARFIELD: Let's start with the Kronkiter bit. Read me, please, the excerpt from the AP obit.

BEN ZIMMER: Well, the obituary that ran in many newspapers came from The Associated Press. The version that ran in The Chicago Tribune, for instance, said, "Cronkite was the broadcaster to whom the title 'anchorman' was first applied. In Sweden, anchors were sometimes termed “Kronkiters'", that's with a k. "In Holland, they were "Cronkiters.'" That's with a C.

BOB GARFIELD: It scans. I mean, it sort of sounds possible. But what you did was go back to see if it was, you know, true. What did you discover?

BEN ZIMMER: Well, I was not able to discover any evidence in Swedish, Dutch or any other language that news anchors were ever called Kronkiters. So I tried to figure out, well, who started telling this anecdote? And when I first looked, the earliest example I could find was in a 1978 book called Air Time: The Inside Story of CBS News, written by Gary Paul Gates, who was at one time a news writer for Cronkite.

Then in 1979, David Halberstam wrote The Powers That Be, and similarly he had, in Sweden, anchormen were known as Kronkiters. It seemed that these were the earliest examples of this story being told. And what I did was I actually contacted Gary Paul Gates to find out where he got the story from, and it turns out he says he got the story from Halberstam.

At least the Web, when it taketh away the truth, can also giveth it. For example, when a responsible publication messes up, it can correct it with lightning speed. And not in some miniscule correction published in an obscure corner of the paper days later that leaves the original nonsense untouched. The corrections can be made to the original article (something as Foxnews.com, the Daily Mail, and The Sun have yet to do, by the way), with, hopefully, an editorial note describing the changes-all seven of them, in the case of—speaking of Walter Cronkite—an “appraisal” of the late great newscaster written by New York Times ace appraiser Alessandra Stanley.

Check out the mammoth 200-word correction (numbers added):

An appraisal on Saturday about Walter Cronkite's career included a number of errors. (1) In some copies, it misstated the date that the Rev. Dr. Martin Luther King Jr. was killed and (2)referred incorrectly to Mr. Cronkite's coverage of D-Day. Dr. King was killed on April 4, 1968, not April 30. Mr. Cronkite covered the D-Day landing from a warplane; he did not storm the beaches. In addition, (3) Neil Armstrong set foot on the moon on July 20, 1969, not July 26. (4) “The CBS Evening News” overtook “The Huntley-Brinkley Report” on NBC in the ratings during the 1967-68 television season, not after Chet Huntley retired in 1970. (5) A communications satellite used to relay correspondents' reports from around the world was Telstar, not Telestar. (6) Howard K. Smith was not one of the CBS correspondents Mr. Cronkite would turn to for reports from the field after he became anchor of “The CBS Evening News” in 1962; he left CBS before Mr. Cronkite was the anchor. Because of an editing error, (7) the appraisal also misstated the name of the news agency for which Mr. Cronkite was Moscow bureau chief after World War II. At that time it was United Press, not United Press International.

Correction: All eight corrections. A week later, the Times added yet one more:

This article has been revised to reflect the following correction: Correction: August 1, 2009 An appraisal on July 18 about Walter Cronkite's career misstated the name of the ABC evening news broadcast. While the program was called “World News Tonight” when Charles Gibson became anchor in May 2006, it is now “World News With Charles Gibson,” not “World News Tonight With Charles Gibson.”

Why stop there? We can get to an even 10 for for the Times on the subject of Cronkite if we count the two—yes, two—separate corrections to its obituary of (as opposed to appraisal for) Der Kronkiter.

This article has been revised to reflect the following correction:

Correction: July 21, 2009 Because of an editing error, an obituary Saturday about the CBS newsman Walter Cronkite misspelled the name of the church in Manhattan where his family plans to hold a private funeral service. It is St. Bartholomew's, not Bartholemew's.

This article has been revised to reflect the following correction:

Correction: July 23, 2009 An obituary on Saturday about Walter Cronkite misidentified the country in which he crash-landed a glider as a United Press correspondent in World War II. It was the Netherlands, not Belgium.

It's a wonderful thing, the Web, a gargantuan fact-checking machine it is. We're lucky to have one. It's just too bad we need one so often, in the era of the Web. Like the pharmacy owner in the Mad Magazine cartoon who sells both chocolate ice cream and acne medication, the Web fuels its own arms war of truth and falsity.

Devices for Diabetics Expand Inward and Outward

Funding and advising the development of an artificial pancreas is a major long-term initiative at the FDA. A couple interesting advances have recently been made, both commercially and in research, that seem to bring us closer to this goal.

According to the FDA, an artificial pancreas would consist of three components:

"(1) an infusion pump to deliver the required drug, many of which are already available; (2) a continuous glucose monitor, several of which have been approved by the FDA for tracking and trending glucose levels; and (3) an algorithm to communicate between the pump(s) and glucose monitor. An algorithm will receive information from the glucose monitor and convert it to instructions for the infusion pump."

Some continuous blood glucose monitoring devices are already available on the market (here's a good comparative chart).  All give periodic updates of blood glucose levels measured from a sensor inserted just beneath the skin. But all fall short, in some serious way or another, of what an artificial pancreas would require.A huge problem, it seems, is the lifespan of the device. The sensors for these models only last a few day and have to be reinserted regularly. Furthermore, the sensor is only partially implanted, and connects to a transmitter through the skin.

Last month, engineers at the University of Calgary published an alternative design that mounts a glucose sensor onto a transponder chip. An external reader inductively powers the chip while reading the glucose level, eliminating the need for it to hook up to a battery powered transmitter. This makes the  device very small, and thus more durable in the body. Removing the need for a battery also means that the entire chip and sensor can be fully implanted under the skin.

The design also uses an alternative chemical reaction to measure the glucose levels in the body, one that doesn't require oxygen. The oxygen-driven reaction used by other devices produces hydrogen peroxide that can corrode the sensor.

The device hasn't been tested in an organism yet, and once that happens it will be interesting to see how accurate it actually is, but these are definitely ideas that could improve available models.

Another thing that is changing is the extent to which these glucose monitors can communicate with computers and other devices. The MyGlucoHealth system uses a traditional pin prick glucose meter but has installed it with a USB cable and bluetooth capabilities that make it possible to synchronize data with a diabetes management system. It also keep doctors and patients up to date with individuals' glucose level fluctuations with text messages and email. This kind of network is likely to be vital in a system that closes the loop with an automatic insulin infusion pump.

Advertisement

Tech Talk

IEEE Spectrum’s general technology blog, featuring news, analysis, and opinions about engineering, consumer electronics, and technology and society, from the editorial staff and freelance contributors.

Newsletter Sign Up

Sign up for the Tech Alert newsletter and receive ground-breaking technology and science news from IEEE Spectrum every Thursday.

Advertisement
Load More