Tech Talk iconTech Talk

Midwest Insurance Company Excludes Nanotechnology from its Policies

I have to admit that I saw this tidbit a week or two ago over at Nanodot and found it to be so outlandish that I thought it fell into too-ridiculous-to-comment category.

But people kept sending me the links to the news story usually accompanied with some slack-jawed, bewildered comment.

It is bewildering. First, who is this Des Moines, IA-based Continental Western Insurance Group? I have never heard of the insurer, but I am not a Midwest farm. If someone would like to enlighten me as to the nanoparticle producers they currently insure (or should I say, used to insure), I would welcome the information.

Second, excluding â''nanotechnologyâ''?! Okay, you could make some poorly informed, taking hearsay over science decision that nanoparticles, or even more precisely carbon nanotubes, have exhibited some similarities to asbestos, albeit with research still inconclusive. But nanotechnology?

What is that supposed to be exactly? Will that include STMs and AFMs, key tools in nanotechnology? Will that include the GMR effect used in your computer so you can store 100 gigabytes of family photos?

I have to commend the Nanobusiness Alliance in being extremely restrained in their response:

We believe the decision to exclude â''nanotubes and nanotechnologyâ'' was not well thought out. Treating nanotechnology as if it is monolithic makes no sense. A technology itself does not have risks and benefits â'' only the embodiments of the technology in the form of products do. Furthermore, the definitions were sufficiently broad that almost any business to be subject to the exclusion. This is the first exclusion. We hope that it will be reconsidered or pulled back altogether once the insurer understands the implications of the general-purpose exclusion they created. But, we must also educate insurers so that they do not make ill informed policy like this in the future.

The Nanobusiness Alliance is absolutely correct and at the same time generous to a fault. Instead, I am afraid this is just a further example of how just a small seed of misinformation can lead to dangerous stupidity.

The question I canâ''t seem to resolve is what was the point of the announcement? I keep pondering what possible purpose it served: giving notice to the Midwest nanoparticle industry to not knock on Continental Westernâ''s door when looking for a policy? Or demonstrating what a forward thinking, risk adverse trailblazer the company is to its current customers?

If itâ''s the former, well I am not sure that they are turning away much business, and the little that they are will find the insurance they need. If it was the latter, it probably would be a safe bet that the current customers probably didnâ''t know about nanotechnology never mind care about its toxicological issues.

Bewildering, indeed.

Keeping score in the digital cinema game: the virtual print fee is winning by a landslide

bits111.gif

Digital Cinema technology has been viable for several years; the problem has been getting it into the theaters. Itâ''s not that theater owners, for the most part, wouldnâ''t love to trade in their film projectors, itâ''s that converting a multiplex to digital is an expensive operation; about $70,000 a screen.

In the December 2006 issue of IEEE Spectrum author Russell Wintner described a creative solution to this dilemma: a deal between the vendors of digital equipment, the movie studios, and the theaters in which the vendors would provide the equipment to the theaters at no charge, and would be reimbursed by fees paid by the movie studios when they load digital files of movies onto the theater systems. Wintner termed this charge a virtual print fee. An interesting idea at the time, but would anyone sign on? Wintner predicted that they would.

And indeed, they have. Wintnerâ''s group, Access IT, signed four studios this spring--Disney, Fox, Paramount, and Universalâ''and is busy converting 10,000 North American screens to digital (AccessIT had already installed systems for projecting bits onto 4000 screens with a studio backed virtual-print deal). And last week a consortium of three of the largest theater chains, Digital Cinema Implementation Partners (DCIP), announced that they put together a financing package that will fund converting 20,000 North American screens to digital and signed on five studiesâ''Lionâ''s Gate Entertainment, Paramount Pictures, Twentieth Century Fox, Universal Pictures Universal Pictures Walt Disney Co., Again, studios will pay a virtual print fee.

Later last week Sony Corp., partnering with Paramount and Twentieth Century Fox, separately announced that it would convert 9000 screens in North America, Europe, and Asia. Again, the conversion will be financed by a virtual print fee. All the consortia are estimating this fee to be between $700 and $1000.

As far as Wintner is concerned, the more of these deals that happen the better. â''It means,â'' he says, â''the initiative I helped start in 1996 has succeeded with a commitment to replace 35mm film in all theatres across the U.S. and Canada (a total of about 38,000 screens).

Next up for Wintner? Taking advantage of digital cinemaâ''s 3D capability. â''These announcements,â'' he says, â''will add enormous momentum to 3D initiatives.â''

US Army plans to build 500 MW solar thermal plant

The U.S. Department of Defense, as we reported this month, has become the home of several very large-scale renewable energy projects. The reasons are simple: the military owns lots of empty land, it has complete jurisdiction over that territory, and its energy needs are insatiable. To that end, the U.S. Army, which to date has lagged the Air Force and the Navy in its energy initiatives, has just announced plans to build a 500-megawatt solar thermal plant at Fort Irwin, in California. The Mojave desert, an empty and hot place, has long been the home of solar thermal activity in the United States, in large part because it receives some of the strongest solar radiation in the world. The Army also reaffirmed its interest in a 30-megawatt geothermal power plant at Hawthorne Army Depot, using geothermal research from the Navy.

The Army's endeavor marks the military's first foray into solar thermal. The plant will be about equal in size to the Mojave Solar Park 1, which is being developed by Solel Solar Systems and is expected to be operational in 2011. However, contrary to what this CNET article reports, the Army's solar power plant will not "eclipse today's largest U.S. solar thermal installation of 14 megawatts at Nellis Air Force Base" -- that solar installation, though large, is photovoltaic. For more on the Nellis photovoltaic field and other military energy projects, check out this slide show.

Out of Africa: the sky is the limit

Mobile phones are the rage in Africa, but their success should not obscure an uncomfortable reality: Internet access is relatively small and too costly.

The solution is neither clear nor inexpensive. Two problems are critical. First, there need to be better communications links within and between African countries. Second, the African continent must have stronger links with the rest of the world.

Undersea cables, coming on stream, seem likely to solve the second problem. The first problem is more nettlesome, though bright minds envision an answer in the sky.

Satellites ought to do the trick, say Google and a communications innovator, Greg Wyler, whom the search-engine company is supporting.

The effort by Wyler's Ob3 Networks, which would involve 16 satellites, is expensive -- $700 million by one reckoning. There's also the question of whether the approach is commercially viable, or would require long-term subsidies from outside donors.

Definitive answers will not come quickly. The task of "wiring" Africa -- amid all the hoopla over the penetration of mobile phones in the poorest parts of the world -- remains daunting. And yet without greater Internet usage, the information economy in Africa will suffer gravely.

Exascale supercomputers: Can't get there from here?

Today Darpa released a report I've been hearing about for months concerning whether and how we could make the next big leap in supercomputing: exascale computing, a 1000x increase over today's machines. Darpa was particularly interested in whether it could be done by 2015.

With regard to whether it could be done by 2015, the answer, according to my read of the executive summary, is a qualified no.

In it's own words, here's what the study was after:

The objectives given the study were to understand the course of mainstream computing technology, and determine whether or not it would allow a 1,000X increase in the computational capabilities of computing systems by the 2015 time frame. If current technology trends were deemed as not capable of permitting such increases, then the study was also charged with identifying where were the major challenges, and in what areas may additional targeted research lay the groundwork for overcoming them.

The study was led by Peter Kogge, an IEEE Fellow and professor of computer science and engineering at Notre Dame University. (We'll be talking to him next week about the study for further coverage in IEEE Spectrum) And it had contributions from some of the profession's leading lights including Stanford's William Dally, HP's Stanley Williams, Micron's Dean Klein, Stanford's Kunle Olukotun, Georgia Tech's Rao Tumala, Intel's Jim Held and Katherine Yeolick (who I include in this list not because I know who she is, but because she lectured about the "Berkeley Dwarfs").

Darpa's helpers seem to have come to the decision that current technology trends will not allow for exascale computing. That's summed up pretty neatly in this graph, which clearly shows that the trend line in computer performance undershoots exascale in 2015 by an appreciable amount:

exascaleGflops.gif

The group found four areas where "current technology trends are simply insufficient" to get to exascale. The first and what they deemed the most pervasive was energy and power. The Darpa group was unable to come up with any combination of mature technologies that could deliver exascale performance at a reasonable power level:

exascaleGflopswatt.gif

The key, they found, is the power needed not to compute but to move data. Data needs to move on interconnects and they found that even using some really cool emerging technology it still cost 1-3 picojoules for a bit to go through just one interconnect level (like from chip to board or board to rack). Scale that up and you're talking 10-30 MW (167 000 - 500 000 60 watt light bulbs) per level. Eeesh.

The other 3 problems are memory storage (how to handle 1 billion 1GB DRAM chips), concurrency and locality (how to write a program that can handle a billion threads at once), and resiliency (how to prevent and recover from crashes).

These are equally interesting, but the power problem is, I think, what much of today's computing work is really boiling down to. Solve that, and things will look a lot sunnier for everything from high performance computing to embedded sensors.

The full (297 page) Darpa Exascale Computing report is here.

(In the November issue of IEEE Spectrum, watch for a cool simulation that Sandia computer architects did to show another bump in the road to future supercomputers. Their simulations show that as the multicore phenomenon advances in the processor industry, some very important applications will start performing worse.)

Nuclear waste imports can wait

Last July, our Sally Adee, brought you a story on the controversy over a Utah company's plan to import 18 000 metric tons of Italian nuclear waste into the United States and (after some difficult to understand process) dump some of it in Utah.

The Wall Street Journal reports that the Nuclear Regulatory Commission has decided to delay its decision on whether or not the importation can proceed. The NRC is going to sit on its hands until a federal court hears a related caseâ''some time next year.

The delay, says the Journal, gives a boost to a bill that would ban nuclear waste imports (unless they were defense-related). The legislation is currently stuck in committee.

Physics Nobel for why the Big Bang wasn't a big bust

From our intrepid intern, Monica Heger:

The Nobel Prize in physics was awarded today for discoveries in subatomic physics. Yoichiro Nambu, from the Enrico Fermi Institute at the University of Chicago won half the award for his discovery of the mechanism of spontaneous broken symmetry in subatomic physics. Two Japanese physicists, Makoto Kobayashi from the High Energy Accelerator Research Organization and Toshihide Maskawa from the Yukawa Institute for Theoretical Physics at Kyoto University, split the other half of the award for their discovery of the origin of broken symmetry, which predicts the existence of at least three families of quarks, a fundamental particle.

Broken symmetry lies behind the very nature of our existence. At the time of the Big Bang, if equal amounts of matter and antimatter were created, they theoretically would have destroyed each other. Instead, that symmetry was broken, allowing for the existence of our universe. Scientists still do not know how that symmetry was broken.

The three Nobel winners all explained broken symmetry within the framework of the existing laws of physics. Kobayashi and Maskawa were only able to do this by expanding broken symmetry to include three new families of quarks. The quarks they described in 1972 have only recently been observed in laboratories by particle accelerators.

Marching to the Beat of a Different Drummer in Nanotech

Andrew Maynard in his latest blog site presents one of the stronger metaphors I have seen to date to describe the state of dialogue (or lack thereof) on the future and direction of nanotech.

Maynard likens the current discourse to the latest social phenomenon the â''silent raveâ'' in which everyone shows up at the same place but listen to their own iPod.

These nanotechnology meetings to which Maynard draws his comparison consist of scientists, policy makers, industry leaders and NGOs just to name the main groups and they are all marching to the beat of different drummers.

What Maynard seems loathe to point out is that there may actually be qualitative difference between the drummers, or, to follow his metaphor, songs. Maybe Ringo Starr was a better drummer than Pete Best.

After reading TNTLogâ''s recent experience at another stakeholder consultation group intended to be â''Fruitful Dialogueâ'', one wonders how fruitful these dialogues can be when one or more of the groups clearly have absolutely no idea of what they are talking about.

Is it possible to step in and pull the plug on those iPods of the clearly misinformed? Probably not. The thought that some ideas and opinions are just bogus has come to be so anathema to â''reasonableâ'' people that we have to endure nonsense, or noise, and hope that the more pleasant notes come to the fore.

Unfortunately, hoping for something to happen doesnâ''t mean that it will.

Flash of Genius: See the Movie, then Read the Article

I watched Flash of Genius in a sort of slack-jawed amazement. The movie, which opened over the weekend, stars Greg Kinnear as Bob Kearns, inventor of the intermittent windshield wiper who brought car companies to their knees in the early 90s by winning a patent infringement case with GM.

The climactic scene is a patent trial! Sure, plenty of movies have trial scenes--but patent law is notoriously opaque. Throw in the complexities of engineering and a mentally disturbed engineer representing himself, and you've got the makings of cinematic Ambien. Mercifully, the trial moves at a brisk clip, with plenty of drama, and the most cogent explanation of the legal standard of non-obviousness that I've ever heard.

And of course, there was the flash of genius, Kearns' "eureka moment," when looking in the mirror and watching his eye blink, he realized he could make a windshield wiper work the same way. I was so excited when I came out of the movie that I started madly Twittering my review.

As I tweeted, I got to thinking about the nature of eureka moments. Kearns' eureka moment was actually several moments spread over a decade. Kearns blinded himself in one eye when popping a champagne cork on his wedding night. He and his wife recount this incident throughout the movie as the moment--but the wiper wasn't even a twinkle in his black eye at that time. Then there was the time he was driving his family in a downpour and he was frustrated by a lack of wiper-speed variability. Then there was the moment when he looks at his eye blinking in the mirror, and the first two moments came together--Eureka! Kearns' flash of genius.

Hmm. Flashes of genius, maybe. Or flashes that result in a moment of insight.

But that's really kind of nitpicky. Less so is the scene where Kearns' wiper works--the first time he turns it on. No way, I thought. And when I read John Seabrook's 1993 New Yorker article "The Flash of Genius" on which the movie was loosely (as it turns out) based, I learned that Kearns actually spent months perfecting his invention.

In fact, comparing the article to the movie is an object lesson in how Hollywood distorts complicated issues and complex people into a digestible package of entertainment. There's a laundry list of differences between legal fact and movie fiction. For instance, in the movie Kearns is a professor at Wayne State University teaching applied electrical engineering. In reality, at the time he invented the wiper, Kearns, who had a masters in mechanical engineering and would eventually teach at Wayne State, was commuting to Case Western Reserve in Cleveland, trying to earn his PhD. He came home on weekends to be with his wife and family. The real hero of the story should be his wife Phyllis, who had to take care of six kids by herself, while holding down a substitute teaching job (wait, was that true or made up?).

Finally, there's the explanation of the technology behind the intermittent wiper, or rather lack thereof. In the movie, we come to understand that Kearns invention is electronic rather than mechanical, and that it relies on simple components: a transistor, a capacitor and a variable resistor. How these work together in a novel circuit design (the Invention) is never explained in the movie. But how hard would it have been to have Kinnear explain the mechanism to his kids, using Seabrook's elegant explanation:

The resistor and the capacitor together were the timer, and the transistor worked as the switch. The resistor, which the driver could adjust with a knob, controlled the rate of current flowing into the capacitor. When the voltage in the capacitor reached a certain level, it triggered the transistor; the transistor turned on, and the wipers wiped once. The running of the wiper motor drained voltage out of the capacitor; it sank below the threshold level of the transistor, and the transistor turned off. The wipers dwelled until the capacitor recharged.

Having said all that, this is movie well worth seeing. How many times do you see circuit diagrams, even as set pieces, on the silver screen? How many times do engineers star in a film? Kinnear gives a terrific performance. And the ending, though happy enough, underscores the price the late professor Kearns and his family paid for his obsession. See the movie. Then read the article.

Digital TV preview hints at problems; firefighters come to the rescue

11.Dig.TV.Blog.gif Last month broadcasters in Wilmington, N.C., turned off their analog signals, meaning that viewers of over-the-air television had digital television or nothing. This is a preview of the nationwide analog shutdown scheduled for 17 February 2009.

Local government officials worked hard to get the word out, and an estimated 97 percent of Wilmington residents knew about the analog shut-off. The local fire department sent volunteers out to help people hook up their converter boxes. It still didnâ''t go so well. Many viewers lost their favorite television channels altogether; of the 1828 people who complained to the FCC in the first five days after the shutoff, more than half of those had lost channels. Others called their local television stations to complain.

According to a team of students from Elon University, many problems were related to the antenna. One Wilmington resident quoted in a great blog post about the antenna problems said, â''I feel scammed by all these commercials and companies. If getting a new antenna was something they knew we might have to do, why did they not say our antennas would not work?â''

So it looks like my unhappy experience in trying to switch to digital was not, unfortunately, an aberration. Just making some rough calculations, I figure that 12 percent of the roughly 15,000 people in Wilmington who donâ''t subscribe to cable or satellite were ticked off enough to call the FCC. Nationwide, twelve percent of the 13.4 million households is 1.6 million. The FCC is going to have a really busy February.

And it might not just end there. In fact, I hope those volunteer firemen keep standing by. Back in August I converted my mother in New Jersey to digital; all went well, she got lots of channels, she was happy. I just found out that, however, in spite of my carefully written instructions, a couple of days after I left she pushed some button out of sequence and hasnâ''t been able to tune in a TV signal since; wonder if I should have her call her local fire department to sort it out?

For more tales from the digital television transition, as well as links to in depth coverage about digital television technology, see IEEE Spectrum's Special Report: THE DAY ANALOG TV DIES.

Advertisement

Tech Talk

IEEE Spectrum’s general technology blog, featuring news, analysis, and opinions about engineering, consumer electronics, and technology and society, from the editorial staff and freelance contributors.

Newsletter Sign Up

Sign up for the Tech Alert newsletter and receive ground-breaking technology and science news from IEEE Spectrum every Thursday.

Advertisement
Load More