Tech Talk iconTech Talk

Derivatives and the Singularity

In the weekend's New York Times, a can't-miss article from Richard Dooling on the economics of the singularity. In spite of its unlikely title, "The Rise of the Machines" offers the best explanation I've read on what exactly the heck a derivative is. So far a lot of stories discuss the bailout: is it wise? They focus on the effects of wall street on main street. They focus on the personal tragedies of financial titans as if we're trying to make schadenfreude our national pastime. But what's always elusive in these (otherwise satisfying) narratives about motivation is what it was the analysts were actually trying to accomplish. Exactly what is a credit default derivative?

It's a "fake" currency in the same way that paper money is a fake currency based on real gold. Unfortunately, the new currency is so complex that only a machine could understand it.

It was easy enough for us humans to understand a stick or a dollar bill when it was backed by something tangible somewhere, but only computers can understand and derive a correlation structure from observed collateralized debt obligation tranche spreads. Which leads us to the next question: Just how much of the worldâ''s financial stability now lies in the â''handsâ'' of computerized trading algorithms?

The unfortunate part is, Dooling says, that the Unabomber already made Dooling's case for him--in back in 1995:

But we are suggesting neither that the human race would voluntarily turn power over to the machines nor that the machines would willfully seize power. What we do suggest is that the human race might easily permit itself to drift into a position of such dependence on the machines that it would have no practical choice but to accept all of the machinesâ'' decisions. ... Eventually a stage may be reached at which the decisions necessary to keep the system running will be so complex that human beings will be incapable of making them intelligently. At that stage the machines will be in effective control. People wonâ''t be able to just turn the machines off, because they will be so dependent on them that turning them off would amount to suicide.

Add that to the list of singularity predictions.

Return of the Solar Power Tower

Last week Spectrum Online ran my profile of Andasol 1, a solar thermal power plant that's set to startup in Andalucia with the largest installation built expressly for storing renewable energy: a set of molten salt storage tanks that will hold enough heat energy to run its 50 MW steam turbine for 7.5 hours after dark. This week brought decisive evidence that another solar thermal design that makes even better use of energy storage -- a so-called 'power tower' whereby sunlight is focused on a central tower -- will also have its moment in the Andalucian sun.

The project, dubbed Gemasolar, will employ sun-tracking mirrors covering an area equal to 40 soccer fields to focus light at the top of a roughly 120-meter-high tower. There the sunlight will heat a solar receiver full of molten salt. In contrast, Andasol 1 (like most of the solar thermal plants under construction in the U.S., Spain, North Africa and the Gulf) uses thousands of square meters of trough-shaped mirrors to focus light on a synthetic oil; energy is stored via heat exchangers that transfer the synthetic oil's heat to a molten salt.

One advantage of the power tower is thus obvious: heating salt directly eliminates the need for heat exchangers, reducing installation and operating costs. Another lies in the fortuitous thermodynamics of heating molten salts, whose maximum safe temperature of 565 C is about 165 C higher than the synthetic oil's.

Sandia National Lab researchers verified these power tower advantages in the second half of the 90s, but also suffered through a series of operational difficulties. Five years ago the European Commission provided funding for the Gemasolar project (then known as the Solar Tres) to demonstrate that the difficulties could be overcome, but the project foundered on legal issues and changes in Spain's renewable energy law. But engineering continued and this March the project sprung back to life when its lead proponent, Spanish engineering firm Sener, clinched a solar thermal joint venture with Abu Dabi's alternative energy program.

With Abu Dabi's deep pockets Gemasolar's financing just might survive the current financial crisis. Siemens confirmed that the tower was moving forward this week by disclosing that it would supply the steam turbine to convert the tower's solar-generated heat into up to 19 MW of electricity for the Spanish grid.

For further details on Gemasolar, see this frank telling of its origins, design and goals on Sener's website. For details on a competing power tower design that directly produces steam, see this white paper from Spains' Abengoa Solar.

Space Elevator Engineers Are Set to Meet in Tokyo

That farout sci-fi staple known as the space elevator is in the news again, among real engineers who take the idea seriously. An organization known as the Japan Space Elevator Association will hold the 1st Japan Space Elevator Conference in Tokyo on 15-16 November. And the attendees should have a lot to discuss.

A piece from CNN ('Space elevator' would take humans into orbit) reports that interest in developing a space elevator has never been higher, with hundreds of engineers and scientists from Asia, Europe, and the Americas working hard to turn the visionary concept into a reality, possibly within a few decades.

The CNN item refers to the challenge of building a cable that would extend from a ground station to an orbiting outpost thosands of miles above as 'an unprecedented feat of human engineering'. Once built and deployed, the tether would theoretically be capable of conveying an attached platform into space.

To learn more about the space elevator concept, please read our cover story from the August 2005 issue of IEEE Spectrum, A Hoist to the Heavens. In it, space scientist Bradley Carl Edwards writes: 'Roomy elevator cars powered by electricity would speed along the cable. For a fraction of the cost, risk, and complexity of today's rocket boosters, people and cargo would be whisked into space in relative comfort and safety.'

Credit for popularizing the idea of a tethered transport system from the earth to space goes to famous science-fiction author Arthur C. Clarke, who died earlier this year. In his 1979 novel The Fountains of Paradise, Clarke fictionalized a system that had recently been put forth by U.S. space scientist Jerome Pearson.

Before he passed away in March, Clarke spoke with Spectrum contributor Saswato R. Das about the prospects of a space elevator from his hospital bed in Colombo, Sri Lanka. In his last published interview, Final Thoughts from Sir Arthur C. Clarke (1917-2008), Clarke told Das that he thought such a space transport system would be "considered equally important" to the breakthroughs brought about by rockets and satellites.

"I'm often asked when I think the space elevator will be built," Clarke told Das with a smile. "My answer is about 10 years after everyone stops laughing. Maybe 20 years. But I am pretty sure that the space elevator is an important element in future space travel."

Last week, the head of the Japan Space Elevator Association, Akira Tsuchida, told CNN that his group is already working with U.S. and European firms on early cable prototypes based on carbon nanotube technology.

"At present we have a tether which is made of carbon nanotube[s], and has one-third or one-quarter of the strength required to make a space elevator. We expect that we will have strong enough cable in the 2020s or 2030s," Tsuchida noted. "Because we don't have a material which has enough strength to construct [a] space elevator yet, it is difficult to change people's mind[s] so they believe that it can be real."

Next month in Tokyo, real scientist and engineers will gather to grapple with the fiction-inspired notion of hoisting people and objects into space along a tether strong enough to leash a planet. It's a far-fetched idea, all right. But if they can inspire one another into producing a few key breakthroughs, they'll have started a process that might eventually change the minds of people around the world.

Déjà? Are Hybrids Already Passé?

Plugs are definitely vogue at this week's Mondial de l'Automobile in Paris. So where does the hybrid vehicle fit into the picture? It may not, according to Renault. The French carmaker says that electric vehicles, not hybrids, are needed to deliver the emissions reductions that governments and customers demand.

Renault says that it is engineering a pair of battery-powered electric vehicles (EVs), to be produced starting in 2011, that it claims will be cheaper to build, cost markedly less to power, and produce far less carbon dioxide. Today they unveiled a partnership with utility géant Electricité de France to "establish electric cars as a viable and

attractive transport solution for consumers."

And Renault is not the only major automaker planning to produce commuter-oriented EVs. Mitsubishi Motors and Daimler both announced plans in Paris last week to accelerate commercialization of small EVs -- Mitsubishi with its i-MiEV minicar and Daimler with a battery version of its popular Smart Fortwo. Volkswagen's promo materials in Paris confirmed it would join the EV club, producing a tiny commuter EV called the Up! in 2010 with a top speed of 130 kilometers/hour and roughly 100 kms of range.

Ok you say. EV's are à la mode. But what of the hybrid option? The question is partly semantic. Hybrid technology is everywhere if you count the mild hybrids, which employ a small but potent electric battery to save gas by rebooting the combustion engine on a green light instead of idling through the red; some can also recuperate energy during breaking by recharging their battery. This technology is going mainstream: Renault competitor PSA Peugeot Citroën said it alone will install 1 million stop-start systems by 2011. VW spokesperson Martin Hube said his company viewed stop-start as just an evolution of internal combustion drive. "You can call it a mild hybrid but it's just a smart technique," says Hube. "That's nothing new."

No automaker questions whether full hybrids like the Prius or GM's plug-in Chevy Volt that can drive on either electricity or gasoline are something new. But while several showed full hybrid concept cars in Paris, fewer talked up plans to build one. Perhaps they've made the same calculation as Renault: it's not worth the trouble to cram high-energy motors, batteries and an engine into a vehicle when one can go straight to the full EV instead.

Midwest Insurance Company Excludes Nanotechnology from its Policies

I have to admit that I saw this tidbit a week or two ago over at Nanodot and found it to be so outlandish that I thought it fell into too-ridiculous-to-comment category.

But people kept sending me the links to the news story usually accompanied with some slack-jawed, bewildered comment.

It is bewildering. First, who is this Des Moines, IA-based Continental Western Insurance Group? I have never heard of the insurer, but I am not a Midwest farm. If someone would like to enlighten me as to the nanoparticle producers they currently insure (or should I say, used to insure), I would welcome the information.

Second, excluding â''nanotechnologyâ''?! Okay, you could make some poorly informed, taking hearsay over science decision that nanoparticles, or even more precisely carbon nanotubes, have exhibited some similarities to asbestos, albeit with research still inconclusive. But nanotechnology?

What is that supposed to be exactly? Will that include STMs and AFMs, key tools in nanotechnology? Will that include the GMR effect used in your computer so you can store 100 gigabytes of family photos?

I have to commend the Nanobusiness Alliance in being extremely restrained in their response:

We believe the decision to exclude â''nanotubes and nanotechnologyâ'' was not well thought out. Treating nanotechnology as if it is monolithic makes no sense. A technology itself does not have risks and benefits â'' only the embodiments of the technology in the form of products do. Furthermore, the definitions were sufficiently broad that almost any business to be subject to the exclusion. This is the first exclusion. We hope that it will be reconsidered or pulled back altogether once the insurer understands the implications of the general-purpose exclusion they created. But, we must also educate insurers so that they do not make ill informed policy like this in the future.

The Nanobusiness Alliance is absolutely correct and at the same time generous to a fault. Instead, I am afraid this is just a further example of how just a small seed of misinformation can lead to dangerous stupidity.

The question I canâ''t seem to resolve is what was the point of the announcement? I keep pondering what possible purpose it served: giving notice to the Midwest nanoparticle industry to not knock on Continental Westernâ''s door when looking for a policy? Or demonstrating what a forward thinking, risk adverse trailblazer the company is to its current customers?

If itâ''s the former, well I am not sure that they are turning away much business, and the little that they are will find the insurance they need. If it was the latter, it probably would be a safe bet that the current customers probably didnâ''t know about nanotechnology never mind care about its toxicological issues.

Bewildering, indeed.

Keeping score in the digital cinema game: the virtual print fee is winning by a landslide

bits111.gif

Digital Cinema technology has been viable for several years; the problem has been getting it into the theaters. Itâ''s not that theater owners, for the most part, wouldnâ''t love to trade in their film projectors, itâ''s that converting a multiplex to digital is an expensive operation; about $70,000 a screen.

In the December 2006 issue of IEEE Spectrum author Russell Wintner described a creative solution to this dilemma: a deal between the vendors of digital equipment, the movie studios, and the theaters in which the vendors would provide the equipment to the theaters at no charge, and would be reimbursed by fees paid by the movie studios when they load digital files of movies onto the theater systems. Wintner termed this charge a virtual print fee. An interesting idea at the time, but would anyone sign on? Wintner predicted that they would.

And indeed, they have. Wintnerâ''s group, Access IT, signed four studios this spring--Disney, Fox, Paramount, and Universalâ''and is busy converting 10,000 North American screens to digital (AccessIT had already installed systems for projecting bits onto 4000 screens with a studio backed virtual-print deal). And last week a consortium of three of the largest theater chains, Digital Cinema Implementation Partners (DCIP), announced that they put together a financing package that will fund converting 20,000 North American screens to digital and signed on five studiesâ''Lionâ''s Gate Entertainment, Paramount Pictures, Twentieth Century Fox, Universal Pictures Universal Pictures Walt Disney Co., Again, studios will pay a virtual print fee.

Later last week Sony Corp., partnering with Paramount and Twentieth Century Fox, separately announced that it would convert 9000 screens in North America, Europe, and Asia. Again, the conversion will be financed by a virtual print fee. All the consortia are estimating this fee to be between $700 and $1000.

As far as Wintner is concerned, the more of these deals that happen the better. â''It means,â'' he says, â''the initiative I helped start in 1996 has succeeded with a commitment to replace 35mm film in all theatres across the U.S. and Canada (a total of about 38,000 screens).

Next up for Wintner? Taking advantage of digital cinemaâ''s 3D capability. â''These announcements,â'' he says, â''will add enormous momentum to 3D initiatives.â''

US Army plans to build 500 MW solar thermal plant

The U.S. Department of Defense, as we reported this month, has become the home of several very large-scale renewable energy projects. The reasons are simple: the military owns lots of empty land, it has complete jurisdiction over that territory, and its energy needs are insatiable. To that end, the U.S. Army, which to date has lagged the Air Force and the Navy in its energy initiatives, has just announced plans to build a 500-megawatt solar thermal plant at Fort Irwin, in California. The Mojave desert, an empty and hot place, has long been the home of solar thermal activity in the United States, in large part because it receives some of the strongest solar radiation in the world. The Army also reaffirmed its interest in a 30-megawatt geothermal power plant at Hawthorne Army Depot, using geothermal research from the Navy.

The Army's endeavor marks the military's first foray into solar thermal. The plant will be about equal in size to the Mojave Solar Park 1, which is being developed by Solel Solar Systems and is expected to be operational in 2011. However, contrary to what this CNET article reports, the Army's solar power plant will not "eclipse today's largest U.S. solar thermal installation of 14 megawatts at Nellis Air Force Base" -- that solar installation, though large, is photovoltaic. For more on the Nellis photovoltaic field and other military energy projects, check out this slide show.

Out of Africa: the sky is the limit

Mobile phones are the rage in Africa, but their success should not obscure an uncomfortable reality: Internet access is relatively small and too costly.

The solution is neither clear nor inexpensive. Two problems are critical. First, there need to be better communications links within and between African countries. Second, the African continent must have stronger links with the rest of the world.

Undersea cables, coming on stream, seem likely to solve the second problem. The first problem is more nettlesome, though bright minds envision an answer in the sky.

Satellites ought to do the trick, say Google and a communications innovator, Greg Wyler, whom the search-engine company is supporting.

The effort by Wyler's Ob3 Networks, which would involve 16 satellites, is expensive -- $700 million by one reckoning. There's also the question of whether the approach is commercially viable, or would require long-term subsidies from outside donors.

Definitive answers will not come quickly. The task of "wiring" Africa -- amid all the hoopla over the penetration of mobile phones in the poorest parts of the world -- remains daunting. And yet without greater Internet usage, the information economy in Africa will suffer gravely.

Exascale supercomputers: Can't get there from here?

Today Darpa released a report I've been hearing about for months concerning whether and how we could make the next big leap in supercomputing: exascale computing, a 1000x increase over today's machines. Darpa was particularly interested in whether it could be done by 2015.

With regard to whether it could be done by 2015, the answer, according to my read of the executive summary, is a qualified no.

In it's own words, here's what the study was after:

The objectives given the study were to understand the course of mainstream computing technology, and determine whether or not it would allow a 1,000X increase in the computational capabilities of computing systems by the 2015 time frame. If current technology trends were deemed as not capable of permitting such increases, then the study was also charged with identifying where were the major challenges, and in what areas may additional targeted research lay the groundwork for overcoming them.

The study was led by Peter Kogge, an IEEE Fellow and professor of computer science and engineering at Notre Dame University. (We'll be talking to him next week about the study for further coverage in IEEE Spectrum) And it had contributions from some of the profession's leading lights including Stanford's William Dally, HP's Stanley Williams, Micron's Dean Klein, Stanford's Kunle Olukotun, Georgia Tech's Rao Tumala, Intel's Jim Held and Katherine Yeolick (who I include in this list not because I know who she is, but because she lectured about the "Berkeley Dwarfs").

Darpa's helpers seem to have come to the decision that current technology trends will not allow for exascale computing. That's summed up pretty neatly in this graph, which clearly shows that the trend line in computer performance undershoots exascale in 2015 by an appreciable amount:

exascaleGflops.gif

The group found four areas where "current technology trends are simply insufficient" to get to exascale. The first and what they deemed the most pervasive was energy and power. The Darpa group was unable to come up with any combination of mature technologies that could deliver exascale performance at a reasonable power level:

exascaleGflopswatt.gif

The key, they found, is the power needed not to compute but to move data. Data needs to move on interconnects and they found that even using some really cool emerging technology it still cost 1-3 picojoules for a bit to go through just one interconnect level (like from chip to board or board to rack). Scale that up and you're talking 10-30 MW (167 000 - 500 000 60 watt light bulbs) per level. Eeesh.

The other 3 problems are memory storage (how to handle 1 billion 1GB DRAM chips), concurrency and locality (how to write a program that can handle a billion threads at once), and resiliency (how to prevent and recover from crashes).

These are equally interesting, but the power problem is, I think, what much of today's computing work is really boiling down to. Solve that, and things will look a lot sunnier for everything from high performance computing to embedded sensors.

The full (297 page) Darpa Exascale Computing report is here.

(In the November issue of IEEE Spectrum, watch for a cool simulation that Sandia computer architects did to show another bump in the road to future supercomputers. Their simulations show that as the multicore phenomenon advances in the processor industry, some very important applications will start performing worse.)

Nuclear waste imports can wait

Last July, our Sally Adee, brought you a story on the controversy over a Utah company's plan to import 18 000 metric tons of Italian nuclear waste into the United States and (after some difficult to understand process) dump some of it in Utah.

The Wall Street Journal reports that the Nuclear Regulatory Commission has decided to delay its decision on whether or not the importation can proceed. The NRC is going to sit on its hands until a federal court hears a related caseâ''some time next year.

The delay, says the Journal, gives a boost to a bill that would ban nuclear waste imports (unless they were defense-related). The legislation is currently stuck in committee.

Advertisement

Tech Talk

IEEE Spectrum’s general technology blog, featuring news, analysis, and opinions about engineering, consumer electronics, and technology and society, from the editorial staff and freelance contributors.

Newsletter Sign Up

Sign up for the Tech Alert newsletter and receive ground-breaking technology and science news from IEEE Spectrum every Thursday.

Advertisement
Load More