Tech Talk iconTech Talk

Space Elevator Engineers Are Set to Meet in Tokyo

That farout sci-fi staple known as the space elevator is in the news again, among real engineers who take the idea seriously. An organization known as the Japan Space Elevator Association will hold the 1st Japan Space Elevator Conference in Tokyo on 15-16 November. And the attendees should have a lot to discuss.

A piece from CNN ('Space elevator' would take humans into orbit) reports that interest in developing a space elevator has never been higher, with hundreds of engineers and scientists from Asia, Europe, and the Americas working hard to turn the visionary concept into a reality, possibly within a few decades.

The CNN item refers to the challenge of building a cable that would extend from a ground station to an orbiting outpost thosands of miles above as 'an unprecedented feat of human engineering'. Once built and deployed, the tether would theoretically be capable of conveying an attached platform into space.

To learn more about the space elevator concept, please read our cover story from the August 2005 issue of IEEE Spectrum, A Hoist to the Heavens. In it, space scientist Bradley Carl Edwards writes: 'Roomy elevator cars powered by electricity would speed along the cable. For a fraction of the cost, risk, and complexity of today's rocket boosters, people and cargo would be whisked into space in relative comfort and safety.'

Credit for popularizing the idea of a tethered transport system from the earth to space goes to famous science-fiction author Arthur C. Clarke, who died earlier this year. In his 1979 novel The Fountains of Paradise, Clarke fictionalized a system that had recently been put forth by U.S. space scientist Jerome Pearson.

Before he passed away in March, Clarke spoke with Spectrum contributor Saswato R. Das about the prospects of a space elevator from his hospital bed in Colombo, Sri Lanka. In his last published interview, Final Thoughts from Sir Arthur C. Clarke (1917-2008), Clarke told Das that he thought such a space transport system would be "considered equally important" to the breakthroughs brought about by rockets and satellites.

"I'm often asked when I think the space elevator will be built," Clarke told Das with a smile. "My answer is about 10 years after everyone stops laughing. Maybe 20 years. But I am pretty sure that the space elevator is an important element in future space travel."

Last week, the head of the Japan Space Elevator Association, Akira Tsuchida, told CNN that his group is already working with U.S. and European firms on early cable prototypes based on carbon nanotube technology.

"At present we have a tether which is made of carbon nanotube[s], and has one-third or one-quarter of the strength required to make a space elevator. We expect that we will have strong enough cable in the 2020s or 2030s," Tsuchida noted. "Because we don't have a material which has enough strength to construct [a] space elevator yet, it is difficult to change people's mind[s] so they believe that it can be real."

Next month in Tokyo, real scientist and engineers will gather to grapple with the fiction-inspired notion of hoisting people and objects into space along a tether strong enough to leash a planet. It's a far-fetched idea, all right. But if they can inspire one another into producing a few key breakthroughs, they'll have started a process that might eventually change the minds of people around the world.

Déjà? Are Hybrids Already Passé?

Plugs are definitely vogue at this week's Mondial de l'Automobile in Paris. So where does the hybrid vehicle fit into the picture? It may not, according to Renault. The French carmaker says that electric vehicles, not hybrids, are needed to deliver the emissions reductions that governments and customers demand.

Renault says that it is engineering a pair of battery-powered electric vehicles (EVs), to be produced starting in 2011, that it claims will be cheaper to build, cost markedly less to power, and produce far less carbon dioxide. Today they unveiled a partnership with utility géant Electricité de France to "establish electric cars as a viable and

attractive transport solution for consumers."

And Renault is not the only major automaker planning to produce commuter-oriented EVs. Mitsubishi Motors and Daimler both announced plans in Paris last week to accelerate commercialization of small EVs -- Mitsubishi with its i-MiEV minicar and Daimler with a battery version of its popular Smart Fortwo. Volkswagen's promo materials in Paris confirmed it would join the EV club, producing a tiny commuter EV called the Up! in 2010 with a top speed of 130 kilometers/hour and roughly 100 kms of range.

Ok you say. EV's are à la mode. But what of the hybrid option? The question is partly semantic. Hybrid technology is everywhere if you count the mild hybrids, which employ a small but potent electric battery to save gas by rebooting the combustion engine on a green light instead of idling through the red; some can also recuperate energy during breaking by recharging their battery. This technology is going mainstream: Renault competitor PSA Peugeot Citroën said it alone will install 1 million stop-start systems by 2011. VW spokesperson Martin Hube said his company viewed stop-start as just an evolution of internal combustion drive. "You can call it a mild hybrid but it's just a smart technique," says Hube. "That's nothing new."

No automaker questions whether full hybrids like the Prius or GM's plug-in Chevy Volt that can drive on either electricity or gasoline are something new. But while several showed full hybrid concept cars in Paris, fewer talked up plans to build one. Perhaps they've made the same calculation as Renault: it's not worth the trouble to cram high-energy motors, batteries and an engine into a vehicle when one can go straight to the full EV instead.

Midwest Insurance Company Excludes Nanotechnology from its Policies

I have to admit that I saw this tidbit a week or two ago over at Nanodot and found it to be so outlandish that I thought it fell into too-ridiculous-to-comment category.

But people kept sending me the links to the news story usually accompanied with some slack-jawed, bewildered comment.

It is bewildering. First, who is this Des Moines, IA-based Continental Western Insurance Group? I have never heard of the insurer, but I am not a Midwest farm. If someone would like to enlighten me as to the nanoparticle producers they currently insure (or should I say, used to insure), I would welcome the information.

Second, excluding â''nanotechnologyâ''?! Okay, you could make some poorly informed, taking hearsay over science decision that nanoparticles, or even more precisely carbon nanotubes, have exhibited some similarities to asbestos, albeit with research still inconclusive. But nanotechnology?

What is that supposed to be exactly? Will that include STMs and AFMs, key tools in nanotechnology? Will that include the GMR effect used in your computer so you can store 100 gigabytes of family photos?

I have to commend the Nanobusiness Alliance in being extremely restrained in their response:

We believe the decision to exclude â''nanotubes and nanotechnologyâ'' was not well thought out. Treating nanotechnology as if it is monolithic makes no sense. A technology itself does not have risks and benefits â'' only the embodiments of the technology in the form of products do. Furthermore, the definitions were sufficiently broad that almost any business to be subject to the exclusion. This is the first exclusion. We hope that it will be reconsidered or pulled back altogether once the insurer understands the implications of the general-purpose exclusion they created. But, we must also educate insurers so that they do not make ill informed policy like this in the future.

The Nanobusiness Alliance is absolutely correct and at the same time generous to a fault. Instead, I am afraid this is just a further example of how just a small seed of misinformation can lead to dangerous stupidity.

The question I canâ''t seem to resolve is what was the point of the announcement? I keep pondering what possible purpose it served: giving notice to the Midwest nanoparticle industry to not knock on Continental Westernâ''s door when looking for a policy? Or demonstrating what a forward thinking, risk adverse trailblazer the company is to its current customers?

If itâ''s the former, well I am not sure that they are turning away much business, and the little that they are will find the insurance they need. If it was the latter, it probably would be a safe bet that the current customers probably didnâ''t know about nanotechnology never mind care about its toxicological issues.

Bewildering, indeed.

Keeping score in the digital cinema game: the virtual print fee is winning by a landslide


Digital Cinema technology has been viable for several years; the problem has been getting it into the theaters. Itâ''s not that theater owners, for the most part, wouldnâ''t love to trade in their film projectors, itâ''s that converting a multiplex to digital is an expensive operation; about $70,000 a screen.

In the December 2006 issue of IEEE Spectrum author Russell Wintner described a creative solution to this dilemma: a deal between the vendors of digital equipment, the movie studios, and the theaters in which the vendors would provide the equipment to the theaters at no charge, and would be reimbursed by fees paid by the movie studios when they load digital files of movies onto the theater systems. Wintner termed this charge a virtual print fee. An interesting idea at the time, but would anyone sign on? Wintner predicted that they would.

And indeed, they have. Wintnerâ''s group, Access IT, signed four studios this spring--Disney, Fox, Paramount, and Universalâ''and is busy converting 10,000 North American screens to digital (AccessIT had already installed systems for projecting bits onto 4000 screens with a studio backed virtual-print deal). And last week a consortium of three of the largest theater chains, Digital Cinema Implementation Partners (DCIP), announced that they put together a financing package that will fund converting 20,000 North American screens to digital and signed on five studiesâ''Lionâ''s Gate Entertainment, Paramount Pictures, Twentieth Century Fox, Universal Pictures Universal Pictures Walt Disney Co., Again, studios will pay a virtual print fee.

Later last week Sony Corp., partnering with Paramount and Twentieth Century Fox, separately announced that it would convert 9000 screens in North America, Europe, and Asia. Again, the conversion will be financed by a virtual print fee. All the consortia are estimating this fee to be between $700 and $1000.

As far as Wintner is concerned, the more of these deals that happen the better. â''It means,â'' he says, â''the initiative I helped start in 1996 has succeeded with a commitment to replace 35mm film in all theatres across the U.S. and Canada (a total of about 38,000 screens).

Next up for Wintner? Taking advantage of digital cinemaâ''s 3D capability. â''These announcements,â'' he says, â''will add enormous momentum to 3D initiatives.â''

US Army plans to build 500 MW solar thermal plant

The U.S. Department of Defense, as we reported this month, has become the home of several very large-scale renewable energy projects. The reasons are simple: the military owns lots of empty land, it has complete jurisdiction over that territory, and its energy needs are insatiable. To that end, the U.S. Army, which to date has lagged the Air Force and the Navy in its energy initiatives, has just announced plans to build a 500-megawatt solar thermal plant at Fort Irwin, in California. The Mojave desert, an empty and hot place, has long been the home of solar thermal activity in the United States, in large part because it receives some of the strongest solar radiation in the world. The Army also reaffirmed its interest in a 30-megawatt geothermal power plant at Hawthorne Army Depot, using geothermal research from the Navy.

The Army's endeavor marks the military's first foray into solar thermal. The plant will be about equal in size to the Mojave Solar Park 1, which is being developed by Solel Solar Systems and is expected to be operational in 2011. However, contrary to what this CNET article reports, the Army's solar power plant will not "eclipse today's largest U.S. solar thermal installation of 14 megawatts at Nellis Air Force Base" -- that solar installation, though large, is photovoltaic. For more on the Nellis photovoltaic field and other military energy projects, check out this slide show.

Out of Africa: the sky is the limit

Mobile phones are the rage in Africa, but their success should not obscure an uncomfortable reality: Internet access is relatively small and too costly.

The solution is neither clear nor inexpensive. Two problems are critical. First, there need to be better communications links within and between African countries. Second, the African continent must have stronger links with the rest of the world.

Undersea cables, coming on stream, seem likely to solve the second problem. The first problem is more nettlesome, though bright minds envision an answer in the sky.

Satellites ought to do the trick, say Google and a communications innovator, Greg Wyler, whom the search-engine company is supporting.

The effort by Wyler's Ob3 Networks, which would involve 16 satellites, is expensive -- $700 million by one reckoning. There's also the question of whether the approach is commercially viable, or would require long-term subsidies from outside donors.

Definitive answers will not come quickly. The task of "wiring" Africa -- amid all the hoopla over the penetration of mobile phones in the poorest parts of the world -- remains daunting. And yet without greater Internet usage, the information economy in Africa will suffer gravely.

Exascale supercomputers: Can't get there from here?

Today Darpa released a report I've been hearing about for months concerning whether and how we could make the next big leap in supercomputing: exascale computing, a 1000x increase over today's machines. Darpa was particularly interested in whether it could be done by 2015.

With regard to whether it could be done by 2015, the answer, according to my read of the executive summary, is a qualified no.

In it's own words, here's what the study was after:

The objectives given the study were to understand the course of mainstream computing technology, and determine whether or not it would allow a 1,000X increase in the computational capabilities of computing systems by the 2015 time frame. If current technology trends were deemed as not capable of permitting such increases, then the study was also charged with identifying where were the major challenges, and in what areas may additional targeted research lay the groundwork for overcoming them.

The study was led by Peter Kogge, an IEEE Fellow and professor of computer science and engineering at Notre Dame University. (We'll be talking to him next week about the study for further coverage in IEEE Spectrum) And it had contributions from some of the profession's leading lights including Stanford's William Dally, HP's Stanley Williams, Micron's Dean Klein, Stanford's Kunle Olukotun, Georgia Tech's Rao Tumala, Intel's Jim Held and Katherine Yeolick (who I include in this list not because I know who she is, but because she lectured about the "Berkeley Dwarfs").

Darpa's helpers seem to have come to the decision that current technology trends will not allow for exascale computing. That's summed up pretty neatly in this graph, which clearly shows that the trend line in computer performance undershoots exascale in 2015 by an appreciable amount:


The group found four areas where "current technology trends are simply insufficient" to get to exascale. The first and what they deemed the most pervasive was energy and power. The Darpa group was unable to come up with any combination of mature technologies that could deliver exascale performance at a reasonable power level:


The key, they found, is the power needed not to compute but to move data. Data needs to move on interconnects and they found that even using some really cool emerging technology it still cost 1-3 picojoules for a bit to go through just one interconnect level (like from chip to board or board to rack). Scale that up and you're talking 10-30 MW (167 000 - 500 000 60 watt light bulbs) per level. Eeesh.

The other 3 problems are memory storage (how to handle 1 billion 1GB DRAM chips), concurrency and locality (how to write a program that can handle a billion threads at once), and resiliency (how to prevent and recover from crashes).

These are equally interesting, but the power problem is, I think, what much of today's computing work is really boiling down to. Solve that, and things will look a lot sunnier for everything from high performance computing to embedded sensors.

The full (297 page) Darpa Exascale Computing report is here.

(In the November issue of IEEE Spectrum, watch for a cool simulation that Sandia computer architects did to show another bump in the road to future supercomputers. Their simulations show that as the multicore phenomenon advances in the processor industry, some very important applications will start performing worse.)

Nuclear waste imports can wait

Last July, our Sally Adee, brought you a story on the controversy over a Utah company's plan to import 18 000 metric tons of Italian nuclear waste into the United States and (after some difficult to understand process) dump some of it in Utah.

The Wall Street Journal reports that the Nuclear Regulatory Commission has decided to delay its decision on whether or not the importation can proceed. The NRC is going to sit on its hands until a federal court hears a related caseâ''some time next year.

The delay, says the Journal, gives a boost to a bill that would ban nuclear waste imports (unless they were defense-related). The legislation is currently stuck in committee.

Physics Nobel for why the Big Bang wasn't a big bust

From our intrepid intern, Monica Heger:

The Nobel Prize in physics was awarded today for discoveries in subatomic physics. Yoichiro Nambu, from the Enrico Fermi Institute at the University of Chicago won half the award for his discovery of the mechanism of spontaneous broken symmetry in subatomic physics. Two Japanese physicists, Makoto Kobayashi from the High Energy Accelerator Research Organization and Toshihide Maskawa from the Yukawa Institute for Theoretical Physics at Kyoto University, split the other half of the award for their discovery of the origin of broken symmetry, which predicts the existence of at least three families of quarks, a fundamental particle.

Broken symmetry lies behind the very nature of our existence. At the time of the Big Bang, if equal amounts of matter and antimatter were created, they theoretically would have destroyed each other. Instead, that symmetry was broken, allowing for the existence of our universe. Scientists still do not know how that symmetry was broken.

The three Nobel winners all explained broken symmetry within the framework of the existing laws of physics. Kobayashi and Maskawa were only able to do this by expanding broken symmetry to include three new families of quarks. The quarks they described in 1972 have only recently been observed in laboratories by particle accelerators.

Marching to the Beat of a Different Drummer in Nanotech

Andrew Maynard in his latest blog site presents one of the stronger metaphors I have seen to date to describe the state of dialogue (or lack thereof) on the future and direction of nanotech.

Maynard likens the current discourse to the latest social phenomenon the â''silent raveâ'' in which everyone shows up at the same place but listen to their own iPod.

These nanotechnology meetings to which Maynard draws his comparison consist of scientists, policy makers, industry leaders and NGOs just to name the main groups and they are all marching to the beat of different drummers.

What Maynard seems loathe to point out is that there may actually be qualitative difference between the drummers, or, to follow his metaphor, songs. Maybe Ringo Starr was a better drummer than Pete Best.

After reading TNTLogâ''s recent experience at another stakeholder consultation group intended to be â''Fruitful Dialogueâ'', one wonders how fruitful these dialogues can be when one or more of the groups clearly have absolutely no idea of what they are talking about.

Is it possible to step in and pull the plug on those iPods of the clearly misinformed? Probably not. The thought that some ideas and opinions are just bogus has come to be so anathema to â''reasonableâ'' people that we have to endure nonsense, or noise, and hope that the more pleasant notes come to the fore.

Unfortunately, hoping for something to happen doesnâ''t mean that it will.


Tech Talk

IEEE Spectrum’s general technology blog, featuring news, analysis, and opinions about engineering, consumer electronics, and technology and society, from the editorial staff and freelance contributors.

Newsletter Sign Up

Sign up for the Tech Alert newsletter and receive ground-breaking technology and science news from IEEE Spectrum every Thursday.

Load More