Tech Talk iconTech Talk

Apple's bad battery news could be good news for Boston Power

4197VZ0BH5L._SL500_AA280_.jpgIn the newspaper last week I read about Appleâ''s admission that some first-generation iPod Nanoâ''s are overheating due to battery problems. The announcement came in the wake of a Japanese government report that credited overheating in first-generation nanos with causing three fires, two light burn injuries, and twelve damaged cases. In this list it included one iPod nano that scorched a tatami mat back in January and a second unit that burned sheets of paper in this month.

In the mail last week I got an invitation to a party celebrating the fact that Boston Power is shipping its first generation of lithium-ion batteries.

Conclusion: If timing is everything, Boston Power has great timing. This is the startup company I wrote about in the March issue of IEEE Spectrum that began looking for money to fund development of a safer, more reliable, lithium-ion battery just before the big Sony battery recalls of 2006. It probably would have gotten funded eventually without battery flameouts being in the news, but Sonyâ''s problems certainly didnâ''t hurt.

Now, after keeping a fairly low profile all year, Boston Power is about to send lithium-ion batteries to as-yet-unidentified laptop manufacturs who will quickly be installing them into computers and shipping them out to consumers. And the company also announced product developers can order batteries for evaluation from its website.

Which is why the Nanoâ''s troubles are good news for Boston Power. Thereâ''s nothing like a scorched tatami mat or two to make no-meltdown technology just that much more attractive.

Out of Africa: death of the digital divide

Is the "digital divide," one of the most popular technology buzz terms of the decade, dead?

The question was posed to me last night by Eric Osiakwan, an old friend and Internet promoter from Accra, Ghana. Osiakwan is visiting the U.S. to attend a gathering of global geeks convened by Harvardâ''s Berkman Center next week. As soon as we sat down for beers and pizza in Berkeley's Jupiter cafe, he asked me whether I had seen the confession by Jeffrey Sachs, the economist, in the London Guardian.

I had not, so Eric found the piece on my Ipod. Sure enough, Sachs was admitting that for too long he had underplayed the importance of information technology -- computing, communications and the Internet -- in reducing poverty in Africa.

"The digital divide is beginning to close," Sachs opined.

"Extreme poverty is almost synonymous with extreme isolation, especially rural isolation. But mobile phones and wireless internet end isolation, and will therefore prove to be the most transformative technology of economic development of our time," Sachs added.

"The digital divide is ending not through a burst of civic responsibility, but mainly through market forces," he continued. "Mobile phone technology is so powerful, and costs so little per unit of data transmission, that it has proved possible to sell mobile phone access to the poor."

At this point Osiakwan beamed proudly, but then made an important critical point: On a global scale the digital divide is closing but within African countries the divide remains -- and may even be worsening as in many places the gulf between rich and poor is widening.

Another major issue, Osiakwan told me, is the shifting nature of computing and communications. Five years ago, the experts thought the computer would be the engine of networking, even in very poor parts of Africa. Great and expensive initiatives, often backed by governments and charities, arose with the aim of bringing computers to the African masses.

Computers of course remain important in Africa. But two factors changed the equation. First, used computers began to proliferate in African cities. These machines often cost as little as $150, and they are fully functional desktop machines, effective though usually using a generation-old system. Even laptop computers can be found for $250; again, they are used but in good condition.

These aging computers can no longer be sold in Europe and the U.S., but they are sought after in Africa, and for good reason. They cost 5 to 10 times less than a new machine.

The second factor disloging the computer from center stage is the rapid rise of the mobile phone. Phones are becoming more powerful, so that the prospect is approaching of convergence between Internet and mobility.

As we dive into our pizzas, Osiakwan admits to me that he never anticipated the sudden ascent of mobile phones -- and the relative lack of excitement about computers today in Africa.

"Mobile phones are where the action is," he says, "but the Internet remains the foundation for the new information society arising in Africa. Without the Internt, the phone would only be for talking."

And we all know, talk is cheap.

Engineering a Better Olympic Athlete

During the broadcast of the Closing Ceremony last night by NBC, commentator Joshua Cooper Ramos took note of the progress the Chinese athletes have made in recent years and referred to China's nationalized sports program as "an engineering project." Fellow commentator Bob Costas was quick to agree, pointing to the country's focus on developing elite athletes from early childhood while paying scant attention to the physical fitness of those of its children who do not show precocious potential as future Olympians.

Not surprisingly, China won the most gold medals (51) of the Games in Beijing. As with many things developed in a Communist state, those medals were the result of a good deal of planning and long-term follow up. Nationalized sports programs are nothing new in the Olympics. The Soviet Union dominated the Games for years with cradle-to-medal-podium training regimens. But China's athletics "engineering project" still raises questions about the fairness of big, state-controlled programs competing against those of smaller, free-market nations. In other words, were these Games fair for all? Probably not, but the world is not a perfect place.

The bigger question, though, lies implicit in the comments of the NBC commentators: Can a nation engineer premier Olympic athletes, as if they were automobiles or aircraft?

The answer to that appears to be yes. And there are fears now that unscrupulous administrators of sports programs in years ahead could use highly sophisticated methods to give their protégés enhancements that go well beyond those that come from selective recruitment and nationally subsidized training.

In the most controversial (and illegal) example, a practice called gene doping, the medical techniques used to manipulate genetic material for therapeutic purposes are subverted by corrupt physicians to enhance the physical makeup of athletes. In a report from the American Association for the Advancement of Science (see Strong New Measures Against Gene Doping in Sports Urged at Conference Co-Sponsored by AAAS), gene therapy researchers urged participants at a recent meeting of the World Anti-Doping Agency (WADA) held in St. Petersburg, Russia, to advocate for stricter attention to the threat of gene doping in sports.

Gene doping could be used to modify an athlete's own genes to increase muscle mass or boost red blood cell production, for example.

"Science has moved so quickly in gene therapy and because it moves so quickly, it makes the non-therapeutic use of these kinds of methods much more likely and much more imminent," Theodore Friedmann, a former president of the American Society of Gene Therapy, told conference attendees. "And the sooner it pops up in sport, the more likely it is to pop up in other areas."

The experts in illegal performance enhancements called for governments to legislate sanctions against those who might be tempted to tinker with the genetic material of athletes in order to boost their prowess. The experts also called on physicians to be more vigilant in looking for signs that competitors have been genetically enhanced.

So far, according to the scientists, no documented cases of gene doping in the world of sports have been uncovered yet. That does not mean that somewhere there isn't someone trying to do it. The reward is so great that it seems illogical to believe that some ambitious program wouldn't stoop to the level of trying to engineer a better athlete genetically.

It's one thing to selectively train a young person to become a talented competitor as an adult. It's quite another to try to build one through human chemistry.

Newly Discovered Seismic Fault Could Threaten Indian Point

Earthquake risks in the greater New York City area are reassessed in a major study released today by a team of seismologists based at Columbia Universityâ''s Lamont-Doherty Earth Observatory, Palisades, N.Y. Though the report reaffirms that large earthquakes are relatively rare in New York, it finds that fault patterns are more complex than previously appreciated. In particular, two fault systems are found to converge very close to the controversial Indian Point nuclear power plant, 24 miles north of the city.

The authors of the study catalogued 383 earthquakes from 1677 to 2007, and, in those three-plus centuries, identified three magnitude 5 quakes capable of causing serious damage. They estimate that a potentially catastrophic category 6 quake might occur every 370 years, and a category 7 every 3,400 years. Though those probabilities are relatively low, the damage risk from a New York City earthquake is still very high because of the cityâ''s concentration of people and physical infrastructure, observes Lynn Sykes, the very eminent seismologist who led the study.

A previously known geologic feature, the Ramapo Seismic Zone, runs from eastern Pennsylvania to the Hudson Valley, passing within a couple of miles of Indian Point, with roughly parallel fault lines to the south, as far down as Harlem. Now, in addition, the study has identified a second fault line that originates to the east near Stamford, Connecticut, and intersects with the Ramapo zone, passing within a mile of Indian Point. Thus, â''Indian Point is situated at the intersection of the two most striking linear features marking seismicity,â'' says the paper. â''This is clearly one of the least favorable sites in our study area from an earthquake hazard and risk perspective.â''

In hindsight, even without the new discovery, itâ''s scarcely imaginable that a nuclear power plant would be sited today at Indian Point, if the decision were to be made again. Itâ''s bad enough that the reactor is at the edge of a metropolitan area with 25 million people, and directly upriver of the city, so that if there were a reactor meltdown, the whole harbor estuary would be permanently contaminated. But the issue of whether to keep recommissioning Indian Point will nonetheless be a difficult one to resolve. The plant supplies a large fraction of the cityâ''s energy, and in terms of climate, itâ''s a big green fraction.

And thatâ''s not the only issue the city will have to consider and reconsider. â''We need to step backward from the simple old model, where you worry about one large, obvious fault, like they do in California,â'' observes study co-author Leonardo Seeber, reflecting on the webby fault systems they found. â''The problem here comes from many subtle faultsâ'¿ Each one is small, but when you add them up, they are probably more dangerous than we thought.â''

Coastal Cities Climb on Wind Bandwagon, or Try To

Even as Texas oilman T. Boone Pickens is attracting national attention with his proposal to vastly increase U.S. reliance on wind energy, meeting personally with presidential candidates Barack Obama and John McCain to focus attention on the huge wind potential found in the nationâ''s Great Plains states, cities at opposite sides of the country are seeking to get in on the action. Earlier this week, New York City â''s Mayor Michael Bloomberg announced an initiative to explore all possible applications of wind in the greater metropolitan area, which he believes could have the city relying on wind for 10 percent of its electricity within a decade.

Bloombergâ''s speech, in which he evoked an image of the Statue of Libertyâ''s torch being illuminated by green energy, got wide attention in the local pressâ''initially positive, but then quite critical and skeptical. Bloomberg himself backed off from his suggestion the day after making it, expressing doubt as to whether wide deployment of wind in the city would actually make sense.

Separately but complementarily, Bloomberg announced the week before the creation of an expert task force to study how critical infrastructure can be adapted to the effects of climate change. This initiative may be the one that turns out to have more staying power. The panel will be co-chaired by Cynthia Rosenzweig, of Columbia Universityâ''s Earth Institute, and William Solecki, director of Hunter Collegeâ''s Institute for Sustainable Cities. Rosenzweig and colleagues have been actively advising the city on global warming and infrastructure for several years, and already have issued pioneering studies that have attracted attention in other megacities around the world.

Meanwhile, earlier this summer San Franciso Mayor Gavin Newsom announced creation of a residential wind working group, tasked with figuring out how to revamp the cityâ''s zoning and building codes to allow wind turbines on private lots. On July 25, San Francisco issued an "over-the-counter permitting process for residential and commercial wind turbines,â'' as one of the companies hoping to capitalize on the streamlined proceduresâ''Whirligigâ''put in in a press release. Both the New York City and San Francisco initiatives open opportunities for companies marketing what might be called personal windmills, among which Whirligig is just one.

Where Are the Multi-Fuel Vehicles?


The Fiat Siena Tetrafuel can run on gasoline, ethanol, blends of gasoline and ethanol, and also natural gas. Is your next car going to be a multi-fuel? Photo: Fiat Brazil

A recent New York Times story describes the efforts of billionaire oilman T. Boone Pickens to promote alternative energy, including wind and natural gas. What caught my attention was the beginning of the story, which says demand for natural gas cars like the Honda Civic GX is running high in certain corners of the United States where that fuel has become an attractive alternative to pricey gasoline.

This is interesting because consumers have long been dismissive of natural gas vehicles. The main problem is a lack of natural gas filling stations (there are only about 1,600 in the U.S.). And then thereâ''s range. Natural gas vehicles have around half the range of comparable gasoline cars. (See other pros and cons here.) These issues have discouraged consumers and automakers alike. The Times reports that Honda plans to produce just 2,000 Civic GX units this year; Ford and GM donâ''t even have natural gas cars to offer.

What puzzles me is the this-or-that fuel approach. You can either run on gasoline or natural gas. Why aren't automakers offering cars designed to run on both?

Note the emphasis on designed. Sure, retrofitted vehicles that can burn gasoline and natural gas have been around for decades. But where are the truly multi-fuel automobiles for the masses?

The beauty of such vehicles is they help solve one of todayâ''s biggest energy problems: uncertainty. With fuel prices oscillating wildly, is it better to stick with gasoline, invest in a natural gas vehicle, buy a hybrid, or what? Who knows? Thatâ''s why multi-fuel is interesting. You fill up with whatever fuel is cheaper, or available, where you live. It's no silver bullet for the energy crisis, sure, but it just makes sense in places where more than one fuel is available. The point is multi-fuel could work as a bridge from petroleum to other possible technologies and fuels, be it batteries, hydrogen, cellulosic ethanol, whatever.



Last year, I wrote about one such multi-fuel vehicle, the Siena Tetrafuel, created by Fiat in Brazil. This car can run on pure gasoline, pure ethanol, blends of gas and ethanol in any proportion, and also natural gas. It will burn the natural gas -- the cheapest car fuel in Brazil -- while cruising, and it will switch on the fly to the liquid fuel mix whenever it needs more power. From the article:

And hereâ''s the best part: you can put any mixture of gasoline and ethanol into its tank -- from 100 percent gasoline and no ethanol to 100 percent ethanol and no gasoline. The engine automatically adjusts its ignition timing and the quantity of fuel injected into the cylinders on each cycle to get the most power out of whatever mixture youâ''ve got while keeping emissions under control.

Cars that can use different mixes of gasoline and alcohol have been around for years. And vehicles that let the driver switch between natural gas and gasoline arenâ''t new, either. But one car that can do both -- switching automatically between the fuels and adjusting its engine to suit an arbitrary gasoline-alcohol mix -- thatâ''s very new indeed.

In other words, Fiat engineers designed the Tetrafuel engine -- and programmed its engine control unit -- to operate optimally for all those fuels. Not your usual retrofit. Its multi-fuel capability eliminates the two main problems of natural gas-only vehicles. Canâ''t find a filling station with natural gas? Just use gasoline or ethanol. And with both gas and natural gas tanks, range is not a problem anymore.

Fiat's Brazilian subsidiary unveiled the Siena Tetrafuel almost two years ago. It expected to sell 2,500 units in 2007; it sold more than 10,000. This year it has sold nearly 6,000 so far. In terms of annual sales, the Tetrafuel should represent less than 1 percent of all flex cars sold in Brazil (flex cars can use both gasoline and ethanol; 1.7 million were sold last year). Itâ''s still a tiny market. But for Fiat -- and Brazilians -- itâ''s nice to have such option around in case oil prices skyrocket or something. It's all about flexibility. (In fact, gas prices oscillations and the availability of ethanol at filling stations led to an automotive revolution in Brazil; sales of flex cars went from virtually nothing in 2003 to about 90 percent of all new cars sold last year, when the Brazilian auto industry saw all-time record sales.)

So back to the original question: Why arenâ''t other automakers considering multi-fuel? Well, in a sense they are. There are a number of projects around. BMW has shown off a gasoline-hydrogen luxury sedan. Volvo developed a prototype that runs on gasoline, E85, natural gas, hythane, and biomethane. Most of these projects, however, don't involve mass-produced, affordable vehicles. Automakers say developing multi-fuel vehicles require a lot of R&D and the cars will need extra parts like separate tanks, sensors, and so forth, making the vehicles expensive. But how expensive? Fiat, for example, did a good job in keeping the Siena Tetrafuel's price tag low enough. In Brazil -- the only place where the car is available -- it costs about the same as a regular Honda Fit.

I guess in the end automakers will regard multi-fuel as a niche, too small of a market to bother. They appear to be seeking the â''next big thingâ'' that will take them out of the hole they find themselves in. But then again, as uncertainty about energy prices and availability mount, it appears that betting on a single fuel is a bad bet. We need more omnivorous vehicles.

Wireless energy

The New York Times reports today from the Intel Developers Forum that Intel is the latest to tantalize us with the promise of wireless power.

Wireless power! No more noodle soup under my desk where brutal Darwinian struggles unfold between the cell phone charger, the laptop charger, the digital camera charger, and the electric carving knife charger*, all jockeying for space on the power strip.

On Thursday, the chip maker plans to demonstrate the use of a magnetic field to broadcast up to 60 watts of power two to three feet. It says it can do that losing only 25 percent of the power in transmission.

Intel calls it WiTricity (wireless electricity) and it's built on the shoulders of MIT giant Marin Soljacic. Mauricio Freitas has cool pictures.



Combine this development with the new wave of low-power chips, and I have high hopes for an end to the cold war under my desk.

*(No, I don't have an electric carving knife.)

GM to Market Volt in Europe as Opel or Vauxhall

General Motors has announced plans to launch European versions of its much-ballyhooed Volt, the plug-in hybrid it expects to start producing in 2010, according to a report this week in the Financial Times. GM expects to sell it on the Continent as an Opel, and in the United Kingdom as a Vauxhall. The Volt will be a so-called series hybrid, in which the car is always propelled by its electric motor, with a backup internal combustion engine recharging its lithium-ion battery pack when necessary. Toyotaâ''s plug-in electric car, also scheduled for 2010, will be a parallel hybrid, in which the electric motor and internal combustion engine alternatively provide traction, as required. According to the FT, groups led by Koreaâ''s LG Chem and Bostonâ''s A123Systems are competing for the contract to provide the Voltâ''s batteries.

Large Nerd Collider

The Large Hadron Collider is set to fire up on September 10. Not sure why, but don't want to slog through tedious explanations of the Higgs boson and the Standard Model? Have a look at this informative rap narrative, delivered by persons in lab coats and hard hats.

It actually gets kind of catchy.

The Javelin Throw Goes 21st Century

One of the oldest Olympic contests has gotten a major technical upgrade, after thousands of years. The javelin throw is one of the most iconic events to take place during the Games. It has been vied for since 1906 in the modern era and for a millennium in the ancient Olympics. Part of track and field, competition in the venerable spear toss gets underway in Beijing today, and the athletes will have new javelins in hand that should increase the distance of their throws.

History tells us the ancient Greeks competed in flinging the javelin as far back as 3000 years ago. It was practical exercise, because the ability to throw a spear accurately at great distance was a much-prized ability for hunters and warriors. In recent times, though, the javelin competition has been restricted to the infield of sports stadiums. After the 2004 Athens Games, the International Amateur Athletic Federation decided that, in addition to metal and fiberglass, javelins could be made of new carbon-fiber materials.

At the Bird's Nest stadium in Beijing, some contestants will be throwing the new OTE Composite FX Carbon/Aluminum Javelin from Gill Athletics, of Champaign, Ill. According to the company, the combination of metal and carbon-fiber composites enables javelin throwers to achieve consistent distances of over 90 meters (the contemporary world record is 98.48 m) using the Composite FX.

The firm boasts on its Internet site that its javelin "does make a difference â'¿ with its superior vibration dampening qualities and flight characteristics."

We'll find out later today whether it has what it takes to fly farther than the other spears at this year's modern Olympics.


Tech Talk

IEEE Spectrum’s general technology blog, featuring news, analysis, and opinions about engineering, consumer electronics, and technology and society, from the editorial staff and freelance contributors.

Newsletter Sign Up

Sign up for the Tech Alert newsletter and receive ground-breaking technology and science news from IEEE Spectrum every Thursday.

Load More