Energywise iconEnergywise

Fast Train Boomerang?

Except for bicycles, mass transit is the greenest of transportation technologies. Experts may quibble about whether electric and hybrid-electric vehicles yield net energy and carbon savings, but there's no doubt that to the extent travellers can be lured from cars onto trains, substantial efficiencies result. As for air travel, greenhouse gas emissions are so great some green-minded indviduals try to avoid it altogether.

So, as emerging economies like China build out their railway systems while advanced industrial countries intensify efforts to make commuter raid more attractive, provision of advanced fast trains is a highly competitive business. It's been dominated in recent years by Bombardier, the commuter plane and train maker based in Montreal; Alstom, make of France's famed Train a Grande Vittesse (the TGV); the intercity expresses (ICEs) made by Germany's Siemens; and Hitachi, manufacturer of Japan's Shinkansen bullet trains.

Bombardier, having assembled and created manufacturing operations around the world, has emerged in recent years as world's leading trainmaker. The United States, havlng sorely neglected investment in infrastructure and green technology, may have some influence in fast trains as a consumer, but as a technology developer it's been nowhere. Easily the most influential consumer is China--and it's been leveraging that influence to acquire advanced technology. As a result it may be in a position to challenge the top four manufacturers before this decade is out.

The global significance of green-oriented infrastructure technology is not to be underestimated. Ask yourself why China is a major preoccupation and problem for the United States but much more an opportunity for a country like Germany, and the main part of the answer is bound to be Germany's strength in infrastructural engineering--power plants, train equipment, industrial machinery. The U.S. trade deficit with China was $269 billion in 2008 , with exports just a fifth of imports; but Germany's China deficit was only $25 billion, with exports two thirds of imports.

With an eye on that kind of situation, at present the main fast train makers are all developing still greater versions of their current systems. Bombardier's next train is the Zefiro (pictured in the photo in the table of contents). Siemens has dubbed its the Valero-D (pictured above).  Alstom's is the AGV (Automotrice a Grande Vitesse). Hitachi, not to be left out of account, has been developing a bullet train with redesigned heavy-duty bogies at a factory in Kent, with Britain's commuter raid networks in mind. Both Alstom and Bombardier have won major orders in Italy, and both companies with Siemens have been involved in helping China build the remarkable high-speed link that now connects Guangzhou with Wuhan. (Guanzhou, near Hong Kong, was formerly known in the West as Canton; Wuhan, on the Yangtze River, is midway between Guangzhou and Beijing.

But the China market has come at a price. True to its customary operating procedure, it has exacted co-production and technology-transfer agreements as part of its big train purchase deals. As a result, the Financial Times concluded in a recent special report, "the [rail]  industry universally expects a large-scale effort by Chinese manufacturers to break into many of the world's most important markets." Until recently, the FT continued, the European manufacturers confidently expected to dominate provisioning of the 40,000 kilometers of dedicated high-speed lines that China plans to build by 2020; few thought that China was close to mastering their technology.

It won't be the first time that China has stunned the world with the speed with which its people are able to get on top of advanced engineering. To be sure, Alstom, Bombardier, Siemens, and Hitachi are confident of their ability to stay ahead with the very most advanced trains. But when contracts are let for the fast train lines planned for the United States, with encouragement from stimulus bill funding, the Chinese Ministry of Railways--having concluded a memorandum of understanding with General Electric--es expected to be among the bidders.

"Energy Independence"

The Financial Times reports today in a handful of related articles that the United States is getting set to sell some $123 billion in conventional arms--yes, that's $123 billion in conventional arms--to countries in the Middle East. The biggest single chunk, consisting of new and upgraded F-15 fighters, goes to Saudi Arabia, which also is buying a lot of equipment suited to counter-insurgency fighting. But Kuwait, the United Arab Emirates, Oman and Qatar also are major customers.

Notably, UAE and Kuwait are buying upgrades to the Patriot short-range missile defense system developed by Raytheon, and UAE intends to buy Lockheed Martin's high-altitude Thaad missile defense system. Boeing as builder of the F-15 will be the biggest corporate beneficiary of the sales, followed most likely by Raytheon.

Why are the Gulf states buying? Partly of course because they have a lot of money burning holes in their pockets: With high oil prices they have accumulated  vast hordes of foreign currency. However, their main immediate motive appears to be fear of a hegemony-minded nuclear Iran and, closely related to that, fear that Israel might attack Iran--and that Iran might then retaliate against them, because of implicit or explicit cooperation with Israel.

I personally do not believe that Israel will attack Iran, or that the United States will do so in Israel's place. But according to the Financial Times's analysts, leaders in the Gulf states are not so confident.

Why is the United States selling? Partly because at a time when everybody is so concerned about job creation lagging recovery, it does't hurt to generate jobs at some of the most successful U.S companies. But the larger consideration, according to the highly regarded military analyst Anthony Cordesman, is that the U.S. government wants to create a "new post-Iraq security structure that can secure the flow of energy exports to the global economy," as he put it to an FT reporter.

He's got that right. The U.S. government (assuming its leaders are thinking clearly) are not so much concerned about American energy dependence as the dependence on oil of its big partners in the global economy, Europe and Japan.  Contrary to widely held misperceptions the United States does not import all that much of its oil from the Middle East: only about 15 percent, according to the most recent figures I've seen, whereas over four fifths comes from the Americas and Africa. But Japan and some major countries in Europe are critically dependent on Middle Eastern oil--as they go, so goes the world economy.

The late Stephen Schneider, the eminent and influential Stanford climatologist, used to illustrate lectures about oil with photos of the U.S. aircraft carriers required to guarantee its safe delivery. His point was that when the total cost of oil is tallied, the cost of the aircraft carriers really should be included. That point is worth recalling, with all the talk we're hearing these days of systematically removing fossil fuel subsidies, worldwide. (If we can't agree on mandatory cuts in carbon emissions, the thinking goes, we ought to at least agree on no longer subsidizing oil and coal.)

Having paid the Gulf states vast sums of money for oil, the advanced industrial states help defray the costs of the militaries they maintain partly to secure the delivery of that oil by selling huge quantities of arms back to the Gulf States. That in itself seems more than a little questionable. But there are of course other implications. Calculations behind the current deals obviously include a bet that Israel need not be threatened because the Gulf states are being sold weapons that were designed and built in the 1970s, while Israel is getting a lot of the best brand-new stuff. But if current trends continue, could sheer quantity become a consideration? Is there a point where Saudi Arabia's fleet of F-15s gets so big, for example, it could be a threat to any regional neighbor?

MIT Finds Plenty of Uranium to Fuel Nuclear Renaissance

An MIT team issued a report on the nuclear fuel cycle yesterday, the third report the university has issued in recent years on aspects of our nuclear future. The report's most striking finding--and one that contradicts a lot of conventional wisdom--is that there's plenty enough uranium to power a nuclear renaissance, even a quite vigorous one. Just as important, as leaders of the team emphasized in a press event yesterday, since fuel costs account for only 2-4 percent of total nuclear electricity generating costs, even a big increase in uranium costs would have only a minor impact on electricity prices.
MIT guesses that even if the global nuclear enterprise expanded by a factor of 10 in this century and all the reactors operated for 100 years, uranium prices would increase only by about half.
In light of that crucial finding, and other circumstances, the report's attitude toward current-generation and next-generation reactors is a little puzzling. Though costs of improved light-water reactors have escalated sharply, with little new demand for them anywhere except East Asia and perhaps India, the report continues to regard them as the "workhorse" of the industry, as a team leader put it yesterday, giving little attention to novel designs for small modular reactors that are under development (such as those Spectrum magazine highlights in its current issue).
A dozen or so years ago, when Ernest J. Moniz was serving as research director in the Clinton Administration's Department of Energy, I asked him for Spectrum why the industry was so slow to develop novel reactor designs. He said it was setting its headlights too low. I took that to mean that Moniz thought more technological ambition was called for.
Yet Moniz (pictured above) was cochairman of the team that produced the MIT fuel cycle report, which shows so little vision about new reactor technology.
This goes especially for fast-breeder technology: MIT treats it as if it were still a plausible candidate for an end-of-century nuclear economy. But why would we need breeders if there's plenty of uranium to go around to fuel conventional reactors for at least a century?
Tom Cochran of the Natural Resources Defense Council pointed out yesterday that the cost of recovering nuclear fuel from spent fuel--whether for use in light-water or breeder reactors--has increased by an order of magnitude in real dollar terms since 1970. The MIT team did not contradict him. But why then would one want to build breeders, which depend on nuclear fuel recycling, now shown to be excessively costly?
MIT's attitude is all the more baffling when one takes into account the history of fast breeder development since the 1960s. Of the major reactor demonstration projects, including France's once-touted Superphenix, not a single one has satisfied expectations. Some have been disasters. None can be called a success.
So where is the fast breeder reactor, and why would one want to operate it, that MIT persists in visualizing for the late 21st century?

It often is argued in nuclear industry circles that reprocessing and recycling of nuclear fuels is needed to alleviate waste disposal problems, even if there is no need for recovered fissile material to power breeder or conventional reactors. But the MIT report also takes a notably sanguine view of what generally seems an intractable dilemma, especially in the United States. Leaders of the team said that geological storage is a sound approach, and in fact that all the approaches to storage are satisfactory--at reactor sites, dry cask storage at interim facilities, and long-term semi-permanent but retrievable geologic disposition.

Nor is a final resolution of the waste problem as urgent as it may appear. "Every country that has examined the problem has concluded that spent fuel should be temporarily stored for 46-60 years [for its reactivity to diminish] before being put into a final repository," said one team member. Thereafter, a repository like Yucca would be filled over a period of perhaps 30 years and then kept open for at least 50 years after that, for ventilation and possible retrieval of spent fuel.

IEA's Tanaka Maps Global Technology Future

The International Energy Agency is not staffed by a bunch of starry eyed utopians and idealists. Its director general Nobuo Tanaka reminded an audience at Columbia University today that it was conceived by Henry Kissinger as a kind of counter-OPEC, an  "oil consumer countries' union" The bottom line he said is that the IEA countries today maintain a 90-day oil reserve.

Having said that, Tanaka went on to describe the agency's 2010 Energy Technology Perspectives, issued this summer, which calls for nothing less than a revolution in energy technology to meet the 2050 carbon reduction goals adopted at Copenhagen last December. As the report itself says: "Current energy and CO2 trends run directly counter to the repeated warnings sent by the United Nations Intergovernmental Panel on Climate Change, which concludes that reductions of at least 50 percent in global CO2 emissions compared to 2000 levels will need to be achieved by 2050 to limit the long-term global average temperature rise to between 2.0 degrees Celsius and 2.4 degrees C."

Employing a methodology that Tanaka described as "backcasting" rather than "forecasting," the IEA concluded that carbon prices would have to go to 175 dollars per ton in the next decades--an order of magnitude higher than current prices in the world's rather  limited carbon markets--and that the world's countries will have to spend at least 30 percent more on energy r&d than they'd spend according to baseline projections. Other policies needed worldwide would include energy and conservation standards, mandated best practices, consumer eduction, and serious research into consumer behavior.

With an eye on the disappointing Copenhagen outcome, which did not produce new carbon reduction requirements, and the next conference scheduled for Cancun this fall, Tanaka said: "We should not wait for a global deal. We know what to do."

But knowing what to do is not the same as actually doing it. The report and Tanaka himself concede that revolutionary changes in technology and sharp reductions in carbon emissions will not occur unless producers and consumers face higher energy prices. Politicians are going to have to be more courageous and honest in getting that across, Tanaka told me after his talk. But that runs up against a dilemma highlighted in the very first sentences of the technology report: The global near-depression of the last two years reintroduced a concern "that high energy prices can cripple economic growth," and renewed concern about basic energy security.

The IEA, like the Obama administration, would like us to believe that developing greener energy technology, securing future energy supplies, and cutting carbon emissions are all one and the same thing. But voters aren't buying that. So it will take more courage, honesty and eloquence than politicians have mustered so far to persuade them to pay what's required to get what they say they want.

There's no better source for current and future energy trends than the IEA. But in the past I've been mildly critical of the agency for predicting events that it describes in the same breath as unsustainable. This time I'd criticize it not for a lack of economic realism--the point Tanaka seemed sensitive about--but for lack of political realism.

US Offshore Wind Potential Dwarfs All Existing Electricity Capacity

The United States boasts an embarrassment of riches when it comes to offshore wind potential.

According to a report released by the Department of Energy's National Renewable Energy Laboratory, the US maxes out at 4,150.3 gigawatts of offshore generating capacity. In 2008, the entire country's electricity generating capacity barely topped 1,000 gigawatts.

Of course, determining the total potential has very little to do with what can or will actually be built, and if the past is any indicator of future performance (it isn't, especially in this case, but still) then we will take advantage of precisely zero of those gigawatts. But still, it is heartening to know that if a strong will develops aimed at plumbing the amazing depth of the offshore energy potential, there will be no lack of places to put the turbines.

The NREL calculated that massive gigawatt number using average wind speeds at 90 meters above sea level from shorelines out to a maximum of 50 nautical miles. The 4,150.3 GW takes up more than 830,000 square kilometers of ocean and Great Lakes; more than 127,000 of those surround Hawaii, which comes in just ahead of California in total capacity, at 637.4 GW. Even tiny Rhode Island could swing 25.6 GW, ahead of several other coastal states.

This is no indicator of where the industry is actually heading, and the NREL itself admits that the analysis "does not consider that some offshore areas may be excluded from energy development on the basis of environmental, human use, or technical considerations." But as Cape Wind and various projects off the coasts of Rhode Island, New Jersey and elsewhere slog forward, and as technology also marches apace (floating turbines, anyone?), perhaps some of those gigawatts flying around the US shores might start heading inland soon.

(Image via NREL)

Pure-form Big Chill Scenario Seems Vindicated

For decades, Wallace Broecker's Younger Dryas "big chill" scenario has been the poster-boy example of how sudden and devastating climate change can be: According to his theory, the sudden collapse of a natural ice dam in the St. Lawrence river system about 13,000 years ago let a gigantic mass of fresh water escape from an inland sea into the North Atlantic; that in turn disrupted normal oceanic circulation, shutting down the Gulf Stream and plunging the northern hemisphere into a mini-ice age, just as it was emerging from the last big one.

A few years ago groups of scientists produced evidence that the St. Lawrence dam collapse was induced not by post-ice age warming but by a meteor. Strictly speaking, the question was academic. It didn't really matter, in terms of implications for climate policy, whether the root cause had to do with natural climatic cycles (which by the way are triggered by changes in Earth-Sun orbital geometry), or with something that came in one huge burst from outer space. From a scientific point of view, a meteor impact inducing other changes is one kind of climate "forcing"; a human-induced increase in greenhouse gas levels is another.

But of course that's not exactly how things play in the popular press. Say that a climate catastrophe was caused by a meteor, and it sounds like a freak accident that has no real implications; say that it arose  from the natural dynamic of the earth-atmosphere system and it sounds like something to seriously worry about.

So it's of some interest, both political and scientific, to learn that the meteor theory is on its last legs. According to an authoritative account by reporter Richard A. Kerr in the September 3 issue of Science, "Mammoth-Killer Impact Flunks Out," the leading impact specialists are concluding that the main proponents of a Younger Dryas collision have failed to find the critical evidence needed. Chalk another one up for Broecker, seen above setting out on a scientific expedition.

Old-Time Coal

The grassroots rebellion against coal that swept the United States has been a startling and striking development. Until quite recently it seemed impossible to build a new nuclear power plant under any circumstances; almost from one moment to the next it was new coal one couldn't build--or so it seemed.

Actually, according to a recent compilation by the Associated Press, that's not quite so. Since 2008, 16 large coal plants have been completed in the United States, and 16 more are under construction, according to AP's Matthew Brown. "Combined they will produce an estimated 17,900 megawatts of electricity, sufficient to power up to 15.6 million homes--roughly the number of homes in California and Arizona," writes Brown. "They also will generate about 125 million tons of greenhouse gases annually."

To be sure, until recently Federal regulators expected about 150 new coal plants to be built by now. So, 16 is a much smaller number, but still quite big enough to keep coal in its position as the largest U.S. generator of electricity and by far its single most important source of carbon emissions.

The continued construction of traditional coal plants represents an "acknowledgment that highly touted 'clean coal' technology is still a long ways from becoming a reality," as Brown sees it--and it's hard to argue. About $35 billion is being spent on the old-time coal plants, about ten times what the Obama Administration's stimulus bill earmarks for development of clean-coal plants that can capture and store carbon.

Brown also sees an implicit confidence on the part of the utility industry that carbon emissions will not be penalized, but if that's the industry's attitude, it may turn out to be mistaken, despite this year's demise of the administration's proposed cap-and-trade legislation. The Environmental Protection Agency still has the authority to regulate carbon directly as a pollutant, and though that's sure to be challenged in Federal courts, the Supreme Court would have to reverse a rather recent ruling for EPA to be blocked.

Meanwhile there are other ideas on the table. Media magnate Ted Turner, oilman T. Boone Pickens, Silicon Valley entrepreneur Steve Kirsch, Grist blogger Ted Nace, and--last and indeed least--yours truly have all proposed that we simply pay owners of dirty coal plants to shut them down, on the model of the cash for clunkers program. Paying rather than penalizing could defuse a divisive issue and give the economy an added boost, the same way bribing people to buy new cars did.

U.S. National Parks Threatened by Climate Change

Last year Spectrum had occasion to draw attention to cyber threats to Virginia's state computer systems. Now the focus is on climate change and Virginia's major tourist attractions, with the release last week of a report by the Rocky Mountain Climate Organization and the Natural Resources Defense Council: Profile of Virginia's Special Places in Peril.

The report addresses how climate change could affect small sites like Jamestown and one very big one, the gorgeous Shenandoah National Park and its famed Blue Ridge Highway, built during the New Deal as one of FDR's showcase work generation projects. In a press briefing, representatives of NRDC and the Rocky Mountain Climate Organization joined with Virginia conservationists to spell out how global warming could affect places that draw 6 million visits annually and account for 4,000 in-state jobs.

I have to confess finding the warnings about the Blue Ridge somewhat underwhelming. Yes, I don't doubt that the timing and character of the spectacular fall colors may be affected and that ground-feeding birds may be threatened. Yes, it is a really remarkable thing that this greatest of eastern parks is just "a tank of gas away" for many tens of millions of people, some living as far away as New York City. (My son and I discovered that last summer when we drove down from Brooklyn and took the pictures accompanying this post.) But frankly, my gut tells me that even with some pretty unwelcome effects from climate change, experiencing the Shenandoah will be just as astonishing 75 or 100 years from now as it is today.

Jamestown, ironically, is perhaps another story. The site is little and not much to look at. But it is after all the exact spot where Europeans landed in 1607 to establish their first permanent settlement in North America, as every grade schooler learns. No doubt it will be permanently submerged by the end of the century--and so will many another equally compelling site around the world.

At the briefing I naturally wondered why a Rocky Mountain organization was  examining an Appalachian park and an Atlantic monument. The answer is that RMCO has been systematically examining the impact of climate change on major national parks, starting in the west, with special attention in one report to Glacier National Park. But there also is a not particularly hidden agenda. In the last year, the Virginia attorney general has been conducting something of a crusade against climate activism: He sought to take legal action in connection with the climategate research imbroglio, and he has taken a leading role in challenges to the Environmental Protection Agency's authority to regulate carbon.

NRDC's Theo Spencer said in opening remarks last week that the report is meant to convey (among other things) that there are more constructive ways of addressing climate change than to conduct a "witchhunt against climate scientists." And the report concerns issues, Spencer continued during the Q&A, that are unfolding "in DC policymakers' backyard."

Stephen B. Schneider: In Memoriam

Though I can't altogether justify it, I can't escape the feeling that the death on July 19 of Stephen B. Schneider somehow got lost in the noise of this summer's surf. The more alert climate bloggers, including Andrew Revkin at the New York Times and William Hewitt at the Foreign Policy Association, immediately reported it and with lively appreciation of Schneider's significance. The New York Times ran a nice obituary, as did other papers. But I didn't see anything on anybody's front page, and there were no after-the-fact tributes in the leading pages of opinion--at least none that I noticed.

For an appreciation, I highly recommend a piece posted by the Bulletin of the Atomic Scientists, "The Passing of a Climate Prodigy." A physicist and engineer turned climatologist, Schneider was a productive scientist throughout a career that took him from GISS to NCAR and finally Stanford. But much more than most scientists, Schneider was eager to involve himself in the details of policy formulation and willing to dirty his hands in politics. (It was mainly for this reason, no doubt, that he was one of MacArthur's designated geniuses.) He founded and led the influential  interdisciplinary journal Climate Change, and he was a lead author for all four of the major IPCC assessment reports.

If the recommendations of the recent IPCC review panel are followed, Schneider would not have been able to continue as a lead writer in future reports, as rotation of authors is called for. But that doesn't mean his voice would not have been heard. At a time when global climate policy is in profound disarray, Schneider's informed attention to the subject will be missed more than ever.

PHOTO CREDIT: Patricia Pooladi, National Academy of Sciences

Macro-Engineering and Renewables: Tilting at Windmills?

As we have covered in this space before, the Desertec Foundation wants to build massive solar thermal installations in the Sahara Desert and elsewhere, pointing out that if only a tiny fraction of the world's deserts were used we could have all the power we would ever need.

But that isn't the only instance of macro-engineering ideas in the renewable energy field: from T. Boone Pickens's ideas around wind power to proposed hydroelectric monstrosities in Africa, people have been thinking BIG when it comes to pushing fossil fuels out of the picture. Some of the ideas seem great, like the solar thermal plants in the desert; but given our general history with projects on such huge scale, is macro-engineering a new energy landscape really the way to go?

Take the idea put forth by researchers in 2007 of a massive dam at the narrow southern entrance to the Red Sea: the project would conceivably generate upwards of 50 gigawatts of electricity, but the authors themselves basically acknowledge such an idea is little more than a thought experiment. "The cost and time-scales involved are beyond normal economical considerations," they write, and seem more interested in simply discussing the implications of such huge projects.

But is that really so off-scale? The existing Three Gorges Dam along the Yangtze River in China (pictured) will peak at just shy of half that capacity, and the proposed Grand Inga Dam along the Congo River will supposedly have a generating capacity of 39 gigawatts.

Clearly, we as a species are not incapable of seemingly insane-in-scope projects, and renewable energy is no exception. Dams, of course, are public enemy number one among environmentalists when it comes to renewables, but is it possible that that's just because we haven't completed any really big solar or wind projects? Three Gorges displaced 1.3 million people, had cost overruns in the many billions of dollars, and has been threatened by such largely unforseeable problems as giant islands of garbage that could have jammed up the turbines.

What would Desertec do to the Sahara? What would happen to America's "wind corridor" through the Midwest if all of the Pickens turbines were actually built? Clearly, I am not suggesting that renewables aren't the most important energy source of the future (they are), only that thinking too big sometimes has unintended consequences. Macro-engineering concepts in other fields are often just ideas that don't get built:  the giant Shimizu Pyramid, or, say, a space elevator. With the need to massively expand renewable energy around the world growing greater with every ton of CO2 emitted, does it make more sense to focus efforts and dollars on individual, very doable projects than on giant quick fixes?

(Photo via Hugh/Wikimedia Commons)

Advertisement

Newsletter Sign Up

Sign up for the EnergyWise newsletter and get biweekly news on the power & energy industry, green technology, and conservation delivered directly to your inbox.

Advertisement
Load More