Energywise iconEnergywise

Will Winter Weather Worsen EV Range? Not Hardly.

We’re big fans of Mythbusters and love busting some of our own myths. 

So we decided to test a myth we hear time and time again: electric cars can’t go very far in the winter when in-cabin electric heating zaps the power from the battery pack.

We put on our thermal underwear, grabbed a warm coffee and headed out into unseasonably cold weather with a 2011 Tesla Roadster Sport 2.5 to see just how the sports car coped with ice, snow and a raging cold northerly wind.

As luck would have it our car for the weekend was exactly the same vehicle we borrowed in October, so we were able to draw a good comparison between the two weekends in true Mythbuster style.

As we pulled out of Central London and headed west down one of the many freeways radiating out of the U.K’s capital the entire U.K. was under severe weather advisories for heavy snow and extreme cold. 

While the temperatures we were set to experience were mild in comparison to a severe north east Winter we were promised temperatures as cold as 14 °F, with daytime maximums barely creeping above freezing point.

But this didn’t phase the Tesla Roadster Sport 2.5. With powerful seat heaters and a fully electric heater providing enough warmth to make the cabin more than cosy, the Tesla forged forward into the encroaching darkness.

About 40 miles before our destination snow started to fall. Initially light, the snowfall became heavier until our car registered an outside temperature of around 25 °F. 

At this point the Tesla, still in Range mode, was predicting enough power for at least another 80 miles. On arrival, the on-board computer predicted a further 40 miles in range mode was possible.

The next day, we took the Tesla on an exhaustive trip designed to give it a thorough working out. First, a 60 mile freeway trip at 70mph, followed by a further 80 mile meandering route through some of the southwest of England’s most challenging roads.

With the temperature below 27 °F for the entire trip and temperatures dropping to an indicated 20 °F while driving through the iconic Cheddar Gorge, our test car didn’t put a foot wrong, climbing up the 1,000 feet twisty road with ease.

Even with the best will in the world we just couldn’t make the Tesla Roadster Sport 2.5 lose grip while driving. The Tesla’s traction control made sure it stayed pointed in the right direction, even when we drove on sheet ice.

In fact, the only way we could force the Tesla to slide around was to find a deserted parking lot and turn the traction control off.

We’d planned to film our frigid fun, but it turned out our camera equipment just couldn’t handle the cold and switched off as soon as it was exposed to the extreme windchill. No such problems for the car, however, which kept on providing heat, power and entertainment for a whole weekend.

In total we used just over 140 kilowatt-hours of power over the weekend, resulting in a massive 450 miles of snow-filled fun. 

We struggled to find a difference in performance, range or energy consumption between our cold-weekend and our mid-Fall test-drive. Whatever we threw at it, the 2010 Tesla Roadster Sport 2.5 kept going.

Our only problem? A frozen trunk mechanism after the overnight temperature dipped below 16 °F which required a few hours of driving to thaw out.

Other than that, we’d have to say that Tesla have managed to change our perception of Winter electric vehicle motoring.

Myth Busted.

This article, written by Nikki Gordon-Bloomfield, originally appeared on AllCarsElectric.com, a content partner of IEEE Spectrum.

The Ideal Wind Farm: Tweaking Turbine Spacing to Improve Output

In the early days of wind energy development, it seemed there was little thought put into some of the details of how to put together a wind farm. The Altamont Pass farm might pass as the poster child for some early missteps, as its small and tightly clustered turbines kill more than 4,000 birds per year (including 70 protected golden eagles). More recently, a lot more thought is going into just how the thousands of turbines the world is building should be spaced

In a presentation at an American Physical Society meeting this week, Johns Hopkins researcher Charles Meneveau discussed work on an algorithm designed to optimize the placement of turbines in a wind farm. Among the findings -- which are based on computer modeling of the flow of air around spinning turbines -- is that generally we've been placing them too close together.

According to a press release, large turbines (of the five-megawatt variety) should be separated by 15 rotor-diameters rather than seven, which is commonly used today. Turbulence created by the spinning blades creates a situation where the speed and direction of the wind is muddied, meaning that turbines placed close together might not be creating as much energy as they could at slightly larger spacings.

This isn't the first research looking at how to get the most out of a lot of turbines placed close together. Earlier this year, investigators in Spain published a paper in Renewable Energy on an algorithm designed to optimize wind farm arrangement. All of this work is a crucial step in improving wind power's overall viability: the continuing effort to bring the cost-per-kilowatt down into fossil fuel range will make each new wind farm that much easier to build.

(Image via Wikimedia Commons)

Stuxnet Sends Ominous Message

Two months ago the German cybersecurity expert Frank Rieger published a compelling analysis of Stuxnet suggesting it targeted Iranian nuclear facilities, quite possibly the big uranium enrichment complex at Natanz. Two weeks ago the U.S. cybersecurity firm Symantec published an exhaustive analysis that showed beyond any reasonable doubt that Natanz was the main target, though perhaps not the only target. All that is arresting enough. But there's also a larger message, namely that any large networked system--from the smart grid to oil refineries or nuclear reactors--could be vulnerable to malware of similar sophistication.

To quote the summary that concludes the Symantec report: "Stuxnet represents the first of many milestones in malicious code history--it is the first to exploit four operating system vulnerabilities, compromise two digital certificates, and inject code into industrial control systems and hide the code from the operator... Stuxnet has highlighted direct-attack  attempts on critical infrastructure are possible and not just theory or movie possibilities.. . . Stuxnet is the type of threat we hope to never see again."

Huge Stockpile of Kazakh Weapons-Grade Plutonium Is Secured

The Kazakh and U.S. governments recently moved about 100 tons of high-grade plutonium from a poorly secured location on the Caspian Sea to a better-secured secret location at the diagonally opposite corner of Kazakhstan. The operation was described last week in a three-part eyewitness report on National Public Radio  by Mike Shuster, NPR's longtime roving ace reporter.

What made this maneuver so specially important, Shuster explained, was that the plutonium happened to be of the highest quality for nuclear weapons--"ivory" grade, as people in the trade put it.

For those accustomed to thinking of all plutonium as weapons-grade, the Kazakh situation calls for somewhat more explanation than Shuster deemed appropriate for a general audience. It's well known that uranium can either be natural, somewhat enriched for reactors, or highly enriched for weapons, but plutonium is generally thought to be weapons-grade by definition. That's essentially true, but with an important qualification. If the plutonium is produced by neutron bombardment of uranium in a standard power reactor being operated normally, it ends up containing excessive quantities of the higher-isotopes (Pu-241, 243, etc), rendering any weapon made from it prone to premature ignition. Therefore, any country wanting to build good and reliable nuclear weapons will not want to make them from plutonium extracted from standard reactor fuel. (Tthis is why Iran's Bushehr reactor, built with Russian help, is essentially irrelevant to its nuclear-weapons program.)

Even for a general audience, Shuster might have mentioned that the Kazakh plutonium was made in an early breeder reactor, the BN-350, which was customized to both produce electricity and (in facility seen above) desalinate water from the Caspian. Breeder-bred plutonium is higher grade, which is one of several reasons why breeders are such a singularly bad idea.

As Tom Cochran explained in a recent interview here, Russia's breeder program ultimately was a failure because the breeder fuel cycle was never "closed." That is to say, the plutonium produced in the breeders was never processed into fresh reactor fuel, to realize the dream of an endlessly self-propagating fuel supply. Thus it came about that Kazakhstan found itself sitting on enough fuel for make nearly 800 quite satisfactory atomic bombs.

Great Lakes Wind

Four years ago, surveying the U.S. president's excellent strategic plan for climate change technology, I was struck by a detail on a map charting the country's wind resources: It showed of course a lot of strong winds through the Great Plains and off the two ocean coasts; but it also showed something less obvious to all—namely, that the entire surface area of the Great Lakes also was among the country's very windiest regions.

Why not, I wondered, focus wind energy development just there. In contract to the Great Plains, which are notoriously distant from the country's major load centers, the Great Lakes are smack dab in the middle of the country's highly populated old industrial heartland, a region that happens to get most of its electricity from carbon-intense coal. And surely it might be easier and less expensive to develop offshore wind on the lakes than on the coasts.

It turns out I'm not the only genius having such thoughts. On one of his recent posts at Energy Central, Bill Opalka speculates that regulatory hurdles and opportunities for citizen intervention may be somewhat less daunting when it comes to Great Lake wind development, by comparison with projects off the ocean coasts. That may be debatable. And certainly the jury's out on whether Great Lakes projects would be technically easier and overall cheaper.

But a lot of people and a growing number of organizations are exploring these questions. Two months ago, public officials, developers, and stakeholders in the Great Lakes Wind Collaborative met at Case Western Reserve University in Cleveland, for their third annual meeting. A member of the National Renewable Energy Laboratory told them that lakes' wind might be a $100 billion market.

Two years ago, Michigan State University's Land Policy Institute estimated that 100,000 wind turbines situated off Michigan's coasts--the state borders four of the five Great Lakes--could generate as much as 321 GW of electricity, if towers were sunk at all depths. If towers only were built at depths comparable to those already achieved elsewhere in offshore wind farms, the lakes' wind potential would be closer to 100 GW--still a prodigious amount.

At the end of October, the White House, Energy Department, and with the lakes collaborative co-hosted two-day workshop in Chicago to discuss Great Lakes siting issues in detail. Specific projects are under consideration near Cleveland on Lake Erie, off the eastern shore of Lake Michigan near Pentwater, Mich., and in Erie or Ontario off New York State's shores.

More on IPCC Reform

As long as we're on the subject of how the Intergovernmental Panel on Climate Change needs to be reformed to be more authoritative, believable, and helpful, we may as well take note of a conference held earlier this week at Columbia University. The panel on IPCC Reform and the Global Climate Challenge brought together scholars  from political science to meteorology and produced a remarkable amount of agreement on some key points.

--There's an excessively wide gap between the scientific assessments, on the one hand, and political leaders, policymakers, and members of the general public, on the other. "The IPCC is an assessment of science, by scientists, for scientists," as climatologist Gavin Schmidt  put it.

Peter Haas, a specialist on environmental governance, said the assessment reports, besides needing to be more widely seen as legitimate, also should be more timely and pertinent. Only the second of the four assessments reports that have appeared so far was published on the eve of important political decisions being taken, Haas observed.

(Nobody mentioned that the assessment reports also are extremely expensive and bulky, and are never actually seen outside a scientist's office. Why is there not a 175-page summary written for the educated public and published as a paperback?)

--One reason why the IPCC have come under criticism for the way it handles uncertainty is the clumsy way its authors are asked to attach error bars to statements, regardless of whether they have any real basis for evaluating uncertainties or not. The authors need to source what they know more carefully, and refrain from talking about what they don't know, said Syukura Manabe, the computer modeling pioneer.

--The separation of science and policy is too sharp, and the analysis of climate impacts and mitigation strategies too weak, as economist Jeffrey Sachs put it. Manabe noted that most of the errors found in the last assessment were in the impacts section--a subject all the more important now that the Kyoto process is being overwhelmed by fast-growing emissions from the BRIC countries.

Suki Manabe (photo), who can be reasonably described as the father of climate modeling,* was saying that if we can't significantly slow global warming, then we'd better start preparing for its consequences, which are going to be far-reaching and serious. We need to understand them much better, he said, but we know for example that water is going to be a huge issue as the world's wet areas get wetter and its dry areas get drier. We're going to have to worry a lot more about things like water transport and desalination, he said.
 _____________________________________________________

*Information on his remarkable career can be found in Spencer Weart's book The Discovery of Global Warming and at the climate website Weart maintains at the American Institute of Physics or in my book Kicking the Carbon Habit, one chapter of which is a profile of Manabe.

"Cool It" Instead Fuels It

The title of the new movie featuring "skeptical environmentalist" Bjorn Lomborg is of course an intentional pun: It refers at once to cooling the debate over global warming and finding a way to actually cool the planet. The second point may come as surprise to some, who may not be aware that Lomborg is not and never has been a climate change denier. What he has argued is that proposed solutions to global warming are too expensive and detract from spending on more serious global problems such as disease, ignorance, and poverty.

Cool It opens with a rather long and boring personal introduction of Lomborg and a defense of his controversial book. Only after 20 minutes do we arrive at the 1990 Rio conference, where the U.N. Framework Convention on Climate Change was adopted; it is subjected to an inadequate and tendentious discussion, along with the follow-on Kyoto Protocol and last year's Copenhagen climate conference. We see President Obama telling the assembled delegates in Denmark that we have been talking about climate chance for two decades "without much to show for it."

That's a falsehood, and the film does its viewers no service and Lomborg no credit in repeating and endorsing it. The Kyoto Protocol called on the advanced industrial countries to cut their greenhouse gas emissions by 7-8 percent by 2008-12 by comparison with 1990. As of 2007, Germany had cut its emissions by 21.3 percent, the United Kingdom by 17.3 percent, France by 5.3 percent, and the members of the European Union that adopted the protocol by a combined 4.3 percent. In the United States, however, emissions increased by 16. 8 percent; in Australia, the other major country that declined to ratify the protocol, they were up 30.1 percent.

Had there been no Rio treaty or no Kyoto Protocol, it's reasonable to assume that European emissions would have climbed from 1990 to 1997 almost as much as U.S. and Australian emissions did. So to say we have "nothing" to show for the two agreements is as if five buddies pledged to stop drinking, four succeeded but the biggest guzzler of all relapsed, and we thereupon declared the abstinence program a total failure. The failure is the hopeless drunk, not the program.

Though the argument in Cool It is based on the relative costs of greenhouse gas reduction versus other strategies, the film is remarkably casual in the way it throws numbers around. It does not explain carefully where its major cost estimates are coming from, what their basis is, or put them in a meaningful context. Not surprisingly, the movie has left viewers confused. One reviewer came away with the impression that the combined industrial country cost of complying with Kyoto was said to be $280 million per year, an absurdly small number. The film's actual number, $280 billion sounds big but actually is not so huge: Assuming the combined annual GDP of the industrial countries is at least $50 trillion, $280 billion is a fraction of 1 percent.

The same kind of problem arises with the estimate of how much it will cost Europe to attain its 20-20-20 goals, which Lomborg puts at $250 billion. We don't really know what the basis of that number is, and to me it sounds rather inflated. But even if it's correct, let's remember that it represents about 1.6 percent of Europe's $15 trillion domestic product and about $500/person/year. Is that too much to pay for insurance against the long-term risks of dangerous global warming? Lomborg says, unintelligibly, that it would be like buying home insurance that only covers the doorframe.

Lomborg wants to think he's clarifying the debate about global warming, but he's not. What he proposes is that instead of spending $250 billion per year on greenhouse gas reduction the money instead should be spent on the development of futuristic gee-whiz energy technologies, which he continues to be amazingly credulous about. Where is the $250 billion to come from? One reviewer came away with the impression that it would come from a carbon tax, and that may indeed be what Lomborg is suggesting.

If so, the supposed maverick is saying essentially what just about every other serious advocate of action on climate is saying: We should either tax carbon or auction emission credits in a cap-and-trade system, and use the very large revenues to fund development of low- and zero-carbon energy technologies. Lomborg is purveying the conventional wisdom, but in claiming to be saying something distinctively different, he's only sowing confusion and tossing random kindling on what's already a blazing fire.

Trenberth Goes Back to Basics on Climate Change

At last count we had about 40 comments on the interview with you about IPCC reform, many of them hostile. Where do you think that extreme hostility is coming from? You and your colleagues must think of yourselves as honest, hard-working scientists, and yet you stand accused of being a “liar,” part of a corrupt enterprise, “the system.”

This kind of thing is not confined to climate science. There are a lot of other examples out there. It's a nasty atmosphere, and any hot topic brings this out. Climate science has become highly politicized, and climate change denial is closely related to the strong opposition many people feel at having the government interfere in their affairs in any way. Under these circumstances something like “climategate” was bound to happen. I received something like 19 pages of e-mail with extremely abusive language in them. I kept asking the mail people why our filters weren’t catching them.

Many of the hostile comments on the interview appear to come from engineers. Some express anger, for example, about IEEE’s stand on climate change, which must be some kind of misunderstanding, as IEEE as such doesn’t take positions on controversial matters of politics and policy. Are you surprised to see well educated, well credentialed individuals expressing such anger?

Yes, I am rather. Last year I gave an invited lecture at an IEEE meeting of aeronautical engineers, with about 700 in attendance. Afterwards a tremendous number of people came up to me and said they had no idea as to the nature of the evidence regarding climate change, how everything fits together. In general scientists and engineers have open minds.

In terms of the major claims that are so hotly contested-—that the earth is getting warmer, that human activity is playing a significant role, and that warming will gather pace unless we change our ways rather radically—to what extent to they rest on computer modeling and how much on observational science?

Our knowledge that the earth is warming does not come from modeling at all. It comes from a very wide variety of sources and is incontrovertible—“unequivocal,” the IPCC said in its last assessment. The main sources are temperature readings, melting of Arctic sea ice and glaciers, snow extent rise in sea level, and ocean temperatures.

What about the human role?

We can see that greenhouse gas concentrations are growing in time and that temperatures are rising, and we postulate a causal relationship. But we need computer models to test and prove that relationship. We run the models with the observed changes in the atmosphere and see what temperatures are generated, then remove the gas forcings, and compare the results. Though there's some natural variability, of course, since 1970 human-induced warming has clearly emerged from the noise.

As for prediction of future temperatures and other effects of climate change, there we rely entirely on models.

What about paleoclimatic research—ice science and such?

It plays a smaller part. It depends on use of proxies to substitute for instrumental readings, and when you go back further than 500 years, the number of such proxies drops and data get blurrier. As you go back in time annual resolution is no longer possible. We also know much more about forcings in the modern period: We have information on solar activity going back to Galileo, and we know quite a bit about volcanic activity, which affects the composition of the atmosphere significantly.

Can you say something about the widespread belief that solar activity somehow accounts for the temperature changes we’ve seen in recent decades?

That’s easily disproven. It’s nonsense. Since 1979 we've had spacecraft measuring total solar irradiance, and there's been no change—if anything the sun has cooled slightly. There's nothing in the record that indicates that the sun is responsible for any of the warming in this period.

And further back?

The sun did get somewhat more active early in the last century and probably did account for some of the warming until about 1950. The best current estimate is that it explained about one tenth of one degree Celsius of the warming, which totaled about 0.3C from 1900 to 1950. There was a solar minimum in the early 1700s and probably was partly behind the so-called Little Ice Age. Going much further back to the alternation of ice ages and interglacial periods, they were driven by subtle changes in the orbital relationships of sun and earth—the Milankovitch cycles. We learned from the study of them that small changes in solar irradiation can generate huge changes in climate. In fact, the changes in radiative forcing from the rise in greenhouse gas concentrations in the atmosphere are much larger than the changes connected with the Milankovitch cycles. 

For the record, are you the person who Dave D says messed with the Australian temperature record?

I’m a native of New Zealand and have had nothing to do with the Australian record.

As a person who has contributed a great deal of time to IPCC work, do you feel that lead scientists should be directly compensated by the IPCC?

No, that would only make the process even more vulnerable to the criticism that it is somehow self-serving.

Renaissance Hiccups: What Do Recent Nuclear Reactor Incidents Tell Us?

The nuclear proponents will say that the system is working. Two separate incidents at the Vermont Yankee and Indian Point nuclear plants on resulted in shutdowns of both sites, but authorities in both cases were confident that the problems posed no threat to the public. And hey, that's the idea, right? Catch things before they really do become a problem. Let's take a look.

The Vermont Yankee plant, in Vernon, Vermont, closed down because a routine check found a leak from a welded over pipe in the feedwater system; this is a closed loop that brings water to cool the reactor. But this wasn't just regular water dripping at about 60 drops per minute from the pipe; it was radioactive water. According to an update from Entergy, who owns the plant (and whose Vermont Yankee website can be found on the somewhat ironically domained safecleanreliable.com), the pipe was in a section of the system that could not be repaired while the plant remained active, so they made a "conservative decision" to pull it offline. Since Sunday, the pipe has apparently been repaired and the reactors were being powered back up and reconnected to the grid on Wednesday.

Meanwhile, in Buchanan, New York, one of two main transformers exploded on Sunday evening and caused an automatic shutdown of the second of three reactors at Indian Point (pictured). Importantly, these transformers are located outside of any area with nuclear material nearby, so the all-clear report that sounded just a few hours later was undoubtedly justified.

In a statement from -- oh look, Entergy owns this one too! (And Indian Point has a similarly hilarious website, safesecurevital.org.) Anyway, they claim everything is now functioning as normal; nothing to see here, apparently.

And yes, neither of these incidents seem to have been particularly dangerous, and the response systems in place do appear to be working as they should. The bigger issue here is that these aren't necessarily isolated problems. Vermont Yankee became the first plant in more than two decades to be shuttered (well, after 2012, at least) by the public when the state senate voted against renewing its license. The reason? Radioactive tritium leaks. Entergy reported last month that tritium levels are low and that there is no danger of water contamination, but the problem highlights the fact that our nuclear infrastructure is not getting any younger.

Whether or not the next incident actually is a threat is almost beside the point; multi-billion dollar loan guarantees will be a hard political sell if the public keeps hearing about radioactive water leaks and explosions. If the old plants can't keep quiet for a while, the nuclear renaissance might be dead in the water.

(Image via Daniel Case)

New York State's Climate Plan Contrasts with California's

Six years ago the New Yorker magazine caused rather a stir with an article in which writer David Owen described how he and his wife had moved to a utopian green community where they got by living in just 700 square feet, with no dishwasher , garbage disposal or car, and a monthly electric bill that came to about one dollar. That utopian community was Manhattan.

Yesterday New York State issued an interim greenhouse gas reduction plan, its (outgoing) governor David Paterson having pledged last year to put the state on a path such that its emissions would be 80 percent lower in 2050 than in 1990. That's a weaker goal than California's, which targets 2020, a timeframe in which leaders can be held accountable for their results. The philosophy and approach behind New York's plan also is strikingly different from California's. Yet the two states also have two big things in common: Relatively low per capita energy consumption and emissions, because of their highly urbanized "green" lifestyles, and populaces and political leaderships that are firmly dedicated to combating global warming.

At some risk of oversimplification, it can be said that California's commitment is tightly bound up with its self-perception as a high-tech state that stands to benefit in the long run from driving development of energy-saving, carbon-reducing technologies, while New York is more focused on helping avoid the local ill effects to be expected from climate change.

Those effects, as spelled out in a fact sheet that accompanied release of the interim report, could include a temperature rise by 2080 of 5 to 7.5 degrees Fahrenheit, heavier rainfall and flooding, more frequent and severe heat waves, and a rise in the level of coastal waters and the Hudson River estuary of 12-55 inches.

The report puts the emphasis squarely on improving the efficiency of buildings, which account for about 40 percent of the state's emissions, and transportation, where New York has been a pioneer in electrifying public transport, notably buses. It also calls for adopting a more aggressive renewable portfolio standard, to encourage lower-carbon generation, and for "expansion" of the Regional Greenhouse Gas Initiative, which envisions a regions carbon trading system. It characterizes its proposals for buildings and transport as "no regrets" options--that is, policies that would make good sense even in the absence of climate change.

The problem with the plans being made by New York, obviously, is that the state can't do it alone. Avoiding the ill effects of climate change depends on everybody all over the world doing their part. So, while California and New York, Britain, Germany and Denmark all make good-faith efforts to reduce their emissions, all will be naught unless they somehow persuade the other big emitters to make equivalent efforts.

Advertisement

Newsletter Sign Up

Sign up for the EnergyWise newsletter and get biweekly news on the power & energy industry, green technology, and conservation delivered directly to your inbox.

Advertisement
Load More