Worried About Being Cold this Winter? How About Installing a Data Furnace?

Mini-data centers could in theory heat up a single family house

2 min read
Worried About Being Cold this Winter? How About Installing a Data Furnace?

There was an interesting article over the weekend in the New York Times calling attention to a paper presented last June at the 2011 Usenix Workshop on Hot Topics in Cloud Computing that proposes using cloud server-generated heat to warm up single-family homes.

The paper, The Data Furnace: Heating Up with Cloud Computing (PDF), was co-authored by Jie Liu, Michel Goraczko, Sean James, and Christian Belady from Microsoft Research and Jiakang Lu and Kamin Whitehouse, from the Computer Science Department at the University of Virginia. The authors propose that instead of creating giant data centers of servers to support cloud computing, why not disperse the servers among homes/businesses and use the substantial heat generated to warm them up during the winter?

During the summer, the Times article says, "... the servers would still run, but the heat generated would be vented to the outside, as harmless as a clothes dryer’s. The researchers suggest that only if the local temperature reached 95 degrees or above would the machines need to be shut down to avoid overheating." The assumption is that a data furnace would not be cooled.

How the concept is envisioned operating is that a cloud service provider would sell data furnaces to homeowners for the cost of a regular oil furnace as well as sell the resultant heat to the homeowners at a price that would be equal to what he or she would normally be paying. The cost of the added electricity to run the data furnace would be paid for by the cloud service provider, as would the maintenance costs.

According to their total cost of ownership cost-benefit analysis (check the paper for their operating assumptions), the authors claim that a data furnace that is "designed correctly" could not only heat a 1,700 square foot house to 21 degrees Celsius (70 degrees Fahrenheit) but result in a total cost of ownership savings of about $300 per server per year to the cloud service provider.

The authors point out that depending on their configuration and availability, the data furnaces could be suitable for "... many delay-tolerant batch jobs [that] can be performed opportunistically, such as non-real-time web crawling, content indexing, and the processing of large scientific data sets (e.g. astronomical data, genome sequencing operations, and SETI@Home), " or "Internet television services and on-line video rental services [that] could use pre-fetching based on local programming schedules or video queues of people in the local vicinity. Similarly, location-based services such as map serving, traffic estimation, local navigation, and advertisements for local stores are typically requested by customers from the local region."  

The authors say that a number of technical challenges would need to be overcome including ensuring that a data furnace's power and networking needs don't interfere with a home's normal operational requirements; that the physical and IT security of the data furnace could be maintained, and; that any hardware or software failures are able to be handled quickly and as remotely as possible. As the authors point out, "Even at the event of software failure, the system should continue to provide heat until receiving physical services."

There is a nice summary of the authors' presentation at the Usenix Workshop here (PDF) which also includes questions from the Workshop audience concerning the viability of the idea. Presentation slides also can be found here (PDF).

Any opinions on the idea? Would you be tempted replace your existing furnace for a data furnace? If not, what would it take?

Photo: iStockphoto

The Conversation (0)

Smokey the AI

Smart image analysis algorithms, fed by cameras carried by drones and ground vehicles, can help power companies prevent forest fires

7 min read
Smokey the AI

The 2021 Dixie Fire in northern California is suspected of being caused by Pacific Gas & Electric's equipment. The fire is the second-largest in California history.

Robyn Beck/AFP/Getty Images

The 2020 fire season in the United States was the worst in at least 70 years, with some 4 million hectares burned on the west coast alone. These West Coast fires killed at least 37 people, destroyed hundreds of structures, caused nearly US $20 billion in damage, and filled the air with smoke that threatened the health of millions of people. And this was on top of a 2018 fire season that burned more than 700,000 hectares of land in California, and a 2019-to-2020 wildfire season in Australia that torched nearly 18 million hectares.

While some of these fires started from human carelessness—or arson—far too many were sparked and spread by the electrical power infrastructure and power lines. The California Department of Forestry and Fire Protection (Cal Fire) calculates that nearly 100,000 burned hectares of those 2018 California fires were the fault of the electric power infrastructure, including the devastating Camp Fire, which wiped out most of the town of Paradise. And in July of this year, Pacific Gas & Electric indicated that blown fuses on one of its utility poles may have sparked the Dixie Fire, which burned nearly 400,000 hectares.

Until these recent disasters, most people, even those living in vulnerable areas, didn't give much thought to the fire risk from the electrical infrastructure. Power companies trim trees and inspect lines on a regular—if not particularly frequent—basis.

However, the frequency of these inspections has changed little over the years, even though climate change is causing drier and hotter weather conditions that lead up to more intense wildfires. In addition, many key electrical components are beyond their shelf lives, including insulators, transformers, arrestors, and splices that are more than 40 years old. Many transmission towers, most built for a 40-year lifespan, are entering their final decade.

Keep Reading ↓ Show less