The December 2022 issue of IEEE Spectrum is here!

Close bar

CES 2012: A Federation of Storage Clouds

2012 will be the year of store-anywhere, access-anywhere

2 min read
CES 2012: A Federation of Storage Clouds

Smartphones, tablets, and laptops are connecting us to our content and to each other as never before, and the growing demand not just for information and content, but immediate access to it, is sparking a new wave of cloud storage services—this time, targeted to consumers.

In 2011 advances to software and infrastructure management technologies lowered the costs of remote data center storage, resulting in just-in-time delivery of data and services to a large numbers of clients. These technologies include virtualization, where real physical resources can be subdivided into a much larger number of virtual devices.

Storage tiering is another technology that is hot trend for 2012. In storage tiering, more active content is kept on faster, more expensive storage devices while less active content is kept on slower, cheaper devices. As a result, a high level of service can be delivered at a lower cost. Deduplication—where multiple copies of a piece of content are consolidated to a single copy, with multiple pointers to it—is also making storage centers more efficient. Altogether, virtualization, deduplication and storage tiering have increased average data center storage utilization from about 50 percent to as much as 80 percent.

Even so, today’s cloud storage remains more expensive than local storage.  In early October 2011, for example, the website Nextag’s price for a 3 terabyte hard disk drive (pre-Thailand flood) was about US $0.04 per gigabyte.  The annual fee for Google’s popular cloud storage service, on the other hand, comes to about $0.25/GB. This difference in prices—and the concerns many consumers have about the privacy of personal content stored in public storage services—will likely limit the amount of consumer content stored in the cloud. Cloud storage will still expand greatly in 2012, in part because of the growth of mobile thin clients, with their limited storage capabilities and high demand for remote services through apps.

Another new development is that of battery-powered Wi-Fi enabled storage devices, which can provide from 16 to 500 GB of personal cloud storage from a device small enough to fit into a pocket.  These devices, from such companies as Kingston and Seagate, can provide extra storage for smart phones, tablets, and other devices with limited storage space. The content can be accessed through a downloaded application.  These WiFi enabled storage devices that a user carries can be considered a sort of personal cloud storage.

A related market, also growing rapidly, is that of network-attached storage (NAS) devices, which let consumers keep content on personal storage devices in one place.  In the past couple of years some companies such as Pogo Storage, Iomega, Western Digital and G-Tech are developing NAS devices that, in effect, give consumers private cloud storage services. These storage devices can be accessed remotely from the home and usually support protected access and encryption of stored and shared content.  We can think of these devices as providing a home cloud.

In 2012, these new technological advances provide home and personal cloud storage that can be used in addition to remote data center cloud storage. Together these various levels of networked storage, connected together through the public Internet, are creating a larger federated cloud service composed of public and private consumer cloud services.  Such a federation might provide new and useful services and entertainment capabilities for consumers in 2012. A federated cloud storage infrastructure could spur new businesses, new types of content, and new consumer devices to service these opportunities.

Tom Coughlin is President of Coughlin Associates, an IEEE Senior Member, and Vice President of Operations and Planning of the IEEE Consumer Electronics Society.


The Conversation (0)

Why Functional Programming Should Be the Future of Software Development

It’s hard to learn, but your code will produce fewer nasty surprises

11 min read
A plate of spaghetti made from code
Shira Inbar

You’d expectthe longest and most costly phase in the lifecycle of a software product to be the initial development of the system, when all those great features are first imagined and then created. In fact, the hardest part comes later, during the maintenance phase. That’s when programmers pay the price for the shortcuts they took during development.

So why did they take shortcuts? Maybe they didn’t realize that they were cutting any corners. Only when their code was deployed and exercised by a lot of users did its hidden flaws come to light. And maybe the developers were rushed. Time-to-market pressures would almost guarantee that their software will contain more bugs than it would otherwise.

Keep Reading ↓Show less