Infrastructure Engineers Prepare for Tsunami of Data from Virtual Reality, IoT

Sony hopes to catch the wave with its optical jukebox for data storage

3 min read
Infrastructure Engineers Prepare for Tsunami of Data from Virtual Reality, IoT
Illustration: iStockphoto

It’s coming, it’s really coming, and it’s going to be huuuuugggge!

That was the message of the tech company executives keynoting the annual Open Compute Project Summit, held this week in San Jose, Calif.

They were talking about data. A tsunami of data. More data than anybody—even in an era in which 300 hours of video is posted to YouTube every minute—has yet seen.

Before considering the challenges of this data deluge, the executives patted themselves on the back for launching the Open Compute Project in 2011. They boasted that it is leading to a far more adaptable and efficient infrastracture than would have been possible in an era of proprietary systems. And they welcomed several telecommunications companies, including AT&T, Verizon, Deutsche Telecom, and SK Telecom, that just joined the consortium.

Then they turned to their plans for dealing with the coming wave of data—driven by the growth of the Internet of Things and Virtual Reality—that will soon hit the communications networks and data centers operated by companies like Facebook, Google, and Amazon.

Mahmoud El-Assir, chief information officer for Verizon, said that getting data quickly from the data center to the user will become more of an issue for virtual reality and things like connected cars, in which even small latency issues cause big problems. “We are moving to software defined networks,” he explained, because, to serve VR and the IoT, “the current networks are impossible to scale.” The future data center, Mahmoud said, will be small, unmanned, and closer to the customer.

“We are moving to software defined networks” because, to serve VR and the IoT, “the current networks are impossible to scale.”

Kangwon Lee, senior vice president for R&D at SK Telecom, concurred, adding that, “Virtualizing the network gives flexibility and finer grain control.”

“If your customer [say, a gamer with a new Oculus VR] needs a lot of bandwidth,” Lee said, “we can create a virtual pipe to that customer. For a connected car scenario, in which you want very low latency, we can route traffic at the edge of the mobile network.”

Jay Parikh, vice president of engineering at Facebook, pointed to Intel’s Yosemite server system on a chip and the new 100 Gbit Wedge Ethernet switch as advances that will help cope with the coming data tsunami. Parikh indicated that the company is looking to artificial intelligence technology to help solve many of its other problems, but said that one frustrating bottleneck remains: storage.

“Disk drives are getting bigger, but not more reliable,” Parikh said. “Latency is not improving. Flash is getting bigger, but latency is only improving slightly.”

Urz Holze, senior vice president for technical infrastructure at Google, the most recent member of the Open Computer Project, is also looking for big improvements in storage.  He called for disk manufacturers “to think about alternate form factors and functionality for disks in data centers. Individual operators don’t care about individual disks, they care about a lot of them tied together—so there is a lot of room to save costs and complexity.”

Sony, it turns out, was ready to answer that call. Later that day, the company exhibited for the first time its optical storage system designed specifically for data center use: the Everspan. Horst Schellong, vice president of sales for Sony Optical Archive, said the system is a “scalable, highly reliable product that stores data forever.” (Well, at least what seems like forever compared with today’s alternatives: Sony guarantees the archival disks for 100 years if stored under controlled conditions (under 35 °C/95 °F); 50 years otherwise.)

Sony guarantees the archival disks for 100 years if stored at or below 35° C (95° F)

The optical storage system uses a blue laser to read and write to optical disks (but Schellong emphasized that the format is not compatible with today’s Blu-Ray disks). Each disk has two sides, with three recording layers on each side, for a total capacity of 300 gigabytes (the company has plans to increase this to 1000 GB). The drive reads and writes both sides simultaneously, using four lasers on each side, and has a peak transfer rate of 315 MB/s. And each storage system includes 64 drives, for a data transfer rate, Schellong said, of 1 petabyte per day.

The Everspan, it appears, does what Holze seems to want—that is, offering expansion in an efficient way. One set of drives can serve up to 14 expansion units, each holding 680 trays with 64 disks in a tray, for a total of 181 petabytes of data. The trays are collected and fed to the readers by two robots—one to go and get the trays, and one to load and unload them. In other words, the system operates like a very smart jukebox. Sony will begin shipping Everspan systems in July.

The Conversation (0)