Attention among manufactures is shifting to UHD, which, like 4G in telecoms, is actually a catch-all name for a grab bag of related technological improvements. (UHD covers not just higher resolution, but also options for improved color depth, sound, et cetera.) But before 3-D TV is put back into the technological attic, it’s worth discussing why it was such a spectacular failure (and hopefully learn some lessons for the future.)
There seem to be four main answers that seem to be bubbling up:
- A small, but significant, number of consumers either don’t have stereoscopic vision in the first place, or found that the technology gave them eye strain or headaches.
- The initial rush to get 3-D TV technology out meant that content was often created with immature systems by inexperienced creators, resulting in a great deal of poorly produced 3-D content that alienated early adopters. And good 3-D production requires non-trivial investments in training and equipment.
- 3-D was largely useless as a story telling tool. With the exception of 2009’s Avatar and 2013’s Gravity, people were hard pressed to think of live-action movies that used the technology as an integral part of the cinematic experience.
- Sports, which was considered to be a potential killer app for 3-D, fell victim to fact that inviting people around to watch the big game didn’t really work with handing out glasses, not least because of the expense involved in buying additional sets.
It’s unlikely that the dream of 3-D TV is dead. Indeed there are already people talking about how 8k video is what’s needed to make it really work, so expect inevitable resurrection attempts to come in a few years, perhaps with VR-style headsets, such as the Oculus Rift, or a commercialized version of the 3-D headset prototype Sony was demonstrating at its booth But when it does come around again, it’s worth looking at that list above, and ask what’s really changed.
For more from CES, check out our complete coverage.