Researchers from the University of São Paulo, in Brazil, and University of British Columbia, in Canada, have developed a spherical display that lets users see and interact with three-dimensional objects. In one demonstration, viewers have the sensation of staring into a snow globe that they can control with simple gestures from any angle.
The device, called Spheree, represents the first display capable of projecting uniform, high resolution pixels on a spherical surface—a technology that also allows users to interact with the 3-D display objects by using gestures.
The Spheree allowed attendees at the SIGGRAPH 2014 convention held in Vancouver last week to play with a Snow Globe 3D animation that included a house, animated snow and a train chugging around the house. That interactive display required eight pocket-size projectors mounted at the base of the globe, as well as software capable of blending together the individual projector views to create a uniform pixel presentation from almost anywhere on the spherical surface.
Small pico-projectors like the ones used for the demonstration have lower resolution and brightness than traditional projectors—a problem for a virtual reality system that aims for high quality. But the international team of Brazilian and Canadian researchers used an auto-calibration algorithm called FastFusion to seamlessly combine the resolution and brightness of the many projected images without a resulting decrease in quality. A basic webcam allows the algorithm to see the position of the individual projector images on the globe and compute each image's contribution to the overall final image.
The auto-calibration system works with practically any number of pico-projectors, which means researchers could build ever-larger versions of Spheree. The team has already tested a four pico-projector system with an 18-centimeter-wide display and an eight pico-projector system with a 51-centimeter-wide display. By avoiding the use of special mirrors or lenses, they avoided having "blind spots" in the overall projected image.
Spheree also uses six infrared cameras to track the movement of special headbands worn by viewers. The data the cameras feed to a computer constantly provide perspective-corrected virtual scenes based on a viewer's position with respect to the globe. Gesture control with a Leap Motion interface also allows users to interact with the 3-D scenes or animations by using gestures to start, move forward and backward, pause and stop animations.
The system uses a second computer to run 3-D animations with Blender Software. Researchers envision Spheree helping animators or modelers by showing 3-D computer animations or the results of image-based rendering applications—perhaps as a second screen. A larger version of Spheree might provide walk-around experiences for team projects or show up in interactive museum displays. Future video games or toys might also make use of such technology.
Spheree contributors: University of British Columbia, Canada; Universidade Federal do ABC (UFABC), Brazil; Federal University of São Carlos (UFSCar), Brazil; and University of Saskatchewan, Canada.
Jeremy Hsu has been working as a science and technology journalist in New York City since 2008. He has written on subjects as diverse as supercomputing and wearable electronics for IEEE Spectrum. When he’s not trying to wrap his head around the latest quantum computing news for Spectrum, he also contributes to a variety of publications such as Scientific American, Discover, Popular Science, and others. He is a graduate of New York University’s Science, Health & Environmental Reporting Program.