The annals of film history enshrine the movies that heralded breakthroughs in cinematic technology. There’s The Jazz Singer in 1927, which was the first feature with sound; Becky Sharp in 1935, which pioneered three-color Technicolor; and Glory Road, which…
Glory Road ?
Yes, Glory Road. You may have missed it, but earlier this year that Disney feature about a scrappy Texas basketball team became the first motion picture released in a standardized digital form. Setting aside their film projectors, 29 theater owners ran the movie as a stream of bits from a stack of hard drives. In earlier attempts to launch digital technology, movies were encoded to play on proprietary systems, beginning with Star Wars Episode 1: The Phantom Menace, which hit two screens in New Jersey and two more in California on 19 May 1999. The largest nonstandard digital release was the three-dimensional version of Chicken Little, which played on 100 screens in late 2005.
But by the end of this year more than 2000 North American theaters will be projecting bits instead of frames; by the end of 2007, more than 5000 North American screens will be digital. And the digital invasion is advancing around the globe. In Ireland, for example, 500 screens will be digital by the middle of 2007; in India, 2500 will convert by the end of that year.
After nearly a decade of talk and no action on the commercial front, digital cinema is taking the world by storm. The reason for the tempest? In a word: standards. On 27 July 2005, seven movie studios got together and published the first specification for digital cinema, and the motion picture industry launched its biggest transition since black-and-white movies gave way to color.
The ongoing shift to digital cinema will bring major benefits to moviegoers, theater owners, and the movie studios. For moviegoers, the move will mean a larger variety of features and possibly even other entertainment at their local movie theaters. The movies will have higher-quality images, and there will be more offerings in a 3-D format. In a digital world, much of the expense and difficulty of displaying a movie in 3-D disappears, and 3-D becomes a real option for moviemakers instead of just a gimmick [see sidebar, “Digital Cinema: Another Dimension”].
For theater owners, digital will make movies easier and cheaper to handle, ship, store, and discard. But for these exhibitors, the biggest benefit may turn out to be the simple ability to replicate a movie on-site for showing on multiple screens when it becomes an unexpected hit.
Studios will also save money—lots of it: the movie industry estimates that it currently spends close to US $1 billion annually to process and ship 35-millimeter films to theaters; it expects to save several hundreds of millions of dollars when 35-mm films are replaced with digital releases. Already, too, new applications are starting to emerge, including the showing of live sports events, legitimate theater offerings, and even operas at movie theaters.
While the bits themselves are much cheaper to replicate than reels of film, the up-front costs of putting a digital picture in front of a theater audience are about $100 000 per screen compared with about $35 000 for the corresponding film projection equipment. (The cost of the sound system is basically unchanged.)
And therein lies the rub. Theater owners who considered running digital pictures balked at making that kind of investment without assurance that the technology would be compatible with the offerings of all movie studios for the long term.
For instance, when Boeing Digital Cinema, CineComm, and Technicolor introduced incompatible digital cinema systems back in 2001, they didn’t catch on. The systems supplied by CineComm worked only with a particular brand of projector made by a Hughes/JVC joint venture, and that projector didn’t reproduce colors in the same manner as the new projectors being shipped by Texas Instruments. Lack of interoperable standards meant that each system required a separate master, or else the color red on one projector system, for example, might display as pink on the other.
Meanwhile, the Technicolor system used a compression technology from Qualcomm, which was completely incompatible with the more standard MPEG (for Motion Picture Experts Group) compression used by the Boeing system. Studios had to produce two completely different digital files. The situation was, frankly, a mess.
So in March 2002, seven studios—Disney, Fox, MGM, Paramount, Sony Pictures Entertainment, Universal, and Warner Brothers Studios—established Digital Cinema Initiatives, in Los Angeles, to create a specification for digital cinema. They first set a quality threshold: the image resolution had to be, at a minimum, 2048 by 1080 pixels, a resolution loosely called 2K. The systems also had to be upgradable to double that resolution, called 4K. The consortium specified that the systems had to produce essentially the same range of colors as film does, with the future potential to include all colors visible to the human eye.
The group then began looking at how equipment manufacturers could put together a system that would provide that high-quality picture using a minimum of protected intellectual property. The studios knew that the fewer proprietary technologies they chose, the more widely and inexpensively they could implement the systems.
They ended up specifying a video compression technology and recommending uncompressed sound. But they didn’t recommend a specific projection technology, though the industry seems, for now, to have settled on Digital Light Processing (DLP), a micromirror system developed by Texas Instruments. Meanwhile, an efficient method of getting the digital files from the studios to the theaters is still evolving.
The picture, of course, is key. And several things conspire against its quality in conventional film prints. After just a dozen showings, dirt, grease, and scratches visibly degrade the image. What’s more, copying a film print through several generations, which is what the film labs do to generate the immense number of copies needed for distribution, also reduces image quality, in the same way that making a photocopy of a photocopy does.
Starting out, a digital picture with its image clarity and range of color tones and a pristine film print of a movie displayed on a well-maintained film projection system are equal. But the digital version is made with the exact images approved by the director or the studio, and it maintains that quality through an indefinite number of showings and copies.
Today, movies may still be shot on film and then digitized. The digital files typically used in movie production to capture, store, and edit movies after they are shot on 35-mm film are massive, as large as 6000 terabytes. The final uncompressed movie files are a few terabytes. Yet even these files would be too expensive for studios and theaters to store, ship, and handle.
Obviously, some sort of compression was needed to bring costs down. But the movie industry widely recognized that if digital cinema didn’t start out with quality that was as good as 35-mm film, it was doomed. Selecting a compression technology that would enable digital movies to be packed down to a reasonable size and without any visible loss of quality required Hollywood’s most discriminating observers to do a lot of testing. These “golden eyes” included cinematographers, movie directors, theater owners, and studio executives—all people who spend much of their professional careers examining the minute details of images, such as color, contrast, and even the tiniest artifacts that might somehow render an image less realistic.
These cinema experts converged in 2002 on the Hollywood Pacific Theater, a grand old movie palace taken over by the Entertainment Technology Center at the University of Southern California and turned into the industry’s Digital Cinema Laboratory. After replacing the old 35-mm projection systems with the best film projectors available, the group invited digital technology vendors to set up test equipment and asked providers of compression technologies to face off against one another. And the games began.
An important technology contender was MPEG-2, the compression system created in 1994 and now ubiquitously used around the world for television, DVD, and Internet video. The problem with MPEG-2, however, is motion artifacts—the appearance of discontinuity or jerkiness in action scenes, particularly ones involving speeding cars or fire.
These motion artifacts appear because MPEG-2 uses temporal compression. The technique essentially encodes only the differences between frames, so, after the initial frame in a scene, the digital files typically need to add very little information for subsequent frames. But in scenes with a lot of action, many changes occur between frames, and the processor that decodes the compressed data cannot keep up, making movement on the screen appear jerky or displaying chunks of the picture as single-color blocks. Motion artifacts are rarely noticeable on a television screen but are all too apparent when magnified on a large screen.
The specification developers also considered the cost of implementation. Companies offered a variety of compression schemes, but many had costly licensing requirements or restricted manufacturing to single sources of supply. The industry wanted quality, but it also wanted an open and competitive market. The group finally settled on JPEG2000 as providing the best possible image quality with the least-encumbered intellectual property.
The original JPEG format (for Joint Photographic Experts Group), established in 1986, is ubiquitous in consumer digital cameras. JPEG is the popular name for ISO 10918-1, a standard created by the combined efforts of many image-processing experts from industry and academia. The format compresses large image files into manageable sizes through so-called lossy compression, a technique that discards some information. When done well, the algorithms discard mostly information that is unimportant to the human eye or to the human brain as it processes signals from the eye.
JPEG2000, first published in 2003, updates that classic technology. It compresses whole frames as if each were a separate picture, which in fact each is. But instead of using the original JPEG algorithms, which analyzed each image and threw away the least important data, JPEG2000 uses “wavelets.”
In this technique, Fourier analysis transforms the image into a set of sine waves with different frequencies and amplitudes. The computer doing the compression then maps the sine waves against a stored set of sine waves—the wavelets—to determine which members of the stored set best represent the image data. The compression program contains mathematical formulas to define each of these stored wavelets and records the image as a set of these formulas. When a computer later decompresses the image data for viewing, it does the math to recreate the original sine waves.
Using wavelets for compression eliminates some of the glitches that appear in images when important data is ignored. However, the process takes a lot of computing power. That’s probably why it hasn’t made it into popular consumer products, but it’s not an insurmountable problem for high-end theater systems.
JPEG2000 also assigns more bits to the digital representation of color, another means of improving the image. Today’s digital cameras typically allocate 8 bits per color, or 24 bits per pixel, to identify the color of that pixel. JPEG2000 allocates 16 bits per color, or 48 bits per pixel, creating a much broader color palette. The result of wavelet compression, combined with more accurate color representation, is a typical file size for a feature movie after JPEG2000 compression of 300 gigabytes or less.
The industry group spent much less time considering standards for sound. They determined that, because the bandwidth for sound is negligible as a percentage of a movie’s total bandwidth, they could simply use standard uncompressed CD-quality audio as represented by the Wave digital encoding format. This audio file format, developed by Microsoft, is common in PCs and game software.
Having settled on the type of data movie studios would send to exhibitors, the next challenge was to figure out what kind of equipment would translate those bits into images on the big screen.
With film, exhibitors have a mature, stable technology that requires little training to use and is reasonably easy to fix when equipment breaks down. Consequently, problems in film projection are rare, and it’s unusual that they interrupt more than a single showing. To measure up, digital cinema systems had to have similar reliability, ease of use, and limited downtime.
At a minimum, a digital cinema system requires a digital projector and a media player. The media player stores the movie and decrypts and then uncompresses the digital cinema files during playback. The system also needs to have user-programmable functions, so that exhibitors can easily select which trailers, advertisements, and movie files to play. Another thing: it can’t let pirates get at the data.
The projector has to be able to take the output of the media player and turn it into the picture that the moviemaker intended. That is, the projector needs to know if the picture is in CinemaScope, which is an extended wide-screen format, or in a squarer format. It also needs to know if the movie is to be presented in 3-D or not [see sidebar, “Digital Cinema: Another Dimension”]. And it has to add forensic data that, though invisible to the moviegoer, will disclose the time and place of the movie’s showing if a video-camera-wielding pirate captures the images.
That’s the minimum. But in today’s world, a single-screen theater is rare. In modern multiplexes, exhibitors play movies on multiple screens and move them from auditorium to auditorium while trying to match seating counts to demand. To operate in a multiplex, the digital cinema media players must be networked with a central management server [see diagram, “Following the Bits”]. This networked system allows exhibitors to move titles between screens within the multiplex and enables third-party support companies to remotely monitor and troubleshoot the system.
After Digital Cinema Initiatives released the first digital cinema specifications in the summer of 2005, a few companies began marketing tools to equip multiplexes. Access Integrated Technologies, Dolby Laboratories, Doremi Labs, Eastman Kodak, NEC, and QuVIS, which had each already started manufacturing digital cinema equipment, introduced upgrades to comply with the new specification.
All systems included provisions for networked control and management of the screens, to allow theater managers to schedule movies and to start and stop shows in one auditorium from any other auditorium in the complex. All but the NEC systems, however, lacked a central library server to allow flexible programming of movies into any of multiple auditoriums. And all would work only if a theater complex installed the same brand of equipment for every screen.
In late 2005, two companies—Christie Digital Systems, in Cypress, Calif., and Access Integrated Technologies, in Morristown, N.J.—jointly developed a central library management system. So far, this is the only system designed to allow theater owners to purchase media players and projectors from multiple manufacturers.
Christie or AccessIT can build the library server on Windows or Linux-based servers from Compaq, Dell, and Stratus. Media players might come from Dolby, Doremi, Kodak, or QuVIS. The ability to choose adds flexibility and competition that lowers prices. All four players can use the same digital files, removing the biggest headache for the movie studios.
Throughout the development of digital cinema, the industry has been paying a lot of attention to the digital cinema projector. The projector is central to the whole system and has the biggest effect on picture quality. Today’s typical projector uses DLP chips, a technology also used in high-end home projection TVs. DLP chips modulate light by bouncing it off arrays of micromirrors, one array each for the red, green, and blue components of a video picture [see “Goodbye, CRT,” IEEE Spectrum, November].
TI’s DLP Cinema, designed for the commercial market, uses 2 million micromirrors on each chip to produce 2048 lines of horizontal resolution. Home sets typically contain 1 million mirrors and project 1280 lines of resolution; they have less contrast than the cinema-grade chips.
The 2 million–mirror, 2048-line theater system is the pixel equivalent of what the eye can see in 35-mm celluloid film projected from a pristine print by a well-maintained film projector, as verified by ordinary viewers as well as Hollywood’s “golden eyes.” A small number of theaters across the globe have successfully used DLP Cinema for five years without significant problems, proving the reliability of the technology.
Now developing even higher-resolution, 4096-line (4K) projectors, JVC, NTT, and Sony are using liquid crystal on silicon (LCOS) technology, which is also in sets sold for home use. LCOS devices use digital memory cells in a densely packed array. The memory cells cause the liquid crystals to twist and, like a window blind, change the amount of light reflected, from pure white to black. LCOS devices are undeniably promising but still untested in the commercial cinema market.
The two projection technologies also differ in their ability to display 3-D movies. In this area, DLP has the advantage. Current LCOS systems cannot display 3-D very well because of the higher frame rates required, removing one of the advantages digital cinema has over film.
The way things are shaping up today, the movie studios have digital files, and the theater owners are quickly acquiring digital file-management and projection systems. What’s missing is the link between the two—how the studios get the movies in digital format to the theaters. In the short term, the answer is very low-tech: by truck or delivery van. It leaves a lot to be desired; imagine if suddenly e-mail were banned, and to communicate using computers, you had to go back to mailing or hand delivering floppy disks.
To make matters worse, media large enough to hold an entire movie on a single unit, like a hard drive or a tape cartridge, are delicate, making transport tricky. Also, the equipment used to transfer the files to the media servers—the tape readers or hard disk players—is costly.
At one time, the movie industry had high hopes for terrestrial fiber-optic delivery. On the plus side, fiber is secure, because it is nearly impossible to intercept a message that travels by light through thin threads of glass. However, in rural parts of the developed world—to say nothing of the developing world—many theaters are not located near fiber lines.
That leaves satellite. Satellite communications systems are perfect for distributing data from one point to many others. Conditional access methods that allow a sender to address specific receivers and not others can ensure that the files are delivered only to authorized recipients.
Satellite delivery is, by its nature, an automated process. Physical media, on the other hand, require human intervention during delivery, like receiving the delivered package, plugging the media into the in-theater system, and triggering the software to upload the content onto the theater’s server. Automatic satellite delivery simplifies theater management by eliminating the need to schedule labor and removing the element of risk that human intervention inevitably adds.
Nevertheless, analysts disagree on the number of theaters that would have to be able to receive a single satellite transmission to make satellite distribution cost-effective. Some say the number is as low as 30; others argue that some 3000 theaters would have to have satellite access to make it viable.
At the moment, AccessIT and Microspace Communications Corp., a Raleigh, N.C., company that sells video and data satellite services to corporate users, are promoting satellite as their preferred method of delivery. AccessIT has delivered more than 40 feature-length movies this way. While it is still too early to see how the market will react, each company claims that its satellite option is cost-effective and sustainable.
A year ago, with the digital cinema specifications largely complete, networked theater equipment on the market, and a couple of distribution methods worked out, one key question remained: Who was going to pay for this massively expensive conversion? In the United States and Canada alone, some 36 000 screens await conversion, at an estimated cost of $3.5 billion. Add in the rest of the world and the costs climb into the tens of billions. Even for Hollywood, that’s a lot of money.
The studios stand to save hundreds of millions of dollars a year in a fully digitized theatrical environment just in the United States. Yet replacing 35-mm systems with plenty of life left in them requires a lot of confidence. Theater owners won’t spend that kind of money on technology that many of them continue to regard as unproven. Third parties tried to step in.
Since last October, two companies, AccessIT and Technicolor (yes, Technicolor, the same company that introduced color movies 70 years ago), launched the latest attempt to broker a deal to lower the up-front costs to studios and the risks to theater owners. And if in the process they create a sizable business for themselves, so much the better.
The plan is based on a “virtual print fee,” which the movie studios will pay to the company that provides and installs their digital cinema equipment and software for each showing of a movie on the digital cinema system. Because the movie studios will be paying the bulk of the cost of converting the industry to a digital infrastructure, the business proposition is attractive to exhibitors. It’s basically like a mortgage: the virtual print payments end once the vendor has accumulated the original costs of the equipment plus interest.
While the exact dollar figures haven’t been made public, the cost to the movie studio is supposedly lower than the cost of making the equivalent number of film prints. One print typically costs between $1200 and $2000; a studio must make one print for every cinema screen displaying the movie at a given time and replace prints that wear out.
Though AccessIT and Technicolor have adopted the same concept to lower up-front costs, they are implementing it competitively. All the major movie studios have agreed to participate in one or both of their plans.
In a few test cases, certain vendors and even the movie studios have paid outright to upgrade theaters, but the virtual print fee is the only broad-based economic model currently rolling out in the United States. The rest of the world is watching to see how this model works out, though in some regions governments may fund the transition.
While screen after screen is going digital, and the eventual transition to digital cinema is now certain, the technology has a lot of room to grow. Projectors with even higher resolution than the 4K-line systems now in development are on the horizon. Faster computers and increases in network bandwidth will open the door for more powerful systems that will let theater managers transfer movies more quickly between screens and even play movies out of a central library in the theater, eliminating the need for a server for each screen.
New business models will emerge from this evolving digital milieu. Some pioneers, like National CineMedia, in Centennial, Colo., and AccessIT, are already testing the viability of putting sporting events and concerts into movie theaters. BroadwayOnline has captured live Broadway shows (Jekyll & Hyde, Puttin’ It Together, and Smokey Joe’s Café) in high definition and has experimentally exhibited these videos at movie theaters throughout the country since the first DLP Cinema projectors were released five years ago. The Metropolitan Opera in New York City is planning to show some operas in its 2006–2007 series in movie theaters.
In just five or 10 years, the impact of digital cinema is likely to go far beyond the cost savings and quality improvements envisioned now, bringing moviegoers new forms of entertainment that we have yet to imagine.
About the Author
RUSSELL WINTNER has been pioneering digital cinema technology for the past 11 years. As president and chief operating officer of Access Integrated Technologies Digital Media Services, Wintner is responsible for Access Integrated Technologies’ digital content management and delivery services.
To Probe Further
You can download the full specifications for digital cinema from the official Web site of Digital Cinema Initiatives, http://www.dcimovies.com.
For more on Access Integrated Technologies and its software and services for digital cinema, see http://www.accessitx.com.
Details on Texas Instruments’ Digital Light Processing technology and its digital cinema applications are available at http://www.dlpcinema.com.
For daily updates on the progress of the digital cinema rollout, see DCinema Today, an online resource, at http://www.dcinematoday.com.