To TV sports fans, the yellow first-down line appears on stadium grass as if by magic, thanks to the wizardry of a group of former defense engineers
Stan Honey, the force behind the creation of Sportvision’s yellow football line, poses on a miniature football field in the company’s laboratory. The yellow line on the live video feed is generated to appear as if painted on the turf behind Honey. Photo: Robert Houser
It is “one of the most significant innovations to football coverage since instant replay,” pronounces Fred Gaudelli, executive producer of football for the ABC TV network. And yet it stemmed from a failure in hockey broadcasts that was rejected by die-hard fans as a garish gimmick that defaced their game.
If you’ve watched a pro football broadcast in the United States lately, you’ve seen what Gaudelli is talking about: an eerily realistic bright yellow line created on the playing field that shows you exactly how far the offense has to carry the ball to get a first down. “It makes it easier to watch the game, and that is what our job is all about,” Gaudelli tells IEEE Spectrum.
That yellow line has become such a staple in U.S. football that no self-respecting network would think of televising a game without it. It even won an Emmy award, for technical innovation. The tiny company that pioneered the technology, Sportvision Inc., with offices in Mountain View, Calif. (headquartered in Chicago), now covers up to 300 games a year with 18 crews. The state-of-the-art workstations the crews use, along with sensors and other hardware and software, solve a barrage of fiendishly difficult image-processing problems in a fraction of a second. And some casual viewers have no idea that the yellow line they see on the field is a computer graphics figment no more “real” than the weather maps that seem to swirl behind television meteorologists.
The story of the yellow-line system, which is called 1st & Ten, is a classically circuitous one that begins in the world of military technology. It goes into the hockey arena, where an early version of Sportvision's tracking technology turned the puck into a flashy orb that streaked across the ice with a fiery tail. It moves onto the football field. And most recently, it has branched out into car racing sponsored by NASCAR (for National Association of Stock Car Automotive Racing, based in Daytona Beach, Fla.), where broadcasters conjure up graphics to give detailed race data in real time. A future version of the technology will even take NASCAR data and channel it to homes, where it will control the movements of virtual cars in interactive video games so couch potatoes can test their racing skills against the pros.
Faster than a speeding puck
Flash back to 1994. IEEE Member Stan Honey is executive vice president of technology for Rupert Murdoch’s vast media and entertainment empire, News Corp. (Sydney, Australia). It’s June, and Honey is in a meeting with David Hill, then president of News Corp.’s Fox Sports (now chairman and CEO of Fox Sports Television Group). Honey is rhapsodizing about the possibilities of virtual billboards, which would let technicians insert any graphics at will into the images of actual billboards in a stadium. Hill rejects the idea (since implemented by Sportvision and others), but suddenly asks, "Could you track and highlight a hockey puck?"
Honey responds, “I tracked things a lot harder than a hockey puck for the military, David, but you couldn’t afford it.”
“Just how much would it cost?” Hill asks.
“It would take two years to develop and cost about $2 million,” is the reply.
“You don’t understand the economics of sports,” Hill tells Honey. “Write a memo.”
A few days later, back in his office near San Francisco, Honey gets a call from Murdoch, who gets right to the point. “David says you can track and highlight a hockey puck, and you can get it done by the 1996 January All-Star game, and it’d only cost $2 million,” Murdoch says. “That is now your highest priority. If anybody asks you about the money, tell them to call me.”
With just 18 months to go before the game, Honey immediately starts assembling his team, relying heavily on engineers he’d worked with at SRI International in Menlo Park, Calif., in the early 1980s. The group had developed an over-the-horizon radar, underwater sensors, and an ultraprecise radio-positioning system for the military. Some of them had gone on with Honey when he founded Etak Inc., the company that pioneered in-car vehicle navigation; others had stayed at SRI. With the defense industry slumping, Honey has little trouble arousing interest in his project to break new ground in TV sports broadcasting.
The goal is simple. Make the televised image of a hockey puck glow so it’s easier for the viewer to spot, and, when it’s going really fast, put a tail on it showing its path. Murdoch and company hope the system can overcome the main complaint about televised hockey: the trouble that casual viewers have in following the fast-moving puck on TV screens.
Honey lays out his proposal. To sum up: it’s never been done before, it will undoubtedly be hard, but it’s doable. In other words, it’s “just the perfect project.” He quickly gathers a team of 10 and enlists help from Vista Research LLC (New York City), a group of defense engineers, and Shoreline Studios Inc. (Vancouver, B.C., Canada), a spinoff of Silicon Graphics Inc. (Mountain View, Calif.). The project becomes “a taste of Camelot” for the engineers involved, Honey says.
To track the hockey puck, a number of parameters have to be fed into a computer system and updated continually. First, the system has to know exactly where the broadcast cameras are focused. It also has to have some idea of how each camera lens distorts the image; different brands of lenses vary. It then has to figure out which camera’s feed is being displayed to viewers at any moment.
Meanwhile, the system also has to know exactly where the puck is and how fast it is traveling, and then it has to create a graphic based on that data and overlay it onto the video image 60 times a second. All these things have to be exactly synchronized, as the cameras are zooming and panning and the puck is traveling at up to 160 km an hour.
Making it work requires putting infrared transmitters in each hockey puck. Calibrating the system proves difficult. It is finally accomplished by drilling holes in the ice and filling them with blue dye to enable the system to accurately calculate locations on the ice. “It took some convincing to be allowed to do that,” recalls Marvin White, now Sportvision’s chief technology officer. Technically, the project is a success, and it comes in on budget and on time.
The technology works fine but manages to offend even hockey fans with its lack of subtlety. Fox Sports Television Group (Los Angeles) chooses to highlight the puck with a large, bright, fuzzy blue spot. A red rocket trail appears, painted right over the players, when the puck is traveling at high velocity. “It was comic-strip-like, Flash Gordon,” Honey says. Called “FoxTrax,” it makes its debut at the 1996 All-Star Game with a series of television commercials and a huge fanfare. Stacks of newspaper articles debate its pros and cons, the “Late Show with David Letterman” spoofs it, and hockey ratings jump to their highest levels ever.
But serious hockey fans hate it. After three years, pro hockey broadcasts in the United States switch networks, and the system dies a quiet death. “There are two ways for a product to fail,” Rick Cavallaro, Sportvision’s vice president of product development, tells Spectrum. “One is for it not to work; the other is for people to say it works great and we hate it.”
In hindsight, Honey says, the system would have been better accepted had it been subtler. Better to have placed the image of a gray disk under the puck—and underneath the players—and shown the puck’s track only when it was moving faster than a preset threshold, he says. That track could have been a clean black line, drawn as if by a drafting tool, making it clear that it was to provide information, not entertainment. “It’s possible to imagine,” Honey says, “that the diehard fans would have seen that as an advantage.”
Off the ice, onto the field
Meanwhile, unwilling to disband his crack team, Honey needs another project—fast. An idea for a fancy new telestrator for commentator John Madden’s play diagrams goes nowhere, but once again, Hill has another idea. “Why don’t you just do the first-down line?” he asks Honey. “It’s clean, it’s simple, and it’s important.” (The telestrator as described by Honey is, nevertheless, developed later and introduced in 2002.)
This time, though, Murdoch does not bring out the checkbook. Honey and his team, along with two News Corp. executives, Jerry Gepner and Bill Squadron, spin out a new company, Sportvision, taking rights to use all the patents and other intellectual property they have from the hockey puck project. In exchange, News Corp. gets 10 percent of the company.
The color map conundrum
Drawing a simple first-down line has got to be much simpler than continually tracking a puck bouncing around and traveling at up to 160 km an hour, right? Wrong. While some elements of the first-down line problem are similar to some in the hockey puck problem—you have to know where the cameras are pointed at all times and figure out where in the image to do your drawing—a number of factors make the seemingly simple task of drawing a line actually more difficult.
For one, the line has to be drawn as if it were under the players, not as an overlay. Also, the distortion of the television lenses becomes more critical—if the puck trail is a little off, it’s no big deal, but if the yellow line is curved incorrectly, it is immediately apparent next to the real white lines on the field. That lens distortion changes constantly as the cameras zoom in and out. Complicating all of this is the fact that, unlike hockey rinks, which are flat, football fields are not—they have a crown down the middle to allow drainage.
But the biggest problem of all is the color “keying.”
Color keying is done all the time in broadcast television and movies. The classic example is the weather forecaster in front of a blue screen; the image in the blue screen is later replaced by a video image of a weather map, and it looks as if the weather forecaster is in front of the map. This type of keying is simple; the processor simply replaces any blue pixel with the second image. If a pixel is not blue, it doesn’t replace it. When this technology is being used, actors simply do not wear anything blue or parts of them would seem to disappear.
But a football field is not blue; it is many shades of green, and the exact color of the green changes every time a cloud passes over the field. Some fields have patches of brown dirt as well. Some players also have green uniforms, or brownish ones, or uniforms that become stained with grass or mud. As a worst-case scenario, the engineers consider the San Francisco 49ers playing the Green Bay Packers at sunset, after a recent rain: the image will have multiple colors of brown dirt, because there is dirt sitting in the sun, dirt in the shade, dry dirt, and wet dirt. It will have multiple shades of green grass. And it will have the 49ers wearing brown pants and the Packers in their green shirts [see photo, below].
A custom-made color key enables the yellow line to appear only over the green grass, not over the green jerseys. Photo: Robert Houser
Even with all that green and brown to deal with, the system would have to make sure that the yellow line is never drawn over a player. Determining the key that would define what is field (to be drawn upon) and what is player (not to be drawn upon) is going to be a challenge.
Cavallaro, the project leader, fears that “in some cases, there just wouldn’t be enough pure color distinction,” resulting in some yellow line appearing incongruously on a player’s shirt, say.
So the first thing the team does is develop a more sophisticated method of color keying than had ever been done before, one that can be redefined quickly as lighting changes. The group produces a huge table of color definitions in multiple formats, the most common of which is RGB, specifying colors by their red, green, and blue components. An operator can use whichever set of color definitions works best in a particular situation. Proving that a sophisticated enough color key can be developed takes a month; then the team moves on to solving its other problems.
Each television camera, it turns out, electronically measures the zoom and focus position of its lens. This information is, amazingly, not used for much of anything until Sportvision taps into it to make sure the yellow line is correctly laid out. But zoom and focus information is not enough; the team also has to add sensor rings under the camera that measure its pan and tilt and the attitude of the tripod.
In the latest version, the solution is a controller mounted on the tripod head that collects all this data for each frame of video, synchronized by the camera’s vertical sync signal and then modulated onto an audio frequency before being sent back to the production area via a microphone channel.
With the project about halfway through, the team decides to go with yellow, after considering and rejecting orange, blue, and red. “Yellow isn’t the most obvious choice,” Cavallaro admits. “Orange would make more sense because the chain gang [the officials who mark and measure first downs] has orange flags. But orange looks bad and yellow looks good.”
Finally, it’s all done.
The technology is first embraced by ESPN, which introduces it in the fall of 1998 and receives an Emmy for technical innovation. The introduction is done without fanfare, without teaser advertisements. It just appears one Sunday in September, with the intensity of the line turned down far below what it typically is today; blades of grass seem to peek through, making the line look like yellow chalk on the grass. Viewers are puzzled: is the line really on the field or not? And that mystery initially creates a lot of interest.
Today, Sportvision crews crisscross the country during football season. A crew of two typically arrives the day before the game, along with a small rack of computers, about the size of a dormitory refrigerator. They first measure the slope of the field using a laser surveying system, fit the rings on each camera, and calibrate zoom positions. An hour before the game starts, the operator chooses the color key, based on the colors of the field in both shade and sun and those of the players’ uniforms. During the game itself, every time a new first down is made, an operator looks at the screen and clicks either on the ball or on the chain gang; this positions the line.
Meanwhile, sensors on three main cameras continuously send camera position information down an audio channel. While the video feed is delayed a fraction of a second, the computers consider all the data, determine whether the line should be drawn in that image, and, if it should, consult the color key to determine which pixels should be switched to yellow—and repeats this process 60 times a second.
Viewers almost immediately accepted the technology. Today, while Sportvision creates the lion’s share of virtual first-down lines, others, like PVI Virtual Media Services LLC, in Lawrenceville, N.J., have gotten into the act.
Next up, NASCAR
Sportvision’s latest conquest is NASCAR racing, a sport in which Sportvision relies on a Global Positioning System (GPS) receiver in each racing car to send location information to a computer system that, during a two-second broadcast delay, generates graphics and other data related to each car’s movement and position on the course, including its speed, brake status (on or off), and tachometer reading [see photo, above]. The technology was first used in a live race in 2001.
The imaging problem was similar to that in football, because the television camera lenses have long zoom ranges. This problem had been solved before. But at its inception, the NASCAR project was faced with a conundrum. It was clear that to get accurate positioning of each car on the track, a GPS receiver was needed in each car. But the conventional wisdom of the time was that GPS could never work in a racing environment because of the amount of electrical noise generated inside each car; the extreme multipath interference due to all the metal nearby as a car raced around a track; the metal fence that leans out over the track, blocking, in effect, half the sky; and the fact that the GPS receivers would have to provide accurate positioning within centimeters instead of the more typical meters.
Sportvision’s technology creates virtual dashboards in this screen shot from an NBC broadcast and one from InDemand (below), a pay-per-view service. Also shown are the race car’s number and position on the track and the name of the driver. Photo: Robert Houser
The solution was to give the GPS receivers a head start in figuring out their position by restricting the possibilities to a three-dimensional ribbon of track, measured in advance to within a few centimeters of accuracy.
Designing the telemetry, the method by which the GPS information would be sent back to the computer system from the car, was also a challenge. Previous NASCAR telemetry systems relied on a transmission control protocol/Internet protocol (TCP/IP)-like approach of acknowledge/retry. But the Sportvision system didn’t have time for that; it had to get the information for all the cars, compute the graphics, and display them in less than the two seconds the live broadcast was being delayed. So the engineers designed a customized telemetry system that used spread-spectrum modulation of the signals, and assigned each car two predetermined time slots for transmission so there would be no signal collisions.
After the design of the NASCAR system, RACEf/x, was complete, the technology proven, and the NASCAR authorities pleased enough with it to mandate its installation in all cars, there was a problem that nearly stopped the technology in its tracks.
RACEf/x was designed to use lithium-manganese dioxide batteries to power each car’s GPS and telemetry package. During the 2001 Watkins Glen (N.Y.) International Race, one battery pack overheated and started a fire in the battery enclosure, emitting smoke into the car. (The fire was extinguished without injury, though the car did not finish.)
Engineers determined that the batteries chosen were not stable under high vibration, and reported this to NASCAR. They then designed and manufactured replacement alkaline battery packs that were much larger and heavier but less flammable, and had those ready to install in time for the next weekend’s race. NASCAR, which had been mandating the use of Sportvision’s technology, gave drivers a chance to opt out; none did.
Internet in the future
Today, the Sportvision crew is expanding the NASCAR tracking system to interactive television and video games. In these games, an actual race is virtually reconstructed with slightly smaller vehicles and a home player’s car is inserted, allowing the gamer to race against the actual drivers. The company has introduced an Internet application called PitCommand, in which paid subscribers can watch a virtual aerial view of the race and view the track and data for any car they select, run instant replays at will, and change perspective.
Ultimately, the Internet is where the company sees its future.
“When the time comes for people at home to make decisions about how to customize their viewing experience, they will need data from the sporting event. And we will have that data,” Honey says. The company already has been granted over 20 patents related to gathering data at an event and sending it downstream to devices in the home.
But that future may take a while to arrive, and Sportvision isn’t holding its breath in anticipation.
The company, founded during the dot-com boom, was initially pressured by investors to become an Internet company instead of focusing on standard broadcast television, which was then perceived as a technology dinosaur. But Sportvision resisted, arguing that TV broadcasters have real money. Then, with its trucks and sensors at every sporting event, the company would be well positioned to dominate interactive Internet sports if and when that time came. Today, Sportvision is a solid, if still small, privately held business. It projects that it will be profitable on an operating basis in 2004.
Honey meanwhile in October stepped down as company president to be the navigator of a sailing crew attempting to set the record for the fastest passage around the world. He remains on the board.
“We haven’t made a bazillion doing this,” Honey told Spectrum, “but it’s the most fun we’ve ever had. It’s not like when we were building military systems. Everybody gets to see what we’re doing, and we’re doing something that is both technically hard and is in good cheer.”
To Probe Further
For information about Sportvision and its products, see the company’s website.
The Engineering & Technology History Wiki has a wealth of information about the development of Sportvision’s technology. See in particular the detailed accounts from three of Sportvision’s early collaborators: Rick Cavallaro, Dave Hill, and Stan Honey. Video footage from the company can also be found in the ETWH’s Sportvision article.
Stan Honey’s passion for sailboat racing later led him to become director of technology for the 2013 America’s Cup in San Francisco. In “The Augmented Reality America’s Cup,” he and Ken Milnes described the advanced graphics and real-time tracking technology that’s turning yacht racing into a spectator sport.