When Spectrum Auctions Fail


For some microwave links, cooperation beats 
competition as a way to share the air


Most people think that the atomic bombings of Hiroshima and Nagasaki in 1945 ended U.S. involvement in the Pacific theater of World War II. In fact, the state of war with Japan persisted, in a technical sense, until September 1951, when the formal peace treaty was signed. “Making peace is like repairing the many strands of an intercontinental cable,” President Truman said at the time. “Each strand must be spliced separately and patiently, until the full flow of communication has been restored.”


Thanks to some then-new technology, more than 30 million U.S. viewers witnessed Truman compare peace making with cable mending during the very first TV broadcast aired from coast to coast. Electrical engineers watching the event might have appreciated the irony: You see, the new technique for linking far-flung TV stations had just made lengthy cables obsolete. Engineers at AT&T instead used a network of microwave transmitters to beam TV signals from point to point across the country.


This system let TV viewers all over the United States watch presidential speeches, documentaries, sporting events, and, of course, a lot of silly sitcoms; “I Love Lucy,” for example, was broadcast nationally just a month later. “Lucy” swiftly became entrenched, as did the microwave systems. Now you see the antennas everywhere: sideways-facing dishes and flat disks hanging on the sides of radio masts, water towers, and tall buildings. 


These antennas don’t carry as much television programming nowadays. But they do carry phone calls and Internet data packets, handle cellphone traffic to and from local cell towers, transmit calls for assistance to firefighters and police, help balance the electric grid, coordinate railroad trains, regulate pressure and flow in oil and natural-gas pipelines, and convey vast amounts of ordinary business data.


The industry refers to these communications channels as fixed microwave, to distinguish them from the many other wireless applications that also use frequencies in the microwave bands, including everyday cellphones and Wi-Fi networking gear. Fixed-microwave links have multiplied over the years, with engineers continually devising ways to meet increasing demands. Now, however, the tradition of letting those engineers work together to squeeze in links is under assault. More and more, government regulators in the United States and the United Kingdom have been awarding licenses for fixed-microwave communications to the highest bidder, auctioning off the spectrum as they have done for many other wireless services.


Basic physics, economics, and real-world experience all suggest this is a bad idea. Left unchallenged, it could needlessly impede a vital and successful mode of radio communications. 


To understand the problem, first consider how this form of radio works. Today’s microwave links are a by‑product of radar, first invented in the 1930s and vastly improved under the pressures of World War II. Radio transmitters before that time—mainly AM broadcast, maritime, and early attempts at mobile communications—sent out signals more or less equally in all directions. The most urgent task of wartime radar, finding the distance and direction of incoming aircraft, called for something different: a narrow beam. So the development of radar yielded, among other things, antennas capable of concentrating radio-frequency energy in one direction.


Engineers soon realized that radar-style antennas would allow for a new kind of radio communication, one that was ideal for exchanging signals between fixed locations, not over a wide area. Telephone companies, to take an early example, had to move large numbers of calls from one switching office to another. For that, point-to-point transmission using a focused beam offered important advantages.


With a larger fraction of the transmitter power aimed at the intended receiver, fewer watts cover more kilometers. A similarly directional antenna on the receiving end, pointed back at the transmitter, further magnifies the incoming signal and makes the receiver relatively insensitive to interference coming from other directions. Also, covert eavesdropping becomes difficult because it requires positioning a receiver within the relatively narrow transmitted beam.


Less significant in those early days, but of great importance as the radio spectrum became more crowded, was the ability of several point-to-point links in the same area to share the same frequency. They can easily do that, so long as different transmissions do not impinge on the same receiving antenna from the same direction. 


The early adopters of this technology were Bell Canada and AT&T, which in the 1950s built continent-spanning microwave systems to carry telephone calls and Teletype messages in addition to television programming. Frequency-division multiplexing allowed for up to 5400 telephone channels on each microwave radio channel, with as many as 10 radio channels combined on one antenna. Towers could be up to 70 kilometers apart. Those early systems used analog modulation. Now, of course, all new systems are digital, which makes them much more efficient and reliable. 


What hasn’t changed much is the shape of the antennas: Typically, they’re still parabolic dish antennas, similar to those found in World War II radar installations. Radio waves diffract to a small extent around the edges of these antennas, so some energy unavoidably radiates off the desired axis. Increasing the size of the antenna diminishes the stray radiation and improves focusing power. In this context, the relevant measure of size is not the centimeter but rather the wavelength of the radio signal involved. Good performance requires an antenna roughly 20 to 40 wavelengths in diameter. So higher frequencies (shorter wavelengths) allow an antenna of the same physical size to produce a tighter beam. 


Wartime radar used frequencies in the tens or hundreds of megahertz, very low by modern standards. That upper limit on frequency was set by the performance of 1940s-era vacuum tubes. As a result, some of the early antennas were tens of meters across, and even then they weren’t all that effective in forming beams. But improvements in vacuum tubes came quickly, followed by bigger improvements in transistors, permitting point-to-point microwave to move to ever-higher operating frequencies.


Today, commercial use extends up to 95 gigahertz, where the corresponding wavelength is only 3 millimeters. And as frequencies have increased, antennas have shrunk. Whereas a typical microwave antenna in the 1950s was the size of a one-car garage, some antennas today are no bigger than a dinner plate.


Besides tight beams and small antennas, higher frequencies also allow for wider radio bandwidths, giving greater carrying capacity, typically measured in bits per second. The older microwave bands, around 2 GHz and below, had radio bandwidths of only a few hundred kilohertz. Typical modulation schemes can convey about 5 bits of information per second per hertz of bandwidth, so the maximum data payload at those frequencies was just a few megabits per second. At the other extreme, in the 92- to 95-GHz region, links can handle tens of gigabits per second. That’s enough to supply broadband Internet service to thousands of customers. 


Higher frequencies provide narrower beams, which, all else being equal, need less clearance over obstructions, so you can get by with shorter supporting structures. The narrower beams are also less susceptible to multipath interference, which arises when the same transmitted signal reaches the receiver both directly and also by reflection, say, from the surface of a body of water, or by refraction, as it passes through atmospheric layers with different densities. The reflected or refracted copy, arriving a little later, can partially or even completely cancel out the direct signal.


But higher frequencies are suitable only for shorter distances. Whereas a 2-GHz link might extend over 100 km, links above 70 GHz are usually limited to less than 1 km. One reason is that at a given distance from the transmitting antenna, the loss in signal strength goes up with the square of the frequency, because of what’s called free-space attenuation. This isn’t actually a frequency-dependent attenuation: The loss comes about because each element of the receiving antenna is less effective at higher frequencies. Typically, this effect largely cancels out the greater signal strength gained from a more tightly focused beam.


Another impediment to propagation is something known as rain fade, the weakening of a signal due to dispersal by raindrops or ice particles in the atmosphere. This phenomenon affects frequencies above about 10 GHz and becomes progressively worse the higher you go.


Radio waves around 60 GHz suffer an even greater problem: They are absorbed by oxygen molecules, and to a 60-GHz signal, clear air will look like dense fog. But systems in this band are still used, for instance, to connect nearby buildings on a single campus. Here, the atmospheric attenuation can be considered a feature rather than a bug, because it allows the same frequency to be employed a short distance away without risking interference. 


Fixed-microwave links are among the most reliable forms of radio communications ever built. A system serving critical needs can routinely be kept running 99.9999 percent of the time, or “six nines” in industry jargon. This allows for a total cumulative downtime of only 30 seconds per year—not bad for equipment mounted high in the air and exposed to the elements. 


One key to reliability is ensuring that neighboring systems do not interfere with one another. Most countries achieve this through some form of frequency coordination. In the United States, the engineer configuring a new link, often with the help of a frequency-coordination firm, consults a comprehensive database of licensees and prior applicants. The goal is to adjust the design so that it meets certain specified criteria for keeping the system free of interference and for preventing it from interfering with other systems. The designer then sends out details of the proposed system to all other users whose operations might plausibly cause or receive interference. The acquiescence of potential victims of interference is required before the license application goes to the Federal Communications Commission. If another user objects, the engineers on both sides negotiate to resolve the problem. The role of the FCC here is merely to maintain the database, along with setting minimum performance requirements— activities it funds with fees collected from license applicants (currently US $470 per transmitter location for each 10-year license).


Historically, even companies in direct competition have been helpful to each other through this process (although this spirit of cooperation is starting to break down on certain high-demand routes). The criteria used to identify interfering signals are conservative, so a system that has passed frequency coordination is very likely to coexist successfully with its neighbors. 


Some services get by with no frequency coordination at all. Many Internet providers offer connectivity via directional links in the unlicensed bands near 2.4 and 5.8 GHz. These can operate over tens of kilometers. Because unlicensed users receive no interference protection, they are free to put up antennas almost anywhere they want. Even so, the directional nature and limited power of their transmissions makes interference unlikely. You wouldn’t want to control a nuclear power station over these connections, but they can provide satisfactory consumer service. In practice, unlicensed links are also used temporarily for more critical applications that can’t wait for frequency coordination. 


Once a link is up and running, threats to reliability occur even in clear air due to atmospheric turbulence or strong gradients in air density, which are invisible to the eye. These clear-air fades can reduce the signal-to-noise ratio enough to cause errors in the bits received. In severe cases, they may interrupt communications entirely. 


Designers of systems for critical applications must keep their systems working even through the most severe fades. They have three main tools. One is space diversity—transmitting the same information simultaneously over multiple paths, thus improving the odds that at least one keeps working. A second is automatic power control [PDF], which has the receiver continuously report back to the transmitter on the strength of the incoming signal. If it drops below some threshold, the transmitter temporarily cranks up the power to compensate. The main disadvantage here is the increased risk of interfering with nearby systems.


The third approach is adaptive modulation [PDF]. When the receiver reports a drop in signal strength, instead of powering up, the transmitter sends out fewer bits per second. Basic information theory dictates that shifting to a sufficiently low bit rate can maintain some desired (very low) error rate, despite a decrease in the signal-to-noise ratio. It’s a little like shouting to someone in a noisy environment: You instinctively speak more slowly than usual.


The use of adaptive modulation has long been accepted in Europe and Canada but not the United States, where the technique ran up against a rule that requires fixed-microwave systems to operate at a minimum bit rate, typically between 2.5 and 4.5 bits per second per hertz. The FCC has approved adaptive modulation, but to prevent the deployment of spectrally inefficient systems, it requires transmitters to maintain the specified minimum at least 99.95 percent of the time. This limits the slower modulations to a little over 4 hours per year. That’s enough, though, to keep most links in continuous operation.


The need for fixed-microwave communications is skyrocketing. That’s in no small part because of the increasing numbers of people spending time online and the spread of bandwidth-hungry applications, particularly video. The popularity of smartphones and tablets, whose owners expect broadband speeds no matter where they are, adds to the demand. Their mobile traffic not only requires spectrum for connecting their devices with cell towers but also expands the need for backhaul—the process of moving all the users’ data between towers and the carrier’s network. The microwave dishes that adorn many cell towers provide those links. Expect to see more of them in coming years.


Like most other radio services, fixed microwave has trouble finding enough suitable spectrum. The relatively low frequencies needed for longer links have been in especially short supply since the mid-1990s, when many countries reallocated radio spectrum around 2 GHz, shifting it from fixed microwave to second-generation cellphones.


In many places, fixed-microwave users must share some of their frequency bands with satellite uplinks and downlinks. Frequency coordination in those bands must take account of the satellite facilities. Sharing is usually manageable in bands allocated for satellite uplinks, which threaten microwave receivers with interference only in the vicinity of the satellite earth stations. The downlinks can be more troublesome. In the United States, the proliferation of backyard “receive-only” earth stations in a shared 4-GHz band is especially problematic. In this band, which is often used for TV distribution, each satellite receiver is entitled to interference protection, making it nearly impossible to coordinate new fixed-microwave links. And the practice of mounting satellite earth stations on moving ships has made band sharing all the more difficult, because these ship-borne systems can interfere with microwave towers when they operate near the coasts or on inland waterways.


When fixed-microwave interests go looking for higher frequencies to occupy, they must squeeze in around satellite applications, space research, navigation systems, radar, aeronautical applications, radio astronomy, and more. And some regulatory bodies have unintentionally exacerbated the spectrum shortage.


In the United States, the problem is this: In 1998, rather than expand the traditional system of having engineers coordinate things before a new license is even applied for, the FCC began auctioning off fixed-microwave licenses, at 28 and 31 GHz, and later at 39 and 24 GHz. Some winners of the early auctions paid tens of millions of dollars for a license. You might think they’d make good use of them, but it hasn’t turned out that way. Many auction winners haven’t constructed enough links to meet even their minimum requirements for license renewal. As a result, the FCC has since taken back many hundreds of these licenses, leaving the valuable radio spectrum they cover unavailable to anyone in those areas. Not surprisingly, the later auctions of fixed-microwave spectrum brought in far less cash from bidders.


Regulators in the U.K. have tried fixed-microwave auctions as well, with no better results. The first auctions earned limited cash and saw little construction of facilities. Later auctions attracted even less interest from bidders and resulted in even less construction.


Why these auctions haven’t produced better results is complicated. Part of the answer may be that the geographic areas being licensed, typically drawn around population centers, do not conform to any one company’s actual needs.


A fixed-microwave link is typically just one small part of a complex network. Often that network is used to support other kinds of infrastructure—railroads, electrical grids, oil pipelines—for which the microwave links parallel the underlying assets. Networks that support other kinds of commercial operations tend to connect population centers with one another and sometimes with outlying branches of a business. When used for cellular backhaul, the microwave network will reflect the sometimes idiosyncratic layout of the carrier’s facilities. And public-safety backhaul networks, used to relay emergency calls to local police and firefighters, conform to the boundaries of the local jurisdictions they serve.


These users all need highly customized configurations, not the arbitrary areas governments have used in auctioning licenses. But if regulators were to change the areas so that they did align with specific needs, they would each attract a single bidder, undercutting the rationale for auctions in the first place. It’s a no-win situation.


So, you might be wondering, why can’t one company secure the license for a region and then simply resell or lease out the appropriate microwave configurations to the organizations within the region that need them? Indeed, this model has been attempted, most notably by a San Francisco–based firm, FiberTower Corp. It won fixed-microwave licenses at 24 and 39 GHz and then sold backhaul services to Sprint and to a local county for 911 emergency calls, among others. But last year the FCC took back several hundred of FiberTower’s licenses, soon after which the company filed for bankruptcy. Its clients, already skittish about not controlling their own facilities, scrambled to maintain their service. Carriers will probably now think twice before trusting their backhaul to a third-party provider. 


In short, auctioning point-to-point microwave licenses just doesn’t make much sense—except perhaps for a few very competitive corridors. Otherwise, it’s better to let engineers coordinate these point-to-point operations, a system that has used the radio spectrum very efficiently ever since the radar engineers of World War II began turning their dishes into extremely reliable cables of air. 


About the Author

A former electrical engineer, Mitchell Lazarus is now a partner in the Washington, D.C., law firm Fletcher, Heald & Hildreth. He helps the Fixed Wireless Communications Coalition and others navigate Federal Communications Commission rules.

Related Stories

Advertisement
Advertisement