Spectrum is a natural resource. As a natural resource, it must be managed to avoid overuse. Yet there is no consensus on how to do so. Academics have discussed spectrum management policy for decades. The discipline combines economics, law, and engineering to debate optimum models for distributing spectrum in the most equitable and efficient way. Learning just a little bit about these theories helps us understand why the rules we have are the way they are. And even better, helps make our case when we think the rules are wrong.
Spectrum management is fascinating because it must be revisited in the wake of technologies that redraw the boundaries of the resource’s scarcity.
In 1912, only 127 U.S. wireless shore stations stations had great difficulty avoiding interference from a handful of navy ships, amateurs, and each other.
In 2014 billions of wireless devices of all shapes and sizes operate across the United States, and there still isn’t enough spectrum to go around!
There are four schools of thought on spectrum management: open access, command-and-control, market based, and commons. Let’s take a look.
Spectrum that is entirely unregulated. Anyone can operate any device at any power without prior authorization. This was the state of the spectrum before the Radio Act of 1912. And it was, as we’ve said before, a wild frontier.
In 1910, the Navy sent a letter to the Senate Committee on Commerce, pleading for government intervention.
"[each station]… considers itself independent and claims the right to send forth its electric waves through the ether at any time that it may desire, with the result that there exists in many places a state of chaos… Calls of distress from vessels in peril on the sea go unheeded or are drowned out in the etheric bedlam produced by numerous stations all trying to communicate at once. Mischievous and irresponsible operators seem to take great delight in impersonating other stations and in sending out false calls. It is not putting the case too strongly to state that the situation is intolerable, and is continually growing worse."
There are some proponents who believe that modern digital technologies can circumvent the “etheric bedlam” of open access spectrum. But the prevailing wisdom, as espoused by Jerry Brito, is that at least some form of regulation on devices and their capabilities is required to create a stable wireless ecosystem that does not devolve into chaos.
This is the oldest regulation model. Under command and control, a single institution delegates who may use spectrum, how much they can use, and in what way. In the United States, one branch of the Federal Government commands and controls spectrum, and I think you know who it is—the Federal Communications Commission. On some frequencies, and historically all frequencies, the FCC accepts applications from potential users and then shells out licenses to the most qualified.
If command-and-control sounds antiquated, that’s because it is. For military and government applications, command-and-control still makes sense. But for civilian applications, there are many opportunities for corruption, over or under-representation of groups and industries, and inefficiency. In today’s dynamic and swiftly changing airwaves, the government can rarely maneuver fast enough to keep up with changing demand. And technology uses spectrum in ways that are far too complex for any human bureaucracy to keep track of, as is the case with mesh networks.
In 1959, Nobel Laureate Ronald Coase advocated releasing spectrum as a form of property, which was to be paid for by the highest bidder according to market demand. His paper was a response to inefficiencies and seemingly arbitrary methods the FCC used to dispense spectrum licensing, the monopolization of spectrum by broadcasters in major markets at the expense of other industries and, perhaps, the uneasy notion of complete government control at the tail-end of the red scare. “There is nothing in the technology of the broadcasting industry which prevents the use of the same mechanism." Coase said, in response to the government’s view that electromagnetic waves cannot be sold at auction because their scarcity is uniquely finite.
“Indeed, use of the pricing system is made particularly easy by a circumstance... namely, that the broadcasting industry uses but a small proportion of 'spectrum space.' A broadcasting industry, forced to bid for frequencies, could draw them away from other industries by raising the price it was willing to pay. It is impossible to say whether the result of introducing the pricing system would be that the broadcasting industry would obtain more frequencies than are allocated to it by the Federal Communications Commission. Not having had, in the past, a market for frequencies, we do not know what these various industries would pay for them.”
But now we do: more than one thousand billion dollars. Coase could not fathom the cellular and mobile broadband industries that have sprung up since his paper. He also could not fathom the collusion of the public and private sectors that is the revolving door of the FCC operating under the pretense of market policies.
On that sour note, the last three decades have shown market based spectrum policy is vulnerable to three scenarios.
The first is when incumbents using one technology who have purchased spectrum in the past are no longer the optimum users of their frequencies. Obsolete incumbents are partly to blame for the details of the upcoming incentive auctions, a joint effort between the FCC and telecom companies to pry spectrum from the hands of cantankerous over-the-air broadcasters and repurpose it to mobile broadband. Their crowbar of choice is money. Lots of it. While little to none of the billions of dollars generated from the auction flowing to public coffers.
The second is when the private self-interests of a powerful elite, or political dues owed by politicians, interfere with transparent, public auctions that facilitate exchange of spectrum to the most ideal user. This was the case in the 1980s, when billions of dollars of spectrum was put up for a “public” lottery. The application process was so overgrown with technobabble and gotchas that the only way to be approved was to hire expensive legal counsel or have a friend on the inside, looking over your shoulder.
The third is monopolization: when a single or small number of parties buys up as much spectrum as possible to stifle competition. This is at the center of the battle between big telecom (who owns a great deal of prime beachfront spectrum) and Silicon Valley (who must use telecom’s spectrum to deliver their services).
Borrowing a word from from Garrett Hardin’s famous theory of self interest and resource scarcity, the commons theory seeks to create a spectrum space that anyone can use, as long as they follow certain rules. The rules can be set by a government, a private party, or even by the users themselves, but there must be rules of some sort to prevent interference.
It’s likely you are using a spectrum commons to access this blog post. WiFi is a digital standard that uses the 2.4 GHz ISM frequencies for wireless broadband. 2.4 GHz, along with 5.8 GHz and a few other bands, were considered “garbage" by the FCC and other industries because of their crowding, and deemed an “unlicensed” band. Engineers at IEEE then seized the opportunity of unlicensed use to create a spectrum commons on top of 2.4 GHz to enable short range wireless data communications.
Billions and billions of dollars worth of (relatively) interference free WiFi equipment later, there is still no single owner of 2.4 GHz. Rather, anyone may use a device that has been approved by the FCC for use in 2.4 GHz. A panel of engineers dutifully ensures the 802.11 standard for digital information exchange over Wi-Fi are upheld and updated, as necessary. The approval is based on transmission power, channel spacing, interference avoidance techniques, and other things to make sure everyone who wants to use a WiFi device can do so without encountering interference from other WiFi devices.
Leading image by Brandon Giesbrecht.