Monday, August 31, 2015

FCC approves robotic lawn mower, rejects worst case analysis

On August 12, 2015 the FCC granted a waiver (pdf) of some Part 15 rules to allow iRobot to market a robotic lawn mower transmitting in the 6240-6740 MHz range (Order in Proceeding 15-30).

The National Radio Astronomy Observatory (NRAO) had expressed concern that the waiver could lead to interference to radio astronomy operations.

The Office of Engineering and Technology noted in its waiver grant that "because the NRAO analysis looked at line-of-sight separation distances, it has greatly overestimated the interference potential of transmitters that are located less than two feet above the ground."

It added, "We find that when taking into account the variability in propagation characteristics due to terrain, low antenna heights and other propagation factors, grant of this waiver is very unlikely to increase the potential for harmful interference."

The glass-half-full reading is that the FCC rejected a worst-case analysis; the glass-half-empty view is that it missed an opportunity to do a probabilistic risk analysis that quantified expressions like "greatly overestimated the interference potential" and "very unlikely to increase the potential for harmful interference."

The cynic's take is that this is to be expected; providing hard numbers would expose the Commission to having its reasoning questioned during subsequent litigation.

Worst case in interference analysis for medical interference

In its second order on reconsideration regarding the incentive auction released on June 19, 2015 (docket 12-268, pdf) the FCC noted that its analysis interference into wireless medical telemetry systems its work "is a worst case analysis and in most installations one or more of the parameters we assumed here will provide additional protection" (recon order at para 119).

Even this wasn't good enough for GE Healthcare, who filed a petition on July 28, 2015 asking the FCC to reconsider its reconsideration, saying "Due to the severe and wide-ranging negative consequences of interference to Channel 37 WMTS, the Commission's expressed intent to use a worst-case (i.e. minimum coupling loss) analysis in evaluating separation between Channel 37 WMTS and 600 MHz band mobile base stations is appropriate, but its adopted separation rules are not, in fact, based on a worst-case analysis, as the Commission appears to believe."

The trouble with worst case is that there is no worst case. That is: one can always imagine something worse. It’s not a sufficiently stable concept to be usable. This leads to oxymorons like the “realistic worst-case” GE HealthCare refers to in its petition. It leads to oxymorons like the “realistic worst-case” GE HealthCare refers to.

There’s even a term of art: RWCS, the Reasonable Worst Case Scenario, that even has a definition in the UK: a scenario "designed to exclude theoretically possible scenarios which have so little probability of occurring that planning for them would be likely to lead to disproportionate use of resources” (House of Commons Select Committee Report). (There's also the term "reasonably foreseeable worst case use scenarios" that's used in passing in IEC 60601.)

It’s related to the unbounded character of the maximum of a distribution.  It’s well known among statisticians, but apparently not that many spectrum folk, that the longer you sample a parameter with an unbounded distribution (e.g. a propagation path loss with a log-normal fading distribution), the larger the maximum you find will be.

Friday, July 31, 2015

Q&A: Risk-assessment, harm claim thresholds and adjudication

In my testimony before the Senate Commerce Committee on Wednesday July 29, 2015 I recommended three spectrum management reforms. A summary and links to the written testimony and video are in an earlier blog post. This post offers some Q&A.

The three reforms were: (1) moving away from worst case interference analysis and using risk-informed methods that consider not only the consequences but also the likelihood of harmful interference; (2) providing more clarity about operators’ interference rights and obligations by specifying harm claim thresholds; and (3) giving any spectrum user the option of taking action directly against any other, either in front of an FCC judge or in a federal Court of Spectrum Claims.

Wednesday, July 29, 2015

Senate Testimony: Risk-assessment, harm claim thresholds and adjudication

I testified today before the Senate Commerce Committee hearing on “Wireless Broadband and the Future of Spectrum Policy.” My written testimony is here; this is the summary I presented during the hearing. I’ve posted some Q&A in a subsequent post. My remarks are recorded in the archived webcast, starting at 58:02; see also a question from Chairman Thune and my reply starting at 2:05:43.

Thursday, February 12, 2015

Risk-informed interference assessment

I've spent the last year or so thinking about ways to complement worst-case interference assessment with a more comprehensive approach that considers many potential interference hazards, not just a possibly implausible nightmare scenario. I have concluded that quantitative risk analysis, used in many regulated industries, is a suitable tool.

Sunday, December 28, 2014

Six weeks of spectrum auction tweets

I created an animated GIF to show how twitter traffic about spectrum auctions changed over the first six weeks of the AWS-3 auction, i.e. November 15 to December 27.

Saturday, September 06, 2014

Beyond the main stage: Teasing apart twitter traffic about net neutrality

For this installment of the NodeXL Gallery Glimpse, I'm teasing apart the members of the social graph around the net neutrality issue.

Sunday, August 31, 2014

5G on Twitter: NodeXL social network analysis

A NodeXL SNApshot is a great way to catch up with who's saying what about a topic on Twitter. This post discusses the SNApshot that graphs the 1,963 tweets containing the hashtag #5G posted over the period 17 Jul - 29 Aug 2014.

Here's the Gallery Glimpse video:

Wednesday, June 04, 2014

Adjudication versus Enforcement

Mike Marcus (web site) has suggested that enforcement problems can be divided into two categories:
#1. Cases where behavior explicitly violates existing rules, e.g. use of the wrong frequency, or equipment that doesn't comply with rules.
#2. Unanticipated interactions between systems that either lead to service degradation but do not self-evidently violate any rules, or raise complex legal issues of whether there is a violation.
Mike suggests that the second category includes "cellular booster" interference to cellular systems, police radar detector "fuzzbuster" interference to VSATs, the Nextel/public safety intermod problem in 800 MHz, and impairment of 700 MHz cellular due to FM transmitter harmonics (discussed on Mike’s blog).
The fact that the spectrum community informally refers to both categories as enforcement problems while the second is actually a question of adjudication highlights a problem caused by the FCC’s rudimentary judicial function: while it has more than 250 people in the Enforcement Bureau (2014 Budget), it only has one (!) administrative law judge.

It seems to me that (1) being clear about the enforcement/adjudication distinction and (2) actually having an adjudication function separate from both rule making (the legislative function) and enforcement (the executive function) would not only help us think more clearly about spectrum problems but would also lead to quicker resolution, to everyone's benefit.


As an administrative agency (caveat: IANAL) the FCC combines the three branches of government under one roof: legislative, judicial and executive. It makes rules (legislative), decides whether they have been broken (judicial), and takes action to detect alleged violations, and punish them if violations are found (executive).

Mike’s Category #1 (explicit violations of existing rules) is enforcement, defined by the OED as “the act of compelling observance of or compliance with a law, rule, or obligation”: it presupposes that adjudication has already taken place. The examples in Category #2 (unanticipated interactions) are actually questions of adjudication, i.e. “A formal judgment on a disputed matter” per the OED: they're difficult precisely because it's not clear whether there's been a violation, or by whom.

The FCC is very loosey-goosey on this distinction, as has been pointed out over the years; see e.g. Ellen Goodman’s 2004 Telecosm paper, Phil Weiser’s 2009 FCC Reform paper and our recent Hamilton Project paper.

Distinguishing clearly between these two categories could also address a blind spot about the need for enforcement in the Dynamic Spectrum Access (DSA) community. If enforcement is addressed at all by advocates of Spectrum Access Systems (SAS), it’s usually waved away with assurances that the rules in the database will solve all problems. (Jerry Park’s presentation at the January 2014 FCC 3.5 GHz SAS workshop is an exception, but even he focuses on attacks on the database, rather on how to decide disputes.)

Mike's distinction made me realize that the DSA/SAS community probably equates enforcement with Category #1. It's then plausible to believe that a system that prevents explicit rules violations solves, or more accurately obviates, "enforcement problems." However, the arcane interactions between radio systems in the wild and the difficulty in assigning responsibility for them make it important to highlight the Category #2 problems: these unintended issues are not only more likely to cause problems – and cause them unexpectedly – that failures in rule sets, but by their nature they will require judgment (in both a legal sense, and in the sense of requiring assessment of hard-to-compute complexities) to resolve.

Sunday, March 02, 2014

RF Mirror Worlds: Supercomputing meets propagation models, 3D terrain data and ubiquitous sensors

Petri Mähönen has observed that wireless researchers haven’t exploited supercomputing as much as one might expect, especially in comparison with other scientific disciplines such as aerospace, meteorology, oceanography, biology, sociology... If they had, we could be exploring thousands or millions of “Test Cities” rather than just the one contemplated in the PCAST Report (pdf, Chapter 6 and Appendix G). The PCAST budget for the first three years of a Test City (Table G.1) runs to $21 million in operating expenses and $15 million in capital expenses – that would buy a lot of computation!

I suspect (hope!) we’re on the verge of a step change in using software simulation, aka “mirror worlds”, to understand and manage radio systems. The underlying technology has been on the exponential growth curve we’ve all heard about, but hasn’t broken through to high profile visibility. It may soon.

Saturday, February 22, 2014

DoD treats Spectrum as Territory

The U.S. Department of Defense released a spectrum strategy document on Thursday (press release, pdf). I’ll leave discerning what (if anything) is actually new in it to the Pentagon watchers.

I was struck by the implications of the language used: the DoD conceives of spectrum as a place. Given that military success often seems to be framed as controlling or denying territory, this is not an auspicious starting point for spectrum sharing – which is about wireless system coexistence in many intangible dimensions, rather than all-or-nothing control of territory.

Wednesday, October 16, 2013

Unlicensed’s success: physics, not regulation?

Unlicensed allocations have generated a massive, and to many surprising, amount of innovation and value (see the References below). The question is: Why?

Almost all of the value so far has come in the 2.4 GHz ISM band, mostly due to Wi-Fi but also to a lesser extent Bluetooth applications. There is never a single, simple answer to a Why question about a complicated nexus of technology, politics and user behavior, but my impression is that unlicensed partisans believe that it's due pretty much exclusively to the techno-economic characteristics enabled by the rights assignment regime: “openness” (Benkler), “managed commons” (Milgrom, Levin & Eilat), or “rule-based access” (Thanki).

I think it's at least plausible that Wi-Fi's undoubted success has been due to a fortuitous coincidence of band choice, physics and timing as much as to regulation: It turned out that the interference range was small enough that users didn’t really degrade each other’s performance; and the networking needs of their applications could be met by the bandwidth available around them. In other words: the capacity of the channel was larger than the number of people who interfered with each other, multiplied by the data they wanted to move.

Wednesday, October 09, 2013

The Emperor has Objections: Replies to feedback on our “Is Wi-Fi Congested?” paper

Our TPRC 2013 paper “The Emperor has no Problem: Is Wi-Fi Spectrum Really Congested?” ( has generated quite a bit of interest. Here are responses to some pointed questions and comments we've received.

Monday, March 18, 2013

Counting Spectrum in an Age of Sharing

Mike Marcus’s recent blog post Dueling Spectrum Charts - Part 2 is a nice reminder (not that anyone who reads his blog needs it) that spectrum isn’t just MHz. The current focus on “spectrum sharing” underlines the fact that one has to think of space and time as well as frequency. A more accurate (but also much more geeky) metric would divide MHz by percentage of population covered, and percentage of time allocated.

Thursday, March 14, 2013

Using an auction to decide the number of 3.5 GHz spectrum access administrators

The FCC faces a choice of whether to authorize one database administrator or many to run the spectrum access system (SAS) that will manage small cell operation in the 3.5 GHz band. This resembles the choice between an exclusive-use licensing or unlicensed regime. The FCC could use an auction to let the market decide by using a simplified version of the 2008 Bykowsky, Olson and Sharkey proposal.