Our TPRC 2013 paper “The Emperor has no Problem: Is Wi-Fi Spectrum Really Congested?” (http://ssrn.com/abstract=2241609) has generated quite a bit of interest. Here are responses to some pointed questions and comments we've received.
Many thanks to Rob Alderfer, Richard Bennett, Brett Glass, Tren Griffin, and Chuck Jackson for their engagement with (though not necessarily endorsement of!) our work. Thanks to my co-authors for their help in writing this post: Andreas Achtzehn, Petri Mähönen, Marina Petrova and Ljiljana Simic.
(Citations for the papers mentioned are given at the end.)
OBJECTION: This is utter speculation; no data was gathered or research done.
True, we don’t report major new field measurements (though we mention work in progress; stay tuned). Instead, we analyzed the relevant engineering literature and concluded that (a) there are various engineering measures of congestion, none of them unequivocal; (b) there was no peer-reviewed evidence of widespread congestion. This led us to propose a set of criteria that could be used to judge whether “congestion” had reached a level that justified regulatory intervention.
OBJECTION: Have you actually talked to any Wi-Fi engineers in the field? They see congestion every day, but they don’t have time to write research papers.
One would expect wireless engineers deploying systems to work at the edge of the envelope, i.e. to deploy as few access points and other infrastructure as possible and still meet customer demand. To do otherwise would not be prudent; if they over-provision their network by purchasing more equipment than is necessary, they are wasting money. They will therefore necessarily be operating their networks at the hairy edge where service degradation becomes a problem, and will often see congestion, or at least worry about it constantly. If they didn’t, they wouldn’t be doing their job.
(As it happens, wireless congestion wasn’t a major concern for the engineers running the University of Aachen network.)
OBJECTION: You need to come to my apartment…
There are no doubt some apartments where wireless connectivity is problematic. However, they can usually be fixed by adding an access point where coverage is weak, using the 5GHz rather than the 2.4 GHz band, using power line connectivity, or even pulling Ethernet cable. Sure, this costs a bit of money – but as long as only a few users (or many affluent users in special circumstances, e.g. early adopters living in Silicon Valley apartments) have these problems, that’s the socially optimal solution.
And in fact, poor network performance is not necessarily due to spectrum constraints. It can also be due to poor engineering by a service provider, including under-provisioning the link from local switching centers to access points or even from DSLAMs to the Internet.
OBJECTION: Who are you lobbying for?
Nobody; this work has not been directly or indirectly funded by any interest group.
OBJECTION: Measuring packet QoS is simple. People probably don't do it because everyone has encountered Wi-Fi that is highly congested/unusable. So they probably think: that's like research that proves teenage boys are restless.
Measuring network Quality of Service (QoS) metrics is relatively straightforward, although it’s often hard to interpret what the results say about supply/demand imbalances (see Section 4, and Section 5.1 in the paper).
It’s even harder to establish a correspondence between QoS and Quality of Experience; according to MASS Consultants, “… it was not then possible to find any correlations in the experimental data that were acceptable. Correlations were unsuccessfully investigated between the [mean opinion score, a subjective measure of user experience] and different [QoS] statistics describing the frame rate, retry rate, retry ratio and bytes/second.”
Just because everyone thinks something is true doesn’t make it so. For example, a survey by the Pew Research Center in March 2013 found that 56% of Americans believe the number of crimes involving a gun is higher than it was 20 years ago; only 12% say it is lower. In fact, national rates of gun homicide and other violent gun crimes are strikingly lower now than during their peak in the mid-1990s; compared with 1993, the peak of U.S. gun homicides, the firearm homicide rate was 49% lower in 2010, and there were fewer deaths, even though the nation’s population grew.
OBJECTION: Why do you fixate on throughput? Poor latency is also problematic! Picking one metric to decide whether a network is "filled" seems wrong, since latency, packets loss, jitter etc. all matter.
Our paper analyzes the existing literature, and most papers have focused on throughput. (Sicker et al. is the exception that proves the rule.) Different papers have used different metrics: Jardosh et al. and Raghavendra et al. used link utilization; MASS Consultants used frame rate, retry rate and mean opinion scores; etc.
We agree that it’s not just a throughput game. Latency matters a lot for certain applications like VoIP and twitch games, as does jitter for e.g. small-buffer video streaming; still, throughput matters for other apps like video streaming and bulk file downloads. We turned to economic utility as a way to think about composing all these goods (see Section 5.2 in the paper), but so far we haven’t been able to figure out how to use that literature to combine/weight all these factors.
In the end it’s a cost-benefit trade-off. More unlicensed spectrum may well be a good deal for society, but it means less capacity for some other (existing) use. A case for reallocation to unlicensed (or licensed cellular, for that matter) should be based on a demonstration of increased social welfare, rather than anecdotal claims of congestion.
OBJECTION: What's a "significant" increase? What's a "valuable" task? Your criteria are meaningless.
Our congestion criteria are intentionally qualitative rather than quantitative. As we note in the paper, deciding whether degradation is significant or persistent, what “best” means, etc. is in the final analysis a matter for policy judgment of the evidence that has been presented. It would be implausible and presumptuous to try to quantify such (necessarily) complex judgments before the fact. Regulatory decisions, like legal judgments, are seldom hard and fast. The relevant adjudicators have to take all the factors into consideration, and decide whether the increase that a plaintiff claims is significant is indeed so.
OBJECTION: Your fifth congestion claim criterion (that problems need to be observed in spite of users’ willingness to pay for the best available service level) ignores the question of whether paying a premium is economically optimal.
This is a good point. A congestion claim would be even more compelling if it demonstrated that a re-allocation of spectrum to unlicensed would yield a net social benefit, i.e. the costs to the incumbents of losing spectrum would be less than the gain to unlicensed users.
OBJECTION: The claim that Wi-Fi congestion happens due to lack of investment is contradicted by the massive investment that various companies have made in Wi-Fi.
We’re not saying that Wi-Fi congestion happens due to lack of investment; on the contrary, we’re suggesting that claims of Wi-Fi “congestion” are not persuasive unless it is shown that widespread degradation is occurring in spite of investment. The evidence we’ve seen suggests that when the investment is made, e.g. in sport stadiums and conference centers, degradation problems recede. The only published data on a corporate campus (Raghavendra et al.) shows scant evidence of congestion according to their chosen metric.
But can one find a network load pattern that’ll break any imaginable infrastructure investment? Of course. It’s a cost-benefit trade-off. You can’t both eat ice cream and not get fat.
OBJECTION: Why should unlicensed advocates have to prove that 2.4 GHz capacity is insufficient?
If someone is asking the government for something, I believe they bear the burden of proving that it’s necessary. So if someone is using claims of insufficient 2.4 GHz capacity as the basis for asking for more unlicensed spectrum, they bear the burden of proof.
OBJECTION: The radio spectrum exists. Unlicensed and licensed should have the same burden of proof. Not allocating spectrum is not optimal.
Although it may not currently be put to its highest use, all spectrum is already notionally allocated – and reallocation costs money. We have tried to develop criteria for judging whether claims that spectrum is under-utilized are well-founded.
Indeed, there should be the same bar for everyone; and many if not all the arguments in our paper apply to claims of spectrum crisis by licensed interests, and can be applied to legacy “command and control” allocations too.
OBJECTION: If you had to summarize the most important conclusion from this work in a single sentence, what would that sentence be?
Claims of Wi-Fi congestion are difficult to make because observations are local, and congestion metrics don’t tell one much if anything about whether and why users are suffering service degradation, or what regulators should do about it.
OBJECTION: What is the best single argument that would undercut your analysis?
The absence (or presence) of congestion is irrelevant to the question of whether more unlicensed spectrum should be allocated.
OBJECTION: What single piece of additional information would be most useful to better understand the problem that you have analyzed?
Easy access to data on the state of transmission buffers in access points and clients. As a bonus, it would be good to know how many frames in the transmission buffer are being resent. Since transmission buffer overflow is a clear indication that one is losing packets for some reason (collision, transmission errors, interference, …) this is the best place to observe “congestion” or “packet loss” statistics.
Hardware vendors have good hooks to these data in access points and clients, but the APIs to access this information are not public. If a manufacturer or large hotspot operator was willing to share even a subset of this data it would be heaven for academic researchers.
OBJECTION: Since the Wi-Fi MAC protocol is less than 50% efficient for most frame sizes, you're never going to observe a link occupancy rate much higher than 40%; so how come you [actually, Raghavendra et al.] infer that there isn’t collapse/congestion when the median utilization is under 40%?
It is true that if one just monitors RF duty cycle, 802.11 (like virtually all CSMA variants) is inefficient and one loses about 50% of the air-time due to the protocol. Raghavendra et al. define utilization to include channel sensing time and back-off time slots; it’s not just the on-air time. Jardosh et al. found that throughput increased linearly with utilization up to about 80%, by this definition.
OBJECTION: What would the effect of interference be on WISPs? Would complaints by WISPS be evidence of congestion?
Interference would likely reduce the coverage area of a WISP access point, and reduce throughput. Complaints from WISPs would be evidence of congestion, but not necessarily compelling evidence since they have a financial incentive to invest no more in infrastructure than necessary, and thus to operate on the hairy edge of congestion; see point about field engineers seeing congestion.
It’s true that extensive interference impairs or destroys long-range services, and that even if there isn’t severe interference at every location, providing a long-range service become problematic if it’s present at sufficiently many locations. However, there are solutions to this other than additional spectrum: unlicensed operators can lease or buy licenses that will provide protection against interference, and both licensed and unlicensed operators can reduce cell size.
References
Jardosh, A. P., Ramachandran, K. N., Almeroth, K. C., & Belding-Royer, E. M. (2005) Understanding congestion in IEEE 802.11b wireless networks. In IMC '05, 2005 Internet Measurement Conference (IMC '05). http://conferences.sigcomm.org/imc/2005/papers/imc05efiles/jardosh/jardosh.pdf.
MASS Consultants. (2009). Estimating the utilization of key license-exempt spectrum bands. Technical Report MC/SC0710/REP003/3, Ofcom. http://www.ofcom.org.uk/research/technology/research/exempt/wifi/.
Raghavendra, R., Padhye, J., Mahajan, R., & Belding, E. (2009). Wi-Fi networks are underutilized. Technical Report MSR-TR-2009-108, Microsoft Research. http://research.microsoft.com/en-us/um/people/ratul/papers/tr2009-wifi-underutilized.pdf.
Sicker, D., Doerr, C., Grunwald, D., Anderson, E., Munsinger, B., & Sheth, A. (2006). Examining the wireless commons. In TPRC 2006. http://ssrn.com/abstract=2103824.
No comments:
Post a Comment