"Spectrum is the equivalent of our highways," says Christopher Guttman-McCabe, vice president of regulatory affairs for CTIA-The Wireless Association, an industry trade group. "That's how we move our traffic. And the volume of that traffic is increasing so dramatically that we need more lanes. We need more highways." (Joelle Tessler, "Wireless companies want a bigger slice of airwaves", Associated Press, posted to SiliconValley.com 12/28/2009)And its also as self-serving as they come. What the cellular companies need is data capacity. There are many ways to get that that don't require new radio licenses, notably increasing the density of cell towers and improving antenna technology. But those are more expensive than new licenses, hence the claim that they need "the land".
Monday, December 28, 2009
Sunday, December 27, 2009
Let’s start with a particular musical tradition: harpsichord pieces in the High Baroque. Bach wrote the Goldberg Variations, for example, with a particular instrument and even performer (Goldberg) in mind. The performer has many options, however, regarding tempo and mood. When the same score is played on a different instrument, e.g. the piano, an additional set of choices and opportunities arise.
- Composer – policy maker (legislator or regulator with quasi-legislative powers, like the FCC)
- Score – law, rule or regulation
- Instrument – technology and social context
- Performer – judge (or quasi-judicial actor, e.g. the FCC)
- Audience – interest groups, stakeholders, citizens, etc.
- Changes of instruments (technology) that require only minor changes in the score (law)
- Changes that prompt composers (policy makers) to invent new genres (rules), either as a result of new technologies or the internal development of the genre itself
- Changes brought about by shifts in performance (judicial) practice
Update 12/28/2009: See the comments for some great thoughts from Jon Sallet about the role of improvisation in music and governance. His conclusion: "In a world of change and uncertainty, discretion is an important tool; discretion that is applied by professionals (like trained musicians), within guidelines (like the old rule against using augmented fourths) but that calls upon the expertise of the composer and the performer both to work, as it were, in harmony."
 Robert E. Mensel, ""Kodakers Lying in Wait": Amateur Photography and the Right of Privacy in New York 1885-1915", American Quarterly, Vol. 43, No. 1 (Mar., 1991), pp. 24-45, PDF available.
 James V DeLong, “Avoiding a Tech Train Wreck”, The American, May/June 2008
Saturday, December 26, 2009
You don’t know either, even if you’re a lawyer or scholar who’s written confident diagnoses of, and persuasive curative prescriptions for, various policy problems.
If you’re a regulator, you know you don’t know.
Decision makers have always operated in a world of complexity, contradiction and confusion: you never have all the information you’d like to make a decision, and the data you do have are often inconsistent. It is not clear what is happening, and it is not clear what to do about it. What’s most striking about the last century is that policy makers seem to have been persuaded by economists that they have more control, and more insight, than they used to.
We have less control over the world than we’d like. We are either confronted by unwanted situations we cannot prevent, or desired situations are precluded. We would like to prevent unwanted situations, but can’t; or we would like favorable circumstances to continue, but they don’t.
There is a small part of the world where the will has effective control; for the rest, one has to deal with necessity, i.e. circumstances that arise whether you will or no. Science and technology since the Enlightenment has dramatically widened our scope of control; economics has piggy-backed on the success of classical physics to make large claims about its ability to explain and manage society. However, this has had the unfortunate consequence that we no longer feel comfortable accepting necessity. If a situation is avoidable – say, postponing the moment of death through a medical intervention – then it becomes tempting to think that when it comes, someone or something can be held responsible.
As Genevieve Lloyd tells it (and I understand it) in Providence Lost (2009), our culture opted to follow Descartes in his framing of free will: we should do the best we can, and leave the rest to divine Providence, which provides a comforting bound to our responsibilities. In the absence of providence, however, we have no guidance on how to deal with what lies beyond our control. As Lloyd puts it, “the fate of the Cartesian will has been to outlive the model of providence that made it emotionally viable.” She argues that Spinoza’s alternative account of free will, built on the acceptance of necessity, is better suited to our time; there is freedom in how we shape our lives in the face of necessity, and a providential deity is not required.
Our Cartesian heritage can be seen in the response to the financial collapse of recent years: someone or something had to be responsible. If only X had done Y rather than Z… but an equally plausible account is that crises and collapse are inevitable; it was only a matter of time.
I submit that the best response to an uncertain and ever-changing world is to accept it and aim at resilience rather than efficiency. Any diagnosis and prescription should always be provisional; it should be made in the knowledge that it will have to be changed. Using efficiency as the measure of a solution, as neoclassical economics might, is the mark of the neo-Cartesian mind: it assumes that we have enough knowledge of the entire system to find an optimum solution, and that we have enough control to effectuate it. In fact, an optimum probably doesn’t exist; if it does exist, it’s probably unstable; and even if a stable solution exists, we have so little control over the system that we can’t implement it.
The best conceptual framework I’ve found for analyzing problems in this way is the complex systems view, and the most helpful instantiation is the approach to managing ecosystems encapsulated in C. S. Holling’s “adaptive cycle” thinking. (See e.g. Ten Conclusions from the Resilience Project). The adaptive cycle consists of four stages: (1) exploitation of new opportunities following a disturbance; (2) conservation, the slow accumulation of capital and system richness; (3) release of accumulation through a crisis event – cf. Shumpeter’s creative destruction; and (4) reorganization, in which the groundwork for the next round is laid.
Two techniques seem to be particularly helpful in applying this approach to governance: simulation and common law. Simulation and modeling exploit the computing power we now have to explore the kinds of outcomes that may be possible given a starting point and alternative strategies; it gives one a feel for how resilient or fragile different proposed solutions may be. Simulation may also help understand outcomes; for example, Ofcom uses modeling of radio signal propagation rather than measurement to determine whether licensees in it Spectrum Usage Rights regime are guilty of harmful interference with other licensees. (See e.g. William Webb (2009), “Licensing Spectrum: A discussion of the different approaches to setting spectrum licensing terms”.)
A common law approach helps at the other end of the process: Jonathan Sallet has argued persuasively that common-law reasoning is advantageous because it is a good way of creating innovative public policies, and is a sensible method of adapting government oversight to changing technological and economic conditions.
But I could be wrong…
Update 12/28/2009: See the fascinating comments from Rich Thanki, below. He takes two salient lessons from complexity theory: avoid monoculture, and develop rules of thumb. He also provides more of the usual quick Keynes quote about "slaves of some defunct economist."
Thursday, December 24, 2009
Any change in policy has unintended consequences; some of them will be adverse. One has to think carefully before advocating radical change: the benefits of change or the costs of doing nothing should be substantial. One way of beginning a cost/benefit analysis is to understand the underlying forces.
Many arguments have been given for new internet regulation. Cowhey and Aronson (Transforming Global Information and Communication Markets 2009:17) cite three factors that will force change: the modular mixing and matching of technology building blocks; the need to span traditional policy and jurisdictional divides (aka Convergence); and the need to rely more on non-governmental institutions to coordinate and implement global policy. In my paper “Internet Governance as Forestry”, I cite three characteristics of the internet that require new responses: modularity, decentralized self-organization, and rapid change.
Let’s consider, then, the following candidates for radical, unprecedented and transformational change in the internet economy taken from these two lists: modularity, convergence, the “third sector”, decentralization, and rate of change.
I doubt modularity will persist as a characteristic of the internet business. While it is clearly a hallmark of our current stage, it has a long history: the standardization of interchangeable parts is dated back to Eli Whitney’s process for manufacturing muskets for the US government in 1798, but there is evidence for standardization of arrowheads and uniform manufacturing techniques in the bronze age, and some anthropologists claim there was standardization of stone age tools. However, modular technology does not lead inescapably to a modular industry structure. Standard parts have not rendered pre-internet industries immune to anti-trust problems, and it is likely they will do so now. The role of modularity in the relationships between companies waxes and wanes, depending on rather than driving industry consolidation and market power.
The good old convergence argument is a true enough, but tired. The mixing of broadcasting, telecom and intellectual property regulation brought about by common digital formats will undoubtedly require a huge amount of creative reform of regulation, but I no longer think that the result will be the abolition of regulatory categories based on the commercial and technological status quo.
I would very much like to see such an abolition; I proposed a re-organizing the FCC by policy imperatives rather than industry categories in my FCC Reform paper, but I don’t think it’s going to be practical. The human rage to classify  will reassert itself. Classifying by policy concern probably won’t work, sad to say, because of how regulation tends to work: take a new problem, fit it into an existing category, and apply the rules of that category. Even if this mechanism yields weird results in times of transition, it’s usually efficient and is likely to persist, even as categories change. We don’t yet have the new categories, but they may well emerge based more on how industry self-organizes than by logic. Judging by today’s behemoths, they might perhaps be networks, cloud services, devices and content (i.e. AT&T, Google/Microsoft, Apple/Dell and Hollywood) replacing broadcasting, telecom, cable and intellectual property (ABC/CBS/NBC, the old AT&T, Comcast and Hollywood).
The internet is no doubt much more decentralized than its forebears, e.g. the telephone network; it is by definition an affiliation of many networks, and a lot of processing is done “at the edges” rather than “in the middle”. There is a linkage between a decentralized architecture and modularity. Modularity allows decentralization, and is amplified by it. If or when either regresses to the mean, the other will tend to do so as well. Since I don’t believe that a high and increasing amount of modularity is an persistent attribute of the 21st century communications industry, I don’t believe that high and increasing decentralization is either. However, the current degree of modularity and decentralization in has probably put us into a qualitatively different regime; there has phase change, so to speak. The polity has just begun to work through the implications, and this will take a decade or more.
The “third sector”: Non-Governmental Institutions (NGOs), non-profits and civil society
Cowhey and Aronson’s interest in NGOs is based in trade, and the organizations they have in mind (ICANN, W3C, IETF) meet the four-part definition offered by Lester Salamon, a political scientist and scholar of US non-profits at Johns Hopkins: they are organizations, i.e., they have an institutional presence and structure; they are private, i.e., they are institutionally separate from the state; they are fundamentally in control of their own affairs; and membership/support is voluntary. Salamon argues that the prominence of NGOs represent an “associational revolution”. I cannot judge whether this phenomenon is transient or not; however, the large organizations clearly provide an alternative venue for governance. For example, Cowhey and Aronson argue that the IETF’s central role in internet standards came about because the US Government decided to delegate authority to it.
If one relaxes the requirement for formal institutional structure, the rise of private, voluntary engagement in politics facilitated by Web 2.0 represent an impetus and perhaps even a venue for new governance. Currently fashionable examples include http://transparencycorps.org/, http://opengov.ideascale.com/ and http://watchdog.net/; tools that facilitate engagement include http://www.opencongress.org/, http://www.opensecrets.org/lobbyists/ and http://www.govtrack.us/. The citizen’s ability to know about the activities of their legislators and petition has never been greater; tools for organizing into ad hoc coalitions (most famously the role of http://www.meetup.com/ in the 2004 and 2008 US campaigns) lead to a ferment of groups that may grow into more recognizable institutions. Policy makers will have to invent new ways to track and mollify these groups, at the very least; the Obama Administration appears to be using them to support policy making.
While the decentralized architecture of the internet and the rise of NGOs are different phenomena with different causes, Web 2.0 technologies are beginning to draw them together.
Rate of change
As to whether the rapidity of change is transformative and permanent, I think the answer is No and Yes. The rate of technical and commercial innovation on internet over the last two decades has been stunning. It has been abetted by modularity, and even more so by the ability of software to morph without having to retool a factory. (Retooling a code base is a non-trivial exercise, though.) However, the internet is growing up and it’s reasonable to expect that the industry and technology will settle into a phase of relative maturity. 
On the other hand, while the rate of change may not continue to accelerate, or even continue at its current pace, the political system has to adjust to the stresses that the increase to date has already imposed. William Scheuerman, for example, argues that the “social acceleration of time” has created a profound imbalance between the branches of government in liberal democratic systems like the US.  Even if the rate of techno-commercial innovation slows down, the rate at which global markets generate and propagate news will be a challenge for political systems whose time cycles are set in constitutions that change only very slowly, and human physiology which changes hardly at all. 
Back to Hard Intangibles
A change in context that forces a change in governance doesn’t need to be irreversible for the consequences to be profound. Since history is cumulative, a “phase change” in policy making is a change that never really reverts to its prior form, since the context changes with it. However, some changes are more portentous than others. I’ve argued above that the modularity, convergence and decentralization of the internet are temporary, and part of the regular cycle flow in industry structure. Changes in tempo and the rise of the third sector seem to me to be more momentous. I think both are rooted in the growing intangibility of our societies, which has been accelerated by ICT: complex software running on powerful processors linked by very fast networks.
I think there is a link back to my 2006/2007 obsession with “hard intangibles” (DeepFreeze9 thread). The ability to compose more components than the mind can manage makes programming/debugging very hard, particularly when those components are so easily mutable: it’s easier to change a line of code than to retool an assembly line. The “soft products” of these technologies, themselves complex, composable and mutable become the inputs for culture and thus policy making: it’s easier to change web artifacts and social networks than to manage a movement using letters and sailing ships.
 I first heard the term used by Rohan Bastin, Associate Professor of Anthropology at Deakin University, in a Philosopher’s Zone interview about Claude Levi-Strauss. “The human rage to classify” is also a chapter title in F. Allan Hanson, The Trouble With Culture : How Computers Are Calming The Culture Wars, SUNY Press 2007
 This prediction contradicts Ray Kurzweil’s contention that technological change accelerates at an exponential rate, and will continue to do so: his “Law of Accelerating Returns” [link, critique]
 William E. Scheuerman, Liberal Democracy and the Social Acceleration of Time (2004). Scheuerman defines social acceleration of time as “a long term yet relatively recent historical process consisting of three central elements: technological acceleration (e.g. the heightening of the rate of technological innovation), the acceleration of social change (referring to accelerated patterns of basic change in the workplace, e.g.), and the acceleration of everyday life (e.g., via new means of high-speed communication or transportation).” I’m indebted to Barb Cherry for introducing me to Scheuerman’s ideas; see e.g. her “Institutional Governance for Essential Industries Under Complexity: Providing Resilience Within the Rule of Law” CommLaw Conspectus 17.1
 Human thinking won’t speed up much, if at all – though tools can make it look as if it does. See for example the Edwin Hutchins’ wonderful Cognition in the Wild (1996). Hutchins contends that we need to think in terms of “socially distributed cognition” in a system that comprises people and the tools that were made for them by other people.
Monday, December 21, 2009
The legal scholar William Boyd introduced me the concept of an “object of governance”, i.e. the explicit focus or nominal topic of regulatory activity.  Boyd is concerned with deforestation as an object of climate governance ; a quick web search throws up examples like organized crime, “The East”, the Sahel, and risk. Objects of communications regulation include personally identifiable information (PII), spectrum, phone service, and the internet.
While most of these objects are intangible, they are at least to some extent thing-like; they’re nouns. It becomes more tricky when regulation addresses behavior – that is, verbs. I’ll work through a few examples in communications regulation where the object of governance started off as a thing/noun, and is becoming a behavior/verb:
Privacy: From PII to Use
The current approach to protecting privacy on the web is rooted in the notion of data security: information exists somewhere, and needs to be protected. However, an alternative conception based on appropriate use rather than access restrictions is emerging.   The idea is that the tradition Notice & Choice regime is complemented by use-and-obligations model where organizations disclose the purposes to which they intend to put information, and undertake to limit themselves to those uses.
Wireless regulation: From spectrum to radio operation
Radio regulation has been framed in terms of government management of a “spectrum asset” for many decades. Even though in practice the regulations concerned themselves with the operating parameters of transmitters, the idea that some underlying asset existed has been a useful fiction, particularly as the detailed technology and service choices have been increasingly privatized through auctions of general-use licenses.
However, a new generation of radio technologies has been used to call this approach into question. “Open Spectrum” advocates have argued that dynamic wireless technologies obviate many underlying assumptions of current regulation, and prefer “commons” access over exclusive licenses.  Some in the RF engineering community recommend that regulation take into account dynamic adaptation at all layers in the network stack, not just at the radio layer.  I have argued that a static, spectrum-as-asset approach is not a given; a more dynamic radio-as-trademark interference metaphor is perfectly workable. 
Universal Service: From telephony to internet access
The Universal Service Fund in the US, and its equivalents in other countries, was conceived of as guaranteeing phone service to those who would not otherwise be able to afford it, particularly in rural communities. There is no a great deal of debate about extending the universal service concept to the internet. However, since internet access can come in an unlimited variety of flavors, it is unclear what the goal of the program should be. Phone service is the same everywhere; but what broadband speed is “good enough”? The regulatory debate is moving away from how to fund phone service to how to define baseline access.
Common carriage: From a neutral network to network management
The most recent of these debates concerns the 21st century equivalent of common carriage for the internet. The rallying cry of Network Neutrality had satisfyingly thing-like connotations: there was a network, and it had to have the attribute of neutrality (noun/adjective). Over time is has largely been agreed that network operators should have some discretion in managing the behavior of their network. The question has now become a behavioral one: what is degree of network management (verb) is appropriate?
A shift in the objects of governance from things to behaviors suggests a shift in regulation from ex ante to ex post action, that is, from making detailed rules up-front to stating general principles and enforcing breach after the fact. In Law’s Order , economist David M. Friedman compares speed limits (ex ante) with reckless driving (ex post), and observes that ex post punishments are most useful when the behavior is determined by private knowledge that the regulator cannot observe.
"Ex ante punishments can be imposed only on behavior that a traffic cop can observe; so far, at least, that does not include what is going on inside my head. Ex post punishments can be imposed for outcomes that can be observed due to behavior that cannot—when what is going on inside my head results in my running a red light and colliding with another automobile."When an object of governance is thing-like, and changes in the attributes of those things are easily observed – a data breach occurs, some packets don’t cross the network – then ex ante rules are attractive. When governance concerns behavior, particularly behavior that is difficult to observe – the uses to which data is put by a company, whether a particular network management technique discriminates against a competitor – then the regulator has to fall back on ex post enforcement. The difficulties with ex post are well-known, though: from providing sufficient clarity up-front about what would constitute a breach, to the political difficulty of exacting very occasional but very large penalties from powerful players.
 Note that this is not the traditional meaning of the term, which used “object” as synonymous with “objective”, e.g. Edmund Burke: “To govern according to the sense and agreement of the interests of the people is a great and glorious object of governance. This object cannot be obtained but through the medium of popular election, and popular election is a mighty evil.”
 Boyd, William, “Ways of Seeing in Environmental Law: How Deforestation Became an Object of
Climate Governance”, to be published in Ecology Law Quarterly
 Daniel J. Weitzner, Harold Abelson, Tim Berners-Lee, Joan Feigenbaum, James Hendler, Gerald J. Sussman (2007) “Information Accountability”, Computer Science and Artificial Intelligence Laboratory Technical Report, MIT-CSAIL-TR-2007-034, June 13, 2007
 Business Forum for Consumer Privacy, “A New Approach to Protecting Privacy in the Evolving Digital Economy: A Concept for Discussion”, March 2009
 Kevin Werbach (2003), "Radio Revolution: The Coming of Age of Unlicensed Wireless," New America Foundation and Public Knowledge, no date on document, dated 15 Dec 2003 on NAF site
 Preston Marshall (2009) “Quantifying Aspects of Cognitive Radio and Dynamic Spectrum Access Performance” (see slides 15, 16)
 J Pierre de Vries, (2008) "De-situating spectrum: Rethinking radio policy using non-spatial metaphors" New Frontiers in Dynamic Spectrum Access Networks, 2008 (DySPAN 2008). http://ssrn.com/abstract=1241342
 David M. Friedman, Law's Order: What Economics Has to Do with Law and Why It Matters, Princeton University Press: 2001. See Chapter 7 for a discussion of ex ante/ex post.
Friday, December 18, 2009
In the Summary and Conclusions, co-written with Don Abelson, they introduce four “principles” for market governance in the light of current conditions, and ten “norms” needed to implement the principles (see Appendix 1 below). They define market governance as “the mixture of formal and informal rules and the expectations about how markets should logically operate.”
When I look at their norms, I see a set of choices for the set-points of a small number of governance mechanisms:
- Subsidy (Norm 2)
- Competition policy (Norms 3, 5)
- Regulatory “touch” (Norms 1, 4, 6)
- Property rights (Norms 8, 9, 10)
- Public Safety. Protecting citizens is a primary responsibility of government.
- Consumer Protection. Policy makers take action when lawmakers conclude that commercial activity needs to be circumscribed in the public interest.
- Culture and Values. In order to protect and express a culture’s values, policy makers seek to limit some kinds of speech and promote others.
- Government Revenue. Money needs to be raised and redistributed by federal, state and local treasuries; this includes taxes, fees, levies, subsidies, and tax breaks.
- Economic Vitality. A healthy market produces goods and services that citizens value.
The mechanisms of competition policy and property rights are means to the end of economic vitality, my fifth policy imperative. The mechanism of regulatory “touch” is a means that I address in my paper under the heading of Principles (see Appendix 2, below); as it happens, I concur with their recommendations for light touch regulation.
The difference in emphasis is perhaps most noticeable in the absence of norms/mechanism that speak to the “soft” policy imperatives. While Cowhey & Aronson’s Norm 7 addresses media content, and thus recognizes some value in “culture and values”, my third policy imperative, it is not implementable in the way the others are; it merely recommends a balance between encouraging trade and protecting cultural values. The “public safety” imperative is completely absent. While one may argue that Imperative 2, “consumer protection”, is to be achieved through competition policy (Norms 3 and 5), Cowhey & Aronson do not explicit mention of consumers.
 Cowhey, Peter F. and Jonathan D. Aronson, Transforming Global Information and Communication Markets: The Political Economy of Innovation, MIT Press (February 15, 2009). Softcopy available at http://globalinfoandtelecom.org/book/ (look for the “Download free under Creative Commons license” link)
 It is telling that Cowhey and Aronson seem to equate the public interest with consumer welfare, an economic construct. For example, on p. 17 they write: “The main challenge for governance is creating appropriate new spaces for market competition that allow the most important potential for innovation to play out in a manner that enhances consumer welfare (the public interest).”
 De Vries, Pierre, “Internet Governance as Forestry: Deriving Policy Principles from Managed Complex Adaptive Systems”, TPRC 2008. Available at SSRN: http://ssrn.com/abstract=1229482
Appendix 1: Four guiding principles and ten norms to help implement them
(Cowhey & Aronson (2009) Table S.1, p. 265
- Enable transactions among modular ICT building blocks.
- Facilitate interconnection of modular capabilities.
- Facilitate supply chain efficiency, reduce transaction costs.
- Reform domestically to help reorganize global governance.
- Delegate authority flexibly.
- Invest in virtual common capabilities; be competitively neutral.
- Use competition policy to reinforce competitive supply chains.
- Intervene lightly to promote broadband networks.
- Narrow and reset network competition policy. All networks must accept all traffic from other networks. Narrow scope of rules to assure network neutrality. Separate peering and interconnection for provision of VANs.
- Government should allow experiments with new applications.
- Create rules for globalization of multimedia audiovisual content services that encourage international trade and foster localism, pluralism, and diversity.
- Tip practices toward new markets for digital rights.
- Promote commercial exchanges that enhance property rights for personal data and mechanisms to do so.
- Users own their information and may freely transfer it.
- Flexibility: Determine ends, not means.
- Delegation: Most problems should be solved by the market and civil society.
- Big Picture: Take a broad view of the problem and solution space.
- Diversity: Multiple solutions are possible and desirable.
Thursday, December 17, 2009
According to 2002 Census data, the marketing research & public opinion polling industry as a whole had revenues of $10.9 billion; special interests paid Washington lobbyists $3.2 billion in 2008 according to the Center for Responsive Politics. Lobbying is as old as politics, but polling is relatively new (19th century), as is its premise: the importance of mass public opinion in government and diplomacy (18th century). Lobbyists are key players in Washington DC, and there’s a revolving door that moves former federal employees into jobs as lobbyists, and that pulls former hired guns into government careers or political appointments. Polling expertise is a key attribute in top political advisors, and something that politicians – and administrations – do incessantly.
The social media technologies of Web 2.0 will create a lobbying/polling hybrid and create a new political power center to rival traditional lobbying and polling. Efforts by government to solicit citizen opinion, like the Ideascale site soliciting input on the National Broadband Plan, or the Open for Questions site run by the White House, are a way for citizens to engage in little-L lobbying. These channels invite manipulation that will amount to big-L lobbying. In the same way that astroturfing co-opted grassroots lobbying, political operatives will co-opt the forms of web 2.0 citizen participation. Those who are adept at viral marketing will propel political memes into real-time polling tools in way that amounts to lobbying.
The amplification of the randomly popular that is pervasive on social rating sites like digg will infuse politics, intensifying the temptations of “poll, then decide”. We’ll also likely see something akin to the hollowing out of the media industry mid-list that The Economist charted in “A world of hits”: In movies and books, both blockbusters and the long tail are doing well; the losers are titles (and retailers) in the not-quite-so-good middle ground. Similarly, blockbuster issues will be laid on for the mass public that doesn’t care about politics (shibboleths like taxes and abortion), and niche lobbying on topics like radio spectrum, prison reform, and privacy will become even more fine-grained. Citizen publics will be important in both: as armies of computer-generated extras in the first case, and as engaged semi-experts in the second. Worthy mid-ground issues like trade, education, and energy policy will get steadily shorter shrift.
One implication is that niche topics like hunger policy shouldn’t strive to move up the charts into the middle ground – they’ll just wither there. Rather, niche players should embrace their residence in the long tail and make the most of Web 2.0 phenomena, like Polling x Lobbying, that give them direct access to the appropriate sliver of the policy making elite.
Monday, December 14, 2009
Timothy Mitchell, for example, argues that the economy was created by economists:
The economy is a recent product of socio-technical practice, including the practice of academic economics. Previously, the term “economy” referred to ways of managing resources and exercising power. In the mid-twentieth century, it became an object of power and knowledge. Rival metrological projects brought the economy into being. 
In his chapter in Do Economists Make Markets? On the Performativity of Economics, Michel Callon puts it this way: “To claim that economics is performative is to argue that it does things, rather than simply describing (with greater or lesser degrees of accuracy) an external reality that is not affected by economics.” MacKenzie argues in his chapter of the same book that the Black-Scholes-Merton options pricing model not only helped traders price something that already existed; it also shaped it, since most traders ended up using the model, prices converged to what the model predicted. 
In the same way, economists who treat spectrum as an asset (see my post "Property rights without assets") are not simply describing an external reality; they are bringing something into being. One of the key tools in this process is metrology: for example, the gathering of GDP data brings into being “the economy” which is reified through numbers like the GDP. In the same way, the program to make an inventory of spectrum buttresses the spectrum-is-real perspective. (More on spectrum inventories in a future post.)
World views have consequences, and thus stakeholders. Those who have a stake in the existing spectrum-based regime gain from this view; questioning the validity of “spectrum” undermines the security of their rights and privileges. This applies not only to capitalists who own spectrum licenses, but also to progressives who base their claims to government supervision of radios on the public ownership of the supposed “spectrum asset”. On the other hand, if one thinks of radio regulation simply in terms of the operating rights associated with radios, then a much more dynamic regime can be imagined – one that would benefit both political and commercial entrepreneurs. A non-spectrum world view might also be attractive to current “spectrum owners” who are discontented with their rights. 
The political and engineering systems that have co-evolved with the spectrum concept have specific characteristics: largely static allocations of rights to operate defined in terms of fixed frequency ranges. More dynamic approaches don’t fit nicely. For example, Preston Marshall wants to guarantee the right to operate, but not exclusivity over one channel; he proposes to guarantee a licensee (along with others) an aggregate of access to sufficient frequencies to meet a certain amount of service. 
This is an approach that focuses on behavior, rather than the exclusive ownership of an asset. As I argued in "Property rights without assets", this is perfectly compatible with a property rights regime, since property rights don’t have to be based on an underlying asset.
The bottom line is that a spectrum-as-asset approach leads one to ignore elements, which leads to inferior rights design. Specifically, receivers have been ignored. If one thinks one’s job is to "carve up spectrum", then you don't have to worry about receivers. But when radio is considered as a system, the receivers determine interference just as much as transmitters, so one has to take them into account explicitly. By analogy: if you're deciding a land trespass case, you don't worry about whether farmer is grazing Holsteins or Friesians. But if you're deciding a trademark dispute, everything depends on what happens in the mind of the consumer (analogous to the receiver). 
 Mitchell, Timothy (2008) “Rethinking Economy”, Geoforum, Volume 39, Issue 3, May 2008, pp. 1116-1121.
 MacKenzie, Donald A, Fabian Muniesa, Lucia Siu (2007), Do economists make markets? On the Performativity of Economics, Princeton University Press, 2007
 There is debate about the origin and extent of government property rights in spectrum; see for example the Introduction of William L. Fishman, “Property Rights, Reliance, and Retroactivity Under the Communications Act of 1934”, Federal Communications Law Journal, Vol. 50, No. 1. Fishman concludes: “It would probably be better, therefore, to say that the government regulates electromagnetic radiation in certain defined frequencies, rather than to say it regulates spectrum.”
 See e.g. Section 5.4 in the report “Radio Regulation Summit: Defining Inter-channel Operating Rules”.
 For more on the virtues of the radio-as-trademark metaphor, see my blog post “De-situating Spectrum: Non-spatial metaphors for wireless communication”, and paper “De-Situating Spectrum: Rethinking Radio Policy Using Non-Spatial Metaphors”
Thursday, December 10, 2009
I’ve been struck recently that many if not most definitions of property rights seem to turn on a relationship to an asset. For example, Gary Libecap in Contracting for Property Rights defines them as "the social institutions that define or delimit the range of privileges granted to individuals to specific assets" (1990:1); or Yoram Barzel in The Economic Analysis of Property Rights: "Property rights of individuals over assets consist of the rights, or the powers, to consume, obtain income from, and alienate these assets" (1997:2). Such definitions set out to define rights which assure the owner of an asset that they can derive value from that asset.
However, one can have rights to create value that do not require the existence of an underlying asset – unless, of course, one takes the position that the existence of a property right necessarily implies an asset. 
Therefore, let me distinguish between any property right, which is an asset in itself, and a property right to exploit an asset, which entails two assets: the right itself, and the underlying asset. All assets can lead to property rights – perhaps tautologically, in that something might not be counted as an asset if it does not have rights associated with it – but not all property rights require assets.
It always helps to make things concrete. One property right without an underlying asset is a New York taxi cab medallion: it's a right to operate, but there isn't an underlying asset. The right is tied to a particular place (New York), but that place isn't the asset.
Another common asset-less right is a franchise, that is, an agreement to sell a company's products exclusively in a particular area or to operate a business that carries that company's name.
Perhaps my favorite is a trademark, that is, a word, symbol, or phrase, used to identify a particular manufacturer or seller's products and distinguish them from the products of another. One might use the word “Wired” to brand a magazine, but the word isn’t the asset; when I last counted about a year ago, there were about 27 distinct trademarks using the word "wired" in the US.
Notice that permission for an agent to behave in a particular way is the essence of all these rights – and of rights that require assets, too. Therefore, I’d contend that behavior is the key to property rights, and assets are optional.
There are of course many property rights to assets, from owning a pencil to the right to extract oil in a particular region. Note that the underlying assets don't have to be tangible: an algorithm over which one has a patent is a perfectly viable intangible asset (perhaps made so exactly by the property right).
This distinction between property rights that do and do not require underlying assets matters: if one assumes an underlying asset where there is none, one is liable to over-assign rights.
For example, if trademark regulation assumed that the word being used was the asset, then it might give the owner of the trademark the right to all possible (commercial) uses of the word. There would be only one “Wired” trademark in the US, let’s say owned by Condé Nast; the companies who wanted to use the word to sell cologne, art supplies, energy drinks, stationery, electronic door chimes or automobile wheels would be out of luck. This would be a loss because an entrepreneur could apply the letters w-i-r-e-d to some new product that couldn’t be confused with a magazine without seeking (and probably failing to get) Condé Nast’s permission.
Similar reasoning applies to radio regulation. The existence of radio licenses doesn’t mean that there is an underlying asset, “spectrum”. 
If one regards a radio channel as an asset, then (Anglo-American) regulators have shown a proclivity to grant an expansive array of rights. Following the norm of technology and service neutrality, they have defined operating rights so broadly that pretty much preclude all operations that radiate energy in that channel, regardless of its harm to the licensee, in order to allow the licensee to operate in any conceivable way.  Such a broad definition forecloses new entry by potentially useful but non-interfering services
A broad definition also forecloses future arrangements of radio operating rights that are not tied to channel-based world view. Bands and channels, as regulatory constructs, are in large part a consequence of the two-stage super-heterodyne radio design that first filters a broad range of frequencies at the "RF stage", and then after down-conversion picks out a narrow range at the "IF stage". This is an old-fashioned approach that is increasingly becoming obsolete  - but it is enshrined in regulation.
Barzel, Yoram, Economic Analysis of Property Rights, Cambridge University Press 1989, second edition 1997
Libecap, Gary D., Contracting for Property Rights, Cambridge University Press 1990
 A view of property rights that does not require the existence of underlying assets is not identical to the "bundle of rights" approach taught in law school property classes; there it's taken as a given that there's an underlying asset - paradigmatically, real estate - and the bundle explains how it can be simultaneously “owned" by multiple parties.
 The emergence of the spectrum concept suggests that this is, indeed, the conclusion that has been drawn. Perhaps the reasoning that a property right must entail an asset is one of the reasons why “spectrum” has become such an entrenched concept.
 I’m ignoring allowed inter-channel interference; for a discussion of that case, see my report on the meeting held at Silicon Flatirons, “Defining Inter-Channel Operating Rules”
 See e.g. Soni & Newman 2009, "Direct conversion receiver designs enable multi-standard/multi-band operation", RF Designline
Monday, December 07, 2009
The meeting showed there was broad support for taking receivers into account more explicitly when drafting rules, for example by regulating resulting signal levels rather than the customary approach of specifying rules for individual transmitters. This approach focuses on the results of transmission – which includes interference, the bone of contention in most radio regulation debates – rather than the transmission itself.
Ofcom, the UK communications regulator, took an interference-based approach to licensing by creating Spectrum Usage Rights, also known as SURs . However, SURs were roundly rejected by the cellular operators. Ofcom chose not to impose SURs on the mobile industry, on the premise that the goal of SURs was to improve the certainty of the license holders for their benefit, not the regulator’s benefit.
While there are many other reasons for the cellcos to reject SURs (the problems SURs address, like uncertainty about likely uses and technologies, or disparate uses in adjacent channels, are largely absent in cellular bands), it is clear that Ofcom deferred to the interests of incumbents – potentially at the cost of consumers or new entrants. One of the conclusions of a 2007 report for the European Commission on radio interference regulatory models  came to mind:
“Technology and service-neutral licensing (as would be supported by interference-based licensing techniques) offers significant benefit for end-users but not necessarily for spectrum owners and network providers.”In an essay in honor of Alfred Kahn’s 90th birthday, Phil Weiser observed that airline regulation (where Kahn, the "Father of Airline Deregulation," made his name) and spectrum regulation share some basic characteristics: both regimes emerged from an effort to protect established interests; both limited output by restricting the use of the resource in question; and in both cases, early academic criticism calling for regulatory reform went unheeded. In making the case for Kahn as a political entrepreneur, Weiser argues that he “pursued the objective of eroding the airline industry’s commitment to the legacy regulatory regime by both undermining the manner in which it protected established incumbents and bolstering the strength of those interests that would benefit from deregulation.”
The radio incumbents Weiser had in mind were the broadcasters and not the cellular companies – but it’s not too much of a stretch to attribute at least some of the resistance to new methods of radio regulation to the New Incumbents.
 Phil Weiser (2009), “Alfred Kahn as a Case Study of a Political Entrepreneur: An Essay in Honor of His 90th Birthday.” Journal Network Economics, 2009. Abstract at SSRN. The paper was first delivered at a conference at Silicon Flatirons in Boulder on September 5, 2008.
 See e.g. William Webb (2009), “Licensing Spectrum: A discussion of the different approaches to setting spectrum licensing terms” (PDF); and Ofcom (2008), “Spectrum Usage Rights: A Guide Describing SURs” (PDF)
 Eurostrategies and LS telcom, “Study on radio interference regulatory models in the European Community” 29 November 2007 (PDF)
Monday, November 02, 2009
A recent News Focus piece in Science (Erik Stokstad, On the Origin of Ecological Structure, Science 2 October 2009, Vol. 326. no. 5949, pp. 33 - 35; there's also an interview with Stokstad in the podcast of 2 October, 2009) raises a more fundamental problem with the utility of the metaphor: ecologists themselves are still struggling to understand what dictates the kinds and proportions of organisms in communities ranging from meadows to montane forests.
Stokstad writes that "there is still no consensus on the relative importance of the various forces [that influence community formation, like competition, predation, and disturbance]. Darwin and many later ecologists emphasized competition among species, but proponents of a controversial theory of biodiversity that assumes competition has no impact argue that immigration and other random demographic events can account for much of the apparent makeup of communities. As a result, ecologists have a long way to go to come up with formulas that predict how communities might arise and change."
If ecologists can't explain community dynamics in biology, it's dangerous to make inferences by analogy about the influence of (say) competition and disturbance in commercial systems. Which is a pity, since that's just what I've tried to do myself...
Tuesday, October 27, 2009
Wireless companies are clamoring for "more spectrum", that is, new radio operating licenses that will allow them to satisfy the exploding demand for data capacity (think video streaming to an iPhone). New licenses are a cheaper way to increase capacity than the alternative: building additional cell towers.
The cellular companies have traditionally been opposed to unlicensed radio allocations (like the one that allows Wi-Fi networks) because they saw them as a substitute for licensed: the more unlicensed, the less licensed allocations; and less licensed meant higher prices at auction.
However, at least some of them seem to be realizing that unlicensed can help them off-load traffic from their licensed networks. A video stream that arrives on a phone in a coffee shop via a wired connection to the shop and a Wi-Fi link to the device has not crossed the licensed cellular network, freeing up cellular capacity.
Unlicensed is therefore a complement to licensed - just as open source can complement rather than substitute for proprietary software. Even Microsoft now offers some software under open source-like licenses.
Just as with software, one should expect kicking and screaming in the cellular industry, and variation in the degree of acceptance of unlicensed depending on business model and other assets. IBM is a big supporter of open source because it makes money on services rather than software licenses; T-Mobile is more open to unlicensed than Verizon because it has fewer licenses.
Thursday, October 01, 2009
For example, both Michael Calabrese (The End of Spectrum ‘Scarcity’, New America Foundation Wireless Future Program Working Paper No. 25, June 2009) and Kevin Werbach (Castle in the Air: A Domain Name System for Spectrum, TPRC September 2009) have argued that the database(s) contemplated to manage device operation in the TV white spaces could be the foundation for a method to increase the amount of radio operation.
Thinking through how such a database might be used shows the advantage of approaching radio regulation as coordinating operations, rather than using conventional approach of “dividing up spectrum”.
The regulatory challenge is therefore not "spectrum databases" but "radio operation databases".
In a first approximation – and perhaps even as the ultimate solution, if one uses the “spectrum” approach – a database would be a listing of “vacant” channels; a device would query the database for “available” channels, and operate in one. When one starts from the basis that spectrum is an asset like land to be divided up and distributed, vacancy is a self-evident concept; it derives from the attributes of the underlying asset, and not by reference to the intended use.
However, context is everything in radio operation. Whether harmful interference will result from the operation of an added radio system depends not only on its transmissions, but also the transmit and receive characteristics of the incumbent system.
Consider, for example, three channels: A, B, and C. Let’s say incumbent system #1 operates using channel A. Channels B and C are nominally vacant. Can an incoming system #2 operate in those channels? If both system #1 and #2 use traditional cellular technology (i.e. FDM, e.g. 3G), the answer is yes. But if #1 uses 3G and #2 uses TDM technology like WiMAX, then the answer is No: there needs to be a guard band between them, and system #2 can only use Channel C. Channel B needs to be left “vacant”. (This is a live issue: see e.g. Ars Technica on the argument between T-Mobile and M2Z over the rules for the AWS-3 band band.)
A mental model informed by spectrum-as-land is therefore not an ideal guide to understanding what needs to be in the database. (More generally, one needs to refine the metaphor to better guide regulation, as Weiser and Hatfield did last year by introducing the concept of "zoning the spectrum" in Spectrum Policy Reform and the Next Frontier of Property Rights, 15 Geo. Mason L. Rev. 549.)
An approach grounded in coordinating operations, on the other hand, leads to the understanding that what needs to be in the database is not just a frequency range and geographic region, but all the relevant parameters of an incumbent operation. The short list would add receiver performance (ability to reject interference) and duty cycle (near-constant transmission like cellular systems, vs. very intermittent but intense uses like firefighting) to the usual suspects of transmitter location, emitted power, and transmit mask.
The task is not to find a “vacant channel”, but to determine if an incoming operator will cause harmful interference. This requires, in addition to the operating parameters of the incumbent and incoming systems, information about the spatial distribution of incumbent and incoming radios, and a propagation model to connect the two.
Ofcom is the regulator that has thought most deeply about ways to better characterize the interference characteristics of radio systems; see e.g. Ofcom’s Guide to Spectrum Usage Rights (SURs) and William Webb’s recent paper Licensing Spectrum: A discussion of the different approaches to setting spectrum licensing terms.
A well-founded framework for generalizing the white space database – where interference management between incumbents and new entrants hard-coded into the FCC rules for white space device operation – could benefit from new radio operating metaphors (grind axe: see my De-situating spectrum: Rethinking radio policy using non-spatial metaphors, DySPAN 2008) and the application of a SUR-like approach.
One will also have to think carefully about the minimal set of parameters needed to facilitate interference avoidance, since it's easy but economically inefficient to come up with a very long list of attributes that describe radio operations. The Silicon Flatirons Center recently examined this issue in a summit on defining out-of-band operating rules.
Tuesday, September 29, 2009
My guess is that I don’t need to use the S-word at all; that it can be replaced with simple terms rather than clumsy paraphrases; and that the clarity of a text, and the understanding of the reader, will be greatly improved if one avoids it.
I first raised this possibility in the DySPAN 2008 paper De-Situating Spectrum: Rethinking Radio Policy Using Non-Spatial Metaphors where I recommended a “restatement of wireless policy in terms of system operation rather than spectrum.”
The word “spectrum” has many meanings, depending on context. In policy documents it’s usually short-hand for the topic dealt with by radio regulators, aka their “the object of governance”. There are a range of intended meanings, including a radio license; a range of frequencies; or all the parameters (frequency, geography, transmit power mask, allowed use, single or paired bands, etc.) associated with a radio license.
Engineers use “spectrum” to refer to a range of frequencies, or sometimes to electromagnetic phenomena. Less frequently – curiously, since this is closest to the dictionary definition – they use it to refer to the distribution of electromagnetic energy that results from radio operation.
In short, the following substitution covers most cases:
FOR spectrum SAY radio license OR radio operation OR frequenciesThe S-word is also used in various combinations; here are some translations:
FOR acquire spectrum SAY acquire permissionsThe value of more precise terminology becomes obvious when one looks through this list. One can distinguish between two distinct referents of “spectrum”: the parameters of operation of radios, and the rights to operate. It’s a distinction between assets and operations. One can easily put radio licenses (“spectrum”) on a balance sheet, but not the institutional and technological ways of coordinating radio operation (“spectrum”).
FOR sharing spectrum SAY coordinating operation
FOR use of spectrum SAY operation of radios
FOR spectrum rights SAY rights to operate
FOR spectrum allocation (noun) SAY license type
FOR spectrum allocation (verb) SAY deciding use
FOR spectrum assignment (verb) SAY authorizing a radio operator
FOR Dynamic Spectrum Access ("DSA") SAY dynamic radio operation
FOR stockpiling spectrum SAY stockpiling licenses
FOR demand for spectrum SAY demand for licenses
FOR manage spectrum SAY manage radio operation
FOR improve the efficiency of spectrum use SAY increase concurrent radio operation
FOR a chunk of spectrum SAY operations concentrated in a band
But why bother? An obvious retort is that this is just nitpicking: “Everybody knows what they mean by the word in a given context.” I argue that it’s important because the connotations of words matter, and change how we see the world.
Constant, thoughtless use of the S-word without teasing apart its meanings creates a thing: “the spectrum”. We come to accept as real the illusion that we’re dealing with a concrete thing (like bushels of corn) rather than the behavior of devices and their owners. If one takes away the radios, “spectrum” as an object of governance ceases to exist, although “spectrum” in the sense of “electromagnetic phenomena” persists. This illusion leads to the fallacy that “spectrum” can be counted like bushels of corn, whereas it is in fact a regulated socio-technical arrangement. It leads to the fallacy that “spectrum” can be counted, and that it is permanently divisible and inalienable.
I’m not denying that interference can occur between radio systems, nor that property rights can facilitate the coordination of radio operation. Rather, I’m suggesting that “spectrum” leads too easily to important conclusions that need to be considered more deeply, such as that wireless licenses are necessarily exclusive rights to operate in fixed frequency ranges. A focus on the behavior of radio systems, which changes constantly as technology and institutions evolve, rather than some spectrum-as-thing can produce a more robust and efficient way to coordinate radio operation.
Thursday, September 24, 2009
The links between cybersecurity, digital identity and trade were not immediately obvious to me, and since security isn’t an area where I can even pretend to have expertise, it forced me to think through the topic from the ground up.
I ended up reframing the topic as “the protection of assets in the digital age.” Not “digital assets”, although some assets are undoubtedly digital. Some concrete assets have digital dimensions: for example, a compromised SCADA system can deprive a city of its water supply. This is a new risk because the use of standardized/open solutions and the growing internet connections between SCADA systems and office networks has made them more vulnerable to attack. And while a person’s reputation isn’t digital as such, information technology has changed how reputations are constructed, disseminated, and need to be protected.
The next step is to categorize the assets that need to be protected, and for that one can consider various attributes. One useful categorization is the motive for threatening assets; I submit that Sex, Money, and Power are the three important motivations (in all things!).
Sex is about status – high status improves reproductive success; into this category would fall hackers who build exploits to show their prowess, and people who want to build a digital persona.As an alternative nomenclature to sex, money, and power, one might think of Fame, Fortune, and Foreign Affairs.
Money refers to economic motivations, whether protecting intellectual property rights in content through encryption, or building botnets for fraud or blackmail.
Power is perhaps least talked about until recently: it’s the pursuit of national interest through IT, e.g. “cyberwar”. The assets in question include critical national infrastructure, and sensitive intelligence.
The motivations of sex, money, and power can be mapped against another categorization, that of the asset context. In increasing order of scale, the contexts are the personal, the corporate, and the national (aka social, commercial, and political). (However, note that global corporations actually operate at both a national and transnational scale.)
With these two categorizations, one can then plot topics on a handy grid (apologies about formatting; I haven't grokked how to import tables into blogger):
| || || || |
| || || |
| || |
Employee & customer safety
State assets, incl. military
Political power structures
| || |
Sex (aka fame)
Brand hijacking, web defacement
Threats (by motive for attack)
Money (aka fortune)
Theft of goods
Appropriation of know-how
Diverted compute capacity
Advantage national champions
Create non-tariff barriers
| || |
Power (aka foreign affairs)
Suppression of speech, access
Appropriation of IPR
Reduce ability to compete
Degrading infrastructure & assets
A few notes:
Threats to assets come in various flavors, notably appropriation, destruction, and constraint of use
Trade occurs both within and between columns, that is, between individual people and between individual companies, as well as between people and companies. Likewise, at a different resolution scale, between nations.
It helps to distinguish between the What vs. the How. Security doesn’t appear explicitly in the table, and neither does digital identity; both are means to end (“how) of protecting assets (“what”). Other means (they do overlap) include encryption, digital rights management, and norms, rules and treaties.