Tuesday, March 24, 2009

Lessons From Software For Patents, vs. Solving the Software Patent Problem

Software patents may be going the way of network neutrality: an arcane policy problem once the preserve of a small circle of wonks is becoming a politicized slanging match. In both cases an esoteric but important research question has become a point of leverage for certain interest groups. In both cases the subject (“network neutrality”, “software patents”) is at best poorly defined, typically has multiple possible meanings, and at worst is so vague as to be useless. And in both cases, the poster child is the small-time innovator, while the sugar daddy is a big money player minimizing costs (e.g. content providers who love net neutrality, and VCs who hate software patents).

I was fortunate to attend the Silicon Flatirons conference on Evaluating Software Patents last week. The legal scholars there agreed that there were many, incompatible ways to define software patents, and the practitioners agreed that even if a definition were stipulated, they’d find a way around any additional burdens imposed by its use.

However, good arguments were made that software has added something new to the intellectual property mix. None of the following attributes of software are decisive, but together they point to changed dynamics:
There is a very high likelihood of infringement when producing a software-based product since so many patents are implicated in any application

There is a large group opposing software patents (whatever they are) because it undermines their business model, notably the open source / free software movement

Unlike many other inventions, software can also be protected via copyright

A very large proportion of current patent applications involve software

Software-related products are more intangible than traditional mechanical inventions

Programmers as a community are more hostile to the use of patents than other inventors
As I understood the observations of the legal scholars (John Duffy, Mark Lemley, and Michael Meurer – apologies for lumping their views together) it didn’t matter if “software” patents weren’t a definable category; new legal doctrines were required to address the new problems raised by software. For example, a rapidly moving industry like software has a more pressing need for a high standard of obviousness than earlier technologies of more placid times; and the problem of inadvertent infringement needs to be addressed on its own terms, and not just for “software” patents.

Prof. Lemley also pointed out that over the last three years, the courts have fixed most of the problems that have been grist for the debate. Legislation and reform of the patent office will be a long time coming, and we shouldn’t – and don’t need to – wait for them.

The lesson generalizes: rather than tie new methods of governance to the particular technologies or industries that give rise to new problems, one should abstract the problems and solve them generically. Now if only that had happened with network neutrality…

Tuesday, March 03, 2009

Two-way transparency

“Transparent government” is the watchword these days – but it’s the transparency of the proscenium arch. The curtain has been drawn aside a little and we can watch the players, but they care little and know less about what’s going on in the audience. The groundlings aren’t asked to shape the play.

It’s important for citizens to be able to see into government; but it’s just as important for government to understand what citizens want. And in a democracy, it’s most important for citizens to influence government.

Web 2.0 is giving participatory democracy a fillip, as online social networks are drafted into energizing voters. However, much of the “we’re listening to you” is still theater: citizens are asked to submit YouTube videos, and a select few are played to simulate that someone is paying attention.

It’s not (just) that politicians don’t want to listen; making sense of the individual opinions of thousands or millions of people is very hard to do in a nuanced way. The dominant method is still counting noses, whether in an election, an opinion poll, or keeping track of how calls from constituents are splitting on a contentious issue. Potential knowledge is boiled away, leaving only numerology at the bottom of the pot.

Advanced computation can help make sense of citizen input. Semantic analysis tools developed to filter spam, mine search queries, collate machine-submitted bug reports, and extract signals intelligence can be applied to provide a narrative. Old technologies should be still be used – and used more intensively. Regulatory agencies should poll citizens, and not just depend on lobbyists and lawyers to tell them what’s important. ICT can also turn citizen input from a burden to a blessing if it becomes a cost-effective way to leverage democratizing innovation into innovative democracy, using all the social networking and idea market tools of Web 2.0.

Wednesday, February 11, 2009

Ecosystems: sustainability or innovation, pick one (at a time)

Business folk, particularly those in IT, love the ecosystem metaphor (perhaps erroneously). One of the reasons, I realized listening to Pamela Passman on a panel at the Silicon Flatirons annual conference, is that it provides validation to both incumbents and challengers. Passman advocated creating a healthy internet ecosystem, and emphasized the importance of both sustainability and innovation. [*]

Both of these are characteristics of ecosystems, but not, as I understand it, at the same time. For example, mammals could only start rise after the extinction of the dinosaurs, prompted by a massive meteor strike or large-scale volcanism. The innovation that led to Homo Sapiens resulted from a catastrophic breakdown in ecosystem sustainability.

The adaptive cycle model developed by Buzz Holling and his collaborators has ecosystems constantly cycling through four stages: exploitation or growth, a mature conservation phase, a catastrophic release, and finally reorganization leading to new growth. To take the example of a forest: a fire, drought, or insect infestation triggers the breakdown (release) of the intricate and productive biological web that had been established during the preceding conservation phase. This sets the stage for reorganization, during which species that had been excluded in the prior conservation phase move in. As they become established, exploitation of open niches leads to growth. Eventually, we reach another conservation phase. Everything settles down; all the niches become filled, and the network of connections between biomass and nutrients becomes increasingly tight. This is a stable and very productive stage, from the point of view of resource utilization and biomass production. However, the tight linkages make it fragile to sudden release, starting the cycle again.

Ecosystems therefore oscillate between stability and innovation, swinging through repeated crises. By focusing on the appropriate phase, both incumbents and newcomers can see themselves in an ecosystem view. During an exploitation/growth phase, which we have with the Internet at the moment, newcomers are validated by looking back to the preceding reorganization phase which led to their rise, and (re-)emerging incumbents look forward to the impending conservation phase during which they will reap their reward.

What does sustainability mean in this context? Certainly not eternal stability, since that’s not possible. At best, it’s management the ecosystem to limit the severity of the release phases while still generating enough restructuring to allow innovation.

The moral of this story is that ecosystems talk hides but does not end the endless tussle between newcomers and incumbents. Wise governance needs to find a way to extract the social benefits of both, while recognizing that each represents the eclipse of the other.

Note

[*] Shane Greenstein had a great paper at the conference on what makes for "healthy" behavior in the internet industry; forthcoming in the Journal on Telecommunications and High Technology Law. For a brief summary, see Rocky Radar

Thursday, January 22, 2009

Evolved to revere teachers


I’ve been keeping an ear on an interview with an Aikido teacher that S. has been listening to. Shaner Sensei frequently points out and marvels at the insights and skills of the teacher that founded this particular branch of the art.

The meditation technique I’m learning is also built around a charismatic teacher, in spite of himself; he keeps rejecting “gurudom”, and focuses attention on the practice. But this teacher, in turn, deeply and publicly reveres the teachers that preceded him.

A predisposition to teacher-reverence is probably in-born. It’s easy to construct an evolutionary biology Just So Story to explain it. Learning is clearly adaptive, and our genes encourage us to engage in it by making learning pleasurable. Since one learns better when one trusts the teacher, our genes predispose us to revere teachers and put them on pedestals.

Like all behaviors, this has risks as well as benefits. Along with the ability to surrender ourselves to a teaching that brings benefit, comes a proclivity to give ourselves over to people who lead us into evil or oblivion. The problem is that it’s hard to know where a path leads when embarking on it. If one really knew the end-point, one would already have completed the journey. Teacher reverence will therefore continue to beckon us onto both good paths and bad.

This argument goes through with minimal changes for leader-worship. It probably also has a basis in evolutionary fitness, and is also double-edged. I wonder whether one is related to the other? Both are based in respect for a leader, though the purposes (learning and inter-group conflict, respectively) are different.

Note: The image above is a statue of the founder Swami Vishnu-devananda at the Sivananda Ashram Yoga Ranch. It comes from slide show in a story on spiritual retreats in upstate New York by Shivani Vora, “The Simple Life”, The New York Times, 12 December 2008

FCC Reform paper

My recent posts on reforming the FCC (here, here, here and here) culminated in a short paper for a conference on this topic in DC on 5 January 2009. I also spoke on one of the panels (video).

Sunday, January 18, 2009

Voting within the Margins

Al Franken seems (for now, at least) to have won the Minnesota Senatorial election by 225 out of a total of about 3 million ballots cast: a margin of 0.0001, or 0.01%.

This margin of error is tiny; it's of the same order as the difference in length of your car between a day that's freezing and one that's in the 80's. (See here for steel's coefficient of thermal expansion if you want to check my math.)

This is so small that the result is a toss-up for all practical purposes. Presumably, however, society cannot accept that election results are random; we have to pretend that certainty can be had.

The margins of error of the voting process are sometimes larger than the margin of victory of the winner; this was certainly the case in Minnesota. Philip Howard of the University of Washington found seven such cases in the 2004 elections ("In the Margins: Political Victory in the Context of Technology Error, Residual Votes, and Incident Reports in 2004," 1/6/2005, PDF). He used three ways of thinking about error in an election: technology error, residual votes, and incident reports. For example, Howard cites a 2000 Caltech/MIT which found that the error rates for a large variety of vote counting processes were all 1% or more. (Recall that the margin of victory in Minnesota was one-one hundredth of this: 0.01%) He concludes: "In each case, the electoral outcome was legitimated by elections officials, not the electorate, because in very close races the voting process cannot reveal electoral intent."

In Minnesota, with all the recounts, many of those errors were removed. But there are many kinds of randomness in an election beyond the measurement: someone absent-mindedly ticking the wrong box, someone else deciding at random not to vote on a given day, or people who mistake one candidate for another. In the end, we just don't know the answer, and a coin toss (whether overt or hidden) is a fine way to decide the result. If it was a bad choice, the electorate can throw the bum out next time.

Saturday, January 17, 2009

William James, consciousness, and the non-existence of spectrum

We've just started another wonderful Teaching Company course: Daniel Robinson on Consciousness and Its Implications. He quoted from William James's essay Does "Consciousness" Exist? (1904), which reminded me of my spectrum preoccupations:
"I believe that consciousness, when once it has evaporated to this estate of pure diaphaneity, is on the point of disappearing altogether. It is the name of a nonentity, and has no right to a place among first principles. ... For twenty years past I have mistrusted conscousness as an entity: for seven or eight years past I have suggested its non-existence to my students, and tried to give them its pragmatic equivalent in realities of experience. It seems to me that the hour is ripe for it to be openly and universally discarded.

"To deny plumply that consciousness exists seems so absurd on the face of it — for undeniably thoughts do exist — that I fear some readers will follow me no farther. Let me then immediately explain that I mean only to deny that the word stands for an entity, but to insist most emphatically that it does stand for a function." (My italics.)
The distinction between entity that doesn't exist, and a function that does, applies equally well to spectrum. (I outlined my argument regarding spectrum in Newton, Leibnitz and the (non?)existence of spectrum; for more detail, see my article De-situating spectrum: Rethinking radio policy using non-spatial metaphors.) To mash-up William James:
To deny plumply that "spectrum" exists seems so absurd on the face of it — for undeniably "signals" do exist — that I fear some readers will follow me no farther. Let me then immediately explain that I mean only to deny that the word stands for an entity, but to insist most emphatically that it does stand for a function.
In other words, the proper subject of both psychology and wireless regulation is behavior and its results. This becomes all the more important as radios become more sophisticated.

A simple example: in the white space proceeding, the FCC specified different maximum transmit powers for different kinds of unlicensed radios, but required that they all avoid wireless microphones using the same detection sensitivity. This doesn't make engineering sense, since the radius of interference for weak radios is smaller, and they therefore do not need to detect microphones at the same range as strong radios. Their detection therefore doesn't have to be as sensitive. A more efficient alternative would be for the unlicensed radios to vary their detection sensitivity depending on their transmit power. The "usable spectrum" is therefore a function of the behavior of the radios concerned, and not just frequencies.

In a similar vein, the boundary between "spectrum licenses" is not really -- or not just -- a frequency, as it might at first sight appear. (Let's leave aside geographical boundaries.) There is no sharp edge, with a radio allowed to transmit any power it wishes "inside its spectrum", and none at all "outside". Instead, there's a gradation of decreasing power for increasing frequency difference. There isn't a boundary where one thing ends and another begins; rather, the boundary is a behavior. This underlines that spectrum, like consciousness for Henry James, isn't an entity, but rather a function.

Sunday, January 11, 2009

Factoid: Vista is worth negative-$150

According to Silicon Valley Insider, Dell is now charging customers $150 to downgrade from Vista to Windows XP. According to the story, Dell started charging customers an extra $20 to $50 for a downgrade to Windows XP in June, and by October, Dell's XP premium was up to $100.

Not a happy product if people will pay large amounts to avoid having to buy it.

And I thought that old joke about "first prize is a week in Palookaville, and second prize is two weeks there" was just a joke. . .

Saturday, January 10, 2009

Forever blowing bubbles

In a Wall Street Journal op-ed (PDF) Paul Rubin* suggests that bubbles and crashes are a natural part of capitalist markets. More to the point, the very factors that have recently increased the efficiency of markets – notably the internet – have also facilitated the formation of bubbles.

Technology is double-edged, as always: the internet facilitates both the functioning and malfunctioning of markets. Of course, the difference between function and malfunction is in the eye of the beholder. As John Sterman famously said, “There are no side effects—only effects.”

While this is a perennial problem, the internet may have caused a qualitative change in the degree of interconnection, which leads to significantly less resilience. Note the paradox: The internet is more resilient as a communication system, but it causes the systems that use it to be less resilient.

The corollary is that regulators face an impossible task: one can’t eliminate the downsides of the internet without simultaneously eliminating the benefits.

My study of the complex adaptive systems literature leads to the same conclusion:
  • more interconnected systems are less resilient
  • crashes are healthy, because they allow new entrants to flourish
  • the regulatory task is not to avoid crashes (this just makes the eventual correction worse) but to manage them
As if the regulatory job weren’t tough enough, the political challenge is even harder. “This should never happen again” are the first words out of a politician’s mouth after a catastrophe. That’s what people want to hear, but it’s not realistic – and not desirable either. It will surely happen again, and it’s necessary for renewal and innovation. Managing crashes includes both reducing their severity and mitigating their impacts.

[*] Paul Rubin is a professor of economics and law at Emory University and a senior fellow at the Technology Policy Institute. He served in a senior position in the Federal Trade Commission in the 1980s.

Friday, December 19, 2008

Factoid: more computers than people in Israel

According to a country ranking of computer ownership published in the 2009 edition of The Economist’s “Pocket World in Figures”, there were 122.1 computers per 100 people in Israel in 2006.

Russia was in the same league as Namibia, with 12 computers per 100 people. The US, Britain and Australia were at 76 per 100 people.

Sunday, December 14, 2008

Reforming the FCC: New Organization

The recommendations I outlined in Reforming the FCC: New Capabilities will only bear fruit in a suitable institutional setting. This post suggests some organizational responses to the challenge of managing 21st century communications in the United States.

In-house expertise

All participants in a complex adaptive system, including regulators, have to innovate to keep pace with ever-changing circumstances. However, change carries risks. The system will adjust to new rules as soon as they are promulgated, and there’s no way to turn back the clock.

The variability and unpredictability of adaptive systems means there cannot be a single, fixed, optimal strategy. The only viable approach is continuing learning and adaptation. As society and technology change ever faster, the institution has to learn faster and plan better. The FCC needs to be a learning organization which adapts at a much faster rate than was required in the past.

The FCC needs to have a strong, in-house basis for understanding the state of play, and anticipating developments. This is necessary both to make smart decisions about how (and whether!) to intervene through rule making, and to make smart/robust rules. Learning and planning are tied together through simulation and other safe-to-fail experiments.

Simply depending on the adversaries in proceedings to provide technical information is not sufficient; even if the input were not biased, one would still need in-house expertise to make informed choices. The Commission has this level of expertise in wireless technology, but perhaps not to the same degree in internet/web technology. Staff can make nuanced judgments about the likelihood and mitigation of radio interference, but has less experience judging network management claims, the implications for consumer privacy of data aggregation and behavioral advertising, or the most effective way to implement law enforcement surveillance on the internet.

With strong in-house capacity, the FCC could make effective use of outside consultants, which have employed too rarely in recent decades. It could also find ways to exploit the goodwill and expertise of volunteer experts in academia and society at large.

The multiple uncertainties of adaptive systems mean that every institution needs a long memory. Slow variables are often the ones that trigger radical change, but they can only be observed with prolonged attention. The institution needs a strong, constantly renewing base of career officials that can bridge across the terms of political appointees. There has been a renewed awareness in recent years of the degree to which the tenure of career professionals can be vulnerable to political processes.

Individuals matter. The interlinking of activities at various layers of a complex system means that in-house entrepreneurs (and reactionaries) can have disproportionate influence, particularly when a system is in flux. It’s important to foster independence and adaptive management skills, particularly in the American setting where top leadership is politically appointed, and changes frequently. Secondments like Chief Economist and Chief Technologist are an important tool, and should be complemented with fellowships from industry and academia at middle and entry levels in the organization.

New Structure

Developing an open, adaptive and principles-based approach to policy making, and building the capacity to perform new functions, will require changes in the institution’s organization.

The FCC is currently organized, in large part, to reflect the Titles of the Communications Act(s): the Wireline Competition Bureau is responsible for wire-based telephony; the Wireless Telecommunications Bureau oversees spectrum, including cellular telephony; the International Bureau covers satellite and international matters; the Media Bureau regulates radio and TV services. However, the mapping to statute is not exact, which suggests that organization by Title is not a necessity: a re-organization does not require a new Communications Act.

Such a structure cannot effectively address questions that cross industry boundaries – which, in the 21st century, is most of them.

A more effective and stable structure would be organization by policy mandate. This would replace the industry-oriented bureaus by departments with responsibilities for (1) public safety, (2) consumer protection, (3) the protection of culture and values, (4) economic vitality, and (5) the raising of revenues – regardless of industry or technology.

The rudiments of such an organization already exist; the Public Safety & Homeland Security Bureau is responsible for public safety across industries. Other bureaus would have to be split up among domain-oriented departments. The responsibilities of the departments replacing existing bureaus would include:
  1. Public Safety: Access to emergency services, law enforcement surveillance, data retention, and child safety

  2. Consumer Protection: Privacy, fraud, fair trade terms, access for those with disabilities, device certification, universal service, digital inclusion

  3. Culture & Values: Control of speech (obscenity, violence in media), advertising rules

  4. Markets: Economic vitality, anti-trust, allocation of resources (numbers, spectrum), market analysis

  5. Revenue: Taxes, fees, levies, subsidies
No reorganization is perfect, or without cost or risk. However, there are many benefits of this re-arrangement:
  • An alignment with policy mandates will be more stable over time than one based on technology or industry segmentation, which is in constant flux.

  • An organization structures by public interest mandate would require and enable the Commission to take a big picture approach in every case, and not limit staff to supervising or nurturing a particular, soon-to-be-obsolete industry category.

  • It would weaken the ability of incumbents to dominate all rule makings applicable to them by focusing on a single bureau.

  • A department focused on market issues would highlight the complements, overlaps and conflicts between the FCC and FTC in the anti-trust area.

  • The inclusion of spectrum allocation issues within a department charged with maximizing the economic value of national resources would call the question of the divided FCC/NTIA management of spectrum in the US, and may provide a path to the long-term resolution of this inefficient anomaly.
Enhancing capabilities like contingency planning, data analysis, and simulation could also lead to restructuring the Offices that serve cross-organizational roles. For example, a modeling group might logically be housed in either the Office of Engineering & Technology (OET) or the Office of Strategic Planning & Policy Analysis (OSP). One can make a case that combining the two Offices would reinvigorate staff in both. Both are tasked with providing expert advice, but OET focuses on radio engineering, while OSP houses lawyers and economists. The necessary cross-disciplinary work will be easier to accomplish in a single organization, even though inter-disciplinary incomprehension may be an obstacle in the early days. An invigorated cross-Commission data collection and analysis operation would be housed in OSP (or its successor), rather than distributed across the industry-oriented bureaus.

Is now the right time?

One needs to ask not only how to reform, but whether.

Reform is a phase in the adaptive cycle: it is the reorganization that follows the crisis of a release phase. (See Reforming the FCC: Etiology for a summary of the adaptive cycle.) While release and restructuring is necessary for the long-term health of a system, it can be painful.

Reform necessarily dissipates the capital accumulated during growth and maturity; it is not something to be embarked on lightly. Is now the right time to reform the FCC?

The goal of wise management is to keep disruptions from flipping a system into an undesirable state, while encouraging the innovation and experimentation that comes with reorganization – not vainly trying to stop restructuring from happening at all. Delayed re-organization amplifies the eventual crisis, increasing the risk of landing up in an unhealthy state; too frequent or premature re-organization never allows the full accumulation of the potential that can fuel the next restructuring cycle.

An accumulation of governance problems is not sufficient to drive a paradigm shift, as Thomas Kuhn demonstrated in the context of scientific revolutions; there needs to be a better alternative. Do we have a better alternative? I don’t know, and complexity science suggests that there’s no way to know for sure. The only way to find out is to try it.

Are the risks acceptable? One of the prices to be paid is that tools that one needs to manage the system are disrupted during reform. For example, trust is an important ingredient of self-regulation, which will be important in the new approach – but trust requires stability, which is reduced during a reform. Fortunately, industry is in a relatively stable phase at the moment, which can accommodate and smooth over disruption at the Commission level. This gives the FCC an opportunity to change while not endangering the stability of the whole system

A new Communications Act might trigger reform, but that is neither necessary nor likely. Congress will be the last to change in the face of the complexification of communications. Members, particularly influential ones, are typically long standing office-holders with entrenched patrons and perspectives. They will resist the threat to their patronage entailed by a re-organization of regulatory action. When they do act to reform, it will probably be in response to an existential threat to a powerful old-line industry – which will tend to entrench, or at best resist attempts at blurring, existing ways of doing things.

Re-organization will therefore have to be driven by the Chairman, with all the risks (and opportunities) of a process that depends on a single big ego. The choice of a new Chairman will therefore have far-reaching consequences for the effectiveness of the organization.

Conclusions

There is no a priori limit to the number of true statements one can make about a complex, adaptive system, and many of them will be at odds with one another. The role of a regulator is therefore not to establish the ultimate truth as the basis of a correct decision, but rather a never-ending quest to make the best possible call given what can be known at a given moment.

This reality has become more visible and more pressing as the stable world of 20th century communications gives way to the flux of the 21st century internet/web. Even while understanding grows that the FCC’s influence is limited, there is no doubt that it still has great influence, and important responsibilities. The addition of new techniques to its repertoire and a corresponding restructuring of its organization will be essential to wielding its influence wisely, to the benefit of citizens.

The new approach proposed here is premised on dynamics that affect not only the FCC, but all actors in the communications system. These arguments, and all debates about how to reform the FCC, therefore also apply to the broader question of governance of 21st century communications.

Further reading

Weiser, Philip J, “FCC Reform and the Future of Telecommunications Policy” (forthcoming, to be presented at Reforming the Federal Communications Commission, 5 January 2009)

Reforming the FCC: New Capabilities

Complex adaptive systems, like 21st century communications, are by definition difficult to understand and control; I outlined some reasons for this in Reforming the FCC: Etiology. It is often unclear whether they can be managed at all. However, society has well-established expectations of the FCC regarding consumer protection, economic vitality, public safety, raising revenues, and the protection of culture and values.

The staff members of the FCC as currently constituted do a valiant job. However, the use of new techniques, and changes to the organization’s structure would enable them to be even more effective in future.

The challenge of managing a complex adaptive system calls for particular Commission capabilities techniques to be added or enhanced. This post will discuss each in turn:
  • Principles rather than Rules
  • Modeling and Simulation
  • Self-regulation
  • Transparency and Intelligibility
  • Data gathering and Interpretation
A subsequent post will consider changes to the institution.

Principles rather than Rules

The goal of policy is to understand the present, anticipate the future, and plot a course between the two. Since the present reality is changing ever more quickly, detailed rules and regulations will often be obsolete before they have been finalized. Responses will necessarily be ad hoc, but they don’t need to be arbitrary: in a complex world, principles are a more appropriate technique than detailed rules.

All managers of adaptive systems have rules of thumb for coping with unpredictability and complexity. I have been particularly inspired by the work of Buzz Holling and colleagues on managed ecosystems, which are a useful analog to 21st century communications. Both ICT and ecosystem managers have to deal with the confluence of socio-political systems and largely self-driven “subject” systems. In ecosystems the subject is biology, and in ICT it is the intrinsic, self-directed creativity of technology. The following principles distill the experience of managing such systems.
  1. Flexibility. Determine ends, not means. Describe and justify the outcomes sought, not the methods to be used to achieve them.

  2. Delegation. Let the market and society solve most problems, not government. Government's role is to provide proper incentives and guidance, and to address critical failures.

  3. Big Picture. Take a broad view of the problem and solution space. Favor generic over sector-, technology- or industry-specific legislation.

  4. Diversity. Seek and support multiple alternative solutions to policy problems. Encourage competition and market entry.
These are not techniques that all communications policy makers are used to using.
  1. Detailed intervention by specifying a detailed mechanism for achieving a societal goal – from merger conditions, to pricing network elements, to the exact allowed uses of a spectrum license – has been the rule, and is embedded in custom and statute.

  2. Delegation is at odds with the traditional top-down control of telecoms and broadcasting.

  3. The industry silos that underlie the titles of the Communications Act enshrine a “little picture” approach, where problems are solved piecemeal and in isolation.

  4. While lip service might be paid to diversity and innovation, regulatory capture by industry incumbents prevents competition. The desire for (illusory) control has traditionally seduced regulators into focusing on single rather than multiple solutions.
The actions that follow from the principles will depend on the role of the manager, whether developing policy, writing legislation, or regulating. For regulators like the FCC, the implications include:
  1. Flexibility. Use principles rather than rules. Ensure backstop powers are available if self-regulation fails. Rules, if used, should be technology and business-model neutral. Build in capacity to deal with the unexpected.

  2. Delegation. Intervene if players close to the action demonstrably fail to solve problems flagged by the regulator, or in legislation. Avoid ex ante action unless there is a high probability that a market failure will be locked in.

  3. Big Picture. Avoid silo-specific regulation. Develop an integrated understanding, based on in-house expertise, of the industry and its social context. Use scenario planning to prepare for contingencies such as the entrenched market failure.

  4. Diversity. Don't entrench one solution through regulatory preference. Define markets broadly for competitive analysis.
Modeling and Simulation

It is difficult to predict how complex adaptive systems will respond to interventions. Unintended consequences are the rule, not the exception. Experimentation before deploying new rules can reduce the likelihood, or impact, of blunders.

The ability to try out new ideas in state and local jurisdictions is a useful feature of the United States’ federal system. However, new ways to conduct “safe to fail” experiments are needed because geographies are now more interlinked, and less isolated, than they used to be. System modeling made possible by the advent of cheap, fast computing provides a safe way to try out regulatory ideas.

Exploring the consequences of policy choices in simulation can identify which courses of action are most robust under a variety of possible outcomes, and can identify critical preconceptions and biases. It can identify policy choices that are brittle and work in only a narrow set of circumstances, thus leading to more resilient final measures.

Techniques developed for business modeling, social analysis, and long-term policy planning can be applied to the internet/web. For example, agent-based simulations of internet access in the US provide insight into the dynamics of network neutrality regulation. It would be instructive to explore whether the resulting system has multiple stable states, as one might expect from a complex adaptive system; if it does, increased vigilance regarding irreversible transition into a low-diversity content arrangement is called for. Once such a model is in place, one can extend it to do resilience analyses, and factor in political power.

No single model is sufficient; indeed, a focus on a single correct but incomplete model generates long-term problems even while satisfying short-term objectives. Simulation – agent-based techniques as well as traditional econometric modeling – needs to become part of the standard way all parts of the organization make decisions.

Self-regulation

There is growing interest in the value of self- or co-regulation in the communication industry as part of a continuum of approaches ranging from no formal government action, through to full statutory regulation. Industry self-regulation can be more flexible and less costly for both business and consumers than direct government involvement. It is most likely to be effective where there is trust between government, industry, and consumers; enterprises recognize the importance of responsible behavior over the long term; and non-compliance by rogue companies can be contained.

Allowing or encouraging appropriate self-regulation is a way for the FCC to implement the Delegation principle. This is delegation, however, not abdication of responsibility: the Commission retains the responsibility for discharging its social mandates (consumer protection, economic vitality, etc.), but does not necessarily have to use regulation to do so.

Successfully applying self-regulation entails a capability in the organization to make informed judgments about whether the socio-economic context is conducive to effective self-regulation, and whether self-regulatory organizations are meeting their responsibilities. As with all the techniques discussed here, new capabilities have institutional consequences.

Transparency and Intelligibility

Visibility into the workings of a complex system reduces volatility and improves resilience. But how does one get timely information about an elaborate, rapidly-changing socio-economic system like the internet/web? Since funding for monitoring by regulators is limited, it is important to enable surveillance by civil society and the market itself. Limited capacity and visibility at the top of the control hierarchy can be complemented by distributing monitoring throughout the system.

It is important to have effective disclosure mandates on powerful players, such as firms with significant market power – and the regulators themselves. Internet/web technology itself facilitates monitoring. It makes information more immediately accessible, and enables peer-to-peer disclosure and the pooling of information and knowledge. The pioneers like Zagat, Amazon Reviews, and Wikipedia are being joined by “vigilante transparency” organizations monitoring civil servants, campaign contributions, internet service providers, and even nannies.

One of the lessons of the sub-prime mortgage crisis is that a lack of intelligibility was more problematic than too little transparency. Nobody understood the ramifications of the financial instruments they were creating, managing or (not) regulating. While it’s true that eliminating complexity could stifle innovation, that’s a false choice; nobody would seriously propose to eliminate either complexity or innovation. What is needed is an accounting of intelligibility in the regulatory calculus.

Transparency/intelligibility need not be mandatory; companies should be able to choose obscurity. However, the choice of shrouding their activities could incur the cost of increased regulatory scrutiny – and perhaps higher expectations regarding performance against public interest mandates. For example, Comcast need not explain exactly how it manages its network; but if it chooses obscurity, it should face tougher network neutrality expectations. Microsoft need not disclose its interfaces; but if it chooses obscurity, it should face tougher anti-trust tests. Google need not explain how it uses DoubleClick data to improve ad click-throughs; but if it chooses obscurity, it should face tougher privacy protection requirements.

Data gathering and interpretation

All the above techniques depend on effective data gathering. The FCC cannot do all the work itself, but neither should it be entirely beholden to interested parties to a proceeding, who will necessarily furnish only information that advances their cause. The agency should foster a culture and capability of data collection and interpretation with the goal of learning, not bookkeeping. All these activities can be delegated in part, but in all of them the FCC should retain an in-house capability.

The data that the FCC currently reports is organized by the old-line industry silos, and largely ignores the structure of the 21st century market. There are data for common carrier traffic, competition in satellite services, the cable industry, and wireless – all given separately. Many vital current issues are not covered, such as: the assessment of the consumer value of auctioned spectrum; an inventory of the utilization of spectrum under both NTIA and FCC management; consumer data aggregation practices.

The FCC should also explore new ways to inform its decisions and act on behalf of citizens, such as opinion polling. This technique is often used by other regulators, e.g. Ofcom, but apparently not by the FCC. The Commission will always be catching up with data collection requirements, since the industry changes so quickly. An institutional habit of identifying new data needs and fulfilling them is just as important as high quality reporting against current mandates.

Once data has been collected, it needs to be interpreted. There are many interpretation challenges, not least the flood of short comments generated by particular proceedings which has been facilitated by laudable efforts to involve the public like filing via email, and submitting brief comments. For example, in 2006 there were 10,831 one- or two-page filings (excluding ex partes) on FCC Docket 06-74, the AT&T/BellSouth merger; they were essentially all from individual citizens. This was 18% of all the filings in that year (59,081). By comparison, in 2004 there was a total of 25,480 filings.

This is part of a larger challenge of managing multiple constituencies. Not only have the number of industries with an interest in the communication business grown beyond telecommunications and TV broadcasting to include internet content providers, software companies and new broadcasting media, but the public has become increasingly engaged.

The FCC needs to rethink how it can involve a wider community in data collection and analysis. This community includes interested citizens, industry, research vendors, and think tanks. The goal should be to improve the speed and quality of both data collection and interpretation by opening up the process to commercial interests and citizen-consumers.

Further reading

Adaptive systems
Yorque, Ralf, Brian Walker, C S Holling, Lance H Gunderson, Carl Folke, Stephen R Carpenter, and William A Brock, “Toward an Integrative Synthesis” Ch. 16 in Gunderson, Lance H and C S Holling, Panarchy: Understanding transformations in human and natural systems, Island Press (2002)
Complexity theory and governance
Schneider, Volker and Johannes M. Bauer, “Governance: Prospects of Complexity Theory in Revisiting System Theory”, Presented at the annual meeting of the Midwest Political Science Association, Chicago, Illinois, 14 April 2007. Available: http://www.uni-konstanz.de/FuF/Verwiss/Schneider/ePapers/MPSA2007Paper_vs_jmb.pdf.

De Vries, Jean Pierre (2008), "Internet Governance as Forestry: Deriving Policy Principles from Managed Complex Adaptive Systems", TPRC 2008. Available: http://ssrn.com/abstract=1229482.
Self-regulation
Ofcom (2008) Principles for analysing self- and co-regulation: Statement. 10 December 2008. Available: http://www.ofcom.org.uk/consult/condocs/coregulation/statement/

Weiser, Philip J. (2008) “Exploring Self Regulatory Strategies for Network Management”, Flatirons Summit on Information Policy, 9-10 June, 2008. 25 August 2008. Available: http://www.silicon-flatirons.org/documents/publications/summits/WeiserNetworkManagement.pdf
Simulation
Sterman, John D. (2002) “All models are wrong: reflections on becoming a systems scientist: The Jay Wright Forrester Prize Lecture.” System Dynamics Review Vol. 18, No. 4, (Winter 2002): 501–531. Available: http://web.mit.edu/jsterman/www/All_Models_Are_Wrong_(SDR).pdf

Lempert, Robert J., Steven W. Popper, and Steven C. Bankes (2003) “Shaping the Next One Hundred Years: New Methods for Quantitative, Long-Term Policy Analysis” RAND Report MR-1626-RPC, 2003. Available: http://www.rand.org/pubs/monograph_reports/MR1626/index.html

Bauer, Johannes M. (2007) “Dynamic effects of network neutrality,” International Journal of Communication, 1: 531-547. Available: http://ijoc.org/ojs/index.php/ijoc/article/view/156/79tprg
Transparency
Fung, Archon, Mary Graham, David Weil, and Elena Fagotto (2006) "The Effectiveness of Regulatory Disclosure Policies," Journal of Policy Analysis and Management, v. 25, no. 1 (Winter 2006). Available: http://www.hks.harvard.edu/taubmancenter/transparency/downloads/jpam06.pdf

Saturday, December 13, 2008

And it started so small: News-related Etymology

According to various sources at dictionary.com, the word "bribe" is Middle English, from the Old French, piece of bread given as alms. The shift of meaning to "gift given to influence corruptly" is first attested 1535. The original meaning of the word is said to be from the base bri(m)b- denoting something small.

(My apologies to Gov. Blagojevich for contributing to the burial of the principle of "innocent until proven guilty" in America. Talk about trial and conviction by public opinion...)

Wednesday, December 10, 2008

Reforming the FCC: Etiology

In Reforming the FCC: Diagnosis, I argued that the FCC must these days supervise a much more complex and adaptive situation. This post examines the causes of this situation. I’ll consider responses in a subsequent post.

As an exercise, consider whether the following attributes apply to 21st century communications (which I’ll also refer to as ICT, for lack of a better term): the absence of a global controller; nested, hierarchical organization; dispersed interactions; never-ending novelty; constant selection among candidate solutions; and rapid adaptation to new circumstances. I believe they definitely describe the internet/web – and they are much less applicable to the silo’d world of telecommunications and analog broadcasting of only a few decades ago.

These attributes are the hallmarks of complexity and adaptive, non-linear systems. 21st Century communications is a complex adaptive social system, but the FCC was set up to manage a 20th century industry which was complicated but not complex. This is the deep reason why the institution needs to change.

The adaptive cycle

A key attribute of complex adaptive systems is that they cycle through distinct stages. I’ll describe it here using the example of ecosystems (where it was introduced) before turning to ICT.

During the growth stage, there is rapid colonization of recently disturbed areas, for example after a fire or wind storm has removed large amounts of biomass in a forest. The connectedness between organisms is low, which leads to high resilience; the loss of one species doesn’t lead to the loss of another. As the forest matures, it moves into the maturity phase of the cycle, which is dominated by the accumulation of material. The network of connections between biomass and nutrients becomes increasingly tight, and fragile; every niche in the forest is filled, and every resource is used. Organisms become much more interdependent; food chains become dense and interconnected. The maturity phase is followed by a dramatic release, triggered in a forest by fire, drought, insect pests, etc. A lot of energy is unbound, and networks are broken up. This sets the scene for the fourth phase, reorganization: opportunistic species that have been suppressed by the stable configuration of the maturity phase move in. This is a time of innovation and restructuring, laying the groundwork for a return to another growth phase.

The behavior of managed ecosystems is shaped by three properties: the accumulation of potential, the degree of connectedness between elements, and the resilience of the system in the face of shocks. The same properties apply to complex human enterprises like modern communications.

The adaptive cycle alternates periods of gradual accumulation of potential (e.g. biomass, socio-economic capital or know-how, depending on the kind of system) with sudden and often unexpected disruptions that reorganize that potential. Connectedness is high at maturity, but that is also the time when resilience to shocks is at its lowest. This cycle of aggregation followed by restructuring leads to innovation; but the release phase is often a surprise, and frequently an unpleasant one for those who were successful in the maturity phase. It is thus often experienced as a crisis.

Decision Environments

One can recognize the phases of the adaptive cycle in the internet/web, and in the larger system of communications governance. It is helpful to parse the system into four decision environments that represent different hierarchical layers:
  • Political system: local, state and federal politicians seeking to advance their causes
  • Inter-organizational system: peer agencies with partially overlapping responsibilities, such as the FCC, FTC and NTIA
  • Organizational system: an agency, in our case the FCC, acting on its “subject” layer, and other organizations, in a context provided by the political systems
  • Market/culture system: companies and citizen/consumers using technology (goods and services) to achieve their various ends, often at odds with each other and other levels of system
Each decision environment has its own adaptive cycle which interacts with the others:

  • Political: The political system went through a release phase with the 2008 election, and will spend 2009 in reorganization as players who have been out of office for eight years move into newly opened positions of power (cf. ecological niches), bringing new perspectives with them.
  • Inter-organizational: The new Administration will bring necessarily bring changes at the top of the FTC and NTIA as well, but the consequences may not be as dramatic as those at the FCC, providing some stability at this layer
  • Organizational: The FCC is due for “release” with the appointment of new Commissioners and Chairman in 2009. There is anecdotal evidence that large-scale departures of long-serving career staff in the last couple of years represent a release in itself, with the breakup of long-standing networks of expertise and the dissipation of institutional knowledge.
  • Market/culture: The productive parts of the communication system are in or near maturity. Traditional content industries like news, music publishing and TV at maturity, and some are entering release. Telecoms went through a re-organization following the Telecoms Act of 1996, and is in a growth stage, judging by the consolidation of AT&T and Verizon. Similarly, the disruptive market-oriented allocation of spectrum through auctions has been absorbed, and there are signs of maturity in the concentration of spectrum in a few hands. There are still pockets of reorganization left over from the last cycle, e.g. cable taking voice share from wire line telcos, and telcos threatening cable’s video business. For all the hype, the PC/internet/web subsystem is well along in the growth phase and nearing maturity (e.g. Microsoft, Cisco, Google). Consumer habits have adapted to internet and the web, and have become mature.
Surprise

Another hallmark of complex adaptive systems – and one of the hardest challenges for a regulator – is unexpected novelty. Changes in the state of a complex system are usually unexpected, in part because many dynamics are invisible. Surprises are particularly noticeable when they lead to release.

Here are some recent reminders that the innovation that we expect from complex systems usually comes as a surprise:

  • Digital satellite radio expected to compete with traditional radio, not to be swamped by the iPod
  • Digital video as an alternative to broadcast TV came to prominence as low-quality, user-originated content on YouTube, rather than as high quality Video on Demand via cable or DSL
  • The explosion of Wi-Fi (and CDMA cellular telephony) was the consequence of esoteric decisions about unlicensed rules by the FCC in the mid 1980’s
  • The collapse of music publishing – the industry lost a third of its revenues between 1999 and 2006
  • The eclipse of commercial encyclopedias by user-produced content on Wikipedia

Many surprises come from contagion between problem domains that were previously considered distinct. XM/Sirius’s problems came at the intersection of personal computing devices with broadcasting; music publishing’s crisis arose from software and networking innovations that led to the P2P distribution of digital content; and the open source software movement informed Wikipedia.

Problem scope

A consequence of interlocking decision environments and intersecting problem domains is that the unit of analysis for the FCC is no longer a distinct, largely independent, well-defined industry associated with a particular technology and its own Title in the Communications Act.

Attention needs to shift from industries to problem domains, and away from solutions for a particular industry, technology and even institution or statute. For example, a policy imperative like lawful intercept is no longer limited to telephony, which leads to conflicts such as the competing definitions of information services in CALEA and the Communications Act. This is an example of the importance of the Big Picture principle for managing adaptive system. (I’ll review this principle and its three companions – Delegation, Flexibility and Diversity – in the next post.)

However, simply broadening some existing statute to cover all new possibilities is counter-productive. It conflicts with the other three principles, and falls victim to the fallacy that narrow-minded control of a single variable leads to a healthy outcome; in adaptive systems, it leads eventually to an even worse crisis.

In conclusion, the FCC is really facing a system problem, not an institutional one. Even if today’s procedural problems within the Commission were completely solved, it would not address the challenges of a qualitatively more complex and unpredictable regulation “subject”, that is, the market/culture system where innovation and growth takes place. Nor would it speak to the problems faced at the political level where the social acceleration of time poses existential challenges to the rule of law, and profoundly complicates the separation of powers between the legislature, executive, and judiciary market capitalism, and liberal democracy.

I’ll turn to the question of how the FCC should respond in the next post.

Further reading

The adaptive cycle:
Holling, C S, Lance H Gunderson and Donald Ludwig, “In Quest of a Theory of Adaptive Change”, Ch 1 of Gunderson, Lance H and C S Holling, Panarchy: Understanding transformations in human and natural systems, Island Press (2002). PDF

Ten Conclusions from the Resilience Project
Decision environments and the challenges individuals face in managing adaptive systems:
Westley, Frances, “The Devil in the Dynamics: Adaptive Management on the Front Line”, Ch. 13 in Gunderson, Lance H and C S Holling, Panarchy: Understanding transformations in human and natural systems, Island Press (2002)
A discussion of the intersection between system resilience, the rule of law, and Scheuerman’s notion of the social acceleration time
Cherry, Barbara A (2008), “Institutional Governance for Essential Industries Under Complexity: Providing Resilience Within the Rule of Law” CommLaw Conspectus (forthcoming)
An account of the early history of civil spread spectrum
Early Civil Spread Spectrum History, Mike Marcus web site
Collapse of the music industry
theweek.com
economist.com
Growth of cable voice traffic
redorbit.com
gigaom.com

Wednesday, December 03, 2008

Etymology for the day: "larceny"

According to the American Heritage Dictionary via dictionary.com, "larceny" is a Middle English word, from Anglo-Norman larcin, theft, from Latin latrōcinium, robbery, from latrō, robber, mercenary, utimately from Greek latron, pay, hire.

And the Spanish for a robber or thief is ladrón.

Mercenaries have never had a good rep. . .