The staff members of the FCC as currently constituted do a valiant job. However, the use of new techniques, and changes to the organization’s structure would enable them to be even more effective in future.
The challenge of managing a complex adaptive system calls for particular Commission capabilities techniques to be added or enhanced. This post will discuss each in turn:
- Principles rather than Rules
- Modeling and Simulation
- Transparency and Intelligibility
- Data gathering and Interpretation
Principles rather than Rules
The goal of policy is to understand the present, anticipate the future, and plot a course between the two. Since the present reality is changing ever more quickly, detailed rules and regulations will often be obsolete before they have been finalized. Responses will necessarily be ad hoc, but they don’t need to be arbitrary: in a complex world, principles are a more appropriate technique than detailed rules.
All managers of adaptive systems have rules of thumb for coping with unpredictability and complexity. I have been particularly inspired by the work of Buzz Holling and colleagues on managed ecosystems, which are a useful analog to 21st century communications. Both ICT and ecosystem managers have to deal with the confluence of socio-political systems and largely self-driven “subject” systems. In ecosystems the subject is biology, and in ICT it is the intrinsic, self-directed creativity of technology. The following principles distill the experience of managing such systems.
- Flexibility. Determine ends, not means. Describe and justify the outcomes sought, not the methods to be used to achieve them.
- Delegation. Let the market and society solve most problems, not government. Government's role is to provide proper incentives and guidance, and to address critical failures.
- Big Picture. Take a broad view of the problem and solution space. Favor generic over sector-, technology- or industry-specific legislation.
- Diversity. Seek and support multiple alternative solutions to policy problems. Encourage competition and market entry.
- Detailed intervention by specifying a detailed mechanism for achieving a societal goal – from merger conditions, to pricing network elements, to the exact allowed uses of a spectrum license – has been the rule, and is embedded in custom and statute.
- Delegation is at odds with the traditional top-down control of telecoms and broadcasting.
- The industry silos that underlie the titles of the Communications Act enshrine a “little picture” approach, where problems are solved piecemeal and in isolation.
- While lip service might be paid to diversity and innovation, regulatory capture by industry incumbents prevents competition. The desire for (illusory) control has traditionally seduced regulators into focusing on single rather than multiple solutions.
- Flexibility. Use principles rather than rules. Ensure backstop powers are available if self-regulation fails. Rules, if used, should be technology and business-model neutral. Build in capacity to deal with the unexpected.
- Delegation. Intervene if players close to the action demonstrably fail to solve problems flagged by the regulator, or in legislation. Avoid ex ante action unless there is a high probability that a market failure will be locked in.
- Big Picture. Avoid silo-specific regulation. Develop an integrated understanding, based on in-house expertise, of the industry and its social context. Use scenario planning to prepare for contingencies such as the entrenched market failure.
- Diversity. Don't entrench one solution through regulatory preference. Define markets broadly for competitive analysis.
It is difficult to predict how complex adaptive systems will respond to interventions. Unintended consequences are the rule, not the exception. Experimentation before deploying new rules can reduce the likelihood, or impact, of blunders.
The ability to try out new ideas in state and local jurisdictions is a useful feature of the United States’ federal system. However, new ways to conduct “safe to fail” experiments are needed because geographies are now more interlinked, and less isolated, than they used to be. System modeling made possible by the advent of cheap, fast computing provides a safe way to try out regulatory ideas.
Exploring the consequences of policy choices in simulation can identify which courses of action are most robust under a variety of possible outcomes, and can identify critical preconceptions and biases. It can identify policy choices that are brittle and work in only a narrow set of circumstances, thus leading to more resilient final measures.
Techniques developed for business modeling, social analysis, and long-term policy planning can be applied to the internet/web. For example, agent-based simulations of internet access in the US provide insight into the dynamics of network neutrality regulation. It would be instructive to explore whether the resulting system has multiple stable states, as one might expect from a complex adaptive system; if it does, increased vigilance regarding irreversible transition into a low-diversity content arrangement is called for. Once such a model is in place, one can extend it to do resilience analyses, and factor in political power.
No single model is sufficient; indeed, a focus on a single correct but incomplete model generates long-term problems even while satisfying short-term objectives. Simulation – agent-based techniques as well as traditional econometric modeling – needs to become part of the standard way all parts of the organization make decisions.
There is growing interest in the value of self- or co-regulation in the communication industry as part of a continuum of approaches ranging from no formal government action, through to full statutory regulation. Industry self-regulation can be more flexible and less costly for both business and consumers than direct government involvement. It is most likely to be effective where there is trust between government, industry, and consumers; enterprises recognize the importance of responsible behavior over the long term; and non-compliance by rogue companies can be contained.
Allowing or encouraging appropriate self-regulation is a way for the FCC to implement the Delegation principle. This is delegation, however, not abdication of responsibility: the Commission retains the responsibility for discharging its social mandates (consumer protection, economic vitality, etc.), but does not necessarily have to use regulation to do so.
Successfully applying self-regulation entails a capability in the organization to make informed judgments about whether the socio-economic context is conducive to effective self-regulation, and whether self-regulatory organizations are meeting their responsibilities. As with all the techniques discussed here, new capabilities have institutional consequences.
Transparency and Intelligibility
Visibility into the workings of a complex system reduces volatility and improves resilience. But how does one get timely information about an elaborate, rapidly-changing socio-economic system like the internet/web? Since funding for monitoring by regulators is limited, it is important to enable surveillance by civil society and the market itself. Limited capacity and visibility at the top of the control hierarchy can be complemented by distributing monitoring throughout the system.
It is important to have effective disclosure mandates on powerful players, such as firms with significant market power – and the regulators themselves. Internet/web technology itself facilitates monitoring. It makes information more immediately accessible, and enables peer-to-peer disclosure and the pooling of information and knowledge. The pioneers like Zagat, Amazon Reviews, and Wikipedia are being joined by “vigilante transparency” organizations monitoring civil servants, campaign contributions, internet service providers, and even nannies.
One of the lessons of the sub-prime mortgage crisis is that a lack of intelligibility was more problematic than too little transparency. Nobody understood the ramifications of the financial instruments they were creating, managing or (not) regulating. While it’s true that eliminating complexity could stifle innovation, that’s a false choice; nobody would seriously propose to eliminate either complexity or innovation. What is needed is an accounting of intelligibility in the regulatory calculus.
Transparency/intelligibility need not be mandatory; companies should be able to choose obscurity. However, the choice of shrouding their activities could incur the cost of increased regulatory scrutiny – and perhaps higher expectations regarding performance against public interest mandates. For example, Comcast need not explain exactly how it manages its network; but if it chooses obscurity, it should face tougher network neutrality expectations. Microsoft need not disclose its interfaces; but if it chooses obscurity, it should face tougher anti-trust tests. Google need not explain how it uses DoubleClick data to improve ad click-throughs; but if it chooses obscurity, it should face tougher privacy protection requirements.
Data gathering and interpretation
All the above techniques depend on effective data gathering. The FCC cannot do all the work itself, but neither should it be entirely beholden to interested parties to a proceeding, who will necessarily furnish only information that advances their cause. The agency should foster a culture and capability of data collection and interpretation with the goal of learning, not bookkeeping. All these activities can be delegated in part, but in all of them the FCC should retain an in-house capability.
The data that the FCC currently reports is organized by the old-line industry silos, and largely ignores the structure of the 21st century market. There are data for common carrier traffic, competition in satellite services, the cable industry, and wireless – all given separately. Many vital current issues are not covered, such as: the assessment of the consumer value of auctioned spectrum; an inventory of the utilization of spectrum under both NTIA and FCC management; consumer data aggregation practices.
The FCC should also explore new ways to inform its decisions and act on behalf of citizens, such as opinion polling. This technique is often used by other regulators, e.g. Ofcom, but apparently not by the FCC. The Commission will always be catching up with data collection requirements, since the industry changes so quickly. An institutional habit of identifying new data needs and fulfilling them is just as important as high quality reporting against current mandates.
Once data has been collected, it needs to be interpreted. There are many interpretation challenges, not least the flood of short comments generated by particular proceedings which has been facilitated by laudable efforts to involve the public like filing via email, and submitting brief comments. For example, in 2006 there were 10,831 one- or two-page filings (excluding ex partes) on FCC Docket 06-74, the AT&T/BellSouth merger; they were essentially all from individual citizens. This was 18% of all the filings in that year (59,081). By comparison, in 2004 there was a total of 25,480 filings.
This is part of a larger challenge of managing multiple constituencies. Not only have the number of industries with an interest in the communication business grown beyond telecommunications and TV broadcasting to include internet content providers, software companies and new broadcasting media, but the public has become increasingly engaged.
The FCC needs to rethink how it can involve a wider community in data collection and analysis. This community includes interested citizens, industry, research vendors, and think tanks. The goal should be to improve the speed and quality of both data collection and interpretation by opening up the process to commercial interests and citizen-consumers.
Yorque, Ralf, Brian Walker, C S Holling, Lance H Gunderson, Carl Folke, Stephen R Carpenter, and William A Brock, “Toward an Integrative Synthesis” Ch. 16 in Gunderson, Lance H and C S Holling, Panarchy: Understanding transformations in human and natural systems, Island Press (2002)Complexity theory and governance
Schneider, Volker and Johannes M. Bauer, “Governance: Prospects of Complexity Theory in Revisiting System Theory”, Presented at the annual meeting of the Midwest Political Science Association, Chicago, Illinois, 14 April 2007. Available: http://www.uni-konstanz.de/FuF/Verwiss/Schneider/ePapers/MPSA2007Paper_vs_jmb.pdf.Self-regulation
De Vries, Jean Pierre (2008), "Internet Governance as Forestry: Deriving Policy Principles from Managed Complex Adaptive Systems", TPRC 2008. Available: http://ssrn.com/abstract=1229482.
Ofcom (2008) Principles for analysing self- and co-regulation: Statement. 10 December 2008. Available: http://www.ofcom.org.uk/consult/condocs/coregulation/statement/Simulation
Weiser, Philip J. (2008) “Exploring Self Regulatory Strategies for Network Management”, Flatirons Summit on Information Policy, 9-10 June, 2008. 25 August 2008. Available: http://www.silicon-flatirons.org/documents/publications/summits/WeiserNetworkManagement.pdf
Sterman, John D. (2002) “All models are wrong: reflections on becoming a systems scientist: The Jay Wright Forrester Prize Lecture.” System Dynamics Review Vol. 18, No. 4, (Winter 2002): 501–531. Available: http://web.mit.edu/jsterman/www/All_Models_Are_Wrong_(SDR).pdfTransparency
Lempert, Robert J., Steven W. Popper, and Steven C. Bankes (2003) “Shaping the Next One Hundred Years: New Methods for Quantitative, Long-Term Policy Analysis” RAND Report MR-1626-RPC, 2003. Available: http://www.rand.org/pubs/monograph_reports/MR1626/index.html
Bauer, Johannes M. (2007) “Dynamic effects of network neutrality,” International Journal of Communication, 1: 531-547. Available: http://ijoc.org/ojs/index.php/ijoc/article/view/156/79tprg
Fung, Archon, Mary Graham, David Weil, and Elena Fagotto (2006) "The Effectiveness of Regulatory Disclosure Policies," Journal of Policy Analysis and Management, v. 25, no. 1 (Winter 2006). Available: http://www.hks.harvard.edu/taubmancenter/transparency/downloads/jpam06.pdf