According to a country ranking of computer ownership published in the 2009 edition of The Economist’s “Pocket World in Figures”, there were 122.1 computers per 100 people in Israel in 2006.
Russia was in the same league as Namibia, with 12 computers per 100 people. The US, Britain and Australia were at 76 per 100 people.
"in this world, there is one awful thing, and that is that everyone has their reasons" --- attrib. to Jean Renoir (details in the Quotes blog.)
Friday, December 19, 2008
Sunday, December 14, 2008
Reforming the FCC: New Organization
The recommendations I outlined in Reforming the FCC: New Capabilities will only bear fruit in a suitable institutional setting. This post suggests some organizational responses to the challenge of managing 21st century communications in the United States.
In-house expertise
All participants in a complex adaptive system, including regulators, have to innovate to keep pace with ever-changing circumstances. However, change carries risks. The system will adjust to new rules as soon as they are promulgated, and there’s no way to turn back the clock.
The variability and unpredictability of adaptive systems means there cannot be a single, fixed, optimal strategy. The only viable approach is continuing learning and adaptation. As society and technology change ever faster, the institution has to learn faster and plan better. The FCC needs to be a learning organization which adapts at a much faster rate than was required in the past.
The FCC needs to have a strong, in-house basis for understanding the state of play, and anticipating developments. This is necessary both to make smart decisions about how (and whether!) to intervene through rule making, and to make smart/robust rules. Learning and planning are tied together through simulation and other safe-to-fail experiments.
Simply depending on the adversaries in proceedings to provide technical information is not sufficient; even if the input were not biased, one would still need in-house expertise to make informed choices. The Commission has this level of expertise in wireless technology, but perhaps not to the same degree in internet/web technology. Staff can make nuanced judgments about the likelihood and mitigation of radio interference, but has less experience judging network management claims, the implications for consumer privacy of data aggregation and behavioral advertising, or the most effective way to implement law enforcement surveillance on the internet.
With strong in-house capacity, the FCC could make effective use of outside consultants, which have employed too rarely in recent decades. It could also find ways to exploit the goodwill and expertise of volunteer experts in academia and society at large.
The multiple uncertainties of adaptive systems mean that every institution needs a long memory. Slow variables are often the ones that trigger radical change, but they can only be observed with prolonged attention. The institution needs a strong, constantly renewing base of career officials that can bridge across the terms of political appointees. There has been a renewed awareness in recent years of the degree to which the tenure of career professionals can be vulnerable to political processes.
Individuals matter. The interlinking of activities at various layers of a complex system means that in-house entrepreneurs (and reactionaries) can have disproportionate influence, particularly when a system is in flux. It’s important to foster independence and adaptive management skills, particularly in the American setting where top leadership is politically appointed, and changes frequently. Secondments like Chief Economist and Chief Technologist are an important tool, and should be complemented with fellowships from industry and academia at middle and entry levels in the organization.
New Structure
Developing an open, adaptive and principles-based approach to policy making, and building the capacity to perform new functions, will require changes in the institution’s organization.
The FCC is currently organized, in large part, to reflect the Titles of the Communications Act(s): the Wireline Competition Bureau is responsible for wire-based telephony; the Wireless Telecommunications Bureau oversees spectrum, including cellular telephony; the International Bureau covers satellite and international matters; the Media Bureau regulates radio and TV services. However, the mapping to statute is not exact, which suggests that organization by Title is not a necessity: a re-organization does not require a new Communications Act.
Such a structure cannot effectively address questions that cross industry boundaries – which, in the 21st century, is most of them.
A more effective and stable structure would be organization by policy mandate. This would replace the industry-oriented bureaus by departments with responsibilities for (1) public safety, (2) consumer protection, (3) the protection of culture and values, (4) economic vitality, and (5) the raising of revenues – regardless of industry or technology.
The rudiments of such an organization already exist; the Public Safety & Homeland Security Bureau is responsible for public safety across industries. Other bureaus would have to be split up among domain-oriented departments. The responsibilities of the departments replacing existing bureaus would include:
Is now the right time?
One needs to ask not only how to reform, but whether.
Reform is a phase in the adaptive cycle: it is the reorganization that follows the crisis of a release phase. (See Reforming the FCC: Etiology for a summary of the adaptive cycle.) While release and restructuring is necessary for the long-term health of a system, it can be painful.
Reform necessarily dissipates the capital accumulated during growth and maturity; it is not something to be embarked on lightly. Is now the right time to reform the FCC?
The goal of wise management is to keep disruptions from flipping a system into an undesirable state, while encouraging the innovation and experimentation that comes with reorganization – not vainly trying to stop restructuring from happening at all. Delayed re-organization amplifies the eventual crisis, increasing the risk of landing up in an unhealthy state; too frequent or premature re-organization never allows the full accumulation of the potential that can fuel the next restructuring cycle.
An accumulation of governance problems is not sufficient to drive a paradigm shift, as Thomas Kuhn demonstrated in the context of scientific revolutions; there needs to be a better alternative. Do we have a better alternative? I don’t know, and complexity science suggests that there’s no way to know for sure. The only way to find out is to try it.
Are the risks acceptable? One of the prices to be paid is that tools that one needs to manage the system are disrupted during reform. For example, trust is an important ingredient of self-regulation, which will be important in the new approach – but trust requires stability, which is reduced during a reform. Fortunately, industry is in a relatively stable phase at the moment, which can accommodate and smooth over disruption at the Commission level. This gives the FCC an opportunity to change while not endangering the stability of the whole system
A new Communications Act might trigger reform, but that is neither necessary nor likely. Congress will be the last to change in the face of the complexification of communications. Members, particularly influential ones, are typically long standing office-holders with entrenched patrons and perspectives. They will resist the threat to their patronage entailed by a re-organization of regulatory action. When they do act to reform, it will probably be in response to an existential threat to a powerful old-line industry – which will tend to entrench, or at best resist attempts at blurring, existing ways of doing things.
Re-organization will therefore have to be driven by the Chairman, with all the risks (and opportunities) of a process that depends on a single big ego. The choice of a new Chairman will therefore have far-reaching consequences for the effectiveness of the organization.
Conclusions
There is no a priori limit to the number of true statements one can make about a complex, adaptive system, and many of them will be at odds with one another. The role of a regulator is therefore not to establish the ultimate truth as the basis of a correct decision, but rather a never-ending quest to make the best possible call given what can be known at a given moment.
This reality has become more visible and more pressing as the stable world of 20th century communications gives way to the flux of the 21st century internet/web. Even while understanding grows that the FCC’s influence is limited, there is no doubt that it still has great influence, and important responsibilities. The addition of new techniques to its repertoire and a corresponding restructuring of its organization will be essential to wielding its influence wisely, to the benefit of citizens.
The new approach proposed here is premised on dynamics that affect not only the FCC, but all actors in the communications system. These arguments, and all debates about how to reform the FCC, therefore also apply to the broader question of governance of 21st century communications.
Further reading
Weiser, Philip J, “FCC Reform and the Future of Telecommunications Policy” (forthcoming, to be presented at Reforming the Federal Communications Commission, 5 January 2009)
In-house expertise
All participants in a complex adaptive system, including regulators, have to innovate to keep pace with ever-changing circumstances. However, change carries risks. The system will adjust to new rules as soon as they are promulgated, and there’s no way to turn back the clock.
The variability and unpredictability of adaptive systems means there cannot be a single, fixed, optimal strategy. The only viable approach is continuing learning and adaptation. As society and technology change ever faster, the institution has to learn faster and plan better. The FCC needs to be a learning organization which adapts at a much faster rate than was required in the past.
The FCC needs to have a strong, in-house basis for understanding the state of play, and anticipating developments. This is necessary both to make smart decisions about how (and whether!) to intervene through rule making, and to make smart/robust rules. Learning and planning are tied together through simulation and other safe-to-fail experiments.
Simply depending on the adversaries in proceedings to provide technical information is not sufficient; even if the input were not biased, one would still need in-house expertise to make informed choices. The Commission has this level of expertise in wireless technology, but perhaps not to the same degree in internet/web technology. Staff can make nuanced judgments about the likelihood and mitigation of radio interference, but has less experience judging network management claims, the implications for consumer privacy of data aggregation and behavioral advertising, or the most effective way to implement law enforcement surveillance on the internet.
With strong in-house capacity, the FCC could make effective use of outside consultants, which have employed too rarely in recent decades. It could also find ways to exploit the goodwill and expertise of volunteer experts in academia and society at large.
The multiple uncertainties of adaptive systems mean that every institution needs a long memory. Slow variables are often the ones that trigger radical change, but they can only be observed with prolonged attention. The institution needs a strong, constantly renewing base of career officials that can bridge across the terms of political appointees. There has been a renewed awareness in recent years of the degree to which the tenure of career professionals can be vulnerable to political processes.
Individuals matter. The interlinking of activities at various layers of a complex system means that in-house entrepreneurs (and reactionaries) can have disproportionate influence, particularly when a system is in flux. It’s important to foster independence and adaptive management skills, particularly in the American setting where top leadership is politically appointed, and changes frequently. Secondments like Chief Economist and Chief Technologist are an important tool, and should be complemented with fellowships from industry and academia at middle and entry levels in the organization.
New Structure
Developing an open, adaptive and principles-based approach to policy making, and building the capacity to perform new functions, will require changes in the institution’s organization.
The FCC is currently organized, in large part, to reflect the Titles of the Communications Act(s): the Wireline Competition Bureau is responsible for wire-based telephony; the Wireless Telecommunications Bureau oversees spectrum, including cellular telephony; the International Bureau covers satellite and international matters; the Media Bureau regulates radio and TV services. However, the mapping to statute is not exact, which suggests that organization by Title is not a necessity: a re-organization does not require a new Communications Act.
Such a structure cannot effectively address questions that cross industry boundaries – which, in the 21st century, is most of them.
A more effective and stable structure would be organization by policy mandate. This would replace the industry-oriented bureaus by departments with responsibilities for (1) public safety, (2) consumer protection, (3) the protection of culture and values, (4) economic vitality, and (5) the raising of revenues – regardless of industry or technology.
The rudiments of such an organization already exist; the Public Safety & Homeland Security Bureau is responsible for public safety across industries. Other bureaus would have to be split up among domain-oriented departments. The responsibilities of the departments replacing existing bureaus would include:
- Public Safety: Access to emergency services, law enforcement surveillance, data retention, and child safety
- Consumer Protection: Privacy, fraud, fair trade terms, access for those with disabilities, device certification, universal service, digital inclusion
- Culture & Values: Control of speech (obscenity, violence in media), advertising rules
- Markets: Economic vitality, anti-trust, allocation of resources (numbers, spectrum), market analysis
- Revenue: Taxes, fees, levies, subsidies
- An alignment with policy mandates will be more stable over time than one based on technology or industry segmentation, which is in constant flux.
- An organization structures by public interest mandate would require and enable the Commission to take a big picture approach in every case, and not limit staff to supervising or nurturing a particular, soon-to-be-obsolete industry category.
- It would weaken the ability of incumbents to dominate all rule makings applicable to them by focusing on a single bureau.
- A department focused on market issues would highlight the complements, overlaps and conflicts between the FCC and FTC in the anti-trust area.
- The inclusion of spectrum allocation issues within a department charged with maximizing the economic value of national resources would call the question of the divided FCC/NTIA management of spectrum in the US, and may provide a path to the long-term resolution of this inefficient anomaly.
Is now the right time?
One needs to ask not only how to reform, but whether.
Reform is a phase in the adaptive cycle: it is the reorganization that follows the crisis of a release phase. (See Reforming the FCC: Etiology for a summary of the adaptive cycle.) While release and restructuring is necessary for the long-term health of a system, it can be painful.
Reform necessarily dissipates the capital accumulated during growth and maturity; it is not something to be embarked on lightly. Is now the right time to reform the FCC?
The goal of wise management is to keep disruptions from flipping a system into an undesirable state, while encouraging the innovation and experimentation that comes with reorganization – not vainly trying to stop restructuring from happening at all. Delayed re-organization amplifies the eventual crisis, increasing the risk of landing up in an unhealthy state; too frequent or premature re-organization never allows the full accumulation of the potential that can fuel the next restructuring cycle.
An accumulation of governance problems is not sufficient to drive a paradigm shift, as Thomas Kuhn demonstrated in the context of scientific revolutions; there needs to be a better alternative. Do we have a better alternative? I don’t know, and complexity science suggests that there’s no way to know for sure. The only way to find out is to try it.
Are the risks acceptable? One of the prices to be paid is that tools that one needs to manage the system are disrupted during reform. For example, trust is an important ingredient of self-regulation, which will be important in the new approach – but trust requires stability, which is reduced during a reform. Fortunately, industry is in a relatively stable phase at the moment, which can accommodate and smooth over disruption at the Commission level. This gives the FCC an opportunity to change while not endangering the stability of the whole system
A new Communications Act might trigger reform, but that is neither necessary nor likely. Congress will be the last to change in the face of the complexification of communications. Members, particularly influential ones, are typically long standing office-holders with entrenched patrons and perspectives. They will resist the threat to their patronage entailed by a re-organization of regulatory action. When they do act to reform, it will probably be in response to an existential threat to a powerful old-line industry – which will tend to entrench, or at best resist attempts at blurring, existing ways of doing things.
Re-organization will therefore have to be driven by the Chairman, with all the risks (and opportunities) of a process that depends on a single big ego. The choice of a new Chairman will therefore have far-reaching consequences for the effectiveness of the organization.
Conclusions
There is no a priori limit to the number of true statements one can make about a complex, adaptive system, and many of them will be at odds with one another. The role of a regulator is therefore not to establish the ultimate truth as the basis of a correct decision, but rather a never-ending quest to make the best possible call given what can be known at a given moment.
This reality has become more visible and more pressing as the stable world of 20th century communications gives way to the flux of the 21st century internet/web. Even while understanding grows that the FCC’s influence is limited, there is no doubt that it still has great influence, and important responsibilities. The addition of new techniques to its repertoire and a corresponding restructuring of its organization will be essential to wielding its influence wisely, to the benefit of citizens.
The new approach proposed here is premised on dynamics that affect not only the FCC, but all actors in the communications system. These arguments, and all debates about how to reform the FCC, therefore also apply to the broader question of governance of 21st century communications.
Further reading
Weiser, Philip J, “FCC Reform and the Future of Telecommunications Policy” (forthcoming, to be presented at Reforming the Federal Communications Commission, 5 January 2009)
Reforming the FCC: New Capabilities
Complex adaptive systems, like 21st century communications, are by definition difficult to understand and control; I outlined some reasons for this in Reforming the FCC: Etiology. It is often unclear whether they can be managed at all. However, society has well-established expectations of the FCC regarding consumer protection, economic vitality, public safety, raising revenues, and the protection of culture and values.
The staff members of the FCC as currently constituted do a valiant job. However, the use of new techniques, and changes to the organization’s structure would enable them to be even more effective in future.
The challenge of managing a complex adaptive system calls for particular Commission capabilities techniques to be added or enhanced. This post will discuss each in turn:
Principles rather than Rules
The goal of policy is to understand the present, anticipate the future, and plot a course between the two. Since the present reality is changing ever more quickly, detailed rules and regulations will often be obsolete before they have been finalized. Responses will necessarily be ad hoc, but they don’t need to be arbitrary: in a complex world, principles are a more appropriate technique than detailed rules.
All managers of adaptive systems have rules of thumb for coping with unpredictability and complexity. I have been particularly inspired by the work of Buzz Holling and colleagues on managed ecosystems, which are a useful analog to 21st century communications. Both ICT and ecosystem managers have to deal with the confluence of socio-political systems and largely self-driven “subject” systems. In ecosystems the subject is biology, and in ICT it is the intrinsic, self-directed creativity of technology. The following principles distill the experience of managing such systems.
It is difficult to predict how complex adaptive systems will respond to interventions. Unintended consequences are the rule, not the exception. Experimentation before deploying new rules can reduce the likelihood, or impact, of blunders.
The ability to try out new ideas in state and local jurisdictions is a useful feature of the United States’ federal system. However, new ways to conduct “safe to fail” experiments are needed because geographies are now more interlinked, and less isolated, than they used to be. System modeling made possible by the advent of cheap, fast computing provides a safe way to try out regulatory ideas.
Exploring the consequences of policy choices in simulation can identify which courses of action are most robust under a variety of possible outcomes, and can identify critical preconceptions and biases. It can identify policy choices that are brittle and work in only a narrow set of circumstances, thus leading to more resilient final measures.
Techniques developed for business modeling, social analysis, and long-term policy planning can be applied to the internet/web. For example, agent-based simulations of internet access in the US provide insight into the dynamics of network neutrality regulation. It would be instructive to explore whether the resulting system has multiple stable states, as one might expect from a complex adaptive system; if it does, increased vigilance regarding irreversible transition into a low-diversity content arrangement is called for. Once such a model is in place, one can extend it to do resilience analyses, and factor in political power.
No single model is sufficient; indeed, a focus on a single correct but incomplete model generates long-term problems even while satisfying short-term objectives. Simulation – agent-based techniques as well as traditional econometric modeling – needs to become part of the standard way all parts of the organization make decisions.
Self-regulation
There is growing interest in the value of self- or co-regulation in the communication industry as part of a continuum of approaches ranging from no formal government action, through to full statutory regulation. Industry self-regulation can be more flexible and less costly for both business and consumers than direct government involvement. It is most likely to be effective where there is trust between government, industry, and consumers; enterprises recognize the importance of responsible behavior over the long term; and non-compliance by rogue companies can be contained.
Allowing or encouraging appropriate self-regulation is a way for the FCC to implement the Delegation principle. This is delegation, however, not abdication of responsibility: the Commission retains the responsibility for discharging its social mandates (consumer protection, economic vitality, etc.), but does not necessarily have to use regulation to do so.
Successfully applying self-regulation entails a capability in the organization to make informed judgments about whether the socio-economic context is conducive to effective self-regulation, and whether self-regulatory organizations are meeting their responsibilities. As with all the techniques discussed here, new capabilities have institutional consequences.
Transparency and Intelligibility
Visibility into the workings of a complex system reduces volatility and improves resilience. But how does one get timely information about an elaborate, rapidly-changing socio-economic system like the internet/web? Since funding for monitoring by regulators is limited, it is important to enable surveillance by civil society and the market itself. Limited capacity and visibility at the top of the control hierarchy can be complemented by distributing monitoring throughout the system.
It is important to have effective disclosure mandates on powerful players, such as firms with significant market power – and the regulators themselves. Internet/web technology itself facilitates monitoring. It makes information more immediately accessible, and enables peer-to-peer disclosure and the pooling of information and knowledge. The pioneers like Zagat, Amazon Reviews, and Wikipedia are being joined by “vigilante transparency” organizations monitoring civil servants, campaign contributions, internet service providers, and even nannies.
One of the lessons of the sub-prime mortgage crisis is that a lack of intelligibility was more problematic than too little transparency. Nobody understood the ramifications of the financial instruments they were creating, managing or (not) regulating. While it’s true that eliminating complexity could stifle innovation, that’s a false choice; nobody would seriously propose to eliminate either complexity or innovation. What is needed is an accounting of intelligibility in the regulatory calculus.
Transparency/intelligibility need not be mandatory; companies should be able to choose obscurity. However, the choice of shrouding their activities could incur the cost of increased regulatory scrutiny – and perhaps higher expectations regarding performance against public interest mandates. For example, Comcast need not explain exactly how it manages its network; but if it chooses obscurity, it should face tougher network neutrality expectations. Microsoft need not disclose its interfaces; but if it chooses obscurity, it should face tougher anti-trust tests. Google need not explain how it uses DoubleClick data to improve ad click-throughs; but if it chooses obscurity, it should face tougher privacy protection requirements.
Data gathering and interpretation
All the above techniques depend on effective data gathering. The FCC cannot do all the work itself, but neither should it be entirely beholden to interested parties to a proceeding, who will necessarily furnish only information that advances their cause. The agency should foster a culture and capability of data collection and interpretation with the goal of learning, not bookkeeping. All these activities can be delegated in part, but in all of them the FCC should retain an in-house capability.
The data that the FCC currently reports is organized by the old-line industry silos, and largely ignores the structure of the 21st century market. There are data for common carrier traffic, competition in satellite services, the cable industry, and wireless – all given separately. Many vital current issues are not covered, such as: the assessment of the consumer value of auctioned spectrum; an inventory of the utilization of spectrum under both NTIA and FCC management; consumer data aggregation practices.
The FCC should also explore new ways to inform its decisions and act on behalf of citizens, such as opinion polling. This technique is often used by other regulators, e.g. Ofcom, but apparently not by the FCC. The Commission will always be catching up with data collection requirements, since the industry changes so quickly. An institutional habit of identifying new data needs and fulfilling them is just as important as high quality reporting against current mandates.
Once data has been collected, it needs to be interpreted. There are many interpretation challenges, not least the flood of short comments generated by particular proceedings which has been facilitated by laudable efforts to involve the public like filing via email, and submitting brief comments. For example, in 2006 there were 10,831 one- or two-page filings (excluding ex partes) on FCC Docket 06-74, the AT&T/BellSouth merger; they were essentially all from individual citizens. This was 18% of all the filings in that year (59,081). By comparison, in 2004 there was a total of 25,480 filings.
This is part of a larger challenge of managing multiple constituencies. Not only have the number of industries with an interest in the communication business grown beyond telecommunications and TV broadcasting to include internet content providers, software companies and new broadcasting media, but the public has become increasingly engaged.
The FCC needs to rethink how it can involve a wider community in data collection and analysis. This community includes interested citizens, industry, research vendors, and think tanks. The goal should be to improve the speed and quality of both data collection and interpretation by opening up the process to commercial interests and citizen-consumers.
Further reading
Adaptive systems
The staff members of the FCC as currently constituted do a valiant job. However, the use of new techniques, and changes to the organization’s structure would enable them to be even more effective in future.
The challenge of managing a complex adaptive system calls for particular Commission capabilities techniques to be added or enhanced. This post will discuss each in turn:
- Principles rather than Rules
- Modeling and Simulation
- Self-regulation
- Transparency and Intelligibility
- Data gathering and Interpretation
Principles rather than Rules
The goal of policy is to understand the present, anticipate the future, and plot a course between the two. Since the present reality is changing ever more quickly, detailed rules and regulations will often be obsolete before they have been finalized. Responses will necessarily be ad hoc, but they don’t need to be arbitrary: in a complex world, principles are a more appropriate technique than detailed rules.
All managers of adaptive systems have rules of thumb for coping with unpredictability and complexity. I have been particularly inspired by the work of Buzz Holling and colleagues on managed ecosystems, which are a useful analog to 21st century communications. Both ICT and ecosystem managers have to deal with the confluence of socio-political systems and largely self-driven “subject” systems. In ecosystems the subject is biology, and in ICT it is the intrinsic, self-directed creativity of technology. The following principles distill the experience of managing such systems.
- Flexibility. Determine ends, not means. Describe and justify the outcomes sought, not the methods to be used to achieve them.
- Delegation. Let the market and society solve most problems, not government. Government's role is to provide proper incentives and guidance, and to address critical failures.
- Big Picture. Take a broad view of the problem and solution space. Favor generic over sector-, technology- or industry-specific legislation.
- Diversity. Seek and support multiple alternative solutions to policy problems. Encourage competition and market entry.
- Detailed intervention by specifying a detailed mechanism for achieving a societal goal – from merger conditions, to pricing network elements, to the exact allowed uses of a spectrum license – has been the rule, and is embedded in custom and statute.
- Delegation is at odds with the traditional top-down control of telecoms and broadcasting.
- The industry silos that underlie the titles of the Communications Act enshrine a “little picture” approach, where problems are solved piecemeal and in isolation.
- While lip service might be paid to diversity and innovation, regulatory capture by industry incumbents prevents competition. The desire for (illusory) control has traditionally seduced regulators into focusing on single rather than multiple solutions.
- Flexibility. Use principles rather than rules. Ensure backstop powers are available if self-regulation fails. Rules, if used, should be technology and business-model neutral. Build in capacity to deal with the unexpected.
- Delegation. Intervene if players close to the action demonstrably fail to solve problems flagged by the regulator, or in legislation. Avoid ex ante action unless there is a high probability that a market failure will be locked in.
- Big Picture. Avoid silo-specific regulation. Develop an integrated understanding, based on in-house expertise, of the industry and its social context. Use scenario planning to prepare for contingencies such as the entrenched market failure.
- Diversity. Don't entrench one solution through regulatory preference. Define markets broadly for competitive analysis.
It is difficult to predict how complex adaptive systems will respond to interventions. Unintended consequences are the rule, not the exception. Experimentation before deploying new rules can reduce the likelihood, or impact, of blunders.
The ability to try out new ideas in state and local jurisdictions is a useful feature of the United States’ federal system. However, new ways to conduct “safe to fail” experiments are needed because geographies are now more interlinked, and less isolated, than they used to be. System modeling made possible by the advent of cheap, fast computing provides a safe way to try out regulatory ideas.
Exploring the consequences of policy choices in simulation can identify which courses of action are most robust under a variety of possible outcomes, and can identify critical preconceptions and biases. It can identify policy choices that are brittle and work in only a narrow set of circumstances, thus leading to more resilient final measures.
Techniques developed for business modeling, social analysis, and long-term policy planning can be applied to the internet/web. For example, agent-based simulations of internet access in the US provide insight into the dynamics of network neutrality regulation. It would be instructive to explore whether the resulting system has multiple stable states, as one might expect from a complex adaptive system; if it does, increased vigilance regarding irreversible transition into a low-diversity content arrangement is called for. Once such a model is in place, one can extend it to do resilience analyses, and factor in political power.
No single model is sufficient; indeed, a focus on a single correct but incomplete model generates long-term problems even while satisfying short-term objectives. Simulation – agent-based techniques as well as traditional econometric modeling – needs to become part of the standard way all parts of the organization make decisions.
Self-regulation
There is growing interest in the value of self- or co-regulation in the communication industry as part of a continuum of approaches ranging from no formal government action, through to full statutory regulation. Industry self-regulation can be more flexible and less costly for both business and consumers than direct government involvement. It is most likely to be effective where there is trust between government, industry, and consumers; enterprises recognize the importance of responsible behavior over the long term; and non-compliance by rogue companies can be contained.
Allowing or encouraging appropriate self-regulation is a way for the FCC to implement the Delegation principle. This is delegation, however, not abdication of responsibility: the Commission retains the responsibility for discharging its social mandates (consumer protection, economic vitality, etc.), but does not necessarily have to use regulation to do so.
Successfully applying self-regulation entails a capability in the organization to make informed judgments about whether the socio-economic context is conducive to effective self-regulation, and whether self-regulatory organizations are meeting their responsibilities. As with all the techniques discussed here, new capabilities have institutional consequences.
Transparency and Intelligibility
Visibility into the workings of a complex system reduces volatility and improves resilience. But how does one get timely information about an elaborate, rapidly-changing socio-economic system like the internet/web? Since funding for monitoring by regulators is limited, it is important to enable surveillance by civil society and the market itself. Limited capacity and visibility at the top of the control hierarchy can be complemented by distributing monitoring throughout the system.
It is important to have effective disclosure mandates on powerful players, such as firms with significant market power – and the regulators themselves. Internet/web technology itself facilitates monitoring. It makes information more immediately accessible, and enables peer-to-peer disclosure and the pooling of information and knowledge. The pioneers like Zagat, Amazon Reviews, and Wikipedia are being joined by “vigilante transparency” organizations monitoring civil servants, campaign contributions, internet service providers, and even nannies.
One of the lessons of the sub-prime mortgage crisis is that a lack of intelligibility was more problematic than too little transparency. Nobody understood the ramifications of the financial instruments they were creating, managing or (not) regulating. While it’s true that eliminating complexity could stifle innovation, that’s a false choice; nobody would seriously propose to eliminate either complexity or innovation. What is needed is an accounting of intelligibility in the regulatory calculus.
Transparency/intelligibility need not be mandatory; companies should be able to choose obscurity. However, the choice of shrouding their activities could incur the cost of increased regulatory scrutiny – and perhaps higher expectations regarding performance against public interest mandates. For example, Comcast need not explain exactly how it manages its network; but if it chooses obscurity, it should face tougher network neutrality expectations. Microsoft need not disclose its interfaces; but if it chooses obscurity, it should face tougher anti-trust tests. Google need not explain how it uses DoubleClick data to improve ad click-throughs; but if it chooses obscurity, it should face tougher privacy protection requirements.
Data gathering and interpretation
All the above techniques depend on effective data gathering. The FCC cannot do all the work itself, but neither should it be entirely beholden to interested parties to a proceeding, who will necessarily furnish only information that advances their cause. The agency should foster a culture and capability of data collection and interpretation with the goal of learning, not bookkeeping. All these activities can be delegated in part, but in all of them the FCC should retain an in-house capability.
The data that the FCC currently reports is organized by the old-line industry silos, and largely ignores the structure of the 21st century market. There are data for common carrier traffic, competition in satellite services, the cable industry, and wireless – all given separately. Many vital current issues are not covered, such as: the assessment of the consumer value of auctioned spectrum; an inventory of the utilization of spectrum under both NTIA and FCC management; consumer data aggregation practices.
The FCC should also explore new ways to inform its decisions and act on behalf of citizens, such as opinion polling. This technique is often used by other regulators, e.g. Ofcom, but apparently not by the FCC. The Commission will always be catching up with data collection requirements, since the industry changes so quickly. An institutional habit of identifying new data needs and fulfilling them is just as important as high quality reporting against current mandates.
Once data has been collected, it needs to be interpreted. There are many interpretation challenges, not least the flood of short comments generated by particular proceedings which has been facilitated by laudable efforts to involve the public like filing via email, and submitting brief comments. For example, in 2006 there were 10,831 one- or two-page filings (excluding ex partes) on FCC Docket 06-74, the AT&T/BellSouth merger; they were essentially all from individual citizens. This was 18% of all the filings in that year (59,081). By comparison, in 2004 there was a total of 25,480 filings.
This is part of a larger challenge of managing multiple constituencies. Not only have the number of industries with an interest in the communication business grown beyond telecommunications and TV broadcasting to include internet content providers, software companies and new broadcasting media, but the public has become increasingly engaged.
The FCC needs to rethink how it can involve a wider community in data collection and analysis. This community includes interested citizens, industry, research vendors, and think tanks. The goal should be to improve the speed and quality of both data collection and interpretation by opening up the process to commercial interests and citizen-consumers.
Further reading
Adaptive systems
Yorque, Ralf, Brian Walker, C S Holling, Lance H Gunderson, Carl Folke, Stephen R Carpenter, and William A Brock, “Toward an Integrative Synthesis” Ch. 16 in Gunderson, Lance H and C S Holling, Panarchy: Understanding transformations in human and natural systems, Island Press (2002)Complexity theory and governance
Schneider, Volker and Johannes M. Bauer, “Governance: Prospects of Complexity Theory in Revisiting System Theory”, Presented at the annual meeting of the Midwest Political Science Association, Chicago, Illinois, 14 April 2007. Available: http://www.uni-konstanz.de/FuF/Verwiss/Schneider/ePapers/MPSA2007Paper_vs_jmb.pdf.Self-regulation
De Vries, Jean Pierre (2008), "Internet Governance as Forestry: Deriving Policy Principles from Managed Complex Adaptive Systems", TPRC 2008. Available: http://ssrn.com/abstract=1229482.
Ofcom (2008) Principles for analysing self- and co-regulation: Statement. 10 December 2008. Available: http://www.ofcom.org.uk/consult/condocs/coregulation/statement/Simulation
Weiser, Philip J. (2008) “Exploring Self Regulatory Strategies for Network Management”, Flatirons Summit on Information Policy, 9-10 June, 2008. 25 August 2008. Available: http://www.silicon-flatirons.org/documents/publications/summits/WeiserNetworkManagement.pdf
Sterman, John D. (2002) “All models are wrong: reflections on becoming a systems scientist: The Jay Wright Forrester Prize Lecture.” System Dynamics Review Vol. 18, No. 4, (Winter 2002): 501–531. Available: http://web.mit.edu/jsterman/www/All_Models_Are_Wrong_(SDR).pdfTransparency
Lempert, Robert J., Steven W. Popper, and Steven C. Bankes (2003) “Shaping the Next One Hundred Years: New Methods for Quantitative, Long-Term Policy Analysis” RAND Report MR-1626-RPC, 2003. Available: http://www.rand.org/pubs/monograph_reports/MR1626/index.html
Bauer, Johannes M. (2007) “Dynamic effects of network neutrality,” International Journal of Communication, 1: 531-547. Available: http://ijoc.org/ojs/index.php/ijoc/article/view/156/79tprg
Fung, Archon, Mary Graham, David Weil, and Elena Fagotto (2006) "The Effectiveness of Regulatory Disclosure Policies," Journal of Policy Analysis and Management, v. 25, no. 1 (Winter 2006). Available: http://www.hks.harvard.edu/taubmancenter/transparency/downloads/jpam06.pdf
Saturday, December 13, 2008
And it started so small: News-related Etymology
According to various sources at dictionary.com, the word "bribe" is Middle English, from the Old French, piece of bread given as alms. The shift of meaning to "gift given to influence corruptly" is first attested 1535. The original meaning of the word is said to be from the base bri(m)b- denoting something small.
(My apologies to Gov. Blagojevich for contributing to the burial of the principle of "innocent until proven guilty" in America. Talk about trial and conviction by public opinion...)
(My apologies to Gov. Blagojevich for contributing to the burial of the principle of "innocent until proven guilty" in America. Talk about trial and conviction by public opinion...)
Wednesday, December 10, 2008
Reforming the FCC: Etiology
In Reforming the FCC: Diagnosis, I argued that the FCC must these days supervise a much more complex and adaptive situation. This post examines the causes of this situation. I’ll consider responses in a subsequent post.
As an exercise, consider whether the following attributes apply to 21st century communications (which I’ll also refer to as ICT, for lack of a better term): the absence of a global controller; nested, hierarchical organization; dispersed interactions; never-ending novelty; constant selection among candidate solutions; and rapid adaptation to new circumstances. I believe they definitely describe the internet/web – and they are much less applicable to the silo’d world of telecommunications and analog broadcasting of only a few decades ago.
These attributes are the hallmarks of complexity and adaptive, non-linear systems. 21st Century communications is a complex adaptive social system, but the FCC was set up to manage a 20th century industry which was complicated but not complex. This is the deep reason why the institution needs to change.
The adaptive cycle
A key attribute of complex adaptive systems is that they cycle through distinct stages. I’ll describe it here using the example of ecosystems (where it was introduced) before turning to ICT.
During the growth stage, there is rapid colonization of recently disturbed areas, for example after a fire or wind storm has removed large amounts of biomass in a forest. The connectedness between organisms is low, which leads to high resilience; the loss of one species doesn’t lead to the loss of another. As the forest matures, it moves into the maturity phase of the cycle, which is dominated by the accumulation of material. The network of connections between biomass and nutrients becomes increasingly tight, and fragile; every niche in the forest is filled, and every resource is used. Organisms become much more interdependent; food chains become dense and interconnected. The maturity phase is followed by a dramatic release, triggered in a forest by fire, drought, insect pests, etc. A lot of energy is unbound, and networks are broken up. This sets the scene for the fourth phase, reorganization: opportunistic species that have been suppressed by the stable configuration of the maturity phase move in. This is a time of innovation and restructuring, laying the groundwork for a return to another growth phase.
The behavior of managed ecosystems is shaped by three properties: the accumulation of potential, the degree of connectedness between elements, and the resilience of the system in the face of shocks. The same properties apply to complex human enterprises like modern communications.
The adaptive cycle alternates periods of gradual accumulation of potential (e.g. biomass, socio-economic capital or know-how, depending on the kind of system) with sudden and often unexpected disruptions that reorganize that potential. Connectedness is high at maturity, but that is also the time when resilience to shocks is at its lowest. This cycle of aggregation followed by restructuring leads to innovation; but the release phase is often a surprise, and frequently an unpleasant one for those who were successful in the maturity phase. It is thus often experienced as a crisis.
Decision Environments
One can recognize the phases of the adaptive cycle in the internet/web, and in the larger system of communications governance. It is helpful to parse the system into four decision environments that represent different hierarchical layers:
Another hallmark of complex adaptive systems – and one of the hardest challenges for a regulator – is unexpected novelty. Changes in the state of a complex system are usually unexpected, in part because many dynamics are invisible. Surprises are particularly noticeable when they lead to release.
Here are some recent reminders that the innovation that we expect from complex systems usually comes as a surprise:
Many surprises come from contagion between problem domains that were previously considered distinct. XM/Sirius’s problems came at the intersection of personal computing devices with broadcasting; music publishing’s crisis arose from software and networking innovations that led to the P2P distribution of digital content; and the open source software movement informed Wikipedia.
Problem scope
A consequence of interlocking decision environments and intersecting problem domains is that the unit of analysis for the FCC is no longer a distinct, largely independent, well-defined industry associated with a particular technology and its own Title in the Communications Act.
Attention needs to shift from industries to problem domains, and away from solutions for a particular industry, technology and even institution or statute. For example, a policy imperative like lawful intercept is no longer limited to telephony, which leads to conflicts such as the competing definitions of information services in CALEA and the Communications Act. This is an example of the importance of the Big Picture principle for managing adaptive system. (I’ll review this principle and its three companions – Delegation, Flexibility and Diversity – in the next post.)
However, simply broadening some existing statute to cover all new possibilities is counter-productive. It conflicts with the other three principles, and falls victim to the fallacy that narrow-minded control of a single variable leads to a healthy outcome; in adaptive systems, it leads eventually to an even worse crisis.
In conclusion, the FCC is really facing a system problem, not an institutional one. Even if today’s procedural problems within the Commission were completely solved, it would not address the challenges of a qualitatively more complex and unpredictable regulation “subject”, that is, the market/culture system where innovation and growth takes place. Nor would it speak to the problems faced at the political level where the social acceleration of time poses existential challenges to the rule of law, and profoundly complicates the separation of powers between the legislature, executive, and judiciary market capitalism, and liberal democracy.
I’ll turn to the question of how the FCC should respond in the next post.
Further reading
The adaptive cycle:
As an exercise, consider whether the following attributes apply to 21st century communications (which I’ll also refer to as ICT, for lack of a better term): the absence of a global controller; nested, hierarchical organization; dispersed interactions; never-ending novelty; constant selection among candidate solutions; and rapid adaptation to new circumstances. I believe they definitely describe the internet/web – and they are much less applicable to the silo’d world of telecommunications and analog broadcasting of only a few decades ago.
These attributes are the hallmarks of complexity and adaptive, non-linear systems. 21st Century communications is a complex adaptive social system, but the FCC was set up to manage a 20th century industry which was complicated but not complex. This is the deep reason why the institution needs to change.
The adaptive cycle
A key attribute of complex adaptive systems is that they cycle through distinct stages. I’ll describe it here using the example of ecosystems (where it was introduced) before turning to ICT.
During the growth stage, there is rapid colonization of recently disturbed areas, for example after a fire or wind storm has removed large amounts of biomass in a forest. The connectedness between organisms is low, which leads to high resilience; the loss of one species doesn’t lead to the loss of another. As the forest matures, it moves into the maturity phase of the cycle, which is dominated by the accumulation of material. The network of connections between biomass and nutrients becomes increasingly tight, and fragile; every niche in the forest is filled, and every resource is used. Organisms become much more interdependent; food chains become dense and interconnected. The maturity phase is followed by a dramatic release, triggered in a forest by fire, drought, insect pests, etc. A lot of energy is unbound, and networks are broken up. This sets the scene for the fourth phase, reorganization: opportunistic species that have been suppressed by the stable configuration of the maturity phase move in. This is a time of innovation and restructuring, laying the groundwork for a return to another growth phase.
The behavior of managed ecosystems is shaped by three properties: the accumulation of potential, the degree of connectedness between elements, and the resilience of the system in the face of shocks. The same properties apply to complex human enterprises like modern communications.
The adaptive cycle alternates periods of gradual accumulation of potential (e.g. biomass, socio-economic capital or know-how, depending on the kind of system) with sudden and often unexpected disruptions that reorganize that potential. Connectedness is high at maturity, but that is also the time when resilience to shocks is at its lowest. This cycle of aggregation followed by restructuring leads to innovation; but the release phase is often a surprise, and frequently an unpleasant one for those who were successful in the maturity phase. It is thus often experienced as a crisis.
Decision Environments
One can recognize the phases of the adaptive cycle in the internet/web, and in the larger system of communications governance. It is helpful to parse the system into four decision environments that represent different hierarchical layers:
- Political system: local, state and federal politicians seeking to advance their causes
- Inter-organizational system: peer agencies with partially overlapping responsibilities, such as the FCC, FTC and NTIA
- Organizational system: an agency, in our case the FCC, acting on its “subject” layer, and other organizations, in a context provided by the political systems
- Market/culture system: companies and citizen/consumers using technology (goods and services) to achieve their various ends, often at odds with each other and other levels of system
- Political: The political system went through a release phase with the 2008 election, and will spend 2009 in reorganization as players who have been out of office for eight years move into newly opened positions of power (cf. ecological niches), bringing new perspectives with them.
- Inter-organizational: The new Administration will bring necessarily bring changes at the top of the FTC and NTIA as well, but the consequences may not be as dramatic as those at the FCC, providing some stability at this layer
- Organizational: The FCC is due for “release” with the appointment of new Commissioners and Chairman in 2009. There is anecdotal evidence that large-scale departures of long-serving career staff in the last couple of years represent a release in itself, with the breakup of long-standing networks of expertise and the dissipation of institutional knowledge.
- Market/culture: The productive parts of the communication system are in or near maturity. Traditional content industries like news, music publishing and TV at maturity, and some are entering release. Telecoms went through a re-organization following the Telecoms Act of 1996, and is in a growth stage, judging by the consolidation of AT&T and Verizon. Similarly, the disruptive market-oriented allocation of spectrum through auctions has been absorbed, and there are signs of maturity in the concentration of spectrum in a few hands. There are still pockets of reorganization left over from the last cycle, e.g. cable taking voice share from wire line telcos, and telcos threatening cable’s video business. For all the hype, the PC/internet/web subsystem is well along in the growth phase and nearing maturity (e.g. Microsoft, Cisco, Google). Consumer habits have adapted to internet and the web, and have become mature.
Another hallmark of complex adaptive systems – and one of the hardest challenges for a regulator – is unexpected novelty. Changes in the state of a complex system are usually unexpected, in part because many dynamics are invisible. Surprises are particularly noticeable when they lead to release.
Here are some recent reminders that the innovation that we expect from complex systems usually comes as a surprise:
- Digital satellite radio expected to compete with traditional radio, not to be swamped by the iPod
- Digital video as an alternative to broadcast TV came to prominence as low-quality, user-originated content on YouTube, rather than as high quality Video on Demand via cable or DSL
- The explosion of Wi-Fi (and CDMA cellular telephony) was the consequence of esoteric decisions about unlicensed rules by the FCC in the mid 1980’s
- The collapse of music publishing – the industry lost a third of its revenues between 1999 and 2006
- The eclipse of commercial encyclopedias by user-produced content on Wikipedia
Many surprises come from contagion between problem domains that were previously considered distinct. XM/Sirius’s problems came at the intersection of personal computing devices with broadcasting; music publishing’s crisis arose from software and networking innovations that led to the P2P distribution of digital content; and the open source software movement informed Wikipedia.
Problem scope
A consequence of interlocking decision environments and intersecting problem domains is that the unit of analysis for the FCC is no longer a distinct, largely independent, well-defined industry associated with a particular technology and its own Title in the Communications Act.
Attention needs to shift from industries to problem domains, and away from solutions for a particular industry, technology and even institution or statute. For example, a policy imperative like lawful intercept is no longer limited to telephony, which leads to conflicts such as the competing definitions of information services in CALEA and the Communications Act. This is an example of the importance of the Big Picture principle for managing adaptive system. (I’ll review this principle and its three companions – Delegation, Flexibility and Diversity – in the next post.)
However, simply broadening some existing statute to cover all new possibilities is counter-productive. It conflicts with the other three principles, and falls victim to the fallacy that narrow-minded control of a single variable leads to a healthy outcome; in adaptive systems, it leads eventually to an even worse crisis.
In conclusion, the FCC is really facing a system problem, not an institutional one. Even if today’s procedural problems within the Commission were completely solved, it would not address the challenges of a qualitatively more complex and unpredictable regulation “subject”, that is, the market/culture system where innovation and growth takes place. Nor would it speak to the problems faced at the political level where the social acceleration of time poses existential challenges to the rule of law, and profoundly complicates the separation of powers between the legislature, executive, and judiciary market capitalism, and liberal democracy.
I’ll turn to the question of how the FCC should respond in the next post.
Further reading
The adaptive cycle:
Holling, C S, Lance H Gunderson and Donald Ludwig, “In Quest of a Theory of Adaptive Change”, Ch 1 of Gunderson, Lance H and C S Holling, Panarchy: Understanding transformations in human and natural systems, Island Press (2002). PDFDecision environments and the challenges individuals face in managing adaptive systems:
Ten Conclusions from the Resilience Project
Westley, Frances, “The Devil in the Dynamics: Adaptive Management on the Front Line”, Ch. 13 in Gunderson, Lance H and C S Holling, Panarchy: Understanding transformations in human and natural systems, Island Press (2002)A discussion of the intersection between system resilience, the rule of law, and Scheuerman’s notion of the social acceleration time
Cherry, Barbara A (2008), “Institutional Governance for Essential Industries Under Complexity: Providing Resilience Within the Rule of Law” CommLaw Conspectus (forthcoming)An account of the early history of civil spread spectrum
Early Civil Spread Spectrum History, Mike Marcus web siteCollapse of the music industry
theweek.comGrowth of cable voice traffic
economist.com
redorbit.com
gigaom.com
Wednesday, December 03, 2008
Etymology for the day: "larceny"
According to the American Heritage Dictionary via dictionary.com, "larceny" is a Middle English word, from Anglo-Norman larcin, theft, from Latin latrōcinium, robbery, from latrō, robber, mercenary, utimately from Greek latron, pay, hire.
And the Spanish for a robber or thief is ladrón.
Mercenaries have never had a good rep. . .
And the Spanish for a robber or thief is ladrón.
Mercenaries have never had a good rep. . .
Tuesday, December 02, 2008
Reforming the FCC: Diagnosis
The financial crisis has called into question how markets are regulated; calls for reforming the FCC have been growing louder for some years. The legal/regulatory shortcomings of the FCC are a topic of frequent conversation (e.g., GMU Sep 08, PK/Silicon Flatirons Jan 09. It is therefore instructive to ask why it has ended up in this situation. Some of the problems are due to the personalities and politics of the moment, and are thus transitory. Some are due to its terms of operation; the FCC’s structure and mission are determined by the Communications Act, and won’t change fundamentally unless the Act changes. The deeper cause, which most interests me, is a change in the nature what is being regulated: the transformation of the communications business from telecoms+broadcasting to the internet.
Since the mid-90s, the computer, information and communication services have come to dwarf telecommunications services. For example, the graphic at the top of this post charts the service exports of the OECD countries according to the OECD Communications Outlook 2007 (p. 256). This was not only a quantitative change; computing brought a qualitative change. The internet/web is modular, decentralized, self-organizing, adaptive and diverse on a fundamentally different scale to telecommunications (Internet Governance as Forestry). These are all characteristics that distinguish complex systems from merely complicated ones.
An analogy may help: the FCC in the telecoms era was like a farmer managing agricultural production; today it is like a ranger responsible for a wilderness. A farmer can decide which crops to cultivate, where to plant them, and when to rotate – though the plants do the work of converting sunlight to carbohydrate, and the animals convert food to meat. Some inputs, like weather and market conditions, are unpredictable, but many – irrigation, fertilizer, seed type, antibiotics – are under the farmer’s control. (And even weather and market risk is mitigated by massive government subsidies for major crops.) The desired output is well-defined and easily measurable. Rangers, on the other hand, have to deal with a very different balance of power and responsibility. They have to protect endangered species, prevent catastrophic fires, and provide access to citizens, but have little or no control over the animals and plants in the ecosystem, or the inputs in the form of weather, migrating animals, or pests.
This limited control implies that detailed, rule-based regulation is no longer sustainable. An approach based on principles, supported by tools such as transparency and computer simulation, is the only viable strategy. Rules can determine which crop hybrid to use for a particular market need given climate and soil type; but principles – such as flexibility, taking a big picture view, fostering diversity, and delegating responsibility – are unavoidable when managing an ecosystem.
In a New Yorker article about the financial crisis, James Surowiecki uses a sport analogy to explain the difference between principles and rules:
It’s something like the difference between football and soccer. Football, like most American sports, is heavily rule-bound. There’s an elaborate rulebook that sharply limits what players can and can’t do (down to where they have to stand on the field), and its dictates are followed with great care. Soccer is a more principles-based game. There are fewer rules, and the referee is given far more authority than officials in most American sports to interpret them and to shape game play and outcomes. For instance, a soccer referee keeps the game time, and at game’s end has the discretion to add as many or as few minutes of extra time as he deems necessary. There’s also less obsession with precision—players making a free kick or throw-in don’t have to pinpoint exactly where it should be taken from. As long as it’s in the general vicinity of the right spot, it’s O.K.Pursuing this metaphor, the FCC is not only the referee of a football game, it also makes the rules – often as the game goes along.
--- James Surowiecki, Parsing Paulson, The New Yorker, 2 Dec 2008
I’ll suggest some possible ways for a new FCC to manage the new communications business in an upcoming post. However, a caveat: The ICT business hasn’t had a crisis of melt-down proportions, as finance has had, to concentrate the mind. It remains to be seen how the change in power in DC will affect this process. Some of the loudest calls for change at the FCC have come from the Right, arguing that the FCC regulates too much and too intrusively; the Left has chimed in, arguing that it regulates too ineffectively. With Democrats now in control of both Congress and the Administration, and the GOP in some disarray, the pressure to reform the FCC may well abate; calls for its abolition will certainly have less resonance.
Newton, Leibnitz and the (non?)existence of spectrum
My argument that spectrum doesn’t exist [see Note below for a recap] has deeper philosophical roots than I’d realized.
As I learned in a lecture on the Philosopher’s Zone podcast my contention parallels an argument between Leibnitz and Newton about whether space and time have an independent existence. Here’s Graham Priest, Professor of Philosophy at Melbourne:
Leibniz describes a space that exists only as a relation between objects, and which has no existence apart from the existence of those objects. Motion exists only as a relation between those objects. Newtonian space provided the absolute frame of reference within which objects can have motion. In Newton’s system the frame of reference exists independently of the objects which are contained in it. These objects can be described as moving in relation to space itself.
This notion of space and time as big buckets that you put things in reminds me of the dominant metaphor for spectrum: it’s like space, and often a container. (Blog post: De-situating Spectrum: Non-spatial metaphors for wireless communication. Papers: Imagining Radio: Mental Models of Wireless Communication; De-situating spectrum: Rethinking radio policy using non-spatial metaphors.)
When it comes to spectrum, I’m a relationalist. Frequency has no existence apart from the existence of electromagnetic radiation, and therefore there is no spectrum resource separable from the operation of radio systems.
Note: Why “spectrum doesn’t exist”
I contend that the term “spectrum” as commonly used doesn’t have any meaningful referent. It typically occurs as a synonym for frequency or a frequency band, as in “the service is in UHF spectrum”. However, frequency is simply a marker; spectrum aka frequency has no existence independent of the measured radiation. It therefore doesn’t make sense to talk of a “spectrum resource”.
Use of the term in the dictionary sense, to refer to a distribution of electromagnetic radiation (as in “the spectrum of that transmitter”) is rare. In this sense a spectrum exists when a radio transmitter is on, but not when it’s off. Again, “spectrum as a resource” isn’t meaningful.
The only arguably meaningful definition is that “spectrum” means “wireless operation”. This is the only way I can make sense of a term like “spectrum license”; it’s a license to operate a radio in a given way. However, few people use the word spectrum in a way that suggests they have “wireless operation” in mind as a synonym – though they may end up with this definition when pressed.
As I learned in a lecture on the Philosopher’s Zone podcast my contention parallels an argument between Leibnitz and Newton about whether space and time have an independent existence. Here’s Graham Priest, Professor of Philosophy at Melbourne:
“In the 17th century, there was a famous debate between Newton and Leibniz about the nature of space and time. And this was the question they focused on: Could you pick up the whole physical cosmos and move it five miles to the east and put it down? Or, could everything happen exactly the same way it does now, except all half an hour later? Newton said Yes to both those questions; Leibniz said No. Because Newton said space and time are kind of like buckets and it makes sense to suppose that the buckets have a certain kind of reality, and the contents just hang in there. So space is something, it will be there even if there were no events in space and time. And time likewise. So space and time have a certain kind of self existence, they don't depend on anything.Wikipedia’s article on the philosophy of space and time includes a paragraph on the Leibnitz/Newton debate. The question of whether or not time and space exist independently of the mind leads, in this case, to either the “absolutist” position that they are real objects themselves, or the “relationalist” position that they are merely orderings upon actual objects. Wkipedia summarizes the positions as follows:
“Leibniz said this: No, it doesn't make any sense to suppose that you could lift everything up and move it five miles to the east, or that things could have started five minutes later. Because nothing would really have changed. Space and time aren't kind of big buckets that you put things in. Space and time are nothing more than the interned relationships between the events that happen in space and time. So if all the relationships, the befores and afters, the lefts and rights, if all those are the same, then nothing's changed. So if you moved everything supposing you could, five miles to the east, all the spatial relationships would have remained the same. So nothing would have changed. So if you have this view of the nature of space and time, then space and time do not have the same kind of self existence, ontological existence that they have for Newton.
“In the jargon of Buddhist philosophy, Newton thought that space and time had self existence. They were there independently of anything else. But for Leibniz they did not, they were just things in a system of relationships which kind of support each other by a system of relationships.”
Source: Why Asian philosophy? (podcast and transcript), The Philosopher’s Zone, 18 October 2008, a program of ABC Radio National.
Leibniz describes a space that exists only as a relation between objects, and which has no existence apart from the existence of those objects. Motion exists only as a relation between those objects. Newtonian space provided the absolute frame of reference within which objects can have motion. In Newton’s system the frame of reference exists independently of the objects which are contained in it. These objects can be described as moving in relation to space itself.
This notion of space and time as big buckets that you put things in reminds me of the dominant metaphor for spectrum: it’s like space, and often a container. (Blog post: De-situating Spectrum: Non-spatial metaphors for wireless communication. Papers: Imagining Radio: Mental Models of Wireless Communication; De-situating spectrum: Rethinking radio policy using non-spatial metaphors.)
When it comes to spectrum, I’m a relationalist. Frequency has no existence apart from the existence of electromagnetic radiation, and therefore there is no spectrum resource separable from the operation of radio systems.
Note: Why “spectrum doesn’t exist”
I contend that the term “spectrum” as commonly used doesn’t have any meaningful referent. It typically occurs as a synonym for frequency or a frequency band, as in “the service is in UHF spectrum”. However, frequency is simply a marker; spectrum aka frequency has no existence independent of the measured radiation. It therefore doesn’t make sense to talk of a “spectrum resource”.
Use of the term in the dictionary sense, to refer to a distribution of electromagnetic radiation (as in “the spectrum of that transmitter”) is rare. In this sense a spectrum exists when a radio transmitter is on, but not when it’s off. Again, “spectrum as a resource” isn’t meaningful.
The only arguably meaningful definition is that “spectrum” means “wireless operation”. This is the only way I can make sense of a term like “spectrum license”; it’s a license to operate a radio in a given way. However, few people use the word spectrum in a way that suggests they have “wireless operation” in mind as a synonym – though they may end up with this definition when pressed.
Subscribe to:
Posts (Atom)