According to a country ranking of computer ownership published in the 2009 edition of The Economist’s “Pocket World in Figures”, there were 122.1 computers per 100 people in Israel in 2006.
Russia was in the same league as Namibia, with 12 computers per 100 people. The US, Britain and Australia were at 76 per 100 people.
"in this world, there is one awful thing, and that is that everyone has their reasons" --- attrib. to Jean Renoir (details in the Quotes blog.)
Friday, December 19, 2008
Sunday, December 14, 2008
Reforming the FCC: New Organization
The recommendations I outlined in Reforming the FCC: New Capabilities will only bear fruit in a suitable institutional setting. This post suggests some organizational responses to the challenge of managing 21st century communications in the United States.
In-house expertise
All participants in a complex adaptive system, including regulators, have to innovate to keep pace with ever-changing circumstances. However, change carries risks. The system will adjust to new rules as soon as they are promulgated, and there’s no way to turn back the clock.
The variability and unpredictability of adaptive systems means there cannot be a single, fixed, optimal strategy. The only viable approach is continuing learning and adaptation. As society and technology change ever faster, the institution has to learn faster and plan better. The FCC needs to be a learning organization which adapts at a much faster rate than was required in the past.
The FCC needs to have a strong, in-house basis for understanding the state of play, and anticipating developments. This is necessary both to make smart decisions about how (and whether!) to intervene through rule making, and to make smart/robust rules. Learning and planning are tied together through simulation and other safe-to-fail experiments.
Simply depending on the adversaries in proceedings to provide technical information is not sufficient; even if the input were not biased, one would still need in-house expertise to make informed choices. The Commission has this level of expertise in wireless technology, but perhaps not to the same degree in internet/web technology. Staff can make nuanced judgments about the likelihood and mitigation of radio interference, but has less experience judging network management claims, the implications for consumer privacy of data aggregation and behavioral advertising, or the most effective way to implement law enforcement surveillance on the internet.
With strong in-house capacity, the FCC could make effective use of outside consultants, which have employed too rarely in recent decades. It could also find ways to exploit the goodwill and expertise of volunteer experts in academia and society at large.
The multiple uncertainties of adaptive systems mean that every institution needs a long memory. Slow variables are often the ones that trigger radical change, but they can only be observed with prolonged attention. The institution needs a strong, constantly renewing base of career officials that can bridge across the terms of political appointees. There has been a renewed awareness in recent years of the degree to which the tenure of career professionals can be vulnerable to political processes.
Individuals matter. The interlinking of activities at various layers of a complex system means that in-house entrepreneurs (and reactionaries) can have disproportionate influence, particularly when a system is in flux. It’s important to foster independence and adaptive management skills, particularly in the American setting where top leadership is politically appointed, and changes frequently. Secondments like Chief Economist and Chief Technologist are an important tool, and should be complemented with fellowships from industry and academia at middle and entry levels in the organization.
New Structure
Developing an open, adaptive and principles-based approach to policy making, and building the capacity to perform new functions, will require changes in the institution’s organization.
The FCC is currently organized, in large part, to reflect the Titles of the Communications Act(s): the Wireline Competition Bureau is responsible for wire-based telephony; the Wireless Telecommunications Bureau oversees spectrum, including cellular telephony; the International Bureau covers satellite and international matters; the Media Bureau regulates radio and TV services. However, the mapping to statute is not exact, which suggests that organization by Title is not a necessity: a re-organization does not require a new Communications Act.
Such a structure cannot effectively address questions that cross industry boundaries – which, in the 21st century, is most of them.
A more effective and stable structure would be organization by policy mandate. This would replace the industry-oriented bureaus by departments with responsibilities for (1) public safety, (2) consumer protection, (3) the protection of culture and values, (4) economic vitality, and (5) the raising of revenues – regardless of industry or technology.
The rudiments of such an organization already exist; the Public Safety & Homeland Security Bureau is responsible for public safety across industries. Other bureaus would have to be split up among domain-oriented departments. The responsibilities of the departments replacing existing bureaus would include:
Is now the right time?
One needs to ask not only how to reform, but whether.
Reform is a phase in the adaptive cycle: it is the reorganization that follows the crisis of a release phase. (See Reforming the FCC: Etiology for a summary of the adaptive cycle.) While release and restructuring is necessary for the long-term health of a system, it can be painful.
Reform necessarily dissipates the capital accumulated during growth and maturity; it is not something to be embarked on lightly. Is now the right time to reform the FCC?
The goal of wise management is to keep disruptions from flipping a system into an undesirable state, while encouraging the innovation and experimentation that comes with reorganization – not vainly trying to stop restructuring from happening at all. Delayed re-organization amplifies the eventual crisis, increasing the risk of landing up in an unhealthy state; too frequent or premature re-organization never allows the full accumulation of the potential that can fuel the next restructuring cycle.
An accumulation of governance problems is not sufficient to drive a paradigm shift, as Thomas Kuhn demonstrated in the context of scientific revolutions; there needs to be a better alternative. Do we have a better alternative? I don’t know, and complexity science suggests that there’s no way to know for sure. The only way to find out is to try it.
Are the risks acceptable? One of the prices to be paid is that tools that one needs to manage the system are disrupted during reform. For example, trust is an important ingredient of self-regulation, which will be important in the new approach – but trust requires stability, which is reduced during a reform. Fortunately, industry is in a relatively stable phase at the moment, which can accommodate and smooth over disruption at the Commission level. This gives the FCC an opportunity to change while not endangering the stability of the whole system
A new Communications Act might trigger reform, but that is neither necessary nor likely. Congress will be the last to change in the face of the complexification of communications. Members, particularly influential ones, are typically long standing office-holders with entrenched patrons and perspectives. They will resist the threat to their patronage entailed by a re-organization of regulatory action. When they do act to reform, it will probably be in response to an existential threat to a powerful old-line industry – which will tend to entrench, or at best resist attempts at blurring, existing ways of doing things.
Re-organization will therefore have to be driven by the Chairman, with all the risks (and opportunities) of a process that depends on a single big ego. The choice of a new Chairman will therefore have far-reaching consequences for the effectiveness of the organization.
Conclusions
There is no a priori limit to the number of true statements one can make about a complex, adaptive system, and many of them will be at odds with one another. The role of a regulator is therefore not to establish the ultimate truth as the basis of a correct decision, but rather a never-ending quest to make the best possible call given what can be known at a given moment.
This reality has become more visible and more pressing as the stable world of 20th century communications gives way to the flux of the 21st century internet/web. Even while understanding grows that the FCC’s influence is limited, there is no doubt that it still has great influence, and important responsibilities. The addition of new techniques to its repertoire and a corresponding restructuring of its organization will be essential to wielding its influence wisely, to the benefit of citizens.
The new approach proposed here is premised on dynamics that affect not only the FCC, but all actors in the communications system. These arguments, and all debates about how to reform the FCC, therefore also apply to the broader question of governance of 21st century communications.
Further reading
Weiser, Philip J, “FCC Reform and the Future of Telecommunications Policy” (forthcoming, to be presented at Reforming the Federal Communications Commission, 5 January 2009)
In-house expertise
All participants in a complex adaptive system, including regulators, have to innovate to keep pace with ever-changing circumstances. However, change carries risks. The system will adjust to new rules as soon as they are promulgated, and there’s no way to turn back the clock.
The variability and unpredictability of adaptive systems means there cannot be a single, fixed, optimal strategy. The only viable approach is continuing learning and adaptation. As society and technology change ever faster, the institution has to learn faster and plan better. The FCC needs to be a learning organization which adapts at a much faster rate than was required in the past.
The FCC needs to have a strong, in-house basis for understanding the state of play, and anticipating developments. This is necessary both to make smart decisions about how (and whether!) to intervene through rule making, and to make smart/robust rules. Learning and planning are tied together through simulation and other safe-to-fail experiments.
Simply depending on the adversaries in proceedings to provide technical information is not sufficient; even if the input were not biased, one would still need in-house expertise to make informed choices. The Commission has this level of expertise in wireless technology, but perhaps not to the same degree in internet/web technology. Staff can make nuanced judgments about the likelihood and mitigation of radio interference, but has less experience judging network management claims, the implications for consumer privacy of data aggregation and behavioral advertising, or the most effective way to implement law enforcement surveillance on the internet.
With strong in-house capacity, the FCC could make effective use of outside consultants, which have employed too rarely in recent decades. It could also find ways to exploit the goodwill and expertise of volunteer experts in academia and society at large.
The multiple uncertainties of adaptive systems mean that every institution needs a long memory. Slow variables are often the ones that trigger radical change, but they can only be observed with prolonged attention. The institution needs a strong, constantly renewing base of career officials that can bridge across the terms of political appointees. There has been a renewed awareness in recent years of the degree to which the tenure of career professionals can be vulnerable to political processes.
Individuals matter. The interlinking of activities at various layers of a complex system means that in-house entrepreneurs (and reactionaries) can have disproportionate influence, particularly when a system is in flux. It’s important to foster independence and adaptive management skills, particularly in the American setting where top leadership is politically appointed, and changes frequently. Secondments like Chief Economist and Chief Technologist are an important tool, and should be complemented with fellowships from industry and academia at middle and entry levels in the organization.
New Structure
Developing an open, adaptive and principles-based approach to policy making, and building the capacity to perform new functions, will require changes in the institution’s organization.
The FCC is currently organized, in large part, to reflect the Titles of the Communications Act(s): the Wireline Competition Bureau is responsible for wire-based telephony; the Wireless Telecommunications Bureau oversees spectrum, including cellular telephony; the International Bureau covers satellite and international matters; the Media Bureau regulates radio and TV services. However, the mapping to statute is not exact, which suggests that organization by Title is not a necessity: a re-organization does not require a new Communications Act.
Such a structure cannot effectively address questions that cross industry boundaries – which, in the 21st century, is most of them.
A more effective and stable structure would be organization by policy mandate. This would replace the industry-oriented bureaus by departments with responsibilities for (1) public safety, (2) consumer protection, (3) the protection of culture and values, (4) economic vitality, and (5) the raising of revenues – regardless of industry or technology.
The rudiments of such an organization already exist; the Public Safety & Homeland Security Bureau is responsible for public safety across industries. Other bureaus would have to be split up among domain-oriented departments. The responsibilities of the departments replacing existing bureaus would include:
- Public Safety: Access to emergency services, law enforcement surveillance, data retention, and child safety
- Consumer Protection: Privacy, fraud, fair trade terms, access for those with disabilities, device certification, universal service, digital inclusion
- Culture & Values: Control of speech (obscenity, violence in media), advertising rules
- Markets: Economic vitality, anti-trust, allocation of resources (numbers, spectrum), market analysis
- Revenue: Taxes, fees, levies, subsidies
- An alignment with policy mandates will be more stable over time than one based on technology or industry segmentation, which is in constant flux.
- An organization structures by public interest mandate would require and enable the Commission to take a big picture approach in every case, and not limit staff to supervising or nurturing a particular, soon-to-be-obsolete industry category.
- It would weaken the ability of incumbents to dominate all rule makings applicable to them by focusing on a single bureau.
- A department focused on market issues would highlight the complements, overlaps and conflicts between the FCC and FTC in the anti-trust area.
- The inclusion of spectrum allocation issues within a department charged with maximizing the economic value of national resources would call the question of the divided FCC/NTIA management of spectrum in the US, and may provide a path to the long-term resolution of this inefficient anomaly.
Is now the right time?
One needs to ask not only how to reform, but whether.
Reform is a phase in the adaptive cycle: it is the reorganization that follows the crisis of a release phase. (See Reforming the FCC: Etiology for a summary of the adaptive cycle.) While release and restructuring is necessary for the long-term health of a system, it can be painful.
Reform necessarily dissipates the capital accumulated during growth and maturity; it is not something to be embarked on lightly. Is now the right time to reform the FCC?
The goal of wise management is to keep disruptions from flipping a system into an undesirable state, while encouraging the innovation and experimentation that comes with reorganization – not vainly trying to stop restructuring from happening at all. Delayed re-organization amplifies the eventual crisis, increasing the risk of landing up in an unhealthy state; too frequent or premature re-organization never allows the full accumulation of the potential that can fuel the next restructuring cycle.
An accumulation of governance problems is not sufficient to drive a paradigm shift, as Thomas Kuhn demonstrated in the context of scientific revolutions; there needs to be a better alternative. Do we have a better alternative? I don’t know, and complexity science suggests that there’s no way to know for sure. The only way to find out is to try it.
Are the risks acceptable? One of the prices to be paid is that tools that one needs to manage the system are disrupted during reform. For example, trust is an important ingredient of self-regulation, which will be important in the new approach – but trust requires stability, which is reduced during a reform. Fortunately, industry is in a relatively stable phase at the moment, which can accommodate and smooth over disruption at the Commission level. This gives the FCC an opportunity to change while not endangering the stability of the whole system
A new Communications Act might trigger reform, but that is neither necessary nor likely. Congress will be the last to change in the face of the complexification of communications. Members, particularly influential ones, are typically long standing office-holders with entrenched patrons and perspectives. They will resist the threat to their patronage entailed by a re-organization of regulatory action. When they do act to reform, it will probably be in response to an existential threat to a powerful old-line industry – which will tend to entrench, or at best resist attempts at blurring, existing ways of doing things.
Re-organization will therefore have to be driven by the Chairman, with all the risks (and opportunities) of a process that depends on a single big ego. The choice of a new Chairman will therefore have far-reaching consequences for the effectiveness of the organization.
Conclusions
There is no a priori limit to the number of true statements one can make about a complex, adaptive system, and many of them will be at odds with one another. The role of a regulator is therefore not to establish the ultimate truth as the basis of a correct decision, but rather a never-ending quest to make the best possible call given what can be known at a given moment.
This reality has become more visible and more pressing as the stable world of 20th century communications gives way to the flux of the 21st century internet/web. Even while understanding grows that the FCC’s influence is limited, there is no doubt that it still has great influence, and important responsibilities. The addition of new techniques to its repertoire and a corresponding restructuring of its organization will be essential to wielding its influence wisely, to the benefit of citizens.
The new approach proposed here is premised on dynamics that affect not only the FCC, but all actors in the communications system. These arguments, and all debates about how to reform the FCC, therefore also apply to the broader question of governance of 21st century communications.
Further reading
Weiser, Philip J, “FCC Reform and the Future of Telecommunications Policy” (forthcoming, to be presented at Reforming the Federal Communications Commission, 5 January 2009)
Reforming the FCC: New Capabilities
Complex adaptive systems, like 21st century communications, are by definition difficult to understand and control; I outlined some reasons for this in Reforming the FCC: Etiology. It is often unclear whether they can be managed at all. However, society has well-established expectations of the FCC regarding consumer protection, economic vitality, public safety, raising revenues, and the protection of culture and values.
The staff members of the FCC as currently constituted do a valiant job. However, the use of new techniques, and changes to the organization’s structure would enable them to be even more effective in future.
The challenge of managing a complex adaptive system calls for particular Commission capabilities techniques to be added or enhanced. This post will discuss each in turn:
Principles rather than Rules
The goal of policy is to understand the present, anticipate the future, and plot a course between the two. Since the present reality is changing ever more quickly, detailed rules and regulations will often be obsolete before they have been finalized. Responses will necessarily be ad hoc, but they don’t need to be arbitrary: in a complex world, principles are a more appropriate technique than detailed rules.
All managers of adaptive systems have rules of thumb for coping with unpredictability and complexity. I have been particularly inspired by the work of Buzz Holling and colleagues on managed ecosystems, which are a useful analog to 21st century communications. Both ICT and ecosystem managers have to deal with the confluence of socio-political systems and largely self-driven “subject” systems. In ecosystems the subject is biology, and in ICT it is the intrinsic, self-directed creativity of technology. The following principles distill the experience of managing such systems.
It is difficult to predict how complex adaptive systems will respond to interventions. Unintended consequences are the rule, not the exception. Experimentation before deploying new rules can reduce the likelihood, or impact, of blunders.
The ability to try out new ideas in state and local jurisdictions is a useful feature of the United States’ federal system. However, new ways to conduct “safe to fail” experiments are needed because geographies are now more interlinked, and less isolated, than they used to be. System modeling made possible by the advent of cheap, fast computing provides a safe way to try out regulatory ideas.
Exploring the consequences of policy choices in simulation can identify which courses of action are most robust under a variety of possible outcomes, and can identify critical preconceptions and biases. It can identify policy choices that are brittle and work in only a narrow set of circumstances, thus leading to more resilient final measures.
Techniques developed for business modeling, social analysis, and long-term policy planning can be applied to the internet/web. For example, agent-based simulations of internet access in the US provide insight into the dynamics of network neutrality regulation. It would be instructive to explore whether the resulting system has multiple stable states, as one might expect from a complex adaptive system; if it does, increased vigilance regarding irreversible transition into a low-diversity content arrangement is called for. Once such a model is in place, one can extend it to do resilience analyses, and factor in political power.
No single model is sufficient; indeed, a focus on a single correct but incomplete model generates long-term problems even while satisfying short-term objectives. Simulation – agent-based techniques as well as traditional econometric modeling – needs to become part of the standard way all parts of the organization make decisions.
Self-regulation
There is growing interest in the value of self- or co-regulation in the communication industry as part of a continuum of approaches ranging from no formal government action, through to full statutory regulation. Industry self-regulation can be more flexible and less costly for both business and consumers than direct government involvement. It is most likely to be effective where there is trust between government, industry, and consumers; enterprises recognize the importance of responsible behavior over the long term; and non-compliance by rogue companies can be contained.
Allowing or encouraging appropriate self-regulation is a way for the FCC to implement the Delegation principle. This is delegation, however, not abdication of responsibility: the Commission retains the responsibility for discharging its social mandates (consumer protection, economic vitality, etc.), but does not necessarily have to use regulation to do so.
Successfully applying self-regulation entails a capability in the organization to make informed judgments about whether the socio-economic context is conducive to effective self-regulation, and whether self-regulatory organizations are meeting their responsibilities. As with all the techniques discussed here, new capabilities have institutional consequences.
Transparency and Intelligibility
Visibility into the workings of a complex system reduces volatility and improves resilience. But how does one get timely information about an elaborate, rapidly-changing socio-economic system like the internet/web? Since funding for monitoring by regulators is limited, it is important to enable surveillance by civil society and the market itself. Limited capacity and visibility at the top of the control hierarchy can be complemented by distributing monitoring throughout the system.
It is important to have effective disclosure mandates on powerful players, such as firms with significant market power – and the regulators themselves. Internet/web technology itself facilitates monitoring. It makes information more immediately accessible, and enables peer-to-peer disclosure and the pooling of information and knowledge. The pioneers like Zagat, Amazon Reviews, and Wikipedia are being joined by “vigilante transparency” organizations monitoring civil servants, campaign contributions, internet service providers, and even nannies.
One of the lessons of the sub-prime mortgage crisis is that a lack of intelligibility was more problematic than too little transparency. Nobody understood the ramifications of the financial instruments they were creating, managing or (not) regulating. While it’s true that eliminating complexity could stifle innovation, that’s a false choice; nobody would seriously propose to eliminate either complexity or innovation. What is needed is an accounting of intelligibility in the regulatory calculus.
Transparency/intelligibility need not be mandatory; companies should be able to choose obscurity. However, the choice of shrouding their activities could incur the cost of increased regulatory scrutiny – and perhaps higher expectations regarding performance against public interest mandates. For example, Comcast need not explain exactly how it manages its network; but if it chooses obscurity, it should face tougher network neutrality expectations. Microsoft need not disclose its interfaces; but if it chooses obscurity, it should face tougher anti-trust tests. Google need not explain how it uses DoubleClick data to improve ad click-throughs; but if it chooses obscurity, it should face tougher privacy protection requirements.
Data gathering and interpretation
All the above techniques depend on effective data gathering. The FCC cannot do all the work itself, but neither should it be entirely beholden to interested parties to a proceeding, who will necessarily furnish only information that advances their cause. The agency should foster a culture and capability of data collection and interpretation with the goal of learning, not bookkeeping. All these activities can be delegated in part, but in all of them the FCC should retain an in-house capability.
The data that the FCC currently reports is organized by the old-line industry silos, and largely ignores the structure of the 21st century market. There are data for common carrier traffic, competition in satellite services, the cable industry, and wireless – all given separately. Many vital current issues are not covered, such as: the assessment of the consumer value of auctioned spectrum; an inventory of the utilization of spectrum under both NTIA and FCC management; consumer data aggregation practices.
The FCC should also explore new ways to inform its decisions and act on behalf of citizens, such as opinion polling. This technique is often used by other regulators, e.g. Ofcom, but apparently not by the FCC. The Commission will always be catching up with data collection requirements, since the industry changes so quickly. An institutional habit of identifying new data needs and fulfilling them is just as important as high quality reporting against current mandates.
Once data has been collected, it needs to be interpreted. There are many interpretation challenges, not least the flood of short comments generated by particular proceedings which has been facilitated by laudable efforts to involve the public like filing via email, and submitting brief comments. For example, in 2006 there were 10,831 one- or two-page filings (excluding ex partes) on FCC Docket 06-74, the AT&T/BellSouth merger; they were essentially all from individual citizens. This was 18% of all the filings in that year (59,081). By comparison, in 2004 there was a total of 25,480 filings.
This is part of a larger challenge of managing multiple constituencies. Not only have the number of industries with an interest in the communication business grown beyond telecommunications and TV broadcasting to include internet content providers, software companies and new broadcasting media, but the public has become increasingly engaged.
The FCC needs to rethink how it can involve a wider community in data collection and analysis. This community includes interested citizens, industry, research vendors, and think tanks. The goal should be to improve the speed and quality of both data collection and interpretation by opening up the process to commercial interests and citizen-consumers.
Further reading
Adaptive systems
The staff members of the FCC as currently constituted do a valiant job. However, the use of new techniques, and changes to the organization’s structure would enable them to be even more effective in future.
The challenge of managing a complex adaptive system calls for particular Commission capabilities techniques to be added or enhanced. This post will discuss each in turn:
- Principles rather than Rules
- Modeling and Simulation
- Self-regulation
- Transparency and Intelligibility
- Data gathering and Interpretation
Principles rather than Rules
The goal of policy is to understand the present, anticipate the future, and plot a course between the two. Since the present reality is changing ever more quickly, detailed rules and regulations will often be obsolete before they have been finalized. Responses will necessarily be ad hoc, but they don’t need to be arbitrary: in a complex world, principles are a more appropriate technique than detailed rules.
All managers of adaptive systems have rules of thumb for coping with unpredictability and complexity. I have been particularly inspired by the work of Buzz Holling and colleagues on managed ecosystems, which are a useful analog to 21st century communications. Both ICT and ecosystem managers have to deal with the confluence of socio-political systems and largely self-driven “subject” systems. In ecosystems the subject is biology, and in ICT it is the intrinsic, self-directed creativity of technology. The following principles distill the experience of managing such systems.
- Flexibility. Determine ends, not means. Describe and justify the outcomes sought, not the methods to be used to achieve them.
- Delegation. Let the market and society solve most problems, not government. Government's role is to provide proper incentives and guidance, and to address critical failures.
- Big Picture. Take a broad view of the problem and solution space. Favor generic over sector-, technology- or industry-specific legislation.
- Diversity. Seek and support multiple alternative solutions to policy problems. Encourage competition and market entry.
- Detailed intervention by specifying a detailed mechanism for achieving a societal goal – from merger conditions, to pricing network elements, to the exact allowed uses of a spectrum license – has been the rule, and is embedded in custom and statute.
- Delegation is at odds with the traditional top-down control of telecoms and broadcasting.
- The industry silos that underlie the titles of the Communications Act enshrine a “little picture” approach, where problems are solved piecemeal and in isolation.
- While lip service might be paid to diversity and innovation, regulatory capture by industry incumbents prevents competition. The desire for (illusory) control has traditionally seduced regulators into focusing on single rather than multiple solutions.
- Flexibility. Use principles rather than rules. Ensure backstop powers are available if self-regulation fails. Rules, if used, should be technology and business-model neutral. Build in capacity to deal with the unexpected.
- Delegation. Intervene if players close to the action demonstrably fail to solve problems flagged by the regulator, or in legislation. Avoid ex ante action unless there is a high probability that a market failure will be locked in.
- Big Picture. Avoid silo-specific regulation. Develop an integrated understanding, based on in-house expertise, of the industry and its social context. Use scenario planning to prepare for contingencies such as the entrenched market failure.
- Diversity. Don't entrench one solution through regulatory preference. Define markets broadly for competitive analysis.
It is difficult to predict how complex adaptive systems will respond to interventions. Unintended consequences are the rule, not the exception. Experimentation before deploying new rules can reduce the likelihood, or impact, of blunders.
The ability to try out new ideas in state and local jurisdictions is a useful feature of the United States’ federal system. However, new ways to conduct “safe to fail” experiments are needed because geographies are now more interlinked, and less isolated, than they used to be. System modeling made possible by the advent of cheap, fast computing provides a safe way to try out regulatory ideas.
Exploring the consequences of policy choices in simulation can identify which courses of action are most robust under a variety of possible outcomes, and can identify critical preconceptions and biases. It can identify policy choices that are brittle and work in only a narrow set of circumstances, thus leading to more resilient final measures.
Techniques developed for business modeling, social analysis, and long-term policy planning can be applied to the internet/web. For example, agent-based simulations of internet access in the US provide insight into the dynamics of network neutrality regulation. It would be instructive to explore whether the resulting system has multiple stable states, as one might expect from a complex adaptive system; if it does, increased vigilance regarding irreversible transition into a low-diversity content arrangement is called for. Once such a model is in place, one can extend it to do resilience analyses, and factor in political power.
No single model is sufficient; indeed, a focus on a single correct but incomplete model generates long-term problems even while satisfying short-term objectives. Simulation – agent-based techniques as well as traditional econometric modeling – needs to become part of the standard way all parts of the organization make decisions.
Self-regulation
There is growing interest in the value of self- or co-regulation in the communication industry as part of a continuum of approaches ranging from no formal government action, through to full statutory regulation. Industry self-regulation can be more flexible and less costly for both business and consumers than direct government involvement. It is most likely to be effective where there is trust between government, industry, and consumers; enterprises recognize the importance of responsible behavior over the long term; and non-compliance by rogue companies can be contained.
Allowing or encouraging appropriate self-regulation is a way for the FCC to implement the Delegation principle. This is delegation, however, not abdication of responsibility: the Commission retains the responsibility for discharging its social mandates (consumer protection, economic vitality, etc.), but does not necessarily have to use regulation to do so.
Successfully applying self-regulation entails a capability in the organization to make informed judgments about whether the socio-economic context is conducive to effective self-regulation, and whether self-regulatory organizations are meeting their responsibilities. As with all the techniques discussed here, new capabilities have institutional consequences.
Transparency and Intelligibility
Visibility into the workings of a complex system reduces volatility and improves resilience. But how does one get timely information about an elaborate, rapidly-changing socio-economic system like the internet/web? Since funding for monitoring by regulators is limited, it is important to enable surveillance by civil society and the market itself. Limited capacity and visibility at the top of the control hierarchy can be complemented by distributing monitoring throughout the system.
It is important to have effective disclosure mandates on powerful players, such as firms with significant market power – and the regulators themselves. Internet/web technology itself facilitates monitoring. It makes information more immediately accessible, and enables peer-to-peer disclosure and the pooling of information and knowledge. The pioneers like Zagat, Amazon Reviews, and Wikipedia are being joined by “vigilante transparency” organizations monitoring civil servants, campaign contributions, internet service providers, and even nannies.
One of the lessons of the sub-prime mortgage crisis is that a lack of intelligibility was more problematic than too little transparency. Nobody understood the ramifications of the financial instruments they were creating, managing or (not) regulating. While it’s true that eliminating complexity could stifle innovation, that’s a false choice; nobody would seriously propose to eliminate either complexity or innovation. What is needed is an accounting of intelligibility in the regulatory calculus.
Transparency/intelligibility need not be mandatory; companies should be able to choose obscurity. However, the choice of shrouding their activities could incur the cost of increased regulatory scrutiny – and perhaps higher expectations regarding performance against public interest mandates. For example, Comcast need not explain exactly how it manages its network; but if it chooses obscurity, it should face tougher network neutrality expectations. Microsoft need not disclose its interfaces; but if it chooses obscurity, it should face tougher anti-trust tests. Google need not explain how it uses DoubleClick data to improve ad click-throughs; but if it chooses obscurity, it should face tougher privacy protection requirements.
Data gathering and interpretation
All the above techniques depend on effective data gathering. The FCC cannot do all the work itself, but neither should it be entirely beholden to interested parties to a proceeding, who will necessarily furnish only information that advances their cause. The agency should foster a culture and capability of data collection and interpretation with the goal of learning, not bookkeeping. All these activities can be delegated in part, but in all of them the FCC should retain an in-house capability.
The data that the FCC currently reports is organized by the old-line industry silos, and largely ignores the structure of the 21st century market. There are data for common carrier traffic, competition in satellite services, the cable industry, and wireless – all given separately. Many vital current issues are not covered, such as: the assessment of the consumer value of auctioned spectrum; an inventory of the utilization of spectrum under both NTIA and FCC management; consumer data aggregation practices.
The FCC should also explore new ways to inform its decisions and act on behalf of citizens, such as opinion polling. This technique is often used by other regulators, e.g. Ofcom, but apparently not by the FCC. The Commission will always be catching up with data collection requirements, since the industry changes so quickly. An institutional habit of identifying new data needs and fulfilling them is just as important as high quality reporting against current mandates.
Once data has been collected, it needs to be interpreted. There are many interpretation challenges, not least the flood of short comments generated by particular proceedings which has been facilitated by laudable efforts to involve the public like filing via email, and submitting brief comments. For example, in 2006 there were 10,831 one- or two-page filings (excluding ex partes) on FCC Docket 06-74, the AT&T/BellSouth merger; they were essentially all from individual citizens. This was 18% of all the filings in that year (59,081). By comparison, in 2004 there was a total of 25,480 filings.
This is part of a larger challenge of managing multiple constituencies. Not only have the number of industries with an interest in the communication business grown beyond telecommunications and TV broadcasting to include internet content providers, software companies and new broadcasting media, but the public has become increasingly engaged.
The FCC needs to rethink how it can involve a wider community in data collection and analysis. This community includes interested citizens, industry, research vendors, and think tanks. The goal should be to improve the speed and quality of both data collection and interpretation by opening up the process to commercial interests and citizen-consumers.
Further reading
Adaptive systems
Yorque, Ralf, Brian Walker, C S Holling, Lance H Gunderson, Carl Folke, Stephen R Carpenter, and William A Brock, “Toward an Integrative Synthesis” Ch. 16 in Gunderson, Lance H and C S Holling, Panarchy: Understanding transformations in human and natural systems, Island Press (2002)Complexity theory and governance
Schneider, Volker and Johannes M. Bauer, “Governance: Prospects of Complexity Theory in Revisiting System Theory”, Presented at the annual meeting of the Midwest Political Science Association, Chicago, Illinois, 14 April 2007. Available: http://www.uni-konstanz.de/FuF/Verwiss/Schneider/ePapers/MPSA2007Paper_vs_jmb.pdf.Self-regulation
De Vries, Jean Pierre (2008), "Internet Governance as Forestry: Deriving Policy Principles from Managed Complex Adaptive Systems", TPRC 2008. Available: http://ssrn.com/abstract=1229482.
Ofcom (2008) Principles for analysing self- and co-regulation: Statement. 10 December 2008. Available: http://www.ofcom.org.uk/consult/condocs/coregulation/statement/Simulation
Weiser, Philip J. (2008) “Exploring Self Regulatory Strategies for Network Management”, Flatirons Summit on Information Policy, 9-10 June, 2008. 25 August 2008. Available: http://www.silicon-flatirons.org/documents/publications/summits/WeiserNetworkManagement.pdf
Sterman, John D. (2002) “All models are wrong: reflections on becoming a systems scientist: The Jay Wright Forrester Prize Lecture.” System Dynamics Review Vol. 18, No. 4, (Winter 2002): 501–531. Available: http://web.mit.edu/jsterman/www/All_Models_Are_Wrong_(SDR).pdfTransparency
Lempert, Robert J., Steven W. Popper, and Steven C. Bankes (2003) “Shaping the Next One Hundred Years: New Methods for Quantitative, Long-Term Policy Analysis” RAND Report MR-1626-RPC, 2003. Available: http://www.rand.org/pubs/monograph_reports/MR1626/index.html
Bauer, Johannes M. (2007) “Dynamic effects of network neutrality,” International Journal of Communication, 1: 531-547. Available: http://ijoc.org/ojs/index.php/ijoc/article/view/156/79tprg
Fung, Archon, Mary Graham, David Weil, and Elena Fagotto (2006) "The Effectiveness of Regulatory Disclosure Policies," Journal of Policy Analysis and Management, v. 25, no. 1 (Winter 2006). Available: http://www.hks.harvard.edu/taubmancenter/transparency/downloads/jpam06.pdf
Saturday, December 13, 2008
And it started so small: News-related Etymology
According to various sources at dictionary.com, the word "bribe" is Middle English, from the Old French, piece of bread given as alms. The shift of meaning to "gift given to influence corruptly" is first attested 1535. The original meaning of the word is said to be from the base bri(m)b- denoting something small.
(My apologies to Gov. Blagojevich for contributing to the burial of the principle of "innocent until proven guilty" in America. Talk about trial and conviction by public opinion...)
(My apologies to Gov. Blagojevich for contributing to the burial of the principle of "innocent until proven guilty" in America. Talk about trial and conviction by public opinion...)
Wednesday, December 10, 2008
Reforming the FCC: Etiology
In Reforming the FCC: Diagnosis, I argued that the FCC must these days supervise a much more complex and adaptive situation. This post examines the causes of this situation. I’ll consider responses in a subsequent post.
As an exercise, consider whether the following attributes apply to 21st century communications (which I’ll also refer to as ICT, for lack of a better term): the absence of a global controller; nested, hierarchical organization; dispersed interactions; never-ending novelty; constant selection among candidate solutions; and rapid adaptation to new circumstances. I believe they definitely describe the internet/web – and they are much less applicable to the silo’d world of telecommunications and analog broadcasting of only a few decades ago.
These attributes are the hallmarks of complexity and adaptive, non-linear systems. 21st Century communications is a complex adaptive social system, but the FCC was set up to manage a 20th century industry which was complicated but not complex. This is the deep reason why the institution needs to change.
The adaptive cycle
A key attribute of complex adaptive systems is that they cycle through distinct stages. I’ll describe it here using the example of ecosystems (where it was introduced) before turning to ICT.
During the growth stage, there is rapid colonization of recently disturbed areas, for example after a fire or wind storm has removed large amounts of biomass in a forest. The connectedness between organisms is low, which leads to high resilience; the loss of one species doesn’t lead to the loss of another. As the forest matures, it moves into the maturity phase of the cycle, which is dominated by the accumulation of material. The network of connections between biomass and nutrients becomes increasingly tight, and fragile; every niche in the forest is filled, and every resource is used. Organisms become much more interdependent; food chains become dense and interconnected. The maturity phase is followed by a dramatic release, triggered in a forest by fire, drought, insect pests, etc. A lot of energy is unbound, and networks are broken up. This sets the scene for the fourth phase, reorganization: opportunistic species that have been suppressed by the stable configuration of the maturity phase move in. This is a time of innovation and restructuring, laying the groundwork for a return to another growth phase.
The behavior of managed ecosystems is shaped by three properties: the accumulation of potential, the degree of connectedness between elements, and the resilience of the system in the face of shocks. The same properties apply to complex human enterprises like modern communications.
The adaptive cycle alternates periods of gradual accumulation of potential (e.g. biomass, socio-economic capital or know-how, depending on the kind of system) with sudden and often unexpected disruptions that reorganize that potential. Connectedness is high at maturity, but that is also the time when resilience to shocks is at its lowest. This cycle of aggregation followed by restructuring leads to innovation; but the release phase is often a surprise, and frequently an unpleasant one for those who were successful in the maturity phase. It is thus often experienced as a crisis.
Decision Environments
One can recognize the phases of the adaptive cycle in the internet/web, and in the larger system of communications governance. It is helpful to parse the system into four decision environments that represent different hierarchical layers:
Another hallmark of complex adaptive systems – and one of the hardest challenges for a regulator – is unexpected novelty. Changes in the state of a complex system are usually unexpected, in part because many dynamics are invisible. Surprises are particularly noticeable when they lead to release.
Here are some recent reminders that the innovation that we expect from complex systems usually comes as a surprise:
Many surprises come from contagion between problem domains that were previously considered distinct. XM/Sirius’s problems came at the intersection of personal computing devices with broadcasting; music publishing’s crisis arose from software and networking innovations that led to the P2P distribution of digital content; and the open source software movement informed Wikipedia.
Problem scope
A consequence of interlocking decision environments and intersecting problem domains is that the unit of analysis for the FCC is no longer a distinct, largely independent, well-defined industry associated with a particular technology and its own Title in the Communications Act.
Attention needs to shift from industries to problem domains, and away from solutions for a particular industry, technology and even institution or statute. For example, a policy imperative like lawful intercept is no longer limited to telephony, which leads to conflicts such as the competing definitions of information services in CALEA and the Communications Act. This is an example of the importance of the Big Picture principle for managing adaptive system. (I’ll review this principle and its three companions – Delegation, Flexibility and Diversity – in the next post.)
However, simply broadening some existing statute to cover all new possibilities is counter-productive. It conflicts with the other three principles, and falls victim to the fallacy that narrow-minded control of a single variable leads to a healthy outcome; in adaptive systems, it leads eventually to an even worse crisis.
In conclusion, the FCC is really facing a system problem, not an institutional one. Even if today’s procedural problems within the Commission were completely solved, it would not address the challenges of a qualitatively more complex and unpredictable regulation “subject”, that is, the market/culture system where innovation and growth takes place. Nor would it speak to the problems faced at the political level where the social acceleration of time poses existential challenges to the rule of law, and profoundly complicates the separation of powers between the legislature, executive, and judiciary market capitalism, and liberal democracy.
I’ll turn to the question of how the FCC should respond in the next post.
Further reading
The adaptive cycle:
As an exercise, consider whether the following attributes apply to 21st century communications (which I’ll also refer to as ICT, for lack of a better term): the absence of a global controller; nested, hierarchical organization; dispersed interactions; never-ending novelty; constant selection among candidate solutions; and rapid adaptation to new circumstances. I believe they definitely describe the internet/web – and they are much less applicable to the silo’d world of telecommunications and analog broadcasting of only a few decades ago.
These attributes are the hallmarks of complexity and adaptive, non-linear systems. 21st Century communications is a complex adaptive social system, but the FCC was set up to manage a 20th century industry which was complicated but not complex. This is the deep reason why the institution needs to change.
The adaptive cycle
A key attribute of complex adaptive systems is that they cycle through distinct stages. I’ll describe it here using the example of ecosystems (where it was introduced) before turning to ICT.
During the growth stage, there is rapid colonization of recently disturbed areas, for example after a fire or wind storm has removed large amounts of biomass in a forest. The connectedness between organisms is low, which leads to high resilience; the loss of one species doesn’t lead to the loss of another. As the forest matures, it moves into the maturity phase of the cycle, which is dominated by the accumulation of material. The network of connections between biomass and nutrients becomes increasingly tight, and fragile; every niche in the forest is filled, and every resource is used. Organisms become much more interdependent; food chains become dense and interconnected. The maturity phase is followed by a dramatic release, triggered in a forest by fire, drought, insect pests, etc. A lot of energy is unbound, and networks are broken up. This sets the scene for the fourth phase, reorganization: opportunistic species that have been suppressed by the stable configuration of the maturity phase move in. This is a time of innovation and restructuring, laying the groundwork for a return to another growth phase.
The behavior of managed ecosystems is shaped by three properties: the accumulation of potential, the degree of connectedness between elements, and the resilience of the system in the face of shocks. The same properties apply to complex human enterprises like modern communications.
The adaptive cycle alternates periods of gradual accumulation of potential (e.g. biomass, socio-economic capital or know-how, depending on the kind of system) with sudden and often unexpected disruptions that reorganize that potential. Connectedness is high at maturity, but that is also the time when resilience to shocks is at its lowest. This cycle of aggregation followed by restructuring leads to innovation; but the release phase is often a surprise, and frequently an unpleasant one for those who were successful in the maturity phase. It is thus often experienced as a crisis.
Decision Environments
One can recognize the phases of the adaptive cycle in the internet/web, and in the larger system of communications governance. It is helpful to parse the system into four decision environments that represent different hierarchical layers:
- Political system: local, state and federal politicians seeking to advance their causes
- Inter-organizational system: peer agencies with partially overlapping responsibilities, such as the FCC, FTC and NTIA
- Organizational system: an agency, in our case the FCC, acting on its “subject” layer, and other organizations, in a context provided by the political systems
- Market/culture system: companies and citizen/consumers using technology (goods and services) to achieve their various ends, often at odds with each other and other levels of system
- Political: The political system went through a release phase with the 2008 election, and will spend 2009 in reorganization as players who have been out of office for eight years move into newly opened positions of power (cf. ecological niches), bringing new perspectives with them.
- Inter-organizational: The new Administration will bring necessarily bring changes at the top of the FTC and NTIA as well, but the consequences may not be as dramatic as those at the FCC, providing some stability at this layer
- Organizational: The FCC is due for “release” with the appointment of new Commissioners and Chairman in 2009. There is anecdotal evidence that large-scale departures of long-serving career staff in the last couple of years represent a release in itself, with the breakup of long-standing networks of expertise and the dissipation of institutional knowledge.
- Market/culture: The productive parts of the communication system are in or near maturity. Traditional content industries like news, music publishing and TV at maturity, and some are entering release. Telecoms went through a re-organization following the Telecoms Act of 1996, and is in a growth stage, judging by the consolidation of AT&T and Verizon. Similarly, the disruptive market-oriented allocation of spectrum through auctions has been absorbed, and there are signs of maturity in the concentration of spectrum in a few hands. There are still pockets of reorganization left over from the last cycle, e.g. cable taking voice share from wire line telcos, and telcos threatening cable’s video business. For all the hype, the PC/internet/web subsystem is well along in the growth phase and nearing maturity (e.g. Microsoft, Cisco, Google). Consumer habits have adapted to internet and the web, and have become mature.
Another hallmark of complex adaptive systems – and one of the hardest challenges for a regulator – is unexpected novelty. Changes in the state of a complex system are usually unexpected, in part because many dynamics are invisible. Surprises are particularly noticeable when they lead to release.
Here are some recent reminders that the innovation that we expect from complex systems usually comes as a surprise:
- Digital satellite radio expected to compete with traditional radio, not to be swamped by the iPod
- Digital video as an alternative to broadcast TV came to prominence as low-quality, user-originated content on YouTube, rather than as high quality Video on Demand via cable or DSL
- The explosion of Wi-Fi (and CDMA cellular telephony) was the consequence of esoteric decisions about unlicensed rules by the FCC in the mid 1980’s
- The collapse of music publishing – the industry lost a third of its revenues between 1999 and 2006
- The eclipse of commercial encyclopedias by user-produced content on Wikipedia
Many surprises come from contagion between problem domains that were previously considered distinct. XM/Sirius’s problems came at the intersection of personal computing devices with broadcasting; music publishing’s crisis arose from software and networking innovations that led to the P2P distribution of digital content; and the open source software movement informed Wikipedia.
Problem scope
A consequence of interlocking decision environments and intersecting problem domains is that the unit of analysis for the FCC is no longer a distinct, largely independent, well-defined industry associated with a particular technology and its own Title in the Communications Act.
Attention needs to shift from industries to problem domains, and away from solutions for a particular industry, technology and even institution or statute. For example, a policy imperative like lawful intercept is no longer limited to telephony, which leads to conflicts such as the competing definitions of information services in CALEA and the Communications Act. This is an example of the importance of the Big Picture principle for managing adaptive system. (I’ll review this principle and its three companions – Delegation, Flexibility and Diversity – in the next post.)
However, simply broadening some existing statute to cover all new possibilities is counter-productive. It conflicts with the other three principles, and falls victim to the fallacy that narrow-minded control of a single variable leads to a healthy outcome; in adaptive systems, it leads eventually to an even worse crisis.
In conclusion, the FCC is really facing a system problem, not an institutional one. Even if today’s procedural problems within the Commission were completely solved, it would not address the challenges of a qualitatively more complex and unpredictable regulation “subject”, that is, the market/culture system where innovation and growth takes place. Nor would it speak to the problems faced at the political level where the social acceleration of time poses existential challenges to the rule of law, and profoundly complicates the separation of powers between the legislature, executive, and judiciary market capitalism, and liberal democracy.
I’ll turn to the question of how the FCC should respond in the next post.
Further reading
The adaptive cycle:
Holling, C S, Lance H Gunderson and Donald Ludwig, “In Quest of a Theory of Adaptive Change”, Ch 1 of Gunderson, Lance H and C S Holling, Panarchy: Understanding transformations in human and natural systems, Island Press (2002). PDFDecision environments and the challenges individuals face in managing adaptive systems:
Ten Conclusions from the Resilience Project
Westley, Frances, “The Devil in the Dynamics: Adaptive Management on the Front Line”, Ch. 13 in Gunderson, Lance H and C S Holling, Panarchy: Understanding transformations in human and natural systems, Island Press (2002)A discussion of the intersection between system resilience, the rule of law, and Scheuerman’s notion of the social acceleration time
Cherry, Barbara A (2008), “Institutional Governance for Essential Industries Under Complexity: Providing Resilience Within the Rule of Law” CommLaw Conspectus (forthcoming)An account of the early history of civil spread spectrum
Early Civil Spread Spectrum History, Mike Marcus web siteCollapse of the music industry
theweek.comGrowth of cable voice traffic
economist.com
redorbit.com
gigaom.com
Wednesday, December 03, 2008
Etymology for the day: "larceny"
According to the American Heritage Dictionary via dictionary.com, "larceny" is a Middle English word, from Anglo-Norman larcin, theft, from Latin latrōcinium, robbery, from latrō, robber, mercenary, utimately from Greek latron, pay, hire.
And the Spanish for a robber or thief is ladrón.
Mercenaries have never had a good rep. . .
And the Spanish for a robber or thief is ladrón.
Mercenaries have never had a good rep. . .
Tuesday, December 02, 2008
Reforming the FCC: Diagnosis
The financial crisis has called into question how markets are regulated; calls for reforming the FCC have been growing louder for some years. The legal/regulatory shortcomings of the FCC are a topic of frequent conversation (e.g., GMU Sep 08, PK/Silicon Flatirons Jan 09. It is therefore instructive to ask why it has ended up in this situation. Some of the problems are due to the personalities and politics of the moment, and are thus transitory. Some are due to its terms of operation; the FCC’s structure and mission are determined by the Communications Act, and won’t change fundamentally unless the Act changes. The deeper cause, which most interests me, is a change in the nature what is being regulated: the transformation of the communications business from telecoms+broadcasting to the internet.
Since the mid-90s, the computer, information and communication services have come to dwarf telecommunications services. For example, the graphic at the top of this post charts the service exports of the OECD countries according to the OECD Communications Outlook 2007 (p. 256). This was not only a quantitative change; computing brought a qualitative change. The internet/web is modular, decentralized, self-organizing, adaptive and diverse on a fundamentally different scale to telecommunications (Internet Governance as Forestry). These are all characteristics that distinguish complex systems from merely complicated ones.
An analogy may help: the FCC in the telecoms era was like a farmer managing agricultural production; today it is like a ranger responsible for a wilderness. A farmer can decide which crops to cultivate, where to plant them, and when to rotate – though the plants do the work of converting sunlight to carbohydrate, and the animals convert food to meat. Some inputs, like weather and market conditions, are unpredictable, but many – irrigation, fertilizer, seed type, antibiotics – are under the farmer’s control. (And even weather and market risk is mitigated by massive government subsidies for major crops.) The desired output is well-defined and easily measurable. Rangers, on the other hand, have to deal with a very different balance of power and responsibility. They have to protect endangered species, prevent catastrophic fires, and provide access to citizens, but have little or no control over the animals and plants in the ecosystem, or the inputs in the form of weather, migrating animals, or pests.
This limited control implies that detailed, rule-based regulation is no longer sustainable. An approach based on principles, supported by tools such as transparency and computer simulation, is the only viable strategy. Rules can determine which crop hybrid to use for a particular market need given climate and soil type; but principles – such as flexibility, taking a big picture view, fostering diversity, and delegating responsibility – are unavoidable when managing an ecosystem.
In a New Yorker article about the financial crisis, James Surowiecki uses a sport analogy to explain the difference between principles and rules:
It’s something like the difference between football and soccer. Football, like most American sports, is heavily rule-bound. There’s an elaborate rulebook that sharply limits what players can and can’t do (down to where they have to stand on the field), and its dictates are followed with great care. Soccer is a more principles-based game. There are fewer rules, and the referee is given far more authority than officials in most American sports to interpret them and to shape game play and outcomes. For instance, a soccer referee keeps the game time, and at game’s end has the discretion to add as many or as few minutes of extra time as he deems necessary. There’s also less obsession with precision—players making a free kick or throw-in don’t have to pinpoint exactly where it should be taken from. As long as it’s in the general vicinity of the right spot, it’s O.K.Pursuing this metaphor, the FCC is not only the referee of a football game, it also makes the rules – often as the game goes along.
--- James Surowiecki, Parsing Paulson, The New Yorker, 2 Dec 2008
I’ll suggest some possible ways for a new FCC to manage the new communications business in an upcoming post. However, a caveat: The ICT business hasn’t had a crisis of melt-down proportions, as finance has had, to concentrate the mind. It remains to be seen how the change in power in DC will affect this process. Some of the loudest calls for change at the FCC have come from the Right, arguing that the FCC regulates too much and too intrusively; the Left has chimed in, arguing that it regulates too ineffectively. With Democrats now in control of both Congress and the Administration, and the GOP in some disarray, the pressure to reform the FCC may well abate; calls for its abolition will certainly have less resonance.
Newton, Leibnitz and the (non?)existence of spectrum
My argument that spectrum doesn’t exist [see Note below for a recap] has deeper philosophical roots than I’d realized.
As I learned in a lecture on the Philosopher’s Zone podcast my contention parallels an argument between Leibnitz and Newton about whether space and time have an independent existence. Here’s Graham Priest, Professor of Philosophy at Melbourne:
Leibniz describes a space that exists only as a relation between objects, and which has no existence apart from the existence of those objects. Motion exists only as a relation between those objects. Newtonian space provided the absolute frame of reference within which objects can have motion. In Newton’s system the frame of reference exists independently of the objects which are contained in it. These objects can be described as moving in relation to space itself.
This notion of space and time as big buckets that you put things in reminds me of the dominant metaphor for spectrum: it’s like space, and often a container. (Blog post: De-situating Spectrum: Non-spatial metaphors for wireless communication. Papers: Imagining Radio: Mental Models of Wireless Communication; De-situating spectrum: Rethinking radio policy using non-spatial metaphors.)
When it comes to spectrum, I’m a relationalist. Frequency has no existence apart from the existence of electromagnetic radiation, and therefore there is no spectrum resource separable from the operation of radio systems.
Note: Why “spectrum doesn’t exist”
I contend that the term “spectrum” as commonly used doesn’t have any meaningful referent. It typically occurs as a synonym for frequency or a frequency band, as in “the service is in UHF spectrum”. However, frequency is simply a marker; spectrum aka frequency has no existence independent of the measured radiation. It therefore doesn’t make sense to talk of a “spectrum resource”.
Use of the term in the dictionary sense, to refer to a distribution of electromagnetic radiation (as in “the spectrum of that transmitter”) is rare. In this sense a spectrum exists when a radio transmitter is on, but not when it’s off. Again, “spectrum as a resource” isn’t meaningful.
The only arguably meaningful definition is that “spectrum” means “wireless operation”. This is the only way I can make sense of a term like “spectrum license”; it’s a license to operate a radio in a given way. However, few people use the word spectrum in a way that suggests they have “wireless operation” in mind as a synonym – though they may end up with this definition when pressed.
As I learned in a lecture on the Philosopher’s Zone podcast my contention parallels an argument between Leibnitz and Newton about whether space and time have an independent existence. Here’s Graham Priest, Professor of Philosophy at Melbourne:
“In the 17th century, there was a famous debate between Newton and Leibniz about the nature of space and time. And this was the question they focused on: Could you pick up the whole physical cosmos and move it five miles to the east and put it down? Or, could everything happen exactly the same way it does now, except all half an hour later? Newton said Yes to both those questions; Leibniz said No. Because Newton said space and time are kind of like buckets and it makes sense to suppose that the buckets have a certain kind of reality, and the contents just hang in there. So space is something, it will be there even if there were no events in space and time. And time likewise. So space and time have a certain kind of self existence, they don't depend on anything.Wikipedia’s article on the philosophy of space and time includes a paragraph on the Leibnitz/Newton debate. The question of whether or not time and space exist independently of the mind leads, in this case, to either the “absolutist” position that they are real objects themselves, or the “relationalist” position that they are merely orderings upon actual objects. Wkipedia summarizes the positions as follows:
“Leibniz said this: No, it doesn't make any sense to suppose that you could lift everything up and move it five miles to the east, or that things could have started five minutes later. Because nothing would really have changed. Space and time aren't kind of big buckets that you put things in. Space and time are nothing more than the interned relationships between the events that happen in space and time. So if all the relationships, the befores and afters, the lefts and rights, if all those are the same, then nothing's changed. So if you moved everything supposing you could, five miles to the east, all the spatial relationships would have remained the same. So nothing would have changed. So if you have this view of the nature of space and time, then space and time do not have the same kind of self existence, ontological existence that they have for Newton.
“In the jargon of Buddhist philosophy, Newton thought that space and time had self existence. They were there independently of anything else. But for Leibniz they did not, they were just things in a system of relationships which kind of support each other by a system of relationships.”
Source: Why Asian philosophy? (podcast and transcript), The Philosopher’s Zone, 18 October 2008, a program of ABC Radio National.
Leibniz describes a space that exists only as a relation between objects, and which has no existence apart from the existence of those objects. Motion exists only as a relation between those objects. Newtonian space provided the absolute frame of reference within which objects can have motion. In Newton’s system the frame of reference exists independently of the objects which are contained in it. These objects can be described as moving in relation to space itself.
This notion of space and time as big buckets that you put things in reminds me of the dominant metaphor for spectrum: it’s like space, and often a container. (Blog post: De-situating Spectrum: Non-spatial metaphors for wireless communication. Papers: Imagining Radio: Mental Models of Wireless Communication; De-situating spectrum: Rethinking radio policy using non-spatial metaphors.)
When it comes to spectrum, I’m a relationalist. Frequency has no existence apart from the existence of electromagnetic radiation, and therefore there is no spectrum resource separable from the operation of radio systems.
Note: Why “spectrum doesn’t exist”
I contend that the term “spectrum” as commonly used doesn’t have any meaningful referent. It typically occurs as a synonym for frequency or a frequency band, as in “the service is in UHF spectrum”. However, frequency is simply a marker; spectrum aka frequency has no existence independent of the measured radiation. It therefore doesn’t make sense to talk of a “spectrum resource”.
Use of the term in the dictionary sense, to refer to a distribution of electromagnetic radiation (as in “the spectrum of that transmitter”) is rare. In this sense a spectrum exists when a radio transmitter is on, but not when it’s off. Again, “spectrum as a resource” isn’t meaningful.
The only arguably meaningful definition is that “spectrum” means “wireless operation”. This is the only way I can make sense of a term like “spectrum license”; it’s a license to operate a radio in a given way. However, few people use the word spectrum in a way that suggests they have “wireless operation” in mind as a synonym – though they may end up with this definition when pressed.
Friday, November 28, 2008
The Finance Goose and the Telecoms Gander
I’ve been trying to think through the analogies between finance and ICT (aka telecoms [1]) in the hope of gleaning insights about ICT regulation from the market melt-down. (Recent posts: From transparency to intelligibility in regulating finance, Lessons for communications regulation from banking complexity, More on Intelligibility vs. Transparency.)
While finance and ICT are both complex adaptive systems, there are some deep differences [2] – deep enough that “Re-regulate, Baby!” thinking about financial markets shouldn’t automatically include ICT. In other words: while self-regulation on Wall Street may be anathema in the current climate, it should still be on the menu for ICT.
Money is a core commodity
The dotcom bust of 2001 was severe, but pales in comparison to the Savings & Loan debacle, let alone the current crisis. The ICT business doesn’t have the societal or financial leverage to drive meltdowns that rock society. Finance is about making money with money, and money drives the economy. Money is the ultimate commodity; when you can’t get money, nothing works.
ICT is not (yet?) so central. Information, for all the talk about bits and bytes, is not really a commodity. A dollar is a dollar is a dollar, but a brother could be a sibling, a comrade in arms, or any human being, depending on the context [3]. Distortions of information transfer, whether in transport or content, are therefore not as leveraged as bottlenecks in the money supply.
Information flows are undoubtedly important, and their interruption would cause disruption. For example, cargo ships need to provide electronic manifests to their destination ports 24 hours before arrival in the US. If this data flow were blocked, the movement of goods would stop. However, this is a point failure; it isn’t obvious to me how something like this could cause a cascade of failures, as we saw when banks stopped lending to each other [4].
Leveraged Intangibles
Finance is more leveraged than ICT. It’s more abstract, not least because money is a more “pure” commodity than information; that is, it’s more generic. The sub-prime crisis was the collapse of a tower of derivatives: loans were bundled into CDOs, which were then re-bundled into CDOs, and again, and again. There was double leverage. First, the obvious leverage of betting with borrowed money; second, the recursive bundling of financial instruments to magnify even those inflated returns. The tower of derivatives was possible because its bricks were intangible; failure only came when the conceptual opacity of the structure overwhelmed our individual and institutional comprehension, rather than when its weight exceeded the compressive strength of its materials.
ICT also has its fair share of intangibles; many problems of large-scale software development are due to the opacity of boundless webs and towers of abstractions. However, the recursiveness is not as thoroughgoing, at least at the lower levels of the stack. The risks of network infrastructure companies misusing their market power in interconnection, say, is limited by the fact that no part of the network is many abstraction steps away from tangible wires and routers.
The risks do become greater in the higher network layers, such as applications and content. Software carries more and more value, here; even though the bits and MIPS live in data centers somewhere, complex layers of abstract processing can create unexpected risks. One example is the way in which personally identifiable information doesn't have to be a name and address: when sufficiently many seemingly random items can be aggregated, someone becomes uniquely identifiable even though they didn't provide their name.
Subjectivity
The finance business is shot through with unquantifiable and unpredictable subjectivity, notably trust, greed, and panic. Of course, all businesses including ICT rely on trust, etc. However, in finance subjective assessments drive minute-by-minute market decisions. When banks lost faith that their counter-parties would still be solvent the next morning, all were sucked down a maelstrom of mutual distrust. Businesses all try to quantify trust, but it’s a fragile thing, particularly when assets are intangible and unintelligible. Investors thought they could depend on ratings agencies to measure risk; when it turned out that they couldn’t due to the agencies’ conflicts of interest, the downward spiral started (and was accelerated by leverage).
The ICT business, at least at the lower transport levels, is much less dependent on subjective assessments. One can measure up-time and packet loss objectively. Things are less sure at the content layers, as can be seen in the rumblings about the incidence of click fraud, and whether the click-through accounting of search engine operators can be trusted; so far, though, there’s been no evidence of a rickety tower of dubious reputation.
Conclusions
Finance today arguably needs more supervision because of the wide ramifications of unfettered greed, fear or stupidity. The impacts are so large because of the amplifying effects of leverage and intangibility; the risk is greater because of the opacity due to unintelligibility.
ICT also has leverage, intangibility and opacity, but not at the same scale. Therefore, objections to delegated regulation in finance do not transfer automatically to ICT.
Counter-intuitively (to me, at least), the parts of the ICT business that are most defensible against claims for re-regulation are those that have a great deal of physical infrastructure. The more software-intensive parts of the cloud are most vulnerable to analogies with the runaway risks we’ve seen in financial markets
Notes
[1] While telecoms is an easy old word that everybody knows, it really doesn’t capture the present situation. It connotes old technologies like telephony, ignores the media, and misses the importance of computing and software. There is as yet no better, commonly used term, and so I’ll reluctantly use the acronym ICT (Information and Communication Technologies). ICT is about business and policy as well as technology, but it’s a little more familiar, and shorter, than “connected computing”, my other preferred term.
[2] Jonathan Sallet observes that the financial crisis derives from market externalities that put all of society at risk (personal communication, 26 Nov 2008). The very large scope of this risk can be used to justify government intervention. We’re hoping to combine our thinking in an upcoming note.
[3] I’m toying with the notion of doing a metaphor analysis of information. At first sight, the discourse seems to be driven by an Information Is a Fluid analogy; it’s a substance of which one can have more or less. This metaphor is both pervasive, and open to criticism. Reddy introduced the Communication Is a Conduit metaphor for knowledge transfer; this is related to the Lakoff’s Ideas Are Objects. See here for his critique, and citation of his paper.
[4] Just because I can’t see a cascade doesn’t mean it isn’t there, of course; it may just be my uninformed and uninspired imagination. Network security analysts have, I’m sure, constructed many nightmare scenarios. The weakness of my analysis here is that my argument for the implausibility of a meltdown rests in part on the fact that it hasn’t happened – yet. The 9/11 fallacy. . . .
While finance and ICT are both complex adaptive systems, there are some deep differences [2] – deep enough that “Re-regulate, Baby!” thinking about financial markets shouldn’t automatically include ICT. In other words: while self-regulation on Wall Street may be anathema in the current climate, it should still be on the menu for ICT.
Money is a core commodity
The dotcom bust of 2001 was severe, but pales in comparison to the Savings & Loan debacle, let alone the current crisis. The ICT business doesn’t have the societal or financial leverage to drive meltdowns that rock society. Finance is about making money with money, and money drives the economy. Money is the ultimate commodity; when you can’t get money, nothing works.
ICT is not (yet?) so central. Information, for all the talk about bits and bytes, is not really a commodity. A dollar is a dollar is a dollar, but a brother could be a sibling, a comrade in arms, or any human being, depending on the context [3]. Distortions of information transfer, whether in transport or content, are therefore not as leveraged as bottlenecks in the money supply.
Information flows are undoubtedly important, and their interruption would cause disruption. For example, cargo ships need to provide electronic manifests to their destination ports 24 hours before arrival in the US. If this data flow were blocked, the movement of goods would stop. However, this is a point failure; it isn’t obvious to me how something like this could cause a cascade of failures, as we saw when banks stopped lending to each other [4].
Leveraged Intangibles
Finance is more leveraged than ICT. It’s more abstract, not least because money is a more “pure” commodity than information; that is, it’s more generic. The sub-prime crisis was the collapse of a tower of derivatives: loans were bundled into CDOs, which were then re-bundled into CDOs, and again, and again. There was double leverage. First, the obvious leverage of betting with borrowed money; second, the recursive bundling of financial instruments to magnify even those inflated returns. The tower of derivatives was possible because its bricks were intangible; failure only came when the conceptual opacity of the structure overwhelmed our individual and institutional comprehension, rather than when its weight exceeded the compressive strength of its materials.
ICT also has its fair share of intangibles; many problems of large-scale software development are due to the opacity of boundless webs and towers of abstractions. However, the recursiveness is not as thoroughgoing, at least at the lower levels of the stack. The risks of network infrastructure companies misusing their market power in interconnection, say, is limited by the fact that no part of the network is many abstraction steps away from tangible wires and routers.
The risks do become greater in the higher network layers, such as applications and content. Software carries more and more value, here; even though the bits and MIPS live in data centers somewhere, complex layers of abstract processing can create unexpected risks. One example is the way in which personally identifiable information doesn't have to be a name and address: when sufficiently many seemingly random items can be aggregated, someone becomes uniquely identifiable even though they didn't provide their name.
Subjectivity
The finance business is shot through with unquantifiable and unpredictable subjectivity, notably trust, greed, and panic. Of course, all businesses including ICT rely on trust, etc. However, in finance subjective assessments drive minute-by-minute market decisions. When banks lost faith that their counter-parties would still be solvent the next morning, all were sucked down a maelstrom of mutual distrust. Businesses all try to quantify trust, but it’s a fragile thing, particularly when assets are intangible and unintelligible. Investors thought they could depend on ratings agencies to measure risk; when it turned out that they couldn’t due to the agencies’ conflicts of interest, the downward spiral started (and was accelerated by leverage).
The ICT business, at least at the lower transport levels, is much less dependent on subjective assessments. One can measure up-time and packet loss objectively. Things are less sure at the content layers, as can be seen in the rumblings about the incidence of click fraud, and whether the click-through accounting of search engine operators can be trusted; so far, though, there’s been no evidence of a rickety tower of dubious reputation.
Conclusions
Finance today arguably needs more supervision because of the wide ramifications of unfettered greed, fear or stupidity. The impacts are so large because of the amplifying effects of leverage and intangibility; the risk is greater because of the opacity due to unintelligibility.
ICT also has leverage, intangibility and opacity, but not at the same scale. Therefore, objections to delegated regulation in finance do not transfer automatically to ICT.
Counter-intuitively (to me, at least), the parts of the ICT business that are most defensible against claims for re-regulation are those that have a great deal of physical infrastructure. The more software-intensive parts of the cloud are most vulnerable to analogies with the runaway risks we’ve seen in financial markets
Notes
[1] While telecoms is an easy old word that everybody knows, it really doesn’t capture the present situation. It connotes old technologies like telephony, ignores the media, and misses the importance of computing and software. There is as yet no better, commonly used term, and so I’ll reluctantly use the acronym ICT (Information and Communication Technologies). ICT is about business and policy as well as technology, but it’s a little more familiar, and shorter, than “connected computing”, my other preferred term.
[2] Jonathan Sallet observes that the financial crisis derives from market externalities that put all of society at risk (personal communication, 26 Nov 2008). The very large scope of this risk can be used to justify government intervention. We’re hoping to combine our thinking in an upcoming note.
[3] I’m toying with the notion of doing a metaphor analysis of information. At first sight, the discourse seems to be driven by an Information Is a Fluid analogy; it’s a substance of which one can have more or less. This metaphor is both pervasive, and open to criticism. Reddy introduced the Communication Is a Conduit metaphor for knowledge transfer; this is related to the Lakoff’s Ideas Are Objects. See here for his critique, and citation of his paper.
[4] Just because I can’t see a cascade doesn’t mean it isn’t there, of course; it may just be my uninformed and uninspired imagination. Network security analysts have, I’m sure, constructed many nightmare scenarios. The weakness of my analysis here is that my argument for the implausibility of a meltdown rests in part on the fact that it hasn’t happened – yet. The 9/11 fallacy. . . .
Friday, November 21, 2008
More on Intelligibility vs. Transparency
A commentary by Richard Thaler and Cass Sunstein, the co-authors of Nudge, also notes that the growing complexity of the financial world needs more attention; cf. my recent post Lessons for communications regulation from banking complexity.
I’ve been thinking about bounded rationality for some time; see the Hard Intangibles thread. It’s one of the fundamental challenges of managing complex adaptive systems. I, like many others, recommended disclosure (aka transparency) as a key tool for internet governance; see e.g. my Internet Governance as Forestry paper).
However, the more I think about transparency, the more skeptical I become. I’ve concluded that in finance, at least, the problem isn’t disclosure but intelligibility; see e.g. my post From transparency to intelligibility in regulating finance. While it’s true that eliminating complexity could stifle innovation, that’s a false choice; nobody’s proposing to eliminate either complexity or innovation. It’s our infatuation with novelty, as well as our greed, that got us into this problem, and we have to manage our urges in both respects.
I suspect that one can make the intelligibility argument just as well for computing & communications as for finance – though the lack of a Comms Chernobyl will make it harder to sell the idea in that industry.
Transparency/intelligibility need not be mandatory; companies should be able to choose obscurity. However, the choice of keeping their activities shrouded will bear the cost of increased regulatory scrutiny – and perhaps higher expectations regarding performance against public interest mandates. Comcast need not explain exactly how it manages its network; but if it chooses obscurity, it should face tougher network neutrality expectations. Microsoft does not have to disclose its interfaces; but if it chooses obscurity, it should face a tougher anti-trust test. Google need not explain how it uses DoubleClick data to improve ad click-throughs; but if it chooses obscurity, it should face tougher privacy protection requirements.
My “transparency to intelligibility” post proposed algorithmic complexity as a way to measure intelligibility. That’s not the only method. Another (prompted by the financial crisis, and teetering software stacks) is an abstraction ladder: the more steps between a derivative and its underlying asset, the higher it is on the abstraction ladder, and the less intelligible and more risky it should be deemed to be. In computing & communications as in finance, the abstraction ladder counts the number of rungs up from atoms. The networking stack is an example: from wires in the ground one climbs up to links, networks, sessions, applications. On the premise that atoms are easier to observe than bits, and that piling up inscrutable and unstable combinations are easier the higher you go, services at higher layers will be subject to closer regulatory scrutiny, other things (like market concentration) being equal.
Not so long ago, most mortgages were of the 30-year fixed-rate variety. Shopping was simple: find the lowest monthly payment. Now they come in countless forms. Even experts have trouble comparing them and a low initial monthly payment can be a misleading guide to total costs (and risks). A main cause of the mortgage crisis is that borrowers did not understand the terms of their loans. Even those who tried to read the fine print felt their eyes glazing over, especially after their mortgage broker assured them that they had an amazing deal.Thaler & Sunstein conclude that regulators therefore need to help people manage complexity and resist temptation. They reject the option of requiring simplicity, on the grounds that this would stifle innovation, and they recommend that disclosure is improved.
Yet growing complexity on the borrowers’ side was trivial compared with what was going on at the banks. Mortgages used to be held by the banks that initiated the loans. Now they are sliced into mortgage-backed securities, which include arcane derivative products.
--- Human frailty caused this crisis, Financial Times, 11 November 2008. Thanks to Andrew Sterling for the link.
I’ve been thinking about bounded rationality for some time; see the Hard Intangibles thread. It’s one of the fundamental challenges of managing complex adaptive systems. I, like many others, recommended disclosure (aka transparency) as a key tool for internet governance; see e.g. my Internet Governance as Forestry paper).
However, the more I think about transparency, the more skeptical I become. I’ve concluded that in finance, at least, the problem isn’t disclosure but intelligibility; see e.g. my post From transparency to intelligibility in regulating finance. While it’s true that eliminating complexity could stifle innovation, that’s a false choice; nobody’s proposing to eliminate either complexity or innovation. It’s our infatuation with novelty, as well as our greed, that got us into this problem, and we have to manage our urges in both respects.
I suspect that one can make the intelligibility argument just as well for computing & communications as for finance – though the lack of a Comms Chernobyl will make it harder to sell the idea in that industry.
Transparency/intelligibility need not be mandatory; companies should be able to choose obscurity. However, the choice of keeping their activities shrouded will bear the cost of increased regulatory scrutiny – and perhaps higher expectations regarding performance against public interest mandates. Comcast need not explain exactly how it manages its network; but if it chooses obscurity, it should face tougher network neutrality expectations. Microsoft does not have to disclose its interfaces; but if it chooses obscurity, it should face a tougher anti-trust test. Google need not explain how it uses DoubleClick data to improve ad click-throughs; but if it chooses obscurity, it should face tougher privacy protection requirements.
My “transparency to intelligibility” post proposed algorithmic complexity as a way to measure intelligibility. That’s not the only method. Another (prompted by the financial crisis, and teetering software stacks) is an abstraction ladder: the more steps between a derivative and its underlying asset, the higher it is on the abstraction ladder, and the less intelligible and more risky it should be deemed to be. In computing & communications as in finance, the abstraction ladder counts the number of rungs up from atoms. The networking stack is an example: from wires in the ground one climbs up to links, networks, sessions, applications. On the premise that atoms are easier to observe than bits, and that piling up inscrutable and unstable combinations are easier the higher you go, services at higher layers will be subject to closer regulatory scrutiny, other things (like market concentration) being equal.
Monday, November 17, 2008
Lessons for communications regulation from banking complexity
A New Scientist story on Why the financial system is like an ecosystem (Debora Mackenzie, 22 October 2008) traces how the science of complexity might prevent future breakdowns of the world’s financial system.
The lessons apply to communications regulation, too. Both finance and the ICT business (“Information & Computer Technology”) are complex systems. The recommendations in the article resonate with the conclusions I came to in my paper Internet Governance as Forestry. This post explores some of the resonances.
New Scientist observes:
The question for communications regulation is whether phase changes such as those we’ve seen in finance and ecosystems have occurred, or could occur in the future. Other than the periodic consolidation and break-up of telecom monopolies, and the vertical integration of the cable and media businesses in the 80s, conclusive evidence of big phase transitions in communications is hard to find. Is there currently a slow accumulation of small changes which will lead to a big shift? There are two obvious candidates: the erosion of network neutrality, and growth of personal information bases (cf. behavioral advertising, Phorm, more).
The New Scientist article suggests that unremarked linkages, such as the increase in cross-border investments since 1995, allowed the collapse of the US real estate market to reverberate around the world. The most obvious linkage in communications is “convergence”, the use of the same underlying technology to provide a myriad of services. Common technology facilitates commercial consolidation in infrastructure equipment (e.g. Cisco routers), tools (e.g. Microsoft’s web browser), and services (e.g. Google advertising). Convergence ties together areas of regulation that used to be distinct. For example, TV programs are distributed through broadcasting, cable, podcasts, mobile phones; how should one ensure access to the disabled in this situation? There are also links from one network layer to another, as internet pipe providers use Phorm-like technologies to track which web sites their users visit.
Increased connectivity makes the financial system less diverse and more vulnerable to dramatics shifts. “The source of the current problems is ignoring interdependence," according to Yaneer Bar-Yam, head of the New England Complex Systems Institute in Cambridge, Massachusetts. Telecoms convergence creates a similar risk, with substantial horizontal concentration: Cisco has 60% market share in core routers, Internet Explorer holds 70% web browser share, and Google has 60% search share and 70% online advertising share. While modularity and thus substitutability of parts in the internet/web may limit this concentration, it needs to be carefully monitored, as captured by my Diversity principle: “Allow and support multiple solutions to policy problems; encourage competition and market entry.” Integration is a successful strategy (cf. Apple) that some find disconcerting (cf. Zittrain); it is likely to become more pervasive as the industry matures.
Diversity allows ecosystems to remain resilient as conditions change. In the quest to achieve these results, regulators have to be careful to avoid rigidity, a temptation because the financial system is so fluid. Here’s Bar-Yam again, from the New Scientist article: “Governments will have to be very careful, and set rules and limits for the system without actually telling people what to do.” To manage this risk in the comms context, I proposed the principles of Delegation (most problems should be solved by the market and society, not by government; government's role is to provide proper incentives and guidance, and to intervene to solve critical shortcomings) and Flexibility (determine ends, not means; describe and justify the outcomes sought, not the methods to be used to achieve them).
The article closes by quoting Bar-Yam: “At its core the science of complex systems is about collective behaviour.” He goes on to say that economic policy has so far failed to take into account the complexity and consequent unpredictability of such behavior, and calls for the use of testable models. This will be important in communications regulation, too. Simulations of the internet/web can help to improve policy makers’ intuition about unpredictable systems with many variables. Exploring the consequences of policy choices in simulation can identify which courses of action are most robust under a variety of possible outcomes. It’s the 21st Century version of letting states and regions experiment with regulation, which is eventually pre-empted by federal rules. Policy simulation will allow decision makers to “sweat in training rather than bleed in combat.” Since any solution embodies a set of assumptions and biases, constructing a wide range of simulations can expose hidden preconceptions. They can then eliminate policy choices that work in only a narrow set of circumstances, leading to more resilient final measures.
Update 28 Nov 2008:
I came across a very apposite comment on the value of simulation in the New Scientist editorial for the July 19, 2008 issue (No. 2665). The editorial is a critique of mainstream economics disinterest in agent-based models. It closes by saying:
The lessons apply to communications regulation, too. Both finance and the ICT business (“Information & Computer Technology”) are complex systems. The recommendations in the article resonate with the conclusions I came to in my paper Internet Governance as Forestry. This post explores some of the resonances.
New Scientist observes:
“Existing economic policies are based on the theory that the economic world is made up of a series of simple, largely separate transaction-based markets. This misses the fact that all these transactions affect each other, complexity researchers say. Instead, they see the global financial system as a network of complex interrelationships, like an electrical power grid or an ecosystem such as a pond or swamp”Consequently, the accumulation of small, slow changes can trigger a sudden crisis. Johan Rockström of the Stockholm Environment Institute is quoted as saying,
"Slow changes have been accumulating for years, such as levels of indebtedness. None on their own seemed big enough to trigger a response. But then you get a trigger - one investment bank falls - and the whole system can then flip into an alternative stable state, with different rules, such as mistrust."This is reminiscent of my Big Picture principle, which can be summarized as “take a broad view of the problem and solution space; prefer generic to sector-, technology-, or industry-specific legislation.”
The question for communications regulation is whether phase changes such as those we’ve seen in finance and ecosystems have occurred, or could occur in the future. Other than the periodic consolidation and break-up of telecom monopolies, and the vertical integration of the cable and media businesses in the 80s, conclusive evidence of big phase transitions in communications is hard to find. Is there currently a slow accumulation of small changes which will lead to a big shift? There are two obvious candidates: the erosion of network neutrality, and growth of personal information bases (cf. behavioral advertising, Phorm, more).
The New Scientist article suggests that unremarked linkages, such as the increase in cross-border investments since 1995, allowed the collapse of the US real estate market to reverberate around the world. The most obvious linkage in communications is “convergence”, the use of the same underlying technology to provide a myriad of services. Common technology facilitates commercial consolidation in infrastructure equipment (e.g. Cisco routers), tools (e.g. Microsoft’s web browser), and services (e.g. Google advertising). Convergence ties together areas of regulation that used to be distinct. For example, TV programs are distributed through broadcasting, cable, podcasts, mobile phones; how should one ensure access to the disabled in this situation? There are also links from one network layer to another, as internet pipe providers use Phorm-like technologies to track which web sites their users visit.
Increased connectivity makes the financial system less diverse and more vulnerable to dramatics shifts. “The source of the current problems is ignoring interdependence," according to Yaneer Bar-Yam, head of the New England Complex Systems Institute in Cambridge, Massachusetts. Telecoms convergence creates a similar risk, with substantial horizontal concentration: Cisco has 60% market share in core routers, Internet Explorer holds 70% web browser share, and Google has 60% search share and 70% online advertising share. While modularity and thus substitutability of parts in the internet/web may limit this concentration, it needs to be carefully monitored, as captured by my Diversity principle: “Allow and support multiple solutions to policy problems; encourage competition and market entry.” Integration is a successful strategy (cf. Apple) that some find disconcerting (cf. Zittrain); it is likely to become more pervasive as the industry matures.
Diversity allows ecosystems to remain resilient as conditions change. In the quest to achieve these results, regulators have to be careful to avoid rigidity, a temptation because the financial system is so fluid. Here’s Bar-Yam again, from the New Scientist article: “Governments will have to be very careful, and set rules and limits for the system without actually telling people what to do.” To manage this risk in the comms context, I proposed the principles of Delegation (most problems should be solved by the market and society, not by government; government's role is to provide proper incentives and guidance, and to intervene to solve critical shortcomings) and Flexibility (determine ends, not means; describe and justify the outcomes sought, not the methods to be used to achieve them).
The article closes by quoting Bar-Yam: “At its core the science of complex systems is about collective behaviour.” He goes on to say that economic policy has so far failed to take into account the complexity and consequent unpredictability of such behavior, and calls for the use of testable models. This will be important in communications regulation, too. Simulations of the internet/web can help to improve policy makers’ intuition about unpredictable systems with many variables. Exploring the consequences of policy choices in simulation can identify which courses of action are most robust under a variety of possible outcomes. It’s the 21st Century version of letting states and regions experiment with regulation, which is eventually pre-empted by federal rules. Policy simulation will allow decision makers to “sweat in training rather than bleed in combat.” Since any solution embodies a set of assumptions and biases, constructing a wide range of simulations can expose hidden preconceptions. They can then eliminate policy choices that work in only a narrow set of circumstances, leading to more resilient final measures.
Update 28 Nov 2008:
I came across a very apposite comment on the value of simulation in the New Scientist editorial for the July 19, 2008 issue (No. 2665). The editorial is a critique of mainstream economics disinterest in agent-based models. It closes by saying:
“Although the present crisis was not caused by poor economic models, those models have extended its reach by nurturing the complacent view that markets are inherently stable. And while no one should expect better models alone to prevent future crises, they may give regulators better ways to assess market dynamics, detect early signs of trouble and police markets.”
Tuesday, October 21, 2008
From transparency to intelligibility in regulating finance
As I've been reading coverage of the rolling financial crisis, it occurred to me again that one of the tools that I’ve recommended for managing complex regulatory systems - transparency - might be relevant, though with a tweak.
More than lack of transparency, a root cause of the melt-down seems to have been a lack of intelligibility. The constant refrain is that nobody understood the ramifications of the financial instruments they were creating, managing or (not) regulating.
I’ve explored this topic in the context of software development on this blog (the hard intangibles thread), and I'm now convinced that the cognitive constraints that lead to problems on large software projects apply in finance, too. As I blogged last August:
The current approach is that complex novel approaches are left unregulated, on the assumption that only “informed investors”, those who are supposedly smart enough to understand the risks, will be exposed to losses. We’ve learned that this is not the case: the informed investors are pretty dumb, and the rest of us pay for their ignorance.
I now think we should invert the regulatory presumption: the more complicated an instrument, the more firmly it should be supervised.
The hard question is how to measure the intelligibility of financial instruments in order to decide if they deserve additional scrutiny. The Mom Test for user interface design - "would your mom be able to figure this out?" – seems reasonable, but it’s hard to see how the SEC would use it in practice. A more commonly used equivalent, the Politician Test, doesn’t help either since the comprehension of politicians is a function of campaign contributions.
We’re left with algorithmic complexity: the length of the program required to specify the object. Financial wizards will surely plead commercial confidentiality in order to avoid disclosing their algorithms; but a private assessment by an impartial regulator need not lead to a leakage of competitive advantage.
More than lack of transparency, a root cause of the melt-down seems to have been a lack of intelligibility. The constant refrain is that nobody understood the ramifications of the financial instruments they were creating, managing or (not) regulating.
I’ve explored this topic in the context of software development on this blog (the hard intangibles thread), and I'm now convinced that the cognitive constraints that lead to problems on large software projects apply in finance, too. As I blogged last August:
“The sub-prime mortgage debacle is a problem of cognitive complexity. A lack of understanding of the risks entailed by deeply nested loan relationships is leading to a lack of trust in the markets, and this uncertainty is leading to a sell-off.”At that point I hesitated to draw the corollary: that limits should be imposed on the complexity of the intangible structures we create.
The current approach is that complex novel approaches are left unregulated, on the assumption that only “informed investors”, those who are supposedly smart enough to understand the risks, will be exposed to losses. We’ve learned that this is not the case: the informed investors are pretty dumb, and the rest of us pay for their ignorance.
I now think we should invert the regulatory presumption: the more complicated an instrument, the more firmly it should be supervised.
The hard question is how to measure the intelligibility of financial instruments in order to decide if they deserve additional scrutiny. The Mom Test for user interface design - "would your mom be able to figure this out?" – seems reasonable, but it’s hard to see how the SEC would use it in practice. A more commonly used equivalent, the Politician Test, doesn’t help either since the comprehension of politicians is a function of campaign contributions.
We’re left with algorithmic complexity: the length of the program required to specify the object. Financial wizards will surely plead commercial confidentiality in order to avoid disclosing their algorithms; but a private assessment by an impartial regulator need not lead to a leakage of competitive advantage.
Monday, October 13, 2008
The loss of paint on panel
At the end of Jared Carter’s poem “At the Sign-Painter’s,” he explains why he liked these men best of all the laborers he met, accompanying his father on his rounds:
Some few people still manipulate their work: chefs, the cutters of hair, surgeons, those who care for children and the sick. A blessed few, though not financially; except for surgeons, most are paid minimum wage.
The rest of us have to wrestle meaning out of the “huge disorder” with tools that are themselves intangible, information worker implements operated with plastic prostheses.
----
The poem appears on p. 172 of Fifty Years of American Poetry, New York: Harry N Abrams, 1984. It originally appeared in Work, for the Night is Coming, 1980.
For the wooden rod with its black knob resting lightlyI felt nostalgia for the loss of finger-touch with stuff we make for a living. A sign maker now works with mouse and keyboard, tweaking pixels on a glass screen – no more “the slow sweep and whisper / Of the brush.” The signs themselves are made at many removes, the “fresh and wet and gleaming” paint now applied in the invisible sanctum of an industrial ink-jet printer.
Against the primed surface, for the slow sweep and whisper
Of the brush—liked seeing the ghost letters in pencil
Gradually filling out, fresh and wet and gleaming, words
Forming out of all that darkness, that huge disorder.
Some few people still manipulate their work: chefs, the cutters of hair, surgeons, those who care for children and the sick. A blessed few, though not financially; except for surgeons, most are paid minimum wage.
The rest of us have to wrestle meaning out of the “huge disorder” with tools that are themselves intangible, information worker implements operated with plastic prostheses.
----
The poem appears on p. 172 of Fifty Years of American Poetry, New York: Harry N Abrams, 1984. It originally appeared in Work, for the Night is Coming, 1980.
Tuesday, September 30, 2008
Expect whining: The Boomers in Retirement
McKinsey did a study (free registration required) of the impact of retiring baby boomers in November 2007. As one might expect, the report is brimming with factoids. The ones that jumped out at me:
Guess who’ll be demanding a bail-out of their own in 2015, just as we’re (hopefully) getting over the current one?
- 60 percent of the boomers won’t be able to maintain a lifestyle close to their current one without continuing to work.
- The same percentage of older boomers already suffers from chronic health problems.
- 43 percent already are frustrated that they aren’t leading the lives they expected.
- Boomers will control nearly 60 percent of US net wealth in 2015 (see Exhibit 2).
- There will be more than 45 million households with people from 51 to 70 years old, compared with about 25 million for the “silent” generation, born from 1925 to 1945.
- Their real disposable income and consumption will be roughly 40 percent higher.
Guess who’ll be demanding a bail-out of their own in 2015, just as we’re (hopefully) getting over the current one?
Monday, September 22, 2008
Spectrum auctions aren’t neutral
Spectrum allocation isn’t neutral regarding technology or services, even when there aren’t obvious strings attached. Even when it looks like regulators are not making technology bets, as in contemporary auctions, they are indeed doing so.
In the old days, radio licenses were issued via beauty contests, with stringent conditions on the kind of services and technology that a licensee could use. Television is a good example: the license limits use to TV broadcasts, and specifies the technology to be used (such as NTSC analog and ATSC digital in the US). Such “command and control” regulation has been much decried in recent years: industry is better placed to make technology decisions than the government, say the Right, and beauty contests lead to windfall profits, say the Left.
The Right favors the auction of flexible-use licenses which don’t specify the service to be offered or the technology to be used. This has been largely implemented in the cellular bands, where operators could choose whether and when to move from analog to digital technology, and which technology to use. [1] AT&T, for example, uses the TDMA technology, while Verizon uses CDMA.
However, even if conditions aren’t attached to the frequencies, the way in which frequencies are packaged into bands limits the technologies that can be used, and thus the services that can be offered. A current example is the debate in Europe about allocating spectrum for future wireless services. Unstrung reports:
--- Note ---
[1] Some remnants of the old philosophy remain. The PCS rules, for example, allow licensees to provide any mobile communications service BUT broadcasting as defined in the Communications Act is prohibited.
In the old days, radio licenses were issued via beauty contests, with stringent conditions on the kind of services and technology that a licensee could use. Television is a good example: the license limits use to TV broadcasts, and specifies the technology to be used (such as NTSC analog and ATSC digital in the US). Such “command and control” regulation has been much decried in recent years: industry is better placed to make technology decisions than the government, say the Right, and beauty contests lead to windfall profits, say the Left.
The Right favors the auction of flexible-use licenses which don’t specify the service to be offered or the technology to be used. This has been largely implemented in the cellular bands, where operators could choose whether and when to move from analog to digital technology, and which technology to use. [1] AT&T, for example, uses the TDMA technology, while Verizon uses CDMA.
However, even if conditions aren’t attached to the frequencies, the way in which frequencies are packaged into bands limits the technologies that can be used, and thus the services that can be offered. A current example is the debate in Europe about allocating spectrum for future wireless services. Unstrung reports:
. . . the CEPT working group will likely recommend two band plan options: one for frequency division duplex (FDD), which uses different frequencies to transmit and receive signals, and the other for time division duplex (TDD), which uses one channel and timed transmissions to send and receive. Cellular operators have traditionally favored FDD systems. [my italics]Regulators will have to make a choice between FDD and TDD, which entails a choice between services and vendors. FDD is voice-oriented and aligned with the cellular industry (UMTS) and Qualcomm, while TDD is data-oriented and aligned with the WiMAX camp and Intel.
--- Note ---
[1] Some remnants of the old philosophy remain. The PCS rules, for example, allow licensees to provide any mobile communications service BUT broadcasting as defined in the Communications Act is prohibited.
Wednesday, September 17, 2008
Rebooting the broadband debate
Rob Atkinson and his colleagues at ITIF have written an even-handed and insightful report on “Explaining International Broadband Leadership”.
They found that while the United States is behind other countries in broadband deployment, speed and price, it can’t all be blamed on the government – but that good policies can make a difference. It’s harder than many on the Left claim to find a silver bullet in the experience of some other country (France, South Korea, etc.), but despite claims from the Right, one can learn something from their best practices. Government leadership and competition matter, but so do market incentives on both the supply and demand side.
Atkinson, Correa and Hedlund’s headline policy recommendation is that we end the “either-or shouting matches”. However, the question is How? They call for a “pragmatic discussion”, but that’s the end result, not the means to the end. It’s true, as they say, that we should be able to agree that the United States can do better on broadband, but we can only move beyond a divisive and unproductive debate if the conversation is reframed – and if we can recruit new, less entrenched, participants to the table.
One could engage industry and society at large (rather than just companies and activists with narrow issue agendas) if broadband were tied to commercial and personal success.
Workforce development: “Telecommuting” is a very tired meme nowadays, but it had power back in the day. If the Fortune 500 came to believe that universal affordable broadband would make them more competitive, and if the AFL-CIO came to believe that it would make workers more employable, then the debate might shift.
More sales: The “information superhighway” is just as tired, but the notion that the interstates and local roads are good for business is as true now as it ever was. If US retailers of goods and services (including entertainment) came to believe that they’d generate more profitable sales if the network was faster and cheaper, and if populist protectionists came to believe that fast local broadband was a bulwark against losing business to them furriners, then the debate might shift.
Energy: If one could make an argument that the US could get to energy independence sooner by moving bits rather than atoms, then the debate might shift. Gas prices will fluctuate, but the trend will be up. If you’re moving atoms, the world isn’t flat. Broadband can enable gains from local specialization based on knowledge, rather than production costs of commodities.
They found that while the United States is behind other countries in broadband deployment, speed and price, it can’t all be blamed on the government – but that good policies can make a difference. It’s harder than many on the Left claim to find a silver bullet in the experience of some other country (France, South Korea, etc.), but despite claims from the Right, one can learn something from their best practices. Government leadership and competition matter, but so do market incentives on both the supply and demand side.
Atkinson, Correa and Hedlund’s headline policy recommendation is that we end the “either-or shouting matches”. However, the question is How? They call for a “pragmatic discussion”, but that’s the end result, not the means to the end. It’s true, as they say, that we should be able to agree that the United States can do better on broadband, but we can only move beyond a divisive and unproductive debate if the conversation is reframed – and if we can recruit new, less entrenched, participants to the table.
One could engage industry and society at large (rather than just companies and activists with narrow issue agendas) if broadband were tied to commercial and personal success.
Workforce development: “Telecommuting” is a very tired meme nowadays, but it had power back in the day. If the Fortune 500 came to believe that universal affordable broadband would make them more competitive, and if the AFL-CIO came to believe that it would make workers more employable, then the debate might shift.
More sales: The “information superhighway” is just as tired, but the notion that the interstates and local roads are good for business is as true now as it ever was. If US retailers of goods and services (including entertainment) came to believe that they’d generate more profitable sales if the network was faster and cheaper, and if populist protectionists came to believe that fast local broadband was a bulwark against losing business to them furriners, then the debate might shift.
Energy: If one could make an argument that the US could get to energy independence sooner by moving bits rather than atoms, then the debate might shift. Gas prices will fluctuate, but the trend will be up. If you’re moving atoms, the world isn’t flat. Broadband can enable gains from local specialization based on knowledge, rather than production costs of commodities.
Wednesday, September 10, 2008
Protecting receivers vs. authorizing transmitters
When governments hand out permissions to operate radios – licenses, for example – they think in terms of transmitters: within the licensed frequency range you can broadcast at such-and-such a power, and outside those frequencies you can transmit only at some other, much lower, power. [1] This distribution of broadcast power is often called a “transmission mask”.
In thinking about new ways of regulating radio, I’ve come to believe that a transmission mask is not sufficient; it helps to include receiver parameters. [2] But that’s not the point of this post; if you’re interested, read my paper at SSRN.
Today’s question is: if transmitter parameters are not sufficient, could one do without them completely? Can you go the whole hog, and only specify a receiver mask? I think you can, and I’m encouraged that Dale Hatfield tells me Robert Matheson concluded this some time ago, though I haven’t found a reference yet.
Receiver and transmitter parameters are figure and ground. Assume a steady state where all systems operate without interfering with each other. Transmissions will by definition have been chosen to prevent interference with receivers in adjacent frequency bands. The result of all the transmissions is a level of electromagnetic energy at every receiver which is low enough that no harmful interference results. (I’m ignoring receiver specifications [2] and the geographical distribution of transmitters in this discussion.) Each transmission propagates through space to a receiver, resulting in the allowed receiver mask:
{all transmission masks} -> {propagation} -> {resulting receiver mask}One can also invert the calculation: given a receiver mask and propagation model, one can determine what the allowed transmissions should be.
A license defined in terms of receiver masks would allowed the licensee to transmit any amount of energy as long as it does not exceed the mask of anyone else. It would guarantee that nobody else is allowed to radiate energy which leads to the allowed levels being exceeded at the licensee’s receiver.
One can compare reception-based vs. transmission-based licenses by thinking about property rights. The two important attributes here are exclusivity (if I’m a licensee, I can prevent anyone from transmitting in my frequency “band”), and autonomy (within my “band”, I can do what I like, notably acting in a way that makes money). [3], [4]
A transmitter-based license focuses on autonomy by defining transmission parameters. It specifies what a licensee is allowed to do, but it doesn’t provide any guarantee of exclusivity. A receiver-only right reverses this emphasis: by specifying what would amount to harm to a receiver, it provides a way to make exclusivity real in practice. The constraints on autonomy are implicit in the receiver-rights of others: as long as a licensee doesn’t interfere with other licensees’ exclusivity, it can do what it likes.
Anything transmitter-only rights can do, receiver-only rights can also do. They are mirror images of each other. The information burdens are also mirror images. Receiver rights place a burden on all the other rights holders to figure out if their transmissions will transgress a receiver spec. While transmission rights don’t impose an information overhead upfront, the rights holder bears the burden of uncertainty: they may be blind sided at some future date by a new transmission right that reduces the value of the system they’ve deployed. The fight between M2Z and T-Mobile is a good example: T-Mobile claims that the proposed terms of a proposed new cellular license M2Z seeks (AWS-3) will cause harmful interference to their operations under a current license (AWS-1).
I like receiver-only rights because they put the focus on the ability of a wireless system to operate in the presence of noise, which one can only do by taking receiver parameters into account. However, enforcement proceedings may appear more complicated in this case, since it isn’t obvious which of many transmissions is responsible for a receiver mask being exceeded: a spectrum analyzer at a receiver can only measure the sum of all radiated power. Today the regulator has what looks like a big stick: it can objectively check whether a licensee’s equipment meets or violates its approved transmission mask. The stick doesn’t actually help solve the most difficult interference problem, the case where all transmissions meet their masks, but it gives the regulator power that it will be loath to give up.
Conclusion
There a choice in creating radio rights between protecting receivers and authorizing transmitters – or some mixture of the two. The current approach limits the discussion simply to ways of authorizing transmitters. A more nuanced analysis of the trade-offs is required, and was begun here.
Notes
[1] Since this is going to get pretty geeky, so I’m going to leave out a lot of important other stuff, including geography (radio licenses are typically restricted to a certain area) and duty cycles (how often transmitters operate).
[2] Receiver parameters. I distinguish between a receiver mask, by which I mean a distribution of RF energy (i.e., a spectrum) which represents the worst-case noise environment in which a receiver needs to be able to operate, and a receiver specification, by which I mean the ability of the receiver to detect and decode wanted signals in the presence of unwanted ones.
[3] According to Gary Libecap in Contracting for property rights (1989), p. 2 “. Private ownership . . . may involve a variety of rights, including the right to exclude nonowners from access, the right to appropriate the stream of rents from use of and investments in the resource, and the right to sell or otherwise transfer the resource to others.”
[4] “Band” here means the collection of constraints on operation. In the conventional approach, it’s usually taken to mean a frequency band and geographical region.
Saturday, August 30, 2008
Analog and digital religions
Are you saved?
In the Christian tradition, the answer is binary: either you are, or you aren’t. Even if you’re not sure, God has decided. You’re either going to heaven or hell, with a side trip through purgatory for some denominations.
It’s curious that we’ve taken so easily to binary, digital technologies given that we evolved in a physical reality that is analog and continuous. I suspect it’s because our minds categorize: someone is either male or female, friend or foe, sheep or a goat; the fruit is on the tree, in the basket, or on the ground. The classifying knack makes intelligible binary outcomes both in the spiritual life and modern technology though one may also have a gradualist religion, or a liking for vinyl records.
- - - -
* As always, yes, there are exceptions that prove the rule. PC performance can degrade gradually as a disk gets fragmented or an application accumulates memory leaks; glass and ceramic will fracture catastrophically.
In the Christian tradition, the answer is binary: either you are, or you aren’t. Even if you’re not sure, God has decided. You’re either going to heaven or hell, with a side trip through purgatory for some denominations.
When the Son of Man comes in his glory, and all the angels with him, he will sit on his throne in heavenly glory. All the nations will be gathered before him, and he will separate the people one from another as a shepherd separates the sheep from the goats. He will put the sheep on his right and the goats on his left. (Matthew 25:31-33)There are other traditions where salvation is an “analog” process. The release from suffering comes about gradually through hard work.
Just as when a carpenter or carpenter's apprentice sees the marks of his fingers or thumb on the handle of his adze but does not know, “Today my adze handle wore down this much, or yesterday it wore down that much, or the day before yesterday it wore down this much,” still he knows it is worn through when it is worn through. (Samyutta Nikaya 22.101)Our physical existence is analog: things wear down gradually, like the handle of an adze over years of use. (In case you also don’t remember what an adze is: it’s a tool used for smoothing rough-cut wood in hand woodworking.) On the other hand, technology is increasingly digital: something either works perfectly, or not at all. [*] The reception of analog TV will gradually get worse as one moves further and further away from a transmission tower, but digital TV quality falls off a cliff at a certain distance. It’s perfect, and then suddenly the screen is black.
It’s curious that we’ve taken so easily to binary, digital technologies given that we evolved in a physical reality that is analog and continuous. I suspect it’s because our minds categorize: someone is either male or female, friend or foe, sheep or a goat; the fruit is on the tree, in the basket, or on the ground. The classifying knack makes intelligible binary outcomes both in the spiritual life and modern technology though one may also have a gradualist religion, or a liking for vinyl records.
- - - -
* As always, yes, there are exceptions that prove the rule. PC performance can degrade gradually as a disk gets fragmented or an application accumulates memory leaks; glass and ceramic will fracture catastrophically.
Monday, August 25, 2008
The Soreness of Losing – Clinton Edition
A dark cloud of cranky Clintonism is hanging over the Democratic convention in Denver. Dark muttering about not supporting Obama because Clinton (either one) was disrespected just won’t go away.
There are many plausible explanations, including egotism, frustrated feminism and the Boomer/GenX divide. I rather like an appeal to the psychological phenomenon of loss aversion: people feel a loss more keenly than a gain.
Technically, loss aversion is the tendency to prefer avoiding losses over acquiring equivalent gains. I like to think of it this way: Imagine a store selling widgets. They can either sell them for $100, and offer a 5% discount for cash, or sell them at $95 but impose a $5 surcharge for someone buying with a credit card. The discount feels like a gain to the cash customer, and the surcharge feels like a loss to the credit card buyer. The net effect is the same, but the loss is felt more keenly than the gain. Therefore, stores will more likely post the credit card price and offer cash discount than impose a surcharge.
Clinton supporters went into the primary campaign assuming that they were going to win. Obama’s win is a keen loss to them; something that these people felt they already “had” is being taken away. For the Obamans, on the other hand, the win was a bonus; they never really expected it. They’re happy about it, of course, but don’t feel it as profoundly as the Clintonistas feel their loss.
There is probably little that the Obama campaign can do to assuage their pain. The best hope for the Democratic party is the operation of another cognitive bias: the tendency for people to overestimate the length or the intensity of an emotion, known as impact bias. Even though the Clintonistas may not feel that way now, by the time the November election comes around their current disaffection will have passed, and they will vote the Democratic ticket.
There are many plausible explanations, including egotism, frustrated feminism and the Boomer/GenX divide. I rather like an appeal to the psychological phenomenon of loss aversion: people feel a loss more keenly than a gain.
Technically, loss aversion is the tendency to prefer avoiding losses over acquiring equivalent gains. I like to think of it this way: Imagine a store selling widgets. They can either sell them for $100, and offer a 5% discount for cash, or sell them at $95 but impose a $5 surcharge for someone buying with a credit card. The discount feels like a gain to the cash customer, and the surcharge feels like a loss to the credit card buyer. The net effect is the same, but the loss is felt more keenly than the gain. Therefore, stores will more likely post the credit card price and offer cash discount than impose a surcharge.
Clinton supporters went into the primary campaign assuming that they were going to win. Obama’s win is a keen loss to them; something that these people felt they already “had” is being taken away. For the Obamans, on the other hand, the win was a bonus; they never really expected it. They’re happy about it, of course, but don’t feel it as profoundly as the Clintonistas feel their loss.
There is probably little that the Obama campaign can do to assuage their pain. The best hope for the Democratic party is the operation of another cognitive bias: the tendency for people to overestimate the length or the intensity of an emotion, known as impact bias. Even though the Clintonistas may not feel that way now, by the time the November election comes around their current disaffection will have passed, and they will vote the Democratic ticket.
Sunday, August 24, 2008
Why is it hard to be good?
Getting into his stride, productivity guru David Allen asks his $595/head audience, “How many of you have fallen off the wagon?” That is: how many, after having already forked out money at least once for his Getting Things Done regimen, have relapsed into their old bad habits? Many of them sheepishly raise their hands [1].
Something similar happens every week in churches, temples, synagogues, mosques and meeting houses around the world, though it’s usually cheaper and less glitzy: people go to be reminded, again and again and again, to practice virtue, and quit their vices.
Why is it so hard to be good, and so easy to be bad? Nobody needs a motivational speaker to remind them to sin. The goal is not to be virtuous for the sake of it, of course. Virtue is necessary for salvation. But the question still stands: Why is the path to salvation the difficult one?
If the virtues were adaptive, one could be sure that evolution would have made them pleasurable. We don’t have to be reminded to eat and procreate; it’s stopping ourselves eating and coupling in “inappropriate” ways that takes effort [2]. Perhaps salvation is a goal of the mind, not the body.
Morality comes to the fore when evolution by culture starts to outstrip human evolution by nature – times when the selection of memes becomes more important than the selection of genes.
The Axial Age was such a tipping point. Around 500 BCE, the function of major religions shifted from cosmic maintenance to personal transformation [3]. This was a time when urbanization and mobility was increasing. Literacy and technology moved into the cultural mainstream. There was a decisive change in people’s sense of individuality: a growing consciousness of humans as moral agents responsible for their own actions, an increasing awareness of the experience of death, and a preoccupation with what lay beyond.
Our struggle with virtue might be the clash between what it takes to be happy in an urbanized, technological society, vs. what’s required in a pre-literate life closer to unmediated nature.
The puzzle of the dark triad is instructive. The triad is a complex of anti-social behaviors that has serious social downsides: people who are narcissistic, sociopathic or Machiavellian risk being shunned by others, leaving them vulnerable to all the risks of being a loner outside the social circle. And yet, those behaviors persist; they must be adaptive. It seems that the dark triad helps you get laid (if you’re male). Such people are also useful as wartime leaders.
The dark triad, and other immoral behavior, is sometimes adaptive. Morality could be the way that complex societies compensate for their down sides. The struggle for virtue is the price our minds pay for the benefits our genes get from immoral behavior.
----- Notes -----
[1] This is a paraphrase of reporting in “Getting Serious About Getting Things Done,” Business Week, August 14, 2008
[2] Big sins, unlike the menial ones, usually do require persuasion, as in the pep talks that sellers of shady financial products get before they hit the phone banks. And we do occasionally commit acts of kindness without having to force ourselves – thought that’s rare enough to deserve being remarked. Those good behaviors that do “come naturally”, like caring for our own children, scarcely count as virtues.
[3] John Hick, An Interpretation of Religion, 1989, pp. 22-29
Something similar happens every week in churches, temples, synagogues, mosques and meeting houses around the world, though it’s usually cheaper and less glitzy: people go to be reminded, again and again and again, to practice virtue, and quit their vices.
Why is it so hard to be good, and so easy to be bad? Nobody needs a motivational speaker to remind them to sin. The goal is not to be virtuous for the sake of it, of course. Virtue is necessary for salvation. But the question still stands: Why is the path to salvation the difficult one?
If the virtues were adaptive, one could be sure that evolution would have made them pleasurable. We don’t have to be reminded to eat and procreate; it’s stopping ourselves eating and coupling in “inappropriate” ways that takes effort [2]. Perhaps salvation is a goal of the mind, not the body.
Morality comes to the fore when evolution by culture starts to outstrip human evolution by nature – times when the selection of memes becomes more important than the selection of genes.
The Axial Age was such a tipping point. Around 500 BCE, the function of major religions shifted from cosmic maintenance to personal transformation [3]. This was a time when urbanization and mobility was increasing. Literacy and technology moved into the cultural mainstream. There was a decisive change in people’s sense of individuality: a growing consciousness of humans as moral agents responsible for their own actions, an increasing awareness of the experience of death, and a preoccupation with what lay beyond.
Our struggle with virtue might be the clash between what it takes to be happy in an urbanized, technological society, vs. what’s required in a pre-literate life closer to unmediated nature.
The puzzle of the dark triad is instructive. The triad is a complex of anti-social behaviors that has serious social downsides: people who are narcissistic, sociopathic or Machiavellian risk being shunned by others, leaving them vulnerable to all the risks of being a loner outside the social circle. And yet, those behaviors persist; they must be adaptive. It seems that the dark triad helps you get laid (if you’re male). Such people are also useful as wartime leaders.
The dark triad, and other immoral behavior, is sometimes adaptive. Morality could be the way that complex societies compensate for their down sides. The struggle for virtue is the price our minds pay for the benefits our genes get from immoral behavior.
----- Notes -----
[1] This is a paraphrase of reporting in “Getting Serious About Getting Things Done,” Business Week, August 14, 2008
[2] Big sins, unlike the menial ones, usually do require persuasion, as in the pep talks that sellers of shady financial products get before they hit the phone banks. And we do occasionally commit acts of kindness without having to force ourselves – thought that’s rare enough to deserve being remarked. Those good behaviors that do “come naturally”, like caring for our own children, scarcely count as virtues.
[3] John Hick, An Interpretation of Religion, 1989, pp. 22-29
Subscribe to:
Posts (Atom)