I’ve been trying to think through the analogies between finance and ICT (aka telecoms [1]) in the hope of gleaning insights about ICT regulation from the market melt-down. (Recent posts: From transparency to intelligibility in regulating finance, Lessons for communications regulation from banking complexity, More on Intelligibility vs. Transparency.)
While finance and ICT are both complex adaptive systems, there are some deep differences [2] – deep enough that “Re-regulate, Baby!” thinking about financial markets shouldn’t automatically include ICT. In other words: while self-regulation on Wall Street may be anathema in the current climate, it should still be on the menu for ICT.
Money is a core commodity
The dotcom bust of 2001 was severe, but pales in comparison to the Savings & Loan debacle, let alone the current crisis. The ICT business doesn’t have the societal or financial leverage to drive meltdowns that rock society. Finance is about making money with money, and money drives the economy. Money is the ultimate commodity; when you can’t get money, nothing works.
ICT is not (yet?) so central. Information, for all the talk about bits and bytes, is not really a commodity. A dollar is a dollar is a dollar, but a brother could be a sibling, a comrade in arms, or any human being, depending on the context [3]. Distortions of information transfer, whether in transport or content, are therefore not as leveraged as bottlenecks in the money supply.
Information flows are undoubtedly important, and their interruption would cause disruption. For example, cargo ships need to provide electronic manifests to their destination ports 24 hours before arrival in the US. If this data flow were blocked, the movement of goods would stop. However, this is a point failure; it isn’t obvious to me how something like this could cause a cascade of failures, as we saw when banks stopped lending to each other [4].
Leveraged Intangibles
Finance is more leveraged than ICT. It’s more abstract, not least because money is a more “pure” commodity than information; that is, it’s more generic. The sub-prime crisis was the collapse of a tower of derivatives: loans were bundled into CDOs, which were then re-bundled into CDOs, and again, and again. There was double leverage. First, the obvious leverage of betting with borrowed money; second, the recursive bundling of financial instruments to magnify even those inflated returns. The tower of derivatives was possible because its bricks were intangible; failure only came when the conceptual opacity of the structure overwhelmed our individual and institutional comprehension, rather than when its weight exceeded the compressive strength of its materials.
ICT also has its fair share of intangibles; many problems of large-scale software development are due to the opacity of boundless webs and towers of abstractions. However, the recursiveness is not as thoroughgoing, at least at the lower levels of the stack. The risks of network infrastructure companies misusing their market power in interconnection, say, is limited by the fact that no part of the network is many abstraction steps away from tangible wires and routers.
The risks do become greater in the higher network layers, such as applications and content. Software carries more and more value, here; even though the bits and MIPS live in data centers somewhere, complex layers of abstract processing can create unexpected risks. One example is the way in which personally identifiable information doesn't have to be a name and address: when sufficiently many seemingly random items can be aggregated, someone becomes uniquely identifiable even though they didn't provide their name.
Subjectivity
The finance business is shot through with unquantifiable and unpredictable subjectivity, notably trust, greed, and panic. Of course, all businesses including ICT rely on trust, etc. However, in finance subjective assessments drive minute-by-minute market decisions. When banks lost faith that their counter-parties would still be solvent the next morning, all were sucked down a maelstrom of mutual distrust. Businesses all try to quantify trust, but it’s a fragile thing, particularly when assets are intangible and unintelligible. Investors thought they could depend on ratings agencies to measure risk; when it turned out that they couldn’t due to the agencies’ conflicts of interest, the downward spiral started (and was accelerated by leverage).
The ICT business, at least at the lower transport levels, is much less dependent on subjective assessments. One can measure up-time and packet loss objectively. Things are less sure at the content layers, as can be seen in the rumblings about the incidence of click fraud, and whether the click-through accounting of search engine operators can be trusted; so far, though, there’s been no evidence of a rickety tower of dubious reputation.
Conclusions
Finance today arguably needs more supervision because of the wide ramifications of unfettered greed, fear or stupidity. The impacts are so large because of the amplifying effects of leverage and intangibility; the risk is greater because of the opacity due to unintelligibility.
ICT also has leverage, intangibility and opacity, but not at the same scale. Therefore, objections to delegated regulation in finance do not transfer automatically to ICT.
Counter-intuitively (to me, at least), the parts of the ICT business that are most defensible against claims for re-regulation are those that have a great deal of physical infrastructure. The more software-intensive parts of the cloud are most vulnerable to analogies with the runaway risks we’ve seen in financial markets
Notes
[1] While telecoms is an easy old word that everybody knows, it really doesn’t capture the present situation. It connotes old technologies like telephony, ignores the media, and misses the importance of computing and software. There is as yet no better, commonly used term, and so I’ll reluctantly use the acronym ICT (Information and Communication Technologies). ICT is about business and policy as well as technology, but it’s a little more familiar, and shorter, than “connected computing”, my other preferred term.
[2] Jonathan Sallet observes that the financial crisis derives from market externalities that put all of society at risk (personal communication, 26 Nov 2008). The very large scope of this risk can be used to justify government intervention. We’re hoping to combine our thinking in an upcoming note.
[3] I’m toying with the notion of doing a metaphor analysis of information. At first sight, the discourse seems to be driven by an Information Is a Fluid analogy; it’s a substance of which one can have more or less. This metaphor is both pervasive, and open to criticism. Reddy introduced the Communication Is a Conduit metaphor for knowledge transfer; this is related to the Lakoff’s Ideas Are Objects. See here for his critique, and citation of his paper.
[4] Just because I can’t see a cascade doesn’t mean it isn’t there, of course; it may just be my uninformed and uninspired imagination. Network security analysts have, I’m sure, constructed many nightmare scenarios. The weakness of my analysis here is that my argument for the implausibility of a meltdown rests in part on the fact that it hasn’t happened – yet. The 9/11 fallacy. . . .
"in this world, there is one awful thing, and that is that everyone has their reasons" --- attrib. to Jean Renoir (details in the Quotes blog.)
Friday, November 28, 2008
Friday, November 21, 2008
More on Intelligibility vs. Transparency
A commentary by Richard Thaler and Cass Sunstein, the co-authors of Nudge, also notes that the growing complexity of the financial world needs more attention; cf. my recent post Lessons for communications regulation from banking complexity.
I’ve been thinking about bounded rationality for some time; see the Hard Intangibles thread. It’s one of the fundamental challenges of managing complex adaptive systems. I, like many others, recommended disclosure (aka transparency) as a key tool for internet governance; see e.g. my Internet Governance as Forestry paper).
However, the more I think about transparency, the more skeptical I become. I’ve concluded that in finance, at least, the problem isn’t disclosure but intelligibility; see e.g. my post From transparency to intelligibility in regulating finance. While it’s true that eliminating complexity could stifle innovation, that’s a false choice; nobody’s proposing to eliminate either complexity or innovation. It’s our infatuation with novelty, as well as our greed, that got us into this problem, and we have to manage our urges in both respects.
I suspect that one can make the intelligibility argument just as well for computing & communications as for finance – though the lack of a Comms Chernobyl will make it harder to sell the idea in that industry.
Transparency/intelligibility need not be mandatory; companies should be able to choose obscurity. However, the choice of keeping their activities shrouded will bear the cost of increased regulatory scrutiny – and perhaps higher expectations regarding performance against public interest mandates. Comcast need not explain exactly how it manages its network; but if it chooses obscurity, it should face tougher network neutrality expectations. Microsoft does not have to disclose its interfaces; but if it chooses obscurity, it should face a tougher anti-trust test. Google need not explain how it uses DoubleClick data to improve ad click-throughs; but if it chooses obscurity, it should face tougher privacy protection requirements.
My “transparency to intelligibility” post proposed algorithmic complexity as a way to measure intelligibility. That’s not the only method. Another (prompted by the financial crisis, and teetering software stacks) is an abstraction ladder: the more steps between a derivative and its underlying asset, the higher it is on the abstraction ladder, and the less intelligible and more risky it should be deemed to be. In computing & communications as in finance, the abstraction ladder counts the number of rungs up from atoms. The networking stack is an example: from wires in the ground one climbs up to links, networks, sessions, applications. On the premise that atoms are easier to observe than bits, and that piling up inscrutable and unstable combinations are easier the higher you go, services at higher layers will be subject to closer regulatory scrutiny, other things (like market concentration) being equal.
Not so long ago, most mortgages were of the 30-year fixed-rate variety. Shopping was simple: find the lowest monthly payment. Now they come in countless forms. Even experts have trouble comparing them and a low initial monthly payment can be a misleading guide to total costs (and risks). A main cause of the mortgage crisis is that borrowers did not understand the terms of their loans. Even those who tried to read the fine print felt their eyes glazing over, especially after their mortgage broker assured them that they had an amazing deal.Thaler & Sunstein conclude that regulators therefore need to help people manage complexity and resist temptation. They reject the option of requiring simplicity, on the grounds that this would stifle innovation, and they recommend that disclosure is improved.
Yet growing complexity on the borrowers’ side was trivial compared with what was going on at the banks. Mortgages used to be held by the banks that initiated the loans. Now they are sliced into mortgage-backed securities, which include arcane derivative products.
--- Human frailty caused this crisis, Financial Times, 11 November 2008. Thanks to Andrew Sterling for the link.
I’ve been thinking about bounded rationality for some time; see the Hard Intangibles thread. It’s one of the fundamental challenges of managing complex adaptive systems. I, like many others, recommended disclosure (aka transparency) as a key tool for internet governance; see e.g. my Internet Governance as Forestry paper).
However, the more I think about transparency, the more skeptical I become. I’ve concluded that in finance, at least, the problem isn’t disclosure but intelligibility; see e.g. my post From transparency to intelligibility in regulating finance. While it’s true that eliminating complexity could stifle innovation, that’s a false choice; nobody’s proposing to eliminate either complexity or innovation. It’s our infatuation with novelty, as well as our greed, that got us into this problem, and we have to manage our urges in both respects.
I suspect that one can make the intelligibility argument just as well for computing & communications as for finance – though the lack of a Comms Chernobyl will make it harder to sell the idea in that industry.
Transparency/intelligibility need not be mandatory; companies should be able to choose obscurity. However, the choice of keeping their activities shrouded will bear the cost of increased regulatory scrutiny – and perhaps higher expectations regarding performance against public interest mandates. Comcast need not explain exactly how it manages its network; but if it chooses obscurity, it should face tougher network neutrality expectations. Microsoft does not have to disclose its interfaces; but if it chooses obscurity, it should face a tougher anti-trust test. Google need not explain how it uses DoubleClick data to improve ad click-throughs; but if it chooses obscurity, it should face tougher privacy protection requirements.
My “transparency to intelligibility” post proposed algorithmic complexity as a way to measure intelligibility. That’s not the only method. Another (prompted by the financial crisis, and teetering software stacks) is an abstraction ladder: the more steps between a derivative and its underlying asset, the higher it is on the abstraction ladder, and the less intelligible and more risky it should be deemed to be. In computing & communications as in finance, the abstraction ladder counts the number of rungs up from atoms. The networking stack is an example: from wires in the ground one climbs up to links, networks, sessions, applications. On the premise that atoms are easier to observe than bits, and that piling up inscrutable and unstable combinations are easier the higher you go, services at higher layers will be subject to closer regulatory scrutiny, other things (like market concentration) being equal.
Monday, November 17, 2008
Lessons for communications regulation from banking complexity
A New Scientist story on Why the financial system is like an ecosystem (Debora Mackenzie, 22 October 2008) traces how the science of complexity might prevent future breakdowns of the world’s financial system.
The lessons apply to communications regulation, too. Both finance and the ICT business (“Information & Computer Technology”) are complex systems. The recommendations in the article resonate with the conclusions I came to in my paper Internet Governance as Forestry. This post explores some of the resonances.
New Scientist observes:
The question for communications regulation is whether phase changes such as those we’ve seen in finance and ecosystems have occurred, or could occur in the future. Other than the periodic consolidation and break-up of telecom monopolies, and the vertical integration of the cable and media businesses in the 80s, conclusive evidence of big phase transitions in communications is hard to find. Is there currently a slow accumulation of small changes which will lead to a big shift? There are two obvious candidates: the erosion of network neutrality, and growth of personal information bases (cf. behavioral advertising, Phorm, more).
The New Scientist article suggests that unremarked linkages, such as the increase in cross-border investments since 1995, allowed the collapse of the US real estate market to reverberate around the world. The most obvious linkage in communications is “convergence”, the use of the same underlying technology to provide a myriad of services. Common technology facilitates commercial consolidation in infrastructure equipment (e.g. Cisco routers), tools (e.g. Microsoft’s web browser), and services (e.g. Google advertising). Convergence ties together areas of regulation that used to be distinct. For example, TV programs are distributed through broadcasting, cable, podcasts, mobile phones; how should one ensure access to the disabled in this situation? There are also links from one network layer to another, as internet pipe providers use Phorm-like technologies to track which web sites their users visit.
Increased connectivity makes the financial system less diverse and more vulnerable to dramatics shifts. “The source of the current problems is ignoring interdependence," according to Yaneer Bar-Yam, head of the New England Complex Systems Institute in Cambridge, Massachusetts. Telecoms convergence creates a similar risk, with substantial horizontal concentration: Cisco has 60% market share in core routers, Internet Explorer holds 70% web browser share, and Google has 60% search share and 70% online advertising share. While modularity and thus substitutability of parts in the internet/web may limit this concentration, it needs to be carefully monitored, as captured by my Diversity principle: “Allow and support multiple solutions to policy problems; encourage competition and market entry.” Integration is a successful strategy (cf. Apple) that some find disconcerting (cf. Zittrain); it is likely to become more pervasive as the industry matures.
Diversity allows ecosystems to remain resilient as conditions change. In the quest to achieve these results, regulators have to be careful to avoid rigidity, a temptation because the financial system is so fluid. Here’s Bar-Yam again, from the New Scientist article: “Governments will have to be very careful, and set rules and limits for the system without actually telling people what to do.” To manage this risk in the comms context, I proposed the principles of Delegation (most problems should be solved by the market and society, not by government; government's role is to provide proper incentives and guidance, and to intervene to solve critical shortcomings) and Flexibility (determine ends, not means; describe and justify the outcomes sought, not the methods to be used to achieve them).
The article closes by quoting Bar-Yam: “At its core the science of complex systems is about collective behaviour.” He goes on to say that economic policy has so far failed to take into account the complexity and consequent unpredictability of such behavior, and calls for the use of testable models. This will be important in communications regulation, too. Simulations of the internet/web can help to improve policy makers’ intuition about unpredictable systems with many variables. Exploring the consequences of policy choices in simulation can identify which courses of action are most robust under a variety of possible outcomes. It’s the 21st Century version of letting states and regions experiment with regulation, which is eventually pre-empted by federal rules. Policy simulation will allow decision makers to “sweat in training rather than bleed in combat.” Since any solution embodies a set of assumptions and biases, constructing a wide range of simulations can expose hidden preconceptions. They can then eliminate policy choices that work in only a narrow set of circumstances, leading to more resilient final measures.
Update 28 Nov 2008:
I came across a very apposite comment on the value of simulation in the New Scientist editorial for the July 19, 2008 issue (No. 2665). The editorial is a critique of mainstream economics disinterest in agent-based models. It closes by saying:
The lessons apply to communications regulation, too. Both finance and the ICT business (“Information & Computer Technology”) are complex systems. The recommendations in the article resonate with the conclusions I came to in my paper Internet Governance as Forestry. This post explores some of the resonances.
New Scientist observes:
“Existing economic policies are based on the theory that the economic world is made up of a series of simple, largely separate transaction-based markets. This misses the fact that all these transactions affect each other, complexity researchers say. Instead, they see the global financial system as a network of complex interrelationships, like an electrical power grid or an ecosystem such as a pond or swamp”Consequently, the accumulation of small, slow changes can trigger a sudden crisis. Johan Rockström of the Stockholm Environment Institute is quoted as saying,
"Slow changes have been accumulating for years, such as levels of indebtedness. None on their own seemed big enough to trigger a response. But then you get a trigger - one investment bank falls - and the whole system can then flip into an alternative stable state, with different rules, such as mistrust."This is reminiscent of my Big Picture principle, which can be summarized as “take a broad view of the problem and solution space; prefer generic to sector-, technology-, or industry-specific legislation.”
The question for communications regulation is whether phase changes such as those we’ve seen in finance and ecosystems have occurred, or could occur in the future. Other than the periodic consolidation and break-up of telecom monopolies, and the vertical integration of the cable and media businesses in the 80s, conclusive evidence of big phase transitions in communications is hard to find. Is there currently a slow accumulation of small changes which will lead to a big shift? There are two obvious candidates: the erosion of network neutrality, and growth of personal information bases (cf. behavioral advertising, Phorm, more).
The New Scientist article suggests that unremarked linkages, such as the increase in cross-border investments since 1995, allowed the collapse of the US real estate market to reverberate around the world. The most obvious linkage in communications is “convergence”, the use of the same underlying technology to provide a myriad of services. Common technology facilitates commercial consolidation in infrastructure equipment (e.g. Cisco routers), tools (e.g. Microsoft’s web browser), and services (e.g. Google advertising). Convergence ties together areas of regulation that used to be distinct. For example, TV programs are distributed through broadcasting, cable, podcasts, mobile phones; how should one ensure access to the disabled in this situation? There are also links from one network layer to another, as internet pipe providers use Phorm-like technologies to track which web sites their users visit.
Increased connectivity makes the financial system less diverse and more vulnerable to dramatics shifts. “The source of the current problems is ignoring interdependence," according to Yaneer Bar-Yam, head of the New England Complex Systems Institute in Cambridge, Massachusetts. Telecoms convergence creates a similar risk, with substantial horizontal concentration: Cisco has 60% market share in core routers, Internet Explorer holds 70% web browser share, and Google has 60% search share and 70% online advertising share. While modularity and thus substitutability of parts in the internet/web may limit this concentration, it needs to be carefully monitored, as captured by my Diversity principle: “Allow and support multiple solutions to policy problems; encourage competition and market entry.” Integration is a successful strategy (cf. Apple) that some find disconcerting (cf. Zittrain); it is likely to become more pervasive as the industry matures.
Diversity allows ecosystems to remain resilient as conditions change. In the quest to achieve these results, regulators have to be careful to avoid rigidity, a temptation because the financial system is so fluid. Here’s Bar-Yam again, from the New Scientist article: “Governments will have to be very careful, and set rules and limits for the system without actually telling people what to do.” To manage this risk in the comms context, I proposed the principles of Delegation (most problems should be solved by the market and society, not by government; government's role is to provide proper incentives and guidance, and to intervene to solve critical shortcomings) and Flexibility (determine ends, not means; describe and justify the outcomes sought, not the methods to be used to achieve them).
The article closes by quoting Bar-Yam: “At its core the science of complex systems is about collective behaviour.” He goes on to say that economic policy has so far failed to take into account the complexity and consequent unpredictability of such behavior, and calls for the use of testable models. This will be important in communications regulation, too. Simulations of the internet/web can help to improve policy makers’ intuition about unpredictable systems with many variables. Exploring the consequences of policy choices in simulation can identify which courses of action are most robust under a variety of possible outcomes. It’s the 21st Century version of letting states and regions experiment with regulation, which is eventually pre-empted by federal rules. Policy simulation will allow decision makers to “sweat in training rather than bleed in combat.” Since any solution embodies a set of assumptions and biases, constructing a wide range of simulations can expose hidden preconceptions. They can then eliminate policy choices that work in only a narrow set of circumstances, leading to more resilient final measures.
Update 28 Nov 2008:
I came across a very apposite comment on the value of simulation in the New Scientist editorial for the July 19, 2008 issue (No. 2665). The editorial is a critique of mainstream economics disinterest in agent-based models. It closes by saying:
“Although the present crisis was not caused by poor economic models, those models have extended its reach by nurturing the complacent view that markets are inherently stable. And while no one should expect better models alone to prevent future crises, they may give regulators better ways to assess market dynamics, detect early signs of trouble and police markets.”
Subscribe to:
Posts (Atom)