Factoid (verbatim) from Nassim Nicholas Taleb's The Black Swan: The Impact of the Highly Improbable, Random House 2007, p. 275.
In the caption to Figure 14 on the following page, which illustrates S&P 500 returns, Taleb observes: "This is only one of many such tests. While it is quite convincing on a casual read, there are many more-convincing ones from a mathematical standpoint, such as the incidence of 10 sigma events."
(This post marks the end of my attempt to use MSN Spaces for my Factoids site. It was just too clunky. I'll now post factoids here; as they pile up, you'll find all of them by clicking on the "factoids" label at the end of the post.)
"in this world, there is one awful thing, and that is that everyone has their reasons" --- attrib. to Jean Renoir (details in the Quotes blog.)
Thursday, June 28, 2007
Wednesday, June 27, 2007
Eco mumbo jumbo
I’m coming to the conclusion that the “business ecosystem” metaphor is nonsense. That’s a pity, since I speculated in Tweaking the Web Metaphor that the food web might be a useful metaphor for the internet, conceived as a “module ecosystem.” [1] Bugs in the business ecosystem mapping would be even more unfortunate for people who’ve made strategic business decisions on the basis of this flawed metaphor.
“Business is an ecosystem” is an analogy, and like any argument from analogy it is valid to the extent that the essential similarities of the two concepts are greater than the essential differences. I will try to show (at too much length for a blog post, I know...) that the differences are much greater than the similarities.
This biological analogy is very popular. A Google search on "business ecosystem" yielded about 154,000 hits, "software ecosystem" gave 76,000 hits (Microsoft’s in 47,200 of them), and “computing ecosystem” 18,000 hits.
So where’s the problem?
Let me count the ways.
1. A biological ecosystem is analyzed in terms of species, each of which represents thousands or millions of organisms. Business ecosystems are described in terms of firms: just one of each. A food web of species summarizes billions of interactions among interactions; a business web of companies is simply the interaction among the firms studied.
2. Species are connected, primarily, by flows of energy and nutrients. A is eaten by B is eaten by C is eaten by D, etc. Energy is lost as heat at every step. In the business system, a link primarily represents company B buying something from company A. Goods flow from A to B, and money flows back. Both A and B gain, otherwise they wouldn’t’ve entered into the transaction. Therefore, the system isn’t lossy, as it is in a food web. In fact, gains from trade suggest that specialization leading to more interacting firms leads to more value. The links between companies could also stand for co-marketing ventures relationships, technology sharing and licensing agreements, collusion, cross shareholding, etc; however, these have the same non-zero sum characteristics as monetary exchange does.
3. One might sidestep these problems by claiming that species are mapped to firms, and individual animals are mapped to the products that a firm sells. That solves the multiplicity mismatch in #1, and, if one just considers the material content of products, the entropy problem in #2. However, two problems remain. First, the value of products is mostly in the knowledge they embody, not their matter; knowledge (aka value add) is created at every step, the inverse of what happens with the 2nd Law of Thermodynamics. Second, companies sell many diverse products. The fudge only works if a species is mapped to a product unit (in fact, to the part of a product unit that produces a single SKU), rather than to a firm.
4. Species change slowly, and their role in an ecosystem changes very slowly; on the other hand, companies can change their role in the blink of an eye through merger, acquisition or divestiture. Interactions between firms can be changed by contract, whereas that between species is not negotiable except perhaps over very long time scales by evolution of defensive strategies).
5. Biological systems are unstable; the driving force of ecological succession is catastrophe. [2] Businesses seek stability, and the biological metaphor is used as a source of techniques to increase resilience; see e.g. Iansiti and Levien’s claim that keystone species lead to increase stability in an ecosystem. [3], [4] If one seeks stability, biological systems are not a good place to look.
6. Biological systems don’t have goals, but human ones do. There are no regulatory systems external to ecosystems, but many, such as rule of law and anti-trust, in human markets. Natural processes don’t care about equity or justice, but societies do, and impose them on business systems. If ecosystems were a good model for business networks, there would be no need for anti-trust in markets.
7. End-consumers are not represented at all in the “business ecosystem” model. Von Hippel and others [5] who study collaborative innovation could be seen to be pointing to customers - or at least some of them - as a node in the business ecosystem, but the same problems about singularity/multiplicity noted above applies here.
8. Companies are exhorted to invest in their ecosystem if they want to keystone species. Keystone species don’t necessarily (or usually) represent a lot of biomass, so it’s not clear why a firm would want to be a keystone. (And of course, the metaphor leaves unstated whether biomass maps to total revenue, profitability, return on investment, or something else.) More generally: being a keystone species isn’t a matter of choice for the animal concerned; the keystone relationship arises from the interactions among species as a matter of course.
The business ecosystem metaphor in use
Iansiti and Levien are high profile proponents of business ecosystems. [3] [4] In The Keystone Advantage, they motivate the analogy between business networks and biological ecosystems by arguing that both are “formed by large, loosely connected networks of entities”, both “interact with each other in complex ways”, and that “[f]irms and species are therefore simultaneously influenced by their internal capabilities and by their complex interactions with the rest of the ecosystem.” They state that the key analogy they draw “is between the characteristic behavior of a species in an ecosystem and the operating strategy of a strategic business unit.” They declare the stakes when they continue: “To the extent that the comparison of business units to ecosystems [I presume they mean “to species”] is a valid one, it suggests that some of the lessons from biological networks can fruitfully be applied to business networks.”
To caricature their argument: Ecosystems are networks; business networks are networks; therefore business networks are ecosystems. Hmmm...
They were more circumspect in the papers that preceded the book, where they try to dodge the weakness of the analogy that underpins their argument by contending that they don’t mean it: “[W]e are not arguing here that industries are ecosystems or even that it makes sense to organize them as if they were, but that biological ecosystems serve both as a source of vivid and useful terminology as well as a providing some specific and powerful insights into the different roles played by firms” ([4], footnote 10). They want to have it both ways: get the rhetorical boost of a powerful biological metaphor, but avoid dealing with parts of the mapping that are inaccurate or misleading. As they noted in their book: If industries cannot be compared to ecosystems, then their insights cannot be validated by the analogy. However, they do attempt a mapping. For example, they attempt to answer the question “What makes a healthy business ecosystem?” by examining ecosystem phenomena like (1) hubs which are said to account for the fundamental robustness of nature’s webs, (2) robustness measured by survival rates in a given ecosystem, (3) productivity of an ecosystem analogized to total factor productivity analysis in economics, and (4) niche creation.
In most if not all cases, the appeal to ecosystem is very superficial; no substantive analogy is drawn. For example, Messerschmitt and Szyperski’s book [6], which made it into softcover, is entitled Software Ecosystem, but its remit seems to be simply to examine software “in the context of its users, its developers, its buyers and sellers, its operators, society and government, lawyers, and economics”; the word ecosystem doesn’t even appear in the index. (Disclaimer: I haven’t read the book.) The word ecosystem is simply meant to evoke a community of interdependent actors, with no reference beyond that to dynamics or behavior.
A more generic flaw with the business ecosystem metaphor is that most people are more familiar with businesses than with ecosystems. Successful metaphors usually explain complex or less-known things in terms of simpler, more familiar ones. Shall I compare thee to a Summer's day? The rhetorical appeal of the business ecosystem analogy must lie beyond its rather weak ability to make domesticate a strange idea.
Why do careful scholars resort to the ecosystem metaphor in spite of its obvious flaws? The image of nature is so powerful that it is a symbol too potent to pass up. Nature represents The Good (at least in our culture at this time), and therefore an appeal to a natural order is a compelling argument if one’s claims bear some resemblance to what’s happening in nature. However, if nothing else, this reminds me of Hegel’s historicist cop-out that what is real is rational, and what is rational is real. Just because nature is constructed in a certain way doesn’t mean that industries should be.
Perhaps my standards for metaphors are too high. To me, a conceptual metaphor is a mapping one set of ideas to another; one has to take the “bad” elements of the mapping with the “good”. If the good outweighs the bad, and if the metaphor produces insight and new ideas, then the analogy has value. Others just take the “good” and simply ignore the “bad”. For me, a metaphor is a take-it-all-or-leave-it set menu, not something to pick from a la carte.
Tentative conclusion
Does this all matter? Yes, but I still have to work out the details. For now I just claim that the weakness of the mapping between biological and business systems means that any argument that one might make about the goodness of “business ecosystems” in general and “keystone species” in particular is potentially misleading. It could delude firms into make unsound investments, e.g. in “building ecosystems,” and lead policy makers into dangerous judgments.
Notes
[1] The module ecosystem differs from the business ecosystem in that species, the nodes of the food web, are mapped to functional modules, rather than to individual companies. However, the glaring weaknesses of the business ecosystem metaphor undermine my confidence in the whole approach.
[2] John Harte, in “Business as a Living System: The Value of Industrial Ecology A Roundtable Discussion,” California Management Review (Spring 2001), argues that the ecological sustainability practices under the banner of “industrial ecology” are worthy and important, but that they do not mimic the way natural ecosystems work. His ideas are reflected in items #2, #5 and #6. He also notes that human processes are much more efficient in using waste heat than natural ones are – photosynthesis is only about a half a percent efficient, whereas power plants at 30% are sixty times more efficient. Note, however, that Industrial ecology, defined as the proposition that industrial systems should be seen as closed-loop ecosystems where wastes of one process become inputs for another process (wikipedia, ISIE) differs from the business ecosystem idea as I treat it here, i.e. that industry organization (regardless of ecological impact) can be understood as an ecosystem.
[3] Marco Iansiti and Roy Levien, “The New Operational Dynamics of Business Ecosystems: Implications for Policy, Operations and Technology Strategy,” Harvard Business School Working Paper 03-030 (2003)
[4] Marco Iansiti and Roy Levien, The Keystone Advantage: What the New Dynamics of Business Ecosystems Mean for Strategy, Innovation, and Sustainability, Harvard Business School Press, 2004
[5] Eric von Hippel, Democratizing Innovation (2005), and e.g. Charles Leadbeater
[6] David Messerschmitt and Clemens Szyperski, Software Ecosystem – Understanding an Indispensable Technology and Industry, MIT Press, 2003
“Business is an ecosystem” is an analogy, and like any argument from analogy it is valid to the extent that the essential similarities of the two concepts are greater than the essential differences. I will try to show (at too much length for a blog post, I know...) that the differences are much greater than the similarities.
This biological analogy is very popular. A Google search on "business ecosystem" yielded about 154,000 hits, "software ecosystem" gave 76,000 hits (Microsoft’s in 47,200 of them), and “computing ecosystem” 18,000 hits.
So where’s the problem?
Let me count the ways.
1. A biological ecosystem is analyzed in terms of species, each of which represents thousands or millions of organisms. Business ecosystems are described in terms of firms: just one of each. A food web of species summarizes billions of interactions among interactions; a business web of companies is simply the interaction among the firms studied.
2. Species are connected, primarily, by flows of energy and nutrients. A is eaten by B is eaten by C is eaten by D, etc. Energy is lost as heat at every step. In the business system, a link primarily represents company B buying something from company A. Goods flow from A to B, and money flows back. Both A and B gain, otherwise they wouldn’t’ve entered into the transaction. Therefore, the system isn’t lossy, as it is in a food web. In fact, gains from trade suggest that specialization leading to more interacting firms leads to more value. The links between companies could also stand for co-marketing ventures relationships, technology sharing and licensing agreements, collusion, cross shareholding, etc; however, these have the same non-zero sum characteristics as monetary exchange does.
3. One might sidestep these problems by claiming that species are mapped to firms, and individual animals are mapped to the products that a firm sells. That solves the multiplicity mismatch in #1, and, if one just considers the material content of products, the entropy problem in #2. However, two problems remain. First, the value of products is mostly in the knowledge they embody, not their matter; knowledge (aka value add) is created at every step, the inverse of what happens with the 2nd Law of Thermodynamics. Second, companies sell many diverse products. The fudge only works if a species is mapped to a product unit (in fact, to the part of a product unit that produces a single SKU), rather than to a firm.
4. Species change slowly, and their role in an ecosystem changes very slowly; on the other hand, companies can change their role in the blink of an eye through merger, acquisition or divestiture. Interactions between firms can be changed by contract, whereas that between species is not negotiable except perhaps over very long time scales by evolution of defensive strategies).
5. Biological systems are unstable; the driving force of ecological succession is catastrophe. [2] Businesses seek stability, and the biological metaphor is used as a source of techniques to increase resilience; see e.g. Iansiti and Levien’s claim that keystone species lead to increase stability in an ecosystem. [3], [4] If one seeks stability, biological systems are not a good place to look.
6. Biological systems don’t have goals, but human ones do. There are no regulatory systems external to ecosystems, but many, such as rule of law and anti-trust, in human markets. Natural processes don’t care about equity or justice, but societies do, and impose them on business systems. If ecosystems were a good model for business networks, there would be no need for anti-trust in markets.
7. End-consumers are not represented at all in the “business ecosystem” model. Von Hippel and others [5] who study collaborative innovation could be seen to be pointing to customers - or at least some of them - as a node in the business ecosystem, but the same problems about singularity/multiplicity noted above applies here.
8. Companies are exhorted to invest in their ecosystem if they want to keystone species. Keystone species don’t necessarily (or usually) represent a lot of biomass, so it’s not clear why a firm would want to be a keystone. (And of course, the metaphor leaves unstated whether biomass maps to total revenue, profitability, return on investment, or something else.) More generally: being a keystone species isn’t a matter of choice for the animal concerned; the keystone relationship arises from the interactions among species as a matter of course.
The business ecosystem metaphor in use
Iansiti and Levien are high profile proponents of business ecosystems. [3] [4] In The Keystone Advantage, they motivate the analogy between business networks and biological ecosystems by arguing that both are “formed by large, loosely connected networks of entities”, both “interact with each other in complex ways”, and that “[f]irms and species are therefore simultaneously influenced by their internal capabilities and by their complex interactions with the rest of the ecosystem.” They state that the key analogy they draw “is between the characteristic behavior of a species in an ecosystem and the operating strategy of a strategic business unit.” They declare the stakes when they continue: “To the extent that the comparison of business units to ecosystems [I presume they mean “to species”] is a valid one, it suggests that some of the lessons from biological networks can fruitfully be applied to business networks.”
To caricature their argument: Ecosystems are networks; business networks are networks; therefore business networks are ecosystems. Hmmm...
They were more circumspect in the papers that preceded the book, where they try to dodge the weakness of the analogy that underpins their argument by contending that they don’t mean it: “[W]e are not arguing here that industries are ecosystems or even that it makes sense to organize them as if they were, but that biological ecosystems serve both as a source of vivid and useful terminology as well as a providing some specific and powerful insights into the different roles played by firms” ([4], footnote 10). They want to have it both ways: get the rhetorical boost of a powerful biological metaphor, but avoid dealing with parts of the mapping that are inaccurate or misleading. As they noted in their book: If industries cannot be compared to ecosystems, then their insights cannot be validated by the analogy. However, they do attempt a mapping. For example, they attempt to answer the question “What makes a healthy business ecosystem?” by examining ecosystem phenomena like (1) hubs which are said to account for the fundamental robustness of nature’s webs, (2) robustness measured by survival rates in a given ecosystem, (3) productivity of an ecosystem analogized to total factor productivity analysis in economics, and (4) niche creation.
In most if not all cases, the appeal to ecosystem is very superficial; no substantive analogy is drawn. For example, Messerschmitt and Szyperski’s book [6], which made it into softcover, is entitled Software Ecosystem, but its remit seems to be simply to examine software “in the context of its users, its developers, its buyers and sellers, its operators, society and government, lawyers, and economics”; the word ecosystem doesn’t even appear in the index. (Disclaimer: I haven’t read the book.) The word ecosystem is simply meant to evoke a community of interdependent actors, with no reference beyond that to dynamics or behavior.
A more generic flaw with the business ecosystem metaphor is that most people are more familiar with businesses than with ecosystems. Successful metaphors usually explain complex or less-known things in terms of simpler, more familiar ones. Shall I compare thee to a Summer's day? The rhetorical appeal of the business ecosystem analogy must lie beyond its rather weak ability to make domesticate a strange idea.
Why do careful scholars resort to the ecosystem metaphor in spite of its obvious flaws? The image of nature is so powerful that it is a symbol too potent to pass up. Nature represents The Good (at least in our culture at this time), and therefore an appeal to a natural order is a compelling argument if one’s claims bear some resemblance to what’s happening in nature. However, if nothing else, this reminds me of Hegel’s historicist cop-out that what is real is rational, and what is rational is real. Just because nature is constructed in a certain way doesn’t mean that industries should be.
Perhaps my standards for metaphors are too high. To me, a conceptual metaphor is a mapping one set of ideas to another; one has to take the “bad” elements of the mapping with the “good”. If the good outweighs the bad, and if the metaphor produces insight and new ideas, then the analogy has value. Others just take the “good” and simply ignore the “bad”. For me, a metaphor is a take-it-all-or-leave-it set menu, not something to pick from a la carte.
Tentative conclusion
Does this all matter? Yes, but I still have to work out the details. For now I just claim that the weakness of the mapping between biological and business systems means that any argument that one might make about the goodness of “business ecosystems” in general and “keystone species” in particular is potentially misleading. It could delude firms into make unsound investments, e.g. in “building ecosystems,” and lead policy makers into dangerous judgments.
Notes
[1] The module ecosystem differs from the business ecosystem in that species, the nodes of the food web, are mapped to functional modules, rather than to individual companies. However, the glaring weaknesses of the business ecosystem metaphor undermine my confidence in the whole approach.
[2] John Harte, in “Business as a Living System: The Value of Industrial Ecology A Roundtable Discussion,” California Management Review (Spring 2001), argues that the ecological sustainability practices under the banner of “industrial ecology” are worthy and important, but that they do not mimic the way natural ecosystems work. His ideas are reflected in items #2, #5 and #6. He also notes that human processes are much more efficient in using waste heat than natural ones are – photosynthesis is only about a half a percent efficient, whereas power plants at 30% are sixty times more efficient. Note, however, that Industrial ecology, defined as the proposition that industrial systems should be seen as closed-loop ecosystems where wastes of one process become inputs for another process (wikipedia, ISIE) differs from the business ecosystem idea as I treat it here, i.e. that industry organization (regardless of ecological impact) can be understood as an ecosystem.
[3] Marco Iansiti and Roy Levien, “The New Operational Dynamics of Business Ecosystems: Implications for Policy, Operations and Technology Strategy,” Harvard Business School Working Paper 03-030 (2003)
[4] Marco Iansiti and Roy Levien, The Keystone Advantage: What the New Dynamics of Business Ecosystems Mean for Strategy, Innovation, and Sustainability, Harvard Business School Press, 2004
[5] Eric von Hippel, Democratizing Innovation (2005), and e.g. Charles Leadbeater
[6] David Messerschmitt and Clemens Szyperski, Software Ecosystem – Understanding an Indispensable Technology and Industry, MIT Press, 2003
A perspective on persistence
A wonderful fragment from the poem Seventeen Pebbles, from Jane Hirshfield's 2006 collection After: Poems (p. 61)
For days a fly travelled loudly
from window to window,
until at last it landed on one I could open.
It left without thanks or glancing back,
believing only - quite correctly - in its own persistence.
Thursday, June 14, 2007
The Narrative Fallacy, Data Compression, and Counting Characters
I’m very grateful to Tren Griffin and Pierre-Yves Saintoyant for independently suggesting that I read Nassim Nicholas Taleb’s “The Black Swan: The Impact of the Highly Improbable.” Both must’ve realized how relevant his thinking is to my exploration of hard intangibles. At times it felt as if the book was written with me in mind.
One of the human failings that Taleb warns against – “rails against” might be more accurate – is the narrative fallacy. He argues that our inclination to narrate derives from the constraints on information retrieval (Chapter 6, “The Narrative Fallacy”, p. 68-9). He notes three problems: information is costly to obtain, costly to store, and costly to manipulate and retrieve. He notes, “With so many brain cells – one hundred billion (and counting) – the attic is quite large, so the difficulties probably do not arise from storage-capacity limitations, but may just be indexing problems. . .” He then goes on to argue that narrative is a useful form of information compression.
I’m not sure what Taleb means by “indexing”, but I suspect that the compression is required to extracting meaning, not the raw information. It’s true that stories provide a useful retrieval frame; since there’s only a limited number of them, we can perhaps first remember the base story, and then the variation. However, the long-term storage capacity of the brain seems to be essentially unbounded. What’s limited is our ability to manipulate variables in short-term; according to Halford et al, we can handle only about four concurrent items.
Joseph Cambell reportedly claimed that there were seven basic plots, an idea elaborated by Christopher Booker in The Seven Basic Plots; see here for a summary. The number is pretty arbitrary; Cecil Adams reports on a variety of plot counts, between one and sixty-nine. While the number of “basic plots” is arbitrary, the number of key relationships is probably more constant. I’m going to have to get hold of Booker’s book to check out this hypothesis, but in the meantime, a blog post by JL Lockett about 36 Basic Plots lists the main characters; there are typically three of them, or sometimes four. Now, there are many more characters in most plays and novels – but the number of them interacting at any given time is also around four.
I think one might even be able to separate out the data storage from the relationship storage limits: {stories} x {variations} allows one to remember many more narratives than simple {stories}, but I expect that the number of relationships in a given instance of {stories} x {variations} will be no greater that that in a given story.
More generally: if making meaning is a function of juggling relationships (cf. semiotics), a limit on the number of concurrent relationships our brains can handle represents a limit on our ability to find meaning in the world.
One of the human failings that Taleb warns against – “rails against” might be more accurate – is the narrative fallacy. He argues that our inclination to narrate derives from the constraints on information retrieval (Chapter 6, “The Narrative Fallacy”, p. 68-9). He notes three problems: information is costly to obtain, costly to store, and costly to manipulate and retrieve. He notes, “With so many brain cells – one hundred billion (and counting) – the attic is quite large, so the difficulties probably do not arise from storage-capacity limitations, but may just be indexing problems. . .” He then goes on to argue that narrative is a useful form of information compression.
I’m not sure what Taleb means by “indexing”, but I suspect that the compression is required to extracting meaning, not the raw information. It’s true that stories provide a useful retrieval frame; since there’s only a limited number of them, we can perhaps first remember the base story, and then the variation. However, the long-term storage capacity of the brain seems to be essentially unbounded. What’s limited is our ability to manipulate variables in short-term; according to Halford et al, we can handle only about four concurrent items.
Joseph Cambell reportedly claimed that there were seven basic plots, an idea elaborated by Christopher Booker in The Seven Basic Plots; see here for a summary. The number is pretty arbitrary; Cecil Adams reports on a variety of plot counts, between one and sixty-nine. While the number of “basic plots” is arbitrary, the number of key relationships is probably more constant. I’m going to have to get hold of Booker’s book to check out this hypothesis, but in the meantime, a blog post by JL Lockett about 36 Basic Plots lists the main characters; there are typically three of them, or sometimes four. Now, there are many more characters in most plays and novels – but the number of them interacting at any given time is also around four.
I think one might even be able to separate out the data storage from the relationship storage limits: {stories} x {variations} allows one to remember many more narratives than simple {stories}, but I expect that the number of relationships in a given instance of {stories} x {variations} will be no greater that that in a given story.
More generally: if making meaning is a function of juggling relationships (cf. semiotics), a limit on the number of concurrent relationships our brains can handle represents a limit on our ability to find meaning in the world.
Friday, June 08, 2007
Tweaking the Web Metaphor
Today’s modular Internet needs a metaphor make-over. The “silo” and “layer” frameworks that have guided are no longer adequate. It’s time to reinvent a well-worn metaphor: the Web as a web. [1], [2]
The silo model divided up the communications business by end-user experiences like telephony, cable and broadcast television, assuming that each experience has its own infrastructure. The distinct experiences with unique public policy aspects remain, but the silos are growing together at the infrastructure level since all media are now moved around as TCP/IP packet flows. The layer model reflects this integration of different media all using the same protocols. It’s relevant when one takes an infrastructure perspective, but doesn’t take into account the very real differences between, say, real-time voice chat, blogs, and digital video feeds. One might say that the silo model works best “at the top” and a layer model “at the bottom”; in the middle, it’s a mess.
Time to revive the web metaphor, with a twist. The “web” of the World Wide Web refers to the network of pointers from one web page to another. [3] The nodes are pages, and the connections between them are hyperlinks. The “info-web” model I’m exploring here proposes a different mapping: the connections in the web represent information flow, not hyperlinks, and the nodes where they connect are not individual pages but rather functional categories, like blogs, social networking sites, search portals, and individuals.
Food webs
It’s a web as in an ecosystem food web, where the nodes are species and the links are flows of energy and nutrients. The simplest view is that of a food chain: in a Swedish lake, say, ospreys eat pike, which eat perch, which eat bleak, which eat freshwater shrimp, which eat phytoplankton, which get their energy from the sun via photosynthesis. A chain is a very simple model which shows only a linear path for energy and material transfer. (The Layers model resembles a food chain, where network components at one layer pass down communications traffic to the layer below for processing.)
A food web extends the food chain concept to a complex network of interactions. It takes into account aspects ignored in a chain, such as consumers which eat, and are eaten by, multiple species; parasites, and organisms that decompose others; and very big animals that eat very small ones (e.g. whales and plankton). The nodes in a food web are species, and the links between them represent one organism consuming another. While the nodes are multiply connected, there is some degree of hierarchy, since in an ecosystem there’s always a foundation species that harvests energy directly from non-organic sources, usually a plant using sunlight. Each successive organism is at a higher trophic level; first the phytoplankton, then the shrimp, then the bleak, etc.
Info-webs
In an eco-based web model for the Internet, the species in a bio-web are mapped to functionality modules as described in my earlier post, A modular net. For example, a YouTube video clip plugged into a MySpace page running on a Firefox browser on a Windows PC might correspond to the osprey, fish, shrimp, plankton in the simple example above. In the same way that there might be other predators beyond ospreys feeding on fish, there might be many other plug-ins on the MySpace page for IM, audio, etc.. In a bio-web, a link between species A and B means “A eats B”. In the info-web model of the Internet, a link means “information flows from A to B.” Value is added to information value in the nodes through processing (e.g. playing a video) or combinations (e.g. a mash-up). For example, a movie recommender embedded in Facebook gets its information from a database hosted somewhere else, and integrates into a user’s page. Therefore, information transport is key. One can think of the links as being many-stranded if there are many alternative ways of getting the relevant information across, or single-stranded if there’s only one or two communications options (e.g. for web search one can use Wi-Fi, 3G data, DSL, cable modem etc, but for high def video on demand there’s many fewer choices.)
The analogy between the info-web and the food web diverges when one considers what flows across the links. In the Internet, information flows around the web; in the biological case, it’s energy and nutrients. Information can be created at any stage in an information web and increases with each step, whereas energy is conserved, and available energy decreases as one moves up the trophic levels of an ecological pyramid. There is a sequence of “infotrophic levels” where information value is added at each step. However, since the amount of information grows with each step, the “information pyramid” is therefore inverted relative to the ecological one: it grows wider from the bottom to the top, rather than narrower.
Implications for policy making
The Internet is complex web of interlocking service, and is approaching the richness of simple biological ecosystems. In the same way that humans can’t control ecosystems, regulators cannot understand, let alone supervise, all the detailed interactions of the Internet. One may be able to understand the interactions at a local level, e.g. how IP-based voice communications plug into web services, but the system is too big to wrap one’s head around all the dynamics at the same time. [4] This is why a market-based approach is advisable. Markets are the best available way to optimize social systems by distributing decision making among many participants. Markets aren’t perfect, of course, and there are social imperatives like public safety and justice that need government intervention. The info-web model suggests ways to find leverage points where regulators should focus their attention, and also provides salutary lessons about the limits of the effectiveness of human ecosystem management.
For example, a keystone species is one that has a disproportionate effect on its environment relative to its abundance. Black-tailed prairie dogs are a keystone species of the prairie ecosystem; more than 200 other wildlife species have been observed on or near prairie dog colonies. Such an organism plays a role in its ecosystem that is analogous to the role of a keystone in an arch. An ecosystem may experience a dramatic shift if a keystone species is removed, even though that species was a small part of the ecosystem by measures of biomass or productivity. Regulators could apply leverage on “keystone species” rather than searching for bottlenecks or abuse of market power. This would provide a basis for both supportive and punitive action. At the moment search engines are “keystone species” – they play a vital role not only in connecting consumers with information, but also in generating revenue that feeds many business models. One might say that Google is the phytoplankton of the Internet Ocean, converting the light of user attention into the energy of money. Local Internet Service Providers may also be keystone species. In earlier phase of the net, portals were keystone species. Keystone services provide a point of leverage for regulators; they can wield disproportionate influence by controlling behavior of these services.
The unintended side effects of intervention in ecosystems stand as a warning to regulators to tread carefully. For example, the Christian Science Monitor reported recently on efforts to eradicate buffelgrass from the Sonoran Desert. It was introduced by government officials after the Dust Bowl in an attempt to hold the soil and provide feed for cattle. It’s unfortunately turned out to be an invasive weed that threatens the desert ecology, choking out native plants like the iconic saguaro cactus. Another example of biological control gone wrong is the introduction of the cane toad into Australia in 1935 to control two insect pests of sugar cane: it did not control the insects and the Cane Toad itself became an invasive species. By contrast, the release of myxomatosis in 1950 was successful in controlling feral rabbits in that country.
The silo model divided up the communications business by end-user experiences like telephony, cable and broadcast television, assuming that each experience has its own infrastructure. The distinct experiences with unique public policy aspects remain, but the silos are growing together at the infrastructure level since all media are now moved around as TCP/IP packet flows. The layer model reflects this integration of different media all using the same protocols. It’s relevant when one takes an infrastructure perspective, but doesn’t take into account the very real differences between, say, real-time voice chat, blogs, and digital video feeds. One might say that the silo model works best “at the top” and a layer model “at the bottom”; in the middle, it’s a mess.
Time to revive the web metaphor, with a twist. The “web” of the World Wide Web refers to the network of pointers from one web page to another. [3] The nodes are pages, and the connections between them are hyperlinks. The “info-web” model I’m exploring here proposes a different mapping: the connections in the web represent information flow, not hyperlinks, and the nodes where they connect are not individual pages but rather functional categories, like blogs, social networking sites, search portals, and individuals.
Food webs
It’s a web as in an ecosystem food web, where the nodes are species and the links are flows of energy and nutrients. The simplest view is that of a food chain: in a Swedish lake, say, ospreys eat pike, which eat perch, which eat bleak, which eat freshwater shrimp, which eat phytoplankton, which get their energy from the sun via photosynthesis. A chain is a very simple model which shows only a linear path for energy and material transfer. (The Layers model resembles a food chain, where network components at one layer pass down communications traffic to the layer below for processing.)
A food web extends the food chain concept to a complex network of interactions. It takes into account aspects ignored in a chain, such as consumers which eat, and are eaten by, multiple species; parasites, and organisms that decompose others; and very big animals that eat very small ones (e.g. whales and plankton). The nodes in a food web are species, and the links between them represent one organism consuming another. While the nodes are multiply connected, there is some degree of hierarchy, since in an ecosystem there’s always a foundation species that harvests energy directly from non-organic sources, usually a plant using sunlight. Each successive organism is at a higher trophic level; first the phytoplankton, then the shrimp, then the bleak, etc.
Info-webs
In an eco-based web model for the Internet, the species in a bio-web are mapped to functionality modules as described in my earlier post, A modular net. For example, a YouTube video clip plugged into a MySpace page running on a Firefox browser on a Windows PC might correspond to the osprey, fish, shrimp, plankton in the simple example above. In the same way that there might be other predators beyond ospreys feeding on fish, there might be many other plug-ins on the MySpace page for IM, audio, etc.. In a bio-web, a link between species A and B means “A eats B”. In the info-web model of the Internet, a link means “information flows from A to B.” Value is added to information value in the nodes through processing (e.g. playing a video) or combinations (e.g. a mash-up). For example, a movie recommender embedded in Facebook gets its information from a database hosted somewhere else, and integrates into a user’s page. Therefore, information transport is key. One can think of the links as being many-stranded if there are many alternative ways of getting the relevant information across, or single-stranded if there’s only one or two communications options (e.g. for web search one can use Wi-Fi, 3G data, DSL, cable modem etc, but for high def video on demand there’s many fewer choices.)
The analogy between the info-web and the food web diverges when one considers what flows across the links. In the Internet, information flows around the web; in the biological case, it’s energy and nutrients. Information can be created at any stage in an information web and increases with each step, whereas energy is conserved, and available energy decreases as one moves up the trophic levels of an ecological pyramid. There is a sequence of “infotrophic levels” where information value is added at each step. However, since the amount of information grows with each step, the “information pyramid” is therefore inverted relative to the ecological one: it grows wider from the bottom to the top, rather than narrower.
Implications for policy making
The Internet is complex web of interlocking service, and is approaching the richness of simple biological ecosystems. In the same way that humans can’t control ecosystems, regulators cannot understand, let alone supervise, all the detailed interactions of the Internet. One may be able to understand the interactions at a local level, e.g. how IP-based voice communications plug into web services, but the system is too big to wrap one’s head around all the dynamics at the same time. [4] This is why a market-based approach is advisable. Markets are the best available way to optimize social systems by distributing decision making among many participants. Markets aren’t perfect, of course, and there are social imperatives like public safety and justice that need government intervention. The info-web model suggests ways to find leverage points where regulators should focus their attention, and also provides salutary lessons about the limits of the effectiveness of human ecosystem management.
For example, a keystone species is one that has a disproportionate effect on its environment relative to its abundance. Black-tailed prairie dogs are a keystone species of the prairie ecosystem; more than 200 other wildlife species have been observed on or near prairie dog colonies. Such an organism plays a role in its ecosystem that is analogous to the role of a keystone in an arch. An ecosystem may experience a dramatic shift if a keystone species is removed, even though that species was a small part of the ecosystem by measures of biomass or productivity. Regulators could apply leverage on “keystone species” rather than searching for bottlenecks or abuse of market power. This would provide a basis for both supportive and punitive action. At the moment search engines are “keystone species” – they play a vital role not only in connecting consumers with information, but also in generating revenue that feeds many business models. One might say that Google is the phytoplankton of the Internet Ocean, converting the light of user attention into the energy of money. Local Internet Service Providers may also be keystone species. In earlier phase of the net, portals were keystone species. Keystone services provide a point of leverage for regulators; they can wield disproportionate influence by controlling behavior of these services.
The unintended side effects of intervention in ecosystems stand as a warning to regulators to tread carefully. For example, the Christian Science Monitor reported recently on efforts to eradicate buffelgrass from the Sonoran Desert. It was introduced by government officials after the Dust Bowl in an attempt to hold the soil and provide feed for cattle. It’s unfortunately turned out to be an invasive weed that threatens the desert ecology, choking out native plants like the iconic saguaro cactus. Another example of biological control gone wrong is the introduction of the cane toad into Australia in 1935 to control two insect pests of sugar cane: it did not control the insects and the Cane Toad itself became an invasive species. By contrast, the release of myxomatosis in 1950 was successful in controlling feral rabbits in that country.
----- Notes -----
[1] Steven Johnson’s Discover essay, republished as “Why the web is like a rain forest” in The Best of Technology Writing, ed. Brendan Koerner, helped inspire this thinking.
[2] This is a rough first draft of ideas. There are still many gaps and ambiguities. The nature of the nodes is still vague: are they applications/services (LinkedIn is one node, Facebook is another), application categories (all kinds of social networking sites are one node), market segments, or something else? How and where does the end user fit in? How can one use this model to address questions of VOIP regulation, accessibility directives, culture quotas for video, and other hot topics in Internet policy? Much work remains to be done. The representation of transport services as links rather than nodes may change. The different in conservation laws needs to be worked out: sunlight, water and nutrients are limited and conserved in the web, rival resources, whereas information is non-rival and can be produced anywhere. Connections need to be made with prior work on metaphors for communications technologies, e.g. Susan Crawford’s Internet Think, Danny Hillis’s Knowledge Web, and Douglas Kellner’s “Metaphors and New Technologies: A Critical Analysis.”
[3] The word web derives from the Old Norse vefr, which is akin to weave. It thus refers to a fabric, or cloth. In many usages, e.g. food webs, there are assumed to be knots or nodes at the intersection of warp and weft, which occur in nets, but not in fabrics.
[4] This is a link to the Hard Intangibles problem more generally, via the limit (about four) on the number of independent variables that humans can process simultaneously.
[2] This is a rough first draft of ideas. There are still many gaps and ambiguities. The nature of the nodes is still vague: are they applications/services (LinkedIn is one node, Facebook is another), application categories (all kinds of social networking sites are one node), market segments, or something else? How and where does the end user fit in? How can one use this model to address questions of VOIP regulation, accessibility directives, culture quotas for video, and other hot topics in Internet policy? Much work remains to be done. The representation of transport services as links rather than nodes may change. The different in conservation laws needs to be worked out: sunlight, water and nutrients are limited and conserved in the web, rival resources, whereas information is non-rival and can be produced anywhere. Connections need to be made with prior work on metaphors for communications technologies, e.g. Susan Crawford’s Internet Think, Danny Hillis’s Knowledge Web, and Douglas Kellner’s “Metaphors and New Technologies: A Critical Analysis.”
[3] The word web derives from the Old Norse vefr, which is akin to weave. It thus refers to a fabric, or cloth. In many usages, e.g. food webs, there are assumed to be knots or nodes at the intersection of warp and weft, which occur in nets, but not in fabrics.
[4] This is a link to the Hard Intangibles problem more generally, via the limit (about four) on the number of independent variables that humans can process simultaneously.
Thursday, June 07, 2007
Algo Trading
This week's New Scientist has a good review article on algorithmic trading (Robert Matthews, Gordon Gekko makes way for trading software, 30 May 2007).
Some excerpts:
"Investors have realised that the processing speed and sheer volume of trades a computer can make can help them to outwit the sharpest of dealers. . . . Ten years ago, algo-trading was almost non-existent, but according to a recent report by [Brad] Bailey, now at the Boston-based consulting firm Aite Group, one-third of all trading decisions in US markets are now made by machines. He predicts that by 2010 more than half will be done this way. At Deutsche Bank in London, over 70 per cent of a category of foreign currency trades, called "spot trades", are now carried out without human intervention every day."
"Silicon is taking over from carbon on Wall Street," says Bailey.
"Dave Cliff, a computer scientist at Southampton University and founder of Syritta, a UK-based consultancy firm that develops algo-trading software [has turned to genetic algorithms to manage the large number of parameters that have to be tweaked.] His new system takes an initial set of guesses about the optimal selection of market parameters, tests how well each parameter describes prevailing market conditions, and then "breeds" a new selection from these to arrive at a more effective set. This evolutionary cycle is repeated until optimum values for the parameters are reached which the algo then uses to trade with."
"Human traders can make up for the lack of data with instinct and experience, and hooking human instinct up to computing power is now at the leading edge of algo trading. The result is software that helps the trader come up with ideas for bagging some alpha, and tests those ideas in simulations to see if they'll fly. With so many variables, it's easy to make mistakes, but the computer can spot them before unleashing the algo upon the market."
Some excerpts:
"Investors have realised that the processing speed and sheer volume of trades a computer can make can help them to outwit the sharpest of dealers. . . . Ten years ago, algo-trading was almost non-existent, but according to a recent report by [Brad] Bailey, now at the Boston-based consulting firm Aite Group, one-third of all trading decisions in US markets are now made by machines. He predicts that by 2010 more than half will be done this way. At Deutsche Bank in London, over 70 per cent of a category of foreign currency trades, called "spot trades", are now carried out without human intervention every day."
"Silicon is taking over from carbon on Wall Street," says Bailey.
"Dave Cliff, a computer scientist at Southampton University and founder of Syritta, a UK-based consultancy firm that develops algo-trading software [has turned to genetic algorithms to manage the large number of parameters that have to be tweaked.] His new system takes an initial set of guesses about the optimal selection of market parameters, tests how well each parameter describes prevailing market conditions, and then "breeds" a new selection from these to arrive at a more effective set. This evolutionary cycle is repeated until optimum values for the parameters are reached which the algo then uses to trade with."
"Human traders can make up for the lack of data with instinct and experience, and hooking human instinct up to computing power is now at the leading edge of algo trading. The result is software that helps the trader come up with ideas for bagging some alpha, and tests those ideas in simulations to see if they'll fly. With so many variables, it's easy to make mistakes, but the computer can spot them before unleashing the algo upon the market."
Subscribe to:
Posts (Atom)