Homer-Dixon’s The Ingenuity Gap helped me realize that perhaps the difference between software and more traditional engineering is that bridges (say) are complicated, while software is complex. I follow the distinction I’ve seen in the systems theory literature that complicated refers to something with many parts, whereas complex refers to unpredictable, emergent behavior. Something complicated may not be complex (e.g. a watch), and a complex system might not be complicated (e.g. a cellular automaton).
A large piece of code meets the criteria for a complex adaptive system: there are many, densely interconnected parts which affect the behavior of the whole in a non-linear way that cannot be simply understood by looking at the components in isolation. A code module can be tested and defined individually, but its behavior in the context of all the other components in a large piece of code can only be observed – and even then not truly understood – by observing the behavior of the whole. If software were linear and simply complicated, extensive testing wouldn’t be required after individually debugged modules are combined into a build.
Some have compared software to bridges while calling (properly) for better coding practices. Holding software development to the same standards as other facets of critical infrastructure is questionable, however, because a software program is to a bridge as the progress of a cocktail party is to a watch. Both parties and watches have a large number of components that can be characterized individually, but what happens at a given party can only be predicted in rough terms (it’s complex because of the human participants) while a watch’s behavior is deterministic (though it’s complicated).
This challenge in writing software is of the “intrinsically hard” kind. It is independent of human cognition because it catch you eventually, no matter how clever or dumb you are (once you’re at least smart enough to complex software systems at all).
Geek’s addendum: Definitions of complexity
Home-Dixon’s definition of complexity has six elements. (1) Complex systems are made up a large number of components (and are thus complicated, in the meaning above). (2) There is a dense web of causal connections between the parts, which leads to coupling and feedback. (3) The components are interdependent, i.e. removing a piece changes the function of the remainder. (I think this is actually more about resilience than complexity.) (4) Complex systems are open to being affected by events outside their boundaries. (5) They display synergy, i.e. the combined effect of changes to individual components differs in kind from the sum of the individual changes. (6) They exhibit non-linear behavior, in that a change in a system can produce an effect that disproportional to the cause.
Sutherland and Van den Heuvel (2002) analyze the case of enterprise applications built using distributed object technology. They point out that such systems have highly unpredictable, non-linear behavior where even minor occurrences might have major implications, and observe that recursively writing higher levels of language that are supported by lower level languages, as source of the power of computing, induces emergent behaviors. They cite Wegner (1995) as having shown that interactive systems are not Turing machines: “All interactions in these systems cannot be anticipated because behavior emerges from interaction of system components with the external environment. Such systems can never be fully tested, nor can they be fully specified.” They use Holland’s (1995) synthesis to show how enterprise application integration (EAI) can be understood as a complex adaptive system (CAS).
References
Sutherland, J. van den Heuvel, W.-J. (2002). "Enterprise application integration encounters complex adaptive systems: a business object perspective.” HICSS. Proceedings of the 35th Annual Hawaii International Conference on System Sciences, 2002.
Wegner, P (1995). “Interactive Foundations of Object-Based Programming.” IEEE Computer 28(10): 70-72, 1995.
Holland, J. H. (1995). Hidden order: how adaptation builds complexity. Reading, Mass., Addison-Wesley, 1995.
No comments:
Post a Comment