Tobias Burgers alerted me to Sean Lawson’s 2013 paper “Beyond Cyber-Doom: Assessing the Limits of Hypothetical Scenarios in the Framing of Cyber-Threats” (DOI). Lawson’s article helped me further understand the servant/master narrative that seems to be a tropes of technology stories.
Lawson gives a useful inventory of cyber-doom scenarios. (Given the paper’s date, the examples are vintage 2010; I wonder if those stories have become more or less prevalent...)
I thought his description of “securitization theory” was a rather neat motivator for my work, too: this theory “begins with the observation that threats to security are neither natural nor given, but rather must be constructed through political discourse.” Of course, story is the salient part of discourse to me.
His set-up of cyber-doom scenarios is also useful; as a basis for a few comments, here’s his definition:
Cyber-doom scenarios are hypothetical stories about prospective impacts of a cyberattack and are meant to serve as cautionary tales that focus attention on cybersecurity. These stories typically follow a set pattern involving a cyberattack disrupting or destroying critical infrastructure. Examples include attacks against the electrical grid leading to mass blackouts, against the financial system leading to economic losses or complete economic collapse, against the transportation system leading to planes and trains crashing, against dams leading floodgates to open, or against nuclear power plants leading to meltdowns.
A few notes on this description:
- Scenarios are clearly stories, even in the strict sense of a series of logically and chronologically related events that are caused or experienced by actors (following Mieke Bal’s definition). Merriam-Webster defines scenario as “a sequence of events especially when imagined,” which underlines the hypothetical quality.
- When used in scenario planning, such stories are meant to test the envelope within which hazard responses will hold up. They therefore, almost necessarily, include worst-case hypotheticals. However, in Lawson’s telling, they are given an apocalyptic tone. Even though I don’t think he plays this card explicitly, I sense a logos-mythos bias: policy-makers should be rational, and not fall for religiously-inflected hysteria about end times. (Cf. Lawson's note that “[s]everal scholars have noted a divergence between the rhetoric of cyber-doom scenarios and the reality of actual cyberattacks.”)
- The underlining of their function as cautionary tales seems important. At one point I was framing my work by saying that one can understand the world through facts, feelings, and fables. I chose "fables" because of the alliteration, but the downside was that everyone understands fables to have moral – which myths and socially significant stories don’t necessarily have. (To be powerful, I think they have to be ambiguous, and thus not force a particular moral imperative down the reader’s throat.)
- Infrastructure compromise seems to be a key ingredient, at least in Lawson’s telling. As the editor of the Christian Science Monitor recently wrote a propos the recent Texas winter storm, “A former editor of mine once argued that “infrastructure” is the most boring word in journalism – guaranteed to induce drowsiness within 10 seconds. . . . Infrastructure seems to matter only when it fails.” One of the rhetorical weaknesses of the cyber-doom narrative, I’d submit, is that most people don’t care about infrastructure, most of the time. (Cf. the appalling state of US infrastructure.)
One of the key questions, I think, is whether the public is aware of these scenarios, and takes them seriously. (Using my “know/believe/act” rubric: most of them know these stories; many of them believe them; and some want their politicians to act.) The same question, of course, also applies for the policy-making elites – I suspect some of them don’t believe the stories but use exploit them to achieve their policy aims anyway.
I like the fear-justification for the success of cyber-doom rhetoric, though I suspect there isn’t much quantitative evidence to support the claim; the three papers Lawson cites are essays. I think the “technology out of control” trope is a powerful one, though – perhaps necessarily? – the arguments for it seem to be mostly discursive cultural analysis not empirical (e.g., Langdon Winner and Leo Marx). Lawson’s summary points to the mythological function of these stories:
Many of the concerns found in contemporary cybersecurity discourse are not unique, but rather, have strong corollaries in early 20th century concerns about society’s increasing reliance upon interdependent and seemingly fragile infrastructure systems of various types, including electronic communication networks.
A narrative version of this would be that society has become increasingly dependent on an increasingly powerful and unreliable servant. Lawson gives various real-world examples (the US Navy’s reluctance to adopt the radio; anxieties about telegraph systems; and airpower theories) but I’m reminded of EM Forster’s The Machine Stops, in which the collapse of technology brings down civilization.
Lawson’s analysis is driven by his objection to “over-reliance on hypothetical scenarios instead of empirical data,” and his belief that historical and sociological data “[cast] serious doubt on the assumptions underlying cyber-doom scenarios by demonstrating that both infrastructures and societies are more resilient than often assumed.” That may well be the case, but the fact that he and his peers have had to make this case implies that these views are persistent, pervasive and effective. In the context of his section on blackouts, he suggests that these ideas “persist because of the persistence of a technological determinist mindset among officials, the media, and the general public.” If that’s true, it means that the belief that a society's technology determines the development of its social structure and cultural values (Wikipedia) functions as a guiding myth. In narrative terms, we have a generalization of the unreliable servant story: Technology dictates how Society behaves and evolves – in effect, the servant has become the master.
I won’t get into Lawson’s policy recommendations, except to say that he occasionally engages in doom-laden hypotheticals himself. For example, he argues that the creation of US Cyber Command “is fraught with danger” since it could undermine US policy preferences for an open Internet, and accelerate the militarization of cyberspace. This is ironic, since Lawson frames the policy debate with himself on the on the “just the facts, ma’am” side, with the scenario-mongers on the other.
No comments:
Post a Comment