Friday, July 22, 2005

Bloggregators

John Reith
Blogs aren't newspapers, though some like Slashdot are beginning to look that way. Newspapers are magical brews: toe of sports, eye of news, wool of gossip, tip of hope, root of evil. Lord Reith's agenda for the BBC applies, though perhaps no longer in his preferred order: educate, inform and entertain. Editors not only give us what they think we want, but also what they want us to think.

Newspapers, even those online, have a richer visual structure than most blogs, reflecting their more diverse content. See, for example, the New York Times and the Washington Post: multiple columns, rich typography, and multiple horizontal "folds".

One could assemble a compelling community-written alternative to conventional newspapers out of the ingredients on the web. The parts and pieces are there, but I haven't found the gestalt:

Browser-based RSS readers (like My Yahoo!) allow me to construct a list of content from blogs and news outlets, but don't give me the control over layout that I want. It's a one-dimensional list, with some options for a side-bar. The stories are also represented by just a headline.

Google News allows me to re-arrange the sections I'm interested in, and create custom sections - but sources the content from news sites, not blogs (yet?).

I'd like to compose a front page out of individual stories from my feeds (a la My Yahoo RSS) that fit my keyword requirements (a la Google) , presented as link+precis. I want a filtered, aggregated view of my blog feeds - hence the term "bloggregator".

Since I can imagine this, somebody's already built it, perhaps with a nifty combination of RSS and greasemonkey. Dear Reader, please let me know via the Comments where to find it.

The Windows Vista: coming soon to a car dealership near you

Brian Valentine
If Blogger had categories, this would be filed under "Cheap Shots". Or maybe "Everybody's an Expert". Still, I can't help myself...

"Windows Vista" is the official name for code base formerly known as Longhorn (Microsoft announcement). The tag line is "Clear, Confident, Connected: Bringing Clarity to your World".

Huh?

I don't look to an operating system to help me "cut through the clutter and focus on what's important", which is what Clear seems to refer to. I suppose the Confident is trying to resonate with the Trustworthy Computing message, and persuade people that Windows is more secure than the alternatives. I'm sure it has nothing to do with the fact that CIOs are using Firefox because it's thought to be more secure than IE. As for Connected: well, we're celebrating the tenth anniversary of the Internet right now, so it'll be very topical in 2006.

The name reminds me of faded car brands: the Ford Pinto, the Chrysler Volare, the Chevrolet Nova, the Buick Regal. So: would you buy a used car from that man?

Wednesday, July 20, 2005

Practice (continued): getting better, but not best

Miladin had a perceptive comment about "best". It reminded me of the rhyme I learned as a kid:
Good, better, best
Never let use rest
'Til our good is better
And our better, best.

The propaganda for the existence of "best" (which I think the two of us agree is a myth) clearly starts early. Where does it come from? It's possible that the notion of "best" is wired in; our brains could come with a built-in notion of a superlative, and with reaching that superlative a built-in drive. However, I think one can get by without this assumption; relativism will suffice.

Comparisons are the stuff of life. All living beings act on the basis of A being more or less desirable than B. This mate is more fit than that one; this food is better than that food; this place is safer than than place. Along with comparison comes preference, that is, making the choice of one thing over another. This then leads to a series: C better than D, better than E ... In principle, of course, there will always be a B better than D, and an F worse than E. Since we're finite beings, however, it's a good simplifying step to assume that the series is finite, and that C is the best one can do. It facilitates the decision, which is evolutionarily adaptive; if you always keep waiting for something better to come along, you'll never get laid, fed, or secure.

Anxiety is the stuff of life: animals are always striving to do better. If they don't, their genes won't survive. Focused anxiety is more tolerable than the free-floating kind, and hence the endless search for "best". Many cultural artifacts are designed to reassure us that "best" exists, and that there is therefore hope of respite from our anxiety. To take two stereotypical examples: romances where the heroine eventually lands the man of her dreams and lives happily ever after; and sport, where every game has a result, and every contest (in America, at least) has a winner.

In a more honest world, the doggerel might read:
Good, better, ... um
Never let us bum
'Til our good is better
And the rest - stay mum

Miladin also sought advice on how to organize time better. I have two recommendations. David Allen's "Getting Things Done: The Art of Stress-Free Productivity" is a great self-help (!) guide to creating actionable To-Dos. Since a lot of stress comes from a big reading load, I'd recommend Mortimer Adler's "How to Read a Book"; for a quick intro to it, see the last chapter in Robert Hagstrom's "Investing: The Last Liberal Art" .

Saturday, July 16, 2005

The practice of practice

The endless shelves of the Self-Help Section shout out our imperfections to the world. So much to be fixed! Self-improvement is an endless road. Since our lives are finite, we are doomed never to reach the destination. The destinations, plural, to be painfully precise. Examine your life for even a few moments, and you can list endless topics for improvement, from a golf swing to showing compassion.

Christianity offers great comfort; God's grace comes to the rescue to fill the infinite gap between our inadequacy and the criteria for admission to Heaven. Buddhism is less forgiving; since suffering is rooted in one's desire, only your own efforts can break the cycle of wanting and lacking. However, even here religion has constructed hope: karma gives you multiple shots in successive lives at getting it right.

Some systems of thought don't try to solve the puzzle, and simply give advice on muddling through as best one can - Taoism, for example. Lin Yutang extols the importance of loafing, leisurely walks, and long talks with friends. My mother likes to quote GK Chesterton: "If a thing is worth doing, it is worth doing badly."

All of them advise one to be mindful of the challenge, even if it is merely to ignore it. But the challenge is to thread the path between pride and despair.

Any practice requires endless practice, whether it is spiritual growth or cooking. There is never an end to learning, and so the amount to be learned is infinite. Any progress is infinitesimal measured against what needs to be done; mathematically, any finite fraction of the infinite is zero. Judged against the task, one can only despair.

Ah, but one can take heart by looking back and seeing how far one's come. No matter how inconsequential the progress, it is more than would've happened if you had made no effort at all. Even with a step back for every two steps forward, the headway you make is significant and can be a source of pride. As you advance down the path, though, a day's progress becomes an ever-smaller fraction of the journey to date...

Perhaps the only escape is to focus on the moment, and ignore both past and future. One has to find joy in the practice for its own sake, regardless of its greater purpose. Only in Zeno's story does Achilles concede the race because the Tortoise persuades him that he can never catch up. The real-life Achilles just runs, wins, and (because he's a mean bastard) has the Tortoise for dinner.

Wednesday, July 13, 2005

Bringing in the Apple harvest

Some Apple news in the last few days gives a snapshot of the most vulnerable moment in their product cycle: nearing the end of one innovation, and before the next one has appeared.

Business Week tells the story of the iPod business's fading lustre: Wall Street is getting skittish about not seeing any more blow-out quarters in the pipeline. At the same time, a study by SG Cowen (N=1,400) indicates that the iPod is exposing Windows users to Apple, and increasing their interest in Macs.

The iPod party may well be ending, but it doesn’t follow that the Apple party is ending. Jobs is executing a very neat “niche fast follower” strategy, which is sustainable while he’s around. He takes a technology that’s proven to work, but only at a level that’s accessible to experts and early adopters. Apple then does a superb integration job, producing a product that’s a joy to use for the non-expert, though still pricey; their customer sweet spot is rich non-geeks. After a while the rest of the market catches up, and Apple has to move on to the next opportunity. It's done this for the personal computer (Mac), laptop (PowerBook), and MP3 player (iPod). This strategy will work as long as the industry keeps churning out new product ideas that have merit – the cool stuff that’s rough around the edges until Steve Jobs does his magic.

People who focus on the end-game for a particular product line will under-estimate Apple’s long-term viability. Certainly Apple will eventually be overtaken by the mass market in any given segment; but in the long run we’re all dead. For folks with a bias towards the mass market, and who look to asymptotic end-games, Apple will always be an eventual loser. It does lose eventually in any given segment - but not as a business. The the financial reality isn’t too shabby. Looking at some financial indicators, here’s how Apple and MSFT stack up:

Stock: AAPL -- MSFT
P/E: 42 -- 25
Operating margin: 9.17% -- 42.55% (Significant Market Power, gotta love it)
Return on equity: 13.8% -- 19.07%

What's the next iPod? Time and Steve Jobs will tell. Apple hasn't broken into the home entertainment market in a significant way yet, a scenario the SG Cowen data points towards. It’s likely that there will be more and more mixed-platform homes, as people who own and use Windows add Macs. Microsoft will have to start addressing this in the same way that it had to come to terms with Unix+Windows shops in enterprises. There’s an opportunity to get a head start on Windows plug-ins that will make Macs play well with MS machines - and conversely, there's an opportunity for Apple to turn Windows PCs into Mac peripherals.

Admittedly, even optimistic share projections for Apple are relatively tiny: SG Cowen believes that 6% share is achievable by 2008. However, this is the kind of tail that can wag the dog, as we’ve seen with Firefox vs. IE.

Batten down the hatches for a hard winter in the Apple orchard, but keep an eye out for the buds of new innovation.

Friday, July 08, 2005

He was listening to his neighbor when the wind blew the door open. The spotted dog scratched itself. From then on, he itched whenever the wind blew.

Different parts of his body would itch depending on the time of day, the phase of the moon, and what people were saying. He would scratch first one place and then another, a pilgrimage of irritation, determined in part by the interlocking rhythms of time, in part contingent on the random words he heard.

He left his house to search for a place where the wind didn't blow. He criss-crossed the continents, his path enveloping the world, seeking quiet.

We buried him under that tree up on the hill. Sometimes, when the wind blows, you can still hear him scratching himself.

Sunday, July 03, 2005

So true it hurts #2: getting old

I'm officially old. The first undeniable sign was when hair started growing out of my ears and nostrils -- eeeewwww! Then I started getting bushy eyebrows -- gross! But it was yesterday that the medic told me I had some arthritis. ("Minimal", mind you, so don't get carried away.)

At least I haven't received my AARP membership card yet...

So true it hurts #1: blogging

In today's Doonesbury, the host of a radio talk-show asks his blogger-guest, "Isn't blogging for angry, semi-employed losers who are too untalented or too lazy to get real jobs in journalism?"

Hey, I'm not angry!

Saturday, July 02, 2005

The social contract of new technology

Donald Bruce has used the example of genetically modified food (GM food, for short) to illustrate how the notion of a social contract can be used to build a shared vision for risky new technologies [1]. I will simplify his model and show how it can be used to assess the risk profile of new information and communication technologies.

Dr Bruce argues that the conflict about GM food in Europe is rooted in a lack of trust between the biotech industry and their customers. Many consumers don’t see the benefits of GM food, are concerned about the long-term impacts of “meddling with nature”; they that industry is pursuing its own interests over theirs. Industry and its allies believe that consumers are uninformed and irrational, and are slowing down the introduction of a technology that has widespread social benefits.

Bruce argues that a dozen parameters, listed in [1], influence whether society will embrace the benefits of a new technology, and accept its risks and disruptions. I’ve simplified this list into three themes: Why do it? Who’s in control? How do I feel about it?

The first theme (Why do it?) concerns the value proposition: what benefits are being offered at what risk?

Benefit: does it offer realistic, tangible benefits to the consumer?

Risk: how often can a bad consequence be expected, what’s the likely magnitude, and is it noticeable or insidious? [2]

The second theme (Who’s in control?) addresses questions of power: If consumers don’t feel in control of the technology, they are more likely to resist it.

Choice: is it a voluntary risk assumed by consumers, or one imposed by another party?

Trust: if imposed by someone else, how much do we trust those who are in control? [3]
The third theme (How do I feel about it?) concerns attitudes and reputation:

Values: does the technology uphold or challenge basic values?

Familiarity: is the technology familiar and understood, and socially embedded?

Precedent: if it is unfamiliar, has something like it gone wrong before or proved reliable?

Profile: has it been given a positive or negative image in the media?

All these considerations are more or less subjective. Different people weigh risks and benefits in different ways; some are more comfortable ceding control to companies than governments, or vice versa; different people have different basic values; and something that’s familiar to me may be alien to you. The adoption of a technology is not simply, or even not largely, a technical matter [4]. If the makers of technology ignore this, their initiatives are at risk.

Technologists and business leaders often live in a very different world from their customers, and stopping to listen to the unwashed masses is hard for both geeks and execs. The news media are a useful resource but are often discounted, discredited or ignored since they are seen as biased bearers of bad tidings; in fact, they may simply be representing the interests of a broader cross section of society.

The informatics industry is in the fortunate situation that it hasn’t experienced the melt-down of confidence that GM food has suffered in Europe. Hence, one doesn’t need to use a social contract analysis to figure out how to build a positive shared vision of technology, as Donald Bruce has done for biotech. However, there are deep similarities. The positive self-description of biotech noted by Bruce is similar to informatics’ self-image: discovery, innovation, enhancement, efficiency, prosperity, and growth. And as he says of biotech, "Underlying all, and largely taken for granted, is an Enlightenment vision of rational human progress through technology."

Still, many new information and communication technologies are at risk of social conflict. This approach offers a useful checklist for assessing those risks. I’ll give two examples of how this tool could be used. I leave as an exercise its application to more contentious topics like uniform software pricing in rich and poor countries, software patents, and the misuse of email and browsers to commit identity theft.

Preventing piracy through digital rights management technology (DRM): three thumbs down

Why Do It? The benefits to consumers of DRM are not tangible; it presents itself as an inconvenience. Creators assert that the flow of content will dry up without rights control technologies to protect their investment, but this loss to consumers won’t be immediately visible. The risk of losing rights to copy which have become customary with analog technologies is much more easily grasped.

Who’s in Control? The owners of content are clearly calling the shots, though the providers of the underlying tools are also implicated when consumers confront this technology. The customer has little choice but to accept DRM when it is imposed, and finds it infuriating when, say, some CDs don’t play in their PCs. Consumers are unlikely to feel they have much in common with corporate giants like Time Warner and Microsoft, and trust will be low.

How do I feel about it? While the technology upholds traditional values like not stealing, a new set of values is emerging that finds nothing wrong in freely sharing digital media. The technology is unfamiliar and hard to use, and the precedent of the failure of copy-protecting dongles once used with computer software is not encouraging. The public profile of the technology is still up for grabs; the mainstream media have yet to define an image either positive or negative.
Voice over IP (VoIP): three thumbs up

Why Do It? The benefits are immediate and tangible: phone calls cost less. A notable risk is that a call to the fire brigade or ambulance won’t go through. This is a low-frequency, high-magnitude risk, and is thus getting a lot of coverage in the press. However, it’s an understandable and mitigatable risk for most people. VoIP’s threat to the social revenue base built on legacy taxes is a long-term and esoteric risk; few consumers understand this impact, and are likely to discount it. The main risks associated with the Internet, identity theft and harm to children, are not obviously associated with VoIP.

Who’s in Control? The consumer is in charge of deploying this technology for their own use. Risks like failed emergency calls are taken on voluntarily (though one can argue about education, notice and choice). Customers have to trust their Internet service providers, but not in any unusual way. As an Internet technology, VoIP also partakes of the halo of citizen empowerment that the web has acquired.

How do I feel about it? The auguries are good on this score, too. The technology builds on the commonly held belief that the Internet empowers individuals, and offers cheap and useful new products. The technology is familiar, since it resembles traditional telephony; for those who have some on-line experience, Internet Voice services like Skype resemble the known technology of Instant Messaging. There is no widely held precedent of something like this having led to disastrous consequences and the media profile is mixed to positive.
----- ----- -----

[1] Donald M Bruce, A Social Contract for Biotechnology - Shared Visions for Risky Technologies? I found this a very useful and thought-provoking document. I do get a little uneasy, though, whenever someone ascribes opinions and motives to "people" or "the ordinary public"; there’s a narrow line between being an advocate and being patronizing.

I was alerted to the fascinating work done by Dr Bruce at the Society, Religion and Technology Project of the Church of Scotland by an opinion column that he wrote in the New Scientist of 11 June 2005, Nanotechnology: making the world better?

[2] The psychology of risk aversion plays an important role here. When facing choices with comparable returns, people tend chose the less-risky alternative, even though traditional economic calculation would suggest that the choices are interchangeable. When making decisions under uncertainty, people will tend to take decisions which minimize loss, even if that isn’t the economically rational behavior. Consequently, there is a greater aversion to high consequence risks, even if their likelihood is small. See http://en.wikipedia.org/wiki/Loss_aversion for definitions and links.

[3] We trust another party another party if we believe that they will act in our best interests in a future, often unforeseen, circumstance. Trust is in large part a matter of a shared vision: how much do we share their values, motivations and goals? Vision is often expressed as a projection of the consequences of a set of perceptions about current situations and trends. This projection is driven by the values held by the visionary; if the values are not aligned, then the vision will not be persuasive. It’s a three-stage process in which perceptions are modulated by values to produce a vision: Perception -> values -> vision.

[4] I blogged at some length on this topic last week, under the heading Technology isn't Destiny.

Thursday, June 30, 2005

Hostages to fortune

Wired News reports that an old piece of writing may come back to haunt Bram Cohen, creator of BitTorrent, in the light of the recent Supreme Court decision on Grokster. According to Wired, Cohen said he's unhappy that the Supreme Court's decision is forcing him to confront something he wrote more than five years ago. He added, "Anybody who thinks that they might produce technology at some point in the future that might be used for piracy has to watch everything that they say."

It's not just someone who might produce technology who has to be careful; it's everyone. Something I wrote in this explicitly personal blog was used to gain leverage against my employer. After that, I went through a great deal of soul searching about whether I should continue to write in my own name. It would have been much easier to write anonymously, as many do. I decided that the risk was outweighed by the beneficial discipline (and terror) of writing under my own name.

Forgotten indescretions have always had the power to come haunting. The difference today is that they're so indestructible. Digital posts are backed up and cached and copied and never go away (except for the ones you'd like to keep, which are governed by Murphy's Law and disappear without a trace). As I said my post about Miranda Murphy, the following rules apply:
Everything you say digitally will be remembered, and can be used against you.
If something you say can be misinterpreted, it will.
Caveat auctor!

English as a foreign language

Lady Catherine: 'She sallied forth to scold [any erring tenants] into harmony and plenty'
I am, at last, reading Jane Austen. The English in it is not 200 years old, but yet it surprises me at every turn.

The spelling is markedly different, the most noticeable being words split that we have joined: "any body" for anybody, "no body" for nobody. In contrast, the punctuation is not that alien, though there are, as one would expect, more commas than we'd use.

The most striking are words whose use reflects a different social mileu. Take "condescension", for example. Here's the insufferable Mr Collins describing his patroness in Pride and Prejudice: "... he had never in his life witnessed such behaviour in a person of rank -- such affability and condescension, as he had himself experienced from Lady Catherine. She had been graciously pleased to approve of both the discourses which he had already had the honour of preaching before her." [1] Elsewhere, reference is made to Mr Collins' admiration of "... Lady Catherine's condescension as he knew not how to admire enough" [2].

While we think of condescension as a failing, indicating arrogance and offensively patronizing [3] behavior, it also has the meaning of "affability to your inferiors and temporary disregard for differences of position or rank" [4] -- clearly a good thing in a patron. In Austen's world, class distinctions are a matter of endless attention and vital importance to one's quality of life, and hence a superior who deigns to ignore them is offering a great courtesy to their inferiors.

It's not the language that's foreign after all, but the world that it is describing.

----------

[1] Pride & Prejudice, Chapter XIV of Volume I (Chap. 14)

[2] Pride & Prejudice, Chapter VI of Volume II (Chap. 29)

[3] Here's another one: patronizing. One meaning is "to treat with condescension", which is bad these days; but it also means "to act as a patron, to support or sponsor", which is a good thing (dictionary.com)

[4] WordNet and American Heritage Dictionary, cited in dictionary.com

Wednesday, June 29, 2005

There's 4.6 of them born every minute

Gartner notes that 2.4 million consumers have reported losing money directly due to phishing attacks in the year to May 2005; of those consumers, half of them lost a combined $929 million in the 12 months preceding the survey.

(2.4 million suckers divided by 525,600 minutes/year gives 4.566. The loss per user is impressive, at almost $1,000, though I don't know why Gartner only gives numbers for half of them... I'd guess that gives a more impressive number.)

Phishing solicitations increased 28% from 57 million during the 12-month period ended in April 2004 to 73 million for the year ended May 2005. This is a smaller number than I would've guessed, given the prominence this topic gets. I suspect that the number of people losing money has increased more rapidly, though the press stories on the Gartner report don't mention this.

It's having an impact: according to Gartner, 33% of online shoppers are buying fewer items due to concerns about online fraud, and 75% are more cautious about where they shop online. The scary implication is the 25% are not more cautious about where they shop online...

Sources: Internet Retailer, ZDNet's IT Facts

Monday, June 27, 2005

Where do you want to be today?

Microsoft is dabbling in content again, according to Stephanie Olsen's news.com blog.

As she points out, Microsoft has zigged and zagged on the topic of content. I think MSN still hasn't resolved what it wants to be when it grows up. It seems torn between the Google (advertising) and Yahoo (content) models. Does MSN want to Madison Avenue, or Hollywood?

It would seem to be best if they're neither; that way they don't compete head-on. MSN Spaces is cleverly not competing directly with Blogger; Spaces (and Yahoo360, too, judging by the beta) is going after small groups of friends, whereas Blogger oriented to people with Technorati ambitions. MSN does have a great asset in their instant messaging user base, but they're probably still running second to Yahoo in community software.

Some places that come to mind for MSN: Wall Street (money and business, though Yahoo is ahead there), Main Street (merchandizing, though eBay and Amazon are the name brands), Elm Street (home and family). It looks like they're going for Elm Street.

Sunday, June 26, 2005

Technology isn't Destiny

Steve Heims argues that the ethos of science rests on two pillars: that science is value neutral, and that the results of science are unequivocally good [1]. I would add a third: that progress is inexorable. While this belief system is no longer held unquestioningly in scientific circles, it’s still going strong in technology. It can lead industries to underestimate the power of their opponents; this has happened with genetically modified foods, and may happen again soon with Digital Rights Management Systems (DRM).

According to Heims, John von Neumann (a paragon of the rationalistic approach) viewed the march of technology as inevitable and beyond human control. He also believed that all technologies were ultimately constructive and beneficial. Taken together, these two beliefs imply that a technologist is not responsible for any negative outcomes: if there are any harms, they only apply in the short term; and even if there were long-term harms, they’re inevitable [2].

The development of nuclear weapons called this value system into question: it is hard to argue that the science of the Bomb was independent of the political process, since it was funded as part of a war effort. It's even harder to argue that the invention of the Bomb was an unalloyed good. These days the hottest issue are in biology. Stem cell research is the subject of great political controversy, as is human cloning in general; and the risks and benefits of Genetically Modified Organisms (GMOs) in the food supply have been hotly debated.

However, this belief system is still going strong in the IT business. Staking a claim to "Grove's Law" earlier this year, Intel’s outgoing CEO said: "Technology will always win. You can delay technology by legal interference, but technology will flow around legal barriers." [3]

Most technology visionaries still treat their dreams as being independent of politics, religion, and social debates in general. For example, most technologists resist the idea that their work should be the subject of regulation. It’s commonly argued that light regulation of emerging technology is the most appropriate course of action. This only follows if one accepts the premises that innovation is beneficial (the second pillar), and that regulation slows down innovation.

In reality, technology is not value neutral. One need only look at the United States’ R&D tax credit [4] to realize that society has chosen to fund specific kinds of innovation in specific industries: physical or biological science, engineering, or computer science. Most science is funded by government grants, and the size and focus of these awards are the results of social decision making, often with an eye to technological applications which reflect specific socio-political agendas. There are many reasons to subsidize R&D - creating of jobs, creating wealth, generating competition, creating national champions – all of which are to some degree at odds with each other.

Arguments in favor of the pillars often depend on discounting the importance of time. If a technology hasn’t yet triumphed, or its benefits are unclear, it’s argued that one simply hasn’t waited long enough. Proponents of technological determinism assume that the benefit exists, and it is only a matter of time before it shows itself: a Platonic ideal which is consonant with “math envy” which is at the root of many technologists’ world view [5]. This also accounts for the ultimately frustrating vagueness of technological prognostications: visionaries are happy to tell you when something will happen, but are careful never to guess about when.

Some may argue that science will inevitably progress, regardless of local political agendas: for example, if the US government won't fund stem cell research, then the State of California will. The discoveries will be made somewhere. However, these very decisions to fund or not are the result of lobbying and polls, and in their nature contingent. Little progress will occur in unfashionable areas, but this doesn’t help the skeptic’s argument: it’s impossible to prove a negative.

The fact is that political and social processes can speed up or slow down: bomb making was speeded up, and human cloning has been slowed down. History is path dependent, and these interventions affect the package of technologies which results. It is only if one believes that the outcomes of science and technology are inevitable, not only in their existence but also in their form, that timing becomes irrelevant.

Technology is a new addition to the social ecosystem, and it has intended as well as unintended consequences. The unintended consequences can be ignored if one believes in the Second Pillar: that the outcome of technical development is always beneficial. The realist, on the other hand, needs to plan for the unintended consequences, and society – your and I, in other words – needs to make a conscious collective decision on the risks vs. benefits of new technologies. Norbert Wiener had a fine sense of this imperative; here’s how Steve Heims describes Wiener’s world view [6]:


“Wiener is asking the user of powerful automated tools to reflect upon what his true objectives are -- to appreciate that multiple objectives usually conflict with each other and that to be able to articulate what one truly wishes implies a profound and sophisticated understanding of things and people, including oneself. This constitutes an important shift from the traditional view of technology: instead of think of a new technology merely as something that enables you to do such-and-such (the attitude of the "gadgeteer"), you come to realize that by making it part of your ecological system you grant it the power to alter your future, for better or worse. Just what part you wish it to play in your life and what relation to it you wish to have are the choices at issue.”
I believe that every technology is embedded in a value system, and that outcomes are neither pre-ordained nor unquestionably good. I would thus argue that technologists need to understand their social context if they are not to be surprised by cultural resistance.

One can see this playing out in DRM today. The companies providing the technology argue that they are not the ones limiting customer choice; they merely provide the tools which content companies can use to enforce their rights in whatever way they choose. This is the First Pillar in action: the technology itself is value neutral. The technologists focus, understandably, on the benefits of their technology, and don’t see (or admit) any downside to it; the Second Pillar. And third: since the technology has been developed, its deployment is inevitable. Alternatives such as levies are discounted as a blunt instrument, historically obsolete, or unfair [7].

Technology companies have a blind spot to alternative futures in which DRM does not inevitably triumph, and underestimate the power of social movements who don’t buy into the three pillars to block their chosen solution. The blank incomprehension among many in the biotech industry to the rejection of genetically modified food in the European Union is a precedent the ICT industry cannot ignore.

My general conclusion is that technology is not destiny. Technology is part of a complex social process, and the outcomes are uncertain. The best technical solution will not necessarily win in the market (cf. Betamax vs. VHS). Conversely, the optimal business result, let alone the optimal social result, is not necessarily built on the optimal technical solution. Put another way: The best technical architecture isn’t necessarily the best business architecture.

----------

[1] Steve J Heims, John von Neumann and Norbert Wiener: From Mathematics to the Technologies of Life and Death, MIT Press, 1980, Ch. 1, Von Neumann, Only Human in Spite of Himself, p 360

[2] Heims, ibid, p 367

[3] Michael Kanellos, blogging the 18 May Intel 2005 shareholders meeting

[4] Resources on the R&D tax credit: assessment of impacts, news coverage of its extension in late 2004 , a summary of the technology industry position, a backgrounder on state and federal credits; a summary of who qualifies

[5] It’s often said that those in the social sciences, or even softer natural sciences, have “physics envy” (in biology, in economics). By “math envy” I mean that technologists would like to believe that their work is timeless and true; the social contexts within which they appear are contingent and ultimately irrelevant. I prefer to use this rather than “physics envy”, since it seems to me that even physicists are jealous of the eternal truths supposedly obtainable via mathematics.

[6] Heims, ibid, Ch. 13, Wiener, the Independent Intellectual, p 341

[7] The European ICT trade organization EICTA makes the argument for DRM and against copyright levies concisely here. For a more detailed argument, see this. Those in favor of levies include collecting societies (eg in the US, ASCAP and BMI) and the free culture movement.

Friday, June 24, 2005

A tonne per terabyte

A conversation with Jon Pincus inspired me to wonder what the ecological impact of massive data storage might be. As I’ll outline below, I reckon a terabyte of storage generates a tonne of carbon dioxide every year, the same as the per-passenger cost of a flight from New York to LA, and about 5-10% of what a typical first-worlder generates in a year.

Working off Barroso, Dean and Hölzle’s paper on the Google cluster architecture, I infer that the energy consumption for storage is approximately 1W/GB: a dual 1.4-GHz Intel Pentium III server with a two 80GB drives draws 120W per server. Adding Google’s estimate of about 40% for cooling, gives 165 W per 160 GB, or 1 W/GB (watt/gigabyte). [1]

The environmental advocacy group SEEN estimates carbon dioxide production for a variety of power plant types. For a 1 megawatt (MW) plant the numbers range from 7,900 tonnes/year for coal to 4,000 tonnes/year for a gas-fired plant. I’ll pick 5,000 tonnes per MW.year for lack of knowledge about average capacity. That’s to say: 5,000,000 kg per 1,000,000 W.year, or 5 kg/W over a year.

Combining the two: 1 W/GB times 5 kg/W = 5 kg of CO2 per GB. To simplify further, and to be conservative, let’s say I’m off by a factor of five; that is, either servers or power plants are five times more power efficient than I’m estimating. That gives 1 kg/GB.

Since one metric tonne is 1,000 kg, and a terabyte is (roughly)1,000 gigabytes, we get to a nicely memorable number: a tonne of CO2 emitted per terabyte for data storage per year.

For context, first worlders generate about 20 tonnes of CO2 per year each [2]. Most people aren’t storing anything like a terabyte of personal data yet, and so the load their storage places on the atmosphere is relatively small. However, once we all start using up our TB of gMail storage, that tonne/terabyte will become a significant part of a first worlder’s the personal CO2 emissions.

----------

[1] As a reality check, a Maxtor 320GB NAS is spec’d at a power consumption of 150 watts, or about 0.5 W/GB. Peter Harrison suggests a rule of thumb that each watt of power consumed requires a watt of cooling, again taking us to about 1 W/GB.

[2] One activist site estimates that a typical family of three with two cars, who flies to an annual vacation, might produce 50 tonnes of CO2 a year; the same family, living in a small, efficient house with no car, and no annual flight, might produce 10 tonnes. Jerry Hannan gives different figures of the same order: A car and driver produce about 5.5 tons of CO2 per year; when all fossil fuel is considered, every man, woman, and child can be said to be responsible for 18.7 tons of CO2 per year. Air travel generates a lot of carbon dioxide. The City of Seattle uses a figure of figure of 0.34 kg per passenger air mile. It’s 2,800 miles from LA to New York, which uses or 0.95 tonnes of CO2 per passenger.