Sunday, July 31, 2005

Abetting Amazon




Dear Reader, I wanted to let you know that I've signed up for the Amazon Associates program. Purchases you make as a result of links from my blog will accrue to my referral account (total earnings to date: $0.00).

I wasn't sure whether to flag this, since possible reactions span the gamut from "Ick!" to "Duh!" Since I had qualms, though, I wanted to err on the side of full disclosure.

So, Buy Buy Buy, but for now, Wait Wait Wait. I haven't gone through the schlepp of branding the links yet...

Engineering jobs

Google and Yahoo are today's geek magnets, according to Ben Elgin's report in Business Week. Search engines are where the action is, not only for investors but also for engineers: the technical problems are hard, the impact of your code is huge, and the respect you can earn (though perhaps not the equity) is massive.

Back at the height of the dot-com boom in 2000, the cool plays were about business model, marketing and early mover advantage as much as software: think Amazon and eBay. Uber-geeks were necessary but not sufficient for businesses that competed in retail and B2B. The pendulum has swung in five years. In the mid-Nineties -- five years earlier still, and a full ten-year cycle ago -- the big news was Apple/Microsoft, and the introduction of the browser; Pure Tech again.

It's an internecine tussle in the digital world today, rather than companies old and new fighting over the bits vs. bricks boundary. Sure, Google and Yahoo are roiling the business models of existing companies, but those affected are themselves largely operating in "concept space", like advertising agencies and content companies.

The hiring landscape has changed a lot, though. While US industry leaders still ritually complain about not getting enough visas to bring in the talent they need, many have also moved a lot development off-shore. A Bay Area veteran recently told me that today's Valley is much more about VCs and senior leadership teams than it used to be, because so many companies have moved routine development to India.

There's the conundrum for Americans contemplating a software career: Tech companies can never get enough of the very best people, but the second raters can't get a job. If you're a star, you'll be fine; if you're not...

It's particularly acute in software engineering. A superb developer can do something in a tenth or a hundredth the time it would take journeyman. Legends abound about the uber-geek who came in on Monday after a week-end of non-stop coding with an astounding new operating system, compiler, or product prototype.

In civil engineering, by contrast, the very best practitioners can produce qualititatively much better work, but might only do it in half the time of the average worker. I suspect the difference is between fields that are enmeshed in the physical world, vs. those that operate in the world of ideas. I predict that there is much greater non-linearity between the best and the average professional in fields like software, law and mathematics, vs. fields like mechanical engineering, medicine and psychology. One could test this hypothesis by comparing the width of the spread between highest, lowest and average salaries in these fields.

It's the difference between "pure" and "messy", and it maps to hiring trends. The IT business is in a "pure" phase right now, trying to invent new search algorithms to make sense of the mountain of latent meaning in the web. If history is any guide, the focus will shift back soon enough to the messy business of embedding these technologies in existing value chains: going from bits versus bricks, to bits and bricks.

Wednesday, July 27, 2005

Bubbles as taxes?

Investment bubbles leave significant social value when they burst: we have a super-cheap Internet backbone because of over-investment in fiber capacity. Many investors lost their shirts, as a result of lemming-like investment decisions, fraud (when a company like WorldCom is doing so well, its competitors persuade themselves that there must be a pony, and keep throwing good money after bad), irrational optimism, and confirmation bias.

This is not a new phenomenon. Stock investors lost huge amounts during the build-out of the railroads in the 19th Century.

Large private losses which result in common social assets substitute for taxes. Investors are in effect handing over their money to the commonwealth. Since large investors are more likely to be rich than poor, large losses are more likely to fall on the rich than the poor. This "tax" is therefore nicely progressive.

Questions: What is the relative size of investment losses vs. ordinary taxes for large net worth individuals in the course of a bust cycle? (One would have to include the use of losses to reduce taxes on other income in the calculation.) Do rich people lose more than ordinary ones in a bust? (Wealthy folk have portfolios, but the rest of us have retirement accounts; since there are more of us, we may bear as heavy a burden in aggregate.) Do government losses when bubbles burst off-set the social gain of losses incurred by investors? (Governments often invest alongside the private sector, or provide significant tax incentives.)

Saturday, July 23, 2005

You know it's a bubble when...

... a For Rent sign shows up for the first time ever in your neighborhood.

(More and more people are buying houses on spec and renting them until they find a Greater Fool.)

Archi-sects

Notre Dame du Haut by Le Corbusier, one of the High Priests of architecture
Kyril Faenov recommended Christopher Alexander's new opus to me: The Phenomenon of Life: The Nature of Order. Beyond the subject matter, the reviews are fascinating for the way they reveal architects as the ultimate cultists. That's ironic, since Alexander has always expounded an approach which is supposed to replace brain washing by a reasoning and understanding.

Reactions range from "book of the Century" and "magnificent", to "arbitrary and very personal pseudo-metaphysical remarks about architecture" and "flawed". Any book elicits a range of reactions, but this reminded me how architects are carried away by fads. They love movements and messiahs: modernism, brutalism, neo-classicism, post-modernism, ...; Kahn, Foster, Liebeskind, Calatrava, Gehry, ...

All culture is constantly swept by fashions, but the populus cannot avoid looking at and "wearing" what entrances architects. Couture and paintings are avoidable, but buildings aren't.

I suspect architects are particularly susceptible because they operated in the borderland between disciplines. They are neither sculptors nor engineers; their work isn't celebrated at the Venice Biennale, and an engineer has to sign their plans to certify that the building won't fall down. They don’t originate the idea to build something (developers do that), nor do they drive it to completion (the builders do that). They are supposed to be pragmatic enough to come up with something that can be built and used, yet idealistic enough to concoct compelling visions that can rally a city.

When you're on the margins, you have to live large or cower. It helps to have larger-than-life role models - hence the hero worship.

Architects have to navigate the complex terrain between a client's dream and the practicalities of building something usable. As Kyril pointed out to me,
"It is a place where you quickly realize that no amount of logic and analysis can help you make decisions, help get people on the same page, help you even get started. Something else that comes from inside breaks the inevitable paradoxes and deadlocks. The arrogant among them root their vision from which the design is built in their ego, the reflective ones seem compelled to dig deeper, which inevitably leads to philosophical/psychological questions. "
I suspect software architects may be in a similar situation to building architects, though their heroes and grand ideas are less visible to the public. Here's Kyril's comment:

"I think that software architects are in the same situation as the building architects when they have to and do take into the account business needs, as well as customer realities. That is a small percentage, from what I observe. Many are more like artists, find their rooting in the purity and rationality of a software idea, and avoid staring into the dark void of the business/customer/product nexus. I think entrepreneurs of any kind come closest to living in that nexus, with the added complexity of having to satisfy the irrational needs of their employees in addition to customers."

I just love that image of the dark void of the business/consumer/product nexus... small wonder that many architects need a little messianic motivation to keep them sane.

Friday, July 22, 2005

You know it's a bubble when...

... a guy stands at an intersection waving a sign that says, "Free Real Estate Course".

(Seen yesterday near the Microsoft campus in Redmond, WA.)

Bloggregators

John Reith
Blogs aren't newspapers, though some like Slashdot are beginning to look that way. Newspapers are magical brews: toe of sports, eye of news, wool of gossip, tip of hope, root of evil. Lord Reith's agenda for the BBC applies, though perhaps no longer in his preferred order: educate, inform and entertain. Editors not only give us what they think we want, but also what they want us to think.

Newspapers, even those online, have a richer visual structure than most blogs, reflecting their more diverse content. See, for example, the New York Times and the Washington Post: multiple columns, rich typography, and multiple horizontal "folds".

One could assemble a compelling community-written alternative to conventional newspapers out of the ingredients on the web. The parts and pieces are there, but I haven't found the gestalt:

Browser-based RSS readers (like My Yahoo!) allow me to construct a list of content from blogs and news outlets, but don't give me the control over layout that I want. It's a one-dimensional list, with some options for a side-bar. The stories are also represented by just a headline.

Google News allows me to re-arrange the sections I'm interested in, and create custom sections - but sources the content from news sites, not blogs (yet?).

I'd like to compose a front page out of individual stories from my feeds (a la My Yahoo RSS) that fit my keyword requirements (a la Google) , presented as link+precis. I want a filtered, aggregated view of my blog feeds - hence the term "bloggregator".

Since I can imagine this, somebody's already built it, perhaps with a nifty combination of RSS and greasemonkey. Dear Reader, please let me know via the Comments where to find it.

The Windows Vista: coming soon to a car dealership near you

Brian Valentine
If Blogger had categories, this would be filed under "Cheap Shots". Or maybe "Everybody's an Expert". Still, I can't help myself...

"Windows Vista" is the official name for code base formerly known as Longhorn (Microsoft announcement). The tag line is "Clear, Confident, Connected: Bringing Clarity to your World".

Huh?

I don't look to an operating system to help me "cut through the clutter and focus on what's important", which is what Clear seems to refer to. I suppose the Confident is trying to resonate with the Trustworthy Computing message, and persuade people that Windows is more secure than the alternatives. I'm sure it has nothing to do with the fact that CIOs are using Firefox because it's thought to be more secure than IE. As for Connected: well, we're celebrating the tenth anniversary of the Internet right now, so it'll be very topical in 2006.

The name reminds me of faded car brands: the Ford Pinto, the Chrysler Volare, the Chevrolet Nova, the Buick Regal. So: would you buy a used car from that man?

Wednesday, July 20, 2005

Practice (continued): getting better, but not best

Miladin had a perceptive comment about "best". It reminded me of the rhyme I learned as a kid:
Good, better, best
Never let use rest
'Til our good is better
And our better, best.

The propaganda for the existence of "best" (which I think the two of us agree is a myth) clearly starts early. Where does it come from? It's possible that the notion of "best" is wired in; our brains could come with a built-in notion of a superlative, and with reaching that superlative a built-in drive. However, I think one can get by without this assumption; relativism will suffice.

Comparisons are the stuff of life. All living beings act on the basis of A being more or less desirable than B. This mate is more fit than that one; this food is better than that food; this place is safer than than place. Along with comparison comes preference, that is, making the choice of one thing over another. This then leads to a series: C better than D, better than E ... In principle, of course, there will always be a B better than D, and an F worse than E. Since we're finite beings, however, it's a good simplifying step to assume that the series is finite, and that C is the best one can do. It facilitates the decision, which is evolutionarily adaptive; if you always keep waiting for something better to come along, you'll never get laid, fed, or secure.

Anxiety is the stuff of life: animals are always striving to do better. If they don't, their genes won't survive. Focused anxiety is more tolerable than the free-floating kind, and hence the endless search for "best". Many cultural artifacts are designed to reassure us that "best" exists, and that there is therefore hope of respite from our anxiety. To take two stereotypical examples: romances where the heroine eventually lands the man of her dreams and lives happily ever after; and sport, where every game has a result, and every contest (in America, at least) has a winner.

In a more honest world, the doggerel might read:
Good, better, ... um
Never let us bum
'Til our good is better
And the rest - stay mum

Miladin also sought advice on how to organize time better. I have two recommendations. David Allen's "Getting Things Done: The Art of Stress-Free Productivity" is a great self-help (!) guide to creating actionable To-Dos. Since a lot of stress comes from a big reading load, I'd recommend Mortimer Adler's "How to Read a Book"; for a quick intro to it, see the last chapter in Robert Hagstrom's "Investing: The Last Liberal Art" .

Saturday, July 16, 2005

The practice of practice

The endless shelves of the Self-Help Section shout out our imperfections to the world. So much to be fixed! Self-improvement is an endless road. Since our lives are finite, we are doomed never to reach the destination. The destinations, plural, to be painfully precise. Examine your life for even a few moments, and you can list endless topics for improvement, from a golf swing to showing compassion.

Christianity offers great comfort; God's grace comes to the rescue to fill the infinite gap between our inadequacy and the criteria for admission to Heaven. Buddhism is less forgiving; since suffering is rooted in one's desire, only your own efforts can break the cycle of wanting and lacking. However, even here religion has constructed hope: karma gives you multiple shots in successive lives at getting it right.

Some systems of thought don't try to solve the puzzle, and simply give advice on muddling through as best one can - Taoism, for example. Lin Yutang extols the importance of loafing, leisurely walks, and long talks with friends. My mother likes to quote GK Chesterton: "If a thing is worth doing, it is worth doing badly."

All of them advise one to be mindful of the challenge, even if it is merely to ignore it. But the challenge is to thread the path between pride and despair.

Any practice requires endless practice, whether it is spiritual growth or cooking. There is never an end to learning, and so the amount to be learned is infinite. Any progress is infinitesimal measured against what needs to be done; mathematically, any finite fraction of the infinite is zero. Judged against the task, one can only despair.

Ah, but one can take heart by looking back and seeing how far one's come. No matter how inconsequential the progress, it is more than would've happened if you had made no effort at all. Even with a step back for every two steps forward, the headway you make is significant and can be a source of pride. As you advance down the path, though, a day's progress becomes an ever-smaller fraction of the journey to date...

Perhaps the only escape is to focus on the moment, and ignore both past and future. One has to find joy in the practice for its own sake, regardless of its greater purpose. Only in Zeno's story does Achilles concede the race because the Tortoise persuades him that he can never catch up. The real-life Achilles just runs, wins, and (because he's a mean bastard) has the Tortoise for dinner.

Wednesday, July 13, 2005

Bringing in the Apple harvest

Some Apple news in the last few days gives a snapshot of the most vulnerable moment in their product cycle: nearing the end of one innovation, and before the next one has appeared.

Business Week tells the story of the iPod business's fading lustre: Wall Street is getting skittish about not seeing any more blow-out quarters in the pipeline. At the same time, a study by SG Cowen (N=1,400) indicates that the iPod is exposing Windows users to Apple, and increasing their interest in Macs.

The iPod party may well be ending, but it doesn’t follow that the Apple party is ending. Jobs is executing a very neat “niche fast follower” strategy, which is sustainable while he’s around. He takes a technology that’s proven to work, but only at a level that’s accessible to experts and early adopters. Apple then does a superb integration job, producing a product that’s a joy to use for the non-expert, though still pricey; their customer sweet spot is rich non-geeks. After a while the rest of the market catches up, and Apple has to move on to the next opportunity. It's done this for the personal computer (Mac), laptop (PowerBook), and MP3 player (iPod). This strategy will work as long as the industry keeps churning out new product ideas that have merit – the cool stuff that’s rough around the edges until Steve Jobs does his magic.

People who focus on the end-game for a particular product line will under-estimate Apple’s long-term viability. Certainly Apple will eventually be overtaken by the mass market in any given segment; but in the long run we’re all dead. For folks with a bias towards the mass market, and who look to asymptotic end-games, Apple will always be an eventual loser. It does lose eventually in any given segment - but not as a business. The the financial reality isn’t too shabby. Looking at some financial indicators, here’s how Apple and MSFT stack up:

Stock: AAPL -- MSFT
P/E: 42 -- 25
Operating margin: 9.17% -- 42.55% (Significant Market Power, gotta love it)
Return on equity: 13.8% -- 19.07%

What's the next iPod? Time and Steve Jobs will tell. Apple hasn't broken into the home entertainment market in a significant way yet, a scenario the SG Cowen data points towards. It’s likely that there will be more and more mixed-platform homes, as people who own and use Windows add Macs. Microsoft will have to start addressing this in the same way that it had to come to terms with Unix+Windows shops in enterprises. There’s an opportunity to get a head start on Windows plug-ins that will make Macs play well with MS machines - and conversely, there's an opportunity for Apple to turn Windows PCs into Mac peripherals.

Admittedly, even optimistic share projections for Apple are relatively tiny: SG Cowen believes that 6% share is achievable by 2008. However, this is the kind of tail that can wag the dog, as we’ve seen with Firefox vs. IE.

Batten down the hatches for a hard winter in the Apple orchard, but keep an eye out for the buds of new innovation.

Friday, July 08, 2005

He was listening to his neighbor when the wind blew the door open. The spotted dog scratched itself. From then on, he itched whenever the wind blew.

Different parts of his body would itch depending on the time of day, the phase of the moon, and what people were saying. He would scratch first one place and then another, a pilgrimage of irritation, determined in part by the interlocking rhythms of time, in part contingent on the random words he heard.

He left his house to search for a place where the wind didn't blow. He criss-crossed the continents, his path enveloping the world, seeking quiet.

We buried him under that tree up on the hill. Sometimes, when the wind blows, you can still hear him scratching himself.

Sunday, July 03, 2005

So true it hurts #2: getting old

I'm officially old. The first undeniable sign was when hair started growing out of my ears and nostrils -- eeeewwww! Then I started getting bushy eyebrows -- gross! But it was yesterday that the medic told me I had some arthritis. ("Minimal", mind you, so don't get carried away.)

At least I haven't received my AARP membership card yet...

So true it hurts #1: blogging

In today's Doonesbury, the host of a radio talk-show asks his blogger-guest, "Isn't blogging for angry, semi-employed losers who are too untalented or too lazy to get real jobs in journalism?"

Hey, I'm not angry!

Saturday, July 02, 2005

The social contract of new technology

Donald Bruce has used the example of genetically modified food (GM food, for short) to illustrate how the notion of a social contract can be used to build a shared vision for risky new technologies [1]. I will simplify his model and show how it can be used to assess the risk profile of new information and communication technologies.

Dr Bruce argues that the conflict about GM food in Europe is rooted in a lack of trust between the biotech industry and their customers. Many consumers don’t see the benefits of GM food, are concerned about the long-term impacts of “meddling with nature”; they that industry is pursuing its own interests over theirs. Industry and its allies believe that consumers are uninformed and irrational, and are slowing down the introduction of a technology that has widespread social benefits.

Bruce argues that a dozen parameters, listed in [1], influence whether society will embrace the benefits of a new technology, and accept its risks and disruptions. I’ve simplified this list into three themes: Why do it? Who’s in control? How do I feel about it?

The first theme (Why do it?) concerns the value proposition: what benefits are being offered at what risk?

Benefit: does it offer realistic, tangible benefits to the consumer?

Risk: how often can a bad consequence be expected, what’s the likely magnitude, and is it noticeable or insidious? [2]

The second theme (Who’s in control?) addresses questions of power: If consumers don’t feel in control of the technology, they are more likely to resist it.

Choice: is it a voluntary risk assumed by consumers, or one imposed by another party?

Trust: if imposed by someone else, how much do we trust those who are in control? [3]
The third theme (How do I feel about it?) concerns attitudes and reputation:

Values: does the technology uphold or challenge basic values?

Familiarity: is the technology familiar and understood, and socially embedded?

Precedent: if it is unfamiliar, has something like it gone wrong before or proved reliable?

Profile: has it been given a positive or negative image in the media?

All these considerations are more or less subjective. Different people weigh risks and benefits in different ways; some are more comfortable ceding control to companies than governments, or vice versa; different people have different basic values; and something that’s familiar to me may be alien to you. The adoption of a technology is not simply, or even not largely, a technical matter [4]. If the makers of technology ignore this, their initiatives are at risk.

Technologists and business leaders often live in a very different world from their customers, and stopping to listen to the unwashed masses is hard for both geeks and execs. The news media are a useful resource but are often discounted, discredited or ignored since they are seen as biased bearers of bad tidings; in fact, they may simply be representing the interests of a broader cross section of society.

The informatics industry is in the fortunate situation that it hasn’t experienced the melt-down of confidence that GM food has suffered in Europe. Hence, one doesn’t need to use a social contract analysis to figure out how to build a positive shared vision of technology, as Donald Bruce has done for biotech. However, there are deep similarities. The positive self-description of biotech noted by Bruce is similar to informatics’ self-image: discovery, innovation, enhancement, efficiency, prosperity, and growth. And as he says of biotech, "Underlying all, and largely taken for granted, is an Enlightenment vision of rational human progress through technology."

Still, many new information and communication technologies are at risk of social conflict. This approach offers a useful checklist for assessing those risks. I’ll give two examples of how this tool could be used. I leave as an exercise its application to more contentious topics like uniform software pricing in rich and poor countries, software patents, and the misuse of email and browsers to commit identity theft.

Preventing piracy through digital rights management technology (DRM): three thumbs down

Why Do It? The benefits to consumers of DRM are not tangible; it presents itself as an inconvenience. Creators assert that the flow of content will dry up without rights control technologies to protect their investment, but this loss to consumers won’t be immediately visible. The risk of losing rights to copy which have become customary with analog technologies is much more easily grasped.

Who’s in Control? The owners of content are clearly calling the shots, though the providers of the underlying tools are also implicated when consumers confront this technology. The customer has little choice but to accept DRM when it is imposed, and finds it infuriating when, say, some CDs don’t play in their PCs. Consumers are unlikely to feel they have much in common with corporate giants like Time Warner and Microsoft, and trust will be low.

How do I feel about it? While the technology upholds traditional values like not stealing, a new set of values is emerging that finds nothing wrong in freely sharing digital media. The technology is unfamiliar and hard to use, and the precedent of the failure of copy-protecting dongles once used with computer software is not encouraging. The public profile of the technology is still up for grabs; the mainstream media have yet to define an image either positive or negative.
Voice over IP (VoIP): three thumbs up

Why Do It? The benefits are immediate and tangible: phone calls cost less. A notable risk is that a call to the fire brigade or ambulance won’t go through. This is a low-frequency, high-magnitude risk, and is thus getting a lot of coverage in the press. However, it’s an understandable and mitigatable risk for most people. VoIP’s threat to the social revenue base built on legacy taxes is a long-term and esoteric risk; few consumers understand this impact, and are likely to discount it. The main risks associated with the Internet, identity theft and harm to children, are not obviously associated with VoIP.

Who’s in Control? The consumer is in charge of deploying this technology for their own use. Risks like failed emergency calls are taken on voluntarily (though one can argue about education, notice and choice). Customers have to trust their Internet service providers, but not in any unusual way. As an Internet technology, VoIP also partakes of the halo of citizen empowerment that the web has acquired.

How do I feel about it? The auguries are good on this score, too. The technology builds on the commonly held belief that the Internet empowers individuals, and offers cheap and useful new products. The technology is familiar, since it resembles traditional telephony; for those who have some on-line experience, Internet Voice services like Skype resemble the known technology of Instant Messaging. There is no widely held precedent of something like this having led to disastrous consequences and the media profile is mixed to positive.
----- ----- -----

[1] Donald M Bruce, A Social Contract for Biotechnology - Shared Visions for Risky Technologies? I found this a very useful and thought-provoking document. I do get a little uneasy, though, whenever someone ascribes opinions and motives to "people" or "the ordinary public"; there’s a narrow line between being an advocate and being patronizing.

I was alerted to the fascinating work done by Dr Bruce at the Society, Religion and Technology Project of the Church of Scotland by an opinion column that he wrote in the New Scientist of 11 June 2005, Nanotechnology: making the world better?

[2] The psychology of risk aversion plays an important role here. When facing choices with comparable returns, people tend chose the less-risky alternative, even though traditional economic calculation would suggest that the choices are interchangeable. When making decisions under uncertainty, people will tend to take decisions which minimize loss, even if that isn’t the economically rational behavior. Consequently, there is a greater aversion to high consequence risks, even if their likelihood is small. See http://en.wikipedia.org/wiki/Loss_aversion for definitions and links.

[3] We trust another party another party if we believe that they will act in our best interests in a future, often unforeseen, circumstance. Trust is in large part a matter of a shared vision: how much do we share their values, motivations and goals? Vision is often expressed as a projection of the consequences of a set of perceptions about current situations and trends. This projection is driven by the values held by the visionary; if the values are not aligned, then the vision will not be persuasive. It’s a three-stage process in which perceptions are modulated by values to produce a vision: Perception -> values -> vision.

[4] I blogged at some length on this topic last week, under the heading Technology isn't Destiny.