The Economist this week writes about the increasing use of algorithms, for everything from book recommendations to running supply chains (Business by numbers, 13 Sep 2007). It suggests that algorithms are now pervasive.
The most powerful algorithms are those that do real-time optimization. They could help UPS recalibrate deliveries on the fly, and reorder airplane departure queues at airports to improve throughput. More down-to-earth applications include sophisticated calculations of consumer preference that end up predicting where to put biscuits on supermarket shelves.
If that’s true, the underlying fragility of algorithms is now pervasive. The fragility is not just due to the risk of software bugs, or vulnerability to hackers; it’s also a consequence of limitations on our ability to conceive of, implement, and manage very large complex systems.
The day-to-day use of these programs shows that they work very well almost all of the time. The occasional difficulty – from Facebook 3rd party plug-in applications breaking for mysterious reasons to the sub-prime mortgage meltdown – reminds us that the algorithmic underpinnings to our society are not foolproof.
In the same issue, the Economist reviews Ian Ayres’s book Super Crunchers: Why Thinking-by-Numbers Is the New Way to Be Smart about automated ddecision making (The death of expertise, 13 Sep 2007). According to their reading of his book, “The sheer quantity of data and the computer power now available make it possible for automated processes to surpass human experts in fields as diverse as rating wines, writing film dialogue and choosing titles for books.” Once computers can do a better job at diagnosing disease, what’s left for the doctor to do? Bank loan officers have already faced this question, and had to find a customer relations job. I used to worry about the employment implications; I still do, but now I also worry about relying on complex software systems.