Sunday, January 29, 2006

Special Present fallacy explains accelerating paradigms shifts

In Two breakthroughs per century per billion people I pointed out that the accelerating rate of paradigm shifts cited by Kurzweil in The Singularity Is Near can be explained by the growth of global population over the last two thousand years. It is not necessary to invoke compounding complexity effects. Of course, human population growth can only account for historic events; what about the doubly exponential acceleration rate of earlier milestones in the Modis data set?

I will show below that the whole data set can be explained by a commonplace event selection behavior among list makers. The pattern in the data arises automatically if list makers lock their timeline to the present day, and then scatter events evenly over a time line where each tick represents an increasing power of ten.

The further something is away from us in time, the less interested we are in it. Someone living in 1600 would list as many earth shattering events in the preceding century as someone living today might for the period 1900 to 2000. However, a list maker living today would probably just boil them all down to “the Reformation”, while insisting that the century leading up to our time contained radio, television, powered flight, nuclear weapons, transistors, the Internet, and sequencing the genome.

Let us assume that a list maker (1) wants to show historical events over a very long time period, and (2) wants to make it relevant to the present reader by including some recent events.

Since recent events are exceedingly close to each other on a long time scale, one needs a scale that zooms out the closer it comes to the present. An obvious and common way to do this is to use a logarithmic time line. Imagine a ruler where the ticks are numbered 1, 2, 3, etc.; each number is a power of ten, and the ticks denote 10^1 = 10, 10^2 = 100, 10^3 = 1000 years ago. The line has regularly spaced ticks marking 10 years ago, 100 years ago, 1000 years ago and so on. (The notation 10^n stands for 10 raised to the n-th power. “n” is the logarithm of 10^n, that is, n = log(10^n) – hence the term “logarithmic scale”.)

I’ll now show that scattering events regularly over such a time line leads to the log-log acceleration which so captivated Modis and Kurzweil.

Assume that a series of events are scatted regularly on a log line. Let two successive events be numbered “i” and “i+1”. If t(i) stands for the time at which event i occurred, then a regular gap between the events means that

log ( t(i+1) ) – log ( t(i) ) = some constant, for any i

Since for logarithms log(a) – log(b) = log(a/b), we get that

t(i+1)/t(i) = some other constant

Adding and subtracting t(i) on either side of t(i+1) in the numerator gives

( t(i) + t(i+1) – t(i) ) / t(i) = that constant

Simplifying, we find

( t(i+1) – t(i) ) / t(i) = another constant

… or in other words

t(i+1) – t(i) is proportional to t(i)

Put in the language of Kurweil’s charts, this says that the “time to next event” is proportional to the “time before the present”. And this is indeed what the data shows. I’ve reproduced the Modis data on a linear scale at the top of the post; the data is in the third tab called “Milestone data” in this spreadsheet. If the proportionality holds, the data points would fall on a straight line. A linear fit to the data has an R-squared of 0.7, which is surprisingly good given the squidginess of the data; S., my in-house statistician, grimaces when R-squared is less than 0.8, but doesn’t laugh out loud until it’s less than 0.5.

I have thus shown that the acceleration of the intervals between cosmic milestones on a log-log scale can be explained by a list maker evenly distributing their chosen events on a power-of-ten timeline (or any other logarithmic scale). Since this is a common way for scientists to think about data, it is a plausible explanation – more plausible, in my book, than a mysterious cumulative complexity effect.

This distribution of milestones is an example of the Special Present fallacy because of assumption #2 above: making the timeline relevant to the present reader by including some recent events. Since the list maker tends to believe that recent events are more significant than distant ones, they feel obligated to include the present. If they did not need to do that, a linear (rather than a logarithmic) scale would have sufficed. This is not to discount the importance of linking the timeline to the present: finding a way to represent concepts on a human scale is critical to conveying meaning to humans.

At least one point is still unresolved: what’s the interplay between the two explanations (population growth vs. event selection bias) I’ve given? How much of the data behavior is due to population growth, and how much is due the Special Present fallacy? My hunch is that one could accommodate some Special Present effects while still keeping population growth as the major driver over the historical period. I have not validated this, and given the sparseness and arbitrariness of the milestone data sets, it may not be worth the trouble.

I’d like to thank S. for her help in developing this argument.