Monday, February 12, 2007

More cores than cognition

Intel's announcement of an 80-core chip reminds me the cognitive challenge entailed by parallel programming. According to David A. Patterson, a UC Berkeley microprocessor expert, “If we can figure out how to program thousands of cores on a chip, the future looks rosy. If we can’t figure it out, then things look dark.” (New York Times story 11 Feb 2007.)

Programming thousands of cores is a hard research question. The shopping list of open tasks in a Berkeley white paper on this topic indicates just how far we have to go. I've observed some work on parallel computing, and I suspect that the cognitive limitations that have lurked below the surface in programming so far (e.g. limits on the number of independent variables humans can think about simultaneously) will rise up with a vengeance in parallel programming. We simply may not be able to figure it out.

Programming tools are surely part of the picture, but tools have typically automated and accelerated activities that humans have already mastered. It's an open question whether we can conceive of 1000-core parallel processing sufficiently well to create tools.

Hardware proposes, software disposes. . . . Chip innovation has always been exponential, and software innovation linear. Many-core chips will make matters qualitatively as well as quantitatively worse.

This mismatch between hardware and software is a weakness in the singulatarian argument that exploding compute power will soon trigger a phase transition in humanity and culture. All those CPU cycles are useless if they cannot be programmed efficiently, and our brains may be the limiting factor. We may need a transhuman mind to get through the barrier to transhumanity. Anyone got a time machine?

No comments: