Bailey, a former executive at Thinking Machines, the manufacturer of one of the earliest lines of parallel processor computers, argues that computers using parallel processing, as opposed to traditional linear processing, will change the way we understand intelligence. Drawing on stories and examples from Galileo to contemporary thinkers, he seeks to explain why the parallel-processing approach will revolutionize information processing and analysis. Some of his examples and analogies are straightforward and understandable, but too often he makes unclear chronological and conceptual jumps. Part philosophy, part history of science, part computer science history, and part technological prediction, this book is difficult to follow and thus unconvincing. For academic libraries.Hilary Burton, Lawrence Livermore National Lab., Livermore, Cal.
After Thought
After Thought's essential premise is that the old ways of understanding the world are about to be swept away by new mathematical methods called "intermaths", much as the ancients' descriptive, geometry-based understanding was swept away by Newton's analytical approach. By helping us understand the first transition, author James Bailey hopes to provide a conceptual model for the
second.
"At first the transition to a new vocabulary does not affect the way we understand the world itself. The epicycles of the moon's behavior, for example, were simply reexpressed with sines and cosines instead of with circles and lines. Soon, however, a change of vocabulary is accompanied by a much deeper change in understanding. Scientists choose underlying fictions that are well-adapted to their vocabulary. Where once they looked up into the sky and saw epicycles, for example, they -- and we -- came to see fields of gravitational force instead." -- After Thought, page 80.
After Thought is at its best in the first two sections, wherein Bailey takes us inside the classical mind and then through the paradigm shift that occurred with the invention of the calculus. The presentation is scholarly yet engrossing, synthesizing concepts and figures from hundreds of years of scientific and philosophical literature, Descartes side by side with Thoreau. The history of how
the organization and methods of the human "computers" of yore have been reflected in -- and, indeed, severely constrain -- the design of modern digital computers was new to me and I found it quite interesting.
In the third section of the book, Bailey turns to the "intermaths" and discusses a grab bag of computational techniques and topics including neural nets, genetic algorithms, cellular automata, classifier systems, emergent phenomena, fractals, chaos theory, agents, simulation, and even weather and economic modeling. One
particularly interesting passage describes a mechanistic implementation of a neural net using a class of students (page 127-128). The explosion of the Internet and the possibility of integrating the processing power now isolated on a hundred million desktops are also woven into the argument.
In the end, After Thought becomes an appeal to faith. Bailey predicts that the "intermaths" will make current technology and "Cartesian thinking" obsolete, but offers only a muddle of anecdotes to support those prophecies. His discussions of operational hardware are limited to Danny Hillis' "Connection
Machine," but that product and its parent corporation Thinking Machines Corporation are now defunct. His most convincing case histories are those that use neural nets, but those do not in fact require any radical changes in how computers are organized, or even rely on parallel computation to be useful.
Are the proponents of the "intermaths" merely fringe players? Or are they visionaries trying to implement brilliant ideas with inappropriate hardware -- like Charles Babbage and the Difference Engine? Only time will tell.--Dr.Dobb's Electronic Review of Computer Books