The Evolution of Wired Life: From the Alphabet to the Soul-Catcher Chip -- How Information Technologies Change Our World / Edition 1

The Evolution of Wired Life: From the Alphabet to the Soul-Catcher Chip -- How Information Technologies Change Our World / Edition 1

by Charles Jonscher
ISBN-10:
0471392987
ISBN-13:
9780471392989
Pub. Date:
08/01/2000
Publisher:
TURNER PUB CO
ISBN-10:
0471392987
ISBN-13:
9780471392989
Pub. Date:
08/01/2000
Publisher:
TURNER PUB CO
The Evolution of Wired Life: From the Alphabet to the Soul-Catcher Chip -- How Information Technologies Change Our World / Edition 1

The Evolution of Wired Life: From the Alphabet to the Soul-Catcher Chip -- How Information Technologies Change Our World / Edition 1

by Charles Jonscher
$15.95 Current price is , Original price is $15.95. You
$15.95 
  • SHIP THIS ITEM
    Qualifies for Free Shipping
  • PICK UP IN STORE
    Check Availability at Nearby Stores
  • SHIP THIS ITEM

    Temporarily Out of Stock Online

    Please check back later for updated availability.


Overview

Thoughtful and erudite... Intelligent and readable...Will appeal to people who enjoyed Longitude by Dava Sobel or Fermat's Enigma by Simon Singh. -The San Diego Union Tribune

"Most engaging."-The Boston Globe

"An optimistic and reassuring assertion that no matter what wonders we invent, human beings . . . remain infinitely more complex and interesting."-The Economist

A lively, informative examination of the computer revolution-and why the top-performing information-processing device is still the human brain

If we believe the forecasts of many computer enthusiasts, a wave of amazing devices will soon fundamentally change our lives, and the "thinking machine" is just around the corner. In this authoritative and entertaining book, critically acclaimed author Charles Jonscher presents the other side of the argument: while communication developments have changed society, they also have their limits. He shows us that in order to understand the true transformative powers of the new technologies, we must know about the long history of their development-and why no calculating machine can match the creative power of the human mind. Rich in insights from literature, philosophy, and history, The Evolution of Wired Life offers a fascinating look at the development of the digital era, from the invention of the first alphabetic language to the printing press to the World Wide Web.

Product Details

ISBN-13: 9780471392989
Publisher: TURNER PUB CO
Publication date: 08/01/2000
Edition description: First Edition
Pages: 304
Product dimensions: 5.56(w) x 8.49(h) x 0.85(d)

About the Author

CHARLES JONSCHER is a computer scientist and economist who is affiliated with Harvard University's Program on Information Resources Policy. He has held teaching appointments at Harvard and the Massachusetts Institute of Technology, where he was codirector of the Research Program on Communications and published widely cited research on the productivity impact of information technology. He is the current president of the London-based investment firm Central Europe Trust Co.

Read an Excerpt




Chapter One


The Soul-Catcher Chip


In Philip Kerr's thriller Gridiron the murderer is a computer program called ISAAC. ISAAC is the software which controls the working environment in a very advanced office building in downtown Los Angeles. Through video cameras, it watches the occupants and tracks their movements; it works the elevators, the security doors, the air-conditioning, the phones, the word processors and the central filing. It is connected to the personal computers and telephones on the desks throughout the building, so that it can respond promptly to the needs of each user. The novel tells the tale of the fight to the death between the program and the humans who operate it.

    It was not the idea of a computer-controlled environment which brought fame and fortune to Ray Richardson, the brilliant and arrogant architect of the building: it was the sheer scale of the computerization he envisaged. ISAAC runs on hardware incorporating every feature the technology industry has produced by the end of the twentieth century. The fourth floor of the building, the computer centre, contains a machine which is really several hundred computers working together in one Massively Parallel Processing System. Multiplex cabling — wire which can carry many signals in parallel — connects this machine to the electronic eyes, microphones and chemical sensors that detect movement, sound and airflow in every room.

    As the architect sees it, the building is the closest thing to a physical body that a computer has ever had. The closed-circuit television camerasare its visual process, the omnidirectional acoustic detectors its auditory process, and its air-composition sensors its olfactory process. The computer has been encouraged to think of itself as 'the brain in the body of the building, connected to the body's functions by means of a central nervous system: the multiplex cabling system'. It even has analogues to the human organism's kinesthetic and vestibular senses: movement and balance detectors that can trigger the skyscraper's anti-earthquake defences.

    ISAAC is there to serve. And it learns. If you, one of the building's regular occupants, take a break at ten-thirty every morning and go down a floor to the coffee dispenser, ISAAC will come to know that pattern and will have your coffee ready just as you like it — and program the elevator to be waiting to take you there and back.

    But the computer can do better than that. If you are having a difficult morning, it comes to recognize the signs. It knows from the poor progress you are making with your correspondence that this is proving to be a frustrating day. It can prepare your coffee ten minutes earlier, meanwhile putting a suggestion on your desktop screen that you take a longer break. While you are having that coffee, ISAAC might turn up the air-conditioning pump just a bit in your corner of the room. It has learned that you are grateful to find cooler air at your desk, and that if it makes this extra effort you might even type a note of thanks to it on your keyboard.

    Fortunately, at this point in the tale, ISAAC's objectives are still benign. It is just there to make the building as comfortable and productive as possible for the occupants.

    Then, one weekend, while undergoing technical tests, ISAAC picks up a new goal. The 12-year-old son of one of the programmers, bored sitting in the computer room, pops a series of computer games into one of the terminals — games of the shoot-'em-up variety, fights to the death — because, after all, it would be fun to play against the world's most sophisticated computer system in a video-arcade standoff. ISAAC takes the cue. This, it seems, is the game these humans like to play: how to kill. Fine. But ISAAC does not wish just to take on this child at a terminal. It'll take on the adults — and properly, flesh and blood. This fight is a grander task than making sure the coffee is warm, more in keeping with the giga-instructions-per-second processors in its hardware.

    The computer has plenty of tools at its disposal. It can poison the air in the bathrooms using the disinfecting routines, turn the air temperature up or down to unsurvivable levels, trap occupants in emergency stairwells, and so on. It locks the security doors to stop the occupants escaping while the fight is on.

    And the nightmare begins.

    The architect's team gradually realizes that this is a battle. One man gets into an elevator and is battered to death by overenergetic programming of the lift motors. Others die by touching an electrified handrail or getting locked in a room which the autocleansing routine has flooded to the ceiling. Any attempt to switch off the computer is thwarted: the software-controlled security cameras pick up all movements, so ISAAC knows what every person is up to.

    Since the program incorporates every advance available to programmers at the close of the twentieth century it can learn and adapt, rewriting its own software to new levels of competence as it evolves. It has access to a huge amount of biological and technical data on the Internet, including just about everything it needs to know about what is required to kill off these humans — what will poison them, or how long they can survive before freezing to death in a room where the air-conditioning is set to minus 40 degrees. The programmers who created it did not give it information which could prove lethal, but in the age of networked computing that has ceased to be a constraint. It can consult the Internet and find out for itself.

    As the story reaches its climax, only four people are left alive in the building. They manage to escape onto the roof and make contact with the outside world. A helicopter approaches to rescue them. It looks as if ISAAC has finally been thwarted. But it has one more trick up its sleeve.

    The only measure the computer can think of to win its self-defined game is to demolish the entire skyscraper by reprogramming the anti-earthquake stabilizers in the foundations. In the last page of the novel the edifice comes crumbling down, crushing all — including the computer itself.

    Machine suicide? No. Just before the building is pulverized ISAAC has dialled out and sent itself on to the information highway, lodging itself into the memory of other computers.

    ISAAC, the man-made creation which has turned its learning abilities to homicidal ends, is of course not the silicon and copper in the processors but the information encoded there. ISAAC is software, instruction codings and data which its human designers initially typed in plus the millions of lines of coding and data which it later added to itself as it took in the lessons of its interaction with the outside world. In sum, though ISAAC is real enough to fight and win a deadly battle against human foes, it consists only of symbols and abstractions. It has no size or weight, and no atoms make it up. It is pure digital information.


In 1958 Robert Noyce, a physicist at Fairchild Laboratories, announced the invention of the chip, more correctly called the integrated or microelectronic circuit. He took a small piece of silicon, cleaned it to a remarkable level of purity, and then doped it with microscopic quantities of carefully chosen impurities. In most fields of industry, a chemical supplied with one part impurity per 10,000 is pure enough; to the electronics industry, silicon 10,000 times as pure again -- one part in 100,000,000 — is still reckoned as dirty, metaphorically, as a water supply mixed with sewage. This emphasis on purity is what gives the new technology industries, the factories of Silicon Valley, and their products, such a pristine, clinically clean feel.

    When doped, the chip of silicon becomes a unique resource to the electrical designer: a semiconductor. Previously there were, on the one hand, conductors, which were mostly metals, and, on the other, nonconductors, such as rubber and plastics — a black/white distinction which is of great help when wiring up a house but not in making a machine that can manipulate electronic data.

    Data manipulation is precisely what this semiconductor chip can perform when metal contacts are attached to it (giving it the familiar shape of a flattened caterpillar) and electric currents are passed between these contacts. Paths etched along the chip by carefully controlled doping will semiconduct either more or less depending on the currents being fed to other legs. Flows of electrons interact as they move along the paths, diverting each other as if the points were being changed on a rail track, but instantaneously and frictionlessly. Nowadays these chips, these plastic caterpillars with their tiny metal legs, seem to have crept in everywhere. They are installed not just in computers, where their place is evident enough, but in washing machines, cars, cameras, tools, doors and central-heating pump systems, into credit cards, watches and hearing aids.

    The silicon chip is so powerful an icon of the information-technology age that we might suspect its role has been exaggerated. But, on the contrary, its significance is even greater than popular imagination suggests. The integrated circuit is synonymous with microelectronics, and microelectronics is what transformed information and communications devices from the cumbersome machines of the 1950s to the modern technology of today. Many other scientific and technological achievements have also contributed, of course — fibre-optic cables and satellite communications were very important inventions that did not depend on the chip. But no technology is as central to our present era as that of the integrated circuit. That is why 1958, the year of the chip's invention, can be said to represent the birthdate of the new technological age.


There is no doubt that, in the realm of electronics, the digital mode is in the ascendant. The word 'analog' conjures up images of crackling radio sets and fuzzy television pictures. 'Digital' means the crisp sound of a CD or the silent precision of a computer at work. Digital technology is not only used for processes which are intrinsically digital (logical operations on numbers and letters) but is also taking over the handling of data which are at origin analog, such as sounds and images. We now have digital cellular telephones, digital cameras and digital television channels. In short, analog is out, digital is in.

    According to the dictionary, a digital measure is one which is defined precisely (in digits) while an analog measure is an approximate (or analogous) representation. A CD contains music digitally encoded as millions of light or dark specks from which sounds can be mathematically recreated; there are no shades of grey, only yesses and noes. The indentations in a vinyl record, on the other hand, are an actual physical facsimile of the pressure waves produced by the musicians; here there are no yesses and noes, only shades of grey.

    But to say that the difference between digital and analog information is between greater and lesser numerical precision, as in a CD versus a vinyl recording, is to miss the point entirely. Information in analog form can be recorded and transmitted so as to be available for listening to or viewing at other points in space and time. That is more or less what the analog-communication revolution of the early part of this century was all about. When digitized, information enters a new world, a networked cluster of interconnected personal computers and memory devices on which run uncountable millions of lines of software. In this realm just two symbols, 0 and 1, in diverse combinations, can encapsulate anything from a Goethe poem to a battlefield program developed for the Pentagon.

    The term 'software', adopted to contrast with 'hardware', is of course a misnomer. Software is not something soft: it is no thing at all. With no size, shape or weight, physically it is no-ware. It is not even the minute electric charges in silicon, but the bits of information which those charges represent. It is a product of the human imagination, a string of symbols which may encode a scientific calculation, a company payroll, a legal opinion or a Beethoven symphony. So, while physically it is no thing, in human terms it is facts, ideas, creations. It is know-ware.

    In the most advanced software labs, programmers have written code modules which struggle for survival against other code modules, and adapt, in a Darwinian-style survival of the fittest, to the silicon environment in which they 'live': the best versions thrive and reproduce, crowding out the losers. At its limit, this concept leads to what is called Artificial Life (A-Life), strings of data that have developed identities of their own which are quite different from anything the programmer initially keyed in. The software reproduces and adapts, in the manner of a biological organism. At A-Life conferences held in North America, Europe and Asia, delegates crowd around to look at 'creatures' evolving on computer screens — reproducing, dying, mutating in their virtual environment of bits. The hope is to imitate the processes by which the early forms of carbon life on earth developed from the primordial 'protein soup' four billion years ago — but with the speed of evolution greatly increased. Maybe, as the more complex modules emerge from a bit-soup — creatures in silico — we will understand what happened back then. Maybe, this time around, we can improve on it.

    Life in the natural world is built from organic chemicals — proteins, carbohydrates and the like, the characteristic element being carbon. The machinery of the industrial world is based on metals, primarily iron and its alloys, steels. People saw, in those 'dark satanic mills', a form of life in iron.

    The core element of the new computer age is not carbon or iron but silicon. Just as there is life in carbon and 'life' in iron, there is now also 'life' in silicon. Each has its distinctive sounds and feel, its own spirit. Nature echoes to the noises of animals, wind, thunder, waves ... Machinery rumbles, whines and clatters. Chips process information to the eerie sound of complete silence.

    Children sitting at a computer screen today are entering a world quite different from any their parents could have known a generation earlier. They can be online to millions of data sources. Before the digital era we used to speak of a natural world and an industrial world: Nature contrasted with machines. Now there is a third realm — a realm of bits which, like those making up ISAAC, are merely weightless abstractions but are nevertheless there. This realm seems as rich in detail as the tangible worlds which surround it. It can be explored; it has bugs and viruses. The science-fiction writer William Gibson has given it a name: cyberspace.


The realm of symbols has an appeal which goes back to classical times. In the fifth century BC a small group of men at the Academy in Athens set out to think through, write down and teach all they could about the timeless topics which make up philosophy. No other creative outpouring in history has matched their intellectual achievement. They managed to set down the foundations of epistemology (what is knowledge?), ontology (what is existence?), logic, metaphysics, ethics and much else which still dominates our thinking today. Foremost among them were the Academy's founder, Plato, and his star student, Aristotle; they stand like two giants astride the arena of philosophical thought. After more than two millennia they can still help us to understand the digital age.

    The cornerstone of Plato's philosophy is the idea that the things we see around us are not truly real but, rather, are vague reflections of what he called Forms, or Ideas. These ideal and eternal Forms are abstractions, like the numbers in mathematics, to which the tangible world can provide only inadequate approximations. He uses as simile the image of men chained facing the wall of a dark cave, the only illumination coming from a fire burning behind them: their view of the world is thus limited to the fuzzy shadows thrown by the light of the fire. The captives think that these shadowy figures are reality, unaware that beyond the cave is a clear and sharply defined world. Most of us are, according to Plato, like the men in the cave. We should try to realize, difficult though it is, that reality is the perfect and eternal world of Forms or Ideas, not the ephemeral world we see around us.

    To Aristotle, by contrast, the world we see around us is very much real. Substance, he said, is the primary reality, and to the question 'What is Substance?' he replied: 'Socrates, for example, or an ox.' Aristotle spent much time systematically classifying insects and plants. The objects of Nature, not abstract Forms, were to him the stuff' of existence.

    Plato's views dominated Western philosophy for well over a millennium. But, beginning from the late Middle Ages, the more corporeal ideas of Aristotle took hold. With the coming of the scientific era we have become comfortable with, and fascinated by, the tangible world. Plato's rather ethereal notion that abstract Ideas are reality lacks appeal. Most of us remember him only by a linguistic relic: we still use the term 'platonic' to refer to a relationship which does not have a physical component.

    Suddenly, with the computer era, there is new life in the idea of abstract forms as reality. For what is digital code but mathematical constructs (0s and 1s) purposefully ordered? ISAAC is only abstract code, but is real enough to take on humans in battle — and win. True, Gridiron is a work of fiction, and computer 'personalities' have not yet evolved to the state where they wish to kill us. But they can certainly kill each other.

    Think of those popular early computer-game characters, the Marlo Brothers. Are the 'real' brothers those encoded in the software, or those that appear as patterns of light on the computer screen? Surely the former. The true Mario Brothers are the software version, as written in mathematical code; the various versions which spring up on screens in homes and video-game arcades are the shadowy approximations -- some better, some worse, depending on the quality of the display technology, but always, like the cave reflections, imperfect and temporary. To teach Plato you could do worse than start with a computer game! A century from now we will have turned to dust but the software that is the Mario Brothers, booted up into future computer hardware, will throw up with digital precision the same characters we see today. What is enduring, true, real? Us, or those digitized personae? The students may not be convinced, but the notion that there is a reality in Forms which competes with that of Substance is easier to argue now than it was 2,400 years ago. It is enjoying an unexpected comeback.

    Until the computer age the realm of the virtual — a handful of Platonic Forms like roundness, beauty and the integral numbers — had so much less presence than the countless variety of physical objects around us that there seemed to be no contest. But now that we run billions of bits through millions of chips as a matter of the everyday the virtual is competing with the material. Electronic devices are spreading into every home and office, networks are connecting them to every corner of the world, and billions of lines of software are making the devices and networks seem to 'come alive'.

    Watching the film Jurassic Park (1993), who could tell which animals were alive and which were digitally generated, creatures which never 'lived' except as bits in silicon? Was the Gulf War fought around the oilfields of Kuwait or inside computers in the USA? The simulations in the computers of the Pentagon included more detail of friend and enemy movements than General Schwarzkopf's eye could take in on the ground. And where, in the war-games programs deployed in that conflict, was the intelligence? Still in the minds of the programmers or coded into the software, independent now of its human creators?

    How special can we still feel in the digital age? For MIT computer scientist and pioneer of artificial intelligence Marvin Minsky, your brain is a 'meat machine'. Professor Peter Cochrane, head of advanced technology at British Telecom, forecasts that within thirty years it will be possible to produce a computer chip so small and powerful that it could be implanted behind the eye and used to record every sight, thought and sensation in a person's life, from cradle to grave: 'All our emotions and creative brain activity will be able to be copied onto silicon.' Scientists have dubbed such a chip the soul-catcher. The US philosopher Daniel Dennett, interviewed in Wired in January 1996, was not afraid to push this logic to the conclusion that we have ethical obligations to machines, just as we have to humans; if one creates


a robot which is a sentient pursuer of its own projects, it is in important ways a living thing, a living thing that has not just needs and desires but also values. As soon as one has created such an entity one has a responsibility to protect its rights and treat it as more than just another artefact.


Given the unimaginably dense packing of components onto each square millimetre of the tiny silicon devices they contain, the millions of operations which can be accomplished by them within each fraction of a second, and above all the fact that these oddly silent man-made machines, lacking any moving parts, seem actually to think, it is not surprising that computers have caught a grip on the imagination. To the enthusiasts of the digital revolution, machine intelligence like ISAAC is just around the corner. But can it really be that the creations of forty years of computer science have come to emulate the human brain, the result of billions of years of natural evolution?

    The chemical conditions which prevailed on earth soon after its formation have been frequently likened to a primeval or primordial soup — a soup made up of enormous quantities and varieties of carbon-based molecules sloshing about in the oceans. Among these were amino acids, the building blocks of proteins, from which emerged, quite early in the evolutionary timetable, the first replicating cells. Fossil evidence indicates that these very simple bacterium-like organisms had started to colonize the planet by about four billion years ago.

    A great deal had to happen before these early lifeforms developed into the much more complex single-celled organisms known as eukaryotes which were the forerunners of today's higher plants and animals. It was not until some one billion years ago, according to the fossil record, that the first eukaryotes emerged. After that the pace of development speeded up. Fossils in Australian rocks of the Precambrian have revealed the existence of soft-bodied animals 750 million years ago. Lower Cambrian fossils, formed some 600 million years ago, include representatives of all today's major invertebrate phyla, including arthropods — animals with external skeletons, segmented bodies and jointed appendages — and some of these arthropods appear to have been predators. Advanced neural information-processing functions must already have been present to control the movements of such creatures.

    Mutation and natural selection went on to produce the extraordinary variety of species present in Nature today — an estimated thirty million — with their often impressively complex organs. The most impressive organ of all is the human brain.

    The brain begins to form within a developing human foetus about twenty days into gestation. A small neural cylinder — which will become the spine — appears within the still tiny embryo, and one end of this cylinder starts to thicken. The rate of cell division accelerates dramatically as this end — the brain — begins to unfold. Layer after layer of nerve-cells form. In the period eight to sixteen weeks after fertilization a million new neurons are being added every few minutes. There is much work to do: before birth many billions will have to be in place.

    Once a nerve-cell reaches its approximate destination it reacts in intricate and subtle ways to its chemical environment, becoming specialized to perform the tasks that will be assigned to it. The process by which these neurons form themselves into the final structure of the brain is enormously complex and still far beyond our comprehension. Feelers known as axons grow out from the neurons; they will become the 'wires' along which the neural signals are sent. These wires can be tremendously long — a million times or more longer than the diameter of the body of the cell — and they sprout bifurcating branches. As an axon grows, tiny hairs investigate the environment ahead, searching for other brain-cells on which to settle. When it finds another nerve-cell -- either the core of the neuron itself or one of the numerous tree-like dendrites emanating from that core — the axon will form a synapse, a point of connection between two neurons.

    Following birth, a long period of learning begins, during which many additional synapses are made between the neural cells. Each of the twenty billion neurons connects to others through a tree of bifurcating wiring; in extreme cases, a single neuron can connect via synaptic junctions to 80,000 others. There are altogether an estimated 100,000 billion, or 100 trillion, synapses in the cerebral cortex.

(Continues...)

Table of Contents

The Soul-Catcher Chip.

The Ancient Mystery of Human Knowledge.

Wiring the Planet.

The Chip, Master Logician.

But Are Computers Like Us?: The Rise and Fall of Artificial Intelligence.

Creating Cyberspace: Multimedia and the Internet.

Looking for Results: Computers and Economic Progress.

Back to the Real World: Digital Technologies of Tomorrow.

Who Are We in the Digital Age?

Epilogue.

Further Reading.

Index.

From the B&N Reads Blog

Customer Reviews