Browse Prior Art Database

The Computer and the Brain Revisited

IP.com Disclosure Number: IPCOM000129614D
Original Publication Date: 1989-Jun-30
Included in the Prior Art Database: 2005-Oct-06
Document File: 6 page(s) / 28K

Publishing Venue

Software Patent Institute

Related People

TERRENCE J. SEJNOWSKI: AUTHOR [+2]

Abstract

Terrence Sejnowski assesses von Neumann's contribution of mathematical and computational tools for the development of computational neuroscience. He surveys the progress that has been made in this field since von Neumann's death and outlines the difficulties that remain.

This text was extracted from a PDF file.
This is the abbreviated version, containing approximately 16% of the total text.

Page 1 of 6

THIS DOCUMENT IS AN APPROXIMATE REPRESENTATION OF THE ORIGINAL.

Copyright ©; 1989 by the American Federation of Information Processing Societies, Inc. Used with permission.

The Computer and the Brain Revisited

TERRENCE J. SEJNOWSKI

Terrence Sejnowski assesses von Neumann's contribution of mathematical and computational tools for the development of computational neuroscience. He surveys the progress that has been made in this field since von Neumann's death and outlines the difficulties that remain.

Categories and Subject Descriptors: K.2 [Computer Milieux]: History of Computing -- hardware, people, systems, theory. H. 1.2 [Information Systems]: Models and Principles -- Systems and Information Theory; H. 1.2 [Information Systems]: User/Machine Systems -- Human information processing.

I first read John von Neumann's book The Computer and the Brain in the summer of 1970, while studying for the general examination for doctoral candidacy in physics at Princeton. Ever since then, I have been thinking about the issues von Neumann raised in his book. Rereading the book recently has highlighted the progress that has been made on trying to understand information-processing in the brain, as well as the difficulties that remain.

When von Neumann wrote the manuscript for the Silliman Lectures at Yale in 1956, the general character of electrical transmission and communication between neurons had just recently been elucidated through the seminal work of Alan Hodgkin and Andrew Huxley on the squid giant axon, and Bernard Katz on the frog neuromuscular junction. The all-or-none nature of the action potential had suggested analogies with binary gates in digital computers (McCulloch and Pitts 1943), but the analog nature of neural integration was just beginning to be fully appreciated. Typically, the accuracy of numerical calculation in a modern digital computer is 8 to 16 significant figures. But in a neuron, signaling by means of the average firing rate has at best one or two significant figures of accuracy. We still do not understand how information is represented in the brain in such a way that low accuracy is not a problem.

Von Neumann recognized that the reliance of the brain on analog-signal processing had far- reaching significance for the style of computation that the brain was capable of supporting. He pointed out that the logical depth of a calculation, for example, can be very great for a digital computer that retains high accuracy at each step in the calculation; but for an analog system like the brain, the compounding of errors causes severe problems after only a few steps. Much of the work in artificial intelligence depends on the efficient use of a sequential, symbol- processing architecture, and on tree searches that have great logical depth. The model of computation based on logic that led to sequential architecture also served as a model for human reasoning in cognitive science (Newell and Simon 1976). The recent availability of d...