|
The Brain - How Does it Compute?
Idan Segev
The brain is living proof that physical, chemical, and electrical components can display highly developed levels of intelligence. Compared with the brains of the simplest animals, artificial mechanisms like the GOLEM - whether software-designed or hardware-based - are highly primitive attempts to solve real-life problems. But how do the ten billion cells that make up the human brain control the functioning of the body, memory, emotions, and the performance of creative tasks? The time has come to synthesize - to go beyond the enormous achievements of the 20th century in exploring neural mechanisms at the anatomical, physiological, and molecular levels - and to develop a theory or working model that connects the mechanistic level and the behavioral level.
There is growing awareness in the 21st century that the classic disciplines in and of themselves will not bring about the scientific breakthrough required to solve the mysteries of the brain. Finding a solution requires a thorough understanding of biology, chemistry and physics, mathematics, philosophy, and cognitive psychology, as well as computer sciences and hardware design. Neural computation is a new field of research. It reflects an emerging interdisciplinary approach that attempts to discover the principles underlying the computation and processing of information in the brain, and to lay the groundwork for constructing artificial intelligence mechanisms. Innovative research centers and advanced teaching methods in neural computation are gaining momentum throughout the world. The largest center of this type is at Hebrew University in Jerusalem.
This new type of scientist, one with widely based experimental and theoretical abilities, has taken on the task of achieving that breakthrough in our understanding of the brain's computational methods within the first half of this century. In addition to solving the greatest scientific puzzle ever, the breakthrough will allow us to exchange parts of a living brain, construct artificial brains, and make more effective use of our own brains. The breakthrough will be at least as momentous as the Industrial Revolution and the current Information Revolution. It will alter our lives in far-reaching and fascinating ways.
The Brain Computes!
For a young child, there is nothing simpler than being able to identify his mother from different points of view - even if she has a new hairstyle, her background lighting changes, or she is frowning rather than smiling. He can differentiate between different musical selections, learn the alphabet, and extend his hand to grab hold of a cup of milk. For beetles, it is essential to identify the sources and quality of food, find a partner, and escape danger. All of that is the work of the nervous system, whose main function is to process sensory information (input) and generate appropriate behavior (output). The general operation of the nervous system in processing sensory input and creating appropriate output is called neural computation. We can say that the brain is capable of many things, but it has one truly phenomenal characteristic - it computes!
But how does the brain go about it? How do the physical components of the nervous system - the ion channels in the neural membrane, the synapses and neural transmitters that connect the neurons, the uniquely structured nerve cells, and the large networks created by the neurons - perform the task of computing? This mystery is, perhaps, the greatest intellectual challenge of the century. The field of neural computation has made this challenge its goal. One of the first steps toward reaching the goal is to educate a new type of multidisciplinary scientist with a solid background in both biology and experimental psychology and in the theory of complex systems (such as our brain). Technology-intensive countries are already conducting advanced research development programs for the promotion and use of artificial intelligence. We have an urgent need for a theory that explains how neural mechanisms support mental processes and behavior. That is the ultimate goal of neural computation.
Levels of Description, Levels of Understanding
Nerve science in the 20th century was noted for the development of sophisticated techniques that gave us an inside view of the nervous system at different levels. The invention of the electron microscope in the 1950s allowed scientists to identify types of synapses (excitatory and inhibitory) and pinpoint their location on the dendrite. Voltage clamp and patch clamp methods revealed the mechanisms responsible for generating and transmitting electronic signals in the neurons. Molecular techniques allow scientists to manipulate the structure of specific ion channels and receptors in order to better understand their role in creating memory and behavior. Recent advances in optical technology - such as the DIC video microscope and two photon microscope - enable optic imaging of the electrical activity and the dendritic tree of individual neurons in vivo, while the animal is processing sensory information. Magnetic resonance imaging (fRMI) allows us to image specific areas of the brain that are involved in the performance of defined tasks. Slowly but surely, we have been opening the "grey box" and learning how it works. We're becoming acquainted with the different levels of the "meat machine," as computer scientist Marvin Minsky described it, from its molecular tissue to its functional organization (using PET and fRMI). But to find the solution we need something more.
One of the basic questions we need to address is the correct level of description for drawing a connection between the material base of the nervous system and the calculation processes conducted by the brain. While this question has been widely debated, many scientists agree with the following statements by Braitenberg in his "Manifesto of Brain Science": "Brain computation is a delicate activity that requires a spatial resolution of one to ten micrometers. The functional elements are fibers and neurons, and in most cases many neurons are involved in a specific functional structure…" We believe that the distribution of electrical signals observed at a resolution of one micrometer and a temporal resolution of a thousandth of a second is all we need to know as the material representative of the behavior. Behavior is subject to change by hormones, medication, or a disturbance in the pathological balance of any transmitter material, but the actual result is always expressed in terms of the occurrence or non-occurrence of a change in neuron potential."
If this is, indeed, true, we must focus on the way electrical signals are created, coded, and transmitted via individual neurons and small groups of neurons (in the cerebral cortex, for example) when the animal performs a specific behavioral task. This would require the immediate development of new techniques allowing us to record from many cells simultaneously, both within and outside these cells. This technological challenge is beginning to draw attention. In vivo recording using the patch clamp technique combined with a two-photon microscope, produce the necessary level of resolution. These - together with the development of multiple-electrode systems for simultaneously examining the start-up activities of many neurons in different processing areas in behaving animals - bring us very close to observing the calculating brain with the appropriate spatial and temperature resolution.
But, alas, even this is not sufficient to understand how the brain performs its calculation function. There is no theory allowing us to interpret the large corpus of "brain data" in functional or behavioral terms. In this respect, nerve sciences are still in their infancy. Lacking such a theory, we cannot say, "We understand how the brain works."
The Urgent Need for a Theory of the Brain
We can take any computational system, such as a personal computer, disassemble it into its tiniest components, and describe what we see. We can also record the electrical signals flowing within these components when a given problem is solved (2 X 2, for example). However, in order to say we understand how the computer works, we need to know the mathematical algorithms that it applies. We need to know that logic gates are used to represent numbers and that the computer conducts a logical calculation. Therefore, a thorough understanding of the digital computer requires theoretical knowledge about mathematical logic.
In the same say, we can record the building blocks of the nervous, synaptic, and dendritic system; the axons; and the ion channels that cross membranes. We can also characterize the ion mechanisms underlying the initiation and conductance of the axon's action potential in the nerve and the synaptic potential of the dendritic tree. However, how can the action potential (or a burst of potentials), the transmitter in synaptic gap and the post-synaptic potentials in dentrites represent (code) important sensatory excitation? What are the (mathematical) actions these signals perform? Are the neurons logic gates that perform work, and can neuron networks perform complicated logical actions? Is it more appropriate to use Shannon's scientific theory as the universal mathematical framework according to which the brain calculates? Is the dynamic state of neuron networks described and interpreted in the best way and according to non-linear systems theory? It appears that we will need to develop a totally new mathematical theory in order to understand how the physics of the brain bring about the occurrence of a particular behavior.
Theory and models have already deepened our knowledge of various aspects of brain functioning. On the biophysical level (more than on the computational level), the most prominent examples are the Hodgkin and Huxley model (1952, and Nobel Prize recipients in 1963) of the initiation and conductance of the axon potential, and the Rall model (1967) of the integration of synaptic outputs in the dendritic tree. There are currently efforts supported by the United States-Israel Binational Science Foundation (BSF) to combine these biophysical models with models suitable to neurons as sophisticated calculating mechanisms (data processors with microchips). With the help of information theory, we can calculate the amount of information that a given synapse transmits from the dendritic area to the axon, and we can interpret the functional role of neural noise (ion channel stochasm, for example, and the probability nature of synaptic transmission).
At a higher level, the work of Pitts and McChulloch (M&P, 1943) was apparently the first attempt to devise a theory that links the brain mechanism level with the mental processes level. These researchers were the first to study neural computation. It is interesting to note that their motivation came from Leibniz's doctrine, which held that all logic can be reduced to arithmetic, like that expressed in the binary code. The "all or nothing" behavior of the action potential and the discovered effect of excitatory and inhibitory synapses led to their groundbreaking 1943 article, "A Logical Calculus of the Ideas Immanent in Nervous Activity." As they themselves knew, neurons are not binary gates. Their work left a deep stamp on the development of innovative computers and more recent attempts to develop a computational theory of the brain.
Another powerful example of high level brain theory based on statistical mechanisms of unordered systems is Hopfield's model (1982; 1984) of associative memories and learning in wide networks in the cerebral cortex. His work has had enormous impact in industry. More significantly, however, his research on artificial neural networks, which serves both analytical techniques and computer imaging, supplied the key to the overall principles guiding computation of orientation sensitivity, depth perception and motion detection in the cerebral cortex. In any case, the challenge of understanding the circuits of the cerebral cortex remains: the great theory of the calculating brain has yet to be invented.
When the Breakthrough Comes
There are now centers for the study of neural computation throughout the world. One of the first and largest is the Interdisciplinary Center for Neural Computation at Hebrew University in Jerusalem. Its staff has launched highly original and innovative projects, many of which have enjoyed the support of the BSF. It has succeeded in drawing top-level students to its doctoral program and is helping to create the next generation of neural scientists. There are currently 60 doctoral students enrolled. A new generation of brain researchers is on the way; experimenters and theorists are working together on the same brain. These young, bright scientists will apply the rapidly growing momentum of calculation and innovative technology to brain research. Given their extensive experience and theoretical background, there is room for optimism - the solution is in sight. When it appears, our lives will change dramatically. Intelligent robots will perform many of our daily tasks. They will drive cars, clean the house, iron clothes, and go shopping. These devices will be able to learn from experience and improve their performance. They will play soccer with us and plan tomorrow's schedule for us in accordance with the weather forecast. They will read book for us and perform medical diagnoses. The industry will be completely automatic, and its control mechanisms will be artificial brains. As a result, we will have a lot of time for thinking, creating, and enjoying each other's company and the world around us. There are, of course, ethical issues involved in the creation of intelligent mechanisms. Will these robots have emotions? Self-awareness? An independent will? These and other yet-unknown issues will undoubtedly be our central concerns in the 21st century - the "century of the brain." This much is clear: we are on the brink of a great adventure.
Bibliography
- Braitenberg, V. (1992). Manifesto of Brain Science. In: Proceedings in the Cortex, eds. A. Aertsen and V. Braitenberg, pp. 473-477, Spiringer.
- McCulloch, W., and Pitts, W. (1943). A logical calculus of the ideas immanent in nervous activity. Bulletin of Mathematical Biophysics 5:115-133.
- Hopfield, J.J. (1982). Neuronal networks and physical systems with emergent collective computational abilities. Proceedings of the National Academy of Science 79:2554-2558.
- Hodgkin, A. L. and A. F. Huxley (1952). A quantitative description of membrane current and its application to conduction and excitation in nerve. J. Physiol. 117: 500-544.
- Rall, W. (1977). Core conductor theory and cable properties of neurons. In: The Handbook of Physiology, The Nervous system, Vol. 1, Cellular Biology of neurons. Ed. E. R. Kandel., J.M. Brookhart and V.B. Mountcastle. pp. 39-97. Bethesda, MD. American Physiological Society.
|
|