Robin Robertson's Home Page
Title: The Case of the Missing Third.
Abstract: How is it that form arises out of chaos? How do we reconcile mind with body? In attempting to deal with these primary questions, time and again a "missing third" is posited that lies between extremes. The problem of the "missing third" can be traced through nearly the entire history of thought. The form it takes, the problems that arise from it, the solutions suggested for resolving it, are each representative of an age. This paper traces the issue from Plato and Parmenides in the 4th - 5th centuries, B.C.; to Neoplatonism in the 3rd - 5th centuries; to Locke and Descartes in the 17th century; on to Berkeley and Kant in the 18th century; Fechner and Wundt in the 19th century; to behaviorism and Gestalt psychology, and then Jung, early in the 20th century, and culminates with ethology and cybernetics later in the 20th century. Just short of dynamics.
The problem of the "missing third" can be traced through nearly the entire history of thought. The form it takes, the problems that arise from it, the solutions suggested for resolving it, are each representative of an age. In Process and Reality, Alfred North Whitehead said that "The safest general characterization of the European philosophical tradition is that it consists of a series of footnotes to Plato" (Whitehead, 1929, 39). So our proper starting point is Plato.
Parmenides' 3rd Man Argument Against Plato's Ideals
In Plato's dialogue Parmenides, a young Socrates presents his theory that the "things" we think of as real in the physical world necessarily acquire their form from a world of idealized "forms," commonly known today as Platonic "ideas" or "ideals." Aristotle used the term "Third Man" for Parmenides' concise argument against Socrates' (that is, Plato's) model. Only Parmenides' side is presented below, as Socrates merely acquiesces at each point, as is common in the dialectical dialogues of Plato.
I imagine your ground for believing in a single form in each case is this. When it seems to you that a number of things are large, there seems, I suppose, to be a certain single character which is the same when you look at them all; hence you think that largeness is a single thing. But now take largeness itself and the other things which are large. Suppose you look at all these in the same way in your mind's eye, will not yet another unity make its appearance--a largeness by virtue of which they all appear large? If so, a second form of largeness will present itself, over and above largeness itself and the things that share in it, and again, covering all these, yet another, which will make all of them large. So each of your forms will no longer be one, but an indefinite number (Cornford, 1957).
Parmenides' is arguing that if some quality such as largeness exists as an independent entity, then there must be some connecting element, here referred to as a "second form of largeness" between those things which are large and "largeness" itself. If that argument is accepted then one is involved in an infinite regress, with largeness-1 leading to largeness-2, and so on, ad infinitum. So any attempt to find ideal forms ends up denying those ideal forms since each ideal form has an infinite number of versions.
In the same dialogue, Parmenides presents his answer to such problems: that there is a single underlying principle of being, "whether one is or is not, one and the others in relation to themselves and one another, all of them, in every way, are and are not, and appear to be and appear not to be" (Comford, 1957). In other words, the only way to get out of the infinite regress is to realize that there is no multiplicity, that all is one.
Parmenides' disciple Zeno wrote a book whose object, according to Plato: "…was to defend the system of Parmenides by attacking the common conceptions of things" (Heath, 1981). The Neoplatonist philosopher Proclus (of whom we'll hear more later) said that the book "elaborated forty different paradoxes following from the assumption of plurality and motion, all of them apparently based on the difficulties deriving from an analysis of the continuum" (Gillispie, 1980). Aristotle preserved four of these paradoxes; fragments of several others survive from other sources. Though all present the problem of the missing third to one degree or another, two do so most explicitly:
In "Dichotomy," Zeno argues that a mid-point always lies between any starting and ending point. But that mid-point can be considered as a new ending point and, therefore a new mid-point can be identified between it and the starting point. And so on.
In the famous race between Achilles and the Tortoise, if the Tortoise is given a lead to begin, Achilles can never catch him because, by the time Achilles reaches the Tortoise's starting point, the Tortoise will have moved farther. When Achilles reaches that point, the Tortoise will have move again. And so on.
Why do these paradoxes occur? Zeno would agree with Parmenides that the problem is the assumption that time and space are divisible, when, in actuality, space and time are each indivisible concepts, impossible to break into parts.
As the name implies, Neoplatonism was a new formulation of the philosophy of Plato. However, Neoplatonist philosophers did not consider that they were saying anything new; they felt that they were just developing ideas which were already in Plato, either explicitly or implicitly. In fact, they didn't see any dramatic split between the ideas of Plato, with his world of ideal forms, and Aristotle, with his emphasis on the actual forms we encounter in nature.
A brief history of Neoplatonism necessarily includes three figures: Plotinus, Iamblichus, and Proclus. Plotinus, who lived in the 3rd century, A.D. was the founder of Neoplatonism. He was born in Egypt, traveled widely in Syria and Persia, and settled in Rome. He was the first clear expositor of the central theme of Neoplatonism. In brief, this is the idea that there is a single undifferentiated, divine One, and everything else in existence came about as an "emanation" from this One. We can substitute the word God for the One, as long as we don't attribute any anthropomorphic qualities to God. It's harder to find a word that fully captures what the Neoplatonists meant by "emanate." They didn't mean "create" in the normal sense of most creation myths. The emanations are not separate from the One; they partake of the One. They may be diffuse, but they are still a part of the One. Radiate is a close parallel, as in the sun radiates light. This idea of emanations is what allows the Neoplatonists to reconcile Parmenides' One with Plato's Ideals, without falling into an infinite regress.
They argued that there is a graduated hierarchy of levels of the emanations that leads down from the divine to the mundane. (The Gnostics had a very similar concept in their mythologies.) The first level of emanations is Nous or Supreme Intelligence. This is the level of Plato's ideal forms. The second level is that of the World Soul. The last level is the world itself. Though each of these levels is different and progressively further from the One, each is still in part composed of the One. So from an undifferentiated One emerges three levels.
The next great Neoplatonic thinker was the Syrian Iamblichus, who was born the same year Plotinus died. He vastly elaborated Plotinus' concept of the three-way path down from the Nous to the World, in large part drawing on the Pythagorean idea of the divinity of number. Iamblichus divided the level of Supreme Intelligence into two separate triads. The level of World Soul was divided into another triad. Each such division had categories and sub-categories galore. But the idea of a triad held throughout.
The high-point of Neoplatonism was reached in Athens in the late 5th century by Proclus. Proclus was born in Constantinople, studied in both Alexandria and Athens, and was the head of the Neoplatonic School of Athens until his death. He took the idea of a hierarchy of emanations from the One to its limit and argued that Iamblichus' complex hierarchy was holographic at every level, since the emanations of the One necessarily partook of the One. Now, of course, he didn't call it holographic in the same sense that, for example, neurophysiologist Karl Pribram talks of the world as being holographic. But Proclus did argue that it was possible at any level to have direct apprehension of the One.
Where Plato argued that the things of the world were pale copies of ideal forms, and Parmenides and his disciple Zeno argued that everything in nature constituted a single, indivisible whole, the Neoplatonists accepted both. They were able to reconcile these two opposing views by positing a hierarchy between the One and the World, each stage of which was potentially holographic with the One. To put this into a two-stage form (rather than the three-stages of Neoplatonism) that sounds less "mystical," and anticipates later views, we might say that:
Of course, this was not clear yet when the Neoplatonists formulated their structure. Let us now move forward over a thousand years to a quaternity of philosophers who bring new views to the table.
Seventeenth century British philosopher John Locke was the first since Plato to offer a new theory of ideas. He argued that the mind, at birth, was a tabula rasa (blank slate) upon which sensory experience wrote. We accumulated sensations which were stored as simple ideas. The mind combined those simple ideas into more complex ideas. But all ideas were directly or indirectly from sensory experience of the world.
At roughly the same time, in France, mathematician and philosopher Rene Descartes proclaimed his credo of "Cogito, ergo sum " ("I think, therefore I am"). In the eighteenth century, British philosopher Bishop George Berkeley took Descartes' concept to its logical extreme and turned Locke's model on its head. Instead of the primacy of the world, Berkeley asserted the primacy of the mind: all we ever know are our thoughts. We have no proof that any world exists outside those thoughts. (Though Berkeley allowed it to exist by asserting that the world was a thought in the mind of God).
Clearly there was a need for a new "third" to resolve this paradox. This was provided in 1781, when Immanuel Kant published his Critique of Pure Reason. Kant agreed with Locke that there is a world outside us. He also agreed with Berkeley that all we can ever really experience are our thoughts. He argued though that the human mind is hardly a blank slate at birth. Instead it contains inherent "categories" through which we filter sensory data. We can think of those "categories" as in some way related to Plato's ideas. Instead of existing in some unknown location outside of the world, however, they exist in the mind. Kant said that judgement consisted of taking sensory data in through these inherent structures, and applying logic to reach a conclusion. In Kant's words "thoughts without content are empty, and intuitions without concepts are blind" (Reese, 1980, 276-80).
Though we might never be able to experience das ding an sich (the thing in itself), our minds are themselves structured much as the world is structured, and contain necessarily true categories with which we perceive the world. Kant's categories formed the missing third that lay between the mind and the world. Basically, Kant was the first to state explicitly the first of the two laws we presented earlier: "The organizing principles of experiential realities are to be found in logical realities." As we move forward in time, we will see experiential proof of this idea.
German philosopher Gustav Fechner was an interesting study in contrasts, and his work reflected those contrasts. Throughout his life he was pulled between the twin poles of mind and matter, in his case expressed by his interest in both metaphysics and science. His early work was first in physiology, then in physics. He experienced a crisis in mid-life, where he became ill and withdrew from the world. His earlier work no longer had meaning for him unless he could find some way of reconciling the science he practiced with the spiritual world in which he believed. After a dozen years as a reclusive invalid, he suddenly recovered.
The primary result of [the crisis] was a deepening of Fechner's religious consciousness and his interest in the problem of the soul.…his philosophical solution of the spiritual problem lay in his affirmation of the identity of mind and matter and in his assurance that the entire universe can be regarded as readily from the point of view of its consciousness (Boring, 1950, 278).
Fechner felt that if the mind and body were two manifestations of a single consciousness (i.e, Parmenides' One), then there should be physical constants that reflected the connection between the two. He wanted to find some way to actually quantify physical relationships between the mental and physical realms. Fechner found just such quantification in the work of Ernst Heinrich Weber. Weber conducted a series of experiments in which he tried to discover the threshold of sensory awareness. He had a subject hold a weight, then a second weight which was slightly heavier. He kept increasing the difference between the weights until the subject could detect a difference. He found that "the smallest perceptible difference between two weights can be stated as a ratio between the weights, a ratio that is independent of the magnitudes of the weights" (Heidbreder, 1933, 81; Boring, 1950, 113).
Not only that, but the ratio was constant for all subjects, and moreover, the results were similar in his experiments with visual and auditory response. That may sound commonplace. After all, it merely says that the heavier the weight, the larger the difference between it and the second weight before any difference can be detected. This means, however, that the sensory world is experienced through relationship, not through absolute difference. But if there was no intermediate third between the world and the mind, then our experience should be absolute, not relative.
To Fechner, Weber's experiments were a revelation, and he called the results Weber's Law. Actually, it might better be called Fechner's Law since Weber never saw the full significance of his discoveries, nor did he develop the general form of the law. As Fechner stated Weber's Law, sensation (which Fechner felt to be a purely mental process) varied with the natural logarithm of the stimulus (which Fechner felt was purely physical). Another way of expressing Weber's Law is that response varies arithmetically as stimulus varies geometrically. Fechner spent the next decade extending that research, to which he gave the name psychophysics. (2)
In the course of that research, he developed many of the techniques of experimental psychology. He felt satisfied that he had accomplished his goal: to quantify the relationship between mind and body. As he said, "the idea of a psychophysical threshold is of the utmost importance because it gives a firm foundation to that of the unconscious generally" (Jung, 1969, 166).
Experimental psychologists don't tend to be philosophical, and few have agreed with his self-assessment. Psychologist and philosopher William James expressed their point-of-view, when he said that "…in the humble opinion of the present writer, the proper psychological outcome [of Fechner's research] is just nothing" (James, 1950, 534). James' main objection was that Fechner was making a metaphysical judgement (i.e., sensation is mind) about something that he felt would prove to be merely physiological.
William James established a psychological laboratory at Harvard in 1875, where he studied psychophysics, hypnosis, trance states, and other related areas. But, as in so many areas, James' view was broader than most of his colleagues. If early experimental psychologists had to deal with the psyche at all, they wanted it reduced to consciousness, and consciousness reduced to tiny little elements that could be experimentally introspected. Four years later, in 1870, William Wundt founded a psychological laboratory in Leipzig to do just that; it was his work and the work of his pupils in his laboratory that led to the wide-spread development of experimental psychology. He called his method of research scientific introspectionism. While the name introspectionism might bring to mind deep, and perhaps fuzzy, thoughts, that was hardly Wundt's territory. Rather he studied the simple reactions that behavioral psychology has mapped so well, but did it by experiments with trained human subjects who made simple discriminations. This seemed a reasonable extension of Weber's Law. Though Wundt's laboratory studied a wide range of areas beyond this, in America we largely know of his work only through his student Titchner, who brought the concept of introspectionism to America's fledgling experimental psychology laboratories, but little else. Introspectionism is now a relic of the time. It was this narrowness that led to the counter-development of behavioral psychology which explicitly denied any missing third such as mind was needed to understand behavior.
Early in the twentieth-century, in reaction against Wundt's introspectionism, John B. Watson developed a stimulus-response model that became behaviorism. All behavior is considered to be composed of s-r chains, where a stimulus leads to a response, which in turn becomes the stimulus for still another response. Both stimulus and response are purely physiological events, and hence fully measurable. The mind in between can be safely ignored as, at most, an epiphenomenon.
Behaviorism fit both the climate of the time and the American character. Though never influential in Europe, it quickly became the dominant branch of American psychology. At roughly the same time that Watson was developing his s-r model, German psychologists explored Gestalt psychology. Gestalt is a difficult word to translate but roughly means the total context. Gestalt psychologists argued that perception, for example, could never be understood by simple chains of associations or stimulus-response units. In a series of classic experiments, independently conducted by German Gestalt psychologist Wolfgang Kohler and American neuropsychologist Karl Lashley, chickens were taught to distinguish two different shades of gray; we'll call them Gray and Dark-Gray. If the chickens correctly identified Dark-Gray, they received a reward of food. With the chickens trained to pick Dark-Gray, the psychologists brought in a still darker gray, which we'll call Gray-Black. They exposed the chickens to Dark-Gray and Gray-Black.
Now , if behaviorism was correct, the chickens should once more have picked Dark-Gray, because they had supposedly made a straightforward association of Dark-Gray with a reward of food. In fact, a small percentage of chickens did just that. But the overwhelming majority of chickens picked Gray-Black, the darker shade. In other words, most chickens had associated a relationship, "darker", with getting a reward (Kohler, 1947,118).
It is important to stress that the chickens were not exposed to a series of situations in which they had to pick the darker shade of gray. They were only trained with the one pair of grays, yet most learned from that one situation to pick the darker shade. In these experiments, stimuli did not lead immediately to response; "between the stimulus and the response, there occurs the process of organization" (Kohler, 1947, 119). Thus, even at the sensory level, experience is structured relationally. These experiments were extended to apes (thus much closer to humans), and to the perception of size (where the relationship of smaller to larger proved more primary than the actual size of an object.)
These, and many similar experiments, show that not only humans, but animals also organize sensory perception. The question of how much organization goes on at various levels between the original sensory perception and the eventual complex organization in the psyche is an open question. But clearly "The organizing principles of experiential realities are to be found in logical realities;" logic is, of course, used here in the broadest sense of some organizational model, some algorithm.
Platonic Ideals Become Jungian Archetypes
At much the same time as both behaviorism and Gestalt psychology were emerging, a third voice appeared from a new direction: depth psychology. Psychology as a science, both experimental and clinical emerged in the late nineteenth and early twentieth century. As much as Wundt dominated experimental psychology, Sigmund Freud dominated clinical psychology to an even greater extent. Wundt is now a forgotten name except among those studying history of psychology, while Freud is a name known to nearly everyone.
But it's less Freud who concerns us here than his younger colleague: Carl Gustav Jung. Early in their association, Freud anointed Jung as his successor, only to then excommunicate him when Jung's views took directions antithetical to Freud's. While working as a young doctor at the Burgholzli Mental Clinic in Switzerland, Jung conducted word association experiments where he not only recorded the patient's response to a stimulus word, but also measured the reaction time of the response. Interestingly, he found that the responses with the longest reaction times tended to cluster around subjects that had emotional significance for the patient. In other words, Jung had discovered once again that there was a "missing third" between action and reaction. For the first time, however, these intervening organizational entities were associated with emotional subject matter which could be classified into groupings that Jung later found to be collective; that is, not simply specific to an individual patient. Jung termed these clusters of emotionally loaded concepts "complexes".
Freud was fascinated by this concept and immediately incorporated it into his own model. "Complexes" (plural), however, were reduced to a single complex: the Oedipus complex.
Freud made the Oedipal complex the cornerstone of his theory; it was supposedly the single most significant psychic element which underlay masculine development. Jung saw something much more exciting in Freud's discovery: the idea that all the ancient myths still lived inside each of us. In the story of Oedipus, Freud found a description for all psychic development, Jung a single example of a multitude of psychic invariants inside each of us. Jung felt that psychology could escape from personal history by turning to the history of the race as it was recorded in mythology. This historical approach provided both a standpoint outside the patient, and a lever to move the patient's psyche. Jung immediately began to pursue this exciting new direction in psychology.
In 1912, Jung published the first fruits of his research as Transformations and Symbols of the Libido (later extensively rewritten and published as Symbols of Transformation (CW3) in 1952, then corrected in 1967). In a dazzling display of scholarly detective work, Jung turned to the whole field of mythology for amplification of the fantasies of a single woman in the incipient stages of schizophrenia. Where Freud "reduced" fantasy and dream images to a single mythological reference (the Oedipus complex), Jung "amplified" the images in her fantasies by showing parallels throughout the varied mythologies of many cultures and ages. This was too much for Freud and he soon broke off relations with Jung.
Freud theorized that all complexes revolved around sexually significant events from a patient's early life. He reasoned that the process of psychoanalysis should be able to bring the personal associations to mind one at a time. Eventually the chain of associations would lead back to a sexually charged event from childhood: an Oedipal event. Once the patient uncovered the primal event that lay at the root of his complex, there would be nothing left in the complex and the patient would be cured. This is a logically tidy theory that, unfortunately, doesn't match the facts.
When Jung explored his patients' complexes, he found something quite different. The patient didn't automatically get well when all of her personal associations has been brought to light. Nor was there always (or even frequently) a primal event at the core of the complex. Instead, Jung found that after everything personal was made conscious, there still remained a core of incredible emotional power. Instead of defusing the energy, the energy increased. What could form such a core? Why did it have such energy? Jung used the term archetype (from the Greek for "prime imprinter") for the core of a complex.
Jung's understanding of the nature of archetypes developed slowly over nearly fifty years. Initially, he thought they were simply the primordial images which he observed in the dreams and fantasies of patients, as well as in the myths, legends and fairy tales of the human race. But gradually he came to understand that while the archetype was almost invariant (or at least changing as slowly as anything else in evolution), it took outer forms that conformed to different cultures in the different times. And still later he realized that the archetype was essentially two-sided, appearing either as a primordial image or as an instinctual behavior. The archetype took explicit form only through experience in the outer world. Because of this two-sided quality, he came to viewed archetypes as the underlying structure of both the psyche and the outer world. Toward the end of his life, he took a final jump and theorized that, since archetypes were devoid of content, the primary archetypes of order were the simple counting numbers.
Archetypes, like Plato's ideas, were ideal forms; they didn't, however, exist in some idealized world, but instead both in the inner world of the mind and the structure of physical reality. Jung's theory subsumed both of our earlier suppositions, that: (1) the organizing principles of experiential realities are to be found in logical realities, and (2) the organizing principles of logical realities are to be found in first principles outside of those realities.
Like Kant's categories, our perceptions of reality were necessarily structured by the archetypes, but they were much more extensive than Kant had theorized. There are seemingly archetypes for every person, place, object, or situation which has had emotional power for a large number of people over a large period of time.
Jung liked to compare an archetype to a river bed which has been slowly formed over thousands of years. Originally there was just a stream of water which followed the path of least resistance on its way to the sea. The water followed any of a number of different branches, one as likely as the next. However, as time slowly passed, it became less and less likely that the water would take any path other than those already traced many times previously.
However, if circumstances changed drastically enough, a new path could still form. For example, if a rockfall dammed up a portion of the river, the water would be forced to take a new direction. If it eventually branched back into the old path, at least it would have taken a new route for part of the journey. But perhaps part of the river never made it back to the old bed and a new tributary formed. Similarly the emergence of a new archetype.
In the latter half of the 20th century, science started to find proof for something quite like Jung's concept of archetypes, though neither Jung nor archetypes were ever mentioned. Following Watson & Crick's dramatic discovery of the structure of DNA in 1953, molecular biology began to make astonishing advances in demonstrating how the person we is stored in our DNA. But of more interest for our discussion were the discoveries in ethology, which carefully traced how, in the animal kingdom, inherited behaviors stored in the brain are triggered by environmental cues. As with so many areas on thought in psychology, psychologist William James already realized that the brain exists as part of the world, and must necessarily be adapted to it.
Mental facts cannot properly be studied apart from the physical environment of which they take cognizance…our inner faculties are adapted in advance to the features of the world in which we dwell…Mind and world in short have evolved together, and in consequence are something of a mutual fit (Anderson & Rosenfeld, 1988, 1).
The brain preserves its evolutionary history in its structure. It has stored an immense number of behaviors that have proved useful for our ancestors in the human and animal world over vast periods of time. Certain situations repeat over and over for every member of a species. Frogs have to be very good at recognizing flies or they will go hungry. Humans have to be very good at recognizing human faces or they can't function within any human social structure. So there has to be a great deal of specificity in what is stored in the brain (Anderson & Rosenfeld, 1988, 2). On the other hand, every creature is in large part born into a world unique to it, and has to be able to adapt to that world.
For example, though all creatures are born from a mother and, in a majority of species, are raised by the mother, mothers come in a great variety of different packages. Nature has to handle both the specificity and the variety. For example, in his King Solomon's Ring (1952, 47), Nobel prize winning ethologist Konrad Lorenz explains that "greylag goslings unquestioningly accept the first living being whom they meet as their mother." But with mallards, he found that it wasn't enough to be the first person they saw at birth; in addition, he "must quack like a mother mallard in order to make the little ducks run after me" (Lorenz, 1952, 48). That turned out to do the trick. Though he looked about as much like a mother mallard "as Calvin Coolidge looks like Metro Goldwyn Mayer's lion" (as James Thurber so perfectly described the wolf masquerading as Red Riding Hood's grandmother in his Fables for our Time), "anything that emits the right quack note will be considered as mother" (Lorenz, 1952, 48).
The mother's response is as instinctive as the baby's; more recent research has shown this extends to humans as well as other animals. Nor are such inherited behaviors limited to mother/child behavior. Courting rituals are also instinctive in most species. An adult male jackdaw fell in love with Lorenz and tried all the wiles that normally proved successful with female jackdaws. For example, he kept trying to feed Lorenz delicacies like ground-up worms. Like a true scientist, Lorenz suffered this disgusting diet as long as he could. When he was finally so sick of the taste of worms that he refused to open his mouth, the jackdaw filled his ear with worms, then was disappointed when this proved a less than successful romantic strategy (Lorenz, 1952, 153-4).
A baby's recognition of mother, an adult's repertoire of courting behavior, both innate. Recognition of danger is also often inborn. "Magpies, mallards or robins, prepare at once for flight at their very first sight of a cat, a fox or even a squirrel. They behave in just the same way, whether reared by man or by their own parents." In contrast, jackdaws, who we have seen are born knowing so much about love, have to be taught to recognize danger by their parents. Though they may not be born with a sense of self-preservation, they are, however, born with a innate need to protect their young at all costs. Since jackdaws--including baby jackdaws--are black, "any living being that carries a black thing, dangling or fluttering, becomes the object of furious onslaught." Poor Lorenz discovered this to his distress when he accidentally picked up a baby jackdaw and was instantly attacked by the mother. He dropped the baby, and was left with a wounded and bloody hand. Now forewarned, Lorenz systematically explored just how far this instinctual behavior extended. He found, for example, that he could safely carry his black camera, but "the jackdaws would start their rattling cry as soon as I pulled out the black paper strips of the pack film which fluttered to and fro in the breeze" (all quotes Lorentz, 1952, 157-9). This happened even though the adult birds knew Lorenz to be their friend and no threat to their children. Instinct simply took over.
Though we have a considerably greater ability to take conscious control of our actions than a jackdaw, a huge amount of our behaviors are already stored away at birth in our brains, ready to be triggered into action when needed. The missing 3rd not only resides in the brain, but it is pre-adapted because it and its environment evolved together. But we still need one further piece to our puzzle: a model for the dynamic relationship of brain, mind, and world.
Cybernetics: McCulloch-Pitts vs. Gregory Bateson
When [Norbert] Wiener brought the feedback idea to the foreground, not only did it become immediately recognized as a fundamental concept, but it also raised major philosophical questions as to the validity of the cause-effect doctrine.…the nature of feedback is that it gives a mechanism, which is independent of particular properties, of components, for constituting a stable unit. And from this mechanism, the appearance of stability gives a rationale to the observed purposive behavior of systems and a possibility of understanding teleology (Varela, 1979, 166-167).
Before non-linear dynamics, there was general systems theory. And before general systems theory, there was cybernetics. Cybernetics first emerged from the fertile mind of Norbert Wiener (1948/1961), with a healthy dose of John von Neumann tossed into the mix. But it was really the Macy Conferences--a series of 10 interdisciplinary conferences beginning in 1946 and continuing through 1953--that took cybernetics from the idea stage to a fully-developed field.
Throughout the annual conferences of the Society for Chaos Theory in Psychology, there has always been a creative tension between two complementary (and sometimes opposing) groups, which I usually characterize in my mind as the "techies" and the "metaphorists." The Macy conferences had their equivalent split between what historian Steve Heims calls the "cyberneticians" and the "social scientists". (3) During WWII, social science had moved out of the college classroom and beyond theory into application, due to the need for psychologists, anthropologists, and sociologists in the war effort. The Macy conferences marked the first time when hard science and soft science came together to discuss a single idea, each trying to offer its own unique perspective.
What was so important about cybernetics that it could unite such a varied group, much as chaos theory would later unite a similarly broad spectrum of disciplines?
First, it was concerned with goal-directed actions, where an organism acts with a purpose.…Second, the model replaced the traditional cause-and-effect relation of a stimulus leasing to a response by a "circular causality" requiring negative feedback (Heims, 1991, 15).
Previously science had relegated ideas of goal-directedness--teleology--to the scrapheap of pre-scientific notions like vitalism or spontaneous generation. Cause-and-effect, building ineluctably from past causes to future effects, could explain all. Cybernetics supplied a new way of viewing reality that lent itself to scientific description and even mechanical realization, yet could deal with situations too complex for simple causality to explain. At its simplest level, imagine a thermostat regulating air conditioning. The thermostat will continuously compare the actual temperature--let's say it's 75 degrees--with whatever desired temperature you have set the thermostat at--70 degrees in this example. As long as the actual temperature is warmer than the desired temperature, the air conditioning will come on and stay on. When the actual temperature reaches 70 degrees, the thermostat will shut off the air-conditioning. It will stay off until the temperature once more climbs above the desired setting. And this will be done in a mechanical manner which doesn't involve itself with philosophical issues like teleology.
Even our little thermostat already involves both the key concepts in cybernetics: positive and negative feedback. Positive: keep doing what you're doing; negative: hey, stop already. Seen in that light, cybernetics hardly seems startling, but it was a major break from Newtonian dynamics.
Let's take two contrasting examples of the attendees of the Macy conferences, one each from the "cyberneticians" and one from the "social scientists." Representing the cyberneticians, psychiatrist and research physiologist Warren McCulloch combined with boy genius (well, almost a boy, as he was only 20 at the time) Walter Pitts to create the McCulloch-Pitts theorem for neural nets. This was the first "computer" model of the brain; what was truly remarkable is that the concept preceded actual computers. All references in McCulloch's and Pitts' original paper were to mathematical logic. They viewed the neuron as an binary device which could receive either excitatory or inhibitory inputs from the synapses. When the value of these inputs passes a certain threshold, the neuron turns on or off. If enough of these simplified neurons are linked together into a closed network, McCulloch and Pitts proved that a large class of logical problems can be solved. Essentially, they presented the brain as a Universal Turing Device, or what we now call a computer (McCulloch and Pitts, 1943). Of course, we now know that the neuron and the linkage between neurons, is much more complex than this, more organic, less mechanical. Still this was a brilliant starting point.
In his youth, after studying Bertrand Russell's then seminal work in which mathematics was viewed as a subset of logic, McCulloch was tantalized by the deep philosophical question "what is a man that he can know a number?" Attempting to answer that question led him into first psychology, then neurophysiology, but it was the power of logic that drove him. Pitts was an autodidact who "was known to master the contents of a textbook in a field new to him in a few days." Even gifted scientists would defer to him for a logical analysis of any issue within their own fields. McCulloch was for a good while a father figure for Pitts (as was also, to a lesser extent, Norbert Wiener). Together McCulloch and Pitts made an impressive single representative for the power of logic.
Anthropologist Gregory Bateson was an equally impressive spokesman for the other camp. He grew up in an almost overwhelmingly intellectual British family. His father was a "major, albeit controversial, figure within British biology" who championed the rediscovery of Mendel's work. Growing up with biological issues in his bones, he initially chose zoology as a field, then moved on to anthropology, to which he brought an ecological point of view long before ecology existed as a field. In Heims' book on the group, he characterizes Bateson as a "scout" not only for the Macy group, but throughout his career. "From anthropology and learning theory he moved to psychiatry, behavior of otters and octopus, theory of humor, kinesics, language and learning among dolphins, and theory of evolution" (Heims, 1991, 251).
At the Macy conferences Bateson often presented anthropological examples of cybernetic feedback mechanisms in various cultures, "including the case of the Iatmul culture in which a transvestite ceremony served as a homeostatic mechanism whenever a characteristic patten of aggressive actions within the tribe threatened to divide them" (Heims, 1991, Bateson, 1958).
That must seem like pretty small stuff in comparison with the McCulloch-Pitts model which led to neural nets. But Bateson's ability to cut across academic disciplines allowed him to scout out new areas of thought, to sniff out what was important, to ask the central questions before anyone else even knew there was anything to be asked. Of course, like all scouts, once a new field--like cybernetics--achieved respectability, his questions were unfortunately often dismissed as irrelevant by those more intent on the matters at hand. We are only now coming to reexamine questions Bateson raised half a century ago. For example, in his own work, he distinguished learning from "learning to learn" ("proto-learning" and "deutero-learning" respectively). Applying that to computers, Bateson asked the group "whether computers can learn to learn, and how in a formal mathematical way one could distinguish that from plain learning" (Heims, 1991, 24, also see Wiener, 1950/54).
Through the decade of the conference, and beyond them until his death in 1980, Bateson continued to extend the scope of these cybernetic questions. In 1968, fifteen years after the last Macy conference, he organized a conference on the "Effects of Conscious Purpose on Human Adaption." As an introduction, he offered 11 "considerations." The first seven discuss cybernetic systems in general. The last four move into the relationship between mind and world.
8. The content of the screen of consciousness is systematically selected from the enormously great plethora of mental events. But of the rules and preferences of this selection, very little is known.…
9. It appears, however, that the system of selection of information for the screen of consciousness is importantly related to "purpose," "attention," and similar phenomena which are also in need of definition, elucidation, etc.
10. If consciousness has feedback upon the remainder of mind and if consciousness deals only with a skewed sample of the events of the total mind, then there must exist a systemic (i.e., non-random) difference between the conscious views of self and the world and the true nature of self and the world. Such a difference must distort the processes of adaption.
11. It is suggested that the specific nature of this distortion is such that the cybernetic nature of self and the world tends to be imperceptible to consciousness (Bateson, M., 1972, 16).
As we see, the question of the missing 3rd now has moved forward into the nature of consciousness itself and its emergence from the "cybernetics nature of self and world." It is at this point that chaos theory, autopoiesis, and non-linear dynamics enter the picture, each with its own attempt to advance the issue still further.
We began this essay with Socrates' assertion that the qualities we attach to the things of the world are merely copies of copies of idealized forms that exist outside the world. Parmenides argued that any such idealized quality could then be included among the things to which it applied, and there would then necessarily be a third, a quality which described all both worldly objects and the Platonic ideal. This would then lead into an infinite regress. The problem, he argued, is because the world is an undifferentiated whole, a One, which includes both ideal and actual.
The Neoplatonists of the 3rd to 5th century, resolved this problem by positing a graduated hierarchy between highest and lowest rather than a continuum, with each stage composed of emanations from the One. In its most developed form, expressed by Proclus, each stage was holographic, since it contained the possibility of immediate apprehension of the One.
With the emergence of science in the seventeenth century, Locke proposed a new model in which the mind was a tabula rasa, a blank slate, at birth. All ideas contained in the mind were either records of sensations, or of combinations of sensations. At much the same time, Descartes asserted cogito, ergo, sum (I think, therefore I am), which asserted that the proper starting point for examination of the world was our thoughts.
In the eighteenth century, Berkeley took Descartes starting point to its logical conclusion and denied the reality of the outside world since all we know are our thoughts. Late in the century, Kant took the biggest step forward since Plato and resolved the paradox by asserting that the mind contains inherent categories which organize our sensory perceptions of physical reality. Hence there was a missing third already existent in the mind.
In the nineteenth century, Fechner fleshed out Weber's experimental results on the perception of weight differences to demonstrate that there was an intermediate organization that lay between physical stimuli and our sensory awareness of this stimuli. Under Wundt, Fechner's techniques, if not his conclusions, led to the development of experimental psychology, though in a narrow form Wundt called introspectionism.
In the early twentieth century, in reaction against Wundt's work, Watson developed behaviorism, which denied the need for mind, and simply viewed behavior as composed of series of stimulus-response chains. Hence no missing third. But Kohler, Lashley and other Gestalt psychologists demonstrated clearly that Fechner was correct that sensory perception was relational, not direct. Jung then took the final step with his model of archetypes which provided a model of the structure of the missing third that lies between the mind and the world.
Later in the twentieth century, ethology provided details for just how archetypes are stored in the brain and triggered by environmental cues at necessary points of development. And cybernetics demonstrated that mind, brain, and world are tied together in a loop in which each informs the other. This leads to the emergence of consciousness, a consciousness unaware of this cybernetic relationship. And that is the point at which non-linear dynamics begins its own contribution.
Anderson, J.and Rosenfeld, E. (eds.) Neurocomputing: Foundations of Research. Cambridge: MIT Press, 1988.
Bateson, G. (1958). Naven, 6th edition, (Stanford: Stanford University Press).
Bateson, M. (1972). Our Own Metaphor (New York: Alfred A. Knopf).
Boring, E. (1950). A History of Experimental Psychology, 2nd ed. (New York: Appleton-Century-Crofts).
Cornford, F., Trans. (1957). Plato and Parmenides, (New York: The Liberal Arts Press. Excerpt from the World Wide Web: http://faculty.washington.edu/smcohen/parm132.html.
Gillispie, C. (1980). Dictionary of Scientific Biography. (New York: MacMillan). Excerpt from the World Wide Web: http://www-gap.dcs.st-and.ac.uk/~history/Mathematicians/Zeno_of_Elea.html.
Heath, T. (1981). A history of Greek mathematics: From Thales to Euclid. (Dover, 1981).
Heidbreder, E. (1933). Seven Psychologies. (New York: Appleton-Century-Crofts).
Heims, S. (1991). The Cybernetics Group. (Cambridge, Mass.: MIT Press).
James, W. (1950). The Principles of Psychology, vol.1. 2 volume reprint of 1st ed. of 1890. (New York: Dover Publications).
Jung, C. (1960). Collected Works, Vol. 8: The Structure and Dynamics of the Psyche. (Princeton: Bollingen Series, Princeton University Press).
Jung, C. (1967). Collected Works, Vol. 3: Symbols of Transformation, 2nd ed. (Princeton: Bollingen Series, Princeton University Press).
Jung, C. (1969). Collected Works, Vol. 8: The Structure and Dynamics of the Psyche, 2nd ed. (Princeton: Bollingen Series, Princeton University Press).
Kohler, W. (1947). Gestalt Psychology. (New York: Mentor Books, New American Library).
Lorenz, Konrad. King Solomon's Ring. New York: Time Incorporated, 1952.
McCulloch, W. and Pitts, W. (1943). "A logical calculus of the ideas immanent in nervous activity", Bulletin of Mathematical Biophysics 5: 115-133.
Reese, W. (1980). Dictionary of Philosophy and Religion. (New Jersey: Humanities Press).
Stevens, S. (1951). Handbook of Experimental Psychology. (New York: Wiley).
Stevens, S. (1958). "Problems and Methods of Psychophysics", Psychological Bulletin, 55.
Varela, F. (1979). Principles of Biological Autonomy. (New York: North Holland). 1979.
Wiener, N. (1950/54). The Human Use of Human Beings: Cybernetics and Society, revised edition (New York: De Capo Press).
Wiener, N. (1948/1961). Cybernetics, 2nd edition. (Cambridge, Mass.: MIT Press).
Whitehead, A. (1929). Process and Reality, Corrected Edition. David Ray Griffin and Donald W. Sherburne, Eds. (New York: The Free Press, 1978).
1. I'm indebted to Robert Porter, Ph.D. for this concise formulation.
2. A hundred years later, S. S. Stevens vastly extended Weber's and Fechner's research on psychophysics. As with most pioneering results, Weber's initial guess was overly crude. Stevens showed that the ratio between sensation and stimulus was a power function, not merely a logarithmic function; i.e., the sensation equaled some constant that varied with the particular kind of sensation being measured, times the measurement of the stimulus to some power (often fractional). Though this modified Weber's Law, it further proved that the relationship between sensation and stimulus is relational, not direct. (Stevens, 1951, 1958).
3. In this section on the Macy conferences, I'm largely drawing on (Heims, 1991).
Robin Robertson's Home Page