1.
Before Darwin, many scholars wrote about the origins of man and the beginnings of mental life. Such writings, however, were frankly speculative: there were few agreed-upon facts, nor was there a comprehensive theoretical frame within which to situate facts and suppositions. Darwin’s epoch-making writings changed forever the status of human beings’ reflections about themselves and their minds. Darwin put forth the most plausible general account of the evolutionary origins of contemporary forms of life—an account long accepted as correct in its basic outlines. In addition, he stimulated students of biology and human behavior to collect and interpret data relevant to the actual, as opposed to the supposed, “prehistory” of mind and man.
What had been a trickle of writings about prehistoric life became a torrent of publications in the half century following Darwin. Dozens of scholars conjectured about the antecedents of contemporary man or described more primitive forms of mental life in nonhuman animals. In nearly all cases, what emerged as the peak of intellectual evolution bore a marked resemblance to the authors of these books—European and North American scholars who, in a contemporary phrase, were “stale, male, and pale.” So much energy was devoted to the often speculative search for mankind’s intellectual “roots” that in 1866 the Circle of Linguistics of Paris actually banned papers on the origins of language, which of course did not stop them from being written.
While the temptation to publish grand theories about the prehistory of the mind was never completely quelled, such work had fallen distinctly out of favor by the middle years of this century. Most scholars agreed that there were already too many accounts that could not be properly evaluated, among them Freud’s view that prehistoric social life originated in the consumption by sons of their murdered father’s corpse. It made more sense for scholars to set themselves to less ambitious tasks, and to try to get at least part of the “prehistory riddle” straight. They could investigate the mental life of infants, or the development of early tools in East Africa, or brain changes that might have facilitated the development of speech, or the stunning images of animals found in the caves of southern Europe. Finally, the pseudoscientific evolutionary and eugenic program advanced by the Nazis gave a deservedly bad name to the practice of ranking individuals (or minds) according to some metrical index of sophistication, complexity, or “full” humanity.
Still, it is not possible—and probably not desirable—for scholars to desist entirely from attempting to sketch a broad picture of human prehistory. Textbook authors often attempt to do so, and important proposals about early development have also come from philosophers such as Ernst Cassirer, who examined the fundamental practices of art, myth, and religion as they might have first emerged; brain scientists like Harry Jerrison, who charted the growth patterns of the brain in different species; and archaeologists like Alexander Marshack, who discerned early notational systems in the apparently haphazard scratches on pieces of bone.
Other theories have been more controversial, while also having more popular appeal. In the 1970s, for example, the psychologist Julian Jaynes created a stir with his claim that human self-consciousness as we know it did not develop until well into the classical era. In Jaynes’s account, the Greeks described in the Iliad heard voices that told them what to do, while the Greeks portrayed in the Odyssey were capable of making their own plans. Another controversial theory was proposed in the 1980s, when the historian Riane Eisler concluded from a study of artifacts that early human societies were primarily matriarchal, with power eventually being seized by the initially subsidiary males.
In a complex and inevitably controversial field like “prehistory,” works that successfully combine the scholarly and the popular are rare. In recent years, the most impressive effort of this sort has been undertaken by the Canadian neuropsychologist Merlin Donald. In his carefully reasoned Origins of the Modern Mind, a book that draws explicitly on several scholarly disciplines, Donald claims that the human line of descent in the past two million years has passed through three major transitions. In the first, which separated hominids from other apes, early humans became able to use their bodies to imitate older and more sophisticated members of the same group. In the second transition, humans developed distinctive neuronal and anatomical systems that allowed them to use spoken language and to make up and tell stories. In the third transition, which created modern man, humans invented symbolic and notational systems that could eventually be used to preserve memories and transmit complex forms of culture, including art and science. Donald’s account, necessarily a broad one, is considered by many scholars as a plausible outline of human origins.
2.
The British archaeologist Steven Mithen states explicitly, “I want to follow in Donald’s footsteps, although I believe he made some fundamental errors in his work—otherwise there would be no need for this book.” One error that Donald made, in Mithen’s view, was to give excessive attention to psychological data. Mithen understandably looks to his own discipline of archaeology when he declares, “If you wish to know about the mind, do not ask only psychologists and philosophers: make sure you also ask an archaeologist.” And he sees this discipline—with himself as interpreter—as equal to the task. “The human mind is,” he confidently writes,
Advertisement
a product of evolution, not supernatural creation. I have laid bare the evidence. I have specified the “whats,” the “whens” and the “whys” for the evolution of the mind. I have explained how the potential arose in the mind to undertake science, create art and believe in religious ideologies.
We are all familiar with some of the powerful metaphors, analogies, and images that have been used in the sciences and social sciences: Darwin’s tangled bank of species, Freud’s portrayal of the ego trying to control the id as a rider perched on an unruly horse. Of course, metaphors can also mislead: the kinds of experience that they purport to connect may prove incommensurate in various ways. Clearly Mithen is smitten by the metaphoric bug, but he is not always well served by this tendency toward epistemological romance.
To describe evolutionary processes, Mithen relies heavily on a set of evocative images. The principal organizing metaphor of the book construes the history of the mind as a four-act play. If the metaphor of a series of acts seems banal, the play is a peculiar one, to say the least—a set of Beckett-like lengthy silences, punctuated by a few bursts of Shakespearean energy.
Act One—covering the period 6 million to 4.5 million years ago—occurs in Africa and involves only an ancestral ape. There may be a set of primitive tools lying about, but otherwise the scenery is dark and nothing in the archeological evidence suggests human intelligence is emerging. Act Two—spanning 4.5 million to 1.8 million years ago—consists of two scenes. The first scene involves Australopithecus—at first living in wooded environments, later walking around more freely and climbing trees. The famous Lucy makes an appearance here. The second scene, beginning about 2 million years ago, ushers in the Homo lineage. Homo habilis possesses rough stone tools, which he uses to extract meat from the corpses of dead animals.
The third act covers the period from 1.8 million (the start of the Pleistocene) to 100,000 years ago. Homo erectus appears at several points around the world. His brain is larger, and hand axes first appear—mankind’s earliest tool, which was first chipped from stone some 1.4 million years ago. The second scene, called the Middle Paleolithic, commences 200,000 years ago. Pear-shaped stone hand axes give way to tools that produce flakes, stone points, and other more nuanced instruments. By about 150,000 BC, Homo neanderthalensis appears, with his finely crafted tools of stone and, possibly, wood. Mithen expresses his surprise that life at this late point in the prehistorical drama remains tedious, with the same set of tools being used for narrow purposes and with no hint in the archaeological evidence of art, science, or religion.
Act Four, commencing 100,000 years ago, Mithen describes as “a much shorter act, into which are squeezed three scenes packed with more dramatic action than in all the rest of the play.” The star is Homo sapiens sapiens. For the first time, we can see vestiges of practices that remind us of our own concerns: the dead are regularly buried, some with objects that are placed within burial sites; harpoons are made of bone; boats are built; dwellings are erected; walls are painted; animals are carved; clothes are sewn with bone needles; people decorate their bodies with beads and pendants. Finally, in the third scene of the final act, people in the Near East plant crops, domesticate animals, build towns and cities, create and use systems of notation. After six million years of relative inaction, material events and mental activity proliferate at such an explosive rate that it becomes difficult to order and make sense of them.
When he comes to addressing the nature and workings of the human mind, Mithen begins by discussing—and then rejecting—two metaphors that have wide appeal but seem inadequate. It does not suffice, he says, to consider the mind as a sponge which just soaks up material—such an image cannot account for how people solve problems or compare and contrast items of information. Nor is it convincing to see the mind as a general-purpose computer learning program. While it is certainly helpful to possess a computer, or a set of computers, the mind also thinks, creates, imagines—all activities that fall outside our current conceptions of what a computer can do. Mithen declares, “Maybe when we think of the mind as either a sponge or a computer program we are joining the psychological equivalent of the flat earth society.”
Advertisement
But Mithen still tries to find his own apt metaphors. Borrowing a figure from psychology, he invites us to think of the mind as a Swiss army knife. On this view, rather than being a single all-purpose mechanism, the mind is better thought of as a series of tools, each of which has evolved to carry out a specific kind of operation. In introducing this image, Mithen draws on a large number of writings, such as the philosopher Jerry Fodor’s description of different “modules of the mind” (itself based on Noam Chomsky’s theory that the mind comprises not one but many separate “thinking” devices, each with a separate purpose); the “faculties” that the evolutionary psychologists Leda Cosmides and John Tooby believe have evolved through natural selection to cope with adaptive problems faced by scavengers-hunters-gatherers; and my own theory of eight discrete human “intelligences,” ranging from logical to personal intelligences to the intelligence of the naturalist.
Finally, and most strikingly, Mithen construes the mind as a cathedral, or a series of cathedrals, by which he does not of course mean literal churches but buildings whose structure, including their component chapels, emerged in ways that are analogous to the evolution of the mind. Mithen argues that historically early cathedrals were thick-walled buildings that had one nave and several smaller semi-independent chapels built off from it, like outbuildings on a saltbox farmhouse. By comparison, later cathedrals are taller, more graceful, with a complex, built-in system of chapels that connect both to the central nave as well as to each other.
Yet if the nave-chapel metaphor is truly to stand for the idea of a general intelligence buttressed, as evolution progresses, by a growing number of specialized intelligences (“chapels”), Mithen would presumably want to show that real cathedrals grew up in the same progressive way as well—but some of them didn’t. Cathedrals such as Chartres were conceived and created by purposeful human beings as unified projects. Mithen confuses the issue, and the reader, by setting up a parallel between the history of the mind and the history of cathedral architecture when no clear correspondence between them exists. (It is in any case an odd metaphor since cathedrals are sacred enclosures which have a specific function of serving as the principal church of a diocese; and the practices in the different parts of a cathedral, including its chapels, are drawn from a common faith and common set of rituals.) When Mithen suggests that the “acts” of human cognitive evolution can be thought of as a cathedral he seems to be stretching a metaphor and forcing us to strip it of its habitual associations.
Nor is it easy to accept Mithen’s additional idea that each newborn infant harbors a mental cathedral that gets constructed and finished as the child matures. Succumbing to the temptation to conflate the personal and historical that has attracted so many evolutionary theorists since the eighteenth century, Mithen proposes that the evolution of the mental “cathedral” during prehistory bears a marked resemblance to the evolution of the mental cathedral during the childhood of a contemporary person. The mind of an infant or of a Neanderthal is, in Mithen’s view, Romanesque; the mind of an adult Homo sapiens sapiens is, in contrast, a veritable Chartres. In his view, ontogeny recapitulates phylogeny in three distinct phases, which overlap—somewhat confusingly—with the prehistorical drama in four acts.
What are those phases? Phase One consists of a mind—the mind of infants and early hominids—with a “nave” of general intelligence that can, in rough fashion, process information, make connections, and solve problems. Phase Two, characterizing young children and hominids of the middle period, consists of a cathedral with some general intelligence, but with multiple new chapels, each harboring specialized intelligences. These chapels probably number at least four: one for technology (the development of tools); one for natural history (sensitivity to plants, animals, and the environment); one for language; and one for social understanding—a grasp, for example, of what it is to participate in a group. During this historical phase the various chapels function in relative independence of one another. An individual may have had strong technological and strong social understandings, for example, but had no way of getting those two faculties to work together. (An early hominid may have found ways to make use of a stick of wood to probe for termites without being able to share this skill with fellow hominids.)
Mithen describes in detail the emergence and ultimate interaction of the separate faculties during the different phases of human evolution. According to his account, social intelligence—a sense of one’s relations with the members of one’s immediate group—emerges first and is, indeed, quite developed even during the opening scenes of the history of the hominid line. During the second act, what Mithen calls “natural history intelligence”—an ability to assess and take advantage of the natural environment—comes into play. So does a tool-using intelligence; but in the early phases of hominid evolution this remains unconnected to natural history intelligence—the hominid is alert to the natural environment but can’t effectively use his simple tools to make consequential changes in it.
During the third phase, Mithen finds the beginnings of the capacity to use language, but this is almost exclusively tied to social interchanges—a kind of “grooming in words.” (The assumption here is that, much like apes, preverbal hominids spent up to a third of their time “grooming” each other as a means of creating and solidifying social bonds. Since language was a much more effective way of sustaining these bonds and took only a fraction of the time absorbed by grooming, it presented a great adaptive advantage.) Some important links were made 100,000 years ago during the early fourth phase between the “social” and “natural history” capacities, and between the technical and naturalistic capacities 30,000 years ago. It was then that tools started to be used to transform nature in cutting trees and making use of animal skin, for example. The dramatic climax features an explosion of the linguistic faculty—“once Early Humans started talking, they just couldn’t stop,” Mithen declares—and the ultimate yoking of all four faculties with one another.
It is the capacity to connect the various intelligences, faculties, modules, or chapels that characterizes the fully evolved individual—be it Homo sapiens sapiens 30,000 years ago or the mature adult of today. There may, Mithen thinks, be some kind of general intelligence at work. There may even be a superchapel that handles “meta-representation” (to which I shall return); but what allows human beings to become truly human is their capacity to bring together the operation of different faculties in order to solve problems, create products, and make original contributions in art, science, religion, and even cognitive archaeology.
3.
To support his metaphors and arguments, Mithen relies on two rather different lines of evidence, which, he hopes, will complement and bolster each other. First of all, there is the evidence that he has assembled as a practicing cognitive archaeologist. In impressive, well-illustrated tables, he collates the facts as they are now agreed upon by archaeologists—for example, the kinds and distribution of tools used during the Pleistocene, some 1.8 million years ago. He takes up as well issues that remain mired in controversy—for example, whether Homo sapiens sapiens originated independently more than once. (He believes it did not.)
By imagining the kinds of thinking that might occur, given the constraints imposed by the presence or absence of various mental modules, Mithen attempts to “feel himself” into the life situations faced by different lineages of early hominids. Drawing on a variety of evidence, he explains in great detail the different survival pressures that, one can assume, beset individuals who lived in small or large groups at different times (scattered through the four acts) and in different ecological niches—forested, open, warm, frigid.
The second line of evidence on which Mithen relies comes from psychology. Among psychologists, there are fierce disputes about the nature of different modules, intelligences, and faculties—how many there are, how they function, their sources, the extent to which they can work together casually or deliberately. While acknowledging these debates, Mithen attempts to deal ecumenically with them, trying to extract what the different theories have in common and to place the competing claims into a broader frame. The result—his “architecture of the mind,” for which, again, his grand metaphor is the cathedral—is a chronology of the mind’s evolution that he divides into phases based on often contradictory bodies of research. In the final phase, separate mental functions—or “services”—become harmonized and are able to work in concert.
Mithen claims here to be drawing on a consensus among a variety of theorists. Indeed, in one breathless paragraph, he refers to Jerry Fodor’s distinctly non-modular central processes, Howard Gardner’s seamlessly operated multiple intelligences, Paul Rozin’s capacities that become extended into other domains, Annette Karmiloff-Smith’s knowledge re-representational capacities, the “mapping across knowledge systems” of Susan Carey and Elizabeth Spelke, and Margaret Boden’s claim that “creativity arises from the ‘transformation of conceptual spaces.”‘
Since Mithen refers in passages such as this one to several highly technical bodies of knowledge, few readers will be able to evaluate each aspect of his argument. To turn first to cognitive archaeology, my own reading of the literature suggests that a number of his analyses are quite controversial. Mithen’s firmest evidence necessarily lies with what has survived: the nature and distribution of tools and carcasses of various sorts. Even here, however, there is plenty of room for dispute—for example, about Mithen’s conclusions from his investigations of tool use during the Middle Pleistocene in southeast England. He argues that sophisticated systems for making hand axes, with provisions for training the young to manufacture and use them, developed in colder places with relatively open patterns of vegetation, where large groups could gather. By contrast, the technology of the more primitive flake (little bits of rough, sharp stone) was associated with warmer, heavily forested environments where only small groups gathered and there was little transmission of knowledge.
Perhaps this is so. But the ecology of England at the time is not well known; it is not clear which hominid line was involved; and new evidence about chimpanzee capacities suggests that early hominids may have had far greater symbolic and mimetic abilities than Mithen attributes to them. Nor can the size of the hominid groups that gathered in one place or another reliably be inferred from, say, the size of piles of flakes. Even worse, as every cognitive archaeologist knows, tomorrow’s discovery can topple any intricate explanatory scheme, suggesting how fragmentary our knowledge of prehistory remains. Since reading Mithen’s book in late 1996, I have learned of new investigations suggesting that hominids (with stone tools) existed 400,000 years earlier than previously thought; that Homo erectus still lived as recently as 27,000 years ago; that finely crafted spears were created 400,000 years ago; that dogs may have been domesticated 135,000 years ago; and that Neanderthals may have composed music for the flute. Each of these finds differs from Mithen’s account.
When he discusses the behavior of prehistoric man, Mithen cannot venture far beyond speculation. How can we know, for example, about the incidence and nature of linguistic utterances that occurred 200,000 years ago? How can we be confident about the extent to which mothers—or fathers or siblings or playmates—attempted to show children different ways to make tools as well as to hunt and to prepare food? The temptation to choose the interpretation that fits one’s own theory is difficult to resist. All too often—and particularly when he is trying to decide whether modules are being used in ways that reinforce one another—Mithen succumbs to this temptation. His recurring dilemma is revealed when, speaking of an early artistic image, he declares, “We cannot prove, but equally cannot doubt, that it represents a being in the mythology of the Upper Paleolithic groups of southern Germany.”
When it comes to the literature on child development and adult cognition, some of the recent evidence is available for inspection—and, as a cognitive-developmental psychologist, I find most convincing Mithen’s claim that human intelligence lies in the capacity to make connections: through using metaphors as Mithen tries to do, for instance, or through the unexpected juxtaposition of images that make us laugh. To make connections is to link the various quasi-independent intellectual modules (as one does in learning to attach meanings to one’s own or others’ squiggles on a slab of stone or a piece of paper). Whatever the deficiencies of the cathedral metaphor itself, Mithen contributes to scholarship in the ways he elaborates on it. Here his metaphorical approach invites neurologists and psychologists to explain just how it is that different parts of the mind/brain learn to “speak to one another.”
On the other hand, I am not in the least persuaded by Mithen’s argument that the earliest phases of cognition—whether in prehistory or in the mind of an infant—entail a kind of general intelligence. As we have seen, Mithen believes that the earliest hominids practiced only simple kinds of tool use. They could not, for example, combine the specific kinds of intelligence needed for both toolmaking and hunting. When they engaged in hunting and toolmaking at this early stage, they did so, he writes, by means of “general intelligence.” In invoking general intelligence, he ignores whatever specific sensory, perceptual, conceptual, or emotional skills the hominids may have employed—singularly or in concert—in social relations or in the way they used tools.
A similar confusion arises when Mithen bases his argument about the infant’s mind on his reading of two well-known authorities in developmental psychology: Patricia Greenfield, who has argued that both the use of tools and early language exploit the same regions of the cortex; and Annette Karmiloff-Smith, a modular theorist who has nonetheless endorsed parts of Piaget’s analysis of general intellectual development. But even if Greenfield’s argument has merit, it relates at most to two specific capacities: the sequence of actions in the use of tools and the ordering of linguistic elements in verbal communication. The apparent parallels in these two “actions” could be deep analogies or they could be more superficial coincidences. In any event such parallels do not take account of a large number of other cognitive faculties which Mithen himself describes, such as “social” intelligence and “natural history” intelligence.
Moreover, Mithen’s version of Karmiloff-Smith’s interpretation of Piaget is hardly convincing. In endorsing Piaget’s approach to developmental psychology, Karmiloff-Smith suggests that there may be some “across-the-board” changes in cognition—for example, when young people become explicitly aware of knowledge that was previously intuitive or when older individuals become capable of “meta-representation”—classifying or recording their own representations through the use of a “higher- order” language. However, Karmiloff-Smith takes care to argue that even such “across-the-board” cognitive changes will occur at different times for different mental modules, such as those activated in playing a musical instrument or mastering the use of irregular verbs.
Drawing on research on early infancy, Karmiloff-Smith gives a picture of cognition that has gained considerable acceptance. That is, during the very first months of life, human infants display a dizzying array of quite specific, often unrelated modular capacities. These include the abilities to recognize the sounds of adult language; to appreciate music tonality, including the differences between consonant and dissonant intervals; to recognize human facial configurations; to engage in highly specific communicative exchanges with loving caretakers; to appreciate simple numerical relations and operations; to imitate actions of others, even when they cannot observe their own bodies. Moreover, they understand the basic properties of different objects a full year before Piaget believed that they could do so. If the archaeological record changes quickly, reports on infants’ hitherto unsuspected skills accumulate on a monthly basis.
Far from providing evidence for the existence and predominance of a general intelligence, recent research on early infancy provides the strongest clues to the inherent modularity of human cognition. The problem, as Mithen himself recognizes elsewhere, is not getting these modules to work—they are constructed so that they automatically become active under the appropriate circumstances. What we don’t know is how the various modules somehow become able to work together.
Why should Mithen, among others, be so confused about general intelligence? Why does he invoke a generalized capacity to explain both the development of the infant mind and the history of hominid development as well? In my view the blame partly lies with the dominance of a certain epistemology and partly with our ordinary language. Dating back to the British empiricists, a strong strain in our intellectual tradition posits certain basic operations of the mind, including the capacities to perceive, compare, associate, and infer; and such capacities are too often identified with general intelligence. The currently popular metaphor of the computer continues that tradition. Neurons and minds are said to work by means of basic operations—and, by referring to these operations (and the various degrees of skill they entail), we can compare minds across millennia or among different individuals or species. This perspective also colors our language. Because we talk readily about intelligence (or general intelligence), we assume that such an entity must exist and be measurable.
What is wrong with the seemingly plausible notion of general intelligence is that it has neither a reasonable definition nor evidence to support its existence. Those who use the words “intelligence” or “general intelligence” sidestep the question of what we mean by a general intelligence as opposed to a collection of separate ones. Mithen is at his least consistent here. At various times, he uses general intelligence to refer to the activities of species fifty million years ago, such as the ability to find food or to make cost/benefit analyses; he also identifies general intelligence as the ability of modern humans to create complex tools or to engage in art, religion, or science. He speaks vaguely of “general-purpose learning and decision-making rules.” “General” has become so general that it denotes emptiness and simplicity, as well as complexity, fluidity, and abstractness.
Empirical evidence shows that the mind—human or prehuman—is distinguished precisely by the fact that it does not treat all experiences or all problems as equal and does not har-bor all-purpose rules or operations. Whether one deals with bees, ants, birds, rats, or human beings, the story is always the same: certain kinds of information are readily apprehended, easily processed, difficult to forget—while others are only mastered with difficulty or are ignored altogether. Try to get an infant to recognize faces upside down, or a toddler to speak a language which does not make phonemic distinctions or which requires that the child attend to every other word. You will soon discover the powerful, specific constraints on cognition in Homo sapiens sapiens. That we know less about Australopithecus does not warrant our assumption that this ancestor used general intelligence. The problem is to figure out the specific kinds of intelligence of which Australopithecus was capable—just as we need to explore the nature of song in sparrows, or maze-running in rats, or dance communication in bees, or foraging in ants—and try to understand the highly particular nature of these species’ so-called general intelligence.
Fortunately for Mithen’s argument, the emptiness of the concept of general intelligence does not in any decisive sense undermine his argument. Indeed, were he to jettison it he could simplify his account of prehistory—reduce it, in fact, to two successive phases of human evolution. In the first there were a number of specific but unconnected chambers in the cathedral. In the second there was a larger and better integrated set of chambers as well as new meta-chambers (for such “modern” functions as consciousness).
On balance, I am sympathetic to what Mithen has set out to do in The Prehistory of the Mind, and what he has achieved. This seems a good time to attempt to integrate the separate intellectual traditions represented by evolutionary psychology, developmental psychology, brain study, and cognitive archeology. No doubt there are many ways to approach such a project, but Mithen’s is one plausible effort, and others will benefit from his attempt—just as Mithen himself was stimulated by Merlin Donald’s conclusions about origins.
Mithen’s achievement—and it is not a small one—is to bring together the many specific discoveries of cognitive archaeology into a systematic account. Even those who reject his various metaphorical blueprints of mental evolution will benefit from his up-to-date and well-organized accounts of scholarly research. Moreover, and more importantly, the particular question addressed by Mithen—how specific modules come to work together to form the creative aspects of human intellect—seems to me precisely the right one for scholars to be examining at the present time. Notwithstanding his somewhat grandiose claims, Mithen’s book does not explain the creativity of artists or scientists, or, indeed, of Mithen himself. That is a task that remains for humanists and scientists. But by drawing on the indispensable contributions of archaeology to cognitive history, his book begins to explain how we evolved as a species whose members can think about such things.
This Issue
October 9, 1997