Category Archives: history of cognitive archaeology

Critique of Piagetian Approach

Although the Piagetian approach provided useful insights into cognitive evolution (see previous post), it ultimately proved to be limited in its applicability.

Why Piaget’s genetic epistemology was a useful tool:

  • Piaget intended that the theory apply to all developmental sequences, and thus it was a general theory of development based on a study of children. Piaget himself even argued that it applied to the development of scientific theories themselves (recursion, anyone?). If the theory was truly general, it would apply in all circumstances, including the evolution of hominin cognition. When I applied his theory, I argued that I was applying a general theory of cognitive development to stone tools, not a model of children’s thinking. As you might imagine, no one paid attention to this assertion, and I was accused of comparing Homo erectus to modern eight-year olds, something I never actually did.
  • Genetic epistemology identified and described styles of thinking that were less powerful than those used by modern adults. Several of these styles of thinking appeared to apply to non-human primates, and the Piagetian approach gained considerable traction among primatologists, beginning with Susan Parker and Kathleen Gibson’s classic 1979 article in Behavioral and Brain Sciences. Piaget’s stages could, in fact, be used as hypotheses for phylogenetic stages. I was able to point to specific characteristics of stone tools as reflecting abilities typical of specific stages of development. In other words, Piaget’s scheme could operate as an independent scale of cognitive development.
  • As the most influential theory of cognitive development of the twentieth century, Piaget’s theory carried a lot of weight. Bringing it to bear on Palaeolithic remains provided me with a very powerful tool. Indeed, it was more like a bludgeon when talking to paleoanthropologists who had little or no background in psychology.

Some of the insights provided by genetic epistemology remain sound, especially the basic sequence in the development of spatial cognition. But eventually, I abandoned use of the theory, and it is perhaps instructive to examine why.

Why I abandoned the approach in the 1990s:

  • My first inkling that all was not rosy came in a review of my work in the French journal l’Homme by Scott Atran. He commended my method, i.e., using an explicit cognitive theory to interpret stone tools, but condemned my choice of Piaget, referring to it as a ‘theoretical motley.’ I was a bit shocked, but shouldn’t have been. My strategy had always been to read Piaget as a primary source, and not refer to the secondary literature, including any criticism. When I looked seriously at the critical literature, I found that all was not well in Piagetian developmental psychology. Indeed, to put it frankly, Piaget was on the way out, for a number of reasons. Among others, it turned out that children did not all pass through his stages in the same order, let alone at the same time. Piaget himself knew this, but did not think it important. But it was. Interestingly, Atran’s review remains the single most intelligent and useful piece of criticism I have ever received.
  • But more important for me was the fact that the theory did not ultimately work for human evolution, either. Even from the beginning I was troubled by some of the conclusions entailed by the theory. For example, the evidence from spatial cognition indicated adult level intelligence by probably half-a-million years ago. Very little about the 500,000-year-old archaeological record appears modern. I fell back on a kind of ‘unexpressed potential’ argument, with culture change accounting for all subsequent developments. But it was weak, and I knew it.
  • Theories of general intelligence had fallen out fashion in favor of modular models of mental life (still strangely enshrined in popular literature, as when a failing student claims to have great emotional intelligence…). It did look as if many human cognitive abilities were not manifestations of a single general intelligence, but instead needed to be treated separately. Piaget’s theory did not match up well.
  • The theory more or less ignored the brain. Of course, most psychological theories at the time did, too. But things were changing rapidly. First information processing theories set out to duplicate neural functioning, and none of these approaches could be reconciled with Piagetian development. In the 1980s and 1990s the brain sciences saw dramatic developments; science finally began to understand a little about how the brain worked, and Piagetian development again did not fare well.
  • Finally, the Piagetian approach did not itself generate any further useful questions. I could sit back and defend my initial formulation, continue to tweak it through more precise chronologies, or look for a more defensible and productive theory. This sounds very clear and straight-forward in retrospect, but at the time it was not, and I floundered about a bit in the late 1980s before settling on cognitive neuroscience.

Piaget and archaeology

This post is more autobiographical than most.

I mentioned in an earlier post that my avenue to cognitive archaeology was opened by a bit of bad luck, and a bit of good luck. My initial doctoral research proposal centered on measuring the edge angles and weights of Sangoan core axes in order to get some idea of what their function might have been (microwear analysis on most of them was impossible because of weathering). Unfortunately, when I requested access to the crucial collections from Kalambo Falls, I was denied; Desmond Clark himself had planned to work on the collections at the same time I wanted to. That was the bit of bad luck. But I had an idea that I’d been mulling over for almost a year, and which had its nascence in a graduate seminar on primitive technology led by Charles Keller. He took a rather unconventional approach to the topic. Instead of asking us to develop our own research topics, he assigned each participant an important twentieth century thinker/theorist and sent us off to find out if that person had written or said anything that might help anthropologists understand how people make and use tools. My friend Mike Michlovic (Minnesota State University, Moorhead) was assigned philosopher Michael Polanyi, another was assigned sociologist Talcott Parsons, and so on. I was assigned Jean Piaget. This was the bit of good luck (I could have been assigned Wittgenstein…). I had barely heard of Piaget, and at that point knew nothing about his field of study or about his theoretical orientation.
The University of Illinois research library consisted of ten floors of book stacks, over ten million volumes in all. I had a carrel in the stacks, which allowed me freedom to wander around. The experience was quite different from googling. The major danger was stumbling across some ancient tome of wisdom on a remote subject, and distracting oneself for hours. All of Piaget’s books were shelved together, and there were a lot of them, in the original French and also in English translation. I initially read The Origins of Intelligence in Children, which was a summary of his early work in child development. At first it struck me as all rather tangential to tool use. Piaget observed his own children, especially as infants, as they came to understand the world they lived in. Based on these observations he developed a stage model for child cognitive development that became the most influential theory of child development in the twentieth century. It was qualitative research; Piaget was not very interested in samples or distributions or variance. But there was something quite fascinating about his accounts of how infants and young children appeared to understand the world. I next read several of his more narrowly focused studies, including The Child’s Conception of Space (co-authored with Bärbel Inhelder). Because none of these mentioned tool use at all, I turned to his more philosophical studies, Structuralism, and The Biology of Knowledge. I had received a heavy dose of structuralism as an undergraduate, so the learning curve was not too steep. The Biology of Knowledge turned out to be a real eye opener. Piaget had been trained in 19th century French evolutionary science, which was markedly non-Darwinian. Indeed, Piaget’s first publications were on intergenerational changes in the shape of fresh water mollusks living in turbulent water (still cited today as an example of the Baldwin effect). Thus his entire perspective on evolution was very different from the Synthetic Theory version that prevailed in American academia at that time. Piaget’s perspective was thus provocative, even maddening, but it didn’t seem to have much to do with tool making and tool use. I dutifully wrote up as much and reported back to the group.
But Piaget had managed capture my imagination. I recall very clearly the ‘aha’ moment when it dawned on me that Piaget’s scheme could be applied more or less directly to the Palaeolithic record through the avenue of spatial cognition. In hind sight this should have occurred to me from the beginning, but my initial charge, if you recall, was to ferret out insights about tool making and tool use, not spatial cognition, and I had focused more on Piaget’s proposed mechanisms of assimilation accommodation, not his stage scheme. Paget had himself contemplated the relevance of his theory for human evolution:

“The fundamental hypothesis of genetic epistemology is that there is a parallelism between the progress made in the logical and rational organization of knowledge and the corresponding formative psychological processes. Well, now, if that is our hypothesis, what will be our field of study? Of course, the most fruitful, most obvious field of study would be reconstituting human prehistory – the history of human thinking in prehistoric man. Unfortunately, we are not very well informed about the psychology of Neanderthal man or about the psychology of Homo siniensis of Teilhard de Chardin. Since this field of biogenesis is not available to us, we shall do what biologists do and turn to ontogenesis…”(Piaget 1970 Genetic Epistemology, p. 13).
Piaget may have thought that a cognitive archaeology would be the most fruitful approach, but members of my thesis committee were not as sanguine. The task was now to convince them that my ‘aha’ moment could be operationalized in an analysis of stone tools. I settled on a typological approach, defining analytical types based on Piaget and Inhelder’s study of spatial cognition. Thus my attributes included topological (e.g., inside vs. outside a boundary), projective (e.g., artificially straight edges), and Euclidean features (e.g., regular cross-sections). I was ultimately able to convince my committee that was possible. I then applied for and was awarded an NSF dissertation improvement grant, and headed off to Dar es Salaam, Tanzania, to examine collections.

Next: Critiques of the Piagetian approach

The role of the ‘New Archaeology’ in Evolutionary Cognitive Archaeology

The role of the ‘New Archaeology’ in Evolutionary Cognitive Archaeology

The perspective on archaeological practice termed the ‘New Archaeology’ in the late 1960s, and now generally referred to as processual archaeology, had its roots 20 years earlier in the U.S. with Walter Taylor’s A Study of Archaeology (1948). Taylor’s book was a reaction against narrow classificatory approaches that yielded typologies and local culture sequences, but which revealed relatively little about people or life in the past. Taylor called for a more anthropological archaeology. U.S. archaeologists worked mostly in anthropology departments, and thus in the 1950s and 1960s they began to make conscious efforts to “put the Indian behind the artifact,” as it was indelicately phrased. Willey and Phillips’ Method and Theory in American Archaeology (1958) was the classic statement of this stance, and the guiding perspective for most of U.S. archaeology in the 1960s. In the U.K., Grahame Clark’s excavation at the Mesolithic site of Star Carr in the early 1950s broadened focus in prehistoric archaeology to include economic life, so the shift in perspective was not entirely American in origin. The 1960s saw significant growth in U.S. anthropology programs, including archaeology, accompanied by dramatic increases in available research funding from a number of sources. Universities hired many young archaeologists, some of whom, including Lewis Binford, made reorientation of archaeological thinking something of a crusade. This period in the history of archaeology has been amply covered by several historians of archaeology (e.g., Bruce Trigger), so I thought that I would place a personal spin on a few of the components of the ‘New’ archaeology that influenced my thinking.

I encountered my first debate in archaeological method and theory as an undergraduate in about 1968 (politically and socially, a very dramatic and pivotal year). This was the debate about the epistemological status of archaeological typologies. Were archaeological types ‘real’ or ‘natural,’ or just useful archaeological constructs? The consensus settled on the latter; archaeologists constructed typologies to solve (usually) culture-historic problems, and there was no expectation that the people in the past who made the artifacts would have used, or even understood the types. This stance effectively extinguished one of the few embers of cognition still burning in prehistoric archaeology. If our typologies coincided with prehistoric classification systems, then they provided a glimpse into prehistoric thinking. At the time, of course, this did not bother me. I embraced the enthusiasm for analytical types (I even used them in my dissertation), and along with most of my contemporaries abandoned any attempt to recognize ‘natural’ types.

When I entered graduate study in 1971 the ‘New Archaeology,’ as it had come to be known, was in full swing and much of the passion for reform was carried by young PhDs and graduate students. For a Palaeolithic specialist like me (though I set out to study the African Iron Age…) there were two major components to the ‘New’ archaeology – a self-conscious adherence to methodological strictures derived from the Philosophy of Science, especially the Vienna Circle, and a commitment to materialist/ecological models of culture.

The philosophy of science component was arguably the more significant. To be persuasive archaeological discourse needed to be scientific discourse, that is, archaeological arguments needed to be structured in a scientific way. At the time the Vienna Circle was still very influential in the philosophy of science, and I remember reading turgid texts by Carl Hempel and others who dictated the use of deductive reasoning and the construction of covering laws. Fellow students in the University Illinois program even invited a faculty member from the philosophy department to come tell us how to reason; his amusing reply was that his scholarly task was to figure out how we reasoned, not vice versa, and that it was silly to ask a philosopher. The passion for covering laws and other technical aspects of deductive reasoning eventually waned, but two components of scientific reasoning became established archaeological practice – the use of hypothesis testing, and the explicit use of theory to generate those hypotheses. Both were to become important components of evolutionary cognitive archaeology as it is practiced today, but cognitive archaeology itself was not an immediate product of the New Archaeology, and the reasons lie in the second major component of the approach – materialist theories of culture.

One of the ironies of the history of the New Archaeology is that it advocated for explicit use of theory in archaeological practice, but then advocated for only one, materialism. This was especially true for Palaeolithic studies for one good reason, and one not-so-good reason. Materialist theories of various stripes (e.g., Marxism, cultural materialism, cultural ecology, etc.) emphasize the ‘means of production’ as the determining or primary factors in human culture. Thus, for example, the nature of religious life is seen to be determined, or at least framed, by how people acquired and distributed food (hunting and gathering vs. farming, and so on). Now, just about the only remains Palaeolithic archaeologists ever find are tools and garbage, both closely linked to means of production. Thus, materialist theories have a natural appeal. Note that actual actors and minds are not subjects of study; at most they were seen as components in natural systems.

The not-so-good reason for the dominance of materialism was the undue influence of a few individual scholars. In the 1960s Palaeolithic specialists in academia were relatively few, and individual academic networks paramount in professional life. One of the central actors in Palaeolithic research at the time was Lewis Binford, who had established his materialist credentials in a famous debate in the literature with Francois Bordes about the meaning of the Mousterian variants. Bordes argued that these different tool sets represented separate Mousterian cultures or groups; Binford countered that they were simply different functional assemblages made to perform different tasks, and had no implications for culture or group affiliation. Binford was a radical materialist, and had no patience for any other theoretical stance. He was also a voluble and acerbic critic of others, occasionally venturing close to ad hominem attacks. He famously described any reference to cognition as ‘paleo-psychology’ and mocked it as unscientific nonsense. Unfortunately, there was no effective counter-weight in Palaeolithic studies. Even Glynn Isaac, who did more than anyone to inject new thinking into African Stone Age archaeology, and who was more open minded about theory, was a target of Binford’s venom.

By the late 1970s Palaeolithic studies were firmly committed to scientific archaeology and materialist causality. Few other perspectives ever appeared in print. There is another irony here. By the late 1960s anthropology had more or less abandoned materialist theory in favor of structuralist models. As a result, archaeological and anthropological theory began to diverge. By the 1980s many archaeologists returned to the anthropological fold, embracing structuralist and post-modern perspectives on culture. But Palaeolithic specialists continued to adhere to a materialist/ecological party line, and most continue to do so to this day. There were a few exceptions working in the U.K., such as Clive Gamble, John Gowlett, and Colin Renfrew, but in the U.S. the climate among Palaeolithic specialists was quite hostile to anything smacking of structuralism or, worse, post-modernism.

I chose as my initial PhD research project a classic materialist study, measuring edge angles on Sangoan core axes (Sangoan was an Early Stone Age/Middle Stone Age ‘transitional’ industry) in order to learn something of their function. By a mixture of bad luck and good luck I ended up doing something very different.

Next time: How a self-respecting Palaeolithic specialist got caught in a Piagetian web…

What were the intellectual roots of evolutionary cognitive archaeology?

What were the intellectual roots of evolutionary cognitive archaeology?

(Idiosyncratic musing by Thomas Wynn on the recent history of evolutionary cognitive archaeology)

In my online course on the history of cognitive archaeology I begin with a classic 1969 paper in Current Anthropology written by Ralph Holloway, who was and is the dean of American paleoneurologists. But that paper did not emerge from a vacuum, and I thought it appropriate to briefly exam the intellectual trends in (mostly) American anthropology that set the stage.

Interest in prehistoric minds stretches back, in Europe at least, to the initial acceptance of deep antiquity in the Nineteenth Century. However, early speculations were not informed by formal theories of cognition, largely because there were few, if any, formal theories of cognition; the psychological science of the time was also in its infancy. Instead, understandings of prehistoric minds were derived largely from the ethnological literature, which at the time was influenced by unilineal evolutionary thinking of the “savagery, barbarism and civilization” variety, and an unexamined assumption of European ascendancy. This bias played out in a way that had unfortunate long term consequences for our understanding of Neanderthals in particular, who were assigned to a lower mental grade based on their robust anatomy, their less diverse range of artifacts, and the chronological accident of having their remains stratified below more European-looking people. After establishing the fact of deep antiquity, archaeologists focused on the basic questions of what and when; what artifacts did Palaeolithic people make, and when did they make them. To do this they borrowed heavily from geology (for chronology) and paleontologyy (for classifying artifacts and assemblages).

In the early Twentieth Century the combination of behaviorism in psychology, and historical particularism in anthropology combined to all but banish serious interest in prehistoric mental life, at least among American archaeologists. In psychology behaviorism treated minds as essentially tabula rasa, blank slates, upon which more complex learning contexts would inscribe more complex mental responses. In anthropology the historical particularism of Boas and students viewed cultural differences as reflecting only different local histories, and not any universal stages of development. This was a backlash to the racist and Eurocentric stance of most Nineteenth Century anthropology. The development of cultures became the only topic of importance; there was no need invoke anything about prehistoric minds. Archaeologists were quite comfortable with particularism because it encouraged them to describe local developmental sequences, without any need to explain how or why. The local sequence was king. When Gordon Willey and Philip Phillips famously averred that American “…archaeology is anthropology or it is nothing…” (1958 Method and Theory in American Archaeology) they were operating within a historical particularist context, with a goal of tracing culture historic sequences in the New World.

In the 1950s and 1960s many anthropologists eschewed particularism, and embraced a new kind of general theory about the nature of culture – and mind – structuralism. The underlying principles that people used to structure social and religious life must at some point be mental structures, though at the time anthropologists shied away from specifying just what these structures might be in a neurobiological sense. At the same time psychology was beginning to reintroduce minds into the study of mental life. Cognitive psychologists, in response to failures of behaviorism, began to argue that the mind was not a blank slate, but in fact came with an inherent structure that guided or predisposed people to certain behaviors.

However, instead of embracing structuralism or cognitivism, most archaeologists in the 1960s either stuck to particularism, or embraced materialist theories of culture, which had experienced only limited popularity in anthropology as a whole. Materialist theories, in which the productive system is seen to dictate other aspects of culture, were almost ideal for archaeologists, especially archaeologists of the deep past where technical and economic data were just about the only data ever recovered. This was not a particularly friendly environment for the development of cognitive archaeology. Minds, even individuals, were considered by most Palaeolithic specialists to be impossible to study, but more significantly, brains and minds were thought to have had no power to influence the evolution of hominins or hominin culture.

In his 1969 article in Current Anthropology (1969) Holloway specifically addressed the significance of stone tools associated with early hominin fossils. Given the prevalence of structural and linguistic models in American anthropology, it is perhaps not a surprise Holloway applied this perspective to the early stone tools. He had done his graduate work at University of California, Berkeley, where Sherwood Washburn had developed a more holistic and dynamic physical anthropology that incorporated perspectives from primatology and ecological theory. The result was less emphasis on anatomical minutiae, and more interest in adaptation. Holloway was interested in the evolution of brain as reflected in endocasts. For Holloway it was a given that changes in hominin brains must have been accompanied by changes in behavior. He linked hominin endocasts to adaptive behavior by constructing an argument about minds, suggesting that the patterned action required to make stone tools was akin to the regularities of syntactical communication, and that therefore structurally-patterned culture must have been a leading element in the evolution of the early hominin brain and cognition. This was a landmark paper for human paleontology and for the thread of research that became evolutionary cognitive archaeology. In retrospect, two components of Holloway’s argument stand out: 1) the explicit use of archaeological remains as evidence of cognition, and 2) the use of an established theory, in this case a linguistic/structural model of culture. Unfortunately, while Holloway’s conclusion was well-received, his method had little influence on archaeological practice, which was then entering the chaotic days of the “New archaeology,” out of which a decidedly materialist/ecological perspective came to dominate Palaeolithic research.

NEXT TIME: Thoughts on the “New Archaeology” and Palaeolithic archaeology in the 1970s