Share This Article

Archaeology, evolution, and the evidence of early human conflict

When did humans begin to kill humans? Or more precisely, when did groups of humans learn to cooperate to kill members of other groups? The debate on this subject is old and intense. For much of the 20th century the scholarly consensus tended to follow the philosophical principle associated with the 18th-century French philosopher Jean Jacques Rousseau, in which “primitive” humans, in an egalitarian “state of nature,” were naturally and fundamentally peaceful. They presumably lived in an environment of abundant shared resources and were not cursed by violent territoriality or any other form of conflict. Thomas Hobbes’s 17th-century opposing formulation of a state of nature in which each human warred against all, leading to lives he famously described as “solitary, poor, nasty, brutish, and short,” was constructed, like Rousseau’s, to explain the origins of government but neither reflects any real understanding of human prehistory. These two ways of construing human behavior have polarized debate ever since. Perhaps the most notorious moment in this ongoing discussion occurred in 1986 at a UNESCO meeting in Seville, Spain. Attended by a host of eminent scientists, the conference issued a statement—later adopted by the American Anthropological Association, the American Psychological Association, and the American Sociological Association— asserting in highly categorical language that “it is scientifically incorrect to say that we have inherited a tendency to make war from our animal ancestors.…[And] it is scientifically incorrect to say that war or any other violent behaviour is genetically programmed into our human nature.…[And] it is scientifically incorrect to say that in the course of human evolution there has been a selection for aggressive behaviour more than for other kinds of behaviour.” This sort of position staking does a disservice to honest investigation of the human condition and its origins, but the debate began to shift dramatically in 1996, with the publication of archaeologist Lawrence Keeley’s War Before Civilization. Keeley made an impassioned plea for reimagining the role of violence in human experience, and he made the striking claim that pre-state societies experienced extreme male fatality rates. Keeley’s work and additional studies have settled on estimates of between 15 and 25 percent of males and roughly 5 percent of females in human forager societies dying from warfare. This per capita rate far exceeds those of later state-based societies. Since Keeley’s work, archaeologists and anthropologists have renewed the debate originally opened by Rousseau and Hobbes. In the simplest terms, one school of thought, the “deep rooters,” believes in the long evolutionary history of intergroup violence; the other, the “inventors,” prefers a more recent origin for human conflict, emerging only because of changes in human social organization. The debate between the two schools continues, even as new evidence accumulates. What follows examines the deep-rooter position and emphasizes the idea that conflict and cooperation are inextricably linked. The best and the worst of humanity evolved hand in hand.

Recognizable states warring with each other and with surrounding nonstate peoples had clearly emerged in the Middle East by 3500 BC in Uruk, 3100 BC in pre- dynastic Egypt, roughly 1800 BC in north China (the Xia or the Erlitou culture), and 2500 to 2000 BC in India (Harappan culture, although the evidence for war in this civilization is not as marked as elsewhere). Sedentism, agriculture, and state formation occurred much later in the Western Hemisphere—in central Mexico by 100 BC and in the highlands of Peru by around ad 500. These dates mark a kind of endpoint, from which to proceed backward in time, moving from what is incontrovertibly state-based war in its truest sense and deeper into the human prehistoric and evolutionary past.

Virtually all scholars now accept that early agricultural communities engaged in warfare well before the emergence of the state. Considerable continuity existed between later “states” and the complex social organizations that immediately preceded them. The archaeological evidence for pre-state warfare in the agricultural Neolithic includes the development of fortifications, paintings, weapons, and evidence of settlement destruction. Çatalhöyük, a site in Turkey 9,500 to 7,700 years old, and Tell Maghzaliyah, a site in northern Iraq dating back 9,000 years, were clearly fortified, and near Çatalhöyük, a site at Hacilar shows a clear archaeological sequence of an unwalled village that was destroyed around 7,500 years ago, rebuilt with a wall four to six feet thick, destroyed again 250 years later, rebuilt with even stronger walls, then finally destroyed and abandoned around 6,800 years ago—all well prior to the emergence of states.

Moving backward in time, sedentary settlement based on intensive local foraging emerged in the Near East around 13,000 BC, with agriculture following by around 9000 BC In Central Europe sedentism emerged around 10,500 BC, with agriculture arriving from the Near East by around 5500 BC, almost certainly through migration. There are three famous massacre sites associated with these transitional periods, as well as an assortment of other sites showing violent death and even cannibalism. Right on the cusp of the emergence of agriculture in Europe is the 7,000-year-old mass burial at Talheim, near Heilbronn, Germany. The bodies of 34 individuals were unearthed in a pit there; most of them died violently, probably by execution (the bodies lack defensive wounds, and many of the fatal head blows are in an identical location on the skull). The second site in Europe, Ofnet Cave in Bavaria, contains two pits where a similar number of individuals were buried. The site dates back about 8,500 years, before the arrival of farming in the region. At least half of the skulls show trauma or cut marks. The skulls were curated—preserved and displayed in some way—but they certainly suggest some form of lethal conflict. The third and most famous site is the 13,000-yearold cemetery at Jebel Sahaba in the Sudan in which 59 burials, including many women and children, were deposited in small groups over time, and at least half of the skeletons have stone points embedded in them. The site is usually interpreted as suggesting repeated attacks on a substantial community by an enemy or group of enemies.

Some scholars argue that Talheim, Ofnet, and Jebel Sahaba all represent conflict from locations and times during which humans were settling into sedentary patterns of life that emphasized and hardened territoriality and intensified competition for local resources. But again moving backward in time, it becomes clear that the pivotal issue is not sedentism but biological selection. Did it operate on human evolution to favor traits—especially group size and self-sacrificing cooperation (usually called altruism)—that provided a competitive advantage in intergroup conflict?

To begin that discussion requires some basic background on human evolution and the migrations from Africa. The dominant model for the hominid and then human populating of the planet is now the “two-wave” model. The first wave of Homo migrated out of Africa around 1.5 to 1.8 million years ago. Those migrants established populations in Europe and southwest Asia and eventually evolved into Homo neanderthalensis. Others in that wave of migration moved into East and South Asia and followed separate evolutionary tracks whose exact outlines are less well documented but who did eventually genetically intermingle with both Neanderthals and modern Homo sapiens. Meanwhile, the Homo population back in Africa also continued to evolve, eventually producing anatomically “modern” humans, Homo sapiens, between 400,000 and 200,000 years ago.

Then, about 125,000 to 100,000 years ago, a second wave of emigration from Africa began, now of modern humans. That wave stalled or slowed in the Middle East and then restarted later and at much greater speed. It is possible that this starting, stalling, and restarting was tied to specific evolutionary developments in modern humans, so let’s review what was happening in the minds and bodies of early H. sapiens.

About 100,000 to 70,000 years ago, modern humans in Africa began to display creative behaviors that indicated self-awareness, complex problem solving, and even ritual. This was roughly contemporaneous with the beginning of the second wave of migration. That wave may have stalled in the Middle East because of competition with long-resident Homo populations there. Then around 60,000 to 40,000 years ago, human cognitive development accelerated further, marked by a diversification of tool types and materials used and of art and ornamentation; distant sourcing of materials; and perhaps even a new stage in language development. These new skills may have played a role in the rapid population expansion that saw H. sapiens spread across Eurasia with unprecedented speed. The “social brain” theory (which among other things correlates the size of the neocortex to the size of the sustainable social network) suggests that in addition to new tools, the new human cognitive and linguistic capability contributed to the establishment and maintenance of larger groups that could sustain multiple tiers of organization. Where chimpanzees could sustain non-tiered communities of maybe 50, Neanderthals had moved to a tiered social organization of camps and macrobands (the latter functioning as the equivalent of a community) of 12 to 15 per camp, and macrobands of 200 or so. Foraging H. sapiens, however, had camps ranging from 15 to 50 individuals, with an average of 17 bands in a macroband, and thus a coherent group ranging from 250 to 800 individuals; some theorists posit the emergence already in the Upper Paleolithic (50,000 to 10,000 years ago) of an even higher tier of “global networks” of 2,000 to 2,500 people. These connections were sustained through language and other forms of symbolic activity, such as ritual, death ceremonies, and body adornment.

H. sapiens’ enhanced social capability and new technologies (notably the atlatl spear thrower developed 30,000 to 40,000 years ago) gave them a competitive advantage as they expanded into territories inhabited by other archaic humans. There have long been theories and suggestions that the in-migration of modern humans pressured, isolated, and eventually eliminated the Neanderthal population, if not through actual genocide then by pushing them into more and more marginal zones where their reproduction rates suffered. There is little physical evidence to suggest this competition was always violent, but some evidence does exist for both intra-Neanderthal violence and violence between Neanderthals and modern humans.

There is substantial skeletal evidence of violent death and dietary cannibalism among Neanderthals. A dramatic example emerged relatively recently at the El Sidrón site in Spain. DNA analysis of the 12 Neanderthal individuals found there has shown that they were a family group of related males and children, buried approximately 50,000 years ago with nonblood-related females, all of whom were killed in a single incident and butchered for food. While other examples of Neanderthal cannibalism exist, only some 400 Neanderthal individuals have ever been discovered—and not often their whole skeletons—so no statistical conclusions about the prevalence of violence can be reliably drawn. Currently, only 27 Neanderthal and 19 H. sapiens skeletons from the Paleolithic show evidence of violent death. But many fatal injuries would leave no mark on the skeleton, and furthermore, victims of massacres in the prehistoric past were as likely to be left unburied as not, and their remains thus scattered. Such numbers and conditions make it difficult to sustain statistical claims for the omnipresence of violence (or its lack), but the massacre and eating of a family as found at the El Sidrón site certainly support the possibility, as do the much later bodies found in Jebel Sahaba in Sudan, of at least intermittent extreme, indiscriminate, and presumably terrifying violence.

There is also evidence for cannibalism among Paleolithic modern humans. About 14,700 years ago at Gough’s Cave in Britain, five bodies were processed, the long bones cracked to extract marrow and one skull shaped into a drinking cup. Both Neanderthals and early H. sapiens therefore provide at least some evidence for interspecies cannibalism and violence. But what about H. sapiens violence against Neanderthals? Here the evidence is scant, and there is an emerging possibility that at least in parts of Europe, particularly the Iberian Peninsula, the chronological overlap of Neanderthal and H. sapiens may have been shorter than once believed, or even regionally nonexistent. Some evidence for overlap comes from Les Rois Cave in France, in which the bones of a modern human were found alongside the skull of a Neanderthal child, and the child’s jawbone bears the marks of having been processed for food. And a recent reanalysis of a wound inflicted on a Neanderthal skeleton known as Shanidar 3 (found in Iraq’s Zagros Mountains in the 1950s) suggests that it was inflicted by a thrown spear, possibly using an atlatl, and therefore almost certainly by a modern human.

The basic contention here is that coordinated lethal conflict between groups of humans intent on increasing their access to resources of various types has existed at all times and in all places but not all the time. Conflict or war does not dominate the archaeological record of pre-state societies, but it does suffuse it, deep into the Paleolithic. We can turn to our closest biological relatives, the chimpanzees, for further evidence of the likely role of conflict in early human societies.

Animal behaviorists used to see intraspecies conflict as generally resolved by submission or flight, but more and more evidence points to lethal intraspecies violence, even killing, especially among social animals like ants, meerkats and banded mongooses, gerbils, lions, prairie dogs, wolves, spotted hyenas, and especially chimpanzees. Chimpanzee group conflicts have striking parallels to those found in human forager societies. First, conflict is not endemic but episodic—proximity does not guarantee conflict, but neighboring communities live under the threat of conflict nonetheless and behave accordingly, with territories clearly marked, patrolled, and defended. Second, casualty rates can be very high. Third, conflict does not begin through simple aggression but rather in response to ecological pressures and perceptions of advantage. Specifically, conflicts begin when one group enjoys an overall numerical advantage, and attacks are launched only when a localized numerical edge exists and usually with the benefit of surprise. Fourth, raids are conducted by groups of males and are shot through with male sociality and even training activities. Adult chimpanzees have been observed to correct the patrolling behaviors of adolescent members. Finally, females might be killed but are more likely to be captured and socially absorbed by the winning group.

This collection of behaviors can only be called war. It is organized group activity, conducted with lethal effects designed to diminish one group for the benefit of another. Its existence among our closest living genetic relatives suggests that even where the archaeological evidence falters, human evolution likely has long taken place in the context of such violence.

Did violence among our own hominid ancestors go back far enough and was it frequent enough to have evolutionary effects? In other words, did death by fellow human generate an evolutionary selection effect? Historian Samuel Bowles, using a variety of estimates of lethality—including from the ethnographic record of hunter-gatherer warfare, from the conflicts of chimpanzees, and from the limited archaeological evidence of prehistoric forager societies—has developed a mathematical model he says indicates that “for many groups and for substantial periods of human prehistory, lethal group conflict may have been frequent enough” and lethal enough to have had a selection effect, and furthermore, it was selecting for “quite costly forms of altruism.” Cooperation and conflict, it seems, proved to be two sides of the same evolutionary coin, each reinforcing the other, as groups evolved larger and more successful systems of cooperation in order to succeed at conflict, and the persistence of conflict necessitated ever more complex forms of cooperation.

This sort of conclusion presumes an evolutionary role in creating or shaping complex forms of human behavior, something that modern scholars have only recently begun to accept. Yet the last few decades have seen increasing evidence for exactly that, especially for natural selection that affects group behavior. Of particular importance for military historians is the problem of altruism addressed by Bowles—that is, the willingness of an individual to take risks, even self-sacrificial risks, on behalf of another or others in the group, however large that group might be. Genetically speaking, altruism seems to violate the primary imperative to survive and reproduce. If altruism had biological origins (and it is visible in certain nonhuman cooperating species), what selection processes created it?

The current orthodoxy on this subject argues that evolutionary selection works at an individual level, favoring traits that promote individual survival and reproduction. In this view, “species” do not evolve as a group; individuals within a species evolve, and the most reproductively successful then outcompete the other individuals and pass on their genes more prolifically until their traits come to redefine the species. Within this individually competitive dynamic, what then would promote risky altruism?

Beginning in the mid 1960s and then later under the rubric of “sociobiology,” biologists argued for the principle of “inclusive fitness,” in which individuals within a group cooperate in order to promote the survival of their own genetic code via that code’s survival in close relatives. The more closely related the group, the greater the incentive to cooperate: A brother who acts to promote multiple siblings’ reproductive success ensures the propagation of 50 percent (on average) of his own genetic code in each sibling.

More recently, one of the original innovators of inclusive fitness, biologist E. O. Wilson—along with Martin Nowak and other biologists using evolutionary game theory—has argued that there is in fact such a thing as selection operating at a group level—not just on individuals. Without completely dismissing inclusive fitness, they now argue instead for multilevel selection. Individual selection continues, including for group behaviors, but group selection also operates. Furthermore, and this is why it matters here, the selection pressure encouraging group behaviors is the existence of competition from other members of the same species who live in a different group or groups.

For Wilson, the key criterion that initiated group selection favoring cooperation was the creation of a “protected nest,” a form of living that promotes sociality without necessarily eliminating continued intragroup competition and selection. From that initial stage, Nowak argues that populations of cooperators will eventually outcompete selfish populations. Selfish “defectors” tend to have an early advantage in competition, but as their reputation as defectors grows, that advantage diminishes in the face of united cooperators. In short, hereditary altruists form groups that cooperate and organize to outcompete nonaltruist groups, “through both direct conflict and differential competence in exploiting the environment.”

Therefore, competition and its most dramatic form—violent conflict—acted as a selection pressure on human biological evolution, favoring the selection of group behaviors like male solidarity, a “shoot-on-sight” attitude toward intrusive strangers, the ability to incorporate the defeated remnants of other groups, risk calculation in assessing the threat of other groups, and, most important, greater social complexity to sustain the larger group sizes that provide an important advantage in violent conflict.

It is this last characteristic that points us to the other side of the conflict coin: cooperation. For humans, being “good” at conflict meant more than physical size, or keen eyesight, or other physical attributes. It also meant selection for cooperation. Recent studies of infants suggest that the ability and desire to reward cooperators exists at the genetic level, and ethnographic work among band-level societies frequently shows not only that they depend on cooperation within the group but that the group actively controls aggressive, dominance-seeking individuals. The evolution of cooperation is thus linked to the existence of lethal conflict, which promoted, in Bowles’s words: “a particular form of [cooperative] altruism, often hostile toward outsiders and punishing toward insiders who violate norms, [and it] coevolved with a set of institutions—sharing food and making war are examples—that at once protected a group’s altruistic members and made group-level cooperation the sine qua non of survival.” Modern humans are thus products of a grand tension between the use of internal cooperation to be more successfully aggressive in the quest for resources at the expense of outsiders within the same species. Confronting such aggression prioritized ever better cooperation to enlarge group size, but once enlarged, a group needed more resources to sustain itself, and so on.

While the development of weapons for hunting and killing at a distance—including handheld spears some 400,000 years ago—may also have altered the patterns of human conflict by increasing its risk, what is being suggested here is that a “man the competitor” hypothesis (as distinct from “man the hunter” or “man the hunted”) smoothly links biological evolution in a continuum with cultural evolution—what Wilson calls “gene-culture co-evolution.” It is this dynamic of humans in competition with each other and therefore seeking ever more organizational capacity that was the real shaper of war as a social phenomenon, first biologically then culturally. As conflict continued in cultural time (as opposed to evolutionary time), it continued to promote greater cooperation within the group, with solutions now achieved through cultural adaptation and transmission at a much greater rate than had occurred genetically. Solutions to selective pressures could now be reached through learning and problem solving, then taught to succeeding generations.

Is it from this evolutionary process that we developed the intense bonds that exist between small groups of men engaged in conflict? And is it also here that we find the powerful and persistent ethnocentrism that enables and even encourages one group to label another as “others” and thereby to treat them with more extreme violence? This is not to claim that we are trapped in these behaviors—culture still overrules biology, especially in group behaviors—but it does suggest their power. And finally, it suggests that human cooperation and complexity evolved both biologically and culturally in the face of human conflict—that they were and remain two sides of the same coin.

 

Wayne E. Lee is a professor of history at the University of North Carolina and chair of the Curriculum in Peace, War, and Defense. He has written or contributed to five books on human conflict.

Originally published in the January 2015 issue of Military History Quarterly. To subscribe, click here.