roots of domination

Over on Hipcrime Vocab, a new awesome summary of the trip from egalitarian tribes to civ.

I have a few comments. We need a better understanding than Harris offered regarding the move from a society committed to leveling, and the rise of the Big Men. As escapefromWisconsin puts it, “in such societies, aggrandizing members … encourage the production of surpluses by which they throw lavish feasts to enhance their prestige and status.” Yes, but a society based on the values of ‘vigilant sharing’ would not allow striving for prestige and status in the first place.

I disagree that slavery emerged because the agrarian lifestyle is backbreaking. There is plenty of evidence that foragers/horticulturists lived very well; they had some surplus, they still had the leisure. Slavery turned into a necessity only after top-heavy elites made mincemeat out of the economic patterns linked to sharing. It’s the overhead, stupid! 🙂

And finally, the progression from egalitarian band to despotism already happened within the egalitarian bands themselves. There is a creepy account of a Greenland Inuit group that fell prey to a despotic shaman who murdered people and stole women. The band became so terrified they were unable, at the time this early account was written, to strike back. We don’t know if they finally managed to assassinate him, or whether they all snuck off in the middle of the night. In other words, it is possible to hoard power and become a despot without first taking the entrepreneurial path of Big Men.

Welcome, commenters!

Once aggrandizers are given an inch of leeway under favorable resource conditions, they quickly stretch that inch into a mile and keep on going.
— Brian Hayden

Once upon a time, there lived the ancestral apes that gave rise to humans, chimpanzees and bonobos.
Figure 1
In all likelihood, they lived in bands dominated by the strongest, most aggressive individuals — the male alphas. This tends to produce a rather disagreeable state of affairs where anyone can be humiliated or brutalized at any moment, and the best food and most mates go to just a few. Even baboons would rather opt out when the opportunity arises! In addition, our growing brains demanded the fats found only in scarce meat which the alphas commandeered.

Evolution snaked forward. The chimps pretty much put up with the true and tried. Bonobos evolved out of this unpleasant arrangement into an alliance of females, cemented by mutual sexual pleasuring. Humans likewise evolved out and into an alliance of betas, cemented by unprecedented, increasingly more subtle communication abilities, eventually including laughter and speech.

In conjunction with weapons-at-a-distance that equalized brawn and brains, power came to be shared, and so was the meat. The resulting egalitarian bands, a durable and satisfying arrangement, saw humans through the harshness of repeated ice ages and other natural calamities. During this time, humans became survivors par excellence on the planetary stage. The egalitarian strategy of “vigilant sharing” had proven itself a winner.

When did our first egalitarian revolution occur? Nobody knows, as yet. Some experts posit it could be as far back as when we came down from the trees, others place it into our sapiens timeline. The oldest known wooden, fire-hardened spears come from about 300-400,000 years ago.

This agreeable social arrangement began to slightly unravel in areas of plenty in the late European Paleolithic, and gradually wound down among the so-called “complex hunter-gatherers” after 15,000 years ago. Complex or transegalitarian foragers were people who forged new pathways into competition, accumulation, increasingly violent conflict, and ratcheting economic growth. Individuals known in the literature as Big Men or aggrandizers led this “elitist revolution,” becoming quite the experts on getting people to crank out work and surpluses, by hook or by crook.

In the beginning, these hardworking, enterprising, and generous leaders couched their projects in the language of altruism and community. But being “triple-A” (aggressive, acquisitive, ambitious) personalities, they were also surreptitiously looking out for number one. As more and more wealth of the tribe flowed through their hands, they learned to skim a little, then a bit more, for themselves. They finessed a plethora of strategies that created social imbalances among the people of the tribe. At first, only a few families were left behind, and most did well in the aftermath of Big Men’s projects. But in time, poverty spread apace with increasing social stratification. And after a few millennia of these increasingly manipulative and coercive tactics, the very individuals who early on worked the hardest and kept the least became those who worked the least and kept the most.

As the ratchet picked up speed, wealth and power inequalities grew to such an extent that a genetic bottleneck shows up around 8,000 years ago [reports just off the press, here and here] in various communities of the mid to late Neolithic. Just like in the days of our apish ancestors, the most aggressive alphas grabbed the best food and most of the mates. H. sapiens went baboon.

More work meant more food meant more people. Aggressive, accumulative, highly competitive societies gained a short-term advantage and were pushing out those who stayed with the old relaxed, egalitarian lifestyle. The needs of power came to trump the needs of life on the “Parable of the Tribes” planet. Elite-run societies are very good at producing goods; they nevertheless have a variety of disadvantages. The key one being this: aggrandizers have a problem with brakes. In the long run, they drive their societies off a cliff.

And here we are. Time, once again, for a crash. Except, this time, it’s global. Except, this time, it’s affecting the entire web of life our own lives depend on. The planetary ecosystems are devastated; some are dying. Our fellow creatures are disappearing forever. The soils that feed us are blowing away and turning into desert. There are invisible poisons everywhere, in the air we breathe, in the food we eat, in mothers’ milk. Clean water has become a rare commodity. Oceans are chock-full of garbage. Pathetically enough, the aggrandizers are losing their touch: jobs are vanishing at a time when people depend on them for their entire livelihoods. A stain of misery seeps across the anguished blue planet.

Our leading aggrandizers, of course, are not paying attention. It is one of elite privileges, not having to listen to the peons. Not having to listen to bad news. Not having to face feedback that is simply inconvenient to their plans and schemes, inconvenient to getting even richer and more powerful. One of the cherished perks of being rich and powerful is ignoring anyone who isn’t. Why not continue to live in a bubble and pretend that the bubble that’s lasted so long is permanently impervious to reality?

The Earth is running out — out of minerals, out of peoples and places to exploit, out of space for waste, out of patience. And the teetering tower of complexity, having reached the point of diminishing returns, stirs deep memories of quite another lifeway. Our species knows how to handle hardship and austerity — this knowledge is part of our genetic endowment. When resource conditions worsen to the point that aggrandizing behaviors again pose a threat to community and survival, humans set down tight limits on greed and narrow self-interest. I reckon we are about there. Time for the second egalitarian revolution, don’t you think?


Originally, I planned two major posts summing up in detail the history of our species. Unfortunately, it turned into a big slog. I left the project a few years back, unfinished, and it would require several months of dogged research now. My life is too unsettled at the moment to allow that. But at the same time, it is impossible to sally forth into deeper explorations of early agriculture and social complexities without at least sketching a map of our “true history” — true, in this case, meaning a clear focus on the full span of our time as the species H. sapiens, not more, and not less.

Somebody ought to write a beautiful coffee table book, showing vividly the utter awesomeness of the Paleolithic world where megafauna roamed free, humans were just one species among many, and elephants were the “lords of creation” and doing an excellent job of it! An eye-opening and radicalizing bit of time travel it has been for me. So, here is a quickie, to share what I’ve discovered. Caveat: this is my own synthesis; others may disagree with some of the details; there is little in deep history that is not contested…

  • Curtain opens at about 200,000 years ago, as the world is heading into another ice age. Sapiens in lower Africa; Neanderthals in Europe and northern Asia, and several other descendants of erectus in southeast Asia. Humans talk, use fire, hunt, cook, make rafts, fire-hardened spears and simple stone tools.
  • Sapiens love to inhabit caves near rivers or the ocean; a number of them have been excavated and described in southern parts of Africa. Humans thrive in small egalitarian bands of 20 to 40 people; very local trade exists between bands.
  • Ice age comes to an end around 130,000 years ago, and for a while it’s quite hot. The vast majority of human artifacts from this interglacial come from the Neanderthals. Artifacts get more interesting. Humans love ochre and other pretty rocks. They invent fancy glue, make composite tools (wood and bone), fish hooks, bury their dead.
  • The climate cools again toward another ice age. The massive Toba eruption (c. 71,000 ya) causes a 6 year winter and sapiens barely escape extinction.
  • Temperature_Interglacials

  • About 60,000 years ago, descendants of erectus float or sail to Australia. And sapiens humans start moving out of Africa.
  • 50,000 years ago… many more tools, much improved; something is happening to sapiens brain, enabling a cultural shift into greater complexity of both language and artifacts. Art becomes common. Flutes. Sewn clothing. Conscience emerges.
  • Sapiens are coexisting and occasionally mating with Neanderthals in Europe, until 25,000 years ago. Pockets of humans survive the ice age at higher latitudes in refugia where megafauna is particularly plentiful. In these spots, culture flowers, tools are finessed, caves are painted, rituals are performed. First child-dog bond in evidence some 33,000 years ago. America discovered and begins to be settled.
  • R.I.P. our Neanderthal cousin

    R.I.P. our Neanderthal cousin

  • Ice age maximum reached at 20,000 years ago. The cold drought kills perhaps 90% of humans in Australia. Abrupt warming fosters flourishing sapiens cultures in Europe and the near East; horses and reindeer actively cared for and seeds sown. Pigs domesticated by Anatolian foragers around 13,000 ya. Inequalities begin to emerge in some bands. Resurgence of ice during the Younger Dryas period (13,300 ya to 11,800 ya). The construction of monumental Göbekli Tepe begins.
  • 10,000 years ago, a warm moist world of plenty; in a few areas, humans begin to settle down and build more permanent shelters and walls; cultivation of plants and animals intensifies, populations grow. Some human groups transition from egalitarian to Big Man (transegalitarian) social structures. First towns (and regional proto-civilizations) emerge in the Near East; people flock there voluntarily; peace and relative equality reigns. First regional environmental collapses resulting from human activity experienced toward the end of the Neolithic.
  • 6,000 years ago, first transitions to advanced metallurgy, bronze weapons, domination, and war. The very first incarnation of “this civilization” emerges in Sumer. Women are actively marginalized, social stratification increases, and health and longevity deteriorate for those lower on the pecking order. Non-civilized tribes begin to be pushed out. Wholesale slaughter of regional megafauna emerges as a status sport. Amazing art and devious cruelty advance apace.
  • First brutal empires (Akkadia, Babylonia and Assyria) emerge about 4,000 years ago. War and standing armies assume a menacing presence in a few places. But most areas of the globe continue to be settled by egalitarian or transegalitarian tribes (and on until recently). Sahara forms (without human help).
  • By 2,000 years ago, many societies continue to intensify and great religions emerge and manage to modify somewhat the brutality of the age of empires. Civilized humans preen as rational beings and lords of creation and begin to take over everything they can reach. Writing spreads. So do plagues. Mathematics, science and frequent technological breakthroughs start to make a difference in the human condition. Oceania settled by intrepid explorers in outrigger canoes.
  • 250 years ago, industrial civilization’s “Satanic mills” move into “mow down the living planet” mode, encourage out of control human reproduction, and filthify everything. Last autonomous tribes on the way out. Planet increasingly devastated. At the same time, some humans reap unprecedented benefits — including longer life-spans — from advancing understandings of science and technology. Ideology of progress and sharing the pie quells unrest. Then, within the space of a few decades, this civilization begins to show serious cracks. Elites keep their heads firmly wedged, er, in sand. Humans are, overall, increasingly well-connected, educated, stumped, and suffering from multiple addictions. Will they survive?


A properly socialized individual had a powerful sense that the wild world was feeding him, and he ought to be as grateful and as anxious to act decently as he would to any human who fed him out of sheer kindness.
– E.N. Anderson, Ecologies of the heart

People intuitively view agriculture as the root of domination because intensifying food economies made possible large surpluses which could then support elites and their servants. As indeed they did. But the link with agriculture is conditional.

Certain well-endowed economies (whether foraging, horti, field agriculture, or grazing) make large surpluses possible. But they do not make them inevitable. Food harvests– of any kind — do not lead to surplus unless the people in question decide to produce it. Given the fact that humans generally have better things to do with themselves than toil, they tend to work as little as necessary to cover their food needs and a little extra for the winter or an upcoming celebration. If they planted a field of rye, and it produced twice as much as they expected, they’d be likely to plant half next year, and spare themselves the extra work. If salmon or anchovies are particularly plentiful this year, why not kick back and enjoy the easy life?

And indeed, there is a great deal of evidence that “agriculture does not automatically create a food surplus. We know this because many agricultural people of the world produce no such surplus. Virtually all Amazonian Indians, for example, were agricultural, but in aboriginal times they did not produce a food surplus. That it was technically feasible for them to produce such a surplus is shown by the fact that, under the stimulus or European settlers’ desire for food, a number of tribes did raise manioc in amounts well above their own needs, for the purpose of trading.” These tribespeople went back to underproduction when their trading needs were satisfied.

Even the simplest foragers often produced some subsistence surplus. They were, however, not exercised much by planning ahead, and often blew through the entire cache at a midwinter feast, going hungry shortly thereafter, trusting that the world would provide. Many anthropologists noted that strictures against taking “more than you need” were extant in these societies.

Boreal Algonquians expected intermittent periods of hunger during the winter, and these fasts—and even the possible threat of death—were preferable to the planning and labor entailed by food storage. The definition of the resource situation was one in which animals were ordinarily available and hunger a predictable, endurable, and usually transient aspect of the winter round. It is precisely in this arbitrary weighting of risk aversion and optimism that the operation of the cultural logic of Cree labor is specifiable. The costs of the labor, always potentially superfluous, entailed in storage was reckoned disproportionate to the reliability ensured by the surplus. Before approximately 1900, boreal forest Algonquians often fasted and sometimes perished for lack of food. These tragedies would have occurred less frequently if more intensive food storage had been practiced. Experiencing long-term game shortages as though they were new instances of transient scarcity, the Algonquians continued, with some concessions, “to let tomorrow provide for itself.” The decision to store less and starve more (or, among Chipewyans, to store more and starve less) was not objectively determined by the Canadian Shield ecosystem, the limits of the technology, or caloric efficiency. The paradox of the starving Montagnais consuming all their preserved eels in autumn feasts is a particularly forceful example of the meaningful construction of utility, efficiency, and the entire structure of foraging labor and consumption. This skepticism toward advanced planning and reliability is not limited exclusively to foragers. Audrey Richards’s (1932) classic monograph on the Bemba is a detailed exposition of an agricultural society whose members preferred transient hunger to what they deemed excessive labor.

To broaden the areal focus, comparable practices existed even in a “delayed return” foraging society like the Alaskan Koyukons who occupied sedentary winter villages provisioned by preserved fish and caribou meat. According to Sullivan (1942), the Koyukons sometimes disposed of their stored foods during lavish feasts in late summer, midwinter, and early spring. The midwinter feasts, in particular, sometimes occasioned hardship if hunting was unsuccessful, but they continued into the present century. The Koyukon feasts pose the same paradox as the Montagnais: the surplus was accumulated and preserved but then consumed, precluding its use to level fluctuations in the long term. Murphy (1970:153) described among the Brazilian Munduruçu “the hunter’s glut, an abundance of meat that had to be consumed before it spoiled, and the men stayed at home because further hunting would have been a crime against the game and because they had to apply themselves steadily to the serious business of eating.”

These subsistence surpluses hedge the bets of survival a little; much of the time, though, simple (or “immediate return”) foragers only get enough to eat for the next several days. Surplus that goes beyond subsistence is a luxury good. Since it is above what the community needs, it can be traded, or given away, and no one is the worse off. It is not the little extra a community needs to weather a winter or to set aside seed for spring planting. That “little extra” is needed for survival and cannot be derailed toward optional undertakings. Luxury surplus is the kind that can support elites.

The extant records, like the ones quoted above, show that even the most basic subsistence surpluses were the result of choice. Only more so, then, can luxury surpluses be said to result from a choice (within either forager, horticultural, or agricultural economies). They cannot be the automatic result of the agricultural way of life. There will be no surplus, no matter how abundant the land, unless the people in question decide to override their culture’s disapproval, begin taking more than they need, and devote much more effort to storage techniques. And it appears that the first people who chose to produce luxury surpluses were very ancient complex (or “delayed-return“) foragers. Brian Hayden has this to say:

From all the indications that prehistorians have gathered, it appears that humans have existed for well over 2 million years in a state of relative equality. It is possible to perceive the glimmerings of some changes toward socioeconomic inequality around 50,000 years ago. These changes became more pronounced in some areas about 30,000 years ago, and then became especially dramatic and widespread after about 15,000 years ago.

The shift toward socioeconomic inequality is not tied to food production, but occurred well before agriculture emerged. At the end of the Pleistocene, these changes occurred independently in a number of different areas of the globe. Thus the emergence of significant inequality followed a pattern that is strikingly similar to the emergence of food production, but preceded it by many millennia. (Richman, Poorman, Beggarman, Chief, 2007)

There we have it. The root of domination lies in the Paleolithic, deep in forager world.

gravetian man

Our human forebears everywhere did not just passively gather food and basketry materials but actively tended the plant and animal populations on which they relied. There was no clear-cut distinctions between hunter-gatherers and the more “advanced” agricultural peoples of the ancient world. Moreover, California Indians had likely completed the initial steps in the long process of domesticating wild species…
— Kat Anderson, Tending the Wild

In Agriculture: villain or boon companion, I argued that we sapiens have been cultivators since time immemorial, that a combination of foraging and cultivation is a sensible, durable way of life that has served us well, and that the “origin of agriculture” really is the intensification of cultivation that becomes visible in the archeological record.

I have since been stymied in my quest for clearer understanding by the ongoing insistence of some folks to paint agricultural cultivation into a corner as a disastrous turn for humans and the root of our present troubles. They point to foraging and horticulture as modes of food production that avoid the damage agriculture has brought about. I wanted to test this claim.

It became quickly apparent to me that one does not need agriculture to intensify and produce an increasing surplus. For example, the rich salmon-and-candlefish-based economy of the Kwakiutl provided plenty of surplus to support elites and even to motivate slavery. Foragers are said to live in harmony with their environment, to keep their populations low and their hierarchies flat (if any). Unfortunately, it ain’t necessarily so. There are compelling data showing that the Australian aborigines wreaked continent-wide devastation with their use of fire on a highly vulnerable landscape, degrading the vegetation, causing massive runoff and loss of soil during monsoons, and eventually precipitating a change in climate for the worse. While in North America the native tribes may have had but little to do with megafauna extinction, not so in Australia. The human-precipitated change of vegetation deprived the largest and most specialized browsers of adequate food, and they began to disappear not long after the arrival of humans, some 45,000 years ago, along with their marsupial predators. That should hardly be surprising, as the same story repeated many millennia later with the colonization of Far Oceania. For example, in New Zealand. the South Island Maori, former horticulturists who returned to foraging as more suited to that environment, slaughtered the moas and other vulnerable creatures in an orgy of gluttony, only to turn on each other when protein ran low. The populations of both aborigines and Maori fluctuated according to food availability. Some of the tribes lived in hierarchical societies.

It has also been claimed that horticulturists for the most part remain egalitarian and lack despots, armies, and centralized control hierarchies, and have built-in constraints against large populations and the hoarding of surplus. Nothing could be further from the truth. There have been, indeed, some horticulturists who remained egalitarian, chose to limit their population when it was getting out of hand, and whose gardens and edible forests leave the soil and ecosystem in a good shape. The small island of Tikopia comes to mind. But they seem no more common than those horticulturists (such as Easter Islanders and many others) who pillaged their new island home, wiping out much of the native flora and fauna, permanently degrading the living environment. The horticulturists who settled Far Oceania were generally rigidly ranked peoples whose chiefs extracted a goodly portion of the harvest, waged wars on neighbors, built fancy tombs and megaliths, and occasionally came close to a state formation. The puzzle of intensification cannot be sidestepped by a reference to a golden age of horticulture.

Still, it bears stressing that many — perhaps most? — ancient forager/cultivator societies coexisted very well with their landbase. For example, the Moriori, cousins of the Maori, also switched to settled foraging on Chatham Islands, and were such careful stewards of their environment that seal colonies flourished within a stone’s throw from their villages. They lived notably egalitarian lives and carefully controlled their population. Until they were wiped out by the Maori, they were an impressive example of cool temperate region people living in close symbiosis with their ecosystem.

The illuminating and well-researched book Tending the Wild documents various Indian tribes who were also, by and large, careful stewards of their coastal California homelands. “They were able to harvest the foods and basketry and construction materials they needed each year while conserving — and sometimes increasing — the plant populations from which these came. The rich knowledge of how nature works and how to judiciously harvest and steward its plants and animals without destroying them was hard-earned; it was the product of keen observation, patience, experimentation, and long-term relationships with plants and animals.” Living among a similarly abundant natural environment as the Kwakiutl further north, they did not succumb to ongoing intensification, and continued to share any accumulated seasonal surpluses. Why did Kwakiutl intensify, while their close neighbors to the south, the Coastal Yurok, did not?

I conclude that neither the foraging nor horticultural modes of food production are by themselves a guarantee against ongoing intensification and the eventual damage it brings. There is a streak of persistent idealization of the forager and simple horticulturist among primitivists and other uncivilization-minded people. Slavery might be reframed as “captivity,” environmental damage rationalized, potlatches celebrated as evidence for gift-economies rather than economic warfare, and discussion shut off. Surely it’s not necessary to ostracize people who point out the facts on the ground, and a need for a rethink? After all, egalitarian forager/cultivators do show us that this particular mode of existence — so successful and durable during most of our species’ history — functioned mostly within the ‘Law of limits’ that allows ecosystems to thrive.

Below is an artist’s portrait of the California flightless diving ducks. They were finally driven extinct by the Indians who could reach Catalina Island by boat. But… it took them 8,000 years to do it.

flightless duck

The rise of pristine states would appear to be best understood as a consequence of the intensification of agricultural production. Any increase in the quantity of soil, water, minerals, or plants put into a particular production process per unit of time constitutes intensification.
— Marvin Harris

A great and absorbing book. Very much recommended. I figure it must be one of the essential sources used by Daniel Quinn when he wrote Ishmael.

Marvin Harris was a well-known anthropologist and author who nearly cracked the puzzle I have been writing about: what is the root of domination, and with it, the root of our dysfunctional civilization? He writes (it’s so clear and good, I quote at length):

In most band and village societies before the evolution of the state [er, this civ], the average human being enjoyed economic and political freedoms which only a privileged minority enjoys today. Men decided for themselves how long they would work on a particular day, what they would work at — or if they would work at all. Women, too, generally set up their own daily schedules and paced themselves on an individual basis. There were few routines. People did what they had to do, but the where and when of it was not laid out by someone else. No executives, foremen or bosses stood apart, measuring and counting. No one said how many deer or rabbits you had to catch or how many wild yams you had to dig up. A man might decide it was a good day to string his bow, pile on thatch, look for feathers, or lounge about the camp. A woman might decide to look for grubs, collect firewood, plait a basket, or visit her mother. If the cultures of modern band and village peoples can be relied upon to reveal the past, work got done this way for tens of thousands of years. Moreover, wood for the bow, leaves for the thatch, birds for the feathers, logs for the grubs, fiber for the basket — all were there for everyone to take. Earth, water, plants, and game were communally owned. Every man and woman held title to an equal share of nature. Neither rent, taxes, nor tribute kept people from doing what they wanted to do.

With the rise of the state all of this was swept away. For the past five or six millennia, nine-tenths of all the people who ever lived did so as peasants or as members of some other servile caste or class. With the rise of the state, ordinary men seeking to use nature’s bounty had to get someone else’s permission and had to pay for it with taxes, tribute, or extra labor. The weapons and techniques of war and organized aggression were taken away from them and turned over to specialist-soldiers and policemen controlled by military, religious, and civil bureaucrats. For the first time there appeared on earth kings, dictators, high priests, emperors, prime ministers, presidents, governors, mayors, generals, admirals, police chiefs, judges, lawyers, and jailers, along with dungeons, jails, penitentiaries, and concentration camps. Under the tutelage of the state, human beings learned for the first time how to bow, grovel, kneel, and kowtow. In many ways the rise of the state was the descent of the world from freedom to slavery.

How did this happen?

How the heck indeed. Here is Harris’ logic.

  1. Forager peoples were unable to effectively limit their population growth.
  2. Population pressure forced them into intensification of food production.
  3. Intensification of production led to domestication of plants and animals and other aspects of what we see in the historical record as “agriculture”, and sooner or later, sedentary settlement.
  4. Farmers tend to encourage intensification by conspicuously rewarding those who work harder than others (and by creating institutions that do so).
  5. These early rewarded production-intensifiers are known in anthropology as the Big Men (or lately as aggrandizers). They specialize in getting people to work harder, and in redistributing the resulting bounty via feasts and ritual celebrations. They accumulate followers and renown, rather than wealth. Production-intensifiers are the new cultural heroes, the community benefactors and “great providers”; they are the people to whom power flows and who are readily given leadership roles.
  6. The Big Men build exclusive club houses for their male followers, where they reward them with prostitutes and copious amounts of delicacies. It is not much of a step to begin diverting some of the wealth to equipping and training these men as warriors, and leading them into war parties where booty provides further rewards.
  7. Under certain conditions, amidst growing imbalance of power between ordinary producers and redistributors, these Big Men gradually set themselves up above their fellows, skim off more and more of the surplus that flows through them for self-aggrandizement, image-building, and solidifying their monopoly over coercion, and become the original nucleus of the ruling classes of the first states. And so, as Harris notes, we end up with a system where “those who worked hardest and kept the least became those who worked the least and kept the most.”

Makes a lot of sense to me. Marvin Harris’s writings ought to be widely read. He touches on many other topics including the subordination of women, the origin or war, shifting patterns of human and animal sacrifice, and the origin of vegetarian diets. I was particularly intrigued by his claim that domestication was “the greatest conservation movement of all times,” whereby certain tasty animals were saved from extinction that surely would have followed from their ongoing overhunting. What a shame they didn’t start with the mini-mammoths!

My disagreement with him is primarily with the starting point of his sequence. The argument regarding forager population control does not hold water. He wrote in the days when “population pressure” was widely regarded as the engine that drove the origin of agriculture so it is not surprising that he thought this way. His argument goes as follows: Forager women’s fertility adds to about 4 children per woman, taking into account their lean body mass, “contraceptive on the hip” effect, and disease. That’s twice the replacement level. Therefore, foragers had to engage in life-threatening abortions and particularly in infanticide. Nobody likes to kill their own babies. Therefore people would rather work harder and intensify. (In addition, he argues rather ingeniously that it was female infanticide balanced by the killing of young men in warfare that kept the population in check.)

I have studied two cultures that were successful in limiting their populations, and while infanticide played a role in one (Tikopia), it did not play a role in the other (Moriori). And more to the point, the whole culture was shaped as to limit population. Even in Tikopia abortion or infanticide was more of a last resort. Exhortation by the chiefs, peer pressure to remain unmarried and childless, coitus interruptus, marriage customs (only first sons were allowed to reproduce), dangerous heroic sea voyages, and no doubt many other supporting customs combined to keep population maintaining at a steady state. The Moriori, about whom less is known, are said to have used castration of a certain percentage of the boys as their primary population limiting measure, and since they regarded killing other humans with horror, infanticide (or warfare) was not in their repertoire. (There must be much more to the story because obviously, even one man can impregnate all the fertile women in a small society.) The picture I see is cultures that have woven population limits deep into their cultural fabric. When they were motivated, they did have the tools to be successful. This is why I do not accept Harris’ thesis that it was population pressure that started the whole cycle.

I continue to side with Daniel Quinn and others who maintain that more food leads to more people (other things being equal), and not the other way around. It all starts with the intensification of food production. Population pressure is the result, not the cause. Population growth is a function of the food supply. Once intensification of food production got under way, once either foragers and cultivators got onto the “more food, more people” treadmill that Quinn describes so ably in his Story of B (B’s lecture on population), the “food race” became a vicious circle.

To finish the chapter in Cannibals and Kings that describes his formidable logic in detail (viz chapter 7, The Origin of Pristine States), Harris says:

The consolidation of governmental power would have taken place as a series of natural, beneficial and only slightly extra-legal responses to current conditions, with each new acquisition of power representing only a small departure from contemporary practice. By the time the remnants of the old councils sank into impotence before the rising power of the king, no one would remember the time when the king had been only a glorified Big Man whose exalted status rested on the charity of his friends and relatives.

Then, let us remember. Let us make it part of the Great Remembering.

big man

[Fourth part of a series: 1, 2, 3]

This was the tremendous strength of the tribal way, that its success did not depend on people being better. It worked for people the way they are – unimproved, unenlightened, troublesome, disruptive, selfish, mean, cruel, greedy and violent.
— Daniel Quinn

Is domination in our genes? It seems very likely. After all, the bands of our closest primate relatives are “run” by alpha leaders: among chimpanzees, the strongest males dominate the troop; among gorillas, a big male presides over a harem, and among the bonobos, both alpha females and related males wield power in the band. It is therefore highly probable that domineering alpha individuals led the bands of the early hominids. Domination conferred advantages: those who could snatch the most resources and mate with the most females “won” by surviving and passing on their genes. But at a certain point along our evolution our ancestors became radically egalitarian, sharing power and economic resources among all members. They lived as near-equals, had direct access to food and basic necessities, enjoyed modest affluence along with freedom and leisure, and refused to tolerate grabs for power, wealth, and prestige. This successful and durable adaptation is documented not only by archeological evidence but also by ample ancient and recent ethnographic accounts of “primitive” societies. [A sampler of links: on human reciprocity and its evolution, on the Batek people, and on tribal egalitarian ways.]

How did this transformation come about? Here is the argument. Our distant ancestors, just like chimps have been observed to do, chafed under the rule of the alphas. Nobody likes to be bullied on a regular basis. Nobody likes to have their food stolen by the bigger fellows just because they can. While rank and file chimps put the kibosh on their alphas only occasionally, stone age hominids figured out how to do it so regularly and thoroughly that a new social system was born. This is such an important and surprising development that we may speak of an egalitarian revolution.

Humans are unique among animals in cooperating in large groups of unrelated individuals, with a high degree of resource sharing. These features challenge traditional evolutionary theories built on kin selection or reciprocity. A recent theoretical model … takes a fresh look at the ‘egalitarian revolution’ that separates humans from our closest relatives, the great apes. The model suggests that information from within-group conflicts leads to the emergence of cooperative alliances and social networks.
Understanding the “Egalitarian Revolution” in human social evolution

The conjecture has it that it happened when our ancestors became communicative enough to form discreet coalitions, well enough armed to easily threaten or kill an upstart, and motivated to fairly share the meat needed for their growing brains. Nobody knows how long ago this may have been. Computer models have shown that the change may have occurred quite fast, within a few generations. We do know that big game spears date back at least to 400,000 years ago, that the later erectus had a large brain, and that hunting is probably far older than had been thought. Some anthropologists put the egalitarian revolution at perhaps 100,000 years ago, but allow that it may well have happened much earlier. Others go back as far as 2 million years to the beginning of the Paleolithic. I am taking here the liberty of assuming, not unreasonably, that we sapiens entered our speciation in the egalitarian mold.

Before 12,000 years ago, humans basically were egalitarian. They lived in what might be called societies of equals, with minimal political centralization and no social classes. Everyone participated in group decisions, and outside the family there were no dominators. Rather often the egalitarianism of hunter-gatherers pertains more to males than females, but the women enjoy far more political potency than did the women of Athens, and these mobile foragers kept no slaves. Their highly equalized version of political life goes far back into prehistory…
Christopher Boehm, Hierarchy in the Forest

For thousands of generations since the egalitarian revolution, we lived in small bands where the many set limits over the few for the benefit of all. The betas put an effective check on the alphas by wit, wisdom and alliance. Aggrandizing individuals who got out of hand were brought down a peg or eliminated. And so the evolutionary advantage went to the cooperators. In the former alpha-led system the advantage was to the strong, and the weak suffered. In the new system the advantage was to the weak(er), and most did well as a result. This state of affairs required continued vigilance, and an ongoing culture of egalitarian traditions of checks and balances. Our ancestors formed a new status quo that suited evolving human awareness, well-being and conscience better than domination. They came upon a strategy of effectively resisting power abuse by advantaging cooperative, sharing, pro-social behaviors.

This remarkable pattern of “vigilant sharing” saw humans through severe ice ages, intense global warmings and volcanic winters. It saw them through all the hardships our species has suffered in the 200,000 years of its existence, and that’s no small thing. A social system where vigilance against Hyde-ish behaviors is coupled with sharing most of the Earth’s bounty confers an evolutionary advantage. During difficult times, tribes that look after each other survive. Those that allow self-aggrandizing alphas’ rise into dominance and resource hoarding will be at a survival disadvantage. After all, those human bands where some gorged on meat while others starved would have, other things being equal, done poorly in ice age competition with other groups whose members were all relatively well fed, or in coping with the hardships of a frozen, arid world.

There were always failures. Despotic or greedy individuals managed to snatch power for a while and disturbed the equilibrium. But this only reinforced overall the traditions and customs mediating these weaknesses. Our ancestors did not try to convert the human nature to something else. They shrewdly acted on what the human nature really was, and cultural evolution did the rest. Displaying the same sharp wit as certain astute American Founders of 200+ years ago, they understood that human society must acknowledge and be shaped around human weaknesses, vices and foibles. They built in checks and balances that curbed the — certain to occur — misuse of power and incipient greed. Their leadership patterns can be described as ad hoc egalitarian meritocracy: people rose into leadership on the basis of helpful qualities, were carefully watched, and unseated if power went to their heads.

Human beings, after all, are not created equal in ability. It is the responsibility of the community to make sure that ambitious or aggressive individuals don’t overstep the boundaries leading to power abuse, while at the same time giving these naturally advantaged people enough leeway that they may benefit the community through their talents and leadership. It’s a balancing act that requires constant care… like driving a car. All goes well most of the time, because continual vigilance is practiced, and small adjustments are easily and continuously made. If the driver stops paying attention, however, trying to right the situation will probably be hard and painful once the tree approacheth ready to smack the vehicle. And so also, once a dominant individual or a clique muscles their way into power, the cost of dealing with them can be quite high. Egalitarians understand well that power goes to people’s heads with tedious regularity, that it devolves on the rest of the community to be alert to it, and that it is the responsibility of the weak to curb the strong.

Let’s go back to the time when the ice began to let up, some 17,000 years ago. There had been occasional societies in the European Paleolithic where a measure of economic and political inequality took hold for a time. Nevertheless, the predominant pattern is remarkable. Here we are, egalitarian to the bone. We are sharers, our possessions are few, we are on the lookout for upstarts and hoarders, standing up for the weaker members of the band. We murder each other with unsettling frequency, mostly men killing other men while competing for women. We skirmish against other bands and tribes, but casualties are limited. Occasionally, a despotic individual arises, wreaks damage, and is eliminated. We live within modest abundance, and famines, as well as great many later diseases, are largely unknown. We are still both nice and nasty inside, but over the last several hundred thousand years have become remarkably nicer in our behavior within the tribe. The underdogs unite to keep the bullies in check for the benefit of all.

Vigilant sharing of power and resources has been the preferred mode of our species’ existence for most of its time on Earth. Did these cultures halt human evil? No; they circumscribed Hyde. And if they could do it, why not us? Finding a way to reconnect with our egalitarian past in the near future seems more and more like the sweetest dream worth pursuing.

Longbottom, at the end of this lesson we will feed a few drops of this potion to your toad and see what happens. Perhaps that will encourage you to do it properly.
— Severus Snape

But wait a minute, you might be saying. [For the first two parts of the series, see here and here.] If our nature is dark and light, then we may as well throw in the towel. If this malfunctioning human system called civilization is simply an outgrowth of who we are, then any other system we create will also be fatally flawed, right? A valid concern; does either theory or history bear it out?

Our shadow side, our dappled human psyche explains much. It particularly explains the everyday evils stemming from our mistaken or malign intentions and misguided actions. But does it explain enough? Human nature did not abruptly change 6,000 years ago. Yet as history shows, there was a distinct cultural and behavioral break with what went before. Human existence — first in Mesopotamia, then elsewhere — suffered a profound, alarming, and sudden setback, as city-states and then empires rose, wars were institutionalized, human cruelty reached horrifying heights, economies turned to steady plunder, and stratification, slavery and perpetual indebtedness pushed large numbers of human beings into inhuman misery. This civilization, with its dark heart of conquest and domination, was born.

Our species is at least 200,000 years old. A mere 6,000 years ago, unprecedented, massively destructive social systems began to rise. How could this possibly be explained by recourse to human nature? Consider an alternative hypothesis. Let us begin by noting that there are two depths of social evil. There is the moderately greater poverty of some within a community. And then there is obscene destitution in the shadow of a palace. There are raiding parties of a couple dozen warriors clashing. And then there is war. There is the painful and often lethal gauntlet that war captives had to run among some Indian tribes. And then there is the destruction of a city where all men are tortured and slaughtered, all women and children sold into slavery and the fields are salted so nothing can ever grow there again (e.g. the Roman sack of Carthage). There is the petty despot of a chief. And then there is the king or modern dictator. There is the raid against a nearby settlement to steal their goods. And then there is the breach of a dam unleashed against another town to destroy all that live there, as the “civilized” people of Sumer liked to do to their neighbors. It is one thing to capture the children of one’s enemies. Quite another to see to it that “their children were beheaded, flayed alive or roasted over a slow fire,” courtesy of the Assyrians. It is one thing to have imperfect human societies where some levels of antisocial harm are expected. And yet quite another to build social systems that glory in violence, cruelty and plunder.

Suppose we agree that we are neither “basically good,” nor depraved and rotten to the core. Our mixed character challenges our evolving conscience, but each left to our own devices, the harm we do is mostly commonplace. Often, we blunder badly. Sometimes, our motives are frankly malevolent in small insidious ways. But the extreme evils listed above cannot be inflicted at personal or small group level. They require a socio-economic system that amplifies Hyde.

Societies that ignore Hyde and leave him at large suffer profound detrimental consequences. After all, the Hyde/Jekyll problem is not symmetrical: the damage done by antisocials wounds us all and is often impossible to right. The people killed in wars cannot be brought back, a ruined landbase may not be able to heal within a timeframe meaningful to mortals. And our Jekylls are always busy cleaning up after the Hydes. All in all, it adds up to Hyde coming out ahead. It can get worse, of course; a culture can amplify Hyde and disadvantage Jekyll to such an extent it ends up with a social system run by psychopaths off a cliff. But there is a third option: putting limits on Hyde, Jekyll can spend his energies in pro-social undertakings.

The unfortunate Dr. Jekyll is trapped in a paradigm that gives Hyde an advantage, mirroring the way this whole civilization is structured. But it’s not hard to imagine a possible happier ending to the Jekyll/Hyde tale. How about this? The good doctor does not work in secret, looking for fame as a lone genius, but is part of a team of colleagues. These people come to spot him as he drinks the potion, aware that the results might be — shall we say — iffy. When Hyde makes his appearance, they are ready. Safely constraining the dangerous shapeshifter, they contact other allies to issue a warning and urge the development of an effective antidote. When Jekyll reappears, they persuade him to remain under watch just in case a flashback occurs, and keep monitoring the drug’s residual effects over time. And the brew? Lock it up in a safe and leave room for a sequel?

In the original novella by Robert Louis Stevenson, Jekyll gets badly addicted to the heady rush of “being Hyde.” Let a chill pass down the spine as we contemplate our own horror of being strung out on seductive daily doses of vile molochov cocktails… In life, there are no guarantees; Hyde lives in us. He too deserves his due. Would Jekyll’s friends be up to considering Hyde’s needs along with Jekyll’s? Facing him with awareness, wisdom and kindness may tip the balance for society at large.

This alternative telling, it seems to me, illustrates the behavior of a levelheaded society as well. A sane culture bent on long-term survival embodies the understanding that we are all better off if we look out for one another, and that the fruits of human intelligence are just another part of the commons, developed and shared collaboratively. It remembers with particular urgency to acknowledge and set limits on Hyde in tandem with Jekyll’s growing powers. A commonsensical precaution, not requiring extra high levels of intelligence or advanced training, wouldn’t you say? And this is exactly what our Paleolithic forebears proceeded to do.

The problem with people is that they’re only human.
— Bill Watterson

We emerged from the mists of our deep history into awareness as rather appalling, scary creatures, and also as rather wonderful, amazing creatures… something completely new in the world: animals who told stories, who learned to laugh, befriended other species, created new things with their deft hands and spun delightful images with their clever brains. Initially, our powers were small. We lived, more or less, in harmony with the world, like other creatures. We were no more — but no less! — awful and destructive than the hyena or the shark. Like a hyena or a shark, we heedlessly grabbed what the planet offered. But since our powers were small, any damage we did was small. The limits Mother Nature places upon all organisms limited us as well.

Still, we evolved. Any shark or hyena grown an opposable thumb and a manipulative brain would be a very alarming creature indeed, especially as it began to evade, at least for a time, the bounds nature places on all organisms. I don’t blame humans for becoming more destructive as our capabilities grew. Would any other animal behave differently? As our behavioral repertoire expanded — better language, cooperation, skills of survival, hunting, reasoning abilities, symbols, dexterity — our ability to change the world, to help or harm it, increased apace. We became very good at survival, and very good at destroying what stood in our way. Any large predator – had it evolved such abilities – would become a very dangerous creature indeed, to self and others. Jekyll grew in powers. How wonderful. But so did Hyde. How very, very inauspicious.

As our capabilities expanded, humans began to cause significant damage to certain parts of the planet. We damaged a large part of the continent of Australia and its climate through fires. Imagine: small roaming bands of humans equipped only with simple stone age tools managed to bring ruin to an entire continent! We probably had a hand in wiping out our cousins, the Neanderthals, and perhaps other descendants of erectus as well. Our greedy hunting methods included mass stampedes of hundreds of animals over cliffs and into cul-de-sacs when only a few of their bodies could be used. With a variety of improved hunting strategies, we began to have significant impact on certain animal populations, and likely contributed to the extinctions of the Upper Paleolithic. We certainly caused great devastations much later as the outlying Pacific islands were settled, or used as larders by passing sailors.

Picture leaving a few pairs of predators – say, cats — on pristine Easter Island. They would have multiplied, wiped out the naïve fauna in relatively short order and collapsed, leaving an impoverished island behind. Just the way the Polynesians did it. We think that humans should know better. But the cats’ predicament is our predicament too. Even today, with all the bells and whistles of modern life, we are not good at dealing with the future staring us in the face. We find ourselves just as unable to modify our destructive behaviors as did the hapless Easter Islanders.

Face to face with a more realistic assessment of human nature, is pessimism or cynicism called for? I don’t see it that way. I do think evolution has saddled us with a problem that calls for a great deal of caution. Our shadow side cannot magically disappear by going into therapy, getting religion, via bootstrap evolution, through self-discipline, or doing the 12 steps. It will not disappear by sloughing off civilization. This is who we are: dangerous, amazing, limited human animals. We must face what is in terrible glory inside us. To let our heart be broken by who we are. To know, to surrender to the truth, and to find peace. Then we can quit tearing Mother Nature to pieces in revenge for having made us so imperfect, so “fallen.” Then, we can finally stop destroying each other and the planet we love.

If we believe in the fundamental goodness of man, we are doomed.
— Dr. Robert Hare

We may as well start with some very bad news, and get it out of the way. We humans are naturally violent, acquisitive, greedy, negligent, aggressive, destructive, petty, mean, self-centered, and sometimes abysmally foolish. And now for the very good news: it’s also in our nature to be peaceful, giving, generous, caring, gentle, creative, broad-minded, kind, altruistic, and sometimes profoundly wise. We are domineering, yet we long for equality.

Mother Culture of the so-called progressive worldview vigorously disagrees. As a reaction to the often-knee jerk blaming of human nature for the failings of civilization, many of us moderns bought the other side of the coin. Haven’t we been told by the various gurus of enlightened 21st century thinking that human nature is “basically good”? Like some anxiety management self-help circle, we indulge in endless mutual assurances that I am ok and you are ok. But the façade of “goodness” crumbles rather quickly under the critical gaze of those who lose faith in the ready blandishments. “There are more and more factors beginning to push us out of the comfortable pew where we mostly once worshiped our species, our ‘leaders’, our civilization, our perception of unlimited human capacity and entitlement and manifest destiny.” Indeed. And along with the worship of our species goes the often uncritical defense of the species’ nature. These particular worshipers fish around for evidence that our primate cousins are gentle giants, that our paleolithic ancestors lived non-violent lives, that hunting and omnivory was really somehow imposed upon us mild-mannered fruit-eaters, and that human aggression is really learned — not innate — and can be erased with another kind of learning.

When people argue on behalf of benevolent human nature, the argument often takes this informal shape: It is quite evident that most of us behave in fairly innocuous ways most of the time. But look at all the horrible things people have done – now a list of genocides, tortures, and other ghastly deeds emerges – that is not us, is it? The Hitlers of this world are caused by… culture, stress, poor upbringing, perhaps even innate pathologies. But that’s not us! See, most humans are basically good. Such an argument is based on a fallacy. It’s not either/or: either we are basically good, or we are genocidal maniacs and perverts. There is a third possibility: that we are both good and bad in fundamental common measure. And this point of view, called by some social scientists “the ambivalence model of human nature” is the keystone of my own understanding. I used to believe otherwise. I once defended vigorously the “basically good” point of view. But events in my own life — in my own behavior! — eventually prompted me to take a harder look.

I now accept a different argument. This one is rooted in the evidence of primitive tribes. Their profound egalitarianism, radical sharing, steady emphasis on social harmony, and the rarity of serious armed conflict rightly astounds the modern mind. But it would be a romantic misdirection to claim that greed, violence or power abuse is absent among them. Studies clearly indicate that hiding one’s kill from others, shirking common work, eagerness to inflict severe damage on neighbors, and upstartism has been documented time and again even among remote or newly contacted tribes. Significant levels of violence — mostly among males competing for females, and in skirmishes between bands — have been recorded in most primitive societies.

What is the evidence from our far-ancient ancestors and other primates? An erectus find displays the remains of a human being who had been scalped and his eyes gouged out. There is evidence of interhuman violence, including human sacrifice, in cave art and Upper Paleolithic remains. And a massacre from about 12,000 years ago shows half of a small settlement dispatched by human weapons. Chimpanzees have been observed to terrorize and kill other chimps. It has finally been understood that intraspecific violence is common among animals, including our closest primate relatives. We are no different.

It is the propensity for killing that allows both chimps and humans to be such good hunters. Bonobos were said by eager romanticizers a while back “to have lost the desire to kill.” But careful study shows bonobo females organizing themselves into precise, coordinated, swift and deadly hunting bands as they go after monkeys. It is hard to believe we would have evolved into fierce predators had there been no biological basis for it.

And then there is cannibalism. Well documented among the erectus, Neanderthals, and sapiens, it presents a picture of our nature many of us would prefer not to know. But the evidence cannot be ignored. Both long-ago ancestors and more recent tribal peoples hunted fellow humans as prey. Eating one’s fellows out of dire hunger, reproductive reasons, and cage confinement is not uncommon in the animal kingdom. But gastronomic cannibalism, the hunting of one’s own kind in plentiful times for food is far more unusual. We stand in the company of bull frogs, scorpions, king cobras, sharks, and our primate cousins, the common chimps. Isn’t that alone something to gag on?

Benevolent, us?! Trees are benevolent beings. We are not. Besides, any animal species has it in their power to wreak a lot of damage on earth by overbreeding, overtrampling, overkilling and overconsuming. This is true from bacteria all the way to mammals. It is true of us.

The dark and light nature of our species was vividly portrayed by that classic of a film, Dr. Jekyll and Mr. Hyde (1931). Dr. Jekyll, a noble humanitarian, develops an elixir that — he hopes — will improve upon human nature. He tests his potion on himself and morphs into the hairy, coarse, nasty Mr. Hyde who goes off on a rampage. The story ends badly. To be rid of Hyde, the world must be rid of Jekyll. The Jekyll/Hyde metaphor is a powerful reminder of the underlying light-and-shadow that lives in ambivalent, dappled symbiosis in all of us.

Where once humans were blamed for the imperfection of civilization, turning it upside down blames civilization for the imperfection of humans. “It is the psychotic demands of civilization that have created these very troubling forms of social disintegration along with the weakness that haunts individuals in their complicit acquiescence, in their enslavement to these urban walls and the psychopathologies they generate.” Human evils are symptoms of stress-related mental illness caused by our culture. If that is true — and the project of Enlightenment has believed it to be so — then all we need is shucking off the burden, healing, and plenty of freedom. More freedom! How sweet it rang in the French revolution. How sweet the sound in all the propaganda for modernity. But if human nature is dark and light, then more freedom for Jekyll will always and inevitably lead to more freedom for Hyde… and that seems like a singularly bad idea.