Baishiya Karst Cave, where the fossil was found, is both a Buddhist sanctuary and a tourist attraction.
Dongju Zhang/Lanzhou University
Back in July 2016, I was in Spain’s Balearic Islands for a short holiday, when, coming back from a dive, I found an email from China waiting for me. The email asked my opinion about a strange mandible discovered on the Tibetan Plateau, in a place called Xiahe, China. At the sight of the pictures on my screen, my heart jumped. The fossil was quite complete and clearly nonmodern. Six weeks later, the archaeologist Dongju Zhang visited me in Leipzig, Germany, to discuss our collaboration. Before September ended, I was in Lanzhou, a town on the Yellow River at the foot of the Tibetan Plateau, to see the fossilized jaw in person. With a team of specialists, we were embarking on a remarkable adventure.
That jaw turned out to be Denisovan, results that we published today in the journal Nature. It is the first fossil of this elusive branch of hominins found outside of Siberia’s Denisova Cave, and it provides critical clues to what Denisovans looked like. With the jaw and a new molecular technique, we and other researchers can begin to identify as Denisovan other fossils that are already in collections.
Denisovans are one of the most mysterious groups of hominins. The research community’s fascination largely results from the extraordinary conditions of their discovery. In 2007, a team led by evolutionary geneticist Svante Pääbo from the Max Planck Institute for Evolutionary Anthropology in Leipzig, identified Neanderthal DNA in fossil bones coming from Okladnikov Cave in the Altai region of southern Siberia. Neanderthals had not previously been found so far east in Eurasia, and this discovery intensified the quest for ancient DNA throughout Siberia. These efforts led researchers to the Denisova Cave, where ancient DNA was also found. I was part of the group who published those results and, 10 years later, I still vividly remember the excitement of that discovery: the DNA from the Denisova Cave was not Neanderthal DNA—it was DNA of “something else.”
The team was able to identify this partial mandible as Denisovan by analyzing its degraded proteins. Dongju Zhang/Lanzhou University
This “something else” was revealed to be a sister group that had split from the Neanderthals 450,000 years ago: Denisovans. The Denisova Cave offers outstanding conditions for the preservation of ancient DNA, with an average annual temperature close to zero degrees Celsius. Unfortunately, it has been more often occupied by carnivores than by hominins, and the bones found in the site are rarely bigger than a thumb, so the anatomy of Denisovans has remained elusive.
I had long suspected that Denisovans represented a substantial portion of the already-known Chinese fossil record; they just hadn’t been identified. Researchers know that traces of Denisovan DNA are found in people today all over East Asia and, to a greater extent, in Australia and Melanesia. Most likely, modern Homo sapiens moving into East Asia between 80,000 and 40,000 years ago interbred with Denisovans, including in places much further south than Siberia. But this has been impossible to prove. No Denisovan DNA has been extracted from existing fossils outside the Denisova Cave, generally because warmer environments do not preserve DNA for very long. And we couldn’t connect specimens from Denisova Cave to other fossils based on how they looked, as there wasn’t enough information available to do that. The Xiahe jaw, we hoped, could bridge the gap.
A couple of years ago one of my Ph.D. students, Frido Welker, demonstrated how ancient proteins, which can be preserved much longer than DNA, could, in the absence of DNA, be used to map out hominin groups. From the genome sequencing of Neanderthals and Denisovans, it is possible to predict the structure of these proteins and build a hominin family tree. The Xiahe mandible did not contain ancient DNA, but its teeth did yield degraded proteins. In 2017 we extracted these proteins—the first time this has been done from an archaic Chinese hominin—analyzed them, and found that they sit on the same branch of the hominin family tree as the specimens from the Denisova Cave. This was the kind of Eureka moment scientists have at most a few times in their lives.
This Xiahe jaw arrived on the scientific stage after a long journey. In 1980, an anonymous monk recovered the specimen while visiting the Baishiya Karst Cave, near Xiahe, to pray and meditate. Local people used to grind the “holy bones” collected in this cave to use them as medicine; this one, maybe more precious because it was clearly human, escaped destruction. Instead, it was offered to the Sixth Gung-Thang Living Buddha, who later passed it on to scientists at Lanzhou University. It was only in 2010 that a team from that university, led by paleoclimatologist and Quaternary geologist Fahu Chen, could start to investigate the Baishiya Karst Cave and its surroundings. Then, eventually, came that startling email to me in 2016.
We no longer know exactly where in the cave the jaw bone was found. But a crust of carbonates covering the fossil has been dated, using isotopic chemistry, to 160,000 years before present. That’s 120,000 years older than any archaeological traces of humans in the region. The Baishiya Karst Cave is more than 3,000 meters above sea level, at the foot of an impressive white cliff facing south toward the Ganjia Basin. It is a huge, dry, and relatively warm cave—a good place to live, especially during glacial episodes like the one developing 160,000 years ago.
Dongju Zhang (top right in the trench) leads an excavation in Baishiya Karst Cave in 2018. Dongju Zhang/Lanzhou University
In 2016, Zhang was finally allowed to start an archaeological survey inside the cave, which is a Buddhist sanctuary. She discovered stone artifacts and eventually began a more systematic excavation in 2018. Her team has so far found a large number of stone tools and animal bones with cut marks. These artifacts will provide invaluable information about how Denisovans lived and adapted to the environment on the high Tibetan Plateau.
The morphology of the Xiahe jaw is reminiscent of that of other Eurasian Middle Pleistocene hominins. As I hoped, this fossil now lets us say that other Chinese specimens, in particular an archaic mandible found off the shore of Taiwan and reported in 2015, most likely also belong to the Denisovans. Researchers have been searching for a long time for a fossil that can be used to “diagnose” Denisovans. A piece of skull was recently found in Denisova Cave, but it was too small to use to identify other fossils. The Xiahe mandible is complete enough to now revisit the rich Chinese hominin fossil collection and identify other fossils as Denisovan, even without DNA evidence. I have no doubt that in the future the sequencing of ancient proteins will complement these morphological analyses.
But the most extraordinary aspect of our findings, in my opinion, is the demonstration that such archaic hominins could successfully live in this challenging high-altitude environment, more than 120,000 years before modern H. sapiens settled on the Tibetan Plateau. It seems that a gene variant that helps modern populations on the Tibetan Plateau to adapt to high-altitude hypoxia was inherited from these Denisovans.
A new phase in the deciphering of human evolution in Asia has begun. Human evolution in this part of the world is much more complex than was thought; the simplistic model of a local and direct evolution from Homo erectus to present-day Asians needs to be abandoned. And from the Baishiya Karst Cave, there will surely be more discoveries to come.
Homo luzonensis’ teeth are unusual in that the premolars (two teeth on the left) are relatively large, while the molars (three on the right) are smaller than other hominin molars.
Callao Cave Archaeology Project
This week, anthropologists working in the Philippines unveil new fossils that they say belong to a previously undiscovered species of human relatives. The fossils come from Callao Cave, on the northern island of Luzon, and are at least 50,000 years old.
The team, led by Florent Détroit of the National Museum of Natural History in Paris, have named the new species Homo luzonensis after the island where it lived.
With only seven teeth, three foot bones, two finger bones, and a fragment of thigh, the set of Callao fossils doesn’t give much to go on. Their small size is reminiscent of Homo floresiensis, the tiny-bodied species discovered in 2003 on the island of Flores, Indonesia, that lived around the same time. But there aren’t enough remains here to say just how tall Homo luzonensis was. And, unfortunately, the team was unsuccessful in attempts to find DNA. Many people will wonder, on such slim evidence, if the declaration of a new species is warranted.
I was fortunate to be a part of the team that discovered the new hominin species Homo naledi, which lived in South Africa around 250,000 years ago. That work was published in 2015. Such discoveries seemed almost unimaginable 20 years ago, when I was finishing my Ph.D. At that time, some of the most respected anthropologists actually suggested that the hunt for hominin fossils was almost over. Funding agencies directed their efforts away from exploring for new fossils and toward new technologies to wring more precious data from fossils discovered in the past.
Callao Cave on Luzon Island, Philippines, where the fossils were discovered. Callao Cave Archaeology Project
Yet the last 20 years have seen an unprecedented burst of new discoveries. Some, like H. naledi and H. floresiensis, represent branches of the human family tree that separated from the modern human line quite early and yet survived until a surprisingly recent time.
Was H. luzonensis another such population? To establish that these fragmentary fossils justify recognition as a new species, a key first step is to exclude their membership to modern humans. Living people of the Philippines include some very small-bodied groups. Small size alone is not enough to place the Callao fossil teeth outside the range of modern people.
To go further, Détroit and colleagues studied the details of the bones and teeth. Together, they represent a mash of features that are confusingly reminiscent of a huge range of other hominins, and together make for something new and hard to classify. The molars, for example, are small compared to every other known species, while the adjacent premolars, bizarrely, are not so small. The molar crowns have a simple, humanlike pattern, but the premolars bear resemblance to the larger teeth more typical in older species, including H. floresiensis and some early specimens of Homo erectus. Some premolars have three roots, as sometimes found in H. erectus and more distant human relatives. The toe and finger bones also seem different from modern humans: One finger bone is curved, and the toe doesn’t seem to have been able to bend upward at the ball of the foot as much as ours. In some ways, these bones resemble hominins that lived more than 2 million years ago, such as Lucy’s species, Australopithecus afarensis. No other known species shares the whole set of features found at Callao.
So, what does this discovery mean? To me, it solidifies the case that ancient human relatives were a lot smarter and more adaptable than we used to give them credit for.
Flores lies about 2,000 miles to the south of Luzon, but both islands share a peculiar geography: Land bridges never connected these islands to the Asian continent. Another large, disconnected island in the region is Sulawesi. There, stone tools from a site called Talepu were made by hominins more than 118,000 years ago, though no fossils have been found yet to indicate who was making them. Some anthropologists have thought that the colonization of such islands over water was due to luck. Maybe ancient storms or tsunamis washed a few unsuspecting survivors onto ancient beaches. But where one strange event might be attributed to luck, three are much more interesting.
There is evidence for early hominins on each of these three islands. John Hawks
The evidence for life on these islands goes back a long way. Some hominins were making stone tools on Flores more than a million years ago, and the oldest hominin fossil on that island is around 700,000 years old. Last year, paleoarchaeologist Thomas Ingicco, from the National Museum of Natural History in Paris, and colleagues reported on work at the site of Kalinga, Luzon. There, they found stone tools and butchered rhinoceros bones, also around 700,000 years old. Very early forms of Homo must have surpassed barriers and found new ways of life in places with very different climates and plant and animal communities than their African ancestors. Meanwhile, within Africa, a diversity of hominin species continued to exist throughout most of the last million years.
It’s too early for us to say whether the earliest inhabitants of Flores and Luzon gave rise to H. floresiensis and H. luzonensis. I wouldn’t bet on it. Many new arrivals may have come between the first occupations and the later appearance of modern people in the region. One such arrival may have been the Denisovans, a mysterious group known from DNA evidence. Today’s people of the Philippines bear genetic traces of Denisovan ancestry, and new analyses of Denisovan genetic contribution in New Guinea suggest deep roots for this ancient group. Could the Denisovans have existed on Flores, Sulawesi, or the Philippines?
To answer such questions, we must reinvest in exploration. The new discoveries of the past decade or so have transformed the field of human origins. New methods of exploration, and more intensive exploration of underrepresented regions, have introduced a new paradigm. Ancient groups of human relatives were varied and adaptable. They sometimes mixed with one another, and that mixing gave rise to new evolutionary solutions. Our species today is the lone survivor of this complicated history. We have replaced or absorbed every other branch of our family tree.
Many more of these branches are surely waiting for us to find them.
Northern Ethiopia was once home to a vast, ancient lake. Saber-toothed cats prowled around it, giant crocodiles swam within. The streams and rivers that fed it—over 3 million years ago, during the Pliocene—left behind trails of sediment that have now hardened into sandstone.
Deposited within these layers are fossils: some of early hominins, along with the bones of hippos, antelope, and elephants. Anthropologist Jessica Thompson encountered two of these specimens, from an area named Dikika, in 2010.
At the time, she was a visiting researcher at the Institute of Human Origins at Arizona State University. Given no explanation as to their history, she analyzed the bones and found signs of butchery. Percussion marks suggested someone may have accessed the marrow; cut marks hinted that flesh was stripped from bone. To her surprise, the specimens were 3.4 million years old, putting the butcher’s behaviors back 800,000 years earlier than conventional estimates would suggest. That fact got Thompson, now an assistant professor in the department of anthropology at Yale University, thinking there might be more traces of tool use from those early times.
In a wide-ranging review published in February’s issue of Current Anthropology, Thompson joins a team of researchers to weave together several strands of recent evidence and propose a new theory about the transition to large animal consumption by our ancestors. The prevailing view, supported by a confluence of fossil evidence from sites in Ethiopia, is that the emergence of flaked tool use and meat consumption led to the cerebral expansion that kickstarted human evolution more than 2 million years ago. Thompson and her colleagues disagree: Rather than using sharpened stones to hunt and scrape meat from animals, they suggest, earlier hominins may have first bashed bones to harvest fatty nutrients from marrow and brains.
Humans are the only primate to regularly consume animals larger than themselves. This nutritional exploitation, something Thompson and her colleagues call the “human predatory pattern,” has long been synonymous with the flesh-eating, man-the-hunter view of human origins.
Because large animals such as antelope pack a serious micro-and-macro-nutrient punch, scientists have thought their meat contributed to humanity’s outsized brains. A consensus arose in the 1950s that our ancestors first hunted small animals before moving on to larger beasts around 2.6 million years ago. Flaked tool use and meat eating became defining characteristics of the Homo genus.
“It’s a very appealing story,” says Thompson. “Right around that time there appeared to be the first stone tools and butchery marks. You have the origins of our Homo genus. A lot of people like to associate that with what it means to be human.”
Then, starting in the mid-1980s, an opposing theory arose in which Homo’s emergence wasn’t so tightly coupled with the origins of hunting and predatory dominance. Rather, early hominins first accessed brain-feeding nutrients through scavenging large animal carcasses. The debate has rolled on through the decades, with evidence for the scavenging theory gradually building.
In pursuit of nutrients from marrow and brain, early hominins likely smashed animal bones with percussive tools, such as the flint hammerstones in the top row. Flaked stone tools, such as the ax-head fragment in the lower photo, may have been crafted for other tasks. Frank Basford/Wikimedia Commons (Top/Bottom)
The new paper goes further: Harvesting outer-bone meat would have come at significant costs, the authors argue. The chance of encountering predators is high when scraping raw flesh from a carcass. Chewing raw meat without specialized teeth doesn’t give much energetic benefit, studies have shown. In addition, meat exposed to the elements will quickly rot.
Marrow and brains, meanwhile, are locked inside bones and stay fresh longer. These highly nutritional parts are also a precursor to the fatty acids involved with brain and eye development. And more easily than flesh-meat, bones could be carried away from carcass sites, safe from predators.
Conventional thinking has been that the behavioral package of early hominins was to go after meat and marrow together, explains Briana Pobiner, a paleoanthropologist at the Smithsonian Institution, who did not contribute to the new paper. But in the new paper, she says, “This team has shown that marrow may have in fact been more important. It’s a nuance, but an important nuance.”
The Pliocene—between 5.3 and 2.6 million years ago—was an era of dramatic change. An intensely variable and cooling climate transformed vast swaths of rainforest into mosaics of grassland and savanna. Large clearings spawned ecological niches for opportunistic and versatile hominins like Australopithecus, a likely contender for the Homo ancestor, and Kenyanthropus to fill in. Larger predators may well have left carcasses for them to scavenge.
Evidence suggests hominins shifted their diet around 3.76 million years ago as they took advantage of the open spaces. By around 3.5 million years ago, some species of Australopithecus already showed increased brain sizes, up to 30 percent larger than chimpanzees of comparable body size. Canines had shrunk to proportions later seen in the genus Homo, and hand morphology was already more human than ape, with potential both for terrestrial travel and tool use.
Percussive tools, the authors argue, were the key to the transition to large animal exploitation. Rocks could bash open bones, exposing the marrow inside. The alternative—that humans sharpened stone against stone, creating a flaked tool to carve meat from bone—seems more onerous, they say. They argue that such meat carving and the associated tool creation would likely come later.
As to who wielded these percussive instruments, the timeline presents a puzzle. The earliest Homo specimen is now dated to 2.8 million years. The Dikika fossils suggest butchery behaviors at 3.4 million years ago. Homo may have emerged earlier than scientists suspected—a theory that would need more fossil evidence to support it—or another hominin, such as Australopithecus, may have created tools before Homo.
Some scholars aren’t convinced by the study’s arguments, however. For example, Craig Stanford, an anthropologist at the University of Southern California, questions the emphasis on hominin scavenging behavior appearing before hunting. “We have no examples today of animals that scavenge but don’t hunt,” he adds.
To test the new theory, the review authors suggest seeking out further evidence of percussive tools that predate flaked tools. Researchers could, they note, broaden the search for the signatures of such instruments within both the existing fossil record and at dig sites. Thompson’s graduate students, for example, are using 3D scanning and artificial intelligence techniques to improve the identification of marks on fossils—whether they were created by early hominins, saber-toothed cats, hyenas, or other types of creatures.
What they uncover could deal a blow to their theory, but it will also, undoubtedly, enrich our understanding of how our ancestors evolved.
A chunk of a Denisovan skull has been identified for the first time—a dramatic contribution to the handful of known samples from one of the most obscure branches of the hominin family tree. Paleoanthropologist Bence Viola from the University of Toronto will discuss the as-yet-unpublished discovery at the upcoming meeting of the American Association of Physical Anthropologists in Cleveland, Ohio, at the end of March.
Very little is known about the Denisovans, an extinct branch of hominins that was a sister group to Neanderthals. Only four individual Denisovans had been identified previously, all from one cave in Siberia. The first Denisovan was described in 2010 from the fragment of a pinky finger bone, and three more were identified from teeth. This skull piece, excavated about three years ago in that same Siberian cave, represents a fifth individual.
“It’s very nice that we finally have fragments like this,” says Viola. “It’s not a full skull, but it’s a piece of a skull. It gives us more. Compared to the finger and the teeth, it’s nice to have.” But, he adds, it’s hardly a full skeleton. “We’re always greedy,” he laughs. “We want more.”
The new discovery consists of two connecting fragments from the back, left-hand side of the parietal bone, which forms the sides and roof of the skull. Together, they measure about 8 cm by 5 cm. DNA analysis proves that the piece is Denisovan, though it’s too old to date with radiocarbon techniques. Viola and colleagues have compared the fragments to the remains of modern humans and Neanderthals, according to the conference abstract, although Viola is unwilling to discuss the details of what they learned until the work is published.
“This is exciting,” says Chris Stringer, a paleoanthropologist at the Natural History Museum in London, U.K., who wasn’t involved with the work but will be presenting in the same upcoming conference session about Denisovans. “But, of course, it is only a small fragment. It’s as important in raising hopes that yet more complete material will be recovered.”
Sadly, the newfound piece is not large enough to use to identify other skulls found elsewhere as Denisovan without genetic information to back the diagnosis up. Researchers are still waiting for a find like that, which would likely help to boost their collection and understanding of Denisovans. Researchers think the extinct hominins once roamed widely across Asia, but since most fossils aren’t well enough preserved to allow for genetic analysis, it has been hard to identify Denisovans elsewhere.
In 2017, some researchers wondered whether two partial skulls found in China might be Denisovan, but this remains unconfirmed. “It should be possible to see how well this [new find] matches with Chinese fossils such as those from Xuchang, which people like me have speculated might be Denisovans,” says Stringer.
Nicola Jones is a freelance science journalist living in Pemberton, British Columbia.
The town of Kabwe sits about 70 miles north of Zambia’s capital, Lusaka, as the crow flies. Just over 200,000 people live in this major transportation crossroads. Like most of this south-central African nation, Kabwe is perched on a high and vast plateau, a land of red soils dotted with shrubby legumes and canopies of small, spindly miombo trees.
Kabwe’s story is defined in part by a mine that opened in the early 1900s after rich deposits of lead and zinc were discovered on the edge of the town. Kabwe—then called Broken Hill—became a major mining center, producing profits for British interests and, later, important metals for the Allies in both world wars. At that time, Zambia was known as Northern Rhodesia, after British mining magnate Cecil Rhodes, whose name has come to symbolize the worst evils of his nation’s colonialism.
The mine shut down in 1994 after its deeper deposits of zinc and lead were exhausted, 30 years after Zambia achieved its independence. But it left the town with a toxic legacy of lead contamination. Recent studies have found that nearly all of Kabwe’s children have blood lead levels so high that their health is in serious danger. Environmentalists consider Kabwe to be one of the most polluted cities on Earth. And they are concerned by reports that the Zambian government has given the international minerals company, Jubilee Metals Group, based in London, permission to begin collecting lead and zinc from surface deposits this year.
Yet the Kabwe mine also left a happier legacy, one that all of humankind can celebrate: In 1921, miners working there discovered a fossilized skull of a possible human ancestor, along with some other bones thought to be associated with it. Dubbed “Rhodesian Man,” this hominin may occupy a pivotal place in the evolutionary transition from Homo erectus to Homo sapiens. Today anthropologists refer to the find as the “Kabwe skull” and recognize it as the first early human fossil discovered in Africa, found at a time when most scientists were looking to Asia or even Europe for the origin of our species.
Soon after its discovery, mining officials sent the fossils to the British Museum for study. In subsequent years, the skull and other remains stayed in the U.K., and today they reside in London’s Natural History Museum. The Zambians have been trying to get them back for decades, to no avail.
The U.K. has a reputation for fiercely resisting the return of antiquities acquired during colonial times, especially tourism-generating treasures like the Parthenon sculptures (a.k.a. Elgin marbles) from Greece and the Rosetta stone from Egypt. But last May, at a meeting of the United Nations Educational, Scientific, and Cultural Organization (UNESCO) heritage committee in Paris, Zambian representatives finally broke through British resistance. The United Kingdom agreed to sit down and talk about the possible repatriation of the skull and related fossils back to Zambia.
British and Zambian officials are being tight-lipped about the upcoming negotiations, citing their sensitive diplomatic nature. Yet the decision to discuss repatriating the skull fits a more recent pattern wherein former colonial powers have begun returning cultural artifacts and human remains to Indigenous peoples. Examples include Germany’s return to Namibia of the remains of more than 25 victims of a colonial-era genocide, including 19 skulls that had been taken out of Africa in the early 20th century for “anthropological” research; France’s return of the remains of Saartjie Baartman (also known as “Hottentot Venus”) to South Africa for proper burial; and the Smithsonian Institution’s repatriation of human remains to New Zealand and to Native American tribes in the United States.
Namibian officials seek the repatriation of remains of former tribespeople who were murdered and removed from the region during a German-led genocide in the early 20th century. Kay Nietfeld/Getty Images
If Britain agrees to return the remains of the Rhodesian Man, it could provide a major boost, the Zambians say, to their national identity and would represent a major victory for repatriation efforts worldwide. At stake are key issues about the rights of former colonies to control their own heritage, and the responsibilities of former colonial powers to own up to the sins of the past.
On June 17, 1921, a Swiss miner named Tom Zwigelaar and a young African miner, working at Broken Hill, uncovered the Rhodesian Man. We owe details of the discovery to Aleš Hrdlička, a famed anthropologist at the Smithsonian Institution, who traveled to Northern Rhodesia to gather more information about the exact location where the skull had been found.
Hrdlička spoke with Zwigelaar himself, who was still working at the mine, about the find. “[Zwigelaar] was found to be a serious middle-aged man, not highly educated but of good common sense,” Hrdlička wrote in a 1926 issue of the American Journal of Physical Anthropology.
Zwigelaar told Hrdlička that he was working with an African miner—whose name was never recorded—at about 10 a.m. that June morning in a pocket of the mine with a lot of lead ore. “After one of the strokes of the pick, some of the stuff fell off, and there was the skull looking at me,” Zwigelaar told him. Hrdlička’s paper included a photo, taken by the mine’s manager shortly after the discovery, of Zwigelaar leaning against the mine shaft holding the skull on the palm of his outstretched left hand.
The skull didn’t stay in Northern Rhodesia for long. A doctor at the Broken Hill Hospital examined it and immediately suspected its scientific importance. About five months later, the Rhodesia Broken Hill Mine Company shipped it off to England, donating the find to the British Museum in London. There, the museum’s keeper of geology, renowned paleontologist Arthur Smith Woodward, named it Homo rhodesiensis. Woodward and other scholars recognized that the skull, with a brain size of about 1,300 cubic centimeters—within the range of H. sapiens—was an important human ancestor. Many anthropologists today classify the skull as belonging to the species Homo heidelbergensis, a descendant of H. erectus. Some researchers think H. heidelbergensis is, in turn, the common ancestor of modern humans and Neanderthals.
Discovered in 1921, the Kabwe skull belonged to a hominid who lived some 300,000 years ago. British Museum/Wikimedia Commons
Over the years, the “Kabwe skull” has continued to attract scientific attention, especially because it is one of the best-preserved hominin fossils from its time period of roughly 300,000 years ago. That timing coincides with when human evolution experts think H. sapiens diverged from more archaic hominins in Africa. Thus, the Kabwe skull—which includes ancient features such as prominent brow ridges and modern features such as a globular-shaped brain case—could represent a transitional step in human evolution.
In 2016, scientists sampled it for ancient DNA, although so far attempts to sequence even part of the Rhodesian Man’s genome have been unsuccessful. In just the past two years, at least 10 papers have been published about the fossils, many based in whole or in part on CT scanning data from the skull. These studies seem to confirm the skull is a descendant of the earlier H. erectus, but its relationship with later humans, such as H. sapiens and the Neanderthals, is still a matter of debate. Researchers at the Natural History Museum are trying to help resolve this mystery by dating the skull more precisely using modern methods.
Zambia achieved its independence from Britain in 1964. A decade later, the young nation began trying to get the Kabwe skull and associated bones back, but the British government either rejected or ignored its requests.
In recent years, Zambian researchers and cultural officials have left a paper trail of detailed and eloquent arguments—citing moral grounds and international law on cultural artifacts and human remains—for the return of the Kabwe skull. In many ways, the case for repatriation echoes that made by Native Americans under the 1990 Native American Graves Protection and Repatriation Act, which provides for the return of artifacts and human remains under certain defined conditions.
Key to Zambia’s position is a colonial law from 1912 called the Bushman Relics Proclamation. Zambia interprets the proclamation to mean that no cultural artifacts or human remains could be removed from Northern Rhodesia without a permit from the British South Africa Company, which at that time was chartered by the British government to administer the protectorate. The Zambians insist that no such permit was ever issued to the Broken Hill mining company when it donated the skull to the British Museum.
In a 2013 paper in the African Archaeological Review, which chronicles some of this history, Zambian historian Francis Musonda contended that the removal of the skull from Zambia occurred in a colonial context that is anachronistic today. (Greece has made similar arguments for the return of the Parthenon sculptures housed in the British Museum.)
Greece wants Britain to return the Parthenon marbles, part of which is shown here.
“African people find it unacceptable for a British institution to provide a repository for an African fossil when they themselves have the capacity to do so,” Musonda wrote. “This has put the country in an embarrassing and awkward position because of the impression created that it is incapable of looking after its own hominin fossil.” And even if the conditions in Zambia were “not ideal,” Musonda argued, “why not assist the country in creating conditions deemed suitable for the object?”
It is still unclear when the actual negotiations between Zambia and the U.K. will begin, and Zambian officials have expressed some frustration at the delays—both in the Zambian press and to SAPIENS. In the meantime, writes Flexon Mizinga, executive secretary of the National Museums Board of Zambia, in an email, “it would be premature to make any statements on this matter before exhausting consultations in progress with the Government of Zambia.” Mizinga adds that “we wouldn’t want to jeopardize the multilateral issues involved.”
A spokesperson for the U.K.’s Department for Digital, Culture, Media, and Sport was equally circumspect, only confirming the UNESCO agreement and that it was expected to lead to “discussions to find a mutually acceptable solution to the Broken Hill skull case.”
Nevertheless, the news did get considerable coverage in the Zambian press, and on June 1, 2018, the Zambian delegation in Paris put out a detailed statement hailing the diplomatic breakthrough.
In email correspondence, numerous human evolution researchers expressed sympathy with Zambia’s position. “We should all support repatriation of cultural objects looted or taken away, for whatever reason, from any country in the world,” writes Yohannes Haile-Selassie, curator of physical anthropology at the Cleveland Museum of Natural History.
“The pride that Africans feel about ancestry is unwavering and pronounced,” writes Wendy Black, curator of pre-colonial archaeology at the Iziko Museums of South Africa in Cape Town. “For Africans, repatriation of these items is viewed as part of a healing process, where all parties concerned can make amends, forgive, and move on.”
A CT scan of the Kabwe skull shows the fine structure of its internal features, which can be compared evolutionarily to other ancient skulls. Bruner & Manzi/The Anatomical Record
Haile-Selassie, an Ethiopian who is active in paleoanthropological research in his native country, points out that the key question is who will have control—not only of the actual skull but also the CT scans and other digitalized information that have been gathered on the specimen over many years. That information, which the Natural History Museum now considers its intellectual property, should be under Zambia’s control, he contends.
Another issue is access, which human evolution researchers are eager to maintain. “It is an aesthetically beautiful specimen,” writes Leslie Aiello, president of the American Association of Physical Anthropologists. Aiello, formerly at the University College London and a past president of the Wenner-Gren Foundation, which funds SAPIENS, adds that “while in London, I used to ask them to get it out occasionally just so I could admire it!”
Some researchers think that before the Kabwe skull is repatriated, certain conditions should be met to ensure that scientists will be able to access and study it in the future. Gerhard Weber, an anthropologist at the University of Vienna in Austria who has studied CT scans of the skull, writes that even if it is scanned and digitalized with “the best resolution possible” before it leaves London, “this will only help for a limited time. … There will be new methods in 20, 50, or 100 years that we cannot even imagine today.” Weber suggests that agreements should be negotiated with Zambia that guarantee access to the skull so that novel techniques can be applied. He also thinks an international committee should monitor these agreements.
But Black and others don’t see any inherent barriers to access if Zambia gets the Kabwe skull back. “Researchers travel to Africa all the time,” Black maintains. “Museums in South Africa and Kenya, for example, provide access to their collections to numerous researchers from around the world each year. … One could apply for access, as all researchers do at all institutions.”
Furthermore, writes Rebecca Ackermann, a biological anthropologist at the University of Cape Town in South Africa, “a move to Lusaka would certainly make it easier for African researchers to study the remains.” Indeed, the Kabwe skull, discovered when Zambia was still under colonial rule and then whisked out of the country, is an outlier; most hominin fossils discovered in Africa—such as Ethiopia’s “Lucy” and the many fossils found in Kenya, Tanzania, and South Africa—have remained in their countries of origin.
Ackermann and others contend that non-African researchers and museums must be willing to loosen their grips on the spoils of the colonial past, even ones that are vital to our understanding of human origins. “There has been a whole lot of taking and comparatively little giving back,” she contends. Keeping the skull in British—not Zambian—hands, she argues, perpetuates a colonial legacy. “Any claim that Zambians can’t take responsibility for their own heritage is frankly racist.”
Michael Balter is a freelance writer and reporter based in the New York City area.
Tooth pendants (one pictured here), along with other artifacts discovered at Denisova Cave, mark the earliest evidence of human ornamentation—between 43,000 and 49,000 years ago—in northern Eurasia.
Tom Higham/University of Oxford
An extinct branch of hominins called the Denisovans is one of the most elusive members of our extended family tree: So far there have been only four individuals found in a single Siberian cave. Now researchers have done the painstaking work of dating the fossils, sediments, and artifacts found in that famous cave, including what might be the first evidence for crafts made by our long-lost cousins.
The Denisova Cave in the foothills of Russia’s Altai Mountains has a long history of occupation and has proven a gold mine for anthropologists trying to untangle the relationship between hominin groups living in Eurasia hundreds of thousands of years ago. The cave—which has three chambers and is about the size of a modern four-bedroom home—was used as recently as the 1700s by a hermit named Denis, which is where it got its modern name (in Russian, “the cave of Denis”). Its earlier inhabitants have proven harder to pin down.
Researchers have been finding and studying fossils from this cave since at least the 1970s. In 2010, the genetic analysis of a fragment of pinky finger bone prompted the identification and naming of the Denisovans, a sister group to Neanderthals. The two split ways about 400,000 years ago. So far, Denisovan remains haven’t been confirmed anywhere else in the world, although DNA studies suggests that they once lived widely across Asia.
Lush vegetation now covers the hillside where the entrance to Denisova Cave lies. Richard Roberts
Although thousands of tiny fossils have been found inside the Denisova Cave, many of these are from animals and only about a dozen individual hominins have been identified from bone and teeth—including three other Denisovans, three Neanderthals, and some unidentified hominins. Last year, genetic work revealed that one of the cave’s fossils is from the first-known hybrid of a Denisovan and a Neanderthal, news that won headlines around the world.
With so few remains and artifacts, and no fire pits, it seems that the people of the time preferred to live in the open air and only came into the cave periodically, perhaps during heavy rain, says Bence Viola, a paleoanthropologist at the University of Toronto. “There are a lot of unpleasant creatures in there: hyenas, bats, pigeons. It can be disgusting,” says Viola. The cave’s location in chilly Siberia has made it a good place for preserving DNA. “The Altai is nice and cold, and the caves are like big fridges,” says Viola.
But dating all the tiny scraps found in the cave has proven tricky because they are mostly smaller than a centimeter across and older than can be reliably dated using radiocarbon dating, which works best for things 50,000 years old or younger. “It requires a huge amount of investment from a range of different people and techniques, so it inevitably takes a long time and effort to bring it together,” says archaeologist and earth scientist Zenobia Jacobs of the University of Wollongong in New South Wales, Australia.
Today twopapers published in Nature lay bare the history of the cave. Archaeologist Katerina Douka of the Max Planck Institute for the Science of Human History in Jena, Germany, and colleagues, including Viola, analyzed results from a combination of techniques, including radiocarbon dating, genetics, and optical dating, to track fossils and artifacts. Optical dating works by measuring how much stored energy remains in some minerals, including quartz, from the last time they were exposed to sunlight. Meanwhile, Jacobs and co-authors used optical dating on more than 100 samples of cave-floor sediments to fill in the complete timeline of hominin occupation, along with clues about the area’s climate based on animal and plant remains.
Together the works suggest that hominins have been living sporadically in this cave for about 300,000 years. Fossils and DNA traces in the soil show both Denisovans and Neanderthals living in the cave between about 200,000 and 90,000 years ago, says Jacobs, with Denisovans staying as late as about 50,000 years ago. The overlapping dates make sense, given the presence of a hybrid in the cave. “You could say it was a Denisovan cave, and the Neanderthals just visited for a while,” says Sharon Browning, a biostatistician at the University of Washington, Seattle, who has worked on Denisovan remains but wasn’t involved with either new study. “Though the Neanderthal occupation appears to have extended for tens of thousands of years, so it was a long visit.”
Perhaps most exciting is some pendants made from deer and elk teeth, and bone points that might have been used to pierce clothing for sewing. These have been dated at 43,000–49,000 years old, making them the oldest such artifacts in northern Eurasia. Older jewelry has been found elsewhere—shell beads discovered in Israel are at least 100,000 years old. But these ancient pendants could possibly be the first evidence of Denisovans making arts and crafts. Alternatively, the jewelry could come from modern humans, who are known to have been living elsewhere in Eurasia at that time.
“The big question is: Who produced these bone points and pendants? That is something we just don’t know,” says Viola. “Sadly, the pendants don’t come with a name tag.”
Nicola Jones is a freelance science journalist living in Pemberton, British Columbia.
This 3,200-year-old find is exciting because it shows that the ancient Egyptians shared our love of cheese—to the extent it was given as a funerary offering. But not only that, it also fits into archaeology’s growing understanding of the importance of dairy to the development of the human diet in Europe.
Dairy in diets
About two-thirds of the world’s population is lactose intolerant. So although dairy products are a daily part of the diet for many living in Europe, northern India, and North America, drinking milk in adulthood was only possible from the Bronze Age, over the last 4,500 years.
For most of human history, adults lost the ability to consume milk after infancy—and the same is true of people who are lactose intolerant today. After weaning, people with lactose intolerance can no longer produce the enzyme lactase. This is necessary to break down the lactose sugars in fresh milk into compounds that can be easily digested. People with lactose intolerance experience unpleasant symptoms if they consume dairy products, such as bloating, flatulence, and diarrhea.
This map shows the percentage of adults that can digest lactose in the Indigenous populations of the Old World. Circles mark sample locations. Joe Roe
Ancient DNA analysis on human skeletons from prehistoric Europe places the earliest appearance of the lactase gene (LCT)—which keeps adults producing lactase—at 2500 B.C. But there is plenty of evidence from the Neolithic period (around 6000–2500 B.C. in Europe) that milk was being consumed.
This is not totally surprising though, as the Neolithic marks the start of farming in most regions of Europe—and the first time humans lived closely alongside animals. And although they were unable to digest milk, we know that Neolithic populations were processing milk into substances they could consume.
Using a technique called lipid analysis, sherds of ancient pottery can be analyzed and fats absorbed into the clay identified. This then allows archaeologists to find out what was cooked or processed inside them.
Although it is not yet possible to identify the species of animal, dairy fats can be distinguished. It is also challenging to determine what techniques were being used to make dairy products safe to consume, with many potential options. Fermenting milk, for example, breaks down the lactose sugar into lactic acid. Cheese is low in lactose because it involves separating curd (from which cheese is made) from whey, in which the majority of the lactose sugars remain.
Clay sieves from Poland, similar to modern cheese sieves, have been found to have dairy lipids preserved in their pores, suggesting that they were being used to separate curds from the whey. Whether the curds were then consumed or attempts were made to preserve them by pressing them into a harder cheese is unknown. Fermentation of milk was also possible to our ancestors, but harder to explore with the techniques currently available to archaeology.
Early cheese making
While the techniques from bioarchaeology have provided this fantastic detail on Neolithic diets, where the science stops, experimental archaeology can explore what was possible.
We have been making cheese using the utensils, plants, and techniques available to Neolithic farmers. The aim of the experiments is not to faithfully recreate early cheeses, but to begin to capture some of the decisions available to early cheese makers—and the experiments have thrown up some interesting results.
By using these ancient techniques, we have discovered that a wealth of different means of curdling the milk would have been possible, each producing different forms, tastes, and amounts of cheese.
And such specialist knowledge may have been akin to the spread of bronze smelting at the end of the Neolithic. Dairy may have had a special status among foodstuffs. For example, at the major late Neolithic feasting site of Durrington Walls, not far from and contemporary with Stonehenge, dairy residues were found in a particular kind of pottery vessel and concentrated in the area around a timber circle—a form of late Neolithic monument.
From the Bronze Age, however, lactase persistence offered an advantage to some people who were able to pass this on to their offspring. It also seems that this advantage was not solely because of increased calorie and nutrient intake, but also because of the special status dairy foods may have had. The development of this biological adaption to fresh milk took place after humans had already found ways to safely include dairy products in the diet.
This shows not only that humans are able to manipulate our food to make it edible, but also that what we consume can lead to new adaptations in our biology.
Penny Bickle is a lecturer in archaeology at the University of York in the U.K.
Suzana Herculano-Houzel spent most of 2003 perfecting a macabre recipe—a formula for brain soup. Sometimes she froze the jiggly tissue in liquid nitrogen, and then she liquefied it in a blender. Other times she soaked it in formaldehyde and then mashed it in detergent, yielding a smooth, pink slurry.
Herculano-Houzel had completed her Ph.D. in neuroscience several years earlier, and in 2002, she had begun working as an assistant professor at the Federal University of Rio de Janeiro in Brazil. She had no real funding, no laboratory of her own—just a few feet of counter space borrowed from a colleague.
“I was interested in questions that could be answered with very little money [and] very little technology,” she recalls. Even so, she had a bold idea. With some effort—and luck—she hoped to accomplish something with her kitchen-blender project that had bedeviled scientists for over a century: to count the number of cells in the brain—not just the human brain, but also the brains of marmosets, macaque monkeys, shrews, giraffes, elephants, and dozens of other mammals.
Her method might have seemed carelessly destructive at first. How could annihilating such a fragile and complex organ provide any useful insights? But 15 years on, the work of Herculano-Houzel and her team has overturned some long-held ideas about the evolution of the human mind. It is helping to reveal the fundamental design principles of brains and the biological basis of intelligence: why some large brains lead to enhanced intelligence while others provide no benefit at all. Her work has unveiled a subtle tweak in brain organization that happened more than 60 million years ago, not long after primates branched off from their rodent-like cousins. It might have been a tiny change—but without it, humans never could have evolved.
The questions that Herculano-Houzel sought to answer go back more than 100 years, to a time when scientists were just starting to study the relationship between brain size and intelligence.
In August 1891, laborers working for the Dutch anatomist Eugène Dubois began excavating trenches along a steep riverbank on the Indonesian island of Java. Dubois hoped to find early hominin remains.
The first Homo erectus fossil ever discovered, found in 1891 in Java, Indonesia, brought new questions about the relationship between brain size and intelligence in the Homo genus. In this photo, the two white squares indicate where the femur (left) and the skullcap (right) of this “Java man” were unearthed. Aleš Hrdlička/Wikimedia Commons
Over the course of 15 months, layers of sandstone and hardened volcanic gravel yielded the petrified bones of elephants and rhinos, and, most importantly, the skullcap, left femur, and two molars of a human-like creature thought to have died nearly a million years before. That specimen, named Pithecanthropus erectus, and later Java man, would eventually come to be known as the first example of Homo erectus.
Dubois made it his mission to infer the intelligence of this early hominin. But he had only three fragments of seemingly relevant information: its estimated brain size, stature, and body weight. Would this be enough?
Zoologists had long noticed that when they compared different species of animals, those with bigger bodies had larger brains. It seemed as if the ratio of brain weight to body weight was governed by a mathematical law. As a start, Dubois set out to identify that law. He gathered the brain and body weights of several dozen animal species (as measured by other scientists), and using these, he calculated the mathematical rate at which brain size expands relative to body size. This exercise seemed to reveal that across all vertebrates, the brain really does expand at a similar rate relative to body size.
Dubois reasoned that as body size increases, the brain must expand for reasons of neural housekeeping: Bigger animals should require more neurons just to keep up with the mounting chores of running a larger body. This increase in brain size would add nothing to intelligence, he believed. After all, a cow has a brain at least 200 times larger than a rat, but it doesn’t seem any smarter. But deviations from that mathematical line, Dubois thought, would reflect an animal’s intelligence. Species with bigger-than-predicted brains would be smarter than average, while those with smaller-than-predicted brains would be dumber. Dubois’ calculations suggested that his Java man was indeed a smart cookie, with a relative brain size—and intelligence—that fell somewhere between modern humans and chimpanzees.
Dubois’ formula was later revised by other scientists, but his general approach, which came to be known as “allometric scaling,” persisted. More modern estimates have suggested that the mammalian brain mass increases by an exponent of two-thirds compared to body mass. So a dachshund, weighing roughly 27 times more than a squirrel, should have a brain about 9 times bigger—and in fact, it does. This concept of allometric scaling came to permeate the discussion of how brains relate to intelligence for the next hundred years.
Seeing this uniform relationship between body and brain mass, scientists developed a new measure called encephalization quotient (EQ). EQ is the ratio of a species’ actual brain mass to its predicted brain mass. It became a widely used shorthand for intelligence. As expected, humans led the pack with an EQ of 7.4 to 7.8, followed by other high achievers such as dolphins (about 5), chimpanzees (2.2 to 2.5), and squirrel monkeys (roughly 2.3). Dogs and cats fell in the middle of the pack, with EQs of around 1.0 to 1.2, while rats, rabbits, and oxen brought up the rear, with values of 0.4 to 0.5. This way of thinking about brains and intelligence has been “very, very dominant” for decades, says Evan MacLean, an evolutionary anthropologist at the University of Arizona in Tucson. “It’s sort of a fundamental insight.”
The encephalization quotient measures the ratio of a species’ actual brain mass to its predicted brain mass. Cay Leytham-Powell/SAPIENS
This paradigm still held sway when Herculano-Houzel was going through graduate school in the 1990s. “The intuition behind it made perfect sense,” she says. When she began trying to count neurons in the early 2000s, she imagined herself simply adding a layer of nuance to the conversation. She didn’t necessarily expect to undermine it.
By the early 2000s, scientists had already been counting neurons for decades. It was slow, painstaking work, usually done by cutting brain tissue into ultra-thin prosciutto-like slices and viewing these under a microscope. Researchers typically counted hundreds of cells per slice. Tallying enough neurons to estimate the average number of cells for a single species was time-consuming, and the results were often uncertain. Each nerve cell is branched like a twisty oak tree; its limbs and twigs crisscross with those of other cells, making it hard to know where one cell ends and another begins.
This is the problem that Herculano-Houzel set out to solve. By early 2003, she realized that the best way to count nerve cells in brain tissue might be to eliminate the complexity altogether. It occurred to her that each nerve cell, no matter how branched and contorted, should contain only one nucleus—the little sphere that holds the cell’s DNA. All she had to do was find a way to dissolve the brain tissue while keeping the nuclei intact. Then she could count the nuclei to figure out how many cells there were; it would be as simple as counting checkers on a checkerboard.
After 18 months, she settled on a procedure that involved hardening the brain tissue with formaldehyde and then mashing it gently with detergent—repeatedly pushing a plunger into the glass tube, turning it as she went, until she had a uniform slurry. She diluted the liquid, squeezed a drop of it onto a glass slide, and peered at it through a microscope. A constellation of blue dots lay scattered across her field of view: the cell nuclei, lit up with a DNA-binding dye. By staining the nuclei with a second dye, which binds to specialized nerve proteins, she could count how many of them came from nerve cells—the cells that actually process information in brains—rather than other types of cells found in brain tissue.
Neuroscientist Suzana Herculano-Houzel holds up a tube that contains a liquid suspension of all the cell nuclei that once made up a mouse brain. James Duncan Davidson/Flickr
Herculano-Houzel counted a few hundred nerve cells over the course of 15 minutes; by multiplying this number up to the entire volume of liquid, she was able to calculate a totally new piece of information: An entire rat brain contains about 200 million nerve cells.
She looked at brains from five other rodents, from the 40-gram mouse to the 48-kilogram capybara (the largest rodent in the world, native to Herculano-Houzel’s home country of Brazil). Her results revealed that as brains get larger and heavier from one species of rodent to another, the number of neurons grows more slowly than the mass of the brain itself: A capybara’s brain is 190 times larger than a mouse’s, but it has only 22 times as many neurons.
Then in 2006, Herculano-Houzel got her hands on the brains of six primate species during a visit with Jon Kaas, a brain scientist at Vanderbilt University in Nashville, Tennessee. And this is where things got even more interesting.
What Herculano-Houzel found in these primates was totally different from rodents. “The primate brains had many more neurons than we expected,” she says. “It was right there, staring us in the face.”
The assumption that everyone had been making “was very obviously wrong.”
Herculano-Houzel saw a clear mathematical trend among these six species that are alive today: As the primate brain expands from one species to another, the number of neurons rises quickly enough to keep pace with the growing brain size. This means that the neurons aren’t ballooning in size and taking up more space, as they do in rodents. Instead, they stay compact. An owl monkey, with a brain twice as large as a marmoset, actually has twice as many neurons—whereas doubling the size of a rodent brain often yields only 20 to 30 percent more neurons. And a macaque monkey, with a brain 11 times larger than a marmoset, has 10 times as many nerve cells.
The assumption that everyone had been making, that different mammalian species’ brains scaled up the same way, “was very obviously wrong,” says Herculano-Houzel. Primate brains were very different from those of rodents.
Herculano-Houzel published these first nonhuman primate results with Kaas and two other co-authors in 2007. And in 2009, she confirmed that this pattern holds true from small-brained primates all the way up to humans: At roughly 1,500 grams, the human brain weighs 190 times as much as a marmoset brain and holds 134 times as many nerve cells—about 86 billion in total. Her subsequent studies, published between 2009 and 2017, suggest that other major mammal groups, such as insectivores and cloven-hoofed artiodactyls (like pigs, antelopes, and giraffes), follow the rodent-like scaling pattern, with neuron numbers increasing much more slowly than brain mass. “There’s a huge difference between primates and non-primates,” says Herculano-Houzel, who moved to Vanderbilt University in 2016.
Her results didn’t reveal the exact process of evolution that led to the modern human brain. After all, she could only count brain cells in species that currently exist—and because they’re alive today, they aren’t human ancestors. But by studying a diversity of brains, from small to big, Herculano-Houzel learned about the design principles of brains. She came to understand that primate and rodent brains faced very different constraints in the way that they could evolve.
People in the anthropological community have responded positively to her work—though with a touch of caution. Robert Barton, an anthropologist who studies brain evolution and behavior at Durham University in the U.K., is convinced that neurons are packed more densely in the brains of primates than they are in those of other mammals. But he’s not yet convinced that the mathematical trend line—the rate at which brains add new neurons as they get bigger from species to species—is any greater in primates compared to other mammals. “I’d like to see more data before I completely believe it,” he says. He points out that Herculano-Houzel has so far studied the brains of about a dozen, out of several hundred known, primate species.
As brain size expanded over the course of primate evolution, the number of neurons in the primate brain increased quickly, leading to big improvements in cognition. In rodents, however, the expansion of brain size led to only small increases in the number of neurons, with little or no improvement in cognitive ability. Catherine Gilman/SAPIENS
But Herculano-Houzel’s results have already dealt a serious blow to conventional wisdom. Scientists who calculated EQs had assumed that they were making apples-to-apples comparisons—that the relationship between brain size and number of neurons was uniform across all mammals. Herculano-Houzel showed that this wasn’t so.
“It’s a brilliant insight,” says MacLean, who himself has spent years studying the intellectual capacities of animals. “It’s pushed the field forward enormously.”
MacLean’s own work has also undermined the universality of EQ. His study, published with a large consortium of co-authors in 2014, compared the brains and cognitive abilities of 36 animal species—including 23 primates and a sprinkling of other mammals, and seven birds. MacLean assessed them on their capacity for impulse control (measured, for example, by an animal’s ability to calmly reach around a transparent barrier to obtain some food, rather than smashing against it in an impulsive grab). Impulse control is an important component of intelligence, which, unlike algebra skills, can be measured across diverse species.
MacLean found that EQ did a poor job of predicting this quality. Chimpanzees and gorillas have mediocre EQs of 1.5 to 2.5, but, says MacLean, “they did super well [in impulse control]. They were at the top.” Squirrel monkeys, meanwhile, scored far worse than chimps and gorillas on self-control, even though this species sports an EQ of 2.3.
Despite a relatively small sampling of animals and a lot of scatter in the data, MacLean found that the best predictor for self-control was absolute brain volume, uncorrected for body size: Chimps and gorillas may have EQs no better than squirrel monkeys, but their brains, in absolute terms, are 15 to 20 times bigger. (Their EQs may be thrown off because they have unusually big bodies, not small brains.) For primates, a bigger brain was a better brain, regardless of the animal’s size. (This was also the case for birds.)
In 2017, Herculano-Houzel published a study in which she looked at the same measurements of impulse control that MacLean had used, but she compared them to a new variable: the number of neurons that each species has in its cerebral cortex—the upper layer of brain tissue, often folded, that performs advanced cognitive functions, such as recognizing objects. Herculano-Houzel found that the number of cortical neurons predicted self-control about as well as absolute brain size had in MacLean’s study—and it also smoothed out a major glitch in his results: Birds may have tiny brains, but Herculano-Houzel found that those brains are densely packed. The Eurasian jay has a brain smaller than a walnut, but it has nearly 530 million neurons in its pallium (the brain structure in birds that is roughly equivalent to the mammalian cortex). Her numbers provided a compelling explanation for why these birds scored better on impulse control than did some primates with brains five times larger.
“The simplest, most important factor that should limit cognitive capacity,” concludes Herculano-Houzel, “is the number of neurons that an animal has in the cortex.”
If the secret to intelligence is simply having more neurons, then one might ask why rodents and other mammals didn’t just evolve bigger brains to accommodate their larger neurons. The reason is that ballooning neuron size presents a staggering problem. It eventually becomes unsustainable. Just consider a hypothetical rodent with the same number of neurons as a human—about 86 billion. That beast would need to drag around a brain weighing 35 kilograms. That’s nearly 25 times bigger than a human brain—about as heavy as nine gallons of water. “It’s biologically implausible,” says MacLean. It “would be insane—you couldn’t walk.”
White matter in the brain contains fat-coated axons that make long-distance connections between neurons in gray matter. Frontiers in Psychiatry
This problem of ballooning neuron size was probably one of the major factors that limited brain expansion in most species. The burning question is how primates managed to avoid this problem.
The usual curse of an ever-expanding neuron size may stem from the basic fact that brains function as networks in which individual neurons send signals to one another. As brains get bigger, each nerve cell must stay connected with more and more other neurons. And in bigger brains, those other neurons are located farther and farther away.
“Those are problems that have to be solved when you enlarge brains,” says Kaas, who often collaborates with Herculano-Houzel. He hypothesized that rodents and most other mammals addressed these problems in a simple way: by growing communication wires, called axons, that are longer, causing each neuron to take up more space.
In 2013, Herculano-Houzel found evidence for this theory by looking at white matter in the brains of five rodent and nine primate species. White matter contains much of the brain’s wiring—the fat-coated axons that cortical neurons use to make long-distance connections. Her work showed that the volume of white matter grows much more quickly in rodent species with larger brains than it does in primates. A large rodent called an agouti has eight times as many cortical nerve cells as a mouse, while its white matter takes up an astonishing 77 times as much space. But a capuchin monkey, with eight times as many cortical neurons as a small primate called a galago, has only 11 times as much white matter.
So as rodent brains get bigger, more and more brain volume has to be devoted to the wires that simply transmit information. Those wires don’t just get longer, they also get thicker—which allows signals to travel at a higher speed, to make up for the longer distances they have to cover. As a result, less and less space is available for the nerve cells that do the important work of actually processing information.
The downfall of rodents, in other words, is that their brains don’t adapt well to the problems of being big. They don’t compensate efficiently for the communication bottlenecks that emerge as brains increase in size. This constraint has severely limited their capacity for intelligence.
Primates, on the other hand, do adapt to these challenges. As primate brains become larger from species to species, their blueprints do gradually change—allowing them to circumvent the problem of long-distance communication.
Kaas thinks that primates managed to keep most of their neurons the same size by shifting..
The urge to try and find the animal “roots” for human behavior is enticing because humans are animals. We are mammals, primates, and hominoids (the superfamily of apes). Due to these realities, we share more of our evolutionary history, our DNA, and our physiology with chimpanzees (including bonobos) than with any other living thing. In light of our commonalities, many researchers look to the chimpanzee world in order to better understand the human one.
The argument goes that if warfare, sexual coercion, male aggression, the creation and use of tools, hunting, and other patterns show up in both chimpanzees and humans, then these are evolutionarily old, shared traits. Thus, understanding the reasons behind these behaviors in chimps can offer insight into similar behaviors in humans.
This premise is nice, but it is mostly wrong.
There is significant overlap between humans and chimpanzees. However, to draw evolutionary insights from comparisons between species, we must be sure that we are comparing the same underlying evolutionary processes and that “similar” patterns are indeed similar.
Most of what chimpanzees and humans do today is not directly comparable—because we have evolved independently for millions of years. Along those very different evolutionary paths, both species have picked up a suite of distinctive ways of being in the world.
Recently, and with good reason, much of our focus has been on our extensive connections to our closest relatives and the rest of the natural world. There are many reasons to study these links, including as an opportunity to dispel erroneous and even dangerous thinking. Historically, through to the present day, certain scholars, philosophers, and religious thinkers have argued that human “nature” lies outside of the natural world, with no connection to other living things. For example, a conviction in “human uniqueness” has been used to justify the exploitation of other species. And a refusal to recognize our connection to other primates is a hallmark of anti-evolutionary dogma; rejecting our evolutionary past denies our biology and history, and negates basic realities about our bodies and lives. Recognizing that we share so much with other primates, especially chimpanzees, enables us to tackle human hubris and rebut those who perceive humans as being above or outside of the natural world.
Aspects of our morphology and physiology—that is, the way we look and function—and our general patterns of social behavior are connected to the fact that we are primates. Like chimpanzees (and most primates), we humans place great importance on our social world and abilities. Our society, with its obligatory social intensity, is part of being a primate, part of our evolutionary heritage.
Chimpanzees do have complex socially acquired behaviors—such as breaking open nuts with rocks, as these adult females are doing—but it’s impossible to equate the social worlds of humans and chimps. Anup Shah/Getty Images
Researchers have observed multiple chimpanzee societies for more than 50 years, so we know a lot about what these primates do and what they don’t do. Chimpanzees have a fascinating array of social traditions (using stone tools to crack open nuts, drinking from leaf cups, “fishing” for termites) and capacities (complex social hierarchies, deep social relationships, complex group conflict with other communities of chimps). Studying these behaviors can tell us a lot about chimpanzees and their evolution. It may also reveal some things about humans.
But we also know chimpanzees don’t have cash economies, governments, religious institutions, creeds, or fanatics. They don’t arrest and deport one another, or create massive economies of material and social inequity. They don’t change planet-wide ecosystems, build cities and airplanes, drive thousands of other species toward extinction, or write science blogs. We do.
We are a very particular mammal, primate, and hominoid that is able to look at the world around us, see it as it is, imagine entirely new possibilities, and convert those imaginings into material reality—or at least try to. We have evolved the capacity to be the most compassionate, the cruelest, the most creative, and the most destructive of all life on this planet. And we demonstrate these abilities often. How this difference came to be matters. It’s only by delving into humanity’s distinctive evolutionary history, since our split with the other ape lineages, that we are better able to develop a fuller understanding of the human niche, of what makes us specifically human.
Focusing primarily on the continuities between humans and chimpanzees (and other animals) without recognizing, understanding, and investigating the discontinuities confounds our ability to offer evolutionary insights into critical contemporary challenges. Racism and global climate change are not explained by our shared history with chimpanzees, nor are gender diversity, the #MeToo movement, and the recent rise in nationalism.
In the 7 to 10 million years since the human-chimpanzee lineage split, we have changed a lot. Genetically, we’ve accumulated 17 million novel single-nucleotide polymorphisms—that is, single location changes in our DNA. We’ve also had 2.5 million insertions and deletions (mixing up of the DNA) when compared with chimpanzee genomes. New research has even identified dozens of genes related to brain structure and function that differ between humans and chimpanzees. Morphologically, we’ve increased our fat-to-muscle ratio and reorganized the patterns of bones, muscles, and ligaments in our hands, feet, faces, and lower limbs.
A brief journey through our evolutionary history further illuminates the complexity of the differences between us and our primate relatives. The first evidence for members of our own genus, Homo, dates to between 2 and 3 million years ago. Like all primates, these earliest ancestors had complex social lives, but the intensity of their group cohesion exceeded most other primates. They also walked on their hind legs.
Known as the “Ledi jaw,” the oldest known fossil of the genus Homo dates to between 2.75 and 2.8 million years ago. Brian Villmoare
These ancestors could see in stones the potential for a tool, a vision they inherited from the hominins who came before them. They not only reshaped rock into new forms for their benefit, they cultivated, expanded, and mastered that ability—going beyond any other species on this planet.
In the next 1.5 million years or so, Homo changed in increasingly complex ways, both behaviorally and neurobiologically. A powerful feedback loop had begun: New abilities created opportunities that in turn increased what we could achieve. The innovations included more complex stone toolmaking, communal care of children, control of fire, and the creation of meaning-laden materials. Our capacity to forage grew, as did our ecological expansion across the planet.
And our brain increased in size and complexity. Today, when contrasted with other mammals, including many primates, our brains have grown as much as six times larger relative to our body size. We’ve ramped up the relative size and complexity of the frontal lobes and the overall cortex, areas of the brain particularly associated with complex thought, planning, and decision-making. It takes as much as three times as long for humans to develop an adult brain compared with our closest relatives.
In the last 300,000 to 400,000 years, human material and social complexity ratcheted up. Human societies dawned as geographically distant populations of Homo began to interact with greater frequency. Out of these associations, distinctively human ecologies were born.
At this time, we began to develop a rich and amazingly complex suite of morphological, vocal, gestural, and symbolic processes (called “language”) that enabled humans to share and receive information at multiple levels. We can now discuss the past and future, inner states and imaginings, and hopes and dreams. We convey these concepts far and wide, and even insert them into material items, such as books, papers, and recordings. Thanks to recorded histories and stories, humans—unlike any of our relatives—can communicate our thoughts, ideas, experiences, wishes, and visions to other humans even after we are dead.
A figurine made from mammoth ivory some 12,000 to 52,000 years ago reveals our ancestors’ desire to create meaning out of everyday phenomena. Dea/G. Cigolini/Getty Images
Material evidence for our ancestors’ creative meaning-making suggests that such practices became commonplace roughly 40,000 to 300,000 years ago. Engravings, beads, cave art, and figurines reveal how our forebears imagined novel items and representations, and how they created or modified materials to re-envision themselves and their world. This period is also likely when they began to create explanations for observable phenomena, such as storms, the movement of the moon, and even death.
These people had the capacity to think in ways similar to how we think today. And the resemblance did not end there. Fossils from this time period look more like the bones of contemporary humans than any previous populations.
By a few hundred thousand years ago, we had grown out of socially complex hominin origins to become organisms who existed in a highly constructed, innovative, and hyper-complex niche. Our species had become one that used imagination, complex material and social networks, and dense, multifarious communication to reshape itself and the world around it.
In the last 8,000 to 15,000 years or so, we have created concepts of property, ownership, and identity that have shaped how we structure our lives. Breakthroughs in how we grow and store food, and the domestication of some animals and plants, allowed us to arrange ourselves in new residential patterns. The resulting transformations in our social organizations ushered in creative ways of making a living.
And structural innovations and new social phenomena housed and birthed emerging social orders. Cities, communal monumental architecture, religious institutions, large-scale politics and economies, inequality, and warfare all arose, expanded, and flourished. Each advance reconfigured the possibilities and patterns of distinctively human behavior.
During this recent phase in human history, gender, politics, and economics influenced—to a much greater degree than in previous eras—how people thought about themselves and how they experienced and envisioned the larger world. These recent, and increasingly complicated, processes opened the doors for progressively more and more structured, far-reaching, and unequal human social realities.
Chimpanzees (and many other animals) do have complex societies and social lives. But it is critical to place humanity and all that our species has done in the context of our distinctive evolutionary history.
All of the skills that humans acquire, use, and alter across our lifetimes are particularly (but not exclusively) facilitated by the processes and patterns of our distinctive evolutionary past and present. Human behavior has to be examined in the context of what humans are and do, how we develop our bodies and minds. Many human processes have no direct comparisons in chimpanzees because of the substantive differences between our evolutionary trajectories.
It is therefore wrong to compare Trump to a chimpanzee. Stating that Trump is like an aggressive alpha male chimp implies that the deep explanation for his behavior stems from evolutionary roots and behavioral patterns he shares with chimpanzees. That stance disregards the distinctively human processes and contexts at play in contemporary humanity that underpin and facilitate his actions. These contexts include the history and current reality of specific inequalities that stem from economic, racist, and sexist processes in our society. We need to draw on these realities in order to effectively critique and challenge his behavior.
Humans and chimpanzees do share much in common, but when it comes to dealing with contemporary human behavior, we must look to human evolutionary histories and current realities. That approach will get us a lot further than facile comparisons to our closest relatives.
Around the world, more than 200 million people live with an infection from the hepatitis B virus, a pathogen that can reside in the human body for long stretches of time and cause serious complications, such as chronic liver disease. Despite its prevalence, little is known about the ancestral roots of the virus. New findings, published today in Nature, reveal some of the oldest samples of the virus to date—between 800 and 4,500 years old—and provide fresh insights into its origin and evolution.
Although scientists have previously uncovered the hepatitis B virus in two 16th-century mummies, most investigations into the pathogen have focused on DNA sequences extracted from humans during the last 50 years, says Barbara Mühlemann, a doctoral student studying viral evolution at the University of Cambridge in the U.K. So uncovering these ancient genomes was “a bit like finding a fossil for the first time,” Mühlemann says.
To probe for signs of ancient outbreaks in individuals who occupied Eurasia from the Bronze Age to the medieval period, the researchers examined DNA extracted from the remains of 304 ancient humans. These samples, which spanned a range of more than 6,000 years, were from large collections of skeletons from two previous examinations of human genomes in this region.
Approximately half of those ancient DNA samples came from another new Nature study, published today, that reports an analysis of the genomes of people who lived on the Eurasian steppe from the period right after the Bronze Age through medieval times. These types of studies are only possible because of the advances in genetic techniques and decades of painstaking work by archaeologists who excavated the burial mounds to unearth the remains, says Peter de Barros Damgaard, a doctoral student in molecular anthropology at the Natural History Museum of Denmark, and co-author of both of the new studies. “We were hugely privileged to work with all these skeletons at once.”
When the researchers set out to search for viruses within these samples, they were not sure what they would discover, says Mühlemann, a co-author of the hepatitis study. While other groups had found signs of bacteria from ancient plagues in similarly aged samples, no one had looked for viruses. “We didn’t know if they would be preserved for such a long time,” she notes.
So it came as a surprise when the team found that 25 individuals showed signs of a hepatitis B infection, and that of those, only 12 of the viral genomes, which were between 800 and 4,500 years old, were intact enough for further analysis. While these are some of the oldest hepatitis viruses identified in ancient human remains to date, it is not particularly surprising that people were infected with this pathogen over thousands of years, Mühlemann says.
Further analyses of the genomes of these ancient viruses revealed that most of them could fit into a modern-day genotype, or category based on similarities in their DNA. Three of the genetic sequences, however, did not have a modern match and were linked to at least one newly uncovered strain that is now extinct. “We have really no idea of what diversity [of viruses] was in ancestral populations of humans,” says Hendrick Poinar, an evolutionary geneticist at McMaster University in Canada, who was not involved in this research. Evolutionary theory predicts that some viral sequences will be lost by random chance, and finding an extinct strain of hepatitis B suggests that the diversity of the virus may have been much greater in the past than it is today, he explains.
The researchers also found that some of the ancient viral sequences came from regions not typically associated with their specific genotypes today. For example, some scientists have proposed, based on its modern-day distribution, that the hepatitis B genotype A (a particular strain) originated in Africa and spread through the slave trade between the 16th and 19th centuries. The new findings, however, reveal that this strain of the virus existed in two approximately 4,000-year-old Eurasian samples and in one Hungarian Scythian sample that was around 2,600 years old.
This map shows the worldwide distribution of the hepatitis B virus in 2005. Centers for Disease Control/Wikimedia Commons
“Our data suggest that the geographic origin of genotype A is not in Africa,” Mühlemann says. However, she adds, it’s still possible that the virus moved to other parts of the world through the slave trade, or that an earlier version of the pathogen in Africa remains undiscovered.
Further insights into the dynamics of ancient human groups may also help shed light on how the virus evolved. Findings from the new human genome study, for example, reveal that at least with respect to the Scythians, who occupied the Eurasian steppe from 800 to 200 B.C., previously established notions of large-scale migrations may be incorrect. Rather, the researchers’ discovery of genetically similar groups suggests that change may have occurred through smaller-scale movements. If you take these insights and apply them to the hepatitis work, it raises the question of how, exactly, these types of interactions affected the spread of ancient hepatitis infections, says Michael Frachetti, a professor of archaeology at Washington University in St. Louis, who was not involved in the work.
Mühlemann marvels that these ancient DNA sequences “open a window into studying virus evolution that we haven’t had before.” And by revealing modifications that have occurred in the past, she says, “these studies could shed light on how the virus might change its genome in the future.”