Zazzle Shop

Screen printing

Tuesday, January 6, 2009

Archeogenetics Illuminates Pre History


Mystery man: Cro-Magnons emerged about 40,000 years ago and were very similar to us. So why did it take tens of thousands of years for civilization to take hold?
Credit: Bettmann/Corbis
Prehistory: The Making of the Human Mind
By Colin Renfrew
Modern Library, 2008, $23.00

How did we become the thinking animals that we are? That's the question at the heart of the study of human prehistory--and the one that Colin Renfrew has been asking since the summer of 1962, when he travelled to Milos, one of the Cycladic Islands in the Aegean Sea, a source of the black obsidian that was the earliest commodity traded by humans.

Renfrew--Lord Renfrew of ­Kaimsthorn since he was made a British life peer in 1991 to honor his many contributions to archaeology--was then a graduate student at Cambridge. As an undergradu­ate, he'd first studied natural sciences before moving on to archaeology; thus, seeking a means to determine the provenance of the obsidian that prehistoric ­peoples favored for toolmaking, he tried the novel tactic of using optical emission spectroscopy to analyze its trace elements.

"We really hit lucky," Renfrew told me recently. "Obsidian makes much thinner, sharper blades than flint and so was a preferred substance found at almost all the early Neolithic sites in Greece. In fact, we learned it was already traded during the Upper Paleolithic." Yet the principal quarries for obsidian in the Aegean were on Milos. "So the material documents the earliest known seafaring," Renfrew says. "We needed nevertheless to be sure where it was coming from. Trace-element analysis let us characterize each different obsidian source, since they're created by relatively recent volcanoes and tend to be consistently distinguishable." Renfrew found that he could clearly graph how far the material had traveled: obsidian from a site in Anatolia (modern Turkey), in one instance, had been transported approximately 500 miles to Palestine. Overall, the picture that emerged suggested a world where most people never traveled more than a few miles from where they were born, but a few went everywhere. "It's an interesting picture," Renfrew says. "It was the seafarers who traveled distances, getting around the Aegean Islands quite widely and clearly doing that before the origins of farming."

Next, Renfrew turned his attention to what had been a cherished assumption in archaeology: that prehistoric cultural innovation originated in the Near East and diffused to Europe. "Just in archaeological terms, I didn't think that argument was very good," he says. "In Bulgaria and Romania, I'd been struck by the early metallurgy at some sites. So when radiocarbon dating arrived--particularly when tree-ring calibration came through in the late 1960s--the penny dropped." The new technological methods proved that, indeed, certain artifacts in Central and Western Europe were older than their supposed Near Eastern forerunners. Renfrew wrote a book, Before Civilization: The ­Radiocarbon Revolution and Prehistoric Europe (1973), pointing out that "the previous diffusionist chronology collapsed at several points."

Over the decades, Renfrew has remained at his field's cutting edge; he was among the earliest advocates of technologies like computer modeling and positron emission tomography (PET), the latter to examine contemporary subjects' brain activities as they replicated the toolmaking of Lower Paleolithic hominids. In his latest book, Prehistory: The Making of the Human Mind, Renfrew has not only produced a summary of by far the vaster part of human history but also provided an account of archaeology's advance since European scholars realized some 150 years ago that the human past extended many millennia further back than 4004 b.c.e. (the 17th-century theologian Bishop Ussher's estimate of when God had created the world). Given its vast subject and its strictures of length, probably the only real criticism one can make of the book is that in its index, under the letter R, the author is missing. It's a significant omission: Renfrew has informed today's understanding of human prehistory much as he says Gordon Childe--who is responsible for the concepts of the Neolithic and urban revolutions--shaped thinking during the first half of the 20th century. Like Childe, he has been one of the great archaeological synthesizers, working to construct a theory of global human development. For Renfrew, all archaeology ultimately leads to cognitive archaeology--the branch that investigates the development of human cognition.

In particular, Renfrew has been preoccupied by what he has dubbed the "sapient paradox": the immense time lag between the emergence of anatomically modern human beings and the advent of the cultural be­haviors that we take to define humanity.

Prehistory is defined as that period of human history during which people either hadn't yet achieved literacy--our basic ­information storage technology--or left behind no written records. Thus, in Egypt, prehistory ended around 3000 b.c.e., in the Early Dynastic Period, when hieroglyph-inscribed monuments, clay tablets, and papyrus appeared; in Papua New Guinea, conversely, it ended as recently as the end of the last century. Archaeologists and anthropologists accept this region-by-region definition of prehistory's conclusion, but they agree less about its beginning. A few have seen prehistory as commencing as recently as around 40,000 b.c.e., with the emergence of Cro-Magnon man, who as Homo sapiens sapiens was almost indistinguishable from us (although Cro-Magnons, on average, had larger brains and more robust physiologies). However, most experts would probably say that prehistory began in the Middle Pleistocene, as many as 200,000 years ago--when Homo neanderthalensis (sometimes classified as Homo sapiens neanderthalensis) and archaic Homo sapiens emerged. Either way, it's assumed that the appearance of Homo sapiens sapiens triggered "a new pace of change ... that set cultural development upon [an] ... accelerating path of development," as Renfrew writes in Prehistory. But Renfrew thinks that this acceleration must have been due to something else.

"The evidence that Homo sapiens' arrival equates with full linguistic abilities, the human behavioral revolution, and so on is very limited," Renfrew told me, adding that he sees nothing clearly separating the flint tools of the Neanderthals from those associated with Homo sapiens. As for the cave paintings at Altamira, Lascaux, and other Southern European sites, which are 15,000 to 17,000 years old: "They're amazing, but stylistically singular and very restricted in their distribution. They mightn't be characteristic of early Homo sapiens." Overall, Renfrew thinks, if aliens from space had compared Homo sapiens hunter-gatherers with their earlier counterparts, they probably wouldn't have seen much difference.

Two and a half million years ago, the first protohumans, Homo habilis, shaped stones to take the place of the claws and fangs they lacked, using them to kill small animals and scavenge the remains of larger ones. The payoff was immense: whereas metabolic needs like food processing constrain brain size for most mammals, eating meat enabled habilis to start evolving a smaller gut, freeing that metabolic energy for the brain's use. After a few hundred thousand years, later hominids like erectus and ergaster had developed straightened finger bones, stronger thumbs, and longer legs. The expansion of hominid brains--they were twice as big within a million years, three times by the Middle Paleolithic--enabled symbolic communication and abstract thought. By 50,000 b.c.e., our ancestors had spread from Africa through Asia, Europe, and Australia.

Archaeogenetics Emerges
The paradox, or puzzle, is this: if archaic Homo sapiens emerged as long as 200,000 years ago, why did our species need so many millennia before its transition, 12,000 to 10,000 years ago, from the hunter-­gatherer nomadism that characterized all previous hominids to permanent, year-round settle­ment, which then allowed the elabo­ration of humankind's cultural efforts? To answer this question, Renfrew calls for a grand synthesis of three approaches: scientific archaeology, which collects hard data through radiocarbon dating and similar technologies; linguistic study aimed at constructing clear histories of the world's languages; and molecular genetic analysis.

Renfrew sees this last approach, which he calls archaeogenetics, as progressing most rapidly. So far, archaeogenetics has relied principally on analysis of human mitochondrial DNA (mtDNA), which is found not in the paired chromosomes within cell nuclei but in tiny loops, called plasmids, inside the mitochondria that generate most of the cell's chemical energy. Unlike chromosomal DNA, mtDNA derives only from the ovum--so it represents only the maternal lineage--and does not recombine from generation to generation. Thus, it's essentially static. Yet over thousands of years, single-­nucleotide polymorphisms--mutations that alter a ­single DNA base pair--do occur in mitochondria at a statistically predictable rate. Given that mutation rate, modern researchers can analyze and compare mtDNA ­samples from individuals throughout the world, using the similarities and differences to construct a great human family tree.

Furthermore, Renfrew told me, "studies of mtDNA mutation rates give an approximate chronology that ties quite nicely to data from radiocarbon dating of fossil remains." Like radiocarbon dating itself, mtDNA analysis has refuted long-cherished myths about race by showing that humankind almost certainly had a single origin in Africa, with our main dispersal out of that continent occurring about 60,000 years ago and proba­bly involving a relatively small number of humans. During humanity's global diaspora, many populations grew isolated. Today, mitochondrial haplogroups--groups that share common ancestors--are identifiable as originating in Africa, Europe, Asia, the Americas, and the Pacific Islands.

Mitochondrial-DNA analysis is only one tool in an expanding genomic arsenal. The fuller picture is, perhaps, even more dramatic than Renfrew suggests. Increasingly, we look like just one taxonomic variant within the continuum of the hominid clade: a FOXP2 gene variant strongly implicated in our language capabilities, for instance, is one we shared with Neanderthals 60,000 to 100,000 years ago. According to John Hawks, an anthropologist and population geneticist at the University of Wisconsin-Madison, Neanderthals and Homo sapiens may well have interbred: "No primate species have established reproductive boundaries into sterility in less than a couple of million years. Neanderthals and ourselves ­resemble, maybe, chimps and bonobos, which are geographically separated in nature but hybridize freely if placed together in a zoo." In short, though we tend to be species-centric about the concept of humanity, the reality is that all organisms are temporary receptacles into which DNA pours itself, and inter­species boundaries are more fluid and tenuous than we've thought. In a sense, the idea of Homo sapiens as a distinct species is one more racial myth.

Other assumptions don't hold up any better. Not only did Cro-Magnons have larger brains than we do, for example, but the difference was big. "In the last 10,000 years, our brains have shrunk about 200 cubic centimeters," Hawks explains. "If we shrunk another 200, we'd be the equivalent of Homo erectus. One possi­bility is this represents greater efficiency--our brains using less energy, needing less developmental time, and signaling faster. Alternatively, of course, we're getting dumber."

Pondering these and similar questions, Hawks and other researchers wondered if data from the International HapMap Project--a consortium established to catalogue the patterns of human genetic variation in different populations around the globe--could help clarify matters.

In population genetics, "linkage disequilibrium" means that certain alleles--the alternative versions of a given gene responsible for variations such as brown or blue eyes--occur together more frequently than can be explained by chance. It is a sign that evolutionary selection has been working: advantageous new mutations are appearing. Hawks and his colleagues applied novel genome-scanning approaches to HapMap data to track linkage disequilibrium and then, in December 2007, published a controversial paper, "Recent Acceleration of Human Adaptive Evolution," in the Proceedings of the National Academy of Sciences.

"When I was in graduate school in the mid-'90s, the dogma was that culture had halted evolution," Hawks told me. But he and his colleagues found genomic evidence that, on the contrary, culture has increased the pace of human evolution over the last 40,000 years, and especially over the last 10,000. What's driven this acceleration, they argued in PNAS, is the global human population explosion that commenced 10,000 years ago, as a consequence of the agricultural revolution. Humankind invented agriculture, started eating different foods, and began dwelling in cities; populations expanded, allowing large numbers of mutations. Natural selection promoted the spread of beneficial variations.

According to Hawks, evidence indicates recent selection on more than 1,800 human genes. Beyond identifying a selected allele, he adds, analysis can often determine from its sequence something of what the allele does. Hawks believes that some of the new alleles confer new digestive capabilities, as with glucose and lactose tolerance; pathogen resistance, as against malaria; improved capacity for DNA repair, which may be associated with human longevity; and new neurotransmitter variations, like the dopamine variant DRD4‑7R, which was strongly selected for in some populations perhaps 40,000 years ago and is implicated in heightened tendencies toward impulsiveness, attention deficit disorder, and alcoholism. (More-conservative population geneticists argue that while humans are probably still evolving, it's not clear that evolution is accelerating, and still less certain which alleles are of recent origin.)

Discussing differences in populations isn't something our egalitarian society enjoys. But one of Hawks's coauthors, Henry Harpending, a population geneticist and anthropologist at the University of Utah, thinks it should be: "Citizens should appreciate that evolution is ongoing, numerous real human differences exist, and we're hurting many people by denying them." ­Harpending notes, too, that life-sciences industries followed up on the paper, seeking opportunities for drug development and personalized medicine. "In the face of embarrassed silence from the world's scientists, they're not inhibited," he says. "They want to make money and are on it like crows on roadkill." If Harpending is right, we will learn new facts about human development whether we want to or not.

Mark Williams is a contributing editor to Technology Review.

0 comments: