It’s possible that I shall make an ass of myself. But in that case one can always get out of it with a little dialectic. I have, of course, so worded my proposition as to be right either way (K.Marx, Letter to F.Engels on the Indian Mutiny)
Saturday, March 13, 2021
We Finally Know The True Age of The Huge, Mysterious Objects in Laos' Plain of Jars
In total, there are thousands of them – a giant landscape of strange, hollowed jars, carved from ancient stone. Some have lids. Most are open to the sky.
These surreal cauldron-like megaliths in Laos are known as the Plain of Jars, an archaeological relics whose original purpose is still shrouded in mystery, their significance long forgotten.
For several decades, researchers have suggested the jars were a part of prehistoric burial practices. Local legends and lore suggest the jars, some of them up to three metres (nearly ten feet) tall, were used for storage of food, alcohol, and rainwater, among other things.
For tragic reasons, it's been almost impossible for modern archaeologists to study the sites and discover the truth.
The Plain of Jars region and Laos as a whole still bear the terrible legacy of millions of unexploded bombs dropped by the US Air Force in the 1960s.
To this day, hundreds of innocent Laotians die every year as a result, decades after the conflict's official end.
And so the mystery of the Jars endures, with fewer than 10 percent of the megaliths having been investigated, researchers say.
In recent years, however, expeditions within selected safe sites have commenced, and archaeologists are now making important discoveries about these unusual objects, some of which stand alone, while others are clustered in great groups.
"Until now, it has not been possible to estimate when the jars were first placed on the landscape or from where the stone was sourced," an international team explains in a new paper detailing the latest research.
According to their analysis – using a technique called Optically Stimulated Luminescence (OSL) to date the ancient stone – the jars were positioned potentially as early as the late second millennium BCE.
Evidence discovered of varied mortuary practices at some of the sites – including primary burial of human skeletons, and also bundled or jarred collections of bones – was also dated by radiocarbon dating, suggesting activity between 9-13th century CE.
On the face of the most recent evidence, this means the Plain of Jars pre-dates the most recent and confirmed discoveries of mortuary practices, by potentially thousands of years. As for what that means, we don't yet know.
"The data presented here strongly suggests that the placement of the megaliths preceded the mortuary activity around the jars, indicating re-use of the sites and enduring ritual significance," the researchers write.
However, previous research has suggested the mortuary rituals may be as old as the stone placements themselves, so it's possible wider searches would reveal a more continuous timeline of human activity.
Another puzzle that remains is how the jars got to their current positions.
Examination of megaliths in one site suggests the most likely quarry was 8 kilometres (5 miles) away from where the jars ended up – so just how the ancient culture that created these objects (estimated to weigh up to over 30 tonnes) managed to also transport them, is yet another unknown.
Still a mystery for the ages, then, and no mistake.
The findings are reported in PLOS One (link not yet live at the time of publishing).
Contentious Hypothesis Posits Humans Brains Grew Larger as We Hunted Smaller Prey
(juliawhite/iStock/Getty Images)
STEPHANIE PAPPAS, LIVE SCIENCE 12 MARCH 2021
Over the course of the Pleistocene epoch, between 2.6 million years ago and 11,700 years ago, the brains of humans and their relatives grew.
Now, scientists from Tel Aviv University have a new hypothesis as to why: As the largest animals on the landscape disappeared, the scientists propose, human brains had to grow to enable the hunting of smaller, swifter prey.
This hypothesis argues that early humans specialized in taking down the largest animals, such as elephants, which would have provided ample fatty meals. When these animals' numbers declined, humans with bigger brains, who presumably had more brainpower, were better at adapting and capturing smaller prey, which led to better survival for the brainiacs.
Ultimately, adult human brains expanded from an average of 40 cubic inches (650 cubic centimeters) 2 million years ago to about 92 cubic inches (1,500 cubic cm) on the cusp of the agricultural revolution about 10,000 years ago.
The hypothesis also explains why brain size shrank slightly, to about 80 cubic inches (1,300 cubic cm), after farming began: The extra tissue was no longer needed to maximize hunting success.
This new hypothesis bucks a trend in human origins studies. Many scholars in the field now argue that human brains grew in response to a lot of little pressures rather than one big one.
But Tel Aviv University archaeologists Miki Ben-Dor and Ran Barkai argue that one major change in the environment would provide a better explanation.
"We see the decline in prey size as a unifying explanation not only to brain expansion, but to many other transformations in human biology and culture, and we claim it provides a good incentive for these changes," Barkai wrote in an email to Live Science.
"[Scholars of human origins] are not used to looking for a single explanation that will cover a diversity of adaptations. It is time, we believe, to think otherwise." Big prey, growing brains
The growth of the human brain is evolutionarily outstanding because the brain is a costly organ. The Homo sapiens brain uses 20 percent of the body's oxygen at rest despite making up only 2 percent of the body's weight. An average human brain today weighs 2.98 lbs. (1,352 grams), far exceeding the brains of chimpanzees, our nearest living relatives, at 0.85 lb. (384 grams).
Barkai and Ben-Dor's hypothesis hinges on the notion that human ancestors, starting with Homo habilis and peaking with Homo erectus, spent the early Pleistocene as expert carnivores, taking down the biggest, slowest prey that Africa had to offer.
Megaherbivores, the researchers argue in a paper published March 5 in the journal Yearbook of Physical Anthropology, would have provided ample calories and nutrients with less effort than foraging plants or stalking smaller prey. Modern humans are better at digesting fat than other primates are, Barkai and Ben-Dor said, and humans' physiology, including stomach acidity and gut design, indicate adaptations for eating fatty meat.
In another paper, published Feb. 19 in the journal Quaternary, the researchers argue that human species' tools and lifestyle are consistent with a shift from large prey to small prey.
In Barkai's fieldwork in Africa, for example, he has found Homo erectus sites strewn with elephant bones, which disappear at later sites from between 200,000 and 400,000 years ago. The human ancestors at those more recent sites seemed to have been eating mostly fallow deer, Ben-Dor wrote in an email to Live Science.
Overall, megaherbivores weighing over 2,200 lbs. (1,000 kilograms) began to decline across Africa around 4.6 million years ago, with herbivores over 770 lbs. (350 kg) declining around 1 million years ago, the researchers wrote in their paper.
It's not clear what caused this decline, but it could have been climate change, human hunting, or a combination of the two. As the biggest, slowest, fattiest animals disappeared from the landscape, humans would have been forced to adapt by switching to smaller animals.
This switch, the researchers argue, would have put evolutionary pressure on human brains to grow larger because hunting small animals would have been more complicated, given that smaller prey is harder to track and catch.
These growing brains would then explain many of the behavioral changes across the Pleistocene. Hunters of small, fleet prey may have needed to develop language and complex social structures to successfully communicate the location of prey and coordinate tracking it.
Better control of fire would have allowed human ancestors to extract as many calories as possible from smaller animals, including grease and oil from their bones. Tool and weapon technology would have had to advance to allow hunters to bring down and dress small game, according to Barkai and Ben-Dor. A fuzzy past
Single hypotheses for human brain evolution haven't held up well in the past, however, said Richard Potts, a paleoanthropologist and head of the Smithsonian's Human Origins Program in Washington, D.C., who wasn't involved in the research.
And there are debates about many of the arguments in the new hypothesis.
For example, Potts told Live Science, it's not clear whether early humans hunted megaherbivores at all. There are human cut marks on large-mammal bones at some sites, but no one knows whether the humans killed the animals or scavenged them.
The researchers also sometimes use arguments from one time period that might not apply to earlier times and places, Potts said.
For example, the evidence suggests a preference for large prey by Neanderthals living in Europe 400,000 years ago, which would have served those human relatives well in winter, when plants were scarce. But the same thing might not have held true a few hundred thousand or a million years earlier in tropical Africa, Potts said.
And when it comes to brains, size isn't everything. Complicating the picture, brain shape also evolved over the Pleistocene, and some human relatives – such as Homo floresiensis, which lived in what is now Indonesia between 60,000 and 100,000 years ago – had small brains. H. floresiensis hunted both small elephants and large rodents despite its small brain.
The period over which humans and their relatives experienced this brain expansion is poorly understood, with few fossil records to go on.
For example, there are perhaps three or four sites firmly dated to between 300,000 and 400,000 years ago in Africa that are certainly related to humans and their ancestors, said John Hawks, a paleoanthropologist at the University of Wisconsin–Madison who was not involved in the research and was skeptical of its conclusions.
The human family tree was complicated over the course of the Pleistocene, with many branches, and the growth in brain size wasn't linear. Nor were the declines in large animals, Hawks told Live Science.
"They've sketched out a picture in which the megaherbivores decline and the brains increase, and if you look at that through a telescope, it sort of looks true," Hawks told Live Science. "But actually, if you look at the details on either side, brain size was more complicated, megaherbivores were more complicated, and it's not like we can draw a straightforward relationship between them."
The paper does, however, draw attention to the fact that human species may indeed have hunted large mammals during the Pleistocene, Hawks said.
There is a natural bias in fossil sites against preserving large mammals because human hunters or scavengers wouldn't have dragged an entire elephant back to camp; they would have sliced off packets of meat instead, leaving no evidence of the feast at their home sites for future paleontologists and archaeologists.
"I'm sure we're going to be talking more and more about what was the role of megaherbivores in human subsistence, and were they important to us becoming human?" Hawks said.
This article was originally published by Live Science.
Some Women Were Big-Game Hunters, Complicating Ancient Gender Roles
Archeological evidence from Peru has revealed that some ancient big-game hunters were, in fact, women
(Matthew Verdolivo/UC Davis)
VIVEK VENKATARAMAN, THE CONVERSATION 13 MARCH 2021
They viewed hunting – done by men – as the prime driver of human evolution, bestowing upon our early ancestors bipedalism, big brains, tools, and a lust for violence. In this narrative, hunting also gave rise to the nuclear family, as women waited at home for men to bring home the meat.
As an anthropologist who studies hunting and gathering societies, I was thrilled by the discovery of female skeletons buried with big-game hunting paraphernalia, a pattern that raises important questions about ancient gender roles. But I found most of the media coverage it generated disappointingly inaccurate.
Responding to the finding, journalist Annalee Newitz wrote: "Nicknamed 'man the hunter,' this is the notion that men and women in ancient societies had strictly defined roles: Men hunted, and women gathered. Now, this theory may be crumbling."
In fact, that theory died a well-deserved death decades ago.
In 1966, 75 anthropologists (70 of whom were men) held a symposium called "Man the Hunter" at the University of Chicago to address one of humanity's grand questions: How did people live before agriculture?
It was there in Chicago that real-life data confronted the myth of Man the Hunter. Researchers showed that women worked just as hard as men, and plant foods gathered by women were crucially important in hunter-gatherer diets.
Hunter-gatherer movement patterns were driven by a variety of ecological factors, not just game. And many hunter-gatherers were quite peaceful and egalitarian. Hunting wasn't the sole driver or unifying theory of human evolution after all.
But it is a mistake to conflate this pattern of "most hunters are men" among hunter-gatherers with the myth of Man the Hunter. That myth was born of assumptions, not careful empirical research.
Through decades of field research, anthropologists have developed a more flexible and capacious view of human labor. According to this view, women are not bound by biology to gather, nor men to hunt. In fact, several accounts of women's hunting in foraging societies had emerged by the mid-1980s.
In this context, ancient female hunters are an expectation, not a surprise. And the focus on Man the Hunter distracts from the more important question of how a society with female big-game hunters might be constructed.
After all, women are perfectly capable of hunting, yet in most hunter-gatherer societies they don't do it very often.
These constraints play a role in shaping risk preferences. In hunter-gatherers, men's hunting is risky, meaning it carries a high chance of failure. Men tend to hunt alone or in small groups and target big game with projectile weapons, which often require fast-paced, long-distance travel.
In contrast, women prefer to hunt in groups and focus on smaller, easier-to-capture prey closer to camps, often with the aid of dogs.
Women are often crucial to the hunting success of others, whether through logistical or ritual assistance. Husbands and wives sometimes work collaboratively; in these instances, women may help trap an animal, then club it to death and carry the meat home. And in big-game hunting societies, women provide support to hunters by manufacturing clothing, weaponry, and transportation equipment.
They may also participate in hunting directly by locating, then surrounding and driving game toward a killing location, as seen among high-latitude reindeer hunters and Plains bison hunters. As the authors of the new paper speculate, this is likely how the Peruvian female hunters killed game.
Updated views on plant gathering provide insight into why women may choose not to hunt altogether. No one questioned that hunting is hard, but early anthropologists often assumed women's gathering was simple and easy.
This turns out to be wrong. Like hunting, gathering demands extensive ecological knowledge and skill that is socially learned and cultivated over a lifetime.
As a result, hunter-gatherers face tough choices about how to divide difficult labor in a 24-hour day. In this context, economic considerations show that it pays to specialize: modest comparative advantages – speed and strength, and the incompatibilities posed by child care – can lead to divisions of labor that increase overall food acquisition by the group.
From this perspective, women's decisions to hunt less than men may be a rational decision about allocating effort.
The Batek people
Many have assumed that by not hunting, women are relegated to lower status. But is that true?
I conduct my work among the Batek people, hunter-gatherers from the rainforests of Malaysia who are widely considered one of the most gender-egalitarian societies in the world. They have little material inequality, share food widely, abhor violence, and emphasize individual autonomy.
When day breaks at camp, Batek men trek far, usually alone, to hunt monkeys with blowpipes. The women gather tubers or fruit in small groups closer to camp. Nothing prohibits women from hunting, as is the case with some hunter-gatherers where, for example, touching hunting weapons is forbidden.
Batek women sometimes join in group hunts of bamboo rats, but it is otherwise rare. However, there are exceptions. Some teenage girls establish an interest in blowpipe hunting that carries into adulthood.
The Batek people say this division of labor comes down to strength differences, incompatibility with child care, and differences in knowledge specialization.
Hunting has great cultural significance, but women's knowledge of plant distributions is crucial for collective decisions like moving camp. The Batek conceive of themselves as a co-operative and interdependent group in which each person makes a unique and important contribution toward a communal goal.
Beyond Man the Hunter
Contrary to news reports, the archeological findings from Peru accord well with current knowledge about how and why men and women divide labor among hunter-gatherers. And it has little to do with the myth of Man the Hunter. The Peruvian hunter-gatherers were big-game specialists who used spear-throwing technologies that were likely relatively easy to learn. This may have enabled more flexible divisions of labor and broader participation in hunting by women, similar to what we see among some hunter-gatherers today.
New research on neglected topics like the determinants of women's status and risk-seeking economic behavior in traditional societies promises to shed light on this issue. But as the case with the Batek people shows, among a liberated society of equals, status and power has little to do with who brings in the meat.
In both popular culture and the academic record, the Scythians have been described as a force to be reckoned with. For hundreds of years, they ruled the Eurasian steppe, fierce warriors given an even bigger advantage by their highly mobile, nomadic lifestyle.
Or so we have thought, for millennia. According to a new analysis of Scythian bones, this perception is not quite the full picture; in fact, some of the people we group in with the Scythians often did settle down, living more agrarian lifestyles with urban centers.
"Our study demonstrates overall low levels of human mobility in the vicinity of key urban locales of the Scythian era, in contrast to previous stereotypes of highly nomadic populations," said anthropologist Alicia Ventresca Miller of the University of Michigan.
"While long-distance mobility increased during the Scythian era relative to preceding periods, it was limited to a small percentage of individuals."
Our understanding of the people we classify as Scythians, who rose and thrived between 700 BCE and 200 BCE, is based on a number of different sources. There are historical records, including reports from the contemporaneous Greek historian Herodotus; and there's the archaeological record, which is rich with the trappings of a warlike nomadic lifestyle, such as weapons, horse tack and burial mounds.
But the steppe is a large place, 500 years is a fairly long time, and humans are complex. Although all the people in that place and time tend to get grouped together under the Scythian label, the research of Ventresca Miller and her colleagues suggests that several, perhaps even many, diverse groups lived on the Pontic steppe during that time.
The team conducted an isotopic analysis of 'Scythian' teeth and bones found throughout what is now Ukraine, and discovered that those people likely had a more stationary lifestyle - growing millet and raising livestock - than the predominant image of the wild 'barbarians' suggests.
The teeth and bones belonged to 56 individuals whose remains were found on three burial sites in central and eastern Ukraine - Belsk, Mamai-Gora and Medvin. From them, the researchers were able to extract enough material to conduct an isotope analysis. They also analysed bones from a pair of sheep and a pig found buried in Bel'sk, providing additional context on livestock and what the ancient people ate.
This technique can reveal when and where a person lived. Combinations of isotopes in the soil can be taken up by plants, to be eaten and absorbed by people and other animals. In the case of strontium, the mix of isotopes replaces a small portion of the calcium in their teeth and bones, preserving the ratio as a record of their diet.
Since each geographic location has a different isotopic signature, and because some isotopes decay at a known rate, these isotopes can be used to place the source of a person's diet not just in geographic space, but also time.
Isotope analysis can even reveal if a person moved around from place to place over the course of their lifetime, so it would be a particularly useful tool for understanding the movement of the Scythians.
The researchers analysed isotopes of strontium, oxygen, nitrogen, and carbon, and compared it to previous studies on human populations in Ukraine from the Neolithic through to the Iron Age. They found strong evidence for the consumption of millet in all three sites, suggesting a reliance on agriculture. Two individuals from Mamai-Gora were found to have been highly mobile; these two ate less millet than the people who didn't move around.
Although these people did move around more than in previous eras, the findings suggest that, by and large, they tended to settle down, farm domesticated grain, and raise livestock, the researchers said.
"The Scythian epoch was clearly a period of contradictions, with strong evidence for complex interactions between agro-pastoralists and pastoralists that contributed to population aggregation in urban locales," Ventresca Miller said.
"This study highlights the potential use of using isotopic analysis to directly assess prevailing models of economies and mobilities during the Scythian era."
The team hopes that future work will include larger, multi-generational samples to unveil a more complete picture of how people moved around - or didn't - on the Eurasian steppe during the time of the Scythians. They also hope to study bones from people of different social status, including those buried in rich graves.
This, they believe, could help us move away from cliches and stereotypes towards a richer, more realistic understanding of human history.
"It is clear that if we are to truly uncover the 'Scythians' we need to accept that the Eurasian steppe was home to a myriad of dynamic cultures and subsistence strategies during the Iron Age," the researchers wrote in their paper.
"In fact, it is perhaps variability, rather than a uniformity of nomadic warriors, that truly frames the Scythians as predecessors to incipient globalization in Eurasia."
Look far enough back in time, and a pattern may emerge. After studying thousands of ancient fossils, paleontologist Jack Sepkoski identified just such a thing in 1981: an epic sequence of life and death, etched into the skeletons of the last 500 million years.
The late Sepkoski, a professor at the University of Chicago, discovered what became known as the three great evolutionary faunas of marine animals – a trio of successive explosions in biodiversity in the ocean over the course of the Phanerozoic Eon.
These giant bloomings of marine life were bookended by catastrophes of world-changing scale: extinction-level events precipitating mass animal die-offs – simultaneously clearing the stage for new creatures to emerge and prosper in the spaces they left behind.
But it doesn't have to happen that way, a new study suggests. Equally powerful forces – capable of shaping macroevolutionary processes with planetary implications – don't always require asteroids or supervolcanoes. Sometimes the fire comes from within.
"The fossil record tells us that some of the key transitions in the history of life were rapid changes triggered by abrupt external factors," explains paleontologist Michal Kowalewski from the University of Florida.
"But this study shows that some of those major transitions were more gradual and may have been driven by biological interactions between organisms."
The case in this point is what's known as the Mesozoic Marine Revolution. Commencing roughly 150-200 million years ago, this transition represents all the macroevolutionary changes that took place as marine predators like bony fish, crustaceans, and predatory snails increased in numbers, forcing their invertebrate prey, such as mollusks, to adapt defenses against boring and shell-crushing attacks.
In the new research, which used modeling to demonstrate the network of relationships between giant assemblages of prehistoric marine lifeforms, the team found that the Mesozoic Marine Revolution effectively represents a fourth, unrecognized chapter of surging biodiversity within the Phanerozoic – equal in its power to the three great evolutionary faunas Sepkoski identified decades ago.
"We are integrating the two hypotheses – the Mesozoic Marine Revolution and the three great evolutionary faunas into a single story," explains first author and paleontologist Alexis Rojas from UmeÃ¥ University in Sweden. "Instead of three phases of life, the model shows four."
Ultimately, although the Mesozoic Marine Revolution was characterized by gradual ecological changes produced by marine life interactions over millions of years, the researchers say it nonetheless triggered a prolonged biotic transition comparable in magnitude to the end-Permian transition.
This episode, often called the Great Dying, occurred approximately 250 million years ago and was Earth's most severe mass extinction event, wiping out approximately 80 percent of all marine species (and 70 percent of terrestrial vertebrates).
In the aftermath, life rebounded with the third great evolutionary fauna, known as the Modern fauna period, per Sepkoski's framework.
But according to Rojas, Kowalewski, and their team, the Modern period intersected with the Mesozoic Marine Revolution, contributing to a recognizable transition in biodiversity in Earth's marine life during the mid-Cretaceous period, about 129 million years ago.
"What we actually built is an abstracted fossil record that provides a unique perspective of the organization of marine life," Rojas says.
"At the most basic levels, this map shows ocean regions with particular animals," he adds. "The building blocks of our study are the individual animals themselves."
It's tricky to figure out what Earth might have looked like in the early years before life emerged. Geological detectives have now obtained more evidence that it was rather different to the planet we live on today.
According to a new analysis of the features of Earth's mantle over its long history, our whole world was once engulfed by a vast ocean, with very few or no land masses at all. It was an extremely soggy space rock.
So where the heck did all the water go? According to a team of researchers led by planetary scientist Junjie Dong of Harvard University, minerals deep inside the mantle slowly drunk up ancient Earth's oceans to leave what we have today.
"We find that water storage capacity in a hot, early mantle may have been smaller than the amount of water Earth's mantle currently holds, so the additional water in the mantle today would have resided on the surface of the early Earth and formed bigger oceans.
"Our results suggest that the long‐held assumption that the surface oceans' volume remained nearly constant through geologic time may need to be reassessed."
Deep underground, a great deal of water is thought to be stored in the form of hydroxy group compounds - made up of oxygen and hydrogen atoms. In particular, the water is stored in two high-pressure forms of the volcanic mineral olivine, hydrous wadsleyite and ringwoodite. Samples of wadsleyite deep underground could contain around 3 percent H2O by weight; ringwoodite around 1 percent.
Previous research on the two minerals subjected them to the high pressures and temperatures of the mantle of modern day Earth to figure out these storage capacities. Dong and his team saw another opportunity. They compiled all the available mineral physics data, and quantified the water storage capacity of wadsleyite and ringwoodite across a wider range of temperatures.
The results showed that the two minerals have lower storage capacities at higher temperatures. Because baby Earth, which formed 4.54 billion years ago, was much warmer internally than it is today (and its internal heat is still decreasing, which is very slow and also has absolutely nothing to do with its external climate), it means the water storage capacity of the mantle now is higher than it once was.
Moreover, as more olivine minerals are crystallizing out of Earth's magma over time, the water storage capacity of the mantle would increase that way, too.
In all, the difference in water storage capacity would be significant, even though the team was conservative with its calculations.
"The bulk water storage capacity of Earth's solid mantle was significantly affected by secular cooling due to its constituent minerals' temperature‐dependent storage capacities," the researchers wrote.
"The mantle's water storage capacity today is 1.86 to 4.41 times the modern surface ocean mass."
If the water stored in the mantle today is greater than its storage capacity in the Archean Eon, between 2.5 and 4 billion years ago, it's possible that the world was flooded and the continents swamped, the researchers found.
This finding is in agreement with a previous study that found, based on an abundance of certain isotopes of oxygen preserved in a geological record of the early ocean, that Earth 3.2 billion years ago had way less land than it does today.
If this is the case, it could help us answer burning questions about other aspects of Earth's history, such as where life emerged around 3.5 billion years ago. There's an ongoing debate over whether life first formed in saltwater oceans or freshwater ponds on land masses; if the entire planet was engulfed by oceans, that would solve that mystery.
Furthermore, the findings could also help us in the search for extraterrestrial life. Evidence suggests that ocean worlds are abundant in our Universe, so looking for signatures of these soggy planets could help us identify potentially hospitable worlds. And it could strengthen the case for looking for life on ocean worlds in our own Solar System, such as Europa and Enceladus.
Not least, it helps us better understand the delicate evolution of our planet, and the strange, often seemingly inhospitable turns along the way that eventually led to the emergence of humanity.
New research led by the University of Cambridge has found rare evidence - preserved in the chemistry of ancient rocks from Greenland - which tells of a time when Earth was almost entirely molten.
The study, published in the journal Science Advances, yields information on a important period in our planet's formation, when a deep sea of incandescent magma stretched across Earth's surface and extended hundreds of kilometres into its interior.
It is the gradual cooling and crystallisation of this 'magma ocean' that set the chemistry of Earth's interior - a defining stage in the assembly of our planet's structure and the formation of our early atmosphere.
Scientists know that catastrophic impacts during the formation of the Earth and Moon would have generated enough energy to melt our planet's interior. But we don't know much about this distant and fiery phase of Earth's history because tectonic processes have recycled almost all rocks older than 4 billion years.
Now researchers have found the chemical remnants of the magma ocean in 3.6-billion-year-old rocks from southwestern Greenland.
The findings support the long-held theory that Earth was once almost entirely molten and provide a window into a time when the planet started to solidify and develop the chemistry that now governs its internal structure. The research suggests that other rocks on Earth's surface may also preserve evidence of ancient magma oceans.
"There are few opportunities to get geological constraints on the events in the first billion years of Earth's history. It's astonishing that we can even hold these rocks in our hands - let alone get so much detail about the early history of our planet," said lead author Dr Helen Williams, from Cambridge's Department of Earth Sciences.
The study brings forensic chemical analysis together with thermodynamic modelling in search of the primeval origins of the Greenland rocks, and how they got to the surface.
At first glance, the rocks that make up Greenland's Isua supracrustal belt look just like any modern basalt you'd find on the sea floor. But this outcrop, which was first described in the 1960s, is the oldest exposure of rocks on Earth. It is known to contain the earliest evidence of microbial life and plate tectonics.
The new research shows that the Isua rocks also preserve rare evidence which even predates plate tectonics - the residues of some of the crystals left behind as that magma ocean cooled.
"It was a combination of some new chemical analyses we did and the previously published data that flagged to us that the Isua rocks might contain traces of ancient material. The hafnium and neodymium isotopes were really tantalizing, because those isotope systems are very hard to modify - so we had to look at their chemistry in more detail," said co-author Dr Hanika Rizo, from Carleton University.
Iron isotopic systematics confirmed to Williams and the team that the Isua rocks were derived from parts of the Earth's interior that formed as a consequence of magma ocean crystallisation.
Most of this primeval rock has been mixed up by convection in the mantle, but scientists think that some isolated zones deep at the mantle-core boundary - ancient crystal graveyards - may have remained undisturbed for billions of years.
It's the relics of these crystal graveyards that Williams and her colleagues observed in the Isua rock chemistry. "Those samples with the iron fingerprint also have a tungsten anomaly - a signature of Earth's formation - which makes us think that their origin can be traced back to these primeval crystals," said Williams.
But how did these signals from the deep mantle find their way up to the surface? Their isotopic makeup shows they were not just funnelled up from melting at the core-mantle boundary. Their journey was more circuitous, involving several stages of crystallization and remelting - a kind of distillation process. The mix of ancient crystals and magma would have first migrated to the upper mantle, where it was churned up to create a 'marble cake' of rocks from different depths. Later melting of that hybrid of rocks is what produced the magma which fed this part of Greenland.
The team's findings suggest that modern hotspot volcanoes, which are thought to have formed relatively recently, may actually be influenced by ancient processes.
"The geochemical signals we report in the Greenland rocks bear similarities to rocks erupted from hotspot volcanoes like Hawaii - something we are interested in is whether they might also be tapping into the depths and accessing regions of the interior usually beyond our reach," said Dr Oliver Shorttle, who is jointly based at Cambridge's Department of Earth Sciences and Institute of Astronomy.
The team's findings came out of a project funded by Deep Volatiles, a NERC-funded 5-year research programme. They now plan to continue their quest to understand the magma ocean by widening their search for clues in ancient rocks and experimentally modelling isotopic fractionation in the lower mantle.
"We've been able to unpick what one part of our planet's interior was doing billions of years ago, but to fill in the picture further we must keep searching for more chemical clues in ancient rocks," said co-author Dr Simon Matthews from the University of Iceland.
Scientists have often been reluctant to look for chemical evidence of these ancient events. "The evidence is often altered by the course of time. But the fact we found what we did suggests that the chemistry of other ancient rocks may yield further insights into the Earth's formation and evolution - and that's immensely exciting," said Williams.
Zealandia Switch may be the missing link in understanding ice age climates
Orono, Maine -- The origins of ice age climate changes may lie in the Southern Hemisphere, where interactions among the westerly wind system, the Southern Ocean and the tropical Pacific can trigger rapid, global changes in atmospheric temperature, according to an international research team led by the University of Maine.
The mechanism, dubbed the Zealandia Switch, relates to the general position of the Southern Hemisphere westerly wind belt -- the strongest wind system on Earth -- and the continental platforms of the southwest Pacific Ocean, and their control on ocean currents. Shifts in the latitude of the westerly winds affects the strength of the subtropical oceanic gyres and, in turn, influences the release of energy from the tropical ocean waters, the planet's "heat engine." Tropical heat spreads rapidly through the atmosphere and ocean to the polar regions of both hemispheres, acting as the planet's thermostat.
The Southern Hemisphere climate dynamics may be the missing link in understanding longstanding questions about ice ages, based on the findings of the research team from UMaine, Columbia University's Lamont-Doherty Earth Observatory, the University of Arizona, and GNS Science in New Zealand, published in Quaternary Science Reviews.
For more than a quarter-century, George Denton, UMaine Libra Professor of Geological Sciences, the journal article's first author, has led research reconstructing the history of mountain glaciers in the Southern Hemisphere. In the late 1980s, he and Wallace Broecker, a geochemist at Columbia University, noted that a key question about ice ages remained unresolved -- the link between ice age climate and the orbital cycles in the length and strength of the Earth's season. Evidence showed that ice age climate changes were synchronous in both polar hemispheres, with rapid transitions from glacial to interglacial global climate conditions. They concluded that existing theories could not adequately account for changes in seasonality, ice sheet size and regional climate.
Mountain glaciers are highly sensitive to climate and well suited to climatic reconstruction, using distinctive moraine deposits that mark the former glacier limits. In the 1990s, Denton led research teams in the mapping and dating of moraine sequences in South America and, more recently, in New Zealand's Southern Alps, with co-author David Barrell, geologist and geomorphologist with the New Zealand government's geoscience research institute, GNS Science.
With advances in isotopic dating of moraines in the mid-2000s, Denton teamed up with Columbia University's Joerg Schaefer, who directs the Cosmogenic Nuclide Laboratory at the Lamont-Doherty Earth Observatory. Together with CU-LDEO colleague and co-author Michael Kaplan, Schaefer, Denton, and UMaine assistant professor and co-author Aaron Putnam have guided a succession of UMaine graduate student field and laboratory projects (including Putnam's Ph.D. work) that have developed a chronology of climate-induced glacier changes in the Southern Alps spanning many tens of thousands of years. The most recent participant in the UMaine-CU partnership is UMaine Ph.D. student and co-author Peter Strand.
Collectively, the UMaine, CU-LDEO and GNS Science partners have worked to create and compile mountain glacier chronologies from New Zealand and South America, producing a comprehensive chronology of glacier extent during and since the last ice age. The team then compared the moraine dating to paleoclimate data worldwide to gain insights into the climate dynamics of ice ages and millennial-scale abrupt climate events. The findings highlight a general global synchronicity of mountain-glacier advance and retreat during the last ice age.
Deep insights into the climate dynamics come from co-author Joellen Russell, climate scientist at the University of Arizona and Thomas R. Brown Distinguished Chair of Integrative Science. Following on her longstanding efforts at modeling the climatic modulation of the westerly winds, she evaluated simulations done as part of the Southern Ocean Model Intercomparison Project, part of the Southern Ocean Carbon and Climate Observations and Modeling initiative. The modeling showed the changes to the southern wind systems have profound consequences for the global heat budget, as monitored by glacier systems.
The "switch" takes its name from Zealandia, a largely submerged continental platform about a third of the size of Australia, with the islands of New Zealand being the largest emergent parts. Zealandia presents a physical impediment to ocean current flow. When the westerly wind belt is farther north, the southward flow of warm ocean water from the tropical Pacific is directed north of the New Zealand landmass (glacial mode). With the wind belt farther south, warm ocean water extends to the south of New Zealand (interglacial mode). Computer modelling shows that global climate effects arise from the latitude at which the westerlies are circulating. A southward shift of the southern westerlies invigorates water circulation in the South Pacific and Southern oceans, and warms the surface ocean waters across much of the globe.
The researchers hypothesize that subtle changes in the Earth's orbit affect the behavior of the Southern Hemisphere westerly winds, and that behavior lies at the heart of global ice age cycles. This perspective is fundamentally different from the long-held view that orbital influences on the extent of Northern Hemisphere continental ice sheets regulate ice age climates. Adding weight to the Zealandia Switch hypothesis is that the Southern Hemisphere westerlies regulate the exchange of carbon dioxide and heat between the ocean and atmosphere, and, thus, exert a further influence on global climate.
"Together with interhemispheric paleoclimate records and with the results of coupled ocean-atmosphere climate modeling, these findings suggest a big, fast and global end to the last ice age in which a southern-sourced warming episode linked the hemispheres," according to the researchers, whose work was funded by the Comer Family Foundation, the Quesada Family Foundation, the National Science Foundation and the New Zealand government.
The last glacial termination was a global warming episode that led to extreme seasonality (winter vs. summer conditions) in northern latitudes by stimulating a flush of meltwater and icebergs into the North Atlantic from adjoining ice sheets. Summer warming led to freshwater influx, resulting in widespread North Atlantic sea ice that caused very cold northern winters and amplified the annual southward shift of the Intertropical Convergence Zone and the monsoonal rain belts. Although this has created an impression of differing temperature responses between the polar hemispheres, the so-called "bipolar seesaw," the researchers suggest this is due to contrasting interregional effects of global warming or cooling. A succession of short-lived, abrupt, episodes of cold northern winters during the last ice age are suggested to have been caused by temporary shifts of the Zealandia Switch mechanism.
The southward shift of the Southern Hemisphere westerlies at the termination of the last ice age was accompanied by gradual but sustained release of carbon dioxide from the Southern Ocean, which may have helped to lock the climate system into a warm interglacial mode.
The researchers suggest that the introduction of fossil CO2 into the atmosphere may be reawakening the same dynamics that ended the last ice age, potentially propelling the climate system into a new mode.
"The mapping and dating of mid-latitude Southern Hemisphere mountain-glacier moraines leads us to the view that the latitude and strength of the austral westerlies, and their effect on the tropical/subtropical ocean, particularly in the region spanning the Indo-Pacific Warm Pool and Tasman Sea through to the Southern Ocean, provides an explanation for driving orbital-scale global shifts between glacial and interglacial climatic modes, via the Zealandia Switch mechanism," the research team wrote. "Such behavior of the ocean-atmosphere system may be operative in today's warming world, introducing a distinctly nonlinear mechanism for accelerating global warming due to atmospheric CO2 rise."
###
Glaciers and enigmatic stone stripes in the Ethiopian highlands
As the driver of global atmospheric and ocean circulation, the tropics play a central role in understanding past and future climate change. Both global climate simulations and worldwide ocean temperature reconstructions indicate that the cooling in the tropics during the last cold period, which began about 115,000 years ago, was much weaker than in the temperate zone and the polar regions. The extent to which this general statement also applies to the tropical high mountains of Eastern Africa and elsewhere is, however, doubted on the basis of palaeoclimatic, geological and ecological studies at high elevations.
A research team led by Alexander Groos, Heinz Veit (both from the Institute of Geography) and Naki Akçar (Institute of Geological Sciences) at the University of Bern, in collaboration with colleagues from ETH Zurich, the University of Marburg and the University of Ankara, used the Ethiopian Highlands as a test site to investigate the extent and impact of regional cooling on tropical mountains during the last glacial period. The results have been published in the scientific journals Science Advances and Earth Surface Dynamics.
Formation of plateau and valley glaciers
"The Ethiopian Highlands are currently not covered by ice, despite its elevation of over 4,000 m," explains Groos, who studied the glacial, climatic and landscape history of the Bale and Arsi Mountains in the southern highlands as part of his dissertation. "Moraines and other land forms, however, attest to the fact that these mountains were glaciated during the last cold period," he continues.
Moraine boulders in the Bale and Arsi Mountains were mapped and sampled in the field and later dated using the chlorine isotope "36Cl" to accurately determine the extent and timing of past glaciations. The researchers were in for a surprise: "Our results show that glaciers in the southern Ethiopian Highlands reached their maximum extent between 40,000 and 30,000 years ago," says Groos, "several thousand years earlier than in other mountainous regions in Eastern Africa and worldwide." In total, the glaciers in the southern highlands covered an area of more than 350 km² during their maximum. In addition to the cooling of at least 4 to 6 °C, the extensive volcanic plateaus above 4,000 m favored the development of glaciation in this magnitude.
The researchers gained important insights by comparing the specially reconstructed glacier fluctuations in the Ethiopian Highlands with those of the highest East African mountains and climate archives from the Great African Rift Valley. "The cross-comparisons show that the tropical mountains in Eastern Africa have experienced a more pronounced cooling than the surrounding lowlands," Groos concludes. "Furthermore, the results suggest a nonuniform response by East African glaciers and ice caps to climate changes during the last cold period, which can be attributed to regional differences in precipitation distribution and mountain relief, among other factors," he continues.
CAPTION
Satellite image of the up to 200 m long, 15 m wide and 2 m deep sorted stone stripes on the southern Sanetti Plateau (ca. 3,900 m.a.s.l.). The stone stripes were probably formed during the last glacial period under much cooler conditions and can best be explained by a natural sorting of the stones in the course of the cyclic freezing and thawing of the ground.
CREDIT
Alexander R. Groos / Digital Globe Foundation
The enigma of the stone stripes
During their fieldwork on the central Sanetti Plateau in the Bale Mountains, the researchers also came across gigantic stone stripes (up to 1,000 m long, 15 m wide and 2 m deep) outside the area of the former ice cap. "The existence of these stone stripes on a tropical plateau surprised us, as so-called periglacial landforms of this magnitude were previously only known from the temperate zone and polar regions and are associated with ground temperatures around freezing point," Groos said. However, the average ground temperature on the Sanetti Plateau is currently about 11 °C.
The large boulders and basalt columns that make up the stone stripes originally came from heavily eroded rock formations and volcanic plugs. As things stand, the researchers assume that the stone stripes were formed during the last glacial period through natural sorting of the previously chaotically distributed rocks in the course of the periodic freezing and thawing of the ground near the former ice cap. However, locally this would have required a drop in the mean ground temperature of at least 11 °C and in the mean air temperature of at least 7 °C. Whether this unprecedented cooling is a regional phenomenon or exemplary for the cooling of tropical high mountains during the last glacial period must be shown by future studies from other tropical mountain regions.
CAPTION
Sampling of an erratic boulder on a terminal moraine in the Arsi Mountains, indicating the extent of valley glaciers during the last glacial period. The rock sample was then processed at the University of Bern and dated at ETH Zurich to timing of the glacial stages.
CREDIT
Serdar Yesilyurt
Evaluating the rehabilitation of an old mine waste rock pile
The Cabeza de los Gatos waste rock pile, left from mining activities in the town of Tharsis (Huelva), underwent a rehabilitation process consisting of remodelling the slope of the pile, applying liming materials and then a layer of soil. Finally, trees and shrubs typical of the area were planted and a hydroseeding with a mixture of shrub and herbaceous seeds was applied. Twelve years later, a study led by researchers from IRNAS-CSIC, in collaboration with Sabina Rossini Oliva, a researcher from the University of Seville and the Environment and Water Agency of Andalusia (AMAYA), has proven the effectiveness of this sort of rehabilitation.
"The results obtained show that the steps taken were successful. Now, twelve years later, woody vegetation covers the upper part of the rehabilitated area and provides a green visual landscape for the inhabitants of the village. Furthermore, it has been proved that the aerial part of the vegetation growing on this upper part does not present significant levels of potentially toxic trace elements," Rossini explained.
In contrast, experts point out that the lower part of the slope is almost devoid of vegetation. Moreover, some of the plants growing at the base of the slope, such as gum rockrose (Cistus ladanifer), have cadmium concentrations exceeding the maximum tolerable level for animals, meaning they represent a toxicity risk for livestock.
"Our view is that this anomaly stems from acidic drainage from the pile which has caused a sharp drop in soil pH at the bottom of the slope. This has increased the presence of potentially toxic trace elements and thus their accumulation in plant leaves. This adverse effect has manifested as lower vegetation cover," the University of Seville researcher added.
In light of the results obtained, an alternative technique is proposed for the rehabilitation of mine rock waste piles by classifying and selectively managing the mine waste. This technique consists of managing excavation waste and selecting it based on its geochemical properties; separating materials that may generate acid from those that do not. The non-acid-generating waste should then be placed on the outside of the waste rock pile as a protective layer to prevent water contamination. This reduces infiltration rates, seepage and the negative effect of acid drainage from the m