Saturday, March 13, 2021

Alberta's energy war room says animated Netflix film is 'full of lies' about oil industry

Joel Dryden CBC
3/12/2021

|
© Netflix/YouTube The Canadian Energy Centre sent an email to its subscribers on Thursday criticizing the Netflix film Bigfoot Family, which focuses on the quest of the mythical ape-like creature to use his fame to protect a wildlife reserve from an oil…

The head of the Canadian Energy Centre (CEC) says a recent computer-animated Netflix family film centred on the exploits of the mythical ape-like creature known as Bigfoot depicts "misinformation" and "villainizes energy workers."

The 2020 film, titled Bigfoot Family, sees the eponymous Bigfoot team up with son Adam to protect a wildlife reserve from an oil company in Alaska.


The CEC, sometimes referred to as Alberta's energy "war room," was initially launched by the ruling United Conservative Party (UCP) in December 2019, with a stated goal of promoting Alberta oil and gas and countering what it calls misinformation and producing media content.

The government-funded, privately operated initiative launched with an annual budget of $30 million, though that was reduced when the pandemic hit.

In a letter sent to subscribers to CEC's email list on Thursday, the centre said Bigfoot Family was "full of lies and misinformation." It pointed to one scene in which oil is extracted by blowing up a valley using "glowing red bombs."

Tom Olsen, the head of the war room, said the CEC responded to the film after a parent flagged it. 

TOM WAS HEAD OF KLEIN'S PR DEPARTMENT AFTER BEING THE EDITOR OF A SUN NEWSPAPER. TOM SUCKS OFF THE TAXPAYERS TEAT AND HAS FOR A LONG TIME UNDER THE 44 YEAR OLD RULE OF THE PC'S THE PARENT PARTY OF KENNEY'S UCP

"The film claims an oil company intends to use a bomb to blow apart a mountain landscape within a wildlife preserve, then flood a pristine valley with oil. All while lying about it," Olsen said in an email to CBC News.

"It villainizes energy workers and disparages the industry's record on and commitment to environmental protection."

HE LIES

Olsen said more than 1,000 Canadians have sent emails to Netflix Canada to show their concern, but Netflix Canada did not immediately respond to a request for comment so CBC could not confirm that figure.

The CEC did not immediately respond to a follow-up query from CBC about what the end goal of the campaign was or a request to interview the parent who had flagged the film.


The provincial government has sought in recent months to attract film studios to produce more movies and TV shows in Alberta, and Netflix, among others, has been in discussions to increase its presence in the province.

Making headlines

In his email, Olsen said the CEC has worked to provide information about the importance of Alberta's oil and gas industry to Canada's recovery after the pandemic.

"We have promoted Indigenous opportunity provided by the energy sector, environmental gains by industry and consistently release peer-reviewed research pieces on the reality of fossil fuels in Canada and around the world," he said.

However, the CEC has also found itself in the headlines for a number of gaffes. Shortly after its launch, the CEC said it would change its logo after it was revealed the logo already represented an American tech company.


Shortly afterward, the Canadian Association of Journalists protested after war room staff referred to themselves as reporters while speaking with sources.

In February 2020, Olsen apologized after the CEC Twitter account sent a series of tweets attacking the New York Times, some of which alleged the Times had been "called out for anti-Semitism countless times" and "[had] a 'dodgy' track record."


Jared Wesley, an associate professor of political science at the University of Alberta, said attacks like those against Bigfoot Family draw more attention to negative stereotypes that people have about the oil and gas industry.


"It may actually end up drawing negative reaction from people who are in the middle on this issue and need to be persuaded of a different view on the oilsands," Wesley said.
© Samuel Martin/CBC University of Alberta political science professor Jared Wesley says the Canadian Energy Centre's campaign against the Netflix children's film Bigfoot Family might do more harm than good when it comes to the oil and gas industry.

Wesley said it was difficult to ascertain what the CEC's campaign against Netflix might achieve but if it was an attempt to shame people on the progressive left, it was likely to backfire.

"It feels good to lash out against seemingly progressive or liberal companies or representations of oil and gas that don't fit with people's positive images of that industry ... but what are the downstream effects of that on the representation of the industry?" Wesley said.

Andrew Leach, associate professor in the Alberta School of Business at the University of Alberta, said the images in the film specifically found to be contentious by the CEC weren't that far off from reality.

"The specific thing they took issue with was the idea that it's unfathomable that there would be a secret campaign to use a bomb to blow up a mountain to flood a valley to fill it with oil," Leach said.

"That was actually pretty darn close to what was once proposed for the oilsands."


On Twitter, Lethbridge NDP MLA Shannon Phillips compared the campaign around the Bigfoot Family film to the conservative backlash over Hasbro's recent decision to remove the gender from the branding of its Mr. Potato Head toy.


"The UCP is divided. They don't have a plan for jobs. Their leader is deeply unpopular," Phillips wrote. "So they go to these 'outrage' issues to raise money and find unity. And public money pays for it

The Weather Network

Lightning is back! Summer-like storm lights up the night sky

Study adds more evidence of antibiotic overuse in COVID-19 patients

A new analysis of COVID-19 hospitalizations in the United States during the first 6 months of the pandemic shows that more than half resulted in a patient receiving an antibiotic.

The study by researchers with the Pew Charitable Trusts' antibiotic resistance project, also found that more than a third of admissions resulted in a patient being prescribed multiple antibiotics. But only 29% of those admitted with the virus were diagnosed as having a bacterial infection.

The findings of the study, which is the largest study to date on antibiotic use in US COVID-19 patients, add to the growing body of research on antibiotic prescribing during the early months of the pandemic. Studies to date have estimated that anywhere from 55% to 98% of hospitalized COVID patients around the world were treated with antibiotics, while only a fraction had a bacterial co-infection that would require their use. This has led to widespread concern about unnecessary antibiotic use during the pandemic.

"Other studies have shown that there is this high level of antibiotic prescribing in this population, while the occurrence or frequency of bacterial co-infections appears to be low," said Pew's Rachel Zetts, MPH, a co-author of the study.

"What our study really highlights is the important need to focus on antibiotic prescribing in this population, and to ensure that antibiotic stewardship programs are well positioned to improve prescribing a year into the pandemic." 

High rate of empiric prescribing

Using IBM Watson Health's electronic health records database, Zetts and her colleagues analyzed data on 5,898 unique US hospital admissions from February through July 2020, representing 4,980 patients. Nearly half of the admissions involved patients aged 56 and older, 52% were women, and most admissions involved patients in the Midwest (84%) and the South (14%). Fifty-nine percent of the hospitalizations lasted 1 to 3 days. Around 58% of the admissions occurred in June (18%) and July (40%)

"We were looking at prescribing primarily during the early summer months of the pandemic," Zetts said.

Within the study population, 52% of admissions resulted in at least one antibiotic being prescribed, with 82% of those patients receiving antibiotics at admission and 96% within the first 48 hours of hospitalization. Thirty-six percent of admissions resulted in more than one antibiotic prescription.

Analysis of diagnostic codes found that, in 20% of admissions, the patient was diagnosed with bacterial pneumonia, while 9% were diagnosed with a urinary tract infection. But that could be an overestimation, since diagnostic coding captures both confirmed and suspected infections. Only 7% of COVID-19 admissions were found to have positive bacterial culture results from blood, urine, and respiratory samples. 

Study co-author David Hyun, MD, director of Pew's antibiotic resistance project, suggested that multiple factors likely played a role in more than half of the COVID-19 admissions receiving an empiric antibiotic. Among them were the length of time it took to get results on COVID-19 tests and other microbiologic tests that could confirm or rule out a bacterial infection.

Early on in the pandemic, some hospital labs were taking several days to confirm COVID test results. And bacterial culture results typically take at least 48 hours.

In addition, even if it was suspected that patients had COVID-19, clinicians may have had concerns that patients had a bacterial infection on top of their viral infection. The study found that patients who received antibiotics were more likely to have inflammatory markers that could indicate an infection requiring antibiotics.

Hyun also noted that most of the patients were in the Midwest and South, in places that were likely experiencing their first surge of COVID-19 infections and were dealing with an infection that had no established treatment or management plan.

"This is what happens when a novel pathogen enters and creates a public health crisis," he said. "Beyond the diagnostic limitations, early on in the pandemic, there was also not a lot of information or research published in terms of the natural course of COVID infections."

But the analysis also found that far fewer patients—15% of the admissions—received additional antibiotics after 48 hours, which suggests that once clinicians did have test results in hand and bacterial infections were ruled out, antibiotic stewardship principles may have helped minimize antibiotic use.

"That does indicate some degree of de-escalation of antibiotic prescribing as doctors received additional information on their patients," Zetts said.

Data emphasize importance of stewardship

The findings from the Pew study are in line with a study published in Clinical Infectious Diseases in August 2020 that looked at antibiotic prescribing in 1,705 COVID-19 patients at 38 hospitals in Michigan from mid-March to mid-June 2020. The study found that 56.6% of those patients received empiric antibiotics, and only 3.5% had bacterial co-infections.

Valerie Vaughn, MD, who led that study and reviewed the data and methodology on the Pew study, said the results confirm what many other studies have found—that there has been a lot of antibiotic use, but very few bacterial co-infections, in COVID-19 patients.

"We're using a ton of antibiotics for COVID, and it's probably not necessary," said Vaughn, director of hospital medicine research at the University of Utah School of Medicine.

In addition to the reasons cited in the study, Vaughn noted one of the potential reasons antibiotics have been used so frequently in COVID-19 patients is that many antibiotic stewardship leaders in hospitals, including clinicians and pharmacists, have been pulled away from those duties to focus on COVID-19, leaving less time to monitor appropriate antibiotic use.

"They've been called upon by hospitals to write their COVID guidelines [and] to help with remdesivir allocation or vaccine distribution," she said. "So I think what we're going to see long-term is that there is more antibiotic use as well because our antibiotic stewardship leaders are focusing on COVID."

Zetts said the findings are an important reminder not only of why antibiotic stewardship programs are necessary, but why they need continued support and funding.

"Having those stewardship programs in place helps ensure there is a team that looks at antibiotic prescribing patterns, identifies areas for improvement, and is well-situated to provide real-time information to physicians as they're making prescribing decisions," Zetts said. "What this data really highlights is the need to have these programs in place now, and to make sure they remain in place moving forward."

This Is a Piece of a Lost Protoplanet, And It's Officially Older Than Earth



A piece of EC 002. (A. Irving, public domain)

MICHELLE STARR
9 MARCH 2021

A chunk of meteorite found in the desert sands of Algeria could be a piece of a baby planet that never made it.

According to an in-depth analysis of the rock's composition and age, not only is the meteorite known as Erg Chech 002 older than Earth, it formed volcanically - suggesting that it could have once been part of the crust of an object known as a protoplanet.

As such, it represents a rare opportunity to study the early stages of planet formation, and learn more about the conditions in the earliest days of the Solar System, when the planets we know and love today were still forming.

EC 002 was just found in May of last year, several chunks of rock with a combined weight of 32 kilograms (70 pounds) in the Erg Chech sand sea in southwestern Algeria. It was fairly quickly identified as unusual; rather than the chondritic composition of most recovered meteorites - which form when bits of dust and rock stick together - its texture was igneous, with pyroxene crystal inclusions.

It was therefore classified as an achondrite, a meteorite made of what seems to be volcanic material, originated on a body that has undergone internal melting to differentiate the core from the crust - a protoplanet, one of the middle stages of planet formation.

Of the tens of thousands of meteorites that have been identified, only a few thousand - 3,179, according to the Meteoritical Bulletin Database - are achondrites.

Most of these achondrites come from one of two parent bodies, and are basaltic in composition. This means that they cannot tell us much about the diversity of protoplanets in the early Solar System.


EC 002, on the other hand, is not basaltic, but a type of volcanic rock known as andesite, a team of scientists led by geochemist Jean-Alix Barrat of the University of Western Brittany in France has determined.

Of all the meteorites we have found to date, even among achondrites, that makes EC 002 extremely rare - and opens up a new avenue for understanding planet formation.

According to the team's analysis, the rock is ancient. The radioactive decay of isotopes of aluminium and magnesium suggest that these two minerals crystallised around 4.565 billion years ago, in a parent body that accreted 4.566 billion years ago. For context, Earth is 4.54 billion years old.

"This meteorite is the oldest magmatic rock analysed to date and sheds light on the formation of the primordial crusts that covered the oldest protoplanets," the researchers wrote in their paper.

Unlike basalt, which forms from the rapid cooling of lava rich in magnesium and iron, andesite is composed primarily of sodium-rich silicates, and - on Earth, at least - forms in subduction zones, where the edge of one tectonic plate is pushed underneath another.


Although it's found rarely in meteorites, the recent discovery of andesite in meteorites found in Antarctica and Mauritania prompted scientists to investigate how it might occur. Experimental evidence suggests that it can form from the melting of chondritic material.

Because chondritic bodies are so common in the Solar System, it's possible that the formation of protoplanets with andesite crusts was also common. However, when the team compared the spectral characteristics of EC 002 - that is, the way it interacts with light - with the spectral characteristics of asteroids, they could find nothing in the Solar System that matched the meteorite.

Andesitic crustal remains are not only rare in the meteorite record; they are also rare in the asteroid belt. Which raises the question: if the formation process was so simple and common, then where the heck did all the differentiated protoplanets get to?

The same place most of the material in the Solar System ended up, probably: they either got pulverised, or incorporated into larger rocky bodies; or, perhaps, a combination of both.

Since EC 002 is a little older than Earth, it's even possible that its protoplanetary siblings went on to help build Earth from a knot of denser material in the dust cloud that orbited the baby Sun.

Although we have a pretty decent grip on how baby planets are born, growing over millions of years as clumps of rocks and dust stick together, the specifics of the process are a little more mysterious.

EC 002 represents a spectacular opportunity to fine-tune our understanding of how our home system emerged from the dust.

The research has been published in PNAS.

We Finally Know The True Age of The Huge, Mysterious Objects in Laos' Plain of Jars

(Chris Hellier/Getty Images)

PETER DOCKRILL

9 MARCH 2021

In total, there are thousands of them – a giant landscape of strange, hollowed jars, carved from ancient stone. Some have lids. Most are open to the sky.

These surreal cauldron-like megaliths in Laos are known as the Plain of Jars, an archaeological relics whose original purpose is still shrouded in mystery, their significance long forgotten.

For several decades, researchers have suggested the jars were a part of prehistoric burial practices. Local legends and lore suggest the jars, some of them up to three metres (nearly ten feet) tall, were used for storage of food, alcohol, and rainwater, among other things.

For tragic reasons, it's been almost impossible for modern archaeologists to study the sites and discover the truth.

The Plain of Jars region and Laos as a whole still bear the terrible legacy of millions of unexploded bombs dropped by the US Air Force in the 1960s.

To this day, hundreds of innocent Laotians die every year as a result, decades after the conflict's official end.

And so the mystery of the Jars endures, with fewer than 10 percent of the megaliths having been investigated, researchers say.

In recent years, however, expeditions within selected safe sites have commenced, and archaeologists are now making important discoveries about these unusual objects, some of which stand alone, while others are clustered in great groups.

"Until now, it has not been possible to estimate when the jars were first placed on the landscape or from where the stone was sourced," an international team explains in a new paper detailing the latest research.

According to their analysis – using a technique called Optically Stimulated Luminescence (OSL) to date the ancient stone – the jars were positioned potentially as early as the late second millennium BCE.

Evidence discovered of varied mortuary practices at some of the sites – including primary burial of human skeletons, and also bundled or jarred collections of bones – was also dated by radiocarbon dating, suggesting activity between 9-13th century CE.

On the face of the most recent evidence, this means the Plain of Jars pre-dates the most recent and confirmed discoveries of mortuary practices, by potentially thousands of years. As for what that means, we don't yet know.

"The data presented here strongly suggests that the placement of the megaliths preceded the mortuary activity around the jars, indicating re-use of the sites and enduring ritual significance," the researchers write.

However, previous research has suggested the mortuary rituals may be as old as the stone placements themselves, so it's possible wider searches would reveal a more continuous timeline of human activity.

Another puzzle that remains is how the jars got to their current positions.

Examination of megaliths in one site suggests the most likely quarry was 8 kilometres (5 miles) away from where the jars ended up – so just how the ancient culture that created these objects (estimated to weigh up to over 30 tonnes) managed to also transport them, is yet another unknown.

Still a mystery for the ages, then, and no mistake.

The findings are reported in PLOS One (link not yet live at the time of publishing).

Contentious Hypothesis Posits Humans Brains Grew Larger as We Hunted Smaller Prey



(juliawhite/iStock/Getty Images)


STEPHANIE PAPPAS, LIVE SCIENCE
12 MARCH 2021

Over the course of the Pleistocene epoch, between 2.6 million years ago and 11,700 years ago, the brains of humans and their relatives grew.

Now, scientists from Tel Aviv University have a new hypothesis as to why: As the largest animals on the landscape disappeared, the scientists propose, human brains had to grow to enable the hunting of smaller, swifter prey.

This hypothesis argues that early humans specialized in taking down the largest animals, such as elephants, which would have provided ample fatty meals. When these animals' numbers declined, humans with bigger brains, who presumably had more brainpower, were better at adapting and capturing smaller prey, which led to better survival for the brainiacs.

Ultimately, adult human brains expanded from an average of 40 cubic inches (650 cubic centimeters) 2 million years ago to about 92 cubic inches (1,500 cubic cm) on the cusp of the agricultural revolution about 10,000 years ago.

The hypothesis also explains why brain size shrank slightly, to about 80 cubic inches (1,300 cubic cm), after farming began: The extra tissue was no longer needed to maximize hunting success.

Related: See photos of our closest human ancestor

This new hypothesis bucks a trend in human origins studies. Many scholars in the field now argue that human brains grew in response to a lot of little pressures rather than one big one.

But Tel Aviv University archaeologists Miki Ben-Dor and Ran Barkai argue that one major change in the environment would provide a better explanation.

"We see the decline in prey size as a unifying explanation not only to brain expansion, but to many other transformations in human biology and culture, and we claim it provides a good incentive for these changes," Barkai wrote in an email to Live Science.

"[Scholars of human origins] are not used to looking for a single explanation that will cover a diversity of adaptations. It is time, we believe, to think otherwise."
Big prey, growing brains

The growth of the human brain is evolutionarily outstanding because the brain is a costly organ. The Homo sapiens brain uses 20 percent of the body's oxygen at rest despite making up only 2 percent of the body's weight. An average human brain today weighs 2.98 lbs. (1,352 grams), far exceeding the brains of chimpanzees, our nearest living relatives, at 0.85 lb. (384 grams).

Related: In photos: Hominin skulls with mixed traits discovered

Barkai and Ben-Dor's hypothesis hinges on the notion that human ancestors, starting with Homo habilis and peaking with Homo erectus, spent the early Pleistocene as expert carnivores, taking down the biggest, slowest prey that Africa had to offer.

Megaherbivores, the researchers argue in a paper published March 5 in the journal Yearbook of Physical Anthropology, would have provided ample calories and nutrients with less effort than foraging plants or stalking smaller prey. Modern humans are better at digesting fat than other primates are, Barkai and Ben-Dor said, and humans' physiology, including stomach acidity and gut design, indicate adaptations for eating fatty meat.

In another paper, published Feb. 19 in the journal Quaternary, the researchers argue that human species' tools and lifestyle are consistent with a shift from large prey to small prey.

In Barkai's fieldwork in Africa, for example, he has found Homo erectus sites strewn with elephant bones, which disappear at later sites from between 200,000 and 400,000 years ago. The human ancestors at those more recent sites seemed to have been eating mostly fallow deer, Ben-Dor wrote in an email to Live Science.

Overall, megaherbivores weighing over 2,200 lbs. (1,000 kilograms) began to decline across Africa around 4.6 million years ago, with herbivores over 770 lbs. (350 kg) declining around 1 million years ago, the researchers wrote in their paper.

It's not clear what caused this decline, but it could have been climate change, human hunting, or a combination of the two. As the biggest, slowest, fattiest animals disappeared from the landscape, humans would have been forced to adapt by switching to smaller animals.

This switch, the researchers argue, would have put evolutionary pressure on human brains to grow larger because hunting small animals would have been more complicated, given that smaller prey is harder to track and catch.

These growing brains would then explain many of the behavioral changes across the Pleistocene. Hunters of small, fleet prey may have needed to develop language and complex social structures to successfully communicate the location of prey and coordinate tracking it.

Better control of fire would have allowed human ancestors to extract as many calories as possible from smaller animals, including grease and oil from their bones. Tool and weapon technology would have had to advance to allow hunters to bring down and dress small game, according to Barkai and Ben-Dor.
A fuzzy past

Single hypotheses for human brain evolution haven't held up well in the past, however, said Richard Potts, a paleoanthropologist and head of the Smithsonian's Human Origins Program in Washington, D.C., who wasn't involved in the research.

And there are debates about many of the arguments in the new hypothesis.

For example, Potts told Live Science, it's not clear whether early humans hunted megaherbivores at all. There are human cut marks on large-mammal bones at some sites, but no one knows whether the humans killed the animals or scavenged them.

The researchers also sometimes use arguments from one time period that might not apply to earlier times and places, Potts said.

For example, the evidence suggests a preference for large prey by Neanderthals living in Europe 400,000 years ago, which would have served those human relatives well in winter, when plants were scarce. But the same thing might not have held true a few hundred thousand or a million years earlier in tropical Africa, Potts said.

And when it comes to brains, size isn't everything. Complicating the picture, brain shape also evolved over the Pleistocene, and some human relatives – such as Homo floresiensis, which lived in what is now Indonesia between 60,000 and 100,000 years ago – had small brains. H. floresiensis hunted both small elephants and large rodents despite its small brain.

The period over which humans and their relatives experienced this brain expansion is poorly understood, with few fossil records to go on.

For example, there are perhaps three or four sites firmly dated to between 300,000 and 400,000 years ago in Africa that are certainly related to humans and their ancestors, said John Hawks, a paleoanthropologist at the University of Wisconsin–Madison who was not involved in the research and was skeptical of its conclusions.

The human family tree was complicated over the course of the Pleistocene, with many branches, and the growth in brain size wasn't linear. Nor were the declines in large animals, Hawks told Live Science.

"They've sketched out a picture in which the megaherbivores decline and the brains increase, and if you look at that through a telescope, it sort of looks true," Hawks told Live Science. "But actually, if you look at the details on either side, brain size was more complicated, megaherbivores were more complicated, and it's not like we can draw a straightforward relationship between them."

The paper does, however, draw attention to the fact that human species may indeed have hunted large mammals during the Pleistocene, Hawks said.

There is a natural bias in fossil sites against preserving large mammals because human hunters or scavengers wouldn't have dragged an entire elephant back to camp; they would have sliced off packets of meat instead, leaving no evidence of the feast at their home sites for future paleontologists and archaeologists.

"I'm sure we're going to be talking more and more about what was the role of megaherbivores in human subsistence, and were they important to us becoming human?" Hawks said.



This article was originally published by Live Science
Some Women Were Big-Game Hunters, Complicating Ancient Gender Roles

Archeological evidence from Peru has revealed that some ancient big-game hunters were, in fact, women


(Matthew Verdolivo/UC Davis)


VIVEK VENKATARAMAN, THE CONVERSATION
13 MARCH 2021

Archeological evidence from Peru has revealed that some ancient big-game hunters were, in fact, women, challenging what science writer James Gorman wrote was "one of the most widely held tenets about ancient hunters and gatherers – that males hunted and females gathered."


"Man the Hunter" is a narrative of human origins developed by early 20th-century anthropologists armed with their imaginations and a handful of fossils.

They viewed hunting – done by men – as the prime driver of human evolution, bestowing upon our early ancestors bipedalism, big brains, tools, and a lust for violence. In this narrative, hunting also gave rise to the nuclear family, as women waited at home for men to bring home the meat.

As an anthropologist who studies hunting and gathering societies, I was thrilled by the discovery of female skeletons buried with big-game hunting paraphernalia, a pattern that raises important questions about ancient gender roles. But I found most of the media coverage it generated disappointingly inaccurate.

Responding to the finding, journalist Annalee Newitz wrote: "Nicknamed 'man the hunter,' this is the notion that men and women in ancient societies had strictly defined roles: Men hunted, and women gathered. Now, this theory may be crumbling."

In fact, that theory died a well-deserved death decades ago.



In 1966, 75 anthropologists (70 of whom were men) held a symposium called "Man the Hunter" at the University of Chicago to address one of humanity's grand questions: How did people live before agriculture?

The researchers had lived with and studied contemporary populations of hunting and gathering peoples around the world, from jungle to tundra.

It was there in Chicago that real-life data confronted the myth of Man the Hunter. Researchers showed that women worked just as hard as men, and plant foods gathered by women were crucially important in hunter-gatherer diets.

Hunter-gatherer movement patterns were driven by a variety of ecological factors, not just game. And many hunter-gatherers were quite peaceful and egalitarian. Hunting wasn't the sole driver or unifying theory of human evolution after all.

By the late 1970s, as anthropologists carried out further research on hunter-gatherers and paid attention to issues of gender, the myth of Man the Hunter fell into disfavor

Updating beliefs

Even so, subsequent research has affirmed a simple division of labor among hunter-gatherers: men mostly hunt, and women mostly gather. When anthropologist Carol Ember surveyed 179 societies, she found only 13 in which women participated in hunting.

But it is a mistake to conflate this pattern of "most hunters are men" among hunter-gatherers with the myth of Man the Hunter. That myth was born of assumptions, not careful empirical research.

Through decades of field research, anthropologists have developed a more flexible and capacious view of human labor. According to this view, women are not bound by biology to gather, nor men to hunt. In fact, several accounts of women's hunting in foraging societies had emerged by the mid-1980s.


In this context, ancient female hunters are an expectation, not a surprise. And the focus on Man the Hunter distracts from the more important question of how a society with female big-game hunters might be constructed.

After all, women are perfectly capable of hunting, yet in most hunter-gatherer societies they don't do it very often.

Hunting and child care

One prominent explanation, elaborated in 1970 by feminist anthropologist Judith Brown, is that the demands of hunting conflict with the provision of child care.

This was supported in a recent review of women's hunting that surveyed traditional societies around the world; the authors found that pregnant or lactating women do not often hunt, and those with dependents only hunt when child care is available or rich hunting grounds are close to camp.

These constraints play a role in shaping risk preferences. In hunter-gatherers, men's hunting is risky, meaning it carries a high chance of failure. Men tend to hunt alone or in small groups and target big game with projectile weapons, which often require fast-paced, long-distance travel.

In contrast, women prefer to hunt in groups and focus on smaller, easier-to-capture prey closer to camps, often with the aid of dogs.

                       WHO DOMESTICATED DOGS? WOMEN 

https://plawiuk.blogspot.com/2021/03/contentious-hypothesis-posits-humans.html

Women are often crucial to the hunting success of others, whether through logistical or ritual assistance. Husbands and wives sometimes work collaboratively; in these instances, women may help trap an animal, then club it to death and carry the meat home. And in big-game hunting societies, women provide support to hunters by manufacturing clothing, weaponry, and transportation equipment.

They may also participate in hunting directly by locating, then surrounding and driving game toward a killing location, as seen among high-latitude reindeer hunters and Plains bison hunters. As the authors of the new paper speculate, this is likely how the Peruvian female hunters killed game.

Updated views on plant gathering provide insight into why women may choose not to hunt altogether. No one questioned that hunting is hard, but early anthropologists often assumed women's gathering was simple and easy.

This turns out to be wrong. Like hunting, gathering demands extensive ecological knowledge and skill that is socially learned and cultivated over a lifetime.

As a result, hunter-gatherers face tough choices about how to divide difficult labor in a 24-hour day. In this context, economic considerations show that it pays to specialize: modest comparative advantages – speed and strength, and the incompatibilities posed by child care – can lead to divisions of labor that increase overall food acquisition by the group.

From this perspective, women's decisions to hunt less than men may be a rational decision about allocating effort.

The Batek people

Many have assumed that by not hunting, women are relegated to lower status. But is that true?

I conduct my work among the Batek people, hunter-gatherers from the rainforests of Malaysia who are widely considered one of the most gender-egalitarian societies in the world. They have little material inequality, share food widely, abhor violence, and emphasize individual autonomy.

When day breaks at camp, Batek men trek far, usually alone, to hunt monkeys with blowpipes. The women gather tubers or fruit in small groups closer to camp. Nothing prohibits women from hunting, as is the case with some hunter-gatherers where, for example, touching hunting weapons is forbidden.

Batek women sometimes join in group hunts of bamboo rats, but it is otherwise rare. However, there are exceptions. Some teenage girls establish an interest in blowpipe hunting that carries into adulthood.

The Batek people say this division of labor comes down to strength differences, incompatibility with child care, and differences in knowledge specialization.

Hunting has great cultural significance, but women's knowledge of plant distributions is crucial for collective decisions like moving camp. The Batek conceive of themselves as a co-operative and interdependent group in which each person makes a unique and important contribution toward a communal goal.

Beyond Man the Hunter

Contrary to news reports, the archeological findings from Peru accord well with current knowledge about how and why men and women divide labor among hunter-gatherers. And it has little to do with the myth of Man the Hunter.

The Peruvian hunter-gatherers were big-game specialists who used spear-throwing technologies that were likely relatively easy to learn. This may have enabled more flexible divisions of labor and broader participation in hunting by women, similar to what we see among some hunter-gatherers today.

The social implications beyond these facts are not clear. That's because one's role in food collection has no simple relation to status or power dynamics.

New research on neglected topics like the determinants of women's status and risk-seeking economic behavior in traditional societies promises to shed light on this issue. But as the case with the Batek people shows, among a liberated society of equals, status and power has little to do with who brings in the meat.

Vivek Venkataraman, Assistant Professor of Anthropology and Archaeology, University of Calgary.

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Secrets From Ancient Bones Have Changed What We Know About The Scythians


Scythian stone carving in Adyr-Kahn. (avtk/iStock/Getty Images Plus)


MICHELLE STARR
11 MARCH 2021
LIVE SCIENCE/SCIENCE ALERT

In both popular culture and the academic record, the Scythians have been described as a force to be reckoned with. For hundreds of years, they ruled the Eurasian steppe, fierce warriors given an even bigger advantage by their highly mobile, nomadic lifestyle.

Or so we have thought, for millennia. According to a new analysis of Scythian bones, this perception is not quite the full picture; in fact, some of the people we group in with the Scythians often did settle down, living more agrarian lifestyles with urban centers.

"Our study demonstrates overall low levels of human mobility in the vicinity of key urban locales of the Scythian era, in contrast to previous stereotypes of highly nomadic populations," said anthropologist Alicia Ventresca Miller of the University of Michigan.

"While long-distance mobility increased during the Scythian era relative to preceding periods, it was limited to a small percentage of individuals."

Our understanding of the people we classify as Scythians, who rose and thrived between 700 BCE and 200 BCE, is based on a number of different sources. There are historical records, including reports from the contemporaneous Greek historian Herodotus; and there's the archaeological record, which is rich with the trappings of a warlike nomadic lifestyle, such as weapons, horse tack and burial mounds.

But the steppe is a large place, 500 years is a fairly long time, and humans are complex. Although all the people in that place and time tend to get grouped together under the Scythian label, the research of Ventresca Miller and her colleagues suggests that several, perhaps even many, diverse groups lived on the Pontic steppe during that time.


The team conducted an isotopic analysis of 'Scythian' teeth and bones found throughout what is now Ukraine, and discovered that those people likely had a more stationary lifestyle - growing millet and raising livestock - than the predominant image of the wild 'barbarians' suggests.

The teeth and bones belonged to 56 individuals whose remains were found on three burial sites in central and eastern Ukraine - Belsk, Mamai-Gora and Medvin. From them, the researchers were able to extract enough material to conduct an isotope analysis. They also analysed bones from a pair of sheep and a pig found buried in Bel'sk, providing additional context on livestock and what the ancient people ate.

This technique can reveal when and where a person lived. Combinations of isotopes in the soil can be taken up by plants, to be eaten and absorbed by people and other animals. In the case of strontium, the mix of isotopes replaces a small portion of the calcium in their teeth and bones, preserving the ratio as a record of their diet.

Since each geographic location has a different isotopic signature, and because some isotopes decay at a known rate, these isotopes can be used to place the source of a person's diet not just in geographic space, but also time.


Isotope analysis can even reveal if a person moved around from place to place over the course of their lifetime, so it would be a particularly useful tool for understanding the movement of the Scythians.

The researchers analysed isotopes of strontium, oxygen, nitrogen, and carbon, and compared it to previous studies on human populations in Ukraine from the Neolithic through to the Iron Age. They found strong evidence for the consumption of millet in all three sites, suggesting a reliance on agriculture. Two individuals from Mamai-Gora were found to have been highly mobile; these two ate less millet than the people who didn't move around.

Although these people did move around more than in previous eras, the findings suggest that, by and large, they tended to settle down, farm domesticated grain, and raise livestock, the researchers said.

"The Scythian epoch was clearly a period of contradictions, with strong evidence for complex interactions between agro-pastoralists and pastoralists that contributed to population aggregation in urban locales," Ventresca Miller said.

"This study highlights the potential use of using isotopic analysis to directly assess prevailing models of economies and mobilities during the Scythian era."

The team hopes that future work will include larger, multi-generational samples to unveil a more complete picture of how people moved around - or didn't - on the Eurasian steppe during the time of the Scythians. They also hope to study bones from people of different social status, including those buried in rich graves.

This, they believe, could help us move away from cliches and stereotypes towards a richer, more realistic understanding of human history.

"It is clear that if we are to truly uncover the 'Scythians' we need to accept that the Eurasian steppe was home to a myriad of dynamic cultures and subsistence strategies during the Iron Age," the researchers wrote in their paper.

"In fact, it is perhaps variability, rather than a uniformity of nomadic warriors, that truly frames the Scythians as predecessors to incipient globalization in Eurasia."

The research has been published in PLOS One.


An Underwater Revolution Millions of Years Ago Rewrote The Script of The Ocean



Illustration of the marine arthropod Sanctacaris
(Sebastian Kaulitzki/Getty Image)


PETER DOCKRILL
13 MARCH 2021

Look far enough back in time, and a pattern may emerge. After studying thousands of ancient fossils, paleontologist Jack Sepkoski identified just such a thing in 1981: an epic sequence of life and death, etched into the skeletons of the last 500 million years.

The late Sepkoski, a professor at the University of Chicago, discovered what became known as the three great evolutionary faunas of marine animals – a trio of successive explosions in biodiversity in the ocean over the course of the Phanerozoic Eon.

These giant bloomings of marine life were bookended by catastrophes of world-changing scale: extinction-level events precipitating mass animal die-offs – simultaneously clearing the stage for new creatures to emerge and prosper in the spaces they left behind.

But it doesn't have to happen that way, a new study suggests. Equally powerful forces – capable of shaping macroevolutionary processes with planetary implications – don't always require asteroids or supervolcanoes.

Sometimes the fire comes from within.


"The fossil record tells us that some of the key transitions in the history of life were rapid changes triggered by abrupt external factors," explains paleontologist Michal Kowalewski from the University of Florida.

"But this study shows that some of those major transitions were more gradual and may have been driven by biological interactions between organisms."

The case in this point is what's known as the Mesozoic Marine Revolution. Commencing roughly 150-200 million years ago, this transition represents all the macroevolutionary changes that took place as marine predators like bony fish, crustaceans, and predatory snails increased in numbers, forcing their invertebrate prey, such as mollusks, to adapt defenses against boring and shell-crushing attacks.

In the new research, which used modeling to demonstrate the network of relationships between giant assemblages of prehistoric marine lifeforms, the team found that the Mesozoic Marine Revolution effectively represents a fourth, unrecognized chapter of surging biodiversity within the Phanerozoic – equal in its power to the three great evolutionary faunas Sepkoski identified decades ago.

"We are integrating the two hypotheses – the Mesozoic Marine Revolution and the three great evolutionary faunas into a single story," explains first author and paleontologist Alexis Rojas from UmeÃ¥ University in Sweden.

"Instead of three phases of life, the model shows four."

Ultimately, although the Mesozoic Marine Revolution was characterized by gradual ecological changes produced by marine life interactions over millions of years, the researchers say it nonetheless triggered a prolonged biotic transition comparable in magnitude to the end-Permian transition.

This episode, often called the Great Dying, occurred approximately 250 million years ago and was Earth's most severe mass extinction event, wiping out approximately 80 percent of all marine species (and 70 percent of terrestrial vertebrates).

In the aftermath, life rebounded with the third great evolutionary fauna, known as the Modern fauna period, per Sepkoski's framework.

But according to Rojas, Kowalewski, and their team, the Modern period intersected with the Mesozoic Marine Revolution, contributing to a recognizable transition in biodiversity in Earth's marine life during the mid-Cretaceous period, about 129 million years ago.

"What we actually built is an abstracted fossil record that provides a unique perspective of the organization of marine life," Rojas says.

"At the most basic levels, this map shows ocean regions with particular animals," he adds. "The building blocks of our study are the individual animals themselves."

The findings are reported in Communications Biology.

Ancient Earth Really Was a Serene Water World, New Evidence Confirms


(WIN-Initiative/Getty Images)

MICHELLE STARR
11 MARCH 2021

It's tricky to figure out what Earth might have looked like in the early years before life emerged. Geological detectives have now obtained more evidence that it was rather different to the planet we live on today.

According to a new analysis of the features of Earth's mantle over its long history, our whole world was once engulfed by a vast ocean, with very few or no land masses at all. It was an extremely soggy space rock.

So where the heck did all the water go? According to a team of researchers led by planetary scientist Junjie Dong of Harvard University, minerals deep inside the mantle slowly drunk up ancient Earth's oceans to leave what we have today.

"We calculated the water storage capacity in Earth's solid mantle as a function of mantle temperature," the researchers wrote in their paper.

"We find that water storage capacity in a hot, early mantle may have been smaller than the amount of water Earth's mantle currently holds, so the additional water in the mantle today would have resided on the surface of the early Earth and formed bigger oceans.

"Our results suggest that the long‐held assumption that the surface oceans' volume remained nearly constant through geologic time may need to be reassessed."

Deep underground, a great deal of water is thought to be stored in the form of hydroxy group compounds - made up of oxygen and hydrogen atoms. In particular, the water is stored in two high-pressure forms of the volcanic mineral olivine, hydrous wadsleyite and ringwoodite. Samples of wadsleyite deep underground could contain around 3 percent H2O by weight; ringwoodite around 1 percent.


Previous research on the two minerals subjected them to the high pressures and temperatures of the mantle of modern day Earth to figure out these storage capacities. Dong and his team saw another opportunity. They compiled all the available mineral physics data, and quantified the water storage capacity of wadsleyite and ringwoodite across a wider range of temperatures.

The results showed that the two minerals have lower storage capacities at higher temperatures. Because baby Earth, which formed 4.54 billion years ago, was much warmer internally than it is today (and its internal heat is still decreasing, which is very slow and also has absolutely nothing to do with its external climate), it means the water storage capacity of the mantle now is higher than it once was.

Moreover, as more olivine minerals are crystallizing out of Earth's magma over time, the water storage capacity of the mantle would increase that way, too.

In all, the difference in water storage capacity would be significant, even though the team was conservative with its calculations.

"The bulk water storage capacity of Earth's solid mantle was significantly affected by secular cooling due to its constituent minerals' temperature‐dependent storage capacities," the researchers wrote.


"The mantle's water storage capacity today is 1.86 to 4.41 times the modern surface ocean mass."

If the water stored in the mantle today is greater than its storage capacity in the Archean Eon, between 2.5 and 4 billion years ago, it's possible that the world was flooded and the continents swamped, the researchers found.

This finding is in agreement with a previous study that found, based on an abundance of certain isotopes of oxygen preserved in a geological record of the early ocean, that Earth 3.2 billion years ago had way less land than it does today.

If this is the case, it could help us answer burning questions about other aspects of Earth's history, such as where life emerged around 3.5 billion years ago. There's an ongoing debate over whether life first formed in saltwater oceans or freshwater ponds on land masses; if the entire planet was engulfed by oceans, that would solve that mystery.

Furthermore, the findings could also help us in the search for extraterrestrial life. Evidence suggests that ocean worlds are abundant in our Universe, so looking for signatures of these soggy planets could help us identify potentially hospitable worlds. And it could strengthen the case for looking for life on ocean worlds in our own Solar System, such as Europa and Enceladus.

Not least, it helps us better understand the delicate evolution of our planet, and the strange, often seemingly inhospitable turns along the way that eventually led to the emergence of humanity.

The research has been published in AGU Advances.