Monday, April 27, 2020

THE LITTLE DEATH

Scientists unveil how general anesthesia works

A study in mice and rat brains reveals how general anesthesia dampens high frequency brain activity by weakening synapses
OKINAWA INSTITUTE OF SCIENCE AND TECHNOLOGY (OIST) GRADUATE UNIVERSITY

Hailed as one of the most important medical advances, the discovery of general anesthetics - compounds which induce unconsciousness, prevent control of movement and block pain - helped transform dangerous and traumatic operations into safe and routine surgery. But despite their importance, scientists still don't understand exactly how general anesthetics work.
Now, in a study published this week in the Journal of Neuroscience, researchers from the Okinawa Institute of Science and Technology Graduate University (OIST) and Nagoya University have revealed how a commonly used general anesthetic called isoflurane weakens the transmission of electrical signals between neurons, at junctions called synapses.
"Importantly, we found that isoflurane did not block the transmission of all electrical signals equally; the anesthetic had the strongest effect on higher frequency impulses that are required for functions such as cognition or movement, whilst it had minimal effect on low frequency impulses that control life-supporting functions, such as breathing," said Professor Tomoyuki Takahashi, who leads the Cellular and Molecular Synaptic Function (CMSF) Unit at OIST. "This explains how isoflurane is able to cause anesthesia, by preferentially blocking the high frequency signals."
At synapses, signals are sent by presynaptic neurons and received by postsynaptic neurons. At most synapses, communication occurs via chemical messengers - or neurotransmitters.
When an electrical nerve impulse, or action potential, arrives at the end of the presynaptic neuron, this causes synaptic vesicles - tiny membrane 'packets' that contain neurotransmitters - to fuse with the terminal membrane, releasing the neurotransmitters into the gap between neurons. When enough neurotransmitters are sensed by the postsynaptic neuron, this triggers a new action potential in the post-synaptic neuron.
The CMSF unit used rat brain slices to study a giant synapse called the calyx of Held. The scientists induced electrical signals at different frequencies and then detected the action potentials generated in the postsynaptic neuron. They found that as they increased the frequency of electrical signals, isoflurane had a stronger effect on blocking transmission.
To corroborate his unit's findings, Takahashi reached out to Dr. Takayuki Yamashita, a researcher from Nagoya University who conducted experiments on synapses, called cortico-cortical synapses, in the brains of living mice.
Yamashita found that the anesthetic affected cortico-cortical synapses in a similar way to the calyx of Held. When the mice were anesthetized using isoflurane, high frequency transmission was strongly reduced whilst there was less effect on low frequency transmission.
"These experiments both confirmed how isoflurane acts as a general anesthetic," said Takahashi. "But we wanted to understand what underlying mechanisms isoflurane targets to weaken synapses in this frequency-dependent manner."
Tracking down the targets
With further research, the researchers found that isoflurane reduced the amount of neurotransmitter released, by both lowering the probability of the vesicles being released and by reducing the maximum number of vesicles able to be released at a time.
The scientists therefore examined whether isoflurane affected calcium ion channels, which are key in the process of vesicle release. When action potentials arrive at the presynaptic terminal, calcium ion channels in the membrane open, allowing calcium ions to flood in. Synaptic vesicles then detect this rise in calcium, and they fuse with the membrane. The researchers found that isoflurane lowered calcium influx by blocking calcium ion channels, which in turn reduced the probability of vesicle release.
"However, this mechanism alone could not explain how isoflurane reduces the number of releasable vesicles, or the frequency-dependent nature of isoflurane's effect," said Takahashi.
The scientists hypothesized that isoflurane could reduce the number of releasable vesicles by either directly blocking the process of vesicle release by exocytosis, or by indirectly blocking vesicle recycling, where vesicles are reformed by endocytosis and then refilled with neurotransmitter, ready to be released again.
By electrically measuring the changes in the surface area of the presynaptic terminal membrane, which is increased by exocytosis and decreased by endocytosis, the scientists concluded that isoflurane only affected vesicle release by exocytosis, likely by blocking exocytic machinery.
"Crucially, we found that this block only had a major effect on high frequency signals, suggesting that this block on exocytic machinery is the key to isoflurane's anesthetizing effect," said Takahashi.
The scientists proposed that high frequency action potentials trigger such a massive influx of calcium into the presynaptic terminal that isoflurane cannot effectively reduce the calcium concentration. Synaptic strength is therefore weakened predominantly by the direct block of exocytic machinery rather than a reduced probability of vesicle release.
Meanwhile, low frequency impulses trigger less exocytosis, so isoflurane's block on exocytic machinery has little effect. Although isoflurane effectively reduces entry of calcium into the presynaptic terminal, lowering the probability of vesicle release, by itself, is not powerful enough to block postsynaptic action potentials at the calyx of Held and has only a minor effect in cortico-cortical synapses. Low frequency transmission is therefore maintained.
Overall, the series of experiments provide compelling evidence to how isoflurane weakens synapses to induce anesthesia.
"Now that we have established techniques of manipulating and deciphering presynaptic mechanisms, we are ready to apply these techniques to tougher questions, such as presynaptic mechanisms underlying symptoms of neurodegenerative diseases," said Takahashi. "That will be our next challenge."
###
New species of moths discovered in the Alps named after three famous alpinists


PENSOFT PUBLISHERS

A CURVED-HORN MOTH OF THE GENUS CARYOCOLUM FEEDING ON A CARNATION PLANT. THIS GENUS FEEDS EXCLUSIVELY ON PLANTS IN THE CARNATION FAMILY (CARYOPHYLLACEAE). view more  CREDIT: P. BUCHNER / TIROLER LANDESMUSEEN

The discovery of new, still unnamed animal species in a well-researched European region like the Alps is always a small sensation. All the more surprising is the description of a total of three new to science species previously misidentified as long-known alpine moths.


During a genetic project of the Tyrolean State Museums in Innsbruck (Austria), Austrian entomologist and head of the Natural Science Collections Peter Huemer used an integrative research approach that relies on molecular methods to study four European moths. Despite having been known for decades, those species remained quite controversial, because of many unknowns around their biology.


At the end, however, it turned out that the scientist was not dealing with four, but seven species. The three that were not adding up were indeed previously unknown species. Therefore, Huemer described the moths in a paper in the open-access, peer-reviewed journal Alpine Entomology. Curiously, all three species were given the names of legendary alpinists: Reinhold Messner, Peter Habeler and David Lama.

Tribute to three legends in alpinism

"The idea to name the new species in honour of three world-renowned climbers was absolutely no coincidence," explains Huemer.

One of the newly described species, Caryocolum messneri, or Messner's Curved-horn moth, is dedicated to Reinhold Messner. Messner is a famous alpinist who was the first to reach Mount Everest without additional oxygen, but also the first climber to ascend all fourteen peaks over 8,000 metres. For decades, he has been inspiring followers through lectures and books. His is also the Messner Mountain Museum project, which comprises six museums located at six different locations in South Tyrol, northern Italy, where each has the task to educate visitors on "man's encounter with mountains" by showcasing the science of mountains and glaciers, the history of mountaineering and rock climbing, the history of mythical mountains, and the history of mountain-dwelling people.

"So what could have been a better fit for a name for the species that flutters on the doorstep of his residence, the Juval Castle in South Tyrol?" says Huemer.

The second new species, Caryocolum habeleri, or Habeler's Curved-horn moth, honours another extraordinary mountaineer: Peter Habeler. Having joined Messner on his expedition to Mount Everest, he also climbed this mountain without additional oxygen in a first for history. Another achievement is his climbing the famous Eiger North Face in mere 10 hours. Additionally, together with the study's author, he sits on the advisory board of the nature conservation foundation "Blühendes Österreich". However, the species' name is also a nod to Peter Habeler's cousin: Heinz Habeler, recognised as "the master of butterfly and moth research in Styria". His collection is now housed in the Tyrolean State Museums.

The third alpinist, whose name is immortalised in a species name, is David Lama, specially recognised by Huemer for his commitment to conservation. Once, in order to protect endangered butterflies along the steep railway embankments in Innsbruck, Lama took care to secure volunteers in a remarkable action. Nevertheless, Lama earned his fame for his spectacular climbing achievements. His was the first free ascent of the Compressor route on the south-eastern flank of Cerro Torre.

"Unfortunately, David lost his life far too soon in a tragic avalanche accident on 16 April 2019 in Banff National Park, Canada. Now, Caryocolum lamai (Lama's Curved-horn moth) is supposed to make him 'immortal' also in the natural sciences," says Huemer.

Many unresolved questions

The newly described moth species are closely related and belong to the genus Caryocolum of the so-called Curved-horn moths (family Gelechiidae).

As caterpillars, the species of this genus live exclusively on carnation plants. Even though the biology of the new moths is still unknown, because of their collection localities, it could be deduced that plants such as the stone carnation are likely their hosts. All species are restricted to dry and sunny habitats and sometimes inhabit altitudes of up to 2,500 m. So far, they have only been observed with artificial light at night.

While Messner's Curved-horn moth occurs from northern Italy to Greece, the area of ??Habeler´s Curved-horn Moth is limited to the regions between southern France, northern Switzerland and southeastern Germany. On the other hand, Caryocolum lamai, only inhabits a small area in the western Alps of Italy and France.

Research on alpine butterflies and moths has been an important scientific focus at the Tyrolean state museums for decades. In 30 years, Peter Huemer discovered and named over 100 previously unknown to science species of lepidopterans. All these new discoveries have repeatedly shown the gaps in the study of biodiversity, even in Central Europe.

"How could we possibly protect a species that we don't even have a name for is one of the key questions for science that derives from these studies," says Huemer in conclusion.

###

Original source:

Huemer P (2020) Integrative revision of the Caryocolum schleichi species group - a striking example of a temporally changing species concept (Lepidoptera, Gelechiidae). Alpine Entomology 4: 39-63. https://doi.org/10.3897/alpento.4.50703

Origins of humankind
Professor Lee Berger has launched a series of online video lectures that will feature some of our most prized fossil discoveries including Taung Child, Homo naledi and Australopithecus sediba.
Virtual and augmented reality: warnings about the ethical dangers

Limits between reality and virtual experiences
UNIVERSITY OF BARCELONA


RESEARCHERS URGE FOR MORE SCIENTIFIC RESEARCH THAT COULD FORM THE BASIS OF A REGULATORY REGIME SUCH AS THERE IS FOR CINEMA. view more CREDIT: UNIVERSITY OF BARCELONA

Research on virtual reality started in the eighties, but it is now that good quality is available to the public and it can become a mass consumer product soon. However, there is almost no scientific knowledge on the effects of virtual reality in the long run, nor any oversight over content.

An international group of researchers, with the participation of Mel Slater, director of Event Lab in the Faculty of Psychology of the University of Barcelona (UB), has published a new paper in the journal Frontiers in Virtual Reality reflecting on potential ethical problems of mass spread of virtual and augmented reality. Researchers warn about the dangers that might arise as these technologies becomes more and more realistic and urge for new research to address these scenarios.

Other participants in the study are representatives of major companies and institutions such as BBC R&D, Digital Catapult, Dimension /Hammerhead VR, Facebook London, NESTA, Jigsaw -part of Google-, Magic Leap, Microsoft Research, and University College London.

Potential negative effects of virtual or augmented reality have not been explored

Virtual reality started more than forty years ago in a form we would recognize today: a stereo head-mounted display, head tracking, and computer graphics generated images. Despite different ups and downs in the development of the technology, a huge amount of research has been carried out across a vast range of applications in the last twenty-five years: from medicine to business, from psychotherapy to industry, from sports to travel.

However, its possible negative effects have not been explored, especially when very high quality visual and behavioral realism of virtual humans is becoming increasingly likely in the near future. Elements and even experiences in virtual or augmented reality may become indistinguishable from reality very soon. "For example, a normal computer game may be violent, but it is violence depicted through a screen, involving tiny characters. In VR you are part of the scenario, everything happening is around you, the characters are life-sized, they can look you in the eye. This is a qualitatively different type of experience compared to video games or movies," said Mel Slater, also a member of the Institute of Neurosciences of the UB (UBNeuro).

"Virtual Reality has been used almost exclusively for the good, mostly focused on psychotherapy, but - continues Mel Slater - like any technology, it can be used for good or evil. In order to prevent its use for evil we need better scientific understanding, and therefore research into its effects."

Limits between reality and virtual experiences

In order to reflect about potential dangers of virtual and augmented reality the working group set out to consider possible worst-case scenarios and possible areas of research to address these. They concentrated mainly on problems that might arise as XR becomes more and more realistic. As virtual reality becomes more real people may find it difficult to distinguish between reality and virtual reality. For example, remembering virtual events as if they had been real, and failing to distinguish over time events that really happened and those that happened in virtual reality.

Researchers also highlighted another potential problem with an ultrarealistic experience: we don't know what are the after effects and mental consequences of using virtual reality - for example in extreme violent games - and of the real-world transition from virtual reality.

"After an intense and emotional experience in virtual reality, you take the headset off, and you are suddenly in the very different real world. We are not good at rapid adjustment of behavior and emotion regulation. Re-entry to the real world, especially after repeated exposure to virtual reality, might lead to disturbances of various types: cognitive (did something happen in XR or in real life?), emotional (cause of emotions is not real, for example your avatar was insulted by a fictional virtual character), and behavioral: for example, actions accepted in XR may not be socially accepted in the real world)," explained Mel Slater.

Social isolation is other issue highlighted in the paper. "It is possible that some people may use XR to such an extent that they lose social face-to-face contact with other people so that people withdraw from society," said Mel Slater.

Some other issues they warn against were data privacy and the dangers of identity impersonation and fake news. "People (e.g., politicians) could be shown to carry out actions in virtual reality that they never did in reality. Although the same is true with just video, in virtual reality it is more powerful because it seems to happen life-sized in the same space in which you are located. It happens in front of you, not through a screen," said Slater.

A regulatory regime similar to cinema
Given these ethical challenges, researchers highlighted that there is essentially no data that can help in addressing them. So, besides the potential problems, some important research questions are outlined in the paper, such as the long-term effects of XR use, or whether XR experiences can be used to manipulate memory, or if people will continue to distinguish real from virtual events.

"It is especially important to make the creators of virtual and augmented reality applications aware of these possible dangers. However, this must be based on scientific study rather than opinion, so an urgent consideration is the funding of interdisciplinary research to address these and other issues. The most important is scientific research that could form the basis of a regulatory regime such as there is for cinema, with classifications of content by age and other conditions," said Mel Slater.

A permanent working group

After this research paper, the working group will continue with regularly meetings and discussion. "We plan to hold a one-day Workshop for Industry where we present these issues in a public forum, and form a more permanent working group from this to advise industry, government and international bodies," said Mel Salter.

###

University of the Witwatersrand publishes first clinical data on COVID-19 in South Africa

This special issue of the Wits Journal of Clinical Medicine covers a range of aspects of the pandemic, from the clinical, through ethical, to the social dynamics of its impact


UNIVERSITY OF THE WITWATERSRAND

THE JOURNAL, PUBLISHED BY WITS UNIVERSITY PRESS UNDER OPEN ACCESS CONDITIONS, PRESENTS THE FIRST CLINICAL DATA ON COVID-19 PUBLISHED IN SOUTH AFRICA. CREDIT: WITS UNIVERSITY
http://www.wits.ac.za/covid19/covid19-news/latest/wits-publishes-first-clinical-data-on-covid-19-in-south-africa.html


For example, do they resuscitate patients even though the prospects of recovery are slim and the risk to the healthcare team high? How should healthcare workers respond?

This is one of the topics covered in the COVID-19 Special Issue of the Wits Journal of Clinical Medicine. The journal, published by Wits University Press under open access conditions, presents the first clinical data on COVID-19 published in South Africa.

The special issue covers a range of aspects of the pandemic, from the clinical, through ethical, to the social dynamics of its impact.

Highlights include:

Diagnosis of COVID-19: Considerations, controversies and challenges in South Africa
Social dimensions of COVID-19 in South Africa: A neglected element of the treatment plan
Clinical ethical challenges in the COVID-19 crisis in South Africa
Potential impact of SARS-CoV-2 infection in HIV-positive patients in South Africa
COVID-19: The concept of herd immunity - is it a strategy for South Africa?

Editor-in-chief of the Wits Journal of Clinical Medicine Professor Pravin Manga says it is important for health professionals and the public to have access to scientific information:

"Social media is awash with all sorts of quackery regarding prevention and treatment remedies for COVID-19 and it is during these times that we need to be rational and be guided by science rather than by emotion."

Manga is Professor of Internal Medicine, School of Clinical Medicine, in the Faculty of Health Sciences at Wits.

In addition to being a public resource, more than 10 articles in this issue aim to guide healthcare workers. These articles address the ethical aspects related to the pandemic.

For example, which patients should healthcare workers admit to already-filled intensive care units and who decides this? Associate Professor Kevin Behrens, Director and Head of the Steve Biko Centre for Bioethics discusses such ethical conundrums in his review.

Professor Laurel Baldwin-Ragaven in the Department of Family Medicine and Primary Care presents a sobering overview of how some of South Africa's vast social disparities may manifest during COVID-19, and outlines our clinical and social responsibilities.

Professor Charles Feldman in the Division of Pulmonology, Department of Internal Medicine at Wits interrogates the aggravating issue of South Africa's additional burdens of HIV and tuberculosis and the risk of COVID-19 infection in HIV-positive patients.

Professor Ismail S. Kalla in Pulmonology in Internal Medicine and Professor Abdullah Laher in Emergency Medicine in the School of Clinical Medicine explore whether herd immunity offers possibilities as a strategy for fighting COVID-19 in South Africa.

This special issue includes a guide to worldwide Medical Resources on COVID-19, as well as a handy poster-style patient guide for healthcare workers.

###

*Covid-19 Resources for Academics and Students: Access the entire Wits University Press eBook collection in your university library through JSTOR and Proquest until 1 July 2020.
How do epidemics spread and persist before and after introduction of a vaccine?

Modeling 40 years of measles dynamics demonstrates the complexity of epidemic spread and the impact of vaccination


PENN STATE

CAPTION

Illustration of the observed measles epidemics in cities and towns of Enland and Wales from 1944-94 (vertical Z-axis represents the timeline) for the largest locations. New research reveals that, prior to the introduction of the measles vaccine, transmission patterns were likely driven by periodic epidemics from both large population centers CREDIT: MAX S. Y. LAU

In the most detailed study to date of epidemic spread, an international team of researchers has modeled measles dynamics based on over 40 years of data collected in England and Wales. The models--which span the prevaccination period, introduction of measles vaccination, and local elimination by vaccination in the 1990s--reveal that, before the introduction of a vaccine, measles could persist in both large population centers and by spread among sets of smaller towns. The study also provides critical data on the importance of spatial modeling for the long-term control of global epidemics and could help inform the long-term public health response to the current COVID-19 pandemic.

A paper describing the study appears April 27, 2020 in the journal Nature Ecology & Evolution.

"During the last 20 years there have been tremendous inroads towards eradicating measles--one of the major killers of children globally--as annual deaths have been driven down from more than a million to less than 200,000," said Ottar N. Bjørnstad, Distinguished Professor of Entomology and Biology at Penn State and one of the leaders of the research team. "However, previous efforts to eradicate smallpox and polio highlight the complexity of moving from local control to global eradication. Our study provides critical data on how long-term control efforts will need both general and detailed spatial models to finally stop this deadly disease."

Prior to the introduction of a vaccine, the number of measles cases in England and Wales would undergo, periodic--often biennial--epidemics. This pattern, driven by herd immunity, is common among a number of diseases and in other locales. The researchers sought to locate the reservoirs where the virus persists in the dips between epidemics, which are the sources for reintroduction of the virus into the general populace in the next major epidemic. This persistence question is central to understanding the dynamics of measles and other viral diseases and for coordinating public health interventions.

The research team combined spatial modeling with the detailed historical data of measles cases in England and Wales to address these questions. The uniquely detailed dataset includes weekly measles reports from almost a thousand locations across England and Wales beginning in 1944 and continuing until the disease was all but locally eliminated by vaccination in the 1990s.

"Previous work stressed the importance of large centers as sources of infection," said Bryan T. Grenfell, the Kathryn Briger and Sarah Fenton Professor of Ecology and Evolutionary Biology and of Public Affairs at Princeton University and another leader of the research team. "However, our new modeling shows that local spread among smaller towns can also contribute to persistence of the virus."

The researchers' new model quantifies the relative influence of different sources of infection, including major cities, spread among smaller towns, and unidentifiable outside sources. Following the introduction of vaccination, the source of reintroduction shifted from a combination of large centers and local spread to mainly unidentifiable sources, possibly outside of England and Wales.

"Having access to this unique dataset allowed us to test these news models of measles dynamics with unprecedented rigor," said Max S. Y. Lau, assistant professor in the Rollins School of Public Health at Emory University and first author of the paper. "Going forward, we can apply what we learn from this test case to understand disease spread beyond measles."

"Measles has always been the 'model organism' of epidemic dynamics--like C. elegans or the fruitfly are for evolution--and, along with influenza, a paradigm for understanding herd immunity," said Grenfell. "So, as COVID-19 approaches endemicity, these new models can help us understand and prepare for modeling its spatial spread, as well as understand the impact that the eventual development of a vaccine might have on its dynamics."

Beyond the COVID-19 pandemic, the models also could help scientists understand how diseases survive and spread at a time when a portion of the public is opposed to vaccines, said co-author C. Jessica E. Metcalf, assistant professor of ecology and evolutionary biology and public affairs. "Understanding the drivers of persistence is also of growing importance in a context of growing vaccine hesitancy, which further complicates dynamics and amplifies the challenges of control," she said.

The researchers added that wide perspective should be taken when applying the results to other diseases.

"Our model and previous experience highlights the complexity of globally eradicating a virus," said Bjørnstad. "Smallpox was eradicated by 1977 through a massive global effort of mass-vaccination of all children, followed by targeted efforts in regional hotspots and finally local quarantining and ring vaccination to squash the scourge. Polio, in contrast, while also targeted through vaccination for more than 50 years keeps escaping 'the final blow' as it successfully shifts and diffuses across regional pockets of susceptible individuals to evade eradication."

###

In addition to Bjørnstad, Grenfell, Lau, and Metcalf, the research team included, from Princeton, Ph.D. candidates Alexander Becker in ecology and evolutionary biology and Hannah M. Korevaar in the Office of Population Research, and postdoctoral researcher Quentin Caudron; and Darren J. Shaw at the University of Edinburgh.

The research was supported by the RAPIDD Program of the US Department of Homeland Security and the Fogarty International Centre, the US National Institutes of Health, the Eunice Kennedy Shriver National Institute of Child Health & Human Development of the National Institutes of Health, and the US National Science Foundation.
Disappearance of animal species takes mental, cultural and material toll on humans

Hunter-gatherer societies have had profoundly deep relationships with the animals they hunted, Tel Aviv University researchers say




AMERICAN FRIENDS OF TEL AVIV UNIVERSITYFor thousands of years, indigenous hunting societies have subsisted on specific animals for their survival. How have these hunter-gatherers been affected when these animals migrate or go extinct?

To answer this and other questions, Tel Aviv University (TAU) researchers conducted a broad survey of several hunter-gatherer societies across history in a retrospective study published on January 30 in Time and Mind. The study, led by Eyal Halfon and Prof. Ran Barkai of TAU's Department of Archeology and Ancient Near Eastern Cultures, sheds new light on the deep, multidimensional connection between humans and animals.

"There has been much discussion of the impact of people on the disappearance of animal species, mostly through hunting," explains Halfon. "But we flipped the issue to discover how the disappearance of animals -- either through extinction or migration -- has affected people."

The research reveals that these societies expressed a deep emotional and psychological connection with the animal species they hunted, especially after their disappearance. The study will help anthropologists and others understand the profound environmental changes taking place in our own lifetimes.

Halfon and Prof. Barkai conducted a survey of different historical periods and geographical locations, focusing on hunter-gatherer societies that hunted animals as the basis for their subsistence. They also investigated situations in which these animals became extinct or moved to more hospitable regions as a result of climate change.

"We found that humans reacted to the loss of the animal they hunted -- a significant partner in deep, varied and fundamental ways," Halfon says.

The new research explores hunter-gatherer societies throughout human history, from those dating back hundreds of thousands of years to modern-day societies that still function much the way prehistoric groups did. Ten case studies illustrate the deep connection -- existential, physical, spiritual and emotional -- between humans and animals they hunted.

"Many hunter-gatherer populations were based on one type of animal that provided many necessities such as food, clothing, tools and fuel," Prof. Barkai says. "For example, until 400,000 years ago prehistoric humans in Israel hunted elephants. Up to 40,000 years ago, residents of Northern Siberia hunted the woolly mammoth. When these animals disappeared from those areas, this had major ramifications for humans, who needed to respond and adapt to a new situation. Some had to completely change their way of life to survive."

 According to the study, human groups adapted in different ways. Siberian residents seeking sustenance after the disappearance of mammoths migrated east and became the first settlers of Alaska and northern Canada. Cave dwellers in central Israel's Qesem Cave (excavated by Prof. Barkai) hunted fallow deer, far smaller than elephants, which required agility and social connections instead of robust physical strength. This necessitated far-reaching changes in their material and social culture and, subsequently, physical structure.

Halfon stresses the emotional reaction to an animal group's disappearance. "Humans felt deeply connected to the animals they hunted, considering them partners in nature, and appreciating them for the livelihood and sustenance they provided," he says. "We believe they never forgot these animals -- even long after they disappeared from the landscape."

An intriguing example of this kind of memory can be found in engravings from the Late Paleolithic period in Europe, which feature animals like mammoths and seals. Studies show that most of these depictions were created long after these two animals disappeared from the vicinity.

"These depictions reflect a simple human emotion we all know very well: longing," says Halfon. "Early humans remembered the animals that disappeared and perpetuated them, just like a poet who writes a song about his beloved who left him."

According to Prof. Barkai, another emotional response was a sense of responsibility -- even guilt. "Indigenous hunter-gatherer societies have been very careful to maintain clear rules about hunting. As a result, when an animal disappears, they ask: 'Did we behave properly? Is it angry and punishing us? What can we do to convince it to come back?'" he concludes. "Such a reaction has been exhibited by modern-day hunter-gatherer societies as well."

###

American Friends of Tel Aviv University supports Israel's most influential, comprehensive and sought-after center of higher learning, Tel Aviv University (TAU). TAU is recognized and celebrated internationally for creating an innovative, entrepreneurial culture on campus that generates inventions, startups and economic development in Israel. TAU is ranked ninth in the world, and first in Israel, for producing start-up founders of billion-dollar companies, an achievement that surpassed several Ivy League universities. To date, 2,500 US patents have been filed by Tel Aviv University researchers -- ranking TAU #1 in Israel, #10 outside of the US and #66 in the world.
A new explanation for the origins of human fatherhood

Dads emerged to reap the benefits of partnerships with females and other males, team of anthropologists and economists argue


BOSTON COLLEGE
Chestnut Hill, Mass. (4/27/2020) - Humans differ from other primates in the types and amounts of care that males provide for their offspring. The precise timing of the emergence of human "fatherhood" is unknown, but a new theory proposes that it emerged from a need for partnership in response to changing ecological conditions, U.S. and French researchers report today in the Proceedings of the National Academy of Sciences.

The new theory was developed using tools of economists and knowledge of the economic and reproductive behavior of human foragers. The theory focuses on the benefits of a "fit" between exclusive partners that enabled the strengths of males and females to provide for one another and their offspring, according to researchers from Boston College, Chapman University, University of New Mexico, and the University of Toulouse in France.

Scientists have long tried to explain how human fatherhood emerged. Paternal care - those investments in offspring made by a biological father - is rare among mammals but widespread across modern human subsistence societies. Much of men's parental investment consists of provisioning relatively helpless children with food for prolonged periods of time - for as long as two decades among modern hunter-gatherers. This is a sharp break with other great apes, whose observed mating systems do not encourage paternal provisioning.

That paternal provisioning arose in humans seems remarkable and puzzling and has revolved around a discussion about two groups of males dubbed "Dads" and "Cads".

With promiscuous mating, a would-be Dad who provides food for a mate and their joint offspring without seeking additional mates risks being outcompeted in terms of biological fitness by a Cad, who focuses only on promiscuous mating instead of investing in offspring. Such a competitive disadvantage creates a formidable barrier for Dads to emerge when Cads abound.

An oft-invoked explanation for the evolution of paternal provisioning in humans is that ancestral females started mating preferentially with males who provided them with food, in exchange for female sexual fidelity. This explanation is insufficient for several reasons, the researchers write.

Instead, the team of anthropologists and economists argues that ecological change would have sufficed to trigger the spread of Dads, even in the face of female sexual infidelity, according to the report, "Paternal provisioning results from ecological change."

The key force in the theory of paternal provisioning is complementarities - in essence the cooperation between females and males, as well as between males. Complementarities are synergistic effects that increase per-capita benefits, which may arise from dividing labor and/or pooling resources. The path to complementarities began roughly five to eight million years ago, with a gradual drying in Africa, and a progressively greater need to rely on nutritious, diverse, spatially dispersed and relatively hard-to-obtain foods, including animal products.

In response to ecological change, ancestral hominins adapted in various ways, including efficient bipedal locomotion, dietary flexibility, and an ability to thrive in diverse environments, facilitated by tool use. Complementarities between males and females would have resulted from the nutrients that each sex specialized in acquiring: protein and fat acquired by males paired well with carbohydrates acquired by females.

Complementarities between males would have resulted from higher returns from hunting in groups instead of in isolation, and from food sharing to lower starvation risk. Dietary reliance on animal products is thus a key feature underlying these complementarities between and within sexes.

These complementarities would have led to a substantial increase in the impact of food provided by a Dad on the survival of his mate's offspring.

Using evolutionary game theory, the authors show that this impact can lead Dads to gain a fitness advantage over Cads, although Cads may still co-exist with Dads under certain conditions. If sons inherit their biological father's traits, then over time Dads will increase in number in a population. Theoretically connecting the evolution of paternal provisioning to ecological change allows the authors to make novel predictions about the paleontological and archeological record.

###

Skoltech research shows how a 'Swiss Army knife' protein helps phages disarm their victims



SKOLKOVO INSTITUTE OF SCIENCE AND TECHNOLOGY (SKOLTECH)


OCR, A DNA MIMIC PROTEIN OF THE WELL-STUDIED T7 PHAGE, CAN PROTECT THE VIRUS FROM BREX (FOR BACTERIOPHAGE EXCLUSION), A POORLY STUDIED SET OF BACTERIAL DEFENSE MECHANISMS UTILIZED BY, AMONG... view more  CREDIT: PAVEL ODINEV / SKOLTECH

Researchers from the Severinov Laboratory at Skoltech, along with their colleagues from Switzerland and Israel, have investigated a poorly studied bacterial BREX defense mechanism to show that it can be "turned off" by a multipurpose viral protein that successfully impersonates DNA. The paper was published in the journal Nucleic Acids Research.

In the never-ending war between bacteria and viruses that infect them, it is extremely important to know where your DNA is. To protect themselves from hostile invasion, bacteria have learned to "mark" their own genetic material by methylating it at specific sites in the genome. All "unmarked" DNA, such as that of a bacteriophage, is then recognized, cleaved and degraded by an army of endonucleases. These are called restriction modification (RM) systems. The phages, in turn, have learned to evade these RM systems by using DNA mimic proteins. A protein that chemically "looks" like DNA to the bacterial restriction complex can bind it and prevent from ever getting to actual phage DNA.

Skoltech PhD student Artem Isaev and his colleagues from Tel Aviv University and Philip Morris International R&D have shown that Ocr, a DNA mimic protein of the well-studied T7 phage, is in fact a multipurpose tool. Besides inhibition of RM type I systems, it can also protect the phage from BREX (for BacteRiophage EX?lusion), another set of bacterial defense mechanisms utilized by, among others, Escherichia coli, which T7 commonly infects.

"Five years ago, we knew about restriction modification, CRISPR and Toxin-Antitoxin abortive infection systems, but recently bioinformatics has shown us that it is just a small proportion of the real diversity of defensive strategies employed by bacteria to cope with phage infections. BREX was the first in a row of novel phage defense systems: they are found in around 1 in 10 of all microorganisms, and in bacteria they are even more widespread than CRISPR. Yet we still don't know the function of five out of six BREX genes and how they are working together to provide protection," Isaev says.

It is yet unknown whether BREX simply destroys the incoming phage DNA or somehow inhibits its replication, but almost all BREX mechanisms employ a BrxX methyltransferase, an enzyme in charge of "marking" bacterial DNA for self-recognition. The Ocr protein apparently binds to this methyltransferase and thus prevents the BREX system from both methylation of the host DNA and attacking the non-methylated phage genome.

"We have shown that Ocr interacts with methyltransferase and inhibits host DNA methylation. In theory, this can lead to an autoimmune response: as host DNA is no longer "marked" as such, BREX exclusion complexes should attack it. Yet, we do not see self-toxicity after expression of Ocr, which indicates that Ocr inhibits these exclusion complexes as well, and thus BREX methyltransferase should be involved at the active stage of defense. Ocr is already known as an inhibitor of type I RM systems, and these systems also require methytransferase for restriction complexes. There are other similarities between BREX and RM systems, and we hope that they would help us understand how BREX functions," Isaev explains.

Other DNA mimic proteins do not seem to overcome the BREX defense, so researchers intend to further investigate how exactly Ocr does its job. As bacterial defense systems mostly deal with DNA recognition and manipulation, they can become powerful tools for molecular biology and medicine. Molecular cloning is possible thanks to the discovery and description of RM systems, and CRISPR has brought about the age of genome editing. Moreover, studying the arsenals of bacteria and phages may prove useful in "recruiting" the viruses as novel antimicrobial agents in the fight against antibiotic-resistant bacteria.

"Bacteria have been combatting phages for more than a billion years, and this constant "arms race" is one of the major evolutionary forces in the microworld. Both sides have developed an enormous arsenal of strategies to fight each other, and a great diversity of molecular machines has been invented in the process. For me personally, it's just fascinating to study what else is hidden in the genome and what novel mechanism we can discover in the process," Isaev concludes.

Rice engineers: Make wastewater drinkable again

Model shows recycling could dramatically slash cities' need for fresh water resources


RICE UNIVERSITY



A SCHEMATIC REPRESENTATION OF THE HYBRID WATER SUPPLY SYSTEM DEVELOPED BY ENGINEERS AT RICE UNIVERSITY. THE RESEARCHERS SUGGEST THAT DELIVERING WATER TO CITY DWELLERS CAN BECOME FAR MORE EFFICIENT, AND... view more  CREDIT: LU LIU/RICE UNIVERSITY

HOUSTON - (April 27, 2020) - Delivering water to city dwellers can become far more efficient, according to Rice University researchers who say it should involve a healthy level of recycled wastewater.

Using Houston as a model, researchers at Rice's Brown School of Engineering have developed a plan that could reduce the need for surface water (from rivers, reservoirs or wells) by 28% by recycling wastewater to make it drinkable once again.

While the cost of energy needed for future advanced purification systems would be significant, they say the savings realized by supplementing fresh water shipped from a distance with the "direct potable reuse" of municipal wastewater would more than make up for the expense.

And the water would be better to boot.

A comprehensive model of the environmental and economic impact and benefits of such a system was developed by Rice researchers associated with the National Science Foundation-backed Nanosystems Engineering Research Center for Nanotechnology-Enabled Water Treatment (NEWT).

Rice environmental engineer Qilin Li is corresponding author and postdoctoral research Lu Liu lead author of the study that appears in Nature Sustainability.

It shows how Houston's planned reconfiguration of its current wastewater treatment system, by which it will eventually consolidate the number of treatment plants from 39 to 12, can be enhanced to "future-proof" water distribution in the city.

"All the technologies needed to treat wastewater to drinking water quality are available," Li said. "The issue is that today, they're still pretty expensive. So a very important part of the paper is to look at how cheap the technology needs to become in order for the whole thing to make sense financially and energy-wise."

Advanced water treatment happens to be a subject of intense study by scientists and engineers at the many institutions, including Rice, associated with NEWT.

"Another way to improve potable water would be to cut its travel time," she said. Water delivered by a system with many distribution points would pick up fewer chemical and biological contaminants en route. Houston, she noted, already has well-distributed wastewater treatment, and making that water drinkable would facilitate shorter travel times to homes.

The model shows there will always be a tradeoff between the acquisition of potable water, the energy required to treat it, the cost of transporting it without affecting its quality, and attempts to find a reasonable balance between those factors. The study evaluated these conflicting objectives and exhaustively examined all possibilities to find systems that strike a balance.

"Ultimately, we want to know what our next-generation water supply system should look like," Li said. "How does the scale of the system affect distribution? Should it be one gigantic, centralized water source or several smaller distributed sources?

"In that case, how many sources should there be, how big of an area should each supply and where should they be located? These are all questions we are studying," she said. "A lot of people have talked about this, but very little quantitative work has been done to show the numbers."

Li admitted Houston may not be the most representative of major municipal infrastructure systems because the city's wastewater system is already highly distributed, but its water supply system is not. The challenge of having a highly centralized water supply was demonstrated by a dramatic 96-inch water main break this February that cut off much of the city's supply.

"That was an extraordinary example, but there are many small leaks that go undetected underground that potentially allow contaminants into homes," she said.

The study only looked at direct potable reuse, which the model shows as a more economic option for established cities, but she said the best option for a new development -- that is, building a distribution system for the first time -- may be to have separate delivery of potable and nonpotable water.

"That would be prohibitive cost-wise in a place like Houston, but it would be cheaper for a new community, where wastewater effluent can be minimally treated, not quite drinkable but sufficient for irrigation or flushing toilets," Li said.

"Though maybe it would be to Houston's advantage to use detention ponds that already exist throughout the city to store stormwater and treat it for nonpotable use."

###

Li is a professor of civil and environmental engineering, materials science and nanoengineering and chemical and biomolecular engineering. Co-authors of the paper are Rice alumnus Evan Lopez; Leonardo Dueñas-Osorio, an associate professor of civil and environmental engineering; Lauren Stadler, an assistant professor of civil and environmental engineering; Pedro Alvarez, the George R. Brown Professor of Civil and Environmental Engineering and a professor of chemistry and of chemical and biomolecular engineering and director of NEWT; and Yuefeng Xie, a professor of environmental engineering at Penn State University and an adjunct chair professor at Tsinghua University, China.

The National Science Foundation and the National Natural Science Foundation of China supported the research.

Read the abstract at https://www.nature.com/articles/s41893-020-0518-5

This news release can be found online at https://news.rice.edu/2020/04/27/rice-engineers-make-wastewater-drinkable-again/

Follow Rice News and Media Relations via Twitter @RiceUNews

Related materials:

Qilin Li Research Group: http://qilinli.rice.edu

Department of Civil and Environmental Engineering: https://cee.rice.edu

George R. Brown School of Engineering: https://engineering.rice.edu

Image for download:

https://news-network.rice.edu/news/files/2020/04/0427_URBAN-1-WEB.jpg

A schematic representation of the hybrid water supply system developed by engineers at Rice University. The researchers suggest that delivering water to city dwellers can become far more efficient, and that it should involve a healthy level of recycled wastewater. (Credit: Lu Liu/Rice University)

Located on a 300-acre forested campus in Houston, Rice University is consistently ranked among the nation's top 20 universities by U.S. News & World Report. Rice has highly respected schools of Architecture, Business, Continuing Studies, Engineering, Humanities, Music, Natural Sciences and Social Sciences and is home to the Baker Institute for Public Policy. With 3,962 undergraduates and 3,027 graduate students, Rice's undergraduate student-to-faculty ratio is just under 6-to-1. Its residential college system builds close-knit communities and lifelong friendships, just one reason why Rice is ranked No. 1 for lots of race/class interaction and No. 4 for quality of life by the Princeton Review. Rice is also rated as a best value among private universities by Kiplinger's Personal Finance.
Increasing green spaces in cities could prevent many premature deaths every year

A study in The Lancet Planetary Health applies a novel methodology in Philadelphia, replicable to other cities, to estimate the health impact of increasing the city's tree canopy
BARCELONA INSTITUTE FOR GLOBAL HEALTH (ISGLOBAL)

Increasing the tree canopy to 30% of land area in the city of Philadelphia (United States) could prevent over 400 premature deaths across the city every year and yield an estimated annual economic benefit of almost four billion dollars. This is the conclusion of a study published in the journal The Lancet Planetary Health, which has, for the first time, analysed the impact of increasing green spaces on premature mortality in an entire city. The project was led by the Barcelona Institute for Global Health (ISGlobal), a centre supported by "la Caixa" Foundation, and the United States Forest Service.

Research has shown that green spaces in urban settings are associated with benefits for the physical and mental health of the city's residents. A recent systematic review and meta-analysis carried out by ISGlobal, the University of Colorado and the World Health Organization (WHO) concluded that residential green spaces can protect against premature all-cause mortality. The meta-analysis, which included nine longitudinal studies involving over eight million people in seven different countries, found a significant association between an increase in green space around homes and a reduction in premature mortality.

In the new study, researchers used the dose-response function from the meta-analysis to carry out a health impact assessment and estimate the number of all-cause deaths that could be prevented if green spaces in a whole city were increased. The team studied three different possible scenarios for the city of Philadelphia for 2025. The most ambitious was based on the current goal as set by the City Council of an increase in tree coverage to 30% of land area in each of the city's neighbourhoods (current coverage is 20% for the city as a whole). The other two scenarios were less ambitious. Data on the existing canopy was obtained from aerial and satellite imagery, which allowed the researchers to measure the tree coverage by viewing the crown, leaves, branches and stems from above.

The results of the analysis showed that if Philadelphia achieves its goal of increasing tree coverage to 30% of the city by 2025, 403 premature adult deaths would be prevented each year, representing 3% of the city's annual mortality. The two more moderate scenarios were also associated with significant reductions in annual mortality: a 5% and 10% increase in tree canopy could result in an annual reduction of 271 and 376 deaths, respectively.

"Achieving this goal does not come without challenges. Large tree planting initiatives are faced with many problems, including losses from climate change, tree pests and invasive species, and urban development", explains Michelle Kondo, first author of the study.

"Although every city has its own characteristics, this study provides an example for all the cities in the world: many lives can be saved by increasing trees and greening urban environments, even at modest levels" concludes Mark Nieuwenhuijsen, the study coordinator and director of ISGlobal's Urban Planning, Environment and Health Initiative. "What's more," he adds "green spaces increase biodiversity and reduce the impact of climate change, making our cities more sustainable and more liveable".

Impact on Poor Neighbourhoods


The study also showed that neighbourhoods with a low socioeconomic level would benefit most from any increase in green spaces. "Many of the deaths prevented would be in the poorest areas of the city, even with a moderate increase in the number of trees," comments Kondo.

Philadelphia is the poorest of the ten largest cities in the United States and its mortality rate is higher than the national average. "Urban reforestation programmes are not only essential for improving public health, they are also a way to reduce health inequities and promote environmental justice," she adds.

###

Reference

Kondo MC, Mueller N, Locke DH, Roman LA, Rojas-Rueda D, Schinasi L, Gascon M, Nieuwenhuijsen M, Greening Cities Can Prevent Premature Deaths: Health Impact Assessment of Philadelphia's 2015 Tree Canopy Cover Goals. The Lancet Planetary Health. April 2020.

Factors associated with disaster preparedness among US households


Association of US Households’ Disaster Preparedness With Socioeconomic Characteristics, Composition, and Region

JAMA Netw Open. 2020;3(4):e206881. doi:10.1001/jamanetworkopen.2020.6881
Key Points Español  中文 (Chinese)
Question  What characteristics are associated with resource- and action-based disaster preparedness among US households?
Findings  In this cross-sectional study using nationally representative survey data from 16 725 US households, 68.9% of households fulfilled at least half of recommended preparedness items, but households were more likely to fulfill resource- than action-based items. Fulfillment of specific resource- or action-based items differed substantially by household characteristics.
Meaning  These findings suggest that because households may treat preparedness items differently, targeted strategies are needed to promote preparedness across communities, and risk communication should emphasize the importance of both resource- and action-based preparedness.

Abstract
Importance  Since the terrorist attacks on September 11, 2001, the US government has promoted household disaster preparedness, but preparedness remains low.
Objective  To identify disparities in disaster preparedness among US households.
Design, Setting, and Participants  This cross-sectional study used data from a nationally representative sample of US households from the 2017 American Housing Survey’s topical section on preparedness to assess associations of disaster preparedness with households’ socioeconomic characteristics, composition, and region. Logistic regressions were used to assess associations of household characteristics with overall preparedness, resource- and action-based preparedness, and specific preparedness items. Data analyses were completed on March 27, 2020.
Exposures  Combined household income, head of household’s education level, race/ethnicity, marital status of head of household, head of household aged 65 years or older, presence of children or a household member with a disability, and region.
Main Outcomes and Measures  Nine actionable preparedness items, such as having an emergency carry-on kit (resource), food and water stockpiles (resource), and alternative communication plans and meeting locations (action). Items were summed for the measures of overall, resource-based, and action-based preparedness, with preparedness defined as meeting at least half of the criteria.
Results  Among 16 725 included households, 9103 household heads were men (54.4%), 11 687 were married (69.9%), and 10 749 (66.1%) had some college education or higher. In all, 1969 household heads (11.8%) were black, while 2696 were Hispanic/Latino (16.1%); 3579 household heads (21.4%) were 65 years or older. A total of 7163 households (42.8%) included children, and 3533 households (21.2%) included a person with a disability. Households were more likely to fulfill at least half of the criteria for resource-based preparedness (10 950 households [65.5%]) than for action-based preparedness (6876 households [41.1%]). Wealthy households and those with household heads aged 65 years or older were more likely to fulfill at least half of resource-based items (wealthy households: adjusted odds ratio [aOR] by logged income, 1.18 [95% CI, 1.13-1.22]; household heads age ≥65 years: aOR, 1.42 [95% CI, 1.29-1.55) but less likely to fulfill action-based items (wealthy households: aOR: 0.96 [95% CI, 0.93-0.99]; household heads age ≥65 years: aOR, 0.92 [95% CI, 0.84-0.99]). Households with black household heads were more likely to fulfill items directly related to emergencies (carry-on emergency kit: aOR, 1.26 [95% CI, 1.14-1.39]; alternative communication plan: aOR, 1.55 [95% CI, 1.39-1.72]; alternative meeting location: aOR, 1.18 [95% CI, 1.07-1.31]) but less likely to fulfill resource-based items (at least half of resource items: aOR, 0.89 [95% CI, 0.80-0.99]).
Conclusions and Relevance  These findings suggest that types of preparedness vary by household characteristics. Targeted strategies are needed to promote preparedness across communities.
TO READ THE STUDY OR DOWNLOAD PDF GO HERE
Rapid evolution in fish: genomic changes within a generation

UNIVERSITY OF BASEL

THE RESEARCHERS RELEASED 3000 EXPERIMENTAL FISH INTO A NATURAL RIVER HABITAT WITHOUT STICKLEBACK, EXPOSING THEM TO NATURAL SELECTION. AFTER ONE YEAR, THE REMAINING FISH WERE RECAPTURED AND EXAMINED GENETICALLY. view more  CREDIT: UNIVERSITY OF BASEL, DARIO MOSER

Evolution is usually viewed as a slow process, with changes in traits emerging over thousands of generations only. Over the recent years, however, research has indicated that adaptation in specific traits can occur more quickly. However, very few studies outside microorganisms were able to demonstrate empirically how quickly natural selection shapes the whole genome.

A research team led by Dr. Daniel Berner at the University of Basel's Department of Environmental Sciences has now provided evidence for rapid evolution within a single generation, using threespine stickleback fish as model organism. The five-year study combined lab work, field experiments, mathematical modeling and genomic analysis.

Different habitats: lakes and rivers

In the Lake Constance area, stickleback have adapted to ecologically different habitats - lakes and rivers. To examine how quickly adaptation occurs across the genome, lake- and river-dwelling fish were crossed in the laboratory over several generations. The genomes of the two ecotypes were thus mixed, resulting in a genetically diverse experimental population.

In a second step, the researchers released thousands of these experimental fish into a natural river habitat without resident stickleback, exposing them to natural selection. After a year, the remaining fish were recaptured and examined genetically.

"The hypothesis of this experiment was that in the river habitat in which the experimental animals had to survive, genetic variants of the original river population would increase in frequency," says Berner. "However, we had no idea whether this would be measurable within a single generation."

Genomic analysis confirms hypothesis

To record potential changes in the genome, the researchers first had to identify the DNA regions most likely to be targeted by natural selection. To do so, they compared the original lake and river populations based on DNA sequence data. This revealed hundreds of regions in the genome likely important for adapting to the lake and river conditions. In precisely these regions, the experimental population's DNA sequence data from before and after the field experiment were then compared to identify changes in the frequency of genetic variants.

The result supported the hypothesis: on average, the frequency of the river variants increased by around 2.5% at the expense of the lake variants. "This difference might appear small at first glance, but is truly substantial when extrapolated over a few dozen generations," says Berner. The experiment demonstrates that evolution can occur very quickly right in front of our eyes - and not only in microorganisms. "Such rapid evolution may help some organisms to cope with the current rapid environmental changes caused by humans," Berner concludes.

###
'We urgently need a renewed public debate about new breeding technologies'

Göttingen agricultural economist evaluates research results on food security
UNIVERSITY OF GÖTTINGEN

GENE-EDITING TECHNOLOGIES CAN HELP TO MAKE SMALLHOLDER FARMING IN AFRICA MORE PRODUCTIVE, ENVIRONMENTALLY-FRIENDLY, AND CLIMATE SMART view more  CREDIT: S KOPPMAIR
Plant breeding has considerably increased agricultural yields in recent decades and thus made a major contribution to combating global hunger and poverty. At the same time, however, the intensification of farming has had negative environmental effects. Increases in food production will continue to be crucial for the future because the world population and demand continue to grow. A recent study by the University of Göttingen shows that new plant breeding technologies - such as genetic engineering and gene editing - can help to increase food production whilst being more environmentally friendly. The results have been published in the journal Applied Economic Perspectives and Policy.

For the study, the agricultural economist Professor Matin Qaim from the University of Göttingen evaluated 50 years' worth of research from around the world. Since the 1960s, there has been a "Green Revolution": high-yielding varieties of wheat, rice and maize have been developed and widely cultivated, especially in Asia and Latin America. As a result, crop yields tripled, which not only increased farmers' incomes, but also improved ordinary consumers' access to food. However, the high yields were accompanied by intensive use of chemical fertilizers and pesticides. In addition, developments concentrated on a small number of cereal species, which led to a reduction in agricultural diversity.

"Cereals are cheap sources of calories, which explains why hunger has been significantly reduced in recent decades," explains Qaim. "Unfortunately, the Green Revolution was less successful in combating widespread micronutrient deficiency. This requires a more balanced diet and more diversity in agricultural production with more beans, vegetables, fruit and other locally adapted species."

New molecular breeding technologies can help modify plants to produce higher yields. In addition, plants can be bred to require less fertilizer and pesticides because they make better use of soil nutrients and are more resistant to disease, pests and extreme weather conditions. The new technologies can also speed up the breeding of new traits, enabling faster adaptation to climate change. "Even though new methods like CRISPR were only invented a few years ago, they have already been used successfully in many different crops. The methods are relatively simple and inexpensive, so even small laboratories can use them to improve local species. This is an important step towards greater diversity," says Qaim.

Genetically modified organisms (GMO) have been grown for about 25 years but remain controversial, especially in Europe. Although these varieties are classified as safe in scientific studies, there is still widespread public concern about possible health and environmental risks. To a large extent, these concerns are related to the fact that GMOs contain genes introduced from other species. "The low public acceptance has led to high regulatory hurdles that are not scientifically justified and seriously slow down the development of the technology," explains Qaim. Gene-editing methods are different because they usually do not transfer foreign genes. In principle, the targeted changes in the plant's DNA could occur naturally. "The problem is that the regulatory authorities in Europe treat plants developed using gene editing in the same way as genetically modified plants where foreign genes are introduced. This fuels public fears and prevents the further development and use of the technology in agriculture. Unfortunately, the European attitude has far-reaching implications for many poor countries, especially in Africa, where new agricultural technologies are particularly important for small farms," says Qaim. "We urgently need a different and more evidence-based public discourse about new breeding technologies, because they can make an important contribution to sustainable agriculture and food security."

###

Original publication: Qaim, M. (2020). Role of new plant breeding technologies for food security and sustainable agricultural development. Applied Economic Perspectives and Policy, https://doi.org/10.1002/aepp.13044

Paper also available here: https://onlinelibrary.wiley.com/doi/full/10.1002/aepp.13044

Contact:

Professor Matin Qaim
University of Göttingen
Faculty of Agricultural Sciences - Department of Agricultural Economics and Rural Development
Platz der Göttinger Sieben 5, 37073 Göttingen, Germany
Tel: +49 (0)551 39-24806
Email: mqaim@uni-goettingen.de
http://www.uni-goettingen.de/en/42360.html
Papua New Guinea highland research redates Neolithic period


UNIVERSITY OF OTAGO

UNIVERSITY OF OTAGO PROFESSOR OF ARCHAEOLOGY PROFESSOR GLENN SUMMERHAYES WITH FIELD CREW IN PAPUA NEW GUINEA. view more  CREDIT: UNIVERSITY OF OTAGO
A new report published in Science Advances on the emergence of agriculture in highland Papua New Guinea shows advancements often associated with a later Neolithic period occurred about 1000 years' earlier than previously thought.


University of Otago Archaeology Programme Professor and report co-author Glenn Summerhayes says findings in Emergence of a Neolithic in highland New Guinea by 5000 to 4000 years ago, provide insights into when and how the highlands were first occupied; the role of economic plants in this process; the development of trade routes which led to the translocation of plants and technologies; and an associated record of landscape, environment and climate change through time.


The report details the earliest figurative stone carving and formally manufactured pestles in Oceania, dating to 5050 to 4200 years ago, which were found at a dig site in Waim. Also found were the earliest planilateral axe-adzes uncovered in New Guinea to date, and the first evidence for fibrecraft and interisland obsidian transfer from neighbouring islands over distances of at least 800km.
"The new evidence from Waim fills a critical gap in our understanding of the social changes and technological innovations that have contributed to the developing cultural diversity in New Guinea," Professor Summerhayes says.

The combination of symbolic social systems, complex technologies, and highland agricultural intensification supports an independent emergence of a Neolithic around 1000 years before the arrival of Neolithic migrants, the Lapita, from Southeast Asia. When considered together with a growing corpus of studies indicating expansion and intensification of agricultural practices, these combined cultural elements represent the development of a regionally distinct Neolithic.
The research establishes dating for other finds at the site, including a fire lighting tool, postholes, and a fibrecraft tool with ochre, possibly used for colouring string fibre.

The report suggests increased population pressure on the uneven distribution of natural resources likely drove this process, which is further inferred by language and genetic divergence.

The project arose out of an Australian Research Council Grant awarded to Dr Judith Field (University of New South Wales) and Professor Summerhayes.

"Former Otago postgraduate student Dr Ben Shaw was employed as postdoctoral fellow to do the "leg work in the field" and Dr Anne Ford (Otago Archaeology Programme) contributed to understandings of the stone tool technologies. As it worked out many of these rich discoveries were made by Dr Shaw. It was one of the best appointments Dr Field and I have ever made. I am proud of our Otago graduates who are some of the best in the world."

Professor Summerhayes and his team had previously completed a Marsden funded project in the Ivane Valley of Papua, establishing the beginning of human occupation at 50,000 years ago. The results of this work were published in Science in 2010.

"This project is a follow-on where we wanted to construct a chronology of human presence in the Simbai/Kaironk Valley of Papua New Guinea by systematic archaeological survey with subsequent excavation and analysis of a select number of sites.

"This work tracks long-term patterns of settlement history, resource use and trade, and establishes an environmental context for these developments by compiling vegetation histories, with particular attention paid to fire histories, indicators of landscape disturbance and markers of climate variability. This will add to understandings of peoples' impact on the environment."

Professor Summerhayes received a Marsden grant in late 2019 for his project "Crossing the divide from Asia to the Pacific: Understanding Austronesian colonisation gateways into the Pacific". This will involve work in the Ramu Valley, which was once part of an inland sea, and will tie in the developments of Highland New Guinea, with the movements of Austronesian speakers into the Pacific.

###

For further information contact
Professor Glenn Summerhayes
University of Otago, Department of Archaeology
Email: glenn.summerhayes@otago.ac.nz
Milk pioneers: East African herders consumed milk 5,000 years ago


WASHINGTON UNIVERSITY IN ST. LOUIS


A MODERN DAY KENYAN COLLECTS FRESH COW'S MILK IN A GOURD. 
view more  CREDIT: OLIVER RUDD
When you pour a bowl of cereal, you probably aren't considering how humans came to enjoy milk in the first place. But animal milk was essential to east African herders at least 5,000 years ago, according to a new study that uncovers the consumption habits in what is now Kenya and Tanzania -- and sheds a light on human evolution.

Katherine M. Grillo, assistant professor of anthropology at the University of Florida and a 2012 PhD graduate of Washington University in St. Louis, teamed up with researchers, including Washington University's Fiona Marshall, the James W. and Jean L. Davis Professor in Arts & Sciences, for the study published this week in the Proceedings of the National Academy of Sciences. Julie Dunne at the University of Bristol in the United Kingdom is co-first author on the paper with Grillo.

After excavating pottery at sites throughout east Africa, team members analyzed organic lipid residues left in the pottery and were able to see evidence of milk, meat and plant processing.

"(This is) the first direct evidence we've ever had for milk or plant processing by ancient pastoralist societies in eastern Africa," Grillo said.

"The milk traces in ancient pots confirms the story that bones have been telling us about how pastoralists lived in eastern Africa 5,000 to 3,000 years ago -- an area still famous for cattle herding and the historic way of life of people such as Maasai and Turkana," Marshall said.

Marshall continued: "Most people don't think about the fact that we are not really designed to drink milk as adults -- most mammals can't. People who had mutations that allowed them to digest fresh milk survived better, we think, among herders in Africa. But there's a lot we don't know about how, where and when this happened.

"It's important because we still rely on our genetics to be able to drink fresh cow's milk once we are adults."

This research shows, for the first time, that herders who specialized in cattle -- as opposed to hunting the abundant wildlife of the Mara Serengeti -- were certainly drinking milk.

"One of the reasons pastoralism has been so successful around the world is that humans have developed lactase persistence -- the ability to digest milk due to the presence of specific alleles," Grillo said.

Notably, in east Africa there are distinctive genetic bases for lactase persistence that are different from other parts of the world. Geneticists believed that this ability to digest milk in northeast Africa evolved around 5,000 years ago, but archaeologists knew little about the archaeological contexts in which that evolution took place.

The development of pastoralism in Africa is unique as well, where herding societies developed in areas that often can't support agriculture.

###