Saturday, May 16, 2020

Scientists report on crucial reduction of Indian lion genome diversity

Modern bioinformatics allows us to take a look into the past and find out when certain species diverged during evolution, which of them still are genetically close to each other, and which are not


ITMO UNIVERSITY

Modern bioinformatics allows us to take a look into the past and find out when certain species diverged during evolution, which of them still are genetically close to each other, and which are not. An international team of researchers, which, among others, includes the academic supervisor of ITMO University's Laboratory of Genomic Diversity and researchers from the University of Copenhagen, Barcelona Institute of Science and Technology, and other research centers from all over the world, analyzed the genomes of extinct and living lions. They managed to determine when the divergence took place, as well as come to several other conclusions on genetic diversity of the modern lion population in India. The results are published in the journal Proceedings of the National Academy of Sciences of the United States of America.

Lions are one of the most powerful and dangerous predators on the planet. Many cities and countries use lion images on their coats of arms as a symbol of power and strength.

However, according to scientists, nowadays lions are in great danger of extinction. In the last two centuries, a 90% reduction in the population took place. During the last 150 years, Cape and Barbary lions were exterminated. Today, apart from zoos, you can meet these animals only in Western and Central Africa, as well as in the Gir National Park and Wildlife Sanctuary, Gujarat, India. To comprehend the process of reduction in the lion population and prevent their extinction, scientists need to answer several questions considering their evolution history.

An international research team, which includes specialists from different countries and continents, as well as Stephen O'Brian, the academic supervisor of ITMO University's Laboratory of Genomic Diversity, has analyzed the remains of cave lions kept in museums and found during paleontological expeditions. The results of the analysis were compared with those of modern lions. It was concluded that cave lions and modern species diverged about 500,000 years ago.

The scientists have also discovered that lion ancestors which used to live in Central and Western Africa diverged from the ancestors of subspecies that used to inhabit Northern Africa, and now inhabit India, about 70,000 years ago. A quite popular myth about lions being artificially brought to India in the pre-colonial era is therefore proved to be false.

This research may have a direct impact on the attempts of the restoration of the Northern-African lion population. If the scientists manage to determine the closest relatives of Barbary subspecies, it will make the restoration more scientifically substantiated and possibly more successful. All the more so due to the fact that in various zoos there are animals which are considered to be derived from Barbary lions gifted to Moroccan rulers in the 19th century.

One of the important conclusions that the scientists came to is that there is no clear evidence that cave lions, which went extinct 10,000-15,000 years ago, or Cape and Barbary lions, which went extinct recently, had a problem with genetic diversity. It means that their extinction evidently was not caused by degeneration and accumulation of deleterious mutations in the population.

However, the reduction in genome diversity can be clearly detected in the modern population of Indian lions. As they have been living in a comparatively small area for centuries, inbreeding often took place. This resulted in cranial defects, low sperm count and testosterone levels, as well as smaller manes. These facts should be taken into consideration during the attempts to save the Indian lion population from extinction.

"The obtained results demonstrate the power given to us by the era of genome research. We can apply it to discover the secrets of the past by reading the fragments of DNA taken from modern species' ancestors. Apart from that, a troubling reduction in Indian lion genetic material was proved," Stephen O'Brien concludes.

The carnivorous plant lifestyle is gene costly
UNIVERSITY OF WÜRZBURG



IMAGE
IMAGE: THE GENOMES OF THE CARNIVOROUS PLANTS VENUS FLYTRAP, SPOON-LEAVED SUNDEW AND WATERWHEEL (FROM LEFT) ARE DECODED. view more 
CREDIT: (PICTURE: DIRK BECKER AND SÖNKE SCHERZER / UNIVERSITY OF WÜRZBURG)

Plants can produce energy-rich biomass with the help of light, water and carbon dioxide. This is why they are at the beginning of the food chains. But the carnivorous plants have turned the tables and hunt animals. Insects are their main food source.
A publication in the journal Current Biology now sheds light on the secret life of the green carnivores. The plant scientist Rainer Hedrich and the evolutionary bioinformatician Jörg Schultz, both from Julius-Maximilians-Universität (JMU) Würzburg in Bavaria, Germany, and their colleague Mitsujasu Hasebe from the University of Okazaki (Japan) have deciphered and analysed the genomes of three carnivorous plant species.
They studied the Venus flytrap Dionaea muscipula, which originates from North America, the globally occurring waterwheel plant Aldrovanda vesiculosa and the spoon-leaved sundew Drosera spatulata, which is widely distributed in Asia.
All three belong to the sundew family. Nevertheless, they have each conquered different habitats and developed their own trapping mechanisms. In Dionaea and Aldrovanda, the ends of the leaves are transformed into folding traps. The sundew, on the other hand, attaches its prey to the leaf surface with sticky tentacles.
Basic genes for carnivory
The first thing the international research team found out was that, despite their different lifestyles and trapping mechanisms, Venus flytrap, sundew and waterwheel have a common "basic set" of genes that are essential for the carnivorous lifestyle.
"The function of these genes is related to the ability to sense and digest prey animals and to utilise their nutrients," explains Rainer Hedrich.
"We were able to trace the origin of the carnivory genes back to a duplication event that occurred many millions of years ago in the genome of the last common ancestor of the three carnivorous species," says Jörg Schultz. The duplication of the entire genome has provided evolution with an ideal playing ground for developing new functions.
Genetic poverty despite a special way of life
To their surprise, the researchers discovered that the plants do not need a particularly large number of genes for carnivory. Instead, the three species studied are actually among the most gene-poor plants known. Drosera has 18,111, Dionaea 21,135 and Aldrovanda 25,123 genes. In contrast, most plants have between 30,000 and 40,000 genes.
How can this be reconciled with the fact that a wealth of new genes is usually needed to develop new ways of life? "This can only mean that the specialization in animal food was accompanied by an increase in the number of genes, but also a massive loss of genes," concludes developmental biologist Hasebe.
Root genes are active in the trapping organs
Most of the genes required for the insect traps are also found in slightly modified form in normal plants. "In carnivorous plants, several genes are active in the trapping organs, which in other plants have their effect in the root. In the trapping organs, these genes are only switched on when the prey is secure," explains Hedrich. This finding is consistent with the fact that the roots are considerably reduced in Venus flytrap and sundew. In the waterwheel they are completely absent.
Further research into the trapping function
The researchers now have an insight into the evolution of carnivory in plants and hold three blueprints for this particular way of life in their hands. Their next goal is to gain an even better understanding of the molecular basis of the trapping function.
"We have found that the Venus flytrap counts the electrical stimuli triggered by the prey, can remember this number for a certain time and finally makes a decision that corresponds to the number," says Hedrich. Now it is important to understand the biophysical-biochemical principle according to which carnivorous plants count.
Little Shop of Horrors' returns to theaters with deleted dark ending

Catnip's chemical attractant is new twist on old family tradition

FLORIDA MUSEUM OF NATURAL HISTORY
IMAGE
IMAGE: CATNIP EVOLVED ITS CAT-ATTRACTING CHEMICAL, ACTUALLY AN INSECT REPELLANT, TENS OF MILLIONS OF YEARS AFTER ONE OF ITS ANCESTORS HAD LOST THE ABILITY TO MAKE THIS TYPE OF COMPOUND. view more 
CREDIT: ALEX ABAIR
GAINESVILLE, Fla. --- Catnip is most famous for its ability to launch felines into a euphoric frenzy, but the origin of its cat-attracting chemical is a remarkable example of evolutionary innovation.
While the compound nepetalactone drives two-thirds of cats batty, likely by mimicking sex pheromones, its real purpose is protecting catnip from pests. Nepetalactone belongs to a class of chemicals called iridoids, which can repel insects as effectively as DEET.
Many of catnip's relatives in the mint family use iridoids as chemical armor. But an international team of researchers found the ancient ancestor of catnip lost a key iridoid-making gene. Descendants in this lineage - herbs such as basil, oregano, rosemary, lemon balm and mint - had to lean on other defenses, with one notable exception: catnip, which revived the family tradition by evolving a new iridoid production line from scratch.
The findings, including the first detailed look at catnip's nepetalactone recipe, were published today in Science Advances. They provide crucial insights into how plants lose and regain defensive compounds, said study co-author Pamela Soltis, Florida Museum of Natural History curator and University of Florida distinguished professor.
"If we know how evolutionarily flexible a trait is, we can hypothesize about how easy or difficult it might be to modify the trait in another species through plant breeding, genetic engineering or gene editing," she said. "It might be possible to make a crop more resistant to pests if we know that a close relative re-evolved a compound that had previously been lost."
Many plants in the mint family also have medicinally important compounds, said study co-author Douglas Soltis, Florida Museum curator and UF distinguished professor, pointing to iridoid-derived cancer treatments as an example.
"Understanding these plants' underlying biochemical pathways is key to using them for human health," he said.
Researchers sequenced the genomes of two species of catnip and hyssop, a close relative that does not produce iridoids. By comparing the genomes, analyzing the mint family tree and studying ancestral genes and enzymes, they were able to trace the sequence of events that led to the loss of iridoid production in catnip's ancestor between 55-65 million years ago and its re-emergence tens of millions of years later.
The deletion of a gene erased the ability of plants in the subfamily Nepetoideae to make iridoids. Whether the gene deletion was the result of a sudden mutation or a gradual "phasing out" of iridoid production as these plants switched to other chemical defenses remains unclear, Pamela Soltis said.
Without this gene, catnip had to co-opt a related gene to build a new biochemical pathway for making iridoids, beginning about 20 million years ago, Douglas Soltis said.
"It's sort of like, 'I lost my screwdriver, and this one isn't quite the same, but it will work,'" he said, quoting "Jurassic Park" character Ian Malcolm: "'Life, uh, finds a way.'"
The new pathway resulted in nepetalactone, which maintains some hallmark iridoid features, but has a unique chemical structure and properties, the researchers said. The enzymes involved in its production are not found in any related plant species.
"There is a lot of evolutionary back-and-forth in all sorts of characteristics in plants - such as the origin of succulence in cacti, jade plants and aloe, or multiple derivations of red or purple pigments in distantly related species," Pamela Soltis said. "But whenever the 'same' thing re-evolves, it always turns out that it has done so slightly differently - always with a 'twist.'"
Why catnip re-evolved the ability to produce iridoids is "the next big question," Douglas Soltis said.
"As the mint family migrated across Eurasia, semi-arid habitats could have imposed new selective pressures," he said. "Maybe iridoids are more effective as defense compounds in those environments. That can't explain the origin of the new pathway, but it can explain the selection for it once it originates."
###
The study is part of the Mint Genome Project, a National Science Foundation-funded initiative that is unravelling the chemistry of the mint family's botanical compounds and how they're produced.
"Plants are constantly evolving new chemistry," said the study's lead researcher Sarah O'Connor, director of the Department of Natural Product Biosynthesis at Germany's Max Planck Institute for Chemical Ecology. "With our research, we would like to get snapshots of this evolution in action."
Grant Godden, Tal Kinser and Miao Sun of the Florida Museum also co-authored the study.

Further evidence does not support hydroxychloroquine for patients with
COVID-19

Adverse events were more common in those receiving the drug
BMJ
The anti-inflammatory drug hydroxychloroquine does not significantly reduce admission to intensive care or death in patients hospitalised with pneumonia due to covid-19, finds a study from France published by The BMJ today.
A randomised clinical trial from China also published today shows that hospitalised patients with mild to moderate persistent covid-19 who received hydroxychloroquine did not clear the virus more quickly than those receiving standard care. Adverse events were higher in those who received hydroxychloroquine.
Taken together, the results do not support routine use of hydroxychloroquine for patients with covid-19.
Hydroxychloroquine can reduce inflammation, pain, and swelling, and is widely used to treat rheumatic diseases. It is also used as an anti-malarial drug. Lab tests showed promising results, but accumulating trial and observational evidence has called into question whether there are any meaningful clinical benefits for patients with covid-19.
Despite this, hydroxychloroquine has already been included in Chinese guidelines on how best to manage the disease, and the US Food and Drug Administration (FDA) issued an emergency use authorization to allow the drug to be provided to certain hospitalized patients. The FDA has since warned against use outside clinical trials or hospital settings due to the risk of heart rhythm problems.
In the first study, researchers in France assessed the effectiveness and safety of hydroxychloroquine compared with standard care in adults admitted to hospital with pneumonia due to covid-19 who needed oxygen.
Of 181 patients, 84 received hydroxychloroquine within 48 hours of admission and 97 did not (control group).
They found no meaningful differences between the groups for transfer to intensive care, death within 7 days, or developing acute respiratory distress syndrome within 10 days.
The researchers say that caution is needed in the interpretation of their results, but that their findings do not support the use of hydroxychloroquine in patients hospitalised with covid-19 pneumonia.
In the second study, researchers in China assessed the effectiveness and safety of hydroxychloroquine compared with standard care in 150 adults hospitalised with mainly mild or moderate covid-19.
Patients were randomly split into two groups. Half received hydroxychloroquine in addition to standard care and the others received standard care only (control group).
By day 28, tests revealed similar rates of covid-19 in the two groups but adverse events were more common in those who received hydroxychloroquine. Symptom alleviation and time to relief of symptoms also did not differ meaningfully between the two groups.
While further work is needed to confirm these results, the authors say that their findings do not support the use of hydroxychloroquine to treat patients with persistent mild to moderate covid-19.
###
Peer reviewed? Yes
Evidence type: Observational comparative study; Randomised controlled trial
Subjects: Patients with covid-19

Cahokia's rise parallels onset of corn agriculture

UNIVERSITY OF ILLINOIS AT URBANA-CHAMPAIGN, NEWS BUREAU
IMAGE
IMAGE: CORN CULTIVATION BEGAN IN THE VICINITY OF THE CITY OF CAHOKIA BETWEEN A.D. 900 AND 1000, RESEARCHERS REPORT IN A NEW STUDY. ITS ARRIVAL MAY HAVE CONTRIBUTED TO THE ABRUPT... view more 
CREDIT: GRAPHIC BY DIANA YATES
CHAMPAIGN, Ill. -- Corn cultivation spread from Mesoamerica to what is now the American Southwest by about 4000 B.C., but how and when the crop made it to other parts of North America is still a subject of debate. In a new study, scientists report that corn was not grown in the ancient metropolis of Cahokia until sometime between A.D. 900 and 1000, a relatively late date that corresponds to the start of the city's rapid expansion.
The findings are published in the journal American Antiquity.
The research team determined the age of charred corn kernels found in homes, shrines and other archaeological contexts in and around Cahokia. The researchers also looked at carbon isotopes in the teeth and bones of 108 humans and 15 dogs buried in the vicinity.
Carbon-isotope ratios differ among food sources, with isotope ratios of corn being significantly higher than those of almost all other native plant species in the region. By analyzing the ratio of carbon 12 to carbon 13 in teeth and bones, the team determined the relative proportion of different types of foods the people of Cahokia ate in different time periods.
The corn remnants and isotope analyses revealed that corn consumption began in Cahokia between 900 and 1000. This was just before the city grew into a major metropolis.
"There's been an idea that corn came to the central Mississippi River valley at about the time of Christ, and the evolution of maize in this part of the world was really, really slow," said retired state archaeologist Thomas Emerson, who led the study. "But this Cahokia data is saying that no, actually, corn arrived here very late. And in fact, corn may be the foundation of the city."
The research team included Illinois State Archaeological Survey archaeobotanist Mary Simon; bioarchaeologist Kristin Hedman; radiocarbon dating analyst Matthew Fort; and former graduate student Kelsey Witt, now a postdoctoral researcher at Brown University.
Beginning in about 1050, Cahokia grew from "a little village of a few hundred people to part of a city with 5,000 to 10,000 people in an archaeological instant," Emerson said. The population eventually expanded to at least 40,000. This early experiment in urban living was short-lived, however. By 1350, after a period of drought and civil strife, most of the city's population had dispersed.
Scientists who theorize that corn came to the central Mississippi River valley early in the first millennium A.D. are overlooking the fact that the plant had to adapt to a completely different light and temperature regime before it could be cultivated in the higher latitudes, said Simon, who conducted an exhaustive analysis of corn kernels found at Cahokia and elsewhere in the Midwest.
"Corn was originally cultivated in Mesoamerica," she said. "Its flowering time and production time are controlled by the amount of sunlight it gets. When it got up into this region, its flowering was no longer corresponding to the available daylight. If you planted it in the spring, it wouldn't even start to flower until August, and winter would set in before you could harvest your crop."
The plant had to evolve to survive in this northerly climate, Simon said.
"It was probably only marginally adapted to high latitudes in what is now the southwestern United States by 0 A.D.," she said. "So, the potential for successful cultivation in the Midwest at this early date is highly problematic."
When they analyzed the carbon isotopes in the teeth and bones of 108 individuals buried in Cahokia between 600 and 1400, researchers saw a signature consistent with corn consumption beginning abruptly between 950 and 1000, Hedman said. The data from dogs buried at and near Cahokia also corresponded to this timeline.
"That's where you see this big jump in the appearance of corn in the diet," Hedman said. "This correlates very closely with what Mary Simon is finding with the dates on the maize."
"Between 900 and 1000 is also when you start to see a real shift in the culture of Cahokia," Emerson said. "This was the beginning of mound construction. There was a massive growth of population and a dramatic shift in ideology with the appearance of fertility iconography."
Artifacts uncovered from Cahokia include flint-clay figurines of women engaged in agricultural activities and vessels marked with symbols of water and fertility. Some of the items depict crops such as sunflowers and squash that predated the arrival of corn.
"It wasn't like the Cahokians didn't already have an agricultural base when corn arrived on the scene," Simon said. "They were preadapted to the whole idea of cultivation."
The absence of corn iconography in artifacts from the city reflects corn's status as a relative newcomer to the region at the time Cahokia first flourished, Emerson said.
Built near present-day St. Louis, the ancient city of Cahokia was an early experiment in urban living.
###
The Illinois Department of Transportation and ISAS supported this research. ISAS is a division of the Prairie Research Institute at the University of Illinois at Urbana-Champaign.
Editor's notes:
The paper "Isotopic confirmation of the timing and intensity of maize consumption in Greater Cahokia" is available online and from the U. of I. News Bureau
NEWS RELEASE 

New Chicago Booth research suggests patients prefer expert guidance for medical decisions

Findings from professors Emma Levine and Celia Gaertig show, now more than ever, people want paternalistic advice -- not autonomy -- for key healthcare choices
UNIVERSITY OF CHICAGO BOOTH SCHOOL OF BUSINESS
Over the past several decades, the United States medical system has increasingly prioritized patient autonomy. However, new research from University of Chicago Booth School of Business Professors Emma Levine and Celia Gaertig, and Northwestern Ph.D. candidate Samantha Kassirer, suggests that in times of uncertainty, people want expert guidance when making choices about their medical care.
The study, released by Proceedings of the National Academy of Sciences (PNAS), examines the important question of how patients, and advisees in general, react to full decisional autonomy when making difficult decisions about their health. The researchers found that advisers who gave advisees decisional autonomy rather than offering paternalistic advice, were judged to be less competent and less helpful. As a result, advisees are less likely to return to and recommend these advisers.
"It's clear that many of us don't want to be responsible for difficult decisions, but doctors seem more concerned than other experts that their advice might infringe on autonomy, and more worried about being blamed later," said Levine, assistant professor of behavioral science and Charles E. Merrill Faculty Scholar at Chicago Booth. "Our results suggest that advisees facing difficult decisions do not perceive autonomy as the gold standard."
The study also indicates that the preference for paternalistic guidance could extend beyond doctors. The researchers asked another set of participants to choose between two hypothetical investments, with some participants receiving recommendations from financial advisers while others did not. In a different experiment, they asked participants to imagine being given feedback from a boss about an upcoming presentation. In both cases, participants continued to prefer paternalistic advice. Moreover, in a final experiment, they didn't get angry at advisers for what turned out to be a bad outcome.
A summary of findings also appears in the Chicago Booth Review.
###

COVID-19 and terrorism: Assessing the short and long-term impacts of terrorism

CRANFIELD UNIVERSITY
A new report authored by Pool Re and Cranfield University's Andrew Silke, Professor of Terrorism, Risk and Resilience, reveals how the COVID-19 pandemic is already having a significant impact on terrorism around the world.
The report, 'COVID-19 and terrorism: assessing the short-and long-term impact' reveals:
  • There is a mixed picture on the level of attacks in the short-term - lockdown measures will tend to inhibit attacks but terrorist propaganda calling for attacks (while authorities are distracted, etc.) will incite some incidents.
  • Much propaganda - and particularly that connected to far-right extremism - is focusing on conspiracy theories connected to COVID-19 and this has already inspired plots and attacks.
  • Islamist extremist propaganda is focusing more on the vulnerability of government opponents distracted by the pandemic and the opportunity this presents for attacks.
  • There is a significant current increase in online extremist activity, raising the risk of increasing short-to-medium term radicalisation.
  • There are strong long-term concerns that states weakened by the serious economic consequences of the pandemic will be more vulnerable to the emergence/resurgence of terrorist groups in many parts of the world.
Launching the report, Andrew Silke Pool Re and Cranfield University's Professor of Terrorism, Risk and Resilience, said: "The pandemic is likely to have a mixed impact on terrorism trends in the short term. While lockdown measures may represent obstacles to terrorists to carry out real-world attacks, many terrorist groups have also flagged that the pandemic has left government and security resources being severely stretched.
"As a result, the ability of government, intelligence and law enforcement agencies to focus on traditional priorities such as counterterrorism has been undermined."
Commenting on CBRN weapons, Professor Silke continues: "One genuine concern is that COVID-19 may lead to a resurgence in interest among terrorists for using chemical, biological, radiological and nuclear weapons. Historically, a range of terrorist movements have been interested in bioterrorism though there have been very few successful attacks by terrorists using biological weapons. While serious obstacles certainly remain, the huge impact of COVID-19 may re-ignite some interest in biological weapons."
Pool Re's Chief Resilience Officer, Ed Butler said: "This report is very timely and worth digesting at a time when we are quite rightly focussed on the near-term issues and human and economic devastation being caused by this global pandemic. However, Pool Re's core purpose remains the provision of terrorism reinsurance and we need to continue to understand the contemporary terrorist threats as well as horizon scan the future landscape. Pool Re's strategic relationship with Cranfield University underpins the importance we attach to collaborating with academia in understanding and mitigating against catastrophic perils."
###

COVID-19: Hospital response risks worsening health inequalities

SAGE
Disadvantaged and marginalised people face worsening health inequalities as a result of the difficult choices made by NHS hospitals in response to the Covid-19 pandemic. Public health doctors, writing in the Journal of the Royal Society of Medicine, say that the restriction of non-urgent clinical services, such as gynaecology, sexual health and paediatrics, and the precipitous decline in emergency department attendances, will affect marginalised groups, disproportionately. Emergency departments, which in March 2020 saw a 44% decline in attendances, are often used for routine care by vulnerable people, such as homeless people and migrants, who can find it difficult to access general practice and other community services.
In their article, the authors explore the nature of health inequalities relating to the response to Covid-19 by hospital trusts and suggest approaches to reduce them. One concern highlighted is the suspension of carbon monoxide screening for pregnant women. Younger women, and those living in more deprived areas, are more likely to smoke during pregnancy. Lead author Sophie Coronini-Cronberg, consultant in public health at Chelsea and Westminster NHS Foundation Trust, said: "It remains vital that maternity services continue to ask women (and their partners) if they smoke or have recently quit, and continue to refer those who smoke for specialist cessation support." Official guidance advises postponing face-to-face smoking cessation clinics during the pandemic. Ms Coronini-Cronberg said: "We encourage providers to provide alternative remote services, to ensure these are equitable and to promote these tenaciously."
The authors also point to the problem of inaccurate baseline data for disease prevalence and progression which for many conditions can vary by ethnicity. Miss Coronini-Cronberg said: "It is imperative that we rigorously capture baseline data so that we understand the impact of key risk factors on disease prognosis, including Covid-19." The authors write that while ethnicity data are generally accurately captured for white British patients, for minority groups only 60-80% of hospital records capture ethnicity correctly. "We risk reaching incorrect conclusions based on flawed data", they say.
Other areas of concern highlighted by the authors include the inequalities faced by contracted workers who may provide critical hospital functions such as security, cleaning, portering and catering and who are more likely to be migrants.
The authors conclude: "The NHS has taken swift action to expand capacity and reorganise services to help ensure that health services can help with an influx of seriously ill Covid-19 patients. Difficult choices have been made, and some unintended consequences are inevitable. Policymakers, managers and clinicians should take pause during this phase to protect the most vulnerable groups in our society from negative unintended consequences and avoid worsening health inequalities."
###
 CFC replacements are a source of persistent organic pollution in the Arctic
Degraded, toxic compounds from CFC replacements found in ice in the Canadian Arctic
UNIVERSITY OF ALBERTA
Substances used to replace ozone-depleting chlorofluorocarbons (CFCs) may be just as problematic as their predecessors, a new study shows.
In 1987, Canada implemented the Montreal Protocol, a global agreement to protect Earth's ozone layer by ceasing the use of substances like CFCs. Unfortunately, the CFC-replacement substances used to replace them are proving problematic as well, with accumulating levels of their degradation products recently found in the Canadian Arctic.
"In many ways, the degradation products from these substances may be just as concerning as the original chemical they were meant to replace," said Alison Criscitiello, director of the Canadian Ice Core Lab (CICL), housed in the University of Alberta's Faculty of Science. "We are seeing significant levels of these short-chain acids accumulating in the Devon Ice Cap, and this study links some of them directly to CFC replacement compounds."
An ice core drilled on the summit of Devon Ice Cap in the Canadian high Arctic shows a tenfold increase in short-chain perfluorocarboxylic acid (scPFCA) deposition between 1986 and 2014. scPFCAs form through atmospheric oxidation of several industrial chemicals, some of which are CFC replacement compounds. scPFCAs are highly mobile persistent organic pollutants and belong to the class of so-called "forever chemicals" because they do not break down. A few preliminary studies have shown toxicity of these substances to plants and invertebrates.
"This is the first multi-decadal temporal record of scPFCA deposition in the Arctic," explained Criscitiello. "Our results suggest that the CFC-replacement compounds mandated by the Montreal Protocol are the dominant source of some scPFCAs to remote regions."
Over the past four years, Criscitiello and colleagues drilled four ice cores across the eastern Canadian high Arctic. This interdisciplinary work is thanks to a strong collaboration between Criscitiello and the labs of York University atmospheric chemist Cora Young and Environment and Climate Change Canada research scientist Amila De Silva.
These same Canadian Arctic ice cores also contain significant levels of perfluoroalkyl acids (PFAAs). These results demonstrate that both perfluoroalkyl carboxylic acids (PFCAs) and perfluorooctane sulfonate (PFOS) have continuous and increasing deposition on the Devon Ice Cap despite North American and international regulations and phase-outs. This is the likely result of ongoing manufacture, use, and emissions of these persistent pollutants, as well as their precursors and other new compounds in regions outside of North America.
"These results show the need for a more holistic approach when deciding to ban and replace chemical compounds," explained Criscitiello. "Chemicals degrade, and developing a strong understanding of how they degrade in the environment, and what they degrade to, is vital."
###
The paper, "Ice core record of persistent short?chain fluorinated alkyl acids: Evidence of the impact from global environmental regulations," is published in Geophysical Research Letters(doi: 10.1029/2020GL087535).

UMD researchers seek to reduce food waste and establish the science of food date labeling

New study highlights the importance of interdisciplinary collaborations to reduce global food waste due to date labeling
UNIVERSITY OF MARYLAND
IMAGE
IMAGE: GROCERY MARKET view more 
CREDIT: MOHAMED MAHMOUD HASSAN
Minimizing food waste is top of mind right now during the COVID-19 global pandemic, with the public concerned about the potential ramifications for our food supply chain. But even before COVID-19, given concerns about a rapidly growing population and hunger around the world, the Food and Agriculture Organization of the United Nations (FAO) issued a global call for zero tolerance on food waste. However, the lack of regulation, standardization, and general understanding of date labeling on food products (such as "best by" and "use by" dates) leads to billions of dollars per year in food waste in the United States alone. Many people don't realize that date labels on food products (with the exception of infant formula) are entirely at the manufacturer's discretion and are not supported by robust scientific evidence. To address this concern and combat global food waste, researchers at the University of Maryland have come together across departments in the College of Agriculture & Natural Resources with the goal of clarifying the science or lack thereof behind food date labels, highlighting the need for interdisciplinary research and global research trends in their new publication in Food Control.
"We have 50 different types of date labels that are currently used in the US because there is no regulation - best by, best if used by, use by - and we as consumers don't know what these things mean," says Debasmita Patra, assistant research professor in Environmental Science and Technology and lead author on the paper. "The labeling is the manufacturer's best estimation based on taste or whatever else, and it is not scientifically proven. But our future intention is to scientifically prove what is the best way to label foods. As a consumer and as a mom, a best by date might raise food safety concerns, but date labeling and food safety are not connected to each other right now, which is a wide source of confusion. And when billions of dollars are just going to the trash because of this, it's not a small thing."
According to the United States Department of Agriculture's Economic Research Service (USDA-ERS), Americans discard or waste about 133 billion pounds of food each year, representing $161 billion and a 31% loss of food at the retail and consumer level. According to the FDA, 90% of Americans say they are likely to prematurely discard food because they misinterpret date labels because of food safety concerns or uncertainty on how to properly store the product. This simple confusion accounts for 20% of the total annual food waste in the United States, representing more than 26 billion pounds per year and over $32 billion in food waste.
"Food waste is a significant threat to food security," adds Paul Leisnham, associate professor in Environmental Science and Technology and co-author. "Recognition of food waste due to confusion over date labeling is growing, but few studies have summarized the status of the research on this topic."
This was the goal of their latest publication, gathering support and background for their future work to reduce food waste, and providing guidance for future areas of research in this field. In order to achieve this, Patra enlisted Leisnham in her own department, but also relied on computational support and food quality and safety expertise from Abani Pradhan, associate professor in Nutrition and Food Science, and his postdoctoral fellow Collins Tanui, both co-authors on the paper.
"We wanted to see the trends and give some suggestions, because the paper shows that we are some of the very few who are thinking about truly interdisciplinary research connecting food labeling to food waste," says Patra. "In fact, we were joking because one major finding was that environmental sciences and food science departments don't seem to collaborate on this topic, so we are doing something unique here at UMD."
"Our paper underlined the fact that future research on food waste and date labeling needs to take an interdisciplinary approach to better explore the perspectives of multiple stakeholders, adds Leisnham. "Expertise from environmental science, food science, sociology, Extension education, and other disciplines can more effectively develop interventions to reduce behaviors that may increase food waste. This is an environmental issue, but involves the knowledge, attitudes, perceptions, and social behaviors of multiple stakeholders, including retailers, food-service providers, and diverse consumers."
The collaboration between environmental sciences and food sciences at UMD is an example of this collaboration in action, with the goal of establishing what science, if any, already underlies date labeling and connecting this to food quality and safety.
"Utilizing my expertise in experimental and mathematical modeling work, we aim to scientifically evaluate the quality characteristics, shelf life, and food spoilage risk of food products," says Pradhan. "This would help in determining if the food products are of good quality beyond the mentioned dates, rather than discarding them prematurely. We anticipate to reduce food waste through our ongoing and future research findings."
Patra stresses the importance of further collaboration through University of Maryland Extension (UME) to have maximum impact on food waste. "Where is the confusion coming from?," says Patra. "If we understand that, maybe we can better disseminate the information through our Extension work."
Patra adds, "Food is something that is involved in everybody's life, and so everyone needs to be a good food manager. But even now, there is no robust scientific evidence behind date labels, and yet those labels govern people's purchasing behavior. People look for something that has a longer 'best by' date thinking they are getting something better. And when you throw that food away, you are not only wasting the food, but also all the economics associated with that, like production costs, transportation from the whole farm to fork chain, and everything else that brought you that product just to be thrown away. Food safety, regulation, and education need to all combine to help solve this problem, which is why interdisciplinary collaboration is so important."
###
The paper, entitled "Evaluation of global research trends in the area of food waste due to date labeling using a scientometrics approach," is published in Food Control, DOI: j.foodcont.2020.107307.