Thursday, February 25, 2021

Reducing salt in Parmigiano Reggiano cheese might not negatively affect its flavor

AMERICAN CHEMICAL SOCIETY

Research News

Aged cheeses pack a punch of nutty, sharp flavor. Before they're fully mature, aged cheeses are either waxed or placed in brine for weeks to create a natural rind. However, the high salt content in brined cheeses deters some consumers. Now, researchers reporting in ACS Food Science & Technology present a shortened brining time for Parmigiano Reggiano that results in a less salty product, while still potentially maintaining the cheese's distinctive texture and flavor compounds.

Parmigiano Reggiano is a lactose-free, crumbly and hard cheese. Manufactured in select provinces in Italy, its protected designation of origin status requires that certain production processes, such as a minimum 12-month ripening period, be performed. Ripening or maturing imparts the cheese's recognizable taste as milk solids are converted to flavor compounds. But before that, cheese wheels are placed in a saturated brine solution for weeks. The added salt plays a key role in the ripening process by modulating microbial growth, enzyme activity and the separation of solids from liquids, hardening the final product. One enzyme-mediated reaction is lipolysis, in which triglyceride fats in milk break down into their key components -- free fatty acids and diacylglycerides. Free fatty acids not only contribute to the taste of the cheese but are also precursors to other flavor molecules. So, Silvia Marzocchi and colleagues wanted to test the impact of brining time on the lipolysis reactions responsible for the free fatty acids involved in Parmigiano Reggiano's flavor profile and distinctive characteristics.

The researchers had five Parmigiano Reggiano dairies brine several cheese wheels by immersing them in a saturated salt solution for either 18 days or a shorter 12-day period. Then the wheels were ripened for 15 months under conditions typical for this type of cheese. Salt content in fully ripened cheese was 9% lower in the samples brined for a shorter time than the group with the longer procedure. Unexpectedly, the researchers found no difference in the moisture level, cholesterol and total fat in the two sets of cheeses. The team also observed no major variations in compounds involved in the flavor profile, as most of the 32 free fatty acids had overlapping concentration ranges between the two groups. Yet in the cheeses with the shorter salting time, overall, the total free fatty acids and the total diacylglycerides concentration ranges were 260% and 100% higher, respectively, than the traditionally brined version, suggesting the lower salt to moisture ratio resulted in more water available to lipolysis reactions and more rapid enzymatic activity breaking down triglycerides. The researchers say a reduced brining time for Parmigiano Reggiano could result in a product appealing to salt-conscious consumers, but sensory tests are still needed to indicate if they can detect differences to the overall taste and texture.

###

The authors acknowledge funding from the PARENT Project, the European Regional Development Fund to the Emilia-Romagna Region and the Consejo Nacional de Investigaciones Cientifí cas y Técnicas.

The paper is freely available as an ACS AuthorChoice

Celebrating Black chemists and chemical engineers

AMERICAN CHEMICAL SOCIETY

Chemical & Engineering News (C&EN), the newsmagazine of the American Chemical Society (ACS), is celebrating Black chemists and chemical engineers with a special issue highlighting Black chemists who work across the fields of biotechnology, solar energy, pharmaceuticals and more. Guest edited by Massachusetts Institute of Technology (MIT) drug delivery pioneer Paula Hammond, Ph.D., this special issue showcases Black scientists, spotlighting their scientific passions and career accomplishments.

"In bringing into focus the unique lives of this set of accomplished Black scientists in chemistry and chemical engineering, it is my hope that we open the door to more frequent and constant recognition of our presence in the field," Hammond wrote in her introductory remarks. "We have always been present in the sciences -- but now more than ever, we must appreciate and acknowledge the presence of Black people and other people of color. We must find ways to continue to raise our voices and celebrate our work. As a nation, we all benefit from the huge talent gained when all are included in the science enterprise."

Among the chemists and chemical engineers featured in the 2021 Trailblazers issue are Karen Akinsanya, Ph.D., of Schr?dinger, Inc., on artificial-intelligence-driven drug discovery; Oluwatoyin Asojo, Ph.D., of Hampton University on her calling to develop drugs for neglected diseases; Squire J. Booker, Ph.D., of Penn State University on the catalytic moments of his career; Cato T. Laurencin, M.D., Ph.D., of the University of Connecticut on his twin passions for surgery and biomedical engineering; and Kristala L. J. Prather, Ph.D., of MIT on harnessing the synthetic power of microbial systems. The print issue, which features original content and photography by Black creators, was released on February 22.

###

The paper, "C&EN's Trailblazers," is freely available here.

For more of the latest research news, register for our upcoming meeting, ACS Spring 2021. Journalists and public information officers are encouraged to apply for complimentary press registration by emailing us at newsroom@acs.org.


COVID-19 isolation linked to increased domestic violence, researchers suggest

Financial stress contributes

UNIVERSITY OF CALIFORNIA - DAVIS

Research News

While COVID-19-related lockdowns may have decreased the spread of a deadly virus, they appear to have created an ideal environment for increased domestic violence.

Data collected in surveys of nearly 400 adults for 10 weeks beginning in April 2020 suggest that more services and communication are needed so that even front-line health and food bank workers, for example -- rather than only social workers, doctors and therapists -- can spot the signs and ask clients questions about potential intimate partner violence. They could then help lead victims to resources, said Clare Cannon, assistant professor of social and environmental justice in the Department of Human Ecology and the lead author of the study.

The paper, "COVID-19, intimate partner violence, and communication ecologies," was published this month in American Behavioral Scientist. Study co-authors include Regardt Ferreira and Frederick Buttell, both of Tulane University, and Jennifer First, of University of Tennessee-Knoxville.

"The pandemic, like other kinds of disasters, exacerbates the social and livelihood stresses and circumstances that we know lead to intimate partner violence," said Cannon. She explained that increased social isolation during COVID-19 has created an environment where victims and aggressors, or potential aggressors in a relationship, cannot easily separate themselves from each other. The extra stress also can cause mental health issues, increasing individuals' perceived stress and reactions to stress through violence and other means.

"Compounding these stressors, those fleeing abuse may not have a place to get away from abusive partners," Cannon said.

Intimate partner violence is defined as physical, emotional, psychological or economic abuse and stalking or sexual harm by a current or former partner or spouse, according to the Centers for Disease Control and Prevention. Crime statistics indicate that 16 percent of homicides are perpetrated by a partner. Further, the CDC says, 25 percent of women and 10 percent of men experience some form of intimate partner violence in their lifetime.

Research participants in the study completed an online survey asking about previous disaster experience, perceived stress, their current situation as it relates to COVID-19, if they experienced intimate partner violence, and what their personal and household demographics were. In all, 374 people completed the survey. Respondents, whose average age was 47, were asked about how COVID-19 had affected them financially and otherwise.

Of the respondents, 39 reported having experienced violence in their relationship, and 74 percent of those people were women.

Although only 10 percent of the sample reported experiencing intimate partner violence, the people that had experienced that violence reported more stress than the segment of the sample that had not experienced it. Furthermore, the results show that as perceived stress increased, participants were more likely to end up as victims of violence.

"Importantly," Cannon said, "these data do not suggest causality and there is no way to determine if intimate partner violence was present in those relationships prior to the pandemic. What the data do suggest, however, is that experiencing such violence is related to reporting more exposure to stress."

Researchers found that as people find themselves in a more tenuous financial situation due to COVID-19, "there are more things to worry about and subsequently argue about. In many instances, that type of situation leads to an occasion for intimate partner violence," the researchers said.

"In our sample's case, as people lost their jobs and suffered financial losses, they also likely increased their worry about eviction," Cannon said. Notably, similar findings linking financial and job loss stresses with increased intimate partner violence were reported in the 2008 recession, Cannon said.

Researchers said their findings show a need for more communication resources for families -- potentially coming from government and nongovernment sources of support and information. By increasing public awareness of resources available to the broader community, community members, trusted friends, neighbors, and family members may be better able to connect those affected by domestic violence with resources, such as shelters, treatment intervention programs and therapeutic professionals such as social workers, therapists and others, researchers said.


Apollo rock samples capture key moments in the Moon's early history, study find

BROWN UNIVERSITY

Research News

PROVIDENCE, R.I. [Brown University] -- Volcanic rock samples collected during NASA's Apollo missions bear the isotopic signature of key events in the early evolution of the Moon, a new analysis found. Those events include the formation of the Moon's iron core, as well as the crystallization of the lunar magma ocean -- the sea of molten rock thought to have covered the Moon for around 100 million years after the it formed.

The analysis, published in the journal Science Advances, used a technique called secondary ion mass spectrometry (SIMS) to study volcanic glasses returned from the Apollo 15 and 17 missions, which are thought to represent some of the most primitive volcanic material on the Moon. The study looked specifically at sulfur isotope composition, which can reveal details about the chemical evolution of lavas from generation, transport and eruption.

"For many years it appeared as though the lunar basaltic rock samples analyzed had a very limited variation in sulfur isotope ratios," said Alberto Saal, a geology professor at Brown University and study co-author. "That would suggest that the interior of the Moon has a basically homogeneous sulfur isotopic composition. But using modern in situ analytical techniques, we show that the isotope ratios of the volcanic glasses actually have a fairly wide range, and those variations can be explained by events early in lunar history."

The sulfur signature of interest is the ratio of the "heavy" sulfur-34 isotope to the lighter sulfur-32. Initial studies of lunar volcanic samples found that they uniformly leaned toward the heavier sulfur-34. The nearly homogeneous sulfur isotope ratio was in contrast with large variations in other elements and isotopes detected in the lunar samples.

This new study looked at 67 individual volcanic glass samples and their melt inclusions -- tiny blobs of molten lava trapped within crystals inside the glass. Melt inclusions capture the lava before sulfur and other volatile elements are released as gas during eruption -- a process called degassing. As such, they offer a pristine picture of what the original source lava was like. Using the SIMS at the Carnegie Institution for Science, Saal with his colleague, the late Carnegie scientist Eric Hauri, were able to measure the sulfur isotopes in these pristine melt inclusions and glasses, and use those results to calibrate a model of the degassing process for all the samples.

"Once we know the degassing, then we can estimate back the original sulfur isotope composition of the sources that produced these lavas," Saal said.

Those calculations revealed that the lavas had been derived from different reservoirs within the interior of the Moon with a wide range of sulfur isotope ratios. The researchers then showed that the range of values detected in the samples could be explained by events in the Moon's early history.

The lighter isotope ratio in some of the volcanic glasses, for example, is consistent with the segregation of the iron core from the early molten Moon. When an iron core separates from other material in a planetary body, it takes a bit of sulfur with it. The sulfur that's taken tends to be the heavier sulfur-34 isotope, leaving the remaining magma enriched in the lighter sulfur-32.

"The values we see in some of the volcanic glasses are fully consistent with models of the core segregation process," Saal said.

The heavier isotope values can be explained by the further cooling and crystallization of the early molten Moon. The crystallization process removes sulfur from the magma pool, producing solid reservoirs with heavier sulfur-34. That process is the likely source of the heavier isotope values found in some of the volcanic glasses and basaltic rocks returned from the Moon.

"Our results suggest that these samples record these critical events in lunar history," Saal said. "As we keep looking at these samples with newer and better techniques, we keep learning new things."

More work needs to be done -- and more samples need to be analyzed -- to fully understand the sulfur isotopic composition of the Moon, Saal says. But these new results help to clarify long-standing questions about the composition of the Moon's interior, and they bring scientists one step closer to understanding the formation and early history of the Moon.

###

The research was funded by NASA's Solar System Workings program (80NSSC20K0461).

The risks of communicating extreme climate forecasts

COLLEGE OF ENGINEERING, CARNEGIE MELLON UNIVERSITY

Research News

For decades, climate change researchers and activists have used dramatic forecasts to attempt to influence public perception of the problem and as a call to action on climate change. These forecasts have frequently been for events that might be called "apocalyptic," because they predict cataclysmic events resulting from climate change.

In a new paper published in the International Journal of Global Warming, Carnegie Mellon University's David Rode and Paul Fischbeck argue that making such forecasts can be counterproductive. "Truly apocalyptic forecasts can only ever be observed in their failure--that is the world did not end as predicted," says Rode, adjunct research faculty with the Carnegie Mellon Electricity Industry Center, "and observing a string of repeated apocalyptic forecast failures can undermine the public's trust in the underlying science."

Rode and Fischbeck, professor of Social & Decision Sciences and Engineering & Public Policy, collected 79 predictions of climate-caused apocalypse going back to the first Earth Day in 1970. With the passage of time, many of these forecasts have since expired; the dates have come and gone uneventfully. In fact, 48 (61%) of the predictions have already expired as of the end of 2020.

Fischbeck noted, "from a forecasting perspective, the 'problem' is not only that all of the expired forecasts were wrong, but also that so many of them never admitted to any uncertainty about the date. About 43% of the forecasts in our dataset made no mention of uncertainty."

In some cases, the forecasters were both explicit and certain. For example, Stanford University biologist Paul Ehrlich and British environmental activist Prince Charles are serial failed forecasters, repeatedly expressing high degrees of certainty about apocalyptic climate events.

Rode commented "Ehrlich has made predictions of environmental collapse going back to 1970 that he has described as having 'near certainty'. Prince Charles has similarly warned repeatedly of 'irretrievable ecosystem collapse' if actions were not taken, and when expired, repeated the prediction with a new definitive end date. Their predictions have repeatedly been apocalyptic and highly certain...and so far, they've also been wrong."

The researchers noted that the average time horizon before a climate apocalypse for the 11 predictions made prior to 2000 was 22 years, while for the 68 predictions made after 2000, the average time horizon was 21 years. Despite the passage of time, little has changed--across a half a century of forecasts; the apocalypse is always about 20 years out.

Fischbeck continued, "It's like the boy who repeatedly cried wolf. If I observe many successive forecast failures, I may be unwilling to take future forecasts seriously.

That's a problem for climate science, say Rode and Fischbeck.

"The underlying science of climate change has many solid results," says Fischbeck, "the problem is often the leap in connecting the prediction of climate events to the prediction of the consequences of those events." Human efforts at adaptation and mitigation, together with the complexity of socio-physical systems, means that the prediction of sea level rise, for example, may not necessarily lead to apocalyptic flooding.

"By linking the climate event and the potential consequence for dramatic effect," noted Rode, "a failure to observe the consequence may unfairly call into question the legitimacy of the science behind the climate event."

With the new Biden administration making climate change policy a top priority, trust in scientific predictions about climate change is more crucial than ever, however scientists will have to be wary in qualifying their predictions. In measuring the proliferation the forecasts through search results, the authors found that forecasts that did not mention uncertainty in their apocalyptic date tended to be more visible (i.e., have more search results available). Making sensational predictions of the doom of humanity, while scientifically dubious, has still proven tempting for those wishing to grab headlines.

The trouble with this is that scientists, due to their training, tend to make more cautious statements and more often include references to uncertainty. Rode and Fischbeck found that while 81% of the forecasts made by scientists referenced uncertainty, less than half of the forecasts made by non-scientists did.

"This is not surprising," said Rode, "but it is troubling when you consider that forecasts that reference uncertainty are less visible on the web. This results in the most visible voices often being the least qualified."

Rode and Fischbeck argue that scientists must take extraordinary caution in communicating events of great consequence. When it comes to climate change, the authors advise "thinking small." That is, focusing on making predictions that are less grandiose and shorter in term. "If you want people to believe big predictions, you first need to convince them that you can make little predictions," says Rode.

Fischbeck added, "We need forecasts of a greater variety of climate variables, we need them made on a regular basis, and we need expert assessments of their uncertainties so people can better calibrate themselves to the accuracy of the forecaster."

Over 80% of Atlantic Rainforest remnants have been impacted by human activity

Researchers estimated biodiversity and biomass losses in the biome using data from 1,819 forest inventories. In terms of carbon storage, the losses correspond to the destruction of 70,000 km² of forest

FUNDAÇÃO DE AMPARO À PESQUISA DO ESTADO DE SÃO PAULO

Research News

IMAGE

IMAGE: BIODIVERSITY AND BIOMASS LOSSES IN THE BIOME USING DATA FROM 1,819 FOREST INVENTORIES. IN TERMS OF CARBON STORAGE, THE LOSSES CORRESPOND TO THE DESTRUCTION OF 70,000 KM² OF FOREST, REPRESENTING... view more 

CREDIT: RENATO DE LIMA/USP

A Brazilian study published (http://www.nature.com/articles/s41467-020-20217-w) in Nature Communications shows that human activities have directly or indirectly caused biodiversity and biomass losses in over 80% of the remaining Atlantic Rainforest fragments.

According to the authors, in terms of carbon storage, the biomass erosion corresponds to the destruction of 70,000 square kilometers (km²) of forest - almost 10 million soccer pitches - or USD 2.3 billion-USD 2.6 billion in carbon credits. "These figures have direct implications for mechanisms of climate change mitigation," they state in the article.

Atlantic Rainforest remnants in Brazil are strung along its long coastline. The biome once covered 15% of Brazil, totaling 1,315,460 km². Only 20% of the original area is now left. The fragments are of varying sizes and have different characteristics.

To estimate the impact of human activity on these remnants, the researchers used data from 1,819 forest inventories conducted by several research groups.

"These inventories are a sort of tree census. The researchers go into the field and choose an area to survey, typically 100 meters by 100 meters. All the trees found within this perimeter are identified, analyzed, and measured," said Renato de Lima (https://bv.fapesp.br/en/pesquisador/668300/renato-augusto-ferreira-de-lima), a researcher at the University of São Paulo's Institute of Biosciences (IB-USP) and leader of the study. "We compiled all the data available in the scientific literature and calculated the average loss of biodiversity and biomass in the fragments studied, which represent 1% of the biome. We then used statistical methods to extrapolate the results to the fragments not studied, assuming that the impact would be constant throughout the Atlantic Rainforest biome."

After identifying the tree species in a fragment, the researchers estimated the size of their seeds and also what they call the "ecological or successional group". These two factors indicate how healthy the forest is, according to Lima. "There are hardy plants that require very little in the way of local resources and can grow on wasteland, pasture, forest borders, etc. These are known as pioneer species. A Brazilian example is the Ambay pumpwood [Cecropia pachystachya]," he said.

Pioneer tree species tend to produce seeds of smaller size, but in large numbers, because each seed has such a small chance of germinating. At the opposite extreme are climax species that flourish only in favorable environments, such as Brazilwood (Paubrasilia echinata) or various species of the genus Ocotea. These trees produce larger seeds with a substantial store of nutrients.

"This kind of seed requires a heavier investment by the parent tree in terms of energy," Lima said. "Areas in which climax species are present typically support more diversified fauna, so they serve as a marker of overall forest quality. Areas in which pioneer species predominate have probably been disturbed in the recent past."

The IB-USP group set out to show how the loss of late-successional species correlated with overall biodiversity loss and also with biomass loss, which represents the reduction in the forest's capacity to store carbon and keep this greenhouse gas out of the atmosphere. They found the forest fragments studied to have 25%-32% less biomass, 23%-31% fewer tree species, and 33%-42% fewer individuals belonging to late-successional, large-seeded, and endemic species.

The analysis also showed that biodiversity and biomass erosion were lower in strictly protected conservation units, especially large ones. "The smaller the forest fragment and the larger the edge area, the easier it is for people to gain access and disturb the remnant," Lima said.

On the positive side, degraded forest areas can recoup their carbon storage capacity if they are restored. "Combating deforestation and restoring totally degraded open areas such as pasturelands have been a major focus. These two strategies are very important, but we shouldn't forget the fragments in the middle," Lima said.

According to Paulo Inácio Prado (https://bv.fapesp.br/en/pesquisador/3487/paulo-inacio-de-knegt-lopez-de-prado), a professor at IB-USP and last author of the study, restored forest remnants can attract billions of dollars in investment relating to carbon credits. "Degraded forests should no longer be considered a liability. They're an opportunity to attract investment, create jobs and conserve what still remains of the Atlantic Rainforest," he said.

Lima believes this could be an attractive strategy for landowners in protected areas of the biome. "There's no need to reduce the amount of available arable land. Instead, we should increase the biomass in forest fragments, recouping part of the cost of restoration in the form of carbon credits," he said. "There will be no future for the Atlantic Rainforest without the owners of private properties. Only 9% of the remaining forest fragments are on state-owned land."

Database

According to Lima, the study began during his postdoctoral research, which was supported by São Paulo Research Foundation - FAPESP (https://bv.fapesp.br/en/bolsas/145695) and supervised by Prado. The aim was to identify the key factors that determine biodiversity and biomass loss in remnants of Atlantic Rainforest. "We found human action to be a major factor," he said. "We considered activities such as logging, hunting, and invasion by exotic species, as well as the indirect effects of forest fragmentation."

The data obtained from the 1,819 forest inventories used in the research is stored in a repository called TreeCo (http://labtrop.ib.usp.br/doku.php?id=projetos:treeco:start), short for Neotropical Tree Communities. Lima developed the database during his postdoctoral fellowship and still runs it. Its contents are described in an article published in Biodiversity and Conservation (https://link.springer.com/article/10.1007/s10531-015-0953-1). It is open to other research groups interested in sharing data on Neotropical forests.

"The repository became a byproduct of my postdoctoral project, and more than ten PhD and master's candidates are using it in their research," Lima said.

###

A Brazilian study published in Nature Communications shows that human activities have directly or indirectly caused biodiversity and biomass losses in over 80% of the remaining Atlantic Rainforest fragments.

According to the authors, in terms of carbon storage, the biomass erosion corresponds to the destruction of 70,000 square kilometers (km²) of forest – almost 10 million soccer pitches – or USD 2.3 billion-USD 2.6 billion in carbon credits. “These figures have direct implications for mechanisms of climate change mitigation,” they state in the article.

Atlantic Rainforest remnants in Brazil are strung along its long coastline. The biome once covered 15% of Brazil, totaling 1,315,460 km². Only 20% of the original area is now left. The fragments are of varying sizes and have different characteristics.

To estimate the impact of human activity on these remnants, the researchers used data from 1,819 forest inventories conducted by several research groups.

“These inventories are a sort of tree census. The researchers go into the field and choose an area to survey, typically 100 meters by 100 meters. All the trees found within this perimeter are identified, analyzed, and measured,” said Renato de Lima, a researcher at the University of São Paulo’s Institute of Biosciences (IB-USP) and leader of the study. “We compiled all the data available in the scientific literature and calculated the average loss of biodiversity and biomass in the fragments studied, which represent 1% of the biome. We then used statistical methods to extrapolate the results to the fragments not studied, assuming that the impact would be constant throughout the Atlantic Rainforest biome.”

After identifying the tree species in a fragment, the researchers estimated the size of their seeds and also what they call the “ecological or successional group”. These two factors indicate how healthy the forest is, according to Lima. “There are hardy plants that require very little in the way of local resources and can grow on wasteland, pasture, forest borders, etc. These are known as pioneer species. A Brazilian example is the Ambay pumpwood [Cecropia pachystachya],” he said.

Pioneer tree species tend to produce seeds of smaller size, but in large numbers, because each seed has such a small chance of germinating. At the opposite extreme are climax species that flourish only in favorable environments, such as Brazilwood (Paubrasilia echinata) or various species of the genus Ocotea. These trees produce larger seeds with a substantial store of nutrients.

“This kind of seed requires a heavier investment by the parent tree in terms of energy,” Lima said. “Areas in which climax species are present typically support more diversified fauna, so they serve as a marker of overall forest quality. Areas in which pioneer species predominate have probably been disturbed in the recent past.”

The IB-USP group set out to show how the loss of late-successional species correlated with overall biodiversity loss and also with biomass loss, which represents the reduction in the forest’s capacity to store carbon and keep this greenhouse gas out of the atmosphere. They found the forest fragments studied to have 25%-32% less biomass, 23%-31% fewer tree species, and 33%-42% fewer individuals belonging to late-successional, large-seeded, and endemic species.

The analysis also showed that biodiversity and biomass erosion were lower in strictly protected conservation units, especially large ones. “The smaller the forest fragment and the larger the edge area, the easier it is for people to gain access and disturb the remnant,” Lima said.

On the positive side, degraded forest areas can recoup their carbon storage capacity if they are restored. “Combating deforestation and restoring totally degraded open areas such as pasturelands have been a major focus. These two strategies are very important, but we shouldn’t forget the fragments in the middle,” Lima said.

According to Paulo Inácio Prado, a professor at IB-USP and last author of the study, restored forest remnants can attract billions of dollars in investment relating to carbon credits. “Degraded forests should no longer be considered a liability. They’re an opportunity to attract investment, create jobs and conserve what still remains of the Atlantic Rainforest,” he said.

Lima believes this could be an attractive strategy for landowners in protected areas of the biome. “There’s no need to reduce the amount of available arable land. Instead, we should increase the biomass in forest fragments, recouping part of the cost of restoration in the form of carbon credits,” he said. “There will be no future for the Atlantic Rainforest without the owners of private properties. Only 9% of the remaining forest fragments are on state-owned land.”

Database

According to Lima, the study began during his postdoctoral research, which was supported by FAPESP and supervised by Prado. The aim was to identify the key factors that determine biodiversity and biomass loss in remnants of Atlantic Rainforest. “We found human action to be a major factor,” he said. “We considered activities such as logging, hunting, and invasion by exotic species, as well as the indirect effects of forest fragmentation.”

The data obtained from the 1,819 forest inventories used in the research is stored in a repository called TreeCo, short for Neotropical Tree Communities. Lima developed the database during his postdoctoral fellowship and still runs it. Its contents are described in an article published in Biodiversity and Conservation. It is open to other research groups interested in sharing data on Neotropical forests.

“The repository became a byproduct of my postdoctoral project, and more than ten PhD and master’s candidates are using it in their research,” Lima said.

###

About São Paulo Research Foundation (FAPESP)

The São Paulo Research Foundation (FAPESP) is a public institution with the mission of supporting scientific research in all fields of knowledge by awarding scholarships, fellowships and grants to investigators linked with higher education and research institutions in the State of São Paulo, Brazil. FAPESP is aware that the very best research can only be done by working with the best researchers internationally. Therefore, it has established partnerships with funding agencies, higher education, private companies, and research organizations in other countries known for the quality of their research and has been encouraging scientists funded by its grants to further develop their international collaboration. You can learn more about FAPESP at http://www.fapesp.br/en and visit FAPESP news agency at http://www.agencia.fapesp.br/en to keep updated with the latest scientific breakthroughs FAPESP helps achieve through its many programs, awards and research centers. You may also subscribe to FAPESP news agency at http://agencia.fapesp.br/subscribe.

POSTMODERN PARACELUS ALCHEMY 

'Miracle poison' for novel therapeutics

Researchers prove they can engineer botulinum toxin proteins to find new targets with high selectivity, a critical advance toward potential new treatments for everything from neuroregeneration to cytokine storm

HARVARD UNIVERSITY

Research News

IMAGE

IMAGE: ENGINEERED BOTULINUM TOXIN COULD LEAD TO NEW TREATMENTS FOR A RANGE OF MALADIES, INCLUDING NERVE AND BRAIN DAMAGE, MUSCLE INJURY, AND RAMPANT INFLAMMATION. view more 

CREDIT: THE LIU LAB

When people hear botulinum toxin, they often think one of two things: a cosmetic that makes frown lines disappear or a deadly poison.

But the "miracle poison," as it's also known, has been approved by the F.D.A. to treat a suite of maladies like chronic migraines, uncontrolled blinking, and certain muscle spasms. And now, a team of researchers from Harvard University and the Broad Institute have, for the first time, proved they could rapidly evolve the toxin in the laboratory to target a variety of different proteins, creating a suite of bespoke, super-selective proteins called proteases with the potential to aid in neuroregeneration, regulate growth hormones, calm rampant inflammation, or dampen the life-threatening immune response called cytokine storm.

"In theory, there is a really high ceiling for the number and type of conditions where you could intervene," said Travis Blum, a postdoctoral researcher in the Department of Chemistry and Chemical Biology and first author on the study published in Science. The study was the culmination of a collaboration with Min Dong, an associate professor at the Harvard Medical School, and David Liu, the Thomas Dudley Cabot Professor of the Natural Sciences, a Howard Hughes Medical Institute Investigator, and a core faculty member of the Broad Institute.

Together, the team achieved two firsts: They successfully reprogrammed proteases--enzymes that cut proteins to either activate or deactivate them--to cut entirely new protein targets, even some with little or no similarity to the native targets of the starting proteases, and to simultaneously avoid engaging their original targets. They also started to address what Blum called a "classical challenge in biology": designing treatments that can cross into a cell. Unlike most large proteins, botulinum toxin proteases can enter neurons in large numbers, giving them a wider reach that makes them all the more appealing as potential therapeutics.

Now, the team's technology can evolve custom proteases with tailor-made instructions for which protein to cut. "Such a capability could make 'editing the proteome' feasible," said Liu, "in ways that complement the recent development of technologies to edit the genome."

Current gene-editing technologies often target chronic diseases like sickle cell anemia, caused by an underlying genetic error. Correct the error, and the symptoms fade. But some acute illnesses, like neurological damage following a stroke, aren't caused by a genetic mistake. That's where protease-based therapies come in: The proteins can help boost the body's ability to heal something like nerve damage through a temporary or even one-time treatment.

Scientists have been eager to use proteases to treat disease for decades. Unlike antibodies, which can only attack specific alien substances in the body, proteases can find and attach to any number of proteins, and, once bound, can do more than just destroy their target. They could, for example, reactivate dormant proteins.

"Despite these important features, proteases have not been widely adopted as human therapeutics," said Liu, "primarily because of the lack of a technology to generate proteases that cleave protein targets of our choosing."

But Liu has a technological ace in his pocket: PACE (which stands for phage-assisted continuous evolution). A Liu lab invention, the platform rapidly evolves novel proteins with valuable features. PACE, Liu said, can evolve dozens of generations of proteins a day with minimal human intervention. Using PACE, the team first taught so-called "promiscuous" proteases--those that naturally target a wide swath of proteins--to stop cutting certain targets and become far more selective. When that worked, they moved on to the bigger challenge: Teaching a protease to only recognize an entirely new target, one outside its natural wheelhouse.

"At the outset," said Blum, "we didn't know if it was even feasible to take this unique class of proteases and evolve them or teach them to cleave something new because that had never been done before." ("It was a moonshot to begin with," said Michael Packer, a previous Liu lab member and an author on the paper). But the proteases outperformed the team's expectations. With PACE, they evolved four proteases from three families of botulinum toxin; all four had no detected activity on their original targets and cut their new targets with a high level of specificity (ranging from 218- to more than 11,000,000-fold). The proteases also retained their valuable ability to enter cells. "You end up with a powerful tool to do intracellular therapy," said Blum. "In theory."

"In theory" because, while this work provides a strong foundation for the rapid generation of many new proteases with new capabilities, far more work needs to be done before such proteases can be used to treat humans. There are other limitations, too: The proteins are not ideal as treatments for chronic diseases because, over time, the body's immune system will recognize them as alien substances and attack and defuse them. While botulinum toxin lasts longer than most proteins in cells (up to three months as opposed to the typical protein lifecycle of hours or days), the team's evolved proteins might end up with shorter lifetimes, which could diminish their effectiveness.

Still, since the immune system takes time to identify foreign substances, the proteases could be effective for temporary treatments. And, to side-step the immune response, the team is also looking to evolve other classes of mammalian proteases since the human body is less likely to attack proteins that resemble their own. Because their work on botulinum toxin proteases proved so successful, the team plans to continue to tinker with those, too, which means continuing their fruitful collaboration with Min Dong, who not only has the required permission from the Centers for Disease Control (CDC) to work with botulinum toxin but provides critical perspective on the potential medical applications and targets for the proteases.

"We're still trying to understand the system's limitations, but in an ideal world," said Blum, "we can think about using these toxins to theoretically cleave any protein of interest." They just have to choose which proteins to go after next.

###

Salmon scales reveal substantial decline in wild salmon population & diversity

Researchers from Simon Fraser University analyzed 100-year-old salmon scales to assess the health of wild salmon populations

SIMON FRASER UNIVERSITY

Research News

IMAGE

IMAGE: THE COLLECTION OF 100-YEAR-OLD WILD SALMON SCALES FROM THE SKEENA RIVER. view more 

CREDIT: MICHAEL PRICE

The diversity and numbers of wild salmon in Northern B.C. have declined approximately 70 per cent over the past century, according to a new Simon Fraser University study.

Researchers drawing on 100-year-old salmon scales report that recent numbers of wild adult sockeye salmon returning to the Skeena River are 70 per cent lower than 100 years ago. Wild salmon diversity in the Skeena watershed has similarly declined by 70 per cent over the last century.

The research undertaken by Simon Fraser University (SFU) and Fisheries and Oceans Canada was published today in the Journal of Applied Ecology.

The research team applied modern genetic tools to salmon scales collected from commercial fisheries during 1913-1947 to reconstruct historical abundance and diversity of populations for comparison with recent information.

The analysis revealed that Canada's second largest salmon watershed - the Skeena River - once hosted a diverse sockeye salmon portfolio composed of many populations that fluctuated from year to year, yet overall remained relatively stable. However, the Skeena sockeye portfolio has largely eroded over the last century, such that it now is dominated by a single population that primarily is supported by artificial production from spawning channels.

"Our study provides a rare example of the extent of erosion of within-species biodiversity over the last century of human influence," says Michael Price, an SFU PhD candidate and lead author. "That loss in abundance and diversity from wild populations has weakened the adaptive potential for salmon to survive and thrive in an increasingly variable environment influenced by climate change."

Life-cycle diversity also has shifted: populations are migrating from freshwater at an earlier age, and spending more time in the ocean.

"Rebuilding a diversity of abundant wild populations - that is, maintaining functioning portfolios - should help ensure that important salmon watersheds like the Skeena are robust to global change," says John Reynolds, co-author, SFU professor, and Tom Buell BC Leadership Chair in Aquatic Conservation.

This research can help inform status assessments and rebuilding plan discussions for threatened salmon populations by expanding our understanding of historical diversity and production potential.


CAPTION

Wild sockeye salmon in the river.

HERE COMES SPRING COVID GARDENING

Pioneering research reveals gardens are secret powerhouse for pollinators

UNIVERSITY OF BRISTOL

Research News

IMAGE

IMAGE: RESIDENTIAL GARDENS UNDERPIN THE URBAN NECTAR SUPPLY, AND MANY CAN BE EXTREMELY RICH IN FLOWERING PLANTS. view more 

CREDIT: NICHOLAS TEW

Home gardens are by far the biggest source of food for pollinating insects, including bees and wasps, in cities and towns, according to new research.

The study, led by the University of Bristol and published today in the Journal of Ecology, measured for the first time how much nectar is produced in urban areas and discovered residential gardens accounted for the vast majority - some 85 per cent on average.

Results showed three gardens generated daily on average around a teaspoon of Nature's ambrosia, the unique sugar-rich liquid found in flowers which pollinators drink for energy. While a teaspoon may not sound much to humans, it's the equivalent to more than a tonne to an adult human and enough to fuel thousands of flying bees. The more bees and fellow pollinators can fly, the greater diversity of flora and fauna will be maintained.

Ecologist Nicholas Tew, lead author of the study, said: "Although the quantity and diversity of nectar has been measured in the countryside, this wasn't the case in urban areas, so we decided to investigate.

"We expected private gardens in towns and cities to be a plentiful source of nectar, but didn't anticipate the scale of production would be to such an overwhelming extent. Our findings highlight the pivotal role they play in supporting pollinators and promoting biodiversity in urban areas across the country."

The research, carried out in partnership with the universities of Edinburgh and Reading and the Royal Horticultural Society, examined the nectar production in four major UK towns and cities: Bristol, Edinburgh, Leeds, and Reading. Nectar production was measured in nearly 200 species of plant by extracting nectar from more than 3,000 individual flowers. The extraction process involves using a fine glass tube. The sugar concentration of the nectar was quantified with a refractometer, a device which measures how much light refracts when passing through a solution.

"We found the nectar supply in urban landscapes is more diverse, in other words comes from more plant species, than in farmland and nature reserves, and this urban nectar supply is critically underpinned by private gardens," said Nicholas Tew, who is studying for a PhD in Ecology.

CAPTION

Even balconies and window boxes in densely urban regions can provide food for pollinators.

CREDIT

Nicholas Tew

"Gardens are so important because they produce the most nectar per unit area of land and they cover the largest area of land in the cities we studied."

Nearly a third (29 per cent) of the land in urban areas comprised domestic gardens, which is six times the area of parks, and 40 times the area of allotments.

"The research illustrates the huge role gardeners play in pollinator conservation, as without gardens there would be far less food for pollinators, which include bees, wasps, butterflies, moths, flies, and beetles in towns and cities. It is vital that new housing developments include gardens and also important for gardeners to try to make sure their gardens are as good as possible for pollinators," Nicholas Tew explained.

"Ways to do this include planting nectar-rich flowers, ensuring there is always something in flower from early spring to late autumn, mowing the lawn less often to let dandelions, clovers, daisies and other plant flowers flourish, avoiding spraying pesticides which can harm pollinators, and avoiding covering garden in paving, decking or artificial turf."

Dr Stephanie Bird, an entomologist at the Royal Horticultural Society, which helped fund the research, said: "This research highlights the importance of gardens in supporting our pollinating insects and how gardeners can have a positive impact through their planting decisions. Gardens should not be seen in isolation - instead they are a network of resources offering valuable habitats and provisions when maintained with pollinators in mind."


CAPTION

One way to help pollinators (even in a small garden) is to allow part of your lawn to grow into meadow.

CREDIT

Paper

'Quantifying nectar production by flowering plants in urban and rural landscapes' by N.E.Tew et al in Journal of Ecology

Notes to editors

A range of images, including caption and credit details, can be found here:

https://drive.google.com/drive/folders/1P8BZDDB7Ry1UW5RxASd2KihVypQpMotf?usp=sharing

Nicholas Tew is available for interview. For interview requests and any other related enquiries, please email: Nicholas.tew@bristol.ac.uk and/or Victoria.tagg@bristol.ac.uk