Tuesday, November 17, 2020

Existing antidepressant helps to inhibit growth of cancer cells in lab animals

KU LEUVEN

Research News

New research has shown that the antidepressant sertraline helps to inhibit the growth of cancer cells. The substance acts on a metabolic addiction that allows different types of cancer to grow. This is shown by a study on cell cultures and lab animals performed by various research labs of KU Leuven. Their findings were published in Molecular Cancer Therapeutics, a journal of the American Association for Cancer Research.

Cancer cells use different biological mechanisms to stimulate their growth. In certain types of breast cancer, leukaemia, skin cancer, brain tumours and lung cancer, among others, the malignant cells produce large amounts of serine and glycine, two amino acids. This production stimulates the growth of cancer cells to such an extent that they become addicted to serine and glycine.

"This mechanism is an interesting target because cancer cells are so dependent on it", says Professor Kim De Keersmaecker, head of the Laboratory for Disease Mechanisms in Cancer (LDMC). "Healthy cells use this mechanism to a lesser extent and also take up serine and glycine from food. This is not sufficient for cancer cells, however, meaning they start producing more. If we can halt this production, we will be able to fight the cancer without affecting healthy cells."

From yeast to mice

In their search of a substance that influences the synthesis of serine and glycine, the researchers utilized a database of existing medicines. In a first phase, Professor Bruno Cammue's research group at the Centre for Microbial and Plant Genetics (CMPG) tested 1,600 substances on yeast cells.

"Because there are also yeasts, or moulds, which depend on the same mechanism", explains research coordinator Dr Karin Thevissen. "Certain yeasts produce these amino acids to protect themselves against antifungals. In addition, you can easily grow yeast cells, allowing you to test many different substances."

The screening showed that the antidepressant sertraline was the most effective substance. "Other studies had already indicated that sertraline has a certain anti-cancer activity, but there was no explanation for this yet", mention researchers Shauni Geeraerts (LDMC and CMPG) and Kim Kampen (LDMC). "In this study, we've been able to demonstrate that sertraline inhibits the production of serine and glycine, causing decreased growth of cancer cells. We also found that this substance is most effective in combination with other therapeutic agents. In studies with mice we saw that sertraline in combination with another therapy strongly inhibits the growth of breast cancer cells."

Considerable potential

"Now that we've been able to identify this mechanism for breast cancer, we can start examining other types of cancer that are also addicted to serine and glycine synthesis", says Professor De Keersmaecker. "This is for example the case in T-cell leukaemia, but also in certain types of brain, lung and skin cancer. The more tumours we can identify that are sensitive to sertraline, the better the prospects are for helping patients in the future."

"These are, of course, results of experimental research, not clinical studies, but we can be optimistic about the potential. The safety of using sertraline in humans has already been well described, which is a great advantage. That's why we are also looking for industrial partners to develop this further."

###

Reducing aerosol pollution without cutting carbon dioxide could make the planet hotter

Solving one environmental problem could create another

UNIVERSITY OF CALIFORNIA - RIVERSIDE

Research News

IMAGE

IMAGE: A SYSTEM OF CURRENTS KNOWN AS THE ATLANTIC MERIDIONAL OVERTURNING CIRCULATION CARRIES WARM WATER INTO THE NORTH ATLANTIC. IT COULD BE DISTURBED IF CO2 AND AEROSOLS ARE NOT SIMULTANEOUSLY CUT. view more 

CREDIT: R. CURRY, WOODS HOLE OCEANOGRAPHIC INSTITUTION/SCIENCE/USGCRP

Humans must reduce carbon dioxide and aerosol pollution simultaneously to avoid weakening the ocean's ability to keep the planet cool, new research shows.

Aerosol pollution refers to particles in the air emitted by vehicles and factories that burn fossil fuels. This pollution contributes to asthma, bronchitis, and long-term irritation of the respiratory tract, which can lead to cancer.

"The conundrum," explained UC Riverside climate scientist and study co-author Robert Allen, "is that aerosols cause poor air quality and lead to premature deaths. However, these particles have a net cooling impact on the climate, so when you cut them that leads to a net warming effect."

Much research has examined aerosol impacts on air quality and land surface temperatures. Less explored is the way aerosols might impact the oceans, which is the focus of a UC Riverside study now published in the journal Science Advances.

The research team created detailed computer models to determine the impact on oceans under two different scenarios -- one in which there is only a reduction in aerosols, and another scenario in which greenhouse gases like carbon dioxide and methane are also reduced.

"The first scenario leads to the surprising result that fewer aerosols in the atmosphere could shift the region where most of the ocean is taking up heat, from the Southern Ocean toward the North Atlantic," Allen said.

In particular, the Atlantic meridional overturning circulation, or AMOC, would be disturbed as aerosols are removed from the atmosphere, the study found. The AMOC pulls warm water further north and pushes colder water south, ensuring the climate on land areas at higher latitudes, such as Europe, are relatively mild.

Roughly half the carbon dioxide humans put into the atmosphere -- mostly through fossil fuel combustion and deforestation -- stays there, and the remaining half is taken up by land and vegetation, as well as the ocean.

One of the ways the ocean takes up our carbon dioxide emissions is through AMOC circulation.

"A projected decline in manmade aerosols potentially induces a weakening of the AMOC, which plays an important role in ocean heat uptake and storage in the North Atlantic," said Wei Liu, an assistant professor of climate change and sustainability at UCR.

In addition, the researchers said a rise in sea level would occur if the North Atlantic Ocean were to get warmer.

This current study focused on ocean heat uptake and circulation via the AMOC. However, Allen explained the study did not attempt to rigorously identify the mechanisms by which aerosol reductions weaken the AMOC. Those mechanisms will be the focus of future studies.

Ultimately, the researchers conclude that even without a more in-depth explanation of the weakening mechanisms, it is necessary to reduce greenhouse gases and aerosols in tandem.

The Intergovernmental Panel on Climate Change recommends making every attempt to prevent the planet from reaching 1.5 degrees Celsius above pre-industrial levels in order to mitigate the worst effects of global warming.

Humans have already increased carbon dioxide levels by almost 50% since the 1850s, and it continues to increase worldwide. Stabilizing carbon dioxide at current levels would require zero net emissions before the year 2070, which is ambitious, but critical.

"Assuming complete removal, aerosols at most will cause warming of about 1 K," said Allen. "However, aerosol-induced warming, as well as the associated ocean circulation changes, can be moderated by rigorous cuts in greenhouse gases including methane and carbon dioxide."

###

Tropical peatland conservation could protect humans from new diseases

UNIVERSITY OF EXETER

Research News

IMAGE

IMAGE: LOCAL FISHERS WORKING UNDER THICK HAZE CONDITIONS FROM PEATLAND FIRES IN CENTRAL KALIMANTAN, INDONESIA. view more 

CREDIT: SUZANNE TURNOCK / BORNEO NATURE FOUNDATION

Conservation of tropical peatlands could reduce the impacts of the COVID-19 pandemic and the likelihood of new diseases jumping from animals to humans, researchers say.

The scientists reviewed existing evidence and concluded the high biodiversity in tropical peat-swamp forests, combined with habitat destruction and wildlife harvesting, created "suitable conditions" for emerging infectious diseases (EIDs) that could jump to humans.

COVID-19 did not emerge in a tropical peatland area - but HIV/AIDS and the joint-first case of Ebola both originated in areas with extensive peatlands.

The study also assessed the possible impact of COVID-19 on tropical peatland conservation and local communities - and identified "numerous potential threats" to both.

Led by the University of Exeter, the international study team comprised researchers from countries with large tropical peatlands, including Indonesia, DR Congo and PerĂș.

"We're not saying tropical peatlands are unique in this respect - but they are one important habitat where zoonotic diseases (those that jump from animals to humans) could emerge," said lead author Dr Mark Harrison, of the Centre for Ecology and Conservation on Exeter's Penryn Campus in Cornwall, UK and Borneo Nature Foundation International.

"Tropical peat-swamp forests are rich in fauna and flora, including numerous vertebrates known to represent zoonotic EID risk, such as bats, rodents, pangolins and primates.

"Exploitation and fragmentation of these habitats, as well as peat wildfires (ultimately driven by human activity) and wildlife harvesting bring more and more people into close contact with peatland biodiversity, increasing the potential for zoonotic disease transmission.

"Our review shows that protecting tropical peatlands isn't therefore just about wildlife and carbon emissions - it's also important for human health."

The study also notes "high impacts" of COVID-19 in some countries with large tropical peatland areas, some of which are relatively poorly resourced to tackle pandemics.

"Many communities in these areas are remote, relatively poor, disconnected, have limited infrastructure, sub-standard or non-existent medical facilities, and depend heavily on external trade," said Dr Ifo Suspense, of Université Marien, Republic of Congo, who contributed to the review.

"As a result, the direct and indirect impacts of COVID-19 may be particularly severe in these communities."

Dr Muhammad Ali Imron, from University Gadjah Mada in Indonesia, who was also involved in the study, said: "Additionally, major wildfires in peatland areas cause massive air pollution, particularly in South East Asia, increasing the threat to human health from respiratory diseases like COVID-19.

"In terms of the impacts on peatlands themselves, we reveal that conservation, research and training are all being affected by the pandemic, which may result in increased habitat encroachment, wildlife harvesting and fires started to clear vegetation".

The study concludes: "Sustainable management of tropical peatlands and their wildlife is important for mitigating impacts of the COVID-19 pandemic, and reducing the potential for future zoonotic EID emergence and severity, thus strengthening arguments for their conservation and restoration."

To help achieve this, the study identifies a number of opportunities and recommendations for researchers, field projects, policy makers and donors to help achieve this goal.

###

The paper, published in the journal PeerJ, is entitled: "Tropical peatlands and their conservation are important in the context of COVID-19 and potential future (zoonotic) disease pandemics."

CAPTION

Peatland fire encroaching into forest in Central Kalimantan, Indonesia.

US agricultural water use declining for most crops and livestock production

UNIVERSITY OF ILLINOIS COLLEGE OF AGRICULTURAL, CONSUMER AND ENVIRONMENTAL SCIENCES

Research News

URBANA, Ill. - Climate change and a growing world population require efficient use of natural resources. Water is a crucial component in food production, and water management strategies are needed to support worldwide changes in food consumption and dietary patterns.

Agricultural production and food manufacturing account for a third of water usage in the U.S. Water use fluctuates with weather patterns but is also affected by shifts in production technology, supply-chain linkages, and domestic and foreign consumer demand.

A comprehensive University of Illinois study looked at water withdrawals in U.S. agriculture and food production from 1995 to 2010. The main trend was a decline in water use, driven by a combination of factors.

"Overall, the use of water for irrigation decreased by 8.3% over this period," says Sandy Dall'erba, regional economist at U of I and co-author on the study.

"However, one needs to identify the drivers of water use by crop as they differ from one commodity to the next, so water-saving strategies for one crop may not be relevant for another one," Dall'erba explains. "For instance, water use in cereal grains, fruits, and vegetables is mostly driven by the efficiency of the irrigation system, domestic per-capita income, and sales to the food processing industry. If irrigation is more efficient, water demand decreases. When demand for fruits and vegetables decreased in 2005-2010 during the financial crisis, so did demand for water."

Oilseed crops, on the other hand, have experienced a 98% increase in water demand over the period. The change is primarily driven by international supply-chain linkages. It means foreign companies, mostly in China, have purchased large amount of U.S. oilseed crops for further processing.

"There has also been a shift in consumer demand from red meat to white meat in the U.S. People consume less beef and more chicken, which require 3.5 times less water per pound of production. Those trends in consumption and taste have helped the U.S. reduce water use for livestock by 14%," Dall'erba says.

Dall'erba and co-author Andre Avelino performed a structural decomposition analysis, looking at 18 factors that drive U.S. water withdrawals across eight crops, six livestock categories, and 11 food manufacturing industries.

Based on data from Exiobase, a global supply-chain database, their analysis included water that's embedded into the production at all stages of the domestic and international supply chain, from crops and livestock to processed food production­ - highlighting the interconnectedness of global agribusiness.

For example, crops produced in the U.S. may rely on fertilizers produced in a different country. Similarly, soybeans produced in the U.S. could be used for food processing in China, or to feed livestock in Europe.

The current U.S.-China trade war is likely to affect these these supply-chain linkages, as Chinese import of oilseeds shifts to South America and Europe. The U.S. exported less soybean and pork to China over the last two years; therefore, less water was embedded into those exports. However, the next few years under a new U.S. administration may see an improvement in these relationships, Dall'erba notes.

The COVID-19 pandemic is also likely affecting water usage. Unemployment and economic crises have always impacted consumer demand, and international trade has sharply declined since the pandemic began. The 2008 recession resulted in decreased water usage and similar effects are expected in the current crisis, Dall'erba states.

Traditionally, scientists measuring the amount of water associated with production and the supply chain rely on a worldwide data set called the Water Footprint Network (WFN), which is based on a crop water-use model. However, Dall'erba and Avelino used data from the U.S. Geological Survey (USGS), which are based on observations rather than physical models.

"With the USGS data we find a decrease in the amount of water used, while the WFN data indicated a small increase. The difference is not large, but it's still a big deal, because you would find the wrong trend and reach misleading conclusions if you use the wrong data set," Dall'erba explains.

"This is important information for researchers. If you're in a situation where you have access to data based on observations rather than crop models, you should use those official data, especially because water saving policies are based on this dataset," he notes.

The questions addressed in this research are extremely relevant for any country that is heavy on agricultural production, Dall'erba says. "Namely, how can we feed the 10 billion people we expect to be at the global level by 2080, considering that we cannot necessarily expand the amount of land that's going to be used? And, given climate change, there is quite a lot of uncertainty with respect to the availability of water needed to grow crops and feed livestock in the years to come."

Water management strategies may include farm-level efforts such as increasing efficiency of the irrigation system, switching crops, and growing genetically modified crops.

Other measures may include policies aimed at affecting consumer behavior such as increasing taxes on water-intensive products and supporting ecolabeling, Dall'erba suggests.

Ecolabeling would require food manufacturing companies to report the amounts of water, carbon dioxide emissions, and labor associated with production. That could help consumers make informed choices and potentially shift consumption to less water-intensive products, he concludes.

###

The article, "What Factors Drive the Changes in Water Withdrawals in the U.S. Agriculture and Food Manufacturing Industries between 1995 and 2010?" is published in Environmental Science and Technology. [doi.org/10.1021/acs.est.9b07071]

Authors are Sandy Dall'erba, Regional Economics Applications Laboratory, Department of Agricultural and Consumer Economics in the College of Agricultural, Consumer and Environmental Sciences, University of Illinois, and Andre Avelino, National Renewable Energy Laboratory.

The research was funded by a United States Department of Agriculture hatch grant.

Henderson island fossils reveal new Polynesian sandpiper species

CANTERBURY MUSEUM

Research News

IMAGE

IMAGE: THE EXTINCT KIRITIMATI SANDPIPER, PROSOBONIA CANCELLATA - A CLOSE COUSIN OF THE NEWLY DISCOVERED PROSOBONIA SAULI. view more 

CREDIT: ILLUSTRATION BY GEORGE EDWARD LODGE, 1907

Fossil bones collected in the early 1990s on Henderson Island, part of the Pitcairn Group, have revealed a new species of Polynesian sandpiper.

The Henderson Sandpiper, a small wading bird that has been extinct for centuries, is described in an article in the Zoological Journal of the Linnean Society published last week.

The newly-described bird is formally named Prosobonia sauli after Cook Islands-based ornithologist and conservationist Edward K Saul.

A team of researchers from New Zealand, Australia, Denmark, Switzerland, the Netherlands and China, led by Canterbury Museum Research Curator Natural History Dr Vanesa De Pietri, described the Henderson Sandpiper from 61 fossilised bones cared for by the Natural History Museum at Tring in England.

Canterbury Museum Visiting Researcher Dr Graham Wragg collected the bones from caves and overhangs on Henderson Island in 1991 and 1992 during the Sir Peter Scott Commemorative Expedition to the Pitcairn Islands.

Prosobonia sauli is the fifth known species of Polynesian sandpiper. All but one of the species, the endangered Tuamotu Sandpiper (Prosobonia parvirostris), are extinct.

"We think Prosobonia sauli probably went extinct soon after humans arrived on Henderson Island, which archaeologists estimate happened no earlier than the eleventh century," says Dr De Pietri.

"It's possible these humans brought with them the Polynesian rat, which Polynesian sandpiper populations are very vulnerable to."

DNA of the living Tuamotu Sandpiper and the extinct Tahiti Sandpiper (Prosobonia leucoptera), which is known only from a skin in the Naturalis Biodiversity Center in the Netherlands, was used to determine how Polynesian sandpipers are related to other wading birds.

"We found that Polynesian sandpipers are early-diverging members of a group that includes calidrine sandpipers and turnstones. They are unlike other sandpipers in that they are restricted to islands of the Pacific and do not migrate," says Dr De Pietri.

Comparisons with the other two extinct Polynesian sandpiper species, the Kiritimati Sandpiper (Prosobonia cancellata) and the Mo'orea Sandpiper (Prosobonia ellisi), are complicated. These birds are known only from illustrations primarily by William Wade Ellis, an artist and Surgeon's Mate on Captain James Cook's third expedition, who probably saw the birds alive in the 1770s.

Compared to the Tuamotu Sandpiper, its geographically closest cousin, the Henderson Sandpiper had longer legs and a wider, straighter bill, indicating how it foraged for food. It probably adapted to the habitats available on Henderson Island, which are different to those on other islands where Polynesian sandpipers were found.

Henderson Island is the largest island in the Pitcairn Group, in the middle of the South Pacific Ocean. It has been uninhabited since around the fifteenth century and was designated a World Heritage Site by the United Nations in 1988.

Dr Paul Scofield, Canterbury Museum Senior Curator Natural History and one of the study's co-authors, says Henderson Island is home to a number of unique species, a handful of which are landbirds like the Henderson Sandpiper.

"The island is really quite remarkable because every landbird species that lives there, or that we know used to live there, is not found anywhere else," he says.

Dr De Pietri says the study shows the need to protect the one remaining Polynesian sandpiper species, the Tuamotu Sandpiper.

"We know that just a few centuries ago there were at least five Polynesian sandpiper species scattered around the Pacific. Now there's only one, and its numbers are declining, so we need to ensure we look after the remaining populations."

###

This research was supported by a grant from the Marsden Fund Council, managed by the Royal Society Te Apārangi, as well as the R S Allan Fund managed by Canterbury Museum.

CAPTION

The Henderson Island Sandpiper bones were excavated from caves during the Sir Peter Scott Commemorative Expedition in the early 1990s. Canterbury Museum Visiting Researcher Dr Graham Wragg, one of the paper's co-authors, is second from left in this photo.


Boosted signal

Novel analytic approach enhances nuclear magnetic resonance signal detection in previously 'invisible' regions

UNIVERSITY OF CALIFORNIA - SANTA BARBARA

Research News

First introduced into wide use in the middle of the 20th century, nuclear magnetic resonance (NMR) has since become an indispensable technique for examining materials down to their atoms, revealing molecular structure and other details without interfering with the material itself.

"It's a broadly used technique in chemical analysis, materials characterization, MRI -- situations in which you do a non-invasive analysis, but with atomic and molecular details," said UC Santa Barbara chemistry professor Songi Han. By placing a sample in a strong magnetic field and then probing it with radio waves scientists can determine from the response from the oscillating nuclei in the material's atoms the molecular structure of the material.

"However, the problem with NMR has been that because it's such a low-energy technique, it's not very sensitive," Han said. "It's very detailed, but you don't get much signal." As a result, large amounts of sample material may be needed relative to other techniques, and the signals' general weakness makes NMR less than ideal for studying complex chemical processes.

One remedy to this situation lies in dynamic nuclear polarization (DNP), a popular technique in which energy is "borrowed" from nearby electrons to enhance the signal emanating from the nuclei.

"Electrons have much higher energy than nuclei," Han explained. Built into specially-designed "radical" molecules, these unpaired electrons' polarization is transferred to the nuclei to improve their signal.

As hot a topic as DNP has become in the past decade, however, Han thinks we're still just scratching the surface.

"Despite DNP fundamentally changing the landscape of NMR, at the end of the day, only a handful of designer polarizing agents have been used," Han said. "A polarizing agent has been used to polarize hydrogen nuclei, but the power of DNP is greater than that. In principle, many other sources of electron spin can polarize many other types of nuclear spin."

In a paper published in the journal Chem, Han and colleagues push the boundaries of NMR with the first demonstration of dynamic nuclear polarization using the transition metal vanadium (IV). According to Han, their new approach -- dubbed "hyperfine DNP spectroscopy" -- offers a glimpse into the typically obscure local chemistry around transition metals, which are important for processes such as catalysis and reduction-oxidation reactions.

"Now we may be able to use endogenous metals that are present in catalysts and in many other important materials," Han said, without having to add polarizing agents -- those radical molecules -- to produce a stronger NMR signal.

The irony with transition metals such as vanadium and copper, Han explained, is that those atoms tend to tend to be functional centers -- places where important chemistry takes place.

"And those exact action centers and functional centers have been very difficult to analyze (with NMR) because they tend to become invisible," she said. The electron spins in the transition metal tend to shorten the lifetime of the NMR signal, she explained, making them disappear before they can be detected.

This wouldn't be the first time chemistry around transition metals has been observed, Han said, pointing to studies that looked at the chemical environments around gadolinium and manganese. But the commercially-available instrument used in those studies offered "a very narrow view."

"But there are many more metals that are much more important for chemistry," she said. "So we developed and optimized instrumentation that enhances the frequency range from the very narrow scope of a commercial instrument to a much broader range."

With their hyperfine DNP spectroscopy the researchers also found that the signal is indeed wiped out within a certain region around the metal called the spin diffusion barrier, but if the nuclei are located outside that zone the signal becomes visible.

"There are ways to lighten up that environment, but you need to know how and why," Han said, adding that the paper's co-lead authors, Sheetal Kumar Jain of UC Santa Barbara and Chung-Jui Yu of Northwestern University will continue to explore and apply this new method as they pursue their academic and research careers.

###

Other contributors to the research on this paper include Christopher Blake Wilson and Tarnuma Tabassum of UC Santa Barbara; and Danna E. Freedman of Northwestern University.

Biochar from agricultural waste products can adsorb contaminants in wastewater

PENN STATE

Research News

UNIVERSITY PARK, Pa. -- Biochar -- a charcoal-like substance made primarily from agricultural waste products -- holds promise for removing emerging contaminants such as pharmaceuticals from treated wastewater.

That's the conclusion of a team of researchers that conducted a novel study that evaluated and compared the ability of biochar derived from two common leftover agricultural materials -- cotton gin waste and guayule bagasse -- to adsorb three common pharmaceutical compounds from an aqueous solution. In adsorption, one material, like a pharmaceutical compound, sticks to the surface of another, like the solid biochar particle. Conversely, in absorption, one material is taken internally into another; for example, a sponge absorbs water.

Guayule, a shrub that grows in the arid Southwest, provided the waste for one of the biochars tested in the research. More properly called Parthenium argentatum, it has been cultivated as a source of rubber and latex. The plant is chopped to the ground and its branches mashed up to extract the latex. The dry, pulpy, fibrous residue that remains after stalks are crushed to extract the latex is called bagasse.

The results are important, according to researcher Herschel Elliott, Penn State professor of agricultural and biological engineering, College of Agricultural Sciences, because they demonstrate the potential for biochar made from plentiful agricultural wastes -- that otherwise must be disposed of -- to serve as a low-cost additional treatment for reducing contaminants in treated wastewater used for irrigation.

"Most sewage treatment plants are currently not equipped to remove emerging contaminants such as pharmaceuticals, and if those toxic compounds can be removed by biochars, then wastewater can be recycled in irrigation systems," he said. "That beneficial reuse is critical in regions such as the U.S. Southwest, where a lack of water hinders crop production."

The pharmaceutical compounds used in the study to test whether the biochars would adsorb them from aqueous solution were: sulfapyridine, an antibacterial medication no longer prescribed for treatment of infections in humans but commonly used in veterinary medicine; docusate, widely used in medicines as a laxative and stool softener; and erythromycin, an antibiotic used to treat infections and acne.

The results, published today (Nov. 16) in Biochar, suggest biochars made from agricultural waste materials could act as effective adsorbents to remove pharmaceuticals from reclaimed water prior to irrigation. However, the biochar derived from cotton gin waste was much more efficient.

In the research, it adsorbed 98% of the docusate, 74% of the erythromycin and 70% of the sulfapyridine in aqueous solution. By comparison, the biochar derived from guayule bagasse adsorbed 50% of the docusate, 50% of the erythromycin and just 5% of the sulfapyridine.

The research revealed that a temperature increase, from about 650 to about 1,300 degrees F in the oxygen-free pyrolysis process used to convert the agricultural waste materials to biochars, resulted in a greatly enhanced capacity to adsorb the pharmaceutical compounds.

"The most innovative part about the research was the use of the guayule bagasse because there have been no previous studies on using that material to produce biochar for the removal of emerging contaminants," said lead researcher Marlene Ndoun, a doctoral student in Penn State's Department of Agricultural and Biological Engineering. "Same for cotton gin waste -- research has been done on potential ways to remove other contaminants, but this is the first study to use cotton gin waste specifically to remove pharmaceuticals from water."

For Ndoun, the research is more than theoretical. She said she wants to scale up the technology and make a difference in the world. Because cotton gin waste is widely available, even in the poorest regions, she believes it holds promise as a source of biochar to decontaminate water.

"I am originally from Cameroon, and the reason I'm even here is because I'm looking for ways to filter water in resource-limited communities, such as where I grew up," she said. "We think if this could be scaled up, it would be ideal for use in countries in sub-Saharan Africa, where people don't have access to sophisticated equipment to purify their water."

The next step, Ndoun explained, would be to develop a mixture of biochar material capable of adsorbing a wide range of contaminants from water.

"Beyond removing emerging contaminants such as pharmaceuticals, I am interested in blending biochar materials so that we have low-cost filters able to remove the typical contaminants we find in water, such as bacteria and organic matter," said Ndoun.

###

Also involved in this research at Penn State were Heather Preisendanz, associate professor of agricultural and biological engineering, and Jack Watson, professor of soil science, soil physics and biogeochemistry; and Clinton Williams and Allan Knopf, scientists at the U.S. Department of Agriculture's Agricultural Research Service Arid Lands Agricultural Research Center, Maricopa, Arizona.

The U.S. Department of Agriculture's National Institute of Food and Agriculture and USDA's Agricultural Research Service supported this research.

Solitary bees are born with a functional internal clock - unlike honeybees

Developmental lag in the circadian clock may facilitate sociality

FRONTIERS

Research News

IMAGE

IMAGE: FEMALE OF THE SOLITARY RED MASON BEE INSIDE THE LOCOMOTOR ACTIVITY MONITOR SYSTEM FOR REGISTERING ACTIVITY RHYTHMS view more 

CREDIT: THE AUTHORS

Social insects like honeybees and hornets evolved from solitary bees and wasps, respectively. A common trait of many social insects is age-specific behavior: when they emerge from the pupa, workers typically specialize in around-the-clock tasks inside the darkness of the nest, starting with brood care. But they gradually shift towards more cyclic tasks away from center of the nest as they get older -- culminating in foraging outside, exclusively in daylight, towards the end of their life. Here, researchers find evidence that this shift from around-the-clock to rhythmic tasks, which does not occur in solitary insects, seems to be driven by a slower maturation of the endogenous (i.e. internal) "circadian" clock of social honeybees compared to solitary bees.

They find that in the solitary red mason bee Osmia bicornis, where females of every age forage to provide food for offspring, females and males emerge with a mature, fully functional circadian clock, as shown by their 24-h movement rhythm and the activity of the brain cells that produce the "pacemaker" protein Pigment-Dispersing Factor (PDF).

"Our results indicate that the maturation of the circadian clock is delayed in social honeybees as compared to solitary mason bees. We predict that rapid maturation of the circadian clock is the ancestral condition and will be found throughout the solitary bees and wasps, which all need to forage and perform brood care throughout their lifespan. In contrast, a delay in maturation may have evolved secondarily in social species, to enable age-related behavioral shifts from acyclic brood care to daily foraging," says Dr Katharina Beer, a postdoctoral scientist at the Department of Animal Ecology and Tropical Biology of the Julius Maximilian Universitaet Wuerzburg, Germany. Her study, done with Professor Charlotte Helfrich-Foerster, the Chair of Neurobiology and Genetics at Julius Maximilian University, is published in the open-access journal Frontiers in Cell and Developmental Biology.

The red mason bee is an important pollinator for agriculture, occurring across Europe, North Africa, and the Near East. Unlike in social insects, there is no infertile worker caste: females overwinter in their natal nest to emerge in the spring, mate, and build new nests in hollow stems. Females forage on many different plant species for pollen and nectar, which they store inside a row of sealed cells, laying a single egg per cell.

Beer and Helfrich-Foerster collected newly emerged bees -- 40 workers (females) from isolated brood combs of honeybees (Apis mellifera), and 56 females and 31 males from isolated pupal cocoons of O. bicornis -- and placed these inside separate tubes inside a Locomotor Activity Monitor system, an apparatus for registering the activity of individual. Infrared beams cross at the center of each tube, which are interrupted whenever the bee moves. By registering the activity pattern, in darkness and at constant temperature, around the clock for 3 to 45 days after emergence, the researchers could derive the rhythm of each bee.

The results showed that none of the honeybees showed a spontaneous ca. 24-h rhythm immediately after emergence from the pupa -- regardless of the degree of contact with the natal colony -- while 88% of O. bicornis females and males did so. The 12% of O. bicornis who did not died young. However in time, at an age of at least two days, each honeybee also developed a pronounced ca. 24-h rhythm. Beer and Helfrich-Foerster conclude that solitary O. bicornis bees, but not social honeybees, emerge with a functional endogenous circadian clock.

What is the neural basis for this difference? To answer this question, the researchers used immunohistochemistry to compare the maturation of the neurons that synthetize PDF, in the two species. In the brain of insects, a cluster of specialized cells called the lateral ventral neurons functions as a circadian pacemaker, secreting bursts of PDF, a so-called neuromodulator which affects the activity of the Central Nervous System. By staining dissected, fixed brains with two antibodies, the first recognizing PDF and the second, labeled with an fluorescent tag, binding to the first, Beer and Helfrich-Förster could count the PDF-producing neurons in each brain hemisphere at different ages, from just before emergence to four weeks after. They show that these steadily increase in number with age in honeybees, but not in O. bicornis.

The authors conclude that the circadian pacemaker is fully mature in newly emerged O. bicornis, but needs to develop to become fully active in honeybees, explaining why young honeybees don't yet show an circadian rhythm. This delay in maturation is likely an evolutionary adaptation to sociality, where young bees need to engage in around-the-clock care of their immature siblings.

"It will be most interesting to perform similar studies on other social insects such as ants, which have various forms of social behavior: some display age-related behavior like in honeybees, wile others don't. If we find similar differences in the maturation of the endogenous clock related to the different social structures of these ant species, this will strongly support our hypothesis," concludes Helfrich-Foerster.

###

Dairy cows exposed to heavy metals worsen antibiotic-resistant pathogen crisis

PENN STATE

Research News

UNIVERSITY PARK, Pa. -- Dairy cows, exposed for a few years to drinking water contaminated with heavy metals, carry more pathogens loaded with antimicrobial-resistance genes able to tolerate and survive various antibiotics.

That's the finding of a team of researchers that conducted a study of two dairy herds in Brazil four years after a dam holding mining waste ruptured, and it spotlights a threat to human health, the researchers contend.

The study is the first to show that long-term persistence of heavy metals in the environment may trigger genetic changes and interfere with the microorganism communities that colonize dairy cows, according to researcher Erika Ganda, assistant professor of food animal microbiomes, Penn State.

"Our findings are important because if bacterial antimicrobial resistance is transferred via the food chain by milk or meat consumption, it would have substantial implications for human health," she said. "What we saw is, when heavy metal contamination is in the environment, there is potential for an increase of so-called 'superbugs.'"

A declaration from the World Health Organization supports Ganda's assertion, saying that resistance to antimicrobials is one of the top 10 global public health threats facing humanity. Antimicrobial resistance occurs when bacteria, viruses, fungi and parasites change over time and no longer respond to medicines, making infections harder to treat and increasing the risk of disease spread, severe illness and death.

A South American environmental calamity triggered the research. Known as the Mariana Dam disaster, in 2015 the Fundao Tailings Dam suffered a catastrophic failure and released more than 11 billion gallons of iron ore waste. The huge wave of toxic mud flowed into the Doce River basin surrounding Mariana City in Minas Gerais, a state in southeast Brazil.

Following this catastrophe, the team analyzed the consequences of long-term exposure to contaminated drinking water on dairy cattle.

To reach their conclusions, researchers identified bacterial antimicrobial-resistance genes in the feces, rumen fluid and nasal passages of 16 dairy cattle in the area contaminated by the iron ore waste four years after the environmental disaster. Researchers compared samples taken from those animals to analogous samples from 16 dairy cattle on an unaffected farm, about 220 miles away.

The microorganism community in the cattle continuously exposed to contaminated water differed in many ways from that of the cows not exposed to heavy metals, noted researcher Natalia Carrillo Gaeta, doctoral student and research assistant in the Department of Preventive Veterinary Medicine and Animal Health, University of Sao Paulo, Brazil.

The relative abundance and prevalence of bacterial antimicrobial-resistance genes were higher in cattle at the heavy metals-affected farm than in cattle at the non-contaminated farm, she pointed out.

The data, published today (Nov. 16) in Frontiers in Microbiology, suggest that exposure to heavy metal contamination results in the selection of bacteria that have resistance genes to heavy metals, biocides and several drugs, Gaeta explained. "We found that bacterial antimicrobial-resistance genes are most readily detected in fecal samples."

The link between heavy metal concentration in the environment and increased prevalence of antibiotic resistance in bacteria has been seen before, Ganda said. It's known as "co-resistance phenomenon" and is characterized by the closeness between different types of resistance genes located in the same genetic element.

"As a result of this connection, the transfer of one gene providing heavy metal resistance, may occur in concert with the transfer of the closest gene, providing antibiotic resistance, she said. "Consequently, some resistance mechanisms are shared between antibiotics and heavy metals."

Ganda's research group in the College of Agricultural Sciences works with the one-health perspective, which focuses on the interaction between animals, people and the environment. She believes this research presents a good description of a one-health problem.

"In this Brazilian environmental disaster, not only were several people and animals killed by the devastating flood caused by the dam rupture, but the contamination persisted in the environment and made it into dairy cows, that could potentially pose another risk for humans," Ganda said. "If these animals are colonized, resistant bacteria could also make it to humans and colonize them through the food chain."

###

Also involved in the research at Penn State were Asha Marie Miles, postdoctoral scholar in the Department of Animal Science, and Emily Bean, doctoral candidate in the Intercollege Graduate Degree Program in Integrative and Biomedical Physiology; and Daniel Ubriaco Oliveira Gonçalves de Carvalho, Mario Augusto Reyes Alema, Jeferson Silva Carvalho and Lilian Gregory, Department of Internal Medicine, School of Veterinary Medicine and Animal Science, University of Sao Paulo, Brazil.

This work was supported by Sao Paulo Research Foundation, the Brazilian Agency CAPES and the U.S. Department of Agriculture's National Institute of Food and Agriculture.

Chronic alcohol use reshapes the brain's immune landscape, driving anxiety and addiction

In their quest to understand how the immune system is implicated in alcohol use disorder, Scripps Research scientists have found a potential path for treatment

SCRIPPS RESEARCH INSTITUTE

Research News

LA JOLLA, CA--Deep within the brain, a small almond-shaped region called the amygdala plays a vital role in how we exhibit emotion, behavior and motivation. Understandably, it's also strongly implicated in alcohol abuse, making it a long-running focus of Marisa Roberto, PhD, professor in Scripps Research's Department of Molecular Medicine.

Now, for the first time, Roberto and her team have identified important changes to anti-inflammatory mechanisms and cellular activity in the amygdala that drive alcohol addiction. By countering this process in mice, they were able to stop excessive alcohol consumption--revealing a potential treatment path for alcohol use disorder. The study is published in Progress in Neurobiology.

"We found that chronic alcohol exposure compromises brain immune cells, which are important for maintaining healthy neurons," says Reesha Patel, PhD, a postdoctoral fellow in Roberto's lab and first author of the study. "The resulting damage fuels anxiety and alcohol drinking that may lead to alcohol use disorder."

Roberto's study looked specifically at an immune protein called Interleukin 10, or IL-10, which is prevalent in the brain. IL-10 is known to have potent anti-inflammatory properties, which ensures that the immune system doesn't respond too powerfully to disease threats. In the brain, IL-10 helps to limit inflammation from injury or disease, such as stroke or Alzheimer's. But it also appears to influence key behaviors associated with chronic alcohol use.

In mice with chronic alcohol use, IL-10 was significantly reduced in the amygdala and didn't signal properly to neurons, contributing to increased alcohol intake. By boosting IL-10 signaling in the brain, however, the scientists could reverse the aberrant effects. Notably, they observed a stark reduction in anxiety-like behaviors and motivation to drink alcohol.

"We've shown that inflammatory immune responses in the brain are very much at play in the development and maintenance of alcohol use disorder," Roberto says. "But perhaps more importantly, we provided a new framework for therapeutic intervention, pointing to anti-inflammatory mechanisms."

Alcohol use disorder is widespread, affecting some 15 million people in the United States, and few effective treatments exist. By examining how brain cells change with prolonged exposure to alcohol, Roberto's lab has uncovered many possible new therapeutic approaches for those with alcohol addiction.

In the latest study, Roberto's lab collaborated with Silke Paust, PhD, associate professor in the Department of Immunology and Microbiology. Paust and her team determined the precise immune cells throughout the whole brain that are affected by chronic alcohol use. The findings revealed a large shift in the brain immune landscape, with increased levels of immune cells known as microglia and T-regulatory cells, which produce IL-10.

Despite a higher number of IL-10-producing cells in the whole brain of mice with prolonged alcohol use, the amygdala told a different story. In that region, levels of IL-10 were lower and their signaling function was compromised--suggesting that the immune system in the amygdala responds uniquely to chronic alcohol use.

This study complements recent findings by the Roberto lab demonstrating a casual role for microglia in the development of alcohol dependence.

Future studies will build on these findings to identify exactly how and when IL-10 signals to neurons in the amygdala and other addition-related brain circuits to alter behavior.

###

The study, "IL-10 normalizes aberrant amygdala GABA transmission and reverses anxiety-like behavior and dependence-induced escalation of alcohol intake," was authored by Reesha Patel, Sarah Wolfe, Michal Bajo, Shawn Abeynaike, Amanda Pahng, Vittoria Borgonetti, Shannon D'Ambrosio, Rana Nikzad, Scott Edwards, Silke Paust, Amanda Roberts and Marisa Roberto.

Support for this study was provided by the National Institute on Alcohol Abuse and Alcoholism (grants AA021491, AA013498, AA006420, AA017447, AA015566, AA027700, AA026765, AA007456) and the Pearson Center for Alcoholism and Addiction Research.