Friday, March 12, 2021

UConn researcher finds 'Goldilocks problem' in child welfare decision-making

UNIVERSITY OF CONNECTICUT

Research News

When something bad happens to a child, the public and policy response is swift and forceful.

How could this have happened?

What went wrong?

What do we do to make sure it never happens again?

When a family becomes erroneously or unnecessarily enmeshed in the child welfare system, that burden is largely invisible - a burden borne mostly by the family itself.

In both situations, the fault for the systemic failure is often placed on the caseworker - overburdened, under-resourced, and forced to make quick and critical judgments about the risk of harm or neglect to children.

But, according to new study coauthored by a researcher in the UConn School of Social Work, a major tool used in child welfare decision-making - and the way agencies try to implement it - may be part of the problem.

"I think it seems appealing to have a consistent way to do something," says Megan Feely, an assistant professor of social work who specializes in child welfare and child maltreatment prevention. "It's when you get into the details that it becomes kind of murky."

In their study - recently published in the journal, Social Service Review - Feely and coauthor Emily Bosk, an assistant professor at the Rutgers University School of Social Work, examined the application of the Structured Decision-Making Model's Risk Assessment in two states.

"What to do with children who need to be safe, and families who may need help keeping their children safe, seem like some of the most important decisions a state will make," says Feely, "and it's really sort of shocking how little attention has been given to how these decisions are made - these incredibly, incredibly important decisions."

Commonly called the "RA," the risk assessment is an actuarially-based prognostic tool that provides a checklist for child welfare workers to use to help assess a family's future risk. It asks questions -- Has the family been involved in child welfare before? Have they had an allegation of neglect? Does the primary caretaker have a substance use problem? Do they have a current or past mental health problem? Are the children medically fragile? - and then categorizes the family as low, medium, high, or intensive risk, based on the worker's responses.

The RA is considered the gold standard in child welfare decision-making, developed with the goal of providing a level of standardization and predictability. It was intended to be used in conjunction with workers' clinical judgement but designed to eliminate some of the most glaring problems with clinical decision-making, such as individual variation in the interpretation of the same set of facts, implicit bias, and lack of knowledge about empirically established risk factors.

"The RA is premised on the idea that when workers follow it, different individuals are reasonably likely to come to the same conclusion about case actions," Bosk and Feely wrote. "No longer will outcomes be random - that is, contingent on which worker a family is assigned."

For their study, Bosk and Feely examined the RA's use - reviewing policies and interviewing caseworkers and their supervisors - and found drastically different applications of the assessment between the two states at the organizational level. In the first, the application of the RA has been mandated by the legislature and was used strictly and in place of clinical judgment. In the other, while the RA was always completed, it was not a significant factor in decision-making, with clinical judgment typically driving decisions.

"We call this 'the Goldilocks problem,' because one state essentially totally privileges the RA score," Feely says, "so it's a too tight interpretation of what to do with it. And in the other, most workers don't really use it, so it's an overly loose interpretation of what to do with it and how to integrate it into clinical judgment. There's no middle point."

In the so-called "tight state," workers explained they were unable to use anything other than the RA to make case determinations, which was not the intended application of the assessment by its developers. Clinical assessment was discouraged and, because of the rigidity of the framework, some workers would intentionally circumvent the RA - changing scores to either increase or decrease the predicted risk - in order to achieve a case trajectory that better matched their otherwise disregarded clinical judgment.

By contrast, in the so-called "loose state," workers were required to complete the RA, but it had little to no role in case decision-making, with the majority of workers relying on their clinical judgements and consultations with their supervisors to decide case trajectories. While the workers had significantly more flexibility in their decision making, the researchers found, the potentially systemizing and standardizing effect of the RA was eliminated.

The problem, Feely says, comes down to a flaw in the RA itself: While the developers intended for the RA to be used in conjunction with clinical judgment, they never provide any guidance or methodology on how to integrate the two. The propensity is to blame the workers, or the agencies, for the RA's shortcomings, she says, but workers consistently found the tool to be problematic, and the study validates those concerns.

"Without guidance, it's not clear how to integrate them, exactly, because it's not another piece of more qualitative information, which we would use in clinical judgment, but a hard number," Feely says. "We found that organizational context really matters for the application of the RA, and that because it's not specified in the model, organizations are responsible for figuring out how to integrate the score with clinical judgment themselves."

While that led to some workers in the "tight state" manipulating the RA, it also led workers to escalate cases involving families that, through clinical judgement, would likely not have been considered at risk. Feely said that unnecessarily high rates of child welfare involvement, particularly in marginalized communities or communities with many Black, Indigenous, or other people of color, contributes to the overall sense that the system is unfair.

"You can see how frustrating it would be if your child had autism, or was categorized as having behavioral or mental health issues, and you were on antidepressants, and then all of a sudden you're labeled as at risk," she says. "You can't do anything about those things. You can't fix them. You're not going to go off your antidepressants, because that obviously would make it worse."

She continues, "It feels like the conservative option is to err on the side of having more false positives, where people that are really not at risk are misidentified as at risk. But there are real downsides to that, and I think that, in child welfare, we're seeing a sort of paralleling with some of the attention that's on police - there are longer-term big consequences when we keep getting it wrong, because people don't trust the system."

While a clinical-based approach offers more nuance, she says, it also loses what could make the process more consistent. As the RA and other prognostic tools and their potential use in child welfare situations are being discussed, Feely said that this study offers a cautionary tale that should encourage policymakers to be wary of trusting a tool more than is warranted.

"A main issue is really having a more open discussion of how these sort of probability-based tools should be included into the context of clinical decision-making," she says. "I think that the move toward trying to incorporate more evidence and a more scientific base in social services, is positive, but I think it has to be really carefully balanced with the limits of that science. Overestimating the science, and the veracity of it, and its ability to be applicable in a particular situation, can be just as problematic for families and society as under-using it."


 

COVID-19 testing in schools complex but doable, worth the effort -- RAND study

RAND CORPORATION

Research News

A RAND Corporation report funded by The Rockefeller Foundation shows that COVID-19 testing can be effectively integrated into K-12 schools' pandemic response plans, helping families and staff feel more comfortable with in-person instruction.

The report found that even for well-resourced districts and schools, launching a COVID-19 testing program was a major undertaking that required access to rapid-turnaround tests, additional staffing or strong partners for logistical support, technical assistance for the design and execution of testing programs, and a strategy for successfully engaging the school community to participate in testing.

"Our interviews with K-12 schools show that COVID-19 testing is complex, but doable," said Laura Faherty, lead author and a physician policy researcher at nonprofit, nonpartisan RAND. "Many early adopters found it critical to partner with local public health departments, local health systems, their peers and testing vendors to launch feasible COVID-19 testing programs. But even with strong partnerships in place, school and district leaders described just how much effort it takes to make a testing program run smoothly. They described being 'hungry' for guidance and in need of a lot of technical assistance."

Researchers examined the experiences of schools and districts that were early adopters of COVID-19 testing in Fall 2020, including insights from a national scan of schools as well as more than 80 interviews with K-12 school and district leaders in December 2020.

According to the report, COVID-19 testing programs were more often implemented in public and independent schools with access to resources such as expert advice, sufficient funding and strong local partnerships.

Schools encountered varying degrees of hesitation around testing and used several strategies to encourage participation. Privacy concerns were common: Staff and parents worried their health information could be shared with others. Members of school communities expressed concern about how their test samples would be used. Another barrier to engaging their communities in testing was doubt about the actual risk posed by COVID-19.

"To give their communities peace of mind to return to in-person learning, schools often started small, using a pilot period to work out many of the details before fully launching," said Benjamin Master, co-leader of the project and a policy researcher at RAND. "The most scalable testing models that we identified involved districts that were able to access expert technical support and partners who could help manage the logistics of testing."

The report provides recommendations for how schools can develop, and policymakers can support, effective COVID-19 testing programs. Among them:

  • expand funding to ensure schools can access tests, hire additional staff and contract with vendors as needed to implement testing
  • continue to invest in research and development of testing solutions that are convenient for schools to use
  • provide incentives such as paid sick leave for teachers and families to participate in testing and isolate as needed
  • promote the use of standardized metrics and data platforms to support decisionmaking
  • provide resources to strengthen state and local health departments so they're fully equipped to help schools design testing programs and respond to positive tests.

The Rockefeller Foundation's funding for the report is part of its portfolio of work aimed at disseminating lessons learned so far about the use of COVID-19 testing to help facilitate school reopening.

###

Other authors of the report, "COVID-19 Testing in K-12 Schools: Insights from Early Adopters," are Elizabeth Steiner, Julia Kaufman, Zachary Predmore, Laura Stelitano, Jennifer Leschitz, Brian Phillips, Heather Schwartz and Rebecca Wolfe.

RAND Health Care promotes healthier societies by improving health care systems in the United States and other countries. RAND Education and Labor is dedicated to improving education and expanding economic opportunities for all through research and analysis. Its researchers address key policy issues in U.S. and international education systems and labor markets, from pre-kindergarten to retirement planning.

New tool makes students better at detecting fake imagery and videos

UPPSALA UNIVERSITY

Research News

IMAGE

IMAGE: RESEARCHERS AT UPPSALA UNIVERSITY HAVE DEVELOPED A DIGITAL SELF-TEST THAT TRAINS USERS TO ASSESS NEWS ITEMS, IMAGES AND VIDEOS PRESENTED ON SOCIAL MEDIA. view more 

CREDIT: THOMAS NYGREN

Researchers at Uppsala University have developed a digital self-test that trains users to assess news items, images and videos presented on social media. The self-test has also been evaluated in a scientific study, which confirmed the researchers' hypothesis that the tool genuinely improved the students' ability to apply critical thinking to digital sources.

The new tool and the scientific review of it are part of the News Evaluator project to investigate new methods of enhancing young people's capacity for critical awareness of digital sources, a key component of digital literacy.

"As research leader in the project, I'm surprised how complicated it is to develop this type of tool against misleading information - one that's usable on a large scale. Obviously, critically assessing digital sources is complicated. We've been working on various designs and tests, with major experiments in school settings, for years. Now we've finally got a tool that evidently works. The effect is clearly positive and now we launch the self-test on our News Evaluator website http://www.newsevaluator.com, so that all anyone can test themselves for free," says Thomas Nygren, associate professor at Uppsala University.

The tool is structured in a way that allows students to work with it, online, on their own. They get to see news articles in a social-media format, with pictures or videos, and the task is to determine how credible they are. Is there really wood pulp in Parmesan cheese, for instance?

"The aim is for the students to get better at uncovering what isn't true, but also improve their understanding of what may be true even if it seems unlikely at first," Nygren says.

As user support, the tool contains guidance. Students can follow how a professional would have gone about investigating the authenticity of the statements or images - by opening a new window and doing a separate search alongside the test, or doing a reverse image search, for example. The students are encouraged to learn "lateral reading" (verifying what you read by double checking news). After solving the tasks, the students get feedback on their performance.

When the tool was tested with just over 200 students' help, it proved to have had a beneficial effect on their ability to assess sources critically. Students who had received guidance and feedback from the tool showed distinctly better results than those who had not been given this support. The tool also turned out to provide better results in terms of the above-mentioned ability than other, comparable initiatives that require teacher participation and more time.

Apart from practical tips such as opening a new search tab, doing reverse image searches and not always choosing the search result at the top of the hit page (but, rather, the one that comes from a source you recognise), Nygren has a general piece of advice that can help us all become more critically aware in the digital world:

"Make sure you keep up to date with information and news from trustworthy sources with credible practices of fact-checking, such as the national TV news programmes or an established daily newspaper. It's difficult and arduous being critical about sources all the time."

###

The News Evaluator project is a collaboration among the Swedish non-profit organisation Public & Science (Vetenskap & Allmänhet, VA), Uppsala University and RISE Research Institutes of Sweden. It is funded by Vinnova (Sweden's innovation agency) and Uppsala University.

Link to the tool: http://www.newsevaluator.com


Unveiling the cause of onion center rot

CARL R. WOESE INSTITUTE FOR GENOMIC BIOLOGY, UNIVERSITY OF ILLINOIS AT URBANA-CHAMPAIGN

Research News

IMAGE

IMAGE: PANTAPHOS, WHICH IS PRODUCED BY THE PLANT PATHOGEN PANTOEA ANANATIS, IS RESPONSIBLE FOR CAUSING ONION CENTER ROT. view more 

CREDIT: ALEX POLIDORE

Since 1983, the bacteria Pantoea ananatis has been known to infect several important crops including onions, rice, and corn. It was unclear, however, what molecules were involved. A new study, published in mBio, has identified one of the culprits: pantaphos. Intriguingly, the researchers have discovered that pantaphos can also act as an herbicide and it is toxic to glioblastoma cells, making it an exciting candidate for agricultural and biomedical applications.

"Herbicide resistant weeds are an issue in agriculture," said William Metcalf (MMG leader), a professor of microbiology. "Unfortunately, there hasn't been a new class of herbicide for over 30 years. If we can understand how pantaphos causes onion rot, we can solve a big problem."

Scientists at Cornell University had previously compared the disease-causing strains of P. ananatis to other innocuous strains. They identified a group of genes, designated as hvr, that were responsible for causing onion rot. The researchers in the current paper were inspired by these studies, and they subsequently identified which molecules were produced by these genes and found pantaphos.

"We can inject onions with purified pantaphos and cause onion rot. The injected onions start rotting, and become gross and mushy. It was exciting to see," said Alexander Polidore, a PhD student in the Metcalf lab. "Additionally, bacteria that cannot synthesize this molecule cannot cause onion rot, which means that it is necessary to cause the infection."

"If we can fully understand how pantaphos is made by the bacteria, we can identify multiple steps of intervention. If we can stop any one of those processes, we can get rid of onion rot," Metcalf said.

Intriguingly, pantaphos has also shown promise as an effective herbicide. "I compared pentaphos to Liberty and Roundup, which are common herbicides, and it was just as good--or even better--against typical weeds such as mustard seedlings," Polidore said.

An important requirement for an herbicide is that it kills weeds, but remains non-toxic to other animals, including humans. Therefore, the researchers tested the toxicity of pantaphos against other organisms. "Although it does not affect other bacteria and fungi, we found that it is moderately toxic to normal human cell lines, but strikingly toxic to glioblastoma cell lines. We were excited because those cancer cells are notoriously hard to kill," Polidore said.

Although pantaphos is somewhat toxic to human cell lines, it is possible that it will not be toxic to whole animals. "If you feed pantaphos to a mouse and it doesn't get taken up in the intestine, it will be much less toxic," Metcalf said. "Thus, our cell line studies are preliminary and will require follow-up experiments to define the level of toxicity in humans."

The researchers are currently trying to understand how pantaphos kills weeds and glioblastoma cells. The results of those studies may help them design modified versions of the molecule that affect only the desirable targets. Using bioinformatic analysis, they have also found similar hvr genes in other bacteria, indicating that there may be several pantaphos-like molecules that could be used as potential herbicides or therapeutic drugs.

###

The study "A Phosphonate Natural Product Made by Pantoea ananatis is Necessary and Sufficient for the Hallmark Lesions of Onion Center Rot" can be found at 10.1128/mBio.03402-20. The study was funded by the National Institutes of Health.

Study finds increased risk of death among breast cancer patients who drink sugar-sweetened soda

Women who reported drinking non-diet soda five times or more per week were 85% more likely to die from breast cancer

UNIVERSITY AT BUFFALO

Research News

IMAGE

IMAGE: STUDY FIRST AUTHOR NADIA KOYRATTY IS AN EPIDEMIOLOGY PHD CANDIDATE IN THE UNIVERSITY AT BUFFALO SCHOOL OF PUBLIC HEALTH AND HEALTH PROFESSIONS. view more 

CREDIT: UNIVERSITY AT BUFFALO

BUFFALO, N.Y. -- New research from the University at Buffalo suggests that breast cancer patients who drink sugar-sweetened beverages regularly are at increased risk for death from any cause and breast cancer in particular.

Compared to women who never or rarely drank non-diet soda, those who reported drinking non-diet soda five times or more per week had a 62% higher likelihood of dying from any causes, and were 85% more likely to die from breast cancer specifically. The findings were published online ahead of print March 2 in Cancer Epidemiology, Biomarkers & Prevention, a journal of the American Association for Cancer Research.

Research on soda and breast cancer is fairly new, says study first author Nadia Koyratty, a PhD candidate in the Department of Epidemiology and Environmental Health in UB's School of Public Health and Health Professions. Because breast cancer is so common, recommendations regarding lifestyle choices to breast cancer survivors are of considerable importance. And, despite the negative health outcomes associated with drinking soda, such as weight gain, Type 2 diabetes and cardiovascular disease, many people continue to drink sugar-sweetened sodas.

There have been only a few observational studies examining the association between sugar-sweetened beverages and cancer mortality. "This study is one of the few that looks at the prognosis of women with breast cancer with respect to non-diet soda consumption," Koyratty says.

Researchers assessed the relationship between sugar-sweetened soda and both all-cause and breast cancer mortality among 927 women who had been diagnosed with breast cancer, aged 35 to 79. Participants were enrolled in the Western New York Exposures and Breast Cancer (WEB) Study, and followed for a median of nearly 19 years.

The study used a food frequency questionnaire to assess participants' food and beverage intake in the 12 to 24 months prior to diagnosis of breast cancer. Of the more than 900 women diagnosed with breast cancer, 41% had died by the end of the follow-up period. Among the participants who had died, there was a higher percentage of women who reported high frequency of sugar-sweetened soda consumption compared to the women who were still living.

The associations did not change when researchers included diet soda consumption as a variable.

Why the focus on non-diet soda?

"Non-diet sodas are the highest contributors of sugar and extra calories to the diet, but they do not bring anything else that is nutritionally beneficial," Koyratty explains. "On the other hand, teas, coffees and 100% fruit juices, unless sugars are added, are healthier beverage options because they do add to the nutritive value through antioxidants and vitamins."

Sugar-sweetened sodas contain large quantities of sucrose and fructose, which give them the highest glycemic load compared to other foods or beverages. These higher concentrations of glucose and insulin may lead to conditions that have been associated with higher risk of breast cancer, the researchers note.

"There are more than 3.5 million breast cancer survivors alive in the U.S. today. We need to better understand the factors that affect their health," said study senior author Jo L. Freudenheim, PhD, SUNY Distinguished Professor in the Department of Epidemiology and Environmental Health in UB's School of Public Health and Health Professions.

"While we need more studies to confirm our findings, this study provides evidence that diet may impact longevity of women after breast cancer," Freudenheim added.

###

Study co-authors include Susan McCann, PhD, of Roswell Park Comprehensive Cancer Center; Amy Millen, PhD, associate professor of epidemiology and environmental health at UB; and Maurizio Trevisan, MD, of Vin University.

The research was supported in part by grants from the Army Medical Research and Material Command, National Cancer Institute and the National Institute on Alcohol Abuse and Alcoholism.

Breaking the warp barrier for faster-than-light travel

Astrophysicist at Göttingen University discovers new theoretical hyper-fast soliton solutions

UNIVERSITY OF GÖTTINGEN

Research News

IMAGE

IMAGE: ARTISTIC IMPRESSION OF DIFFERENT SPACECRAFT DESIGNS CONSIDERING THEORETICAL SHAPES OF DIFFERENT KINDS OF "WARP BUBBLES ". view more 

CREDIT: E LENTZ

If travel to distant stars within an individual's lifetime is going to be possible, a means of faster-than-light propulsion will have to be found. To date, even recent research about superluminal (faster-than-light) transport based on Einstein's theory of general relativity would require vast amounts of hypothetical particles and states of matter that have "exotic" physical properties such as negative energy density. This type of matter either cannot currently be found or cannot be manufactured in viable quantities. In contrast, new research carried out at the University of Göttingen gets around this problem by constructing a new class of hyper-fast 'solitons' using sources with only positive energies that can enable travel at any speed. This reignites debate about the possibility of faster-than-light travel based on conventional physics. The research is published in the journal Classical and Quantum Gravity.

The author of the paper, Dr Erik Lentz, analysed existing research and discovered gaps in previous 'warp drive' studies. Lentz noticed that there existed yet-to-be explored configurations of space-time curvature organized into 'solitons' that have the potential to solve the puzzle while being physically viable. A soliton - in this context also informally referred to as a 'warp bubble' - is a compact wave that maintains its shape and moves at constant velocity. Lentz derived the Einstein equations for unexplored soliton configurations (where the space-time metric's shift vector components obey a hyperbolic relation), finding that the altered space-time geometries could be formed in a way that worked even with conventional energy sources. In essence, the new method uses the very structure of space and time arranged in a soliton to provide a solution to faster-than-light travel, which - unlike other research - would only need sources with positive energy densities. No "exotic" negative energy densities needed.

If sufficient energy could be generated, the equations used in this research would allow space travel to Proxima Centauri, our nearest star, and back to Earth in years instead of decades or millennia. That means an individual could travel there and back within their lifetime. In comparison, the current rocket technology would take more than 50,000 years for a one-way journey. In addition, the solitons (warp bubbles) were configured to contain a region with minimal tidal forces such that the passing of time inside the soliton matches the time outside: an ideal environment for a spacecraft. This means there would not be the complications of the so-called "twin paradox" whereby one twin travelling near the speed of light would age much more slowly than the other twin who stayed on Earth: in fact, according to the recent equations both twins would be the same age when reunited.

"This work has moved the problem of faster-than-light travel one step away from theoretical research in fundamental physics and closer to engineering. The next step is to figure out how to bring down the astronomical amount of energy needed to within the range of today's technologies, such as a large modern nuclear fission power plant. Then we can talk about building the first prototypes," says Lentz.

Currently, the amount of energy required for this new type of space propulsion drive is still immense. Lentz explains, "The energy required for this drive travelling at light speed encompassing a spacecraft of 100 meters in radius is on the order of hundreds of times of the mass of the planet Jupiter. The energy savings would need to be drastic, of approximately 30 orders of magnitude to be in range of modern nuclear fission reactors." He goes on to say: "Fortunately, several energy-saving mechanisms have been proposed in earlier research that can potentially lower the energy required by nearly 60 orders of magnitude." Lentz is currently in the early-stages of determining if these methods can be modified, or if new mechanisms are needed to bring the energy required down to what is currently possible.

###

Original publication: Erik W Lentz, Breaking the Warp Barrier: Hyper-Fast Solitons in Einstein-Maxwell-Plasma Theory, Classical and Quantum Gravity, March 2021. DOI: 10.1088/1361-6382/abe692


CAPTION

Image to show how long it would take different types of spacecraft to travel from our solar system to Proxima Centauri (the nearest known star). Currently, the only option would be to use a chemical rocket meaning a journey time of over 50,000 years.

CREDIT

E Lentz

Bacterial film separates water from oil

NORTH CAROLINA STATE UNIVERSITY

Research News

Researchers have demonstrated that a slimy, yet tough, type of biofilm that certain bacteria make for protection and to help them move around can also be used to separate water and oil. The material may be useful for applications such as cleaning contaminated waters.

In the journal Langmuir, North Carolina State University researchers reported the findings of an experiment in which they used a material produced by the bacteria Gluconacetobacter hansenii as a filter to separate water from an oil mixture.

"It's really remarkable to think that these little bugs can make this stuff that is so perfect in many ways," said Lucian Lucia, the study's corresponding author and an associate professor of forest biomaterials and chemistry at NC State.

The biofilm the bacteria make and release into their environment is made of cellulose, which is the same material that gives plants a sturdy structure in their cell walls. However, when bacteria make cellulose, it has a tightly packed, crystalline structure, researchers said.

"It's one of the purest, if not the purest, forms of cellulose out there," Lucia said. "It's very well structured. It's very water loving, and it's got a very high crystallinity, so it packs very beautifully. Once you strip out the bacteria, you have this amazingly tough material that has a real robustness, or toughness."

The bacteria make the film to protect themselves, the researchers said.

"If you leave something like an unwashed dish out, it can turn all slimy and gross - that's a biofilm," said study co-author Wendy Krause, associate professor of textile engineering, chemistry and science at NC State. "Different bacteria make different biofilms. The bacterial film that we're studying is made of cellulose. The bacteria are making it because they live on it and in it. They're making their home."

In the experiment, researchers used the bacteria as factories of cellulose nano-fibers. They then removed the bacteria and their non-cellulose residue. Finally, the researchers used the cellulose membrane to see if it could separate water from a solution containing both oil and water.

They found the material was effective at removing water, and it was sturdy.

"The oil doesn't want to go through the membrane; it has a repulsive effect to it," Lucia said. "It's super fat-hating."

"If the oil and water were highly mixed, it doesn't matter," Krause added. "You could put an immersion blender into the solution, and the membrane will still separate the water and oil."

Researchers see a variety of potential applications for the material in situations where you need to recover water from an oily mixture - whether it be to clean water contaminated with a textile dye or for environmental remediation. In future work, the researchers want to explore how they can tailor the membrane by chemically modifying it for certain applications.

The study, "Bacterial Superoleophobic Fibrous Matrices: A Naturally Occurring Liquid-Infused System for Oil-Water Separation," was published online in the journal Langmuir on Feb. 19.

###

Note to editors: The abstract follows.

"Bacterial Superoleophobic Fibrous Matrices: A Naturally Occurring Liquid-Infused System for Oil-Water Separation."

Published online in Langmuir on Feb. 19, 2021.

Authors: Zahra Ashrafi, Zimu Hu, Lucia Lucia and Wendy Krause.

DOI: 10.1021/acs.langmuir.0c02717

Abstract: Nanocellulose fibers bioengineered by bacteria are a high performance three-dimensional cross-linked network which can confine a dispersed liquid medium such as water. The strong chemical and physical interactions of dispersed water molecules with the entangled cellulosic network allow these materials to be ideal substrates for effective liquid separation. This type of phenomenon can be characterized as green with no equivalent precedent; its performance and sustainability relative to other cellulose-based or synthetic membranes are shown herein to be superior. In this work, we demonstrated that the renewable bacterial nanocellulosic membrane can be used as a stable liquid-infused system for the development of soft surfaces with superwettability and special adhesion properties and thus address intractable issues normally encountered by solid surfaces.


Deforestation's effects on malaria rates vary by time and distance

Study shows that deforestation in Southeast Asia increases malaria infections before leading to later reductions, although these effects can vary by the location of forest loss

ELIFE

Research News

Deforestation may cause an initial increase in malaria infections across Southeast Asia before leading to later decreases, a study published today in eLife suggests.

The results may help malaria control programs in the region develop better strategies for eliminating malaria infections and educating residents on how to protect themselves from infection.

Mosquitos spread the malaria parasite to humans causing infections that can be severe and sometimes deadly. In the area along the Mekong river in Southeast Asia, many residents hunt or harvest wood in the surrounding forests, which can increase their risk of infection. Yet recent outbreaks of malaria in the region have also been linked to deforestation.

"As countries in the region focus their malaria control and elimination efforts on reducing forest-related transmission, understanding the impact of deforestation on malaria rates is essential," says first author Francois Rerolle, Graduate Student Researcher at the University of California San Francisco (UCSF), US, who works within the UCSF Malaria Elimination Initiative.

To better understand the effects of deforestation on malaria transmission, Rerolle and colleagues examined both forest cover data and village-level malaria incidence data from 2013-2016 in two regions within the Greater Mekong Sub-region.

They found that in the first two years following deforestation activities, malaria infections increased in villages in the area, but then decreased in later years. This trend was mostly driven by infections with the malaria parasite Plasmodium falciparum. Deforestation in the immediate 1-10-kilometer radius surrounding villages did not affect malaria rates, but deforestation in a wider 30-kilometer radius around the villages did. The authors say this is likely due to the effect that wider deforestation can have on human behaviour. "We suspect that people making longer and deeper trips into the forest results in increased exposure to mosquitoes, putting forest-goers at risk," Rerolle explains.

Previously, studies on the Amazon in South America have found increased malaria infections in the first 6-8 years after deforestation, after which malaria rates fall. The difference in timing may be due to regional differences. The previous studies in the Amazon looked at deforestation driven by non-indigenous people moving deeper into the forest, while communities in the current study have long lived at the forest edges and rely on subsistence agriculture.

"Our work provides a more complete picture of the nuanced effects of deforestation on malaria infections," says senior author Adam Bennett, Program Lead at the UCSF Malaria Elimination Initiative. "It may encourage more in-depth studies on the environmental and behavioural drivers of malaria to help inform strategies for disease elimination."

###

Media contact

Emily Packer, Media Relations Manager
eLife
e.packer@elifesciences.org
+44 (0)1223 855373

About eLife

eLife is a non-profit organisation created by funders and led by researchers. Our mission is to accelerate discovery by operating a platform for research communication that encourages and recognises the most responsible behaviours. We aim to publish work of the highest standards and importance in all areas of biology and medicine, including Epidemiology and Global Health, while exploring creative new ways to improve how research is assessed and published. eLife receives financial support and strategic guidance from the Howard Hughes Medical Institute, the Knut and Alice Wallenberg Foundation, the Max Planck Society and Wellcome. Learn more at https://eLifesciences.org/about.

To read the latest Epidemiology and Global Health research published in eLife, visit https://eLifesciences.org/subjects/epidemiology-global-health.

About the UCSF Malaria Elimination Initiative

The Malaria Elimination Initiative works in partnership with malaria endemic countries and regions to advance evidence-based malaria policy and practice. Learn more at http://www.shrinkingthemalariamap.org.

Both old and young fish sustain fisheries

ARC CENTRE OF EXCELLENCE FOR CORAL REEF STUDIES

Research News

IMAGE

IMAGE: THERE ARE FOUR SPECIES OF CORAL GROUPER FOUND COMMONLY ON THE GREAT BARRIER REEF. THE BAR-CHEEK CORAL GROUPER (PLECTROPOMUS MACULATUS) IS DISTINGUISHED BY ELONGATED DOT PATTERNS. view more 

CREDIT: PHIL WOODHEAD, WET IMAGE UNDERWATER PHOTOGRAPHY.

Scientists have used modern genetic techniques to prove age-old assumptions about what sizes of fish to leave in the sea to preserve the future of local fisheries.

"We've known for decades that bigger fish produce exponentially more eggs," said the lead author of the new study, Charles Lavin, who is a research fellow from James Cook University (JCU) and Nord University in Norway.

"However, we also found while these big fish contributed significantly to keeping the population going--they are also rare."

Co-author Dr Hugo Harrison from the ARC Centre of Excellence for Coral Reef Studies at JCU said as fish grow older, they become more fertile and their chances of having babies increase.

"This is an age-old assumption of fisheries management--and with the help of modern genetics, we can show that this assumption is correct."

"But the smaller fish are just as important to keeping populations going. They may have fewer babies, but they also are more abundant."

The study used genetic parentage analysis to identify which adult coral groupers (Plectropomus maculatus) contribute to replenishing fished populations in the Great Barrier Reef Marine Park (GBRMP).

The authors found that large coral groupers are important because they are more likely to replenish the fish stocks removed from the fishery. However, smaller fish are still making a meaningful contribution.

"We show that minimum size-limits on catches are effective at protecting the reproductively mature coral grouper," Mr Lavin said. "This ensures all fish have the opportunity to reproduce at least once prior to being caught."

The authors said all fisheries must ensure there are enough fish reproducing to replace the portion of the population that are caught.

"We're fortunate in the GBRMP to have measures in place that protect both the small and larger fish," Dr Harrison said.

"These ensure our fisheries remain sustainable and can bounce back quickly after a disturbance."

In the GBRMP, catches of coral grouper are limited by size and catch limits, as well as seasonal closures to ensure the fishery is productive and sustainable.

"It's encouraging that these measures are effective," Mr Lavin said.

"But it's important that we also protect the bigger, rarer fish inside no-take marine reserves because they are super-productive," he said.

"For the fisher, this means there will always be fish to catch."

###

PAPER

Lavin C, Jones G, Williamson D, Harrison H. (2021). 'Minimum size limits and the reproductive value of numerous, young, mature female fish'. Proceedings of the Royal Society B. DOI: 10.1098/rspb.2020.2714

Researchers modify air quality models to reflect polluted reality in Latin America

NORTH CAROLINA STATE UNIVERSITY

Research News

IMAGE

IMAGE: COMPUTATIONAL MODELS OF AIR QUALITY HAVE LONG BEEN USED TO SHED LIGHT ON POLLUTION CONTROL EFFORTS IN THE UNITED STATES AND EUROPE, BUT THE TOOLS HAVE NOT FOUND WIDESPREAD ADOPTION... view more 

CREDIT: JAMES EAST

Computational models of air quality have long been used to shed light on pollution control efforts in the United States and Europe, but the tools have not found widespread adoption in Latin America. New work from North Carolina State University and Universidad de La Salle demonstrates how these models can be adapted to offer practical insights into air quality challenges in the Americas outside the U.S.

Computational air quality models can be used in multiple ways. For example, they can be used to determine which sources are responsible for what fraction of air pollution. They can also help authorities predict how air pollution might change if different pollution control methods are adopted.

"Historically, it's been very challenging to apply these modeling tools in Latin America, so it has rarely been done," says Fernando Garcia Menendez, corresponding author of a paper on the work and an assistant professor of environmental engineering at NC State. "This is important because the region has many areas that are dealing with significant air pollution, and these modeling tools can help governments identify the most cost-effective ways of achieving air quality improvements."

One challenge to using computational air quality models in Latin America is that the relevant modeling frameworks were developed largely in the context of the U.S. and Europe. That means that some of the assumptions that modelers took for granted when developing the tools don't always apply in Latin American cities. Furthermore, computational resources and trained environmental modelers are still scarce in the region.

For example, there are often substantially less air emissions data available. In addition, there are some contributors to air pollution that are common across Latin American metro areas, but that differ from what we see in the U.S. - more unpaved roads, an older cargo fleet, a large number of motorcycles, informal economies, and so on.

With that in mind, Garcia Menendez developed a research project with collaborators at the Universidad de La Salle, in Bogotá, Colombia. Specifically, the research team fine-tuned a modeling framework to reflect the air pollution dynamics in Bogotá and investigate the city's air quality problems. The collaborators at Universidad de La Salle also collected air pollution data that allowed the team to assess the accuracy of its modeling results.

"Our paper outlines the techniques we've used to perform computational modeling of air quality issues in a large Latin American city," says James East, first author of the paper and a Ph.D. student at NC State. "This not only demonstrates that it can be done, but provides an approach that others can use to provide insights into air pollution in other parts of the region that are experiencing similar issues."

While the paper focuses on an air quality model for fine particulate matter (PM2.5), the researchers say that the model could be used to look at other air pollutants. Exposure to PM2.5 is associated with a wide variety of health problems, including heart and lung disease.

In their proof-of-concept demonstration, the researchers found that the largest local sources of PM2.5 in Bogotá were dust from unpaved roads and emissions from heavy-duty vehicles. However, when the model was used to project future air quality, the study also found that while paving roads would decrease air pollution in some parts of the city, different emission sources would still lead to increased air pollution in other parts of the city - unless other emission control measures were also implemented.

In short, the model offered practical insights into possible solutions for a complex metropolitan area of 10 million people.

"These findings are of interest to environmental authorities, from the local to the national level, who are pursuing ways to effectively address air pollution in Bogotá and other Colombian cities," says Jorge Pachon, a co-author of the paper and an associate professor at the Universidad de La Salle.

###

The paper, "Air quality modeling to inform pollution mitigation strategies in a Latin American megacity," is published in Science of The Total Environment. The paper was co-authored by Juan Sebastian Montealegre of the Universidad de La Salle. The work was partially funded by Ecopetrol, Colombia's national petroleum company.