Wednesday, August 28, 2024

 

Mizzou researchers explore solutions to help reduce nurse burnout



Study finds giving nurses massages during their shifts may improve physical health and mental well-being



University of Missouri-Columbia

Massage therapy 

image: 

Massage therapy

view more 

Credit: University of Missouri





COLUMBIA, Mo. -- Even before the coronavirus pandemic, high rates of burnout and staffing shortages plagued the nursing industry, primarily because of the stressful demands of the job. The COVID-19 pandemic only amplified these challenges, and with nearly a third of all Missouri nurses nearing retirement, improving nurse retention is key to avoiding an impending nursing workforce crisis in our state.

Despite dozens of studies proving burnout is an issue, few provide interventions to help nurses — and their patients — overcome its challenges.

A recent study by the University of Missouri has found that a simple and common-sense solution — giving nurses massages during their work shift — not only reduces their physical aches and pains but also leaves them feeling mentally rejuvenated to return to work. The findings can help leaders in health care and other industries with high rates of burnout consider the impact massages or other interventions can have on improving employee well-being and reducing high rates of staff turnover.

From the bedside to the research lab

Jennifer Hulett has been a nurse for 30 years and is now a researcher at Mizzou’s Sinclair School of Nursing. She knows firsthand how 12-hour shifts lead to physical aches and pains, chronic stress and a high rate of burnout.

With so many nurses leaving the profession for less stressful careers, the extremely high rate of burnout has caused a constantly revolving door of staffing shortages throughout the nursing industry, with the average new nurse becoming burned out within 18 months on the job.

“I have seen over the years the physical and mental toll the job puts on nurses, and many nurses are not healthy as a result,” Hulett said. “It is unfortunate because nurses dedicate their careers to taking care of their patients, but no one is taking care of the nurses. I’m determined to change the culture of the nursing industry in a way that improves well-being through mind-body interventions.”

In the recently published study, nurses were surveyed on their physical symptoms related to aches and pains as well as their mental well-being before and after receiving 15-minute massages twice per week during their work shift for a month.

“After just one month of the intervention, the nurses reported fewer aches and pains after receiving the massages,” Hulett said. “Perhaps the most important finding was the nurses often reported feeling rejuvenated to go back to work after the massages, improving their overall mental well-being.”

Hulett’s main objective is to create a healthier work environment where nurses are excited to go to work and want to remain in the profession long-term. Massages could be just one intervention among a toolbox of options nurses could potentially choose from. She added that more research is needed to further explore other types of interventions to see which ones might be most effective in improving employee well-being.

“It is time to start thinking outside the box because if we do nothing, current staffing shortages will continue to get worse,” Hulett said. “We have seen over the years that it becomes a vicious cycle where you are constantly hiring new nurses without enough experienced nurses to mentor and train the new hires. This ultimately impacts the quality of care that gets delivered to patients, and that is another critical topic future research can explore.”

While this particular study focused on burnout in the nursing industry, Hulett added that other healthcare professionals and professions that experience high rates of staffing shortages because of burnout could benefit from this type of intervention.

“Massage therapy for hospital-based nurses: A proof-of-concept study” was published in Complementary Therapies in Clinical Practice. Hulett collaborated on the study with Susan Scott.

 

Algorithm raises new questions about Cascadia earthquake record




University of Texas at Austin
Researchers in core viewing lab 

image: 

Research Professors Zoltán Sylvester (left) and Jacob Covault in the core viewing facility at The University of Texas at Austin’s Bureau of Economic Geology. An algorithm they developed for correlating turbidites in geologic cores is raising questions about the earthquake record of Cascadia. Examples of turbidites from Cascadia are shown on the screen behind them. 

view more 

Credit: The University of Texas at Austin/Jackson School of Geosciences.




The Cascadia subduction zone in the Pacific Northwest has a history of producing powerful and destructive earthquakes that have sunk forests and spawned tsunamis that reached all the way to the shores of Japan.

The most recent great earthquake was in 1700. But it probably won’t be the last. And the area that stands to be affected is now bustling metropolises that are home to millions of people.  

Figuring out the frequency of earthquakes – and when the next “big one” will happen – is an active scientific question that involves looking for signs of past earthquakes in the geologic record in the form of shaken up rocks, sediment and landscapes.

However, a study by scientists at The University of Texas at Austin and collaborators is calling into question the reliability of an earthquake record that covers thousands of years – a type of geologic deposit called a turbidite that’s found in the strata of the seafloor.  

The researchers analyzed a selection of turbidite layers from the Cascadia subduction zone dating back about 12,000 years ago with an algorithm that assessed how well turbidite layers correlated with one another.

They found that, in most cases, the correlation between the turbidite samples was no better than random. Since turbidites can be caused by a range of phenomena, and not just earthquakes, the results suggest that the turbidite record’s connection to past earthquakes is more uncertain than previously thought.     

"We would like everyone citing the intervals of Cascadia subduction earthquakes to understand that these timelines are being questioned by this study," said Joan Gomberg, a research geophysicist at the U.S. Geological Survey and study co-author. "It’s important to conduct further research to refine these intervals. What we do know is that Cascadia was seismically active in the past and will be in the future, so ultimately, people need to be prepared."

The results don’t necessarily change the estimated earthquake frequency in Cascadia, which is about every 500 years, said the researchers. The current frequency estimate is based on a range of data and interpretations, not just the turbidites analyzed in this study. However, the results do highlight the need for more research on turbidite layers, specifically, and how they relate to each other and large earthquakes.

Co-author Jacob Covault, a research professor at the UT Jackson School of Geosciences, said the algorithm offers a quantitative tool that that provides a replicable method for interpreting ancient earthquake record, which are usually based on more qualitative descriptions of the geology and their potential associations. 

“This tool provides a repeatable result, so everybody can see the same thing,” said Covault, the co-principal investigator of the Quantitative Clastics lab at the Jackson School’s Bureau of Economic Geology. “You can potentially argue with that result, but at least you have a baseline, an approach that is reproducible.”  

The results were published in the journal GSA Bulletin. The study includes researchers from the USGS, Stanford University and the Alaska Division of Geological & Geophysical Surveys.  

Turbidites are the remnants of underwater landslides. They’re made of sediments that settled back down to the seafloor after being flung into the water by the turbulent motion of sediment rushing across the ocean floor. The sediment in these layers have a distinctive gradation, with coarser grains at the bottom and finer ones at the top.

But there’s more than one way to make a turbidite layer. Earthquakes can cause landslides when they shake up the seafloor. But so can storms, floods and a range of other natural phenomena, albeit on a geographic smaller scale.

Currently, connecting turbidites to past earthquakes usually involves finding them in geologic cores taken from the seafloor. If a turbidite shows up in roughly the same spot in multiple samples across a relatively large area, it’s counted as a remnant of a past earthquake, according to the researchers.

Although carbon dating samples can help narrow down timing, there’s still a lot of uncertainty in interpreting if samples that appear at about the same time and place are connected by the same event.

Getting a better handle on how different turbidite samples relate to one another inspired the researchers to apply a more quantitative method – an algorithm called “dynamic time warping” – to the turbidite data. The algorithmic method dates back to the 1970s and has a wide range of applications, from voice recognition to smoothing out graphics in dynamic VR environments.

This is the first time it has been applied to analyzing turbidites, said co-author Zoltán Sylvester, a research professor at the Jackson School and co-principal investigator of the Quantitative Clastics Lab, who led the adaption of the algorithm for analyzing turbidites.  

“This algorithm has been a key component of a lot of the projects I have worked on,” said Sylvester. “But it’s still very much underused in the geosciences.”

The algorithm detects similarity between two samples that may vary over time, and determines how closely the data between them matches.

For voice recognition software, that means recognizing key words even though they might be spoken at different speeds or pitches. For the turbidites, it involves recognizing shared magnetic properties between different turbidite samples that may look different from location to location despite originating from the same event.

“Correlating turbidites is no simple task,” said co-author Nora Nieminski, the coastal hazards program manager for the Alaska Division of Geological & Geophysical Surveys. “Turbidites commonly demonstrate significant lateral variability that reflect their variable flow dynamics. Therefore, it is not expected for turbidites to preserve the same depositional character over great distances, or even small distances in many cases, particularly along active margins like Cascadia or across various depositional environments.”

The researchers also subjected the correlations produced by the algorithm to another level of scrutiny. They compared the results to correlation data calculated using synthetic data made by comparing 10,000 pairs of random turbidite layers. This synthetic comparison served as a control against coincidental matches in the actual samples.

The researchers applied their technique to magnetic susceptibility logs for turbidite layers in nine geologic cores that were collected during a scientific cruise in 1999. They found that in most cases, the connection between turbidite layers that had been previously correlated was no better than random.  The only exception to this trend was for turbidite layers that were relatively close together – no more than about 15 miles apart. 

The researchers emphasize that the algorithm is just one way of analyzing turbidities, and that the inclusion of other data could change the degree of correlation between the cores one way or another. But according to these results, the presence of turbidities at the same time and general area in the geologic record is not enough to definitively connect them to one another.

And although algorithms and machine learning approaches can help with that task, it’s up to geoscientists to interpret the results and see where the research leads.

“We are here for answering questions not just applying the tool,” Sylvester said. “But at the same time, if you are doing this kind of work, then it forces you to think very carefully.”   

A figure comparing the results of earlier turbidite correlation research to results calculated by an algorithm developed at The University of Texas at Austin. Black dashed lines indicate similar research results. Red dashed lines are different results.

Credit

Zoltan Sylvester

Turbidite 

 

Illinois researchers develop near-infrared spectroscopy models to analyze corn kernels, biomass




University of Illinois College of Agricultural, Consumer and Environmental Sciences
close-up of corn plant 

image: 

University of Illinois researchers developed a global model for corn kernel analysis with NIR spectroscopy.

view more 

Credit: College of ACES




URBANA, Ill. – In the agricultural and food industry, determining the chemical composition of raw materials is important for production efficiency, application, and price. Traditional laboratory testing is time-consuming, complicated, and expensive. New research from the University of Illinois Urbana-Champaign demonstrates that near-infrared (NIR) spectroscopy and machine learning can provide quick, accurate, and cost-effective product analysis.

In two studies, the researchers explore the use of NIR spectroscopy for analyzing characteristics of corn kernels and sorghum biomass.

“NIR spectroscopy has many advantages over traditional methods. It is fast, accurate, and inexpensive. Unlike lab analysis, it does not require the use of chemicals, so it’s more environmentally sustainable. It does not destroy the samples, and you can analyze multiple features at the same time. Once the system is set up, anyone can run it with minimal training,” said Mohammed Kamruzzaman, assistant professor in the Department of Agricultural and Biological Engineering (ABE), part of the College of Agricultural, Consumer and Environmental Sciences and The Grainger College of Engineering at U. of I. He is a co-author on both papers.

In the first study, the researchers created a global model for corn kernel analysis. Moisture and protein content impact nutritional value, processing efficiency, and price of corn, so the information is crucial for the grain processing industry. 

NIR and other spectroscopic techniques are indirect methods. They measure how a material absorbs or emits light at different wavelengths, then construct a unique spectrum that is translated into product characteristics with machine learning models. Many food and agricultural processing facilities already have NIR equipment, but models need to be trained for specific purposes.

“Corn grown in different locations varies because of soil, environment, management, and other factors. If you train the model with corn from one location, it will not be accurate elsewhere,” Kamruzzaman said.

To address this issue and develop a model that applies in many different locations, the researchers collected corn samples from seven countries – Argentina, Brazil, India, Indonesia, Serbia, Tunisia, and the USA. 

“To analyze moisture and protein in the corn kernels, we combined gradient-boosting machines with partial least squares regression. This is a novel approach that yields accurate, reliable results,” said Runyu Zheng, a doctoral student in ABE and lead author on the first study.

While the model is not 100% global, it provides considerable variability in the data and will work in many locations. If needed, it can be updated with additional samples from new locations, Kamruzzaman noted.

In the second study, the researchers focused on sorghum biomass, which can serve as a renewable, cost-effective, and high-yield feedstock for biofuel.

Biomass conversion into biofuels depends on chemical composition, so a rapid and efficient method of sorghum biomass characterization could assist biofuel, breeding, and other relevant industries, the researchers explained.

Using sorghum from the University of Illinois Energy Farm, they were able to accurately and reliably predict moisture, ash, lignin, and other features. 

“We first scanned the samples and obtained NIR spectra as an output. This is like a fingerprint that is unique to different chemical compositions and structural properties. Then we used chemometrics – a mathematical-statistical approach – to develop the prediction models and applications,” said Md Wadud Ahmed, a doctoral student in ABE and lead author on the second paper.

While NIR spectroscopy is not as accurate as lab analysis, it is more than sufficient for practical purposes and can provide fast, efficient screening methods for industrial use, Kamruzzaman said.

“A major advantage of this technology is that you don’t need to remove and destroy products. You can simply take samples for measurement, scan them, and then return them to the production stream. In some cases, you can even scan the samples directly in the production line. NIR spectroscopy provides a lot of flexibility for industrial usage,” he concluded. 

The first paper, “Optimizing feature selection with gradient boosting machines in PLS regression for predicting moisture and protein in multi-country corn kernels via NIR spectroscopy,” is published in Food Chemistry [DOI: 10.1016/j.foodchem.2024.140062].

The second paper, “Rapid and high-throughput determination of sorghum (Sorghum bicolor) biomass composition using near infrared spectroscopy and chemometrics,” is published in Biomass and Bioenergy [DOI:10.1016/j.biombioe.2024.107276]. This work was funded by the DOE Center for Advanced Bioenergy and Bioproducts Innovation (U.S. Department of Energy, Office of Science, Biological and Environmental Research Program under Award Number DE-SC0018420).

Md Wadud Ahmed, a doctoral student at the University of Illinois, used NIR spectroscopy and machine learning to analyze the composition of sorghum biomass.

Credit

College of ACES

 

Plenty of ups-and-downs are key to a great story, new research finds




University of Toronto, Rotman School of Management
Prof. Samsun Knight 

image: 

Samsun Knight is an assistant professor of marketing at University of Toronto’s Rotman School of Management and faculty affiliate at the University of Toronto School of Cities, where he studies quantitative retail marketing, optimal targeting and machine learning. He also researches computational text analysis and natural-language processing, with a focus on measuring and understanding narratives.

Separately, he is a writer and graduate of the Iowa Writers’ Workshop, where he was a Truman Capote Fellow. His debut novel, The Diver, was recently published by University of Iowa Press. His next novel, Likeness, is forthcoming in 2025.

view more 

Credit: Samsun Knight




Toronto - Since at least Aristotle, writers and scholars have debated what makes for a great story. One of them is Samsun Knight, a novelist who is also an economist and assistant professor of marketing at the University of Toronto’s Rotman School of Management. With a scientist’s tools, he’s done what previous theorizers have failed to: put theory to the test and demonstrate the key factor for empirically predicting which stories will be snore fests and which will leave audiences hungry for more.

It turns out to be “narrative reversals” -- lots of them and the bigger the better. Commonly known as changes of fortune or turning points, where characters’ fortunes swing from good to bad and vice versa, Prof. Knight and fellow researchers found that stories rich in these mechanisms boosted popularity and engagement with audiences through a range of media, from television to crowdfunding pitches.

“The best-written stories were always either ‘building up’ a current reversal, or introducing a new plot point,” says Prof. Knight. “In our analysis, the best writers were those that were able to maintain both many plot points and strong build-up for each plot point across the course of the narrative.”

The researchers analyzed nearly 30,000 television shows, movies, novels, and crowdfunding pitches using computational linguistics, a blend of computer science and language analysis. This allowed them to quantify not only the number of a reversals in a text but also their degree or intensity by assigning numerical values to words based on how positive or negative they were.

Movies and television shows with more and bigger reversals were better rated on the popular ratings site IMDb. Books with the most and biggest reversals were downloaded more than twice as much as books with the fewest reversals from the free online library Project Gutenberg. And GoFundMe pitches with more and larger reversals were more likely to hit their fundraising goal, by as much as 39 per cent.

The Greek philosopher Aristotle was the first to identify peripeteia, the sudden reversal of circumstances, as a key feature of a good story. Other thinkers have added their ideas since then, including American playwright and dramaturg Leon Katz whose scholarship particularly inspired Prof. Knight’s research. Katz “described the reversal as the basic unit of narrative, just as a sentence is the basic unit of a paragraph, or the syllogism is the basic unit of a logical proof,” says Prof. Knight.

In addition to helping psychologists understand how narrative works to educate, inform and inspire people, the findings may also benefit storytellers of all kinds.

“Hopefully our research can help build a pedagogy for writers that allows them to rely on the accumulated knowledge of Aristotle et al. without having to ‘reinvent the wheel’ on their own every time,” says Prof. Knight.

That includes himself. With another novel on the way, he was recently working on a chapter with a big reveal. “I realized that this drop might hit harder if I gave the character more positive moments before pulling the rug out from under them,” he says.

The research was co-authored with Matthew D. Rocklage and Yakov Bart, both of Boston’s Northeastern University. It appears in Science Advances.

Bringing together high-impact faculty research and thought leadership on one searchable platform, the Rotman Insights Hub offers articles, podcasts, opinions, books and videos representing the latest in management thinking and providing insights into the key issues facing business and society. Visit www.rotman.utoronto.ca/insightshub.

The Rotman School of Management is part of the University of Toronto, a global centre of research and teaching excellence at the heart of Canada’s commercial capital. Rotman is a catalyst for transformative learning, insights and public engagement, bringing together diverse views and initiatives around a defining purpose: to create value for business and society. For more information, visit www.rotman.utoronto.ca

-30-

For more information:

Ken McGuffin

Manager, Media Relations

Rotman School of Management

University of Toronto

E-mail:mcguffin@rotman.utoronto.ca

 

 

Keeping native bees buzzing requires rethinking pest control



New research adds solid evidence to the suspicion that steep declines in America’s wild bee populations stem in large part from the use of pesticides. Saving the crucial pollinators requires new approaches to managing pesky insects, say the USC Dornsife




University of Southern California

Valley carpenter bee 

image: 

Valley carpenter bee.

view more 

Credit: Teagan Baiotto, PhD student in the Ecological Data Science Lab at the University of Southern California Dornsife.





Whether you’re strolling through a garden, wandering a park, or simply enjoying an open space in the United States, you’re likely to notice bees buzzing about the flowers. While honeybees, imported from Europe in the 17th century to produce honey, are the most recognizable, they aren’t the only bees at work. If you’re a keen observer, you might spot some of the thousands of less familiar, native bee species that call these spaces home. 

Native wild bees play a crucial ecological role, ensuring the survival and reproduction of countless plant species — including many agricultural crops — by spreading pollen as they forage for food. Unfortunately, their numbers seem to be declining, and despite experts suggesting multiple causes, the exact reason remains a mystery.

new study published in Nature Sustainability sheds light on one potential cause: pesticide use.  The research reveals a stark decline in the number of wild bee sightings, with appearances of some species dropping as much as 56% in areas of high pesticide use compared to areas with no pesticide use. 

The study points to pesticides as a significant factor in wild bee decline and suggests that alternative pest control methods, such as those proposed by the U.S. Environmental Protection Agency, could reduce the damage.

Pesticide effects on wild bee populations scrutinized

Loss of wild bees could disrupt entire ecosystems, affecting not just plants but also the wildlife that depend on those plants for food and habitat. The multibillion-dollar agricultural industry could also suffer; wild bees, alongside honeybees, play a crucial role in pollinating three-quarters of food crops and nearly 90% of flowering plant species.

Recognizing the urgent threat posed by bee population declines, Laura Melissa Guzman of the USC Dornsife College of Letters, Arts and Sciences, along with an international team of researchers, set out to investigate the impact of pesticides on wild bees. They also examined the effects of agricultural practices and how the presence of honeybee colonies might influence wild bee populations.

Guzman, Gabilan Assistant Professor of Biological Sciences and Quantitative and Computational Biology, and the team inspected museum records, ecological surveys and community science data collected between 1996 and 2015 from across the contiguous United States. 

Using advanced computational methods, they sifted through more than 200,000 unique observations of over 1,000 species — representing one-third of all known bee species in the U.S. — to assess how frequently different species were observed in various locations.

 In addition, they analyzed data from several government sources, such as the U.S. Geological Survey’s National Land Cover Database and Pesticide National Synthesis Project. The former tracks U.S. land cover types (crop, urban, forest, wetland, etc.) with snapshots taken every two to three years from 2001 to 2016, while the latter provides detailed data on pesticide use by county from 1992 to 2021. 

By integrating these resources, the researchers correlated factors such as land use, pesticide application, honeybee colony presence, and types of agricultural crops with wild bee sightings over the past two to three decades.

Pesticides emerge as a top factor harming wild bees

The research provides compelling evidence that pesticide use is a major contributor to the declining numbers of wild bees. The study found a strong correlation between pesticide use and fewer wild bee sightings, suggesting a direct link between pesticide exposure and bee population declines. 

Some scientists have speculated that certain crops might adversely affect wild bees . However, Guzman and the team uncovered evidence to the contrary. Among crops frequented by pollinators, they found just as many wild bees in counties with a lot of agriculture versus a little.

Interestingly, the study hinted that the presence of colonies of honeybees, an invasive species, had almost no effect on wild bee populations, despite some evidence to the contrary. The researchers caution, however, that they need more detailed data and further study to confirm this conclusion.

“While our calculations are sophisticated, much of the spatial and temporal data is coarse,” Guzman said. “We plan to refine our analysis and fill in the gaps as much as possible.”

Wild bees need alternative pest management methods 

The researchers view their findings as compelling evidence that alternative pest control strategies, such as integrated pest management, are essential for conserving these crucial pollinators. 

Integrated pest management involves controlling pests by using natural predators, modifying practices to reduce pest establishment, and using traps, barriers and other physical means, with pesticide use reserved as a last resort.

The team also emphasizes the need for more long-term studies that collect data on more localized bee populations over extended periods. “We need to combine these large-scale studies that span continents with field experiments that expose bees to chemicals over longer periods and under natural conditions to get a clearer picture of how these chemicals affect bees,” Guzman said.

Building a case for better pesticide risk assessment 

The current study builds on work published earlier this year by Guzman and scientists from Washington State University and Canada’s Université Laval. That study found that ecological risk assessments (ERAs) underestimate pesticide threats to wild bees and other pollinators.

Currently, ERAs measure pesticide effects on honeybees, often in lab studies, then extrapolate those findings to native bee species. However, Guzman and her colleagues revealed that current ERAs vary wildly — as much as a million-fold — when estimating how lethal pesticides are just to honeybees. And many wild bees are even more sensitive to pesticides, compounding the problem, the research showed.

“When we only focus on the western honeybee, we’re ignoring the unique responses of other wild bee species to pesticide exposure,” Guzman said, calling for regulatory agencies, scientists and policymakers to rethink ERA methods.

“More data and analysis on the long-term effects of pesticides will help guide these efforts to the benefit of all pollinators, including wild bees,” Guzman said.

About the study

In addition to corresponding author Guzman, study authors include Elizabeth Elle and Leithen M’Gonigle of Simon Fraser University; Lora Morandin of the Pollinator Partnership; Neil Cobb of Biodiversity Outreach Network (BON); Paige  Chesshire of BON and Northern Arizona University; Lindsie McCabe of the USDA-ARS Pollinating Insects Research Unit; Alice Hughes of the University of Hong Kong; and Michael Orr of State Museum of Natural History Stuttgart.

 


Race and ethnicity and diagnostic testing for common conditions in the acute care setting



 News Release 

JAMA Network





About The Study: White patients discharged from the emergency department with a nonspecific diagnosis of interest were significantly more likely than Black patients to receive related diagnostic testing in this study. The extent to which this represents diagnostic test overuse in white patients vs undertesting and missed diagnoses in Black patients deserves further study. 

Corresponding Author: To contact the corresponding author, Michael I. Ellenbogen, M.D., email mellenb6@jhmi.edu.

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/

(doi:10.1001/jamanetworkopen.2024.30306)

Editor’s Note: Please see the article for additional information, including other authors, author contributions and affiliations, conflict of interest and financial disclosures, and funding and support.

#  #  #

Embed this link to provide your readers free access to the full-text article This link will be live at the embargo time http://jamanetwork.com/journals/jamanetworkopen/fullarticle/10.1001/jamanetworkopen.2024.30306?utm_source=For_The_Media&utm_medium=referral&utm_campaign=ftm_links&utm_term=082724

About JAMA Network Open: JAMA Network Open is an online-only open access general medical journal from the JAMA Network. On weekdays, the journal publishes peer-reviewed clinical research and commentary in more than 40 medical and health subject areas. Every article is free online from the day of publication.