Monday, April 22, 2024

 

Unveiling the secrets of Montesinho's honey: a blend of tradition and science



ZHEJIANG UNIVERSITY
Principal component analysis (PCA) biplot for the 15 honey parameters under study. 

IMAGE: 

CIRCLES REPRESENT HONEY SAMPLES FROM MONTESINHO NATURAL PARK (MNP), TRIANGLES REPRESENT HONEY SAMPLES FROM THE BOUNDARY’S LIMITS OF THE MNP (RED TRIANGLES, M2; YELLOW TRIANGLES, M1) AND SQUARES REPRESENT COMMERCIAL HONEY SAMPLES (DARK GREEN TRIANGLES, C2; SOFT GREEN TRIANGLES, C1).

view more 

CREDIT: FOOD QUALITY AND SAFETY




A study illuminates the physicochemical properties and nutritional value of honey from Montesinho Natural Park (MNP), a cherished natural reserve in Portugal. This research serves as both a tribute to the enduring heritage of Portuguese honey and a significant advancement in comprehending how geographical factors influence honey's quality. 

Montesinho Natural Park (MNP) is celebrated for its diverse flora and unique climate, creating an ideal environment for producing honey with distinctive attributes. With the demand for natural products and Protected Designation of Origin (PDO) honey on the rise, understanding the nuances that contribute to honey quality becomes paramount.

Published in the prestigious Food Quality and Safety journal on March 11, 2024, a new study (DOI: 10.1093/fqsafe/fyae015) by researchers from REQUIMTE/LAQV and Instituto Superior de Engenharia do Porto offers a comprehensive analysis of various honey samples from MNP. The study reveals that the honey consistently meets quality and regulatory standards.

The study conducted a thorough analysis of 13 honey samples from diverse apiaries. Employing sophisticated techniques, researchers assessed crucial physicochemical and nutritional variables, such as moisture, ash content, acidity, electrical conductivity, proteins, carbohydrates, and minerals. Additionally, they looked into specific indicators like 5-hydroxymethylfurfural, diastase activity, and color. Across the board, the honey demonstrated remarkable uniformity in quality, consistently adhering to or surpassing the rigorous criteria set by honey regulations. Notably, the study found that the honey's excellence was unaffected by MNP's varying geographical terrains, indicating a minimal impact of location on its key qualities. The uniformity in quality across different apiaries within MNP confirmed the honey's compliance with legislative standards and its alignment with established quality benchmarks, showcasing MNP honey's adherence to its esteemed PDO status.

Professor Cristina Delerue-Matos, a leading authority in food quality and one of the study’s principal investigators, emphasized the importance of this research: "Our findings not only confirm the superior quality of MNP honey but also highlight the negligible impact of geographical variations within the park on its physicochemical properties. This underscores the park's unique biodiversity and the consistent practices of local beekeepers in maintaining honey quality."

This research not only reinforces the prestigious PDO status of MNP honey but also illuminates the path for future studies aimed at preserving the legacy and integrity of honey production in natural reserves amidst changing environmental conditions.

###

References

DOI

10.1093/fqsafe/fyae015

Original Source URL

https://doi.org/10.1093/fqsafe/fyae015

Funding information

S. Soares and M. Moreira thank FCT (Fundação para a Ciência e Tecnologia) for funding through the Scientific Employment Stimulus—Individual Call (CEECIND/00588/2022 and CEECIND/02702/2017, respectively). The authors also thank the project SYSTEMIC “an integrated approach to the challenge of sustainable food systems: adaptive and mitigatory strategies to address climate change and malnutrition”.

About Food Quality and Safety (FQS)

Food Quality and Safety (FQS) is an open access, international, peer-reviewed journal providing a platform to highlight emerging and innovative science and technology in the agro-food field, publishing up-to-date research in the areas of food quality, food safety, food nutrition and human health. It is covered by SCI-E and the 2022 Impact Factor (IF)=5.6, 5-yr IF=6.2.

 

Despite AI advancements, human oversight remains essential


Study reveals its limitations in medical coding



THE MOUNT SINAI HOSPITAL / MOUNT SINAI SCHOOL OF MEDICINE




New York, NY [April 22, 2024]—State-of-the-art artificial intelligence systems known as large language models (LLMs) are poor medical coders, according to researchers at the Icahn School of Medicine at Mount Sinai. Their study, published in the April 19 online issue of NEJM AI [DOI: 10.1056/AIdbp2300040], emphasizes the necessity for refinement and validation of these technologies before considering clinical implementation.

The study extracted a list of more than 27,000 unique diagnosis and procedure codes from 12 months of routine care in the Mount Sinai Health System, while excluding identifiable patient data. Using the description for each code, the researchers prompted models from OpenAI, Google, and Meta to output the most accurate medical codes. The generated codes were compared with the original codes and errors were analyzed for any patterns.

The investigators reported that all of the studied large language models, including GPT-4, GPT-3.5, Gemini-pro, and Llama-2-70b, showed limited accuracy (below 50 percent) in reproducing the original medical codes, highlighting a significant gap in their usefulness for medical coding. GPT-4 demonstrated the best performance, with the highest exact match rates for ICD-9-CM (45.9 percent), ICD-10-CM (33.9 percent), and CPT codes (49.8 percent).

GPT-4 also produced the highest proportion of incorrectly generated codes that still conveyed the correct meaning. For example, when given the ICD-9-CM description "nodular prostate without urinary obstruction," GPT-4 generated a code for "nodular prostate," showcasing its comparatively nuanced understanding of medical terminology. However, even considering these technically correct codes, an unacceptably large number of errors remained.

The next best-performing model, GPT-3.5, had the greatest tendency toward being vague. It had the highest proportion of incorrectly generated codes that were accurate but more general in nature compared to the precise codes. In this case, when provided with the ICD-9-CM description "unspecified adverse effect of anesthesia," GPT-3.5 generated a code for "other specified adverse effects, not elsewhere classified."

"Our findings underscore the critical need for rigorous evaluation and refinement before deploying AI technologies in sensitive operational areas like medical coding," says study corresponding author Ali Soroush, MD, MS, Assistant Professor of Data-Driven and Digital Medicine (D3M), and Medicine (Gastroenterology), at Icahn Mount Sinai. "While AI holds great potential, it must be approached with caution and ongoing development to ensure its reliability and efficacy in health care."

One potential application for these models in the health care industry, say the investigators, is automating the assignment of medical codes for reimbursement and research purposes based on clinical text.

“Previous studies indicate that newer large language models struggle with numerical tasks. However, the extent of their accuracy in assigning medical codes from clinical text had not been thoroughly investigated across different models,” says co-senior author Eyal Klang, MD, Director of the D3M’s Generative AI Research Program. "Therefore, our aim was to assess whether these models could effectively perform the fundamental task of matching a medical code to its corresponding official text description."

The study authors proposed that integrating LLMs with expert knowledge could automate medical code extraction, potentially enhancing billing accuracy and reducing administrative costs in health care.

"This study sheds light on the current capabilities and challenges of AI in health care, emphasizing the need for careful consideration and additional refinement prior to widespread adoption,” says co-senior author Girish Nadkarni, MD, MPH, Irene and Dr. Arthur M. Fishberg Professor of Medicine at Icahn Mount Sinai, Director of The Charles Bronfman Institute of Personalized Medicine, and System Chief of D3M.

The researchers caution that the study's artificial task may not fully represent real-world scenarios where LLM performance could be worse.

Next, the research team plans to develop tailored LLM tools for accurate medical data extraction and billing code assignment, aiming to improve quality and efficiency in health care operations.

The study is titled “Generative Large Language Models are Poor Medical Coders: A Benchmarking Analysis of Medical Code Querying.”

The remaining authors on the paper, all with Icahn Mount Sinai except where indicated, are: Benjamin S. Glicksberg, PhD; Eyal Zimlichman, MD (Sheba Medical Center and Tel Aviv University, Israel); Yiftach Barash, (Tel Aviv University and Sheba Medical Center, Israel); Robert Freeman, RN, MSN, NE-BC; and Alexander W. Charney, MD, PhD.

This research was supported by the AGA Research Foundation’s 2023 AGA-Amgen Fellowship to-Faculty Transition Award AGA2023-32-06 and an NIH UL1TR004419 award.

The researchers affirm that the study was conducted without the use of any Protected Health Information (“PHI”).

Please see the study to view more details, including on competing interests.

-####-

About the Icahn School of Medicine at Mount Sinai

The Icahn School of Medicine at Mount Sinai is internationally renowned for its outstanding research, educational, and clinical care programs. It is the sole academic partner for the eight- member hospitals* of the Mount Sinai Health System, one of the largest academic health systems in the United States, providing care to a large and diverse patient population.  

Ranked 13th nationwide in National Institutes of Health (NIH) funding and among the 99th percentile in research dollars per investigator according to the Association of American Medical Colleges, Icahn Mount Sinai has a talented, productive, and successful faculty. More than 3,000 full-time scientists, educators, and clinicians work within and across 44 academic departments and 36 multidisciplinary institutes, a structure that facilitates tremendous collaboration and synergy. Our emphasis on translational research and therapeutics is evident in such diverse areas as genomics/big data, virology, neuroscience, cardiology, geriatrics, as well as gastrointestinal and liver diseases. 

Icahn Mount Sinai offers highly competitive MD, PhD, and Master’s degree programs, with current enrollment of approximately 1,300 students. It has the largest graduate medical education program in the country, with more than 2,000 clinical residents and fellows training throughout the Health System. In addition, more than 550 postdoctoral research fellows are in training within the Health System. 

A culture of innovation and discovery permeates every Icahn Mount Sinai program. Mount Sinai’s technology transfer office, one of the largest in the country, partners with faculty and trainees to pursue optimal commercialization of intellectual property to ensure that Mount Sinai discoveries and innovations translate into healthcare products and services that benefit the public.

Icahn Mount Sinai’s commitment to breakthrough science and clinical care is enhanced by academic affiliations that supplement and complement the School’s programs.

Through the Mount Sinai Innovation Partners (MSIP), the Health System facilitates the real-world application and commercialization of medical breakthroughs made at Mount Sinai. Additionally, MSIP develops research partnerships with industry leaders such as Merck & Co., AstraZeneca, Novo Nordisk, and others.

The Icahn School of Medicine at Mount Sinai is located in New York City on the border between the Upper East Side and East Harlem, and classroom teaching takes place on a campus facing Central Park. Icahn Mount Sinai’s location offers many opportunities to interact with and care for diverse communities. Learning extends well beyond the borders of our physical campus, to the eight hospitals of the Mount Sinai Health System, our academic affiliates, and globally.

------------------------------------------------------- 

Mount Sinai Health System member hospitals: The Mount Sinai Hospital; Mount Sinai Beth Israel; Mount Sinai Brooklyn; Mount Sinai Morningside; Mount Sinai Queens; Mount Sinai South Nassau; Mount Sinai West; and New York Eye and Ear Infirmary of Mount Sinai.

 

 

 

 

 

 

 

 

In psychedelic therapy, clinician-patient bond may matter most


Study links relationship strength to reduced depression for up to 1 year


READ LEARY AND ALBERT 


Peer-Reviewed Publication

OHIO STATE UNIVERSITY




COLUMBUS, Ohio – Drug effects have dominated the national conversation about psychedelics for medical treatment, but a new study suggests that when it comes to reducing depression with psychedelic-assisted therapy, what matters most is a strong relationship between the therapist and study participant.

Researchers analyzed data from a 2021 clinical trial that found psilocybin (magic mushrooms) combined with psychotherapy in adults was effective at treating major depressive disorder.

Data included depression outcomes and participant reports about their experiences with the drugs and their connection with therapists. Results showed that the stronger the relationship between a participant and clinician – called a therapeutic alliance – the lower the depression scores were one year later.

“What persisted the most was the connection between the therapeutic alliance and long-term outcomes, which indicates the importance of a strong relationship,” said lead author Adam Levin, a psychiatry and behavioral health resident in The Ohio State University College of Medicine.

Past research has consistently found that as mental health treatments changed, a trusting relationship between clients and clinicians has remained key to better outcomes, said senior author Alan Davis, associate professor and director of the Center for Psychedelic Drug Research and Education in The Ohio State University College of Social Work.

“This concept is not novel. What is novel is that very few people have explored this concept as part of psychedelic-assisted therapy,” Davis said. “This data suggests that psychedelic-assisted therapy relies heavily on the therapeutic alliance, just like any other treatment.”

The study was published recently in the journal PLOS ONE.

Twenty-four adults who participated in the trial received two doses of psilocybin and 11 hours of psychotherapy. Participants completed the therapeutic alliance questionnaire, assessing the strength of the therapist-participant relationship, three times: after eight hours of preparation therapy and one week after each psilocybin treatment.

Participants also completed questionnaires about any mystical and psychologically insightful experiences they had during the drug treatment sessions. Their depression symptoms were assessed one week, four weeks, and up to one year after the trial’s end.

The analysis showed that the overall alliance score increased over time and revealed a correlation between a higher alliance score and more acute mystical and/or psychologically insightful experiences from the drug treatment. Acute effects were linked to lower depression at the four-week point after treatment, but were not associated with better depression outcomes a year after the trial.

“The mystical experience, which is something that is most often reported as related to outcome, was not related to the depression scores at 12 months,” Davis said. “We’re not saying this means acute effects aren’t important – psychological insight was still predictive of improvement in the long term. But this does start to situate the importance and meaning of the therapeutic alliance alongside these more well-established effects that people talk about.”

That said, the analysis showed that a stronger relationship during the final therapy preparation session predicted a more mystical and psychologically insightful experience – which in turn was linked to further strengthening the therapeutic alliance.

“That’s why I think the relationship has been shown to be impactful in this analysis – because, really, the whole intervention is designed for us to establish the trust and rapport that’s needed for someone to go into an alternative consciousness safely,” Davis said.

Considering that psychedelics carry a stigma as Schedule I drugs under the Controlled Substances Act, efforts to minimize negative experiences in future studies of their therapeutic potential should be paramount – and therapy is critical to creating a supportive environment for patients, the authors said.

This study ideally will help clearly position psychedelics treatment as a psychotherapeutic intervention moving forward – rather than its primary purpose being administration of a drug, Levin said.

“This isn’t a case where we should try to fit psychedelics into the existing psychiatric paradigm – I think the paradigm should expand to include what we’re learning from psychedelics,” Levin said. “Our concern is that any effort to minimize therapeutic support could lead to safety concerns or adverse events. And what we showed in this study is evidence for the importance of the alliance in not just preventing those types of events, but also in optimizing therapeutic outcomes.”

This work was supported by the Center for Psychedelic and Consciousness Research, funded by the Steven & Alexandra Cohen Foundation, the RiverStyx Foundation and private donors. It was also supported by the Center for Psychedelic Drug Research and Education (CPDRE), funded by anonymous donors.

Additional co-authors are Rafaelle Lancelotta, Nathan Sepeda and Theodore Wagener of Ohio State, and Natalie Gukasyan, Sandeep Nayak, Frederick Barrett and Roland Griffiths of the Center for Psychedelic and Consciousness Research at Johns Hopkins University, where Davis is an affiliate.

#

 

Bringing bio-inspired robots to life


Nebraska researcher Eric Markvicka gets NSF CAREER Award to pursue manufacture of novel materials for soft robotics and stretchable electronics



UNIVERSITY OF NEBRASKA-LINCOLN

Markvicka and graduate student 

IMAGE: 

ERIC MARKVICKA (LEFT), ASSISTANT PROFESSOR OF MECHANICAL AND MATERIALS ENGINEERING, HOLDS A TRAY OF LIQUID METAL SAMPLES WHILE GRADUATE STUDENT ETHAN KRINGS WORKS ON A SAMPLE AT RIGHT. MARKVICKA HAS RECEIVED A CAREER AWARD FROM THE NATIONAL SCIENCE FOUNDATION TO ADVANCE HIS WORK WITH ROOM-TEMPERATURE, NON-TOXIC LIQUID METALS. 

view more 

CREDIT: CRAIG CHANDLER/UNIVERSITY COMMUNICATION AND MARKETING/UNIVERSITY OF NEBRASKA-LINCOLN




Engineers are increasingly eager to develop robots that mimic the behavior of animals and biological organisms, whose adaptability, resilience and efficiency have been refined over millions of years of evolution.

In bringing bio-inspired robots to life, scientists must first create soft matter counterparts that match the softness and functionality of biological tissue. University of Nebraska–Lincoln engineer Eric Markvicka is at the forefront of these efforts. He recently received a five-year, $690,000 grant from the National Science Foundation’s Faculty Early Career Development Program to advance work on a manufacturing approach that would produce a novel class of materials that could propel the fields of soft robotics, stretchable electronics and beyond.

It would be the first manufacturing strategy to yield stable mixtures of liquid metals with a wide range of solid particle additives to achieve enhanced properties — including thermal and electrical conductivity, fluidity and capacity for self-repair — that exceed anything on today’s market.

The composites would be suitable for use in additive manufacturing, commonly known as 3D printing, and would accelerate momentum toward 4D printing, which produces machines that can morph to adapt to different environments.

“At the end of this project, we’ll have a manufacturing strategy for creating diverse liquid metal mixtures that are appropriate for additive manufacturing,” said Markvicka, assistant professor of mechanical and materials engineering. “Such capabilities will enable the engineering of new materials for hybrid 4D additive manufacturing where we can create a number of things, from robotics to machines, that ultimately mimic biological organisms.”

Scientists have long recognized the commercial potential of liquid metals, which offer infinite mechanical failure strain and enhanced thermal and electrical conductivity compared to other elastomers or fluids. They also have the ability to repair themselves, similar to many natural organisms: Markvicka’s previous work shows that electrical circuits formed by liquid metal droplets can reconfigure after damage occurs.

Graduate student Ethan Krings holds a sample of room-temperature, non-toxic liquid metals. 

CREDIT

Craig Chandler/University Communication and Marketing/University of Nebraska-Lincoln

To further enhance these properties, researchers have tried adding solid particles to liquid metals. But these efforts have run up against two major barriers. For one, most liquid metal varieties include gallium, one of the most highly reactive metals on the periodic table. Many additives, such as copper or aluminum, react with gallium in a way that transforms the mixture from a liquid to a solid, compromising the fluidic advantage.

When the additive doesn’t react with gallium, a different problem arises. The solid particles are enveloped in a thin oxide layer created by the liquid metal’s high surface tension. That layer allows the particles to be incorporated into the metal — but in the process, air voids form. These voids expand at higher temperatures, reducing the material’s thermal conductivity.

Markvicka’s strategy takes aim at both problems by adding an intermediate layer, or coating, to the additive particles. This nanoscale layer serves a dual purpose: For particles that would alloy, or mix, with gallium, it acts as a corrosion barrier, thwarting the reaction that would mitigate fluidity. For nonalloying particles, the layer acts as a wetting agent, enabling the material to integrate into the liquid metal without forming air voids.

Markvicka is one of just a handful of researchers nationwide who are exploring the use of coatings in liquid metal mixtures. If successful, the approach would enable engineers to make custom liquid metal mixtures with particular physical, chemical and rheological traits.

“Few have taken this interfacial engineering approach to not only protect the filler, in terms of the corrosion barrier that’s incorporated, but also develop the ability to take particles that don’t reactively wet and incorporate them,” he said.

Markvicka will also advance a new frontier in engineering education by emphasizing not just the technical side of the field, but also the business dimension. This “entrepreneering” approach merges entrepreneurial literacy with research-based education.

“We provide a mix of foundational and specialized technical courses; however, we rarely discuss economic impact or market viability when designing or creating new products,” he said. “In the rapidly changing job market, engineers who understand the entrepreneurial process are better positioned to seize opportunities and succeed in uncertain and complex situations.”

In concert with Nebraska’s Established Program to Stimulate Competitive Research, Markvicka will develop the first mobile science lab on manufacturing, which will integrate concepts from his research and entrepreneurial skills. It will be distributed to middle and high school students across Nebraska. Markvicka will collaborate with middle school teachers to help them incorporate the mobile lab into their classrooms.

On campus, Markvicka will integrate entrepreneering-based modules into traditional engineering courses, ensuring that a broader range of students are exposed to entrepreneurship principles.

“This isn’t just for people who are interested in starting a company or working for a startup,” he said. “These skills are also important for people working in industry. They need to know how you begin to vet and understand if an idea is viable commercially.”

The National Science Foundation’s CAREER award supports pre-tenure faculty who exemplify the role of teacher-scholars through outstanding research, excellent education and the integration of education and research.

 

Could automation, electrification of long-haul trucking reduce environmental impacts?




UNIVERSITY OF MICHIGAN





April 22, 2024

A new University of Michigan study finds that automation and electrification of long-haul trucking can reduce urban health impacts and environmental damages. 

For long-haul routes below 300 miles, electrification can reduce air pollution and greenhouse gas damages by 13%, or $587 million annually, according to the study. For long-haul routes above 300 miles, electrification of just the urban segments facilitated by hub-based automation of highway driving can reduce damages by 35%, or $220 million annually.

"It's the first study we know of that simultaneously studies a realistic model of automation and a realistic model of electrification—things that are feasible in the near term—and assesses their environmental benefits," said lead author Parth Vaishnav, assistant professor at the U-M School for Environment and Sustainability.

Vaishnav explains more about the study, which was published online April 22 in the journal Transportation Research Part D.

Why did you want to focus on how automation and electrification might reduce health and environmental impacts from trucking?

Both automation and electrification of trucking are occurring in parts of the country, but separately. Electrification is difficult for very long routes. Diesel trucks can go 900 miles on a single tank. With today's technology, it is impractical to build an electric truck that can do that, because the battery would be so heavy that the truck would have no capacity to carry any payload. But it's possible to build an electric truck that can go up to 300 miles. Automation is easy on interstates—our past work shows that even truck drivers think that portion of their job can be automated—but very difficult in cities. So, we decided to see what would happen if you combined the best of both worlds: Electrify all routes shorter than 300 miles long and for longer routes, electrify the portion of the route that occurs in cities, but keep human drivers. The benefit of doing this is that you cut tailpipe pollution in places where it can do the most harm, in urban areas where lots of people can breathe it. 

What are the study's key takeaways/findings?

There is great environmental and health benefit from electrifying routes shorter than 300 miles: About half a billion dollars' worth of health and environmental harm would be avoided each year. Routes longer than 300 miles are responsible for a smaller share of freight ton miles than are shorter routes. Nonetheless, electrifying even the urban portions on these routes would cut health and environmental harms by over a third, or $200 million per year. That's true with today's grid, which still relies quite heavily on fossil fuels. As we clean up the grid, the benefits will grow.  

Did anything surprise you about your findings? If so, what and why?

One of the challenges with using battery electric trucks on short routes is that you face a trade-off between flexibility and efficiency. You want the smallest possible battery that will serve all the routes you want to serve because batteries are heavy, and you don't want to waste energy hauling around extra battery weight. On the other hand, you want the truck to be able to be flexible in what it does; for example, to serve routes of all kinds of different lengths over its lifetime. This means that you want to have a battery that is large enough to serve any route that the truck might be called upon to serve. To our surprise, this efficiency penalty was rather low—about 3%. This is good news, because you can have trucks with a few standard battery sizes without paying a massive efficiency penalty. 

What would you like policy makers to take away from your paper in light of the Biden administration's new rules on heavy trucks?

That the electrification of trucking produces outsize benefits, even if it is done with today's technology in niche applications where it makes the most sense. However, these benefits are contingent on deploying a charging infrastructure that works for trucking: This means building out both chargers and strengthening the electricity grid. The broader lesson is that there is a lot of room for both regulatory and operational creativity. For example, we assume that long-haul routes can be split into interstate and urban legs, and that before a truck enters or leaves the interstate, it stops to switch the trailer from an electric to diesel prime mover. These locations can be brownfield sites—for example, the sites of old retail malls—that are revived with a new purpose. 

The study's co-authors include Yizhou Tian of the University of Michigan College of Engineering and Cecelia Isaac and Aniruddh Mohan of Princeton University's Andlinger Center for Energy and Environment.

Study: Automation and electrification in long-haul trucking cuts urban health and environmental damages (DOI: 10.1016/j.trd.2024.104187)

 

Written by Lori Atherton, U-M School for Environment and Sustainability

 

International Lawyer from the University of Warwick calls for fairness in WHO Pandemic Treaty Talks


As the World Health Organization (WHO) pushes for countries to seal the Pandemic Treaty by May this year, researchers at the University of Warwick and Kings College London stress the need for fair negotiations



UNIVERSITY OF WARWICK




As the World Health Organization (WHO) pushes for countries to seal the Pandemic Treaty by May this year, researchers at the University of Warwick and Kings College London stress the need for fair negotiations.

The opinion piece, featured in PLOS Global Public Health journal, is led by Professor Sharifah Sekalala . The team highlights the importance of considering "Time Equity" in these talks, urging caution on setting deadlines and sharing the burden when time is tight.

Since COVID-19 hit, demands for health equity have echoed worldwide, sparking discussions on fair frameworks to tackle future crises. Negotiating the Pandemic Treaty stands out as a crucial step, aiming for fair vaccine distribution, accessible healthcare, and stronger health systems.

Unlike past agreements, urgency for the Pandemic Treaty has surged, with the WHO Director-General calling for swift action. But rushing poses risks, especially in fairly dividing time and resources among all involved. Low- and middle-income countries (LMICs) risk being sidelined, forced into a speedy process without their say.

Professor Sekalala warns against haste, noting LMICs' challenges in such negotiations. Balancing speed with inclusivity is vital to protect everyone's interests.

Professor Sekalala said: "LMICs face inherent challenges in multilateral treaty negotiations, stemming from resource inequities and knowledge asymmetries,"

"Balancing the need for expediency with inclusivity and transparency is crucial to safeguard the interests of all parties involved."

As talks heat up, researchers suggest a fairer approach, viewing the May 2024 deadline as a chance for dialogue, especially for LMICs with limited representation.

Sekalala stresses on outcomes that benefit LMICs, like time-limited waivers on intellectual property and sustainable health financing. Prioritising these will make the Pandemic Treaty a tool for global health fairness.

ENDS
For more information, or to arrange interviews, please contact:
Bron Mills
Communications Officer
bron.mills@warwick.ac.uk | +447824 540 720
Full report available for viewing here:  https://journals.plos.org/globalpublichealth/article?id=10.1371/journal.pgph.0003118

 

Role of indoor dust on indoor environmental air quality gets closer look




WASHINGTON UNIVERSITY IN ST. LOUIS




By Beth Miller

House dust — the ever-present nuisance. The persistent substance is made up of a variety of particles, including dead skin cells and pet dander, and airborne pollutants that settle onto and stick to dust particles, creating a risk to human health.

Jenna Ditto, assistant professor of energy, environmental & chemical engineering in the McKelvey School of Engineering at Washington University in St. Louis, is taking a closer look at the chemistry of indoor dust with a three-year, $453,000 grant from the National Science Foundation.

“Our goal is to look at the complex mixture of chemicals within the dust and try to characterize not only what’s in the dust, but how the dust chemically transforms in a typical indoor environment,” Ditto said. “Dust particles are a reservoir for various chemicals that start in the air but may stick to the dust and persist there over time.”

Ditto and her team plan to combine indoor dust samples with chemicals such as motor vehicle exhaust and fumes from cooking or from burning biomass, then determine how the chemicals transform within the dust, what products result from the combinations and their impact on human exposure.

“Motor vehicle exhaust and biomass burning smoke from outdoors may enter homes and impact indoor environmental quality, while cooking is a major source that impacts indoor air directly,” Ditto said. “Our research will help us to understand more about how these different pollutant sources impact indoor dust chemistry, especially in homes more susceptible to higher concentrations of pollutants that transfer from air to dust and might persist in dust for long periods of time.”

Later in the research, Ditto’s team plans to use indoor dust samples from homes in the St. Louis area. When the team has determined chemical composition and exposure risk evaluations of the samples and indoor air, team members plan to share results with the public through an interactive online tool, providing an idea about the dust interactions in the region.

Ditto and her lab also plan to hold workshops for middle school students and their teachers through the university’s Institute for School Partnership.

The work is an outgrowth of one avenue of the work Ditto completed through a $25,000 Collaboration Initiation Grant from McKelvey Engineering and the Institute of Clinical and Translational Sciences at the School of Medicine in 2023. The program awards one-year grants to projects that facilitate collaborative research within McKelvey Engineering departments and other university departments for tenure-track faculty. The grants are a pathway for faculty to apply for larger interdisciplinary grants.

 

Research showcases Indigenous stewardship’s role in forest ecosystem resilience



Peer-Reviewed Publication

OREGON STATE UNIVERSITY




CORVALLIS, Oregon – Oregon State University researchers have teamed with the Karuk Tribe to create a novel computer simulation model that showcases Indigenous fire stewardship’s role in forest ecosystem health.

Western scientists and land managers have become increasingly cognizant of cultural burning but its extent and purpose are generally absent from fire modeling research, said Skye Greenler, who led the partnership when she was a graduate research fellow in the OSU College of Forestry.

“We developed this project in collaboration with the Karuk Tribe to explore the impact of cultural burning at a landscape scale in a completely new way,” she said. “The information that went into this model is not new at all – it’s been held by Karuk Tribal members for millennia – but we developed new methods to bring the knowledge together and display it in a way that showcases the extent of Indigenous cultural stewardship across this landscape.”

Greenler and collaborators including Chris Dunn and John Bailey of Oregon State say that understanding the interactions among humans, fire processes and pathways towards coexistence with wildfire has become increasingly urgent as the social, ecological and economic impacts of fire have intensified in recent years.

The research focused on 1,000 square miles of Karuk Aboriginal Territory in the western Klamath Mountains of northern California. Working with the Karuk Tribe Department of Natural Resources, OSU scientists developed historical estimates for cultural ignition locations, frequency and timing. Statistical parameters were collaboratively developed and honed with Tribal members and knowledge holders using interviews, historical and contemporary maps, ethnographies, recent ecological studies and generational knowledge.

Published in Ecological Applications, the findings show that before the arrival of European colonizers, cultural burning was extensive across the landscape, with an estimated 6,972 cultural ignitions occurring annually, averaging about 6.5 ignitions per year for each Indigenous fire steward.

The researchers, who included scientists from the University of Washington, the Mid-Klamath Watershed Council, and the U.S. Forest Service’s Pacific Northwest and Pacific Southwest research stations, found that the timing and location of burning was often guided by the ecology of specific cultural resources, fuel receptivity, seasonal movement patterns and spiritual practices.

“The ignition characteristics we document align closely with data on historical fire regimes and vegetation but differ substantially from the location and timing of the ignitions happening now,” Greenler said. “This work shows the importance of cultural burning for developing and maintaining the ecosystems present at the time of colonization and underscores the need to work collaboratively with Indigenous communities to restore ecocultural processes.”

Northern California’s western Klamath Mountains are a diverse and highly fire- prone ecosystem that historically burned frequently at low or moderate severity but have recently experienced many extensive or severe wildfires, the researchers point out.

“Collaboratively integrating western and Indigenous fire science and knowledge systems in this research will help reinstate fire on this landscape to achieve socioecological resource values with benefits both to tribes and the public,” said the Forest Service’s Frank Lake, a Karuk Tribal descendant who earned a doctorate from the OSU College of Forestry.

The study area is one of many fire-prone landscapes in the western United States that have a long history of cultural burning, defined as the purposeful use of fire by an Indigenous group to promote food, medicinal and material resources. Cultural burning also alters wildfire risk and how fires spread.

“It would be incredible to go back in time and experience the study landscape with open oak groves, hillsides of high quality and abundant first foods, frequent, small cultural burns in the mornings or late afternoons, and little fear of large wildfires when dry lightning storms passed over the mountains in August,” Greenler said. “A recent shift within western scientific and management communities is towards a greater interest in supporting Indigenous fire stewardship practices that better balance relationships between people and fire.”

The increased interest follows decades of work from Indigenous communities to assert sovereign rights to land stewardship, emphasize the importance of cultural burning and build collaborative relationships and policies that integrate cultural burning into research, management and restoration practices, the scientists say.

“The lens with which scientists view data in observational scientific studies affects their interpretation of the results,” Dunn said. “As our acceptance of Indigenous Knowledge grows, we are experiencing a shift in our interpretation of fire regimes, ecological outcomes and humans in these systems. This does not invalidate previous studies but builds upon them for a more accurate understanding of history, and with that, where we are going in the future.”

Indigenous fire stewardship is deeply place-based, Greenler said, meaning the collaborators’ findings are not directly transferable to other landscapes, but the process that was used to develop and model estimates of cultural ignitions could “absolutely be applied” in other landscapes to better understand the impacts and patterns of cultural burning.

“It was a real pleasure to work on this project,” said Bill Tripp, director of natural resources and environmental policy for the Karuk Tribe. “Being able to incorporate Karuk Indigenous Knowledge, practice and belief systems into a product that can help assess historic fire regimes in a way that protects our proprietary information and locational data in regard to the site-specific resources we are working to enhance is of vital importance to our ecocultural revitalization efforts.”

Greenler stresses the importance of remembering that cultural burning remains an important practice on many landscapes today, including in northern California. She would like the collaboration with the Karuk Tribe to help provide information for society to use in restoring balanced human-fire relationships – which in some ways may emulate historical practices and in other ways may not, she said.

“I hope for all of us that we can continue to learn from Indigenous communities, uplift Indigenous Knowledge and sovereignty, and work towards a future where we can better co-exist with fire on our landscapes,” Greenler said.