It’s possible that I shall make an ass of myself. But in that case one can always get out of it with a little dialectic. I have, of course, so worded my proposition as to be right either way (K.Marx, Letter to F.Engels on the Indian Mutiny)
Monday, December 18, 2023
Few patients receive opioid agonist therapy after opioid overdose, despite benefits
In the week following any hospital visit for an overdose, only 1 in 18 people with opioid use disorder begin a treatment known to be highly effective in reducing illness and deaths, according to new research in CMAJ (Canadian Medical Association Journal) https://www.cmaj.ca/lookup/doi/10.1503/cmaj.231014.
“These results highlight critical missed opportunities to prevent future mortality and morbidity related to opioid use, despite connection to health care for many patients in the days after a toxicity event,” writes Dr. Tara Gomes, a researcher at ICES and St. Michael’s Hospital, part of Unity Health Toronto, with coauthors.
Opioid use disorder (OUD) is a major public health issue, with an almost threefold increase in opioid-related emergency department visits between 2016 and 2021 in Ontario and a 32% increase in related hospitalizations in Canada.
Opioid agonist therapy (OAT) is highly effective at reducing illness and deaths in patients with OUD.
Using data from ICES, researchers looked at trends in OAT initiation rates for 20 702 emergency department visits and inpatient hospital admissions for opioid toxicity between January 2013 and March 2020. The median age of patients was 35 years, 65% were male and 90% lived in urban areas. Of the total visits, 29% were from patients who had previously visited hospital for opioid overdoses, and 24% had been dispensed OAT in the last year.
Only 4.1% of hospital encounters for opioid overdoses led to OAT initiation within a week of discharge from hospital. Despite increased advocacy and publication of a 2018 national guideline recommending buprenorphine–naloxone as the preferred first-line treatment for OUD, there was no significant increase in OAT initiation rates. Studies show that risk of death is highest in the days following an overdose, and that patients are more likely to continue OAT if it is started in the emergency department, underlining the need for prompt initiation of treatment.
“Our research shows that there were substantial disparities in OAT initiation rates, with potential barriers to prescribing for older patients, those with mental health diagnoses, and those in the lowest neighbourhood income quintile. Although OAT initiation rates have gradually increased since 2016, the release of the national OUD management guideline in 2018 was not independently associated with changes in this trajectory,” write the authors.
To increase treatment initiation rates, they suggest institutional OAT training, creating OAT initiation protocols, promoting awareness of referral resources with outpatient addictions programs, and more.
A practice article illustrates the challenges of treating patients with multiple substance use disorders in hospitals, who often are experiencing undertreated withdrawal and pain.
Initiation of opioid agonist therapy after hospital visits for opioid poisonings in Ontario
ARTICLE PUBLICATION DATE
18-Dec-2023
Researchers invent "methane cleaner": Could become a permanent fixture in cattle and pig barns
In a spectacular new study, researchers from the University of Copenhagen have used UV light and chlorine to eradicate low-concentration methane from air. The result gets us closer to being able to remove greenhouse gases from livestock housing
In a spectacular new study, researchers from the University of Copenhagen have used light and chlorine to eradicate low-concentration methane from air. The result gets us closer to being able to remove greenhouse gases from livestock housing, biogas production plants and wastewater treatment plants to benefit the climate. The research has just been published in the journal Environmental Research Letters.
The Intergovernmental Panel on Climate Change (IPCC) has determined that reducing methane gas emissions will immediately reduce the rise in global temperatures. The gas is up to 85 times more potent of a greenhouse gas than CO2, and more than half of it is emitted by human sources, with cattle and fossil fuel production accounting for the largest share.
A unique new method developed by a research team at the University of Copenhagen’s Department of Chemistry and spin-out company Ambient Carbon has succeeded in removing methane from air.
"A large part of our methane emissions comes from millions of low-concentration point sources like cattle and pig barns. In practice, methane from these sources has been impossible to concentrate into higher levels or remove. But our new result proves that it is possible using the reaction chamber that we’ve have built," says Matthew Stanley Johnson, the UCPH atmospheric chemistry professor who led the study.
Earlier, Johnson presented the research results at COP 28 in Dubai via an online connection and in Washington D.C. at the National Academy of Sciences, which advises the US government on science and technology.
Reactor cleans methane from air
Methane can be burnt off from air if its concentration exceeds 4 percent. But most human-caused emissions are below 0.1 percent and therefore unable to be burned.
To remove methane from air, the researchers built a reaction chamber that, to the uninitiated, looks like an elongated metal box with heaps of hoses and measuring instruments. Inside the box, a chain reaction of chemical compounds takes place, which ends up breaking down the methane and removing a large portion of the gas from air.
"In the scientific study, we’ve proven that our reaction chamber can eliminate 58 percent of methane from air. And, since submitting the study, we have improved our results in the laboratory so that the reaction chamber is now at 88 percent," says Matthew Stanley Johnson.
Chlorine is key to the discovery. Using chlorine and the energy from light, researchers can remove methane from air much more efficiently than the way it happens in the atmosphere, where the process typically takes 10-12 years.
"Methane decomposes at a snail's pace because the gas isn’t especially happy about reacting with other things in the atmosphere. However, we’ve discovered that, with the help of light and chlorine, we can trigger a reaction and break down the methane roughly 100 million times faster than in nature," explains Johnson.
Up next: livestock stalls, wastewater treatment plants and biogas plants
A 40ft shipping container will soon arrive at the Department of Chemistry. When it does, it will become a larger prototype of the reaction chamber that the researchers built in the laboratory. It will be a "methane cleaner" which, in principle, will be able to be connected to the ventilation system in a livestock barn.
"Today’s livestock farms are high-tech facilities where ammonia is already removed from air. As such, removing methane through existing air purification systems is an obvious solution," explains Professor Johnson.
The same applies to biogas and wastewater treatment plants, which are some of the largest human-made sources of methane emissions in Denmark after cattle production.
As a preliminary investigation for this study, the researchers traveled around the country measuring how much methane leaks from cattle stalls, wastewater treatment plants and biogas plants. In several places, the researchers were able to document that a large amount of methane leaks into the atmosphere from these plants.
"For example, Denmark is a pioneer when it comes to producing biogas. But if just a few percent of the methane from this process escapes, it counteracts any climate gains," concludes Johnson.
The research is funded by a grant from Innovation Fund Denmark for the PERMA project, a part of AgriFoodTure. The research was conducted in collaboration between the University of Copenhagen, Aarhus University, Arla, Skov and the UCPH spin-out company Ambient Carbon, started and now headed by Professor Matthew Stanley Johnson. The company was started to develop MEPS (Methane Eradication Photochemical System) technology and make it available to society.
The researchers built a reaction chmaber and devised a method that simulates and greatly accelerates methane's natural degradation process.
They dubbed the method the Methane Eradication Photochemical System (MEPS) and it degrades methane 100 million times faster than in nature.
The method works by introducing chlorine molecules into a reaction chamber with methane gas. The researchers then shine UV light onto the chlorine molecules. The light’s energy causes the molecules to split and form two chlorine atoms.
The chlorine atoms then steal a hydrogen atom from the methane, which then falls apart and decomposes. The chlorine product (hydrochloric acid) is captured and subsequently recycled in the chamber.
The methane turns into carbon dioxide (CO2) and carbon monoxide (CO) and hydrogen (H2) in the same way as the natural process does in the atmosphere.
More about methane (CH4):
Methane can be burned off to remove it from air, but its concentration must be over 4%, 40,000 parts per million (ppm) to be flammable. As most human-caused emissions are below 0.1 percent, they cannot be burned.
The Intergovernmental Panel on Climate Change (IPCC) has determined that reducing methane gas emissions will immediately reduce the rise in global temperatures.
Methane is a greenhouse gas that is emitted naturally from, among other things, wetlands and from man-made sources such as food production, natural gas and sewage treatment plants.
Today, methane gas is responsible for a third of the greenhouse gases that affect the climate and cause global warming.
It takes methane 10-12 years to decompose naturally in the atmosphere, where it is converted into carbon dioxide.
Over a 25-year period, methane is 85 times worse for the climate than CO2. Over a 100-year period, methane is 30 times worse for the climate than CO2.
The concentration of methane in the atmosphere has increased by 150% since the mid-1700s.
Methane alone has increased anthropogenic radiation exposure by 1.19 W/m^2, which is responsible for a 0.6 ◦C increase in global average surface air temperature, according to the IPCC.
The University of Exeter study has been published in The Journal of Nutrition and is the first of its kind to demonstrate that the ingestion of two of the most commercially available algal species are rich in protein which supports muscle remodeling in young healthy adults. Their findings suggest that algae may be an interesting and sustainable alternative to animal-derived protein with respect to maintaining and building muscle.
Researcher Ino Van Der Heijden from the University of Exeter said: “Our work has shown algae could become part of a secure and sustainable food future. With more and more people trying to eat less meat because of ethical and environmental reasons, there is growing interest in nonanimal-derived and sustainably produced protein. We believe it’s important and necessary to start looking into these alternatives and we’ve identified algae as a promising novel protein source.”
Foods rich in protein and essential amino acids have the capacity to stimulate muscle protein synthesis, which can be measured in the laboratory by determining the incorporation of labelled amino acids into muscle tissue proteins and translated to a rate over time. Animal-derived protein sources robustly stimulate resting and post-exercise muscle protein synthesis.
However, because animal-based protein production is associated with increasing ethical and environmental concerns, it’s now been discovered that an intriguing environmentally friendly alternative to animal-derived protein is algae. Cultivated under controlled conditions, spirulina and chlorella are the two most commercially available algae that contain high doses of micronutrients and are rich in protein. However, the capacity of spirulina and chlorella to stimulate myofibrillar protein synthesis in humans remains unknown.
To bridge the knowledge gap, University of Exeter researchers assessed the impact of ingesting spirulina and chlorella, compared with an established high-quality nonanimal-derived dietary protein source (fungal-derived mycoprotein) on blood amino acid concentrations, as well as resting and post-exercise myofibrillar protein synthesis rates. Thirty-six healthy young adults participated in a randomized, double-blind trial. Following a bout of one-legged resistance leg exercise, participants ingested a drink containing 25 grams of protein from fungal-derived mycoprotein, spirulina or chlorella. Blood and skeletal muscle samples were collected at baseline and during a four-hour post-feeding and post-exercise period. Blood amino acid concentrations and myofibrillar protein synthesis rates in rested and exercised tissue were assessed.
Protein ingestion increased blood amino acid concentrations, but most rapidly and with higher peak responses following consumption of spirulina compared with mycoprotein and chlorella. Protein ingestion increased myofibrillar protein synthesis rates in both rested and exercised tissue, with no differences between groups, but with higher rates in exercised compared with rested muscle.
This study is the first of its kind to demonstrate that ingestion of spirulina or chlorella robustly stimulates myofibrillar protein synthesis in resting and exercised muscle tissue, and to an equivalent extent as a high-quality nonanimal derived counterpart (mycoprotein).
In a companion commentary, Lucy Rogers and Professor Leigh Breen from the University of Birmingham highlight the strengths and utility of these novel findings, while identifying paths forward for future research that focuses on diverse populations such as older adults.
Algae Ingestion Increases Resting and Exercised Myofibrillar Protein Synthesis Rates to a Similar Extent as Mycoprotein in Young Adults.
ARTICLE PUBLICATION DATE
14-Dec-2023
DOD HAS THE BIGGEST BUDGET IN THE U$A
Unconventional cancer research consortium created with $3.2M grant from US Department of Defense
Researchers from disparate disciplines located at USC, Cedars-Sinai Medical Center, Children’s Hospital Los Angeles and Stanford University gather to find solutions to cancer through the newly formed Convergent Science Cancer Consortium.
Funding an unconventional approach to fighting cancer that emphasizes the integration of diverse scientific disciplines, the U.S. Department of Defense has awarded $3.2 million to establish the Convergent Science Cancer Consortium (CSCC), led by Dean’s Professor of Biological Sciences Peter Kuhn at the USC Dornsife College of Letters, Arts and Sciences.
The consortium, which includes Stanford University, Cedars-Sinai Medical Center and Children’s Hospital Los Angeles as inaugural members, unites experts from fields such as biology, engineering, mathematics and computer science, to discover more effective treatment strategies through a more holistic understanding of cancer, particularly for hard-to-treat forms such as bladder cancer, sarcomas and metastatic cancers.
A global asset in finding cancer solutions
A leading cause of death worldwide, cancer presents complex challenges that often exceed the scope of traditional research methods. By fostering collaboration across various scientific domains, the CSCC intends to overcome these limitations. And because it will include scientists from across disciplines and institutions, their proposed solutions are likely to be more versatile and applicable to a diverse range of populations and health care systems.
“By integrating diverse scientific insights, we can develop more effective, tailored treatments for patients, especially those fighting intractable forms of cancer,” he said.
The CSCC places a special emphasis on understanding and addressing cancer in military personnel, a group often exposed to unique environmental risks such as asbestos in conflict zones.
This focus, relatively rare in cancer research, offers significant benefits not only to military personnel but also to civilians exposed to similar hazards. The insights gained could lead to better prevention and treatment strategies for cancers caused by specific environmental factors.
More personal, more effective cancer treatment
The consortium’s approach aims to advance personalized cancer treatments that consider individual patient profiles. This could result in more effective treatments with reduced side effects. They will also study technologies for real-time monitoring, such as wearable devices, that hold promise for early detection and intervention, potentially improving patient survival rates.
Co-principal investigator Dan Theodorescu is the director of Cedars-Sinai Cancer Center and a renowned physician scientist and cancer expert. Theodorescu combines molecular and cell biology with computational methods in his research. He and Kuhn have long held a belief that convergent science holds exceptional power to find cancer solutions.
Co-principal investigator Christina Curtis of Stanford University brings extensive biomedical data science expertise. Using the latest computational methods, she’ll lead efforts to parse the copious and varied information the consortium researchers will produce.
Co-investigator Fariba Navid of Children’s Hospital Los Angeles has expertise in pediatric bone and soft tissue sarcomas. She will continue an ongoing collaboration with Kuhn to assess circulating tumor cells in these cancers.
The establishment of the Convergent Science Cancer Consortium, much of which is modeled on the recently established Convergent Science Virtual Cancer Center led by Theodorescu as director and Kuhn as deputy director, marks a pivotal moment in cancer research, according to the researchers — one that “means new avenues of hope for patients and their families,” said Kuhn.
##
Big Science in the 21st Century – a new ebook published by IOP Publishing
IOP Publishing is proud to announce the release of ‘Big Science in the 21st Century’, a comprehensive exploration of the impact of Big Scienceon our society and the new perspectives it opens on evaluating its societal benefits.
Authored by a diverse group of contributors, the book offers a multifaceted view of the challenges, merits, and transformations of Big Science across different disciplines and geographical boundaries. It delves into the transformative role of Big Science in shaping the world we live in, from its historical roots in the aftermath of the Second World War to its contemporary interdisciplinary and international nature.
Big Science in the 21st Century is organized in five parts, each offering unique insights into the impact of Big Science. The first part looks at lessons from Big Science organizations and best practices in increasing the return of benefits to society. The second part offers the voice of key economists who have worked on assessment exercises concerning the socioeconomic benefits of large-scale research infrastructures. The editors also strived to include a historical perspective on these debates. Essays, in part three, trace the development of Big Science in the aftermath of World War II. Importantly, the editors also focus on the educational and cultural impacts that Big Science has beyond the laboratory; from the art gallery to the school classroom. In the last part, the editors aimed to bring a more global perspective with contributions from other continents outside North America and Western Europe.
As noted in the preface: “Rather than an exhaustive list, this book aspires to offer a comprehensive overview of the different ways in which Big Science impacts our society and consequently opens new perspectives in thinking how to evaluate its societal benefits. It should offer a glimpse on the complex realities that characterize the development and present status of Big Science.”
Panagiotis Charitos, one of the editors of Big Science in the 21st Century says: “The culmination of massive investment and intense international collaboration, ‘Big Science’ projects sit at the forefront of today’s scientific innovations. They reach beyond disciplinary boundaries to deliver global visibility, recognition, and impact for researchers.”
Charitos continues: “The different viewpoints in this book demonstrate the ways in which Big Science has delivered intellectual, utilitarian, and economic progress. It serves as a reminder that Big Science, along its many coordinates, should be evaluated for its scientific, economic, educational, and cultural impact. The essays also stress that this path is not always linear while also giving voice to concerns around “Big Science” projects and the distribution of resources.”
Miriam Maus, Chief Publishing Officer at IOP Publishing, says: “’Big Science’ projects span many issues, from combatting social imbalance to building the Large Hadron Collider. We are proud to publish a book that explores the depth and breadth of Big Science projects with perspectives from a wide variety of contributors across the globe.”
The full book is available on the IOP Publishing Bookstore and is aimed at professionals involved in science policy and administration, economists interested in evaluating the results of scientific research, and anyone with an interest in scientific outreach and communication. The collection of essays is targeted to stimulate interdisciplinary discussions, with the hope of yielding new research tools for measuring the impact of Big Science and creating connections between economists, historians, and those working in science and technology studies.
Five chapters of the book are available to read for free, covering topics including the development of CERN, the rise of ‘Big Science,’ its societal impacts in the 21st century, communicating Big Science in a post-war period, and the transition to an open science model.
ASU research reveals regions in U.S. where heat adaptation and mitigation efforts can most benefit future populations
Study models the benefits of strategies to cope with and alleviate extreme heat in 47 of the largest U.S. cities through the end of the century
Tempe, Ariz., December 18, 2023 – Extreme heat waves, once considered rare, are now frequent and severe in cities due to climate change. Phoenix faced such a brutal heat wave in July of 2023 when it endured 31 consecutive days of high temperatures of at least 110 F. The severity of the heat wave triggered a state of emergency. In June of 2021, the town of Lytton, B.C., Canada, hit a blistering 121 F, leading to a fire that burnt most of the village. This pattern repeated in Europe in 2022, where heat caused fatal illnesses, wildfires and damaged infrastructure, highlighting an urgent need for climate adaptation measures.
In order to make crucial urban-planning decisions across the United States, city managers and stakeholders will need to better understand the outcomes of potential solutions that address the immediate impacts of heat exposure on cities and the long-term climate impacts, both individually and together.
New research published in the January issue of Nature Cities examines, for the first time, the potential benefits of combining heat adaptation strategies – such as implementing cool roofs and planting street trees – with mitigation strategies – such as reducing greenhouse gas emissions – to lessen heat exposure across major U.S. cities. It also identifies the regions in which these strategies could best benefit future populations.
“Research to date has focused on the reduction of harmful impacts on cities resulting from either increased emissions of greenhouse gasses or direct effects from the built environment,” said Matei Georgescu, lead author of the paper and an associate professor at Arizona State University’s School of Geographical Sciences and Urban Planning. “Our work highlights the value of adaptation to reduce human heat exposure at the city level but then goes further to emphasize the benefits of deploying adaptation strategies in tandem with mitigation strategies.”
The paper, “Quantifying the decrease in heat exposure through adaptation and mitigation in twenty-first-century U.S. cities,” is coauthored by Ashley Broadbent of the National Institute of Water and Atmospheric Research, Wellington, New Zealand, and E. Scott Krayenhoff of the
School of Environmental Sciences at the University of Guelph, Ontario, Canada.
In the study, the researchers used computer models to simulate future climate conditions that account for urban expansion, greenhouse gas emissions and population movement through the end of the century.
Then they examined the extent to which adaptation and mitigation strategies, in isolation and in tandem, can reduce population heat exposure across end-of-century U.S. cities and large urban areas.
The adaptation strategies examined included deploying cool and evaporative roofs on buildings and street trees that were applied uniformly across all cities. Mitigation strategies involved the reduction in global emissions of greenhouse gasses. The simulations examined a contemporary decade (2000-2009) against future projections (2090-2099).
The study found that some cities like Tulsa, Okla., will respond better to adaptation strategies such as deploying street trees and cool roofs to cope with heat, while others like Denver benefit more from reducing greenhouse gas emissions.
When focusing on lowering emissions as a strategy to reduce the impact of heat, the study found that exposure to extreme heat for people in cities tends to be higher in southern latitude cities compared to those in northern latitudes. This pattern is seen for adaptation strategies that help us deal with the heat right now and mitigation strategies that tackle the longer-term problems of climate change.
The researchers also discovered that the effectiveness of these strategies in reducing extreme heat exposure varies throughout the day, but remains consistent during nighttime.
When simultaneously implementing adaptation and mitigation measures, the study shows the benefits are the greatest in the Northeast and Midwest regions, encompassing cities like New York, Boston and Chicago. Sun Belt cities, including Los Angeles and Miami, face more limited heat exposure reductions.
Relative increases in population heat exposure remain after implementing adaptation and mitigation measures and are limited to Southeast, Great Plains and Southwest urban areas.
“We underscore the importance of characterizing such results on the basis of individual urban environments, as it paves the way for prioritizing strategies with identified impacts at the urban, rather than broader regional or national scale,” said Georgescu who also is director of ASU’s Urban Climate Research Center. “The study helps us develop a timeline of implementation strategies to enhance livability in our cities.”
Moving forward
In the future, the researchers suggest further advancing modeling work to examine how impactful adaptation strategies can be in reducing adverse heat-related impacts by targeting specific neighborhoods within cities while facilitating collaboration between cities and academic institutions.
“Collaboration between cities and academic institutions is crucial to gather important data and develop sound policies that can effectively protect at-risk communities from the effects of climate change and the added burden of heat from the built environment,” said Georgescu. “By understanding which strategies work best at the local level, and how such strategies may work differently depending on geographical context, we can create effective plans to tackle place-based climate challenges while continuing to work on mitigation strategies that deal with long-term consequences. Working together is key to creating better strategies for a sustainable and resilient future.”
Scientists looking to uncover the mysteries of the underwater world have more valuable information at their fingertips thanks to an international team that has produced an inventory of species confirmed or expected to produce sound underwater.
Led by Audrey Looby from the University of Florida Department of Fisheries and Aquatic Sciences, the Global Library of Underwater Biological Sounds working group collaborated with the World Register of Marine Species to document 729 aquatic mammals, other tetrapods, fishes, and invertebrates that produce active or passive sounds. In addition, the inventory includes another 21,911 species that are considered to likely produce sounds.
With more than 70% of the Earth’s surface covered in water, most of the planet’s habitats are aquatic, and there is a misconception that most aquatic organisms are silent. The newly published comprehensive digital database on what animals are known to make sounds is the first of its kind and can revolutionize marine and aquatic science, the researchers said.
“Eavesdropping on underwater sounds can reveal a plethora of information about the species that produce them and is useful for a variety of applications, ranging from fisheries management, invasive species detection, improved restoration outcomes, and assessing human environmental impacts,” said Looby, who also co-created FishSounds, which offers a comprehensive, global inventory of fish sound production research.
The team’s research, “Global Inventory of Species Categorized by Known Underwater Sonifery,” was published Monday in Scientific Data and involved 19 authors from six countries, funding from the Richard Lounsbery Foundation and centuries of scientific effort to document underwater sounds.
“Understanding how marine species interact with their environments is of global importance, and this data being freely available is a major step toward that goal,” said Kieran Cox, a member of the research team and a National Science and Engineering Research Council of Canada fellow.
Most people are familiar with whale or dolphin sounds but are often surprised to learn that many fishes and invertebrates use sounds to communicate, too, Looby said.
“Our dataset helps demonstrate how widespread underwater sound production really is across a variety of animals, but also that we still have a lot to learn,” she said.