Thursday, January 12, 2023

Use of cannabis, other pain treatments among adults with chronic pain in states with medical cannabis programs

JAMA Network Open

Peer-Reviewed Publication

JAMA NETWORK

About The Study: In this survey study of 1,661 adults with chronic pain in states with medical cannabis laws, 3 in 10 persons reported using cannabis to manage their pain. Most persons who used cannabis as a treatment for chronic pain reported substituting cannabis in place of other pain medications including prescription opioids. The high degree of substitution of cannabis with both opioid and nonopioid treatment emphasizes the importance of research to clarify the effectiveness and potential adverse consequences of cannabis for chronic pain.  

Authors: Mark C. Bicket, M.D., Ph.D., of the University of Michigan in Ann Arbor, is the corresponding author.  

This link will be live at the embargo time http://jamanetwork.com/journals/jamanetworkopen/fullarticle/10.1001/jamanetworkopen.2022.49797?utm_source=For_The_Media&utm_medium=referral&utm_campaign=ftm_links&utm_term=010623

About JAMA Network Open: JAMA Network Open is an online-only open access general medical journal from the JAMA Network. On weekdays, the journal publishes peer-reviewed clinical research and commentary in more than 40 medical and health subject areas. Every article is free online from the day of publication.

Study reveals average age at conception for men versus women over past 250,000 years

Evolutionary biologists at IU found that fathers are consistently older than mothers throughout human evolutionary history, but that age gap has shrunk

Peer-Reviewed Publication

INDIANA UNIVERSITY

Age at conception 

IMAGE: GRAPHS SHOWING THE AVERAGE AGE AT CONCEPTION FOR MEN VERSUS WOMEN OVER THE PAST 250,000 YEARS. view more 

CREDIT: HAHN, WANG, ET. AL., INDIANA UNIVERSITY

BLOOMINGTON, Ind. — The length of a specific generation can tell us a lot about the biology and social organization of humans. Now, researchers at Indiana University can determine the average age that women and men had children throughout human evolutionary history with a new method they developed using DNA mutations.

The researchers said this work can help us understand the environmental challenges experienced by our ancestors and may also help us in predicting the effects of future environmental change on human societies.

“Through our research on modern humans, we noticed that we could predict the age at which people had children from the types of DNA mutations they left to their children,” said study co-author Matthew Hahn, Distinguished Professor of biology in the College of Arts and Sciences and of computer science in the Luddy School of Informatics, Computing and Engineering at IU Bloomington. “We then applied this model to our human ancestors to determine what age our ancestors procreated.”

According to the study, published today in Science Advances and co-authored by IU post-doctoral researcher Richard Wang, the average age that humans had children throughout the past 250,000 years is 26.9. Furthermore, fathers were consistently older, at 30.7 years on average, than mothers, at 23.2 years on average, but the age gap has shrunk in the past 5,000 years, with the study’s most recent estimates of maternal age averaging 26.4 years. The shrinking gap seems to largely be due to mothers having children at older ages.

Other than the recent uptick in maternal age at childbirth, the researchers found that parental age has not increased steadily from the past and may have dipped around 10,000 years ago because of population growth coinciding with the rise of civilization.

“These mutations from the past accumulate with every generation and exist in humans today,” Wang said. “We can now identify these mutations, see how they differ between male and female parents, and how they change as a function of parental age."

Children’s DNA inherited from their parents contains roughly 25 to 75 new mutations, which allows scientists to compare the parents and offspring, and then to classify the kind of mutation that occurred. When looking at mutations in thousands of children, IU researchers noticed a pattern: The kinds of mutations that children get depend on the ages of the mother and the father.

Previous genetic approaches to determining historical generation times relied on the compounding effects of either recombination or mutation of modern human DNA sequence divergence from ancient samples. But the results were averaged across both males and females and across the past 40,000 to 45,000 years.

Hahn, Wang and their co-authors built a model that uses de novo mutations — a genetic alteration that is present for the first time in one family member as a result of a variant or mutation in a germ cell of one of the parents or that arises in the fertilized egg during early embryogenesis — to separately estimate the male and female generation times at many different points throughout the past 250,000 years.

The researchers were not originally seeking to understand the relationship of gender and age at conception over time; they were conducting a broader investigation about the number of mutations passed from parents to children. They only noticed the age-based mutation patterns while seeking to understand differences and similarities between these pattens in humans versus other mammals, such as cats, bears and macaques.

“The story of human history is pieced together from a diverse set of sources: written records, archaeological findings, fossils, etc.,” Wang said. “Our genomes, the DNA found in every one of our cells, offer a kind of manuscript of human evolutionary history. The findings from our genetic analysis confirm some things we knew from other sources (such as the recent rise in parental age), but also offer a richer understanding of the demography of ancient humans. These findings contribute to a better understanding of our shared history.”

Additional contributors to this research were Samer I. Al-Saffar, a graduate student at IU at the time of the study, and Jeffrey Rogers of the Baylor College of Medicine.

New USC study challenges previous ideas regarding Alzheimer’s disease

Increase in amyloid beta protein in the brain, often thought to be directly involved in Alzheimer’s pathology, may instead be a general change that occurs with age even in healthy brains, says senior author Caleb Finch

Peer-Reviewed Publication

UNIVERSITY OF SOUTHERN CALIFORNIA

A new USC Leonard Davis School of Gerontology study challenges existing ideas of how buildup of a protein called amyloid beta (Aβ) in the brain is related to Alzheimer’s disease.

While buildup of amyloid protein has been associated with Alzheimer’s-related neurodegeneration, little is known about how the protein relates to normal brain aging, said University Professor Caleb Finch, the study’s senior author and holder of the ARCO/William F. Kieschnick Chair in the Neurobiology of Aging at the USC Leonard Davis School.

To explore the levels of Aβ in human brains, the researchers analyzed tissue samples from both healthy brains and brains of patients with dementia. More severe Alzheimer’s cases were indicated by higher Braak staging scores, a measurement of how widely signs of Alzheimer’s pathology are found within the brain.

The analysis revealed that older, cognitively healthy brains showed similar amounts of dissolvable, non-fibrillar amyloid protein as brains of Alzheimer’s patients. But, as the researchers expected, the brains of Alzheimer’s patients had higher amounts of insoluble Aβ fibrils, the form of amyloid protein that aggregates to form the telltale “plaques” seen in the disease, said Max Thorwald, the study’s first author and a postdoctoral researcher at the USC Leonard Davis School.

The findings challenge the idea that simply having higher amounts of amyloid protein in general is an underlying cause of Alzheimer’s, say Finch and Thorwald. Instead, the increase in soluble Aβ may be a general aging-related change in the brain not specific to Alzheimer’s, while higher levels of fibrillary amyloid appear to be a better indicator of poorer brain health.

Rather than Alzheimer’s simply involving increased production of Aβ protein, the more important issue may be a reduced ability to effectively clear the protein and stave off the creation of plaque-contributing fibrillary amyloid, Thorwald said.

“These findings further support the use of aggregated, or fibrillary, amyloid as a biomarker for Alzheimer’s treatments,” Thorwald said. “The site in which amyloid processing occurs has less precursor and enzyme available for processing, which may suggest the removal of amyloid as a key issue during Alzheimer’s.”

Increases in amyloid levels happen during early adulthood and differ by brain region. Further studies, including those investigating drugs to possibly break down amyloid, should incorporate positron emission tomography (PET) imaging in both healthy individuals and Alzheimer’s patients of a wide range of ages to determine how and where amyloid processing and removal changes in the brain over time, he added.

“The brain’s frontal cortex has more amyloid production compared to the cerebellum during the aging process in human brains, which coincides with their Alzheimer’s-correlated pathologies in late life,” Thorwald said. “Future projects should examine amyloid over the life course in both cognitively normal and Alzheimer’s patients with both modulation of amyloid processing or removal of amyloid through monoclonal antibodies currently used in clinical trials for Alzheimer’s treatment.”

Monoclonal antibody treatment lemanecab has been observed to reduce Aβ plaques in clinical trials and recently received FDA approval for its potential to slow cognitive decline in Alzheimer’s patients, but the results warrant further careful research regarding long-term impact, Finch said.

"Lecanemab clearly works to diminish fibrillar amyloid," he said. "However, we are concerned with major side effects, including brain swelling and bleeding, that were 100% more than in controls, with unknown delayed or latent impact."

Learning more about how the brain processes and removes proteins such as Aβ could provide important insights into Alzheimer’s disease and its causes. Finch noted that very few cases of dementia occur with amyloid plaques, or masses of aggregated Aβ protein, as the only pathology present in affected patients’ brains. Instead, most cases present with more complicated tissue abnormalities, from buildup of additional types of protein to small bleeds in the brain: “The aging brain is a jungle.”

The study, “Amyloid futures in the expanding pathology of brain aging and dementia,” appeared online on December 19, 2022 in the journal Alzheimer’s and Dementia. Along with Finch and Thorwald, coauthors include Justine Silva and Elizabeth Head of the University of California, Irvine.

Thorwald was supported by National Institute on Aging (NIA) grant T32-AG000037, and Silva and Head were supported by NIA grant P30AG066519. Lab studies were supported by National Institutes of Health grants to Finch (R01-AG051521, P50-AG05142, and P01-AG055367). Brain specimens were obtained from Alzheimer’s Disease Research Center Tissue Cores at USC (P50-AG005142 and AG066530), UC Irvine (P30-AG066519), and University of Washington (P30-AG066509 and U01- AG006781).

Integrating research infrastructures into infectious diseases surveillance operations: Focus on biobanks

Peer-Reviewed Publication

COMPUSCRIPT LTD

Focus on biobanks

https://doi.org/10.1016/j.bsheal.2022.10.001

 

Technological advances in the first two decades of the 21st century have profoundly impacted medical research in many ways, with large population cohorts, biological sample collections and datasets through biobanks becoming valued global resources to guide biomedical research, drug development, and medical practice. However, for biobanks to maximize their impact and scientific reach of their resources, they need to act within a complex network of infrastructures and activities. Therefore, different ways have emerged in which biobanks, including those for infectious diseases, can emerge as (part of) infrastructures, integrate within existing ones, or become an independent, yet an interoperable component of the existing infrastructural landscape. However, there has been a limited understanding and study of such mechanisms to date. This article aims to address this knowledge gap and illustrates these three high-level ways in which such infrastructures could integrate their activities and identifies the necessary key pre-conditions for doing so, while drawing from specific examples.

 

Keywords: Biobanking, Research infrastructures, Infectious diseases, Surveillance, Integration

 

 

# # # # # #

 

Biosafety and Health is sponsored by the Chinese Medical Association, managed by National Institute for Viral

Disease Control and Prevention, Chinese Center for Disease Control and Prevention (China CDC).

For more information, please visit https://www.journals.elsevier.com/biosafety-and-health

Editorial Board: https://www.sciencedirect.com/journal/biosafety-and-health/about/editorial-board

Biosafety and Health is available on ScienceDirect (https://www.sciencedirect.com/journal/biosafety-and-health).

Submissions to Biosafety and Health may be made using Editorial Manager®

(https://www.editorialmanager.com/bsheal/default.aspx).

CiteScore: 4.8

ISSN 2590-0536

 

# # # # # #

 

Article reference: Plebeian B. Medina, Jennifer Kealy, Zisis Kozlakidis, Integrating research infrastructures into infectious diseases surveillance operations: Focus on biobanks, Biosafety and Health, Volume 4, Issue 6, 2022,

Pages 410-413, ISSN 2590-0536, https://doi.org/10.1016/j.bsheal.2022.10.001

Texas A&M research aims to improve Lyme disease diagnostics

Scientists are testing Raman spectroscopy, a technique used to detect vibrations at the molecular level, as a diagnostic tool for Lyme disease.

Peer-Reviewed Publication

TEXAS A&M UNIVERSITY

Research by two Texas A&M University scientists is focused on improving Lyme disease treatment outcomes by developing a test that’s both more accurate and more efficient than the current test for the infection. 

Lyme disease, the fastest growing vector-borne illness in the U.S., according to the Bay Area Lyme Foundation, is challenging to diagnose and can only be treated in the early stages of infection. Once the infection spreads to the nervous and muscular systems, it is both harder to detect and less susceptible to antibiotics. 

Dr. Artem Rogovskyy, an associate professor at the Texas A&M School of Veterinary Medicine & Biomedical Sciences (VMBS), and Dr. Dmitry Kurouski, an assistant professor in the Texas A&M Department of Biochemistry & Biophysics and the Department of Biomedical Engineering, are testing Raman spectroscopy, a technique used to detect vibrations at the molecular level, as a diagnostic tool for Lyme disease. 

The results of Rogovskyy and Kurouski’s second paper on Raman spectroscopy as a diagnostic tool for Lyme disease demonstrate that blood samples from mice and humans infected with the Lyme pathogen were more accurately identified with the Raman spectroscopy test than with the two-tiered serology, the only diagnostic method currently approved to diagnose Lyme disease in humans in the United States.

“We're trying to develop a better test that would be simple, inexpensive and accurate,” Rogovskyy said. “By accurate, I mean highly sensitive and highly specific at the same time.”

The increased accuracy of Raman spectroscopy testing could improve Lyme disease diagnostic practices for both humans and animals believed to have been in contact with the disease. 

For animals, the new test would require a smaller sample that could easily be taken in the field away from a veterinary clinic or hospital, thus improving mobile veterinary practices. 

For humans, Raman spectroscopy testing could significantly decrease the amount of time needed to complete testing, increase the accuracy of the diagnosis, lower the cost of diagnosing the disease, and improve overall health outcomes by definitively diagnosing the disease earlier. 

Rogovskyy said the team is in the process of validating the test through additional studies, and if the test is validated, it could become an important tool for diagnosing Lyme disease worldwide, especially in more remote areas outside the U.S. where the disease is prevalent, by enabling testing outside of traditional medical and hospital settings.

The researchers’ collaborative efforts have received funding from the Bay Area Lyme Foundation, a nonprofit that collaborates with world-class scientists and institutions to accelerate medical breakthroughs for Lyme disease. They also received human blood samples from the Lyme Disease Biobank, a clinical specimen repository.  

Rogovskyy and Kurouski’s first paper published on Raman spectroscopy is the first proof-of-concept study to have explored Raman spectroscopy to diagnose mice infected with the Lyme pathogen. Their second paper included data on testing Raman spectroscopy on samples from mice infected with European Lyme pathogens, and also involved numerous human blood samples supplied by the Lyme Disease Biobank. 

Rogovskyy anticipates the team may be able to publish more findings in about two years from the next phase of their research that entails testing human samples in a blind manner.

NSF awards UMBC’s Lauren Clay $624K Convergence Accelerator grant to address food insecurity in disasters

Grant and Award Announcement

UNIVERSITY OF MARYLAND BALTIMORE COUNTY

Lauren Clay 

IMAGE: LAUREN CLAY, ASSOCIATE PROFESSOR AND CHAIR OF EMERGENCY HEALTH SERVICES AT UMBC view more 

CREDIT: MARLAYNA DEMOND FOR UMBC

Longstanding food insecurity problems in the U.S. and around the world, exacerbated by the pandemic, are projected to increase over the coming decades, as food, water, and energy demands increase and environmental crises worsen. With this in mind, the National Science Foundation (NSF) is investing $11 million toward solutions to address the nutritional needs of vulnerable and under-resourced communities through its Convergence Accelerator Program

UMBC’s Lauren Clay, associate professor and chair of emergency health services, is one 16 Convergence Accelerator awardees selected for Phase I of the program. Clay was awarded $624,000 for her project to improve food system resilience and decrease disaster-induced food insecurity in communities impacted by hurricanes.

Supporting food system resilience

Clay’s proposal explains that 11-15 percent of the U.S. population experienced food insecurity annually between 2008 and 2018, and households that are struggling with food insecurity before a disaster are at greatest risk for serious food access issues when a disaster strikes, and long after. 

“Food and nutrition insecurity rates can increase threefold following disasters,” Clay notes. “Increased food and nutrition insecurity rates persist for years while households and communities recover.”

“Communities across the U.S. are planning for growing threats related to climate disasters. Food security is a basic human need and is highly susceptible to disruption when families and communities experience disasters,” says Clay. “I’m excited to work with a multi-disciplinary and multi-sector team to develop a new tool for measuring community food security to support communities planning for, responding to, and recovering from hurricanes.”

Converging on solutions

The NSF Convergence Accelerator Program seeks to address national-scale challenges in science, engineering, and society through a collaborative research process that brings together expertise from multiple scientific disciplines, known as convergence research. The food and nutrition focus was recently added to the Convergence Accelerator, which also includes approaches towards combating challenges related to population health and climate change.

“We hope to create a group of synergistic efforts that advance regenerative agriculture practices, reduce water usage, provide equitable access to nutritious and affordable food for disadvantaged communities, and spur technology and job creation,” says Douglas Maughan, head of the Convergence Accelerator, in NSF’s announcement of award recipients

Over the course of nine months, Clay and her team will work to develop the Food Index for Resilience, Security, & Tangible Solutions, called FIRST. This index will measure food system functioning in communities and is intended to be a resource that can be used to respond to and recover from disasters and environmental changes. 

This effort builds on Clay’s prior and ongoing research to address disaster-specific food insecurity issues. She was also recently awarded an NSF Faculty Early Career Development (CAREER) award to develop a sociocultural model called Food Environment in Disasters (FED) and other tools to improve the understanding and monitoring of food availability, acceptability, and accessibility during disasters. 

Following Phase I of this project, participating teams will take part in a formal pitch and Phase II proposal and could receive up to $5 million of additional support. Selected Phase II teams will further develop their solutions and sustainability development plans over the course of 24 months, to rapidly meet the needs of global communities.

AIR POLLUTION

Researchers study new particle formation events in the urban atmosphere

Study gives first evidence on the importance of transport in these events in the urban atmosphere

Peer-Reviewed Publication

PARTICUOLOGY

Sketch map for how the “polluted” atmospheric new particle formation events occur 

IMAGE: SCIENTISTS FROM PEKING UNIVERSITY, UNIVERSITY OF GOTHENBURG, AND SHANGHAI ACADEMY OF ENVIRONMENTAL SCIENCES DISCOVERED THE COMBINED IMPACTS FROM REGIONAL TRANSPORT AND LOCAL NUCLEATION ON NANOPARTICLES IN THE URBAN ATMOSPHERE. view more 

CREDIT: DONGJIE SHANG, MIN HU, ET AL., PEKING UNIVERSITY (PKU)

An international research team has conducted a study of new particle formation (NPF) events in the atmosphere of Beijing, which provides the first evidence of the importance of transport in NPF events in the urban atmosphere. Such atmospheric NPF events influence air quality, climate, and human health.

 

The team's findings are published in the journal Particulology on January 9, 2023.

 

NPF events are an important source of secondary particles in the atmosphere, significantly influencing the cloud albedo and air quality. The mechanisms by which NPF events occur under high aerosol loadings (so called "polluted" NPF events) in the atmosphere previously have not been fully understood, resulting in limited precision of climate models and particle pollution control.

 

To better understand how these "polluted" NPF events occur, the research team conducted a one-month comprehensive field measurement in Beijing during the summer of 2016. The team discovered that the "clean" NPF events were caused by local nucleation and growth, while the "polluted" NPF events were caused by both local nucleation-growth and regional transport. Regional transport is the process in which the pollutants from upwind sources impact the air quality in a downwind location. The team's findings emphasize the importance of the transport for nanoparticles in relatively polluted atmospheres. This result shows that regional joint air pollution control is an essential policy.

 

Researchers have widely studied the atmospheric NPF in China because of its negative impact on air quality, climate factors and human health. Since 2004, the PKU aerosol team has conducted continuous measurements of NPF and observed a unique polluted type of NPF process in the atmosphere of Beijing. During this NPF process they observed, the background particle level was high, and the particle burst covered a wide range (3-20 nm). "Although many mechanisms of NPF events in clean atmosphere have been established, the mechanism of how polluted NPF events occurred remained ambiguous," said Min Hu, a professor at Peking University.

 

To analyze the effects of transport on NPF events, the team conducted their one-month observation during the summer of 2016 at a suburban site in Beijing, about 40 km northwest of the urban center. This site is strongly influenced by regional transport controlled by mountain and valley breeze. They comprehensively investigated both "clean" and "polluted" NPF events using particle, precursor and meteorological data. To discover if the NPF events occurred on a larger scale, they conducted simultaneous measurements in the urban area at the main campus of Peking University.

 

During the summer of 2016, the team found that the polluted NPF events are caused by both regional transport and local nucleation of the nanoparticles in the atmosphere. The transport brings 3-20 nm particles from upwind areas in the morning, and local nucleation contributes molecule clusters, such as sulfuric acid dimers, from around 12:00 local time. They found that the wind direction is also different on the "polluted" NPF days, compared with the normal "clean" NPF days and the days without NPF events. "Our findings imply that even when the local emission of particles and NPF gaseous precursors is strictly controlled, the transport can still produce large amounts of secondary particles in the local atmosphere and then trigger the haze events. Thus, joint control measures are highly required on the regional scale to achieve further particle pollution mitigation," said Hu.

 

Looking ahead to future research, the team plans to conduct on-site monitoring campaigns and laboratory simulations with more parameters including organic molecules, to gain a deeper and molecular understanding of the "polluted" NPF events. "The ultimate goal is to improve the mechanisms of NPF events, both clean and polluted type, in the air quality models and climate models, and to reduce uncertainties in policies addressing climate change and controlling air pollutions," said Hu.

 

The research team includes Dongjie Shang, Min Hu, Lizi Tang, Xin Fang, Ying Liu, Yusheng Wu, Zhuofei Du, Xuhui Cai from Peking University; Min Hu, Zhijun Wu, Song Guo, and Yuanhang Zhang from Peking University and Nanjing University of Information Science & Technology; Shengrong Lou from Shanghai Academy of Environmental Sciences; and Mattias Hallquist from University of Gothenburg.

 

This research is funded by the National Natural Science Foundation of China (NSFC), the NSFC - Creative Research Group Fund, the NationalKey Research and Development Program of China, and the bilateral Sweden–China framework program "Photochemical smog in China: formation, transformation, impact and abatement strategies."

 

Particuology (IF=3.251) is an interdisciplinary journal that publishes frontier research articles and critical reviews on the discovery, formulation and engineering of particulate materials, processes and systems. Topics are broadly relevant to the production of materials, pharmaceuticals and food, the conversion of energy resources, and protection of the environment. For more information, please visit: https://www.journals.elsevier.com/particuology.

Solar-powered system converts plastic and greenhouse gases into sustainable fuels

Peer-Reviewed Publication

UNIVERSITY OF CAMBRIDGE

Solar-powered system converts plastic and greenhouse gases into sustainable fuels 

IMAGE: RESEARCHERS HAVE DEVELOPED A SYSTEM THAT CAN TRANSFORM PLASTIC WASTE AND GREENHOUSE GASES INTO SUSTAINABLE FUELS AND OTHER VALUABLE PRODUCTS – USING JUST THE ENERGY FROM THE SUN. view more 

CREDIT: UNIVERSITY OF CAMBRIDGE

Researchers have developed a system that can transform plastic waste and greenhouse gases into sustainable fuels and other valuable products – using just the energy from the Sun.

The researchers, from the University of Cambridge, developed the system, which can convert two waste streams into two chemical products at the same time – the first time this has been achieved in a solar-powered reactor.

The reactor converts the carbon dioxide (CO2) and plastics into different products that are useful in a range of industries. In tests, CO2 was converted into syngas, a key building block for sustainable liquid fuels, and plastic bottles were converted into glycolic acid, which is widely used in the cosmetics industry. The system can easily be tuned to produce different products by changing the type of catalyst used in the reactor.

Converting plastics and greenhouse gases – two of the biggest threats facing the natural world – into useful and valuable products using solar energy is an important step in the transition to a more sustainable, circular economy. The results are reported in the journal Nature Synthesis.

“Converting waste into something useful using solar energy is a major goal of our research,” said Professor Erwin Reisner from the Yusuf Hamied Department of Chemistry, the paper’s senior author. “Plastic pollution is a huge problem worldwide, and often, many of the plastics we throw into recycling bins are incinerated or end up in landfill.”

Reisner also leads the Cambridge Circular Plastics Centre (CirPlas), which aims to eliminate plastic waste by combining blue-sky thinking with practical measures.

Other solar-powered ‘recycling’ technologies hold promise for addressing plastic pollution and for reducing the amount of greenhouse gases in the atmosphere, but to date, they have not been combined in a single process.

“A solar-driven technology that could help to address plastic pollution and greenhouse gases at the same time could be a game-changer in the development of a circular economy,” said Subhajit Bhattacharjee, the paper’s co-first author.

“We also need something that’s tuneable, so that you can easily make changes depending on the final product you want,” said co-first author Dr Motiar Rahaman.

The researchers developed an integrated reactor with two separate compartments: one for plastic, and one for greenhouse gases. The reactor uses a light absorber based on perovskite – a promising alternative to silicon for next-generation solar cells.

The team designed different catalysts, which were integrated into the light absorber. By changing the catalyst, the researchers could then change the end product. Tests of the reactor under normal temperature and pressure conditions showed that the reactor could efficiently convert PET plastic bottles and CO2 into different carbon-based fuels such as CO, syngas or formate, in addition to glycolic acid. The Cambridge-developed reactor produced these products at a rate that is also much higher than conventional photocatalytic CO2 reduction processes.

“Generally, CO2 conversion requires a lot of energy, but with our system, basically you just shine a light at it, and it starts converting harmful products into something useful and sustainable,” said Rahaman. “Prior to this system, we didn’t have anything that could make high-value products selectively and efficiently.”

“What’s so special about this system is the versatility and tuneability – we’re making fairly simple carbon-based molecules right now, but in future, we could be able to tune the system to make far more complex products, just by changing the catalyst,” said Bhattacharjee.

Reisner recently received new funding from the European Research Council to help the development of their solar-powered reactor. Over the next five years, they hope to further develop the reactor to produce more complex molecules. The researchers say that similar techniques could someday be used to develop an entirely solar-powered recycling plant.

“Developing a circular economy, where we make useful things from waste instead of throwing it into landfill, is vital if we’re going to meaningfully address the climate crisis and protect the natural world,” said Reisner. “And powering these solutions using the Sun means that we’re doing it cleanly and sustainably.”

The research was supported in part by the European Union, the European Research Council, the Cambridge Trust, Hermann and Marianne Straniak Stiftung, and the Engineering and Physical Sciences Research Council (EPSRC), part of UK Research and Innovation (UKRI). Erwin Reisner is a Fellow of St John’s College, Cambridge.


Researchers have developed a system that can transform plastic waste and greenhouse gases into sustainable fuels and other valuable products – using just the energy from the Sun. The researchers, from the University of Cambridge, developed the system, which can convert two waste streams into two chemical products at the same time – the first time this has been achieved in a solar-powered reactor.

CREDIT

University of Cambridge

Chemical researchers discover catalyst to make renewable paints, coatings, and diapers

Research discovery will enable the manufacturing of biorenewable materials from trees and corn

Peer-Reviewed Publication

UNIVERSITY OF MINNESOTA

Acrylic acid graphic 

IMAGE: UNIVERSITY OF MINNESOTA RESEARCHERS HAVE INVENTED NEW CATALYST TECHNOLOGY THAT IMPROVES THE PROCESS AND SUBSTANTIALLY REDUCES THE COST OF MANUFACTURING RENEWABLE CHEMICALS THAT CAN BE USED IN A WIDE RANGE OF PRODUCTS INCLUDING PAINTS, COATINGS AND DIAPERS. view more 

CREDIT: JOHN BEUMER, NSF CENTER FOR SUSTAINABLE POLYMERS, UNIVERSITY OF MINNESOTA

A team led by University of Minnesota Twin Cities researchers has invented a groundbreaking new catalyst technology that converts renewable materials like trees and corn to the key chemicals, acrylic acid, and acrylates used in paints, coatings, and superabsorbent polymers. The new catalyst technology is also highly efficient, which means lower costs for manufacturing renewable chemicals.

The new catalyst formulation converts lactic acid-based chemicals derived from corn to acrylic acid and acrylates with the highest yield achieved to date. The technology exhibits substantially higher performance when benchmarked against other classes of leading catalysts.

The research is published online in the Journal of the American Chemical Society Gold (JACS Au), a leading open access journal of the American Chemical Society. 

The research team was supported by the U.S. National Science Foundation through the NSF Center for Sustainable Polymers, a multi-university collaborative team with a mission to transform how plastics are made, unmade, and remade through innovative research.

The public is most familiar with acrylic acid and associated acrylates through its uses in everyday items from paints and coatings to sticky adhesives to superabsorbent materials used in diapers. These chemicals and materials have been made for the last century from fossil fuels. But in the last few decades, the corn industry has been growing to expand beyond food and livestock feed to manufacturing useful chemicals. One such corn-derived chemical is sustainable lactic acid, a key ingredient in the manufacturing of the renewable and compostable plastic used in many everyday applications.

Lactic acid can also be converted to acrylic acid and acrylates using catalysts. However, until this new catalyst discovery, traditional catalysts were very inefficient achieving low yields and making the overall process too expensive.

“Our new catalyst formulation discovery achieves the highest yield to date of acrylic acid from lactic acid,” said Paul Dauenhauer, professor in the University of Minnesota Department of Chemical Engineering and Materials Science. “We benchmarked the performance of our new catalyst to all prior catalysts, and the performance far exceeds previous examples.”

The new catalyst formulation substantially reduces the cost of manufacturing renewable acrylic acid and acrylates from corn by improving yield and reducing waste. For the first time, this could reduce the price of renewable acrylic acid below fossil-derived chemicals.

The economic opportunity generated by the new catalyst is being pursued by Låkril Technologies, a startup company that aims to manufacture low-cost renewable acrylic acid and acrylates. By licensing the catalyst technology from the University of Minnesota, Låkril Technologies will develop the technology beyond the laboratory.

“Chemical manufacturing has relied on a class of catalysts called ‘zeolites’ for half a century,” says Dr. Chris Nicholas, CEO of Låkril Technologies. “Because the new catalyst discovery is based on a zeolite formulation already available at scale, our new process to make acrylic acid and acrylates will achieve low cost with low risk.”

Låkril Technologies, located in Chicago, already has received $1.4 million in pre-seed financing to scale the process. The Iowa Corn Growers Association led the financing with participation from the Kentucky Corn Growers Association along with grants from the Minnesota Corn Research and Promotion Council, Indiana Corn Marketing Council, Corn Marketing Council of Michigan, along with Small Business Innovation Research (SBIR) awards from both the U.S. Department of Agriculture and the U.S. Department of Energy.

At the University of Minnesota, the research team plans to continue their basic research on catalyst design to understand the fundamental aspects of the chemistry with financial support from the Center for Sustainable Polymers headquartered at the University of Minnesota.

“This is a wonderful example of how addressing important basic research questions that are at the heart of fundamental catalysis can lead to innovative new processes that have true technological promise,” said Marc Hillmyer, director of the Center for Sustainable Polymers and a professor in the University of Minnesota Department of Chemistry. “A grand challenge in the Center for Sustainable Polymers is the efficient and sustainable conversion of biomass to polymer ingredients, and this work represents a groundbreaking solution to that challenge that will have lasting impact.”

Learn more about renewable chemistry on the Dauenhauer Research Group website.