Thursday, August 18, 2022

Floating ‘artificial leaves’ ride the wave of clean fuel production

Peer-Reviewed Publication

UNIVERSITY OF CAMBRIDGE

Floating 'artificial leaf' in Cambridge 

IMAGE: A FLOATING ARTIFICIAL LEAF - WHICH GENERATES CLEAN FUEL FROM SUNLIGHT AND WATER - ON THE RIVER CAM NEAR KING'S COLLEGE CHAPEL IN CAMBRIDGE, UK. view more 

CREDIT: VIRGIL ANDREI

Researchers have developed floating ‘artificial leaves’ that generate clean fuels from sunlight and water, and could eventually operate on a large scale at sea.

The researchers, from the University of Cambridge, designed ultra-thin, flexible devices, which take their inspiration from photosynthesis – the process by which plants convert sunlight into food. Since the low-cost, autonomous devices are light enough to float, they could be used to generate a sustainable alternative to petrol without taking up space on land.

Outdoor tests of the lightweight leaves on the River Cam – near iconic Cambridge sites including the Bridge of Sighs, the Wren Library and King’s College Chapel – showed that they can convert sunlight into fuels as efficiently as plant leaves.

This is the first time that clean fuel has been generated on water, and if scaled up, the artificial leaves could be used on polluted waterways, in ports or even at sea, and could help reduce the global shipping industry’s reliance on fossil fuels. The results are reported in the journal Nature.

While renewable energy technologies, such as wind and solar, have become significantly cheaper and more available in recent years, for industries such as shipping, decarbonisation is a much taller order. Around 80% of global trade is transported by cargo vessels powered by fossil fuels, yet the sector has received remarkably little attention in discussions around the climate crisis.

For several years, Professor Erwin Reisner’s research group in Cambridge has been working to address this problem by developing sustainable solutions to petrol which are based on the principles of photosynthesis. In 2019, they developed an artificial leaf, which makes syngas – a key intermediate in the production of many chemicals and pharmaceuticals – from sunlight, carbon dioxide and water.

The earlier prototype generated fuel by combining two light absorbers with suitable catalysts. However, it incorporated thick glass substrates and moisture protective coatings, which made the device bulky.

“Artificial leaves could substantially lower the cost of sustainable fuel production, but since they’re both heavy and fragile, they’re difficult to produce at scale and transport,” said Dr Virgil Andrei from Cambridge’s Yusuf Hamied Department of Chemistry, the paper’s co-lead author.

“We wanted to see how far we can trim down the materials these devices use, while not affecting their performance,” said Reisner, who led the research. “If we can trim the materials down far enough that they’re light enough to float, then it opens up whole new ways that these artificial leaves could be used.”

For the new version of the artificial leaf, the researchers took their inspiration from the electronics industry, where miniaturisation techniques have led to the creation of smartphones and flexible displays, revolutionising the field.

The challenge for the Cambridge researchers was how to deposit light absorbers onto lightweight substrates and protect them against water infiltration. To overcome these challenges, the team thin-film metal oxides and materials known as perovskites, which can be coated onto flexible plastic and metal foils. The devices were covered with micrometre thin, water-repellent carbon-based layers that prevented moisture degradation. They ended up with a device that not only works, but also looks like a real leaf.

“This study demonstrates that artificial leaves are compatible with modern fabrication techniques, representing an early step towards the automation and up-scaling of solar fuel production,” said Andrei. “These leaves combine the advantages of most solar fuel technologies, as they achieve the low weight of powder suspensions and the high performance of wired systems.”

Tests of the new artificial leaves showed that they can split water into hydrogen and oxygen, or reduce CO2 to syngas. While additional improvements will need to be made before they are ready for commercial applications, the researchers say this development opens whole new avenues in their work.

“Solar farms have become popular for electricity production; we envision similar farms for fuel synthesis,” said Andrei. “These could supply coastal settlements, remote islands, cover industrial ponds, or avoid water evaporation from irrigation canals.”

“Many renewable energy technologies, including solar fuel technologies, can take up large amounts of space on land, so moving production to open water would mean that clean energy and land use aren’t competing with one another,” said Reisner. “In theory, you could roll up these devices and put them almost anywhere, in almost any country, which would also help with energy security.”

The research was supported in part by the European Research Council, the Cambridge Trust, the Winton Programme for the Physics of Sustainability, the Royal Academy of Engineering, and the Engineering and Physical Sciences Research Council (EPSRC), part of UK Research and Innovation (UKRI). Virgil Andrei and Erwin Reisner are Fellows of St John’s College, Cambridge.

Floating artifical leaf 

CAPTION

A floating artificial leaf - which generates clean fuel from sunlight and water - on the River Cam near the Bridge of Sighs in Cambridge, UK.

CREDIT

Virgil Andrei

Artificial leaf in the lab 

CAPTION

Researchers from the University of Cambridge have designed ultra-thin, flexible devices, which take their inspiration from photosynthesis – the process by which plants convert sunlight into food. Since the low-cost, autonomous devices are light enough to float, they could be used to generate a sustainable alternative to petrol without taking up space on land.

CREDIT

Virgil Andrei

VIDEO

Researchers from the University of Cambridge designed ultra-thin, flexible devices, which take their inspiration from photosynthesis – the process by which plants convert sunlight into food. Since the low-cost, autonomous devices are light enough to float, they could be used to generate a sustainable alternative to petrol without taking up space on land.


Capturing high pressures in diamond capsules

Peer-Reviewed Publication

CENTER FOR HIGH PRESSURE SCIENCE & TECHNOLOGY ADVANCED RESEARCH

Nanostructured diamond capsules (NDCs) 

IMAGE: A PERSPECTIVE CARTOON DRAWING OF NANOSTRUCTURED DIAMOND CAPSULES (NDCS) WITH HIGH-PRESSURE ARGON NANODOMAINS EMBEDDED IN THE MATRIX. MODERN DIAGNOSTIC PROBES SUCH AS HARD/SOFT X-RAY, ULTRAVIOLET-VISIBLE-INFRARED LIGHT, ELECTRON, AND NEUTRON BEAMS ARE ALL APPLICABLE TO THE NDCS SAMPLES FOR INVESTIGATIONS. view more 

CREDIT: CHARLES ZENG

Preservation of the high-pressure states of materials at ambient conditions is a long-sought-after goal for fundamental research and practical applications. A team of scientists led by Drs. Zhidan (Denise) Zeng, Qiaoshi Zeng, and Ho-Kwang Mao from the Center for High Pressure Science and Technology Advanced Research (HPSTAR) and Prof. Wendy Mao from Stanford University report an innovative breakthrough where they were able to maintain the extraordinary properties of high-pressure materials in free-standing, nanostructured diamond capsules without the support of traditional bulky pressure vessels. Their work was recently published in Nature.

Modern technology is built upon access to materials with suitable physical and chemical properties that can be used to perform specific functions in various devices. Technological advances, therefore, are often dictated by the development of superior materials with desirable properties. High pressure can drastically alter or tune properties of all materials, thus providing a fertile ground for discovering novel materials with extremely favorable properties. The caveat is, however, that the favorable properties often only exist under pressure when the sample remains in the bulky high-pressure vessel, limiting scientific investigation and potential applications. For the past century, scientists have tried to overcome this difficulty. They succeeded only in "quenchable" phases, where novel materials synthesized at high pressure retain their favorable properties after releasing pressure. A well-known example is the high-pressure conversion of ordinary carbon into diamond which is able to keep its brilliance and other exceptional properties after retrieving at ordinary pressures. Unfortunately, such successful examples of quenchable phases are extremely rare, largely rendering high-pressure materials studies of only academic interest with little practical value in the ambient environment.

The HPSTAR and Stanford research group developed a novel approach that has demonstrated the ability to quench even tenuous gases and preserve their high-pressure properties. They compressed glassy carbon, an amorphous form of porous carbon, together with argon gas to 50 gigapascals — about 500,000 times atmospheric pressure, and heated the sample to 3,320 degrees Fahrenheit. The glassy carbon that is initially impermeable to gases at ordinary conditions absorbs argon like a sponge at high pressures. The application of high pressure and temperature conditions converts the carbon into diamond and traps the now solid, high-pressure argon in its pores. The resulting sample that is retrieved at ambient conditions behaves like a nanocrystalline diamond composite with numerous isolated pores that represent like tiny diamond capsules filled with argon. The residual pressure preserved in the argon by the diamond capsule is as high as 22 gigapascals — about 220 times the pressure at the bottom of the Mariana Trench. Better yet, the pressurized argon sample is sealed in by only nanometer-thick diamond skins allowing its extraordinary properties to be accessible by modern analytical probes that require near vacuum environments like electron microscopy. 

"We directly observe many nanometer-sized high-pressure argon grains encapsulated in the nano-diamond matrix by high-resolution transmission electron microscopy; thus, we named them nanostructured diamond capsules (NDCs)," explained Dr. Denise Zhidan Zeng, the lead author of this work. "One of the keys to realizing the concept of NDCs is choosing the right carbon precursor which is sp2 bonded and has pre-existing enclosed sample chambers. This is obviously by no means limited to glassy carbon. Therefore, a variety of crystalline, amorphous, and low-dimensional carbon allotropes could potentially be used as the precursor carbon as well, offering a wide range of capsule materials for optimization of the NDC process," explained Dr. Denise Zhidan Zeng.

"Using multiple complementary diagnostic probes to obtain consistent results features the modern matter research. However, in-situ high-pressure studies have always required high-penetrating probes such as hard x-ray due to the thick high-pressure vessel walls involved. Therefore, many powerful and versatile probes, such as electron microscopy and vacuum ultraviolet to soft x-ray spectroscopy, that require a near vacuum environment, unfortunately, remain incompatible with high-pressure science and technology. This has severely hindered our efforts to understand many high-pressure materials," said Dr. Qiaoshi Zeng.“By synthesizing NDCs, we offer a general method to remove bulky pressure vessels while maintaining the high-pressure conditions and therefore the high-pressure behavior in our samples. We can now employ almost all the modern diagnostic probes to obtain detailed information of the atomic/electronic structures, compositions, and bonding nature of materials at high pressures inside NDCs, including various techniques based on transmission electron microscopy. We are excited about the possibility that an approach based on NDCs will bring high-pressure explorations on par with conventional condensed-matter investigations and applications."

"Beyond the gases that we explored in our study, we also expect the concept of NDCs to be generally applicable to various solid samples," said Prof. Wendy Mao.“Moreover, NDC samples are in principle cumulative with the potential for unlimited, multiple syntheses, thus removing the limitation where high-pressure phenomena only exist in a tiny sample inside a large pressure chamber. Therefore, our work demonstrates the first, critical step toward the grand challenge of high-pressure materials applications for previously unquenchable phases."

This work involves many other co-authors from HPSTAR in China and Argonne national laboratory in USA. The financial support is from the National Key Research and Development Program of China (2019YFA0708502, 2018YFA0703400, and 2021YFA0718900) and National Natural Science Foundation of China (51871054, and U1930401).

More information: "Preservation of high-pressure volatiles in nanostructured diamond capsules", Zeng et al., 608, 513–517 Nature  (2022), https://www.nature.com/articles/s41586-022-04955-z.

New University of Rochester study shows promising alternative to opioids for pain after dental extractions

Study published in JAMA Network Open

Peer-Reviewed Publication

UNIVERSITY OF ROCHESTER MEDICAL CENTER

As the opioid epidemic rages on during the COVID-19 pandemic, a study by the University of Rochester Medical Center’s Eastman Institute for Oral Health (EIOH) shows promise for a non-opioid alternative for acute dental pain.

Prescription opioid-involved death rates increased more than 16% from 2019 to 2020, and among all the opioid overdose deaths, prescription opioids represent nearly 18 percent, according to the Centers for Disease Control and Prevention.

Published today in JAMA Network Open, the study compares two large groups of patients representing an equal number of men and women ages 18-93 who had dental extractions at the Institute’s Howitt Urgent Dental Care clinic in Rochester, New York.

“We hypothesized that using a combination of the non-opioid pain medications and adding gabapentin to the mix for pain would be an effective strategy to minimize or eliminate opioids for dental pain,” said Yanfang Ren, DDS, PhD, MPH, professor and clinical chief, Howitt Urgent Dental Care. 

For the 3,300 patients in the first group in 2012, those with mild pain were treated with ibuprofen or acetaminophen. For moderate to severe pain, patients were prescribed higher doses of ibuprofen or opioid combinations including hydrocodone, oxycodone or codeine.

But the second group of nearly 3,800 people who had extractions from March 2021 through February 2022, received no opioids whatsoever.

Similarly, for mild pain, the second group was prescribed acetaminophen or ibuprofen. For moderate to severe pain, higher doses of ibuprofen or an ibuprofen and acetaminophen combination were prescribed.

For the segment of patients who were unable to take ibuprofen or acetaminophen due to health issues or potential interactions with existing medications, they were often prescribed an opioid combination in year 2012, but a gabapentin combination was used

instead in 2022 to use as a non-opioid alternative. The results were promising when the effectiveness was evaluated using a real-word measure of proportion of patients returning for additional pain treatment after receiving the prescribed analgesics.

“Although the American Dental Association recommends nonsteroidal anti-inflammatory drugs for managing pain,” Dr. Ren said, “dentists frequently prescribe opioids for dental pain and contribute substantially to new and persistent opioid use. This may partly be explained by the lack of alternatives to opioids, especially when the ibuprofen or acetaminophen are ineffective or can’t be tolerated.”

“This study represents continued efforts by our team and other dentists to minimize the use of opioids for dental pain,” said Eli Eliav, DDS, PhD, the director of EIOH and a pain expert.  “Additional studies, preferably randomized controlled clinical trials, are needed to confirm the safety and effectiveness of this approach. It is our duty to continuously seek safe and effective treatment for our patients in pain.”

Other authors on the study include Qirong Huang, DDS, MS, Linda Rasubala, DDS, PhD, Richard H. Gracely, PhD, Junad Khan, BDS, MSD, MPH, PhD, and Eli Eliav, DMD, PhD.

A new neuromorphic chip for AI on the edge, at a small fraction of the energy and size of today’s compute platforms

The NeuRRAM chip is the first compute-in-memory chip to demonstrate a wide range of AI applications at a fraction of the energy consumed by other platforms while maintaining equivalent accuracy

Peer-Reviewed Publication

UNIVERSITY OF CALIFORNIA - SAN DIEGO

The NeuRRAM neuromorphic chip 

IMAGE: A TEAM OF INTERNATIONAL RESEARCHERS DESIGNED, MANUFACTURED AND TESTED THE NEURRAM CHIP. view more 

CREDIT: DAVID BAILLOT/UNIVERSITY OF CALIFORNIA SAN DIEGO

An international team of researchers has designed and built a chip that runs computations directly in memory and can run a wide variety of AI applications–all at a fraction of the energy consumed by computing platforms for general-purpose AI computing.

The NeuRRAM neuromorphic chip brings AI a step closer to running on a broad range of edge devices, disconnected from the cloud, where they can perform sophisticated cognitive tasks anywhere and anytime without relying on a network connection to a centralized server.  Applications abound in every corner of the world and every facet of our lives, and range from smart watches, to VR headsets, smart earbuds, smart sensors in factories and rovers for space exploration.

The NeuRRAM chip is not only twice as energy efficient as the state-of-the-art “compute-in-memory” chips, an innovative class of hybrid chips that runs computations in memory, it also delivers results that are just as accurate as conventional digital chips. Conventional AI platforms are a lot bulkier and typically are constrained to using large data servers operating in the cloud. 

In addition, the NeuRRAM chip is highly versatile and supports many different neural network models and architectures. As a result, the chip can be used for many different applications, including image recognition and reconstruction as well as voice recognition.

“The conventional wisdom is that the higher efficiency of compute-in-memory is at the cost of versatility, but our NeuRRAM chip obtains efficiency while not sacrificing versatility,” said Weier Wan, the paper’s first corresponding author and a recent Ph.D. graduate of Stanford University who worked on the chip while at UC San Diego, where he was co-advised by Gert Cauwenberghs in the Department of Bioengineering. 

The research team, co-led by bioengineers at the University of California San Diego, presents their results in the Aug. 17 issue of Nature.

Currently, AI computing is both power hungry and computationally expensive. Most AI applications on edge devices involve moving data from the devices to the cloud, where the AI processes and analyzes it. Then the results are moved back to the device. That’s because most edge devices are battery-powered and as a result only have a limited amount of power that can be dedicated to computing. 

By reducing power consumption needed for AI inference at the edge, this NeuRRAM chip could lead to more robust, smarter and accessible edge devices and smarter manufacturing. It could also lead to better data privacy as the transfer of data from devices to the cloud comes with increased security risks. 

On AI chips, moving data from memory to computing units is one major bottleneck. 

“It’s the equivalent of doing an eight-hour commute for a two-hour work day,” Wan said. 

To solve this data transfer issue, researchers used what is known as resistive random-access memory, a type of non-volatile memory that allows for computation directly within memory rather than in separate computing units. RRAM and other emerging memory technologies used as synapse arrays for neuromorphic computing were pioneered in the lab of Philip Wong, Wan’s advisor at Stanford and a main contributor to this work. Computation with RRAM chips is not necessarily new, but generally it leads to a decrease in the accuracy of the computations performed on the chip and a lack of flexibility in the chip’s architecture. 

"Compute-in-memory has been common practice in neuromorphic engineering since it was introduced more than 30 years ago,” Cauwenberghs said.  “What is new with NeuRRAM is that the extreme efficiency now goes together with great flexibility for diverse AI applications with almost no loss in accuracy over standard digital general-purpose compute platforms."

A carefully crafted methodology was key to the work with multiple levels of “co-optimization” across the abstraction layers of hardware and software, from the design of the chip to its configuration to run various AI tasks. In addition, the team made sure to account for various constraints that span from memory device physics to circuits and network architecture. 

“This chip now provides us with a platform to address these problems across the stack from devices and circuits to algorithms,” said Siddharth Joshi, an assistant professor of computer science and engineering at the University of Notre Dame , who started working on the project as a Ph.D. student and postdoctoral researcher in Cauwenberghs lab at UC San Diego. 

Chip performance

Researchers measured the chip’s energy efficiency by a measure known as energy-delay product, or EDP. EDP combines both the amount of energy consumed for every operation and the amount of times it takes to complete the operation. By this measure, the NeuRRAM chip achieves 1.6 to 2.3 times lower EDP (lower is better) and 7 to 13 times higher computational density than state-of-the-art chips. 

Researchers ran various AI tasks on the chip. It achieved 99% accuracy on a handwritten digit recognition task; 85.7% on an image classification task; and 84.7% on a Google speech command recognition task. In addition, the chip also achieved a 70% reduction in image-reconstruction error on an image-recovery task. These results are comparable to existing digital chips that perform computation under the same bit-precision, but with drastic savings in energy. 

Researchers point out that one key contribution of the paper is that all the results featured are obtained directly on the hardware. In many previous works of compute-in-memory chips, AI benchmark results were often obtained partially by software simulation. 

Next steps include improving architectures and circuits and scaling the design to more advanced technology nodes. Researchers also plan to tackle other applications, such as spiking neural networks.

“We can do better at the device level, improve circuit design to implement additional features and address diverse applications with our dynamic NeuRRAM platform,” said Rajkumar Kubendran, an assistant professor for the University of Pittsburgh, who started work on the project while a Ph.D. student in Cauwenberghs’ research group at UC San Diego.

In addition, Wan is a founding member of a startup that works on productizing the compute-in-memory technology. “As a researcher and  an engineer, my ambition is to bring research innovations from labs into practical use,” Wan said. 

New architecture 

The key to NeuRRAM’s energy efficiency is an innovative method to sense output in memory. Conventional approaches use voltage as input and measure current as the result. But this leads to the need for more complex and more power hungry circuits. In NeuRRAM, the team engineered a neuron circuit that senses voltage and performs analog-to-digital conversion in an energy efficient manner. This voltage-mode sensing can activate all the rows and all the columns of an RRAM array in a single computing cycle, allowing higher parallelism. 

In the NeuRRAM architecture, CMOS neuron circuits are physically interleaved with RRAM weights. It differs from conventional designs where CMOS circuits are typically on the peripheral of RRAM weights.The neuron’s connections with the RRAM array can be configured to serve as either input or output of the neuron. This allows neural network inference in various data flow directions without incurring overheads in area or power consumption. This in turn makes the architecture easier to reconfigure. 

To make sure that accuracy of the AI computations can be preserved across various neural network architectures, researchers developed a set of hardware algorithm co-optimization techniques. The techniques were verified on various neural networks including convolutional neural networks, long short-term memory, and restricted Boltzmann machines. 

As a neuromorphic AI chip, NeuroRRAM performs parallel distributed processing across 48 neurosynaptic cores. To simultaneously achieve high versatility and high efficiency, NeuRRAM supports data-parallelism by mapping a layer in the neural network model onto multiple cores for parallel inference on multiple data. Also, NeuRRAM offers model-parallelism by mapping different layers of a model onto different cores and performing inference in a pipelined fashion.

An international research team

The work is the result of an international team of researchers. 

The UC San Diego team designed the CMOS circuits that implement the neural functions interfacing with the RRAM arrays to support the synaptic functions in the chip’s architecture, for high efficiency and versatility. Wan, working closely with the entire team, implemented the design; characterized the chip; trained the AI models; and executed the experiments. Wan also developed a software toolchain that maps AI applications onto the chip. 

The RRAM synapse array and its operating conditions were extensively characterized and optimized at Stanford University. 

The RRAM array was fabricated and integrated onto CMOS at Tsinghua University. 

The Team at Notre Dame contributed to both the design and architecture of the chip and the subsequent machine learning model design and training.

The research started as part of the National Science Foundation funded Expeditions in Computing project on Visual Cortex on Silicon at Penn State University, with continued funding support from the Office of Naval Research Science of AI program, the Semiconductor Research Corporation and DARPA JUMP program, and Western Digital Corporation. 

A compute-in-memory chip based on resistive random-access memory

Weier Wan, Rajkumar Kubendran, Stephen Deiss, Siddharth Joshi, Gert Cauwenberghs, University of California San Diego

Weier Wan, S. Burc Eryilmaz, Priyanka Raina, H-S Philip Wong, Stanford University

Clemens Schaefer, Siddharth Joshi, University of Notre Dame

Rajkumar Kubendran, University of Pittsburgh 

Wenqiang Zhang, Dabin Wu, He Qian, Bin Gao, Huaqiang Wu, Tsinghua University

Corresponding authors: Wan, Gao, Joshi, Wu, Wong and Cauwenberghs

CAPTION

A close up of the NeuRRAM chip

CREDIT

David Baillot/University of California San Diego

  

CAPTION

The NeuRRAM chip uses an innovative architecture that has been co-optimized across the stack.

CREDIT

David Baillot/University of California San Diego

How environmental changes affect the shapes of RNA in living cells

Peer-Reviewed Publication

JOHN INNES CENTRE

RNA diverse structure 

IMAGE: NEW TECHNOLOGY WHICH IS CAPABLE OF PROFILING RNA STRUCTURE AT THE RESOLUTION OF A SINGLE MOLECULE IN LIVE CELLS view more 

CREDIT: JOHN INNES CENTRE

The impact of environmental conditions on the dynamic structures of RNAs in living cells has been revealed by innovative technology developed by researchers at the John Innes Centre.

The research, the result of a collaboration between the groups of Professor Dame Caroline Dean FRS and Dr Yiliang Ding, increases our understanding of what happens at cellular level in response to environmental signals. This raises the possibility that we may use this knowledge to fine-tune crops or develop RNA-based therapies for diseases such as COVID-19 (SARS-COV-2.)

Previous research by these groups showed that two important genetic elements COOLAIR and FLC interplay to regulate plant molecular responses to warm and cold.

But it was unclear how the RNA structure of COOLAIR contributes to the regulation of FLC - a genetic brake on flowering in plants.

Researchers in the Ding group developed a new technology which is capable of profiling RNA structure at the resolution of a single molecule in live cells.

Using this technique allowed them to observe RNA structural changes. In warm conditions COOLAIR RNA adopts three predominant structures and these shapes and proportions changed after the plants were exposed to cold temperatures.

They noticed that changes in RNA conformations in one hyper-variable region of COOLAIR changed FLC expression.

By introducing mutations into the sequence of this RNA region, the researchers were able to alter the flowering time of the plants.

Dr Ding said: “Our work has shown that RNAs can adopt different conformations, or structures. These diverse conformations dynamically change in response to external conditions. In this study by tuning the RNA structure, we altered the flowering time of the plant.”

The understanding of how RNA structure affects RNA function and the ability to engineer plant genomes at the RNA cellular level, increases the possibility of designing crop types with more desirable agronomic and nutritional traits.

The group says that the technology can also be applied to human cells where RNA structures could serve as a guide for designing RNA-based therapies.

First author Dr Pan Zhu said: “Every RNA is likely to have its own RNA structure landscapes and conformational diversities. Our technology will allow us to explore the pervasive functional importance of RNA structures in the RNAs of interest such as SARS-COV-2.”

The group will now look to share their new technology with RNA-based industrial or academic collaborators.

During the process of gene expression DNA is transcribed into RNA which is then used to make proteins. RNA is often referred to as the “skinny molecule” because it is single stranded, but recent work has highlighted its structural diversity and how these structures affect gene regulation and protein synthesis.

In plants FLC acts as a brake on flowering, a key part of a molecular mechanism which ensures the plant only flowers when it has reached a required level of cold exposure. COOLAIR is antisense to FLC, binding to it and blocking it from being transcribed after cold exposure. Knowledge of these mechanisms will be key to understanding the consequences of climate change.

In vivo single-molecule analysis reveals COOLAIR RNA structural diversity, appears in Nature. https://www.nature.com/articles/s41586-022-05135-9

 

 
 

THIRD WORLD U$A

Rural cancer patients undergo less radiation treatment and experience poorer outcomes

Peer-Reviewed Publication

VIRGINIA COMMONWEALTH UNIVERSITY

Recently published research out of VCU Massey Cancer Center demonstrated that cancer patients who live in rural areas experience higher mortality rates than nonrural cancer patients, among those receiving radiation treatment. Patients living in rural areas were also more likely to miss their radiation appointments, especially at the beginning of their treatment.

Unfortunately, patients who live in rural areas have a higher mortality rate than those living in nonrural areas. Given that cancer treatment nonadherence is associated with higher rates of cancer recurrence and survival, a team of cancer researchers — led by Bonny Morris, Ph.D., RN, — set out to see if geographic differences in nonadherence could be contributing to the higher cancer mortality rate observed among rural residents. Nonadherence is classified as missing a treatment appointment. The researchers examined medical records data from 3,000 cancer patients to assess how nonadherence to radiation treatment and overall survival were impacted by characteristics like where a person lives, their age, race and other social and clinical factors.

The study found that rural residence was associated with both radiation treatment nonadherence and poorer overall survival. Rural patients with a treatment delay had the lowest overall survival, compared to both nonrural survivors and rural survivors without delay.

“Not receiving the prescribed radiation treatment plan impacted patients’ chance of survival, especially for patients living in rural areas who miss treatments at the beginning of their treatment course,” Morris said. “The two-year survival rate was 76% for nonrural residents who did not experience a treatment delay versus 27% for rural residents who experienced a treatment delay. The health care system can play an important role, and I would say has an ethical responsibility, to reduce, and better yet, eliminate these disparities through targeted support programs like patient navigation.”

Study co-author Bernard Fuemmeler, Ph.D., M.P.H., associate director for population sciences and Gordon D. Ginder, M.D., Chair in Cancer Research at VCU Massey Cancer Center, said the findings reinforce the cancer center’s mission to promote health equity by understanding the gaps in cancer care and care delivery to better serve the community. Massey’s catchment area consists of 66 counties and cities, spanning a variety of rural and nonrural communities in the eastern, central and southern portions of Virginia.

“Our data stress the importance for cancer centers across the country with similar catchment areas to identify and attend to the needs of the rural populations they serve,” Fuemmeler said.

Along with rural residents, the research showed that widowed patients, advanced stage cancer patients and lung cancer patients were more likely to be nonadherent to radiation therapy. The study authors suggest that radiation treatment programs should consider implementing nonadherence risk assessments to understand access barriers for each patient based on a multitude of factors, including geographic residence, marital and employment status, cancer type, disease stage and treatment delay. Additionally, they suggest that services such as patient navigation and referrals to social work or financial counselors should be routinely offered to support treatment adherence, particularly for rural patients who are delayed in starting radiation.

Morris reports that her background as someone who grew up in a rural area as well as her clinical background as an oncology nurse helped her to design this study. While a doctoral student at the VCU School of Medicine, she was awarded an F99/K00 fellow award through the National Cancer Institute (F99CA245799), which is a predoctoral-postdoctoral transition award that funded the present study. She is now a postdoctoral fellow at Wake Forest University School of Medicine and in the K00 phase of this award, which will allow her to further her research on how to improve adherence among rural patient populations (K00. 

Collaborators on this study include Emma Fields, M.D., and Roy Sabo, Ph.D., members of the Developmental Therapeutics research program at Massey; and Ryan Hughes, M.D., radiation oncologist, and Kathryn Weaver, Ph.D., in the Department of Social Sciences and Health Policy at Wake Forest University School of Medicine. This study was also supported by the VCU Massey Cancer Center Biostatistical Shared Resource, and, in part, with funding from NIH-NCI Cancer Center Support Grant (P30 CA016059).

FOR PROFIT HEALTHCARE U$A

Current insurer calculation of qualified payment amount for out-of-network (OON) care may violate No Surprises Act


Skewing data when calculating key rate for OON care weakens community anesthesiology, emergency and radiology practices, and can be used to narrow provider networks and reduce patient access to care

Reports and Proceedings

AMERICAN SOCIETY OF ANESTHESIOLOGISTS

CHICAGO – In possible violation of the No Surprises Act, health insurance company calculations of Qualified Payment Amounts (QPA) for anesthesiology, emergency medicine and radiology services (and possibly other specialty services) likely include rates from primary care provider (PCP) contracts. A new study conducted by Avalere Health and commissioned by three national physician organizations examined a subpopulation of PCPs and determined that contracting practices may directly impact the QPA.   

Despite the law’s directive that QPA calculation be based on payment data from the “same or similar specialty” in the same geographic region, insurers may be calculating median in-network rates for specialty services using PCP contracted rates for services that were never negotiated, may never be provided by those physicians and may never be paid. This method may violate the law and produce insurer-calculated QPAs that do not represent typical payments for these services.

Insurers have already attempted to use the new law to raise profits by artificially lowering the QPA. Further, they have cancelled contracts of providers who declined large reimbursement cuts. Using such erroneous data in QPA calculation may further narrow medical networks, deny patients their choice of providers and could delay diagnosis and treatment of illness and injury.

“This new research raises significant questions about the accuracy of insurer calculated QPAs,” said American Society of Anesthesiologists (ASA) President Randall M. Clark, M.D., FASA. “We have received reports of extremely low QPAs that bear absolutely no resemblance to actual in-network rates in the geographic area; yet these same rates are being used by insurers as their initial payment.”

In the study, 75 primary care practice employees who have a role in contracting with insurers were surveyed regarding whether they contract with insurers for services they rarely or never provide, as well as negotiation practices related to these services. The survey found that 68% of respondents had services that they rarely provide (fewer than twice a year) included in their contracts, and 57% of respondents had services that they never provide included in their contract.

Congress debated including a benchmark for determining payment rates for out-of-network providers in the No Surprises Act but decided not to dictate a specific rate. Such a rate, Congress argued, could not adequately reflect the different specialties and regions care is provided in across the country. Instead, the law directed insurance companies to calculate their own QPAs and directed independent arbiters to take these QPAs into consideration when settling payment disputes. The QPA is intended to reflect a health insurer’s median contracted rate to providers based on the region in which they practice and for providers in the “same or similar specialty.”

Unfortunately, insurance companies may be incorporating contracted rates for services like those included in the PCP contracts that will never be provided by those physicians and for which insurers will never pay. Additionally, the inclusion of rates not actively negotiated by such providers will likely artificially lower the QPA, effectively reducing payments for the anesthesiologists, emergency physicians and radiologists, who actually do provide those services.

“On behalf of our all our members, the American College of Emergency Physicians (ACEP), the American College of Radiology® (ACR®) and ASA call on policymakers to eliminate the QPA as the sole/primary factor in arbitration and ensure the integrity of the QPA, by insisting they be calculated based on “same or similar specialty” in network rates, thus mitigating the unintended consequences of relying on health insurers’  median in-network rates based partially on data from providers who don’t actively negotiate those rates,” said Gillian Schmitz, M.D., FACEP, president of ACEP. “Physicians rely on fair reimbursement to keep their doors open and continue providing lifesaving medical care to their patients.”

“This study demonstrates yet another reason why policymakers cannot make the QPA the primary factor in such arbitration nor base important regulations solely on insurer-produced and defined data,” said Jacqueline Bello, M.D., FACR, Chair of the American College of Radiology Board of Chancellors. “Decisionmakers should work with healthcare providers and insurers to consider relevant and verifiable data, ensure that patients have adequate provider networks, and protect access to care by ensuring sensible, sustainable reimbursement.”     

About the American Society of Anesthesiologists
Founded in 1905, the American Society of Anesthesiologists (ASA) is an educational, research and scientific society with more than 55,000 members organized to raise and maintain the standards of the medical practice of anesthesiology. ASA is committed to ensuring physician anesthesiologists evaluate and supervise the medical care of patients before, during and after surgery to provide the highest quality and safest care every patient deserves.

For more information on the field of anesthesiology, visit the American Society of Anesthesiologists online at asahq.org. To learn more about the role physician anesthesiologists play in ensuring patient safety, visit asahq.org/madeforthismoment. Like ASA on Facebook and follow ASALifeline on Twitter.

About the American College of Emergency Physicians
The American College of Emergency Physicians (ACEP) is the national medical society representing emergency medicine. Through continuing education, research, public education, and advocacy, ACEP advances emergency care on behalf of its 40,000 emergency physician members, and the more than 150 million Americans they treat on an annual basis. For more information, visit http://www.acep.org and www.emergencyphysicians.org.

About the American College of Radiology
The American College of Radiology (ACR), founded in 1924, is a professional medical society dedicated to serving patients and society by empowering radiology professionals to advance the practice, science and professions of radiological care.

 

#          #          #

MY BREAKFAST FOR THIRTY YEARS

Coffee and cigarettes: Research sheds new light on nicotine and morning brew

Peer-Reviewed Publication

UNIVERSITY OF FLORIDA

For some smokers, the first cigarette of the day is just not as satisfying without a cup of coffee. That could be more than just a morning habit: Chemical compounds in roasted coffee beans may help lighten the effects of morning nicotine cravings, University of Florida researchers have found.

In a cell-based study, the researchers identified two compounds in coffee that directly affect certain high-sensitivity nicotine receptors in the brain. In smokers, these brain receptors can be hypersensitive after a night of nicotine withdrawal.

The recently published findings have yet to be tested in humans but are an important step toward better understanding how coffee and cigarettes affect nicotine receptors in the brain, said Roger L. Papke, Ph.D., a pharmacology professor in the UF College of Medicine. Caffeine is coffee’s feel-good ingredient for most people but smokers may get another kind of boost.

“Many people like caffeine in the morning but there are other molecules in coffee that may explain why cigarette smokers want their coffee,” Papke said.

The researchers applied a dark-roasted coffee solution to cells that express a particular human nicotine receptor. An organic chemical compound in coffee may help restore the nicotine receptor dysfunction that leads to nicotine cravings in smokers, the researchers concluded.

The findings have led Papke to a broader hypothesis: One of the compounds in brewed coffee, known as n-MP, may help to quell morning nicotine cravings.

Papke said he was intrigued by the idea that nicotine-dependent smokers associate tobacco use with coffee in the morning and alcohol in the evening. While alcohol’s effect on nicotine receptors in the brain has been thoroughly researched, the receptors’ interaction with coffee has been studied less.

“Many people look for coffee in the morning because of the caffeine. But was the coffee doing anything else to smokers? We wanted to know if there were other things in coffee that were affecting the brain’s nicotine receptors,” Papke said.

The findings, he said, provide a good foundation for behavioral scientists who could further study nicotine withdrawal in animal models. 

Funding for the research was provided by the National Institutes of Health.