Monday, August 29, 2022

Research reveals widespread use of ineffective COVID-19 treatments after FDA deauthorized their use

Products unlikely to provide benefit given to over 150,000 patients

Peer-Reviewed Publication

BETH ISRAEL DEACONESS MEDICAL CENTER

BOSTON – Monoclonal antibodies are laboratory-designed treatments tailor-made to fight specific infections. In early 2021, the U.S. Food & Drug Administration issued emergency use authorization for two monoclonal antibodies (bamlanivimab/etesevimab and casirivimab/imdevimab) for the treatment of mild to moderate COVID-19 in high-risk, non-hospitalized patients. However, these treatments were shown not to work against the Omicron variant of COVID-19, which emerged in the United States in December 2021 and was responsible for a record-breaking COVID-19 surge in the winter of 2021-22. As a result of the monoclonal antibodies’ reduced efficacy against the variant, the FDA deauthorized their use in early January 2022.                                                                     

In a paper published in JAMA Network Open, physician-scientists at Beth Israel Deaconess Medical Center (BIDMC) assessed the use of these two monoclonal antibodies for patients with COVID-19 before and after FDA deauthorization. The team observed that though overall use of the two monoclonal antibodies declined gradually following deauthorization, a large number of doses were administered to patients well into 2022. Altogether, over 158,000 doses of monoclonal antibodies were administered, providing little to no benefit to patients and potentially contributing millions of dollars in costs. Whether the FDA will take regulatory action against those violating guidance remains unknown at this time. 

“Continued use of these treatments represents low value care and may reflect conflicting state government guidance or a lack of hospital awareness of deauthorization,” said lead author Timothy Anderson, MD, MAS, Lead for Improving Value in Healthcare at Center for Healthcare Delivery Science at BIDMC and assistant professor of medicine at Harvard Medical School. “Though the FDA clearly stated these treatments were no longer authorized for use, the FDA did not fully revoke their emergency use authorizations based on the possibility that they may work to treat future COVID-19 variants. This could have led to confusion and misinterpretation.” 

Anderson and colleagues examined mandatory public reporting by hospitals to the U.S. Department of Health and Human Services from October 2021 to June 2022. They observed that in early 2022, hospitals administered more than 158,000 doses of the deauthorized monoclonal antibody treatments bamlanivimab/etesevimab and casirivimab/imdevimab. The researchers also saw wide variability by state in the treatments’ use following deauthorization.  

While use of the ineffective medications steadily declined after deauthorization, the proportion of COVID-19 cases for which the unauthorized treatments were used did not peak until late March. Moreover, usage following deauthorization varied widely by state, with Florida and New York accounting for 24 percent and 20 percent of monoclonal antibody use in 2022 respectively. Eleven states administered more than half of their remaining supply after deauthorization, while 14 states used less than 10 percent of their remaining supply.  

“We believe these findings are quite surprising and indicate a need for the FDA to investigate the continued use of treatments found to not be effective for COVID-19,” said senior author Jennifer Stevens, MD, director of the Center for Healthcare Delivery Science at BIDMC and associate professor of medicine at Harvard Medical School. “Efforts to improve transparency, equity and value in the COVID-19 response should include public facility-level reporting for all COVID-19 therapies. We hope that our findings will lead to greater attention and more diligent regulation by health care providers and government agencies to prohibit the use of unauthorized treatments.”   

Co-authors included Ashley O’Donoghue, PhD, Oren Mechanic, MD, MPH, and Tenzin Dechen, MPH, of BIDMC. 

This study was funded by an unrestricted philanthropic gift from Google.org.   

Anderson reports receiving grants from the National Institute on Aging, American College of Cardiology and Boston Claude D. Pepper Older Americans Independence Center outside the submitted work, and honoraria from Alosa Health, a nonprofit educational organization with no relationship to any drug or device manufacturers.   

About Beth Israel Deaconess Medical Center  

Beth Israel Deaconess Medical Center is a patient care, teaching and research affiliate of Harvard Medical School and consistently ranks as a national leader among independent hospitals in National Institutes of Health funding. BIDMC is the official hospital of the Boston Red Sox.  

Beth Israel Deaconess Medical Center is a part of Beth Israel Lahey Health, a health care system that brings together academic medical centers and teaching hospitals, community and specialty hospitals, more than 4,800 physicians and 36,000 employees in a shared mission to expand access to great care and advance the science and practice of medicine through groundbreaking research and education.  

###  

New method of measuring economic inequality could improve policy outcomes

Policy-makers looking to address income inequality would be better off going beyond the most commonly used metric, the ‘Gini coefficient’

Peer-Reviewed Publication

UNIVERSITY OF EXETER

Social scientists have urged policy-makers and governments to rethink how income inequality is measured.

In a new study published in Nature Human Behaviour, researchers from the University of Exeter Business School, Harvard Business School, and the University of Bremen, found that policy-makers looking to address income inequality would be better off becoming more systematic about how to measure inequality and going beyond the most commonly used metric, the ‘Gini coefficient’.

The Gini coefficient is the most widely-used metric for understanding inequality, used by governments and statistics bureaus around the world and commonly cited in news media and policy discussions.

But according to the authors, the metric is not the most effective at measuring certain aspects of income inequality because it condenses a lot of information into a single parameter.

As a result of this shortcoming, the measure is unable to distinguish where on the income spectrum inequality is most concentrated.

Co-author Professor Jon M. Jachimowicz of the Harvard Business School explained: “If you look at the Bronx in New York and neighboring Westchester County, both have high inequality as measured by the Gini coefficient, but the Bronx’s inequality is driven predominantly by a difference between low-income and medium-to-high earners, whereas inequality in Westchester is mostly driven by the super-rich.

“If we devised policy to address inequality based solely on the Gini, we would treat the Bronx and Westchester the same. But that may not be the right thing to do.”

The researchers analysed around 3,000 US county-level income distributions – covering over 97% of the US – using 17 different models for measuring income inequality.

They found a metric comprised of two separate variables called the “Ortega parameters” outperformed the single parameter Gini co-efficient model in this dataset.

The researchers said this is because each Ortega parameter focuses on a different aspect of income distribution: the first captures income distribution between low-income earners and medium-to-high income earners, while the second captures the extent to which super-high earners compare to the rest.

An advantage of having obtained an accurate metric for measuring inequality in a given dataset is that it can reveal new insights about the relationship between inequality and policy outcomes, according to the authors.

For example, measuring inequality using the Gini co-efficient the researchers found no correlation with obesity or educational outcomes.

But using the Ortega parameters, they found links between greater ‘bottom-concentrated inequality’ (the difference between low-income and medium-to-high income earners) and more obesity and a lower share of the population having a degree.

Areas with higher top-concentrated inequality (the difference between super-high earners and the rest) were in contrast associated with less obesity and a greater share of the population having degrees.

The paper could have far reaching implications for economic research and policymakers alike.

Professor Oliver Hauser, Associate Professor of Economics at the University of Exeter Business School said: “One way to understand the diverging beliefs about inequality and preferences for redistribution may be to focus on what kind of inequality people are most dissatisfied with.”

“This becomes clearer when discussing potential measures taken to redress inequality. For example, reducing top-concentrated economic inequality could be achieved by raising top income taxes, and reducing bottom-concentrated may involve raising the minimum wage.”

“Our approach and findings suggest that moving beyond the overall concentration of inequality as reflected in the Gini coefficient may be fruitful in both pinpointing how different kinds of inequality affect outcomes and how to make meaningful change to redress inequality.”

Measuring Inequality Beyond the Gini Coefficient May Clarify Conflicting Findings” co-authored by Ph.D. candidate Kristin Blesch, Professor Oliver Hauser, and Professor Jon M. Jachimowicz is published in Nature Human Behaviour.

Ancient landslide destroyed area size of Cincinnati

By understanding dynamics of landslides, researchers hope to prevent future ones

Peer-Reviewed Publication

UNIVERSITY OF CINCINNATI

Daniel Sturmer 

IMAGE: UC GEOLOGIST DANIEL STURMER STUDIED A MASSIVE LANDSLIDE OUTSIDE LAS VEGAS THAT SCATTERED DEBRIS OVER AN AREA THE SIZE OF A SMALL CITY MORE THAN 5 MILLION YEARS AGO. view more 

CREDIT: JOSEPH FUQUA/UC

University of Cincinnati geologists reconstructed a massive landslide in Nevada that wiped out an area the size of a small city more than 5 million years ago.

UC College of Arts and Sciences graduate Nick Ferry and UC assistant professor of geology Daniel Sturmer pieced together details of the Blue Diamond landslide, a natural disaster that sent rocks and boulders tumbling more than 6 miles across what is now a desert outside Las Vegas.

The landslide in Red Rock Canyon National Conservation Area had such mass and force that it propelled fragmented rock 2 miles uphill, cresting the enormous Blue Diamond Hill, and flattened an area larger than downtown Cincinnati. Rubble from the landslide stretches over an area of more than 7 square miles.

“You can imagine this being pretty catastrophic in nature,” said Ferry, now a doctoral student at the University of Kansas.

The study was published in the Journal of Sedimentary Research.

The Blue Diamond catastrophe was a rock avalanche, an extremely rapid landslide that propelled fragmented rock and soil downhill at more than 5 meters per second.

Piecing together details of a landslide that occurred millions of years ago is a challenge, but geologists use a combination of field observations and laboratory analysis. By learning more about these natural disasters, researchers hope to find ways to predict future ones.

That’s becoming increasingly important as people build homes, schools and businesses in more precarious places, Sturmer said.

“Landslides are one of the major disasters in the world in areas where you have significant topography, which represents a growing percentage of where people live,” Sturmer said. “It’s critical to be able to predict these hazards and prevent them or at least be smart about construction when you’re expanding cities.

“It’s one of the critical things geoscientists are doing.”

Ferry said the landslide might have been triggered by heavy rain. This part of Nevada was much wetter millions of years ago than it is today.

“We believe one reason the landslide reached so far is because it was flowing over saturated substrate, which reduced the friction resistance,” Ferry said.

Determining the date of a landslide is a little tricky. Researchers can use radiocarbon dating for events that occurred less than 50,000 years ago. For older events, they can turn to cosmic ray exposure dating. Researchers can tell how long a rock has been exposed on the surface of the Earth by studying the isotopes of certain elements in the rocks that were affected by high-energy cosmic rays. It’s a tool also used to study the movement of glaciers.

UC researchers say the Blue Diamond landslide occurred sometime between 5 million and 23 million years ago during the Miocene Period. This was a period of rapid evolutionary diversity. One Miocene bird, Argentavis, found in South America weighed as much as 200 pounds and had a wingspan of 23 feet.

The landslide originated in the Wilson Cliffs, which today stand about 6,500 feet above sea level but were probably much higher at the time of the disaster. Material ended up 6 miles away atop Blue Diamond Hill which today stands at 4,931 feet above sea level. Sturmer said it’s possible that this hill has moved over the millennia through plate tectonics.

Regardless, the landslide would have scarred the Earth in thunderous, dramatic fashion.

“Philosophically, the way I think about landslides is that they heal the surface of the Earth. They’re big agents of change,” Sturmer said. “Plate tectonics creates these mountains with enormous potential energy. And they’re worn down by erosion and from these landslide events that reduce that potential energy.”

The study was supported by the American Chemical Society’s Petroleum Research Fund and grants from the Association of Environmental and Engineering Geologists.

Landslides aren’t just a concern in mountainous regions. Cincinnati is extremely susceptible to landslides, Sturmer said.

Ohio’s Department of Transportation monitors and inspects more than 10,000 identified landslide-prone areas across the state. Each is rated for public risk based on its history and hazard to motorists, among other factors. The highest risk landslides are inspected each year.

“The hills in the Cincinnati area are the reason that we are so susceptible to landslides,” Sturmer said.

Ice Age glaciers created Cincinnati’s steep topography through erosion by streams and rivers that carved steep hills and valleys. Likewise, deposits from the glacier are typically loose and full of weak clays, Sturmer said. When this gets wet from heavy rain or melting snow, the hillsides can be prone to landslides, he said.

“The combination of steep-sided valleys, clay-filled, unconsolidated sediment and wet climate make the Cincinnati area one of the most landslide-prone metro areas in the country,” Sturmer said.

Geologists monitor hillsides that could threaten public safety or infrastructure and study the historical record to see if these steep hills have failed before, he said.

Sturmer said geologists look for telltale warning signs of landslide development such as crescent-shaped cracks on high slopes or the slight tilt of power poles, fence posts or trees. They also look for evidence of previous slides such as bumpy, irregular slopes or cracks in buildings or foundations and plumbing. The hillsides might have curved tree trunks from where the ground shifted or long debris scars stretching downhill, he said.

“You can see many of these features on hillsides driving around town,” he said.

Ferry said he was excited to do fieldwork as a UC geology student.

“Getting out to Nevada to see the massive scale of this landslide was amazing,” Ferry said. “As a geology student, you’re not limited to just studying geology locally. You can travel to places where you can do research that interests you.

“It was invaluable in making me a better geologist,” he said.


CAPTION

A landslide more than 5 million years ago originated at the Wilson Cliffs and covered an area of 7 square miles.

CREDIT

Nick Ferry


CAPTION

UC graduate Nick Ferry examines the geology of the Blue Diamond Hill landslide.

CREDIT

Daniel Sturmer

CAPTION

A view of the Wilson Cliffs from Blue Diamond Hill. A landslide here sent fractured rock hurtling across an area of 7 square miles.

CREDIT

Nick Ferry


UW researchers bring first underwater messaging app to smartphones

Peer-Reviewed Publication

UNIVERSITY OF WASHINGTON

AquaApp demo 

IMAGE: A TEAM OF UNIVERSITY OF WASHINGTON RESEARCHERS DEVELOPED AQUAAPP, THE FIRST MOBILE APP FOR SENDING AND RECEIVING MESSAGES UNDERWATER THAT WORKS ON EXISTING SMARTPHONES — NO SPECIAL HARDWARE REQUIRED. THE APP OVERCOMES A NUMBER OF TECHNICAL CHALLENGES RELATED TO THE UNDERWATER ENVIRONMENT TO ENABLE TWO-WAY COMMUNICATION AND NETWORKING WHILE ENGAGING IN UNDERWATER ACTIVITIES LIKE SNORKELING AND SCUBA DIVING. view more 

CREDIT: UNIVERSITY OF WASHINGTON

For millions of people who participate in activities such as snorkeling and scuba diving each year, hand signals are the only option for communicating safety and directional information underwater. While recreational divers may employ around 20 signals, professional divers’ vocabulary can exceed 200 signals on topics ranging from oxygen level, to the proximity of aquatic species, to the performance of cooperative tasks.

The visual nature of these hand signals limits their effectiveness at distance and in low visibility. Two-way text messaging is a potential alternative, but one that requires expensive custom hardware that is not widely available.

Researchers at the University of Washington show how to achieve underwater messaging on billions of existing smartphones and smartwatches using only software. The team developed AquaApp, the first mobile app for acoustic-based communication and networking underwater that can be used with existing devices such as smartphones and smartwatches.

The researchers presented their paper describing AquaApp Aug. 25 at SIGCOMM 2022.

"Smartphones rely on radio signals like WiFi and Bluetooth for wireless communication. Those don’t propagate well underwater, but acoustic signals do," said co-lead author Tuochao Chen, a UW doctoral student in the Paul G. Allen School of Computer Science & Engineering. "With AquaApp, we demonstrate underwater messaging using the speaker and microphone widely available on smartphones and watches. Other than downloading an app to their phone, the only thing people will need is a waterproof phone case rated for the depth of their dive."

The AquaApp interface enables users to select from a list of 240 pre-set messages that correspond to hand signals employed by professional divers, with the 20 most common signals prominently displayed for easy access. Users can also filter messages according to eight categories, including directional indicators, environmental factors and equipment status.

In building the app, the team had to overcome a variety of technical challenges that they haven’t previously encountered on dry land.

"The underwater scenario surfaces new problems compared to applications over the air," said co-lead author Justin Chan, a doctoral student in the Allen School. "For example, fluctuations in signal strength are aggravated due to reflections from the surface, floor and coastline. Motion caused by nearby humans, waves and objects can interfere with data transmission. Further, microphones and speakers have different characteristics across smartphone models. We had to adapt in real time to these and other factors to ensure AquaApp would work under real-world conditions."

Other challenges included addressing the tendency for devices to rapidly shift position and proximity in the current, and the various noise profiles the app might encounter due to the presence of vessels, animals and even low-flying aircraft.

The team created an algorithm that allows AquaApp to optimize, in real time, the bitrate and acoustic frequencies of each transmission based on certain parameters, including distance, noise and variations in frequency response across devices.

Here's how it works: When one user wants to send a message to another device, their app first sends a quick note, called a preamble, to the other device. AquaApp on the second device runs the algorithm to determine the best conditions to receive the preamble. Then it tells the first device to use those same conditions to send the actual message.

The researchers developed a networking protocol to share access to the underwater network, akin to how WiFi networks referee internet traffic, to support messaging between multiple devices. AquaApp can accommodate up to 60 unique users on its local network at one time.

The team tested the real-world utility of the AquaApp system in six locations offering a variety of water conditions and activity levels, including under a bridge in calm water, at a popular waterfront park with strong currents, next to the fishing dock of a busy lake and in a bay with strong waves. The researchers evaluated the app’s performance at distances of up to 113 meters and depths of up to 12 meters.

"Based on our experiments, up to 30 meters is the ideal range for sending and receiving messages underwater, and 100 meters for transmitting SoS beacons," Chen said. "These capabilities should be sufficient for most recreational and professional scenarios."

The researchers also measured AquaApp’s impact on battery life by continuously running the system on two Samsung Galaxy S9 smartphones at maximum volume and with screens activated. The app reduced the devices’ battery power by just 32% over the course of four hours, which is within the maximum recommended dive time for recreational scuba diving.

"AquaApp brings underwater communication to the masses," said senior author Shyam Gollakota, a UW professor in the Allen School. "The state of underwater networking today is similar to ARPANET, the precursor of the internet, in the 1970s, where only a select few had access to the internet. AquaApp has the potential to change that status quo by democratizing underwater technology and making it as easy as downloading software on your smartphone."

The team’s data and open-source Android code are available on the AquaApp website.

The researchers are supported by the Moore Inventor Fellowship and the National Science Foundation.

###

For more information, contact underwatermessaging@cs.washington.edu.

Researchers develop equations to prevent the collapse of our globe’s most imperiled ecosystems

Scientists can now predict and compare tipping points so that resources can be directed where they are most urgently needed

Peer-Reviewed Publication

RENSSELAER POLYTECHNIC INSTITUTE

Jianxi Gao 

IMAGE: JIANXI GAO, RENSSELAER POLYTECHNIC INSTITUTE view more 

CREDIT: RENSSELAER POLYTECHNIC INSTITUTE

TROY, N.Y. —  News headlines on extreme weather, melting ice caps, and threatened species are daily reminders of our changing environment. The profound scale and intensity of these challenges may leave one to wonder, “What should we do first?” Researchers recently developed formulas that help answer that question, effectively creating a method to triage declining ecosystems by measuring and comparing their distance to tipping points.

In research just published in Nature Ecology & Evolution, a team led by Jianxi Gao, assistant professor of computer science at Rensselaer Polytechnic Institute, developed equations that allow the comparison of distances to tipping points across various mutualistic systems. In other words, for the first time, diverse environments can be analyzed as to how close they are to becoming completely and, perhaps, irrevocably, changed, and they can be compared with others to determine which areas need intervention most urgently.

Previously, scientists could detect early warning signals that a system may be approaching its tipping point, but they were unable to ascribe an exact value to a system’s distance from its tipping point. The value could define the likelihood that a system would transfer to the undesired state from the desired state, or how easily a tipping point could be reached.

Gao’s team developed a general dimension reduction approach to simplify the data in complex systems, allowing accurate measurements of distances to tipping points in diverse ecosystems. The team also developed a scaling factor that positions the resilience of different systems on the same scale so they may be compared.

“With so many ecosystems struggling from the impacts of climate change, being able to convey how little time we have left to intervene before a tipping point is reached is critical,” said Curt Breneman, Dean of the Rensselaer School of Science. “Mobilization will not happen without a sense of urgency.”  

Gao’s team studied 54 different environments from all over the world and analyzed the many factors that control their resilience. Species loss, invasions, human activities, and environmental changes all cause “perturbations” in an ecosystem, but its likelihood of collapse is determined by the ecosystem’s structural properties. For example, if a few trees are cut down in a thick forest, the impact on the ecosystem will be minimal because new trees will grow and the system will recover back to its previous state. However, in an area where trees are lacking, the loss of a few may make a more profound impact and the system may transition to an undesired state from which it is difficult to recover. In mathematical terms, resilience is the distance to the boundary of the attraction basin.

“For example, if one piece of attraction is the forest and the other is the savannah, the system may or may not transfer to savannah because of many factors,” Gao said. “The base of attraction refers to the region of these factors inside high-dimensional space. Where is the region of forest where if you cross the boundary, it changes to savannah? If a system remains in the boundary, it will always recover. Only when it crosses some value will it change into another state and cannot recover.”

Gao’s team believes that the method cannot only be used to determine the resilience of ecological systems, but also biological, engineering, and social systems.

“The dimension reduction approach is very general and can be applied to different types of systems,” Gao said. “It’s universal.”

The team also measured the tipping point within a supply chain network.

On an optimistic note, the team’s research is not concluding with tipping points. They are also pursuing a restoration algorithm for when systems fail.

Gao was joined in research by Huixin Zhang and Weidong Zhang of Shanghai Jiao Tong University, Qi “Ryan” Wang of Northeastern University, and Shlomo Havlin of Bar-Ilan University. Their work was supported by the NSF CAREER Award.

View a short video on the research here.

About Rensselaer Polytechnic Institute

Founded in 1824, Rensselaer Polytechnic Institute is America’s first technological research university. Rensselaer encompasses five schools, over 30 research centers, more than 140 academic programs including 25 new programs, and a dynamic community made up of over 6,800 students and 104,000 living alumni. Rensselaer faculty and alumni include upwards of 155 National Academy members, six members of the National Inventors Hall of Fame, six National Medal of Technology winners, five National Medal of Science winners, and a Nobel Prize winner in Physics. With nearly 200 years of experience advancing scientific and technological knowledge, Rensselaer remains focused on addressing global challenges with a spirit of ingenuity and collaboration. To learn more, please visit www.rpi.edu.

 

For general inquiries: newsmedia@rpi.edu

 

Visit the Rensselaer research and discovery blog: https://everydaymatters.rpi.edu/

 

Follow us on Twitter: @RPINews

 

Disclaimer: AAAS a

Can we reverse the effects of age related memory loss? Experts say yes

BU professor Robert Reinhart explains how a 20-minute noninvasive treatment regimen can improve both short-term and long-term memory

Peer-Reviewed Publication

BOSTON UNIVERSITY

Study Image 

IMAGE: A RESEARCHER ADMINISTERS THE STIMULATION TREATMENT VIA A CAP. view more 

CREDIT: COURTESY OF ROBERT REINHART

According to the Alzheimer’s Association, an estimated 6.5 million Americans aged 65 and older are living with Alzheimer’s in 2022. That figure is predicted to nearly double by 2050.

Dr. Robert Reinhart, an assistant professor at Boston University’s College of Arts and Sciences and director of the Cognitive & Clinical Neuroscience Laboratory, recognizes the severity of this issue and is working to create treatments to help people with brain disorders.

In a recent study published in Nature Neuroscience, Reinhart and his team of researchers explain how their noninvasive treatment delivered through electrodes in a wearable cap can improve memory function in older adults and gets us one step closer to providing a more efficient treatment for memory loss.

Robert Reinhart explains his research, findings, and potential impact in the Q&A below.

What did your study find?

We delivered our noninvasive treatment to target memory function through scalp electrodes and found that electrical brain stimulation for 20 minutes on four consecutive days can improve working memory and long-term memory in individuals 65 years and older for at least one month.

Low-frequency electrode activity improved working memory on day three and day four and one month after intervention, on the other hand, high-frequency improved long-term memory on days two–four and one month after intervention. Our findings demonstrate that the plasticity of the aging brain can be selectively and sustainably altered using these two treatments.

How does this paper expand on your previous study with electrostimulation treatment for memory loss?

In this new study, we used multiple, consecutive days of stimulation for 20 minutes to cause long-lasting memory improvements that lasted one month. Previously, the effects lasted only 50 minutes.

Another important difference is that in this recent study we developed two brain stimulation protocols — one for selectively improving short-term memory via low-frequency parietal stimulation, and another protocol for selectively improving long-term memory via high-frequency prefrontal stimulation.

What are the real-world implications of these findings?

An increasingly older population leads to additional personal, social, healthcare and economic costs. A factor greatly contributing to these costs is the impairment in basic memory systems essential for activities of everyday life, such as making financial decisions or understanding language.

Memory decline varies in severity across individuals during aging, with a rapid decline potentially predicting Alzheimer’s disease and other dementias.

Existing therapeutic approaches for impaired cognition are limited by mixed treatment outcomes, slow improvement, and accompanying risks and side effects. For those reasons, there’s an urgent need to develop innovative therapeutic interventions that can provide rapid and sustainable improvements with minimal side effects.

Clinically, this is important because there are people with only short-term memory problems and others with only long-term memory problems. So, having tools in hand that can address each of these memory systems is of great value.

What’s next?

Further research is needed to determine whether these effects can last beyond one month and whether these specific methods can also enhance memory function in individuals with impaired cognition due to brain disorders and in those at risk for dementia.