Friday, March 21, 2025

 

A less toxic way to manufacture daily goods




University of California - San Diego




Diisocyanates are used in the preparation of all polyurethanes, ranging from the foams used in shoe soles to the thermoplastics used in cell phone cases. Aromatic diisocyanates, which give polyurethane foams their structure, are commonly prepared on the megaton scale in highly secure facilities due to the use of phosgene, a highly reactive and toxic chemical reagent. Michael Burkart’s lab at UC San Diego recently reported the preparation of fully bio-based aromatic diisocyanates from a simple monosaccharide, D-galactose. This new route avoids the use of transition metals, gaseous reagents or any high-pressure/temperature reactions. As an application, the team demonstrates the synthesis of a thermoplastic polyurethane (TPU) using these renewable diisocyanates, which show excellent material properties equivalent to petroleum-based TPUs. These materials can serve as drop-in replacements for existing polyurethanes, which can now be sourced from 100% bio-based materials. Next, the team is developing scale-up procedures to prepare them on kilogram quantities for prototype applications.

 

VR study reveals how pain and fear weaken sense of body ownership



A virtual reality experiment suggests that fear, pain, and expectations shaped by prior experiences can disrupt the mind’s grip on the body.





Hiroshima University

Full-body illusion experiment: Influence of the manipulation of top-down interpretation 

image: 

This illustration shows our experimental setup where participants viewed a virtual body through VR goggles and received touch on their back. The experiment revealed that different interpretations of the virtual body led to different outcomes: when interpreted as "my body," the illusion occurred, but when interpreted as "my body with abdominal pain," the illusion was inhibited.

view more 

Credit: Kazuki Yamamoto & Takashi Nakao, Graduate School of Humanities and Social Sciences, Hiroshima University




A study from Hiroshima University found that when people were told to imagine their virtual bodies in pain, their brains resisted the illusion of ownership. Their findings could provide insights into why some people may struggle with feeling connected to their own bodies, particularly in contexts involving depersonalization or negative physical states.

The sense of body ownership—the feeling that our body belongs to us—is crucial in distinguishing ourselves from objects and responding to threats. Researchers study it using techniques like the rubber hand illusion (RHI) and full-body illusion (FBI), in which an individual is somehow influenced to identify with ownership of a fake or virtual body. To explore how body ownership can be disrupted, researchers test whether top-down factors—where previous knowledge, memories, and beliefs shape how we perceive and interpret new information or stimuli—weaken the illusion when participants are asked to identify with a virtual body in a negative physical state.

Researchers published their results in Frontiers in Psychology in December 2024.

“Using the full-body illusion in virtual reality—where people begin to feel a virtual body as their own— we investigated how interpreting the virtual body as one's own body, while in a negative physical state, affects this illusion. This research can possibly relate to depersonalization, a condition where people struggle to feel their body as their own,” said Kazuki Yamamoto, researcher and author of the study.

Participants were instructed to view a virtual body from the back using a virtual reality (VR) headset and imagine it as their own. Participants would watch the virtual body have their back stroked while their own physical body was stroked also which successfully illicited the illusion. This is a means of using bottom-up factors, which starts with an external stimulus to integrate visual-tactile information, and is a well-tested way to successfully influence the FBI.

To test the effect of top-down factors, the same course of action was instructed with the addition of identifying with the virtual body as their own in a negative physical state (feeling abdominal pain). After the participant watches their virtual body being stroked along their back, a fear stimulus is presented in the form of a knife driving into the virtual body’s back. The fear response is measured using a skin-conductance response and the degree of conductance measures is then related to how strongly the participant is identifying with their virtual body.

One of the main points of this study is using top-down factors, which are expectation or biases on what something should feel like based on prior experience or interactions, to determine if these can also influence a sense of body ownership.

Results indicated that the full-body illusion was inhibited when asked to view the virtual body as their own with abdominal pain, and the higher the degree of depersonalization tendencies within the participants resulted in a lower degree of FBI.

Researchers suggest this could be due to multiple factors, one of which being the manipulation of using top-down factors. Another suggestion is that the participants might have had difficulty perceiving the negative physical symptoms, therefore they had difficulty fully establishing the connection of “the virtual body is my body” which is key for the illusion to occur.

Given the findings of the study, more research can be done to fully understand why an inhibition of the illusion occurred.

“While we observed this inhibitory effect, further research is needed to determine whether it was specifically due to the negative interpretation or to differences between actual and virtual body states,” said Takashi Nakao, researcher and author of the study.

The foundation provided by this study and subsequent studies can aid in clinical intervention for those suffering from disturbed body ownership, such as individuals with depersonalization-derealization disorder. This work can improve those individuals’ sense of body ownership, which can greatly improve lives, not only for safety purposes but also sensory and perception purposes.

###

Kazuki Yamamoto and Takashi Nakao of the Graduate School of Humanities and Social Sciences at Hiroshima University contributed to this research.

The Japan Society for the Promotion of Science funded this research.

About Hiroshima University

Since its foundation in 1949, Hiroshima University has striven to become one of the most prominent and comprehensive universities in Japan for the promotion and development of scholarship and education. Consisting of 12 schools for undergraduate level and 5 graduate schools, ranging from natural sciences to humanities and social sciences, the university has grown into one of the most distinguished comprehensive research universities in Japan. English website: https://www.hiroshima-u.ac.jp/en

 

NIH researchers develop eye drops that slow vision loss in animals



Treatment shows potential to slow the progression of human degenerative eye diseases, including retinitis pigmentosa




NIH/National Eye Institute

PEDF and peptides 

image: 

Model of PEDF protein alongside the 17-mer and H105A peptides. Amino acid 105, which is changed from histidine in PEDF and the 17-mer peptide to alanine in the H105A peptide, is shown in green.

view more 

Credit: NIH/National Eye Institute




Researchers at the National Institutes of Health (NIH) have developed eye drops that extend vision in animal models of a group of inherited diseases that lead to progressive vision loss in humans, known as retinitis pigmentosa. The eye drops contain a small fragment derived from a protein made by the body and found in the eye, known as pigment epithelium-derived factor (PEDF). PEDF helps preserve cells in the eye’s retina. A report on the study is published in Communications Medicine.

“While not a cure, this study shows that PEDF-based eye drops can slow progression of a variety of degenerative retinal diseases in animals, including various types of retinitis pigmentosa and dry age-related macular degeneration (AMD),” said Patricia Becerra, Ph.D., chief of NIH’s Section on Protein Structure and Function at the National Eye Institute and senior author of the study. “Given these results, we’re excited to begin trials of these eye drops in people.”

All degenerative retinal diseases have cellular stress in common. While the source of the stress may vary—dozens of mutations and gene variants have been linked to retinitis pigmentosa, AMD, and other disorders—high levels of cellular stress cause retinal cells to gradually lose function and die. Progressive loss of photoreceptor cells leads to vision loss and eventually blindness.

Previous research from Becerra’s lab revealed that, in a mouse model, the natural protein PEDF can help retinal cells stave off the effects of cellular stress. However, the full PEDF protein is too large to pass through the outer eye tissues to reach the retina, and the complete protein has multiple functions in retinal tissue, making it impractical as a treatment. To optimize the molecule's ability to preserve retinal cells and to help the molecule reach the back of the eye, Becerra developed a series of short peptides derived from a region of PEDF that supports cell viability. These small peptides can move through eye tissues to bind with PEDF receptor proteins on the surface of the retina.

In this new study, led by first author Alexandra Bernardo-Colón, Becerra’s team created two eye drop formulations, each containing a short peptide. The first peptide candidate, called “17-mer,” contains 17 amino acids found in the active region of PEDF. A second peptide, H105A, is similar but binds more strongly to the PEDF receptor. Peptides applied to mice as drops on the eye’s surface were found in high concentration in the retina within 60 minutes, slowly decreasing over the next 24 to 48 hours. Neither peptide caused toxicity or other side effects.

When administered once daily to young mice with retinitis pigmentosa-like disease, H105A slowed photoreceptor degeneration and vision loss. To test the drops, the investigators used specially bred mice that lose their photoreceptors shortly after birth. Once cell loss begins, the majority of photoreceptors die in a week. When given peptide eye drops through that one-week period, mice retained up to 75% of photoreceptors and continued to have strong retinal responses to light, while those given a placebo had few remaining photoreceptors and little functional vision at the end of the week.

“For the first time, we show that eye drops containing these short peptides can pass into the eye and have a therapeutic effect on the retina,” said Bernardo-Colón. “Animals given the H105A peptide have dramatically healthier-looking retinas, with no negative side effects.”

A variety of gene-specific therapies are under development for many types of retinitis pigmentosa, which generally start in childhood and progress over many years. These PEDF-derived peptide eye drops could play a crucial role in preserving cells while waiting for these gene therapies to become clinically available.

To test whether photoreceptors preserved through the eye drop treatment are healthy enough for gene therapy to work, collaborators Valeria Marigo, Ph.D. and Andrea Bighinati, Ph.D., University of Modena, Italy, treated mice with gene therapy at the end of the week-long eye drop regimen. The gene therapy successfully preserved vision for at least an additional six months.  

To see whether the eye drops could work in humans – without actually testing in humans directly – the researchers worked with Natalia Vergara, Ph.D., University of Colorado Anschutz, Aurora, to test the peptides in a human retinal tissue model of retinal degeneration. Grown in a dish from human cells, the retina-like tissues were exposed to chemicals that induced high levels of cellular stress. Without the peptides, the cells of the tissue model died quickly, but with the peptides, the retinal tissues remained viable. These human tissue data provide a key first step supporting human trials of the eye drops.

The research was funded by the NEI Intramural Research Program. Additional funding was provided by the Prevention of Blindness Society, Fondazione Telethon, HEAL-ITALIA Foundation, CellSight Development Fund, and Research to Prevent Blindness.

Reference: Bernardo-Colón A, Bighinati A, Parween S, Debnath S, Piano I, Adani E, Corsi F, Gargini C, Vergara N, Marigo V, and Becerra SP. “H105A peptide eye drops promote photoreceptor survival in murine and human models of retinal degeneration.” Mar 21, 2025, Comms Med. https://doi.org/10.1038/s43856-025-00789-8

###

NEI leads the federal government’s research on the visual system and eye diseases. NEI supports basic and clinical science programs to develop sight-saving treatments and address special needs of people with vision loss. For more information, visit https://www.nei.nih.gov.

About the National Institutes of Health (NIH): NIH, the nation’s medical research agency, includes 27 Institutes and Centers and is a component of the U.S. Department of Health and Human Services. NIH is the primary federal agency conducting and supporting basic, clinical, and translational medical research, and is investigating the causes, treatments, and cures for both common and rare diseases. For more information about NIH and its programs, visit https://www.nih.gov/.

NIH…Turning Discovery Into Health®

 SPACE /COSMOS

How black holes could nurture life



A new study co-led by Dartmouth researchers shows how radiation from black holes could have a nurturing effect on life.




Dartmouth College




At the center of most large galaxies, including our own Milky Way, sits a supermassive black hole. Interstellar gas periodically falls into the orbit of these bottomless pits, switching the black hole into active galactic nucleus (AGN)-mode, blasting high-energy radiation across the galaxy.

It's not an environment you'd expect a plant or animal to thrive in. But in a surprising new study in the Astrophysical Journal, researchers at Dartmouth and the University of Exeter show that AGN radiation can have a paradoxically nurturing effect on life. Rather than doom a species to oblivion, it can help ensure its success.

The study may be the first to concretely measure, via computer simulations, how an AGN's ultraviolet radiation can transform a planet's atmosphere to help or hinder life. Consistent with studies looking at the effects of solar radiation, the researchers found that the benefits—or harms—depend on how close the planet is to the source of the radiation, and whether life has already gained a toehold.

"Once life exists, and has oxygenated the atmosphere, the radiation becomes less devastating and possibly even a good thing," says the study's lead author, Kendall Sippy, who graduated from Dartmouth last year. "Once that bridge is crossed, the planet becomes more resilient to UV radiation and protected from potential extinction events."

The researchers simulated the effects of AGN radiation on not only Earth, but Earth-like planets of varying atmospheric composition. If oxygen was already present, they found, the radiation would set off chemical reactions causing the planet's protective ozone layer to grow. The more oxygenated the atmosphere, the greater the effect.

High-energy light reacts readily with oxygen, splitting the molecule into single atoms that recombine to form ozone. As ozone builds up in the upper atmosphere, it deflects more and more dangerous radiation back into space. Earth owes its favorable climate to a similar process that happened about 2 billion years ago with the first oxygen-producing microbes.

Radiation from the sun helped Earth's fledgling life oxygenate, and add ozone, to the atmosphere. As our planet's protective ozone blanket thickened, it allowed life to flourish, producing more oxygen, and yet more ozone. Under the Gaia hypothesis, these beneficial feedback loops allowed complex life to emerge.

"If life can quickly oxygenate a planet's atmosphere, ozone can help regulate the atmosphere to favor the conditions life needs to grow," says study co-author Jake Eager-Nash, a postdoctoral fellow at the University of Victoria. "Without a climate-regulating feedback mechanisms, life may die out fast."

Earth, in real life, is not close enough to its resident black hole, Sagittarius A, to feel its effects, even in AGN-mode. But the researchers wanted to see what could happen if Earth were much closer to a hypothetical AGN, and thus exposed to radiation billions of times greater.

Recreating Earth's oxygen-free atmosphere in the Archean, they found that the radiation would all but preclude life from developing. But as oxygen levels rose, nearing modern levels, Earth's ozone layer would grow and shield the ground below from dangerous radiation.

"With modern oxygen levels, this would take a few days, which would hopefully mean that life could survive," Eager-Nash says. "We were surprised by how quickly ozone levels would respond."

When they looked at what could happen on an Earth-like planet in an older galaxy with stars clustered closer to its AGN, they found a much different picture. In a "red nugget relic" galaxy like NGC 1277, the effects would be lethal. Stars in more massive galaxies with an elliptical shape, like Messier-87, or our spiral Milky Way, are spread out more, and thus, farther from an AGN's dangerous radiation.

The stars align aboard the Queen Mary 2

Sippy came to Dartmouth with a keen interest in black holes, and by the end of second term, had joined the lab of study co-author Ryan Hickox, professor and chair of physics and astronomy. Later, while debating a potential senior project on AGN radiation, fate intervened.

Heading to England for a sabbatical in 2023, Hickox booked a trip on the Queen Mary 2 so he could bring his dog, Benjamin. Aboard the ship, he got to chatting with an astrophysicist from Exeter, Nathan Mayne, who was a guest speaker on the ship. They quickly realized they had a mutual interest in radiation, and that the PALEO software Mayne had been using to model solar radiation on exoplanet atmospheres could be applied to the more powerful rays of an AGN.

The encounter would clear the way for Sippy to work with Eager-Nash, then a PhD student in Mayne's lab. Using the programing language Julia, they input into their model the initial concentrations of oxygen, and other atmospheric gases, on their Earth-like planet.

"It models every chemical reaction that could take place," says Sippy. "It returns plots of how much radiation is hitting the surface at different wavelengths, and the concentration of each gas in your model atmosphere, at different points in time."

The feedback loop they discovered in an oxygenated atmosphere was unexpected. "Our collaborators don't work on black hole radiation so they were unfamiliar with the spectrum of a black hole and how much brighter an AGN could get than a star depending how close you are to it," says Hickox.

Without the kismet that brought the two labs together, the project might never have happened. "It's the kind of insight you can only really get by combining different sets of expertise," he adds.

After graduating from Dartmouth, Sippy left for Middlebury College, to work as a post-baccalaureate researcher in the lab of McKinley Brumback, who worked in Hickox's lab as a PhD student and is now an assistant professor of physics at Middlebury studying accreting neutron star X-ray binaries.

She brought a unique perspective to the project. In the X-ray binaries that she studies, a neutron star pulls matter from a normal star, causing in-falling material to heat up and emit X-rays.

While an AGN can take up to millions of years to flip between active and inactive states, X-ray binaries can change in mere days to months. "A lot of the same physics that applies to AGNs applies to X-ray binaries, but the time scales are much faster than for an AGN," she says.

Brumback contributed to the AGN analysis and served as a "slightly removed reader" to make sure the paper was accessible to non-experts, she says.

"Thanks to Kendall's excellent writing, it definitely was!"

 

How the brain responds to prices: Scientists discover neural marker for price perception



National Research University Higher School of Economics
Figure 1. Results of MEG experiment 

image: 

Evoked responses to the congruent and incongruent target words 'cheap' or 'expensive' in the price judgment task (A–C) and the semantic task (D). The graph shows responses to the target words following (A) relatively high prices, (B) relatively low prices, and (C) all price ranges combined.

view more 

Credit: Gorin A, Kuznetsova E, Kislov A, Levchenko E, Klucharev V, Moiseeva V, Yurchenko A, Luzhin A, Galkina N and Shestakova AN (2025) Neural correlates of the non-optimal price: an MEG/EEG study. Front. Hum. Neurosci.



Russian scientists have discovered how the brain makes purchasing decisions. Using electroencephalography (EEG) and magnetoencephalography (MEG), researchers found that the brain responds almost instantly when a product's price deviates from expectations. This response engages brain regions involved in evaluating rewards and learning from past decisions. Thus, perceiving a product's value is not merely a conscious choice but also a function of automatic cognitive mechanisms. The results have been published in Frontiers in Human Neuroscience.

Every day, people are faced with prices of food, technology, and services. Sometimes, a product seems overpriced, while other times, it appears suspiciously cheap. Do consumers consciously judge prices, or does the brain do it automatically? Researchers from HSE University and Neurotrend, a Russian neuromarketing company, set out to investigate how the brain responds to unexpected prices.

Participants in the experiment were shown images of mobile phones (iPhone, Nokia, Xiaomi), followed by their hypothetical prices. The prices could be above, below, or matching the actual market value of the products. Afterward, the target word 'expensive' or 'cheap' appeared on the screen, and participants had to determine whether the word matched the given price. Throughout the experiment, the researchers recorded participants' brain activity using EEG and MEG, methods that track changes in brain neuron activity. A total of 65 individuals participated in the study.

The findings reveal that perceiving prices significantly different from the actual market value triggered a strong N400 signal, an electrical impulse in the brain typically generated when confronted with unexpected information. Notably, prices perceived as excessively high triggered a stronger response than those seen as too low. The scientists explain this by suggesting that implausibly high discounts may be perceived with scepticism. Additionally, it appears that the brain's response can vary depending on the product's brand. For the Xiaomi mobile phone, the price range that triggered a strong N400 response was found to be broader. This may suggest that people did not have a clear enough understanding of the real market value of this product.

Andrew Kislov

'Back when I was in my bachelor's programme at HSE University, I wondered whether it was possible to determine from brain activity what price a person considers acceptable. Our experiments have confirmed that it is indeed possible,’ comments Andrew Kislov, doctoral student at the HSE Faculty of Social Sciences and co-author of the study. 'Globally, we are working to develop an objective method for assessing customer preferences. To what extent, in doing so, do we have the right to invade a person's inner world? This is a good question, but in this project, we simply aimed to determine the maximum price that would be comfortable for people, and this method does not pose any real threat to customers.'

To identify which regions of the brain respond to prices, the researchers analysed MEG data. They found that when perceiving 'non-optimal' prices, the frontal cortex and anterior cingulate gyrus—regions responsible for decision-making and assessing rewards—were activated. 

Vasily Klucharev

'The results demonstrate that when the price does not meet expectations, the brain responds almost instantly. Moreover, the response is linked to brain regions involved in assessing rewards and learning from past decisions. This means that the perception of a product's value is part of automatic cognitive mechanisms that are activated long before an individual consciously makes a decision,' explains the chief researcher Vasily Klucharev, Head of the International Laboratory of Social Neurobiology.

The study also provides marketers with new tools: instead of relying solely on surveys, they can gain insight into consumers' perception of prices at the neurocognitive level.

Anna Shestakova

'Marketers are increasingly saying that conventional consumer surveys don't provide a complete picture, as people cannot always explain why a certain price seems too high or too low to them. People often say what they think is expected of them. Therefore, we conducted this study in collaboration with a leading neuromarketing company, Neurotrend, and discovered that it is possible to examine an individual's brain and determine whether a specific product price meets their expectations. This approach can help predict how people will perceive the price of a new product even before it is released to the market,' explains the chief researcher Anna Shestakova, Director of the HSE Institute for Cognitive Neuroscience.