Monday, April 29, 2024

 SPACE

 

Gemini south reveals origin of unexpected differences in giant binary stars


Astronomers confirm that differences in the chemical composition of binary stars can be traced back to the earliest stages of their formation


ASSOCIATION OF UNIVERSITIES FOR RESEARCH IN ASTRONOMY (AURA)

Artist’s Impression of a Giant-Giant Binary 

IMAGE: 

THIS ARTIST’S IMPRESSION ILLUSTRATES A BINARY PAIR OF GIANT STARS. DESPITE BEING BORN FROM THE SAME molecular cloud, ASTRONOMERS OFTEN DETECT DIFFERENCES IN BINARY STARS’ CHEMICAL COMPOSITIONS AND PLANETARY SYSTEMS. ONE STAR IN THIS SYSTEM IS SHOWN TO HOST THREE SMALL, ROCKY PLANETS, WHILE THE OTHER STAR HOSTS TWO GAS GIANTS. USING GEMINI SOUTH’S GHOST, A TEAM OF ASTRONOMERS HAVE CONFIRMED FOR THE FIRST TIME THAT THESE DIFFERENCES CAN BE TRACED BACK TO INHOMOGENEITIES IN THE PRIMORDIAL MOLECULAR CLOUD FROM WHICH THE STARS WERE BORN.

view more 

CREDIT: NOIRLAB/NSF/AURA/J. DA SILVA (SPACEENGINE)/M. ZAMANI




It is estimated that up to 85% of stars exist in binary star systems, some even in systems with three or more stars. These stellar pairs are born together out of the same molecular cloud from a shared abundance of chemical building blocks, so astronomers would expect to find that they have nearly identical compositions and planetary systems. However, for many binaries that isn’t the case. While some proposed explanations attribute these dissimilarities to events occurring after the stars evolved, a team of astronomers have confirmed for the first time that they can actually originate from before the stars even began to form.

Led by Carlos Saffe of the Institute of Astronomical, Earth and Space Sciences (ICATE-CONICET) in Argentina, the team used the Gemini South telescope in Chile, one half of the International Gemini Observatory, supported in part by the U.S. National Science Foundation and operated by NSF NOIRLab. With the new, precise Gemini High Resolution Optical SpecTrograph (GHOST) the team studied the different wavelengths of light, or spectra, given off by a pair of giant stars, which revealed significant differences in their chemical make-up. “GHOST’s extremely high-quality spectra offered unprecedented resolution,” said Saffe, “allowing us to measure the stars’ stellar parameters and chemical abundances with the highest possible precision.” These measurements revealed that one star had higher abundances of heavy elements than the other. To disentangle the origin of this discrepancy, the team used a unique approach.

Previous studies have proposed three possible explanations for observed chemical differences between binary stars. Two of them involve processes that would occur well into the stars’ evolution: atomic diffusion, or the settling of chemical elements into gradient layers depending on each star’s temperature and surface gravity; and the engulfment of a small, rocky planet, which would introduce chemical variations in a star’s composition.

The third possible explanation looks back at the beginning of the stars’ formation, suggesting that the differences originate from primordial, or pre-existing, areas of nonuniformity within the molecular cloud. In simpler terms, if the molecular cloud has an uneven distribution of chemical elements, then stars born within that cloud will have different compositions depending on which elements were available at the location where each formed.

So far, studies have concluded that all three explanations are probable; however, these studies focused solely on main-sequence binaries. The ‘main-sequence’ is the stage where a star spends most of its existence, and the majority of stars in the Universe are main-sequence stars, including our Sun. Instead, Saffe and his team observed a binary consisting of two giant stars. These stars possess extremely deep and strongly turbulent external layers, or convective zones. Owing to the properties of these thick convective zones, the team was able to rule out two of the three possible explanations.

The continuous swirling of fluid within the convective zone would make it difficult for material to settle into layers, meaning giant stars are less sensitive to the effects of atomic diffusion — ruling out the first explanation. The thick external layer also means that a planetary engulfment would not change a star’s composition much since the ingested material would rapidly be diluted — ruling out the second explanation. This leaves primordial inhomogeneities within the molecular cloud as the confirmed explanation. “This is the first time astronomers have been able to confirm that differences between binary stars begin at the earliest stages of their formation,” said Saffe.

“Using the precision-measurement capabilities provided by the GHOST instrument, Gemini South is now collecting observations of stars at the end of their lives to reveal the environment in which they were born,” says Martin Still, NSF program director for the International Gemini Observatory. “This gives us the ability to explore how the conditions in which stars form can influence their entire existence over millions or billions of years.”

Three consequences of this study are of particular significance. First, these results offer an explanation for why astronomers see binary stars with such different planetary systems. “Different planetary systems could mean very different planets — rocky, Earth-like, ice giants, gas giants — that orbit their host stars at different distances and where the potential to support life might be very different,” said Saffe.

Second, these results pose a crucial challenge to the concept of chemical tagging — using chemical composition to identify stars that came from the same environment or stellar nursery — by showing that stars with different chemical compositions can still have the same origin.

Finally, observed differences previously attributed to planetary impacts on a star’s surface will need to be reviewed, as they might now be seen as having been there from the very beginning of the star’s life.

“By showing for the first time that primordial differences really are present and responsible for differences between twin stars, we show that star and planet formation could be more complex than initially thought,” said Saffe. “The Universe loves diversity!”

More information

This research was presented in a paper accepted in Astronomy & Astrophysics Letters. DOI: 10.1051/0004-6361/202449263

The team is composed of C. Saffe (ICATE-CONICET/UNSJ, Argentina), P. Miquelarena (ICATE-CONICET/UNSJ, Argentina), J. Alacoria (ICATE-CONICET, Argentina), E. Martioli (LNA/MCTI, Brasil), M. Flores (ICATE-CONICET/UNSJ, Argentina), M. Jaque Arancibia (Universidad de La Serena, Chile), R. Angeloni (International Gemini Observatory/NSF NOIRLab, Chile), E. Jofré (OAC/CONICET, Argentina), J. Yana Galarza (Carnegie Institution for Science, CA), E. González (UNSJ, Argentina), and A. Collado (ICATE-CONICET/UNSJ, Argentina).

NSF NOIRLab (U.S. National Science Foundation National Optical-Infrared Astronomy Research Laboratory), the U.S. center for ground-based optical-infrared astronomy, operates the International Gemini Observatory (a facility of NSFNRC–CanadaANID–ChileMCTIC–BrazilMINCyT–Argentina, and KASI–Republic of Korea), Kitt Peak National Observatory (KPNO), Cerro Tololo Inter-American Observatory (CTIO), the Community Science and Data Center (CSDC), and Vera C. Rubin Observatory (operated in cooperation with the Department of Energy’s SLAC National Accelerator Laboratory). It is managed by the Association of Universities for Research in Astronomy (AURA) under a cooperative agreement with NSF and is headquartered in Tucson, Arizona. The astronomical community is honored to have the opportunity to conduct astronomical research on I’oligam Du’ag (Kitt Peak) in Arizona, on Maunakea in Hawai‘i, and on Cerro Tololo and Cerro Pachón in Chile. We recognize and acknowledge the very significant cultural role and reverence that these sites have to the Tohono O’odham Nation, to the Native Hawaiian community, and to the local communities in Chile, respectively.

Links

JOURNAL

DOI


Probing the effects of interplanetary space on asteroid Ryugu



Peer-Reviewed Publication

HOKKAIDO UNIVERSITY

Conceptual illustration of the study 

IMAGE: 

CONCEPTUAL ILLUSTRATION OF THE STUDY

view more 

CREDIT: YUKI KIMURA



Samples reveal evidence of changes experienced by the surface of asteroid Ryugu, some probably due to micrometeoroid bombardment.

Analyzing samples retrieved from the asteroid Ryugu by the Japanese Space Agency’s Hayabusa2 spacecraft has revealed new insights into the magnetic and physical bombardment environment of interplanetary space. The results of the study, carried out by Professor Yuki Kimura at Hokkaido University and co-workers at 13 other institutions in Japan, are published in the journal Nature Communications.

The investigations used electron waves penetrating the samples to reveal details of their structure and magnetic and electric properties, a technique called electron holography.

Hayabusa2 reached asteroid Ryugu on 27 June 2018, collected samples during two delicate touchdowns, and then returned the jettisoned samples to Earth in December 2020. The spacecraft is now continuing its journey through space, with plans for it to observe two other asteroids in 2029 and 2031.

One advantage of collecting samples directly from an asteroid is that it allows researchers to examine long-term effects of its exposure to the environment of space. The ‘solar wind’ of high energy particles from the sun and bombardment by micrometeoroids cause changes known as space-weathering. It is impossible to study these changes precisely using most of the meteorite samples that land naturally on Earth, partly due to their origin from the internal parts of an asteroid, and also due to the effects of their fiery descent through the atmosphere.

“The signatures of space weathering we have detected directly will give us a better understanding of some of the phenomena occurring in the Solar System,” says Kimura. He explains that the strength of the magnetic field in the early solar system decreased as planets formed, and measuring the remnant magnetization on asteroids can reveal information about the magnetic field in the very early stages of the solar system.

Kimura adds, “In future work, our results could also help to reveal the relative ages of surfaces on airless bodies and assist in the accurate interpretation of remote sensing data obtained from these bodies.”

One particularly interesting finding was that small mineral grains called framboids, composed of magnetite, a form of iron oxide, had completely lost their normal magnetic properties. The researchers suggest this was due to collision with high velocity micrometeoroids between 2 and 20 micrometers in diameter. The framboids were surrounded by thousands of metallic iron nanoparticles. Future studies of these nanoparticles will hopefully reveal insights into the magnetic field that the asteroid has experienced over long periods of time.

“Although our study is primarily for fundamental scientific interest and understanding, it could also help estimate the degree of degradation likely to be caused by space dust impacting robotic or manned spacecraft at high velocity,” Kimura concludes.


Magnetite (round particles) particles cut from a Ryugu sample

Iron nanoparticles distributed around pseudo-magnetite 

China Set To Launch High-Stakes Mission To Moon’s ‘Hidden’ Side

China will send a robotic spacecraft in coming days on a round trip to the moon’s far side in the first of three technically demanding missions that will pave the way for an inaugural Chinese crewed landing and a base on the lunar south pole.

This week, China is expected to launch Chang’e-6 using the backup spacecraft from the 2020 mission, and collect soil and rocks from the side of the moon that permanently faces away from Earth.

With no direct line of sight with the Earth, Chang’e-6 must rely on a recently deployed relay satellite orbiting the moon during its 53-day mission, including a never-before attempted ascent from the moon’s “hidden” side on its return journey home.

The same relay satellite will support the uncrewed Chang’e-7 and 8 missions in 2026 and 2028, respectively, when China starts to explore the south pole for water and build a rudimentary outpost with Russia. China aims to put its astronauts on the moon by 2030.

On Chang’e-6, China will carry payloads from France, Italy, Sweden and Pakistan, and on Chang’e-7, payloads from Russia, Switzerland and Thailand.

Chang’e 6 will attempt to land on the northeastern side of the vast South Pole-Aitkin Basin, the oldest known impact crater in the solar system.

After touchdown at Malapert A, a site near the south pole that was believed to be relatively flat, the spacecraft tilted sharply to one side amid a host of technical problems, reflecting the high-risk nature of lunar landings.

The south pole has been described by scientists as the “golden belt” for lunar exploration.

Polar ice could sustain long-term research bases without relying on expensive resources transported from Earth. India’s Chandrayaan-1 launched in 2008 confirmed the existence of ice inside polar craters.

Chang’e-6’s sample return could also shed more light on the early evolution of the moon and the inner solar system.

Chang’e-6, after a successful landing, will collect about 2 kilogrammes (4.4 pounds) of samples with a mechanical scoop and a drill.

(Reuters)


China Set To Launch Robotic Spacecraft To

Moon's 'Hidden' Side

By Reuters 
Published on: April 29, 2024



The Chang'e 6 lunar probe and the Long March-5 Y8 carrier rocket combination sit atop the launch pad at the Wenchang Space Launch Site in Hainan province, China April 27, 2024. cnsphoto via REUTERS

China will send a robotic spacecraft in coming days on a round trip to the moon's far side in the first of three technically demanding missions that will pave the way for an inaugural Chinese crewed landing and a base on the lunar south pole.

Since the first Chang'e mission in 2007, named after the mythical Chinese moon goddess, China has made leaps forward in its lunar exploration, narrowing the technological chasm with the United States and Russia.

In 2020, China brought back samples from the moon's near side in the first sample retrieval in more than four decades, confirming for the first time it could safely return an uncrewed spacecraft to Earth from the lunar surface.

This week, China is expected to launch Chang'e-6 using the backup spacecraft from the 2020 mission, and collect soil and rocks from the side of the moon that permanently faces away from Earth.

With no direct line of sight with the Earth, Chang'e-6 must rely on a recently deployed relay satellite orbiting the moon during its 53-day mission, including a never-before attempted ascent from the moon's "hidden" side on its return journey home.

The same relay satellite will support the uncrewed Chang'e-7 and 8 missions in 2026 and 2028, respectively, when China starts to explore the south pole for water and build a rudimentary outpost with Russia. China aims to put its astronauts on the moon by 2030.

Beijing's polar plans have worried NASA, whose administrator, Bill Nelson, has repeatedly warned that China would claim any water resources as its own. Beijing says it remains committed to cooperation with all nations on building a "shared" future.

On Chang'e-6, China will carry payloads from France, Italy, Sweden and Pakistan, and on Chang'e-7, payloads from Russia, Switzerland and Thailand.

NASA is banned by U.S. law from any collaboration, direct or indirect, with China.


Under the separate NASA-led Artemis programme, U.S. astronauts will land near the south pole in 2026, the first humans on the moon since 1972.

"International cooperation is key (to lunar exploration)," Clive Neal, professor of planetary geology at the University of Notre Dame, told Reuters. "It's just that China and the U.S. aren't cooperating right now. I hope that will happen."


SOUTH POLE AMBITIONS

Chang'e 6 will attempt to land on the northeastern side of the vast South Pole-Aitkin Basin, the oldest known impact crater in the solar system.

The southernmost landing ever was carried out in February by IM-1, a joint mission between NASA and the Texas-based private firm Intuitive Machines.

After touchdown at Malapert A, a site near the south pole that was believed to be relatively flat, the spacecraft tilted sharply to one side amid a host of technical problems, reflecting the high-risk nature of lunar landings.

The south pole has been described by scientists as the "golden belt" for lunar exploration.

Polar ice could sustain long-term research bases without relying on expensive resources transported from Earth. India's Chandrayaan-1 launched in 2008 confirmed the existence of ice inside polar craters.

Chang'e-6's sample return could also shed more light on the early evolution of the moon and the inner solar system.

The lack of volcanic activity on the moon's far side means there are more craters not covered by ancient lava flows, preserving materials from the moon's early formation.

So far, all lunar samples taken by the United States and the former Soviet Union in the 1970s and China in 2020 were from the moon's near side, where volcanism had been far more active.


Chang'e-6, after a successful landing, will collect about 2 kilogrammes (4.4 pounds) of samples with a mechanical scoop and a drill.

"If successful, China's Chang'e-6 mission would be a milestone-making event," Leonard David, author of "Moon Rush: The New Space Race", told Reuters. "The robotic reach to the Moon's far side, and bringing specimens back to Earth, helps fill in the blanks about the still-murky origin of our Moon."


UC Irvine astronomers’ simulations support dark matter theory


The tests addressed the elusive matter’s existence despite it never having been observed



UNIVERSITY OF CALIFORNIA - IRVINE





Irvine, Calif., April 29, 2024 — Computer simulations by astronomers support the idea that dark matter – matter that no one has yet directly detected but which many physicists think must be there to explain several aspects of the observable universe – exists, according to the researchers, who include those at the University of California, Irvine. 

 

The work addresses a fundamental debate in astrophysics – does invisible dark matter need to exist to explain how the universe works the way it does, or can physicists explain how things work based solely on the matter we can directly observe? Currently, many physicists think something like dark matter must exist to explain the motions of stars and galaxies. 

 

“Our paper shows how we can use real, observed relationships as a basis to test two different models to describe the universe,” said Francisco Mercado, lead author and recent Ph.D. graduate from the UC Irvine Department of Physics & Astronomy who is now a postdoctoral scholar at Pomona College. “We put forth a powerful test to discriminate between the two models.” 

 

The test involved running computer simulations with both types of matter – normal and dark – to explain the presence of intriguing features measured in real galaxies. The team reported their results in Monthly Notices of the Royal Astronomy Society.
 

The features in galaxies the team found “are expected to appear in a universe with dark matter but would be difficult to explain in a universe without it,” said Mercado. “We show that such features appear in observations of many real galaxies. If we take these data at face value, this reaffirms the position of the dark matter model as the one that best describes the universe we live in.”

 

These features Mercado noted describe patterns in the motions of stars and gas in galaxies that seem to only be possible in a universe with dark matter. 

 

“Observed galaxies seem to obey a tight relationship between the matter we see and the inferred dark matter we detect, so much so that some have suggested that what we call dark matter is really evidence that our theory of gravity is wrong,” said co-author James Bullock, professor of physics at UCI and dean of the UCI School of Physical Sciences. “What we showed is that not only does dark matter predict the relationship, but for many galaxies it can explain what we see more naturally than modified gravity. I come away even more convinced that dark matter is the right model.”

 

The features also appear in observations made by proponents of a dark matter-free universe. “The observations we examined – the very observations where we found these features – were conducted by adherents of dark matter-free theories,” said co-author Jorge Moreno, associate professor of physics and astronomy at Pomona College. “Despite their obvious presence, little-to-no analysis was performed on these features by that community. It took folks like us, scientists working with both regular and dark matter, to start the conversation.” 

 

Moreno added that he expects debate within his research community to follow in the wake of the study, but that there may be room for common ground, as the team also found that such features only appear in their simulations when there is both dark matter and normal matter in the universe. 

 

“As stars are born and die, they explode into supernovae, which can shape the centers of galaxies, naturally explaining the existence of these features,” said Moreno. “Simply put, the features we examined in observations require both the existence of dark matter and the incorporation of normal-matter physics.” 

 

Now that the dark matter model of the universe appears to be the leading one, the next step, Mercado explained, is to see if it remains consistent across a dark matter universe.

 

“It would be interesting to see if we could use this same relationship to even distinguish between different dark matter models,” said Mercado. “Understanding how this relationship changes under distinct dark matter models could help us constrain the properties of dark matter itself.”

 

Funding came from a National Science Foundation MSP-Ascend Award AST-2316748 to Mercado. Mercado and Bullock were supported by NSF grant AST-1910965 and NASA grant 80NSSC22K0827. Moreno receives funding from the Hirsch Foundation. Collaborators include Michael Boylan-Kolchin (The University of Texas at Austin), Philip F. Hopkins (California Institute of Technology), Andrew Wetzel (University of California, Davis) and Claude-André Faucher-Giguère (Northwestern University) and Jenna Samuel (The University of Texas at Austin).

 

About the University of California, Irvine: Founded in 1965, UC Irvine is a member of the prestigious Association of American Universities and is ranked among the nation’s top 10 public universities by U.S. News & World Report. The campus has produced five Nobel laureates and is known for its academic achievement, premier research, innovation and anteater mascot. Led by Chancellor Howard Gillman, UC Irvine has more than 36,000 students and offers 224 degree programs. It’s located in one of the world’s safest and most economically vibrant communities and is Orange County’s second-largest employer, contributing $7 billion annually to the local economy and $8 billion statewide. For more on UC Irvine, visit www.uci.edu.

 

Media access: Radio programs/stations may, for a fee, use an on-campus ISDN line to interview UC Irvine faculty and experts, subject to availability and university approval. For more UC Irvine news, visit news.uci.edu. Additional resources for journalists may be found at https://news.uci.edu/media-resources/.

 

Competition from “skinny label” generics saved Medicare billions


Savings were greatest for rosuvastatin, pregabalin, and imatinib


Peer-Reviewed Publication

AMERICAN COLLEGE OF PHYSICIANS


 

Sodium–Glucose Cotransporter-2 Inhibitors and the Risk for Dialysis and Cardiovascular Disease in Patients With Stage 5 Chronic Kidney Disease 

Abstract: https://www.acpjournals.org/doi/10.7326/M23-1874  

Please contact Angela Collom at acollom@acponline.org or 609-367-4225 with questions or for an embargoed PDF. Thank you.

 

Embargoed for release until 5:00 p.m. ET on Monday 29 April 2024    
Annals of Internal Medicine Tip Sheet     

@Annalsofim    
Below please find summaries of new articles that will be published in the next issue of Annals of Internal Medicine. The summaries are not intended to substitute for the full articles as a source of information. This information is under strict embargo and by taking it into possession, media representatives are committing to the terms of the embargo not only on their own behalf, but also on behalf of the organization they represent.    
----------------------------    

1. Competition from “skinny label” generics saved Medicare billions

Savings were greatest for rosuvastatin, pregabalin, and imatinib

Abstract: https://www.acpjournals.org/doi/10.7326/M23-3212  

URL goes live when the embargo lifts     

An analysis of 15 name-brand drugs and their “skinny label” generic counterparts found that competition from these counterparts saved Medicare Part D nearly $15 billion from 2015 to 2021. Skinny labeling allows generic drug manufacturers to exclude labeling information that remains patent-protected by the brand name manufacturer. However, a recent federal appeals court ruling involving a skinny-label generic version of the beta-blocker carvedilol (Coreg; GlaxoSmithKline) has increased liability risk for manufacturers of skinny-label generics. The brief research report is published in Annals of Internal Medicine.

Researchers from Brigham and Women’s Hospital and Harvard Medical School conducted an analysis of 15 brand-name drugs with a first-to-market skinny-label generic from 2015 to 2019. The authors compared actual spending on each brand-name drug and its skinny-label generics with projected spending had the skinny-label generics not been introduced. They estimated that actual Medicare spending on these 15 drugs and their skinny-label generics was $16.8 billion, and projected spending without generic competition was $31.5 billion, saving Medicare approximately $14.6 billion from 2015-2021. They note that savings were the greatest for rosuvastatin (Crestor, AstraZeneca; $6.5 billion), pregabalin (Lyrica, Pfizer; $4.2 billion), and imatinib (Gleevec, Novartis; $3.1 billion). The authors caution that deterring the use of skinny labeling, as the recent federal appeals court case might do, could be costly for Medicare and other US payers. They suggest that Congress should reinforce the skinny-label pathway by creating a safe harbor that protects manufacturers engaged in skinny labeling from induced patent infringement lawsuits. 

Media contacts: For an embargoed PDF, please contact Angela Collom at acollom@acponline.org. To speak with the corresponding author, Benjamin N. Rome, MD, MPH, please contact BROME@BWH.HARVARD.EDU.

----------------------------    

2. Semaglutide suitable for people with HIV and fatty liver disease

Abstract: https://www.acpjournals.org/doi/10.7326/M23-3354    

URL goes live when the embargo lifts      

A study of persons with HIV (PWH) and metabolic dysfunction–associated steatotic liver disease, also known as ‘fatty liver disease,’ (MASLD) found that semaglutide was highly effective at reducing liver fat and cardiovascular disease risk in this population. The brief research report was published in Annals of Internal Medicine.

Researchers conducted a study of 51 PWH with central adiposity, insulin resistance or prediabetes, and fatty liver disease who were observed over a period of 24 weeks and given semaglutide. The authors found that 29 percent of participants had complete resolution of MASLD, and 58 percent had a relative reduction in liver fat of at least 30 percent. They also report that all participants tolerated 1 mg weekly semaglutide. According to the authors, given the high cardiometabolic disease burden and growing obesity epidemic among PWH, semaglutide may reduce CVD risk while preventing progressive liver disease.

Media contacts: For an embargoed PDF, please contact Angela Collom at acollom@acponline.org. To speak with the corresponding author, Jordan E. Lake, MD, MSCR, please contact Jordan.E.Lake@uth.tmc.edu or 713-500-3030.

----------------------------    

3. Small gains in survival with modestly higher risk for adverse events for cancer patients participating in experimental clinical trials

Abstract: https://www.acpjournals.org/doi/10.7326/M23-2515    

URL goes live when the embargo lifts      

A systematic review and analysis of 128 trials found that cancer patients’ participation in clinical trials are associated with gains in survival compared with patients given standard-of-care treatments that were statistically significant but not clinically important. These gains also came at the cost of greater risk for serious adverse events (SAEs). The review is published in Annals of Internal Medicine.

Researchers from McGill University conducted a systematic review of 128 trials comprising 141 comparisons of a new drug and a comparator. These comparisons included 47,050 patients. The authors found that patients in experimental trials about 5 weeks of progression-free survival (PFS) and overall survival (OS) compared with patients receiving usual standard-of-care. They also note that patients in phase 3 studies or those sponsored by large pharmaceutical companies seem to have greater clinical benefit. However, the authors also found that gains in PFS or OS are potentially offset by a modest but statistically significantly elevated risk for SAEs for patients assigned to experimental groups, corresponding with 7.40 percent increase in absolute risk for a SAE in this group. According to the authors, they believe their findings are best interpreted as suggesting that access to experimental interventions that have not yet received full FDA approval is associated with a marginal but nonzero clinical benefit.

Media contacts: For an embargoed PDF, please contact Angela Collom at acollom@acponline.org. To speak with the corresponding author, Jonathan Kimmelman, PhD, please contact jonathan.kimmelman@mcgill.ca.

----------------------------  

4. Ultrasound may be viable first-line diagnostic tool for persons with suspected GCA

Abstract: https://www.acpjournals.org/doi/10.7326/M23-3417    

URL goes live when the embargo lifts      

A cohort study of persons with suspected giant cell arteritis (GCA) found that the use of ultrasound of the temporal arteries as a first-line diagnostic tool in patients with high clinical suspicion of GCA, further diagnostic tests for patients with positive ultrasound were avoided. The study is published in Annals of Internal Medicine.

Researchers from the Centre Hospitalier Rochefort, Rochefort, France, conducted a prospective cohort study of 165 patients with high clinical suspicion of GCA. The authors examined the use of color Doppler ultrasound of the temporal artery as a first-line diagnostic test and temporal artery biopsy (TAB) as a secondary test. The authors found that a diagnosis of GCA was confirmed in 44%, 17%, and 21% of patients by ultrasound, TAB, and clinical expertise and/or other imaging tests, respectively. According to the authors, their findings show that the use of temporal artery ultrasound may be an efficient way to make the diagnosis of GCA in patients with high clinical suspicion and to reduce imaging costs and the need for biopsy, thereby limiting complications and the need for a surgeon.

Media contacts: For an embargoed PDF, please contact Angela Collom at acollom@acponline.org. To speak with the corresponding author, Guillaume Denis, MD, please contact guillaume.denis@ght-atlantique17.fr.

----------------------------    

 

 

Haiti study suggests early-onset heart failure is prevalent form of heart disease in low-income countries



WEILL CORNELL MEDICINE





Early-onset heart failure is alarmingly common in urban Haiti—over 15-fold higher than previously estimated—according to a study conducted by Weill Cornell Medicine researchers in partnership with the Haitian medical organization GHESKIO. Heart failure occurs when the heart muscle can no longer pump an adequate amount of blood throughout the body.

The study indicates that the nature of cardiovascular disease in Haiti, and perhaps other low- and middle-income nations, differs from wealthier countries where ischemic heart disease, also called coronary heart disease, is prevalent. This condition, where the heart doesn't receive enough blood flow and oxygen due to narrowing of heart arteries was assumed to be the global norm.

The findings, published on April 4 in The Lancet Regional Health—Americas, are from one of the world’s first population-based cardiovascular clinical studies in a low-income setting designed to understand the landscape of heart disease. The cohort included 3,003 residents in Haiti’s capital, Port-au-Prince, who were clinically assessed for heart conditions.

Dr. Margaret McNairy and Dr. Lily Yan, members of Weill Cornell Medicine’s Center for Global Health and the Division of General Internal Medicine, worked with Dr. Jean Pape, executive director of GHESKIO, to lead the research. Dr. Pape is also the Howard and Carol Holtzmann Professor in Clinical Medicine at Weill Cornell Medicine. They determined that heart failure affected nearly 12 percent of participants, with a median age of 57, much younger than in the United States.

“What we found differed strikingly from assumptions based on studies in wealthy countries,” said Dr. McNairy, the principal investigator and an associate professor of medicine who has worked with GHESKIO over the past decade. “It’s a paradigm shift for our understanding of cardiovascular disease in under-resourced countries and important to guide health planning for future prevention and treatment.”

Cardiovascular diseases cause more death and disability worldwide than any other type of condition. However, without data collected on the ground it was unclear whether they affect people in the same way globally.

“In a place like Haiti, people live in an environment where they encounter extreme poverty, more pollution and experience high levels of stress, including from civil unrest,” said corresponding author Dr. Yan, an assistant professor of medicine at Weill Cornell Medicine. “Consequently, we thought there would be differences in heart disease and the factors driving it.” This study tested that suspicion. 

Distinctive Risks for an All-Too-Common Problem

GHESKIO community outreach workers recruited residents from a random sample of households in urban Port-au-Prince and invited them to participate by visiting the GHESKIO clinic. Once there, participants answered questions about their health and received a comprehensive physical exam including blood pressure measurements and blood work. They also had non-invasive tests including an electrocardiogram (ECG) that measures the electrical activity of the heart and echocardiogram, which looks at the function of their heart.

Delving into the data, the researchers used international guidelines to determine cardiovascular conditions, including heart failure, stroke, heart attack and chest pain. Nearly 15 percent of Haitian adults had one or more of these conditions, indicating cardiovascular diseases are a widespread issue. Heart failure, specifically stiffening of the muscle, affected the majority of these patients.

Their analysis found that elevated blood pressure, known as hypertension, was the most common risk factor for heart failure among Haitians. However, a previous study showed that only 13 percent of Haitian adults with hypertension had their blood pressure under control.

When establishing infrastructure to address cardiovascular diseases in Haiti, “this study suggests that we need to shift focus to early prevention of hypertension and heart failure,” Dr. McNairy said. The team is also studying the underlying factors contributing to this health crisis including the adversity Haitians routinely face such as hunger, poverty, sanitation, stress and other factors.   

This study was funded by NIH grants R01HL143788, D43TW011972, and K24HL163393, clinicaltrials.gov NCT03892265.

 

"BioBlitz" citizen science reveals urban biodiversity, guides management



AMERICAN INSTITUTE OF BIOLOGICAL SCIENCES




Citizen scientists are uncovering rare animal, plant, and fungi species in areas where they have never been seen before, increasing our knowledge of urban biodiversity and proving the existence of local species long thought extinct. The approach used is called a BioBlitz, a biological census in which citizen scientists contribute photographs or audio of living organisms they can see or hear in a designated area over a particular period, creating a snapshot of an area’s biodiversity.

In a recently published article in the journal BioScience, Dr. Esti Palma (University of Melbourne) and colleagues use the 2021 Melbourne City Nature Challenge as a case study to outline best practices for future BioBlitzes. During that effort, citizen scientists observed 135 different animal, plant and fungi species that had never been recorded in their local area. They also found 26 species that had not been recorded in Melbourne for at least 30 years. One rare species rediscovered was the thin strawberry weevil, a tiny species not seen for more than 44 years.

Palma explains that sprawl and population growth mean that it is crucial to understand what native and introduced plants and animals live in urban reserves and across public and private greenspace. “It can be hard for us to notice as we go about our busy lives, but cities are filled with indigenous insects, spiders and plants, as well as birds, frogs, fungi, small reptiles and invertebrates like snails,” he said. BioBlitzes are vital to managers, say the authors, because they present "local governments with a cost-effective tool to make informed, evidence-based management and policy decisions, improve education and engagement programs, foster cross-council collaborations, and support a stronger sense of environmental stewardship within the local community."

Coauthor Dr. Luis Mata (University of Melbourne) states that the 2021 evaluation also provided academically rigorous evidence of the benefits of citizen science events, as well as ways to make BioBlitzes even more useful, including conducting them across seasons or at night, with more tools and training to assist participants in collecting high-quality data. “As the citizen science movement grows, there is more potential for them to contribute timely, targeted, and high-quality records to shape local policies, as well as management, education, and research,” Dr Mata said.

The City Nature Challenge began in 2016 when staff at the Natural History Museum of Los Angeles and the California Academy of Sciences conceived a friendly competition between San Francisco and Los Angeles to see which city could record the largest number of species by the largest number of participants over an eight-day period.

This year, hundreds of Victorians used their smartphones or cameras to participate in the 2024 City Nature Challenge urban BioBlitz, which began on Friday, 26 April, across more than 25 councils in metropolitan Melbourne. More than 600 cities worldwide are participating this year.

Media enquiries: Mia Tyquin | +61 403 671 863| mia.tyquin@unimelb.edu.au

 

Long snouts protect foxes when diving headfirst in snow



CORNELL UNIVERSITY




ITHACA, N.Y. – When hunting for mice in winter, red and arctic fox are known to plunge headfirst at speeds of 2-4 meters per second, but their sharp noses reduce the impact force in snow and protect them from injury, according to a new Cornell University study.

The fundamental research sheds light on the biomechanics of the unique hunting behavior (known as mousing), advances our understanding of animal adaptations and offers insights into snow injuries people experience during snowboarding or skiing.

The study published April 29 in the Proceedings of the National Academy of Sciences.

While there have been many studies of water birds and animals such as porpoises and dolphins diving from air into water, interactions between animals and the air-snow interface have not been well-researched. Snow has fluid-like properties when light and fluffy, and solid-like properties when compacted, such as when people make snowballs.

“The fox’s sharp snout doesn’t significantly compress the snow, it penetrates it without much resistance,” said Sunghwan Jung, the paper’s corresponding author and professor of biological and environmental engineering. Jisoo Yuk, a doctoral student in Jung’s lab, is the paper’s first author.

In the study, the authors scanned skulls of red and arctic foxes (from the Canidae family) and lynx and puma skulls (from the Felidae family) at the American Museum of Natural History in Manhattan. They 3D-printed the skulls and attached each to a sensor that measured impact force. The skulls were then dropped into both snow and water, and the researchers entered data into computer models to compare impacts of both.

Jung and colleagues found that the foxes’ sharp snouts penetrated the snow with little resistance, minimizing potential tissue damage during a headfirst dive. “Without much compression, in spite of the high-speed impact, the snow behaves like water,” Jung said. But the flat Felidae snouts compressed the snow upon impact, creating a large and potentially damaging resistance.

When mousing in snow, the fox’s long snout also allows it to reach its prey earlier, as mice are very sensitive to movements in their environment and can quickly escape. Other behavioral studies have shown that prior to pouncing, foxes shake their heads to listen to the rustling sounds of mice or other animals beneath the snow’s surface, thereby gauging the depth of the sound source.

“This is a very dangerous process, but we haven’t had reports of foxes getting injured,” Jung said.

The study was funded by the National Science Foundation.

For additional information, see this Cornell Chronicle story.

-30-

 

Laser imaging could offer early detection for at-risk artwork


Time is robbing some historical paintings of their yellow colors. Technique could spot the first signs of fading before they’re visible to the eye



DUKE UNIVERSITY




DURHAM, N.C. -- Look closely at Impressionist paintings in museums compared with photos of them taken 50 years ago, and you might notice something odd:  some are losing their bright yellow hues.

Take the dramatic sunset in Edward Munch’s famous painting “The Scream.” Portions of the sky that were once a vivid orangish yellow have faded to off-white.

Likewise, some of the sunny yellow that Henri Matisse brushed between the reclining nudes in his painting “The Joy of Life” is now more of a drab beige.

Several other paintings from this period are facing similar issues. The bright yellow paint these artists used was made from the chemical compound cadmium sulfide. The pigment was beloved by many European artists of the late 19th and early 20th centuries. Claude Monet, Vincent van Gogh, and Pablo Picasso all brushed their canvasses with it.

“So many painters really loved this pigment,” said Yue Zhou, who earned her Ph.D. in the lab of Duke chemistry professor Warren Warren.

But as the decades passed, many artists and art conservators realized they had a problem:  Their cadmium yellow brushstrokes didn’t look as vibrant as they once did.

The passage of time exposes artwork to light, moisture, dust and other elements of nature that can make pigments vulnerable to fading and discoloration.

In a new study, Duke University researchers show that a laser microscopy technique they developed could offer a means of early detection, making it possible to identify the first tiny signs of color change even before they’re visible to the eye.

Several techniques exist to study what pigments were used in a painting and how much they’ve broken down. But they typically involve scraping off a tiny chip of paint with a scalpel to analyze its composition. That method can damage the piece and limits the area to be studied, Zhou said.

“It's a little like surgery,” she added.

Enter pump-probe microscopy. It can peer into layers of paint and detect chemical changes that mark the onset of a pigment’s decay, without taking cross-sections of the original artwork.

The technique uses ultra-fast pulses of harmless visible or near-infrared light, lasting less than a trillionth of a second, and measures how they interact with pigments in the paint. The resulting signals can be used as chemical fingerprints to identify which compounds are present.

By focusing the laser beam at different locations and depths within the sample, the researchers are able to create 3D maps of certain pigments and monitor what’s happening at scales as small as a hundredth of a millimeter.

For the new study, published April 26 in the Journal of Physics: Photonics, the researchers used pump-probe microscopy to analyze samples of cadmium yellow paint subjected to an artificial aging process.

In a lab on Duke’s west campus, Zhou stirred up samples of the famous color. Taking a bottle of powdered cadmium sulfide pigment off a shelf, she mixed it with linseed oil and then brushed it on microscope slides to dry.

Some samples were left in a dark and dry environment, protected from moisture and light damage. But the rest were placed in a special chamber and exposed to light and high humidity -- factors known to wreak havoc on unstable colors.

The researchers then imaged the paint samples using pump-probe microscopy to track the degradation progress on a microscopic scale.

Compared with control samples, the samples that got the aging treatment emerged looking the worse for wear. After four weeks in the aging chamber, they had faded to lighter shades of yellow.

But even before these changes became noticeable, clear signs of decay were already apparent in the pump-probe data, Zhou said.

The cadmium sulfide signal started to wane as early as week one, eventually decreasing by more than 80% by week four.

The signal loss is a result of chemical changes in the pigments, Zhou said. Moisture triggers the transformation of cadmium sulfide, which is yellow, into cadmium sulfate, which is white -- resulting in a whitish or dull cast.

Senior co-authors Warren and Martin Fischer originally developed the technique to analyze pigments in human tissue, not works of art -- to inspect skin moles for signs of cancer. But then they realized the same approach could be used for art conservation.

There is a caveat: while the technique spots early changes in a nondestructive way, conservators can’t easily recreate the bulky laser setup in their own museums.

In the future, the team says it might be possible to develop a cheaper, more portable version that can be used to study paintings that are too vulnerable or large to transport and analyze off site.

Of course, any color loss that has already happened can’t be reversed. But one day, art conservators might have a new tool to spot these changes earlier and take steps to slow or stop the process in its beginning stages.

The research has potential applications beyond artists’ pigments. Looking at cadmium yellow degradation in century-old paintings could help researchers better understand modern materials that are vulnerable to the elements too, such as the cadmium sulfide used in solar cells, Warren said.

This research was supported by grants from the National Science Foundation (CHE-2108623) and from the Chan Zuckerberg Initiative (2021242921).

CITATION: "Non-Destructive Three-Dimensional Imaging of Artificially Degraded CdS Paints by Pump-Probe Microscopy," Yue Zhou, David Grass, Warren S. Warren, and Martin C. Fischer. Journal of Physics: Photonics, April 14, 2024. DOI: 10.1088/2515-7647/ad3e65

 

 

Scientists harness the wind as a tool to move objects


New approach allows contactless or remote manipulation of objects by machines or robots



AALTO UNIVERSITY





Researchers have developed a technique to move objects around with a jet of wind. The new approach makes it possible to manipulate objects at a distance and could be integrated into robots to give machines ethereal fingers.

‘Airflow or wind is everywhere in our living environment, moving around objects like pollen, pathogens, droplets, seeds and leaves. Wind has also been actively used in industry and in our everyday lives – for example, in leaf blowers to clean leaves. But so far, we can’t control the direction the leaves move – we can only blow them together into a pile,’ says Professor Quan Zhou from Aalto University, who led the study.

The first step in manipulating objects with wind is understanding how objects move in the airflow. To that end, a research team at Aalto University recorded thousands of sample movements in an artificially generated airflow and used these to build templates of how objects move on a surface in a jet of air.

The team’s analysis showed that even though the airflow is generally chaotic, it’s still regular enough to move objects in a controlled way in different directions – even back towards the nozzle blowing out the air.

‘We designed an algorithm that controls the direction of the air nozzle with two motors. The jet of air is blown onto the surface from several meters away and to the side of the object, so the generated airflow field moves the object in the desired direction. The control algorithm repeatedly adjusts the direction of the air nozzle so that the airflow moves the objects along the desired trajectory,’ explains Zhou.

‘Our observations allowed us to use airflow to move objects along different paths, like circles or even complex letter-like paths. Our method is versatile in terms of the object’s shape and material – we can control the movement of objects of almost any shape,’ he continues.

The technology still needs to be refined, but the researchers are optimistic about the untapped potential of their nature-inspired approach. It could be used to collect items that are scattered on a surface, such as pushing debris and waste to collection points. It could also be useful in complex processing tasks where physical contact is impossible, such as handling electrical circuits.

‘We believe that this technique could get even better with a deeper understanding of the characteristics of the airflow field, which is what we’re working on next,’ says Zhou.

The article has been published in Advanced Intelligent Systems. DOI: http://doi.org/10.1002/aisy.202400174