It’s possible that I shall make an ass of myself. But in that case one can always get out of it with a little dialectic. I have, of course, so worded my proposition as to be right either way (K.Marx, Letter to F.Engels on the Indian Mutiny)
Wednesday, July 30, 2025
SPACE/COSMOS
University of Maryland geophysicist helps identify moonquake dangers that could threaten future missions
A computer simulation depicting the seismic waves emanating from a shallow moonquake occurring on the Lee-Lincoln scarp in the Taurus-Littrow Valley on the Moon and interacting with the Apollo 17 Lunar Module landing site. The audio is the corresponding vertical ground motion from the simulation. Both audio and video are sped up by a factor of 10. The background image is a globe mosaic image from the Lunar Reconnaissance Orbiter Camera Wide Angle Camera (LROC-WAC). Red and blue are positive (upward ground motion) and negative (downward ground motion) polarities of the wave.
A new paper revealed that ground acceleration from moonquakes, rather than meteor impacts, was responsible for shifting lunar landscapes at the moon’s Taurus-Littrow valley, where Apollo 17 astronauts landed in 1972. The study also pinpointed a possible cause for those surface changes and assessed damage risk using new models of the quakes—findings that may impact the safety of future lunar missions and the establishment of long-term bases on the moon.
The paper authored by Smithsonian Senior Scientist Emeritus Thomas R. Watters and University of Maryland Associate Professor of Geology Nicholas Schmerr was published in the journal Science Advances on July 30, 2025.
The scientists analyzed evidence from the Apollo 17 landing site, where NASA astronauts collected samples from boulder falls and landslides that were likely triggered by moonquakes. By studying the geological evidence left behind, the researchers were able to estimate the strength of these ancient moonquakes and identify their most probable source.
“We don’t have the sort of strong motion instruments that can measure seismic activity on the moon like we do on Earth, so we had to look for other ways to evaluate how much ground motion there may have been, like boulder falls and landslides that get mobilized by these seismic events,” Schmerr said.
The scientists found that moonquakes with magnitudes around 3.0—relatively weak by Earth standards but significant if close to the source—occurred repeatedly over the last 90 million years along the Lee-Lincoln fault, a geological fracture that crosses the valley floor. The pattern suggests that the fault, just one of thousands of similar faults on the moon, may still be active.
“The global distribution of young thrust faults like the Lee-Lincoln fault, their potential to be still active and the potential to form new thrust faults from ongoing contraction should be considered when planning the location and assessing stability of permanent outposts on the moon,” Watters said.
Watters and Schmerr also calculated lunar seismic risk, estimating a one in 20 million chance of a potentially damaging moonquake occurring on any given day near an active fault.
“It doesn’t sound like much, but everything in life is a calculated risk,” Schmerr noted. “The risk of something catastrophic happening isn’t zero, and while it’s small, it’s not something you can completely ignore while planning long-term infrastructure on the lunar surface.”
The researchers found that short-term missions, such as Apollo 17, were relatively low-risk, but longer-duration projects face incrementally higher exposure. Future missions with higher aspect ratio landers—such as the Starship Human Landing System—could be vulnerable to ground acceleration from nearby moonquakes that would threaten their stability. The findings are particularly relevant as NASA continues the Artemis program, which aims to establish a sustained human presence on the moon. Watters and Schmerr emphasized that future missions face additional considerations beyond Apollo-era risks.
“If astronauts are there for a day, they’d just have very bad luck if there was a damaging event,” Schmerr added. “But if you have a habitat or crewed mission up on the moon for a whole decade, that’s 3,650 days times 1 in 20 million, or the risk of a hazardous moonquake becoming about 1 in 5,500. It’s similar to going from the extremely low odds of winning a lottery to much higher odds of being dealt a four of a kind poker hand.”
Schmerr believes that his work with Watters represents a new frontier in lunar paleoseismology—the study of ancient earthquakes. Unlike on Earth, where scientists can dig trenches to study historic seismic activity, lunar researchers must rely on creative approaches using existing data and samples. Schmerr expects this field to rapidly advance with new technology, high resolution orbital imaging, and future Artemis missions that will deploy seismometers with 50-plus years of technological improvements over the Apollo-era instruments.
“We want to make sure that our exploration of the moon is done safely and that investments are made in a way that’s carefully thought out,” Schmerr said. “The conclusion we came to is: don’t build right on top of a scarp, or recently active fault. The farther away from a scarp, the lesser the hazard.”
###
This article was adapted from text provided by the Earth and Planetary Studies of the National Air and Space Museum, Smithsonian Institution.
This research was funded by NASA’s Lunar Reconnaissance Orbiter mission, launched on 18 June 2009. LRO is managed by the NASA Goddard Space Flight Center for the Science Mission Directorate. This article does not necessarily reflect the views of this organization.
Apollo 17 Astronaut Harrison H. Schmitt samples the boulder at Station 7 located at the base of North Massif in the Taurus-Littrow valley. This large boulder was dislodged by a strong moonquake that occurred about 28.5 million year ago. The source of the quake was likely from an event on the Lee-Lincoln fault.
Paleoseismic activity in the moon’s Taurus-Littrow valley inferred from boulder falls and landslides
Article Publication Date
30-Jul-2025
Study proposes a new window for dark matter research
Model considers two DM particles, one stable and one unstable, as well as a vector mediator similar to the photon but with mass, which would promote interaction with ordinary matter particles.
Fundação de Amparo à Pesquisa do Estado de São Paulo
It is now understood that all known matter, i.e., studied by science and harnessed by technology, constitutes only 5% of the content of the universe. The rest is composed of two unknown components: dark matter (about 27%) and dark energy (about 68%). This calculation, confirmed decades ago, continues to surprise both laypeople and scientists alike.
In the case of dark matter (DM), there is abundant evidence that it really exists, all resulting from its gravitational interaction with ordinary matter. This evidence comes from sources such as the rotation curves of stars in galaxies, discrepancies in the movement of galaxies in clusters, the formation of large-scale structures in the universe, and cosmic background radiation, which is distributed uniformly throughout space. Despite knowing with a high degree of certainty that DM exists, we do not know what it is. Several models proposed thus far have failed.
A new study by researchers at the University of São Paulo (USP) in Brazil proposes an inelastic DM model that interacts with ordinary matter through a vector mediator similar to a photon, but with mass. The aim is to open a new window of observation. An article on the subject was published in the Journal of High Energy Physics.
“In this work, we consider a DM model composed of a dark sector with light particles that interact weakly with the known particles of the Standard Model [SM],” says Ana Luisa Foguel, a Ph.D. student at the Physics Institute (IF-USP) and the first author of the article.
Context
Initially, the search for DM focused on heavy candidates with masses much greater than that of an electron or even the heaviest particles in the SM. The idea was that, because they were so massive, these particles could not be produced by particle colliders, which did not yet have sufficient energy. However, even with the experiments carried out at the Large Hadron Collider (LHC) at the European Organization for Nuclear Research (CERN, in its official French acronym), no new particles beyond those of the SM have been observed.
Consequently, some in the scientific community shifted their focus to searching for light particles with extremely weak interactions. The idea was that such particles had not yet been observed because they interact very weakly with ordinary matter. To investigate signs of these particles, experiments needed to move toward the so-called “intensity frontier,” meaning they would have to measure couplings and interactions with increasing precision to detect any discrepancies that might signal the existence of something new.
Thermal freeze-out
The study is moving in this direction. “When considering a new DM model, the first thing is to know how it was possible to produce the right amount of such a component. This amount is now measured very precisely, with data from cosmic background radiation, for example. And several mechanisms are known that could have led to the generation of DM in the early universe. One of the most theoretically motivated is the so-called ‘thermal freeze-out,’” says Foguel.
In particle physics and cosmology, thermal freeze-out is the moment when certain particles decouple from the thermal bath, meaning the interactions that transform these particles into other SM particles are no longer sufficient. After this point, since there are no processes that can alter the number of these particles, their abundance “freezes,” remaining virtually unchanged.
“This mechanism is interesting and well known, as we have several examples of SM particles whose abundance was generated in this way. Therefore, it’s natural to consider that the components of DM were generated by a similar mechanism,” comments the researcher.
In this mechanism, DM candidate particles are in a “thermal bath” with ordinary matter particles shortly after the beginning of the universe. In other words, all particles interact very quickly to share the same temperature. As the universe expands and cools, the particles lose this thermal contact. This process is called “freeze-out.”
“The exact moment of decoupling depends on the probability of interactions between DM particles and SM particles. This probability is parameterized by a variable we call the sigma shock section. If sigma is very small, DM particles decouple very early and their abundance is very high. Conversely, if it’s very large, the DM remains in thermal contact longer, annihilating itself into SM particles, so that when it decouples later, it doesn’t have sufficient abundance,” points out Foguel.
In the case of light DM, the interaction with ordinary matter occurs through a portal. In other words, DM particles do not couple directly with all SM particles, but rather with a mediator particle that facilitates interaction between DM and the SM. The sigma shock section of this interaction is proportional to the mass of the DM and inversely proportional to the mass of the portal particle. Thus, for a light candidate to exist at an energy level below gigaelectronvolts, the portal cannot be too heavy. Therefore, the SM bosons that mediate weak interactions (W+, W-, and Z0) would not function as portals. A new dark particle must be introduced to mediate between the DM and the SM.
“In our model, this particle that mediates the relationship between the two sectors is a vector boson (ZQ). It behaves like a photon, the particle that mediates electromagnetic interactions, but it has mass. In addition, the difference in this model is that this mediator also interacts directly with other SM particles,” says the researcher.
This mediator would connect the SM particles to the DM particles. According to the proposed model, there are two types of these particles: a stable particle (χ₁), which would make up DM itself, and a slightly heavier unstable particle (χ₂). These particles would always interact with the ZQ mediator together. In other words, the mediator would interact with both at the same time. This would constitute a specific type of DM called “inelastic dark matter.” In addition, χ₂ could decay into χ₁ and SM particles. This work demonstrates that these arrangements can explain the abundance of DM in the universe while circumventing the experimental limits that prevent its detection.
“It’s worth noting that models such as ours, with inelastic DM, are interesting because in addition to explaining the efficient generation of DM through the freeze-out mechanism, they also make it possible to circumvent the current limits of direct and indirect detection, as well as the limits of cosmology. The reason comes from the fact that, as χ₂ isn’t stable and interactions depend on χ₂, there isn’t enough χ₂ population during the recombination epoch to inject energy into the plasma, which could have modified the cosmic background radiation. And there’s also no χ₂ in the current universe to decay or annihilate with χ₁, producing signals that enable indirect detection. Furthermore, for χ₁ to interact in direct detection experiments, it’d have to transform into χ₂, which is very difficult because χ₂ is more massive,” Foguel explains.
Overcoming the “vanilla” model
According to the researcher, the proposed new model would serve as an alternative to the “vanilla” model of inelastic DM, which considers a mediator that does not couple directly with DM particles. In particle physics, “vanilla” is used to designate the most basic and minimalist version of a model with the fewest theoretical ingredients possible.
“The ‘vanilla’ model has already been practically ruled out, because almost all of the parameters that reproduce the correct abundance of DM have been discarded by experimental searches. Thus, the main objective of our work was to show that by considering a simple modification of this model – allowing mediators with direct rather than indirect couplings – we can potentially ‘save’ inelastic DM,” Foguel explains.
“Considering the proposed models, we first calculated the abundance of DM using the freeze-out process and made a code available online that allows these calculations to be reproduced, showing the regions of the parameter space that produce inelastic DM for different choices of Q load, with correct abundance. After that, we focused on the limits of different experiments. And we concluded that, for certain models, new regions of the parameter space are ‘unlocked,’ that is, there are parameters that reproduce the correct abundance of DM and haven’t yet been excluded. Some of these parameter regions could be investigated in future experiments.”
Renata Zukanovich Funchal, a full professor at IF-USP, Foguel’s advisor, study coordinator, and co-author of the article, summarizes: “The use of more general vector mediators opens a new window for viable models of inelastic DM, with direct consequences for decay rates, experimental signatures, and cosmological limits.”
About São Paulo Research Foundation (FAPESP) The São Paulo Research Foundation (FAPESP) is a public institution with the mission of supporting scientific research in all fields of knowledge by awarding scholarships, fellowships and grants to investigators linked with higher education and research institutions in the State of São Paulo, Brazil. FAPESP is aware that the very best research can only be done by working with the best researchers internationally. Therefore, it has established partnerships with funding agencies, higher education, private companies, and research organizations in other countries known for the quality of their research and has been encouraging scientists funded by its grants to further develop their international collaboration. You can learn more about FAPESP at www.fapesp.br/en and visit FAPESP news agency at www.agencia.fapesp.br/en to keep updated with the latest scientific breakthroughs FAPESP helps achieve through its many programs, awards and research centers. You may also subscribe to FAPESP news agency at http://agencia.fapesp.br/subscribe.
Ribbons in the sky: Space radio telescope reveals plasma jet in a supermassive black hole binary candidate
An international team of astronomers has captured one of the most detailed images yet of a spectacular, ribbon-like jet emerging from the heart of the distant active galaxy, OJ 287
Left: The RadioAstron Space VLBI mission combined a spaceborne radio telescope (Spektr-R) with 27 ground-based radio telescopes worldwide to create a virtual telescope five times the Earth’s diameter, in an observation of the enigmatic quasar OJ 287. Right: Progressive zoom into OJ 287’s jet at increasing resolution. Top: 15 GHz VLBA (18 pc scale). Middle: 43 GHz VLBI (4.5 pc scale). Bottom: 22 GHz RadioAstron (1.8 pc scale), revealing the ribbon-like structure with multiple sharp bends for the first time.
An international team of astronomers has captured one of the most detailed images yet of a spectacular, ribbon-like jet emerging from the heart of the distant active galaxy, OJ 287. Enabled by the RadioAstron space telescope in combination with a global network of radio observatories, this breakthrough sheds new light on the extreme environments surrounding supermassive black holes and the powerful jets they launch into space.
Located about 5 billion light-years from Earth, OJ 287 has long intrigued scientists for its dramatic bursts of light and enigmatic behavior. It is suspected to hold in its central region a binary system of two supermassive black holes with the total mass exceeding a billion Solar masses. Now, for the first time, researchers have peered into its core with high spatial resolution, revealing a sharply bent, continuous “ribbon” of plasma twisting and turning as it streams from the galaxy’s center.
The findings have been published in the scientific journal Astronomy & Astrophysics by researchers from across Spain, Germany, The Netherlands, South Korea, Italy, the United States, and Russia. Some of them have worked on the RadioAstron project since its inception. “I joined the project in 1979, before many of my co-authors were born,” says Prof. Leonid Gurvits of Delft University of Technology (The Netherlands), whose decades of expertise in space VLBI (Very Long Baseline Interferometry) and active galactic nuclei helped shape the study.
Huge virtual telescope
OJ 287 is one of the best places in the Universe to study how two gigantic black holes interact. Although black holes themselves are invisible, we can detect them because they pull in matter from their surroundings. As gas and dust spiral inward, they heat up and emit radiation that we can observe with telescopes.
The RadioAstron Space VLBI mission combined a spaceborne radio telescope with 27 ground-based radio telescopes worldwide in this study, creating a virtual telescope five times the diameter of Earth, thus achieving a resolution equivalent to reading a newspaper in New York from Delft. This remarkable clarity revealed not only the intricate ribbon shape but also regions within the jet hotter than 10 trillion Kelvin, signaling extreme energy and motion near the black hole.
The birth of a shock wave
Polarisation measurements showed the jet’s magnetic field aligned along its length, providing new clues about how such jets are launched and shaped. The team also witnessed the very first moments of the birth of a new shock wave in the jet, which later collided with a stationary shock, an event that coincided with the historic detection of trillion-electron-volt gamma rays from OJ 287 in early 2017. “We captured the birth of a jet component and watched it travel down this beautiful ribbon until it hit a shock wave and produced the most energetic gamma rays ever detected from this source,” explains first author Dr. Thalia Traianou from Heidelberg University.
Black Hole Binaries and Gravitational Waves
These striking jet observations may also offer clues to a much deeper mystery. OJ 287 has puzzled astronomers since the 1880s with unusual brightness variations that follow a mysterious ~60-year cycle, suggesting the presence of not one, but two supermassive black holes locked in orbit. The newly imaged jet structure supports this idea. If two massive black holes are orbiting each other in the galaxy’s core, their motion could periodically twist and reorient the jet. Such effects are now becoming detectable with highly accurate radio recordings. The observed twist also aligns with long-term wobbling seen in OJ 287’s jet, likely caused by precession driven by the same orbital motion.
OJ 287 is therefore a prime target for studying how binary supermassive black hole systems evolve and eventually merge. The mergers are expected to generate powerful gravitational waves. These ripples in spacetime can potentially be detected, in particular by the joint ESA and NASA flagship mission LISA (Laser Interferometer Space Antenna) scheduled to launch in 2035.
A glimpse of what's to come
While this study uses only radio observations, it lays important groundwork for multi-messenger astronomy. This is a new approach that combines signals such as electromagnetic radiation, gravitational waves, and neutrinos to explore the Universe in unprecedented detail. Future observations of OJ 287 and similar systems could provide both types of signals, offering a more complete picture of black hole mergers.
“One of the beautiful things about fundamental science is the unpredictability of its impact,” says Gurvits. “When electricity was discovered two hundred years ago, no one could have imagined how deeply it would shape modern society. It’s the same with our research: we don’t know when and what its effects will be. But that uncertainty is part of what makes fundamental science so exciting. That said, it is certain that this RadioAstron study is a prelude to the upcoming transformational discoveries in the new era of multi-messenger astronomy.”
For more than 150 years, the OJ 287 galaxy and its brightness variations five billion light years away has both puzzled and fascinated astronomers, because they suspect two supermassive black holes are merging in the core. An international research team led by Dr Efthalia Traianou of Heidelberg University recently succeeded in taking an image of the heart of the galaxy at a special level of detail. The groundbreaking image, taken with the aid of a space radio telescope, shows a heretofore unknown, heavily curved segment of the plasma jet spinning off the galaxy’s center. The image provides new insights into the extreme conditions that prevail around supermassive black holes.
The core of the OJ 287 galaxy belongs to the class of blazars that exhibit high activity and striking luminosity. The driving forces behind these active galactic cores are black holes. They absorb matter from their surroundings and can fling it off in the form of giant plasma jets comprised of cosmic radiation, heat, heavy atoms, and magnetic fields. “We have never before observed a structure in the OJ 287 galaxy at the level of detail seen in the new image,” emphasizes Dr Traianou, a postdoctoral researcher in the team of Dr Roman Gold at the Interdisciplinary Center for Scientific Computing of Heidelberg University.
The image, which penetrates deep into the galaxy’s center, reveals the sharply curved, ribbon-like structure of the jet; it also points to new insights into the composition and the behavior of the plasma jet. Some regions exceed temperatures of ten trillion degrees Kelvin – evidence of extreme energy and movement being released in close proximity to a black hole. The researchers also observed the formation, spread, and collision of a new shock wave along the jet and attribute it to an energy in the trillion-electron volt range from an unusual gamma ray measurement taken in 2017.
The image in the radio range was taken with a ground-space radio interferometer consisting of a radio telescope in Earth’s orbit – a ten-meter-long antenna of the RadioAstron mission on board the Spektr-R satellite – and a network of 27 ground observatories distributed across the Earth. In this way, the researchers were able to create a virtual space telescope with a diameter five times greater than the diameter of the Earth; its high resolution stems from the distance of the individual radio observatories to one another. The image is based on a method of measurement that takes advantage of the wave nature of light and the associated overlapping waves.
The interferometric image underpins the assumption that a binary supermassive black hole is located inside galaxy OJ 287. It also provides important information on how the movements of such black holes influence the form and orientation of the plasma jets emitted. “Its special properties make the galaxy an ideal candidate for further research into merging black holes and the associated gravitational waves,” states Efthalia Traianou.
Institutions from Germany, Italy, Russia, Spain, South Korea, and the US all contributed to the research. It was supported by various research and funding institutions. The research results were published in the journal “Astronomy & Astrophysics”.
A new 'cosmic veil' developed by engineers at the University of Surrey could help perovskite solar cells survive in space, opening a new gateway to lighter, cheaper and more efficient solar power for satellites and spacecraft.
Perovskite solar cells are a next-generation lightweight, low-cost solar technology that can be made more easily and last longer than traditional panels – but they are still vulnerable to damage in the harsh conditions of space.
Working with partners at Oxford University, the University of New South Wales in Australia, and institutions across South Korea – including Chungbuk National University, Gyeongsang National University and KRICT – researchers from Surrey's Advanced Technology Institute have created a thin protective coating using propane-1,3-diammonium iodide (PDAI₂).
The study has been published in the journal Joule.
Dr Jae Sung Yun, Lecturer in Energy Technology and co-author of this study from the University of Surrey, said:
"Perovskite solar cells are promising for space, but the various sources of radiation in our solar system are still a major threat – especially to the organic molecules that make them work. Our coating helps protect those fragile parts, stopping them from breaking down and helping the cells stay efficient for longer."
To test the coating's effect, the team exposed treated and untreated versions to high levels of proton radiation – simulating more than 20 years of exposure in low-Earth orbit. The treated cells held up far better. They lost significantly less efficiency and showed fewer signs of internal damage, thanks to the protective layer stopping harmful chemical reactions before they could take hold. PDAI₂ works by stabilising unstable molecules, preventing them from reacting and turning into gases like ammonia or hydrogen, which would otherwise escape and weaken the cell.
Professor Ravi Silva, Director of the Advanced Technology Institute and Interim Director of the Surrey Institute for Sustainability at the University of Surrey, said:
"This project is a brilliant example of how our cross-institute collaborations can deliver real impact. By bringing together expertise from the Advanced Technology Institute, the Surrey Ion Beam Centre, and the Institute for Sustainability, we're able to tackle complex global challenges – like developing the next generation of clean energy technologies for space."
Reaction scheme and energetic level of the investigated reaction of the helium hydride ion with deuterium. It is a swift and barrierless reaction, contrary to earlier theories. Background: The planetary nebula NGC 7027, with molecular hydrogen visible in red.
Credit: Schematic: MPIK; Background Image: W. B. Latter (SIRTF Science Center/Caltech) and NASA
Immediately after the Big Bang, which occurred around 13.8 billion years ago, the universe was dominated by unimaginably high temperatures and densities. However, after just a few seconds, it had cooled down enough for the first elements to form, primarily hydrogen and helium. These were still completely ionised at this point, as it took almost 380,000 years for the temperature in the universe to drop enough for neutral atoms to form through recombination with free electrons. This paved the way for the first chemical reactions.
The oldest molecule in existence is the helium hydride ion (HeH+), formed from a neutral helium atom and an ionised hydrogen nucleus. This marks the beginning of a chain reaction that leads to the formation of molecular hydrogen (H₂), which is by far the most common molecule in the universe.
Recombination was followed by the 'dark age' of cosmology: although the universe was now transparent due to the binding of free electrons, there were still no light-emitting objects, such as stars. Several hundred million years passed before the first stars formed.
During this early phase of the universe, however, simple molecules such as HeH⁺ and H₂ were essential to the formation of the first stars. In order for the contracting gas cloud of a protostar to collapse to the point where nuclear fusion can begin, heat must be dissipated. This occurs through collisions that excite atoms and molecules, which then emit this energy in the form of photons. Below approximately 10,000 degrees Celsius, however, this process becomes ineffective for the dominant hydrogen atoms. Further cooling can only take place via molecules that can emit additional energy through rotation and vibration. Due to its pronounced dipole moment, the HeH⁺ ion is particularly effective at these low temperatures and has long been considered a potentially important candidate for cooling in the formation of the first stars. Consequently, the concentration of helium hydride ions in the universe may significantly impact the effectiveness of early star formation.
During this period, collisions with free hydrogen atoms were a major degradation pathway for HeH⁺, forming a neutral helium atom and an H₂⁺ ion. These subsequently reacted with another H atom to form a neutral H₂ molecule and a proton, leading to the formation of molecular hydrogen.
Researchers at the Max-Planck-Institut für Kernphysik (MPIK) in Heidelberg have now successfully recreated this reaction under conditions similar to those in the early universe for the first time. They investigated the reaction of HeH⁺ with deuterium, an isotope of hydrogen containing an additional neutron in the atomic nucleus alongside a proton. When HeH⁺ reacts with deuterium, an HD⁺ ion is formed instead of H₂⁺, alongside the neutral helium atom.
The experiment was carried out at the Cryogenic Storage Ring (CSR) at the MPIK in Heidelberg — a globally unique instrument for investigating molecular and atomic reactions under space-like conditions. For this purpose, HeH⁺ ions were stored in the 35-metre-diameter ion storage ring for up to 60 seconds at a few kelvins (-267 °C), and were superimposed with a beam of neutral deuterium atoms. By adjusting the relative speeds of the two particle beams, the scientists were able to study how the collision rate varies with collision energy, which is directly related to temperature.
They found that, contrary to earlier predictions, the rate at which this reaction proceeds does not slow down with decreasing temperature, but remains almost constant. “Previous theories predicted a significant decrease in the reaction probability at low temperatures, but we were unable to verify this in either the experiment or new theoretical calculations by our colleagues,” explains Dr Holger Kreckel from the MPIK. ‘The reactions of HeH⁺ with neutral hydrogen and deuterium therefore appear to have been far more important for chemistry in the early universe than previously assumed,’ he continues. This observation is consistent with the findings of a group of theoretical physicists led by Yohann Scribano, who identified an error in the calculation of the potential surface used in all previous calculations for this reaction. The new calculations using the improved potential surface now align closely with the CSR experiment.
Since the concentrations of molecules such as HeH⁺ and molecular hydrogen (H₂ or HD) played an important role in the formation of the first stars, this result brings us closer to solving the mystery of their formation.
Experimental confirmation of barrierless reactions between HeH+ and deuterium atoms suggests a lower abundance of the first molecules at very high redshifts
Article Publication Date
29-Jul-2025
International research lead by York University professor sheds light on 'lava planets'
Novel models of planetary interiors give scientists a framework to interpret current observations of distant exoplanets from space- and ground-based telescopes
Artistic illustration of the internal structure of a lava planet in a cold state, showing a day‑side magma ocean overlain by a mineral atmosphere. The arrows indicate the direction of heat transport within the planet’s interior and the thermal radiation emitted from its night side.
TORONTO, July 29, 2025 – A new paper led by a York University professor and published today in Nature Astronomy introduces a simple theoretical framework to describe the evolution of the coupled interior–atmosphere system of hot rocky exoplanets known as “lava planets.”
“Lava planets are in such extreme orbital configurations that our knowledge of rocky planets in the solar system does not directly apply, leaving scientists uncertain about what to expect when observing lava planets,” says first author Charles-Édouard Boukaré, Assistant Professor in York University’s Department of Physics and Astronomy in the Faculty of Science.
“Our simulations propose a conceptual framework for interpreting their evolution and provide scenarios to probe their internal dynamics and chemical changes over time. These processes, though greatly amplified in lava planets, are fundamentally the same as those that shape rocky planets in our own solar system.”
Exotic worlds may unveil processes driving planetary evolution
Lava planets are Earth- to super-Earth–sized worlds orbiting extremely close to their host stars, completing an orbit in less than a single Earth day. Much like Earth’s Moon, they are expected to be tidally locked, always showing the same face to their star. Their dayside surfaces reach such extreme temperatures that silicate rocks melt – and even vaporize – creating conditions unlike anything in our solar system. These exotic worlds, easily observable due to their ultra short orbital period, provide unique insights into the fundamental processes that shape planetary evolution.
Probing planetary interiors through atmosphere and surface properties
The study combines expertise in geophysical fluid mechanics, exoplanetary atmospheres, and mineralogy to explore how the compositions of lava planets evolve through a process akin to distillation. When rocks melt or vaporize, elements such as magnesium, iron, silicon, oxygen, sodium, and potassium partition differently between vapor, liquid, and solid phases. The unique orbital configuration of lava planets maintains vapor–liquid and solid–liquid equilibria over billions of years, driving long-term chemical evolution.
The paper, “The role of interior dynamics and differentiation on the surface and atmosphere of lava planets,” was co-authored by Daphné Lemasquerier (University of St Andrews), Nicolas B. Cowan (McGill University), Lisa Dang (University of Waterloo), Henri Samuel, James Badro, Aurélien Falco and Sébastien Charnoz (Université Paris Cité).
Using unprecedented numerical simulations, the team predicts two end-member evolutionary states:
• Fully molten interior (likely young planets): The atmosphere mirrors the bulk planetary composition, and heat transport within the molten interior keeps the nightside surface hot and dynamic.
• Mostly solid interior (likely older planets): Only a shallow lava ocean remains on the dayside, and the atmosphere becomes depleted in elements such as sodium, potassium, and iron.
Testing hypotheses with the James Webb Space Telescope
Boukaré explains that this research on lava exoplanets began as a highly exploratory effort with few initial expectations. It builds on a novel modeling approach he developed to study molten rocky planets in collaboration with colleagues at the Institute de Physique du Globe de Paris, Université Paris Cité, published in Nature earlier this year.
What began as an exploratory study has since opened a promising new line of research. The predictions outlined in this work helped secure 100 hours of observation time on the James Webb Space Telescope (JWST) — the most advanced infrared observatory ever built, featuring a 6.5‑metre segmented mirror and ultra‑sensitive instruments capable of probing the earliest galaxies and the atmospheres of distant exoplanets with unprecedented precision. These upcoming JWST observations, led by co-author Prof. Dang, will directly test the theoretical framework proposed in this study.
“We really hope we can observe and distinguish old lava planets from young lava planets. If we can do this, it would mark an important step toward moving beyond the traditional snapshot view of exoplanets,” says Boukaré.
Illustration caption: Artistic illustration of the internal structure of a lava planet in a cold state, showing a day‑side magma ocean overlain by a mineral atmosphere. The arrows indicate the direction of heat transport within the planet’s interior and the thermal radiation emitted from its night side. Credit: Romain Jean-Jaques (Instagram: @romainjean.jacques)
-30-
York University is a modern, multi-campus, urban university located in Toronto, Ontario. Backed by a diverse group of students, faculty, staff, alumni and partners, we bring a uniquely global perspective to help solve societal challenges, drive positive change, and prepare our students for success. York's fully bilingual Glendon Campus is home to Southern Ontario's Centre of Excellence for French Language and Bilingual Postsecondary Education. York’s campuses in Costa Rica and India offer students exceptional transnational learning opportunities and innovative programs. Together, we can make things right for our communities, our planet, and our future.
Media Contact: Emina Gamulin, York University Media Relations, 437-217-6362, egamulin@yorku.ca
The instrument NIRPS is installed on the 3.6-metre telescope at La Silla Observatory in Chile. On this picture: a spectrum taken by NIRPS during preliminary tests in June 2023 of the star Proxima Centauri.
An international team led by the Universities of Geneva (UNIGE) and Montreal published the first results today from the NIRPS spectrograph installed on the European Southern Observatory’s (ESO) 3.6-meter telescope in La Silla, Chile. This new instrument, which operates in the near infrared, offers exceptional performance in detecting and characterising exoplanets, particularly around red dwarfs. By combining NIRPS with the HARPS spectrograph, which operates in visible light, astronomers have access to unrivalled spectral coverage for studying exoplanets. The first five scientific papers can be found in the journal Astronomy & Astrophysics.
The Near-InfraRed Planet Searcher (NIRPS) is a high-resolution spectrograph designed to search for and study exoplanets around stars smaller and cooler than our Sun. Located on the 3.6-meter telescope at La Silla Observatory in Chile, NIRPS officially began scientific observations in April 2023. Its development and construction are the result of a large consortium of scientists and engineers from Canada, Switzerland, Spain, Portugal, France, and Brazil, with the support of the European Southern Observatory (ESO). More than 140 experts contributed to the project, including a large team from the Astronomy Department of the UNIGE Faculty of Science and the National Research Centre PlanetS.
NIRPS is specially designed to observe in the near-infrared wavelengths. Next to it is the HARPS spectrograph, also designed by Swiss scientists, which has been hunting for exoplanets in visible light since 2003.
“This new instrument is the result of technological innovations and the fruit of an international collaboration,” says François Bouchy, Associate Professor at the Department of Astronomy, co-leader of the NIRPS project and lead author of the paper describing the instrument’s performance and scientific objectives. “We are proud of the unique and unrivalled performance of NIRPS and excited by the first scientific results.”
The combination of HARPS and NIRPS offers outstanding spectral coverage for studying and searching for exoplanets. The unique performance of the HARPS+NIRPS tandem makes it one of ESO’s most requested astronomical instruments of the past semester. In parallel with this first-light paper, which accompanies the commissioning and science validation of the brand-new instrument, the consortium is publishing four papers in the journal A&A with the first astrophysical results from NIRPS observations.
Scrutinising the atmosphere of exoplanets The precision of NIRPS in the near-infrared and the possibility of combining it with HARPS in the visible make it possible to study the atmospheres of planets as they pass in front of their star. For their first observations, the astronomers examined the atmospheres of two well-known gas giant exoplanets: WASP-189 b and WASP-69 b.
The former has one of the most extreme atmospheres, so extreme that evaporated iron can be detected. However, it is only detected in the visible with HARPS and not in the near infrared with NIRPS. ‘‘Iron also exhibits spectral signatures in the near infrared. So we should be able to detect it with NIRPS too!’’ explains Valentina Vaulato, PhD student at the Department of Astronomy and first author of the study conducted on WASP-189 b. ‘‘Hence there must be another chemical element hiding the iron signature in the near infrared but not in the visible. The hydride anion – a hydrogen atom with two electrons instead of one – is our prime suspect,’’ concludes the researcher.
NIRPS observations of the second exoplanet, WASP-69 b, reveal a long tail of helium gas escaping from its comet-like atmosphere. This observation, one of the most detailed of its kind, sheds new light on the evolution of planetary atmospheres under the effect of intense radiation from the host star.
Detecting exoplanets in the infrared NIRPS’ prime targets are the cool red stars known as M dwarfs, by far the most common stars in the galaxy, as they shine more brightly in the near-infrared than in the visible. In its first months of operation, scientists from the NIRPS consortium were able to confirm with unprecedented accuracy the presence of Proxima Centauri b, an Earth-like planet located in the habitable zone of the red dwarf Proxima Centauri, the closest star to our solar system. The team also found evidence of a second, less massive planet orbiting this star.
NIRPS is also the only near-infrared instrument to observe our Sun every day, to better understand stellar activity and how to limit its impact on the characterisation of exoplanet atmospheres and the detection of Earth-like exoplanets.
SwRI’s 1,260-square-foot indoor spherical near-field antenna range is equipped with a built-in overhead half-ton hoist to install and test large antennas up to 10 feet in diameter and 1,000 pounds. The chamber allows data collection when the antenna far-field distance exceeds the size of the range, a new capability for SwRI.
SAN ANTONIO — July 29, 2025 —Southwest Research Institute (SwRI) is expanding its antenna measurement capabilities with a state-of-the-art spherical near-field antenna range. The 1,260-square-foot indoor range, lined with radio frequency and microwave foam absorbers, is equipped to accurately sample the near field of an antenna. Near-field measurements can be mathematically transformed into far-field data.
“Near field” refers to the complex electromagnetic fields close to the antenna, while the “far field” encompasses the predictable planar waves farther away from the antenna. Analyzing both fields allows a more complete performance evaluation of an antenna under test. Near-field measurements are typically collected in a planar, cylindrical or spherical formation.
“Spherical data collection is the most comprehensive and flexible method of measuring antenna patterns for all antenna types,” said Dr. Jimmy Li, a lead engineer in SwRI’s Defense and Intelligence Solutions Division. “We get the full 3D radiation pattern data for an antenna — not just limited perspectives obtained with other methods.”
Antennas enable the transmission and reception of signals and are the interface between electromagnetic waves and electronic devices. Testing is necessary to evaluate performance and ensure compliance with industry standards and regulations. Antennas are crucial for a wide range of technologies, including cellular networks, Wi-Fi, radar, satellite communications and positioning and navigation systems.
SwRI’s spherical near-field antenna range offers several new advantages, including:
No restrictions associated with antenna far-field distances exceeding the size of the range because far-field data can be calculated from near-field measurements
A built-in, overhead, half-ton hoist to install large antennas up to 10 feet in diameter and 1,000 pounds
Operation at frequencies from 200 megahertz to 40 gigahertz
Faster data collection enabled by continuous-rotation sampling
Full characterization of antenna radiated patterns, including multi-polarization 3D patterns across all angles
Faulty antenna element diagnostics, array performance evaluations, radome systems tuning and reflector surface area mapping
No limitations due to weather fluctuations that impact outdoor ranges
No limitations from Federal Communications Commission (FCC) regulations that impact outdoor ranges
“Because the range is indoors, we do not have to follow FCC requirements regarding antenna height and other transmission restrictions for outdoor antenna testing,” said Nils Smith, vice president of SwRI’s Defense and Intelligence Solutions Division. “We can now perform thorough testing of antennas on-site at SwRI, and we have more versatility. That translates into a stronger analysis of antenna performance for our clients.”
In the next phase of development, the SwRI spherical near-field antenna range will support emerging millimeter wave technologies, a crucial component for ultra-fast 5G data transmission.
SwRI designs, develops and tests antennas and signal processing software for government and industry clients, including the Navy, Air Force, Marines and intelligence entities.
The antenna testing chamber, lined with radio frequency and microwave foam absorbers, is equipped to accurately sample the near field of an antenna. Those measurements are then mathematically transformed into far-field data. “Near field” refers to the complex electromagnetic fields close to the antenna, while the “far field” encompasses predictable planar waves farther away from the antenna. Analyzing both fields allows a more complete performance evaluation of an antenna under test.
SwRI’s new indoor antenna range supports 3D spherical data collection, the most comprehensive and flexible method of measuring antenna patterns for all antenna types. Other techniques, such as planar or cylindrical collection methods, offer limited perspectives.
Credit
Southwest Research Institute
China's 30-meter landsat composites: a new era in earth observation
The new dataset introduces a seamless, annual Leaf-On Landsat composite data cube for China, spanning from 1985 to 2023. By overcoming challenges like cloud cover, sensor inconsistencies, and data gaps, this innovation provides researchers with a high-quality, readily available resource, streamlining environmental monitoring and land use analysis.
The increasing demand for high-quality, temporally consistent satellite imagery has highlighted gaps in data processing resources, particularly in China. Unlike the United States, which has a preprocessed dataset (ARD) for Landsat imagery, Chinese researchers have lacked such an efficient resource. This has led to time-consuming processes, limiting the accuracy and scope of their research. The new Landsat composite data cube fills this gap, offering a reliable, seamless dataset for environmental monitoring and land use studies, which is crucial for large-scale analyses and informed decision-making. Based on these challenges, there is a pressing need to develop such comprehensive and easily accessible datasets.
The research team, led by Yaotong Cai and colleagues, published (DOI: 10.34133/remotesensing.0698) their findings in Journal of Remote Sensing on July 2, 2025. They present the first-ever seamless, annual Leaf-On Landsat composite data cube for China, covering the years 1985 to 2023. This data cube addresses major challenges such as cloud contamination, sensor calibration differences, and data gaps that hindered previous efforts. The newly developed composite dataset simplifies satellite data processing, making it an invaluable tool for monitoring vegetation dynamics and supporting land use policies.
The study introduces an innovative Landsat data cube, which aggregates images from multiple Landsat sensors. The team utilized advanced techniques like medoid compositing and gap-filling to generate high-quality, cloud-free composites. By leveraging segmented linear interpolation, they effectively addressed data gaps caused by cloud cover and sensor failures. The dataset spans 39 years, capturing vegetation dynamics across diverse regions of China. Its high temporal consistency and improved spectral fidelity make it a valuable resource for environmental assessments and long-term land use research.
The research team employed a systematic approach to generate the dataset, starting with Landsat imagery from 1985 to 2023. They used surface reflectance data from Landsat 4/5, 7, and 8/9, all processed through Google Earth Engine for consistency. Cloud and shadow contamination were removed using a quality assessment (QA) band, and data from different sensors were harmonized to reduce discrepancies. The core innovation lies in the medoid compositing method, which selects the most representative pixel each year, minimizing outliers and maintaining the integrity of the data. This method significantly reduces noise and enhances temporal consistency. To address missing data, the team applied gap-filling techniques, using segmented linear interpolation to seamlessly integrate data across years. The result is a comprehensive and reliable dataset that will serve as a foundation for future remote sensing studies.
Dr. Yaotong Cai, lead author, commented, "This dataset is a significant breakthrough for environmental monitoring in China. It not only simplifies satellite data processing but also provides a long-term resource for research on land use, climate change, and biodiversity conservation. Our methodology offers a robust solution for handling the challenges posed by cloud cover and sensor inconsistencies, and we hope it will drive future research."
The study used Landsat imagery processed into surface reflectance using the Land Surface Reflectance Code (LaSRC). The medoid compositing method was employed to select the most representative pixel for each year, while data gaps were filled using a segmented linear interpolation algorithm. The study applied rigorous cloud and shadow masking, ensuring that the composite images were predominantly clear-sky, enhancing data accuracy. The dataset was assessed for consistency using correlation coefficients, ensuring the quality and reliability of the final product.
The newly developed dataset is poised to transform environmental research and land use management in China. Its applications include monitoring forest cover, assessing the effects of climate change, and supporting biodiversity conservation efforts. Future work will focus on refining the cloud and shadow masking algorithms, integrating additional satellite data sources for enhanced coverage, and expanding the dataset to include leaf-off periods. This ongoing improvement will further increase its usefulness in real-time environmental monitoring and global climate studies.
This work was supported in part by the National Science Foundation for Distinguished Young Scholars of China under Grant 42225107; in part by the National Key Research and Development Program of China under Grant 2022YFB3903402; and in part by the National Natural Science Foundation of China under Grant 42171409 and Grant 42171410.
The Journal of Remote Sensing, an online-only Open Access journal published in association with AIR-CAS, promotes the theory, science, and technology of remote sensing, as well as interdisciplinary research within earth and information science.
Scientists have been able to recreate the extreme conditions found on icy moons in deep space - and revealed the unstable behaviour of water.
In the near-zero pressure environment of space, water reacts very differently from how it does on Earth. It simultaneously undergoes both boiling and freezing.
The icy moons are covered in an ice exterior with liquid oceans existing below the ice crust. Just as lava through volcanic activity reshapes the Earth’s surface, water reshapes icy moons through a process called cryovolcanism.
To understand how the altered behaviour of water might be driving geologic change on the icy moons, researchers from the University of Sheffield, the Open University and the Czech Academy of Sciences used a specially-constructed low-pressure chamber to create the near-vacuum like conditions found on Europa and Enceladus.
Europa is the icy moon that orbits Jupiter. Enceladus orbits Saturn.
Both the icy moons have a frozen exterior. On Enceladus the temperature at the equator is -193 degrees C. Astronomers have seen evidence of giant jets of water vapour and water particles being vented or ejected into space by a volcano-like process known as explosive cryovolcanism.
There is an allied process called effusive cryovolcanism, where liquid is released as a flow on the surface of the icy moons - akin to a lava flow found on Earth - although evidence for such activity is hard to detect.
The research team wanted to see if they could identify how effusive cryovolcanism happens by studying the behaviour of water in a near vacuum environment. The findings are published in the journal Earth and Planetary Sciences Letters.
They used a low-pressure chamber - ‘George’ , the Large Dirty Mars Chamber, housed at the Open University. For the first time, scientists were able to run experiments with relatively large volumes of water and through observation ports, filmed what was happening.
As pressure inside the chamber was lowered, the water began to bubble and boil, despite being cold. Boiling created vapour which transported heat away from the water, and the water cooled, reaching its freezing point - and floating pieces of ice formed. They continued to grow in size, with new ice forming around their edges.
Within a few minutes, most of the water was covered by thin ice.
Below the ice covering, the liquid water continued to boil, with bubbles breaking through or deforming the ice layer, allowing water to effuse or escape through cracks onto the ice surface. Earlier studies involving much smaller volumes of water suggested thick ice would form and rapidly seal off the water to prevent further boiling.
Dr Frances Butcher, Research Fellow in the School of Geography and Planning at the University of Sheffield and one of the study’s authors, said: “The ice layer that forms is weak and full of holes and bubbles.
“If the ice was stronger, it would likely seal-off the liquid water below and prevent further boiling. But our experiments show that as the water boils, the gas that is released gets trapped under the icy crust. Pressure builds, the ice cracks, the gas escapes, and liquid water can briefly seep through the cracks onto the surface of the ice - only to be exposed again to the low-pressure environment.
“As soon as new fractures appear, water begins to boil again, and the entire process repeats itself.”
On Earth, water follows well-known physical rules: it freezes below 0 °C and boils above 100 °C.
Dr Petr Broz, from the Institute of Geophysics at the Czech Academy of Sciences and lead author of the study, said: “We found that the freezing process of water under very low pressure is much more complex than previously thought.
“In such conditions, water rapidly boils even at low temperatures, as it is not stable under low pressure. Simultaneously, it evaporates and begins to freeze, driven by the intense cooling effect caused by the evaporation itself. The ice crust that forms is repeatedly disrupted by vapour bubbles, which lift and fracture the ice, significantly slowing down, complicating, and prolonging the freezing process.”
The researchers hope their investigation will help identify ancient signs of cryovolcanic activity not only on icy moons but across other celestial bodies in the Solar System.
The process the scientists observed of bubbles rising up and deforming the ice cap resulted in an uneven ice crust with bumps and depressions.
Manish Patel, Professor of Planetary Science at the Open University, who supervises the Mars simulation facility, said: “These topographic irregularities - caused by trapped vapour beneath the ice - may leave distinct signatures that could be detectable by orbiting spacecraft, for example by those equipped with radars, offering a potential new way to identify ancient cryovolcanic activity.
“This could provide valuable clues for planning future missions to these remote worlds—and help us better understand the still mysterious process of cryovolcanism.”
The research was funded by the Czech Science Foundation.
ENDS
Note to editors
For furtheR details please contact David Lewis, media officer at the University of Sheffield: 07710 01328 or davidlewis@davidlewismedia.org
Author information
Dr. Petr Brož, lead author: petr.broz@ig.cas.cz
Dr. Frances Butcher, co-author: f.butcher@sheffield.ac.uk
Prof. Manish Patel, co-author: manish.patel@open.ac.uk
The University of Sheffield
The University of Sheffield is a leading Russell Group university, with a world-class reputation, ranked within the Top 100 universities in the world (QS World University Rankings 2026 and Times Higher Education World University Rankings 2025). Over 30,000 students from 150 countries study at Sheffield and in a truly global community, they learn alongside over 1,500 of the world’s leading academics.
Sheffield’s world-shaping research feeds into its excellent education. Students learn at the leading edge of discovery from researchers who are tackling today’s biggest global challenges.
At its core, Sheffield is a place where independent thinkers can come together in pursuit of a shared ambition. To ask bold questions, push boundaries, and make a difference. This is what makes the University of Sheffield one of the best in the world.
From the first documented use of penicillin as a therapy in 1930, to building Europe’s largest research-led manufacturing cluster, Sheffield’s inventive spirit and top quality research environment sets it apart.
Current research partners include Boeing, Rolls-Royce, Unilever, AstraZeneca, GlaxoSmithKline, Siemens and Airbus, as well as many government agencies and charitable foundations.
Sheffield was voted University of the Year in 2024 at the Whatuni Student Choice Awards - the largest annual university awards in the UK voted for exclusively by students. The award reflects a commitment to world-class education and an outstanding student experience. Its Students' Union, which is home to more than 350 societies and clubs, was also named Best Students’ Union in the UK for seven consecutive years between 2017-2024.
Over 300,000 Sheffield alumni from 205 different countries make a significant influence across the world, with six Nobel Prize winners included amongst former staff and students.
No comments:
Post a Comment