Tuesday, April 29, 2025

SPACE/COSMOS

Flares from magnetized stars can forge planets’ worth of gold, other heavy elements


Flatiron Institute researchers calculate that a single flare from a supermagnetized star called a magnetar can produce the mass equivalent of 27 moons’ worth of the universe’s heaviest atoms such as gold, platinum and uranium.



Simons Foundation

Magnetar Flare 

image: 

In an ejection that would have caused its rotation to slow, a magnetar is depicted losing material into space in this artist’s concept. The magnetar’s strong, twisted magnetic field lines (shown in green) can influence the flow of electrically charged material from the object, which is a type of neutron star.

view more 

Credit: NASA/JPL-Caltech





Astronomers have discovered a previously unknown birthplace of some of the universe’s rarest elements: a giant flare unleashed by a supermagnetized star. The astronomers calculated that such flares could be responsible for forging up to 10 percent of our galaxy’s gold, platinum and other heavy elements.

The discovery also resolves a decades-long mystery concerning a bright flash of light and particles spotted by a space telescope in December 2004. The light came from a magnetar — a type of star wrapped in magnetic fields trillions of times as strong as Earth’s — that had unleashed a giant flare. The powerful blast of radiation only lasted a few seconds, but it released more energy than our sun does in 1 million years. While the flare’s origin was quickly identified, a second, smaller signal from the star, peaking 10 minutes later, confounded scientists at the time. For 20 years, that signal went unexplained.

Now, a new insight by astronomers at the Flatiron Institute’s Center for Computational Astrophysics (CCA) in New York City has revealed that the unexplained smaller signal marked the rare birth of heavy elements such as gold and platinum. In addition to confirming another source of these elements, the astronomers estimated that the 2004 flare alone produced the equivalent of a third of Earth’s mass in heavy metals. They report their discovery in a paper published on April 29 in The Astrophysical Journal Letters.

“This is really just the second time we've ever directly seen proof of where these elements form,” the first being neutron star mergers, says study co-author Brian Metzger, a senior research scientist at the CCA and a professor at Columbia University. “It’s a substantial leap in our understanding of heavy elements production.”

Most of the elements we know and love today weren’t always around. Hydrogen, helium and a dash of lithium were formed in the Big Bang, but almost everything else has been manufactured by stars in their lives, or during their violent deaths. While scientists thoroughly understand where and how the lighter elements are made, the production locations of many of the heaviest neutron-rich elements — those heavier than iron — remain incomplete.

These elements, which include uranium and strontium, are produced in a set of nuclear reactions known as the rapid neutron-capture process, or r-process. This process requires an excess of free neutrons — something that can be found only in extreme environments. Astronomers thus expected that the extreme environments created by supernovae or neutron star mergers were the most promising potential r-process sites.

It wasn’t until 2017 that astronomers were able to confirm an r-process site when they observed the collision of two neutron stars. These stars are the collapsed remnants of former stellar giants and made of a soup of neutrons so dense that a single tablespoon would weigh more than 1 billion tons. The 2017 observations showed that the cataclysmic collision of two of these stars creates the neutron-rich environment needed for the formation of r-process elements.

However, astronomers realized that these rare collisions alone can’t account for all the r-process-produced elements we see today. Some suspected that magnetars, which are highly magnetized neutron stars, could also be a source.

Metzger and colleagues calculated in 2024 that giant flares could eject material from a magnetar’s crust into space, where r-process elements could form.

“It’s pretty incredible to think that some of the heavy elements all around us, like the precious metals in our phones and computers, are produced in these crazy extreme environments,” says Anirudh Patel, a doctoral candidate at Columbia University and lead author on the new study.

The group’s calculations show that these giant flares create unstable, heavy radioactive nuclei, which decay into stable elements such as gold. As the radioactive elements decay, they emit a glow of light, in addition to minting new elements.

The group also calculated in 2024 that the glow from the radioactive decays would be visible as a burst of gamma rays, a form of highly energized light. When they discussed their findings with observational gamma-ray astronomers, the group learned that, in fact, one such signal had been seen decades earlier that had never been explained. Since there’s little overlap between the study of magnetar activity and heavy-element synthesis science, no one had previously proposed element production as a cause of the signal.

“The event had kind of been forgotten over the years,” Metzger says. “But we very quickly realized that our model was a perfect fit for it.”

In the new paper, the astronomers used the observations of the 2004 event to estimate that the flare produced 2 million billion billion kilograms of heavy elements (roughly equivalent to Mars’ mass). From this, they estimate that one to 10 percent of all r-process elements in our galaxy today were created in these giant flares. The remainder could be from neutron star mergers, but with only one magnetar giant flare and one merger ever documented, it’s hard to know exact percentages — or if that’s even the whole story.

“We can't exclude that there could be third or fourth sites out there that we just haven’t seen yet,” Metzger says.

“The interesting thing about these giant flares is that they can occur really early in galactic history,” Patel adds. “Magnetar giant flares could be the solution to a problem we’ve had where there are more heavy elements seen in young galaxies than could be created from neutron star collisions alone.”

To narrow down the percentages, more magnetar giant flares need to be observed. Telescopes like NASA’s Compton Spectrometer and Imager mission, set to launch in 2027, will help better capture these signals. Large magnetar flares seem to occur every few decades in the Milky Way and about once a year across the visible universe — but the trick is to catch it in time.

“Once a gamma-ray burst is detected, you have to point an ultraviolet telescope at the source within 10 to 15 minutes to see the signal's peak and confirm r-process elements are made there,” Metzger says. “It’ll be a fun chase.”

###

About the Flatiron Institute

The Flatiron Institute is the research division of the Simons Foundation. The institute's mission is to advance scientific research through computational methods, including data analysis, theory, modeling and simulation. The institute's Center for Computational Astrophysics creates new computational frameworks that allow scientists to analyze big astronomical datasets and to understand complex, multi-scale physics in a cosmological context.

 

Chip-shop fish among key seabed engineers




University of Exeter

Atlantic cod 

image: 

Atlantic cod

view more 

Credit: Alex Mustard





Many of the fish we eat play a key role in maintaining the seabed – and therefore our climate, new research shows.

Convex Seascape Survey scientists assessed the role of fish in bioturbation (churning and reworking sediments) in shallow UK seas.

The Atlantic cod – a staple in chip shops – jointly topped the list of these important “ecosystem engineers” (along with Atlantic hagfish and European eel).

In total, 185 fish species were found to play a role in bioturbation – and 120 of these are targeted by commercial fishing.

“Ocean sediments are the world’s largest reservoir of organic carbon – so what happens on the seabed matters for our climate,” said University of Exeter PhD student Mara Fischer, who led the study.

“Bioturbation is very important for how the seabed takes up and stores organic carbon, so the process is vital to our understanding of how the ocean absorbs greenhouse gases to slow the rate of climate change.

“Bioturbation is also important for seabed and wider ocean ecosystems.

“We have a good understanding of how invertebrates contribute to global bioturbation – but until now, we have been missing half the story.

“Our study is the first to attempt to quantify the bioturbation impact of fish, and it shows they play a significant, widespread role.”  

Overfished and overlooked

Co-author Professor Callum Roberts, from the Centre for Ecology and Conservation at Exeter’s Penryn Campus in Cornwall, said: “We also found that species with the highest bioturbation impacts are among the most vulnerable to threats such as commercial fishing.

“Many of the largest and most powerful diggers and disturbers of seabed sediments, like giant skates, halibut and cod, have been so overfished they have all but vanished from our seas.

“These losses translate into big, but still uncertain, changes in the way seabed ecosystems work.”

The researchers examined records for all fish species living on the UK continental shelf, and found more than half have a role in bioturbation – sifting and excavating sediment during foraging, burrowing and/or building nests.

These different ways of reworking the sediments – termed bioturbation modes – alongside the size of the fish and the frequency of bioturbation, were used by the researchers to calculate a bioturbation impact score for each species.

Examples include:

  • European eel. Bioturbation mode: burrower. Bioturbation score (out of 125): 100. IUCN conservation status: critically endangered. Fished primarily using traps and fyke nets, they are considered a delicacy in many parts of Europe and Asia – commonly prepared as smoked eel or dishes like eel pie and eel soup. Threats include climate change, diseases and parasites, habitat loss, pollutants and fishing.
  • Atlantic cod. Bioturbation mode: vertical excavator. Bioturbation score: 100. IUCN status: vulnerable. Primarily fished using trawling and longlining, they are consumed in many forms, including fish and chips, fresh fillets, salted cod, and cod liver oil. Threats include overfishing, climate change and habitat degradation. Populations have declined in several parts of its range, particularly the North Sea and West Atlantic.
  • Common skate. Bioturbation mode: lateral excavator. Bioturbation score: 50. IUCN status: critically endangered. Historically targeted by trawling and longlining, this species is now protected in several regions – but often caught accidentally (bycatch). Numbers have drastically declined due to overfishing. The species is vulnerable due to its large size, slow growth rate, and low reproductive rate – only about 40 eggs are laid every other year, and each generation takes 11 years to reach maturity.
  • Black seabream. Bioturbation mode: nest builder. Bioturbation score: 36. IUCN status: least concern. Primarily caught using bottom trawling, gillnets, and hook and line. Fishing during the spawning season in April and May can impact population replenishment. Bottom trawling at this time has the potential to remove the fish, nests and eggs.
  • Red gurnard. Bioturbation mode: sediment sifter. Bioturbation score: 16. IUCN status: least concern. Historically not of major interest to commercial fisheries, the species has been targeted more in recent years (including in Cornwall). It is mainly caught by trawlers. There is currently no management for any gurnard species in the EU: no minimum landing size, no quota, etc – which could lead to unsustainable fishing.

Julie Hawkins, another author of the study, commented: “Anyone who has spent time underwater, whether snorkelling or diving, knows that fish are constantly digging up the seabed.

“It’s hard to believe that such an obvious and important activity has been largely overlooked when it comes to understanding ocean carbon burial.”

The Convex Seascape Survey is a partnership between Blue Marine Foundation, the University of Exeter and Convex Group Limited. The ambitious five-year global research programme is the largest attempt yet to build a greater understanding of the properties and capabilities of the ocean and its continental shelves in the earth’s carbon cycle, in the urgent effort to slow climate change.

The paper, published in the journal Marine Environmental Research, is entitled: “A functional assessment of fish as bioturbators and their vulnerability to local extinction.”

 

Greasing the wheels of the energy transition to address climate change and fossil fuels phase out



University of South Australia





The global energy system may be faced with an inescapable trade-off between urgently addressing climate change versus avoiding an energy shortfall, according to a new energy scenario tool developed by University of South Australia researchers and published in the open access journal Energies.

The Global Renewable Energy and Sectoral Electrification model, dubbed ‘GREaSE’, has been developed by UniSA Associate Professor James Hopeward with three civil engineering graduates.

‘In essence, it’s an exploratory tool, designed to be simple and easy for anyone to use, to test what-if scenarios that aren’t covered by conventional energy and climate models,’ Assoc Prof Hopeward says.

Three Honours students – Shannon O’Connor, Richard Davis and Peter Akiki – started working on the model in 2023, hoping to answer a critical gap in the energy and climate debate.

‘When we hear about climate change, we’re typically presented with two opposing scenario archetypes,’ Assoc Prof Hopeward says.

“On the one hand, there are scenarios of unchecked growth in fossil fuels, leading to climate disaster, while on the other hand there are utopian scenarios of renewable energy abundance.”

The students posed the question: what if the more likely reality is somewhere in between the two extremes? And if it is, what might we be missing in terms of risks to people and the planet?

After graduating, the team continued to work with Assoc Prof Hopeward to develop and refine the model, culminating in the publication of ‘GREaSE’ in Energies.

Using the model, the researchers have simulated a range of plausible future scenarios including rapid curtailment of fossil fuels, high and low per-capita demand, and different scenarios of electrification.

According to Richard Davis, “a striking similarity across scenarios is the inevitable transition to renewable energy – whether it’s proactive to address carbon emissions, or reactive because fossil fuels start running short.”

But achieving the rapid cuts necessary to meet the 1.5°C targets set out in the Paris Agreement presents a serious challenge.

As Ms O’Connor points out, “even with today’s rapid expansion of renewable energy, the modelling suggests it can’t expand fast enough to fill the gap left by the phase-out of fossil fuels, creating a 20 to 30-year gap between demand and supply.

“By 2050 or so, we could potentially expect renewable supply to catch up, meaning future demand could largely be met by renewables, but while we’re building that new system, we might need to rebalance our expectations around how much energy we’re going to have to power our economies.”

The modelling does not show that emissions targets should be abandoned in favour of scaling up fossil fuels. The researchers say this would “push the transition a few more years down the road”.

Assoc Prof Hopeward says it is also unlikely that nuclear power could fill the gap, due to its small global potential.

“Even if the world’s recoverable uranium resources were much larger, it would scale up even more slowly than renewables like solar and wind,” he says.

“We have to face facts: our long-term energy future is dominated by renewables. We could transition now and take the hit in terms of energy supply, or we could transition later, once we’ve burned the last of the fossil fuel. We would still have to deal with essentially the same transformation, just in the midst of potentially catastrophic climate change.

“It’s a bit like being told by your doctor to eat healthier and start exercising. You’ve got the choice to avoid making the tough changes now, and just take your chances with surviving the heart attack later, or you get on with what you know you need to do. We would argue that we really need to put our global energy consumption on a diet, ASAP.”

The researchers have designed the model to be simple, free and open source, in the hope that it sparks a wider conversation around energy and climate futures.

 

Full paper details:

Hopeward, J., Davis, R., O'Connor, S. and Akiki, P. (2025) The Global Renewable Energy and Sectoral Electrification (GREaSE) Model for Rapid Energy Transition Scenarios, Energies 18(9). https://www.mdpi.com/1996-1073/18/9/2205  

 

 

 

Deepfakes now come with a realistic heartbeat, making them harder to unmask



Deepfakes could soon be able to evade a detection technique that monitors a person’s on-screen for a pulse



Frontiers





Imagine a world where deepfakes have become so good that no detection mechanism can unmask them as imposters. This would be a bonanza for criminals and malignant state actors: for example, these might use deepfakes to slander rival political candidates or frame inconvenient defenders of human rights.

This nightmare scenario isn’t real yet, but for years, methods for creating deepfakes have been locked in a ‘technological arms race’ against detection algorithms. And now, scientists have shown that deepfakes have gained a significant advantage: the lack of a pulse no longer gives them away.

“Here we show for the first time that recent high-quality deepfake videos can feature a realistic heartbeat and minute changes in the color of the face, which makes them much harder to detect," said Dr Peter Eisert, a professor at the Humboldt University of Berlin, and the corresponding author of a new study in Frontiers in Imaging.

Deepfake creators use deep learning to manipulate videos and audio files. They alter facial expressions and gestures, for example swapping these between different people. Their purpose isn’t necessarily malign: for example, apps that can turn you into a cat or digitally age you are immensely popular and harmless fun.

The analysis of the transmission of light through the skin and underlying blood vessels has long been indispensable in medicine, for example in pulse oximeters. Its digital cousin, so-called remote photoplethysmography (rPPP), is an emerging method in telehealthcare, which uses webcams to estimate vital signs. But rPPP can, in theory, also be used in deepfake detectors.

In recent years, such experimental rPPP-based deepfake detectors have proven good at distinguishing between real and deepfaked videos. These successes led some experts to judge that current deepfakes couldn’t yet mimic a realistic heart rate. But now, it appears that this complacent view is outdated.

Fake it till you make it

Eisert and colleagues first coded a state-of-the-art deepfake detector which automatically extracts and analyzes the pulse rate from videos. It uses novel methods to compensate for movement and remove noise, and needs an input video of the face of a single person of just 10 seconds to work.

The authors also created their own dataset of driving videos, used to create deepfakes of different target identities with the facial motion of the captured videos. During filming, an ECG tracked the heartbeat of the protagonists, which then allowed the researchers to confirm that rPPP measurements made by their new detector were highly accurate. There was a difference of only two to three beats per minute between estimates and the true pulse rate. For good measure, the authors also let their detector loose on two older, widely used collections of videos of real persons. Here, too, they were able to extract heartbeat signals from all genuine videos.

But what would happen if they used the same detector to analyze known deepfakes?

To test this, Eisert and colleagues used recent deepfake methods to swap faces between genuine videos in their collection. To their surprise, their detector perceived a pulse in the deepfakes as well – even though they hadn’t consciously put one in. This nonexistent pulse typically appeared highly realistic.

Taking heart

“Our results show that a realistic heartbeat may be added by an attacker on purpose, but can also be ‘inherited’ inadvertently from the driving genuine video. Small variations in skin tone of the real person get transferred to the deepfake together with facial motion, so that the original pulse is replicated in the fake video,” said Eisert.

Fortunately, there is reason for optimism, concluded the authors. Deepfake detectors might catch up with deepfakes again if they were to focus on local blood flow within the face, rather than on the global pulse rate.

“Our experiments have shown that current deepfakes may show a realistic heartbeat, but do not show physiologically realistic variations in blood flow across space and time within the face,” said Eisert.

“We suggest that this weakness of state-of-the-art deepfakes should be exploited by the next generation of deep fake detectors.”