Showing posts sorted by date for query KOBE. Sort by relevance Show all posts
Showing posts sorted by date for query KOBE. Sort by relevance Show all posts

Monday, November 18, 2024

 

Political shadows cast by the Antarctic curtain



Kobe University
Shibata-Curtain-Iceberg 

image: 

To protect the West Antarctic Ice Sheet from melting, a gigantic underwater curtain has been proposed to be installed on the Antarctic seabed. However, the political ramifications of such a superproject urgently require careful consideration by scholars of international law to anticipate potential political fault lines for the Antarctic Treaty System that has preserved the seventh continent as a place for peaceful scientific exploration.

view more 

Credit: SHIBATA Akiho





The scientific debate around the installation of a massive underwater curtain to protect Antarctic ice sheets from melting lacks its vital political perspective. A Kobe University research team argues that the serious questions around authority, sovereignty and security should be addressed proactively by the scientific community to avoid the protected seventh continent becoming the scene or object of international discord.

A January 2024 article in Nature put the spotlight on a bold idea originally proposed by Finnish researchers to save the West Antarctic Ice Sheet from melting, which is estimated to potentially raise global sea levels by up to 5 meters. The idea of installing an underground curtain 80 kilometers long and 100 meters high to prevent warm underground water from reaching the glaciers made an international splash, and “What had been a technical discussion among some scientists quickly became a social debate involving the general public,” says Kobe University international law researcher SHIBATA Akiho. In the scientific debate, however, the political aspect has either been completely ignored or dangerously downplayed, which runs the risk of kindling conflict around a project that is meant to protect humanity, in a setting that has been a model for peaceful international collaboration for over 60 years.

As experts on the international law that governs the Antarctic’s peaceful existence dedicated to scientific investigation, Shibata and a visiting scholar from the Peace Research Institute Frankfurt, Patrick FLAMM, scrambled to put together a careful analysis of the political repercussions of the global superproject. Shibata says, “We believe that it was important to publish a paper within one year of the original proposal, before the social debate takes on a life of its own.”

In a policy paper now published in the journal International Affairs, the Kobe University researcher points out consequences along three main themes: authority, sovereignty and security. Concerns about authority ask who is in a position to decide on the realization of such a project and what this means for the power balance in the body governing access to the Antarctic. Sovereignty concerns are centered around the implications for extant and dormant territorial claims. And questions around security consider how to practically safeguard a structure that would certainly be seen as planetary critical infrastructure. Shibata sums up, saying: “This paper sheds light on the political and legal ‘shadows’ hidden behind the exciting surface of science and technology. However, we believe that it is necessary for the members of society to make decisions on the development of these technologies based on a thorough understanding of such negative aspects.”

While the researchers write that “In the current climate, with growing international rivalry and great power strategic competition, it would be an extremely unlikely diplomatic achievement to secure the level of international cooperation … required for the proposed glacial geoengineering infrastructures,” they also point out a way forward by looking back. In the early 1980s, a smoldering conflict around guidelines for Antarctic mineral extraction got resolved by the 1991 “Protocol on Environmental Protection to the Antarctic Treaty,” which proactively prohibited mining in the Antarctic indefinitely. This solution set a precedent for the treaty parties to seek solutions that avoid international discord over the Antarctic.

The Kobe University law expert is careful to point out that prohibition is not the default solution, however. He explains: “Recently, momentum has gathered among natural scientists to examine such technologies more multilaterally from the viewpoint of whether they are appropriate in the first place. If in such a deeper scientific and technical discussion the argument is that there are social benefits that outweigh the governance risks we have presented, then again, we international political scientists and international legal scholars need to be involved in this discussion. Perhaps then the discussion will no longer be about protecting the key principles of the current Antarctic Treaty System while considering this technology but about modifying those key principles themselves.”

This research was funded by the Japan Society for the Promotion of Science (grant 21K18124) and the Kobe University Strategic International Collaborative Research Grant Type C. It was conducted in collaboration with a researcher from the Peace Research Institute Frankfurt.

Kobe University is a national university with roots dating back to the Kobe Commercial School founded in 1902. It is now one of Japan’s leading comprehensive research universities with nearly 16,000 students and nearly 1,700 faculty in 10 faculties and schools and 15 graduate schools. Combining the social and natural sciences to cultivate leaders with an interdisciplinary perspective, Kobe University creates knowledge and fosters innovation to address society’s challenges.

Friday, November 15, 2024

 SPAGYRIC HERBALISM



Bioengineered yeast mass produces herbal medicine




Kobe University
Hasunuma-Artepillin_C-Yeast 

image: 

The yeast Komagataella phaffii is well-suited to produce components for the class of chemicals artepillin C belongs to, can be grown at high cell densities, and does not produce alcohol, which limits cell growth.

view more 

Credit: BAMBA Takahiro




Herbal medicine is difficult to produce on an industrial scale. A team of Kobe University bioengineers manipulated the cellular machinery in a species of yeast so that one such molecule can now be produced in a fermenter at unprecedented concentrations. The achievement also points the way to the microbial production of other plant-derived compounds.

Herbal medicinal products offer many beneficial health effects, but they are often unsuitable for mass production. One example is artepillin C, which has antimicrobial, anti-inflammatory, antioxidant, and anticancer action, but is only available as a bee culture product. The Kobe University bioengineer HASUNUMA Tomohisa says: “To obtain a high-yield and low-cost supply, it is desirable to produce it in bioengineered microorganisms which can be grown in fermenters.” This, however, comes with its own technical challenges.

To begin with, one needs to identify the enzyme, the molecular machine, the plant uses to manufacture a specific product. “The plant enzyme that’s key to artepillin C production had only recently been discovered by YAZAKI Kazufumi at Kyoto University. He asked us whether we can use it to produce the compound in microorganisms due to our experience with microbial production,” says Hasunuma. The team then tried to introduce the gene coding for the enzyme into the yeast Komagataella phaffii, which compared to brewer’s yeast is better able to produce components for this class of chemicals, can be grown at higher cell densities, and does not produce alcohol, which limits cell growth.

In the journal ACS Synthetic Biology, they now report that their bioengineered yeast produced ten times as much artepillin C as could be achieved before. They accomplished this feat by carefully tuning key steps along the molecular production line of artepillin C. Hasunuma adds: “Another interesting aspect is that artepillin C is not excreted into the growth medium readily and tends to accumulate inside the cell. It was therefore necessary to grow the yeast cells in our fermenters to high densities, which we achieved by removing some of the mutations introduced for technical reasons but that stand in the way of the organism’s dense growth.”

The Kobe University bioengineer already has ideas how to further improve the production. One approach will be to further raise the efficiency of the final and critical chemical step by modifying the responsible enzyme or by increasing the pool of precursor chemicals. Another approach may be to find a way of transporting artepillin C out of the cell. “If we can modify a transporter, a molecular structure that transports chemicals in and out of cells, such that it exports the product into the medium while keeping the precursors in the cell, we could achieve even higher yields,” Hasunuma says. 

The implications of this study, however, go beyond the production of this particular compound. Hasunuma explains, “Since thousands of compounds with a very similar chemical structure exist naturally, there is the very real possibility that the knowledge gained from the production of artepillin C can be applied to the microbial production of other plant-derived compounds.”

This research was funded by the Japan Society for the Promotion of Science (grant 23H04967), the RIKEN Cluster for Science, Technology and Innovation Hub and the Japan Science and Technology Agency (grant JPMJGX23B4). It was conducted in collaboration with researchers from Kyoto University and the RIKEN Center for Sustainable Resource Science.

Kobe University is a national university with roots dating back to the Kobe Commercial School founded in 1902. It is now one of Japan’s leading comprehensive research universities with nearly 16,000 students and nearly 1,700 faculty in 10 faculties and schools and 15 graduate schools. Combining the social and natural sciences to cultivate leaders with an interdisciplinary perspective, Kobe University creates knowledge and fosters innovation to address society’s challenges.


Through introducing plant enzymes that can catalyze key steps along the molecular production line of artepillin C into yeast cells, and by tuning the balance of precursor molecules, the team around Kobe University bioengineer HASUNUMA Tomohisa produced artepillin C in fermenters at unprecedented concentrations.

Wednesday, November 13, 2024

Machine learning predicts highest-risk groundwater sites to improve water quality monitoring




North Carolina State University




An interdisciplinary team of researchers has developed a machine learning framework that uses limited water quality samples to predict which inorganic pollutants are likely to be present in a groundwater supply. The new tool allows regulators and public health authorities to prioritize specific aquifers for water quality testing.

This proof-of-concept work focused on Arizona and North Carolina but could be applied to fill critical gaps in groundwater quality in any region.

Groundwater is a source of drinking water for millions and often contains pollutants that pose health risks. However, many regions lack complete groundwater quality datasets.

“Monitoring water quality is time-consuming and expensive, and the more pollutants you test for, the more time-consuming and expensive it is,” says Yaroslava Yingling, co-corresponding author of a paper describing the work and Kobe Steel Distinguished Professor of Materials Science and Engineering at North Carolina State University.

“As a result, there is interest in identifying which groundwater supplies should be prioritized for testing, maximizing limited monitoring resources,” Yingling says. “We know that naturally occurring pollutants, such as arsenic or lead, tend to occur in conjunction with other specific elements due to geological and environmental factors. This posed an important data question: with limited water quality data for a groundwater supply, could we predict the presence and concentrations of other pollutants?”

“Along with identifying elements that pose a risk to human health, we also wanted to see if we could predict the presence of other elements – such as phosphorus – which can be beneficial in agricultural contexts but may pose environmental risks in other settings,” says Alexey Gulyuk, a co-first author of the paper and a teaching professor of materials science and engineering at NC State.

To address this challenge, the researchers drew on a huge data set, encompassing more than 140 years of water quality monitoring data for groundwater in the states of North Carolina and Arizona. Altogether, the data set included more than 20 million data points, covering more than 50 water quality parameters.

“We used this data set to ‘train’ a machine learning model to predict which elements would be present based on the available water quality data,” says Akhlak Ul Mahmood, co-first author of this work and a former Ph.D. student at NC State. “In other words, if we only have data on a handful of parameters, the program could still predict which inorganic pollutants were likely to be in the water, as well as how abundant those pollutants are likely to be.”

One key finding of the study is that the model suggests pollutants are exceeding drinking water standards in more groundwater sources than previously documented. While actual data from the field indicated that 75-80% of sampled locations were within safe limits, the machine learning framework predicts that only 15% to 55% of the sites may truly be risk-free.

“As a result, we’ve identified quite a few groundwater sites that should be prioritized for additional testing,” says Minhazul Islam, co-first author of the paper and a Ph.D. student at Arizona State University. “By identifying potential ‘hot spots,’ state agencies and municipalities can strategically allocate resources to high-risk areas, ensuring more targeted sampling and effective water treatment solutions”

“It’s extremely promising and we think it works well,” Gulyuk says. “However, the real test will be when we begin using the model in the real world and seeing if the prediction accuracy holds up.”

Moving forward, researchers plan to enhance the model by expanding its training data across diverse U.S. regions; integrating new data sources, such as environmental data layers, to address emerging contaminants; and conducting real-world testing to ensure robust, targeted groundwater safety measures worldwide.

“We see tremendous potential in this approach,” says Paul Westerhoff, co-corresponding author and Regents’ Professor in the School of Sustainable Engineering and the Built Environment at ASU. “By continuously improving its accuracy and expanding its reach, we’re laying the groundwork for proactive water safety measures across the globe.”

“This model also offers a promising tool for tracking phosphorus levels in groundwater, helping us identify and address potential contamination risks more efficiently,” says Jacob Jones, director of the National Science Foundation-funded Science and Technologies for Phosphorus Sustainability (STEPS) Center at NC State, which helped fund this work. “Looking ahead, extending this model to support broader phosphorus sustainability could have a significant impact, enabling us to manage this critical nutrient across various ecosystems and agricultural systems, ultimately fostering more sustainable practices.”

The paper, “Multiple Data Imputation Methods Advance Risk Analysis and Treatability of Co-occurring Inorganic Chemicals in Groundwater,” is published open access in the journal Environmental Science & Technology. The paper was co-authored by Emily Briese and Mohit Malu, both Ph.D. students at Arizona State; Carmen Velasco, a former postdoctoral researcher at Arizona State; Naushita Sharma, a postdoctoral researcher at Oak Ridge National Laboratory; and Andreas Spanias, a professor of digital signal processing at Arizona State.

This work was supported by the NSF STEPS Center; and by the Metals and Metal Mixtures: Cognitive Aging, Remediation and Exposure Sources (MEMCARE) Superfund Research Center based at Harvard University, which is supported by the National Institute of Environmental Health Science under grant P42ES030990.

 SPACE/COSMOS

 

Astronomers’ theory of how galaxies formed may be upended


New research from Case Western Reserve University questions standard model



Case Western Reserve University

Protogalaxies as seen by the James Webb Space Telescope 

image: 

Protogalaxies as seen by the James Webb Space Telescope

view more 

Credit: NASA



CLEVELAND—The standard model for how galaxies formed in the early universe predicted that the James Webb Space Telescope (JWST) would see dim signals from small, primitive galaxies. But data are not confirming the popular hypothesis that invisible dark matter helped the earliest stars and galaxies clump together.

Instead, the oldest galaxies are large and bright, in agreement with an alternate theory of gravity, according to new research from Case Western Reserve University published Tuesday November 12 in The Astrophysical JournalThe results challenge astronomers’ understanding of the early universe.

“What the theory of dark matter predicted is not what we see,” said Case Western Reserve astrophysicist Stacy McGaugh, whose paper describes structure formation in the early universe.

McGaugh, professor and director of astronomy at Case Western Reserve, said instead of dark matter, modified gravity might have played a role. He says a theory known as MOND, for Modified Newtonian Dynamics, predicted in 1998 that structure formation in the early universe would have happened very quickly—much faster than the theory of Cold Dark Matter, known as lambda-CDM, predicted.

JWST was designed to answer some of the biggest questions in the universe, such as how and when did stars and galaxies form? Until it was launched in 2021, no telescope was able to see that deeply into the universe and far back in time.

Lambda-CDM predicts that galaxies were formed by gradual accretion of matter from small to larger structures, due to the extra gravity provided by the mass of dark matter.

“Astronomers invented dark matter to explain how you get from a very smooth early universe to big galaxies with lots of empty space between them that we see today,” McGaugh said.

The small pieces assembled in larger and larger structures until galaxies formed. JWST should be able to see these small galaxy precursors as dim light.

“The expectation was that every big galaxy we see in the nearby universe would have started from these itty-bitty pieces,” he said.

But even at higher and higher redshift—looking earlier and earlier into the evolution of the universe—the signals are larger and brighter than expected.

MOND predicted that the mass that becomes a galaxy assembled rapidly and initially expands outward with the rest of the universe. The stronger force of gravity slows, then reverses, the expansion, and the material collapses on itself to form a galaxy. In this theory, there is no dark matter at all.

The large and bright structures seen by JWST very early in the universe were predicted by MOND over a quarter century ago, McGaugh said. He co-authored the paper with former Case Western Reserve postdoctoral researcher Federico Lelli, now at INAF—Arcetri Astrophysical Observatory in Italy, and former graduate student Jay Franck. The fourth coauthor is James Schombert from the University of Oregon.

“The bottom line is, ‘I told you so,’” McGaugh said. “I was raised to think that saying that was rude, but that’s the whole point of the scientific method: Make predictions and then check which come true.” He added that finding a theory compatible with both MOND and General Relativity is still a great challenge.

###
 


 

NASA’s swift studies gas-churning monster black holes



NASA/Goddard Space Flight Center
AT 2021hdr artist's concept 

image: 

A pair of monster black holes swirl in a cloud of gas in this artist’s concept of AT 2021hdr, a recurring outburst studied by NASA’s Neil Gehrels Swift Observatory and the Zwicky Transient Facility at Palomar Observatory in California.

view more 

Credit: NASA/Aurore Simonnet (Sonoma State University)




Scientists using observations from NASA’s Neil Gehrels Swift Observatory have discovered, for the first time, the signal from a pair of monster black holes disrupting a cloud of gas in the center of a galaxy.

“It’s a very weird event, called AT 2021hdr, that keeps recurring every few months,” said Lorena Hernández-García, an astrophysicist at the Millennium Institute of Astrophysics, the Millennium Nucleus on Transversal Research and Technology to Explore Supermassive Black Holes, and University of Valparaíso in Chile. “We think that a gas cloud engulfed the black holes. As they orbit each other, the black holes interact with the cloud, perturbing and consuming its gas. This produces an oscillating pattern in the light from the system.”  

paper about AT 2021hdr, led by Hernández-García, was published Nov. 13 in the journal Astronomy and Astrophysics.

The dual black holes are in the center of a galaxy called 2MASX J21240027+3409114, located 1 billion light-years away in the northern constellation Cygnus. The pair are about 16 billion miles (26 billion kilometers) apart, close enough that light only takes a day to travel between them. Together they contain 40 million times the Sun’s mass.

Scientists estimate the black holes complete an orbit every 130 days and will collide and merge in approximately 70,000 years.

AT 2021hdr was first spotted in March 2021 by the Caltech-led ZTF (Zwicky Transient Facility) at the Palomar Observatory in California. It was flagged as a potentially interesting source by ALeRCE (Automatic Learning for the Rapid Classification of Events). This multidisciplinary team combines artificial intelligence tools with human expertise to report events in the night sky to the astronomical community using the mountains of data collected by survey programs like ZTF.

“Although this flare was originally thought to be a supernova, outbursts in 2022 made us think of other explanations,” said co-author Alejandra Muñoz-Arancibia, an ALeRCE team member and astrophysicist at the Millennium Institute of Astrophysics and the Center for Mathematical Modeling at the University of Chile. “Each subsequent event has helped us refine our model of what’s going on in the system.”

Since the first flare, ZTF has detected outbursts from AT 2021hdr every 60 to 90 days.    

Hernández-García and her team have been observing the source with Swift since November 2022. Swift helped them determine that the binary produces oscillations in ultraviolet and X-ray light on the same time scales as ZTF sees them in the visible range.

The researchers conducted a Goldilocks-type elimination of different models to explain what they saw in the data.

Initially, they thought the signal could be the byproduct of normal activity in the galactic center. Then they considered whether a tidal disruption event — the destruction of a star that wandered too close to one of the black holes — could be the cause.

Finally, they settled on another possibility, the tidal disruption of a gas cloud, one that was bigger than the binary itself. When the cloud encountered the black holes, gravity ripped it apart, forming filaments around the pair, and friction started to heat it. The gas got particularly dense and hot close to the black holes. As the binary orbits, the complex interplay of forces ejects some of the gas from the system on each rotation. These interactions produce the fluctuating light Swift and ZTF observe.

Hernández-García and her team plan to continue observations of AT 2021hdr to better understand the system and improve their models. They’re also interested in studying its home galaxy, which is currently merging with another one nearby — an event first reported in their paper.

“As Swift approaches its 20th anniversary, it’s incredible to see all the new science it’s still helping the community accomplish,” said S. Bradley Cenko, Swift’s principal investigator at NASA’s Goddard Space Flight Center in Greenbelt, Maryland. “There’s still so much it has left to teach us about our ever-changing cosmos.”

NASA’s missions are part of a growing, worldwide network watching for changes in the sky to solve mysteries of how the universe works.

Goddard manages the Swift mission in collaboration with Penn State, the Los Alamos National Laboratory in New Mexico, and Northrop Grumman Space Systems in Dulles, Virginia. Other partners include the University of Leicester and Mullard Space Science Laboratory in the United Kingdom, Brera Observatory in Italy, and the Italian Space Agency.



Astronomers discover mysterious ‘Red Monster’ galaxies in the early Universe



An international team that includes the University of Bath has discovered three ultra-massive galaxies (‘Red Monsters’) in the early Universe forming at unexpected speeds, challenging current models of galaxy formation.



University of Bath

The three 'Red Monsters' 

image: 

The three Red Monsters represent the core findings of this work – these extremely massive and dusty galaxies in the first billion years after the Big Bang indicate that the early Universe is forming stars more efficiently than expected. Image taken by the James Webb Space Telescope.

view more 

Credit: NASA/CSA/ESA, M. Xiao & P. A. Oesch (University of Geneva), G. Brammer (Niels Bohr Institute), Dawn JWST Archive




An international team that was led by the University of Geneva (UNIGE) and includes Professor Stijn Wuyts from the University of Bath in the UK has identified three ultra-massive galaxies – each nearly as massive as the Milky Way – that had already assembled within the first billion years after the Big Bang.

The researchers’ results indicate that the formation of stars in the early Universe was far more efficient than previously thought, challenging existing galaxy formation models.

The surprising discovery – described today in the journal Nature – was made by the James Webb Space Telescope (JWST) as part of the JWST FRESCO programme.

The programme set out to systematically analyse a complete sample of emission-line galaxies (ELGs) within the first billion years of cosmic history. ELGs exhibit strong emission lines in their spectra (a spectrum is the range of different wavelengths of light emitted). These emission lines appear as bright lines at specific wavelengths, standing out against the darker background of the spectrum.

The presence of emission lines enabled the team to accurately pin down the distances to the galaxies in the sample. In turn, precise knowledge of the distances and emission line strengths allowed the researchers to reliably measure the amount of stars contained within the galaxies. Three stood out by their large stellar content.

“Finding three such massive beasts among the sample poses a tantalising puzzle”, said Professor Wuyts, co-author of the Nature study and Hiroko Sherwin Chair in Extragalactic Astronomy at Bath’s Department of Physics.

“Many processes in galaxy evolution have a tendency to introduce a rate-limiting step in how efficiently gas can convert into stars, yet somehow these Red Monsters appear to have swiftly evaded most of these hurdles.”

Fast growing Red Monsters

Until now, it was believed that all galaxies formed gradually within large halos of dark matter. Dark matter halos capture gas (atoms and molecules) into gravitationally bound structures. Typically, 20% of this gas, at most, is converted into stars in galaxies. However, the new findings challenge this view, revealing that massive galaxies in the early Universe may have grown far more rapidly and efficiently than previously thought.

Detail in the FRESCO study was captured through ‘slitless spectroscopy’ with JWST’s Near Infrared Camera, a surveying method that allows light to be captured and unravelled into its constituent wavelengths for all objects in a field of view. This makes it an excellent method for measuring accurate distances and physical characteristics of galaxies.

JWST's unparalleled capabilities have allowed astronomers to systematically study galaxies in the very distant and early Universe, providing insights into massive and dust-obscured galaxies. By analysing galaxies included in the FRESCO survey, scientists found that most galaxies fit existing models. However, they also found three surprisingly massive galaxies, with stellar masses comparable to today’s Milky Way.

These are forming stars nearly twice as efficiently as lower mass galaxies from the same epoch or ordinary galaxies at later times in cosmic history. Due to their high dust content, which gives these three massive galaxies a distinct red appearance in JWST images, they have been named the three Red Monsters.

Dr Mengyuan Xiao, lead author of the new study and postdoctoral researcher at UNIGE, said: “Our findings are reshaping our understanding of galaxy formation in the early Universe.”

Dr David Elbaz, director of research at CEA Paris-Saclay and collaborator on this project, said: “The massive properties of these Red Monsters were hardly determined before JWST, as they are optically invisible due to dust attenuation.”

A Milestone in Galaxy Observations

Pascal Oesch, associate professor in the Department of Astronomy at the UNIGE, and principal investigator of the observation programme, said: “Our findings highlight the remarkable power of NIRCam/grism spectroscopy. The instrument on board the space telescope allows us to identify and study the growth of galaxies over time, and to obtain a clearer picture of how stellar mass accumulates over the course of cosmic history.”

While these findings do not conflict with the standard cosmological model, they raise questions for galaxy formation theories, specifically the issue of ‘too many, too massive’ galaxies in the early Universe.

Current models may need to consider unique processes that allowed certain early massive galaxies to achieve such efficient star formation and thus form very rapidly, very early in the Universe. Future observations with JWST and the Atacama Large Millimeter Array (ALMA) telescope will provide further insights into these ultra-massive Red Monsters and reveal larger samples of such sources.

Dr Xiao said: “These results indicate that galaxies in the early Universe could form stars with unexpected efficiency. As we study these galaxies in more depth, they will offer new insights into the conditions that shaped the Universe’s earliest epochs. The Red Monsters are just the beginning of a new era in our exploration of the early Universe.”

Professor Wuyts added: "That is what is so great about astronomy, we're constantly being surprised by new discoveries. Already in its first few years of operation, JWST has thrown us a couple of curveballs.  In more ways than one, it has shown us that some galaxies mature rapidly during the first chapters of cosmic history."

ENDS.

FRIB research team identifies flaw in physics models of massive stars and supernovae



An international team of researchers led by scientists from the Facility for Rare Isotope Beams uncovered evidence that astrophysics models of massive stars and supernovae are inconsistent with observational gamma-ray astronomy



Michigan State University Facility for Rare Isotope Beams

FRIB research team identifies flaw in physics models of massive stars and supernovae 

image: 

An international team of researchers, led by scientists at FRIB, uncovered evidence that astrophysics models of massive stars and supernovae are inconsistent with observational gamma-ray astronomy. The team published its findings in Nature Communications.

view more 

Credit: Image courtesy of the Facility for Rare Isotope Beams




Artemis Spyrou, professor of physics at the Facility for Rare Isotope Beams (FRIB) and in the Michigan State University (MSU) Department of Physics and Astronomy, led an international research team to investigate iron-60, an unstable isotope, by using a new experimental method. The team—which included Sean Liddick, associate professor of chemistry at FRIB and in MSU’s Department of Chemistry and Experimental Nuclear Science Department head at FRIB, and 11 FRIB graduate students and postdoctoral researchers—published its findings in Nature Communications.

Iron-60 interests astrophysicists because it originates inside massive stars and is ejected from supernovae across the galaxy. To investigate the isotope, Spyrou’s team conducted an experiment at the National Superconducting Cyclotron Laboratory (FRIB’s predecessor) using a novel method developed jointly with Ann-Cecilie Larsen, professor of nuclear and energy physics, and Magne Guttormsen, professor emeritus, both at the University of Oslo in Norway.

“The unique thing that we brought into this collaboration was that we combined our expertise in nuclear reactions, isotope beams, and beta decay to learn about a reaction that we can’t measure directly,” Spyrou said. “For this paper, we sought to measure enough of the properties surrounding the reaction we were interested in so that we could constrain it better than before.”

Models are essential for predicting rare astrophysical events

Iron-60 has a long half-life for an unstable isotope—more than 2 million years—so it leaves a lasting signature of the supernova from which it originated. Specifically, iron-60 emits gamma rays as it decays that scientists can measure and analyze for clues about the life cycle of stars and the mechanisms of their explosive deaths. Physicists rely on this data to create and improve astrophysical models.

“One of the overarching goals of nuclear science is to achieve a comprehensive, predictive model of a nucleus that will accurately describe the nuclear properties of any atomic system,” said Liddick, “but we just don’t have that yet. We have to experimentally measure these processes first.” Scientists need to produce these rare isotopes, observe them, and then compare their findings with the model’s prediction to check for accuracy.

“To study these nuclei, we can’t just find them naturally on Earth,” said Spyrou. “We have to make them. And that is the specialty of FRIB—to get stable isotopes that we can find, accelerate them, fragment them, and then produce these exotic isotopes, which might only live for a few milliseconds, so we can study them.” To that end, Spyrou and her team devised an experiment that served two purposes: First, they aimed to constrain the neutron-capture process that transforms the isotope iron-59 into iron-60; second, they wanted to use the resulting data to investigate long-standing discrepancies between supernova model predictions and the observed traces of these isotopes.

New method enables better study of short-lived isotopes

While iron-60 has a relatively long half-life, its neighbor iron-59 is less stable and will decay with a half-life of 44 days. This makes the neutron capture on iron-59 especially challenging to measure in the laboratory since it decays away before reasonable measurements can be performed. To overcome this problem, the scientists developed their own indirect methods of constraining this reaction experimentally.

Spyrou and Liddick worked closely with their colleagues at the University of Oslo to develop a new method for studying these highly unstable isotopes. The result, called the beta-Oslo Method, is a variation of the Oslo Method first developed by project co-author Guttormsen at the Oslo Cyclotron Laboratory. Guttormsen’s approach uses a nuclear reaction to populate a nucleus so that researchers can measure its properties. Though it has proven over several decades to have many astrophysics and nuclear structure applications, it was only possible to apply to (near-) stable isotopes. By combining their expertise in detection, beta decay, and reactions, the researchers devised a way to populate a target nucleus using the process of beta decay itself rather than a reaction. This innovative approach produced the isotope they were looking for much more efficiently and provided a path to constraining neutron-capture reactions on short-lived nuclei.

“The beta-Oslo method is still the only technique that can give us some of these constraints on very exotic nuclei that are far from stability,” said Spyrou.

Correcting the models will take time

After constraining these key uncertainties about the nuclear reaction network that produces iron-60, Spyrou’s team concluded that the likelihood of that reaction happening inside a massive star is higher than model predictions by as much as a factor of two. The researchers now believe that theoretical models of supernovae are flawed, and that there are specific stellar properties that are still incorrectly represented. In their paper’s conclusion, the researchers stated, “The solution to the puzzle must come from the stellar modeling by, for example, reducing stellar rotation, assuming smaller explodability mass limits for massive stars, or modifying other stellar parameters.”

This discovery not only has far-reaching implications for the theoretical understanding of massive stars and the conditions inside them, but it also further demonstrated that the beta-Oslo Method will be a valuable tool for scientists moving forward. “This wouldn’t have worked without our project partners at the University of Oslo, who inspired Artemis and me when they presented the Oslo method at a 2014 seminar at MSU,” said Liddick. “We approached them that day with our question about using beta decay, and discussions took off from there. We’ve worked together ever since, and I have no doubt we will continue to collaborate long into the future.”

Sarah Waldrip is a freelance science writer.

Michigan State University (MSU) operates the Facility for Rare Isotope Beams (FRIB) as a user facility for the U.S. Department of Energy Office of Science (DOE-SC), with financial support from and furthering the mission of the DOE-SC Office of Nuclear Physics. Hosting the most powerful heavy-ion accelerator, FRIB enables scientists to make discoveries about the properties of rare isotopes in order to better understand the physics of nuclei, nuclear astrophysics, fundamental interactions, and applications for society, including in medicine, homeland security, and industry.

The U.S. Department of Energy Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of today’s most pressing challenges. For more information, visit energy.gov/science.