Friday, January 17, 2025

 

Fresh, direct evidence for tiny drops of quark-gluon plasma



Particles of light emitted from collisions of deuterons with gold ions provide direct evidence that energetic jets get 'stuck' — a key signature of quark-gluon plasma



DOE/Brookhaven National Laboratory

Gabor David and Axel Drees 

image: 

Stony Brook University physicists Gabor David and Axel Drees sketch out how a signal of jet energy loss in deuteron-gold collisions at the Relativistic Heavy Ion Collider (RHIC) supports the case that these collisions create small specks of quark-gluon plasma, a form of matter that permeated the early universe.

view more 

Credit: Kevin Coughlin/Brookhaven National Laboratory




UPTON, N.Y. — A new analysis of data from the PHENIX experiment at the Relativistic Heavy Ion Collider (RHIC) reveals fresh evidence that collisions of even very small nuclei with large ones might create tiny specks of a quark-gluon plasma (QGP). Scientists believe such a substance of free quarks and gluons, the building blocks of protons and neutrons, permeated the universe a fraction of a second after the Big Bang. RHIC’s energetic smashups of gold ions — the nuclei of gold atoms that have been stripped of their electrons — routinely create a QGP by “melting” these nuclear building blocks so scientists can study the QGP’s properties.

Physicists originally thought that collisions of smaller ions with large ones wouldn’t create a QGP because the small ion wouldn’t deposit enough energy to melt the large ion’s protons and neutrons. But evidence from PHENIX has long suggested that these small collision systems generate particle flow patterns that are consistent with the existence of tiny specks of the primordial soup, the QGP. The new findings, just published in Physical Review Letters, bolster the case for these tiny droplets of the QGP. The paper provides the first direct evidence that energetic particles generated in RHIC’s small collision systems sometimes lose energy and slow down significantly on the way out.

“We found, for the first time in a small collision system, the suppression of energetic particles, which is one of two main pieces of evidence for the QGP,” said PHENIX Collaboration Spokesperson Yasuyuki Akiba, a physicist at Japan’s RIKEN Nishina Center for Accelerator-Based Science and Experiment Group Leader at the RIKEN-BNL Research Center (RBRC) at Brookhaven Lab.

Jet quenching as a sign of QGP

Looking for the suppression of high-energy jets of particles, or jet “quenching,” has been a key goal from the earliest days at RHIC, a DOE Office of Science user facility for nuclear physics research that began operating at Brookhaven Lab in 2000. Jets are created when a quark or gluon within a proton or neutron in one of RHIC’s ion beams collides intensely with a quark or gluon in the nuclear particles that make up the beam traveling in the opposite direction. These strong interactions can kick single quarks or gluons free from the colliding nuclear building blocks with tremendous amounts of energy, which quickly transforms the energetic particles into cascades, or jets, of other particles.

If the collision doesn’t melt the nuclear matter into a soup of free quarks and gluons — the QGP — then these energetic jets of particles, or their decay products, sail out freely to be counted by RHIC’s detectors. But if the collisions do form a QGP, the kicked-free quark or gluon, despite its energy, gets caught up in interactions with the quarks and gluons that make up the plasma.

“Those interactions lead to energy loss,” explained Gabor David, a PHENIX physicist from Stony Brook University (SBU) who was one of the leaders of the new analysis.

“You can think about it like the difference between running through air and running through water,” he said. The QGP is like the water; it slows the particles down. As a result, jets reach the detector with only a fraction of their original energy.

To look for this suppression, the physicists first must estimate the number of energetic particles that would be expected from the gold-gold smashups by mathematically scaling up from simple proton-proton collisions to the number of protons and neutrons involved in collisions of heavier ions such as gold. The calculated values indirectly indicate whether the collision happens dead-center between the two gold ions or if it’s a glancing collision where the ions sideswipe one another at the edges. Central collisions are expected to create more jets than peripheral ones. But they’re also more likely to generate bigger QGP and therefore higher jet suppression.

This method has worked beautifully for the gold-gold smashups.

“We expected we should see 1,000 times the number of energetic particles, or jets, in the most central gold-gold collisions compared to proton-proton collisions,” Akiba said. “But we saw only about 200 times the proton-proton level, one-fifth the expected number. That’s a factor of five suppression.”

This jet suppression is a clear sign that the gold-gold collisions are generating the QGP. It’s also consistent with another key signature of the QGP formation in these collisions — namely, characteristic patterns of particle flow caused by hydrodynamic properties of the “perfect liquid” plasma.

When PHENIX scientists observed similar hydrodynamic flow patterns in small collision systems, hinting that there might be tiny drops of the QGP, they set out to search for jet suppression in those events as well. The results were a surprise: While the most central collisions of particles such as deuterons — one proton and one neutron — with gold ions exhibited signs of jet suppression, more peripheral collisions seemed to show an increase in energetic jets.

“There was no explanation for why this should happen — absolutely none,” David said.

Turning to direct photons

As it turns out, the surprising increase was an artifact of the indirect way the physicists had determined the centrality of the collisions. They discovered this by trying an alternate and more direct approach, as described in the new paper. Instead of using calculations based on a geometric model to estimate the number of nuclear particles — protons and neutrons — participating in the collisions, they used a direct measurement of those interactions by counting so-called “direct” photons.

This is possible because just as a RHIC collision can kick an energetic quark or gluon free, that interaction can also produce a high energy photon, or particle of light. These direct photons are produced in the collision right along with and in amounts proportional to the kicked-free quarks and gluons.

So, by counting the direct photons that strike their detector, the PHENIX scientists could directly measure the centrality of the collisions and know exactly how many energetic quarks or gluons were kicked free — that is, how many jets to expect.

“The more central the collision is, the more interactions there can be between the quarks and gluons of a small colliding deuteron with the quarks and gluons in the protons and neutrons of a gold ion,” explained Axel Drees of SBU, another leader of the analysis. “So, central smashups produce more direct photons and should produce more energetic jet particles than glancing collisions do.”

But unlike the quarks and gluons, the photons don’t interact with the QGP.

“If photons are created, they escape the QGP completely without any energy loss,” Drees said.

So, if there’s no QGP, the photons and energetic particles should be detected in proportionate amounts. But if in central collisions the number of energetic jet particles detected is significantly lower than the number of direct photons of the same energy, that could be a sign that a QGP is present, quenching the jets.

Niveditha Ramasubramanian, who was a graduate student advised by David at the time, undertook the challenging task of teasing out the direct photon signals from PHENIX’s deuteron-gold collision data. When her analysis was complete, the earlier, unexplained increase in jets emerging from peripheral collisions completely disappeared. But there was still a strong signal of suppression in the most central collisions.

“The initial motivation to do this complex analysis was only to better understand the strange increase in energetic jets in peripheral collisions, which we did,” said Ramasubramanian, a co-author on the paper who earned her Ph.D. — and a Thesis Award at the 2022 RHIC & AGS Users Meeting — for her contributions to this result. Now a staff scientist at the French National Centre for Scientific Research, she added, “The suppression that we observed in the most central collisions was entirely unexpected.”

“When we use the direct photons as a precise, accurate measure of the collision centrality, we can see the suppression [in central collisions] unambiguously,” Akiba said.

David noted that, “The new method relies solely on observable quantities, avoiding the use of theoretical models.”

The next step will be to apply the same method to other small collision systems.

“Ongoing analyses of PHENIX’s proton-gold and helium-3-gold data with the same technique will help to further clarify the origins of this suppression to confirm our current understanding or rule it out by competing explanations,” Drees said.

This research was funded by the DOE Office of Science (NP), the National Science Foundation, and a range of U.S. and international universities and organizations listed in the scientific paper. The PHENIX experiment collected data at RHIC from 2000 until 2016 and, as this paper indicates, analysis of its data is ongoing.

Brookhaven National Laboratory is supported by the Office of Science of the U.S. Department of Energy. The Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, visit science.energy.gov.

Follow @BrookhavenLab on social media. Find us on InstagramLinkedInX, and Facebook.

Related Links

 SCI-FI-TEK-70YRS IN THE MAKING

UT secures $20 million DOE grant to develop critical nuclear fusion materials




University of Tennessee at Knoxville

UT secures $20M DOE grant 

image: 

Professor Steven Zinkle, UT-ORNL Governor’s Chair for Nuclear Materials, talks with Munireach Nannory, first-year masters student, Civil and Structural Engineering, while preparing to test materials samples using a 3MV tandem accelerator with multiple beamlines and stations in the Ion Beam Materials Laboratory (IBML) inside Senter Hall.

view more 

Credit: Steven Bridges/University of Tennessee




The University of Tennessee, Knoxville’s Tickle College of Engineering has been awarded a $20 million grant from the U.S. Department of Energy for a groundbreaking project led by UT-Oak Ridge National Laboratory Governor’s Chair for Nuclear Materials Steve Zinkle. The project, known as the Integrated Materials Program to Accelerate Chamber Technologies, or IMPACT, aims to revolutionize the design and manufacturing of high-performance materials for fusion energy systems.

“We are excited to embark on this ambitious program,” Zinkle said. “Our assembled multidisciplinary team includes experts with a demonstrated track record of using science-based methods to rapidly design, fabricate and deploy advanced structural materials.”

One of the biggest challenges in making fusion energy commercially viable is the lack of nuclear-code-qualified high-temperature structural materials that can be used in fusion reactors. IMPACT aims to create a process and database for the first-ever American Society of Mechanical Engineers Boiler and Pressure Vessel code qualification for a fusion material and to demonstrate how these new materials can more quickly move from code qualification to engineering application.

UT  has one of the best nuclear engineering programs in the country, including a new minor that launched in fall 2024. The IMPACT team led by Zinkle includes three other Tickle College of Engineering faculty members: Eric Lass, an assistant professor in the Department of Materials Science and Engineering; Bradley Jared, an associate professor in the Department of Mechanical, Aerospace, and Biomedical Engineering; and Khalid Hattar, an associate professor in the Department of Nuclear Engineering.

The DOE launched the FIRE Collaboratives initiative last year to establish collaborative networks that bridge the gap between fusion research and industry.

FIRE Collaboratives consist of teams from government facilities, academia and industry that come together to address technical challenges on the road to commercial fusion development. Through the FIRE Collaboratives, the DOE hopes to accelerate the transition of scientific discoveries into commercial fusion applications.

The other institutions involved in the project are Oak Ridge National Laboratory; Stony Brook University; the University of Michigan; Northwestern University; Massachusetts Institute of Technology; the University of California, Santa Barbara; Idaho National Laboratory; the University of Miami; and the University of California, Los Angeles.


US Department of Energy announces selectees for $107 million fusion innovation research engine collaboratives, and progress in milestone program inspired by NASA



New awards from DOE will support acceleration of commercial fusion energy toward viability



DOE/US Department of Energy



WASHINGTON, D.C. — The U.S. Department of Energy (DOE) today announced $107 million in funding for six projects in the Fusion Innovative Research Engine (FIRE) Collaboratives, and that several privately funded fusion companies have completed early critical-path science and technology (S&T) milestones in the Milestone-Based Fusion Development Program (“the Milestone Program”). Both programs, administered by DOE’s Fusion Energy Sciences (FES) program in the Office of Science, are cornerstones of DOE’s fusion strategy to accelerate the viability of commercial fusion energy. 

“The launch of the DOE Milestone Program and FIRE Collaboratives are critical steps in accelerating progress toward the U.S. Bold Decadal Vision for Commercial Fusion Energy,” said Deputy Secretary of Energy David Turk. “As the world races to make fusion a viable source of energy for industry and consumers, these programs signal that the U.S. intends to be the first to commercialize fusion energy through strong partnerships among our National Laboratories, universities, and the private sector to realize industry-led designs for fusion pilot plants.” 

FIRE Collaboratives Project Selections 

The FIRE Collaboratives are aimed at creating a fusion energy S&T innovation ecosystem by forming virtual, centrally managed teams called “Collaboratives” that have a collective goal of bridging FES’s basic science research programs with the needs of the growing fusion industry, including the activities supported under the Milestone Program. 

This initiative represents a significant step forward in FES’s commitment to advancing fusion energy research and development, and aims to create new economic opportunities, maintain US leadership in fusion, bolster US-based manufacturing and supply chains, and enable the development of technologies crucial for national security, energy security, and defense. 

FES is pleased to announce the first awards for the FIRE Collaboratives that support materials and technologies required by a diverse set of fusion concepts. They include developing nuclear blanket testing capabilities at Idaho National Laboratory, materials development at the University of Tennessee – Knoxville, materials testing and advanced simulation capabilities at the Massachusetts Institute of Technology, target injector technology for inertial fusion energy concepts, and fusion fuel-cycle testing capabilities at Savannah River National Laboratory.  

Total anticipated funding for FIRE collaboratives is $180 million for projects lasting up to four years in duration. Additional awards drawing from the same pool of proposals may be made in the future. This is contingent on the availability of funds appropriated by Congress. 

The list of projects and more information can be found on the Fusion Energy Sciences program homepage

Selection for award negotiations is not a commitment by DOE to issue an award or provide funding. Before funding is issued, DOE and the applicants will undergo a negotiation process, and DOE may cancel negotiations and rescind the selection for any reason during that time.   

Progress in the Milestone Program 

The Milestone Program is modeled in part after the NASA Commercial Orbital Transportation Services (COTS) program. With relatively modest federal investments, the COTS program enabled private companies to meet early technical milestones on the way to building today’s commercial space-launch industry. 

Analogous to the earlier NASA COTS program, privately funded fusion companies in the DOE Milestone Program pursue both S&T and business/commercialization milestones (mutually negotiated with DOE). They receive federal payments after DOE verifies completion of each milestone through independent, expert review. The private company provides greater than 50% (in many cases much greater than 50%) of the cost to meet milestones. The company benefits both through the non-dilutive capital it receives from the government as well as through DOE’s validation of milestone completion, which are both helpful for subsequent private fundraising. 

Thus, the Milestone Program acts as a catalyst, where strategic federal investments are significantly amplified with follow-on private funding. To date, Milestone awardees have collectively raised over $350 million of new private funding since their selection into the program was announced in May 2023, compared to the $46 million of federal funding initially committed for negotiated milestones. The benefit to the U.S. public is the de-risking of multiple fusion-development paths that have both been peer-reviewed to be technically credible and are well aligned with commercial factors and needs. 

Specifically, the 8 awardees of the present DOE Milestone Program are working to resolve critical-path S&T gaps, in partnership with national laboratories and universities, toward realizing preliminary engineering designs for their fusion pilot plants (FPPs). The most aggressive and well-funded of the awardees are aiming for successful FPP preliminary-design reviews by the late 2020s to meet the ambitious and aspirational timeline of realizing an operating FPP by the mid-2030s. 

S&T milestones that have been met by companies thus far include the following: 

Quantitative metrics were required to be met for these milestones. The specific metrics are typically protected information of the companies. The other Milestone awardees are Commonwealth Fusion Systems, Tokamak Energy, Type One Energy, Xcimer Energy, and Zap Energy, all of whom are working on their early S&T milestones as well. 

All 8 awardees are presently working toward presenting pre-conceptual designs and technology roadmaps of their FPP concepts within the first 18 months of the Milestone program—roughly late 2025 (18 months into the Milestone Program). If they successfully meet these milestones, they will proceed into the next phase of the Milestone Program, where all the awardees are planning to build and operate major next-step integrated experiments and/or demonstrate some of the critical underlying technologies for their FPPs. Continued progress in the Milestone program is contingent on Congressional appropriations, successful negotiation of future milestones, and successful progress in the program. 

The DOE Milestone-Based Fusion Development Program was first authorized in the Energy Act of 2020 and received its first funding appropriation in fiscal year 2022. The program was announced in September 2022 and, following a rigorous merit-review process, 8 selectees were announced in May 2023. Initially, $46 million has been obligated for the first 18 months of the program. The program is authorized for a total of $415 million through fiscal year 2027 in the CHIPS and Science Act of 2022. 


State-of-the-art fusion simulation leads three scientists to the 2024 Kaul Foundation Prize


Prize Their simulation is one of many critical insights that have come from decades of work on a computer code known as XGC



DOE/Princeton Plasma Physics Laboratory

PPPL’s Seung-Hoe Ku, Choongseok (CS) Chang, and Robert Hager 

image: 

The winners of the 2024 Kaul Foundation Prize for Excellence in Plasma Physics Research and Technology Development: PPPL’s Seung-Hoe Ku, Choongseok (CS) Chang, and Robert Hager.

view more 

Credit: Michael Livingston / PPPL Communications Department




Three scientists were awarded the 2024 Kaul Foundation Prize for Excellence in Plasma Physics Research and Technology Development based on their decades of groundbreaking research about how plasma behaves in fusion reactors. 

Choongseok (CS) ChangSeung-Hoe Ku and Robert Hager of the U.S. Department of Energy’s (DOE) Princeton Plasma Physics Laboratory (PPPL) were recognized “for experimentally validated simulations of turbulence-broadened divertor heat flux widths using the X-Point Included Gyrokinetic Code (XGC),” following decades of research developing comprehensive simulations to model the fusion plasma edge. 

Recently, the scientists – in collaboration with researchers from the Massachusetts Institute of Technology (MIT) and other collaborators working on the DIII-D fusion device at General Atomics – determined that these simulations closely matched experiments in the DIII-D. The research showed that the turbulence doubles the exhaust layer width in conditions similar to what would be found in a commercial-scale reactor such as ITER. This is an important experimental validation that XGC can describe the relevant underlying physics, helping support XGC predictions that ITER could have a much broader exhaust footprint than what has been predicted by present tokamak data. 

This simulation code has been critical for a great deal of research that has advanced fusion science. The code simulates the whole volume of a tokamak plasma, especially the edge region of the magnetically confined plasma that includes the area where magnetic field lines cross, which is known as the X-point. This area is particularly important to study because of its reduced confining magnetic field strength, which can allow plasma particles to escape confinement. XGC is widely considered one of the best such codes available and is used by researchers worldwide on some of the planet’s most powerful computers

“This work has brought great honor to the Lab,” said PPPL Director Steve Cowley when he presented the award at PPPL’s annual State of the Lab address. “This high-performance computing exascale project code, developed at our Lab, is also being honored by the U.S. Secretary of Energy with the prestigious Secretary’s Honor Award as part of the broader exascale computing initiative.”

Using very powerful hardware, exascale computers can perform one quintillion (or a billion billion) calculations per second, which makes them faster than the most powerful supercomputers currently used. 

Each recipient of the annual Kaul Foundation Prize receives $7,500 in recognition of their scientific achievement. The prize was established with funds from the late PPPL Director Ronald C. Davidson’s 1993 Award for Excellence in Science, Education and Physics. It honors outstanding contributions to research in plasma physics. 

Notably, the 2024 winners used XGC to determine critical details about how ions and electrons escape the core plasma during fusion when the plasma is confined by magnetic fields inside a tokamak. Their highly sophisticated simulation is for ITER, the multinational fusion facility under construction in France. The simulation suggests that a key region of the ITER wall should not get as hot as once feared based on the experimental data from present-day tokamaks.

“We would like to thank the national and international XGC team members. For the divertor heat load research, credit goes to the ITER Organization collaborators led by Alberto Loarte and Richard Pitts; PPPL, DIII-D, Alcator C-Mod, National Spherical Torus Experiment and Lawrence Livermore National Laboratory participants in the 2016 DOE Office of Science national theory milestone activities that led to the discovery of the ion leakage and turbulent electron loss physics that are responsible for plasma exhaust. We also thank the joint PPPL/DIII-D research team led by Alessandro Bortolon for the XGC application on DIII-D edge plasma and Darin Ernst of MIT for collaborating with us to simulate his experiments in ITER-like conditions, which turned out to be ideal for validating the XGC simulations,” said Chang. 

“We hope to see more experimental validations on other tokamaks. We also would like to thank the tremendous support from the DOE program managers, DOE computer centers and PPPL management, which made the research possible.”

Choongseok (CS) Chang

After graduating with a doctoral degree in physics from the University of Texas at Austin in 1979, Chang was a senior scientist at General Atomics in San Diego before holding a tenure position at the Korea Advanced Institute of Science & Technology (KAIST). He later moved back to the U.S. and worked at the Courant Institute of Mathematical Sciences at New York University (NYU) before ultimately landing at PPPL in 2011.

Chang’s long career in plasma physics has focused on solving key theoretical challenges to make fusion a reliable source of electricity. Chang has spent decades leading multidisciplinary teams of physicists, applied mathematicians and computer scientists — including those who use artificial intelligence — to lead the development of XGC and simulate the extremely complex environment inside fusion reactors. 

“Plasma is not a single physics phenomenon. Several physics interact together. But that was a very difficult theory to develop,” said Chang. Additionally, the problem was multiscale, meaning it needed to be studied at multiple levels of detail. 

“Fortunately, I had a hunch in the late 1990s that computers would become more and more powerful so we could solve these problems,” Chang said. He recruited “a few brilliant students” to work on this important task. Among them, Ku was the main workforce. At the time, creating models that considered multiple physics simultaneously was considered nearly impossible. But Chang and the talented team – especially Ku – persisted. Ultimately, the work developing the necessary computer codes that could realize his multiphysics vision would receive substantial funding from the DOE and recognition from major U.S. computer centers. The success of this work eventually led Chang to resign from his positions in Korea and at NYU to fully dedicate himself to the XGC project and scientific discovery at PPPL. More young and talented physicists joined the development team and raised the code to a higher capability level. Among them, Hager became another distinguished developer and physics researcher.

One of the most rewarding aspects for Chang is seeing his younger group members become successful computational physicists in their own right. His advice to young physicists is to think big. “Don’t be afraid to attack challenging and ambitious scientific problems,” he said.

Seung-Hoe Ku

Ku has been a research physicist at the Lab since 2011, following Chang’s move. He received his doctoral degree in physics from KAIST in 2004. 

Ku has been deeply involved in the research and development of the XGC code for decades, starting from when he was a graduate student at KAIST. Ku was effectively the sole person writing an initial version of what would one day become the backbone of XGC while he was still a graduate student. 

“This has been a lifelong pursuit,” Ku said. He has seen the code through many iterations, moving it from a two-dimensional code into three dimensions and adding code to include turbulence, for example.

“When I extended it to 3D, a few people came on board to help with code performance,” Ku said. Now, many people around the world are working on XGC, with Ku and Hager focusing on managing the core of the code. 

Ku has been interested in physics since middle school. In high school, he also developed an interest in coding. With some friends, Ku wrote what he describes as a precursor to the popular video game Angry Birds. “You throw the ball, and then it calculates the trajectory,” Ku said. “At the time, it was just for fun. But I think that’s my first physics simulation of particles.”

Ku would like to thank his wife, Haehyun Nam, for her patience.

Robert Hager

Hager received a doctoral degree in plasma physics jointly from the Technical University of Munich and the Max Planck Institute for Plasma Physics in 2011. The following year, he came to PPPL as a postdoctoral researcher. Hager has been working with Chang and Ku on XGC ever since. He became a core developer and is now a research physicist at the Lab. 

“Winning the Kaul Prize is confirmation that what we’ve been doing all those years actually makes sense and produces good results,” Hager said. People sometimes question why he would work so hard on a code that is so complex it can only run using the world’s most powerful computers. “Now, finally, I think more people are seeing our results and realizing we can reproduce what people are seeing in experiments and get better insights,” Hager said.

In addition to being one of the main authors and managers of XGC, Hager is also responsible for training and supporting XGC users worldwide.

Hager says the field was definitely the right choice for him. “As a scientist, you sometimes have long stretches where nothing seems to work. But when you find a solution, you understand something new, and that is so rewarding. I also like the technical aspect, tinkering with computer tools.”

Like many in the field, Hager was initially drawn to plasma physics because of the environmental aspect of clean energy from fusion. However, there were also personal factors: His position at Max Planck brought him closer to his girlfriend, who he would later marry. “I would like to thank my wife, Sofia, my Ph.D. supervisor Klaus Hallatschek and everyone who helped make XGC what it is today.”




Research sparks prevention techniques for wildfires and outages

Engineers are studying ways to monitor power lines for faults and failures to correct those conditions before an outage or fire occurs.


Texas A&M University

 Feature Story | 16-Jan-2025


Fires can start in 10s of milliseconds, and approximately 10% of wildfires are started by something related to the power system.

For over 30 years, researchers in the Power System Automation Laboratory in the Department of Electrical and Computer Engineering at Texas A&M University have been studying ways to monitor electric power lines for faults and failures to correct those conditions before an outage, fire or catastrophic failure occurs.

In that time, they have conducted several projects to enhance their Distribution Fault Anticipation (DFA) system. The idea is to anticipate a fault before it happens.

Made up of a four-man team that has been conducting research together for decades, Professor B. Don Russell and Principal Research Engineers Carl Benner, Jeffrey Wischkaemper and Karthick Manivannan are working on a nearly $3.2 million Department of Energy (DOE) project titled “Preventing Wildfire Ignition from Powerline Equipment Failures Using ML-Based Classification of Real-Time Electrical Measurements.”

What started as a focus on failure diagnosis of mechanisms on power lines evolved into wildfire and power outage prevention caused by power lines.

“We recognized that a lot of what we were detecting — failures and abnormal conditions out on the power system — were also causing wildfires. And that was just serendipitous. That wasn’t because we set out to do something to fix wildfires,” Russell said.

In combination with growing drought conditions over the last decade, environmental climate change, and changing rain and humidity levels around the country, wildfires have notably grown.

“About 10 years ago, we recognized there was a very substantial increase in the number of wildfires that had been started by electric power lines in the United States,” Russell said.

Since power lines are pervasive in the United States’ geography, a fire could happen anywhere. Most of the vulnerable territories are in rural areas where nobody’s there to observe a fire start and report it.

For one project, the team worked for four years with seven Texas utility companies and received a multi-million-dollar grant from the state of Texas to study the impact of power lines on wildfire ignition and what could be done to prevent fires.

During this project, the team developed techniques that could find and allow companies to fix mechanisms that start wildfires. The main causes of wildfires are weather and human actions, such as dry lightning, an unattended barbecue pit, or burning trash. Each year, up to 10% of fires are started by power lines.

How Do Fires Get Started From Power Lines?

Power line conductors in the air can move around substantially in high wind conditions, and if they hit each other—called a conductor clash—they throw off incandescent metal particles that are ignition mechanisms.

Fires can also start from equipment failure. This can occur if a transformer explodes or a pole falls over and lines end up on the ground, or a connector, the device holding the power lines together, overheats and drops melted metal.

The single largest issue is when power lines break in the air, fall to earth and arc to the ground. Arcing conditions sometimes only last for half a second, but it doesn’t take long to start a fire. To make matters worse, sometimes these ground fault conditions are not easily detectable.

The team is examining how conditions in the air affect power lines to prevent them from falling. The team’s algorithms can detect small arcing conditions and failures from miles away. Their DFA system can give utilities up to weeks’ notice of a problem long before catastrophic failure.

“If we can find the issue and tell utilities it’s happening today, they can find and fix it by tomorrow,” Russell said. “We’ve been able to predict the location of power line equipment failure, which left alone for two weeks could have etched through the conductor and dropped lines to the ground. But we identified the problem weeks in advance and were able to prevent disaster.”

For example, MidSouth Electric Cooperative, a utility company in Navasota, tested DFA and uses it daily. In one instance, a clamp started arcing in the Sam Houston National Forest, which could easily start a fire. The DFA system identified the issue and notified the utility company. They were able to repair the clamp and prevent an outage or a fire.

Using AI And Machine Learning

Currently, the team is using conventional algorithms and computer science programming to conduct research. However, their DOE project will look at expanding that technology through artificial intelligence and machine learning to improve DFA sensitivity and reliability for early detection of failures.

“When the Department of Energy asked for projects under the Office of Cybersecurity, Energy Security and Emergency Response program, we proposed to take all the work we had done over the past 30 years as the foundation and build on that to look for substantial improvements using machine learning and artificial intelligence,” Russell said.

“We’ve been able to detect many failure mechanisms for 15 years, and we’re getting better and better at it, but machine learning carries some distinct possibilities for the future,” Russell added “Hopefully, it will result in even better diagnostic tools to find those things that cause not only wildfires, but also outages for customers.”

The future project also has international scope. The team currently has test systems in Australia, New Zealand, Scotland and England, as well as the United States. Australia and New Zealand have a high risk for wildfire issues, and England and Scotland’s concern is primarily improving reliability.

“This is a tool, not just for wildfire mitigation, but to significantly improve the reliability of service. It would benefit everybody that uses it, even if they’re not in a wildfire-prone area,” Russell said.

By Katie Satterlee, Texas A&M Engineering

###

 

East Asia meets Europe in Lower Austria



Ancient genomes show integration of genetically different groups to the same early medieval Avar society in the Vienna Basin, Austria



Max Planck Institute for Evolutionary Anthropology

Avar-period cloak clasp 

image: 

An Avar-period cloak clasp from a female grave at Moedling, Austria. Archers were associated with a higher social status.

view more 

Credit: © Benedict Seidl, benedicts.1995@gmail.com




Research combining expertise of various disciplines found that genes and culture do not have to match. The latest findings from the European Research Council project HistoGenes emerged from a genetic study of burial grounds from the Avar period in the 8th century CE. The Avars had arrived in the 6th century from the East Asian Steppes and settled in East Central Europe among a mixed population.

Despite their rich archaeological heritage, many questions remained. Were the people buried in these sites descendants of the Avar conquerors or of the previous population that was integrated into the Avar society? Or had these two groups long since mixed, as many expected? The analysis of two large sites south of Vienna, of 500 graves in Mödling and almost 150 in Leobersdorf, brought unexpected results.

When the researchers looked at the ancient DNA extracted from the human remains from these neighbouring sites, they were very surprised. While the population of Leobersdorf was mostly of East Asian origin, those buried in Mödling had ancestry associated with European populations. “The genetic difference between these groups was very clear and consistent for most individuals at the sites,” says Ke Wang, a geneticist and one of the lead authors of the study.

Before genetic analysis, no large difference between the sites had been observed. The archaeological remains of the two communities and their way of life were very similar. "The cultural integration apparently worked despite major genetic differences, and these people were obviously regarded as Avars," says Walter Pohl from the Austrian Academy of Sciences, a historian and one of the senior authors of the study.

Peaceful coexistence

The historical records agree with the evidence from anthropology and archaeology that this was one of the most peaceful times in the history of the Vienna Basin, in spite of the reputation of the Avars as warriors. "We find no battle injuries on the skeletons and there are hardly any signs of deficiencies," explains Doris Pany-Kucera, anthropologist at the Natural History Museum Vienna and one of the lead authors of the study. Also, weapons were only occasionally placed in the graves.

Thanks to the sampling strategy and highly sensitive genetic analysis, it was possible to discover a high number of relatives among the deceased. “The large number of genetic relationships between the individuals allowed us to reconstruct contemporary six-generation-long pedigrees at each site,” says Zuzana Hofmanová from the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany and Masaryk University in Brno, Czechia, a geneticist and one of the senior lead authors of the study.

Only exceptionally, the individuals had no biological connection to anybody else at their burial ground. Yet, the researchers found no consanguineous relationships even between distant relatives. Interestingly, they were able to determine that almost none of the mothers had local ancestors: they must therefore have come from other regions and other communities. However, there were hardly any genetic connections between Mödling and Leobersdorf.

Both communities followed a similar social practice in choosing partners from certain other communities, through which their different ancestry was preserved: the women that became mothers in Leobersdorf apparently came from communities that also descended from East Asia (possibly from the centre of the Avar realm), while in Mödling they were of European descent. Yet they did not differ in status or wealth. "Status symbols such as belt fittings depicting griffins, and their culture and customs were the same. Most likely both considered themselves Avars," says Bendeguz Tobias, an archeologist and one of the lead authors of the study.

Such large studies that systematically investigate burial grounds are still rare in the field. “Mödling burial ground is one of the largest ever analyzed genetically, and such results hold a lot of potential for future research in various disciplines,” says Johannes Krause, director at the Max Planck Institute for Evolutionary Anthropology and one of the senior authors of the study.

Coat clasp with glass inlay - an Avar-period artefact that is typically associated with higher social status of women as well as higher biological connectivity as revealed by ancient DNA.

Credit

© Benedict Seidl, benedicts.1995@gmail.com