Saturday, October 12, 2024

 SCI-FI-TEK

New AI models of plasma heating lead to important corrections in computer code used for fusion research



Researchers find an effective alternative to overcome modeling limitations using machine learning



DOE/Princeton Plasma Physics Laboratory

Álvaro Sánchez Villar, an associate research physicist at the U.S. Department of Energy’s Princeton Plasma Physics Laboratory 

image: 

Álvaro Sánchez Villar, an associate research physicist at the U.S. Department of Energy’s Princeton Plasma Physics Laboratory, has developed new AI models for plasma heating that increase the prediction speed while preserving accuracy and providing accurate predictions where original numerical codes failed. 

view more 

Credit: Michael Livingston / PPPL Communications Department




New artificial intelligence (AI) models for plasma heating can do more than was previously thought possible, not only increasing the prediction speed 10 million times while preserving accuracy, but also correctly predicting plasma heating in cases where the original numerical code failed. The models will be presented on October 11 at the 66th Annual Meeting of the American Physical Society Division of Plasma Physics in Atlanta.

“With our intelligence, we can train the AI to go even beyond the limitations of available numerical models,” said Álvaro Sánchez-Villar, an associate research physicist at the U.S. Department of Energy’s (DOE) Princeton Plasma Physics Laboratory (PPPL). Sánchez-Villar is the lead author on a new peer-reviewed journal article in Nuclear Fusion about the work. It was part of a project that spanned five research institutions.

The models use machine learning, a type of AI, to try to predict the way electrons and ions in a plasma behave when ion cyclotron range of frequency (ICRF) heating is applied in fusion experiments. The models are trained on data generated by a computer code. While much of the data agreed with past results, in some extreme scenarios the data wasn’t what they expected.

“We observed a parametric regime in which the heating profiles featured erratic spikes in rather arbitrary locations,” said Sánchez-Villar. “There was nothing physical to explain those spikes.” 

New artificial intelligence (AI) models for plasma heating can do more than was previously thought possible, not only increasing the prediction speed 10 million times while preserving accuracy, but also correctly predicting plasma heating in cases where the original numerical code failed. The models will be presented on October 11 at the 66th Annual Meeting of the American Physical Society Division of Plasma Physics in Atlanta.

“With our intelligence, we can train the AI to go even beyond the limitations of available numerical models,” said Álvaro Sánchez-Villar, an associate research physicist at the U.S. Department of Energy’s (DOE) Princeton Plasma Physics Laboratory (PPPL). Sánchez-Villar is the lead author on a new peer-reviewed journal article in Nuclear Fusion about the work. It was part of a project that spanned five research institutions.

The models use machine learning, a type of AI, to try to predict the way electrons and ions in a plasma behave when ion cyclotron range of frequency (ICRF) heating is applied in fusion experiments. The models are trained on data generated by a computer code. While much of the data agreed with past results, in some extreme scenarios the data wasn’t what they expected.

“We observed a parametric regime in which the heating profiles featured erratic spikes in rather arbitrary locations,” said Sánchez-Villar. “There was nothing physical to explain those spikes.” 

“This means that, practically, our surrogate implementation was equivalent to fixing the original code, just based on a careful curation of the data,” said Sánchez-Villar. “As with every technology, with an intelligent use, AI can help us solve problems not only faster, but better than before, and overcome our own human constraints.”

As expected, the models also improved the computation times for ICRF heating. Those times fell from roughly 60 seconds to 2 microseconds, enabling faster simulations without notably impacting the accuracy. This improvement will help scientists and engineers explore the best ways to make fusion a practical power source.

Other researchers on the project include Zhe Bai, Nicola Bertelli, E. Wes Bethel, Julien Hillairet, Talita Perciano, Syun’ichi Shiraiwa, Gregory M. Wallace and John C. Wright. The work was supported by the U.S. Department of Energy under Contract Number DE-AC02-09CH11466. This research used resources of the National Energy Research Scientific Computing Center (NERSC) operated under Contract No. DE-AC02-05CH11231 using NERSC Award FES m3716 for 2023. 


Department of Energy announces $49 million for research on foundational laboratory fusion



Projects address scientific gaps foundational to enabling fusion energy



DOE/US Department of Energy




WASHINGTON, D.C. - As the Department of Energy (DOE) continues to accelerate a clean-energy future that includes fusion technology, a total of $49 million in funding for 19 projects was announced today in the Foundational Fusion Materials, Nuclear Science, and Technology programs.

The purpose of the funding is to reorient the laboratory-based foundational and basic science research programs to better align and support the new FES program vision.  

“The Fusion Nuclear Science Foundational research program, in enabling research and development and furthering research in fusion nuclear science and fusion materials, is vital to addressing critical scientific gaps foundational to enabling fusion energy,” said Jean Paul Allain, DOE Associate Director of Science for Fusion Energy Sciences.

Fusion energy holds the potential to revolutionize the world’s energy supply by providing a virtually limitless, clean, and sustainable power source. Unlike current nuclear power, which relies on splitting atoms (fission), fusion mimics the process that powers the sun by combining atomic nuclei to release massive amounts of energy.  

Fusion produces no long-lived radioactive waste, emits no carbon dioxide, and uses abundant fuels like hydrogen. If harnessed successfully, fusion energy could provide a safe, reliable solution to meet global energy demands while significantly reducing the environmental impact of power generation.

The projects funded under this initiative cover a wide range of cutting-edge research areas, each crucial to the development of fusion energy technology. For instance, scientists are testing new magnet designs that will help control the extremely hot plasma needed for fusion.

Other teams are working on materials that can withstand the damage caused by plasma, ensuring that systems used for maintaining the plasma remain functional and efficient. Some researchers are investigating blanket materials, which are designed to absorb heat from the plasma and turn it into usable energy, while also studying how these materials can be made durable enough to function in such an extreme environment.

Additionally, efforts are being made to improve fuel cycle systems, which help maintain the continuous flow of the fuel necessary for fusion reactions. Finally, advanced structural materials are being developed to construct stronger, more durable components that can endure the harsh conditions inside the fusion systems. Together, these projects aim to advance our understanding and capability in the pursuit of clean, sustainable fusion energy.

The projects were selected by competitive peer review under the DOE Lab Call: Opportunities in Foundational Fusion Materials, Nuclear Science, and Technology.  

Total funding is $49 million for projects lasting up to three years in duration, with $7 million in Fiscal Year 2024 dollars and outyear funding contingent on congressional appropriations. The list of projects and more information can be found on the Fusion Energy Sciences program homepage.

Selection for award negotiations is not a commitment by DOE to issue an award or provide funding. Before funding is issued, DOE and the applicants will undergo a negotiation process, and DOE may cancel negotiations and rescind the selection for any reason during that time.   



 

People with dyslexia and dyscalculia show less bias, study shows



Study is the first to explore how people with the conditions acquire biases developed automatically in the general population



University of Plymouth





Dyslexia and dyscalculia are most commonly acknowledged as posing challenges linked to people’s literacy and numeracy.

However, a new study has shown those with the conditions may have previously unheralded strengths, namely reduced bias against others based on characteristics such as their disability, race or gender.

The research set out to explore whether, and to what extent, those with the two learning difficulties acquire certain traits that most people with no disabilities develop automatically.

To do this, it examined data amassed through Project Implicit’s Implicit Association Test (IAT) database on attitudes to disability.

It analysed both the responses to a number of questions – and the time taken to respond – from more than 460,000 people, including 22,747 with physical disabilities, 14,700 with dyslexia, and 1,721 with dyscalculia,

The results showed those with dyslexia exhibited significantly less implicit bias than those with no disability, something the researchers had predicted based on existing knowledge of the condition and its effects.

However, those with dyscalculia exhibited even less bias, compared to those with dyslexia and from the wider population.

The researchers say the results provide new insights into the development of people with conditions such as dyslexia and dyscalculia.

However, they say they are particularly striking in that they support previous suggestions that not all the impacts of such conditions are as negative as many people might perceive them to be.

Published in the journal Neurodiversity, the study was led by Dr Gray Atherton and Dr Liam Cross from the University of Plymouth’s School of Psychology.

It is part of their wider programme of research looking at the experiences of people with a range of neurodiverse conditions, which has recently involved studies looking at whether board and role-playing games have benefits for those with autism.

Dr Cross said: “For most people, categorising social groups is one of the first things we automatically learn how to do. It means we inherently know things about people’s race and gender and, as a result, begin to exhibit certain implicit and psychological biases. Our study shows there are groups within society who, in line with difficulties developing literacy and numeracy skills, may not develop these biases. Our contention would be that not having these biases is a good thing, for the individuals concerned and for modern society as a whole.”

Dr Atherton added: “Our research always tries to explore if there are ways to consider a learning disability as far more than just an individual or a negative issue. Based on this new study, we can see how people with dyslexia and dyscalculia are potentially not being influenced by many of society’s rules in the way that those without such a condition are. It reinforces the message that not everyone needs to be at the same level of competence, as they can brings other strengths to society’s table.”

 

Mcity opens for remote testing of autonomous vehicle technologies, calls for federal standards



Remote testing could accelerate the adoption of autonomous vehicles, but innovators need to know the safety regulations they must meet




University of Michigan





Images/Videos

As Mcity begins welcoming researchers in autonomous and connected vehicle technologies from around the U.S. to be remote users of its physical and virtual testing environment, its leadership is calling for federal standards for safety testing, arguing that the lack of clear goalposts is hampering development. 

 

The opening also coincides with a new industry partnership project announced at the NVIDIA AI Summit

 

Following two years of upgrades, the University of Michigan's hub for autonomous testing has officially kicked off its remote program with a series of demonstrations for industry and state and federal government officials. By allowing outside researchers to tap into Mcity research vehicles and testing technologies from afar, U-M seeks to accelerate the development of autonomous vehicle technologies to enable safer and more efficient transportation. Funded by the National Science Foundation, the capabilities are initially available to academic researchers. 

 

"This is particularly valuable for academic institutions, including many that serve minority students, that do not have the resources of large, public institutions like U-M," said Henry Liu, U-M professor of civil and environmental engineering and director of both Mcity and the Center for Connected and Automated Transportation. "We call it Mcity 2.0—the next-generation automated vehicle test facility.

 

"We believe Mcity 2.0's capabilities will expand the volume and increase the speed of AV research that will be conducted and published. This could help put more refined, safer AVs on our roads sooner."

 

At the same time, a new Mcity partnership announced at the NVIDIA AI Summit in Washington, D.C., aims to advance its remote testing capabilities for industry as well as academic users. Mcity simulation tools are being tapped for a new project in partnership with NVIDIA and nonprofit research managing firm MITRE. The goal is a full simulation version of Mcity that industry and academic researchers can use to rigorously test and validate AVs, accelerating their development and adoption. 

 

An invitation to researchers across the U.S. 

 

In the coming year, Mcity plans to assist 10 research teams from academic institutions in the U.S. with their testing needs. Researchers can submit proposals to Mcity

 

By connecting to Mcity's cloud-based digital infrastructure, researchers operating from their home bases can test their autonomous algorithms in virtual and mixed reality environments. Using 5G wireless communications, they can control physical vehicles and traffic signals on Mcity's test streets, and receive real-time data in return—all while protecting proprietary information.

 

"We invite researchers from across the country to connect with us and join the effort to accelerate developing and deploying connected and autonomous vehicles that will make transportation safer, cleaner and more accessible," said Karen Thole, the Robert J. Vlasic Dean of Engineering. "Please add your brain power to advancing tomorrow's technologies."

 

The critical role of federal regulation

 

While Mcity looks to accelerate that research, U-M researchers see federal leadership—in the form of testing standards and funding—as crucial to keeping it on track. Several unanswered questions lie at the heart of the matter, including:

 

  • How should AVs be tested and evaluated? How safe is safe enough?
  • How do we properly prepare our infrastructure to support AV technologies?
  • How should AVs be licensed?
  • How should AVs be insured?

 

"Typically, regulation follows innovation," Liu said. "With AVs, however, the lack of a federal safety testing framework has created uncertainty that is thwarting innovation and delaying the adoption of technologies that we know can save lives and improve the quality of life for millions of people who do not have reliable access to transportation."

 

Mcity has developed a two-part methodology for evaluating AV safety, called the Mcity Safety Assessment Program. The first component tests driving algorithms' basic competency in ordinary traffic scenarios. The second component challenges the software with a diverse set of dangerous driving scenarios—representing those that most often result in crashes, injuries and fatalities.


Opened in 2015, Mcity was the world's first purpose built-proving ground for connected and automated vehicles. In 2022, a $5.1 million grant from the National Science Foundation helped U-M develop Mcity's digital infrastructure—leading to Mcity 2.0.

 

Plant stem cells: Better understanding the biological mechanism of growth control




University of Freiburg






Plants form new leaves, flowers and roots at the tips of shoots and roots, in specific growth regions known as meristems. These meristems contain stem cells that divide as needed and form new cells that develop into specialised tissue. Using the example of plant roots, researchers from Freiburg have now been able to decipher which regulatory mechanisms ensure that growth in the meristem occurs in a controlled manner. The results have been published in the journal Nature Plants.

Stem cells are dependent on signals from other cells

The fact that stem cells can continuously divide and form progenitor cells for specialised tissues is not a matter of course: signals from other cells are necessary to control the properties of stem cells. This dependence on signalling processes is also a protective mechanism. If stem cells were able to multiply uncontrollably, this would lead to uncontrolled growth, as in the case of cancer.

WOX5 is an important signalling molecule that regulates stem cells in the meristem. However, the mechanism through which it does this was previously unknown. A research team led by Prof. Dr. Thomas Laux, a member of the CIBSS – Centre for Integrative Biological Signalling Studies Cluster of Excellence at the University of Freiburg, has now succeeded in decoding this mechanism. The team identified HAN as an essential factor that transmits the function of WOX5.

The gene-regulating molecule HAN is an important regulator for plant growth.

“We were able to show that HAN transmits the WOX5 signal and ensures that the CDF4 gene remains inactive in stem cells,” explains Laux. “CDF4 would otherwise cause stem cell properties to be inhibited. By suppressing CDF4, HAN allows the stem cells in the root meristem to remain undifferentiated and continue dividing.”

The team used molecular biology methods as well as mathematical modelling. These provide a possible explanation as to why the seemingly complicated mechanism could be an advantage for the plant: the involvement of HAN as a link between WOX5 and CDF4 appears to make the regulation of stem cells less sensitive to environmental influences. “In further investigations, we now want to find out whether the multi-level nature of the process actually has the effect that we see in the modelling,” says Laux.

Process also important for plant breeding

A precise understanding of the processes by which plants grow is an important basis for breeding more resilient or higher-yielding crops. This is because it allows the targeted identification and selection of plants that can grow and produce yields even under less-than-ideal conditions, such as extreme weather.

 

 

Using AI and iNaturalist, scientists build one of the highest resolution maps yet of California plants



Citizen scientists can help monitor the state's plants in the face of global change



University of California - Berkeley

California plant distributions from iNaturalist data using AI 

image: 

Using more than 652,000 observations uploaded to iNaturalist (left), UC Berkeley scientists created an AI model to predict the distribution of 2,221 species of plants around the state. To train the deep learning network, the species observations were linked to 256 x 256 meter remote sensing images taken from the 2012 National Agricultural Imagery Program (right) and with climate variables.

view more 

Credit: Moi Exposito-Alonso and Lauren Gillespie, UC Berkeley




With the help of deep learning, University of California, Berkeley, scientists have leveraged citizen science data from the app iNaturalist to create the highest resolution maps yet of plant distributions throughout California.

iNaturalist is a widely-used cellphone app, originally developed by UC Berkeley students, that allows people to upload photos and the location data of plants, animals or any other life they encounter and then crowdsource their identity. The app currently has more than 8 million users worldwide who collectively have uploaded more than 200 million observations.

The researchers used a type of artificial intelligence called a convolutional neural network, which is a deep learning model, to correlate the citizen science data for plants in California with high-resolution remote-sensing satellite or airplane images of the state. The network discovered correlations that were then used to predict the current range of 2,221 plant species throughout California, down to scales of a few square meters.

Botanists usually build high-quality maps of species by painstakingly listing all plant species in an area, but this is not feasible outside of a few small natural areas or national parks. Instead, the AI model, called Deepbiosphere, leverages free data from iNaturalist and remote sensing airplanes or satellites that now cover the entire globe. Given enough observations by citizen scientists, the model could be deployed in countries lacking detailed scientific data on plant distributions and habitats to monitor vegetation change, such as deforestation or regrowth after wildfires.

The findings were published Sept. 5 in the journal Proceedings of the National Academy of Sciences by Moisés "Moi" Expósito-Alonso, a UC Berkeley assistant professor of integrative biology, first author Lauren Gillespie, a doctoral student in computer science at Stanford University, and their colleagues. Gillespie currently has a Fulbright U.S. Student Program grant to use similar techniques to detect patterns of plant biodiversity in Brazil.

“During my year here in Brazil, we’ve seen the worst drought on record and one of the worst fire seasons on record,” Gillespie said. “Remote sensing data so far has been able to tell us where these fires have happened or where the drought is worst, and with the help of deep learning approaches like Deepbiosphere, soon it will tell us what’s happening to individual species on the ground.”

"That is a goal — to expand it to many places," Expósito-Alonso said. "Almost everybody in the world has smartphones now, so maybe people will start taking pictures of natural habitats and this will be able to be done globally. At some point, this is going to allow us to have layers in Google Maps showing where all the species are, so we can protect them. That's our dream."

Apart from being free and covering most of Earth, remote sensing data are also more fine-grained and more frequently updated than other information sources, such as regional climate maps, which often have a resolution of a few kilometers. Using citizen science data with remote sensing images — just the basic infrared maps that provide only a picture and the temperature — could allow daily monitoring of landscape changes that are hard to track.

Such monitoring can help conservationists discover hotspots of change or pinpoint species-rich areas in need of protection.

"With remote sensing, almost every few days there are new pictures of Earth with 1 meter resolution," Expósito-Alonso said. "These now allow us to potentially track in real time shifts in distributions of plants, shifts in distributions of ecosystems. If people are deforesting remote places in the Amazon, now they cannot get away with it — it gets flagged through this prediction network."

Expósito-Alonso, who moved from Stanford to UC Berkeley earlier this year, is an evolutionary biologist interested in how plants evolve genetically to adapt to climate change.

"I felt an urge to have a scalable method to know where plants are and how they're shifting," he said. "We already know that they're trying to migrate to cooler areas, that they're trying to adapt to the environment that they're facing now. The core part of our lab is understanding those shifts and those impacts and whether plants will evolve to adapt."

In the study, the researchers tested Deepbiosphere by excluding some iNaturalist data from the AI training set and then later asking the AI model to predict the plants in the excluded area. The AI model had an accuracy of 89% in identifying the presence of species, compared to 27% for previous methods. They also pitted it against other models developed to predict where plants are growing around California and how they will migrate with rising temperatures and changing rainfall. One of these models is Maxent, developed at the American Museum of Natural History, that uses climate grids and georeferenced plant data. Deepbiosphere performed significantly better than Maxent.

They also tested Deepbiosphere against detailed plant maps created for some of the state's parks. It predicted with 81.4% accuracy the location of redwoods in Redwood National Park in Northern California and accurately captured (with R2=0.53) the burn severity caused by the 2013 Rim Fire in Yosemite National Park.

"What was incredible about this model that Lauren came up with is that you are just training it with publicly available data that people keep uploading with their phones, but you can extract enough information to be able to create nicely defined maps at high resolution," Expósito-Alonso said. "The next question, once we understand the geographic impacts, is, “Are plants going to adapt?’"

Megan Ruffley, also of the Carnegie Institution for Science at Stanford, is a co-author of the paper. The research, which began when Expósito-Alonso was a staff associate at the Carnegie Institution for Science and an assistant professor at Stanford, was funded by the Carnegie Institution for Science.

Old growth and regrowth in Redwoods National Park 

  

Reprogramming wood-degrading mushroom enzymes for the biorecycling of plastic



INRAE - National Research Institute for Agriculture, Food and Environment





Plastic pollution is ubiquitous in the environment and managing plastic waste is a global problem. In addition to developing more reasonable production methods and ways to use plastic, one solution to the problem is to develop biorecycling technology. The very nature of plastic, made with highly resistant polymers in order to not break down, makes this a huge scientific challenge. Yet, plastic shares analogous properties with other – natural – recalcitrant polymers, like wood cellulose, which can be broken down by filamentous fungi. The fungi achieve this by secreting an arsenal of enzymes. They notably secrete very special enzymes known as “lytic polysaccharide monooxygenases”, or LPMOs, capable of breaking down the surface of cellulose to then weaken it and make complete degradation easier. These properties make the LPMOs perfect candidates for engineering to create new functions like breaking down plastics. 

Chimera enzymes that recognise plastics

LPMO enzymes are usually composed of two modules: a binding module that enables it to recognise and bind to a specific polymer – cellulose, in a natural setting – and a catalytic module that breaks down the cellulose surface. Scientists focused on replacing the binding module with other modules, using industrially scaled protein engineering processes to make the enzymes capable of binding to different plastics.  They created chimera LPMOs that can recognise and bind to different types of plastics. Some were also able to make holes in the surface of polyhydroxyalkanoate, a biosourced plastic known as PHA.

The researchers will now evaluate how well these chimera enzymes break down different types of plastics in order to select the most effective ones, to further engineer them and combine them into enzyme ‘cocktails’ with the objective of creating an enzymatic tool kit for the biorecycling of plastics.

Do fungi recognize shapes?



Tohoku University

Figure 1 

image: 

Fungal mycelial networks connecting wood blocks arranged in circle (left) and cross (right) shapes. 

view more 

Credit: ©Yu Fukasawa et al.




Can organisms without a brain still show signs of intelligence? Researchers at Tohoku University and Nagaoka College had this question in mind when conducting a study to measure the decision-making processes in fungi. While it may sound like science fiction, this level of basal cognition is possible even in fungi.

"You'd be surprised at just how much fungi are capable of," remarks Yu Fukasawa of Tohoku University, "They have memories, they learn, and they can make decisions. Quite frankly, the differences in how they solve problems compared to humans is mind-blowing."

Fungi grow by releasing spores, which can germinate and form long, spidery threads underground (a mycelium). We typically only see the tiny mushrooms on the surface without realizing that there's a vast network of interconnected mycelium beneath our feet. It is through this network that information can be shared, somewhat like neural connections in the brain.

The present study examined how a wood-decaying mycelial network responded to two different situations: wood blocks placed in a circle versus cross arrangement. For example, if the fungi didn't display decision-making skills, they would simply spread out from a central point without consideration for the position of the blocks. Remarkably, this is not what the researchers witnessed.

For the cross arrangement, the degree of connection was greater in the outermost four blocks. It was hypothesized that this was because the outermost blocks can serve as "outposts" for the mycelial network to embark in foraging expeditions, therefore more dense connections were required. In the circle arrangement, the degree of connection was the same at any given block. However, the dead centre of the circle remained clear. It was proposed that the mycelial network did not see a benefit in overextending itself in an already well-populated area.

These findings suggest that the mycelial network was able to communicate information about its surroundings throughout the entire network, and change its direction of growth accordingly based on the shape.

Our comprehension of the mysterious world of fungi is limited, especially when compared to our knowledge of plants and animals. This research will help us better understand how biotic ecosystems function and how different types of cognition evolved in organisms.

These results were published in Fungal Ecology on September 12, 2024.

 

Plant Science Research collaboration will explore key mosses critical to storing carbon


Preserving peat bogs is essential for combating global climate change




Donald Danforth Plant Science Center

Sona Pandey, PhD, 

image: 

Sona Pandey, PhD, in the lab. 

view more 

Credit: Donald Danforth Plant Science Center




ST. LOUIS, MO, October 9, 2024 - Plant scientists at the Donald Danforth Plant Science Center and the HudsonAlpha Institute for Biotechnology have been awarded a four-year National Science Foundation (NSF) Enabling Discovery through GEnomics (EDGE) grant to advance their understanding of sphagnum moss, a crucial component of peatlands and a vital player in global ecosystems. The collaborative research team will develop genetic and genomic resources to study sphagnum's life cycle, growth, and adaptation to various environmental conditions.

Sphagnum mosses are the main constituents, and engineers of peatland ecosystems. Peatland or bog ecosystems are home to various flora adapted to the waterlogged, nutrient poor, acidic conditions. The soft, spongy ground of peat bogs comprises living and decaying sphagnum moss that creates an acidic environment to prevent the decomposition of organic materials. This means that all of the carbon stored within the plant material is locked inside the bogs instead of being released into the atmosphere.  

While peat bogs constitute only about 3% of Earth's land area, they store a staggering 30 percent of the planet's terrestrial carbon - twice as much as in all the world’s forest biomass combined. This carbon, locked away for millennia, is at risk of being released into the atmosphere due to drainage, human activities and climate warming. Preserving peat bogs is essential for combating global climate change and safeguarding our planet's future. 

“Regardless of their fascinating features, impacts on global carbon cycles, and unique position in land plant evolution, studies on Sphagnum spp. are limited mostly to eco-physiological analysis, and functional genomic resources remain scarce,” said Lead Principal Investigator, Sona Pandey, PhD, member, Danforth Center. 

How Sphagnum spp. respond and adapt to climate change is a predictor of the stability and distribution of the peatlands and the future trajectory of global carbon fluxes. So, establishing sphagnum as functional genomics models will help translate its morphology, physiology and gene level knowledge to the ecosystem level effects and conversely, ecosystem level traits can be correlated with specific gene function and mechanisms. 

“Despite its clear importance to our environment, relatively little is known about the genetics and biology of sphagnum mosses, and particularly how their sex chromosomes function, as well as how the presence of different sexes of these species can alter ecosystem dynamics,” says HudsonAlpha Faculty Investigator Alex Harkess, PhD, a Co-PI on the grant. 

The project will kick off at the Danforth Center, where Pandey and her team will develop and optimize protocols for successfully and consistently growing sphagnum mosses in a lab setting. The team will study and record detailed morphology and growth patterns of sphagnum and the effects of varying conditions on the growth and decay. They will also create tools to manipulate sphagnum genes and study the functions of specific genes involved in important biological processes. The team will identify pertinent growth conditions and tissue types that are needed to create sphagnum gene expression atlas and develop additional genomics tools. 

“To accomplish this, we must have access to reliable sphagnum cultures in the lab, and we must be able to determine the growth and decay patterns, developmental transitions and physiology of these underappreciated plants. We’re excited to bring the expertise we’ve gained working with other mosses to this globally important project,” Pandey added.

“Different sphagnum species act together to engineer the peatland ecosystem,” said Danforth Center Postdoctoral Associate Boominathan Mohanasundaram, PhD. “For example, the S. fuscum forms dry mounds where growth and decay are slow, while S. cuspidatum grows submerged in bog pools and decays relatively faster. By studying these two species from the extremes of water level and decay rate spectrum, we will be able to assess the overall impact of climate factors on peatlands.”

At HudsonAlpha, Harkess and his lab will generate a comprehensive gene expression atlas for various plant tissues for sphagnum, which will provide a valuable resource for future studies. The team will also look at sphagnum moss sex chromosomes. 

“Sphagnum mosses are dioecious plants, having individuals with separate male and female reproductive organs,” says HudsonAlpha Senior Scientist Sarah Carey, PhD, who will lead the genomics projects. “A recent study in one species of sphagnum shows that genes on the sex chromosomes are potentially important to moss growth in peat bogs and how it contends with its acidic environment. We will assemble more sphagnum sex chromosomes to get a better picture of what all is controlled by interactions with the sex chromosomes.”  

By understanding sphagnum mosses more completely on a genetic level, scientists across many disciplines will gain insights into the evolution of land plants and access more tools to develop strategies to protect peatlands, which are crucial for mitigating climate change and preserving biodiversity.

About The Donald Danforth Plant Science Center
Founded in 1998, the Donald Danforth Plant Science Center is a nonprofit research institute with a mission to improve the human condition through plant science. Research, education and outreach aim to have an impact at the nexus of food security and the environment and position the St. Louis region as a world center for plant science. The Center’s work is funded through competitive grants from many sources, including the National Science Foundation, National Institutes of Health, U.S. Department of Energy, the Bill & Melinda Gates Foundation, and through the support of individuals and corporations.

About HudsonAlpha Institute for Biotechnology 
HudsonAlpha Institute for Biotechnology is a nonprofit organization dedicated to innovating the field of genomic technology and science within health, agriculture, education, and commercialization. Founded in 2008 in Huntsville, the institute strives to spark scientific discoveries that impact human health and well-being while bringing genomic medicine into clinical care. HudsonAlpha fosters life sciences entrepreneurship and encourages the creation of a genomics-literate society. The campus includes nearly 50 biotech companies and consists of 152 acres nestled within Cummings Research Park, the second-largest research park in the nation. To learn more about HudsonAlpha, visit hudsonalpha.org.

 

European project "Pro-Wild" to protect and promote crop wild relatives launched




INRAE - National Research Institute for Agriculture, Food and Environment





"Pro-Wild," short for "Protect and Promote Crop Wild Relatives," is focused on three of Europe’s most widely cultivated crops: wheat, sugar beet, and oilseed rape. The production of new varieties has been stressed as a critical means to mitigate the anticipated impacts of climate change. The "Pro-Wild" project aims to better preserve, characterize and exploit the genetic diversity of wild relatives to breed for better-adapted crops.

The "Pro-Wild" project, which is running from 2024-2029, will address three core areas:

  • In-situ Conservation: Mapping, characterizing and protecting CWR in their natural habitats.
  • Ex-situ Conservation: Propagating, conserving, and cataloguing CWR outside their natural environments to safeguard their genetic material.
  • Pre-breeding Activities: Identifying desirable traits in CWR and incorporating these traits into elite breeding programs to enhance the genetic diversity and resilience of cultivated crops.

The ultimate goal of "Pro-Wild" is to improve the adaptability of wheat, sugar beet, and oilseed rape to changing climatic conditions while maintaining resistance to pests and diseases and improving end-use quality. By reintegrating the genetic diversity of wild relatives, the project aims to enhance the sustainability and security of agriculture in Europe and beyond, with project partners in Turkey and Israel.

The project officially kicked off in the first week of September with a meeting in Clermont-Ferrand, France. Representatives from the 19 international partner institutions gathered to outline the project's roadmap and collaborative efforts. The project is managed and coordinated by INRAE.

"Pro-Wild" is funded by the European Union under the Horizon Europe program, with support from UK Research and Innovation (UKRI) and the Swiss State Secretariat for Education, Research and Innovation (SERI).


Project Features

Pro Wild – Protect and Promote Crop Wild Relatives

Coordination by INRAE – 19 parteners : INRAE, National Research Institute for Agriculture, Food and the Environment, France / IAM, Institut Agro Montpellier, France / WEIZMANN, Weizmann Institute of Science, Israel / IPK, Leibniz Institute of Plant Genetics and Crop Plant Research, Germany / BOKU, University of Natural Resources and Life Sciences Vienna, Austria / ARO, The Agricultural Research Organisation of Israel – The Volcani Centre, Israel / CREA, Council for Agricultural Research and Economics, Italy / UNIPG, University of Perugia, Italy / UNIPA, University of Palermo, Italy / CUK, Çukurova University, Türkiye / INN, Innolea , France / KWS, KWS SAAT SE & Co. KGaA, Germany / IT, INRAE Transfert SAS, France / ELGO, Hellenic Agricultural Organization – Dimitra, Greece / IfZ, Institute of Sugar Beet Research, Germany / HCC, Highclere Consulting, Romania / SV, SESVanderHave , Belgium / FiBL , Research Institute of Organic Agriculture, Switzerland / JIC, John Innes Centre, United Kingdom

European funding under the Horizon Europe programme: €5 472 484,50

Horizon Europe project  n° 101134965

www.pro-wild.eu - LinkedIn  - X