Monday, April 08, 2024

 

How scientists are accelerating chemistry discoveries with automation


New statistical-modeling workflow may help advance drug discovery and synthetic chemistry



DOE/LAWRENCE BERKELEY NATIONAL LABORATORY

automated drug discovery 

IMAGE: 

BERKELEY LAB SCIENTISTS HAVE DEVELOPED A NEW AUTOMATED WORKFLOW THAT APPLIES STATISTICAL ANALYSIS TO PROCESS DATA FROM NUCLEAR MAGNETIC RESONANCE (NMR) SPECTROSCOPY. THE ADVANCE COULD HELP SPEED THE DISCOVERY OF NEW PHARMACEUTICAL DRUGS AND ACCELERATE THE DEVELOPMENT OF NEW CHEMICAL REACTIONS.

view more 

CREDIT: JENNY NUSS/BERKELEY LAB




A new automated workflow developed by scientists at Lawrence Berkeley National Laboratory (Berkeley Lab) has the potential to allow researchers to analyze the products of their reaction experiments in real time, a key capability needed for future automated chemical processes.

The developed workflow – which applies statistical analysis to process data from nuclear magnetic resonance (NMR) spectroscopy – could help speed the discovery of new pharmaceutical drugs, and accelerate the development of new chemical reactions.

The Berkeley Lab scientists who developed the groundbreaking technique say that the workflow can quickly identify the molecular structure of products formed by chemical reactions that have never been studied before. They recently reported their findings in the Journal of Chemical Information and Modeling.

In addition to drug discovery and chemical reaction development, the workflow could also help researchers who are developing new catalysts. Catalysts are substances that facilitate a chemical reaction in the production of useful new products like renewable fuels or biodegradable plastics.

“What excites people the most about this technique is its potential for real-time reaction analysis, which is an integral part of automated chemistry,” said first author Maxwell C. Venetos, a former researcher in Berkeley Lab’s Materials Sciences Division and former graduate student researcher in materials sciences at UC Berkeley. He completed his doctoral studies last year. “Our workflow really allows you to start pursuing the unknown. You are no longer constrained by things that you already know the answer to.”

The new workflow can also identify isomers, which are molecules with the same chemical formula but different atomic arrangements. This could greatly accelerate synthetic chemistry processes in pharmaceutical research, for example. “This workflow is the first of its kind where users can generate their own library, and tune it to the quality of that library, without relying on an external database,” Venetos said.

Advancing new applications

In the pharmaceutical industry, drug developers currently use machine-learning algorithms to virtually screen hundreds of chemical compounds to identify potential new drug candidates that are more likely to be effective against specific cancers and other diseases. These screening methods comb through online libraries or databases of known compounds (or reaction products) and match them with likely drug “targets” in cell walls.

But if a drug researcher is experimenting with molecules so new that their chemical structures don’t yet exist in a database, they must typically spend days in the lab to sort out the mixture’s molecular makeup: First, by running the reaction products through a purification machine, and then using one of the most useful characterization tools in a synthetic chemist’s arsenal, an NMR spectrometer, to identify and measure the molecules in the mixture one at a time.

“But with our new workflow, you could feasibly do all of that work within a couple of hours,” Venetos said. The time-savings come from the workflow’s ability to rapidly and accurately analyze the NMR spectra of unpurified reaction mixtures that contain multiple compounds, a task that is impossible through conventional NMR spectral analysis methods.

“I’m very excited about this work as it applies novel data-driven methods to the age-old problem of accelerating synthesis and characterization,” said senior author Kristin Persson, a faculty senior scientist in Berkeley Lab’s Materials Sciences Division and UC Berkeley professor of materials science and engineering who also leads the Materials Project.

Experimental results

In addition to being much faster than benchtop purification methods, the new workflow has the potential to be just as accurate. NMR simulation experiments performed using the National Energy Research Scientific Computing Center (NERSC) at Berkeley Lab with support from the Materials Project showed that the new workflow can correctly identify compound molecules in reaction mixtures that produce isomers, and also predict the relative concentrations of those compounds.

To ensure high statistical accuracy, the research team used a sophisticated algorithm known as Hamiltonian Monte Carlo Markov Chain (HMCMC) to analyze the NMR spectra. They also performed advanced theoretical calculations based on a method called density-functional theory.

Venetos designed the automated workflow as open source so that users can run it on an ordinary desktop computer. That convenience will come in handy for anyone from industry or academia.

The technique sprouted from conversations between the Persson group and experimental collaborators Masha Elkin and Connor Delaney, former postdoctoral researchers in the John Hartwig group at UC Berkeley. Elkin is now a professor of chemistry at the Massachusetts Institute of Technology, and Delaney a professor of chemistry at the University of Texas at Dallas.

“In chemistry reaction development, we are constantly spending time to figure out what a reaction made and in what ratio,” said John Hartwig, a senior faculty scientist in Berkeley Lab’s Chemical Sciences Division and UC Berkeley professor of chemistry. “Certain NMR spectrometry methods are precise, but if one is deciphering the contents of a crude reaction mixture containing a bunch of unknown potential products, those methods are far too slow to have as part of a high-throughput experimental or automated workflow. And that's where this new capability to predict the NMR spectrum could help,” he said.

Now that they’ve demonstrated the automated workflow’s potential, Persson and team hope to incorporate it into an automated laboratory that analyzes the NMR data of thousands or even millions of new chemical reactions at a time.

Other authors on the paper include Masha Elkin, Connor Delaney, and John Hartwig at UC Berkeley.

NERSC is a DOE Office of Science user facility at Berkeley Lab.

The work was supported by the U.S. Department of Energy’s Office of Science, the U.S. National Science Foundation, and the National Institutes of Health.

###

Lawrence Berkeley National Laboratory (Berkeley Lab) is committed to delivering solutions for humankind through research in clean energy, a healthy planet, and discovery science. Founded in 1931 on the belief that the biggest problems are best addressed by teams, Berkeley Lab and its scientists have been recognized with 16 Nobel Prizes. Researchers from around the world rely on the Lab’s world-class scientific facilities for their own pioneering research. Berkeley Lab is a multiprogram national laboratory managed by the University of California for the U.S. Department of Energy’s Office of Science.

DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit energy.gov/science.

 

 

 

PFAS ‘forever chemicals’ above drinking water guidelines in global source water


We’re likely underestimating the future impact of PFAS in the environment, new UNSW Sydney-led study shows.



UNIVERSITY OF NEW SOUTH WALES





Per-and poly-fluoroalkyl substances – commonly known as PFAS – are a group of over 14,000 human-made chemicals that have been popular since the 1950s for their diverse skills in resisting heat, water, grease and stains.  

They’ve been commonly found in household products like non-stick frypans, clothing, cosmetics, insecticides, and food packaging, as well as specialty industry products, like firefighting foam. 

But despite their broad skillset, the chemicals have a dark side: they’re known as ‘forever chemicals’ as once they’re in the environment – or our bodies – they don’t degrade further.  

PFAS have been linked to environmental and health issues, including some cancers, but a lot remains unknown about the true scale and potential impacts of the problem – including how much is in our water supply. 

A new UNSW-led international study, published today in Nature Geoscience, assessed the levels of PFAS contamination in surface and ground water around the globe.  

It found that much of our global source water exceeds PFAS safe drinking limits. 

“Many of our source waters are above PFAS regulatory limits,” says senior author of the study, UNSW Engineering Professor Denis O’Carroll

“We already knew that PFAS is pervasive in the environment, but I was surprised to find out the large fraction of source waters that are above drinking water advisory recommendations,” he says. “We're talking above 5 per cent, and it goes over 50 per cent in some cases.” 

The research team pulled together PFAS measurements from sources around the world, including government reports, databases, and peer-reviewed literature. Altogether, they collated more than 45,000 data points, which span over roughly 20 years.  

It’s the first study to quantify the environmental burden of PFAS on a global scale. 

The study also found high concentrations of PFAS in Australia, with many locations above recommended drinking water levels. This tended to be in areas where firefighting foams had been used in the past, like military institutions and fire training facilities.  

Prof. O’Carroll stresses that these PFAS traces are found in source water, such as dams, and not drinking water itself – drinking water goes through treatment plants, some of which are designed to reduce the amount of chemicals such as PFAS in our water before it comes out of the tap.  

But some water providers – for example, Sydney Water – don’t routinely measure the broad range of PFAS potentially in our drinking water, says Prof. O’Carroll.

“Drinking water is largely safe, and I don't hesitate drinking it,” he says. “I also don’t suggest that bottled water is better, because it doesn’t mean that they’ve done anything differently than what comes out of the tap. 

“But I certainly think that monitoring PFAS levels and making the data easily available is worthwhile.” 

A contentious debate: how much PFAS is too much? 

Most people in Australia – and in many places around the world – are likely to have low levels of PFAS in their bodies.  

But the potential health risks of PFAS chemicals are poorly understood and haven’t been agreed on universally. 

According to an Australian Government expert health panel, there is limited to no evidence that PFAS poses clinically significant harm to human health – although further afield, peak bodies in the US and Europe suggest that PFAS is linked to adverse health outcomes, such as lower birth weight in babies, higher levels of cholesterol, reduced kidney function, thyroid disease, altered sex hormone levels, reduced vaccine response, and liver, kidney, and testicular cancers. 

In 2023, the World Health Organisation (WHO) declared PFOA, a type of PFAS, a category one human carcinogen. 

While PFAS has been linked to many of these health outcomes, they haven’t necessarily been shown to cause them – but given the potential risks and ‘forever’ nature of these chemicals, many regulatory bodies have tightened PFAS use and introduced safe drinking water limits as a precaution. 

“Two forms of PFAS initially raised of concerns about 20 years ago: PFOS and PFOA,” says Prof. O’Carroll. 

“These chemicals are regulated to different extents around the world. In the US, the proposed drinking water limits for PFOS and PFOA are four nanograms per litre.” 

A third PFAS is also regulated in Australia, called PFHxS. Here, the sum of PFOS and PFHxS is limited to 70 nanograms per litre – well above the four nanograms per litre combined PFOS and PFOA limit in the US. 

But our acceptable levels for PFOA in drinking water is even higher.  

“PFOA, on the other hand, is regulated in Australia at 560 nanograms per litre, which is two orders of magnitude higher than in the US,” says Prof. O’Carroll. 

While Australia’s limits seem relaxed compared to the US, both countries’ recommended drinking water guidelines pale when compared to Canada’s: here, rather than limiting only two or three forms of PFAS in drinking water, Canada tallies up the sum of all 14,000 PFAS and limits the overall number to 30 nanograms per litre. 

The study found that 69 per cent of global groundwater samples with no known contamination source exceeded Health Canada’s safe drinking water criteria, while 32 per cent of the same samples exceeded the US’s proposed drinking water hazard index. 

“There’s debate about what level PFAS should be regulated to,” says Prof. O’Carroll. “Australia has much higher limits than the US, but the question is why. 

“Both health bodies would have different reasoning for that, and there’s not a really strong consensus here.” 

An underestimated risk 

The study suggests that actual PFAS pollution in global water resources could be higher than suspected. 

This is, in part, due to us only monitoring and regulating a limited number of the 14,000 PFAS in existence, and also because the levels of PFAS in consumer products are higher than expected. 

“There’s a real unknown amount of PFAS that we’re not measuring in the environment,” says Prof. O’Carroll. “Commercial products like garments and food packaging have a lot more PFAS in them than we realise.  

“This means we’re likely underestimating the environmental burden posed by PFAS.” 

Prof. O’Carroll and his team are now trying to develop their research by quantifying these levels of PFAS from commercial products in the environment.  

They’re also working to develop technologies that can degrade PFAS in drinking water systems, and looking at developing predictive models that determine where PFAS will go in the environment. 

“Part of this is figuring out how PFAS will associate with different parts of the environment and our bodies – proteins, for example,” says Prof. O’Carroll. 

These studies will be in progress over the next two years and aim to be completed by 2026. 

In the meantime, Prof. O’Carroll says manufacturers and consumers alike need to be careful and do our due diligence when using products containing PFAS. 

“We manufacture and distribute a lot of chemicals without having a full assessment on their potential health impacts,” he says. 

We should have judicious use of some of these chemicals. Just because they’re available, doesn't mean that we should use them.” 

This 3D printer can figure out how to print with an unknown material

The advance could help make 3D printing more sustainable, enabling printing with renewable or recyclable materials that are difficult to characterize.


NEWS RELEASE 
MASSACHUSETTS INSTITUTE OF TECHNOLOGY

Parameter Discovery 

IMAGE: 

RESEARCHERS DEVELOPED A 3D PRINTER THAT CAN AUTOMATICALLY IDENTIFY THE PARAMETERS OF AN UNKNOWN MATERIAL ON ITS OWN.

view more 

CREDIT: COURTESY OF NEIL GERSHENFELD, JAKE READ, ET AL




While 3D printing has exploded in popularity, many of the plastic materials these printers use to create objects cannot be easily recycled. While new sustainable materials are emerging for use in 3D printing, they remain difficult to adopt because 3D printer settings need to be adjusted for each material, a process generally done by hand.

To print a new material from scratch, one must typically set up to 100 parameters in software that controls how the printer will extrude the material as it fabricates an object. Commonly used materials, like mass-manufactured polymers, have established sets of parameters that were perfected through tedious, trial-and-error processes. 

But the properties of renewable and recyclable materials can fluctuate widely based on their composition, so fixed parameter sets are nearly impossible to create. In this case, users must come up with all these parameters by hand.

Researchers tackled this problem by developing a 3D printer that can automatically identify the parameters of an unknown material on its own. 

A collaborative team from MIT’s Center for Bits and Atoms (CBA), the U.S. National Institute of Standards and Technology (NIST), and the National Center for Scientific Research in Greece (Demokritos) modified the extruder, the “heart” of a 3D printer, so it can measure the forces and flow of a material.

These data, gathered through a 20-minute test, are fed into a mathematical function that is used to automatically generate printing parameters. These parameters can be entered into off-the-shelf 3D printing software and used to print with a never-before-seen material.  

The automatically generated parameters can replace about half of the parameters that typically must be tuned by hand. In a series of test prints with unique materials, including several renewable materials, the researchers showed that their method can consistently produce viable parameters. 

This research could help to reduce the environmental impact of additive manufacturing, which typically relies on nonrecyclable polymers and resins derived from fossil fuels.

“In this paper, we demonstrate a method that can take all these interesting materials that are bio-based and made from various sustainable sources and show that the printer can figure out by itself how to print those materials. The goal is to make 3D printing more sustainable,” says senior author Neil Gershenfeld, who leads CBA.

His co-authors include first author Jake Read a graduate student in the CBA who led the printer development; Jonathan Seppala, a chemical engineer in the Materials Science and Engineering Division of NIST; Filippos Tourlomousis, a former CBA postdoc who now heads the Autonomous Science Lab at Demokritos; James Warren, who leads the Materials Genome Program at NIST; and Nicole Bakker, a research assistant at CBA. The research is published in the journal Integrating Materials and Manufacturing Innovation.

Shifting material properties

In fused filament fabrication (FFF), which is often used in rapid prototyping, molten polymers are extruded through a heated nozzle layer-by-layer to build a part. Software, called a slicer, provides instructions to the machine, but the slicer must be configured to work with a particular material. 

Using renewable or recycled materials in an FFF 3D printer is especially challenging because there are so many variables that affect the material properties. 

For instance, a bio-based polymer or resin might be composed of different mixes of plants based on the season. The properties of recycled materials also vary widely based on what is available to recycle. 

“In ‘Back to the Future,’ there is a ‘Mr. Fusion’ blender where Doc just throws whatever he has into the blender and it works ]as a power source for the DeLorean time machine]. That is the same idea here. Ideally, with plastics recycling, you could just shred what you have and print with it. But, with current feed-forward systems, that won’t work because if your filament changes significantly during the print, everything would break,” Read says.

To overcome these challenges, the researchers developed a 3D printer and workflow to automatically identify viable process parameters for any unknown material.

They started with a 3D printer their lab had previously developed that can capture data and provide feedback as it operates. The researchers added three instruments to the machine’s extruder that take measurements which are used to calculate parameters.

A load cell measures the pressure being exerted on the printing filament, while a feed rate sensor measures the thickness of the filament and the actual rate at which it is being fed through the printer.

“This fusion of measurement, modeling, and manufacturing is at the heart of the collaboration between NIST and CBA, as we work develop what we’ve termed ‘computational metrology,’” says Warren.

These measurements can be used to calculate the two most important, yet difficult to determine, printing parameters: flow rate and temperature. Nearly half of all print settings in standard software are related to these two parameters.  

Deriving a dataset

Once they had the new instruments in place, the researchers developed a 20-minute test that generates a series of temperature and pressure readings at different flow rates. Essentially, the test involves setting the print nozzle at its hottest temperature, flowing the material through at a fixed rate, and then turning the heater off.

“It was really difficult to figure out how to make that test work. Trying to find the limits of the extruder means that you are going to break the extruder pretty often while you are testing it. The notion of turning the heater off and just passively taking measurements was the ‘aha’ moment,” says Read.

These data are entered into a function that automatically generates real parameters for the material and machine configuration, based on relative temperature and pressure inputs. The user can then enter those parameters into 3D printing software and generate instructions for the printer.

In experiments with six different materials, several of which were bio-based, the method automatically generated viable parameters that consistently led to successful prints of a complex object.

Moving forward, the researchers plan to integrate this process with 3D printing software so parameters don’t need to be entered manually. In addition, they want to enhance their workflow by incorporating a thermodynamic model of the hot end, which is the part of the printer that melts the filament.

This collaboration is now more broadly developing computational metrology, in which the output of a measurement is a predictive model rather than just a parameter. The researchers will be applying this in other areas of advanced manufacturing, as well as in expanding access to metrology.

This research is supported, in part, by the National Institute of Standards and Technology and the Center for Bits and Atoms Consortia.

###

Written by Adam Zewe, MIT News

Paper: “Online Measurement for Parameter Discovery in Fused Filament Fabrication”

https://link.springer.com/article/10.1007/s40192-024-00350-w

 

JOURNAL

DOI

ARTICLE TITLE

Proof-of-principle demonstration of 3-D magnetic recording


Possibility of ultra-high density hard disk drives with areal densities exceeding 10 Tbit/in² using multi-level magnetic recording



NATIONAL INSTITUTE FOR MATERIALS SCIENCE, JAPAN

Figure 

IMAGE: 

SCHEMATIC VIEW OF (TOP) CURRENTLY USED HAMR AND (BOTTOM) THREE-DIMENSIONAL MAGNETIC RECORDING SYSTEMS. IN THE THREE-DIMENSIONAL MAGNETIC RECORDING SYSTEM, THE CURIE TEMPERATURE OF EACH RECORDING LAYER DIFFERS BY ABOUT 100 K AND DATA ARE WRITTEN TO EACH LAYER BY ADJUSTING THE LASER POWER.

view more 

CREDIT: YUKIKO TAKAHASHI NIMS, THOMAS CHANG SEAGATE TECHNOLOGY, SIMON GREAVES TOHOKU UNIVERSITY




1. Research groups from NIMS, Seagate Technology, and Tohoku University have made a breakthrough in the field of hard disk drives (HDD) by demonstrating the feasibility of multi-level recording using a three-dimensional magnetic recording medium to store digital information. The research groups have shown that this technology can be used to increase the storage capacity of HDDs, which could lead to more efficient and cost-effective data storage solutions in the future.

2. Data centers are increasingly storing vast amounts of data on hard disk drives (HDDs) that use perpendicular magnetic recording (PMR) to store information at areal densities of around 1.5 Tbit/in². However, to transition to higher areal densities, a high anisotropy magnetic recording medium consisting of FePt grains combined with heat-assisted laser writing is required. This method, known as heat-assisted magnetic recording (HAMR), is capable of sustaining areal recording densities of up to 10 Tbit/in². Furthermore, densities of larger than 10 Tbit/in² are possible based on a new principle demonstrated by storing multiple recording levels of 3 or 4 compared with the binary level used in HDD technology.

3. In this study, we succeeded in arranging the FePt recording layers three dimensionally, by fabricating lattice-matched, FePt/Ru/FePt multilayer films, with Ru as a spacer layer. Measurements of the magnetization show the two FePt layers have different Curie temperatures. This means that three-dimensional recording becomes possible by adjusting the laser power when writing. In addition, we have demonstrated the principle of 3D recording through recording simulations, using a media model that mimics the microstructure and magnetic properties of the fabricated media.

4. The three-dimensional magnetic recording method can increase recording capacity by stacking recording layers in three dimensions. This means that more digital information can be stored with fewer HDDs, leading to energy savings for data centers. In the future, we plan to develop processes to reduce the size of FePt grains, to improve the orientation and magnetic anisotropy, and to stack more FePt layers to realize a media structure suitable for practical use as a high-density HDD.

***

5. This research was conducted by Dr. P. Tozman, Distinguished Researcher, and Dr. Yukiko Takahashi, Group Leader of NIMS Center for Magnetic and Spintronics Materials Research, Dr. T.Y. Chang, Researcher at Seagate Technology, and Prof. S.J. Greaves of Tohoku University. This work was supported by Japan Science and Technology Agency (JST) Strategic Basic Research Programs (CREST) "Integrated Devices and Systems Utilizing Information Carriers" JPMJCR22C3.

6. This research was published in Acta Materialia on March 24, 2024.

JOURNAL

DOI

METHOD OF RESEARCH

 

Gendered recommendations in 19th century list of books for boys and girls set the stage for field of children’s literature today



CARNEGIE MELLON UNIVERSITY




Children’s literature became a distinct category during the Progressive Era in the United States, largely through the work of professional “book women” like children’s librarians, publishers, and teachers. In a chapter in a new book, researchers examine one of the first attempts to formalize a selection of existing literature into a canon of children’s books, the 1882 pamphlet Books for the Young by Caroline M. Hewins. They also analyze the books selected by Hewins, with a focus on books designated for boys only and for girls only.

The chapter, by researchers at Carnegie Mellon University, appears in Corpora and Rhetorically Informed Text Analysis: The Diverse Applications of DocuScope.

“Hewins’ booklist featured more than a thousand titles and is widely acknowledged by children’s literature scholars to be a major milestone in the formation of the field, but the actual texts are understudied,” explains Rebekah Fitzsimmons, assistant teaching professor of professional communication at Carnegie Mellon’s Heinz College, who coauthored the chapter.

“Our analysis found that Hewins assigned a narrower scope of text types to the category of books that girls will like, which sets a precedent for future gendering of readership for the field of children’s literature.”

Children’s literature seems omnipresent today, but this category of publishing did not exist until the turn of the 20th century, when Progressive Era social reform spurred advocates to establish a category of literature for children. Scholars now view the lists of recommended books written by these knowledgeable librarians (like Hewins) as foundational. But little attention has been paid to examining the stylistic and rhetorical elements of the books themselves.

In this chapter, Fitzsimmons and her coauthor examine what drew Hewins to select specific titles for the first edition of the list of books she compiled. Using DocuScope, a computer-based, rhetorically informed, dictionary-based tagging system created by faculty in Carnegie Mellon’s English department, the researchers identified patterns, including how Hewins selected books for gendered groups of readers (i.e., books designated as especially good for boys, especially good for girls, or appropriate for a general audience).

The books Hewins recommended for girls were largely place-based in spaces like home or school; were more likely to be written in the first or second person; and were more likely to be focused on negative emotions, acts, relationships, or values. In contrast, the books she recommended for boys were far more likely to be high in reasoning and confidence (a marker of nonfiction texts) and included settings beyond home and school.

In addition, Hewins’ recommendations shed light on the era’s ideas about which occupations and hobbies interested boys and girls. For example, she suggested books on manufacturing and the sciences for boys, and recommended books on household arts and amusements for girls.

“Given the importance of this list in children’s literature studies, our analysis reveals what a knowledgeable book expert of the 1880s considered gender-appropriate reading—including insights into how Victorian notions of binary gendered spheres intersected with ideas of gendered readership,” notes Gisele (Xinyu) Wu, who was a senior in the statistics department at Carnegie Mellon when she coauthored the chapter.

“But it also demonstrates that Hewins’ selection of books according to those cultural norms established standards about the kinds of books girls like or don’t like, and tells us how those choices continue to shape children’s literature today.”

Fitzsimmons plans to continue to work on this project with assistance from Carnegie Mellon’s statistics department’s capstone teams (which help demonstrate students’ mastery of subjects or fields of study), comparing lists from other librarians published in 1904 and 1940.

Relatedly, in a separate chapter in another book, Fitzsimmons examined Hewins’ 1882 canon-formation project and her pamphlet. She explored the language Hewins used to bolster the ethos of her pamphlet and to discuss books suitable for boys and girls, and her attempts to wrangle the emerging field of children’s literature into usable categories. She also parsed the political maneuvers of this canon-forming project meant to establish children’s literature as a respectable sub-field of literature, therefore creating an argument for valuing the expertise of children’s librarianship based on taste, discernment, broad knowledge and hands-on experience. This chapter appears in Children’s Literature and Childhood Discourses: Exploring Identity through Fiction.

 

Boreal forest and tundra regions worst hit over next 500 years of climate change, study shows


The boreal forest, covering much of Canada and Alaska, and the treeless shrublands to the north of the forest region, may be among the worst impacted by climate change over the next 500 years, according to a new study.


UNIVERSITY OF YORK





The boreal forest, covering much of Canada and Alaska, and the treeless shrublands to the north of the forest region, may be among the worst impacted by climate change over the next 500 years, according to a new study.

 

The study, led by researchers at the White Rose universities of York and Leeds, as well as Oxford and Montreal, and ETH, Switzerland, ran a widely-used climate model with different atmospheric concentrations of carbon dioxide to assess the impact climate change could have on the distribution of ecosystems across the planet up to the year 2500.

 

Most climate prediction models run to the year 2100, but researchers are keen to explore longer-term projections that give a global picture of how much humans, animals and plant-life may need to adapt to climate change beyond the next century, which is important as long-lived trees adapt at scales of centuries rather than decades.

 

Modelling climate change over a 500 year period shows that much of the boreal forest, the Earth’s northernmost forests and most significant provider of carbon storage and clean water, could be seriously impacted, along with tundra regions, treeless shrublands north of the boreal forest that play a significant role in regulating the Earth’s climate. 

 

Tundra regions have already seen new plants colonising lands that would have once been too cold for them to survive on, and as the planet continues to warm, its ability to cool tropical heat, pushing it back down to the equator is reduced.  

 

This means that if there is not a rapid halt in emitting greenhouse gases, large parts of some of the hottest countries on Earth will become too hot to be easily inhabited and considerable changes would have to be made to daily life to exist there.

 

The researchers highlight that although we are already starting to see animals and plants migrating as they try to adapt to changing climate conditions, this could intensify in the future.  As the study highlights, some species, like trees, migrate much slower than animals and humans can, and so some plants and animals will be lost altogether, threatening the survival of today’s ecosystems

 

Dr Christopher Lyon, from the University of York’s Department of Environment and Geography and Leverhulme Centre for Anthropocene Biodiversity, said: “We know now that some aspects of climate change are inevitable and so a level of adaptation is required, but how extensive these adaptations need to be is still in our hands. It is, therefore, useful to look beyond the UN’s 2030 and 2050 carbon emission targets, as well as the 2100 climate model predictions, as we know that climate change won’t stop there.

 

“By looking much further into the future - the future that our grandchildren will face - we can see that there is a significant difference between climate change rates, species migration rates, and their migration ability.  Trees, for example, will migrate much slower than birds and mammals, and boreal decline radically changes the ecosystems they’ve formed since the glaciers retreated about 12 000 years ago. 

 

“Those species that can’t adapt or move to more suitable locations will radically decline in number and range or even go extinct.”

 

The study highlights that current boreal regions are colder and less densely populated, but changing environments may mean more people migrate to these landscapes as they warm in the future, increasing the pressures on ecosystems and species.  

 

Migration on this scale also relies on political cooperation from countries around the world, and researchers point out that given current global conflicts and divisions, this could be one of the most significant barriers to successful climate adaptation. 

 

Dr Lyon said: “What’s most important, I think, is that the long-term projections highlight the scale of the change we, and especially our children and grandchildren face - even under the lower warming scenarios - and the need to start thinking very hard now about what it will take for all of us to live justly in those possible worlds.”

 

Dr Bethany Allen from ETH (Federal Institute of Technology) Zurich, added: "Our study indicates the longevity and severity of the impacts that human-induced climate change will have on the biosphere. The need to protect boreal forest and tundra biomes is particularly pressing, and our results demonstrate how large-scale geographic shifts in the areas occupied by these biomes might be necessary in order to preserve them over the next few hundred years."


The research is published in the journal Philosophical Transactions of the Royal Society B and funded in part by the White Rose Collaboration Fund and the Leverhulme Trust via the Leverhulme Centre for Anthropocene Biodiversity.

 

Climate change threatens Antarctic meteorites



ETH ZURICH

Blue ice area - Ellsworth Mountains, Antarctica 

IMAGE: 

FIELD GUIDE IN A BLUE ICE AREA DURING A MISSION TO TAKE ICE SAMPLES. PHOTO TAKEN DURING THE 2023-2024 FIELDWORK MISSION OF THE INSTITUTO ANTÁRTICO CHILENO (INACH) TO UNION GLACIER, ELLSWORTH MOUNTAINS, ANTARCTICA.

view more 

CREDIT: VERONICA TOLLENAAR, UNIVERSITÉ LIBRE DE BRUXELLES




Using artificial intelligence, satellite observations, and climate model projections, a team of researchers from Switzerland and Belgium calculate that for every tenth of a degree of increase in global air temperature, an average of nearly 9,000 meteorites disappear from the surface of the ice sheet. This loss has major implications, as meteorites are unique samples of extraterrestrial bodies that provide insights into the origin of life on Earth and the formation of the Moon.

Disappearing at an alarming rate

By 2050, about a quarter of the estimated of 300,000 - 800,000 meteorites in Antarctica will be lost due to glacial melt. By end of the century, researchers anticipate that number could rise approaching a loss of meteorites closer to three-quarters of the meteorites on the continent under a high-warming scenario.

Published in the journal Nature Climate Change, Harry Zekollari co-led the study while working under Professor Daniel Farinotti in the Laboratory of Hydraulics, Hydrology and Glaciology at the Department of Civil, Environmental and Geomatic Engineering at ETH Zurich. Zekollari and co-lead Veronica Tollenaar, Université Libre de Bruxelles, reveal in the study that ongoing warming results in the loss of about 5,000 meteorites a year, outpacing the collection efforts of Antarctic meteorites by a factor five.

Meteorites – time capsules of the universe

Zekollari, now an Associate Professor of Glaciology at Vrije Universiteit Brussel, calls for a major international effort to preserve the scientific value of meteorites, “We need to accelerate and intensify efforts to recover Antarctic meteorites. The loss of Antarctic meteorites is much like the loss of data that scientists glean from ice cores collected from vanishing glaciers – once they disappear, so do some of the secrets of the universe.”

Meteorites are fragments from space that provide unique information about our solar system. Antarctica is the most prolific place to find meteorites, and to date, about 60 percent of all meteorites ever found on Earth have been collected from the surface of the Antarctic ice sheet. The flow of the ice sheet concentrates meteorites in so-called “meteorite stranding zones”, where their dark crust allows them to be easily detected. In addition to intensifying recovery operations, there is potential to increase the efficiency of meteorite recovery missions in the short term. This potential relies mainly on data-driven analysis to identify unexplored meteorite stranding zones and mapping areas exposing blue ice where meteorites are often found.

Extraterrestrial heritage slipping away

Due to their dark colour, meteorites preferentially heat up with respect to the surrounding ice. As this heat transfers from the meteorites to the ice, it can warm up the ice, and eventually cause the ice to locally melt, leading to a sinking of meteorites underneath the surface of the ice sheet. Once the meteorites enter the ice sheet, even at shallow depths, they cannot be detected anymore, and they are thus lost for science.

As atmospheric temperatures increase, so does the surface temperature of the ice, intensifying the loss. "Even when temperatures of the ice are well below zero, the dark meteorites warm-up so much in the sun that they can melt the ice directly beneath the meteorite. Through this process, the warm meteorite creates a local depression in the ice and over time fully disappears under the surface,” says Tollenaar.

Scientists conclude that in the long-term, the only way to preserve most of the remaining unrecovered Antarctic meteorites is to rapidly reduce greenhouse gas emissions.

Antarctic meteorite (HUT 18036) partially in the ice, in contrast to most samples that are collected while lying on the surface. Meteorite collected by the Lost Meteorites of Antarctica project.

CREDIT

Katherine Joy, The University of Manchester, The Lost Meteorites of Antarctica project.