Friday, May 12, 2023

Metal-filtering sponge removes lead from water

Reusable sponge can capture and recover critical metals and heavy-metal pollutants

Peer-Reviewed Publication

NORTHWESTERN UNIVERSITY

Lead-filtering sponge 

IMAGE: COMMERCIALLY AVAILABLE CELLULOSE SPONGE COATED IN MANGANESE-DOPED GOETHITE NANOPARTICLES view more 

CREDIT: CAROLINE HARMS/NORTHWESTERN UNIVERSITY

Northwestern University engineers have developed a new sponge that can remove metals — including toxic heavy metals like lead and critical metals like cobalt — from contaminated water, leaving safe, drinkable water behind.

In proof-of-concept experiments, the researchers tested their new sponge on a highly contaminated sample of tap water, containing more than 1 part per million of lead. With one use, the sponge filtered lead to below detectable levels.

After using the sponge, researchers also were able to successfully recover metals and reuse the sponge for multiple cycles. The new sponge shows promise for future use as an inexpensive, easy-to-use tool in home water filters or large-scale environmental remediation efforts.

The study was published late yesterday (May 10) in the journal ACS ES&T Water. The paper outlines the new research and sets design rules for optimizing similar platforms for removing — and recovering — other heavy-metal toxins, including cadmium, arsenic, cobalt and chromium.

“The presence of heavy metals in the water supply is an enormous public health challenge for the entire globe,” said Northwestern’s Vinayak Dravid, senior author of the study. “It is a gigaton problem that requires solutions that can be deployed easily, effectively and inexpensively. That’s where our sponge comes in. It can remove the pollution and then be used again and again.”

Dravid is the Abraham Harris Professor of Materials Science and Engineering at Northwestern’s McCormick School of Engineering and director of global initiatives at the International Institute for Nanotechnology.

Sopping up spills

The project builds on Dravid’s previous work to develop highly porous sponges for various aspects of environmental remediation. In May 2020, his team unveiled a new sponge designed to clean up oil spills. The nanoparticle-coated sponge, which is now being commercialized by Northwestern spinoff MFNS Tech, offers a more efficient, economic, ecofriendly and reusable alternative to current approaches to oil spills.

But Dravid knew it wasn’t enough.

“When there is an oil spill, you can remove the oil,” he said. “But there also are toxic heavy metals — like mercury, cadmium, sulfur and lead — in those spills. So, even when you remove the oil, some of the other toxins might remain.

Rinse and repeat

To tackle this aspect of the issue, Dravid’s team, again, turned to sponges coated with an ultrathin layer of nanoparticles. After testing many different types of nanoparticles, the team found that a manganese-doped goethite coating worked best. Not only are manganese-doped goethite nanoparticles inexpensive to make, easily available and nontoxic to human, they also have the properties necessary to selectively remediate heavy metals.

“You want a material with a high surface area, so there’s more room for the lead ions to stick to it,” said Benjamin Shindel, a Ph.D. student in Dravid’s lab and the paper’s first author. “These nanoparticles have high-surface areas and abundant reactive surface sites for adsorption and are stable, so they can be reused many times.”

The team synthesized slurries of manganese-doped goethite nanoparticles, as well as several other compositions of nanoparticles, and coated commercially available cellulose sponges with these slurries. Then, they rinsed the coated sponges with water in order to wash away any loose particles. The final coatings measured just tens of nanometers in thickness.

When submerged into contaminated water, the nanoparticle-coated sponge effectively sequested lead ions. The U.S. Food and Drug Administration requires that bottled drinking water is below 5 parts per billion of lead. In filtration trials, the sponge lowered the amount of lead to approximately 2 parts per billion, making it safe to drink.

“We’re really happy with that,” Shindel said. “Of course, this performance can vary based on several factors. For instance, if you have a large sponge in a tiny volume of water, it will perform better than a tiny sponge in a huge lake.”

Recovery bypasses mining

From there, the team rinsed the sponge with mildly acidified water, which Shindel likened to “having the same acidity of lemonade.” The acidic solution caused the sponge to release the lead ions and be ready for another use. Although the sponge’s performance declined after the first use, it still recovered more than 90% of the ions during subsequent use cycles.

This ability to gather and then recover heavy metals is particularly valuable for removing rare, critical metals, such as cobalt, from water sources. A common ingredient in lithium-ion batteries, cobalt is energetically expensive to mine and accompanied by a laundry list of environmental and human costs.

If researchers could develop a sponge that selectively removes rare metals, including cobalt, from water, then those metals could be recycled into products like batteries.

“For renewable energy technologies, like batteries and fuel cells, there is a need for metal recovery,” Dravid said. “Otherwise, there is not enough cobalt in the world for the growing number of batteries. We must find ways to recover metals from very dilute solutions. Otherwise, it becomes poisonous and toxic, just sitting there in the water. We might as well make something valuable with it.”

Standardized scale

As a part of the study, Dravid and his team set new design rules to help others develop tools to target particular metals, including cobalt. Specifically, they pinpointed which low-cost and nontoxic nanoparticles also have high-surface areas and affinities for sticking to metal ions. They studied the performance of coatings of manganese, iron, aluminum and zinc oxides on lead adsorption. Then, they established relationships between the structures of these nanoparticles and their adsorptive properties.

Called Nanomaterial Sponge Coatings for Heavy Metals (or “Nano-SCHeMe”), the environmental remediation platform can help other researchers differentiate which nanomaterials are best suited for particular applications.

“I’ve read a lot of literature that compares different coatings and adsorbents,” said Caroline Harms, an undergraduate student in Dravid’s lab and paper co-author. “There really is a lack of standardization in the field. By analyzing different types of nanoparticles, we developed a comparative scale that actually works for all of them. It could have a lot of implications in moving the field forward.”

Dravid and his team imagine that their sponge could be used in commercial water filters, for environmental clean-up or as an added step in water reclamation and treatment facilities.

“This work may be pertinent to water quality issues both locally and globally,” Shindel said. “We want to see this out in the world, where it can make a real impact.”

The study, “Nano-SCHeME: Nanomaterial Sponge Coatings for Heavy Metals, an environmental remediation platform,” was supported by the National Science Foundation and U.S. Department of Energy.

Editor’s note: Dravid and Northwestern have financial interests (equities, royalties) in MFNS Tech.

 

Improved microphysics modeling of clouds


Important physics happens at the microscopic level in clouds, and enhanced modeling will improve future weather, climate and solar energy forecasting

Peer-Reviewed Publication

INSTITUTE OF ATMOSPHERIC PHYSICS, CHINESE ACADEMY OF SCIENCES

Cover 

IMAGE: THE REVIEW IS PUBLISHED IN THE MAY ISSUE OF ADVANCES IN ATMOSPHERIC SCIENCE. view more 

CREDIT: ADVANCES IN ATMOSPHERIC SCIENCE

Clouds are made up of individual, microscopic spheres of water, or hydrometeors, that change and interact with one another based on environmental conditions and the characteristics of the hydrometeor population, such as size and water phase: liquid, ice or vapor.  Improved modeling of the effects of environmental conditions on hydrometeor populations can enhance short- and longer-term weather forecasting and optimize solar energy capture.  This review highlights understudied, yet promising areas of cloud microphysics and modeling and outlines the future challenges and directions of this emerging field.

The microscopic hydrometeors that make up clouds vary based on their size and water phase.  Processes such as condensation and other, less obvious processes, like turbulence, radiative effect, lightning and chemical processes, interact with one another to regulate the hydrometeor population and the characteristics of a cloud at the microscopic level.  Modeling the effects of processes on hydrometeors both individually and collectively with other processes is immensely challenging.  A team of leading atmospheric scientists, including researchers from Brookhaven National Laboratory, USA; McGill University, Canada; Nanjing University of Information Science and Technology, China; and the National Center for Atmospheric Research, USA), identified areas where more research is required to improve modeling equations and challenges that will need to be overcome to improve weather forecasting, climate prediction and solar energy harvesting to optimize the use of renewable energy sources.  

 

The research team published their review in the May issue of Advances in Atmospheric Science.  

“We wanted to review some key aspects of explicit modeling and representation in various computer models of cloud microphysics, identify outstanding challenges, and discuss future research directions,” said Yangang Liu, first author of the team’s review and senior scientist in the Environmental and Climate Sciences Department at Brookhaven National Laboratory in Upton, New York, USA.  “This review integrates different topics on which each of the coauthors have expertise into a cohesive unification, including different approaches for developing bulk microphysics parameterizations (approximate representations), … explicit modeling, and theoretical formulations. We believe that such an integrated approach is crucial for further advancing the field,” said Liu.

 

“Cloud microphysics and their representation is becoming increasingly important as the resolutions of numerical weather prediction and climate models improve. Furthermore, significant knowledge gaps remain and need to be addressed in our physical understanding of cloud microphysical processes [such as] turbulence-microphysics interactions,” said Liu.  Relatively little is known about the effects of small-scale turbulence on hydrometeors and other processes in the cloud.  Turbulence-related processes, in particular, have not been included in most atmospheric modeling, despite the important role turbulence plays in the microphysics of clouds.  Additional research in this area could significantly improve future modeling .

 

Liu and his colleagues identified several other challenges that, once overcome, will greatly advance the field’s understanding of cloud processes and improve prediction.  For example, comparing the different cloud microphysics modeling strategies to one another and understanding how and why they differ could enhance the accuracy of each platform.   Additionally, seeking effective evaluation and integration of models and observations at different scales is needed to address cloud-related problems.

 

Increased computational power and use of artificial intelligence will also present new tools for researchers to improve the modeling of cloud microphysics in the future.  Direct numerical simulations (DNSs), in particular, require a great deal of computing power to resolve individual hydrometeor particles, but have already significantly advanced modeling of warm cloud processes.  “We plan to further advance the specific areas each of us [has] been focusing on as well as go beyond our comfort zones to seek effective integration by capitalizing on advances

in other disciplines such as computational technologies, machine learning and complex systems science, including development of bulk microphysics parameterizations, explicit modeling, and theoretical formulations,” said Liu.

Other contributors include Man-Kong Yau from the Department of Atmospheric Sciences at McGill University in Montreal, Canada; Shin-ichiro Shima from the Graduate School of Information Science at the University of Hyogo in Kobe, Japan; Chunsong Lu from the College of Atmospheric Physics at the Nanjing University of Information Science and Technology in Nanjing, China; and Sisi Chen from the National Center for Atmospheric Research in Boulder, Colorado, USA.

Great Bas­in: His­tory of water sup­ply in one of the dri­est regions in the USA

An international team including Simon Steidle from the Quaternary Research Group at the Department of Geology at the University of Innsbruck has reconstructed the evolution of groundwater in the Great Basin, USA up up to 350,000 years into the past.

Peer-Reviewed Publication

UNIVERSITY OF INNSBRUCK

Devils Hole and its Climate History 

IMAGE: THE INNSBRUCK GEOLOGY TEAM HAS BEEN CARRYING OUT INVESTIGATIONS IN DEVILS HOLE FOR YEARS, HERE DESCENDING INTO THE CAVE TO TAKE DRILL CORES UNDER WATER. view more 

CREDIT: ROBBIE SHONE

The team led by Christoph Spötl has been investigating the famous "Devils Hole" cave system in Nevada since 2010 - during spectacular expeditions. Using the calcite deposits in the cave, the researchers have already reconstructed the development of the water level in the cave up to several hundred thousand years ago. In the current study, this information has now been combined with a numerical groundwater model for this arid region. "Based on our extensive sampling in Devils Hole, we have a large amount of data that provides information on the evolution of the water table. By combining this with groundwater models from the US Geological Survey, we can now draw quantitative conclusions about changes in precipitation for the entire region over the last 350,000 years using the precise data from the cave," explains the geologist Simon Steidle. In drylands like the southwest of the USA, precipitation is particularly important and groundwater data is a mirror of changes in the hydroclimate. "The results can be useful for developing water management strategies and sustainable use of groundwater resources, such as how much water can be withdrawn for agricultural purposes."

Drought increases sensitivity

The new data suggest that the elevation of the water table in Devils Hole was three to four times more sensitive to groundwater recharge during dry climates than during wetter climates of the past. "Given that drought conditions will likely increase even more in the future due to the ongoing climate crisis, our results highlight the vulnerability of large aquifers, and thus the alteration of the most important freshwater resource in this area of the United States," Steidle said. The minimum groundwater level in Devils Hole during the peak of the interglacial warm periods was no more than 1.6 metres below today's level, which corresponds to a decrease in groundwater recharge of less than 17 % compared to today's conditions. During the glacial periods, however, the level was at least 9.5 metres above today's level, which means an increase in groundwater recharge of almost 250% compared to today's conditions.
The new information is relevant especially for the already highly endangered Devils Hole pupfish, a fish measuring just a few centimetres whose only habitat is the water in Devils Hole. The habitat of this species is thus the smallest of all known vertebrates (about half the size of an average classroom). Even small changes in water availability triggered by abstraction of groundwater for irrigation purposes or by climate change are of utmost importance for its survival.

Using an underwater drill specially developed for the expedition to Devils Hole, the researchers took samples from the calcite deposits in the cave.

CREDIT

Robbie Shone

Publication:
A 350,000-year history of groundwater recharge in the southern Great Basin, USA: Tracie R. Jackson, Simon D. Steidle, Kathleen A. Wendt, Yuri Dublyansky, R. Lawrence Edwards & Christoph Spötl. Communications Earth & Environment, 4:98 (2023), https://doi.org/10.1038/s43247-023-00762-0

Culprit behind destruction of New York’s first dinosaur museum revealed

Peer-Reviewed Publication

UNIVERSITY OF BRISTOL

Fig 1 

IMAGE: BENJAMIN WATERHOUSE HAWKIN'S CONCEPTUAL DRAWING OF THE PALEOZOIC MUSEUM. view more 

CREDIT: ANNUAL REPORT OF THE BOARD OF COMMISSIONERS OF THE CENTRAL PARK (1858)

A new paper from the University of Bristol rewrites the history of the darkest, most bizarre event in the history of palaeontology.

In New York, in May of 1871, the partially built, life-size models of dinosaurs and other prehistoric creatures destined for a prestigious new museum in Central Park were totally destroyed in a violent act of malicious vandalism by a gang of thugs with sledgehammers. The shattered pieces were carted away and buried somewhere in the park, never to be seen again.

Until now, the heinous act had been tributed to former American politician William ‘Boss’ Tweed.

But now, a new paper from Ms Victoria Coules of Bristol’s Department of History of Art and Professor Michael Benton of Bristol’s School of Earth Sciences sheds new light on the incident and, contrary to previous accounts, identifies who was really behind the order and what drove them to such wanton destruction – an odd man known as Henry Hilton, the Treasurer and VP of Central Park.

“It’s all to do with the struggle for control of New York city in the years following the American Civil War (1861-1865),” said Ms Coules. “The city was at the centre of a power struggle – a battle for control of the city’s finances and lucrative building and development contracts.”

As the city grew, the iconic Central Park was taking shape. More than just a green space, it was to have other attractions, including the Paleozoic Museum. British sculptor Benjamin Waterhouse Hawkins, who had created the Crystal Palace Dinosaurs, the life-size models of prehistoric creatures in London – had travelled to America and was commissioned to build American versions of the models for the Paleozoic museum.

But the notorious William "Boss" Tweed had taken command of the city and, in sweeping changes to the city's governance, put his own henchmen in charge of city departments - including Central Park. They cancelled the partially complete project in late 1870, and there the matter would have lain but in May 1871 someone ordered the gang of workmen to destroy all of its partly finished contents.

Professor Benton explains: “Previous accounts of the incident had always reported that this was done under the personal instruction of “Boss” Tweed himself, for various motives from raging that the display would be blasphemous, to vengeance for a perceived criticism of him in a New York Times report of the project’s cancellation.”

“Reading these reports, something didn’t look right,” Ms Coules said. “At the time Tweed was fighting for his political life, already accused of corruption and financial wrong-doings, so why was he so involved in a museum project?” She added, “So we went back to the original sources and found that it wasn't Tweed - and the motive was not blasphemy or hurt vanity.”

The situation was complicated by two other projects in development at the same time in Central Park, the American Museum of Natural History (AMNH) and the Central Park Zoo. But, as Professor Benton explained, "drawing on the detailed annual reports and minutes of Central Park, along with reports in the New York Times, we can show that the real villain was one strange character by the name of Henry Hilton.”

Ms Coules adds: “Because all the primary sources are now available online, we could study them in detail – and we could show that the destruction was ordered in a meeting by the real culprit, Henry Hilton, the Treasurer and VP of Central Park – and it was carried out the day after this meeting.”

Hilton was already notorious for other eccentric decisions. When he noticed a bronze statue in the Park, he ordered it painted white, and when a whale skeleton was donated to the American Museum of Natural History, he had that painted white as well. Later in life, other ill-judged decisions included cheating a widow out of her inheritance, squandering a huge fortune, and trashing businesses and livelihoods along the way.

Professor Benton concluded: “This might seem like a local act of thuggery but correcting the record is hugely important in our understanding of the history of palaeontology. We show it wasn’t blasphemy, or an act of petty vengeance by William Tweed, but the act of a very strange individual who made equally bizarre decisions about how artefacts should be treated – painting statues or whale skeletons white and destroying the museum models. He can be seen as the villain of the piece but as character, Hilton remains an enigmatic mystery.”

The article

‘The curious case of Central Park's dinosaurs: The destruction of Benjamin Waterhouse Hawkins' Paleozoic Museum revisited’ By Victoria Coules and Michael J. Benton. Proceedings of the Geologists’ Association online Link.

Benjamin Waterhouse Hawkins' studio at the Central Park Arsenal, with models of extinct animals.

CREDIT

Published in The 12th Annual Report of the Board of Commissioners of the Central Park for the Year Ending December 31, 1868

Scientists discover fire records embedded within sand dunes

The discovery could expand scientific understanding of fire histories to arid regions around the world

Peer-Reviewed Publication

DESERT RESEARCH INSTITUTE

Fire in the dunes 

IMAGE: THE COOROIBAH WILDFIRE SWEEPS DOWN THE COOLOOLA SAND DUNES IN AUSTRALIA. view more 

CREDIT: PHOTO BY MICHAEL FORD

Knowing how the frequency and intensity of wildfires has changed over time offers scientists a glimpse into Earth’s past landscapes, as well as an understanding of future climate change impacts. To reconstruct fire records, researchers rely heavily on sediment records from lake beds, but this means that fire histories from arid regions are often overlooked. Now, a new study shows that sand dunes can serve as repositories of fire history and aid in expanding scientific understanding of fire regimes around the world.

Published May 11 in Quaternary Research, the study is the first to examine sedimentary records preserved in foot-slope deposits of sand dunes. The research team, led by Nicholas Patton, Ph.D., a postdoctoral researcher now at DRI, studied four sand dunes at the Cooloola Sand Mass in Australia. Australia is one of the world’s most fire-prone landscapes, with a long history of both natural and cultural burning, and vast expanses without lakes or ponds to gather sedimentary records from. The researchers aimed to prove that these sand dune deposits could be used to reconstruct reliable, multi-millennial fire histories. These previously unrecognized archives could potentially be used in arid regions around the world to fill knowledge gaps in places where fire shapes the landscape.

“Many fire and paleoclimate records are located where there's a lot of water bodies such as lakes, peats, and bogs,” Patton says. “And because of this, most global models really have a bias towards temperate regions.”

The Cooloola Sand Mass consists of enormous – up to 240-meter-tall – sand dunes that build up at the coast and gradually shift inland from the power of the wind. By identifying the age of the dunes using a technique called optically stimulated luminescence dating, or OSL, Patton’s team found that the four dunes span the Holocene, representing the last approximately 12,000 years. 

Once a dune is stable, meaning it is no longer growing but slowly degrading, the force of gravity acts on the dune slopes to collect falling sand at the base, along with the remnants of charcoal from local fires that deposited on the dune’s surface. This sediment builds up over time, layering charcoal from fire events that can be reliably identified using radiocarbon dating.

“We were digging soil pits at the base of the dunes and were seeing a lot of charcoal – more charcoal than we expected,” says Patton. “And we thought maybe we could utilize these deposits to reconstruct local fires within the area.”    

Patton found that on the younger dunes (at 500 years old and 2,000 years old), charcoal layers represented individual fires, because the steep slope of the dunes quickly buried each layer. However, the older dunes (at 5,000 years old and 10,000 years old) had more gradual slopes that blended charcoal from different fires over time, providing a better understanding of periods of increased or decreased fire frequency.

The dunes offered localized fire histories from within an approximate 100-meter radius, so fire records vary somewhat amongst the four dunes, which spanned approximately 2 kilometers. However, Patton’s team compared their results to other fire records from the region found in lake and swamp deposits. Similar to the regional records, their findings showed three major periods of fire activity over the past 7,000 years.

The researchers write that similar records are likely held in sand dunes around the world, and that regions like California and the Southwest U.S. could benefit from a better understanding of regional fire history. Embedded within the fire records is not only information about natural wildfires, but also the way that humans influenced fire regimes.

“Fire histories are important for understanding how fire was used in the past for cultural purposes, whether that was to clear fields for agriculture or for hunting,” Patton says.

Patton hopes to continue this line of research at other dunes near the Cooloola Sand Mass that are nearly 1 million years old to obtain a long-term fire history for the region. Because Australia has had human communities for at least 60-70 thousand years, and quite possibly longer, these records could help understand the relationship between humans and historical fire regimes.

“These kinds of long-term records aren't always available within lake sediments, but they might be available within these dune deposits,” Patton says. “That's pretty exciting.”

###

 

More information: The full study, Reconstructing Holocene fire records using dune foot-slope deposits at the Cooloola Sand Mass, Australia, is available from Quaternary Research.
DOI: https://doi.org/10.1017/qua.2023.14

Study authors include: Nicholas Patton (DRI/Univ. of Canterbury, NZ/Univ. of Queensland, AUS), James Shulmeister (Univ. of Canterbury, NZ/Univ. of Queensland, AUS), Quan Hua (Australian Nuclear Science and Technology Organization), Peter Almond (Lincoln University, NZ), Tammy Rittenour (Utah State Univ.), Johanna Hanson (Univ. of Canterbury, NZ), Aloysius Grealy (Univ. of Queensland, AUS), Jack Gilroy (Univ. of Queensland, AUS), Daniel Ellerton (Univ. of Queensland, AUS/Stockholm Univ.)

About DRI

The Desert Research Institute (DRI) is a recognized world leader in basic and applied environmental research. Committed to scientific excellence and integrity, DRI faculty, students who work alongside them, and staff have developed scientific knowledge and innovative technologies in research projects around the globe. Since 1959, DRI’s research has advanced scientific knowledge on topics ranging from humans’ impact on the environment to the environment’s impact on humans. DRI’s impactful science and inspiring solutions support Nevada’s diverse economy, provide science-based educational opportunities, and inform policymakers, business leaders, and community members. With campuses in Las Vegas and Reno, DRI serves as the non-profit research arm of the Nevada System of Higher Education. For more information, please visit www.dri.edu.

 

Scientists find fire records inside sand dunes

Peer-Reviewed Publication

UNIVERSITY OF QUEENSLAND

Fire 

IMAGE: FRASER ISLAND FIRE ON K’GARI IN 2020 WHICH BURNED OVER 50% OF THE ISLAND. view more 

CREDIT: QUEENSLAND FIRE AND EMERGENCY SERVICES

A previously unrecognised sedimentary archive in sand dunes could unlock a repository of fire records, a discovery that could expand fire histories across the globe.

The research, conducted by Dr Nicholas Patton during his PhD at The University of Queensland, has solved a persistent problem facing historians investigating changing fire patterns.

“Knowing how the frequency and intensity of wildfires has changed over time offers scientists a glimpse into Earth’s past landscapes, as well as an understanding of future climate change impacts,” Dr Patton said.

“To reconstruct fire records, researchers usually rely heavily on sediment records from lake beds, but this means that fire histories from dryland regions are often overlooked.”

“We’ve now shown that sand dunes can serve as repositories of fire history and aid in expanding scientific understanding of fire regimes around the world.”

The study is the first to systematically examine sedimentary records preserved in foot-slope deposits of sand dunes – specifically, four sand dunes at the Cooloola Sand Mass in Queensland, looking at approximately 12,000 years of history.

The researchers aimed to prove that these sand dune deposits could be used to reconstruct reliable, multi-millennial fire histories.

“The Cooloola Sand Mass consists of large sand dunes that were created off the coast and moved inland from the power of the wind,” Dr Patton said.

“We were digging soil pits at the base of the dunes and were seeing a lot of charcoal – more charcoal than we expected.

“And we thought maybe we could utilise these deposits to reconstruct local fires within the area.”

They were correct, and were delighted to see that their dune-based fire history findings successfully matched other fire records from the region found in lake and swamp deposits.

“We found that on the younger dunes – at 500 years old and 2,000 years old – charcoal layers represented individual fires, because the steep slope of the dunes quickly buried each layer,” Dr Patton said.

“However, the older dunes – at 5,000 years old and 10,000 years old – had more gradual slopes that blended charcoal from different fires over time, providing a better understanding of periods of increased or decreased fire frequency.”

The dunes offered localised fire histories from within an approximate 100-meter radius, so fire records varied somewhat amongst the four dunes, which spanned approximately two kilometres.

“Similar records are likely held in sand dunes around the world, and regions like California and the Southwest U.S. could benefit from a better understanding of regional fire history,” Dr Patton said.

“Embedded within the fire records is not only information about natural wildfires, but also the way that humans influenced fire regimes.

“Fire histories are important for understanding how fire was used in the past for cultural purposes, whether that was to clear fields for agriculture or for hunting.

“These records have the potential to unlock the role climate and/or Indigenous peoples had on the landscape from regions where they are rare or absent.

“It would be exciting to see this work extended into the Kimberley and the dune areas along the northern Australian coast where humans have lived for tens of thousands of years.”

The research was conducted between UQ, The University of Canterbury and the Desert Research Institute.

It is published in Quaternary Research.


Diagram of sedimentary records preserved in foot-slope deposits at the base of sand dunes.

CREDIT

Dr Nicholas Patton