Wednesday, November 13, 2024

Machine learning predicts highest-risk groundwater sites to improve water quality monitoring




North Carolina State University




An interdisciplinary team of researchers has developed a machine learning framework that uses limited water quality samples to predict which inorganic pollutants are likely to be present in a groundwater supply. The new tool allows regulators and public health authorities to prioritize specific aquifers for water quality testing.

This proof-of-concept work focused on Arizona and North Carolina but could be applied to fill critical gaps in groundwater quality in any region.

Groundwater is a source of drinking water for millions and often contains pollutants that pose health risks. However, many regions lack complete groundwater quality datasets.

“Monitoring water quality is time-consuming and expensive, and the more pollutants you test for, the more time-consuming and expensive it is,” says Yaroslava Yingling, co-corresponding author of a paper describing the work and Kobe Steel Distinguished Professor of Materials Science and Engineering at North Carolina State University.

“As a result, there is interest in identifying which groundwater supplies should be prioritized for testing, maximizing limited monitoring resources,” Yingling says. “We know that naturally occurring pollutants, such as arsenic or lead, tend to occur in conjunction with other specific elements due to geological and environmental factors. This posed an important data question: with limited water quality data for a groundwater supply, could we predict the presence and concentrations of other pollutants?”

“Along with identifying elements that pose a risk to human health, we also wanted to see if we could predict the presence of other elements – such as phosphorus – which can be beneficial in agricultural contexts but may pose environmental risks in other settings,” says Alexey Gulyuk, a co-first author of the paper and a teaching professor of materials science and engineering at NC State.

To address this challenge, the researchers drew on a huge data set, encompassing more than 140 years of water quality monitoring data for groundwater in the states of North Carolina and Arizona. Altogether, the data set included more than 20 million data points, covering more than 50 water quality parameters.

“We used this data set to ‘train’ a machine learning model to predict which elements would be present based on the available water quality data,” says Akhlak Ul Mahmood, co-first author of this work and a former Ph.D. student at NC State. “In other words, if we only have data on a handful of parameters, the program could still predict which inorganic pollutants were likely to be in the water, as well as how abundant those pollutants are likely to be.”

One key finding of the study is that the model suggests pollutants are exceeding drinking water standards in more groundwater sources than previously documented. While actual data from the field indicated that 75-80% of sampled locations were within safe limits, the machine learning framework predicts that only 15% to 55% of the sites may truly be risk-free.

“As a result, we’ve identified quite a few groundwater sites that should be prioritized for additional testing,” says Minhazul Islam, co-first author of the paper and a Ph.D. student at Arizona State University. “By identifying potential ‘hot spots,’ state agencies and municipalities can strategically allocate resources to high-risk areas, ensuring more targeted sampling and effective water treatment solutions”

“It’s extremely promising and we think it works well,” Gulyuk says. “However, the real test will be when we begin using the model in the real world and seeing if the prediction accuracy holds up.”

Moving forward, researchers plan to enhance the model by expanding its training data across diverse U.S. regions; integrating new data sources, such as environmental data layers, to address emerging contaminants; and conducting real-world testing to ensure robust, targeted groundwater safety measures worldwide.

“We see tremendous potential in this approach,” says Paul Westerhoff, co-corresponding author and Regents’ Professor in the School of Sustainable Engineering and the Built Environment at ASU. “By continuously improving its accuracy and expanding its reach, we’re laying the groundwork for proactive water safety measures across the globe.”

“This model also offers a promising tool for tracking phosphorus levels in groundwater, helping us identify and address potential contamination risks more efficiently,” says Jacob Jones, director of the National Science Foundation-funded Science and Technologies for Phosphorus Sustainability (STEPS) Center at NC State, which helped fund this work. “Looking ahead, extending this model to support broader phosphorus sustainability could have a significant impact, enabling us to manage this critical nutrient across various ecosystems and agricultural systems, ultimately fostering more sustainable practices.”

The paper, “Multiple Data Imputation Methods Advance Risk Analysis and Treatability of Co-occurring Inorganic Chemicals in Groundwater,” is published open access in the journal Environmental Science & Technology. The paper was co-authored by Emily Briese and Mohit Malu, both Ph.D. students at Arizona State; Carmen Velasco, a former postdoctoral researcher at Arizona State; Naushita Sharma, a postdoctoral researcher at Oak Ridge National Laboratory; and Andreas Spanias, a professor of digital signal processing at Arizona State.

This work was supported by the NSF STEPS Center; and by the Metals and Metal Mixtures: Cognitive Aging, Remediation and Exposure Sources (MEMCARE) Superfund Research Center based at Harvard University, which is supported by the National Institute of Environmental Health Science under grant P42ES030990.

 

Multidrug-resistant bacterium posing a global public health threat is detected in Northeast Brazil



This particular strain of Klebsiella pneumoniae had previously been detected in the United States. The bacterium frequently causes infections in hospitals, is not eliminated by any existing antibiotic.



Fundação de Amparo à Pesquisa do Estado de São Paulo

Multidrug-resistant bacterium posing a global public health threat is detected in Northeast Brazil 

image: 

Antibiotic susceptibility test showing Klebsiella pneumoniae to be resistant to all antibiotics. Thin paper discs each containing an antibiotic were placed on an agar plate with the bacterium, which continued to grow in contact with all 15 antibiotics tested

view more 

Credit: Rubens Renato Carmo/ICB-USP




A strain of the bacterium Klebsiella pneumoniae isolated from an 86-year-old woman with a urinary infection admitted to hospital in Brazil’s Northeast region in 2022 proved resistant to all available antibiotics. The patient died 24 hours after being hospitalized.

A group of researchers supported by FAPESP sequenced the bacterium’s genome and compared it with a database of 48 similar sequences. Alarmingly, the results showed that the strain in question had previously been detected in the United States, was already circulating in Brazil, and could potentially spread around the world.

An article on the study is published in The Lancet Microbe.

“The bacterium is so versatile that it adapts to changes in treatment, easily acquiring resistance mechanisms not targeted by existing drugs either singly or in combination. It could become endemic to healthcare facilities worldwide,” said Nilton Lincopan, last author of the article and a professor at the University of São Paulo’s Biomedical Sciences Institute (ICB-USP) in Brazil.

Lincopan runs One Health Brazilian Resistance (OneBR), a database with epidemiological, phenotypical and genomic data on bacteria classified as “critical priority pathogens” by the World Health Organization (WHO).

This classification encompasses bacteria for which few therapeutic options are available, that merit containment measures to prevent dissemination, and that should be prioritized for the purposes of research and development of novel antimicrobials.

OneBR’s database currently holds 700 genomes of human and animal pathogens.

The OneBR platform is supported by FAPESP, the National Council for Scientific and Technological Development (CNPq, an arm of the Brazilian government) and the Bill & Melinda Gates Foundation (read more at: agencia.fapesp.br/38759). 

Health services are required to notify the local epidemiological surveillance authority when multidrug-resistant strains like this are detected. The patient must be isolated, and all health workers involved must take extra precautions to prevent transmission of the pathogen to other patients.

“As an opportunistic pathogen, the bacterium may not cause disease in patients with normal immunity, but in people with low immunity, it can cause severe infections. In the hospital environment, patients in intensive care units [ICUs] or being treated for other diseases can acquire a secondary infection such as pneumonia. With no treatment available and with a depressed immune system, they often die,” Lincopan said.

Favored by the pandemic

The authors note that a rapid increase in pan-beta-lactam-resistant K. pneumoniae co-producing carbapenemases was observed across Latin America and the Caribbean during the COVID-19 pandemic. Carbapenemases are enzymes that hydrolyze most antimicrobial compounds, making them ineffective. Beta-lactams are the most widely used class of antibiotics. 

The spread of these bacteria was reported to the Pan American Health Organization (PAHO) and WHO, which issued an epidemiological alert. 

A global genomic analysis published recently by a group led by Fábio Sellera, a professor at the Metropolitan University of Santos (UNIMES) in Brazil, also reported rapid growth in multidrug-resistant bacteria, stressing that the high prevalence of K. pneumoniae strains indicates a novel resistance trend and a serious public health threat.

An antibiotic combining a third-generation cephalosporin, ceftazidime, with a new beta-lactamase inhibitor, avibactam, was approved by the United States Food and Drug Administration (FDA) in 2015 and is indicated for treatment of K. pneumoniae carbapenemase (KPC)-producing bacteria.

Ceftazidime/avibactam was approved by ANVISA, the Brazilian equivalent of the FDA, in 2018 in view of the large number of reported KPC infections.

“Hospitalization of people with COVID-19 associated with secondary infections by this type of bacterium probably led to a global increase in the use of ceftazidime/avibactam, favoring the emergence of strains resistant to this new antibiotic,” Lincopan said.

The standard procedure for patients hospitalized with suspected bacterial infections would be to collect clinical material for confirmation of the diagnosis and testing of susceptibility to the different antimicrobials available.

“KPC-producing strains treated with ceftazidime/avibactam probably evolved rapidly, acquiring resistance to this latest therapeutic option. We now have carbapenemase co-producing strains that don’t respond to treatment with beta-lactams,” he said.

Besides the need for permanent monitoring of the pathogenic bacteria found in hospitals, the researchers also stress the importance of rational prescribing of antibiotics. The message for patients is that when antibiotics are prescribed they must adhere to the full course of treatment even if they feel well after two or three days. This will also avoid the emergence of more drug-resistant strains.

The first author of the article was Felipe Vásquez Ponce, a PhD candidate at ICB-USP. The study was also supported by a PhD scholarship awarded to Johana Becerra, a researcher at ICB-USP and a member of the OneBR team.

About São Paulo Research Foundation (FAPESP)

The São Paulo Research Foundation (FAPESP) is a public institution with the mission of supporting scientific research in all fields of knowledge by awarding scholarships, fellowships and grants to investigators linked with higher education and research institutions in the State of São Paulo, Brazil. FAPESP is aware that the very best research can only be done by working with the best researchers internationally. Therefore, it has established partnerships with funding agencies, higher education, private companies, and research organizations in other countries known for the quality of their research and has been encouraging scientists funded by its grants to further develop their international collaboration. You can learn more about FAPESP at www.fapesp.br/en and visit FAPESP news agency at www.agencia.fapesp.br/en to keep updated with the latest scientific breakthroughs FAPESP helps achieve through its many programs, awards and research centers. You may also subscribe to FAPESP news agency at http://agencia.fapesp.br/subscribe.

 

Novel machine learning techniques measure ocean oxygen loss more accurately




Georgia Institute of Technology
Novel machine learning techniques measure ocean oxygen loss more accurately 

image: 

Basin-scale oxygen inventory trend with global ocean divided to 10 basins. Blue lines and shading show ensemble mean and ensemble range for ship-only reconstructions, and red lines and shading are for ship and Argo float reconstructions. The research team divided up the task by working on different basins. (Credit: Georgia Institute of Technology)

view more 

Credit: Georgia Institute of Technology





Oxygen is essential for living organisms, particularly multicellular life, to metabolize organic matter and energize all life activities. About half of the oxygen we breathe comes from terrestrial plant life, such as forests and grasslands, while the other half is produced through photosynthesis by marine algae in the ocean's surface waters.

Oxygen concentrations are declining in many parts of the world’s oceans. Experts believe this drop is linked to the ocean’s surface warming and its impacts on the physics and chemistry of seawater, though the problem is not fully understood. Temperature plays a crucial role in determining how oxygen dissolves in seawater; as water warms, it loses its ability to hold gas.

“Calculating the amount of oxygen lost from the oceans is challenging due to limited historical measurements and inconsistent timing,” said Taka Ito, oceanographer and professor in the School of Earth and Atmospheric Sciences at Georgia Tech. “To understand global oxygen levels and their changes, we need to fill in many data gaps.”

A group of student researchers sought to address this issue. Led by Ito, the team developed a new machine learning-based approach to more accurately understand and represent the decline in global ocean oxygen levels. Using datasets, the team further generated a monthly map of oxygen content visualizing the ocean’s oxygen decline over several decades. Their research was published in the Journal of Geophysical Research: Machine Learning and Computation.

“Marine scientists need to understand the distribution of oxygen in the ocean, how much it's changing, where the changes are occurring, and why,” said Ahron Cervania, a Ph.D. student in Ito’s lab. “Statistical methods have long been used for these estimates, but machine learning techniques can improve the accuracy and resolution of our oxygen assessments.”

The project began three years ago with support from the National Science Foundation, and the team initially focused solely on Atlantic Ocean data to test the new method. They used a computational model to generate hypothetical observations, which allowed them to assess how well they could reconstruct missing oxygen level information using only a fraction of the data combined with machine learning. After developing this method, the team expanded to global ocean observations, involving undergraduate students and delegating tasks across different ocean basins.

Under Ito’s guidance, Cervania and other student researchers developed algorithms to analyze the relationships between oxygen content and variables like temperature, salinity, and pressure. They used a dataset of historical, ship-based oxygen observations since the 1960s and recent data from Argo floats — autonomous drifting devices that collect and measure temperature and salinity. Although oxygen data existed before the 1960s, earlier records have accuracy issues, so the team focused on data from the 1960s onward. They then created a global monthly map of ocean oxygen content from 1965 to the present.

“Using a machine learning approach, we were able to assess the rate of oxygen loss more precisely across different periods and locations,” Cervania said. “Our findings indicate that incorporating float data significantly enhances the estimate of oxygen loss while also reducing uncertainty.”

The team found that the world’s oceans have lost oxygen at a rate of about 0.7% per decade from 1970 to 2010. This estimate suggests a relatively rapid ocean response to recent climate change, with potential long-term impacts on marine ecosystems’ health and sustainability. Their estimate also falls within the range of decline suggested by other studies, indicating the accuracy and efficacy of their approach.

“We calculated trends in global oxygen levels and the ocean’s inventory, essentially looking at the rate of change over the last five decades,” Cervania said. “It’s encouraging to see that our rate aligns with previous estimates from other methods, which gives us confidence. We are building a robust estimate from both our study and other studies.”

According to Ito, the team’s new approach addresses an ongoing challenge in the oceanographic community: how to effectively combine different data sources with varying accuracies and uncertainties to better understand ocean changes.  

 “The integration of advanced technologies like machine learning will be essential in filling data gaps and providing a clearer picture of how our oceans are responding to climate change.”

 

Citation: Ito, T., Cervania, A., Cross, K., Ainchwar, S., & Delawalla, S. (2024). Mapping dissolved oxygen concentrations by combining shipboard and Argo observations using machine learning algorithms. Journal of Geophysical Research: Machine Learning and Computation, 1, e2024JH000272. 

DOIhttps://doi.org/10.1029/2024JH000272

Funding: National Science Foundation

 

#####

The Georgia Institute of Technology, or Georgia Tech, is one of the top public research universities in the U.S., developing leaders who advance technology and improve the human condition. The Institute offers business, computing, design, engineering, liberal arts, and sciences degrees. Its more than 45,000 undergraduate and graduate students, representing 50 states and more than 148 countries, study at the main campus in Atlanta, at campuses in France and China, and through distance and online learning. As a leading technological university, Georgia Tech is an engine of economic development for Georgia, the Southeast, and the nation, conducting more than $1 billion in research annually for government, industry, and society.

 

Low-cost method removes micro- and nanoplastics from water



The strategy developed at the University of São Paulo uses magnetic nanoparticles that bind to tiny plastic particles and permit their removal with the aid of a magnet



Fundação de Amparo à Pesquisa do Estado de São Paulo

Low-cost method removes micro- and nanoplastics from water 

image: 

Summary of purification process: water polluted by microplastics (PET); addition of magnetic nanoparticles functionalized with polydopamine and lipase; removal of nanoparticles with microplastics using a magnet

view more 

Credit: Henrique Eisi Toma




Researchers at the University of São Paulo (USP) in Brazil have developed a novel nanotechnology-based solution for the removal of micro- and nanoplastics from water. An article on the research, which was supported by FAPESP, is published in the journal Micron.

Tiny plastic particles are ubiquitous in the world today and may currently be one of the most important environmental problems, after the climate emergency and the accelerating extinction of species and ecosystems. Microplastics are in the soil, water and air, and in the bodies of animals and humans. They come from everyday consumer goods and from wear-and-tear on larger materials. They are found everywhere and in every kind of environment. A major source is the water used to wash clothes made of synthetic fibers. Microplastics currently cannot be filtered out of wastewater and eventually penetrate the soil, water table, rivers, oceans and atmosphere.

Defined as fragments of up to 1 millimeter, microplastics proper are a well-identified and visible problem. Nanoplastics, however, are a thousand times smaller and are proving an even more insidious hazard, since they can pass through key biological barriers and reach vital organs. A recent scientific study, for example, detected their presence in the human brain.

“Nanoparticles aren’t visible to the naked eye or detectable using conventional microscopes, so they’re very hard to identify and remove from water treatment systems,” said Henrique Eisi Toma, a professor at the Institute of Chemistry (IQ-USP) and last author of the Micron article.

The procedure developed at USP uses magnetic nanoparticles functionalized with polydopamine, a polymer derived from dopamine, a neurotransmitter present in the human organism. These nanoparticles can bind to micro- and nanoplastics waste, and the combined particles can then be removed from water via application of a magnetic field.

“Polydopamine is a substance that mimics the adhesive properties of mussels, which cling very tenaciously to many surfaces. It adheres firmly to fragments of plastic in water and enables the magnetic nanoparticles to capture them. This undesirable material can then be removed from the water with a magnet,” Toma said.

The process has already been proven effective for removing micro- and nanoplastics from water, especially in treatment systems. However, the research group also aims to degrade them using specific enzymes such as lipase, which can break down polyethylene terephthalate (PET) into its basic components. Application of the enzymes decomposes PET and other widely used plastics into smaller molecules, which can be reused to produce plastic materials. “Our goal isn’t just to remove plastic from water but also to contribute to its recycling in a sustainable manner,” Toma said.

PET is a raw material for plastic bottles and other items. It is a major pollutant, not least because its degradation produces terephthalic acid (C6H4(COOH)2) and ethylene glycol (C2H4(OH)2), both of which are toxic. “Lipase breaks down PET into these initial monomeric forms, which can be reused to synthesize new PETs. Our study focused on PET, but other researchers can include other specific enzymes to process different plastics, such as polyamide or nylon, for example,” he said.

In the study led by Toma, magnetic nanoparticles of iron (II, III) oxide, or black iron oxide (Fe₃O₄), were synthesized by co-precipitation and later coated with polydopamine (PDA) by partially oxidizing dopamine in a mildly alkaline solution to form Fe₃O₄@PDA. Lipase was immobilized on this substrate. Hyperspectral Raman microscopy was used to monitor sequestration and degradation of the plastic in real time.

Complex problem

The term “plastics” refers to a wide array of synthetic or semi-synthetic polymers, most of which are derived from fossil fuels. Their malleability, flexibility, light weight, durability and low cost have assured their presence in countless products used in everyday life. Concerns regarding the residues and waste produced by this highly intensive use have led to a search for alternatives, such as bioplastics. Instead of nonrenewable petrochemicals, bioplastics are derived from renewable and biodegradable sources.

“It’s a good idea, but before they fully degrade, bioplastics also fragment and form micro- or nanoplastics. Being biocompatible, they’re even more insidious because they can interact more directly with our organisms and trigger biological reactions,” Toma said.

Another troubling piece of information provided by Toma is that bottled mineral water may be even more contaminated by bioplastics than the treated potable water we consume in our homes. “Treated potable water undergoes processes such as filtration, coagulation and flotation to eliminate most residues, whereas mineral water, which is better in some ways – it’s lighter, contains more salts and tastes better, for example – isn’t processed in any of these ways because that would destroy its properties. If the environment from which it’s collected is contaminated by bioplastics, these particles will reach the consumer,” he said.

In sum, the challenge is daunting and there are no obvious answers. The nanotechnology presented by Toma and collaborators offers a promising solution to a problem whose full extent is only just starting to be understood. He urges other researchers to persist in the search for solutions and appeals to public administrators to take the problem seriously.

About São Paulo Research Foundation (FAPESP)

The São Paulo Research Foundation (FAPESP) is a public institution with the mission of supporting scientific research in all fields of knowledge by awarding scholarships, fellowships and grants to investigators linked with higher education and research institutions in the State of São Paulo, Brazil. FAPESP is aware that the very best research can only be done by working with the best researchers internationally. Therefore, it has established partnerships with funding agencies, higher education, private companies, and research organizations in other countries known for the quality of their research and has been encouraging scientists funded by its grants to further develop their international collaboration. You can learn more about FAPESP at www.fapesp.br/en and visit FAPESP news agency at www.agencia.fapesp.br/en to keep updated with the latest scientific breakthroughs FAPESP helps achieve through its many programs, awards and research centers. You may also subscribe to FAPESP news agency at http://agencia.fapesp.br/subscribe.

 

Protecting tax whistleblowers pays off



A New York law that shields and rewards whistleblowers led to less tax dodging from corporations



University of Texas at Austin





AUSTIN, Texas — The federal tax gap — money people and companies owe Uncle Sam but fail to pay on time — has climbed to historic highs: $696 billion in 2022, according to the IRS. It’s money that, if recouped, could fund infrastructure or education or pay down government debt.

One way to collect that money is through lawsuits prompted by corporate whistleblowers — often present or former employees who know a company’s finances and expose its transgressions.

Federal law includes protections and rewards for tax whistleblowers. Washington, D.C., also has such measures, but at the state level, only New York does. Proposals have failed to pass in at least four other states. Could these laws be a relatively cheap and easy way for states to narrow their tax gaps?

New research from Texas McCombs suggests that the answer is yes. Aruhn Venkat, assistant professor of accounting, found that a 2010 New York whistleblower law deterred businesses from avoiding state income tax. It brought in an extra 7.7% in state tax revenue, amounting to $281 million a year.

Weighed against costs of enforcement, the state’s return on investment exceeded 3,000%. “These whistleblower laws do work, and they’re reasonably inexpensive from a government perspective,” Venkat says.

Prodding Companies To Pay

The 2010 amendment to New York’s False Claims Acts (FCA) awards whistleblowers up to 30% of delinquent taxes recovered. It protects them from retaliation, and it also makes companies pay their attorneys’ fees.

In 2018, for example, Sprint paid $330 million to New York’s attorney general to settle a tax fraud case, with $63 million going to the whistleblower.

Did such examples scare other companies into dodging fewer taxes? With Yoojin Lee from California State University, Shaphan Ng from Singapore Management University, and Terry Shevlin from the University of California-Irvine, Venkat looked at hundreds of companies subject to New York’s FCA.

Using financial data for 2005-2015 — from five years before to five years after the law was passed — the researchers calculated the companies’ effective tax rates: the shares of income they paid in state taxes. They compared those rates with those for similar companies in neighboring states. They found:

  • On average, companies increased their state effective tax rates 7.1% after the law’s passage, indicating lower tax avoidance.
  • Rates for the most aggressive tax avoiders increased 34.9%.

Companies paid more in taxes, in part, because they reduced their use of several avoidance strategies, such as shifting money overseas or moving it to separate companies called special purpose entities.

The law had a double punch, the researchers found. Both general knowledge of the law and publicity around actual whistleblower lawsuits had deterrent effects on companies.

“They’re complementary,” Venkat says. “Just the passage of the law will deter state tax avoidance. So, you don’t need only whistleblower events to deter firms.”

Risks and Rewards

The researchers found a few downsides to the law. Companies with minimal business in New York lowered their numbers of establishments in the state after the FCA. But those that already had large presences did not reduce them.

“There could be a cost here, with firms just leaving,” Venkat says. “But whatever is lost from a few firms leaving is probably made up by others reducing their tax avoidance.”

The law doesn’t always work out for whistleblowers, either, he adds. They run the risk that their claims may not be vindicated legally.

“It can still be a raw deal for employees who are whistleblowers, because they do end up being fired fairly often, even though that’s illegal,” he says. “There can be a cost for employees who do this.”

The Deterrence Effects of Tax Whistleblower Laws: Evidence from New York’s False Claims Acts” is published inManagement Science

 

LSU researchers excavate earliest ancient Maya salt works


With funding from the National Science Foundation, a team of archaeologists from LSU and the University of Texas at Tyler have excavated the earliest known ancient Maya salt works in southern Belize, as reported in the journal Antiquity.



Louisiana State University

An excavation grid is put in place to mark an area of high-density pottery on the sea floor. 

image: 

An excavation grid is put in place to mark an area of high-density pottery on the sea floor.

view more 

Credit: E.C. Sills




The team was led by LSU Alumni Professor Heather McKillop, who first discovered wooden buildings preserved there below the sea floor, along with associated artifacts, and the only ancient Maya wooden canoe paddle in 2004.

Her key collaborator, Assistant Professor Elizabeth Sills at the University of Texas at Tyler, began working with McKillop as a master’s student and then as a doctoral student at LSU.

Since their initial discovery of wood below the sea floor in Belize, the team has uncovered an extensive pattern of sites that include “salt kitchens” for boiling seawater in pots over a fire to make salt, residences for salt workers, and the remains of other pole and thatch buildings.

All were remarkably well preserved in red mangrove peat in shallow coastal lagoons. Since 2004, the LSU research team has mapped as many as 70 underwater sites, with 4,042 wooden posts marking the outlines of ancient buildings.

In 2023, the team returned to Belize to excavate a site called Jay-yi Nah, which curiously lacked the broken pots so common at other salt works, while a few pottery sherds were found.

“These resembled sherds from the nearby island site of Wild Cane Cay, which I had previously excavated,” McKillop said. “So, I suggested to Sills that we survey Jay-yi Nah again for posts and sea floor artifacts.”

After their excavations, McKillop stayed in a nearby town to study the artifacts from Jay-yi Nah. As reported in Antiquity, the materials they found contrasted with those from other nearby underwater sites, which had imported pottery, obsidian, and high-quality chert, or flint.

“At first, this was perplexing,” McKillop said. “But a radiocarbon date on a post we’d found at Jay-yi Na provided an Early Classic date, 250-600 AD, and solved the mystery.”

Jay-yi Nah turned out to be much older than the other underwater sites. Through their findings, the researchers learned Jay-yi Nah had developed as a local enterprise, without the outside trade connections that developed later during the Late Classic period (AD 650-800), when the inland Maya population reached its peak with a high demand for salt—a basic biological necessity in short supply in the inland cities.

Jay-yi Nah had started as a small salt-making site, with ties to the nearby community on Wild Cane Cay that also made salt during the Early Classic period. Abundant fish bones preserved in anaerobic deposits at Wild Cane Cay suggest some salt was made there for salting fish for later consumption or trade.

 

Building a diverse wildland fire workforce to meet future challenges



Stanford report provides a blueprint for fostering a more inclusive, diverse and well-supported workforce to meet the increasing need for fire mitigation and management.



Stanford Woods Institute for the Environment





Every year around this time, California’s wildland firefighters hold their breath as hot, dry winds threaten to spread flames across the state. As such conflagrations grow in size and severity throughout the Western U.S., the strain on fire managers has intensified. A new report from Stanford University’s Climate and Energy Policy Program provides a blueprint for fostering a more inclusive, diverse and well-supported workforce to meet the increasing need for fire mitigation and management.

“The wellbeing of the wildland fire workforce has received national attention, yet recruitment and retention challenges specific to women, people of color, and other underrepresented groups in the field has too often been overlooked,” said report coauthor Abigail Varney, a wildland fire fellow at the Stanford Doerr School of Sustainability and a federal wildland firefighter. “Expanding workforce capacity and effectiveness will require increased investment in more equitably supporting all of our firefighters.”

The report addresses the cultural, structural, and capacity-related barriers that have historically prevented a more diverse group of people from entering and succeeding in the fire management field. Overcoming these obstacles will be critical to attracting and retaining a workforce capable of addressing the wildfire crisis, the authors write.

[A related webinar on Nov. 14. will present the paper's results, and feature state, federal, and international experts in a discussion about how to overcome persistent cultural and structural challenges that have impeded the development of a more diverse, inclusive, and equitable fire management workforce. Learn more and register here.]

Overcoming barriers to change

Despite a growing recognition of the benefits of workforce diversity, the wildland fire profession remains largely homogenous. According to the U.S. Government Accountability Office, 84% of the federal fire management workforce identifies as male, and 72% identifies as white. The Stanford report points to several factors contributing to this lack of diversity, including implicit bias, inequitable career advancement opportunities, and a workplace culture that has historically marginalized women and people of color.

The report recommends several strategies to increase diversity and inclusivity in fire management. Among them: outreach efforts aimed at underrepresented populations, the creation of more inclusive onboarding processes, and the development of resources to support the families of firefighters.

Improving health equity and workplace culture

Wildland firefighters face numerous physical and mental health challenges, including exposure to hazardous environments, long hours, and high levels of stress. However, women and people of color may experience these health threats differently because of their specific needs or vulnerabilities. Among the report’s recommendations for improving health equity: expanding insurance coverage for reproductive health services, including female-reproductive organ cancers in presumptive coverage legislation, and increasing mental health services that are gender- and culture-responsive.

Another crucial change, according to the report: promoting a workplace culture within fire management agencies that fosters an inclusive environment where all employees feel valued and supported. The authors recommend regular workplace culture assessments, as well as mandatory training programs to address implicit bias, harassment, and discrimination. “By taking the actions recommended in this report, lawmakers and fire management agencies will not just address barriers to recruitment and retention, but also help build a workforce that is well equipped and supported to meet the challenges ahead,” said report coauthor Cassandra Jurenci, a wildfire legal fellow at the Stanford Law School.Coauthors of the paper also include Michael Wara, director of the Climate and Energy Policy Program at the Stanford Woods Institute for the Environment and Michael Mastrandrea, research director of the program. Wara and Mastrandrea are senior director for policy and director for policy, respectively, in the Stanford Doerr School of Sustainability’s Sustainability Accelerator.