Thursday, August 28, 2025

 

Researchers use seaweed to manufacture raw materials for civil construction



The accumulation of these algae on beaches can harm health, tourism, fishing, and biodiversity. They are usually collected and disposed of in landfills, but a study by Brazilian researchers has found a use for the biomass: to produce lightweight ceramic 



Fundação de Amparo à Pesquisa do Estado de São Paulo

Researchers use seaweed to manufacture raw materials for civil construction 

image: 

Large quantities of brown algae have been washing up on beaches in northern Brazil, the Caribbean, and the United States 

view more 

Credit: João Adriano Rossignolo/FZEA-USP





Brazilian researchers have developed a ceramic clay that is lighter than that normally used in civil construction by adding algae from the Sargassum genus to the manufacturing process.

These brown algae, also known as sargassum, are common in the central Atlantic Ocean. However, they have been washing up in large quantities on beaches in the Caribbean, the United States, and northern Brazil, where they have become problematic. Their accumulation on beaches can harm human health due to the gases emitted during decomposition, as well as damage tourism, fishing, and local biodiversity.

“Normally, sargassum is collected and disposed of in landfills without any practical use. That’s why we decided to research a way to take advantage of this large amount of biomass on the beaches,” says João Adriano Rossignolo, a professor at the Department of Biosystems Engineering at the School of Animal Science and Food Engineering of the University of São Paulo (FZEA-USP) who coordinated the study.

In partnership with researchers from the Federal University of São Carlos (UFSCar), the USP group tested using algae in the manufacture of ceramic clays, used in construction to reduce the weight of concrete, in slabs to improve thermal comfort, and in gardening. The research was supported by FAPESP

Sargassum was incorporated into the samples at proportions of 20% and 40%, as well as 0% for comparison. The samples were molded and sintered – a process that compacts clay using heat to make it solid – at temperatures of 800 °C, 900 °C, and 1,000 °C in conventional and microwave ovens.

Once the materials were ready, several tests were performed to evaluate factors such as water absorption, porosity, and compressive strength. Additionally, a comparison was made between the life cycle of conventional expanded clay and the different formulations with added sargassum, a methodology that assesses the environmental impacts of a product from raw material extraction to final disposal.

The results, published in the Journal of Materials in Civil Engineering, indicate that adding sargassum reduced the apparent density of lightweight ceramic clay aggregates, particularly at a concentration of 40%. However, only materials sintered in a microwave oven met the strength requirements at all temperatures. In terms of life cycle, the versions with sargassum performed better environmentally than conventional expanded clay.

This led the researchers to conclude that lightweight ceramic clay aggregates with microwave-sintered sargassum are a viable alternative to mitigate the damage caused by the large amount of algae on the coast. This alternative reduces the consumption of natural resources and increases energy efficiency.

In addition to this work, the team evaluated the feasibility of using algae to produce particulate panels for the furniture and construction industries, as well as fiber cement tiles. They used sargassum ash as a substitute for limestone. “The results were surprising, as we were able to use 30% sargassum in the panels and replace 100% of the limestone with its ash, with results that fully comply with current standards for these products and improve the durability and mechanical properties of the materials,” says Rossignolo.

About São Paulo Research Foundation (FAPESP)
The São Paulo Research Foundation (FAPESP) is a public institution with the mission of supporting scientific research in all fields of knowledge by awarding scholarships, fellowships and grants to investigators linked with higher education and research institutions in the State of São Paulo, Brazil. FAPESP is aware that the very best research can only be done by working with the best researchers internationally. Therefore, it has established partnerships with funding agencies, higher education, private companies, and research organizations in other countries known for the quality of their research and has been encouraging scientists funded by its grants to further develop their international collaboration. You can learn more about FAPESP at www.fapesp.br/en and visit FAPESP news agency at www.agencia.fapesp.br/en to keep updated with the latest scientific breakthroughs FAPESP helps achieve through its many programs, awards and research centers. You may also subscribe to FAPESP news agency at http://agencia.fapesp.br/subscribe. 

 

FAU tells ‘story’ of Atlantic’s sargassum surge using 40 years of data





Florida Atlantic University

Little Palm Island 

image: 

Brian Lapointe, Ph.D., emerges from sargassum in Little Palm Island in June 2014. 

view more 

Credit: FAU Harbor Branch





Researchers at Florida Atlantic University’s Harbor Branch Oceanographic Institute have released a landmark review tracing four decades of changes in pelagic sargassum – free-floating brown seaweed that plays a vital role in the Atlantic Ocean ecosystem.

Once thought to be primarily confined to the nutrient-poor waters of the Sargasso Sea, sargassum is now recognized as a rapidly growing and widely distributed marine organism, whose expansion across the Atlantic is closely linked to both natural processes and human-induced nutrient enrichment.

The review, published in the journal Harmful Algae, sheds new light on the origins and development of the Great Atlantic Sargassum Belt, a massive recurring bloom of sargassum that stretches across the Atlantic Ocean from the coast of West Africa to the Gulf of America.

Since its first appearance in 2011, this belt has formed nearly every year – except in 2013 – and in May, reached a new record biomass of 37.5 million tons. This does not include the baseline biomass of 7.3 million tons historically estimated in the Sargasso Sea.

By combining historical oceanographic observations, modern satellite imagery, and advanced biogeochemical analyses, this review provides a comprehensive framework for understanding the dramatic changes in sargassum distribution, productivity and nutrient dynamics. It also highlights the broader implications of anthropogenic nutrient enrichment on ocean ecology and the need for coordinated international efforts to monitor and manage the impacts of these massive seaweed blooms.

“Our review takes a deep dive into the changing story of sargassum – how it’s growing, what’s fueling that growth, and why we’re seeing such a dramatic increase in biomass across the North Atlantic,” said Brian Lapointe, Ph.D., lead author and a research professor at FAU Harbor Branch. “By examining shifts in its nutrient composition – particularly nitrogen, phosphorus and carbon – and how those elements vary over time and space, we’re beginning to understand the larger environmental forces at play.”

Early in the review, Lapointe and co-authors Deanna F. Webber, research coordinator; and Rachel Brewton, Ph.D., an assistant research professor, both with FAU Harbor Branch, explain that early oceanographers charted the Sargasso Sea based on surface sightings of sargassum, believing the seaweed thrived in its warm, clear, but nutrient-poor waters. However, this notion created a paradox when mid-20th-century oceanographers described the region as a “biological desert.” 

However, recent satellite observations, ocean circulation models, and field studies have resolved this paradox by tracing the seasonal transport of sargassum from nutrient-rich coastal areas, particularly the western Gulf of America, to the open ocean via the Loop Current and Gulf Stream. These findings support early theories by explorers who proposed that Gulf-originating sargassum could feed populations in the Sargasso Sea.

Remote sensing technology played a pivotal role in these discoveries. In 2004 and 2005, satellites captured extensive sargassum windrows – long, narrow lines or bands of floating sargassum – in the western Gulf of America, a region experiencing increased nutrient loads from river systems such as the Mississippi and Atchafalaya.

“These nutrient-rich waters fueled high biomass events along the Gulf Coast, resulting in mass strandings, costly beach cleanups and even the emergency shutdown of a Florida nuclear power plant in 1991,” Lapointe said. “A major focus of our review is the elemental composition of sargassum tissue and how it has changed over time.”

Laboratory experiments and field research dating back to the 1980s confirmed that sargassum grows more quickly and is more productive in nutrient-enriched neritic waters than in the oligotrophic waters of the open ocean. Controlled studies revealed that the two primary species, sargassum natans and sargassum fluitans, can double their biomass in just 11 days under optimal conditions. These studies also established that phosphorus is often the primary limiting nutrient for growth, although nitrogen also plays a critical role.

From the 1980s to the 2020s, the nitrogen content of sargassum increased by more than 50%, while phosphorus content decreased slightly, leading to a sharp rise in the nitrogen-to-phosphorus (N:P) ratio.

“These changes reflect a shift away from natural oceanic nutrient sources like upwelling and vertical mixing, and toward land-based inputs such as agricultural runoff, wastewater discharge and atmospheric deposition,” said Lapointe. “Carbon levels in sargassum also rose, contributing to changes in overall stoichiometry and further highlighting the impact of external nutrient loading on marine primary producers.”

The review also explores how nutrient recycling within sargassum windrows, including excretion by associated marine organisms and microbial breakdown of organic matter, can sustain growth in nutrient-poor environments. This micro-scale recycling is critical in maintaining sargassum populations in parts of the ocean that would otherwise not support high levels of productivity.

Data from sargassum collected near the Amazon River mouth support the hypothesis that nutrient outflows from this major river contribute significantly to the development of the GASB. Variations in sargassum biomass have been linked to flood and drought cycles in the Amazon basin, further connecting land-based nutrient inputs to the open ocean.

The formation of the GASB appears to have been seeded by an extreme atmospheric event – the negative phase of the North Atlantic Oscillation in 2009 to 2010, which may have helped shift surface waters and sargassum from the Sargasso Sea southward into the tropical Atlantic.

However, the researchers caution that there is no direct evidence of this movement. Moreover, genetic and morphological data suggest that some sargassum populations, particularly the dominant S. natans var. wingei, were already present in the tropical Atlantic prior to 2011, indicating that this region may have had an overlooked role in the early development of the GASB.

“The expansion of sargassum isn’t just an ecological curiosity – it has real impacts on coastal communities. The massive blooms can clog beaches, affect fisheries and tourism, and pose health risks,” said Lapointe. “Understanding why sargassum is growing so much is crucial for managing these impacts. Our review helps to connect the dots between land-based nutrient pollution, ocean circulation, and the unprecedented expansion of sargassum across an entire ocean basin.”

This work was funded by the Florida Department of Emergency Management, United States Environmental Protection Agency, South Florida Program Project, and the NOAA Monitoring and Event Response for Harmful Algal Blooms program. Historical studies included within the review were funded by the NASA Ocean Biology and Biogeochemistry Program and Ecological Forecast Program, NOAA RESTORE Science Program, National Science Foundation, “Save Our Seas” Specialty License Plate and discretionary funds, granted through the Harbor Branch Oceanographic Institute Foundation, and a Red Wright Fellowship from the Bermuda Biological Station.

- FAU -

Sargassum on a beach in Palm Beach County in 2021. 

Credit

Brian Lapointe, FAU Harbor Branch


About Harbor Branch Oceanographic Institute:
Founded in 1971, Harbor Branch Oceanographic Institute at Florida Atlantic University is a research community of marine scientists, engineers, educators, and other professionals focused on Ocean Science for a Better World. The institute drives innovation in ocean engineering, at-sea operations, drug discovery and biotechnology from the oceans, coastal ecology and conservation, marine mammal research and conservation, aquaculture, ocean observing systems and marine education. For more information, visit www.fau.edu/hboi.

 

About Florida Atlantic University:

Florida Atlantic University serves more than 32,000 undergraduate and graduate students across six campuses located along the Southeast Florida coast. It is one of only 21 institutions in the country designated by the Carnegie Classification of Institutions of Higher Education as an “R1: Very High Research Spending and Doctorate Production” university and an “Opportunity College and University” for providing greater access to higher education as well as higher earnings for students after graduation. In 2025, Florida Atlantic was nationally recognized as a Top 25 Best-In-Class College and as “one of the country’s most effective engines of upward mobility” by Washington Monthly magazine. Increasingly a first-choice university for students in both Florida and across the nation, Florida Atlantic welcomed its most academically competitive incoming class in the university’s history in Fall 2025. For more information, visit www.fau.edu.

 

 

Robot regret: New research helps robots make safer decisions around humans




University of Colorado at Boulder

Robot regret photo - Morteza and student 

image: 

From left, engineering professor Morteza Lahijanian and graduate student Karan Muvvala watch as a robotic arm completes a task using wooden blocks.

view more 

Credit: Casey Cass/University of Colorado Boulder




Imagine for a moment that you’re in an auto factory. A robot and a human are working next to each other on the production line. The robot is busy rapidly assembling car doors while the human runs quality control, inspecting the doors for damage and making sure they come together as they should.

Robots and humans can make formidable teams in manufacturing, health care and numerous other industries. While the robot might be quicker and more effective at monotonous, repetitive tasks like assembling large auto parts, the person can excel at certain tasks that are more complex or require more dexterity.

But there can be a dark side to these robot-human interactions. People are prone to making mistakes and acting unpredictably, which can create unexpected situations that robots aren’t prepared to handle. The results can be tragic.

New and emerging research could change the way robots handle the uncertainty that comes hand-in-hand with human interactions. Morteza Lahijanian, an associate professor in CU Boulder’s Ann and H.J. Smead Department of Aerospace Engineering Sciences, develops processes that let robots make safer decisions around humans while still trying to complete their tasks efficiently.

In a new study presented at the International Joint Conference on Artificial Intelligence in August 2025, Lahijanian and graduate students Karan Muvvala and Qi Heng Ho devised new algorithms that help robots create the best possible outcomes from their actions in situations that carry some uncertainty and risk.

“How do we go from very structured environments where there is no human, where the robots are doing everything by themselves, to unstructured environments where there are a lot of uncertainties and other agents?” Lahijanian asked.

“If you’re a robot, you have to be able to interact with others. You have to put yourself out there and take a risk and see what happens. But how do you make that decision, and how much risk do you want to tolerate?”

Similar to humans, robots have mental models that they use to make decisions. When working with a human, a robot will try to predict the person’s actions and respond accordingly. The robot is optimized for completing a task—assembling an auto part, for example—but ideally, it will also take other factors into consideration.

In the new study, the research team drew upon game theory, a mathematical concept that originated in economics, to develop the new algorithms for robots. Game theory analyzes how companies, governments and individuals make decisions in a system where other “players” are also making choices that affect the ultimate outcome.

In robotics, game theory conceptualizes a robot as being one of numerous players in a game that it’s trying to win. For a robot, “winning” is completing a task successfully—but winning is never guaranteed when there’s a human in the mix, and keeping the human safe is also a top priority.

So instead of trying to guarantee a robot will always win, the researchers proposed the concept of a robot finding an “admissible strategy.” Using such a strategy, a robot will accomplish as much of its task as possible while also minimizing any harm, including to a human.

“In choosing a strategy, you don't want the robot to seem very adversarial,” said Lahijanian. “In order to give that softness to the robot, we look at the notion of regret. Is the robot going to regret its action in the future? And in optimizing for the best action at the moment, you try to take an action that you won't regret.”

Let’s go back to the auto factory where the robot and human are working side-by-side. If the person makes mistakes or is not cooperative, using the researchers’ algorithms, a robot could take matters into its own hands. If the person is making mistakes, the robot will try to fix these without endangering the person. But if that doesn’t work, the robot could, for example, pick up what it’s working on and take it to a safer area to finish its task.

Much like a chess champion who thinks several turns ahead about an opponent’s possible moves, a robot will try to anticipate what a person will do and stay several steps ahead of them, Lahijanian said. 

But the goal is not to attempt the impossible and perfectly predict a person’s actions. Instead, the goal is to create robots that put people’s safety first.

“If you want to have collaboration between a human and a robot, the robot has to adjust itself to the human. We don't want humans to adjust themselves to the robot,” he said. “You can have a human who is a novice and doesn't know what they're doing, or you can have a human who is an expert. But as a robot, you don't know which kind of human you're going to get. So you need to have a strategy for all possible cases.”

And when robots can work safely alongside humans, they can enhance people's lives and provide real and tangible benefits to society.

As more industries embrace robots and artificial intelligence, there are many lingering questions about what AI will ultimately be capable of doing, whether it will be able to take over the jobs that people have historically done, and what that could mean for humanity. But there are upsides to robots being able to take on certain types of jobs. They could work in fields with labor shortages, such as health care for older populations, and physically challenging jobs that may take a toll on workers’ health.

Lahijanian also believes that, when they're used correctly, robots and AI can enhance human talents and expand what we're capable of doing.

"Human-robot collaboration is about combining complementary strengths: humans contribute intelligence, judgment, and flexibility, while robots offer precision, strength, and reliability," he said.

"Together, they can achieve more than either could alone, safely and efficiently."

Karan Muvvala watches the robotic arm pick up a blue block.

Credit

Casey Cass/University of Colorado Boulder

 

Wildfire mitigation strategies can cut destruction by half, study finds



A new UC Berkeley-led study demonstrates how home hardening and defensible space can have a major impact on wildfire risk.



University of California - Berkeley





Since January’s wildfires flattened entire neighborhoods in Los Angeles, displacing 12,900 households and causing an estimated $30 billion in losses, California’s many other fire-prone communities have been eager for solutions to better protect themselves.

new UC Berkeley-led study provides these communities and their lawmakers with actionable data on how wildfire mitigation strategies can reduce the destructiveness of wildfires by as much as 50%.

One option to reduce wildfire damage is home hardening, which describes a variety of structural modifications that homeowners can use to make their houses less susceptible to fire. These include using fire-resistant siding and roofing materials, covering vents to prevent embers from entering the home, and upgrading to double-paned tempered glass windows that are less likely to break in a fire. Another strategy, defensible space, refers to a vegetation-free “buffer zone” around a home or structure. Because renovating existing homes is not always easy or cheap, data on the effectiveness of these measures is key to justifying future investment. 

In the study, the researchers used state-of-the-art wildfire simulation tools, combined with real-world data from five of the most destructive fires that occurred in California before 2022, to quantify the impact of these strategies.

It found that home hardening and defensible space together can double the number of homes and other structures that survive a blaze. Notably, they also demonstrated that just removing the vegetation within a 5-foot perimeter of homes — the subject of California’s proposed Zone Zero regulations — could reduce structure losses by 17%.

“I view this as really powerful evidence that the mitigation measures that are available to us,  hardening and defensible space, actually have some real-world effectiveness,” said study senior author Michael Gollner, associate professor of mechanical engineering at Berkeley. 

These strategies may further prevent loss and death by slowing the spread of fire, giving residents more time to evacuate and emergency responders more time to arrive at the scene, Gollner said.

“We can't always change the spacing between structures or the exposure from flames and embers,” Gollner said. “But even within those limitations, we still have the power to cut the destruction in half, if not more. That is very powerful.” 

The study was published online today (Aug. 28) in the journal Nature Communications, and was supported by grants from the California Department of Forestry and Fire Protection (CAL FIRE) through the Forest Health program, the Gordon and Betty Moore Foundation and the National Science Foundation. 

Investment in mitigation pays off

To measure the impact of wildfire mitigation strategies, a research team led by Gollner and Berkeley postdoctoral scholar Maryam Zamanialaei took advantage of CAL FIRE’s unique damage inspection database (DINS), which includes data from on-the-ground surveys of all structures that were damaged or destroyed in major California wildfires since 2013. The study focused on the 2017 Tubbs and Thomas fires, 2018 Camp fire, 2019 Kincade fire and 2020 Glass fires. To build a comprehensive data set, researchers then added information from a variety of other geospatial sources to better define the spacing between each building, the construction materials used and the density of vegetation surrounding each structure. 

A unique aspect of the study was the use of state-of-art simulation tools to model how wildfire might have spread through each community, allowing researchers to account for fire exposure to each structure. 

By applying advanced machine learning techniques to the combined dataset, they developed a data-driven model that predicts structure survivability with 82% accuracy and disentangles how factors such as structure spacing, fire exposure, construction materials and defensible space combine to influence risk. 

“We wanted to identify the risk factors that make a structure susceptible to loss,” Zamanialaei said.

“It’s possible that a well-protected home may have a low chance of survival because of everything around it,” Gollner added. “The model allows us to tune in to see the impact of each factor and how they interplay.”

Their research identified structure separation distance as the most influential factor driving structure loss, especially in densely built areas where wildfire is spread from building to building. Flame length also emerged as a critical contributor.

In addition, construction features such as exterior siding and window materials substantially contributed to the vulnerability of structures. The findings highlight how building arrangement and exposure to flames, combined with ignition resistance, all contribute to wildfire risk. 

However, for mitigation strategies to work best, they need to be adopted by everyone in a fire-prone community, Gollner said. As the fierce debates over Zone Zero regulations illustrate, it can be challenging to cultivate the social and political will to implement these changes on a large scale. 

“Much of what you can do to prevent these fires from spreading through the whole community happens on an individual's property and depends on what your neighbor does,” Gollner said. “This is a really challenging social, economic and political problem that requires a lot of groups working together.”

He hopes that the study further highlights the importance — and positive impact — of this challenging work. 

“We need to justify the investments we're making in mitigation, and I was glad to see that for many of them, we do see significant payback in terms of risk reduction,” Gollner said.

Additional co-authors of the study include Daniel San Martin of the Universidad Técnica Federico Santa María; Maria Theodori and Dwi Purnomo of UC Berkeley; Ali Tohidi, Arnaud Trouvé and Yiren Qin of the University of Maryland; and Chris Lautenberger of Cloudfire.