Friday, August 08, 2025

SFU indoor berry research scales up and branches out with support from the Homegrown Innovation Challenge



Simon Fraser University





Greenhouse berry production research at Simon Fraser University (SFU) is ramping up thanks to $5 million in new funding over three years from the Weston Family Foundation’s Homegrown Innovation Challenge.

The SFU-led project—developed in collaboration with industry partner BeriTech—is one of four Canadian projects selected to participate in the Challenge’s Scaling Phase. With this support, the team will continue advancing their indoor blueberry trials while branching out to include raspberries and blackberries. The goal: to develop sustainable, scalable, and economically viable indoor growing systems that can support Canadian farmers year-round.

Led by biological sciences professor Jim Mattsson, the team is working to address key challenges in indoor production, from plant genetics to growing conditions.

“Berry plants are less demanding than crops such as tomatoes and peppers, but also have a lower yield. “We aim at identifying the sweet spot between inputs and outputs to provide berries at an affordable price,” explains BeriTech chief science officer Eric Gerbrandt.

“We’re working with raspberry farmers already. They’re eager to extend the season on both sides, and they are very interested in greenhouse technology, but they don’t have the know-how to do it.”

In order to make indoor berry farming viable at scale, the team is developing  compact, high-yield berry varieties as well as the technology and growing sytems needed for them to thrive.

“You can buy both raspberry and blueberry plants that have a shorter stature, but they have a fairly low yield,” Mattsson explains. “Certain genes have been knocked out creating a plant that is essentially less healthy, and therefore smaller. But if you knock out the right gene, you can get healthy plant that is smaller.”

Alongside this work on genetics, BeriTech is designing low-cost, modular systems to provide optimal growing conditions at a price point that farmers can afford.

“We’re taking a fairly simple approach,” Mattsson explains, “a turnkey solution based on existing technologies to keep the cost low for farmers.”

This work also contributes to a broader protocol for adapting other crops to indoor production, laying the groundwork for more resilient food systems. With increasing uncertainty due to climate change and global supply chain disruptions, increasing Canada’s capacity to grow food year-round, and decrease our dependence on imports.

While delivering high yields is important, Mattsson believes Canadians shouldn’t have to compromise on flavour or nutrition and the team is also working on improving the flavour content of the berries.

“Flavour boils down to two things: sugar content, and then some flavouring compounds,” he explains. “We're going to try to get higher amounts of raspberry ketone, the flavouring agent that provides a specific raspberry flavour. We're aiming for that taste you remember from childhood, the way berries should taste.”

Delivered over six years and funded by the Weston Family Foundation, the $33 million Homegrown Innovation Challenge supports the development of tools and technologies to enable Canadian producers to grow berries out of season, sustainably and competitively. We believe that by accomplishing out-of-season berry production, we can also unlock solutions for myriad other fruits and vegetables.

 

Expanding Montana’s precision ag: 

Anish Sapkota studies water, soil and more in agricultural systems




Montana State University
Anish Sapkota 

image: 

Montana State University assistant professor Anish Sapkota has earned a 2025 Emerging Science Award from the Western Society of Crop Science. Sapkota works with drones and remote sensing among other areas of precision agriculture. MSU photo by Marcus "Doc" Cravens

view more 

Credit: MSU photo by Marcus "Doc" Cravens





BOZEMAN – Emerging opportunities in the realm of precision agriculture are countless, examining earth, sky and everything in between. At Montana State University, faculty are delving into many of them with the goal of improving resources for the state’s agricultural producers.

Anish Sapkota is one of those faculty members, an assistant professor of precision agriculture in the College of Agriculture’s Department of Land Resources and Environmental SciencesPrecision agriculture involves the incorporation of the latest data and technology into agricultural production, and Sapkota’s breadth of research exploring precision management of water and fertilizer earned him the Western Society of Crop Science’s 2025 Emerging Scientist Award in June.

Sapkota, who is originally from Nepal, joined the faculty a year ago, but he also received his master’s degree from MSU in 2018. He then completed a doctorate at the University of California, Riverside and postdoctoral research at the University of California, Davis.

“I gained a diverse experience working in traditional agricultural systems and with new technologies, bringing those together to understand what’s happening below ground, near ground and above ground,” said Sapkota. “And then there comes this opportunity to be able to come here. I’m so excited to be back at Montana State.”

Sapkota’s research focuses on abiotic stressors, which are those that don’t come from living things such as pests or disease. Drought, heat and soil nutrients can present abiotic stress, he said, and precision agriculture presents a wealth of potential remedies. His work exploring stress in crops including wheat and alfalfa earned him this WSCS Emerging Scientist Award, presented to a young researcher who has made significant contributions to the scientific profession with particular impact in agricultural industries.

The more data researchers can collect from a field, the more precise management can become, Sapkota said, and with the amount of freely available data from satellites, drones and sensors, agricultural fields can be analyzed for everything from water to nutrient stress in different parts of the field. Then, producers can apply only what is needed at the right place and right time, whether that be water or fertilizer, exactly where it will have the most benefit. That practice, called variable rate application, allows farmers to save both money and products by not wasting them where they won’t be effective.

Sapkota is also collaborating with other MSU faculty and Montana growers to evaluate technology such as soil moisture sensors, drones and precision irrigation tool so that farmers can use the latest developments with confidence.

“Dr. Sapkota truly is an emerging outstanding scientist developing and applying new technologies to improve crop production efficiency in Montana. I am proud of his work and congratulate him on this well-deserving recognition,” said Sreekala Bajwa, MSU’s vice president for agriculture. “Dr. Sapkota’s research is a great example of how the new investments our state is making in the Montana Agricultural Experiment Station’s precision agriculture programs are paying off.”

MSU continues to develop research programs and coursework in precision agriculture. As part of that effort, Sapkota has graduate students and research assistants working alongside him. He said that any interested student can find a place to conduct precision agriculture research at MSU, with the work constantly expanding both in geographic region and in topics of interest.

The opportunities are endless, he said, and Montana is fertile ground for exploration. Helping producers to make their operations sustainable and productive causes beneficial ripple effects across the state and the industry.

“Montana is very diverse in terms of cropping systems and practices. We need to understand these differences to better manage agricultural needs and inputs,” he said. “Agriculture is a major industry in Montana, so if we can contribute to addressing problems in agriculture, it means we create lasting impact across the region.” 

North Atlantic faces more hurricane clusters as climate warms



Fudan University
Record-breaking Atlantic hurricane season draws to an end 

image: 

This image from NOAA's GOES-16 satellite on September 14, 2020, shows five tropical systems spinning in the Atlantic basin at one time. From left to right: Hurricane Sally in the Gulf of Mexico, Hurricane Paulette east of the Carolinas, the remnants of Tropical Storm Rene in the central Atlantic, and Tropical Storms Teddy and Vicky in the eastern Atlantic. A total of 10 named storms formed in September 2020 — the most for any month on record. 

view more 

Credit: NOAA





Tropical cyclone cluster events over the North Atlantic. This image from NOAA's GOES-16 satellite on September 14, 2020, shows five tropical systems spinning in the Atlantic basin at one time. From left to right: Hurricane Sally in the Gulf of Mexico, Hurricane Paulette east of the Carolinas, the remnants of Tropical Storm Rene in the central Atlantic, and Tropical Storms Teddy and Vicky in the eastern Atlantic. A total of 10 named storms formed in September 2020 — the most for any month on record. (Image credit: NOAA)

Tropical cyclones, commonly known as typhoons or hurricanes, can form in clusters and impact coastal regions back-to-back. For example, Hurricanes Harvey, Irma and Maria hit U.S. sequentially within one month in 2017. The Federal Emergency Management Agency failed to provide adequate support to hurricane victims in Puerto Rico when Maria struck because most rescue resources and specialized disaster staffers were deployed for the responses to Hurricanes Harvey and Irma.

A new study published in Nature Climate Change confirms these hurricane clusters are becoming more frequent in the North Atlantic in recent decades—a trend projected to continue in the near future.

Tropical cyclone clusters describe the event that two or more tropical cyclones present simultaneously within the same basin. This phenomenon is not rare, as historically only 40% of tropical cyclones appeared alone. Beyond the combined impacts of individual storms, tropical cyclone clusters can cause disproportionate damage as coastal communities and infrastructures need time to bounce back from the impact of the first storm. Understanding tropical cyclone clusters and their future is thus important for coastal risk management.

Analysing the historical observation of tropical cyclones, the authors found that during the past few decades, the chances for tropical cyclone cluster decreased in the Northwestern Pacific basin, while increased in North Atlantic basin. “We tried to develop a probabilistic framework to understand this trend” said Dazhi Xi, a climatologist at HKU who co-led the study and developed the methodology, “If tropical cyclone clusters are formed by chance, then only storm frequency, storm duration, and storm seasonality can impact the chance. So, as a first attempt we simulate the formation of tropical cyclone clusters by probabilistic modelling, considering only these three mechanisms, and hoped we could find why tropical cyclone clusters changed in the past decades”.

However, the probabilistic model is only partly successful. For some years, it significantly underestimates the chance of tropical cyclone cluster. It is because some storms coexist with other storms not simply by chance, rather, they have physical linkage. “The previously seemed failed statistical model now soon becomes a powerful tool that can distinguish physical-linked tropical cyclone cluster with those by pure chance” said Wen Zhou, a climatologist at Fudan University and the corresponding author of the study. For those years that the probabilistic model fails, the authors find that synoptic scale waves, a series of train-like atmospheric disturbances, enhance the chance of tropical cyclone cluster formation.

The study further discovered that the La-Nina-like global warming pattern, characterized by slower warming in the Eastern Pacific compared to the Western Pacific, is the reason behind the observed shifts in tropical cyclone cluster hotspot. “The warming pattern not only modulates the frequency of tropical cyclones in the North Atlantic and Northwestern Pacific basins, but also impacts the strength of the synoptic scale waves, together causing the shift of tropical cyclone cluster hotspot from Northwestern Pacific to North Atlantic basin” said Zheng-Hang Fu, a PhD student at Fudan University who co-led the study.

The research establishes a probabilistic baseline model for investigating tropical cyclone cluster events and their underlying physical mechanisms. This framework not only explains the observed shift of tropical cyclone cluster hotspot from the Northwestern Pacific to the North Atlantic basin, but also provides a transferable methodology applicable to other ocean basins worldwide. Importantly, the authors identify the North Atlantic as an emerging hotspot for tropical cyclone clusters in recent decades. This finding calls for heightened attention from Atlantic coastal nations, urging them to develop proactive strategies against these compounding hazards.

References:

Fu, Z.H., D. Xi, S.-P. Xie, W. Zhou, N. Lin, J. Zhao, X. Wang, and J.C.L. Chan, 2025: Shifting hotspot of tropical cyclone clusters in a warming climate. Nature Climate Change, 15. https://doi.org/10.1038/s41558-025-02397-9

Wildfire collaborative responds to community concerns about air quality

New study seeks to answer Los Angeles residents’ questions about how to find reliable air quality information during fires



Cary Institute of Ecosystem Studies

Smoke from the Palisades Fire over LA on January 8, 2025 

image: 

 Smoke from the Palisades Fire over LA on January 8, 2025 

view more 

Credit: Jeff Suer





When several wildfires ignited in Los Angeles in early January of 2025, Miriam Marlier’s friends and neighbors came to her with questions about how to find reliable air quality information.

Marlier is a UCLA scientist and a member of the Western Fire and Forest Resilience Collaborative (WFFRC), a program of Cary Institute of Ecosystem Studies led by Cary forest ecologist Winslow Hansen. Guided by decision makers, the collaborative conducts research to inform solutions to the Western US fire crisis.

During the January fires, “people were urgently trying to find out whether the air was safe to breathe, where to find information about air quality, and whether the information was comprehensive enough for them to make decisions to protect themselves and their families,” explained WFFRC postdoctoral associate Claire Schollaert.

In response to these urgent questions, Schollaert, Marlier, and colleagues quickly launched a study to assess whether publicly available data sources captured an increase in air pollution during the fires, how air quality changed over space and time, and how the trends differed between data sets. The findings were published in Environmental Science and Technology Letters in July, and provide insights into how these tools can better support public health response during smoke events.

“This study is the first to my knowledge that quantifies how the LA Fires affected air quality at very fine spatial and temporal resolutions,” said Hansen. “It serves as a valuable baseline from an extraordinary event.” The study analyzed hourly data from nearly 750 low-cost and regulatory air pollution air pollution sensors throughout LA.

During large wildfires, government agencies generally recommend that residents check the Environmental Protection Agency’s air quality index, which is fed by data from regulatory monitors. The AirNow Fire and Smoke Map, another commonly recommended tool, relies on data from EPA monitors, low-cost PurpleAir sensors, and satellite data. But these tools utilize different data sources and often consider different time periods, potentially leaving residents confused.

“Smoke exposure is the main way that most people are impacted by fires,” said Schollaert. “Providing the public with understandable and accurate air quality information is crucial to reducing risk.”

For the study, the researchers compared Los Angeles air quality data from EPA regulatory monitors, low-cost PurpleAir sensors, and satellites. “We wanted to understand the smoke plume dynamics from the perspective of multiple, often complementary, air quality data sources,” explains Marlier. 

Findings confirmed that the fire events did degrade air quality in the city, particularly on January 8 and 9. But air quality varied markedly from hour to hour and day to day during the course of the fires, and “the Santa Ana winds, which played a role in spreading the fires, also did a decent job of pushing smoke from the Palisades fire offshore pretty quickly,” said Schollaert. Those same winds, however, spread smoke from the Eaton fire across the most populated parts of the county before pushing the plume out over the Pacific. 

The team found that while regulatory monitors are essential, they often miss local variability due to limited spatial coverage. Integrating low-cost sensors and satellite data on public risk communication platforms can help fill critical gaps. This growing network of sensors can work with regulatory systems, to strengthen them, offering more responsive public health strategies and better community protection.

Satellite observations capture broader patterns of smoke movement through the atmosphere and could help to further fill in gaps between on-the-ground monitors, but the data are not as accessible to the public. 

Smoke impacts were largely consistent across these different data sources, the team found, but differences in spatial distribution and averaging times could lead to discrepancies in air quality readings, which could be confusing to users. 

Based on their findings, the authors call for more ground-based sensors to improve air pollution monitoring, and for the improved integration of satellite data into user-friendly air quality platforms. Preferably, these additional sensors and data sources would be integrated into the most commonly used air quality index resources, said Schollaert. 

“People want one number that lets them know whether it’s safe to go running outside or take their kid to the playground,” said Schollaert. “It’s our job as researchers to try to figure out how to make that information as reliable as possible.”

“We live in an era of data saturation,” said Hansen. “Often these datasets tell us slightly different things at different resolutions of space and time. Now more than ever, effective actionable science is about harmonizing and synthesizing these diverse datasets to provide consistent, robust, and digestible insights to the public, managers, and policy makers.” 

Responsiveness to real-world needs is baked into the WFFRC approach, and the new study provides one example of how this strategy can lead to meaningful impacts and important scientific insights. 

Daily hazard mapping system data, LA Fires, January 7–12 


Daily hazard mapping system smoke plumes with average daily PM2.5 concentrations from AirNow (large circles) and PurpleAir (small circles) from the first 6 days of the fire impacted period (January 7–12).

This research was made possible, in part, by funding from the Gordon and Betty Moore Foundation under grant #11974. The paper is a contribution of the Western Fire and Forest Resilience Collaborative and the Climate and Wildfire Research Initiative at UCLA.

Western Fire and Forest Resilience Collaborative (WFFRC) is an interdisciplinary research program of Cary Institute of Ecosystem Studies focused on advancing fire ecology and forest resilience science across the western U.S. By connecting science with practice, WFFRC aims to advance fire and forest resilience strategies that are ecologically sound, science-informed, and adaptable to rapidly changing environmental conditions.

Cary Institute of Ecosystem Studies is an independent nonprofit center for environmental research. Since 1983, our scientists have been investigating the complex interactions that govern the natural world and the impacts of climate change on these systems. Our findings lead to more effective resource management, policy actions, and environmental literacy. Staff are global experts in the ecology of: forests, freshwater, disease, and cities.

Journal

DOI

Public data reveal extent of air quality impacts during 2025 Los Angeles wildfires

Journal

DOI

Article Title

Debris slide risk doesn’t always rise after a wildfire, study finds


Researchers found no increase in landslides in the Columbia River Gorge after a major fire




University of Oregon

Looking down the Leavens Creek watershed 

image: 

Looking down the Leavens Creek watershed, where a fatal debris flow event occurred following the Eagle Creek Fire. Notice the steep, rocky channels and waterfalls that debris flows travel down.

view more 

Credit: Google Earth




In the wake of a wildfire, there’s often an assumption that burned landscapes will be more susceptible to landslides. But new research from the University of Oregon suggests it’s not always that simple.  

An analysis of the Columbia River Gorge, which runs along the border between Oregon and Washington, shows that steep, rocky watersheds in that area have been prone to debris flows and rockfall for thousands of years. Those events didn’t measurably increase after the Eagle Creek Fire, which scorched 47,000 acres of the gorge over the course of three months in 2017. 

UO geologist Josh Roering and members of his lab published their findings Aug. 8 in Science Advances, highlighting the importance of context and geological history in landslide risk assessments. The study also could help inform safety and hazards awareness projects in the gorge, in both burned and nonburned areas.  

After the Eagle Creek Fire, Oregon land managers were concerned about landslides, especially in the vicinity of the Interstate 84 transportation corridor that runs through the gorge. Their fears were, in large part, informed by what’s transpired in places like Southern California, where post-fire slides have caused devastating casualties and millions of dollars in damage.  

That phenomenon can happen because as wildfire destroys vegetation and groundcover, slopes become more prone to debris movement, erosion and rock fall, Roering said, which can be more easily triggered by rain and storm events.  

“When Eagle Creek burned up such a massive area of the Columbia River Gorge, the natural question was: Is that going to happen here?” Roering said. “The gorge provided a great laboratory to examine how fire affects steep and rocky landscapes.” 

In his lab’s latest project, Roering and doctoral student Maryn Sanders analyzed recent debris flows in the gorge to better understand the likelihood of slope movement after a fire and to explore how to predict when debris flows will occur. Debris flows occur when loose sediment — like mud, rocks and other debris — rapidly moves down a slope, often fueled by a storm or heavy rain.  

Sanders and her team turned to a remote-sensing technology known as airborne lidar, or light detection and ranging, which allows them to see through the tree cover so they can analyze physical changes on the ground below, like where erosion has occurred. That tool, alongside field observations, helped them map out debris flows so they could assess movement across the study area. 

As Sanders mapped the data, she found that many debris flows were concentrated in the watersheds near Dodson, just a few miles east of Multnomah Falls on the Oregon side of the gorge. Those are some of the steepest and fastest eroding watersheds in the state.  

The debris flows in that region have been especially frequent and destructive. They’ve caused fatalities and threatened additional human lives, homes and infrastructure, which make them even more vital for state agencies to understand. 

Sanders noticed a few interesting characteristics of the landscape as she studied the data, which suggested fire might not be the most significant cause of slope movement in that area. It also hinted that steep, rocky terrain behaves differently than slopes in a place like Southern California.    

The researchers found massive amounts of sediment accumulation in fan-like formations at the base of the rocky catchments in gorge watersheds. At first glance, those features looked unassuming because they were covered in vegetation, but with lidar imaging it was clear something more notable was going on beneath the surface. 

“The size and makeup of the fans suggest that frequent debris flows have been happening in these watersheds for a really long period of time, in the magnitude of thousands of years,” Sanders said. 

She also observed that the slopes were collecting sediment much faster than more stable terrain does, likely through temperature fluctuations that cause rockfall. That sets them up to produce debris flows more frequently, typically every few decades.  

Sanders took a closer look and analyzed the erosion rates in the area. She found frequent debris flows throughout its geological history and saw that the landscape had behaved in a consistent manner over thousands of years, something that remained relatively unchanged after the 2017 fire.  

“Because we found similar rates of erosion before and after the fire, we believe the rocky environment was not as sensitive to fire,” she said. “Our analysis suggests that fire plays a relatively small role in triggering these events and emphasizes how important it is to consider the history of place.” 

Still, the frequency, size and nature of debris flows in the gorge remains an ongoing cause for concern. The researchers are in the final stages of developing a tool that could help the Oregon Department of Transportation and other stakeholders predict debris flows in the gorge. That would help them make better use of safety features like roadside warning signs and closures, alerting travelers about the heightened risk of landslides during intense storms.  

“These watersheds are highly active and inherently hazardous, irrespective of fire,” Sanders said. “We want our research to help agencies like ODOT better understand this geologically-complex landscape."