Monday, March 25, 2024

 

Honey bees at risk for colony collapse from longer, warmer fall seasons



Cold storage for colonies could help mitigate climate change effects



Peer-Reviewed Publication

WASHINGTON STATE UNIVERSITY

WSUbeeresearch1 

IMAGE: 

WASHINGTON STATE UNIVERSITY RESEARCHERS AND STUDENTS COLLECT SAMPLES AND PERFORM HONEY BEE COLONY HEALTH ASSESSMENTS IN ORCHARDS NEAR MODESTO, CA. EVERY YEAR MORE THAN 2 MILLION HONEY BEE HIVES FROM ACROSS THE COUNTRY ARE MOVED TO CALIFORNIA TO POLLINATE ALMOND TREES IN FEBRUARY. TO MANAGE BEE HEALTH AND THE LOGISTICS OF THE MOVE, MANY COMMERCIAL BEEKEEPERS ARE STARTING TO USE INDOOR COLD STORAGE FOR THEIR HIVES – A PRACTICE THAT RESEARCHERS HAVE FOUND MIGHT ALSO HELP PREVENT COLONY COLLAPSES FROM LONGER, WARM AUTUMNS DUE TO CLIMATE CHANGE.

view more 

CREDIT: BRANDON HOPKINS, WASHINGTON STATE UNIVERSITY




PULLMAN, Wash. – The famous work ethic of honey bees might spell disaster for these busy crop pollinators as the climate warms, new research indicates.

Flying shortens the lives of bees, and worker honey bees will fly to find flowers whenever the weather is right, regardless of how much honey is already in the hive. Using climate and bee population models, researchers found that increasingly long autumns with good flying weather for bees raises the likelihood of colony collapse in the spring.

The study, published in Scientific Reports, focused on the Pacific Northwest but holds implications for hives across the U.S. The researchers also modeled a promising mitigation: putting colonies into indoor cold storage, so honey bees will cluster in their hive before too many workers wear out.

“This is a case where a small amount of warming, even in the near future, will make a big impact on honey bees,” said lead author Kirti Rajagopalan, a Washington State University climate researcher. “It’s not like this is something that can be expected 80 years from now. It is a more immediate impact that needs to be planned for.”

For this study, researchers ran simulations through a honey bee population dynamics model using climate projections for 2050 and the end of the century at 2100. They found that honey bee colonies that spend the winter outside in many areas of the Pacific Northwest would likely experience spring colony collapses in both the near- and long-term scenarios. This also occurred under a simulation where climate change continued as it is progressing now and one where greenhouse gas emissions were reduced in the near future.

Worker honey bees will forage for food whenever temperatures rise above about 50 degrees Fahrenheit. When it gets colder, they cluster in the hive, huddling with other bees, eating honey reserves and shivering, which helps keep the bees warm. In the spring, the adult worker bees start flying again. That means they also start dying. If too many older worker bees die before their replacements emerge ready to forage, the whole colony can collapse. Scientists have estimated this happens when there are fewer than 5,000 to 9,000 adult bees in the hive.

This study found that colonies wintering outside in colder areas like Omak in the far north of Washington state might still do all right under climate change. But for honey bee colonies in many other places, like Richland, Washington near the border of Oregon, staying outside in the winter would mean the spring hive population would plummet to fewer than 9,000 adults by 2050 and less than 5,000 by the end of the century.

The authors note that the simulations just looked at seasonal factors like temperature, wind and the amount of daylight, making them fairly conservative models.

“Our simulations are showing that even if there is no nutritional stress, no pathogens, no pesticides – just the conditions in fall and winter are enough to compromise the age structure of a colony. So when the hive comes out of winter, the bees are dying faster than they're being born,” said co-author Gloria DeGrandi-Hoffman, a research leader at the U.S. Department of Agriculture’s Carl Hayden Bee Research Center.

The researchers also simulated a potential mitigation, placing honey bee hive boxes in cold storage so the bees start to cluster earlier and save workers. For instance, in the Richland scenarios, by the end of the century, having bees in cold storage from October to April would boost the spring hive population to over 15,000 compared to around 5,000 to 8,000 if they were kept outside.

A relatively new practice, cold storage is gaining popularity among commercial beekeepers to help manage bee health and for the logistics involved in moving hives to California to pollinate almond trees in February, an event that draws more than two million hives from across the country.

“A lot of beekeepers are already practicing this management technique of storing bees indoors because it has a lot of immediate potential to help in a number of ways,” said co-author Brandon Hopkins, a WSU entomologist. “These findings demonstrate that there are additional benefits to this practice for the survival of colonies in a changing climate.”

This research received support from the Washington Department of Agriculture’s Specialty Crop Block Grant.

Washington State University researchers and students collect samples and perform honey bee colony health assessments in orchards near Modesto, CA. Every year more than 2 million honey bee hives from across the country are moved to California to pollinate almond trees in February. To manage bee health and the logistics of the move, many commercial beekeepers are starting to use indoor cold storage for their hives – a practice that researchers have found might also help prevent colony collapses from longer, warm autumns due to climate change.

WSU beeresearch3 (IMAGE)

WASHINGTON STATE UNIVERSITY

 

Persian plateau unveiled as crucial hub for early human migration out of Africa



GRIFFITH UNIVERSITY

Pebdeh Cave 

IMAGE: 

PEBDEH CAVE LOCATED IN THE SOUTHERN ZAGROS MOUNTAINS. PEBDEH WAS OCCUPIED BY HUNTER-GATHERERS AS EARLY AS 42,000 YEARS AGO.

view more 

CREDIT: MOHAMMAD JAVAD SHOAEE




A new study combining genetic, palaeoecological, and archaeological evidence has unveiled the Persian Plateau as a pivotal geographic location serving as a hub for Homo sapiens during the early stages of their migration out of Africa.  

This revelation sheds new light on the complex journey of human populations, challenging previous understandings of our species' expansion into Eurasia. 

The study, published in Nature Communications, highlights a crucial period between approximately 70,000 to 45,000 years ago when human populations did not uniformly spread across Eurasia, leaving a gap in our understanding of their whereabouts during this time frame. 

Key findings from the research include: 

  • The Persian plateau as a hub for early human settlement: Using a novel genetic approach combined with palaeoecological modelling, the study revealed the Persian Plateau as the region where from population waves that settled all of Eurasia originated.

  • This region emerged as a suitable habitat capable of supporting a larger population compared with other areas in West Asia. 

  • Genetic resemblance in ancient and modern populations: The genetic component identified in populations from the Persian Plateau underlines its long-lasting differentiation in the area, compatible with the hub nature of the region, and is ancestral to the genetic components already known to have inhabited the Plateau.

  • Such a genetic signature was detected thanks to a new approach that disentangles 40,000 years of admixture and other confounding events. This genetic connection underscores the Plateau's significance as a pivotal location for early human settlement and subsequent migrations. 

Study co-author Professor Michael Petraglia, Director of Griffith University’s Australian Research Centre for Human Evolution, provided a much clearer picture of these early human movements. 

“Our multidisciplinary study provides a more coherent view of the ancient past, offering insights into the critical period between the Out of Africa expansion and the differentiation of Eurasian populations,” Professor Petraglia said.  

“The Persian Plateau emerges as a key region, underlining the need for further archaeological explorations." 

First author Leonardo Vallini of the University of Padova, Italy, said: “The discovery elucidates a 20,000 year long portion of the history of Homo sapiens outside of Africa, a timeframe during which we interacted with Neanderthal populations, and sheds light on the relationships between various Eurasian populations, providing crucial clues for understanding the demographic history of our species across Europe, East Asia, and Oceania.”  

Senior author, Professor Luca Pagani added: “The revelation of the Persian Plateau as a hub for early human migration opens new doors for archaeological exploration, enriching our understanding of our species' journey across continents and highlighting this region's pivotal role in shaping human history.” 

The study ‘The Persian Plateau served as Hub for Homo sapiens after the main Out of Africa dispersal’ has been published in Nature Communications. 

JOURNAL

DOI

SUBJECT OF RESEARCH

ARTICLE TITLE

20,000 years of shared history on the Persian plateau



Peer-Reviewed Publication

ESTONIAN RESEARCH COUNCIL

Figure 1 

IMAGE: 

PERSIAN PLATEAU, THE MOST LIKELY PLACE WHERE THE ANCESTORS OF ALL PRESENT DAY NON AFRICANS LIVED FOR THE 20.000 YEARS THAT FOLLOWED THEIR MIGRATION OUT OF AFRICA. A PERIOD DURING WHICH THEY ALSO MIXED THEIR GENES WITH THE ONES OF THE NEANDERTHALS.

view more 

CREDIT: CREDITS: THE AUTHORS OF THE ORIGINAL PUBLICATION




All present day non African human populations are the result of subdivisions that took place after their ancestors left Africa at least 60.000 years ago. How long did it take for these separations to take place? Almost 20.000 years, during which they were all part of a single population. Where did they live for all this time? We don’t know, yet.

This is a conversation that could have taken place one year ago, now it is possible to give clearer answers to these questions thanks to the study recently published in Nature Communications (1) led by the researchers from the University of Padova, in collaboration with the University of Bologna (Department of Cultural Heritage), the Griffith University of Brisbane, the Max Planck Institute of Jena and the University of Turin.

The ancestors of all present day Eurasians, Americans and Oceanians, moved Out of Africa between 70 and 60 thousand years ago. After reaching Eurasia these early settlers idled for some millennia as a homogeneous population, in a presumably localized area, before expanding their range across the whole continent and beyond. This event set the basis for the genetic divergence between present day Europeans and East Asians, and can be dated to around 45 thousand years ago. On the one hand, the dynamics that led to the broader colonization of Eurasia have been already reconstructed by some of the authors in a previous publication in 2022 (2), and occurred through a series of chronologically, genetically and culturally distinct expansions. On the other hand, the geographic area where the ancestors of all non Africans lived  after the Out of Africa and that acted as a “Hub” for  the subsequent movements of Homo sapiens has been the matter of a long standing debate, with most of West Asia, North Africa, South Asia or even South East Asia having been listed as potentially suitable locations.

In their latest work, the authors deployed a novel genetic approach and identified ancient and modern populations from the Persian Plateau as the ones carrying genetic traces that most closely resemble the features of the Hub population, pinpointing the area as the likely homeland of all early Eurasians. “The most difficult part” says Leonardo Vallini, first author of the study, “has been to disentangle the various confounding factors constituted by 45 thousand years of population movements and admixtures that took place after the Hub was settled”.

The multidisciplinary study also investigated the paleoecological characteristics of the area at the time, and confirmed it as suitable for human occupation, potentially capable of sustaining a larger population than other parts of West Asia. “Identifying the Persian Plateau as a Hub for early human migration opens new doors for archaeological and palaeoanthropological research” added co-author Professor Michael Petraglia of Griffith University in Brisbane.

In fact, the Persian plateau will be the focus of the ERC Synergy Project 'LAST NEANDERTHALS', recently awarded to co-author Stefano Benazzi, professor at the University of Bologna (Department of Cultural Heritage). "In line with the results of the study," says Benazzi, "this ERC project aims to explore and unravel the intricate biocultural events that occurred between 60,000 and 40,000 years ago, focusing also on the Persian Plateau".

“With our work we found a home to 20,000 years of shared history between Europeans, East Asians, Native Americans and Oceanians. This leg of the human journey out of Africa is fascinating, since it is the one where we also met and mixed our genes with the ones of Neanderthals” concluded Professor Luca Pagani, senior author of the study.

References

  1. Leonardo Vallini, Carlo Zampieri, Mohamed Javad Shoaee, Eugenio Bortolini, Giulia Marciani, Serena Aneli, Telmo Pievani, Stefano Benazzi, Alberto Barausse, Massimo Mezzavilla, Michael D. Petraglia, Luca Pagani, The Persian Plateau served as Hub for Homo sapiens after the main Out of Africa dispersal, Nature Communications [https://www.nature.com/articles/s41467-024-46161-7]
  2. Leonardo Vallini, Giulia Marciani, Serena Aneli, Eugenio Bortolini, Stefano Benazzi, Telmo Pievani, Luca Pagani. Genetics and Material Culture Support Repeated Expansions into Paleolithic Eurasia from a Population Hub Out of Africa, Genome Biology and Evolution, Volume 14, Issue 4, April 2022, evac045, https://doi.org/10.1093/gbe/evac045

 

Link to the study: https://www.nature.com/articles/s41467-024-46161-7

Title: The Persian Plateau served as Hub for Homo sapiens after the main Out of Africa dispersal – «Nature Communications» – 2024

Authors: Leonardo Vallini, Carlo Zampieri, Mohamed Javad Shoaee, Eugenio Bortolini, Giulia Marciani, Serena Aneli, Telmo Pievani, Stefano Benazzi, Alberto Barausse, Massimo Mezzavilla, Michael D. Petraglia, Luca Pagani

 

Humans pass more viruses to other animals than we catch from them



UNIVERSITY COLLEGE LONDON





Humans pass on more viruses to domestic and wild animals than we catch from them, according to a major new analysis of viral genomes by UCL researchers.

For the new paper published in Nature Ecology & Evolution, the team analysed all publicly available viral genome sequences, to reconstruct where viruses have jumped from one host to infect another vertebrate species.

Most emerging and re-emerging infectious diseases are caused by viruses circulating in animals. When these viruses cross over from animals into humans, a process known as zoonosis, they can cause disease outbreaks, epidemics and pandemics such as Ebola, flu or Covid-19. Given the enormous impact of zoonotic diseases on public health, humans have generally been considered as a sink for viruses rather than a source, with human-to-animal transmission of viruses receiving far less attention.

For the study, the research team developed and applied methodological tools to analyse the nearly 12 million viral genomes that have been deposited on public databases to date. Leveraging this data, they reconstructed the evolutionary histories and past host jumps of viruses across 32 viral families, and looked for which parts of the viral genomes acquired mutations during host jumps.

The scientists found that roughly twice as many host jumps were inferred to be from humans to other animals (known as anthroponosis) rather than the other way round. This pattern was consistent throughout most viral families considered. Additionally, they found even more animal-to-animal host jumps, that did not involve humans.

The team’s work highlights the high and largely underappreciated fact that human viruses frequently spread from humans into wild and domestic animals.

Co-author Professor Francois Balloux (UCL Genetics Institute) said: “We should consider humans just as one node in a vast network of hosts endlessly exchanging pathogens, rather than a sink for zoonotic bugs.

“By surveying and monitoring transmission of viruses between animals and humans, in either direction, we can better understand viral evolution and hopefully be more prepared for future outbreaks and epidemics of novel illnesses, while also aiding conservation efforts.”

The findings also show that, on average, viral host jumps are associated with an increase in genetic changes, or mutations in viruses, relative to their continued evolution alongside just one host animal, reflecting how viruses must adapt to better exploit their new hosts.

Further, viruses that already infect many different animals show weaker signals of this adaptive process, suggesting that viruses with broader host ranges may possess traits that make them inherently more capable of infecting a diverse range of hosts, whereas other viruses may require more extensive adaptations to infect a new host species.

Lead author, PhD student Cedric Tan (UCL Genetics Institute and Francis Crick Institute) said: “When animals catch viruses from humans, this can not only harm the animal and potentially pose a conservation threat to the species, but it may also cause new problems for humans by impacting food security if large numbers of livestock need to be culled to prevent an epidemic, as has been happening over recent years with the H5N1 bird flu strain.

“Additionally, if a virus carried by humans infects a new animal species, the virus might continue to thrive even if eradicated among humans, or even evolve new adaptations before it winds up infecting humans again.

“Understanding how and why viruses evolve to jump into different hosts across the wider tree of life may help us figure out how new viral diseases emerge in humans and animals.”

Cell entry is generally seen as the first step for a virus to infect a host. However, the team found that many of the adaptations associated with host jumps were not found in the viral proteins that enable them to attach to and enter host cells, which points to viral host adaptation being a complex process that remains to be fully understood.

Co-author Dr Lucy van Dorp (UCL Genetics Institute) said: “Our research was made possible only by the countless research teams that have openly shared their data via public databases. The key challenge, moving forward, is to integrate the knowledge and tools from diverse disciplines including genomics, epidemiology, and ecology to enhance our understanding of host jumps.”

 

New UM study reveals unintended consequences of fire suppression



THE UNIVERSITY OF MONTANA

Fire image 

IMAGE: 

NEW RESEARCH FROM THE UNIVERSITY OF MONTANA SUGGESTS ATTEMPTING TO SUPPRESS ALL WILDFIRES CAUSES THEM TO BURN WITH GREATER SEVERITY. 

 

view more 

CREDIT: UNIVERSITY OF MONTANA PHOTO BY TOMMY MARTINO




MISSOULA – The escalation of extreme wildfires globally has prompted a critical examination of wildfire management strategies. A new study from the University of Montana reveals how fire suppression ensures that wildfires will burn under extreme conditions at high severity, exacerbating the impacts of climate change and fuel accumulation.

The study used computer simulations to show that attempting to suppress all wildfires results in fires burning with more severe ecological impacts, with accelerated increases in burned area beyond those expected from fuel accumulation or climate change.

“Fire suppression has unintended consequences,” said lead author Mark Kreider, a Ph.D. candidate in the forest and conservation sciences program at UM. “We’ve known for a long time that suppressing fires leads to fuel accumulation. Here, we show a separate counter-intuitive outcome.”

Though fire suppression reduces the overall area burned, it mainly eliminates low- and moderate-intensity fires. As a result, the remaining fires are biased to be more extreme, Kreider said. The new study published March 25 in Nature Communications, shows how this “suppression bias” causes average fire severity to increase substantially.

“Over a human lifespan, the modeled impacts of the suppression bias outweigh those from fuel accumulation or climate change alone,” he said. “This suggests that suppression may exert a significant and underappreciated influence on patterns of fire globally.”

Kreider led the research as part of his Ph.D. dissertation work with the support of a National Science Foundation Graduate Research Fellowship.

Fire suppression exacerbated the trends already caused by climate change and fuel accumulation, the study found, causing areas burned to increase three to five times faster over time relative to a world with no suppression.

Suppression, through preferentially removing low- and moderate-severity fire, also raised average fire severity by an amount equivalent to a century of fuel accumulation or climate change.

“By attempting to suppress all fires, we are bringing a more severe future to the present,” said Kreider.

Andrew Larson, Kreider’s Ph.D. adviser and a professor of forest ecology at UM, said this has significant impacts on ecosystems.

“Traditional suppression removes the low-severity fires that help perpetuate healthy forests by consuming fuels and preferentially killing thin-barked tree species,” Larson said. “I wonder how much we are altering natural selection with fire suppression by exposing plants and animals to relatively less low-severity fire and relatively more high-severity fire.”

However, the new findings also show that allowing more low- and moderate-intensity fire can reduce or reverse the impacts of the suppression bias. Suppression strategies that allow fire to burn under moderate weather conditions – while still suppressing fires during more dangerous fire weather – reduced average fire severity and moderated the rate of burned area increase, the team found.

It may seem counterintuitive, but our work clearly highlights that part of addressing our nation’s fire crisis is learning how to accept more fires burning when safely possible,” said Philip Higuera, a co-author and UM professor of fire ecology. “That’s as important as fuels reduction and addressing global warming.”

Developing and implementing technologies and strategies to safely manage wildfires during moderate burning conditions is essential, Kreider said. This approach may be just as effective as other necessary interventions, like mitigating climate change and decreasing unintentional human-related ignitions.

The article, “Fire suppression makes wildfires more severe and accentuates impacts of climate change and fuel accumulation,” was co-authored by Kreider, Larson, Higuera, William Rice, and Nadia White from the University of Montana, as well as Sean Parks, an ecologist with the Aldo Leopold Wilderness Research Institute.

###

 

Small changes can yield big savings in agricultural water use


California agricultural could potentially avoid more extreme water-saving measures as it faces challenges from climate change and water scarcity



UNIVERSITY OF CALIFORNIA - SANTA BARBARA

Three schema of switching crops, changing farming practices, and fallowing fields all yielded average water savings around 10% 

IMAGE: 

THREE SCHEMA OF SWITCHING CROPS, CHANGING FARMING PRACTICES, AND FALLOWING FIELDS ALL YIELDED AVERAGE WATER SAVINGS AROUND 10%.

 

view more 

CREDIT: BOSER ET AL.




(Santa Barbara, Calif.) — While Hollywood and Silicon Valley love the limelight, California is an agricultural powerhouse, too. Agricultural products sold in the Golden State totaled $59 billion in 2022. But rising temperatures, declining precipitation and decades of over pumping may require drastic changes to farming. Legislation to address the problem could even see fields taken out of cultivation.

Fortunately, a study out of UC Santa Barbara suggests less extreme measures could help address California’s water issues. Researchers combined remote sensing, big data and machine learning to estimate how much water crops use in the state’s Central Valley. The results, published in Nature Communications, suggest that variation in efficiency due to farming practices could save as much water as switching crops or fallowing fields.

“There’s an opportunity for less obtrusive methods of saving water to be more important than we originally thought,” said lead author Anna Boser, a doctoral student at UCSB’s Bren School of Environmental Science & Management. “So we might not have to make as many changes in land use as we originally thought.”

California’s fertile soils and Mediterranean climate enable farmers to cultivate high-value crops that just aren’t viable in the rest of the country. Over a third of the country’s vegetables, and nearly three-quarters of fruits and nuts, are grown in California, according to the state’s Department of Food and Agriculture.

But many of these crops are quite thirsty. Agriculture accounts for around 80% of water used in California, explained co-author Kelly Caylor, a professor at the Bren School. “Declining groundwater levels and a changing climate put pressure on the availability of irrigation water, making it critical to determine how we can ‘do more with less.’”

In 2014, Sacramento passed the Sustainable Groundwater Management Act (SGMA) to secure California’s water resources. SGMA mandates that every groundwater basin in the state must be sustainable by 2040. Each basin created a local agency tasked with developing a plan to meet this goal. Mostly, that means ensuring that we don’t pump more water out of the ground than what seeps in. We will need to reduce total groundwater use by 20% to 50% by 2040, depending on the basin, Boser said. But to accomplish this, we need an idea of how much water farms use, and what fraction of that actually makes it to crops.

Modeling water use

Scientists have a variety of methods to estimate the amount of water ascending from the Earth’s surface to the atmosphere due to evaporation and transpiration through plant leaves. Notably, evaporation cools things down. “When we get hot, we sweat to cool off. The Earth does something similar,” Boser said. Scientists look at how warm the ground is and see how much energy it’s getting from sunlight and the atmosphere. If the ground is cooler than expected, it means some of that energy was used to turn water into vapor, which cools down that spot.

An evapotranspiration database called OpenET became publicly available in early 2023. It provides satellite-based evapotranspiration estimates for the western United States. But Boser was interested in the water being used specifically by crops. So, she compared transpiration in fallowed fields to active fields across the Central Valley. Subtracting evapotranspiration in fallow fields from total evapotranspiration yields the amount of water that crops are actually consuming.

Unfortunately for Boser, farmers don’t fallow fields randomly. Often they’ll take their lowest-yielding fields out of production. That creates systematic differences between fallowed and cultivated fields, which could skew Boser’s analysis. So, she created a machine-learning model to conduct a weighted comparison between active and fallowed land, accounting for factors like location, topography and soil quality.

She trained the model on 60% of the areas and tested its results on 30%, fine tuning the algorithm until its predictions matched the actual conditions in these fields within 10 milliliters per square meter per day, on average. Now confident in her model, she applied it to the rest of California’s Central Valley.

Encouraging results

Crop type only explained 34% of the variation in water consumption. “What that means is maybe we’re overlooking some other ways that we could save water,” Boser said. She continued to investigate the model, controlling for factors like location, topography, local climate, soil quality and orchard age (when applicable). Ultimately, a full 10% of crop transpiration could be saved if the top 50% of water users reduced their water consumption to match that of their median-consuming neighbors. Boser attributes these savings to differences in “farming practices.”

Now, 10% might not sound like a lot, but it’s comparable to a number of other interventions. The authors also estimated the effect of switching crops. If the same 50% of farmers switched to the median water-intensive crops for their area, agricultural evapotranspiration would drop by 10%.

Meanwhile, if the state took the top 5% most water-hungry fields out of production, the model says agricultural evapotranspiration would drop by, you guessed it: 10%. This suggests that addressing inefficiencies in farming practices could save as much water as switching crops or taking fields out of cultivation.

To be fair, the results from fallowing would affect only 5% of cultivated land, as opposed to 50% for crop changes and improved farming practices. “We’re probably going to have to fall back on fallowing at least a little bit,” Boser said, “but hopefully not as much as we were originally expecting.”

The authors want to figure out what practices farmers are using that account for the 10% differences in crop water usage. Some examples include mulching, no-till planting, using drought-tolerant varietals, and deficit irrigation — where you provide less water than the crop could theoretically consume. Deficit irrigation already yields good results in viticulture, where vintners find it can improve the quality of wine.

Changing irrigation practices could also help reduce water use. Irrigation efficiency accounts for the fraction of water a farm uses that actually gets consumed by crops. Inefficiencies include leakage, weed growth and evaporation in transport and in the field. These weren’t within the scope of Boser’s model, which only considers transpiration by the crops themselves. Inefficiencies happen before the water even gets to the plants.

According to Boser, up to 60% of the water a farm uses actually passes through the roots of its crops. Clearly there’s plenty of potential gains in this area, though it isn’t clear what efficiency is actually attainable, she said. “Irrigation efficiency is actually quite poorly understood.”

Better characterizing this is on the team’s to-do list. They hope to identify the causes of irrigation inefficiencies, quantify the efficiencies of different types of irrigation, and learn how climate and geography affects irrigation efficiency. All this will require collecting empirical data.

California is at a critical crossroads in water management. For the first time in its history, the state is putting in place regulations that require substantial reductions in groundwater extraction, including in regions where livelihoods depend on thirsty agricultural production.

“This paper uses novel, data-driven methods to show that, contrary to popular belief, there is large potential to cut water use in California agriculture without fallowing fields,” said co-author Tamma Carleton, an assistant professor at UCSB’s Bren School. “This raises the possibility that the state can continue its role as an agricultural powerhouse while also sustainably managing an essential natural resource.”

 

Better phosphorus use can ensure its stocks last more than 500 years and boost global food production - new evidence shows


More efficient use of phosphorus could see limited stocks of the important fertiliser last more than 500 years and boost global food production to feed growing populations.



LANCASTER UNIVERSITY





More efficient use of phosphorus could see limited stocks of the important fertiliser last more than 500 years and boost global food production to feed growing populations.

But these benefits will only happen if countries are less wasteful with how they use phosphorus, a study published today in Nature Food shows.

Around 30-40 per cent of farm soils have over-applications of phosphorus, with European and North American countries over-applying the most.

The global population is due to hit nearly 10 billion by 2050 and it is estimated that to feed this increased population a further 500 million hectares of arable land will be needed – unless phosphorus can be more efficiently used to boost and maintain crop yields.

Listed as a critical raw material by the European Union, and recently a topic of discussion by the United Nations Environment Assembly, globally 20,500 kilotons of phosphorus are applied to agricultural soils each year as fertiliser.

Concerns have been raised about its limited supply and loss to freshwater where it can degrade water quality. Phosphorus predominantly comes from mining phosphate rock sources, of which there are only a relatively small number of sources located in countries like Morocco and Russia.

Previous estimates of how much phosphorus we have left globally have varied greatly from between 30 to over 300 years. These prior estimates were based on current wasteful practices continuing and contained a lot of uncertainty.

This latest research, looking at global phosphorus use and soil concentrations, by scientists at Lancaster University in the UK as well as AgResearch and Lincoln University in New Zealand, examined concentrations of phosphorus in farm soils across the globe for optimum growth of 28 major food crops from wheat and maize to rice and apples. The research revealed soils that did not contain enough phosphorus, and soils that contain concentrations higher than plants need for optimal growth.

Their findings shed new light on the amounts of phosphorus available in soils and needed as fertilisers and reveal that phosphorus reserves could last for up to 531 years if we use it more efficiently and equitably – that’s 77 years longer than if we stick with current practices.

Professor Phil Haygarth of Lancaster University and co-author of the paper said: “Phosphorus is an essential fertiliser that drives food production on farms around the world. It’s the ‘energy’ of agriculture that drives our food systems, but we need to manage our supplies carefully.

“We need to seek ways to be more efficient and sustainable with its use and our study shows that there’s considerable potential to improve the efficiency of how we use phosphorus fertilisers. We show it’s possible to optimise global food production without accelerating the depletion of precious and finite global phosphorus fertiliser reserves.  

“We are unlikely to run out of phosphorus in the next 500 years, but only if we apply as much as needed to produce optimal crop yields and stop wasteful over-applications.”

The research team calculated 10,556 kt of phosphorus is wasted each year through over-application with much of that dominated by wheat and grassland in Europe and maize and rice in Asia.

Professor Richard McDowell of Lincoln University and AgResearch New Zealand and lead author of the study said: “Many farmers over-apply phosphorus to bank it in the soil. However, only a fraction of soil phosphorus can be used by plants. Adjusting applications to sustain the levels that plans need to produce optimal yields negates the need for phosphorus being wasted. If there are excessive levels in soil that plants can’t use, phosphorus can potentially be lost to water, which risks causing water quality problems like eutrophication.”

But it is not all about reductions. The scientists, using data for farmland globally, also calculated that around the world nearly three quarters of farmed soils are in phosphorus deficit - with phosphorus deficits being most acute in Asian countries such as India. As a result, the researchers calculate that globally there needs to be an application of almost 57,000 kt of phosphorus to alleviate those soils in deficit to boost crop yields.

They then calculated that 17,500 kt of phosphorus is needed each year to maintain optimum soil phosphorus concentrations. This would result in a global reduction in the demand for phosphorus by around 3,000 kt annually.

Professor McDowell said: “The science is clear, but to use phosphorus efficiently and extend supplies, governments need to collaborate to make policy that promote phosphorus use only where needed. That will involve balancing distributions of phosphorus for optimal crop growth and reducing subsidies that sustain overuse of phosphorus and likely cause water quality problems.”

The findings are outlined in the paper ‘Phosphorus applications adjusted to optimal crop yields can help sustain global phosphorus reserves’ published by Nature Food.

The paper’s authors are Professor Richard McDowell and Peter Pletnyakov of Lincoln University and AgResearch, and Professor Phil Haygarth of Lancaster University. Professor McDowell was funded by New Zealand’s Our Land and Water National Science Challenge.