Saturday, May 04, 2024

 

To bend the curve of biodiversity loss, nature recovery must be integrated across all sectors


UTRECHT UNIVERSITY
A Montagu's Harrier hunting in the north of the Netherlands 

IMAGE: 

MONTAGU’S HARRIER, A RARE BIRD OF PREY, HUNTING IN AN AGRICULTURAL LANDSCAPE IN THE NORTH OF THE NETHERLANDS. 

view more 

CREDIT: HENS RUNHAAR





The alarming rates of biodiversity loss worldwide have made clear that the classical way of governing biodiversity recovery based on protected areas and programmes for the protection of endangered species is not enough. To tackle this, almost 200 countries committed to the active ‘mainstreaming’ or integration of biodiversity targets into policies and plans across relevant sectors. However, research led by Utrecht University and UFZ Helmholz Centre for Environmental Research suggests that this has until now been largely ineffective due to non-binding commitments, vaguely formulated targets, “add-on” biodiversity initiatives, and too few resources. “Top down regulation is also needed,” say the authors.

‘Biodiversity mainstreaming’ refers to the process of integrating biodiversity considerations into various sectors, policies, and plans. It aims to ensure that biodiversity conservation and sustainable use are incorporated into decision-making across different sectors such as agriculture, forestry, urban planning, and infrastructure development. The goal is to make biodiversity a central consideration rather than treating it as a separate or peripheral issue.

Almost 200 countries have ratified the UN Convention on Global Diversity (CBD) and committed to ‘bending the curve of biodiversity loss’ since it was first drafted in 1992. Biodiversity mainstreaming is an increasingly key strategy, with the CBD’s Kunming-Montreal Global Biodiversity Framework, ratified in 2022, promoted as a ‘Paris Agreement for Nature’.

Current mainstreaming efforts ineffective

Although mainstreaming biodiversity targets into sectoral policies is considered essential to address the direct and indirect drivers of biodiversity loss, such as land use change, resource exploitation, pollution, and consumption patterns, the study, which analysed 43 studies on the topic, shows that current efforts are ineffective. “Biodiversity targets are often vague, initiatives are “add-on” rather than integrated, and resources allocated to biodiversity recovery are insufficient,” explains lead author Hens Runhaar, Professor of Sustainable Food System Governance at Utrecht University’s Copernicus Institute of Sustainable Development. For example, biodiversity initiatives often exist in isolation from policies that directly regulate drivers of biodiversity loss such as agricultural intensification or spatial planning.

"Biodiversity targets are often vague, initiatives are “add-on” rather than integrated, and resources allocated to biodiversity recovery are insufficient"

Conflicting targets between sectors and unclear responsibilities also hamper effective mainstreaming. It is often felt that efforts to conserve or restore biodiversity imply a loss of productivity in sectors like agriculture, forestry and fisheries. However, there are more and more indications to the contrary. Increasing plant biodiversity in grasslands can help dairy farmers become more resilient against droughts, which occur more often due to climate change. “This discourse in combination with a predominantly voluntary approach has also contributed to the ineffectiveness of mainstreaming efforts,” says Runhaar.

On the positive side, note the authors, biodiversity loss is increasingly considered a risk among financial institutions. In their 2023 report, the European Central Bank calculated that 75% of bank loans in the Euro Zone are highly dependent on at least one ecosystem service which are at risk due to biodiversity loss. “This is ramping up financial interest in halting biodiversity loss,” says co-author Yves Zinngrebe, researcher at UFZ Helmholtz Centre for Environmental Research in Germany.

Both “sticks” and “carrots” needed

A combination of regulatory measures (“sticks”) and incentives (“carrots”) may be more effective than voluntary approaches alone, say the authors. Legal requirements for biodiversity action, along with showcasing the benefits of biodiversity to different sectors, could encourage greater commitment to mainstreaming. “For example, the increased popularity of ‘urban Nature Based Solutions’ that simultaneously contribute to biodiversity, climate change adaptation, social cohesion, and healthy urban living, suggests win-wins are possible” says Runhaar, “but this way of thinking is not widely accepted yet”.

 

Centipedes used in traditional Chinese medicine offer leads for kidney treatment




AMERICAN CHEMICAL SOCIETY





A venomous, 8-inch centipede may be the stuff of nightmares, but it could save the life of those affected by kidney disease. Researchers report in the Journal of Natural Products that the many-legged critter —used in traditional Chinese medicine — contains alkaloids that in cell cultures reduced inflammation and renal fibrosis, which both contribute to kidney disease. The journal is copublished by the American Chemical Society and the American Society of Pharmacognosy.

Some 1,500 species of animals are used in traditional Chinese medicine, but little is known about many of the secondary metabolites their bodies produce for specialized functions such as immobilizing prey. The few compounds that have been studied, such as toad venom for cancer treatment, have proved to be fruitful leads for drug development. So, Yong-Xian Cheng and colleagues decided to examine the secondary metabolites produced by the Chinese red-headed centipede (Scolopendra subspinipes mutilans). The venomous centipede has been used for thousands of years in treatments for conditions including epilepsy, tuberculosis, burns and cardiovascular disease.

The researchers mixed a sample of dried centipede powder with ethanol to extract numerous compounds from the animals and then separated and identified the constituents with techniques such as chromatography and spectrometry. The team found 12 new quinoline and isoquinoline alkaloids, including some with unusual molecular structures, along with a half dozen other alkaloids that had previously been detected in this species or in plants. In cell cultures, some of the alkaloids showed anti-inflammatory behavior, while a portion also reduced renal fibrosis. This buildup of connective tissue is associated with chronic kidney disease and is stimulated by inflammation. Finally, the researchers identified a protein that plays a role in renal fibrosis and that was targeted by the most effective dual-function alkaloid. This information could provide a lead for developing treatments for kidney disease, according to the researchers.

The authors acknowledge funding from the Shenzhen Science and Technology Program.

###

The American Chemical Society (ACS) is a nonprofit organization chartered by the U.S. Congress. ACS’ mission is to advance the broader chemistry enterprise and its practitioners for the benefit of Earth and all its people. The Society is a global leader in promoting excellence in science education and providing access to chemistry-related information and research through its multiple research solutions, peer-reviewed journals, scientific conferences, eBooks and weekly news periodical Chemical & Engineering News. ACS journals are among the most cited, most trusted and most read within the scientific literature; however, ACS itself does not conduct chemical research. As a leader in scientific information solutions, its CAS division partners with global innovators to accelerate breakthroughs by curating, connecting and analyzing the world’s scientific knowledge. ACS’ main offices are in Washington, D.C., and Columbus, Ohio.

To automatically receive news releases from the American Chemical Society, contact newsroom@acs.org.

Note: ACS does not conduct research, but publishes and publicizes peer-reviewed scientific studies.

Follow us: X, formerly Twitter | Facebook | LinkedIn | Instagram

 

Georgia Tech and Meta create massive open dataset to advance AI solutions for carbon capture



The project aims to accelerate direct air capture development while significantly reducing costs.



Peer-Reviewed Publication

GEORGIA INSTITUTE OF TECHNOLOGY

Georgia Tech and Meta Create Massive Open Dataset to Advance AI Solutions for Carbon Capture 

VIDEO: 

GEORGIA TECH AND META HAVE COLLABORATED TO PRODUCE A MASSIVE DATABASE, POTENTIALLY MAKING IT EASIER AND FASTER TO DESIGN AND IMPLEMENT DIRECT AIR CAPTURE TECHNOLOGIES. THE OPEN-SOURCE DATABASE ENABLED THE TEAM TO TRAIN AN AI MODEL THAT IS ORDERS OF MAGNITUDE FASTER THAN EXISTING CHEMISTRY SIMULATIONS. THE PROJECT, NAMED OpenDAC, COULD ACCELERATE CLIMATE SOLUTIONS THE PLANET DESPERATELY NEEDS.

view more 

CREDIT: GEORGIA INSTITUTE OF TECHNOLOGY VIDEO CREDIT: CHRIS MCKENNEY MOF MODEL SCREEN CAPTURE CREDIT: LOGAN BRABSON




To avoid catastrophic climate impacts, excessive carbon emissions must be addressed. At this point, cutting emissions isn’t enough. Direct air capture, a technology that pulls carbon dioxide out of ambient air, has great potential to help solve the problem.

But there’s a big challenge. For direct air capture technology, every type of environment and location requires a uniquely specific design. A direct air capture configuration in Texas, for example, would necessarily be different from one in Iceland. These systems must be designed with exact parameters for humidity, temperature, and air flows for each place.

Now, Georgia Tech and Meta have collaborated to produce a massive database, potentially making it easier and faster to design and implement direct air capture technologies. The open-source database enabled the team to train an AI model that is orders of magnitude faster than existing chemistry simulations. The project, named OpenDAC, could accelerate climate solutions the planet desperately needs.

The team’s research was published in ACS Central Science, a journal of the American Chemical Society.

“For direct air capture, there are many ideas about how best to take advantage of the air flows and temperature swings of a given environment,” said Andrew J. Medford, associate professor in the School of Chemical and Biomolecular Engineering (ChBE) and a lead author of the paper. “But a major problem is finding a material that can capture carbon efficiently under each environment’s specific conditions.”

Their idea was to “create a database and a set of tools to help engineers broadly, who need to find the right material that can work,” Medford said. “We wanted to use computing to take them from not knowing where to start to giving them a robust list of materials to synthesize and try.”

Containing reaction data for 8,400 different materials and powered by nearly 40 million quantum mechanics calculations, the team believes it’s the largest and most robust dataset of its kind.

Building a Partnership (and a Database)

Researchers with Meta’s Fundamental AI Research (FAIR) team were looking for ways to harness their machine learning prowess to address climate change. They landed on direct air capture as a promising technology and needed to find a partner with expertise in materials chemistry as it relates to carbon capture. They went straight to Georgia Tech.

David Sholl, ChBE professor, Cecile L. and David I.J. Wang Faculty Fellow, and director of Oak Ridge National Laboratory’s Transformational Decarbonization Initiative, is one of the world’s top experts in metal-organic frameworks (MOFs). These are a class of materials promising for direct air capture because of their cagelike structure and proven ability to attract and trap carbon dioxide. Sholl brought Medford, who specializes in applying machine learning models to atomistic and quantum mechanical simulations as they relate to chemistry, into the project.

Sholl, Medford, and their students provided all the inputs for the database. Because the database predicts the MOF interactions and the energy output of those interactions, considerable information was required.

They needed to know the structure of nearly every known MOF — both the MOF structure by itself and the structure of the MOF interacting with carbon dioxide and water molecules.

“To predict what a material might do, you need to know where every single atom is and what its chemical element is,” Medford said. “Figuring out the inputs for the database was half of the problem, and that’s where our Georgia Tech team brought the core expertise.”

The team took advantage of large collections of MOF structures that Sholl and his collaborators had previously developed. They also created a large collection of structures that included imperfections found in practical materials.

The Power of Machine Learning

Anuroop Sriram, research engineering lead at FAIR and first author on the paper, generated the database by running quantum chemistry computations on the inputs provided by the Georgia Tech team. These calculations used about 400 million CPU hours, which is hundreds of times more computing than the average academic computing lab can do in a year.

FAIR also trained machine learning models on the database. Once trained on the 40 million calculations, the machine learning models were able to accurately predict how the thousands of MOFs would interact with carbon dioxide.

The team demonstrated that their AI models are powerful new tools for material discovery, offering comparable accuracy to traditional quantum chemistry calculations while being much faster. These features will allow other researchers to extend the work to explore many other MOFs in the future.

“Our goal was to look at the set of all known MOFs and find those that most strongly attract carbon dioxide while not attracting other air components like water vapor, and using these highly accurate quantum computations to do so,” Sriram said. “To our knowledge, this is something no other carbon capture database has been able to do.”

Putting their own database to use, the Georgia Tech and Meta teams identified about 241 MOFs of exceptionally high potential for direct air capture.

Moving Forward With Impact

“According to the UN and most industrialized countries, we need to get to net-zero carbon dioxide emissions by 2050,” said Matt Uyttendaele, director of Meta’s FAIR chemistry team and a co-author on the paper. “Most of that must happen by outright stopping carbon emissions, but we must also address historical carbon emissions and sectors of the economy that are very hard to decarbonize — such as aviation and heavy industry. That’s why CO2 removal technologies like direct air capture must come online in the next 25 years.” 

While direct air capture is still a nascent field, the researchers say it’s crucial that groundbreaking tools — like the OpenDAC database made available in the team’s paper — are in development now. 

“There is not going to be one solution that will get us to net-zero emissions,” Sriram said. “Direct air capture has great potential but needs to be scaled up significantly before we can make a real impact. I think the only way we can get there is by finding better materials.”

The researchers from both teams hope the scientific community will join the search for suitable materials. The entire OpenDAC dataset project is open source, from the data to the models to the algorithms.

“I hope this accelerates the development of negative-emission technologies like direct air capture that may not have been possible otherwise,” Medford said. “As a species, we must solve this problem at some point. I hope this work can contribute to getting us there, and I think it has a real shot at doing that.”

 

Note: Georgia Tech ChBE graduate students Sihoon Choi, Logan Brabson, and Xiaohan Yu made major contributions and are co-authors of the paper.

 

Citation: A. Sriram et al, The Open DAC 2023 Dataset and Challenges for Sorbent Discovery in Direct Air Capture, ACS Central Science (2024).

DOIhttps://doi.org/10.1021/acscentsci.3c01629

Writer: Catherine Barzler

Video: Christopher McKenney

#####

The Georgia Institute of Technology, or Georgia Tech, is one of the top public research universities in the U.S., developing leaders who advance technology and improve the human condition. The Institute offers business, computing, design, engineering, liberal arts, and sciences degrees. Its more than 45,000 undergraduate and graduate students, representing 50 states and more than 148 countries, study at the main campus in Atlanta, at campuses in France and China, and through distance and online learning. As a leading technological university, Georgia Tech is an engine of economic development for Georgia, the Southeast, and the nation, conducting more than $1 billion in research annually for government, industry, and society.

 

Analysis of flour and rice shows high levels of harmful fungal toxins



The foods, found in the homes of Brazilian families participating in the research, were stored for future consumption. The study is the first in Brazil to use biomarkers to characterize the risk associated with mycotoxins in the diet.




FUNDAÇÃO DE AMPARO À PESQUISA DO ESTADO DE SÃO PAULO

Analysis of flour and rice shows high levels of harmful fungal toxins 

IMAGE: 

EXPOSURE TO MYCOTOXINS THROUGH FOOD CAN TRIGGER A RANGE OF HEALTH PROBLEMS, ESPECIALLY IN CHILDREN AND ADOLESCENTS 

view more 

CREDIT: ANDRÉ BORGES/AGÊNCIA BRASIL




By analyzing samples of flour and rice stored in homes in Ribeirão Preto, in the interior of the state of São Paulo (Brazil), researchers from the University of São Paulo (USP) found the presence of high levels of fungal toxins (mycotoxins). The results of the study, supported by FAPESP, were published in the journal Food Research International.

As the authors point out, dietary exposure to mycotoxins can trigger a range of health problems, especially in children and adolescents. The data therefore reinforce the importance of storing foods such as grains and flours in dry places and protecting them from insects to avoid the risk of contamination.

“There are more than 400 toxins that fungi produce to defend themselves or to interact with other organisms. Six of these substances, which we call the superpower girls, require more attention because they’re carcinogenic, immunosuppressive or act as endocrine disruptors [cause changes in the body’s hormonal balance]. It’s something that needs a lot of attention because of its harmful effects on health,” says Carlos Augusto Fernandes de Oliveira, professor at the Faculty of Animal Science and Food Engineering (FZEA-USP), at the Pirassununga campus, and coordinator of the study.

The six toxins of concern were found in all the food samples analyzed: aflatoxins (AFs), fumonisins (FBs), zearalenone (ZEN), T-2 toxin, deoxynivalenol (DON) and ochratoxin A (OTA). In the case of the mycotoxins FBs, ZEN and DON, the levels were above the tolerance limit set by the health authorities. This study was the first in Brazil to use biomarkers to characterize the risk associated with mycotoxins in the diet of children and adolescents.


The research found the six mycotoxins of concern in all the food samples analyzed – such substances require more attention because they are carcinogenic, immunosuppressive or act as endocrine disruptors (image: researchers’ collection)

Oliveira explains that aflatoxin B1, discovered in the 1960s, is the most potent natural carcinogen known. The substance damages the DNA of animals, causing genetic mutations that can lead to the development of liver carcinoma. There are also other effects such as immunosuppression, reproductive problems and teratogenesis (when pregnant or lactating women transfer the toxins to the embryo, fetus or child, causing health problems). 

“There’s no substance known to man in nature that has the carcinogenic power of this mycotoxin, only rare exceptions created in the laboratory, such as dioxins,” says the researcher.

Deoxynivalenol, which was found at high levels in the samples analyzed, although not carcinogenic, can lower the immunity of contaminated people. “It also has an effect on the gastrointestinal system. In animals, for example, it causes so much irritation that they regurgitate. That’s why it’s commonly called vomitoxin,” he says.

Fumonisin B1 is considered a possible human carcinogen and can cause esophageal cancer and other hepatotoxic problems, as can ochratoxin A, another potential carcinogen. Zearalenone, found at high levels in the food samples analyzed, has a structure identical to that of the female hormone estrogen and can cause problems associated with excess estrogen in the body (hyperestrogenism).

“So they’re toxins with heavy consequences. Unlike lead or other chemical contaminants such as bisphenol [found in some plastics], these mycotoxins are not cumulative. However, they do have a progressive effect. This means, for example, that with exposure to B1 molecules, at some point it’ll no longer be possible to repair the DNA damaged by the mycotoxin. This is when cancer can develop. That’s why we’re concerned about children and adolescents, who tend to be more sensitive to toxins in general,” he says.  

The analyses were carried out using ultra-performance liquid chromatography coupled with tandem mass spectrometry (UPLC-MS/MS, a method that allows different substances in a mixture to be distinguished on the basis of molecular weight). The 230 food samples analyzed were available for consumption in the homes of 67 children, including 21 preschoolers (3 to 6 years old), 15 schoolchildren (7 to 10 years old), and 31 adolescents (11 to 17 years old). 

The group is carrying out a second phase of work to further determine the level of contamination. Urine samples have been collected from children and adolescents, and the researchers are in the process of analyzing the results.

“By analyzing biomarkers found in urine, it’s possible to assess exposure to mycotoxins, since the excretion of biomarkers correlates well with the ingestion of some mycotoxins. This will allow us to anticipate the potential effects of contamination,” Oliveira told.

About São Paulo Research Foundation (FAPESP)

The São Paulo Research Foundation (FAPESP) is a public institution with the mission of supporting scientific research in all fields of knowledge by awarding scholarships, fellowships and grants to investigators linked with higher education and research institutions in the State of São Paulo, Brazil. FAPESP is aware that the very best research can only be done by working with the best researchers internationally. Therefore, it has established partnerships with funding agencies, higher education, private companies, and research organizations in other countries known for the quality of their research and has been encouraging scientists funded by its grants to further develop their international collaboration. You can learn more about FAPESP at www.fapesp.br/en and visit FAPESP news agency at www.agencia.fapesp.br/en to keep updated with the latest scientific breakthroughs FAPESP helps achieve through its many programs, awards and research centers. You may also subscribe to FAPESP news agency at http://agencia.fapesp.br/subscribe.

 

Bumblebee nests are overheating due to climate change, threatening future populations



Wild bumblebees all over the world need a similar nest temperature — with global warming driving temperatures up, their nests could be too hot for them to thrive



FRONTIERS




As a result of the climate crisis, global warming is driving up temperatures around the world — and bumblebees, like humans, are struggling to cope with homes that can’t beat the heat. In a new article published in Frontiers in Bee Science, scientists identify rising heat as a potential culprit for the decline in bumblebee populations worldwide, compromising bumblebees’ ability to construct livable nests in which healthy larvae can develop.

“The decline in populations and ranges of several species of bumblebees may be explained by issues of overheating of the nests and the brood,” said Dr Peter Kevan of the University of Guelph, Canada, lead author of the article. “The constraints on the survival of the bumblebee brood indicate that heat is likely a major factor, with heating of the nest above about 35 degrees Celsius being lethal, despite the remarkable capacity of bumblebees to thermoregulate.”

The sting in the tale

There are many bumblebee species around the world, living in many different environments. Lots of these species are in a decline linked to climate change, but identifying a causative factor has proven difficult. However, by reviewing the literature, Kevan and colleagues identified a critical commonality between these species, regardless of geographic range: the optimal temperature of their nests, 28-32 degrees Celsius.

“We can assume that the similarity reflects the evolutionary relatedness of the various species,” said Kevan.

Because this characteristic appears to be common between so many species, it may have limited evolutionary plasticity, meaning the bumblebees would find it hard to adapt to rising temperatures, and would struggle to remain within their thermal neutral zone — a point at which staying the right temperature requires minimal metabolic expenditure. Heat stress that takes a species out of this range is dangerous.

“Excessively high temperatures are more harmful to most animals and plants than cool temperatures. When conditions are cool, organisms that do not metabolically regulate their body temperatures simply slow down, but when temperatures get too high metabolic processes start to break down and cease,” said Kevan. “Death ensues quickly.”

Reviewing 180 years of literature, Kevan and colleagues found that bumblebees seem to be able to survive at up to 36 degrees Celsius and develop optimally at around 30-32 degrees Celsius — though this might differ between species and biogeographical conditions. While bumblebees have some behavioral adaptations which allow them to thermoregulate, this may not be enough to deal with climate change.

Additionally, the bumblebee colony also acts as a ‘superorganism’, where reproductive fitness is dependent on the collective survival and reproduction of the colony rather than individual bees. One bumblebee may cope better with the heat than another, but if the nest is too hot to raise healthy larvae the whole colony suffers, regardless of individual bumblebees’ adaptation.

On a wing and a prayer

“The effect of high nest temperatures has not been studied very much, which is surprising,” said Kevan. “We can surmise that nest temperatures above the mid-30s Celsius would likely be highly detrimental and that above about 35 Celsius death would occur, probably quite quickly.”

Studies of honeybees show that higher nest temperatures compromise bee queens’ strength and reproductive ability, and lead to smaller worker bees in poorer condition. If heat has a similar effect on bumblebees, so that colonies produce less healthy offspring at a higher temperature, then global warming could be directly leading to their decline.

To ensure that bumblebees continue to thrive, the scientists call for more research into what they say is an understudied aspect of bumblebee ecology: nest morphology, material properties, temperature, and thermoregulation. It may be possible for some bumblebee colonies to adapt their nest site choice and form or behavior to cool their nests. Ground-penetrating radar could help study ground-nesting species, while flow-through respirometry analysis of nests at different temperatures might help scientists gauge the stress placed on the bee colonies inside. We need both to understand how different colonies cope with the same conditions and how different species cope with different conditions, including whether some bumblebee species have broader thermal neutral zones, affording them more resilience.

“We hope that future scientists may take the ideas we present and apply them to their own research on bumblebee health and conversation,” concluded Kevan.

 

Researchers collaborate with the shipping industry to cut costs, fuel consumption and greenhouse gas emissions in shipping




ABO AKADEMI UNIVERSITY




A common challenge in shipping occurs when ships arrive promptly at their destination, only to find a crowded harbour. Subsequently, they are often required to wait outside the harbour or anchor until port services and a quayside become available.

According to a report from the International Maritime Organization (IMO), it is not uncommon for ships to spend between 5-10% of their time waiting to enter port. Excessive speeds followed by extended waiting times with engines running result in a notable increase in fuel consumption. This is a problem that impacts both the climate and the economy.

Several European universities, ports, shipping companies and technology firms have now joined forces to develop a software system and business models aimed at optimising travel and port calls for both existing and new shipping routes, incorporating land-based services. The MISSION project is funded by the European Commission with a total budget of 7.5 million euros.

The primary goal of the project is to promote transparency and foster cooperation among stakeholders within maritime transport networks. Through coordinated ship scheduling and an optimisation of ship operations and port services, the objective is to achieve a substantial increase in energy efficiency and a 10-20% reduction in fuel consumption, consequently resulting in lowered greenhouse gas emissions.

–  The MISSION system empowers decision-makers to efficiently manage port traffic, encompassing both sea and inland operations through digitalisation and streamlined processes for operations, communication and administrative tasks. This eliminates bottlenecks in the overall maritime supply chain, leading to substantial economic benefits for shipping and environmental advantages for society, according to Magnus Hellström, Professor of Industrial Economics and leader of Ã…bo Akademi University’s participation in the project.

Ã…bo Akademi University is spearheading the development of green business models to facilitate the deployment and commercialisation of the system within the project. Henrik Ringbom, Professor of Marine Law at Ã…bo Akademi University, is leading the legal aspects of the project, which involves exploring opportunities to support and enhance regulatory measures by the authorities.

The project encompasses approximately 30 European universities, shipping companies, technology firms and port companies. MISSION is a Horizon Europe project that spans from January 1, 2024, to June 30, 2027.

Find out more on the project website.

For more information, please contact:

Magnus Hellström, Professor of Industrial Economics, Åbo Akademi University
magnus.hellstrom@abo.fi

 

Economies take off with new airports



A global study by an SUTD researcher in collaboration with scientists from Japan explores the economic benefits of airport investment in emerging economies using nighttime satellite imagery.




SINGAPORE UNIVERSITY OF TECHNOLOGY AND DESIGN

Economies take off with new airports 

IMAGE: 

AIR TRANSPORT NETWORK AND NIGHTTIME LIGHT INTENSITY AROUND ASIA FOR 2019

view more 

CREDIT: JIN MURAKAMI, WITH DATA FROM ICAO GLOBAL TRAFFIC FLOW (2019) AND NASA EARTH OBSERVATORY (2019)





Be it for work or vacation, chances are that many will have passed through an airport. In the largest cities, the airport presents to travellers the first glimpse of a new land and a reflection of the surrounding city. Beyond first impressions, airports stand as an important economic hub for local policymakers, with a continuous flow of goods and passengers fuelling the urban economy.

However, current literature on the economic impact of airport investment is primarily based on developed cities localised to North America and Europe. In these cases, airport capacity development is focused on upgrading existing infrastructure. For emerging cities in less developed countries, especially for those intending to build their first airport, the economic benefits are not as clear-cut. 

Recognising the importance of a more generalisable analysis, Assistant Professor Jin Murakami of the Singapore University of Technology and Design (SUTD) collaborated with researchers in Japan to close this gap. In their paper ‘Does new airport investment promote urban economic development?: Global evidence from nighttime light data’, Asst Prof Murakami highlights that developing new airport infrastructure often requires billions in capital investment, with evidence and information on long-term returns playing a key role in decision-making. 

To conduct the study, the team first required access to city-scale statistical data across the global regions, including emerging economies in Asia and the Middle East. Previous studies on large cities used the gross domestic product (GDP) and employment rates as key indicators for economic growth. In smaller cities, however, these statistics are poor and inconsistent, and in some cases, altogether unavailable. 

Inspired by other empirical studies on emerging economies, the team used nighttime light intensity (NTL) as a proxy for the degree of economic development. Urban researchers have noted that urbanisation, employment, industrial production, and energy consumption activities can be tied to light emissions at night. With modern satellite observation technologies, nighttime images over a wide span of geographical locations and time periods have become readily available. Asst Prof Murakami and his team were able to examine cities with and without new airport construction across the world at finer spatial resolution and more regular time intervals.

Modelling the relationship between airport development and economic growth proved to be a difficult task. “The two-way causal relationships between transport infrastructure investment and urban economic growth have long been debatable. It is also likely that rapid urban economic growth calls for new airport construction projects,” explains Asst Prof Murakami. 

To distil the “net” economic impact of new airport construction, a difference-in-differences (DID) method was applied that compares the changes in NTL over the years between similar cities that differ only by local airport construction. The team scoured more than 13,000 cities worldwide to identify suitable control groups for comparison, systematically matching each case study.

The team found that less-developed cities without airports would benefit the most from constructing their first airport. The increased accessibility to the global aviation network brings about broader economic benefits and opportunities to such communities. The researchers suggest that new airports go beyond expanding capacity to meet air transport demand—they promote inclusive growth and connectivity to people living in remote locations.

The holistic analysis of recent airport construction projects provides new generalisable global evidence that can help small towns and cities to make pivotal decision where regional accessibility is a concern. With their findings, Asst Prof Murakami hopes to encourage global investors, policymakers, and financial institutions to reassess the bankability or economic feasibility of airport development projects in emerging economies based on current environmental, social, and governance (ESG) investing criteria. 
“This study stresses the importance of incorporating the wider economic benefits to be attained in relatively small and/or geographically remote cities, towns, and communities in airport capital investment evaluation practices for regional network connectivity and inclusive economic growth, particularly in Asia and the Middle East,” emphasises Asst Prof Murakami.

Acknowledgements:
This work was partially supported by the Asian Development Bank (ADB) (Project Number: 51280–001, Asia Infrastructure Insights R&D “Research on the Spatial Characteristics of Airport System Development within Asia’s Developing Cities”) and by the Singapore University of Technology and Design Growth Plan (SGP)–Aviation (Grant Ref No. PIE-SGP-AN-2020–01, Thrust 5: “The Growth of Hub-Airports as Employment Centers”). 

Reference:
Does new airport investment promote urban economic development?: Global evidence from nighttime light data, Transportation Research Part A: Policy and Practice (DOI:10.1016/j.tra.2023.103948)
 

 

A cost-efficient path to a renewable energy grid for Australia



PNAS NEXUS
Australia grid graphical abstract 

IMAGE: 

GENERATION AND MULTI-TIMESCALE STORAGE REQUIREMENTS FOR RENEWABLE ELECTRICITY GRIDS IN AUSTRALIA FOR AN INTERCONNECTED EASTERN (NATIONAL ELECTRICITY MARKET—NEM) AND WESTERN (WHOLESALE ELECTRICITY MARKET—WEM) GRID. BASED ON THE SENSITIVITY ANALYSIS OF TECHNOLOGY COST, ADDITIONAL GENERATOR CAPACITIES, POWER AND ENERGY CAPACITIES FOR EACH STORAGE TECHNOLOGY, AND THE ASSOCIATED INVESTMENT ARE DEPICTED.

view more 

CREDIT: SHAIKH ET AL




A model charts the most cost-efficient path to a fully renewable electricity grid for Australia. Raheel Ahmed Shaikh and colleagues modeled possible scenarios for Australia’s eastern and western grids, using solar and wind generation, short-to-long-term energy storage, and financial input data to explore low-cost capacity mix. Going completely renewable would require significant expansion of both generation and storage. Interconnecting the two grids would reduce generation capacity needs by 6% and storage power capacity needs by 14%. The least cost renewable-only grid would be dominated by wind, with between 50–75% of energy contributed by turbines. Storage would be mandatory for any fully renewable grid. Australia would need the ability to store up to four days of demand. That represents 13 times more storage power capacity and over 40 times more storage energy capacity than the country has at present, considering batteries, pumped hydro, and hydrogen storage. An 82% renewable grid would only require a fourfold increase in storage power capacity and a threefold increase in energy capacity. According to the authors, the optimal route to a fully renewable grid would require an investment of approximately A$130–150 billion, around 8–10% of the country’s Gross Domestic Product, assuming future technology development and cost reduction.