Tuesday, April 25, 2023

Music for sleeping and music for studying share surprising similarities, study finds

A recent study on Spotify data reveals which types of music are used to help listeners focus while studying and fall asleep

Peer-Reviewed Publication

AARHUS UNIVERSITY

What type of music do you use while studying? What type of music would you use to fall asleep at night? Have you ever wondered why you choose certain types of music? It turns out that the music used for both these situations is actually pretty similar.

According to a recent study from Aarhus University, which analyzed data from the streaming service Spotify, music that people listen to while studying and sleeping share more similarities than with music in general.

The researchers used both qualitative and quantitative analysis to compare the two types of music based on tracks, genres and audio features.

The study found that people use similar types of music to accompany these tasks.

“Our study suggest that music used for studying and music used for sleeping share many characteristics in terms of tracks, genres and audio features. This similarity highlights the potential of music to create a pleasant but not too disturbing atmosphere, enabling individuals to focus on studying and relaxation for sleeping”, says Rebecca Jane Scarratt, PhD student at the Center for Music in the Brain at Aarhus University.

Relaxing effects on the brain

The researchers analyzed many playlists seemingly used for studying or for relaxation before bedtime and found that these two types of music shared similar characteristics, such as slow tempo and repetitive patterns.

Among the most common genres found in both datasets were pop, lo-fi, classical and ambient music.

According to the study, the similarities could be attributed to their calming and relaxing effects on the brain. The slow tempo and repetitive patterns of the music help to lower heart rate and reduce stress, creating a conducive environment for both studying and sleeping.

The researchers also used statistical methods to compare the audio features between different datasets and determine whether there were significant differences.

They found that there were significant differences between sleep and general datasets in “Loudness”, “Energy” and “Valence” which refers to the emotional tone or mood of a piece of music. The same was the case between study and general datasets, but there was no large difference between the study and sleep datasets.

According to Rebecca, the findings are the beginning of a new research trend that compares music used for different activities and could lead to a better understanding of how music affects our cognitive and emotional states. The findings shed light on the difference between how music is assumed to be used in theory and how it is actually used in practice.

“While more research is needed to fully understand the relationship between music and cognitive processes, our study provides a starting point for exploring the impact of music on our cognitive and emotional states, and how it may enhance our daily lives.” says Rebecca Jane Scarratt.


Behind the research result

Study type: Empirical big data

External financing: Center for Music in the Brain is funded by the Danish National Research Foundation (DNRF117). 

Conflict of interest: The authors declare no conflicting interests.

Link to publication: https://doi.org/10.1038/s41598-023-31692-8

Swedish quantum computer applied to chemistry for the first time


Peer-Reviewed Publication

CHALMERS UNIVERSITY OF TECHNOLOGY

Martin Rahm 

IMAGE: MARTIN RAHM, ASSOCIATE PROFESSOR IN THEORETICAL CHEMISTRY AT THE DEPARTMENT OF CHEMISTRY AND CHEMICAL ENGINEERING, CHALMERS UNIVERSITY OF TECHNOLOGY view more 

CREDIT: JOHAN BODELL/CHALMERS

There are high expectations that quantum computers may deliver revolutionary new possibilities for simulating chemical processes. This could have a major impact on everything from the development of new pharmaceuticals to new materials. Researchers at Chalmers University have now, for the first time in Sweden, used a quantum computer to undertake calculations within a real-life case in chemistry.

“Quantum computers could in theory be used to handle cases where electrons and atomic nuclei move in more complicated ways. If we can learn to utilise their full potential, we should be able to advance the boundaries of what is possible to calculate and understand,” says Martin Rahm, Associate Professor in Theoretical Chemistry at the Department of Chemistry and Chemical Engineering, who has led the study.

Within the field of quantum chemistry, the laws of quantum mechanics are used to understand which chemical reactions are possible, which structures and materials can be developed, and what characteristics they have. Such studies are normally undertaken with the help of super computers, built with conventional logical circuits. There is however a limit for which calculations conventional computers can handle. Because the laws of quantum mechanics describe the behaviour of nature on a subatomic level, many researchers believe that a quantum computer should be better equipped to perform molecular calculations than a conventional computer.

“Most things in this world are inherently chemical. For example, our energy carriers, within biology as well as in old or new cars, are made up of electrons and atomic nuclei arranged in different ways in molecules and materials. Some of the problems we solve in the field of quantum chemistry are to calculate which of these arrangements are more likely or advantageous, along with their characteristics,” says Martin Rahm.

A new method minimises errors in the quantum chemical calculations

There is still a way to go before quantum computers can achieve what the researchers are aiming for. This field of research is still young and the small model calculations that are run are complicated by noise from the quantum computer’s surroundings. However, Martin Rahm and his colleagues have now found a method that they see as an important step forward. The method is called Reference-State Error Mitigation (REM) and works by correcting for the errors that occur due to noise by utilising the calculations from both a quantum computer and a conventional computer.

“The study is a proof-of-concept that our method can improve the quality of quantum-chemical calculations. It is a useful tool that we will use to improve our calculations on quantum computers moving forward,” says Martin Rahm.

The principle behind the method is to first consider a reference state by describing and solving the same problem on both a conventional and a quantum computer. This reference state represents a simpler description of a molecule than the original problem intended to be solved by the quantum computer. A conventional computer can solve this simpler version of the problem quickly. By comparing the results from both computers, an exact estimate can be made for the amount of error caused by noise. The difference between the two computers’ solutions for the reference problem can then be used to correct the solution for the original, more complex, problem when it is run on the quantum processor. By combining this new method with data from Chalmers’ quantum computer Särimner* the researchers have succeeded in calculating the intrinsic energy of small example molecules such as hydrogen and lithium hydride. Equivalent calculations can be carried out more quickly on a conventional computer, but the new method represents an important development and is the first demonstration of a quantum chemical calculation on a quantum computer in Sweden.

“We see good possibilities for further development of the method to allow calculations of larger and more complex molecules, when the next generation of quantum computers are ready,” says Martin Rahm.

Quantum computer built at Chalmers

The research has been conducted in close collaboration with colleagues at the Department of Microtechnology and Nanoscience. They have built the quantum computers that are used in the study, and helped perform the sensitive measurements that are needed for the chemical calculations.

“It is only by using real quantum algorithms that we can understand how our hardware really works and how we can improve it. Chemical calculations are one of the first areas where we believe that quantum computers will be useful, so our collaboration with Martin Rahm’s group is especially valuable,” says Jonas Bylander, Associate Professor in Quantum Technology at the Department of Microtechnology and Nanoscience.

The quantum computer at Chalmers with the outer shielding of the dilution refrigerator removed.

CREDIT

Chalmers/Anna-Lena Lundqvist

More about the research

Read the article Reference-State Error Mitigation: A Strategy for High Accuracy Quantum Computation of Chemistry in the Journal of Chemical Theory and Computation.

The article is written by Phalgun Lolur, Mårten Skogh, Werner Dobrautz, Christopher Warren, Janka Biznárová, Amr Osman, Giovanna Tancredi, Göran Wendin, Jonas Bylander, and Martin Rahm. The researchers are active at Chalmers University of Technology.

The research has been conducted in cooperation with the Wallenberg Centre for Quantum Technology (WACQT) and the EU-project OpensuperQ. OpensuperQ connects universities and companies in 10 European countries with the aim of building a quantum computer, and its extension will contribute further funding to researchers at Chalmers for their work with quantum chemical calculations.

*Särimner is the name of a quantum processor with five qubits, or quantum bits, built by Chalmers within the framework of the Wallenberg Center for Quantum Technology (WACQT). Its name is borrowed from Nordic mythology, in which the pig Särimner was butchered and eaten every day, only to be resurrected. Särimner has now been replaced by a larger computer with 25 qubits and the goal for WACQT is to build a quantum computer with 100 qubits that can solve problems far beyond the capacity of today’s best conventional super-computers.

 

Is Deep Learning a necessary ingredient for Artificial Intelligence?

Deep learning appears to be a key magical ingredient for the realization of many artificial intelligence tasks. However, these tasks can be efficiently realized by the use of simpler shallow architectures

Peer-Reviewed Publication

BAR-ILAN UNIVERSITY

Is Deep Learning a necessary ingredient for Artificial Intelligence? 

IMAGE: FIGURE: SCHEME OF DEEP MACHINE LEARNING CONSISTING OF MANY LAYERS (LEFT) VS. SHALLOW BRAIN LEARNING CONSISTING OF A FEW LAYERS WITH ENLARGED WIDTH (RIGHT). FOR MORE DETAIL SEE HTTPS://WWW.NATURE.COM/ARTICLES/S41598-023-32559-8 view more 

CREDIT: PROF. IDO KANTER, BAR-ILAN UNIVERSITY

The earliest artificial neural network, the Perceptron, was introduced approximately 65 years ago and consisted of just one layer.  However, to address solutions for more complex classification tasks, more advanced neural network architectures consisting of numerous feedforward (consecutive) layers were later introduced. This is the essential component of the current implementation of deep learning algorithms. It improves the performance of analytical and physical tasks without human intervention, and lies behind everyday automation products such as the emerging technologies for self-driving cars and autonomous chat bots.

The key question driving new research published today in Scientific Reports is whether efficient learning of non-trivial classification tasks can be achieved using brain-inspired shallow feedforward networks, while potentially requiring less computational complexity. “A positive answer questions the need for deep learning architectures, and might direct the development of unique hardware for the efficient and fast implementation of shallow learning,” said Prof. Ido Kanter, of Bar-Ilan's Department of Physics and Gonda (Goldschmied) Multidisciplinary Brain Research Center, who led the research. “Additionally, it would demonstrate how brain-inspired shallow learning has advanced computational capability with reduced complexity and energy consumption.”

"We've shown that efficient learning on an artificial shallow architecture can achieve the same classification success rates that previously were achieved by deep learning architectures consisting of many layers and filters, but with less computational complexity,” said Yarden Tzach, a PhD student and contributor to this work.  “However, the efficient realization of shallow architectures requires a shift in the properties of advanced GPU technology, and future dedicated hardware developments,” he added.

The efficient learning on brain-inspired shallow architectures goes hand in hand with efficient dendritic tree learning which is based on previous experimental research by Prof. Kanter on sub-dendritic adaptation using neuronal cultures, together with other anisotropic properties of neurons, like different spike waveformsrefractory periods and maximal transmission rates (see also a video on dendritic learning: https://vimeo.com/702894966 ).

For years brain dynamics and machine learning development were researched independently, however recently brain dynamics has been revealed as a source for new types of efficient artificial intelligence.

How bee-friendly is the forest?


Peer-Reviewed Publication

UNIVERSITY OF WÜRZBURG

Bee collecting 

IMAGE: A HONEYBEE (APIS MELLIFERA) COLLECTS HONEYDEW ON A FIR TREE. THE STUDY SHOWS THAT THE BEECH-DOMINATED STEIGERWALD PROVIDES INSUFFICIENT FOOD RESOURCES FOR HONEYBEES. view more 

CREDIT: INGO ARNDT

Bees are generally associated with flowering meadows rather than with dense forests. Woodland, however, is considered the original habitat of the western honeybee (Apis mellifera), as it offers nesting sites in the form of tree cavities. Researchers at the Julius-Maximilians-Universität Würzburg (JMU) have now investigated the extent to which contemporary deciduous forests are suitable as foraging habitats for the busy insects.

For this purpose, Benjamin Rutschmann and Patrick Kohl installed twelve normally-sized honeybee colonies in observation hives across the Steigerwald – the respective proportion of forest in the surroundings varied for each bee colony. The two scientists conduct research at JMU in the Chair of Animal Ecology and Tropical Biology (Zoology III), which is headed by Professor Ingolf Steffan-Dewenter. The latter was also involved in the study, which has now appeared in the Journal of Applied Ecology.

Eavesdropping on bee dances

Honeybees communicate via the so-called waggle dance. The team analyzed a total of 2022 of these dances video recorded across the bees’ main foraging season, from March to August. Because the bees communicate the approximate location of a food source to their conspecifics during these dances, the scientist were able to draw conclusions about foraging distances and habitat preferences. The surprising result was that the bees used the forest far less than expected based on its contribution to landcover. Colonies that lived deep in the forest often had to travel long distances to find food.

"Especially in late summer, the supply of pollen in the forest was not guaranteed or insufficient, besides this being an especially critical time for the bee colonies and their brood," says Rutschmann. One of the main reasons for this, he says, is the tree species beech, which makes up more than 40 percent of the tree population in the Steigerwald: "Beech forests are dark, there is not much growing on the ground. Hardly any plants can cope with the light conditions in beech forests after the canopy closes, so a diverse herb layer that would be so important for bees is missing," according to the biologist.

Bees need more diverse forests

Honeydew or flowering tree species, such as linden, black locust, and chestnut, or shrubs such as blackberry and raspberry, do provide bees with an important source of carbohydrates and, in some cases, pollen as a source of protein during short periods of the year; however, bees need a balanced food supply throughout the season. "For a more bee-friendly environment, forests should be diversified with insect-pollinated trees – cherry, linden, maple, willow, horse chestnut or sweet chestnut," Rutschmann advises. Allowing secondary succession in forest gaps, the natural return of flora and fauna typical of a site, could help.

As if the lack of food were not enough of a problem, wild honeybee colonies in managed forests are also hampered by the low availability of tree hollows.

In a possible next step, the comparison with other European forest areas with different tree species composition and management could be investigated: "Especially the comparison with protected areas, where greater disturbances occur, would be interesting," says Benjamin Rutschmann. More natural disturbance and less optimization for timber production should not only increase floral diversity in the forest, but also improve the chances of survival of wild-living honeybee colonies.

Not only honeybees benefit

So, honeybees need a more diverse forest as a habitat. Once established, they also contribute significantly to biodiversity conservation in return. After all, the overwhelming majority of plants depend on cross-pollination. The honeybee, in turn, is one of the most important pollinators, along with numerous other species of wild bees.

A more diverse forest benefits not only the honeybee, but ultimately the forest itself – a diverse ecosystem is a healthy ecosystem and less susceptible to pest infestation, for example. "Converting forests to species-rich mixed deciduous forests not only promotes biodiversity, but also adaptation to future climate conditions," emphasizes Ingolf Steffan-Dewenter.

Wild wild garlic (Allium urisnum) blooming in the forest at springtime. Diverse vegetation is essential for the survival of honeybees.


Gloomy outlook: little sunlight penetrates through the dense canopy of the beech forest.

CREDIT

Ingo Arndt

Predicting regional organic carbon in deep soils

Peer-Reviewed Publication

SCIENCE CHINA PRESS

The performance of the currently reported seven depth distribution functions in fitting deep soil organic carbon (SOC) 

IMAGE: THE BOX CHART SHOWING THE DETERMINATION COEFFICIENT (R2) AND ROOT MEAN SQUARE ERROR (RMSE) OF SEVEN DEPTH DISTRIBUTION FUNCTIONS IN FITTING SOC CONCENTRATION ALONG SOIL PROFILES ON CHINA’S LOESS PLATEAU (A, C) AND OTHER REGIONS (SOUTHEAST CHINA, SOUTHERN KENYA AND SOUTHEAST BRAZIL TOGETHER) (B, D). THE UPPER, INTERMEDIATE AND LOWER EDGES OF BOX INDICATE THE 75% QUARTILES, MEDIAN AND 25% QUARTILES OF THE R2 OR RMSE, RESPECTIVELY. THE GRAY POINT IS R2 OR RMSE OF EACH MEASURED PROFILE. THE X-AXIS IS THE SEVEN DEPTH DISTRIBUTION FUNCTIONS. NEF: NEGATIVE EXPONENTIAL FUNCTION; EDF: EXPONENTIAL DECAY FUNCTION; PF: POWER FUNCTION; LF: LOGARITHMIC FUNCTION; TEF: TYPE III EXPONENTIAL FUNCTION; FIPF: FIRST-DEGREE INVERSE POLYNOMIAL FUNCTION; REF: REVISED EXPONENTIAL FUNCTION. view more 

CREDIT: ©SCIENCE CHINA PRESS

Field sampling combined with laboratory analysis is the most commonly used approach to obtain deep soil organic carbon (SOC) data and has been widely applied for more than a century. This approach provides the most accurate measurement of deep SOC concentration but is highly time-consuming and labor-intensive and is not practical at large spatial scales. Alternatively, developing mathematical functions to predict SOC in deep soils offers a quick technique for regional assessment. The depth distribution function describing the vertical distribution of SOC with soil depth has been used to estimate the deep SOC concentration in various regions and ecosystems. This method requires SOC data collected from multiple layers with a depth of at least 100 cm to obtain the parameters of the function. Additionally, the fittings among various functions have been rarely compared, leading to large arbitrariness in selecting the depth distribution function and lower fit goodness of selected function for the measured data. Moreover, application of such method is mainly focused at the site scale. These drawbacks of the currently used approaches restrict the accurate estimation of deep soil SOC at regional or larger spatial scales.

Jingjing Wang et al. composed regional SOC datasets from the measured and International Soil Reference and Information Centre (ISRIC) Soil Information System database. The datasets were used to compare the results of the currently used 7 depth distribution functions in fitting the vertical distribution patterns of SOC to select the optimal depth distribution function. Then, the team developed a prediction approach of deep SOC at the regional scale, through analyzed the relationships of the optimal depth distribution function parameters and soil properties from 0-40 cm topsoil layers.

The team demonstrated that the negative exponential function can effectively simulate the SOC vertical distribution pattern along soil profiles in different regions, the parameters (i.e., Ce and k) in the negative exponential function was linearly correlated with SOC in topsoils (0-40 cm) at the regional scale. Combining the negative exponential function and the parameters derived from the above linear prediction relationships, the authors developed a quick approach to predict SOC concentration in deep soils (down to 500 cm) at the regional scale. This approach was demonstrated to perform well in predicting SOC in deep soils in various regions.

See the article:

Wang J, Wei X, Jia X, Huang M, Liu Z, Yao Y, Shao M. 2023. An empirical approach to predict regional organic carbon in deep soils. Science China Earth Sciences, 66(3): 583–593, https://doi.org/10.1007/s11430-022-1032-2

Greenhouse gas release from permafrost is influenced by mineral binding processes

Peer-Reviewed Publication

UNIVERSITY OF COLOGNE

Drilling work in permafrost 

IMAGE: A PIECE OF THE DRILLED PERMAFROST CORE EXTRACTED DURING DRILLING OPERATIONS ON THE NEW SIBERIAN ISLANDS IN THE NORTH OF EASTERN SIBERIA. view more 

CREDIT: LUTZ SCHIRRMEISTER, ALFRED WEGENER INSTITUTE

About a quarter of the organic carbon contained in ice-rich Arctic permafrost is more difficult for microorganisms to utilize. The reason for this is a strong binding of the organic material originating from dead plant remains to mineral soil particles. That is the result of a study conducted by a research group led by Professor Dr Janet Rethemeyer and Dr Jannik Martens at the University of Cologne’s Institute of Geology and Mineralogy. Accurate predictions of the release of greenhouse gases from permafrost deposits are therefore more complex than previously assumed.

The results of the joint project, which was funded by the German Federal Ministry of Education and Research (BMBF), are published in the article ‘Stabilization of mineral-associated organic carbon in Pleistocene permafrost’ in the journal Nature Communications.

The Arctic is warming dramatically fast compared to other parts of the world. Much of it is covered by permafrost and contains large amounts of carbon, almost twice as much as the atmosphere. This carbon comes from plants that have grown over thousands of years, decomposed in the soil and then become ‘frozen’. Due to strongly rising temperatures in the Arctic, this gigantic freezer is thawing fast. The old carbon stored in it can now be degraded by microorganisms, releasing carbon dioxide and methane into the atmosphere. These greenhouse gases accelerate global warming. The warmer it gets, the more greenhouse gases are in turn released from the permafrost, causing temperatures to rise further and frozen soils and sediments to thaw even faster. “There is a feedback of carbon in permafrost with climate, the strength of which depends largely on those factors that influence microbial degradation,” said Janet Rethemeyer.

In the joint research project, scientists from the Institute of Zoology at the University of Cologne, the University of Tübingen, the Technical University of Munich and the Alfred-Wegener-Institute in Potsdam studied long permafrost cores from the Siberian Arctic. The cores come from very ice-rich, fine-grained sediments – similar to loess in our latitudes – that were deposited in large areas of Siberia and Alaska during the last ice age. The cores, up to 12 metres long, comprise sediments deposited over a period of up to 55,000 years.

The analyses of the permafrost cores show that a significant part (25-35 %) of the carbon is associated with the mineral particles and thus more difficult to access for microorganisms. “Predictions of interactions between thawing permafrost and climate are very complicated because the microbial degradability of the organic material in the sediments has varied greatly over the last 55,000 years. This is due to the different climatic conditions during this long period of deposition,” Janet Rethemeyer explained. Warmer and wetter conditions resulted in poorer binding of carbon to the mineral particles, while a colder and drier climate led to stronger binding, primarily to iron oxides. Stronger binding to iron oxides means that the decomposition rates of old plant material are lower, as Professor Dr. Michael Bonkowski from the Institute of Zoology, Department of Terrestrial Ecology at the University of Cologne has shown in laboratory experiments.

“These new findings can make a significant contribution to making computer models for forecasting greenhouse gas emissions from thawing permafrost more reliable,” said Jannik Martens, who is currently conducting research at Columbia University in New York.

Drought thresholds that impact vegetation growth reveals the critical points for nonlinear vegetation response to soil moisture stress

Peer-Reviewed Publication

SCIENCE CHINA PRESS

In a new study, a group from Peking University, China, present a highly novel data-led method that identifies, at all locations, the onset and extent of vegetation suppression for increasing levels of drought.

The drought threshold at which damage starts to occur is identified from simultaneous data streams of both soil moisture content and satellite measurements of plant and tree “greenness”. Specifically, vegetation lushness during the growing period is based onthe Normalized Difference Vegetation Index (NDVI], the kernel NDVI, the near-infrared reflectance of vegetation (NIRv) and solar-induced chlorophyll fluorescence (SIF), which are the measures of vegetation greenness and productivity.

The researchers find vegetation behaves nonlinearly as soil moisture stress rises. A discovery is made of an inflection point that clearly delimits two distinct phases of the response of vegetation growth to increasing drought stress. The first phase is characterised as where vegetation is stable, and resilient to soil moisture fluctuations due to plentiful soil moisture. In the second phase, vegetation growth rapidly decreases as drought intensifies. “Using our framework, we detect well-defined thresholds of soil moisture beyond which vegetation changes from highly resilient to highly vulnerable as soil water stress intensifies” Dr. Xiangyi Li, first author of this work, says.

The team show drought thresholds vary geographically, with more forested areas having lower thresholds, making them less sensitive to any emerging drought than less forested regions. The threshold representation, based purely on data, reveals that even state-of-the-art vegetation models often fail to describe the extent to which drought can lower vegetation health. Conversely, current models are overly sensitive to imposed drought conditions for some humid areas with high forest cover. "Our data-driven parameter-sparse representation of drought impacts is a much-needed way to benchmark ecological models", adds Xiangyi.

Arguably the physical components of climate models have been developed over a longer period and are more reliable. Hence the researchers merge estimates of future meteorological change, including drought, with their observationally constrained descriptions of vegetation response to water stress. This combining of lines of evidence reveals hotspots of East Asia, Europe, Amazon, southern Australia, eastern and southern Africa where the risk of drought-induced vegetation damage will increase substantially by the end of 21st Century and for a “business-as-usual” emissions scenario.

The analysis was led by Prof. Shilong Piao and Dr. Xiangyi Li from the College of Urban and Environmental Sciences, Peking University, and supported by scientists from Germany, the UK, Spain and the US.

###

See the article:

Global variations in critical drought thresholds that impact vegetation

https://doi.org/10.1093/nsr/nwad049

Can deep learning help us save mangrove forests?

Peer-Reviewed Publication

PENSOFT PUBLISHERS

Mangrove ecosystems of Qeshm Island in the Persian Gulf, Iran 

IMAGE: MANGROVE ECOSYSTEMS OF QESHM ISLAND IN THE PERSIAN GULF, IRAN view more 

CREDIT: NEDA BIHAMTA TOOSI

Mangrove forests are an essential component of the coastal zones in tropical and subtropical areas, providing a wide range of goods and ecosystem services that play a vital role in ecology. They are also threatened, disappearing, and degraded across the globe.

One way to stimulate effective mangrove conservation and encourage policies for their protection is to carefully assess mangrove habitats and how they change, and identify fragmented areas. But obtaining this kind of information is not always an easy task.

“Since mangrove forests are located in tidal zones and marshy areas, they are hardly accessible,” says Dr. Neda Bihamta Toosi, postdoc at Isfahan University of Technology in Iran working on landscape pattern changes using remote sensing. In a recent study in the journal Nature Conservation, together with a team of authors, she explored ways to classify these fragile ecosystems using machine learning.

Comparing the performance of different combinations of satellite images and classification techniques, the researchers looked at how good each method was at mapping mangrove ecosystems.

“We developed a novel method with a focus on landscape ecology for mapping the spatial disturbance of mangrove ecosystems,” she explains. “The provided disturbance maps facilitate future management and planning activities for mangrove ecosystems in an efficient way, thus supporting the sustainable conservation of these coastal areas.”

The results of the study showed that object-oriented classification of fused Sentinel images can significantly improve the accuracy of mangrove land use/land cover classification.

“Assessing and monitoring the condition of such ecosystems using model-based landscape metrics and principal component analysis techniques is a time- and cost-effective approach. The use of multispectral remote sensing data to generate a detailed land cover map was essential, and freely available Sentinel-2 data will guarantee its continuity in future,” explains Dr. Bihamta Toosi.

The research team hopes this approach can be used to provide information on the trend of changes in land cover that affect the development and management of mangrove ecosystems, supporting better planning and decision-making.

“Our results on the mapping of mangrove ecosystems can contribute to the improvement of management and conservation strategies for these ecosystems impacted by human activities,“ they write in their study.



Mangrove ecosystems on Qeshm Island in the Persian Gulf, Iran

CREDIT

Neda Bihamta Toosi

Research article:

Soffianian AR, Toosi NB, Asgarian A, Regnauld H, Fakheran S, Waser LT (2023) Evaluating resampled and fused Sentinel-2 data and machine-learning algorithms for mangrove mapping in the northern coast of Qeshm island, Iran. Nature Conservation 52: 1-22. https://doi.org/10.3897/natureconservation.52.89639