Thursday, August 26, 2021

New technology lays groundwork for large-scale, high-resolution 3D displays

Projection approach could enable large, high-definition 3D images for digital signs and immersive experiences

Peer-Reviewed Publication

THE OPTICAL SOCIETY

Optical setup 

IMAGE: RESEARCHERS COMBINED TWO DIFFERENT LIGHT FIELD DISPLAY TECHNOLOGIES TO PROJECT LARGE-SCALE 3D IMAGES WITH ALMOST DIFFRACTION-LIMITED RESOLUTION. THEIR OPTICAL SETUP IS SHOWN. view more 

CREDIT: BYOUNGHO LEE, SEOUL NATIONAL UNIVERSITY

WASHINGTON — Researchers have developed a prototype display that uses projection to create large-scale 3D images with ultra-high definition. The new approach helps overcome the limitations of light-field projection, which can create natural-looking 3D images that don’t require special 3D glasses for viewing.

“Our optical design could make it practical to replace 2D flat panel displays with 3D images for digital signs, entertainment, education and other applications where 3D images provide a significant enhancement,” said research team leader Byoungho Lee from Seoul National University in Korea. “Our design could also be modified to provide immersive experiences in movie theaters, for example.”

In The Optical Society (OSA) journal Optics Letters, the researchers describe how they combine two different light-field display technologies to project large-scale 3D images with almost diffraction-limited resolution. The new display is autostereoscopic, which means that it produces different 3D images so that the image can be viewed from various angles.

“We developed a way to carry out all the display processes optically without any digital processing,” said Lee. “This compensates for the limitations of each display technology to allow the creation of high-resolution 3D images on a large screen.”

CAPTION

The new display optically transforms the object display volume generated from the multifocal display into the projection volume for integral imaging by automatically mapping the rays through a microlens array (optical pickup). The transformed information can be enlarged to the large screen through a projection lens. After the projection, the object display volume is reconstructed, passing through another lens array in a similar manner to the existing integral imaging system.

CREDIT

Byoungho Lee, Seoul National University

Combining technologies

Light-field displays work by reproducing light that is reflected from an object in a way that corresponds to the actual visible position. Because autostereoscopic light field displays produce different images for different viewing angles, they require a huge amount of information to be processed. This demand creates a tradeoff between resolution and the size of the displayed image because the hardware of the display gets overwhelmed by the amount of information required.

To overcome this limitation, the researchers designed a new optical configuration that combines a multifocal display with integral imaging. Typically, a multifocal display can generate a high-quality volumetric image, but it is technically difficult to implement on a large-screen system. On the other hand, integral imaging is better at enlarging images.

In the new design, the multifocal display generates a high-resolution 3D, or volumetric, scene while the integral imaging technology enlarges it for viewing on a large screen. The information conversion between the multifocal display and integral imaging is all performed optically without any digital processing.

“Our method goes beyond merely combining two existing methods to achieving an ultrahigh-definition volumetric light-field display with almost diffraction-limited resolution,” said Lee. “We also found a way to effectively resolve the difficulty of enlarging a volumetric scene and overcame problems with information loss that tend to affect integral imaging.”

Large and high-resolution 3D images

After verifying the resolution of their prototype system, the researchers qualitatively confirmed that a volumetric image was reconstructed. The tests showed that the prototype can synthesize a volumetric image of 21.4 cm x 21.4 cm x 32 cm, which is equivalent to 28.6 megapixels and 36 times higher resolution than the original image.

“Our approach is very efficient at processing information, which enables a low computing cost as well as simple, high-quality, real-time system configuration,” said Lee. “The optical design can also be seamlessly integrated with various techniques used in existing light-field displays.”

The researchers are now working to optimize the optics and further reduce the complexity of the multifocal display to make the projector more compact. They note that because the system is a fusion of two different technologies, the performance of their proposed system will likely improve as each technology develops.

Paper: Y. Jo, K. Bang, D. Yoo, B. Lee, “Ultrahigh-definition volumetric light field projection,” Opt. Lett., 46, 17, 4212-4215 (2021).
DOI: https://doi.org/10.1364/OL.431156.

About Optics Letters
Optics Letters offers rapid dissemination of new results in all areas of optical science with short, original, peer-reviewed communications. Optics Letters accepts papers that are noteworthy to a substantial part of the optics community. Published by The Optical Society and led by Editor-in-Chief Miguel Alonso, Institut Fresnel, École Centrale de Marseille and Aix-Marseille Université, France, University of Rochester, USA. Optics Letters is available online at OSA Publishing.

About The Optical Society

The Optical Society (OSA) is dedicated to promoting the generation, application, archiving, and dissemination of knowledge in optics and photonics worldwide. Founded in 1916, it is the leading organization for scientists, engineers, business professionals, students, and others interested in the science of light. OSA’s renowned publications, meetings, online resources, and in-person activities fuel discoveries, shape real-life applications and accelerate scientific, technical, and educational achievement.

Media Contact:

mediarelations@osa.org             

 

US Department of Energy’s Argonne National Laboratory and Hewlett Packard Enterprise prepare for exascale era with new testbed supercomputer


Business Announcement

DOE/ARGONNE NATIONAL LABORATORY

Polaris Rendering 

IMAGE: INSTALLATION OF THE POLARIS SUPERCOMPUTER SYSTEM AT THE ARGONNE LEADERSHIP COMPUTING FACILITY IN AUGUST 2021. POLARIS PROVIDES RESEARCHERS WITH A POWERFUL NEW TOOL TO PREPARE FOR SCIENCE IN THE EXASCALE ERA, WHEN COMPUTERS WILL PERFORM A BILLION BILLION CALCULATIONS PER SECOND. view more 

CREDIT: (IMAGE BY ARGONNE NATIONAL LABORATORY.)

Argonne National Laboratory’s new system, Polaris, built by HPE, will optimize AI, engineering and scientific projects for the forthcoming Aurora exascale supercomputer.

LEMONT, Il, and HOUSTON, August 25, 2021 — The U.S. Department of Energy’s Argonne National Laboratory and Hewlett Packard Enterprise (NYSE: HPE) today unveiled a new testbed supercomputer to prepare critical workloads for future exascale systems that will deliver up to four times faster performance than Argonne’s current supercomputers.

The new system, which Argonne has named Polaris, will be built by HPE, and hosted and managed by the Argonne Leadership Computing Facility (ALCF), a U.S. DOE Office of Science User Facility. It will enable scientists and developers to test and optimize software codes and applications to tackle a range of artificial intelligence (AI), engineering and scientific projects planned for the forthcoming exascale supercomputer, Aurora, a joint collaboration between Argonne, Intel and HPE.

“Polaris is well equipped to help move the ALCF into the exascale era of computational science by accelerating the application of AI capabilities to the growing data and simulation demands of our users.” — Michael E. Papka, ALCF director

Polaris is designed with industry-leading high performance computing (HPC) and AI solutions to advance investigations into society’s most complex and pressing issues, from understanding the biology of viruses to revealing the secrets of the universe. It will also augment Argonne’s ongoing efforts and achievements in areas such as clean energy, climate resilience and manufacturing.

In addition, Polaris will help researchers integrate HPC and AI with other experimental facilities, including Argonne’s Advanced Photon Source and the Center for Nanoscale Materials, both DOE Office of Science User Facilities.

“Polaris is well equipped to help move the ALCF into the exascale era of computational science by accelerating the application of AI capabilities to the growing data and simulation demands of our users,” said Michael E. Papka, director at the ALCF. ​“Beyond getting us ready for Aurora, Polaris will further provide a platform to experiment with the integration of supercomputers and large-scale experiment facilities, like the Advanced Photon Source, making HPC available to more scientific communities. Polaris will also provide a broader opportunity to help prototype and test the integration of HPC with real-time experiments and sensor networks.”

Polaris: Argonne’s North Star propels new era of exascale

Polaris will deliver approximately 44 petaflops of peak double precision performance and nearly 1.4 exaflops of theoretical AI performance, which is based on mixed-precision compute capabilities.

It will be built using 280 HPE Apollo Gen10 Plus systems, which are HPC and AI architectures built for the exascale era and customized to include the following end-to-end solutions:

  • Powerful compute to improve modeling, simulation and data-intensive workflows using 560 2nd and 3rd Gen AMD EPYC™ processors
  • Supercharged AI capabilities to support data and image-intensive workloads while optimizing future exascale-level GPU-enabled deployments using 2240 NVIDIA® A100 Tensor Core GPUs, making it ALCF’s largest GPU-based system to date.
  • Addressing demands for higher speed and congestion control for larger data-intensive and AI workloads with HPE Slingshot, the world’s only high performance Ethernet fabric designed for HPC and AI solutions. HPE Slingshot will also be featured in Argonne’s Aurora exascale system.
  • Enabling fine-grained centralized monitoring and management for optimal performance with HPE Performance Cluster Manager, a system management software solution.

 ​“As we approach the exascale era, which will power a new age of insight and innovation, high performance computing (HPC) will play a critical role in harnessing data to take on the world’s most pressing challenges. Increasingly, the computational power and scale required to process artificial intelligence and machine learning data sets can only be delivered through HPC systems, and HPE uniquely provides a powerful, software-driven platform capable of tackling complex scientific data and simulations,” said Justin Hotard, senior vice president and general manager, HPC and Mission Critical Solutions at HPE. ​“The U.S. Department of Energy’s (DOE) Office of Science continues to make tremendous impacts in accelerating scientific and engineering breakthroughs using HPC. Our latest collaboration with the DOE’s Argonne National Laboratory to build and deliver the Polaris testbed supercomputer will further its mission by preparing users for the magnitude of technological advancement that exascale systems will deliver.”

Polaris prepares scientists to tackle exascale-level problems

Initially, Polaris will be dedicated to research teams participating in initiatives such as the DOE’s Exascale Computing Project and the ALCF’s Aurora Early Science Program, which are already tackling complex issues such as:

  • Advancing cancer treatment by accelerating research in understanding the role of biological variables in a tumor cell’s path by advancing the use of data science to drive analysis of extreme-scale fluid-structure-interaction simulations; and predicting drug response to tumor cells by enabling billions of virtual drugs to be screened from single to numerous combinations, while predicting their effects on tumorous cells.
  • Advancing the nation’s energy security, while minimizing climate impact with biochemical research through the NWChemEx project, funded by the DOE’s Office of Science Biological and Environmental Research. Researchers are solving the molecular problems in biofuel production by developing models that optimize feedstock to produce biomass and analyze the process of converting biomass materials into biofuels.
  • Expanding the boundaries of physics with particle collision research in the ATLAS experiment, which uses the Large Hadron Collider, the world’s most powerful particle accelerator, sited at CERN, near Geneva Switzerland. Scientists study the complex products from particle collisions in very large detectors to deepen our understanding of the fundamental constituents of matter, including the search for evidence of dark matter.

User communities within the DOE’s Exascale Computing Project will also use Polaris for optimizing engineering tasks for Argonne’s Aurora, which includes scaling of combined CPU- and GPU-enabled systems and the complex integration of workflows combining modeling, simulation, AI and other data-intensive components.

The delivery and installation of Polaris is scheduled to begin this month. It will go into use starting early 2022 and will be open to the broader HPC community in spring of 2022 to prepare workloads for the next generation of DOE’s high performance computing resources.

AMD, the AMD Arrow logo, EPYC and combinations thereof are trademarks of Advanced Micro Devices, Inc.

About Hewlett Packard Enterprise

Hewlett Packard Enterprise (NYSE: HPE) is the global edge-to-cloud company that helps organizations accelerate outcomes by unlocking value from all of their data, everywhere. Built on decades of reimagining the future and innovating to advance the way people live and work, HPE delivers unique, open and intelligent technology solutions delivered as a service – spanning Compute, Storage, Software, Intelligent Edge, High Performance Computing and Mission Critical Solutions – with a consistent experience across all clouds and edges, designed to help customers develop new business models, engage in new ways, and increase operational performance. For more information, visit: www​.hpe​.com.

The Argonne Leadership Computing Facility provides supercomputing capabilities to the scientific and engineering community to advance fundamental discovery and understanding in a broad range of disciplines. Supported by the U.S. Department of Energy’s (DOE’s) Office of Science, Advanced Scientific Computing Research (ASCR) program, the ALCF is one of two DOE Leadership Computing Facilities in the nation dedicated to open science.

Argonne National Laboratory seeks solutions to pressing national problems in science and technology. The nation’s first national laboratory, Argonne conducts leading-edge basic and applied scientific research in virtually every scientific discipline. Argonne researchers work closely with researchers from hundreds of companies, universities, and federal, state and municipal agencies to help them solve their specific problems, advance America’s scientific leadership and prepare the nation for a better future. With employees from more than 60 nations, Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science.

The U.S. Department of Energy’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, visit https://​ener​gy​.gov/​s​c​ience.


Carnegie Mellon, University of Washington to pioneer platforms that harness astrophysical data to unravel the universe’s mysteries


Grant and Award Announcement

CARNEGIE MELLON UNIVERSITY

The Rubin Observatory 

IMAGE: DRONE PHOTOS OF THE RUBIN OBSERVATORY SUMMIT FACILITY (PROVIDED BY DOME SURVEYOR, OSCAR RIVERA) SHOW ADVANCEMENT TO NEAR 100% ON DOME CLADDING COMPLETION AS WELL AS CLOSURE OF THE LOUVERS, REAR DOOR, AND SHUTTERS. view more 

CREDIT: RUBIN OBS/NSF/AURA

Carnegie Mellon University and the University of Washington have announced an expansive, multi-year collaboration to create new software platforms to analyze large astronomical datasets generated by the upcoming Legacy Survey of Space and Time (LSST), which will be carried out by the Vera C. Rubin Observatory in northern Chile. The open-source platforms are part of the new LSST Interdisciplinary Network for Collaboration and Computing (LINCC) and will fundamentally change how scientists use modern computational methods to make sense of big data.

Through the LSST, the Rubin Observatory, a joint initiative of the National Science Foundation and the Department of Energy, will collect and process more than 20 terabytes of data each night — and up to 10 petabytes each year for 10 years — and will build detailed composite images of the southern sky. Over its expected decade of observations, astrophysicists estimate the Department of Energy’s LSST Camera will detect and capture images of an estimated 30 billion stars, galaxies, stellar clusters and asteroids. Each point in the sky will be visited around 1,000 times over the survey’s 10 years, providing researchers with valuable time series data.

Scientists plan to use this data to address fundamental questions about our universe, such as the formation of our solar system, the course of near-Earth asteroids, the birth and death of stars, the nature of dark matter and dark energy, the universe’s murky early years and its ultimate fate, among other things.

"Our goal is to maximize the scientific output and societal impact of Rubin LSST, and these analysis tools will go a huge way toward doing just that,” said Jeno Sokoloski, director for science at the LSST Corporation. “They will be freely available to all researchers, students, teachers and members of the general public."

The Rubin Observatory will produce an unprecedented data set through the LSST. To take advantage of this opportunity, the LSST Corporation created the LSST Interdisciplinary Network for Collaboration and Computing (LINCC), whose launch was announced August 9 at the Rubin Observatory Project & Community Workshop. One of LINCC’s primary goals is to create new and improved analysis infrastructure that can accommodate the data’s scale and complexity that will result in meaningful and useful pipelines of discovery for LSST data.

“Many of the LSST’s science objectives share common traits and computational challenges. If we develop our algorithms and analysis frameworks with forethought, we can use them to enable many of the survey’s core science objectives,” said Rachel Mandelbaum, professor of physics and member of the McWilliams Center for Cosmology at Carnegie Mellon.

CAPTION

The complete focal plane of the future LSST Camera is more than 60 cm (2 feet) wide and contains 189 individual sensors that will produce 3200-megapixel images.

CREDIT

Jacqueline Orrell/SLAC National Accelerator Laboratory/NSF/DOE/Rubin Observatory/AURA

The LINCC analysis platforms are supported by Schmidt Futures, a philanthropic initiative founded by Eric and Wendy Schmidt that bets early on exceptional people making the world better. This project is part of Schmidt Futures’ work in astrophysics, which aims to accelerate our knowledge about the universe by supporting the development of software and hardware platforms to facilitate research across the field of astronomy.

“Many years ago, the Schmidt family provided one of the first grants to advance the original design of the Vera C. Rubin Observatory. We believe this telescope is one of the most important and eagerly awaited instruments in astrophysics in this decade. By developing platforms to analyze the astronomical datasets captured by the LSST, Carnegie Mellon University and the University of Washington are transforming what is possible in the field of astronomy,” said Stuart Feldman, chief scientist at Schmidt Futures.

“Tools that utilize the power of cloud computing will allow any researcher to search and analyze data at the scale of the LSST, not just speeding up the rate at which we make discoveries but changing the scientific questions that we can ask,” said Andrew Connolly, a professor of astronomy, director of the eScience Institute and former director of the Data Intensive Research in Astrophysics and Cosmology (DiRAC) Institute at the University of Washington.

Connolly and Carnegie Mellon’s Mandelbaum will co-lead the project, which will consist of programmers and scientists based at Carnegie Mellon and the University of Washington, who will create platforms using professional software engineering practices and tools. Specifically, they will create a “cloud-first” system that also supports high-performance computing (HPC) systems in partnership with the Pittsburgh Supercomputing Center (PSC), a joint effort of Carnegie Mellon and the University of Pittsburgh, and the National Science Foundation’s NOIRLab. LSSTC will run programs to engage the LSST Science Collaborations and broader science community in the design, testing and use of the new tools.

“The software funded by this gift will magnify the scientific return on the public investment by the National Science Foundation and the Department of Energy to build and operate Rubin Observatory’s revolutionary telescope, camera and data systems,” said Adam Bolton, director of the Community Science and Data Center (CSDC) at NSF’s NOIRLab. CSDC will collaborate with LINCC scientists and engineers to make the LINCC framework accessible to the broader astronomical community.

Through this new project, new algorithms and processing pipelines developed at LINCC will be able to be used across fields within astrophysics and cosmology to sift through false signals, filter out noise in the data and flag potentially important objects for follow-up observations. The tools developed by LINCC will support a “census of our solar system” that will chart the courses of asteroids; help researchers to understand how the universe changes with time; and build a 3D view of the universe’s history.

“The Pittsburgh Supercomputing Center is very excited to continue to support data-intensive astrophysics research being done by scientists worldwide. The work will set the stage for the forefront of computational infrastructure by providing the community with tools and frameworks to handle the massive amount of data coming off of the next generation of telescopes,” said Shawn Brown, director of the PSC.

Northwestern University and the University of Arizona, in addition to Carnegie Mellon and the University of Washington, are hub sites for LINCC. The University of Pittsburgh will partner with the Carnegie Mellon hub.  

 

The climate emergency in a COVID year


Business Announcement

AMERICAN INSTITUTE OF BIOLOGICAL SCIENCES

The BioScience Talks podcast features discussions of topical issues related to the biological sciences.

In a year marked by unprecedented flooding, deadly avalanches, and scorching heat waves and wildfires, the climate emergency's enormous cost—whether measured in lost resources or human lives—is all too apparent. Writing in BioScience, a group led by William J. Ripple and Christopher Wolf, both with Oregon State University, update their striking 2019 "World Scientists’ Warning of a Climate Emergency" with new data on the climate's health. The news is not good.

            Although fossil fuel use dipped slightly in 2020, a widely predicted result of the COVID-19 pandemic, the authors report that carbon dioxide, methane, and nitrous oxide "have all set new year-to-date records for atmospheric concentrations in both 2020 and 2021." Furthermore, 16 out of 31 tracked planetary vital signs, reflecting metrics such as greenhouse gas concentrations, ocean heat content, and ice mass, have also set disquieting records. However, there were a few bright spots, including fossil fuel subsidies reaching a record low and fossil fuel divestment reaching a record high.

            In this episode of BioScience Talks, coauthor Jillian Gregg, who is with the Sustainability Double Degree program and the Department of Crop and Soil Science at Oregon State University, joins us to discuss the latest climate update and the urgent actions needed ensure the long-term sustainability of human civilization.

To hear the whole discussion, visit this link for this latest episode of the BioScience Talks podcast.

###

BioScience, published monthly by Oxford Journals, is the journal of the American Institute of Biological Sciences (AIBS). BioScience is a forum for integrating the life sciences that publishes commentary and peer-reviewed articles. The journal has been published since 1964. AIBS is an organization for professional scientific societies and organizations, and individuals, involved with biology. AIBS provides decision-makers with high-quality, vetted information for the advancement of biology and society. Follow BioScience on Twitter @AIBSbiology.

Oxford Journals is a division of Oxford University Press. Oxford Journals publishes well over 300 academic and research journals covering a broad range of subject areas, two-thirds of which are published in collaboration with learned societies and other international organizations. The division has been publishing journals for more than a century, and as part of the world’s oldest and largest university press, has more than 500 years of publishing expertise behind it. Follow Oxford Journals on Twitter @OxfordJournals.

 

What should an ideal refugee protection system look like?


Our research may hold the answer

Meeting Announcement

THE UNIVERSITY OF BERGEN

Get to know PROTECT's research 

VIDEO: PROTECT THE RIGHT TO INTERNATIONAL PROTECTION. A PENDULUM BETWEEN GLOBALIZATION AND NATIVIZATION? IS AN EU-FUNDED RESEARCH PROJECT LAUNCHED ON 1 FEBRUARY 2020. WE STUDY THE IMPACTS OF THE UN'S GLOBAL COMPACTS ON REFUGEES AND MIGRATION, WHICH ARE TWO NON-BINDING FRAMEWORKS PROMOTING INTERNATIONAL COOPERATION AND RESPONSIBILITY-SHARING AS KEY SOLUTIONS TO HANDLE GLOBAL REFUGEE FLOWS. BY STUDYING HOW THE COMPACTS ARE RECEIVED AND IMPLEMENTED IN DIFFERENT COUNTRIES, AND HOW THEY INTERACT WITH EXISTING LEGAL FRAMEWORKS AND GOVERNANCE ARCHITECTURES, WE INVESTIGATE THE COMPACTS' IMPACT ON REFUGEES' RIGHT TO INTERNATIONAL PROTECTION. view more 

CREDIT: PROTECT PROJECT

Can non-binding agreements hold the necessary solutions?
In 2018, a number of countries signed the UN’s Refugee and Migration Compacts, two new agreements aiming at consolidating the objectives and obligations outlined by the Refugee convention. One of the Compacts’ most important goals is securing a better and more fair distribution of the world’s refugees: over 60 per cent of all refugees are currently hosted by ten countries. Turkey alone hosts 3.5 million Syrian refugees. 86 per cent of all refugees live in poor developing countries, which lack the necessary means to handle and support large refugee flows.

The EU funded research project PROTECT studies the Compacts’ potential and how they impact global refugee protection and governance – and ultimately whether they have a positive impact on states’ ability and will to fulfil the objectives of the Refugee Convention.

At our fully-digital Midterm Conference on 26 - 27 August 2021, we will share our initial findings with the world. Here, we will take stock of current challenges – and solutions – linked to global refugee protection. We will propose answers to the following questions:

  • Is the international refugee protection system an efficient tool to meet today's massive humanitarian challenges? If not, how can it be improved? 

  • Are the Global Compact on Migration and the Global Compact on Refugees a new opportunity, or a hindrance, for advancing the right to international protection?

  • What about the European Commission's latest proposal on a New Migration and Asylum Pact and the current CEAS? 

  • How are the EU, Canada, and South Africa facing protection challenges? 

  • What kind of pressures are decision-makers encountering when making and implementing refugee and asylum policies? 

Practicalities:

In order to avoid Zoom fatigue, we have pre-recorded many of the conference talks, available here. At the conference, you can catch short live presentations and interactive Q&A through Padlet. We have invited speakers representing both academia and the policy world.

THE ECOLOGY OF BIRDS

Bird communities threatened by urbanization


Research team led by Göttingen University investigates farmland birds in an Indian megacity

Peer-Reviewed Publication

UNIVERSITY OF GÖTTINGEN

Indian Paradise Flycatcher 

IMAGE: MANY BIRDS, LIKE THIS INDIAN PARADISE FLYCATCHER (TERPSIPHONE PARADISI) WHICH EATS INSECTS, WOULD NOT THRIVE IN AN URBAN ENVIRONMENT. view more 

CREDIT: GABRIEL MARCACCI

Urbanization is one of the most drastic forms of land-use change, and its negative consequences on biodiversity have been studied extensively in temperate countries such as Germany. However, less research has been conducted in tropical regions from the Global South, where most of the ongoing and future urbanization hotspots are located, and little is known about its effects on agricultural biodiversity and associated ecosystems. A research team from the University of Göttingen and the University of Hohenheim, in collaboration with the University of Agricultural Sciences of Bangalore in India, investigated the effects of urbanization on farmland bird communities in and around Bangalore, a city of over 10 million inhabitants in South India. They found that urbanization homogenizes farmland bird communities, filtering out species with certain functional traits, such as insect-eating birds, which are important for pest control. The results were published in Global Change Biology.

A local ornithological expert conducted regular bird surveys over one year and recorded 126 bird species. Using remote sensing techniques, satellite pictures were processed to produce a map of different land-uses, from which urbanization intensity was measured based on the proportion of sealed surfaces and buildings in the landscape. The researchers analyzed how farmland bird communities changed along a gradient of urbanization using newly developed statistical analyses.

“We found that urban bird communities were impoverished subsets of rural communities, both in terms of species composition and the ecological functions they provide,” says first author Gabriel Marcacci, PhD student in the Functional Agrobiodiversity group at the University of Göttingen. “Birds perform important roles in the environment, for instance controlling pests by eating insects, scavenging and removing carrion, or eating fruits and dispersing the seeds. But only bird species that are well adapted to urban environments such as pigeons or crows can thrive.”

“The homogenization – resulting from losing diversity through the exclusion of certain groups – of farmland bird communities may disrupt important ecosystem functions and services in urban agroecosystems such as pest removal by insect-eating birds,” explains Professor Catrin Westphal, head of the Functional Agrobiodiversity group. Urban communities were found to be more sensitive to species loss, endangering ecosystem resilience.

“Our study underscores urbanization as a serious threat to biological communities and ecosystem functioning that may affect food production systems,“ adds Professor Ingo Graß, head of the Department of Ecology of Tropical Agricultural Systems at the University of Hohenheim. Professor Teja Tscharntke concludes, “This concern is especially relevant for countries from the Global South where urban agriculture and ecosystem services play an increasingly important role for food security.”

CAPTION

Urban farms in Bangalore (India), where the study was carried out: vegetable fields next to newly built high-rise buildings.

CREDIT

Arne Wenzel


Original publication: Marcacci, G, et al “Taxonomic and functional homogenization of farmland birds along an urbanization gradient in a tropical megacity” 2021, Global Change Biology. https://doi.org/10.1111/gcb.15755

 

Research helps decision-makers achieve sustainability goals


Press invitation: The 2021 Gothenburg International Research Conference on Sustainable Development Goal 8

Meeting Announcement

UNIVERSITY OF GOTHENBURG

On 26-27 August, researchers from different parts of the world will gather at the School of Business, Economics, and Law at the University of Gothenburg to discuss research that can contribute to achieving the UN's Global Goal 8 - decent work and economic growth. The aim is to help decision-makers implement policies that support that sustainability goal.

The Global Goal (SDG) 8 s an overarching and very ambitious goal and it also has several very specific sub-goals. It includes both the pace of growth and the content of growth, as well as a range of factors concerning working conditions, the creation of meaningful employment, slavery, innovation, resource efficiency, sustainable tourism, and much more.

The conference aims to present an overview of research in this area and how it can contribute to achieving sub-goals within SDG 8.

“With SDG 8, we test a way for researchers to get involved in societal issues with the tools we have. We start from politically defined goals and look at how we can contribute with our research - it is actually somewhat of a new way of working. The aim is to increase the availability of relevant research”, says Gunnar Köhlin, Director of Environment for Development and one of the organisers.

Some universities around the world have taken on the responsibility of coordinating research related to one of the Global Goals. The University of Gothenburg has shouldered the responsibility for SDG 8 and collaborates with eight universities in Asia, Africa, and South America.

Media are welcome to attend the conference. All parts can be followed digitally. Registration is sent to jenni.stromstad@gu.se .

The 2021 Gothenburg International Research Conference on Sustainable Development Goal 8
Date and venue: 26 – 27 August. This is a hybrid conference where some of the participants are present at the School of Business, Economics and Law, Vasagatan 1, Gothenburg, Sweden, and others participate digitally.

Participants: About 120 Swedish and international researchers, many from low- or middle-income countries.

From the programme:

  • Discussions on research related to SDG 8 and its various sub-goals.
  • Pam Fredman, President, International Association of Universities (IAU): ”IAU and the role of Higher Education and Research in achieving the SDGs”
  • Johannes Siegrist, Heinrich-Heine-University Düsseldorf: “Threats to decent work and workers' health: Challenges for research and policy”
  • Steven Stone, Chief Resources and Markets Branch, UN Environment Program (UNEP): “Sustainable Economic Growth:  Oxymoron or Challenge of the Century?”
  • Chantal Line Carpentier, United Nations Conference on Trade and Development (UNCTAD): “SDGs, their history, future, and implications?”
  • Driss Ezzine de Blas, Chief Investigator CIRAD, a visiting researcher at Center for the Sustainable Development Goals for Latin America and the Caribbean at Universidad de los Andes.

See full programme

Organisers: School of Business, Economics, and Law at the University of Gothenburg, Gothenburg Centre for Sustainable Development (GMV), the SDG 8 Initiative at the University of Gothenburg.

Contact:
Gunnar Köhlin, Director, Environment for Development
+46 31-786 44 26, +46 705-35 05 08, gunnar.kohlin@efd.gu.se

Jenni Strömstad, Project Leader, the SDG 8 Initiative
+46 31-786 58 22, +46 766-18 58 22, jenni.stromstad@gu.se