Sunday, August 01, 2021

Golden age of genre films leads Australian industry revival


Despite being often dismissed by critics, genre movies are leading a box-office boom for Australian films

Book Announcement

QUEENSLAND UNIVERSITY OF TECHNOLOGY

Australian Genre Film book cover 

IMAGE: AUSTRALIAN GENRE FILM BOOK COVER view more 

CREDIT: ROUTLEDGE

The Australian film industry is in the middle of a renaissance, with locally produced popular genre movies riding a wave of success, from action, comedy and sci-fi films to crime, thrillers, and our unique horror productions, says a QUT researcher.

Associate Professor Mark Ryan, a lecturer in film, screen and animation, and a chief investigator with QUT’s Digital Media Research Centre, argues the study of film genre has been long overlooked in Australian film studies and deserves more attention.

He adds that despite some exceptions, critics in the past have largely relegated local genre films to B-movie status and celebrated Australian movies that contributed to a sense of national identity and history; films like Sunday Too Far Away, Breaker Morant, and Gallipoli.

“Since the mid- 1970s, genre films made largely for a popular, commercial audience have held a tenuous position in industry practices, film culture, policy frameworks and public funding initiatives but we are now in a golden age of genre cinema,” said Professor Ryan who, along with Associate Professor Kelly McWilliam from the University of Southern Queensland, has co-edited a new book, Australian Genre Film (Routledge).

“The 1980s was a productive period for genre films but the 1990s Australian output was dominated by independent art films, dramas and comedies. National film policy prioritised the development of an industry contributing to the screening and preservation of Australian culture, identity, and history. Genre films were not seen to fit that brief.

“Between the mid-2000s and 2020, however, there has been a surge in action, adventure, horror and science fiction films. Our filmmakers are turning out more movies across a broad spectrum of popular movie types, from low budget to blockbuster productions, including some that morph more than one genre, such as Mystery Road (thriller and crime) or Wolf Creek (horror and road movie).”

According to Professor Ryan, the current boom in genre film making shows no sign of slowing down and began after the federal government created Screen Australia, which in turn introduced critical core production and post-production finance incentives.

“A key role of Screen Australia was to support ‘commercially focused’ films with the potential to appeal to audiences and it has certainly succeeded in this. Between 2007 and 2017, its Producer Offset contributed over AU$900 million in rebates to producers in support of 291 features,” he said.

“Eight of the top 20 highest-earning Australian feature films of all time were produced in the last decade, and each of these films have a dominant primary genre or hybrid generic identity. They include Mao’s Last Dancer (biopic), The Sapphires (musical), The Great Gatsby (drama/romance); The Water Diviner (war), The Dressmaker (revenge/thriller) and Mad Max: Fury Road (action/road).

“Previously, commercial movie genres associated with Hollywood cinema such as action, horror, thrillers, and the musical were viewed by film critics as American genres to the extent that it became ‘unacceptable’ to make Australian films with Hollywood generic conventions.

“The rise of the genre film has also been fuelled by an explosion in guerrilla filmmaking thanks to greater availability of low-cost high-definition cameras and digital editing equipment alongside the proliferation of video on demand platforms, as well as increased production of transnational cinema.”

Professor Ryan said while the study of movie genres was a well-established subfield of film studies, the focus has overwhelmingly been on Hollywood. His book turns the lens to Australia and analyses of 13 significant Australian genres: action, biopics, comedy, crime, horror, musical, road movie, romance, science fiction, teen, thriller, war, and the Western.

His own field of expertise is horror which he covers in the chapter A Monstrous Landscape Filled with Killer Animals and Madmen: Tropes of Contemporary Australian Horror Movies.

“Unlike other major movie genres such as fantasy, action, and science fiction, which have struggled to gain a foothold in the Australian feature film industry until quite recently, the horror genre has been an enduring genre for independent filmmakers over the last 50 years,” he said.

“In the last 20 years, purely in terms of the volume of films produced, the horror genre has become a far more significant form of production for the Australian film industry even if it remains largely a cult genre.

“Since the mid-2000s, horror has become both a staple and, at times, a commercially viable genre for low-budget filmmakers targeting international markets. In this time, the Australian film industry has produced several high-profile horror movies that circulated widely in international cinema markets, such as BaitSawDaybreakers and Wolf Creek.

“However, the genre is often still dismissed by reviewers as unworthy of serious critical attention unless a film can be considered within a more reputable critical context; like The Babadook which was internationally acclaimed and seen as art-horror when it came out in 2014,” he said.

Australian Genre Film is published by Routledge. 


 

New research infuses equity principles into the algorithm development process


Peer-Reviewed Publication

NYU TANDON SCHOOL OF ENGINEERING

BROOKLYN, New York, Thursday, July 29, 2021 – In the U.S., the place where one was born, one’s social and economic background, the neighborhoods in which one spends one’s formative years, and where one grows old are factors that account for a quarter to 60% of deaths in any given year, partly because these forces play a significant role in occurrence and outcomes for heart disease, cancer, unintentional injuries, chronic lower respiratory diseases, and cerebrovascular diseases — the five leading causes of death.

While data on such “macro” factors is critical to tracking and predicting health outcomes for individuals and communities, analysts who apply machine-learning tools to health outcomes tend to rely on “micro” data constrained to purely clinical settings and driven by healthcare data and processes inside the hospital, leaving factors that could shed light on healthcare disparities in the dark.

Researchers at the NYU Tandon School of Engineering and NYU School of Global Public Health (NYU GPH), in a new perspective, “Machine learning and algorithmic fairness in public and population health,” in Nature Machine Intelligence, aim to activate the machine learning community to account for “macro” factors and their impact on health. Thinking outside the clinical “box” and beyond the strict limits of individual factors, Rumi Chunara, associate professor of computer science and engineering at NYU Tandon and of biostatistics at the NYU GPH, found a new approach to incorporating the larger web of relevant data for predictive modeling for individual and community health outcomes.

“Research of what causes and reduces equity shows that to avoid creating more disparities it is essential to consider upstream factors as well,” explained Chunara. She noted, on the one hand, the large body of work on AI and machine learning implementation in healthcare in areas like image analysis, radiography, and pathology, and on the other the strong awareness and advocacy focused on such areas as structural racism, police brutality, and healthcare disparities that came to light around the COVID-19 pandemic.

“Our goal is to take that work and the explosion of data-rich machine learning in healthcare, and create a holistic view beyond the clinical setting, incorporating data about communities and the environment.”

Chunara, along with her doctoral students Vishwali Mhasawade and Yuan Zhao, at NYU Tandon and NYU GPH, respectively, leveraged the Social Ecological Model, a framework for understanding how the health, habits and behavior of an individual are affected by factors such as public policies at the national and international level and availability of health resources within a community and neighborhood. The team shows how principles of this model can be used in algorithm development to show how algorithms can be designed and used more equitably.

The researchers organized existing work into a taxonomy of the types of tasks for which machine learning and AI are used that span prediction, interventions, identifying effects and allocations, to show examples of how a multi-level perspective can be leveraged. In the piece, the authors also show how the same framework is applicable to considerations of data privacy, governance, and best practices to move the healthcare burden from individuals, toward improving equity.

As an example of such approaches, members of the same team recently presented at the AAAI/ACM Conference on Artificial Intelligence, Ethics and Society a new approach to using “causal multi-level fairness,” the larger web of relevant data for assessing fairness of algorithms. This work builds on the field of “algorithmic fairness,” which, to date, is limited by its exclusive focus on individual-level attributes such as gender and race.

In this work Mhasawade and Chunara formalized a novel approach to understanding fairness relationships using tools from causal inference, synthesizing a means by which an investigator could assess and account for effects of sensitive macro attributes and not merely individual factors. They developed the algorithm for their approach and provided the settings under which it is applicable. They also illustrated their method on data showing how predictions based merely on data points associated with labels like race, income and gender are of limited value if sensitive attributes are not accounted for, or are accounted for without proper context.

“As in healthcare, algorithmic fairness tends to be focused on labels — men and women, Black versus white, etc. — without considering multiple layers of influence from a causal perspective to decide what is fair and unfair in predictions,” said Chunara. “Our work presents a framework for thinking not only about equity in algorithms but also what types of data we use in them.”

The study, Machine learning and algorithmic fairness in public and population heath is available at https://www.nature.com/articles/s42256-021-00373-4

 

About the New York University Tandon School of Engineering

The NYU Tandon School of Engineering dates to 1854, the founding date for both the New York University School of Civil Engineering and Architecture and the Brooklyn Collegiate and Polytechnic Institute. A January 2014 merger created a comprehensive school of education and research in engineering and applied sciences as part of a global university, with close connections to engineering programs at NYU Abu Dhabi and NYU Shanghai. NYU Tandon is rooted in a vibrant tradition of entrepreneurship, intellectual curiosity, and innovative solutions to humanity’s most pressing global challenges. Research at Tandon focuses on vital intersections between communications/IT, cybersecurity, and data science/AI/robotics systems and tools and critical areas of society that they influence, including emerging media, health, sustainability, and urban living. We believe diversity is integral to excellence, and are creating a vibrant, inclusive, and equitable environment for all of our students, faculty and staff. For more information, visit engineering.nyu.edu.

 

 

###

 

 

Optimizing Remote Direct Memory Access for a faster, more robust internet

Lehigh University Computer science and engineering researcher Roberto Palmieri wins NSF CAREER award to take RDMA technology to the next level

Grant and Award Announcement

LEHIGH UNIVERSITY


IMAGE: NSF CAREER AWARD WINNER ROBERTO PALMIERI IS AN ASSISTANT PROFESSOR OF COMPUTER SCIENCE AND ENGINEERING IN LEHIGH UNIVERSITY'S P.C. ROSSIN COLLEGE OF ENGINEERING AND APPLIED SCIENCE. view more

CREDIT: LEHIGH UNIVERSITY

We are not a species that likes to wait. Especially when it comes to our online demands—we want instant responses to our queries, and immediate confirmation of our posts.

Meeting such expectations requires distributed computing systems capable of meeting demand while still preserving the integrity of the data they are providing. Distributed systems enable resource sharing in the form of hardware, software, or data, and comprise multiple machines connected through a network. The internet is the largest, best known example; others include social networks, online gaming, and e-commerce.

Such systems must perform innumerable complex interactions—fast—for potentially millions of users, without ruining the data. In other words, a travel site has to accommodate simultaneous requests for a flight, hotel, and rental car from millions of travelers, without screwing any of those requests up. If a site is at all glitchy or slow, users will go somewhere else.

Improving that speed is at the heart of Roberto Palmieri’s research. Palmieri, an assistant professor of computer science and engineering in Lehigh University’s P.C Rossin College of Engineering and Applied Science, recently won support from the National Science Foundation’s Faculty Early Career Development (CAREER) program for his proposal to optimize the technology known as Remote Direct Memory Access (RDMA) to better serve the massive number of internet-user requests.

The prestigious NSF CAREER award is given annually to junior faculty members across the U.S. who exemplify the role of teacher-scholars through outstanding research, excellent education, and the integration of education and research. Each award provides stable support at the level of approximately $500,000 for a five-year period.

“The general idea is that we have a lot of data within a given system, and this data doesn’t fit on a single machine,” says Palmieri. “It’s distributed on multiple machines, and there are operations working on this shared data. Ten, twenty years ago, a certain level of performance was good enough, but now there are so many services available on the internet, the infrastructure has to keep up with this increased workload. We want to make the operations performed by those machines go as fast as possible.”

RDMA is a fairly recent technology that changed the way computers communicated. At a basic level, that traditional communication involved one machine sending a request to another for a particular service. The second machine had to devote resources to processing and responding to the message, and that all took time. RDMA disrupted that pattern.

“So now, if a machine wants something from another machine, it will not ask for it,” he says. “It will just take it by interacting directly with that machine’s RDMA card. Which means that, instead of spending resources handling the message, the machine can focus on its specific business application. With RDMA, we’re talking about tens of nanoseconds for two machines to interact. Whereas before, we were talking about tens or hundreds of milliseconds. If you’re posting something on social media, and one interaction takes hundreds of milliseconds, and you need 10 interactions, the user is now waiting nearly a second, and starting to think, Why am I waiting so long?”

And when it comes to businesses competing for users, timing is everything.

Palmieri equates the difference between pre-RDMA days and now to using snail mail versus email. If you had to mail a letter and then wait for a response, you might not ask certain questions in that letter.

“If I have to decide whether I should put salt on my pasta, I’ll send you an email because I know that in a minute, you can answer.”

RDMA is a superfast delivery system. But it’s one that Palmieri intends to make even faster. In part, by going back to a long-held theory.

Before the arrival of RDMA, researchers had theorized that one way to speed up communication between machines would be to migrate required data from the computer that has it, to the one that wants it. That way, the next time a machine needed something, it didn’t have to ask for it. With the data stored locally, it could perform operations quicker. But at the time, says Palmieri, such migration couldn’t be done efficiently. Once RDMA was developed, retrieving data became so fast and cheap (in terms of performance cost) that migration no longer seemed necessary.

“People said, ‘I’m just going to go and get memory whenever I need it.’ What I’m saying is, ‘No, let’s go back to what we knew was optimal before, which was migrating memory to a local node,” he says. “Let us redesign that software component called the directory that allows memory to move, and traces where it is in the system. If we can move this memory efficiently, then basically every machine can interact with memory that is local. Subsequent requests for operations will then not even trigger a remote operation, it will all be done locally, which is shown to have the best performance. It’s at least one order of magnitude faster than even an RDMA operation.”

To do this, Palmieri and his team plan to redesign algorithms and protocols to fully exploit the capabilities of RDMA. Everything they produce will eventually become open-source, so others can build on it. A portion of Palmieri’s proposal is also directed at sparking more interest among students in computer systems.

“Getting students excited about something that’s intangible is hard,” he says. “To work on systems, students need to learn a lot of advanced concepts. How to work with the hardware and the operating system. You have to understand algorithms and protocols. So even though the ability to build infrastructure and software systems is in high demand, I attribute the lack of enthusiasm for the field to these barriers. You need so much knowledge before you can even start to get excited.”

To stoke interest, he’ll produce software that will allow students to see the potential in accessing hundreds of machines with just a few lines of code and truly appreciate nanosecond speed.

For Palmieri and his team, the potential to realize an outcome that was once theoretical is beyond exciting. And getting to this point, he says, would have been impossible without the ingenuity of his own students.

“This is a collective work. And it’s very unexplored. We had these continuous brainstorming sessions where we were trying to figure out something that no one else has ever done, and they were crazy good,” he says. “I get to do the talks and the interviews, but the students are at the core of the actual work.”

###

About Roberto Palmieri

Roberto Palmieri is an assistant professor in the Department of Computer Science and Engineering Department at Lehigh University, where he co-leads the Scalable Systems Software (SSS) Research Group. He joined the faculty of the P.C. Rossin College of Engineering and Applied Science in 2017, and was previously a research assistant professor at Virginia Tech. He earned his PhD, MS, and BS in computer engineering from Sapienza University of Rome.

Palmieri’s research interests focus on different aspects of concurrency, synchronization, data structures, distributed computing, heterogeneous systems, key-value stores, and distributed systems, spanning from theory to practice. He is passionate about designing and implementing synchronization protocols optimized for a wide range of deployments, from multicore architectures to cluster-scale and geo-distributed infrastructures.

Related Links:
Rossin College Faculty Profile: Roberto Palmieri
NSF Faculty Early Career Development (CAREER) program
NSF Award Abstract (2045976): "Distributed Protocols and Primitives Optimized for RDMA Read/Write Operations in Data Management
Lehigh University: Roberto Palmieri
Lehigh University: Scalable Systems Software (SSS) Research Group
Lehigh University: Institute for Data, Intelligent Systems, and Computation

ABIONIC OIL

Phytoplankton Dicrateria rotunda synthesizes hydrocarbons equivalent to petroleum

Peer-Reviewed Publication

TOYOHASHI UNIVERSITY OF TECHNOLOGY (TUT)

Sampling site of the Arctic strain ARC1 of D. rotunda in the Arctic Ocean (the Chukchi Sea) 

IMAGE: SAMPLING SITE OF THE ARCTIC STRAIN ARC1 OF D. ROTUNDA IN THE ARCTIC OCEAN (THE CHUKCHI SEA) THE RED CIRCLE INDICATES 70° 0.06ʹ N, 168° 44.96ʹ W. view more 

CREDIT: COPYRIGHT (C) TOYOHASHI UNIVERSITY OF TECHNOLOGY. ALL RIGHTS RESERVED.

Key points

  • We discovered that Dicrateria rotunda (D. rotunda), a species of Haptophyte, can synthesize hydrocarbons equivalent to petroleum (saturated hydrocarbons with a carbon number ranging from 10 to 38). No organism capable of petroleum production has been previously reported.

  • All of the eleven Dicrateria strains examined in this study, including ARC1 obtained during a science cruise in the Arctic Ocean, were found to be capable of synthesizing a series of saturated hydrocarbons. This has indicated that this capability of synthesis is common to the entire Dicrateria genus.

  • The saturated hydrocarbon content of the D. rotunda ARC1 increased under dark and nitrogen-deficient conditions. Understanding the physiological function and synthesis pathways of these saturated hydrocarbons may contribute to the development of biofuels in the future.

Overview

Director-General Naomi Harada and colleagues from the Research Institute for Global Change at the Japan Agency for Marine-Earth Science and Technology, in collaboration with Assistant Professor Yuu Hirose from Toyohashi University of Technology and Specially Appointed Professor Kazuyoshi Murata from the National Institute for Physiological Sciences, discovered that the phytoplankton Dicrateria rotunda (D. rotunda) can synthesize a series of saturated hydrocarbons with a carbon number ranging from 10 to 38*.

A phytoplankton community was collected from seawater of the Chukchi Sea during a science cruise of the research vessel “Mirai” in the Arctic Ocean in 2013, from which we isolated and cultured the Arctic strain of D. rotunda, ARC1. ARC1 contained a series of saturated hydrocarbons with a carbon number ranging from 10 to 38, which are categorized as petrol (carbon number 10 to 15), diesel oils (carbon number 16 to 20), and fuel oils (carbon number 21 or higher) . Moreover, we examined ten additional strains of Dicrateria stored in culture collections, all of which were found to be similarly capable of hydrocarbon synthesis, indicating that this was common to the entire Dicrateria genus. This study is the first to report on an organism with the capability to synthesize hydrocarbons equivalent to petroleum.

The capability of the ARC1 strain to synthesize saturated hydrocarbons was shown to increase depending on the environmental conditions , and the findings of this study are expected to contribute to the development of biofuels in the future.

This study was supported by JSPS Grants-in-Aid for Scientific Research JP22221003 and JP15H05712.

 

Reference:

Naomi Harada, Yuu Hirose, Song Chihong, Hirofumi Kurita, Miyako Sato, Jonaotaro Onodera, Kazuyoshi Murata, Fumihiro Itoh (2021) “A novel characteristic of a phytoplankton as a potential source of straight-chain alkanes” Sci. Rep. 11, 14190, 10.1038/s41598-021-93204-w.

 

Supplemental information:

* Saturated hydrocarbons: Organic compounds composed of carbon and hydrogen. Saturated hydrocarbon with the smallest mass number is methane (CH4), which has a carbon number of one.


 

Solar-powered microbes to feed the world?


International research team shows that protein from microbes uses a fraction of the resources of conventional farming

Peer-Reviewed Publication

UNIVERSITY OF GÖTTINGEN

Solar-powered microbes to feed the world? 

IMAGE: CURRENTLY, AGRICULTURE USES ENERGY FROM THE SUN TO CONVERT CARBON DIOXIDE, WATER, AND NITROGEN INTO VALUABLE PROTEINS VIA PLANTS AND LIVESTOCK. A NEW FOOD SYSTEM THAT INCORPORATES SOLAR-POWERED MICROBIAL PRODUCTION COULD SUBSTANTIALLY INCREASE PROTEIN SUPPLY WHILE USING LESS LAND AND PREVENTING FERTILISERS FROM POLLUTING THE ENVIRONMENT. view more 

CREDIT: PAUL VAN LEER, DORIAN LEGER, SILVIO MATASSA, ELAD NOOR, ALON SHEPON, RON MILO, ARREN BAR-EVEN

Microbes have played a key role in our food and drinks – from cheese to beer – for millennia but their impact on our nutrition may soon become even more important. The world is facing growing food challenges as the human population continues to increase alongside its demand for resource intensive animal products. If these needs are to be met solely by conventional agriculture, the environmental cost will be huge. An international research team led by a Göttingen University alumnus has now shown that using solar-panels to produce microbial protein – which is rich not just in proteins but also in other nutrients – is more sustainable, efficient and environmentally friendly than growing conventional crops. This method uses solar energy, land, nutrients, and carbon dioxide from the air. Their research was published in PNAS.

Using computer simulations drawing directly from laboratory results, the researchers modelled large-scale microbial food production facilities, which use solar energy, air, water, and nutrients to grow microbes. The protein-rich biomass is harvested and processed, and the resulting powder can be used as feed for animals, or as food for people. The study carried out an analysis of the energy requirements for each step, from the very start to the end product, taking into account: electricity generation (from solar panels), electrochemical production of energy-rich substrate for the microbes, microbe cultivation, harvesting, and processing the protein-rich biomass. Several types of microbes and growth strategies were compared in order to identify the most efficient.

The study found that for each kilo of protein produced, solar-powered microbes require only 10% of the land area compared to even the most efficient plant crop – soybean. The study calculated that even in northern climates with less sunshine, the yields of solar-powered microbial foods could far outproduce staple crops, while minimizing water and fertilizer use. Importantly, this production could also be located in regions not suitable for agriculture, such as deserts.

In previous research, the protein from these types of microbes has shown beneficial effects when fed to livestock and is already produced at large scale in the EU. “We expect that microbial protein will also be beneficial as a supplement to our diets, since it provides a high-quality protein source composed of all essential amino acids, as well as vitamins and minerals,” explains first author Dorian Leger, who carried out the work in the MPI of Molecular Plant Physiology while studying at the University of Göttingen, along with colleagues from Italy and Israel. “This technology has the potential to support food production while preventing damage to the environment. Current farming methods contribute to polluted ecosystems and depleted water reserves worldwide.”

At the moment, 30-40% of the Earth’s land is used for farming, yet one in ten people are undernourished. Leger says, “Integrating the cultivation of nutrient-rich microbes with renewable energy systems, such as solar panels, has the potential to produce more food with less resources. This could free up vast amounts of agricultural land, and, in addition, prevent the further destruction of natural ecosystems thereby making a valuable contribution to conservation and sustainability whilst promoting food availability globally.”

Contact:

Dorian Leger

Email: dorian.dleger@gmail.com

Tel: +49 1520 571 9379

dorianleger.com

 

New research finds gay and bisexual men still earn less

Academics discover discrepancies in earnings despite anti-discrimination legislation

Peer-Reviewed Publication

ANGLIA RUSKIN UNIVERSITY

A new study has found that gay and bisexual men are still earning less than heterosexual men despite legislation aimed at reducing discrimination in the workplace.

Researchers from Anglia Ruskin University (ARU) analysed 24 studies published between 2012 and 2020 covering countries in Europe, North America and Australia.

Their analysis, published in the Journal of Population Economics, found that gay men earned on average 6.8% less than heterosexual men across all countries covered in the study.

Bisexual men earned 10.3% less than heterosexual men on average, while bisexual women earned 5.1% less than heterosexual women. Lesbian women earned 7.1% more than heterosexual women.

In the UK, gay and bisexual men together earned 4.7% less than heterosexual men, and in the USA they earned 10.9% less.

In the UK, workplace prejudice against individuals due to their sexual orientation or sex is prohibited under the Equality Act of 2010. However, despite this legislation, the research suggests that gay men and bisexual men and women still earn less than their heterosexual counterparts.

Professor Nick Drydakis, author of the study and Director of the Centre for Pluralist Economics at ARU, said: “The persistence of earnings penalties for gay men and bisexual men and women in the face of anti-discrimination policies represents a cause for concern. 

“Legislation and workplace guidelines should guarantee that people receive the same pay and not experience any form of workplace bias simply because of their sexual orientation or gender identity status.

“Inclusive policies should embrace diversity by encouraging under-represented groups to apply for jobs or promotions and providing support to LGBTIQ+ employees to raise concerns and receive fair treatment.

“Standing against discrimination and celebrating and supporting LGBTIQ+ diversity should form a part of HR policies.”

Ultramarathon Mountain Race weather was predicted; impact was not


Researchers call for forecasts to better communicate weather effects on human health and activities.

Peer-Reviewed Publication

INSTITUTE OF ATMOSPHERIC PHYSICS, CHINESE ACADEMY OF SCIENCES

Conceptual model of warning value chain. 

IMAGE: A CONCEPTUAL MODEL OF WARNING CHAIN WAS PRODUCED BY THE WMO/WWRP'S HIGH-IMPACT WEATHER (HIWEATHER) PROJECT IN PURSUIT OF SUCCESSFUL WEATHER-HAZARDS WARNINGS. FROM OBSERVATION TO DECISION MAKING, THERE ARE SEVERAL “MOUNTAINS AND VALLEYS” TO OVERCOME. COLLABORATIONS AMONG SCIENTISTS IN DIFFERENT AREAS, GOVERNMENTS, SOCIAL MEDIA, AND CITIZENS ARE NEEDED FOR SUCCESSFUL HAZARD PREVENTION. view more 

CREDIT: MODEL DESIGN BY WMO/WWRP HIWEATHER PROJECT

On May 22, about 170 ultramarathon racers were climbing an eight kilometer stretch of sand and gravel to a mountain top in Yellow River Stone Forest Park in northwest China. Suddenly, it seemed, the temperature dropped, the wind picked up, and the skies opened. Mostly dressed in shorts and t-shirts, armed with foil blankets, 29 runners were trapped, exposed to the weather elements. Twenty-one runners died and eight suffered injury, all from hypothermia. The tragedy sparked outrage in China: the weather was forecasted, so why didn’t the race organizers delay the race or better prepare the participants?

According to a multi-institute research team, the challenge is not necessarily predicting the weather, but rather better predicting the weather’s impact. They published an analysis of the May 22 weather events and a call to action for more comprehensive warning systems on July 29 in Advances in Atmospheric Sciences.

“A good quality weather forecast does not always lead to a successful hazard warning if gaps exist in the warning chain,” said first author Qinghong Zhang, professor in the Department of Atmospheric and Oceanic Sciences, Peking University, and director of the HIWeather International Coordination Office of WWRP (World Weather Research Program) under WMO (World Meteorological Organization). “The goal of this study is to identity the possible gaps where efforts can be put forth to improve future high-impact weather warning.”

In the reporting of the event, according to Zhang, descriptions differed. The media called the weather an “extreme” event, while the government described it as a “high-impact” event. In this paper, Zhang and her team analyzed six years of hourly observations in the month of May for the area where the weather most affected the runners. They found that the high-impact weather events that day were indeed extreme, as the cold temperature was in the bottom 5% and the high wind speed was in the top 5% of the 4,464 records examined. Contextualized with the human activities of the day, the weather also becomes high impact. Without the race occurring that day, Zhang said, the weather likely would have had little impact on human life and property.

“Our study indicated that our current medium-range forecasting models are capable of making reasonable forecasts for this event several days before its occurrence,” Zhang said, noting that the models predicted a nearly perfect 30-hour forecast of the weather leading to and during the event. “The tragedy occurred partly because detailed weather forecasts for the race site tailored to the specific needs of the users were not available. What the race organizers and runners really needed was the hypothermia risk forecast.”

The researchers point to the efforts of the WMO WWRP’s 10-year international High-Impact Weather Project (HIWeather), which aims to develop a chain of multidisciplinary inputs to better understand not just weather forecasts, but their potential impacts and how to best mitigate the variable circumstances. 

“More than only forecasting the weather, forecasting the hazards associated with the weather and their human and societal impacts are necessary in order to reduce the loss of life and property,” Zhang said. “Collaborations among scientists in different areas, governments, social media and citizens are needed for successful hazard prevention.”

Other contributors include Chan-Pang Ng, Department of Atmospheric and Oceanic Sciences, Peking University; Kan Dai, Jun Xu and Jian Tang, China Meteorological Administration; Juanzhen Sun, National Center for Atmospheric Science, United States; and Mu Mu, Department of Atmospheric and Oceanic Sciences and Institute of Atmospheric Sciences, Fudan University.

Gaining insight into the Universe’s most powerful objects and events


A timely new collection reminds us that even in times of great hardship, our understanding of the Universe’s most explosive, spectacular and mysterious events and objects continues to grow

Peer-Reviewed Publication

SPRINGER

Supernovas, neutron stars, and neutron star mergers are some of the Universe’s most powerful events and mysterious objects, leftover after the burning of nuclear fuel is exhausted within massive stars. A new special issue of EPJ Special Topics entitled ‘Nuclear astrophysics in our time: supernovae, neutron stars and binary neutron star mergers’ edited by Debades Bandyopadhyay, Saha Institute of Nuclear Physics, Kolkata, India, brings together several papers that document our understanding of these astrophysical events and compact stars.

The collection comes at an important time for astrophysics and for our understanding of the core collapses that give rise to the supernovas that create neutron stars and black holes. The first measurement of gravitational waves in 2015 by the LIGO/VIRGO collaboration led to the first observation of the merger between neutron stars in a binary system in 2017.

The gravitational wave signal from the shell elliptical galaxy NGC 4993 which became known as GW170817 opened up new avenues for astrophysics and provided astronomy with a massive boost, particularly when it comes to the study of the nuclear processes that occur towards the end of a star’s lifetime.

The publication of this collection is made even more timely by the fact that it also happens to coincide with the 100-year anniversary of the introduction of the Saha thermal ionization equation. The formula integrated atomic physics and statistical mechanics to explain the spectral classification of stars and the processes occurring within the outer layers of such stellar bodies was developed by a young scientist named Meghnad Saha from the University of Calcutta.

Saha wrote the series of papers introducing the equation as the Spanish flu pandemic — ravaged the globe just as the COVID-19 pandemic now causes similar hardship. As such the collection offers a timely and inspiring reminder that even at the most challenging of times, science presses on and our understanding and knowledge of the Universe continues to be refined.

The papers included in the collection also highlight the fact that whilst the understanding of these objects and events is important in itself, our understanding of astronomy and astrophysics hinges on a better grasp of these powerful astrophysical phenomena.

 

References

D. Bandyopadhyay “Nuclear astrophysics in our time: supernovae, neutron stars and binary neutron star mergers” Eur. Phys. J. Spec. Top. 230, 491–493 (2021). https://doi.org/10.1140/epjs/s11734-021-00009-z