Thursday, April 22, 2021


'Doomsday' climate tipping points have wiggle room: study

AFP EARTH DAY 2021

Global warming thresholds that could tip massive ice sheets into irreversible melting or see the Amazon rainforest shrivel into savannah have "grace periods", giving humanity more time to draw down planet-warming carbon emissions, researchers have calculated.
© CHICO BATATA More than a dozen tipping points triggered mainly by rising temperatures could unleash catastrophic changes in Earth's climate system

© Simon MALFATTO The threshold for the two ice sheets is probably below 2C, and may have already been passed, experts say

More than a dozen tipping points triggered mainly by rising temperatures could unleash catastrophic changes in Earth's climate system.

As the Paris Agreement goal of a 1.5 degree Celsius cap above pre-industrial levels slips out of reach, this is potentially very good news -- although no reason to relax -- scientists said.

Ice sheets atop Greenland and West Antarctic hold enough frozen water to lift oceans a dozen metres (40 feet), drowning cities and redrawing the planet's coastlines.

© Ali ROSE Ice sheets and ocean currents, they found through modelling, have slow-onset tipping points which unfold on a timescale of centuries

Greenhouse gases escaping from Siberian permafrost could overwhelm already belaboured efforts to curb man-made carbon pollution.

Monsoon rains in South Asia, the polar ice cap, coral reef ecosystems, the jet stream, the Amazon basin -- all are vulnerable to point-of-no-return transitions that would radically alter the world as we know it.

Up to now, scientists have focused mostly on temperature thresholds and timing: how much warming will it take to trigger each tipping point?

- 'Period of grace' -

The threshold for the two ice sheets is probably below 2C, and may have already been passed, experts say. To melt the permafrost, however, it will likely take at least another degree or two of warming.

But new research led by Paul Ritchie and Peter Cox from the School of Engineering, Mathematics and Physical Sciences at Exeter University asks a different question.

Once a tipping point tripwire has been triggered, can we row back by lowering temperatures, whether by sucking CO2 out of the air or injecting a sunscreen into the stratosphere?

"Our analysis shows that it is possible to overshoot tipping point thresholds without leading to an abrupt and permanent climate change -- as long as the overshoot is for a short period of time," Cox, senior author of a study published Wednesday in Nature, told AFP.

"There is a 'period of grace' once the threshold is crossed when it is still possible to avoid tipping."

In the study, Cox and colleagues looked at four parts of the climate system: ice sheets, monsoons, the Amazon basin and the ocean conveyor belt of currents that drives cold and warm water across the global, known as the Atlantic Meridional Overturning Circulation (AMOC).

Ice sheets and ocean currents, they found through modelling, have slow-onset tipping points which unfold on a timescale of centuries.

- Dire climate shifts -


This suggests that even if temperatures rise above the tipping point threshold and set in motion a profound change, there is still time to pull back from the cliff's edge by lowering the global thermometer.

"The slow timescale of some of the most worrying tipping points gives us a better chance of reversing global warming before we tip into a less favourable world," Cox said.

But the same timing does not apply to Amazon forest dieback and disruption of the monsoon, which could flip into a new state within decades, the researchers found.

For the Amazon, that shift is also hastened by massive deforestation.

"The study shows that if temperatures overshoot the tipping points for too long, we may set in place irreversible changes in the climate system, such as dying of the Amazon rainforest," commented William Collins, a professor of meteorology at the University of Reading.

Other scientists not involved in the research agreed with the findings but cautioned against complacency.

"This may offer a very tiny bit of wiggle room in our expectations of when such dire climate shifts might become permanent and extremely damaging as Earth continues to warm," Grant Allen, a professor of atmospheric physics at the University of Manchester told the London-based Science Media Centre.

"But a brief delay to some of global warming's more catastrophic consequences does not mean that it is any less dangerous now."

mh/klm/bp
U.S. to double public climate finance to developing countries by 2024

By Andrea Shalal 
EARTH DAY 2021



© Reuters/KEVIN LAMARQUE FILE PHOTO: U.S. President Joe Biden speaks about administration plans to confront climate change at the White House ceremony in Washington







WASHINGTON (Reuters) - The United States on Thursday said it will boost public climate finance to help poor countries reduce greenhouse gas emissions and adapt to a changing climate, doubling funding by 2024 from average levels hit during the Obama administration.

The White House said it was embracing "ambitious but attainable goals" for international aid to developing countries given the urgency of the climate crisis and to compensate for a sharp drop in U.S. funding during the Trump administration.

As part of the goal, the White House said that by 2024 it would triple financing of climate adaptation, which focuses on adjustments to current or expected climate change. It said it will work with Congress to enact needed legislation.

Leonardo Martinez-Diaz, a top aide to Biden's climate envoy John Kerry, said total U.S. international public climate finance averaged around $2.8 billion a year during the baseline period from fiscal year 2013 to 2017, with around $500 million going toward adaptation. That was the most recent period where U.S. climate finance was at an all-time high, he said on Twitter.

In a fact sheet, the White House said U.S. agencies, working with development partners, would prioritize climate in their investments, expand technical assistance and increase funding for adaptation and resilience.

It said the U.S. Agency for International Development (USAID) would release a new Climate Change Strategy in November 2021, at the 26th Conference of the Parties to the United Nations Framework Convention on Climate Change (COP26).

The U.S. International Development Finance Corporation (DFC) would change its development strategy to include climate for the first time, and prioritize climate mitigation and adaptation.

The Millennium Challenge Corporation's new climate strategy would focus on climate-smart development and sustainable infrastructure, aiming to put more than 50% of its funding into climate-related investments over the next five years, it said.

The U.S. Treasury would direct U.S. executive directors in multilateral development banks (MDBs) to ensure that those institutions, including the World Bank, set and apply ambitious climate finance targets and policies.

The plan also calls for ending international investments in carbon-intensive fossil fuel-based energy projects, and steering capital toward climate-aligned investments.

Treasury, together with the Organisation for Economic Co-operation and Development, will spearhead efforts to modify guidelines for official export financing to reorient financing away from carbon-intensive activities, it said.

(Reporting by Andrea Shalal; Editing by David Gregorio)
Three years of artic grayling surveys completed


Three years of snorkeling surveys to study arctic grayling fish populations in the Ingenika River watershed have recently finished

The Ingenika population has been isolated in the watershed for more than 50 years due to the creation of the Williston Reservoir. Long-term population monitoring is focused in three rivers: Ingenika, Finlay, and Mesilinka, all within the Tsay Keh Dene traditional territory.

“We are losing our pristine wilderness at an alarming rate in B.C. and this is one of the last strongholds of the southern extent of the Artic Grayling range,” narrates a video by Chu Cho Environmental, the Tsay Key Dene’s research and consulting firm that conducted the surveys.

Surveys began in 2018 with biologists John Hagen, Mike Stamford, and Cody Haggard.

Swimming downstream in lane formation, they were able to take more accurate counts of fish, about 4 to 6 fish per kilometre through all three years of the study. Counts were recorded on waterproof data sheets strapped to the divers’ wrists.

Chu Cho CEO Mike Tilson says the highest density of arctic graylings was found in a single five-kilometre channel on the river.

"There's no development, no forestry, no roads. Maybe a little bit of old traditional trails, sacred Dene trails. But it is very remote country," Tilson said. "There's a significant section of the stream that's prime habitat for conservation. It's not degraded habitat, its pristine habitat."

The surveys were funded by the Fish and Wildlife Compensation Program, and provide key information about abundance, trends, potential threats, and limiting factors for the fish for the program's Arctic Grayling Monitoring Framework.

A report has been prepared and is expected for public release in the next few months. The findings will be used to inform conservation efforts and policy, but Tilson stressed that more surveys are needed to flesh out the data, noting surveys every three out of 10 years would be ideal.

"If we value arctic graylings, then this becomes a key section in which we can identify that its protection is essential for the survival of the species," said Tilson, noting the remoteness of the area protects the fish population.

tsummer@ahnfsj.ca

Tom Summer, Local Journalism Initiative, Alaska Highway News
4/21/2021

EARTH DAY
Respect for Indigenous knowledge must lead nature conservation efforts in Canada


Andrea Reid, Assistant Professor, Institute for the Oceans and Fisheries, University of British Columbia, 
Joseph Bennett, Assistant Professor, Institute of Environmental Science and Department of Biology, Carleton University, 
Rachel Buxton, Research scientist, Department of Biology, Carleton University, 
Paul A. Smith, Adjunct Research Scientist, Conservation Biology and Arctic Ecology, Carleton University 

The past year has taught us important lessons about the consequences of the harm humans are inflicting on the natural world.

© (Shutterstock) Clayoquot Sound, part of the Tla-o-qui-aht territory, has been the site of numerous protests against logging the forest. Meares Island was declared a Tribal Park in 1984.

We’re confronting a global biodiversity crisis, losing species and natural places at an unprecedented rate. We’ve also experienced the repercussions of nature’s decline through COVID-19, whose jump from animals to humans may be linked to habitat destruction. And yet the international community failed to achieve a single Aichi Biodiversity Target to curb environmental destruction by 2020.

To combat the biodiversity crisis, we need to fundamentally shift our economy and society and make nature conservation the norm. We urgently need a better understanding of how to motivate people, overcome political barriers, influence economies and alter polities to benefit nature.

To guide this change, we need to prioritize Indigenous leadership, respect Indigenous knowledge holders as experts and acknowledge Indigenous communities as co-creators of solutions.
Canada’s role in conservation

Over a quarter of the world’s intact ecosystems are found within Canada, making the country uniquely positioned to make significant contributions to meeting global biodiversity targets.

With vast forests and wetlands that sequester carbon, Canada’s environmental policies can have a ripple effect across the entire world. For example, our boreal forests and wetlands store enormous amounts of carbon, and policies that remove protection could have large climate implications.

Read more: How to meet the ambitious target of conserving 30 per cent of Earth by 2030

Canada has a history of taking conservation action — it was the first industrialized nation to commit to meeting conservation targets — and the vast majority of Canadians consider protecting nature a priority. And yet challenges remain.

Many species are in decline, including iconic Canadian species such as caribou and loons. Canada’s economic reliance on natural resource extraction adds difficult dimensions to many decisions.

And Canada’s early efforts to protect natural areas dispossessed Indigenous peoples of their land, including the founding of famous parks like Jasper and Banff.

There remain significant barriers to the meaningful and rightful participation of Indigenous peoples in conservation.
The need for action

Last year in Ottawa, we brought together many of the country’s leading experts in nature conservation — from academia, government, NGOs and Indigenous organizations. We set out to assess the information needed to inform policy and practice for biodiversity conservation in Canada. The consensus was that we need to move from collecting information to mobilizing action.

Our biggest information needs are not facts and figures about plants and animals. We already have much of the information we need for nature conservation in Canada.

Read more: Acid rain and mercury legacy decreases the number of loon chicks in Ontario lakes

Instead, we need to make conserving nature part of everything we do, and we need to do this quickly. A key component of moving from information to action will be working across scientific disciplines — conservation biologists teaming up with data scientists, for example. We live in the information age, where petabytes of information to guide nature conservation are generated by social media and community science.

Also, scientists must team up with decision-makers to ensure that the information they generate is useful for policy, and policies are based on the best available evidence. Social scientists can play an important role in understanding how to motivate societal change towards widespread nature conservation.
The role of Indigenous peoples

Since time immemorial, Indigenous peoples have lived sustainably in the place now called Canada. Indigenous peoples have unique and diverse ways of knowing and understanding the natural world, and humans’ place within it.

Indigenous knowledge systems are shaped by cultural heritage and law, carried by languages and stories, and are tightly linked with a relationship to the land. Colonization attempted to marginalize and extinguish Indigenous ways of knowing.

Yet, Indigenous peoples remain outstanding caretakers of the natural environment. They are typically on the front lines of land defence issues, and lands managed by Indigenous communities tend to be healthier than other areas. In addition, Indigenous rights over land and resources are enshrined in international resolutions
.
© THE CANADIAN PRESS/Jason Franson Supporters of the Wet'suwet'en hereditary chiefs and who oppose the Coastal Gaslink pipeline set up a support station near Houston B.C., In January 2020.

Indigenous knowledge systems must play a central role for Canada’s nature conservation efforts to be successful. A pathway to working together and embracing multiple worldviews includes the Mi’kmaw principle of Two-Eyed Seeing:

“seeing from one eye with the strengths of Indigenous knowledges and ways of knowing, and from the other with the strengths of western knowledges and ways of knowing, and learning to use both eyes together, for the benefit of all.”

Given the ongoing failure to meet conservation targets and the continued destruction of nature at rates unseen in human history, new approaches are needed. Canada can help lead the way towards nature conservation by focusing on generating societal change and creating more effective action, returning power to Indigenous hands and amplifying Indigenous voices.

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Joseph Bennett receives funding from The Natural Sciences and Engineering Research Council of Canada (NSERC), Environment and Climate Change Canada, and the Nature Conservancy of Canada.

Paul A. Smith receives funding from Environment and Climate Change Canada, the Natural Sciences and Engineering Research Council of Canada, the Nunavut Wildlife Management Board, and Natural Resources Canada's Polar Continental Shelf Program.

Andrea Reid and Rachel Buxton do not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and have disclosed no relevant affiliations beyond their academic appointmen
t.

ALBERTA
Hostile takeover target Inter Pipeline reports 60% of Heartland plant is contracted



CALGARY — Inter Pipeline Ltd. says it has signed take-or-pay contracts covering about 60 per cent of production from the Heartland Petrochemical Complex it's building northeast of Edmonton as it continues to fight off a hostile takeover bid by Brookfield Infrastructure Partners LP
.
© Provided by The Canadian Press

The Calgary-based company said Thursday the deals with a total of seven parties have an average duration of nine years and it is confident it will achieve its goal of 70 per cent contracted production before the plant, designed to convert propane into polypropylene plastic pellets, starts up in early 2022.

“Our announcement today is very positive for shareholders. It provides clarity around the success we've had in creating what we call more like an infrastructure-based contracting model for the plant,” Inter CEO Christian Bayle said in an interview.

The contracts are with a mixture of North American polypropylene consumers as well as Canadian and multinational producers who will pay to have their propane transformed into polypropylene at the plant and then share in the profit when that product goes to market, he said.

When Brookfield made its takeover offer in February of $16.50 per share in cash or 0.206 of a Brookfield Infrastructure Corp. class-A exchangeable share (a deal that values Inter at $7.1 billion), it said it would consider increasing the bid if Inter can "substantiate'' growth and commercialization plans for the $4-billion Heartland project.

Inter said Thursday it expects Heartland to generate annual adjusted earnings before interest, taxes, depreciation and amortization of between $400 million and $450 million in its first full year of operation, based on 70 per cent stable contracted sales and 30 per cent merchant sales.

In estimating the merchant sales, Inter said it is assuming a US$1,200 per tonne difference between North American posted polypropylene and Edmonton propane prices, a "conservative" assumption based on the current spread of US$2,300 per tonne and the seven-year average of US$1,400 per tonne.

Brookfield did not immediately respond to a request for comment on Thursday.

Bayle laughed when asked if Brookfield had called him after the announcement, then declined to answer due to confidentiality rules associated with the strategic alternatives review launched by the Inter board after the Brookfield bid.

"How that translates into a reaction from Brookfield, that's really a question for them," he said.

"When it comes to our strategic review process, clearly it has to be a net positive."

He said the company's long search for a Heartland partner and the review are expected to "all funnel together" before the hostile bid expires on June 7.

Heartland was given a financial boost two weeks ago when the United Conservative Party government in Alberta committed $408 million in cash grants over three years once the complex is operational to replace $200 million in royalty credits granted by the previous NDP government.

In a report, analysts with Stifel FirstEnergy said achieving 60 per cent contracted sales is below target but "is likely above market expectations." It added Inter's adjusted EBITDA forecast is less than expected given the Alberta cash injection announcement.

"While we view the update as positive given the previous uncertainty around the project, we expect the name will continue to trade on news related to the hostile takeover from Brookfield," said ATB Capital Markets in a report.

It added that it expects the contract news could generate more announcements in relation to Inter's review and partner search.

This report by The Canadian Press was first published April 22, 2021.

Companies in this story: (TSX:IPL, TSX:BIP. UN, TSX:BIPC)

Dan Healing, The Canadian Press

Researchers show how 'theory of mind' influences advertising skepticism


New study parses out when customers may feel undue persuasion and reduce their willingness to purchase a product


UNIVERSITY OF OREGON

Research News

EUGENE, Ore. -- April 22, 2021 -- Product marketers should be clear in their messaging to avoid customer skepticism that makes them feel duped, according to University of Oregon research.

At issue in a new study, published in the Journal of Business Research, was a social-cognitive construct called theory of mind, which considers how well people assess the mental states and apparent goals of others.

Developmental psychologists link it to an ability to show empathy. In business, the study, led by former UO doctoral student Elizabeth Minton, showed it also can influence a person's recognition of being persuaded. And that affects a person's evaluation and willingness to buy a product, she found.

"There has been some research on adult theory of mind, particularly in understanding sales communications," said co-author T. Bettina Cornwell, head of the Department of Marketing and a Philip H. Knight Chair at the UO. "However, there hasn't been a lot of attention to variations of how it plays out."

In young children, Cornwell said, theory of mind is easily seen. For them, she said, their parent's minds and aspirations are their minds, too. If a child wants a doll for a birthday, then mom does, too. Later a child separates that thinking, realizing that mom may instead prefer perfume or dinner out.

In the project, Minton, now an associate professor of marketing at the University of Wyoming, designed four experiments in collaboration with Cornwell and Hong Yuan, the Richard P. Booth Associate Professor and Research Scholar of marketing and director of the UO's Business Research Institute.

"We wanted to know when the recognition of persuasion becomes particularly important," Minton said. "At what point will a person be misled?"

The first experiment involved a pool of 61 online participants who considered a visual with a character describing "a soap that smells good and is gentle on your hands." A second condition had the additional text, "You HAVE to buy it." Result found that no matter the text in the advertisement, higher theory of mind increased skepticism and, in turn, attitudes toward the product and purchase intentions declined.

Next, 238 subjects were recruited from the customer database of a real company that produces a stevia leaf cocoa syrup. All saw the same general text about the sweetener but across three ads the accompanying visuals changed. One ad had no additional visuals or text, one showed an outline of a girl on a bicycle with a persuasive message, and the last with an added bubble containing persuasive words spoken by the girl.

Again, the pattern of results showed that across the advertising types, theory of mind increased advertising skepticism and, in turn, attitude, purchase intentions and willingness to pay for the product declined.

A third experiment dealt with transparency. An online group of 200 adults saw an advertisement for a limited time offer to get seven packets of seed butter for free but with shipping costs of $10.99. About half of the participants saw the price information on the same page (high transparency) and the rest on a second page (low transparency).

Same-page high transparency produced little skepticism. The delayed low transparency presentation, however, significantly raised flags of skepticism and reduced the subjects' attitudes, purchase intentions and willingness to pay.

"This clearly showed us a sense of transparency that people appreciated," Yuan said. "It tells us that we, as advertisers and marketers, probably need to focus on transparency."

The final experiment, with 215 undergraduate students, considered possible boundaries to how theory of mind influences advertising skepticism as found in the first three studies. Participants viewed one advertisement that was either a private-use item (socks) or a public use item (a graphic T-shirt) with varied text and visuals.

The logic is that because theory of mind is a social processing capability, the researchers said, it might be emphasized in a situation where the product is publicly seen and noticeable.

Skepticism, as in previous experiments, remained high in participants with high levels of theory of mind, but the private-versus-public nature had strong effects. Those showing higher skepticism reported higher valuations for the private products, while participants with lower levels of skepticism showed higher valuation for the public products.

"These findings continue to provide evidence for the importance of understanding theory of mind's influence on response to potential persuasion episodes and the interaction with a product's private versus public nature," the researchers suggest.

The study, Minton said, opens a public policy question about advertising being allowed to potentially persuade people who haven't developed a strong theory of mind.

"It's probably not advantageous for marketers to promote a product in a way that has low transparency and makes a consumer have to work hard to understand the offer," Cornwell said.

"One, consumers who don't have high theory of mind may be duped then frustrated while those who have high theory of mind, and thus more skepticism, will be irritated by the way the offer is presented and be less willing to buy," she said. "Neither of the outcomes is good."

###

A Sheth Foundation Dissertation Award received by Minton in 2013 supported the research.

Links:

Journal of Business Research paper: https://doi.org/10.1016/j.jbusres.2021.02.002

About T. Bettina Cornwell: https://business.uoregon.edu/faculty/t-bettina-cornwell

About Elizabeth Minton: http://www.uwyo.edu/mgtmkt/faculty-staff/faculty-pages/minton.html

About Hong Yuan: https://business.uoregon.edu/faculty/hong-yuan

Lundquist College of Business: https://business.uoregon.edu/

California's wildfire season has lengthened, and its peak is now earlier in the year

UCI environmental engineers analyzed a century's worth of the Golden State's blazes

UNIVERSITY OF CALIFORNIA - IRVINE









Research News

Irvine, Calif., April 22, 2021 -- California's wildfire problem, fueled by a concurrence of climate change and a heightened risk of human-caused ignitions in once uninhabited areas, has been getting worse with each passing year of the 21st century.

Researchers in the Department of Civil & Environmental Engineering at the University of California, Irvine have conducted a thorough analysis of California Department of Forestry and Fire Protection wildfire statistics from 2000 to 2019, comparing them with data from 1920 to 1999. They learned that the annual burn season has lengthened in the past two decades and that the yearly peak has shifted from August to July. The team's findings are the subject of a study published today in the open-access journal Scientific Reports.

The study is a focused examination of fire frequency, burned area and myriad drivers of the catastrophically destructive events. The team found that the number of hot spots - places with severe fire risk - has grown significantly in recent years, fueled by higher annual mean temperatures, greater vapor pressure deficit (lack of air moisture), drought, and an elevated chance of blazes being sparked through such human causes as power line disruptions, construction, transportation, campfires, discarded cigarettes and fireworks.

"CALFIRE data show that each new year of the 21st century has been a record breaker in terms of wildfire damage in California," said co-author Tirtha Banerjee, UCI assistant professor of civil & environmental engineering. "We also have seen that about 80 percent of the total number of the state's wildfires over the past few decades have been small, measuring less that 500 acres. But when fires get large, their deadliness greatly increases."

Banerjee said that to gain a proper understanding of the growth of fire risk in California, it's important to put large and small incidents into separate buckets. By doing this, the team learned that 1,247 out of 6,336 wildfires, about 20 percent, accounted for 97 percent of the total burned area in the 2000 to 2020 period.

"And more than nine-tenths of the casualties and property losses can be attributed to fires exceeding the 500-acre threshold," Banerjee said.

He added that over the past two decades, there has been a significant increase in "extreme" wildfires scorching more that 10,000 acres. Coinciding with that has been a rapid uptick in the frequency of small, human-caused blazes.

One of the most alarming findings of the study, according to lead-author Shu Li, a Ph.D. student in Banerjee's laboratory, is the substantial spatial growth of fire risk throughout the state. From 1920 to 1999, California's only hot spot with "very high wildfire density" was Los Angeles County. In the past 20 years, that designation has expanded greatly in Southern California to include Ventura County, and portions of Riverside, San Diego and San Bernardino Counties.

Even in northern California, areas known by fire managers as the Nevada-Yuba-Placer Unit and the Tuolumne-Calaveras Unit are newly emerged as high-density wildfire regions.

"Before 2000, there were almost no human-caused wildfires along California's Pacific coastline, but now nearly every coastal county is experiencing increased risk, and the San Benito-Monterey Unit and the San Luis Obispo Unit have even become new hot spots," said Li.

Many of the major fires in the northern part of the state are naturally occurring, predominantly ignited by lightning. But the majority of the increase in fire probability in recent years can be blamed on an expansion of the wilderness-urban interface. As people move into previously unpopulated areas, they bring their bad fire management habits with them.

"The concurrence of human-caused climate change, which is drying out our forests and grasslands and creating longer stretches of hot weather, and a steady influx of people into remote areas is creating conditions for the perfect fire storm," said Banerjee. "But there is some good news in all of this; human-caused fire risk can be mitigated by better fire management practices by humans."

He said he hoped the study and the near real-time analysis of fire risks in California's natural environment it provides can be used by government agencies and public policy officials to both prevent and combat costly blazes.

###

Funding for this research project was provided by the Henry Samueli School of Engineering at UCI and the UC Office of the President.

About the University of California, Irvine: Founded in 1965, UCI is the youngest member of the prestigious Association of American Universities and is ranked among the nation's top 10 public universities by U.S. News & World Report. The campus has produced three Nobel laureates and is known for its academic achievement, premier research, innovation and anteater mascot. Led by Chancellor Howard Gillman, UCI has more than 36,000 students and offers 224 degree programs. It's located in one of the world's safest and most economically vibrant communities and is Orange County's second-largest employer, contributing $7 billion annually to the local economy and $8 billion statewide. For more on UCI, visit http://www.uci.edu.

Media access: Radio programs/stations may, for a fee, use an on-campus ISDN line to interview UCI faculty and experts, subject to availability and university approval. For more UCI news, visit news.uci.edu. Additional resources for journalists may be found at communications.uci.edu/for-journalists.

 Ground and satellite observations map building damage after Beirut explosion

SEISMOLOGICAL SOCIETY OF AMERICA



Research News

Days after the 4 August 2020 massive explosion at the port of Beirut in Lebanon, researchers were on the ground mapping the impacts of the explosion in the port and surrounding city.

The goal was to document and preserve data on structural and façade damage before rebuilding, said University of California, Los Angeles civil and environmental engineer Jonathan Stewart, who spoke about the effort at the Seismological Society of America (SSA)'s 2021 Annual Meeting.

The effort also provided an opportunity to compare NASA Jet Propulsion Laboratory satellite surveys of the blast effects with data collected from the ground surveys. Stewart and his colleagues concluded that satellite-based Damage Proxy Maps were effective at identifying severely damaged buildings and undamaged buildings, but were less effective for assessing intermediate levels of structural or façade damage.

"The main take away is that the Damage Proxy Maps can definitely distinguish severe damage from lack of damage" for both structural and façade assessments, Stewart said, "but they are not as good at finer tuning."

"If what you're interested in is a fairly detailed picture of what has happened, it's not able to replace a person who actually knows what they're doing looking at the structure, particularly from the inside," he added.

The reconnaissance of the Beirut blast was organized through the National Science Foundation-sponsored Geotechnical Extreme Events Reconnaissance Association (GEER). In addition to Stewart and his colleagues at the American University of Beirut, the team included members from the University of Illinois and the University of Calabria in Italy. The information analyzed by the GEER team can help engineers learn more about how to build safely against similarly destructive events, including earthquakes, in the future.

Their findings, detailed in a GEER report, also "make some recommendations about how you can optimize human resources when doing these inspections," Stewart said.

On that August day, a fire at the port detonated an estimated 2.75 kilotons TNT equivalent of ammonium nitrate and fuel, an event about the size of a magnitude 3.3 earthquake. Within days, engineers at the American University of Beirut "had set up a hotline where people could call in who were concerned with the stability of damaged structures," Stewart said.

Professors and students made visits to inspect and assess the stability of these structures and others, but the in-person visits were scaled back in September due to COVID-19. After that, the researchers depended on street view surveys, using GoPro 360-degree cameras mounted on cars driven around the city.

The damage was ranked using scales adapted from those used for post-earthquake events, said Stewart. For instance, structural damage was ranked on a scale that began with minor damage to non-load bearing elements up to the complete collapse of a structure. Façade damage was ranked using a scale that begins with cracked windows and extends to complete blowout of windows and doors.

The spatial patterns of damage from an explosion differ from those seen in an earthquake. Site conditions such as underlying soil matter much more when it comes to the structural impact of an earthquake, while explosion damage depends "on how much are you feeling that blast," Stewart explained. "With an explosion, the damage decreases with distance and with the number of buildings between you and the blast that can deflect its effects."

Stewart isn't an expert in explosion seismology, but he has experience in assessing structural damage after earthquakes from his work in post-earthquake zones with GEER. He reached out to a colleague at the American University in Beirut after the disaster to offer his help in collecting observations that could be useful to future researchers and engineers.

"We felt that it was important to gather perishable data that we anticipate will be useful to people who study blast effects in an urban setting, and to learn something from this disaster to improve our resilience to future such disasters," he said.

###

Stress test finds cracks in the resistance of harmful hospital bugs

JOHN INNES CENTRE

Research News

IMAGE

IMAGE: THE OPPORTUNISTIC HUMAN PATHOGEN PSEUDOMONAS AERUGINOSA view more 

CREDIT: JOHN INNES CENTRE

Research has identified critical factors that enable dangerous bacteria to spread disease by surviving on surfaces in hospitals and kitchens.

The study into the mechanisms which enable the opportunistic human pathogen Pseudomonas aeruginosa to survive on surfaces, could lead to new ways of targeting harmful bacteria.

To survive outside their host, pathogenic bacteria must withstand various environmental stresses. One mechanism is the sugar molecule, trehalose, which is associated with a range of external stresses, particularly osmotic shock - sudden changes to the salt concentration surrounding cells.

Researchers at the John Innes Centre analysed how trehalose is metabolised by P. aeruginosa to define its role in protection against external stresses.

Combining analytical biochemistry and reverse genetics - using mutated bacteria lacking key functions - they show that trehalose metabolism in P. aeruginosa is connected to biosynthesis of the carbon storage molecule glycogen.

Experiments showed that disruption of either trehalose or glycogen pathways significantly reduced the ability of P. aeruginosa to survive on man-made surfaces such as kitchen or hospital counters.

The study found that while both trehalose and glycogen are important for stress tolerance in P. aeruginosa they counter distinct stresses: trehalose helps the bacteria to survive in conditions of elevated salt; glycogen contributes to survival in dry (desiccated) environments.

The findings raise the possibility of targeting the trehalose and glycogen pathways to limit pathogen survival on man-made surfaces.

"We have shown how a dangerous human pathogen Pseudomonas aeruginosa responds to environmental challenges, such as salt stress or drying out. Disrupting the production of certain stress-tolerance sugars in this bug significantly reduces its ability to survive on kitchen and hospital worksurfaces," said corresponding author of the study Dr Jacob Malone.

An unexpected finding was how the bacteria operates different pathways for different stresses, said Dr Malone: "Conventional wisdom says that trehalose was responsible for both phenotypes, but we have shown that trehalose only protects against osmo-stress and glycogen is needed to protect against desiccation. We were also surprised to see such a marked drop in surface survival when we disrupted the pathways in the bugs."

The next step for the research is to understand how trehalose and glycogen metabolic pathways are regulated in P. aeruginosa and closely related species. The group also wants to understand how glycogen accumulation allows the bacteria to survive in dry environments and provide more explanation of how and when different parts of the pathways are turned on and off.

P. aeruginosa is a significant pathogen in animals as well as humans. In humans it primarily affects immunocompromised individuals, where it is a major cause of pneumonia and hospital-acquired infections. Chronic P. aeruginosa infections occur in 80% of adult cystic fibrosis patients, where it is the primary cause of morbidity and mortality.

###

The study: Trehalose and α-glucan mediate distinct abiotic stress responses in Pseudomonas aeruginosa appears in PLOS Genetics

This research was funded by UK Research and Innovation | Biotechnology and Biological Sciences Research Council (BBSRC) Institute Strategic Program Grants BB/J004553/1 (Biotic Interactions) and BBS/E/J/000PR9797 (Plant Health) to the John Innes Centre and by two BBSRC DTP PhD studentships. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.

Researchers identify predictive factors of delirium in Sub-Saharan Africa

VANDERBILT UNIVERSITY MEDICAL CENTER

Research News

IMAGE

IMAGE: DOUGLAS HEIMBURGER, MD, MS, PROFESSOR OF MEDICINE AND CORE FACULTY AT THE VANDERBILT INSTITUTE FOR GLOBAL HEALTH. view more 

CREDIT: VANDERBILT UNIVERSITY MEDICAL CENTER

Severity of illness, history of stroke, and being divorced or widowed were independently predictive of delirium in hospitalized patients in Zambia, according to a study published in PLOS ONE.

A collaborative team of researchers from Vanderbilt University Medical Center and the University of Zambia Teaching Hospital published the risk factors as a follow-up look at the prevalence and impact of delirium, a form of acute brain dysfunction, in lower-resourced hospitals. Findings published in February showed delirium is widespread in patients admitted to the University Teaching Hospital, and the duration of delirium predicted both mortality and disability at six months after discharge.

The studies represent novel research in lower-resourced hospitals, and the findings highlight the breadth of a serious health problem that has existed off the radar, said Kondwelani Mateyo, MBChB, MMed, the hospital's chief pulmonary and critical care physician.

The next step is to explore interventional therapies while raising awareness about the prevalence and risks of delirium -- especially given that nearly 50% of people had delirium upon admission to the hospital. In the U.S., for instance, delirium more often develops in patients after admission and while in the ICU.

"We have to start the conversation, and you can't do that with the absence of data or evidence. We are making people aware that delirium is here and it's a widespread problem that has, up to this point, not been quantified. With these data we're able to see it's in our city, and we think this is representative of hospitals in the country and our region. Delirium is a factor in mortality and cognitive impairment," said Mateyo.

The findings on risk factors coupled with the recent data on delirium prevalence is a critical step toward finding ways to screen for and treat patients at high risk for delirium in order to drive improvements in long-term survival and functional status, said Justin Banerdt, MD, MPH, internal medicine resident at Yale School of Medicine, and corresponding author who led the study on the ground in Zambia while an MD/MPH student at Vanderbilt University School of Medicine. For example, findings from the study suggest that widespread use of validated severity of illness scores may allow health care providers in low-income countries to triage patients at high risk of delirium for further assessment and care.

"One of the striking findings in our study is that nearly 50% of patients in this acutely ill non-ICU population had delirium at admission. Not only was severity of illness at admission a powerful, independent predictor of delirium, but there was also a substantial burden of critical illness at hospital presentation, suggesting that severe illness is a significant contributor to the high prevalence of delirium seen in this cohort. This speaks to how in the future we must endeavor to understand what is putting these patients at risk in the community before they even arrive at the hospital so that delirium-- and critical illness more generally-- can be identified earlier and managed more effectively across all levels of the health care system. This is an opportunity to look beyond the walls of the hospital to address inequities in a poor and very vulnerable population through health system strengthening and critical care capacity-building in Zambia," said Banerdt.

Delirium is an independent predictor of long-term mortality as well as cognitive and functional disability. Health care costs attributable to delirium have been estimated to range from $143 billion to $152 billion annually in the U.S.

"Dr. Mateyo is a path-breaker in Zambia and a huge advocate for his patients. Our team's results show that health systems need to be expanded and broadened. Since we found that people are coming to the hospital with delirium, we need to find ways to develop a higher consciousness in communities about what delirium is. Dr. Mateyo's expertise and leadership, and that of his colleagues at the University of Zambia and the Zambia Ministry of Health, will be key in showing the global public health community how to positively impact the lives of patients," said Douglas Heimburger, MD, MS, professor of Medicine and core faculty at the Vanderbilt Institute for Global Health. Heimburger leads projects with grant funding from the Fogarty International Center of the National Institutes of Health (NIH).

###