Saturday, January 14, 2023

Offshore and coastal risk analyses may misrepresent wave storms from extreme weather like bomb cyclones

A new study shows that models used across the globe to estimate the height of coastal storm waves — critical information when building ports, reinforcing infrastructure and estimating flooding — can differ by several feet

Peer-Reviewed Publication

UNIVERSITY OF CENTRAL FLORIDA

A new study shows that models used across the globe to estimate the height of coastal storm waves — critical information when building ports, reinforcing infrastructure and estimating flooding — can differ by several feet.

ORLANDO, Jan. 12, 2023 – Extreme weather events, such as the storm waves generated by the cyclone that recently moved across California, may be underestimated in many models used to estimate coastal flooding, according to a new study led by University of Central Florida researchers.

The study, which was published this week in the journal Science Advances, shows that models used across the globe to estimate the height of coastal storm waves —critical information when building ports, reinforcing infrastructure and estimating flooding — can differ by several feet, leading to some areas being possibly under protected during extreme wave events.

The first-of-its-kind study compared wave estimates from 12 global models with historical buoy observations across different ocean areas.

It found that existing models can reliably represent historical buoy observations across many areas, but no model performs well everywhere and estimates of extreme wave heights can vary between models by up to approximately 20 feet at places.

The study shows that such discrepancies largely exceed current projected 21st changes in coastal extreme wave storm events under a high-emission scenario (9 degrees Fahrenheit in warming by 2100).

“Extreme wave events pose a major risk to offshore-coastal infrastructure and can dramatically affect global coastlines and natural ecosystems through widespread flooding and erosion,” says Joao Morim, a coastal climate researcher in UCF’s Department of Civil, Environmental, and Construction Engineering and the study’s lead author. “Understanding their magnitude and frequency for today’s climate is required for offshore and coastal designs and risk assessments, and it’s essential to assess potential future changes.”

Morim says due to a lack of observations with global coverage, wave models are typically used for many different applications, such as coastal flood risk assessments.

 “We show that depending on the product used, estimated flooding can vary by several feet,” Morim says.

The researchers found that to more accurately assess offshore and coastal risk and inform future mitigation and adaptation responses, decision and policy makers must account for differences between global wave models and take into account observation measurements based on today’s climate.

This means considering different models and using additional, modern observational data to better constrain extreme wave estimates under current climate conditions.

This should be done before considering changes due global warming and sea level rise, which is the main focus of many of the current global and regional flood risk assessments, the researchers say.

The additional data could include long-term observational records from satellite missions and more data from buoys.

“I feel that our results provide a clear message that published broad-scale offshore and coastal risk assessments for present and future climate using extreme wave data taken from specific global and regional wave model products can be misleading and need to consider multiple products to fully assess offshore-coastal impacts and make robust conclusions,” Morim says.

“The results, which are surprising given that current differences between models can differ by several feet, show that we need to consider uncertainties in extreme wave event estimates for today’s climate while also addressing projected future changes due to global climate change,” he says.

For Florida, there are considerable uncertainties concerning wave flooding estimates for the west coast of the state, stemming from the wave models analyzed and the discrepancies between them, which partly result from their different ability in resolving extremes wave storms due to tropical cyclone, the study shows.

“We feel that our research improves our current knowledge and makes a clear point that no uncertainty can be neglected when considering global and regional adaptation,” Morim says.

Study co-authors were Thomas Wahl, an associate professor in UCF’s Department of Civil, Environmental and Construction Engineering; Sean Vitousek with the Pacific Coastal Marine Science Center, U.S. Geological Survey; Sara Santamaria-Aguilar, a postdoctoral researcher in UCF’s Department of Civil, Environmental, and Construction Engineering;  Ian Young with the Department of Infrastructure Engineering, University of Melbourne, Parkville, Victoria; and Mark Hemer with the Commonwealth Scientific and Industrial Research Organisation (CSIRO) Oceans and Atmosphere, Hobart, Tasmania, Australia.

The research was funded with support by the UCF Pre-eminent Postdoctoral Program; the NASA Sea Level Change Team; the U.S. National Science Foundation; the U.S. Geological Survey; the Australian Research Council; the Australian Commonwealth National Environmental Science Program and CSIRO’s Climate Science Centre.

Morim received his doctorate in climate change and coastal risk from Griffith University on the Gold Coast, Queensland, Australia. He joined UCF in 2021.

Wahl earned his doctorate in civil engineering from the University of Siegen, Germany, and joined UCF’s Department of Civil, Environmental and Construction Engineering, part of UCF’s College of Engineering and Computer Science, in 2017. He is also a member of UCF’s National Center for Integrated Coastal Research and Sustainable Coastal Systems faculty cluster.

Study title: Understanding uncertainties in contemporary and future extreme wave events for broad-scale impact and adaptation planning

CONTACT: Robert H. Wells, Office of Research, robert.wells@ucf.edu

 

Mapping out key sources of emissions for climate change mitigation

Peer-Reviewed Publication

NATIONAL INSTITUTE FOR ENVIRONMENTAL STUDIES

Fig 1. Regional forcing contributions under the historical and two future scenarios with low forcing levels 

IMAGE: REGIONAL FORCING CONTRIBUTIONS IN 2016 UNDER THE HISTORICAL SCENARIO (A-1) AND IN 2100 UNDER THE 1.9 WM-2 (A-2) AND 2.6 WM-2 (A-3) SCENARIOS. PANEL B SHOWS THE FORCING CONTRIBUTIONS IN 2100 RELATIVE TO 2016 LEVELS UNDER THE 1.9 WM-2 (B-1) AND 2.6 WM-2 (B-2) SCENARIOS. THE VALUE ON TOP OF THE BAR INDICATES THE MEAN, AND THE ERROR BAR INDICATES THE ONE-SIGMA UNCERTAINTY OBTAINED FROM THE ENSEMBLE OF EMULATED CLIMATE MODELS. view more 

CREDIT: XUANMING SU

Different regions of the world and different sectors of activity emit various amounts of greenhouse gases and other air pollutants affecting climate change. Knowing the details of these contributions can help policy makers in deciding where to focus their efforts to meet their targets regarding the Paris Agreement. This treaty, signed in 2015, set goals to limit global warming well below 2°C and pursue efforts to limit the increase to 1.5°C above preindustrial levels, to avoid dangerous impacts of climate change. Different regions and sectors (i.e., energy, transportation, agriculture) emit different amounts of greenhouse gases and other air pollutants to the atmosphere. Each of these gases interacts with incoming and outgoing heat (or radiation) in different ways and together determine the amount of warming (or cooling) and climate change we experience. It is therefore important to understand how these interactions with heat or “radiative forcings” might change between regions and sectors over time so we can mitigate climate change effectively.
It is the objective of a team of researchers that conducted a study on how these contributions from different regions of the world, sectors or type of climate forcers may vary. And their study covers both the historical contributions and future scenarios that were developed by research groups around the world, with two low forcing targets of 1.9 W/m-2 (radiative forcing corresponding to a temperature increase of 1.5°C) and 2.6 W/m-2 (2°C). They use a large computational resource to calculate the contributions from different regions, sectors, and climate forcers through a climate model. The authors mention that their analysis is wider than previous studies, adding that no study has yet to assess the contributions from different regions, sectors, and climate forcers at multiple points in the past and the future within a single analytical framework. Their comprehensive attribution assessment shows that achieving these low forcing levels strongly relies on negative CO2 emissions, related to carbon capture and storage methods, under the future scenarios evaluated by them. It also shows that most developing regions and certain sectors, such as housing and transport, can produce larger forcings in 2100 than present even though they are projected to invest substantial efforts to decarbonize. Lastly, it highlights that China, followed by the US, has a crucial role to play in succeeding to bring down the current trajectory to meet these goals.

Regional contributions
Figure 1 from the study puts forward three regions accounting for nearly half of the total radiative forcing, both for the historical and future scenarios: the USA, China and the European Union. While the USA (green) and the European Union (violet) shares are projected to decrease in both low forcing scenarios, China’s contribution is projected to increase, as for many developing regions (India, Middle East and North Africa, Sub-Saharan Africa and Other Asia).

Sectoral contributions
Figure 2 shows that energy and industry would remain the two main sectors contributing to radiative forcings in both low forcing scenarios. In addition, the negative CO2 emissions (grey) would be essential to keep these low forcing levels in 2100, with technologies of carbon capture and storage or BECCS (Bioenergy with carbon capture and storage). As shown by the figure, the major contribution to radiative forcing is CO2 (red). The researchers mention this is related to the life-expectancies of the gases. The scenarios assume that measures to reduce emissions of air pollutants will continue in the coming decades, and the radiative forcings of methane and tropospheric ozone associated with these measures are relatively short-lived (in decades), leaving little impact in 2100. However, CO2 is a long-lived gas, and even though strong emission reductions are assumed in the near term under both scenarios, the abated CO2 emissions will still have a strong footprint on radiative forcing at the end of the century. This highlights the need for putting cutting CO2 emissions as the first priority for long-term climate mitigation. It is interesting to note that the agricultural sector still holds an important share by 2100, reflecting that reducing emissions in this sector (mainly methane and nitrous oxide) would still be a difficult task by the end of the century.

This international study involving research institutes in Japan and France is a comprehensive assessment showing the contribution to radiative forcing by region, sector and climate forcer, both for the historical and future scenarios. It comes as a precious tool to map out how to put efforts efficiently to align on these low forcing scenarios and meet the goals to limit global warming well below 2°C.

Forcing contributions of climate forcers in different sectors in 2016 under the historical scenario (A) and in 2100 under the 1.9 Wm-2 (B) and 2.6 Wm-2 (C) scenarios. The value on top of the bar indicates the mean, and the error bar indicates the one-sigma uncertainty obtained from the ensemble of emulated climate models. Open burning includes agricultural waste burning, forest burning and grass burning. Industry includes industry and solvents. Transport includes the surface transport, aviation and international shipping values.

CREDIT

Xuanming Su

All hands on deck

Ecologists and environmental scientists identify priorities within synthesis research to address pressing issues

Peer-Reviewed Publication

UNIVERSITY OF CALIFORNIA - SANTA BARBARA

In the face of dramatic and accelerating consequences of climate change, the many, often separate disciplines within ecology and environmental science need to come together to find answers and solutions to address and adapt to the increasingly complex shifts in our environment.

To meet the enormous challenge, more than 100 members of the environmental research community convened to organize and prioritize themes and issues to create a unified front in synthesis research.

“Over the next decade we will be facing huge environmental challenges and need to galvanize global efforts to address them,” said ecologist Ben Halpern, who led the large group through a virtual workshop at UC Santa Barbara’s National Center for Ecological Analysis & Synthesis. “We brought together a diverse community of ecologists and environmental scientists — about 120 people — to share ideas and key questions and help boil all that down into a set of priorities to guide the research community in the coming decade. We hope that people can use this work to support their efforts to pursue pressing research needs.”

Their results are published in the journal Ecosphere.

The group identified several priority topics: diversity, equity, inclusion and justice; coupled human-natural systems; actionable and use-inspired science; scale; generality, complexity and resilience; and predictability — all issues that become more significant as the opportunities for synthesis science grow with the increasing scope, scale and speed of data collection.

“There was a large emphasis on diversity, equity, inclusion and justice as both a focal point for synthesis research and a change in the process of how synthesis is done,” Halpern said. By lowering the barriers to the diversity of participants, according to the researchers, synthesis scientists would be able to better define and answer research questions that would result in policies that are more relevant and impactful for broader communities.

The other priorities, Halpern noted, “include ones that have been key issues for decades and remain unresolved and high priorities, as well as new topics.” The workshop group, which consisted of members from a wide array of career stages, institutions, backgrounds and geographies, noted issues such as how human values and decisions affect environmental outcomes; how to embrace local, Indigenous and experiential knowledge; and how to approach the complexity that emerges as shifts in one system drives major changes in others.

Meanwhile, synthesis science must also incorporate understanding across spatial and temporal scales, while acquiring sufficient replication and avoiding bias in its search for general principles to explain patterns and processes. Importantly, according to the researchers, in addressing pressing environmental questions, scientists must adopt an approach in which predictions are iteratively tested and updated with continuous feedback, as opposed to reaching for “perfect” forecasts in a world with constantly changing variables.

In the process and practice of synthesis science, the group identified two common threads: expanding participation and increasing the availability of data. Efforts must be made to support representation and integration of diverse scientists and perspectives, and to engage with marginalized communities, they said, while generating, integrating and providing access to high-quality data.

While the rate of climate change is increasing with effects that may seem daunting at times to the researchers who study them, Halpern is encouraged by the willingness of the workshop participants to come forward and share ideas — a model which he hopes becomes a way forward for synthesis research.

“I have never led a paper with so many coauthors, and in a way that prioritized equal voice and participation,” he said. “This work was truly a collaborative effort at a scale I never imagined possible. It was really inspiring to engage so many people, and I learned so much from hearing the feedback and revisions and conversations that ensued during the process of writing and revising this paper.”

New guidance to reduce human error and improve safety released as UK health service falls to its knees

Human factors can be addressed in all parts of the healthcare system to reduce the risk and impact of human errors and adverse events


Peer-Reviewed Publication

AAGBI

 NEWS RELEASE 

The UK National Health Service is being brought to its knees by a perfect storm of difficulties: poor flow through hospitals resulting in crowded emergency departments and long ambulance waits, increases in respiratory illnesses causing increased workloads and staff absences, combined with pre-existing staff shortages and strikes due to working conditions and pay in nurses, ambulance crews and potentially in future junior doctors.

More than ever, there is a need to strengthen systems ensuring patient safety and to reduce the impact of human error in health care, using so-called ‘human factors’, an evidence-based scientific discipline used in safety critical industries. New guidance is being published today in the journal Anaesthesia (a journal of the Association of Anaesthetists) for clinicians, departments, hospitals and national healthcare organisations, to enable them to design and maintain safe systems that will reduce the risk and potential impact of human error by individuals or teams.

Human error, reluctance to challenge authority and reliance on inadequate systems were among key factors in a number of high-profile deaths, including those of Elaine Bromiley (2005) and Glenda Logsdail (2020) (see notes to editors). Both patients were having straightforward operations but teamworking and communication problems played a significant part in both of their deaths.

“We are not only discussing avoidable deaths here, but also long-term consequences in patients who survive when avoidable errors and adverse events occur,” says guidance co-author Dr Fiona Kelly, Consultant in Anaesthesia and Intensive care Medicine at the Royal United Hospitals Bath NHS Foundation Trust, Bath, UK. “The new guidance has analysed all the potential areas in anaesthesia where human error can creep in with potentially devastating outcomes, and is likely to be applicable to other healthcare specialities. This ‘human factors’ approach aims to make it easy for workers to do the right thing, and difficult, or ideally impossible, for them to do the wrong thing.”

The guidance has been by produced by a working party of the Difficult Airway Society and the

Association of Anaesthetists, and is supported by the Royal College of Anaesthetists and other

national organisations. The team includes Dr Kelly and Professor Chris Frerk, Consultant Anaesthetist at Northampton General Hospital, University of Leicester Medical School, and Chair of the Clinical Human Factors Group (a charity dedicated to making healthcare safer). The guidance is also being presented in a special session at the Winter Scientific Meeting of the Association of Anaesthetists in London from January 12-13.

The wide-ranging guidance addresses issues including the design of operating theatres; well -designed medical equipment and using the most effective equipment; effective use of checklists before operating; encouraging staff of any seniority to speak up if they have safety concerns; the ability to learn from not only situations where things have gone wrong, sometimes fatally so, but also from situations where things have gone well; and training and education.

Human factors principles and strategies have been incorporated successfully into safety critical industries, including nuclear power, offshore oil and gas, aviation, construction, rail and the military. Implementation of human factors principles, education and methods within healthcare has made some progress in the past 20 years, including, for example, the adoption of regular team briefings and staff safety huddles. This has been due in part to the work of Martin Bromiley (husband of Elaine) who is an airline pilot fully accustomed to the day-in day-out use of safety procedures that minimise the risk of air accidents. The incredibly helpful and cooperative approach of both Mr Bromiley, and others including Richard Logsdail (husband of Glenda), in helping address systemic faults involved in the death of their partners, has played a huge role in enabling today’s new guidance.

Human factors strategies can be categorised into four domains arranged in a pyramid shape (figure 1 full paper) according to their likely effectiveness. Design (of environment, equipment and systems) is the strategy likely to be most effective and forms the base of the pyramid. ‘Designing out’ the chance of an error occurring reduces the requirement for exceptional human performance commonly relied upon in healthcare. Design strategies are followed in order by barriers (which trap errors to prevent them progressing), mitigations (which reduce the consequences of errors - such as analysis of deaths or critical events) and education and training.  However, the authors explain that the current UK healthcare system is more like the pyramid turned upside-down (figure 2 full paper), with heavy reliance on high levels of human performance and a resultant small and unstable foundation for safety. In the current UK healthcare climate, the authors say this upside-down pyramid is even more unstable.

After a 5-year-process, the Working Group agreed on 12 recommendations to form the new guidance, split among different areas of the ‘pyramid.’ (see notes to editors for full recommendations).

The authors say: “Healthcare relies on high levels of human performance, as described by the `human as the hero´ concept. However, human performance varies and is recognised to fall in high-pressure situations, meaning that it is not a reliable method of ensuring safety. Other safety-critical industries embed human factors principles into all aspects of their organisations to improve safety and reduce reliance on exceptional human performance; there is potential to do the same in anaesthesia and in healthcare in the broader sense.”

Adding that improving human factors is in no way a substitute for proper funding and resourcing of hospitals and healthcare systems, the authors conclude: “Although applying human factors science has the potential to save money in the long term, its proper implementation may require investment before reward can be reaped.”

Professor Frerk adds: “Care given within the NHS has become increasingly complex over the last 20 years, and is now delivered by teams of nurses, physiotherapists, pharmacists, doctors and other support staff. As many as 1 in 10 patients are harmed during their interactions with the NHS; this is not because staff don’t care or don’t try hard to do the right thing. The old model of healthcare where we assumed that having knowledge (and experience) about the heart, lungs, cancer or arthritis would be enough to keep patients safe is just not the case anymore. Adopting a ‘human factors’ approach across healthcare, to eliminate the risk and impact of human error, has been on the national agenda for more than 10 years. With this guidance, we want to start to move this approach into everyday practice for everyone in the NHS.”

Notes to editors:

 

The 12 recommendations:

 

Design

 

  1. Design of medical equipment should include input from human factors experts at an early stage (where it is possible to still change the design if necessary – this is not currently always the case). The medical equipment procurement process should include human factors assessments.
  2. Design of drug ampoules and packaging should incorporate human factors principles to optimise readability and reduce the risk of mis-selection: anaesthetists, pharmacists and procurement departments should ensure that these principles are prioritised during their purchasing processes. Improvements that could make a difference include making the drug name more prominent than the manufacturer’s name and logo, prioritise generic drug names over trade names and consider standardised use of colour, while being mindful of the impact of colour blindness.
  3. Design of safe working environments should incorporate human factors principles. Regular reviews should be carried out to ensure that safety has not been compromised – this can cover anything from the design of the whole hospital to operating theatre design, and how moveable equipment is used in each operation.

 

Barriers

 

  1. Operating theatre list planning and scheduling should include additional time allocated or complex cases and for high turnover lists to enable adequate preparation and reduce time pressures on staff.
  2. Cognitive aids, including algorithms and checklists, should be designed and tested using human factors principles to ensure usability and efficacy.
  3. Non-technical skills can be learned and developed, and should be practised during everyday work to ensure that staff become skilled in their use and are able to use them effectively. (Many examples exist and include all staff wearing a name badge and using first names of team members, and situational awareness – being aware of what has happened in a situation, what is currently happening, and what could happen in the coming moments.)

 

Mitigations

 

  1. Investigation of critical incidents and adverse events should be performed by teams that include members with human factors training using a human factors investigative tool. Lessons identified should be shared. An example of this is: a new patient safety incident response framework (PSIRF), based on human factors principles, is replacing the existing root cause analysis investigation tools that are currently in use in the UK healthcare system. (Hospitals have been given until Autumn 2023 to implement this.)
  2. Morbidity and mortality meetings should be part of the regular work of all anaesthetic departments and should also include learning from cases that go well. Time within job plans should be allocated to enable staff to prepare for and attend these meetings.

 

Education and training

 

  1. Human factors education and training should be provided at an appropriate level for all anaesthetists and all members of operating theatre teams. It should include the role of good design in healthcare, an appreciation of a systems perspective, the importance of non-technical skills and strategies to improve these.
  2. Non-technical skills training and interprofessional simulation training: Teams that work together should train together. Non-technical skills should be learned during classroom and in-theatre teaching, woven into all anaesthetic workshops and courses and rehearsed during regular interprofessional simulation training. Time and resources should be allocated to allow for this.

 

Well-being

 

1. Staff well-being should be optimised by hospitals and anaesthetic departments by implementing organisational strategies.

 

Strategy

 

1. Each anaesthetic department should have a human factors lead with an appropriate level of training. Every hospital should have patient safety leads with appropriate training and qualifications; in England, this is already included in Health Education England recommendations.

 

Cutting costs and emissions in beef production

Peer-Reviewed Publication

UNIVERSITY OF QUEENSLAND

Cattle 

IMAGE: THE TOOL COULD HELP HAVE THE TRIPLE EFFECT OF HELPING THE GLOBAL BEEF INDUSTRY TO SIMULTANEOUSLY REDUCE COSTS AND GREENHOUSE GAS EMISSIONS WHILE MEETING DEMAND. view more 

CREDIT: ADOBE

A research team led by the University of Queensland has developed a tool to help the global beef industry simultaneously reduce costs and greenhouse gas emissions while meeting demand for meat.

The team assessed the economic and emissions impacts of different cattle feeds at different locations around the globe to formulate a framework to guide and inform industry sustainability efforts.

Postdoctoral Research Fellow Adam C. Castonguay from UQ’s School of Veterinary Science said the study showed that as much as 85 per cent of emissions could be cut without an overall economic hit to the beef sector.

“This can be achieved by opting for more efficient feeds and locations, and restoring forests in inefficient areas, without increasing global costs of production or reducing demand for beef,” Mr Castonguay said.

“We have mapped out the most efficient locations around the world to produce beef and the maps change when factors are altered, such as how much society values reducing emissions over reducing production costs.

“This has given us an unprecedented insight into the ‘what, where, and why’ of beef production at a global level and decisions about the future of the industry can be informed by inputting trade-offs and opportunities.”

The research group says the tool could be used by governments and industry to develop policy and strategy.  

“There will be continued global demand for beef and there are a huge number of livelihoods associated with it, so this research aims to find an appropriate balance to maintain the bottom line of the sector,” Mr Castonguay said.

“Further economic modelling and fine-tuning the data for specific locations would reveal the implications of any changes, including on beef prices for consumers.”

Mr Castonguay said the optimisation method developed by the team using mapping technology overcame historic roadblocks to finding an environmental-economic balance.

“There are many innovations in cattle feed to increase productivity or reduce emissions which have not been analysed as a trade-off with other values and goals,” he said.

“Our results highlight the massive potential for improvements in the way we produce beef, to help us to meet global sustainability goals.

“The extent to which we reduce emissions and production costs depends on our values or preferences as a society.”

This research has been published in Nature Sustainability.

Symptoms of illness help pathogens spread amongst songbirds

Both songbirds and humans suffer from contagious disease outbreaks. Studying what aspects of illness makes a sick animal contagious can help predict whether some pathogens will become more harmful over time

Peer-Reviewed Publication

VIRGINIA TECH

Bird 

IMAGE: THIS IMAGE SHOWS THE INERT FLUORESCENT POWDER PLACED AROUND THE EYES OF BIRDS TO MEASURE HOW WELL THEY "SPREAD" THE POWDER WHILE INFECTED WITH CONJUNCTIVITIS. PHOTO COURTESY OF DANA HAWLEY. view more 

CREDIT: VIRGINIA TECH

It’s “Treasure Island” author Robert Louis Stevenson who is credited with coining the phrase, “You cannot make an omelet without breaking eggs.” For us humans, it’s now cliché. For pathogens, it’s words to live by. Or, rather, spread by.

Like all living organisms, pathogens want to thrive. Aside from cellular reproduction, though, the best future for them lies in moving from host to host. Think of each host as Stevenson’s eggs, unwittingly waiting to be, if not broken, certainly cracked. Meaning ill. That’s why pathogens — from conjunctivitis, commonly known as pink eye, or a common cold or a disease as severe as COVID-19 — make their hosts sick: Spread is sometimes only made possible by expulsion via swollen red eyes; ,coughing or sneezing, or passing through bodily fluids, according to Virginia Tech biologist Dana Hawley.

“For a pathogen, ‘spreading’ is their key form of reproduction. And when we think about why pathogens make their hosts sick, it’s long been a mystery, because making a host sick or making your host die is superficially not a good way for a pathogen to be able to spread. A very sick host will stay home and not interact as much as others, which means less spread potential for a pathogen,” said Hawley, a professor in the Department of Biological Sciences, part of the Virginia Tech College of Science.

But here’s the caveat: “Making your hosts feel ill can be important for getting some of the copies of yourself out of the host you are infecting and into another. So there is a trade-off for the pathogen,” Hawley said. “Making your host feel sick means that host may not interact with as many other hosts as they normally would — this is bad for the pathogen — but when they do have interactions, a very sick host that is coughing or has swollen eyes is going to be much more likely to spread whatever pathogen it has in its body. This is good for the pathogen.”

Using songbirds, a type known as finches, whose populations are affected by a pink-eye disease in nature, Hawley and a team of researchers from Virginia Tech and the University of Memphis in Tennessee have shown just how easily these pathogens — in this case, a form of conjunctivitis common in birds, but harmless to humans — spread. And they did it without having the pink eye pathogen itself spread from bird to bird.

Instead, the team used UV fluorescent powder coatings and tracked that and not the pathogen. During the experiment, birds were divided up into three groups: not ill, mildly ill, and strongly ill, all with conjunctivitis, and each bird was housed with four healthy flockmates. By applying a powder coating around the outer eye of each bird, but not inside the eye, the researchers could track how much powder was spread to flockmates from birds that were strongly, mildly, or not ill with pink eye.

“We weren’t actually tracing the spread of conjunctivitis. We were tracing the spread of powder as a model for the likely spread of conjunctivitis,” said Hawley, who is also an affiliated member of the Virginia Tech Fralin Life Sciences Institute’s Global Change Center and its Center for Emerging, Zoonotic, and Arthropod-borne Pathogens, about the study, published today in the journal Royal Society Open Science.

During the study, birds were kept in large-flight cages, sharing feeders and potentially spreading powder to cage mates. Feeder surfaces are one main avenue of the powder’s travel, according to the study.

There were some surprises along the way, Hawley said. The finches experiencing the strongest conjunctivitis symptoms were far less likely to eat but nonetheless spread the powder at a higher rate than the mildly ill birds who spent more time feeding.

“In our study system, the benefits of making your host sicker by increasing eye swelling outweighed the cost of making the finches feed and interact less,” Hawley said. “So overall, this pathogen is going to likely evolve to cause more harm to birds in nature so that it can spread at a higher rate, but up to some limit, because if the pathogen kills a bird immediately, the pathogen doesn’t have a chance to spread at all.”

The human factor

What does all this mean for humans and the spread of a common cold or COVID-19 at a doctor’s office or the cinema, or the spread of conjunctivitis amongst young children at a nursery school or day care?

The mantra of “stay home if you’re sick” still applies more than ever, according to Hawley. This study suggests that symptoms will make the spread of anything you are infected with much more likely. And wear a face mask — and not just for potential spread of COVID. “Wearing masks when you are coughing from any illness can likely go a long way in preventing disease spread,” Hawley said. “For pink eye, keeping kids isolated is going to be key because young children are just not going to be able to wash their hands or avoid touching each other — speaking from experience as a parent.”

Evolution plays another part. Again, pathogens are living organisms and prone to the rules of life, including evolution. “This goes back to the idea that everyone hoped that COVID would evolve to become milder over time,” Hawley said. “Our study shows that the pressures on pathogens are complicated. On the one hand, being mild is good for pathogens if it keeps your host out and about and in others’ company — good for spread, but on the other hand, being mild may mean that none of the pathogen makes it out of the host and into another because your host isn’t coughing or depositing as much pathogen onto hands or other surfaces. So pathogens are in many cases going to be favored to make us sick.”

In other words, for humans, common sense actions can prevent us from being those proverbial eggs spoken of by Stevenson.

Co-authors on the paper include Courtney A. Thomason, a former postdoctoral associate at Virginia Tech from January 2015 to October 2017, who is now at the Tennessee Department of Environment and Conservation; Matt Aberle, who earned a master's degree in biological sciences in 2018 and is now a wildlife biologist with Oregon Department of Forestry; and Richard Brown, who graduated with a bachelor’s degree in wildlife conservation from the Virginia Tech College of Natural Resources and Environment in 2017 and is now a master's student at George Mason University; as well as James S. Adelman, an assistant professor at the University of Memphis.

Funding for the study came from two National Science Foundation grants and additional funding from the National Institute of General Medical Sciences, the latter part of the National Institutes for Health.

 

As climate warms, drier air likely to be more stressful than less rainfall for Douglas-fir trees

Peer-Reviewed Publication

OREGON STATE UNIVERSITY

Douglas-fir 

IMAGE: DOUGLAS-FIR view more 

CREDIT: LINA DIGREGORIO

CORVALLIS, Ore. – Douglas-fir trees will likely experience more stress from drier air as the climate changes than they will from less rain, computer modeling by Oregon State University scientists shows.

The research is important because Douglas-fir are widespread throughout the Pacific Northwest, an iconic species with ecological, cultural and economic significance, and learning how the trees respond to drought is crucial for understanding forest sensitivity to a shifting climate.

Douglas-fir grow in a range that stretches from northern British Columbia to central California, and also includes the Rocky Mountains and northeastern Mexico. In Oregon, Douglas-fir are found in a variety of mixed conifer and hardwood forests, from sea level to 5,000 feet, and can reach a massive size; a tree on Bureau of Land Management land in Coos County is more than 300 feet tall and greater than 11 feet in diameter.

Native Americans traditionally used the wood of Douglas-fir, Oregon’s official state tree since 1936, for fuel and for tools, its pitch as a sealant and many parts of the tree for medicinal purposes.

A versatile timber tree, Douglas-fir is a source of softwood products including boards, railroad ties, plywood veneer and wood fiber. Oregon leads all U.S. states in softwood production and most of that is Douglas-fir.

The OSU study, published in Agricultural and Forest Meteorology, simulated the response of a 50-year-old stand of Douglas-fir on the Oregon Cascade Range’s west slope to less rain and higher “vapor pressure deficit,” or VPD – basically the atmosphere’s drying power.

A team led by Karla Jarecke, a postdoctoral researcher in the OSU College of Earth, Ocean, and Atmospheric Sciences, sought to look at how the mechanisms behind carbon fixation and water “fluxes” – exchanges of water between trees and the atmosphere – would respond to decreases in rainfall and increases in VPD.

Douglas-fir, like other plants, create food for themselves using sunlight, carbon dioxide and water during photosynthesis. The process pulls CO2, a greenhouse gas, from the air, releases oxygen and results in the long-term storage of carbon in the wood and roots.

“What governs carbon fixation and water fluxes in response to increased temperatures and water limitation in regions with Mediterranean climates – wet winters and dry summers – is only partially understood,” said Jarecke, who began the research as a doctoral student in the OSU College of Forestry. “High VPD and lack of soil moisture can create significant water stress in forests, but dry atmosphere and lack of rainfall are strongly linked, making it difficult to discern their independent effects. They tend to both occur during the summer.”

Jarecke and collaborators including the College of Forestry’s Kevin Bladon and Linnia Hawkins and the U.S. Forest Service’s Steven Wondzell used a computer model to disentangle the effects of the two phenomena. The model uses a series of equations that illustrate how well Douglas-firs are equipped to deal with water stress, and it showed that less spring and summer rain is likely to have a comparatively smaller impact on forest productivity than increased VPD.

“Decreasing spring and summer precipitation did not have much of an effect on Douglas-fir water stress because moisture remained plentiful deep in the soil profile,” Jarecke said. “This demonstrated that the effect of reduced rainfall under future climate change may be minimal but will depend on subsurface water availability, which is determined by soil properties and rooting depths.”

She said heat-driven increases in vapor pressure deficit, however, are likely to cause water stress regardless of the amount of moisture in the soil, adding that “many knowledge gaps remain concerning how trees will respond to extreme temperatures and VPD anomalies such as the record-breaking temperatures that occurred in the Northwest in the summer of 2021.”

Bladon added that the Oregon State study shows the important role of atmospheric droughts in creating stress conditions for trees.

“This has potential implications for not only driving substantial tree mortality, but also influencing wildfires, as other studies have shown strong relationships between VPD and forest area burned in the western United States,” he said.

Karla Jarecke, left, and Lauren Roof collect soil samples (photo by Lina DiGregorio).

CREDIT

Lina DiGregorio